The present disclosure generally relates to the field of vehicles and, more specifically, to methods and systems for associating measurements in vehicles, such as automobiles.
Many vehicles today have systems that track a position or movement of objects (for example, other vehicles) that may be in proximity to the vehicle. Such systems may include, by way of example, adaptive cruise control (ACC) systems, avoidance systems, active braking systems, active steering systems, driver assist systems, warning systems, and the like. However, it may be difficult in certain situations to provide optimal tracking of such objects over time.
Accordingly, it is desirable to provide improved methods and system for measurement association in vehicles, for example with respect to measurements pertaining to detected objects that may be in proximity to the vehicle. Furthermore, other desirable features and characteristics of the present invention will be apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and the foregoing technical field and background.
In accordance with an exemplary embodiment, a method is provided. The method comprises identifying an object proximate a vehicle, obtaining one or more measurements or classifications that may potentially be associated with the object via one or more sensors, generating a first tracking gate that is based at least in part on a characteristic of one of the sensors used to obtain the measurements or classifications, and generating a second tracking gate that is based at least on part on the first tracking gate and a measurement history.
In accordance with another exemplary embodiment, a method is provided. The method comprises obtaining initial first measurements via a first type of sensor, obtaining initial second measurements via a second type of sensor that is different from the first type of sensor, generating a fusion system incorporating the initial first measurements and the initial second measurements, generating a predicted value using the initial first measurements, the initial second measurements, and the fusion system, obtaining additional measurements via the first type of sensor, the second type of sensor, or both, and comparing the predicted value with the additional measurements.
In accordance with a further exemplary embodiment, a system is provided. The system comprises one or more sensors and a processor. The one or more sensors are configured to provide one or more measurements. The processor is coupled to the one or more sensors, and is configured to at least facilitate identifying an object proximate a vehicle, generating a first tracking gate that is based at least in part on a characteristic of one of the sensors used to obtain the measurements, and generating a second tracking gate that is based at least on part on the first tracking gate and a measurement history.
The present disclosure will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and wherein:
The following detailed description is merely exemplary in nature and is not intended to limit the disclosure or the application and uses thereof. Furthermore, there is no intention to be bound by any theory presented in the preceding background or the following detailed description.
As depicted in
The vehicle 100 (as well as each of the target vehicles and third vehicles) may be any one of a number of different types of automobiles, such as, for example, a sedan, a wagon, a truck, or a sport utility vehicle (SUV), and may be two-wheel drive (2WD) (i.e., rear-wheel drive or front-wheel drive), four-wheel drive (4WD) or all-wheel drive (AWD). The vehicle 100 may also incorporate any one of, or combination of, a number of different types of propulsion systems, such as, for example, a gasoline or diesel fueled combustion engine, a “flex fuel vehicle” (FFV) engine (i.e., using a mixture of gasoline and ethanol), a gaseous compound (e.g., hydrogen or natural gas) fueled engine, a combustion/electric motor hybrid engine, and an electric motor.
In the exemplary embodiment illustrated in
Still referring to
The ESS 122 is mounted on the chassis 112, and is electrically connected to the inverter 126. The ESS 122 preferably comprises a battery having a pack of battery cells. In one embodiment, the ESS 122 comprises a lithium iron phosphate battery, such as a nanophosphate lithium ion battery. Together the ESS 122 and electric propulsion system(s) 129 provide a drive system to propel the vehicle 100.
The radiator 128 is connected to the frame at an outer portion thereof and although not illustrated in detail, includes multiple cooling channels therein that contain a cooling fluid (i.e., coolant) such as water and/or ethylene glycol (i.e., “antifreeze”) and is coupled to the combustion engine 130 and the inverter 126.
The steering system 150 is mounted on the chassis 112, and controls steering of the wheels 116. The steering system 150 includes a steering wheel and a steering column (not depicted). The steering wheel receives inputs from a driver of the vehicle. The steering column results in desired steering angles for the wheels 116 via the drive shafts 134 based on the inputs from the driver.
The braking system 160 is mounted on the chassis 112, and provides braking for the vehicle 100. The braking system 160 receives inputs from the driver via a brake pedal (not depicted), and provides appropriate braking via brake units (also not depicted). The driver also provides inputs via an accelerator pedal (not depicted) as to a desired speed or acceleration of the vehicle, inputs via a cruise control resume switch (not depicted), and various other inputs for various vehicle devices and/or systems, such as one or more vehicle radios, other entertainment systems, environmental control systems, lightning units, navigation systems, and the like (also not depicted). In a preferred embodiment, the braking system 160 includes both a regenerative braking capability and a friction braking capability for the vehicle 100.
The control system 170 is mounted on the chassis 112. The control system 170 may be coupled to various other vehicle devices and systems, such as, among others, the actuator assembly 120, the steering system 150, the braking system 160, and the electronic control system 118. The control system 170 detects and tracks objects that may be proximate the vehicle 100, including the tracking of positions and movements of such objects. In addition, the control system 170 associates measurements pertaining to such objects using multiple tracking gates in executing the steps of the processes 300, 500 set forth in
With reference to
The sensor array 202 measures and obtains information for use by the controller 204 pertaining to objects (for example, other vehicles) that may be proximate the vehicle 100 of
The controller 204 is coupled to the sensor array 202. The controller 204 processes the data and information received from the sensor array 202, and associates measurements therefrom pertaining to objects that may be proximate the vehicle. In one embodiment, the controller 204 performs these features in accordance with the steps of the processes 300, 500 depicted in
As depicted in
In the depicted embodiment, the computer system of the controller 204 includes a processor 220, a memory 222, an interface 224, a storage device 226, and a bus 228. The processor 220 performs the computation and control functions of the controller 204, and may comprise any type of processor or multiple processors, single integrated circuits such as a microprocessor, or any suitable number of integrated circuit devices and/or circuit boards working in cooperation to accomplish the functions of a processing unit. During operation, the processor 220 executes one or more programs 230 contained within the memory 222 and, as such, controls the general operation of the controller 204 and the computer system of the controller 204, preferably in executing the steps of the processes described herein, such as the steps of the processes 300, 500 (and any sub-processes thereof) in connection with
The memory 222 can be any type of suitable memory. This would include the various types of dynamic random access memory (DRAM) such as SDRAM, the various types of static RAM (SRAM), and the various types of non-volatile memory (PROM, EPROM, and flash). In certain examples, the memory 222 is located on and/or co-located on the same computer chip as the processor 220. In the depicted embodiment, the memory 222 stores the above-referenced program 230 along with one or more stored values 232 (preferably, including look-up tables) for use in associating the measurements from the sensor array 202.
The bus 228 serves to transmit programs, data, status and other information or signals between the various components of the computer system of the controller 204. The interface 224 allows communication to the computer system of the controller 204, for example from a system driver and/or another computer system, and can be implemented using any suitable method and apparatus. It can include one or more network interfaces to communicate with other systems or components. The interface 224 may also include one or more network interfaces to communicate with technicians, and/or one or more storage interfaces to connect to storage apparatuses, such as the storage device 226.
The storage device 226 can be any suitable type of storage apparatus, including direct access storage devices such as hard disk drives, flash systems, floppy disk drives and optical disk drives. In one exemplary embodiment, the storage device 226 comprises a program product from which memory 222 can receive a program 230 that executes one or more embodiments of one or more processes of the present disclosure, such as the steps of the processes 300, 500 (and any sub-processes thereof) of
The bus 228 can be any suitable physical or logical means of connecting computer systems and components. This includes, but is not limited to, direct hard-wired connections, fiber optics, infrared and wireless bus technologies. During operation, the program 230 is stored in the memory 222 and executed by the processor 220.
It will be appreciated that while this exemplary embodiment is described in the context of a fully functioning computer system, those skilled in the art will recognize that the mechanisms of the present disclosure are capable of being distributed as a program product with one or more types of non-transitory computer-readable signal bearing media used to store the program and the instructions thereof and carry out the distribution thereof, such as a non-transitory computer readable medium bearing the program and containing computer instructions stored therein for causing a computer processor (such as the processor 220) to perform and execute the program. Such a program product may take a variety of forms, and the present disclosure applies equally regardless of the particular type of computer-readable signal bearing media used to carry out the distribution. Examples of signal bearing media include: recordable media such as floppy disks, hard drives, memory cards and optical disks, and transmission media such as digital and analog communication links. It will similarly be appreciated that the computer system of the controller 204 may also otherwise differ from the embodiment depicted in
As depicted in
Additional measurements are also obtained (step 302). The additional measurements pertain to additional values of objects that may be in proximity to the vehicle, and the may pertain to the object identified in step 301. The additional measurements are preferably made by the sensor array 202 of
In certain embodiments, the object is identified in step 301 based on a first measurement from a first sensor of the sensor array 202 of
Historical data is obtained (304). The historical data preferably pertains to a measurement history pertaining to the object identified in step 301, including the measurements used to identify the object in step 301 as well as the additional measurements of step 302. The historical data is preferably stored in the memory 222 of
A first tracking gate is generated (step 306). The first tracking gate represents an initial boundary for tracking the measurements and associating them with the object identified in step 301. The first tracking gate is preferably generated by the processor 220 of
An exemplary first tracking gate 404 is depicted in
A second tracking gate is also generated (step 308). The second tracking gate represents an additional boundary for tracking the measurements and associating them with the object identified in step 301. The second tracking gate is preferably generated by the processor 220 of
An exemplary second tracking gate 406 is depicted in
For each measurement of step 302, a determination is made as to whether the measurement falls within the boundary of the first tracking gate based on a comparison of the measurement with the first tracking gate (step 310). This determination is preferably made by the processor 220 of
In certain embodiments in which multiple first tracking gates are used (e.g., for different types of sensors), the comparison of step 310 preferably comprises a determination of whether a measurement falls within the boundary of a particular first tracking gate that is associated with the type of sensor that was used for generating the particular measurement at issue. In one embodiment, the comparison comprises a probability score that the measurement falls within the boundary of the first tracking gate.
If it is determined in step 310 that the measurement is not within the boundary of the first tracking gate (i.e., if the measurement is outside the boundary, such as with the third additional measurement 414 of
Conversely, if it is determined in step 310 that the measurement is within the boundary of the first tracking gate (such as with the first and second additional measurements 410, 412 of
During step 314, a determination is also made (for each measurement falling within the boundary of the first tracking gate) as to whether the measurement also falls within the boundary of the second tracking gate based on a comparison of the measurement with the second tracking gate (step 314). This determination is preferably made by the processor 220 of
If it is determined in step 314 that the measurement is not within the boundary of the second tracking gate (i.e., if the measurement is outside the boundary, such as with the second and third additional measurements 412, 414 in the example of
Conversely, if it is determined in step 314 that the measurement is within the boundary of the second tracking gate (such as with the first additional measurement 410 in the example of
During step 318, the second tracking gate is updated. Specifically, in a preferred embodiment, the second gate is updated in a recursive manner by adding the measurements as new inputs into the Kalman filter from the previous iteration. As shown in
During step 320, the measurements and determinations of steps 302-318 are used to update the historical data. The updated historical data is preferably stored in the memory 222 of
In addition, in certain embodiments, in the event that objects from several sensors have been grouped together, these objects may be disassociated if an incorrect association has occurred or a better match exists with another object. Accordingly, in one embodiment, the association history of the grouping is checked at each time step, and if after a specifiable (calibratable) number of cycles the data from the previously associated objects no longer warrant association or they have moved too far away from the currently associated track, the measurements that no longer match the fusion track will be removed from that track and either added to another fusion track, if there is a good match, or a new track will be created for that measurement.
As depicted in
Measurements, determinations, and/or classifications are also obtained from a second type of sensor (step 504). In one embodiment, during step 504 measurements are obtained from one or more radar sensors 212 of
Targets are identified based on the measurements, determinations, and/or classifications (steps 506, 508). Specifically, during step 506, targets are identified based on the measurements, determinations, and/or classifications from the first type of sensor of step 502. Similarly, during step 508, targets are identified based on the measurements, determinations, and/or classifications from the second type of sensor of step 504. Accordingly, in one embodiment, vision sensor targets are identified in step 506, and radar sensor targets are identified in step 508. In certain embodiments, targets from three or more different types of sensors (and/or other detection devices and/or techniques) may be identified. In one embodiment, the targets of steps 506 and 508 pertain to different positions of the same object. In another embodiment, the targets of steps 506 and 508 pertain to different objects. In certain embodiments, the identifications of steps 506 and 508 are preferably performed by the processor 220 of
Data association algorithms are utilized with respect to the targets identified in steps 506, 508 (steps 510, 512). Specifically, during step 510, a data association algorithm for the first type of sensor (e.g., for a vision sensor) is used to generate a first tracking gate for the targets of step 506 based on the characteristics of the first type of sensor. Similarly, during step 512, a data association algorithm for the second type of sensor (e.g., for a radar sensor) is used to generate a first tracking gate for the targets of step 508 based on the characteristics of the second type of sensor. Accordingly, in one embodiment, steps 510 and 512 correspond to the creation of multiple first gates for different types of sensors in step 306 of the process 300 of
A fusion system is generated (step 514). The fusion system preferably corresponds to the second gate of step 308 of the process 300 of
Fusion targets are generated (step 516). Specifically, fusion targets are generated using the fusion system of step 514, preferably by the processor 220 of
An analysis of the fusion targets is performed (step 518) and used to generate a target motion model (step 520). Specifically, in one embodiment, a tracking of fusion targets of step 516 over time is used to generate a pattern of movement of the fusion targets over time. The target motion model is used to predict fusion targets into the future (step 522), preferably using the prior fusion targets of step 516 in conjunction with the target motion model (and associated pattern of movement) of step 520. Steps 518-522 are preferably performed by the processor 220 of
In addition, additional measurements are obtained (preferably, from both types of sensors), and additional corresponding targets (preferably, also for both types of sensors) are identified in new iterations of steps 502-508. Such new iterations occur at a time that is subsequent to the time in which the previous iterations of steps 502-508 were performed.
The corresponding targets of the new iterations of steps 506 and 508 are compared with the predicted fusion targets of step 522 in steps 523 and 524, respectively, preferably by the processor 220 of
The process then proceeds with a new iteration of steps 514-522, in which the fusion system is updated accordingly and used to generate updated fusion targets, an updated target motion model, updated predicted fusion targets, and so on, in a continuous loop. The process continues to repeat in this manner throughout the ignition cycle of the vehicle. Accordingly, with each iteration, the fusion system is updated accordingly, to provide for potentially greater accuracy and precision in tracking objects. Also, similar to the discussion above, it will be appreciated that while two types of sensors are mentioned in connection with
Accordingly, methods and systems are provided for associating measurements pertaining to objects that may be detected proximate a vehicle. The disclosed methods and systems provide for tracking measurements pertaining to an object along multiple tracking gates. The disclosed methods and systems thus provide for potentially improved tracking of objects that may be proximate the vehicle.
It will be appreciated that the vehicle of
While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the invention in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing the exemplary embodiment or exemplary embodiments. It should be understood that various changes can be made in the function and arrangement of elements without departing from the scope of the invention as set forth in the appended claims and the legal equivalents thereof.