This application is a U.S. National-Stage entry under 35 U.S.C. § 371 based on International Application No. PCT/US14/45470, filed Jul. 3, 2014 which was published under PCT Article 21(2) and is incorporated in its entirety herein.
This application pertains to vehicles, and more particularly relates to methods and systems for radar systems for vehicles.
Certain vehicles today utilize radar systems. For example, certain vehicles utilize radar systems to detect other vehicles, pedestrians, or other objects on a road in which the vehicle is travelling. Radar systems may be used in this manner, for example, in implementing automatic braking systems, adaptive cruise control, and avoidance features, among other vehicle features. While radar systems are generally useful for such vehicle features, in certain situations existing radar systems may have certain limitations.
Accordingly, it is desirable to provide techniques for radar system performance in vehicles, for example pertaining to the detection and/or tracking of objects on the road in which the vehicle is travelling. It is also desirable to provide methods, systems, and vehicles utilizing such techniques. Furthermore, other desirable features and characteristics of the present invention will be apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and the foregoing technical field and background.
In accordance with an exemplary embodiment, a method is provided for selectively analyzing radar signals of a radar system of a vehicle. The method comprises receiving a plurality of radar signals via a radar system of a vehicle, obtaining data from one or more sensors of the vehicle having a modality that is different from the radar system, and selectively analyzing the plurality of radar signals differently based upon the data.
In accordance with an exemplary embodiment, a radar control system is provided. The radar control system comprises a receiver, an interface, and a processor. The receiver is configured to receive a plurality of radar signals of a radar system of a vehicle. The interface is configured to obtain data from one or more sensors of the vehicle having a modality that is different from the radar system. The processor is coupled to the receiver and to the interface, and is configured to selectively analyze the plurality of radar signals differently based upon the data.
In accordance with an exemplary embodiment, a vehicle is provided. The vehicle includes a radar system, one or more sensors, an interface, and a processor. The radar system includes a receiver that is configured to receive a plurality of radar signals. The one or more sensors have a modality that is different from the radar system. The interface is configured to obtain data from the one or more additional sensors. The processor is coupled to the receiver and to the interface, and is configured to selectively analyze the plurality of radar signals differently based upon the data from the one or more additional sensors.
The present disclosure will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and wherein:
The following detailed description is merely exemplary in nature and is not intended to limit the disclosure or the application and uses thereof. Furthermore, there is no intention to be bound by any theory presented in the preceding background or the following detailed description.
With reference again to
In the exemplary embodiment illustrated in
Still referring to
The steering system 150 is mounted on the chassis 112, and controls steering of the wheels 116. The steering system 150 includes a steering wheel and a steering column (not depicted). The steering wheel receives inputs from a driver of the vehicle. The steering column results in desired steering angles for the wheels 116 via the drive shafts 134 based on the inputs from the driver.
The braking system 160 is mounted on the chassis 112, and provides braking for the vehicle 100. The braking system 160 receives inputs from the driver via a brake pedal (not depicted), and provides appropriate braking via brake units (also not depicted). The driver also provides inputs via an accelerator pedal (not depicted) as to a desired speed or acceleration of the vehicle, as well as various other inputs for various vehicle devices and/or systems, such as one or more vehicle radios, other entertainment systems, environmental control systems, lightning units, navigation systems, and the like (also not depicted).
Also as depicted in
The control system 102 is mounted on the chassis 112. As mentioned above, the control system 102 provides selective analysis of received radar signals of the radar system 103 based upon additional information provided by one or more additional sensors 104 having a different modality from the radar system 103. Specifically, in one embodiment, the control system 102 uses the additional information to help prevent incorrect “ghost” interpretations from the received radar signals, for example by selectively ignoring radar signals that are determined to be returned from objects that are not on the same road as the vehicle 100. The control system 102, in one example, provides these functions in accordance with the process 400 described further below in connection with
While the control system 102, the radar system 103, and the additional sensors 104 are depicted as being part of the same system, it will be appreciated that in certain embodiments these features may comprise two or more systems. In addition, in various embodiments the control system 102 may comprise all or part of, and/or may be coupled to, various other vehicle devices and systems, such as, among others, the actuator assembly 120, and/or the electronic control system 118.
With reference to
The sensor array 202 includes the radar system 103 and the one or more additional sensors 104 of
Also as depicted in
Also as depicted in
The processing unit 226 may be a microprocessor, microcontroller, application specific integrated circuit (ASIC) or other suitable device as realized by those skilled in the art. Of course, the radar system 103 may include multiple processing unit 226, working together or separately, as is also realized by those skilled in the art. In certain embodiments, the processing unit 226 also includes or is associated with a memory (not depicted) of the radar system 103 for storing values for use in the process 400 of
As depicted in
As depicted in
In the depicted embodiment, the computer system of the controller 204 includes a processor 230, a memory 232, an interface 234, a storage device 236, and a bus 238. The processor 230 performs the computation and control functions of the controller 204, and may comprise any type of processor or multiple processors, single integrated circuits such as a microprocessor, or any suitable number of integrated circuit devices and/or circuit boards working in cooperation to accomplish the functions of a processing unit. During operation, the processor 230 executes one or more programs 240 contained within the memory 232 and, as such, controls the general operation of the controller 204 and the computer system of the controller 204, generally in executing the steps of the processes described herein, such as the steps of the method 400 described further below in connection with
The memory 232 can be any type of suitable memory. This would include the various types of dynamic random access memory (DRAM) such as SDRAM, the various types of static RAM (SRAM), and the various types of non-volatile memory (PROM, EPROM, and flash). In certain examples, the memory 232 is located on and/or co-located on the same computer chip as the processor 230. In the depicted embodiment, the memory 232 stores the above-referenced program 240 along with one or more stored values 242 for use in making the determinations.
The bus 238 serves to transmit programs, data, status and other information or signals between the various components of the computer system of the controller 204. The interface 234 allows communication to the computer system of the controller 204, for example from a system driver and/or another computer system, and can be implemented using any suitable method and apparatus. In one embodiment, the interface 234 obtains the additional data from the additional sensors 104 (e.g., camera data from the camera 210 and LIDAR data from the LIDAR system 212) for use in selectively analyzing the received radar signals of the radar system 103. The interface 234 can include one or more network interfaces to communicate with other systems or components. The interface 234 may also include one or more network interfaces to communicate with technicians, and/or one or more storage interfaces to connect to storage apparatuses, such as the storage device 236.
The storage device 236 can be any suitable type of storage apparatus, including direct access storage devices such as hard disk drives, flash systems, floppy disk drives and optical disk drives. In one exemplary embodiment, the storage device 236 comprises a program product from which memory 232 can receive a program 240 that executes one or more embodiments of one or more processes of the present disclosure, such as the steps of the method 400 (and any sub-processes thereof) described further below in connection with
The bus 238 can be any suitable physical or logical means of connecting computer systems and components. This includes, but is not limited to, direct hard-wired connections, fiber optics, infrared and wireless bus technologies. During operation, the program 240 is stored in the memory 232 and executed by the processor 230.
It will be appreciated that while this exemplary embodiment is described in the context of a fully functioning computer system, those skilled in the art will recognize that the mechanisms of the present disclosure are capable of being distributed as a program product with one or more types of non-transitory computer-readable signal bearing media used to store the program and the instructions thereof and carry out the distribution thereof, such as a non-transitory computer readable medium bearing the program and containing computer instructions stored therein for causing a computer processor (such as the processor 230) to perform and execute the program. Such a program product may take a variety of forms, and the present disclosure applies equally regardless of the particular type of computer-readable signal bearing media used to carry out the distribution. Examples of signal bearing media include: recordable media such as floppy disks, hard drives, memory cards and optical disks, and transmission media such as digital and analog communication links. It will similarly be appreciated that the computer system of the controller 204 may also otherwise differ from the embodiment depicted in
As depicted in
The radar system 103 generates the transmittal radar signals via the signal generator 302. The transmittal radar signals are filtered via the filter 304, amplified via the amplifier 306, and transmitted from the radar system 103 (and from the vehicle 100, also referred to herein as the “host vehicle”) via the antenna 308. The transmitting radar signals subsequently contact other vehicles and/or other objects on or alongside the road on which the host vehicle is travelling. After contacting the other vehicles and/or other objects, the radar signals are reflected, and travel from the other vehicles and/or other objects in various directions, including some signals returning toward the host vehicle. The radar signals returning to the host vehicle (also referred to herein as received radar signals) are received by the antenna 310, amplified by the amplifier 312, mixed by the mixer 314, and digitized by the sampler/digitizer 316. The received radar signals are then provided to the processing unit 226 for processing.
In addition to the received radar signals, the processing unit 226 also obtains additional data from the additional sensors 104 (namely, the camera 210 and the LIDAR system 212 of
As depicted in
After the radar signals are reflected from objects on or around the road (e.g., the target vehicle 502 and the guard rail 504 of
Additional data is obtained from one or more additional sensors at 405 of
Initial determinations are made as to objects on or around the road at 412. In one embodiment, initial determinations are made as to the type of objects on or along the road, and/or the location, placement, and/or dimensions thereof, using the received radar signals and the additional information. In one embodiment, geographic coordinates and physical measurements (e.g., length, width, height) of the objects (e.g. the target vehicle 502 and the guard rail 504) are determined using the received radar signals of 404 and the additional data of 405 (e.g., using the object-level camera data of 406 and the object-level LIDAR data of 408), and are then used to determine the types of the objects that have been detected. In one embodiment, these perception tasks may be referred to as object segmentation and classification. While there may be many different approaches and algorithms to perform these tasks, in one embodiment the segmentation is based on diverse clustering methods, while classification is based on building diverse classifiers such as support vector machine (SVM)/relevance vector machines (RVM)/Ada-boost, neural networks (e.g. multi-layer perception (MLP), radial basis function (RBF), and so on). In one embodiment, more complex methods, such as Markov random field (MRF)/(conditional random field) CRF techniques, may be used to solve these two problems simultaneously.
For example, with respect to the objects from which the received radar signals were reflected en route to the host vehicle 100, the camera and LIDAR signals are used to determine whether such object is (i) a type of object that would need to be tracked by the radar system 103 (e.g. a target vehicle, a pedestrian, or an object on the road); or (ii) a type of object that could interfere with the tracking of such a tracked object (e.g. a guard rail alongside the road). In one embodiment, an object is characterized as one requiring tracking if the object is disposed on the road in a path in which the host vehicle is expected to travel (e.g. the target vehicle 502 on the road 500 in
For each of the objects identified at 412, associated angles are also determined for such objects at 414. The associated angles generally pertain to an angle made between the host vehicle 100 and the object with respect to a direction of travel for the host vehicle 100. The associated angles are determined for each of the detected objects, including the objects that need to be tracked by the radar system 103 and the objects that could interfere with the tracking of such tracked objects by the radar system 103. The associated angles from 414 are determined by a processor, such as the processing unit 226 and/or the processor 230 of
By way of additional explanation, LIDAR data comprises three dimensional (3D) data, so that an angle between any detected object and a LIDSA sensor is a result of direct measurement in one embodiment. Also in one embodiment, in order to obtain associated angles (between host-vehicle/radar sensor and objects) the simple linear transformations are performed based on a calibration parameters. In one embodiment, calibration parameters may be calculated via an offline calibration procedure (during sensors installation in the vehicle, before real driving). In one embodiment, similar information can be extracted based on a single mono camera sensor only. In one embodiment, both camera and radar data are projected on the ground (homographic transformations). In one embodiment, such homographic mapping is built during the calibration procedure and assumes that the ground/road is flat. In another embodiment, three dimensional (3D) data can be creating using structure from motion (SFM) algorithms for a single monaural (mono) camera or three dimensional (3D) stereo reconstruction for multiple cameras.
Each received radar signal is then selectively analyzed, beginning with a determination of an angle for each of the radar signals received by the radar system at 416 of
At 418 of
If it is determined at 418 of
Conversely, if it is determined at 418 that the received radar signal being analyzed corresponds to an object that requires tracking (e.g., an object that is on the road in front of the host vehicle 100 in its present direction of travel), then the radar signal being analyzed is utilized for the purposes of identifying and tracking the object on the road at 422. For example, in one embodiment, if the received radar signal being analyzed is determined to be originating from the target vehicle 502 rather than the guard rail 504 of
In one example, the received radar signal is used, along with any previously-processed signals pertaining to the object (e.g., the target vehicle 502 of
At 426, a determination is made as to whether there are any additional received radar signals (or data points) to be analyzed. This determination is performed by a processor, such as the processing unit 226 and/or the processor 230 of
Once each of the received radar signals (or data points) is analyzed, the tracking results of 424 are implemented at 428. Specifically, one or more vehicle actions and/or alerts may be initiated as appropriate based upon the tracking results from 424. In one example, if a distance between the host vehicle 100 and the target vehicle 502 is less than a predetermined threshold (or an estimated time of contact between the host vehicle 100 and the target vehicle 502 under their current respective trajectories is less than a predetermined threshold), then an alert (e.g., a visual or audio alert to the driver) may be provided and/or an automatic vehicle control action (e.g., automatic braking and/or automatic steering) may be initiated, for example by the processor outputting one or more control signals for the steering system 150 and/or the braking system 160 of
Accordingly, the method 400 provides for selective analysis of radar system data utilizing additional object-level data obtained from one or more additional sensors having a different modality from the radar system. Specifically, in accordance with one embodiment, the method 400 allows for potentially more accurate and/or precise tracking of an object (e.g., the target vehicle 502 of
It will be appreciated that the disclosed methods, systems, and vehicles may vary from those depicted in the Figures and described herein. For example, the vehicle 100, the control system 102, the radar system 103, the additional sensors 104, the sensor array 202, the controller 204, and/or various components thereof may vary from that depicted in
While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the disclosure in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing the exemplary embodiment or exemplary embodiments. It should be understood that various changes can be made in the function and arrangement of elements without departing from the scope of the appended claims and the legal equivalents thereof.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2014/045470 | 7/3/2014 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2016/003473 | 1/7/2016 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5479173 | Yoshioka | Dec 1995 | A |
6518916 | Ashihara | Feb 2003 | B1 |
6583403 | Koike | Jun 2003 | B1 |
7376247 | Ohta | May 2008 | B2 |
8027029 | Lu et al. | Sep 2011 | B2 |
8350638 | White et al. | Jan 2013 | B2 |
8390509 | Asanuma | Mar 2013 | B2 |
8466827 | Nanami | Jun 2013 | B2 |
8542106 | Hilsebecher | Sep 2013 | B2 |
8610620 | Katoh | Dec 2013 | B2 |
8629977 | Phillips et al. | Jan 2014 | B2 |
8686906 | White et al. | Apr 2014 | B2 |
8704719 | Song et al. | Apr 2014 | B2 |
8907839 | Oh | Dec 2014 | B2 |
8970397 | Nitanda | Mar 2015 | B2 |
9077072 | Song et al. | Jul 2015 | B2 |
20040178945 | Buchanan | Sep 2004 | A1 |
20080019567 | Takagi et al. | Jan 2008 | A1 |
20080077015 | Boric-Lubecke et al. | Mar 2008 | A1 |
20080260019 | Aoyagi | Oct 2008 | A1 |
20110140949 | Lee | Jun 2011 | A1 |
20120140061 | Zeng | Jun 2012 | A1 |
20120330528 | Schwindt | Dec 2012 | A1 |
20130314272 | Gross | Nov 2013 | A1 |
20140035774 | Khlifi | Feb 2014 | A1 |
Number | Date | Country |
---|---|---|
101581780 | Nov 2009 | CN |
102879777 | Jan 2013 | CN |
102944876 | Feb 2013 | CN |
4442189 | May 1996 | DE |
10326648 | Jan 2005 | DE |
102009054776 | Aug 2010 | DE |
102010024328 | Dec 2011 | DE |
Entry |
---|
State Intellectual Property Office of the People's Republic of China, Office Action in Chinese Patent Application No. 201510383757.8 dated Mar. 20, 2017. |
Amit Kumar Mishra, et al., “Information sensing for radar target classification using compressive sensing,” IRS 2012, 19th International Radar Symposium, May 23-25, Warsaw, Poland, pp. 326-330. |
Ming-Hua Xue, et al., “Research on Three-Dimensional Imaging Algorithm of Radar Target,” Radar Science and Technology, vol. 11, No. 1, Feb. 2013, pp. 65-70. |
International Searching Authority, International Search Report for PCT/US2014/45475, dated Dec. 12, 2014. |
International Searching Authority, International Search Report for PCT/US2014/45471, dated Nov. 7, 2014. |
International Searching Authority, International Search Report for PCT/US2014/45470, dated Nov. 7, 2014. |
European Patent Office, Extended Search Report issued in European Patent Application No. 14896809.2 dated Jan. 3, 2018. |
European Patent Office, Extended Search Report issued in European Patent Application No. 14896397.8 dated Dec. 15, 2017. |
European Patent Office, Extended Search Report issued in European Patent Application No. 14896673.2 dated Dec. 15, 2017. |
European Patent Office, Extended Search Report issued in European Patent Application No. 14896809.2 dated Apr. 19, 2018. |
Translation of DE10326648 (Year: 2005). |
Number | Date | Country | |
---|---|---|---|
20170261602 A1 | Sep 2017 | US |