Vehicle sensor with integrated radar and image sensors

Information

  • Patent Grant
  • 10852418
  • Patent Number
    10,852,418
  • Date Filed
    Thursday, August 24, 2017
    7 years ago
  • Date Issued
    Tuesday, December 1, 2020
    4 years ago
Abstract
A sensing system of a vehicle includes a sensor module disposed at the vehicle. The sensor module includes first and second radar sensors and a camera. A field of sensing of the first radar sensor is encompassed by a portion of a field of view of the camera and a field of sensing of the second radar sensor is encompassed by another portion of the field of view of the camera. Outputs of the radar sensors and the camera are communicated to a control. The control, responsive to processing of the outputs of the radar sensors, detects the presence of an object exterior the vehicle and within the field of sensing of at least one of the radar sensors. The control, responsive to detection of an object via processing of the outputs of the radar sensors, processes the output of the camera to classify the detected object.
Description
FIELD OF THE INVENTION

The present invention relates generally to a vehicle sensing system for a vehicle and, more particularly, to a vehicle sensing system that utilizes one or more sensors at a vehicle to provide a field of sensing at or around the vehicle.


BACKGROUND OF THE INVENTION

Use of imaging sensors or ultrasonic sensors or radar sensors in vehicle sensing systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 8,013,780 and 5,949,331 and/or U.S. publication No. US-2010-0245066 and/or International Publication No. WO 2011/090484, which are hereby incorporated herein by reference in their entireties.


SUMMARY OF THE INVENTION

The present invention provides a driver assistance system or sensing system for a vehicle that utilizes a sensor module disposed at the vehicle to sense a respective region exterior of the vehicle, with the sensor module comprising one or more radar sensors and at least one camera disposed in a common housing. A field of sensing of the radar sensor(s) is encompassed by a portion of a field of view of the at least one camera. The system includes a control, where outputs of the radar sensor(s) and the at least one camera are communicated to the control, and where the control, responsive to processing of the outputs of the radar sensor(s), detects the presence of one or more objects exterior the vehicle and within the field of sensing of at least one of at least one radar sensor, and where the control, responsive to processing of the output of the at least one camera, classifies the detected object.


According to an aspect of the present invention, a sensing system of a vehicle includes a sensor module disposed at a vehicle. The sensor module includes first and second radar sensors and a camera. A field of sensing of the first radar sensor is encompassed by a portion of a field of view of the camera and a field of sensing of the second radar sensor is encompassed by another portion of the field of view of the camera. Outputs of the radar sensors and the camera are communicated to a control. The control, responsive to processing of the outputs of the radar sensors, detects the presence of an object exterior the vehicle and within the field of sensing of at least one of the radar sensors. The control, responsive to detection of an object via processing of the outputs of the radar sensors, processes the output of the camera to classify the detected object.


The sensor module may comprise a circuit board with the first and second radar sensors and the camera disposed at the circuit board, and with the camera disposed at the circuit board at a location between the first and second radar sensors. Physical orientation of the second radar sensor at the circuit board relative to the camera may be different from physical orientation of the first radar sensor at the circuit board relative to the camera. For example, one may be oriented for horizontal resolution and one may be oriented for vertical resolution (where they are oriented with their antenna arrays rotated about 90 degrees relative to one another).


The sensor module may be disposed at a rear of the vehicle such that the camera views rearward of the vehicle. During a reversing maneuver of the vehicle, a display device of the vehicle may display video images derived from the output of the camera.


These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a perspective view of a vehicle with a sensing system that incorporates a radar sensor in accordance with the present invention;



FIG. 2 is a schematic showing the regions encompassed by the fields of sensing of a pair of radar sensors and a camera;



FIG. 3 is a schematic showing the regions encompassed by the fields of sensing of a radar and a camera disposed in a common housing in accordance with the present invention;



FIG. 4 is a side elevation of a vehicle equipped with a sensing system of the present invention, showing the pitch boresight of the radar sensor;



FIG. 5 is a rear elevation of the vehicle equipped with the sensing system, showing the roll boresight of the radar sensor;



FIG. 6 is a top plan view of the vehicle equipped with the sensing system, showing the yaw boresight of the radar sensor;



FIG. 7 is a schematic of an integrated camera and radar device, showing the radar sensors at 90 degrees relative to one another; and



FIG. 8 is a schematic of an integrated camera and radar device, showing the radar sensors at 70 degrees relative to one another.





DESCRIPTION OF THE PREFERRED EMBODIMENTS

A vehicle sensing system, such as a driver or driving assist system, object detection system, parking assist system and/or alert system, operates to capture sensing data exterior of the vehicle and may process the captured data to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a forward or rearward direction or to assist the driver in parking the vehicle in a parking space. The system includes a processor that is operable to receive sensing data from multiple sensors and to provide an output to a control that, responsive to the output, generates an alert or controls an accessory or system of the vehicle, or highlights or overlays an alert on a display screen (that may be displaying video images captured by a single rearward viewing camera or multiple cameras providing forward, side or 360 degree surround views of the area surrounding the vehicle during a reversing or low speed maneuver of the vehicle).


Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an driver assistance system or sensing system 12 that includes at least one radar sensor unit, such as a forward facing radar sensor unit 14 (and the system may optionally include multiple exterior facing sensors, such as cameras or other sensors, such as a rearward facing sensor at the rear of the vehicle, and a sideward/rearward facing sensor at respective sides of the vehicle), which sense regions exterior of the vehicle. The sensing system 12 includes a control or electronic control unit (ECU) or processor that is operable to process data captured by the sensor or sensors and may detect objects or the like. The data transfer or signal communication from the sensor to the ECU may comprise any suitable data or communication link, such as a vehicle network bus or the like of the equipped vehicle.


Some automotive radars use MIMO (Multiple Input Multiple Output) techniques to create an effective virtual antenna aperture, which is significantly larger than the real antenna aperture, and delivers much better angular resolution than conventional radars, such as, for example, conventional scanning radars. MIMO techniques may be used to create virtual antenna apertures, not only from linear arrays of real antennas but also from two dimensional arrays of real antennas. For example, the antenna array of a MIMO sensor may comprise two transmitting antennas and two or more receiving antennas, arranged either in a one-dimensional array or as in a two-dimensional array. Thus, the antenna array may consist of four antennas (arranged as a 2×2 array MIMO virtual antenna) or any other combination of transmitting and receiving antennas, whose product is the number of virtual antennas, such as, for example, 16 virtual antennas (4×4 array) or more or less than 16 virtual antennas.


For vision systems having an exterior viewing camera, the challenge is that the OEM system level requirements may specify that objects need to be detected and classified between 0 m . . . 18 m with a resolution in the 10 cm range. In order to ‘see’ enough of the object in the close range (0 . . . 2 m), a camera lens with about a 185 degree opening angle is needed producing a fisheye image. The large opening angle of the lens in combination with an imager with limited resolution (such as around 1 MP, which are spread in an angle to cover the desired wide field of view) causes a very poor resolution in the long range. For example, at 10 m range the pixel resolution is around 1.2 m, meaning that distance measurement with enough precision beyond 10 m is difficult to nearly impossible.


For a radar sensing system with a 16 virtual receiver configuration, the angular resolution is approximately 10 degrees. The spot size of a slice of the radar's field of view would be approximated by the equation (angular resolution×range)/57. Thus, at one meter, the spot slice is about 17 cm. As the range is extended, the spot size would increase.


If a radar sensor with horizontal slices and another radar sensor with vertical slices are utilized to cover an FOV, the intersection of their slices one from each sensor permits a 17 square cm spot in 2-D for fine resolution over the common FOV. The BSD (blind spot detection) corner sensors would allow for conversion of the spot to a cube for three dimensional (3D) and terrain monitoring.


Corner radar sensors may be used for detection and cameras may be used for classification. For example, the radar sensors mounted at the corners of the vehicle are used for the object detection and the rearward viewing camera may be used for classification (and optionally for display of video images derived from and representative of image data captured by the rearward viewing camera). The radar sensors can be used for measurement of object height and object location. A higher accuracy is achieved in the FOV where the radar sensors overlap. However, a radar blindspot may occur at the center area of the vehicle (see FIG. 2).


The present invention provides an integrated radar sensor and camera that are integrated into the same or common housing (see FIG. 3). The radar sensor is used for object detection and object localization, with a vertical opening angle of about 150 degrees, a range of about 30 m, and range resolution of about 7.5 cm. The radar sensor allows for measuring of the height of objects, the relative velocity and path. The camera and machine vision/image processing is used for object classification (such as, for example, pedestrians, vehicles, bicycles, and/or the like).


For example, the radar and camera data may be fused for an automatic emergency braking (AEB) system of the vehicle. The machine vision/image processing of the camera does not require object detection anymore, therefore is of less risk for smart camera implementations (less processing performance required). If the camera and radar are packaged together, there is less integration effort for the OEM, also wiring is minimized.


Optionally, the system may integrate radar sensor(s) in stereo three dimensional (3D) configuration and a camera into a common housing. The radar sensors are positioned such that their horizontal and vertical azimuths are intersecting. It is envisioned that the radar sensors have common horizontal and vertical fields of view or fields of sensing and common azimuth resolution. Each radar sensor is used for object detection and object localization, with a vertical opening angle of about 150 degrees, a range of about 30+m, and a range resolution of 7.5 cm. The radar sensor allows measuring of the height of objects and creation of a terrain map. The pitch of the radar sensor mounting (see FIG. 4) is selected to maximize the coverage close to the vehicle, with the vertical field of view/sensing ideally intersecting the vehicle. The roll (FIG. 5) and yaw (FIG. 6) of the radar sensors is selected to balance depth measurement across the entire combined field of sensing of all of the radar sensors.


It is envisioned that the sensors' horizontal planes are rotated greater than about 90 degrees relative to each other, providing intersecting beams in range, velocity, azimuth and elevation for all detections, creating radar data cubes of known size for all volumes in 3D space within the combined field of view of at least three sensors on the vehicle. Camera, radar and machine vision/image processing is used for object classification (such as, for example, pedestrians, vehicles, bicycles, and/or the like). The captured radar data and camera/image data is fused for automatic emergency braking (AEB) applications.


Optionally, the radar sensors may operate collaboratively, where the transmitted signals from each transmit antenna of radar sensor #1 are received and processed by all receive antenna of both radar sensor #1 and radar sensor #2, thereby increasing effective aperture, accuracy and resolution of the system.


The machine vision/image processing of the camera-captured image data does not require object detection, and thus is of less risk for smart camera implementations (less processing performance required). If the camera and radar are packaged together, there is less integration effort for the OEM, while also minimizing or reducing wiring requirements.


As shown in FIGS. 7 and 8, an integrated camera and radar package may include a camera disposed between two radar sensors. Optionally, the radar sensors may be arranged at 90 degrees relative to one another (FIG. 7), such that (when the device is mounted at a vehicle) one radar sensor antenna is at a vertical resolution orientation and the other radar sensor antenna is at a horizontal resolution orientation, thus providing a symmetric radar cube package. Optionally, the radar sensors may be arranged at a different angle (such as, for example, 70 degrees in FIG. 8) relative to one another, such that (when the device is mounted at a vehicle) one radar sensor antenna is at a roll+35 degree orientation and the other radar sensor antenna is at a roll−35 degree orientation, thus providing an asymmetric radar cube package. Clearly, other orientations may be implemented depending on the particular application of the integrated radar and camera sensing device and system.


The system may utilize sensors, such as radar or lidar sensors or the like. The sensing system may utilize aspects of the systems described in U.S. Pat. Nos. 9,753,121; 9,689,967; 9,599,702; 9,575,160; 9,146,898; 9,036,026; 8,027,029; 8,013,780; 6,825,455; 7,053,357; 7,408,627; 7,405,812; 7,379,163; 7,379,100; 7,375,803; 7,352,454; 7,340,077; 7,321,111; 7,310,431; 7,283,213; 7,212,663; 7,203,356; 7,176,438; 7,157,685; 6,919,549; 6,906,793; 6,876,775; 6,710,770; 6,690,354; 6,678,039; 6,674,895 and/or 6,587,186, and/or International Publication No. WO 2011/090484 and/or U.S. Publication Nos. US-2017-0222311 and/or US-2010-0245066, and/or U.S. patent application Ser. No. 15/647,339, filed Jul. 12, 2017, Ser. No. 15/619,627, filed Jun. 12, 2017, Ser. No. 15/584,265, filed May 2, 2017, Ser. No. 15/467,247, filed Mar. 23, 2017, Ser. No. 15/446,220, filed Mar. 1, 2017, and/or Ser. No. 15/675,919, filed Aug. 14, 2017, and/or International PCT Application No. PCT/IB2017/054120, filed Jul. 7, 2017, which are hereby incorporated herein by reference in their entireties.


Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.

Claims
  • 1. A sensing system of a vehicle, said sensing system comprising: a sensor module disposed at a vehicle equipped with said sensing system;wherein said sensor module comprises a sensor package that houses in a common housing (i) a first radar sensor disposed in the common housing and having a plurality of transmitting antennas and a plurality of receiving antennas disposed at a first substrate, (ii) a second radar sensor disposed in the common housing and having a plurality of transmitting antennas and a plurality of receiving antennas disposed at a second substrate, and (iii) a camera disposed in the common housing between said first radar sensor and said second radar sensor;wherein the transmitting and receiving antennas of said first radar sensor are oriented in a first orientation at the first substrate;wherein the transmitting and receiving antennas of said second radar sensor are oriented in a second orientation at the second substrate;wherein the first orientation is different than the second orientation by rotation of said first radar sensor relative to said second radar sensor about a first axis perpendicular to the first substrate;wherein said first radar sensor is rotated clockwise relative to said camera about the first axis, and wherein said second radar sensor is rotated counter-clockwise relative to said camera about a second axis perpendicular to the second substrate;wherein said first radar sensor has a first field of sensing exterior the equipped vehicle, and wherein said second radar sensor has a second field of sensing exterior the equipped vehicle, and wherein said camera has a field of view exterior the equipped vehicle;wherein the first field of sensing of said first radar sensor is encompassed by a portion of the field of view of said camera, and wherein the second field of sensing of said second radar sensor is encompassed by another portion of the field of view of said camera;a control, wherein outputs of said first and second radar sensors and said camera are communicated to said control;wherein said control comprises a processor that processes the outputs;wherein said control, responsive to processing of the outputs of said first and second radar sensors, detects the presence of an object exterior the vehicle and within the field of sensing of at least one of said first radar sensor and said second radar sensor; andwherein said control, responsive to detection of the presence of an object via processing of the outputs of said first and second radar sensors, processes the output of said camera to classify the detected object.
  • 2. The sensing system of claim 1, wherein physical orientation of said second radar sensor relative to said camera is different from physical orientation of said first radar sensor relative to said camera.
  • 3. The sensing system of claim 1, wherein said first and second radar sensors are positioned such that their horizontal and vertical azimuths are intersecting.
  • 4. The sensing system of claim 1, wherein said sensor module comprises a circuit board and wherein said first and second radar sensors and said camera are disposed at said circuit board, and wherein said camera is disposed at said circuit board at a location between said first and second radar sensors.
  • 5. The sensing system of claim 1, wherein said sensor module is disposed at a rear of the vehicle such that said camera views rearward of the vehicle.
  • 6. The sensing system of claim 5, wherein, during a reversing maneuver of the vehicle, a display device of the vehicle displays video images derived from the output of said camera.
  • 7. The sensing system of claim 1, wherein a display device of the vehicle is operable to display video images derived from the output of said camera.
  • 8. The sensing system of claim 1, wherein each of said first and second radar sensors comprises an array of virtual antennas.
  • 9. The sensing system of claim 8, wherein said array of virtual antennas comprises at least four virtual antennas.
  • 10. The sensing system of claim 1, wherein said sensor module is part of a sensing system capable of providing driver assist system functions.
  • 11. The sensing system of claim 10, wherein said sensing system provides detection for at least one of automated parking, blind spot detection, cross traffic alert, lane change and merge aid, automatic emergency braking, pedestrian detection, turn assist, terrain mapping and intersection collision mitigation.
  • 12. The sensing system of claim 1, wherein radar signals transmitted by said plurality of transmitting antennas of said first radar sensor are received by said plurality of receiving antennas of said second radar sensor, and wherein radar signals transmitted by said plurality of transmitting antennas of said second radar sensor are received by said plurality of receiving antennas of said first radar sensor.
  • 13. The sensing system of claim 1, wherein said first radar sensor, said second radar sensor and said camera are disposed on a circuit board, and wherein said circuit board is disposed along the first and second substrates.
  • 14. The sensing system of claim 1, wherein said first radar sensor is rotated 90 degrees about the first axis relative said second radar sensor.
  • 15. A sensing system of a vehicle, said sensing system comprising: a sensor module disposed at a vehicle equipped with said sensing system;wherein said sensor module comprises a sensor package that houses in a common housing (i) a first radar sensor disposed in the common housing and having a plurality of transmitting antennas and a plurality of receiving antennas disposed at a first substrate, (ii) a second radar sensor disposed in the common housing and having a plurality of transmitting antennas and a plurality of receiving antennas disposed at a second substrate, and (iii) a camera disposed in the common housing between said first radar sensor and said second radar sensor;wherein the transmitting and receiving antennas of said first radar sensor are oriented in a first orientation at the first substrate;wherein the transmitting and receiving antennas of said second radar sensor are oriented in a second orientation at the second substrate;wherein the first orientation is different than the second orientation by rotation of said first radar sensor relative to said second radar sensor about a first axis perpendicular to the first substrate;wherein said first radar sensor is rotated by a selected angle relative to said camera about the first axis in a clockwise direction when viewing the first substrate, and wherein said second radar sensor is rotated by the selected angle relative to said camera about a second axis perpendicular to the second substrate in a counter-clockwise direction when viewing the second substrate;wherein said first radar sensor has a first field of sensing exterior the equipped vehicle, and wherein said second radar sensor has a second field of sensing exterior the equipped vehicle, and wherein said camera has a field of view exterior the equipped vehicle;wherein the first field of sensing of said first radar sensor is encompassed by a portion of the field of view of said camera, and wherein the second field of sensing of said second radar sensor is encompassed by another portion of the field of view of said camera;a control, wherein outputs of said first and second radar sensors and said camera are communicated to said control;wherein said control comprises a processor that processes the outputs;wherein said control, responsive to processing of the outputs of said first and second radar sensors, detects the presence of an object exterior the vehicle and within the field of sensing of at least one of said first radar sensor and said second radar sensor; andwherein said control, responsive to detection of the presence of an object via processing of the outputs of said first and second radar sensors, processes the output of said camera to classify the detected object.
  • 16. The sensing system of claim 15, wherein the first axis is parallel to the second axis.
  • 17. The sensing system of claim 16, wherein the selected angle is less than or equal to 45 degrees.
  • 18. The sensing system of claim 16, wherein the selected angle is 35 degrees.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application claims the filing benefits of U.S. provisional applications, Ser. No. 62/398,094, filed Sep. 22, 2016, and Ser. No. 62/378,849, filed Aug. 24, 2016, which are hereby incorporated herein by reference in their entireties.

US Referenced Citations (81)
Number Name Date Kind
3778823 Sato Dec 1973 A
5949331 Schofield et al. Sep 1999 A
6587186 Bamji et al. Jul 2003 B2
6674895 Rafii et al. Jan 2004 B2
6678039 Charbon Jan 2004 B2
6690268 Schofield et al. Feb 2004 B2
6690354 Sze Feb 2004 B2
6693517 McCarthy et al. Feb 2004 B2
6710770 Tomasi et al. Mar 2004 B2
6825455 Schwarte Nov 2004 B1
6876775 Torunoglu Apr 2005 B2
6906793 Bamji et al. Jun 2005 B2
6919549 Bamji et al. Jul 2005 B2
7053357 Schwarte May 2006 B2
7157685 Bamji et al. Jan 2007 B2
7176438 Bamji et al. Feb 2007 B2
7203356 Gokturk et al. Apr 2007 B2
7212663 Tomasi May 2007 B2
7283213 O'Connor et al. Oct 2007 B2
7310431 Gokturk et al. Dec 2007 B2
7321111 Bamji et al. Jan 2008 B2
7340077 Gokturk et al. Mar 2008 B2
7352454 Bamji et al. Apr 2008 B2
7375803 Bamji May 2008 B1
7379100 Gokturk et al. May 2008 B2
7379163 Rafii et al. May 2008 B2
7405812 Bamji Jul 2008 B1
7408627 Bamji et al. Aug 2008 B2
7580795 McCarthy et al. Aug 2009 B2
8013780 Lynam Sep 2011 B2
8027029 Lu et al. Sep 2011 B2
8698894 Briggance Apr 2014 B2
8855849 Ferguson Oct 2014 B1
9036026 Dellantoni et al. May 2015 B2
9146898 Ihlenburg et al. Sep 2015 B2
9575160 Davis et al. Feb 2017 B1
9599702 Bordes et al. Mar 2017 B1
9689967 Stark et al. Jun 2017 B1
9753121 Davis et al. Sep 2017 B1
20050267683 Fujiwara Dec 2005 A1
20080169963 White Jul 2008 A1
20100001897 Lyman Jan 2010 A1
20100245066 Sarioglu et al. Sep 2010 A1
20120062743 Lynam et al. Mar 2012 A1
20120218412 Dellantoni et al. Aug 2012 A1
20130063257 Schwindt Mar 2013 A1
20130215271 Lu Aug 2013 A1
20130222592 Gieseke Aug 2013 A1
20130241766 Kishigami Sep 2013 A1
20140062762 Kurono Mar 2014 A1
20140218529 Mahmoud et al. Aug 2014 A1
20140375476 Johnson et al. Dec 2014 A1
20150124096 Koravadi May 2015 A1
20150158499 Koravadi Jun 2015 A1
20150251599 Koravadi Sep 2015 A1
20150352953 Koravadi Dec 2015 A1
20160036917 Koravadi et al. Feb 2016 A1
20160116573 Appia Apr 2016 A1
20160210853 Koravadi Jul 2016 A1
20160291146 Wang Oct 2016 A1
20170129489 Pawlicki et al. May 2017 A1
20170222311 Hess et al. Aug 2017 A1
20170254873 Koravadi Sep 2017 A1
20170276788 Wodrich Sep 2017 A1
20170315231 Wodrich Nov 2017 A1
20170328997 Silverstein Nov 2017 A1
20170356994 Wodrich et al. Dec 2017 A1
20180015875 May et al. Jan 2018 A1
20180045812 Hess Feb 2018 A1
20180065623 Wodrich et al. Mar 2018 A1
20180067194 Wodrich et al. Mar 2018 A1
20180105176 Pawlicki et al. Apr 2018 A1
20180231635 Woehlte Aug 2018 A1
20180231657 Woehlte Aug 2018 A1
20180299533 Pliefke et al. Oct 2018 A1
20190061760 Pawlicki et al. Feb 2019 A1
20190072666 Duque Biarge et al. Mar 2019 A1
20190072667 Duque Biarge et al. Mar 2019 A1
20190072668 Duque Biarge et al. Mar 2019 A1
20190072669 Duque Biarge et al. Mar 2019 A1
20190217775 May et al. Jul 2019 A1
Foreign Referenced Citations (3)
Number Date Country
102007039834 Feb 2009 DE
2011090484 Jul 2011 WO
2018007995 Jan 2018 WO
Non-Patent Literature Citations (3)
Entry
Nissan North America, Inc., “2012 Pathfinder Owner's Manual”, 2011 (Year: 2011).
Machine translation of specificaiton of DE101007039834 (Year: 2009).
Machine translation of claims of DE102007039834 (Year: 2009).
Related Publications (1)
Number Date Country
20180059236 A1 Mar 2018 US
Provisional Applications (2)
Number Date Country
62398094 Sep 2016 US
62378849 Aug 2016 US