The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more cameras at a vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
The present invention provides a driver assistance system or vision system or imaging system for a vehicle that utilizes one or more cameras to capture image data representative of images interior of the vehicle, and provides an illumination source that emits non-visible light that illuminates at least a portion of a driver of the vehicle. A specular-selective reflector is disposed at the vehicle windshield and within a line of sight of the illuminated portion of the driver. The reflector reflects at least some non-visible light (such as infrared (IR) and/or near-infrared (NIR) radiation) incident at the reflector and allows visible light to pass through the reflector and the windshield, so as to direct the emitted non-visible light toward the driver, while allowing the driver to view through the reflector at the windshield. The camera is disposed in the dashboard of the vehicle and has a field of view that is directed away from the driver and that includes or encompasses the reflector. A control includes an image processor operable to process image data captured by the camera. The camera is sensitive to non-visible (IR/NIR) light and captures non-visible light that is emitted by the illumination source and that reflects off the illuminated portion of the driver of the vehicle and that reflects off the reflector so as to reflect toward the camera. The control, responsive to image processing of image data captured by the camera, monitors the illuminated portion of the driver.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle vision system and/or driver assist system and/or object detection system and/or alert system operates to capture images exterior and/or interior of the vehicle and may process the captured image data to monitor occupants of the vehicle and/or display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide display, such as a rearview display or a top down or bird's eye or surround view display or the like.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that may include at least one exterior viewing imaging sensor or camera, such as a rearward viewing imaging sensor or camera 14a (and the system may optionally include multiple exterior viewing imaging sensors or cameras, such as a forward viewing camera 14b at the front (or at the windshield) of the vehicle, and a sideward/rearward viewing camera 14c, 14d at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
The system may also or otherwise include a driver monitoring camera 44 that captures image data representative of the driver's head. The camera is, for example, disposed at a dashboard or instrument panel of the vehicle and has the principal axis of its field of view directed toward a portion of the windshield where a reflector 40 is disposed. The reflector comprises a thin transparent film or coating or a stack of thin films or coatings deposited on a transparent plastic substrate that can be configured to conform to the curvature of the in-cabin surface of the windshield portion when it is disposed at the windshield portion. The reflector reflects infrared (IR) or near-infrared (NIR) light incident at the reflector/windshield portion and allows visible light to pass through the reflector and the windshield portion (so that the driver can view through the windshield at the reflector). For example, a multi-layer stack of dielectric coatings may be utilized. The IR/NIR light present at the driver's head region is imaged by the camera 15 for a driver monitoring system via reflection of the IR/NIR light off the reflector 40, as discussed in detail below.
Driver monitoring systems may be used to track the head and eyes of the driver of a vehicle. To accurately track the eyes of the driver, the pupils of the eyes must be clearly viewed by a camera or other imaging device. Because of this, a preferred camera position is directly in front of the driver and at or below a line-of-sight to the road. This allows the camera to view both pupils in most situations and also avoid having the driver's eyelashes in the way (which is a problem that occurs when cameras are mounted above the line-of-sight to the road).
As shown in
Referring now to
Optionally, an infrared or near-infrared light emitting diode (LED) or other illumination source may be disposed in the vehicle (and such as at or near the camera) that directs IR/NIR light (or other non-visible light) toward the driver's head region, whereby the camera images the illuminated driver's head. The LED and camera may operate together when the driver monitoring system is operating. The illumination source may be disposed at the camera so that the light emitted by the illumination source, when energized or powered, reflects off the reflector toward the driver's head region so as to illuminate the driver's head region with IR/NIR light, or the illumination source may be disposed elsewhere in the vehicle and emit light generally directly toward the driver's head region.
Referring now to
Referring now to
Referring now to
The position and angle of the illumination source and the position of the reflector may be selected based on an angle of the windshield and the position and angle of the illuminator in order to ensure that the light emitted by the light source reflects off the reflector and toward the driver's head region so that a principal axis of the reflected light (as reflected toward the region to be illuminated, e.g., the driver's head region) hits/impinges the driver's eyes within a threshold angle range (e.g., 5-30 degrees) relative to the principal axis of the driver's forward view or field of view (which is typically generally horizontal when the driver views through the windshield and forward of the vehicle. Thus, the reflector may be positioned higher or lower at the windshield depending on the location and angle of the light source and the angle of the windshield so as to direct the light path toward the driver's eyes (or other portion of the driver to illuminate) within the threshold angle range.
Optionally, and desirably, the reflector 40 reflects non-visible light and allows visible light to pass through. That is, the reflector's reflectivity of the operating wavelength of the driver monitoring camera 44 is at or approaching 100 percent. In other words, the reflector's transmissivity of the operating wavelength is at or near zero. The operating wavelength, as discussed above, may be at NIR range (e.g., around 700 nm to around 940 nm). The transmission spectra of the reflector 40 may reflect the operating wavelength and transmit the other wavelength especially in visible spectrum. By such design, the driver may see through the reflector 40 without being visually blocked by the reflector allowing unimpeded views of the vehicle's surroundings (e.g., the road).
Inside the lens of the camera 44 (or other imaging sensor), there may be an optical filter that allows in the operating wavelength and blocks most or all other wavelengths. Thus, only, for example, the NIR light emitted from the illuminator(s) and reflected from a portion of the driver enters into the optical path of the camera 44 and allows for capturing image data of the illuminated portion of the driver. Light rays in all spectrum ranges except for the operating wavelength may be blocked by the optical filter in the camera lens. For example, as shown in
Thus, the system uses a non-visible light emitting illumination source, a reflector, and a camera to monitor, for example, the head and/or eyes of a driver of a vehicle. The non-visible light reflects off a portion of the driver of the vehicle. The reflected non-visible light that is incident at the reflector reflects off the reflector so as to be within view of the camera. The camera, based on the non-visible light emitted by the illumination source and reflected off the driver and reflected off the reflector, captures image data representative of the illuminated portion of the driver. Responsive to processing of the captured image data, the illuminated portion (e.g., the eyes of the driver) are monitored. Using the reflective properties of the reflector, the camera may be placed completely out of sight of the driver. The system may, for example, determine the general viewing direction of the driver, the focal distance of the driver's gaze, or alertness of the driver.
The driver monitoring system may utilize aspects of head and face direction and position tracking systems and/or eye tracking systems and/or gesture recognition systems. Such head and face direction and/or position tracking systems and/or eye tracking systems and/or gesture recognition systems may utilize aspects of the systems described in U.S. Pat. No. 9,405,120 and/or U.S. Publication Nos. US-2017-0274906; US-2016-0209647; US-2016-0137126; US-2015-0352953; US-2015-0296135; US-2015-0294169; US-2015-0232030; US-2015-0092042; US-2015-0022664; US-2015-0015710; US-2015-0009010 and/or US-2014-0336876, which are hereby incorporated herein by reference in their entireties.
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
The imaging device and control and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras (such as various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like) and vision systems described in U.S. Pat. Nos. 5,760,962; 5,715,093; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 5,796,094; 6,559,435; 6,831,261; 6,822,563; 6,946,978; 7,720,580; 8,542,451; 7,965,336; 7,480,149; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,937,667; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454 and/or 6,824,281, and/or International Publication Nos. WO 2009/036176; WO 2009/046268; WO 2010/099416; WO 2011/028686 and/or WO 2013/016409, and/or U.S. Publication Nos. US 2010-0020170 and/or US-2009-0244361, which are all hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims priority of U.S. provisional application Ser. No. 62/885,983, filed Aug. 13, 2019, and U.S. provisional application Ser. No. 62/754,089, filed Nov. 1, 2018, which are hereby incorporated herein by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
4712879 | Lynam et al. | Dec 1987 | A |
5073012 | Lynam | Dec 1991 | A |
5076673 | Lynam et al. | Dec 1991 | A |
5115346 | Lynam | May 1992 | A |
5140455 | Varaprasad et al. | Aug 1992 | A |
5142407 | Varaprasad et al. | Aug 1992 | A |
5151816 | Varaprasad et al. | Sep 1992 | A |
5253109 | O'Farrell et al. | Oct 1993 | A |
5406414 | O'Farrell et al. | Apr 1995 | A |
5525264 | Cronin et al. | Jun 1996 | A |
5550677 | Schofield et al. | Aug 1996 | A |
5567360 | Varaprasad et al. | Oct 1996 | A |
5610756 | Lynam et al. | Mar 1997 | A |
5668663 | Varaprasad et al. | Sep 1997 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5724187 | Varaprasad et al. | Mar 1998 | A |
5910854 | Varaprasad et al. | Jun 1999 | A |
5949331 | Schofield et al. | Sep 1999 | A |
6002511 | Varaprasad et al. | Dec 1999 | A |
6154306 | Varaprasad et al. | Nov 2000 | A |
6178034 | Allemand et al. | Jan 2001 | B1 |
6498620 | Schofield | Dec 2002 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
7184190 | McCabe et al. | Feb 2007 | B2 |
7195381 | Lynam et al. | Mar 2007 | B2 |
7255451 | McCabe et al. | Aug 2007 | B2 |
7274501 | McCabe et al. | Sep 2007 | B2 |
7626749 | Baur et al. | Dec 2009 | B2 |
7914187 | Higgins-Luthman et al. | Mar 2011 | B2 |
8725311 | Breed | May 2014 | B1 |
9405120 | Graf et al. | Aug 2016 | B2 |
20030142041 | Barlow et al. | Jul 2003 | A1 |
20030201895 | Harter et al. | Oct 2003 | A1 |
20060098166 | Scharenbroch et al. | May 2006 | A1 |
20060287779 | Smith et al. | Dec 2006 | A1 |
20100245093 | Kobetski et al. | Sep 2010 | A1 |
20120268582 | Rothenhausler | Oct 2012 | A1 |
20140218529 | Mahmoud et al. | Aug 2014 | A1 |
20140232869 | May et al. | Aug 2014 | A1 |
20140336876 | Gieseke et al. | Nov 2014 | A1 |
20150009010 | Biemer | Jan 2015 | A1 |
20150015710 | Tiryaki | Jan 2015 | A1 |
20150022664 | Pflug et al. | Jan 2015 | A1 |
20150092042 | Fursich | Apr 2015 | A1 |
20150124068 | Madau | May 2015 | A1 |
20150156383 | Biemer et al. | Jun 2015 | A1 |
20150232030 | Bongwald | Aug 2015 | A1 |
20150294169 | Zhou | Oct 2015 | A1 |
20150296135 | Wacquant et al. | Oct 2015 | A1 |
20150352953 | Koravadi | Dec 2015 | A1 |
20160137126 | Fursich et al. | May 2016 | A1 |
20160150218 | Yoon | May 2016 | A1 |
20160209647 | Fursich | Jul 2016 | A1 |
20170274906 | Hassan et al. | Sep 2017 | A1 |
Number | Date | Country |
---|---|---|
102017205386 | Oct 2018 | DE |
Entry |
---|
CCS Inc., “Band-Pass Filters for Machine Vision Camera Lenses”, Jan. 19, 2018, pp. 1-2. (Year: 2018). |
Number | Date | Country | |
---|---|---|---|
20200143560 A1 | May 2020 | US |
Number | Date | Country | |
---|---|---|---|
62885983 | Aug 2019 | US | |
62754089 | Nov 2018 | US |