The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more cameras at an interior cabin of the vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
The present invention provides a driver monitoring system that utilizes one or more cameras to capture image data representative of images interior of the vehicle, and that provides heart rate measurement and/or other vital signs via processing of image data captured by the camera(s). A control includes an image processor operable to process image data captured by the camera. The control, responsive to image processing of image data captured by the camera, monitors the imaged portion of the driver (such as a region of the driver's face) and determines the driver's heart rate and other physical characteristics that pertain to the health and/or status of the driver. The driver monitoring system may also monitor the eyes and/or head of the driver to determine attentiveness or drowsiness of the driver. Responsive to determination of a threshold change in heart rate and/or attentiveness below a threshold level, the system may generate an alert and/or may generate an output to cause a vehicle control system to take over control or partial control of the vehicle.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle vision system and/or driver assist system and/or object detection system and/or alert system operates to capture images exterior and/or interior of the vehicle and may process the captured image data to monitor occupants of the vehicle and/or display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes a driver monitoring system 12 that includes at least one interior viewing imaging sensor or camera 14, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
The driver monitoring system includes a driver monitoring camera 14 that captures image data representative of the driver's head. The camera is disposed at a dashboard or instrument panel of the vehicle and has the principal axis of its field of view directed toward the face of the driver.
The driver monitoring system may track the head and eyes of the driver of the vehicle. To accurately track the eyes of the driver, the pupils of the eyes must be clearly viewed by a camera or other imaging device. Because of this, a preferred camera position is directly in front of the driver and at or below a line-of-sight to the road. This allows the camera to view both pupils in most situations and also avoid having the driver's eyelashes in the way (which is a problem that occurs when cameras are mounted above the line-of-sight to the road).
As shown in
The driver (and/or passenger) monitoring system (comprising the camera(s) and a controller or control unit having a processor for processing image data captured by the camera(s)) can also operate as a remote Photoplethysmography (rPPG) system to monitor human's heart rate, and optionally other vital signs. The system, responsive to detection of a change of the driver's heartbeat, may determine if the driver's capability of driving the vehicle is impaired, such as due to the driver having a health issue, the driver being tired or drowsy, the driver being stressed, or the driver being drunk or under drug influence or the like. The system may generate an alert or warning (such as to the driver or to a passenger in the vehicle or to a remote system or control remote from the vehicle) if the system determines a health issue or the like with the driver.
Optionally, and such as shown in
Because blood flows into vessels in the human head, the blood flow can cause facial skin to change optical reflection at a certain color spectrum, or can cause the head to move in a tiny degree, that are normally not detectable by human eyes. The optical reflection or head movement is in synchronization with the person's heartbeat.
The interior camera(s) can be used to capture image data representative of the driver or passenger head and face, and computational algorithms are used to detect the tiny skin color changes and/or tiny head movements. For example, a camera imaging the driver/passenger head can be used to transmit video or captured image data to an image processing unit that runs special algorithms that extract heartbeat frequency from the video image data. Optionally, one region of interest (ROI) or multiple regions of interest on driver/passenger's face (such as a region that includes or encompasses the driver's eyes, nose and mouth, such as the rectangular region shown in
Natural light can be used as the illumination and a normal visible spectrum camera can used. The camera may comprise a CMOS image sensor (comprising one million or more photosensing elements arranged in a two dimensional array of rows and columns) with a red-green-green-blue or RGGB (Bayer), red-clear-clear-blue or RCCB, red-clear-clear-clear or RCCC, red-green-blue-IR or RGB-IR or other color or spectral filter array pattern at the image sensor array.
Optionally, an infrared or near-infrared light emitting diode (LED) or other light source may be disposed in the vehicle that directs IR/NIR light toward the driver's head region, whereby the camera images the illuminated driver's head. The LED and camera may operate together when the driver monitoring system is operating. The LED may be disposed at the camera or integrated with the camera or the LED may be disposed elsewhere in the vehicle (such as close to the camera) so as to emit light generally directly toward the driver's head region.
Because the infrared (IR) or near IR (NIR) spectrum is more sensitive to human blood than other optical spectrums, such an IR or NIR light emitting diode (LED) or laser is preferably used as an illuminator to enhance the camera's image sensitivity and to decrease/eliminate the environment light's uncontrolled illumination, thereby increasing measurement accuracy of the system. Synchronization of the illumination source's pulse with the imager's exposure further increases the vision signal and decreases noise from environmental lights and electrical circuits. Optionally, for example, an imager with a global shutter (e.g., and On Semi AR0234 or AR0144 imager) may be used to better synchronize image exposure with pulsed illuminations.
Therefore, the present invention provides a driver monitoring system or passenger monitoring system that processes image data captured by a camera (such as a camera having a high definition multi-megapixel color CMOS imager) to determine the driver's or passenger's heartbeat. This is accomplished via detecting small changes in color of the person's skin and/or small movements of the person's head as blood flows into vessels in the person's head. By detecting such small (not perceivable to the human eye) color changes and/or head movements (indicative of blood flow), the system can determine the heartbeat or heart rate of the person and can determine changes in the heartbeat that may be indicative of a health issue. If such changes (such as a rapid increase or decrease in heart rate that is greater than a threshold change over a predetermined period of time, or such as an increase in heart rate above a threshold or normal level, or such as a slowing of the heart beat below a threshold or normal level) are detected, the system may generate an alert to the driver or passenger or to a remote server or device located remotely from the vehicle. For example, the system may generate an alert to the emergency contact for the driver or may generate an alert to a medical facility or service, depending on the severity of the detected changes. Optionally, the system may generate an output that causes an autonomous control system to take over driving of the vehicle, whereby the vehicle may be controlled to stop at the side of the road as soon as possible or to drive to the nearest medical facility.
The driver monitoring system may utilize aspects of head and face direction and position tracking systems and/or eye tracking systems and/or gesture recognition systems. Such head and face direction and/or position tracking systems and/or eye tracking systems and/or gesture recognition systems may utilize aspects of the systems described in U.S. Publication Nos. US-2017-0274906; US-2016-0137126; US-2015-0352953; US-2015-0296135; US-2015-0294169; US-2015-0232030; US-2015-0022664; US-2015-0015710; US-2015-0009010 and/or US-2014-0336876, which are hereby incorporated herein by reference in their entireties.
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. Publication Nos. US-2014-0340510; US-2014-0313339; US-2014-0347486; US-2014-0320658; US-2014-0336876; US-2014-0307095; US-2014-0327774; US-2014-0327772; US-2014-0320636; US-2014-0293057; US-2014-0309884; US-2014-0226012; US-2014-0293042; US-2014-0218535; US-2014-0218535; US-2014-0247354; US-2014-0247355; US-2014-0247352; US-2014-0232869; US-2014-0211009; US-2014-0160276; US-2014-0168437; US-2014-0168415; US-2014-0160291; US-2014-0152825; US-2014-0139676; US-2014-0138140; US-2014-0104426; US-2014-0098229; US-2014-0085472; US-2014-0067206; US-2014-0049646; US-2014-0052340; US-2014-0025240; US-2014-0028852; US-2014-005907; US-2013-0314503; US-2013-0298866; US-2013-0222593; US-2013-0300869; US-2013-0278769; US-2013-0258077; US-2013-0258077; US-2013-0242099; US-2013-0215271; US-2013-0141578 and/or US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO 2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.
The imaging device and control and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras (such as various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like) and vision systems described in U.S. Pat. Nos. 5,760,962; 5,715,093; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 5,796,094; 6,559,435; 6,831,261; 6,822,563; 6,946,978; 7,720,580; 8,542,451; 7,965,336; 7,480,149; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,937,667; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454 and/or 6,824,281, and/or International Publication Nos. WO 2009/036176; WO 2009/046268; WO 2010/099416; WO 2011/028686 and/or WO 2013/016409, and/or U.S. Publication Nos. US 2010-0020170 and/or US-2009-0244361, which are all hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application is a continuation of U.S. patent application Ser. No. 17/929,423, filed Sep. 2, 2022, now U.S. Pat. No. 11,618,454, which is a continuation of U.S. patent application Ser. No. 16/946,848, filed Jul. 9, 2020, now U.S. Pat. No. 11,433,906, which claims priority of U.S. provisional application Ser. No. 62/872,779, filed Jul. 11, 2019, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5550677 | Schofield et al. | Aug 1996 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5949331 | Schofield et al. | Sep 1999 | A |
6243015 | Yeo | Jun 2001 | B1 |
6485081 | Bingle et al. | Nov 2002 | B1 |
6621411 | McCarthy et al. | Sep 2003 | B2 |
6762676 | Teowee et al. | Jul 2004 | B2 |
8063786 | Manotas, Jr. | Nov 2011 | B2 |
8258932 | Wahlstrom | Sep 2012 | B2 |
9377852 | Shapiro et al. | Jun 2016 | B1 |
9750420 | Agrawal et al. | Sep 2017 | B1 |
9988055 | O'Flaherty et al. | Jun 2018 | B1 |
11433906 | Lu | Sep 2022 | B2 |
11618454 | Lu | Apr 2023 | B2 |
20070055164 | Huang | Mar 2007 | A1 |
20070257804 | Gunderson | Nov 2007 | A1 |
20090156904 | Shen | Jun 2009 | A1 |
20090273487 | Ferro | Nov 2009 | A1 |
20110018739 | Dehais | Jan 2011 | A1 |
20120150387 | Watson | Jun 2012 | A1 |
20130070043 | Geva et al. | Mar 2013 | A1 |
20140152792 | Krueger | Jun 2014 | A1 |
20140167967 | He et al. | Jun 2014 | A1 |
20140306814 | Ricci | Oct 2014 | A1 |
20140336876 | Gieseke et al. | Nov 2014 | A1 |
20150009010 | Biemer | Jan 2015 | A1 |
20150015710 | Tiryaki | Jan 2015 | A1 |
20150022664 | Pflug et al. | Jan 2015 | A1 |
20150232030 | Bongwald | Aug 2015 | A1 |
20150258892 | Wu | Sep 2015 | A1 |
20150294169 | Zhou et al. | Oct 2015 | A1 |
20150296135 | Wacquant et al. | Oct 2015 | A1 |
20150352953 | Koravadi | Dec 2015 | A1 |
20160090097 | Grube et al. | Mar 2016 | A1 |
20160137126 | Fursich et al. | May 2016 | A1 |
20170105104 | Ulmansky et al. | Apr 2017 | A1 |
20170274906 | Hassan et al. | Sep 2017 | A1 |
20170311831 | Freer et al. | Nov 2017 | A1 |
20170337438 | el Kaliouby, Jr. et al. | Nov 2017 | A1 |
20170367590 | Sebe et al. | Dec 2017 | A1 |
20200143560 | Lu et al. | May 2020 | A1 |
20200163560 | Chang et al. | May 2020 | A1 |
20200214614 | Rundo et al. | Jul 2020 | A1 |
20200283001 | Kulkarni | Sep 2020 | A1 |
Number | Date | Country | |
---|---|---|---|
20230242124 A1 | Aug 2023 | US |
Number | Date | Country | |
---|---|---|---|
62872779 | Jul 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17929423 | Sep 2022 | US |
Child | 18194703 | US | |
Parent | 16946848 | Jul 2020 | US |
Child | 17929423 | US |