The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that controls one or more lights at a vehicle.
In the 1930s, Enteramine, and in the 1940s, Serotonin, were discovered as hormones in the human body. Later, these were acknowledged to be responsible or at least involved in some body functions as a neuro transmitter. With Melatonin and Cortisol, two hormones were found as counteracting in the human circadian clock (see
In 2001, a new eye photoreceptor was found, the intrinsically photoreceptive Retinal Ganglion Cells (ipRGCs), which makes about two percent of a human eye's Ganglion cells (beside the rods and cones), was found to be the responsible receptor for the circadian clock conception. Both the light intensity as well as the light color were found to have influence to a human's hormone release (
A similar correlation was found to users of self-illuminated e-reader devices, which were found to negatively affect sleep and circadian timing and next morning alertness. Companies such as f.lux came up with a similar approach for hand held devices and desktop monitors, which have in common that all have self-illuminated displays. The f.lux software tunes the light color to warmer temperatures at evening, overnight and morning, while tuning the light color to colder temperatures over the day in a sinusoidal-like time scheme irrespective of the surrounding light (see
The present invention provides a driver assistance system or light control system for a vehicle that utilizes a control and one or more lights of the vehicle (such as interior lights of the vehicle) and controls the lights to provide a desired or selected color based on at least one of (i) a driver attentiveness input and (ii) a driver age input, and optionally a time of day input, an ambient light input, a weather condition input, and/or a navigation input. For example, the system and control may adjust the color of the vehicle interior lights (and/or a display such as backlighting color of a display to change the overall color scheme of the display) to accommodate a change in ambient light at the vehicle or time of day of driving of the vehicle or an estimated arrival time for an input destination of a navigation system of the vehicle or the like. The system thus provides a desired or appropriate color scheme of the interior lights and/or display (and optionally the headlights of the vehicle as well) to enhance the attentiveness or comfort of the driver while the driver is driving the vehicle.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes a lighting system 12 that includes a controller or control 14 for controlling at least one light 16 of the vehicle (with the light 16 comprising an interior light or lights of the vehicle and optionally the system may include one or more exterior lights of the vehicle). Optionally, a cabin monitor or driver monitor sensor 18 (such as one or more interior viewing cameras or the like) may be provided to capture image data representative of the driver's head and face and eyes, whereby the control may adjust the light 16 responsive to image processing of captured image data. The control 14 may also control a display 20 (such as a display screen or a head up display or the like) that displays information or images for viewing by the driver of the vehicle. The data transfer or signal communication from the sensor to the control or from the control to the display or lights may comprise any suitable data or communication link, such as a vehicle network bus or the like of the equipped vehicle.
Automotive displays, wearables and head up displays are nowadays color temperature fine tunable as well as the ambient vehicle interior lighting. Vehicle head lights are typically not tunable in color, with the color of the headlights depending on the light source which is installed, such as Xenon lights, which are hotter and by that brighter and have more blue (so called ‘colder’ light) components compared to a conventional light bulb glowing less hot having more red components (so called ‘warmer’ light).
The present invention provides a lighting control system for supporting the vehicle driver's circadian timing, with the control controlling the vehicle's instrument backlights and displays, wearables and head up displays to tune them in intensity and light color in accordance with the current local time, such as by tuning to a less bright and to a color temperature around 3,000 K the later it is in the evening or earlier in the morning and by tuning to a color temperature up to 6,500 K and brighter around noon (see
Optionally, the interior light sources of the vehicle may also be tuned in intensity and light color in accordance with the current local time. Optionally, the amount of indirect light within the vehicle cabin may also be tuned in intensity and light color in accordance with the current local time. Optionally, the display and ambient light tuning may be tuned to more warmer light tones on days at which the light outside of the vehicle are colder, such as on rainy days or the like, for making the driver feel more cozy or comfortable, and may tune the lights to a more bluish color on bright sunshine days. Optionally, the control may control the display of the vehicle to more colder light tones on days at which the light outside of the vehicle is colder (the opposite of the previous option), such as on rainy days or the like, and may tune the display to more reddish colors on bright sunshine days for making displayed colors or white look more authentic (subjectively).
Optionally, color tunable vehicle head lights may come into use, such as LEDs that can be made as RGB LEDs similar to household illumination LEDs, tunable in light color and dimmable in light intensity. In most practical cases, the light intensity may not be tunable for the driver's comfort reason but for adapting the high beam and providing spot lighting. In those cases, just the light color may be tuned in accordance with the current local time.
Alternatively to the time scheme above, the vehicle user may have the option to tune the time scheme individually. This may be beneficial to a shift worker. For example, such a shift worker may have to get up to work late, sleeping at day time, and by that he or she may prefer to have colder colors at the vehicle displays and optionally also at the ambient lighting and head lights when driving to work in the evening and more warm lights when driving home in the early morning. Another example may be a user flying in from another time zone, preferring to keep in his or her time scheme, such as flight attendants do, instead of adapting to the local time.
Alternatively to the time scheme above, the vehicle may have a driver drowsiness assistant. Since warmer colors support the driver's relaxation and sleepiness, warm colors may be counterproductive to drivers who want to stay awake. A vehicle may generally follow a daytime according time scheme, but may stretch the bright light and cold light color time interval in situations where the driver has planned a road trip extending into night time (and the system may provide such adjustment of the time scheme responsive to a navigation system of the vehicle where the driver has input a route or destination that requires driving into nighttime).
Optionally, sophisticated systems may have a bot on a cell phone or cloud, which mainly may be based on a user's calendar and optionally/additionally based on artificial intelligence (AI), for detecting the driver's long term habits and short term duties. The system may use such a bot for analyzing and predicting whether a vehicle ride may last into earlier or later night time so that the system can tune or adjust the drowsiness assistant's parameter(s) accordingly. Optionally, the drowsiness assistant may employ drowsiness sensors, such as eye lid detection. By these, the drowsiness assistant system may detect close to sleep or short sleep events upon which the driver gets warned that he or she is drowsy. Optionally, the drowsiness assistant artificial intelligence system may employ a (long duration) reinforcement feedback learning process of any kind, which may learn the typical tripping of the driver becoming drowsy when driving long and or at evening or night time. The drowsiness assistant may learn to tune the extended bright light and cold light color time interval in a way that the driver is (expected or predicted) typically just arriving to the trip goal (according the vehicle's or remotely attached navigation device or smart phone with navigation application or personal calendar bot or drowsiness assistant AI), so that the driver's past trip sleep is least as possible affected, but he or she does not need to break the trip due to sleepiness (see
Optionally, and such as for any or all of the solutions above, the vehicle's light scheme may be adapted to the driver's age, since with aging, the shorter wave lengths transmittance of the human eye diminishes stronger than the transmittance of longer wave lengths (see
As another aspect of the invention concerning head up displays, since the head up display's display content is augmented or overlayed to a real image (from behind the HUD's combiner or windshield according the HUD type), the displayed content may be too bright or too dim (or less bright) in contrast to the real background scene. Knowing the driver's eye positions, such as by using an eye tracker for tracking the driver's eyes, the position of the virtual HUD image and the scene in front of the vehicle by using a forwardly viewing vehicle camera, the system can determine how bright and which color and what texture the background of a HUD image in the line of sight from the driver's eyes has. The system may adapt the display content's brightness and optionally also the color on individually positions of on the HUD displayed content to enhance the contrast and the evenness, and to cope with cluttering due to individual background brightness and color.
Thus, the system of the present invention provides for automatic adjustment of the color and intensity of a light of a vehicle (such as an interior cabin light and/or display backlight and/or the like) responsive to one or more inputs associated with the driver and/or the driving conditions and/or the like. The system, responsive to an input indicative of the driver's age and/or attentiveness, automatically adjusts the color and/or intensity of the vehicle lights to provide enhanced lighting for that particular driver at that time. The system may also or otherwise automatically adjust the vehicle lighting responsive to other inputs associated with the time of day or ambient lighting or weather, in order to provide enhanced lighting for the driver during the particular current driving conditions. The system may also or otherwise automatically adjust the vehicle lighting responsive to navigation input indicative of a planned trip such that the vehicle lighting provides appropriate lighting for the driver during the driving event. The system may also or otherwise automatically adjust the vehicle lighting responsive to a user input that allows the user or driver of the vehicle to select a color scheme or color change so that the system automatically adjusts the vehicle lighting at desired or selected times or during desired or selected events and/or the like.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EyeQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The display 20 is operable to display information and/or video images for viewing by the driver of the vehicle, with the control adjusting the backlighting of the display or the color of the display according to the time of day, age of the driver, weather conditions, navigation input, and/or the like. The display may utilize aspects of the display systems described in U.S. Pat. No. 8,427,751 and/or U.S. Publication Nos. US-2014-0333729; US-2014-0139676; US-2015-0092042; US-2015-0232030 and/or US-2016-0209647, which are all hereby incorporated herein by reference in their entireties. Optionally, the system (utilizing a forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2010/099416; WO 2011/028686; WO 2012/075250; WO 2013/019795; WO 2012/075250; WO 2012/145822; WO 2013/081985; WO 2013/086249 and/or WO 2013/109869, and/or U.S. Publication No. US-2012-0162427, which are hereby incorporated herein by reference in their entireties.
Optionally, for example, the system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or International Publication Nos. WO 2011/028686; WO 2010/099416; WO 2012/061567; WO 2012/068331; WO 2012/075250; WO 2012/103193; WO 2012/0116043; WO 2012/0145313; WO 2012/0145501; WO 2012/145818; WO 2012/145822; WO 2012/158167; WO 2012/075250; WO 2012/0116043; WO 2012/0145501; WO 2012/154919; WO 2013/019707; WO 2013/016409; WO 2013/019795; WO 2013/067083; WO 2013/070539; WO 2013/043661; WO 2013/048994; WO 2013/063014, WO 2013/081984; WO 2013/081985; WO 2013/074604; WO 2013/086249; WO 2013/103548; WO 2013/109869; WO 2013/123161; WO 2013/126715; WO 2013/043661; WO 2013/158592 and/or WO 2014/204794, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 62/289,443, filed Feb. 1, 2016, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5172785 | Takahashi | Dec 1992 | A |
5550677 | Schofield et al. | Aug 1996 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5760962 | Schofield et al. | Jun 1998 | A |
5786772 | Schofield et al. | Jul 1998 | A |
5796094 | Schofield et al. | Aug 1998 | A |
5877897 | Schofield et al. | Mar 1999 | A |
5929786 | Schofield et al. | Jul 1999 | A |
5949331 | Schofield et al. | Sep 1999 | A |
6201642 | Bos | Mar 2001 | B1 |
6302545 | Schofield et al. | Oct 2001 | B1 |
6396397 | Bos et al. | May 2002 | B1 |
6498620 | Schofield et al. | Dec 2002 | B2 |
6523964 | Schofield et al. | Feb 2003 | B2 |
6536928 | Hein | Mar 2003 | B1 |
6587573 | Stam | Jul 2003 | B1 |
6611202 | Schofield et al. | Aug 2003 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6717610 | Bos et al. | Apr 2004 | B1 |
6757109 | Bos | Jun 2004 | B2 |
6802617 | Schofield et al. | Oct 2004 | B2 |
6806452 | Bos et al. | Oct 2004 | B2 |
6822563 | Bos et al. | Nov 2004 | B2 |
6882287 | Schofield | Apr 2005 | B2 |
6891563 | Schofield et al. | May 2005 | B2 |
6946978 | Schofield | Sep 2005 | B2 |
7005974 | McMahon et al. | Feb 2006 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7855755 | Weller et al. | Dec 2010 | B2 |
7859565 | Schofield et al. | Dec 2010 | B2 |
7881496 | Camilleri et al. | Feb 2011 | B2 |
8427751 | Rumpf et al. | Apr 2013 | B2 |
8694224 | Chundrlik, Jr. et al. | Apr 2014 | B2 |
9041806 | Baur et al. | May 2015 | B2 |
9126525 | Lynam et al. | Sep 2015 | B2 |
9210761 | Nackaerts | Dec 2015 | B2 |
9357208 | Gupta et al. | May 2016 | B2 |
9596387 | Achenbach et al. | Mar 2017 | B2 |
9762880 | Pflug | Sep 2017 | B2 |
9900522 | Lu | Feb 2018 | B2 |
20070282522 | Geelen | Dec 2007 | A1 |
20080065291 | Breed | Mar 2008 | A1 |
20090010494 | Bechtel | Jan 2009 | A1 |
20090273563 | Pryor | Nov 2009 | A1 |
20090292528 | Kameyama | Nov 2009 | A1 |
20100045797 | Schofield | Feb 2010 | A1 |
20110084852 | Szczerba | Apr 2011 | A1 |
20110090149 | Larsen | Apr 2011 | A1 |
20110178670 | Perkins | Jul 2011 | A1 |
20110241545 | Miller | Oct 2011 | A1 |
20120033123 | Inoue | Feb 2012 | A1 |
20120116632 | Bechtel | May 2012 | A1 |
20120162427 | Lynam | Jun 2012 | A1 |
20120209358 | Feng | Aug 2012 | A1 |
20130116859 | Ihlenburg | May 2013 | A1 |
20130286193 | Pflug | Oct 2013 | A1 |
20140036080 | Schut | Feb 2014 | A1 |
20140049973 | Adachi | Feb 2014 | A1 |
20140139676 | Wierich | May 2014 | A1 |
20140152778 | Ihlenburg | Jun 2014 | A1 |
20140152792 | Krueger | Jun 2014 | A1 |
20140226303 | Pasdar | Aug 2014 | A1 |
20140333729 | Pflug | Nov 2014 | A1 |
20140340510 | Ihlenburg et al. | Nov 2014 | A1 |
20150022664 | Pflug | Jan 2015 | A1 |
20150035437 | Panopoulos | Feb 2015 | A1 |
20150092042 | Fursich | Apr 2015 | A1 |
20150174361 | Baaijens | Jun 2015 | A1 |
20150232030 | Bongwald | Aug 2015 | A1 |
20150343945 | Salter | Dec 2015 | A1 |
20150344028 | Gieseke | Dec 2015 | A1 |
20160104486 | Penilla | Apr 2016 | A1 |
20160188987 | Lisseman | Jun 2016 | A1 |
20160209647 | Fursich | Jul 2016 | A1 |
20160267335 | Hampiholi | Sep 2016 | A1 |
Number | Date | Country |
---|---|---|
WO-2010144900 | Dec 2010 | WO |
Entry |
---|
Chang et al., “Evening use of light-emitting eReaders negatively affects sleep, circadian timing, and next-morning alertness,” CrossMark PNAS, Jan. 27, 2015, pp. 1232-1237. |
Number | Date | Country | |
---|---|---|---|
20170217367 A1 | Aug 2017 | US |
Number | Date | Country | |
---|---|---|---|
62289443 | Feb 2016 | US |