The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more cameras at a vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
The present invention provides a driver assistance system or vision system or imaging system for a vehicle that utilizes one or more cameras (preferably one or more CMOS cameras) to capture image data representative of images exterior of the vehicle, and provides enhanced display of images (such as images captured by one or more exteriorly viewing cameras of the vehicle or images derived from or generated by a navigation system of the vehicle or telematics system of the vehicle or vehicle diagnostics system or other image generating system of the vehicle) for viewing by both eyes of the driver, with each eye viewing a respective displayed or projected image.
According to an aspect of the present invention, a display system of a vehicle includes a camera disposed at the vehicle so as to have a field of view interior of the vehicle that encompasses a head region of a driver of the vehicle. An image processor is operable to process image data captured by the camera to determine a location of both eyes of the driver of the vehicle. A display device is operable to display or project two images, and a first reflector element is adjustable to reflect the displayed or projected images so that each eye of the driver views a respective one of the reflected displayed or projected images.
The first reflector element may comprise a curved reflector element and each of the displayed or projected images may reflect off of a respective area of the curved reflector element. A second reflector element may be provided having two reflective portions for reflecting a respective reflected displayed or projected image from the curved reflector element. The second reflector element may comprise a portion of a windshield of the vehicle.
The first reflector element may be adjustable to adjust the optical path between the display device, the first reflector element and the driver's eyes (to accommodate different drivers and/or movement of the present driver of the vehicle). The display device may comprise two separate display screens, each operable to display images for viewing by a respective eye of the driver of the vehicle.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle vision system and/or driver assist system and/or object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide display, such as a rearview display or a top down or bird's eye or surround view display or the like.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes at least one exterior facing imaging sensor or camera, such as a rearward facing imaging sensor or camera 14a (and the system may optionally include multiple exterior facing imaging sensors or cameras, such as a forwardly facing camera 14b at the front (or at the windshield) of the vehicle, and a sidewardly/rearwardly facing camera 14c, 14d at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
The system of the present invention includes an interior cabin or driver monitoring system 22, which includes at least one camera 7 (
Known vehicle windshield head up display systems as well as combiner head up display systems, such as OEM or aftermarket equipment, have one head box directed towards the driver's eyes after passing reflective elements such as a curved mirror and the windshield or combiner respectively.
The display system of the present invention provides a new and inventive system having two small eye boxes directed to one eye each of the vehicle's driver. By that each eye receives an independent image projected by a respective display device. The image pair may have stereoscopic properties which enables the driver to see the projection in 3D without wearing additional appliances such as a shutter mask, green and red filtered goggles or polarization filtering glasses such as know from prior art 3D displays.
For generating two eye boxes, two images have to be generated. There may be two display elements, such as TFTs, LED or OLED or LASER displays or other suitable display screens or devices, from which the displayed images may be directed via one optical path each to one eye only. Both optical paths may use identical reflective elements such as sharing the windshield as a reflective surface. Optionally, the first reflective element is shared by both eye images (eye box) optical paths. Optionally, each eye box's optical path has a separate respective first reflective element. As an alternative option, one display may generate two images viewable from two different viewing angles which may be directed via according mirrors to the respective one eye.
For continuously tracking the eye boxes towards the driver's eyes the optical paths may be steerable. The optical paths may be steered by altering the reflective direction of the first mirror element or mirror elements when being separated accordingly. Optionally, the optical path may be altered by shifting and turning the display elements. Optionally, the optical paths may be altered by turning the second mirror element (the windshield may not be turnable), such as turning and shifting the combiner. In some of these solutions it may be necessary to turn the source images electronically or to turn the display element or elements for preventing turning of the viewed image when the reflective elements or displays gets turned for tracking the driver's eyes.
To track the viewer's eyes, the vehicle may have an (eventually head and) eye tracking system (such as by utilizing aspects of the systems described in U.S. Pat. No. 7,914,187 and/or U.S. Publication Nos. US-2016-0137126; US-2015-0352953; US-2015-0296135; US-2015-0294169; US-2015-0232030; US-2015-0022664; US-2015-0015710; US-2015-0009010; US-2015-0158499 and/or US-2014-0336876, incorporated above). The eye tracking system's camera or cameras may be installed in front or frontal-sideward of the driver for detecting the driver's eyes directly or may be installed in a hidden area for indirectly imaging and detecting the eye via one or more reflective elements. Optionally, the reflective elements may be the identical with those of the head up display. Optionally, one eye tracking camera may be positioned centered in between the display elements of the head up displays. Optionally, the eye tracking cameras are positioned left and right of the display elements at the same heights of the display elements.
As shown in
The reflector element 4 is adjustable to adjust the angle of reflection of the displayed images and thus to adjust the optical path between the display devices 6, the reflector element 4, the second reflector element or windshield 2 and the eye boxes 1. The reflector element 4 is adjustable via an actuator system 8 that pivots or moves the reflector element about a horizontal axis (so as to vertically adjust the reflection angle to adjust for different height drivers) and/or about a vertical axis (so as to horizontally adjust the reflection angle) and/or multiple axes (such as via a ball and socket mount) to provide the appropriate reflection angle and optical path for the particular driver eye locations.
The camera 7 captures image data and the image processor processes the captured image data to detect and track the driver's eyes (generally at the eye boxes 1) and the first reflective element or curved mirror 4 may be adjusted to adjust the display/projection and reflection of the displayed images to direct the displayed images to the respective eye boxes. The mirror 4 may be adjusted about multiple axes to adjust the optical paths as the driver's eyes may change in vertical or horizontal directions (such as due to a change in driver to a different driver of different height or responsive to the driver moving his or her head sidewardly or vertically). Thus, the present invention projects images to each eye of the driver independently of the other eye. The present invention thus provides for enhanced display of images, such as three dimensional images, for viewing by the driver of the vehicle while the driver is normally operating the vehicle and viewing through the windshield of the vehicle.
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EyeQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO/2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. Publication No. US-2012-0062743, which are hereby incorporated herein by reference in their entireties.
The imaging device and control and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras (such as various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like) and vision systems described in U.S. Pat. Nos. 5,760,962; 5,715,093; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 5,796,094; 6,559,435; 6,831,261; 6,822,563; 6,946,978; 7,720,580; 8,542,451; 7,965,336; 7,480,149; 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,937,667; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454 and/or 6,824,281, and/or International Publication Nos. WO 2009/036176; WO 2009/046268; WO 2010/099416; WO 2011/028686 and/or WO 2013/016409, and/or U.S. Pat. Publication Nos. US 2010-0020170 and/or US-2009-0244361, which are all hereby incorporated herein by reference in their entireties.
The camera module and circuit chip or board and imaging sensor may be implemented and operated in connection with various vehicular vision-based systems, and/or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 7,004,606; 7,339,149 and/or 7,526,103, which are all hereby incorporated herein by reference in their entireties, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454; 6,320,176 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties, a vehicle vision system, such as a forwardly, sidewardly or rearwardly directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978 and/or 7,859,565, which are all hereby incorporated herein by reference in their entireties, a trailer hitching aid or tow check system, such as the type disclosed in U.S. Pat. No. 7,005,974, which is hereby incorporated herein by reference in its entirety, a reverse or sideward imaging system, such as for a lane change assistance system or lane departure warning system or for a blind spot or object detection system, such as imaging or detection systems of the types disclosed in U.S. Pat. Nos. 7,881,496; 7,720,580; 7,038,577; 5,929,786 and/or 5,786,772, which are hereby incorporated herein by reference in their entireties, a video device for internal cabin surveillance and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962; 5,877,897; 6,690,268 and/or 7,370,983, and/or U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties, a traffic sign recognition system, a system for determining a distance to a leading or trailing vehicle or object, such as a system utilizing the principles disclosed in U.S. Pat. Nos. 6,396,397 and/or 7,123,168, which are hereby incorporated herein by reference in their entireties, and/or the like.
Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device, such as by utilizing aspects of the video display systems described in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187; 6,690,268; 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252 and/or 6,642,851, and/or U.S. Publication Nos. US-2012-0162427; US-2006-0050018 and/or US-2006-0061008, which are all hereby incorporated herein by reference in their entireties. Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or bird's-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2010/099416; WO 2011/028686; WO 2012/075250; WO 2013/019795; WO 2012/075250; WO 2012/145822; WO 2013/081985; WO 2013/086249 and/or WO 2013/109869, and/or U.S. Publication No. US-2012-0162427, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 62/260,759 filed Nov. 30, 2015, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5550677 | Schofield et al. | Aug 1996 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5949331 | Schofield et al. | Sep 1999 | A |
7043056 | Edwards et al. | May 2006 | B2 |
7331671 | Hammoud | Feb 2008 | B2 |
7460693 | Loy et al. | Dec 2008 | B2 |
7572008 | Elvesjo et al. | Aug 2009 | B2 |
7653213 | Longhurst et al. | Jan 2010 | B2 |
7914187 | Higgins-Luthman et al. | Mar 2011 | B2 |
8066375 | Skogo et al. | Nov 2011 | B2 |
8120577 | Bouvin et al. | Feb 2012 | B2 |
8165347 | Heinzmann et al. | Apr 2012 | B2 |
8185845 | Bjorklund et al. | May 2012 | B2 |
8220926 | Blixt et al. | Jul 2012 | B2 |
8314707 | Kobetski et al. | Nov 2012 | B2 |
8339446 | Blixt et al. | Dec 2012 | B2 |
8342687 | Blixt et al. | Jan 2013 | B2 |
8562136 | Blixt et al. | Oct 2013 | B2 |
8610768 | Holmberg et al. | Dec 2013 | B2 |
10017114 | Bongwald | Jul 2018 | B2 |
20030169907 | Edwards et al. | Sep 2003 | A1 |
20040193371 | Koshiji et al. | Sep 2004 | A1 |
20060274973 | Mohamed et al. | Dec 2006 | A1 |
20070014916 | Daniels | Jan 2007 | A1 |
20070297692 | Hamatani et al. | Dec 2007 | A1 |
20080077882 | Kramer et al. | Mar 2008 | A1 |
20090304232 | Tsukizawa | Dec 2009 | A1 |
20100097580 | Yamamoto et al. | Apr 2010 | A1 |
20120093358 | Tschirhart | Apr 2012 | A1 |
20120154591 | Baur et al. | Jun 2012 | A1 |
20120224062 | Lacoste | Sep 2012 | A1 |
20130050258 | Liu et al. | Feb 2013 | A1 |
20130229523 | Higgins-Luthman et al. | Sep 2013 | A1 |
20140062946 | Graumann et al. | Mar 2014 | A1 |
20140063359 | Chen | Mar 2014 | A1 |
20140072230 | Ruan et al. | Mar 2014 | A1 |
20140139655 | Mimar | May 2014 | A1 |
20140218529 | Mahmoud et al. | Aug 2014 | A1 |
20140232746 | Ro | Aug 2014 | A1 |
20140247352 | Rathi et al. | Sep 2014 | A1 |
20140300739 | Mimar | Oct 2014 | A1 |
20140300830 | Wang | Oct 2014 | A1 |
20140336876 | Gieseke et al. | Nov 2014 | A1 |
20150009010 | Biemer | Jan 2015 | A1 |
20150022664 | Pflug et al. | Jan 2015 | A1 |
20150092042 | Fursich | Apr 2015 | A1 |
20150145995 | Shahraray et al. | May 2015 | A1 |
20150156383 | Biemer et al. | Jun 2015 | A1 |
20150158499 | Koravadi | Jun 2015 | A1 |
20150185834 | Wingrove et al. | Jul 2015 | A1 |
20150232030 | Bongwald | Aug 2015 | A1 |
20150294148 | Mohanakrishnan et al. | Oct 2015 | A1 |
20150294169 | Zhou et al. | Oct 2015 | A1 |
20150296135 | Wacquant et al. | Oct 2015 | A1 |
20160137126 | Fursich et al. | May 2016 | A1 |
20160209647 | Fursich | Jul 2016 | A1 |
Number | Date | Country | |
---|---|---|---|
20170153457 A1 | Jun 2017 | US |
Number | Date | Country | |
---|---|---|---|
62260759 | Nov 2015 | US |