The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more cameras at a vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
The present invention provides a camera monitoring system for a vehicle that utilizes cameras to capture image data representative of images exterior of the vehicle, and provides a stereographic video display screen for displaying video images for viewing by the driver of the vehicle. The system includes a pair of rearward viewing cameras disposed at the vehicle, with each rearward viewing camera of the pair of rearward viewing cameras having a field of view at least rearward of the vehicle, and with the fields of view of the rearward viewing cameras overlapping. The rearward viewing cameras are mounted at the vehicle side by side one another with the principal axes of the fields of view of the rearward viewing cameras being spaced apart by a known distance. The system includes a driver-monitoring camera disposed at the vehicle and viewing a driver head portion or region of a driver of the vehicle. The rearward viewing cameras and the driver-monitoring camera capture image data, and an image processor of the control is operable to process image data captured by the driver-monitor camera and the rearward viewing cameras. The control, via processing at the control of image data captured by the driver-monitoring camera, determines location of each eye of the driver of the vehicle. Responsive to processing at the control of image data captured by each rearward viewing camera of the pair of rearward viewing cameras, the video display screen displays video images derived at least in part from image data captured by both rearward viewing cameras of the pair of rearward viewing cameras and provides depth perception to the driver of the vehicle viewing the displayed video images.
The stereographic video display screen provides depth perception by (i) projecting video images derived from image data captured by a left-side rearward viewing camera of the pair of rearward viewing cameras toward the left eye of the driver of the vehicle and (ii) projecting video images derived from image data captured by a right-side rearward viewing camera of the pair of rearward viewing cameras toward the right eye of the driver of the vehicle. The system may also provide for panning of the displayed video images to enhance the driver's view rearward (and/or sideward) of the vehicle when viewing the video display screen. For example, the system may pan or adjust the displayed video images at the stereographic video display screen responsive to determining, via image processing at the control of image data captured by the driver-monitoring camera, that the driver's head (and eyes) has changed position, in order to accommodate the driver's change in viewing angle toward the stereographic video display screen.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle vision system and/or driver assist system and/or object detection system and/or alert system and/or camera monitoring system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a forward or rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide display, such as a rearview display or a top down or bird's eye or surround view display or the like.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes a vision system or camera monitoring system 12 that includes two exterior viewing imaging sensors or cameras, such as two rearward viewing imaging sensors or cameras 14a, 14b (and the system may optionally include multiple exterior viewing imaging sensors or cameras, such as a forward viewing camera at the front (or at the windshield) of the vehicle 10, and sideward/rearward viewing cameras at respective sides of the vehicle 10), which capture images exterior of the vehicle 10, with each camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
A camera monitoring system (CMS), if using one rearward viewing camera and one display, lacks depth perception. The camera monitoring system 12, by using two rearward viewing cameras 14a, 14b that have similar and overlapping fields of view, and that point or view in the same direction and that are separated by a known distance (such as five inches or less, such as three inches or less or any suitable distance), can provide or achieve depth perception in the displayed video images. The display device 20 has stereographic characteristics (i.e., it can display or project two images at the same time, with the two images projected toward different target locations, such as the individual eyes of the viewer), and the video images derived from image data captured by the left-side camera 14a are projected toward the driver's left eye, and the video images derived from image data captured by the right-side camera 14b are projected toward the driver's right eye.
As shown in
In the illustrated embodiment of
Optionally, image data captured by each of the pairs of cameras may be used for displaying video images at a respective display inside the vehicle 10 at suitable locations. For example, a driver-side display screen may be disposed at or near the driver-side of the interior cabin of the vehicle 10 (such as at or by the driver-side A-pillar of the vehicle) for displaying video images derived from image data captured by a pair of cameras in the driver-side exterior rearview mirror assembly, and a passenger-side display screen may be disposed at or near the passenger-side of the interior cabin of the vehicle 10 (or to the right of the center region of the interior cabin of the vehicle) for displaying video images derived from image data captured by a pair of cameras in the passenger-side exterior rearview mirror assembly, while a center display screen may be disposed at or near a center region of the interior cabin of the vehicle (such as at the center stack or such as at the interior rearview mirror assembly) for displaying video images derived from image data captured by a pair of cameras generally centrally located at a rearward portion of the vehicle 10 and viewing rearward of the vehicle 10. As shown in
The camera monitoring system 12 determines the location of the driver's eyes and the driver's head position via processing image data captured by the driver monitoring camera 16. With the respective displayed video images (derived from image data captured by a respective one of the cameras 14a, 14b) being projected toward a respective eye of the driver, the camera monitoring system 12 provides depth perception via the stereographic display screen.
The camera monitoring system 12 may also provide the driver with the ability to alter the images displayed on the display screen 20 in a manner similar to how a driver's rearward view is altered when the driver moves his or her head while viewing a rearview mirror of the vehicle. Thus, based on the determined driver's head and eye position (as determined via processing of image data captured by the driver monitoring camera 16), the video images displayed at the display screen 20 may be panned to mimic the driver's perception when the driver is viewing a typical interior mirror. For example, when the driver moves his or her head to the left, the camera monitoring system 12 may pan the images to the right (i.e., the display screen 20 may display or project video images of the scene to the right of a nominal or centered region of the cameras' fields of view) to mimic how the view would change if the driver were viewing a typical mirror when such a head position change occurs.
The displayed video images thus may be representative of a centered portion of the image data captured by the cameras 14a, 14b (such as of a center block or two-dimensional array of the overall two-dimensional array of photosensors of each camera), with side sub-regions and optionally upper and lower sub-regions of photosensors around the center block or array), and then, when the camera monitoring system 12 determines that the driver's head changes position (optionally only when the camera monitoring system 12 also determines that the driver is viewing the display screen 20), the camera monitoring system 12 displays video images at the display screen 20 representative of part of the center block or array of photosensors and part or all (depending on how far the driver's head moves) of the appropriate or respective side (or upper/lower) region of the overall array of photosensors. Thus, the camera monitoring system 12 allows for adjustment of the displayed video images in the sideward and optionally upward and/or downward directions responsive to the driver looking at the display screen 20 from the different angles or locations.
The ECU 18, responsive to processing of image data captured by the cameras 14a, 14b, may detect or determine presence of objects or the like and/or the camera monitoring system 12 provides displayed video images at the display device 20 for viewing by the driver of the vehicle 10 (although shown in
The camera monitoring system 12 includes an image processor at the ECU 18 operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EYEQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the camera monitoring system 12 may generate an alert to the driver of the vehicle 10 and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle 10.
The vehicle 10 may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
The cameras 14a, 14b and control 18 and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras (such as various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like) and vision systems described in U.S. Pat. Nos. 5,760,962; 5,715,093; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 5,796,094; 6,559,435; 6,831,261; 6,822,563; 6,946,978; 7,720,580; 8,542,451; 7,965,336; 7,480,149; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,937,667; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454, 8,451,107, 8,446,470, 9,126,525, 9,041,806, 10,793,067, and/or 6,824,281, and/or U.S. Publication Nos. US 2010-0020170 and/or US-2009-0244361, which are all hereby incorporated herein by reference in their entireties.
The interior camera 16 and driver monitoring system 12 may utilize aspects of head and face direction and position tracking systems and/or eye tracking systems and/or gesture recognition systems. Such head and face direction and/or position tracking systems and/or eye tracking systems and/or gesture recognition systems may utilize aspects of the systems described in U.S. Pat. Nos. 9,405,120, 10,703,204, 10,247,941, 10,017,114, 9,701,258, and/or 9,280,202 and/or U.S. Publication Nos. US-2016-0137126; US-2015-0352953; US-2015-0296135; US-2015-0294169; US-2015-0092042; US-2015-0022664; and/or US-2015-0009010, which are hereby incorporated herein by reference in their entireties.
Optionally, the video display device 20 may utilize aspects of the video display devices and systems described in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187; 6,690,268; 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,501; 6,222,460; 6,513,252. 9,598,016, 9,264,672 and/or 6,642,851, and/or U.S. Publication Nos. US-2006-0050018 and/or US-2006-0061008, which are all hereby incorporated herein by reference in their entireties.
Optionally, the vision system (utilizing a forward viewing camera and a rearward viewing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or bird's-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in U.S. Pat. Nos. 10,071,687; 9,900,522; 9,834,153; 9,762,880; 9,596,387; 9,126,525 and/or 9,041,806, and/or U.S. Publication Nos. US-2015-0022664 and/or US-2012-0162427, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 62/942,305, filed Dec. 2, 2019, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4546551 | Franks | Oct 1985 | A |
4953305 | Van Lente et al. | Sep 1990 | A |
5530240 | Larson et al. | Jun 1996 | A |
5550677 | Schofield et al. | Aug 1996 | A |
5576687 | Blank et al. | Nov 1996 | A |
5632092 | Blank et al. | May 1997 | A |
5668663 | Varaprasad et al. | Sep 1997 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5699044 | Van Lente et al. | Dec 1997 | A |
5708410 | Blank et al. | Jan 1998 | A |
5724187 | Varaprasad et al. | Mar 1998 | A |
5737226 | Olson et al. | Apr 1998 | A |
5802727 | Blank et al. | Sep 1998 | A |
5878370 | Olson | Mar 1999 | A |
5949331 | Schofield et al. | Sep 1999 | A |
6087953 | DeLine et al. | Jul 2000 | A |
6173501 | Blank et al. | Jan 2001 | B1 |
6222460 | DeLine et al. | Apr 2001 | B1 |
6329925 | Skiver et al. | Dec 2001 | B1 |
6513252 | Schierbeek et al. | Feb 2003 | B1 |
6642851 | Deline et al. | Nov 2003 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
7004593 | Weller et al. | Feb 2006 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7184190 | McCabe et al. | Feb 2007 | B2 |
7195381 | Lynam et al. | Mar 2007 | B2 |
7249860 | Kulas et al. | Jul 2007 | B2 |
7255451 | McCabe et al. | Aug 2007 | B2 |
7274501 | McCabe et al. | Sep 2007 | B2 |
7289037 | Uken et al. | Oct 2007 | B2 |
7308341 | Schofield et al. | Dec 2007 | B2 |
7329013 | Blank et al. | Feb 2008 | B2 |
7338177 | Lynam | Mar 2008 | B2 |
7370983 | DeWind et al. | May 2008 | B2 |
7446650 | Scholfield et al. | Nov 2008 | B2 |
7581859 | Lynam | Sep 2009 | B2 |
7626749 | Baur et al. | Dec 2009 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7855755 | Weller et al. | Dec 2010 | B2 |
9041806 | Baur et al. | May 2015 | B2 |
9126525 | Lynam et al. | Sep 2015 | B2 |
9264672 | Lynam | Feb 2016 | B2 |
9280202 | Gieseke et al. | Mar 2016 | B2 |
9405120 | Graf et al. | Aug 2016 | B2 |
9596387 | Achenbach et al. | Mar 2017 | B2 |
9598016 | Blank et al. | Mar 2017 | B2 |
9701258 | Tiryaki | Jul 2017 | B2 |
9762880 | Pflug | Sep 2017 | B2 |
9834153 | Gupta et al. | Dec 2017 | B2 |
9900522 | Lu | Feb 2018 | B2 |
10017114 | Bongwald | Jul 2018 | B2 |
10071687 | Ihlenburg et al. | Sep 2018 | B2 |
10247941 | Fursich | Apr 2019 | B2 |
10503989 | Nishino | Dec 2019 | B2 |
10703204 | Hassan et al. | Jul 2020 | B2 |
20060050018 | Hutzel et al. | Mar 2006 | A1 |
20060061008 | Karner et al. | Mar 2006 | A1 |
20080304705 | Pomerleau | Dec 2008 | A1 |
20120162427 | Lynam | Jun 2012 | A1 |
20150009010 | Biemer | Jan 2015 | A1 |
20150022664 | Pflug et al. | Jan 2015 | A1 |
20150092042 | Fursich | Apr 2015 | A1 |
20150294169 | Zhou et al. | Oct 2015 | A1 |
20150296135 | Wacquant et al. | Oct 2015 | A1 |
20150352953 | Koravadi | Dec 2015 | A1 |
20160049020 | Kuehnle et al. | Feb 2016 | A1 |
20160137126 | Fursich et al. | May 2016 | A1 |
20170153457 | Kunze | Jun 2017 | A1 |
20170174128 | Hu et al. | Jun 2017 | A1 |
20180134217 | Peterson et al. | May 2018 | A1 |
20180191954 | Pan | Jul 2018 | A1 |
20210155167 | Lynam et al. | May 2021 | A1 |
Number | Date | Country | |
---|---|---|---|
20210162926 A1 | Jun 2021 | US |
Number | Date | Country | |
---|---|---|---|
62942305 | Dec 2019 | US |