The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes a forward viewing camera at a windshield of a vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties. Packaging of forward facing imagers is also known, such as described in U.S. Pat. Nos. 7,262,406; 7,265,342; 7,420,159; 7,480,149; 7,533,998; 7,538,316; 7,916,009; 8,179,437 and/or 8,405,726, which are hereby incorporated herein by reference in their entireties.
The present invention provides a vision system or imaging system for a vehicle that utilizes one or more cameras (preferably one or more CMOS cameras) to capture image data representative of images exterior of the vehicle. The system includes a forward viewing camera disposed behind and viewing through the windshield of the vehicle, with the windshield having a light-absorbing blackout area or hiding layer established thereat to limit viewability of the camera to a person viewing the vehicle windshield from outside the vehicle. The system of the present invention provides a customized or selected light-transmitting aperture through the hiding layer at the windshield to provide a desired field of view of the camera through the windshield via the light-absorbing aperture while limiting or reducing viewability of the camera to a person viewing the vehicle windshield from outside the vehicle.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle vision system and/or driver assist system and/or object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes a control having an image processor that is operable to process image data captured by one or more cameras and/or provide an output to a display device for displaying images representative of the captured image data.
A vehicular camera system or module can be installed on the inside of the front windshield of a vehicle, such as a car, truck, bus, or van. Such a camera system may be used for a variety of functions such as an object detection function, a lane keeping function, and an exterior light or high beam control function or the like.
The hiding layer or blackout layer or area thus includes a void or aperture or light-transmitting portion or region in front of and aligned with the forward viewing camera, such that the camera views through the windshield at the aperture and forward of the vehicle. The light-transmitting portion or void or aperture is sized and shaped to reduce or minimize the footprint or profile of the aperture (to limit or reduce or minimize the viewability of the camera to a person outside the vehicle), while providing a sufficient or desired field of view of the camera through the windshield and forward of the vehicle, as discussed below.
The imaging system or vision system that includes the forward viewing camera 10 may also include or be associated with at least one other exterior facing imaging sensor or camera, such as a rearward facing imaging sensor or camera (and the system may optionally include multiple exterior facing imaging sensors or cameras, such as the forwardly facing camera at the front (or at the windshield) of the vehicle, and a sidewardly/rearwardly facing camera at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera. The vision system includes a control or electronic control unit (ECU) or processor that is operable to process image data captured by the cameras and may provide displayed images at a display device for viewing by the driver of the vehicle. The data transfer or signal communication from the camera to the ECU may comprise any suitable data or communication link, such as a vehicle network bus or the like of the equipped vehicle.
The camera system or camera module of the present invention may utilize aspects of the systems and/or modules described in International Publication Nos. WO 2013/123161 and/or WO 2013/019795, and/or U.S. Pat. Nos. 8,256,821; 7,480,149; 7,289,037; 7,004,593; 6,824,281; 6,690,268; 6,445,287; 6,428,172; 6,420,975; 6,326,613; 6,278,377; 6,243,003; 6,250,148; 6,172,613 and/or 6,087,953, and/or U.S. Pat. Publication No. US-2009-0295181, and/or U.S. Publication Nos. US-2015-0251605 and/or US-2014-0226012, which are all hereby incorporated herein by reference in their entireties. Optionally, and desirably, the camera utilizes aspects of the cameras described in U.S. Publication No. US-2014-0160284, which is hereby incorporated herein by reference in its entirety.
The camera module may attach at a camera mounting bracket via any suitable means, and may detachably attach so that the camera module may be detached for service or replacement while the bracket remains attached at the windshield surface. The camera bracket may be adhesively attached at the in-cabin surface of the vehicle windshield, such as at the light-absorbing layer at an upper central region of the windshield at or near or spaced from where a mirror mounting button (to which an interior rearview mirror assembly may attach) is adhesively attached at the in-cabin surface of the vehicle windshield. Optionally, a stray light shield or shroud is attached (such as via a plurality of fasteners, such as screws or the like) at the camera module and/or camera mounting bracket at the camera lens barrel. When the camera module is mounted at the bracket or structure that is attached at the windshield of the vehicle, the stray light shield or shroud is disposed generally at the in-cabin surface of the windshield so that the camera views through or across the recess established by the shroud and through the windshield, such as through the void or aperture or region of the windshield that is devoid of the blackout area or layer or opaque hiding layer or frit layer or the like.
As shown in
For wide field of view (FOV) applications, the blackout can be very large and unsightly. For example, and such as shown in
In determining the void shape and dimensions, the void design for wide angle field of view applications should consider the use cases for the particular vision system.
For example, the void design (size and shape) should be based on the range of objects that may be detected in the field of view of the camera. An object detected at a specific distance occurs at a specific location in the image, this is generally at the horizon or middle of the image vertically. If an object continues to be of interest for the vehicle control, the position from the center of the sensor to the object may be less than the capability of the optical system as the vehicle approaches the object.
The void design (size and shape) may also or otherwise be determined based on the angle of an object relative to the vehicle camera. An object detected at a specific angle occurs at a specific location in the image, which is generally at the horizon or middle of the image vertically and near the edge. If an object continues to be of interest for the vehicle control, the position from the center of the sensor to the object may be less than the capability of the optical system as the vehicle approaches the object.
The void design (size and shape) may also or otherwise be determined based on the functional content of the sensor and system. For example, the particular size and shape of the void may be determined at least in part on whether the vision system is for providing an emergency braking function, a warning function and/or an object detection function or the like.
The light-transmitting portion or void or aperture through the hiding layer should be sized and shaped taking these considerations into contention. For example, and as can be seen with reference to
Optionally, the void may be provided or established in any shape that provides the desired field of view of the camera for the particular application or applications of the camera and vision system. For example, and such as shown in
In applications where the camera's field of view may encompass some of the hiding layer due to the smaller sized void or aperture (such as can be seen at the lower inboard corners of the cropped apertures in
As discussed above, camera module may include a stray light shield or shroud attached at the camera module and/or camera mounting bracket at the camera lens barrel so as to be disposed generally at the in-cabin surface of the windshield so that the camera views through or across the recess established by the shroud and through the windshield, such as through the void or aperture or region of the windshield that is devoid of the blackout area or layer or opaque hiding layer or frit layer or the like. Such a stray light shield or glare shield is also enhanced by the present invention to limit or substantially preclude any dashboard glare from entering the camera. The present invention enhances the black-out void from the “outside looking in”, and provides for reduction or minimization of the glare shield mounted underneath the optical path or view of the camera to attenuate or block any reflections from the top of dash board area. These reflections can be caused by the likes of navigation systems or other equipment like phone holders, documents or trash on the dashboard that reflect light up towards the windshield where such reflected light would enter the camera lens without such a glare shield. As shown in
As shown in
Therefore, the present invention provides a customized or selected size and shape of the void or aperture through a blackout area or hiding layer to provide a desired or appropriate field of view to the forward viewing camera disposed behind the blackout area and viewing through the windshield at the void or aperture. The width of the aperture may be selected to provide a wide angle field of view at the areas where such a wide angle field of view is needed or desired for the camera, and may be narrowed at other regions (such as above and below the wider region) to reduce or minimize the size of the aperture or void, and thus to limit or reduce the viewability of the camera to a person viewing the vehicle and windshield from outside the vehicle.
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an EyeQ2 or EyeQ3 image processing chip available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO/2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. Publication No. US-2012-0062743, which are hereby incorporated herein by reference in their entireties.
The camera module and circuit chip or board and imaging sensor may be implemented and operated in connection with various vehicular vision-based systems, and/or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 7,004,606; 7,339,149 and/or 7,526,103, which are all hereby incorporated herein by reference in their entireties, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454; 6,320,176 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties, a vehicle vision system, such as a forwardly, sidewardly or rearwardly directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978 and/or 7,859,565, which are all hereby incorporated herein by reference in their entireties, a traffic sign recognition system, a system for determining a distance to a leading or trailing vehicle or object, such as a system utilizing the principles disclosed in U.S. Pat. Nos. 6,396,397 and/or 7,123,168, which are hereby incorporated herein by reference in their entireties, and/or the like.
Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2010/099416; WO 2011/028686; WO 2012/075250; WO 2013/019795; WO 2012/075250; WO 2012/145822; WO 2013/081985; WO 2013/086249 and/or WO 2013/109869, and/or U.S. Pat. No. 9,264,672, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 62/131,517, filed Mar. 11, 2015, which is hereby incorporated herein by reference in its entirety.
| Number | Name | Date | Kind |
|---|---|---|---|
| 5096287 | Kakinami et al. | Mar 1992 | A |
| 5130804 | Tamura | Jul 1992 | A |
| 5550677 | Schofield et al. | Aug 1996 | A |
| 5670935 | Schofield et al. | Sep 1997 | A |
| 5796094 | Schofield et al. | Aug 1998 | A |
| 5877897 | Schofield et al. | Mar 1999 | A |
| 5920061 | Feng | Jul 1999 | A |
| 5949331 | Schofield et al. | Sep 1999 | A |
| 6087953 | DeLine et al. | Jul 2000 | A |
| 6124886 | DeLine et al. | Sep 2000 | A |
| 6172613 | DeLine et al. | Jan 2001 | B1 |
| 6243003 | DeLine et al. | Jun 2001 | B1 |
| 6250148 | Lynam | Jun 2001 | B1 |
| 6278377 | DeLine et al. | Aug 2001 | B1 |
| 6326613 | Heslin et al. | Dec 2001 | B1 |
| 6329925 | Skiver et al. | Dec 2001 | B1 |
| 6420975 | DeLine et al. | Jul 2002 | B1 |
| 6428172 | Hutzel et al. | Aug 2002 | B1 |
| 6445287 | Schofield et al. | Sep 2002 | B1 |
| 6466136 | DeLine | Oct 2002 | B2 |
| 6501387 | Skiver et al. | Dec 2002 | B2 |
| 6690268 | Schofield et al. | Feb 2004 | B2 |
| 6824281 | Schofield et al. | Nov 2004 | B2 |
| 7004593 | Weller et al. | Feb 2006 | B2 |
| 7038577 | Pawlicki et al. | May 2006 | B2 |
| 7095572 | Lee et al. | Aug 2006 | B2 |
| 7215479 | Bakin | May 2007 | B1 |
| 7262406 | Heslin et al. | Aug 2007 | B2 |
| 7265342 | Heslin et al. | Sep 2007 | B2 |
| 7289037 | Uken et al. | Oct 2007 | B2 |
| 7420159 | Heslin et al. | Sep 2008 | B2 |
| 7480149 | DeWard et al. | Jan 2009 | B2 |
| 7533998 | Schofield et al. | May 2009 | B2 |
| 7538316 | Heslin et al. | May 2009 | B2 |
| 7551103 | Schofield | Jun 2009 | B2 |
| 7720580 | Higgins-Luthman | May 2010 | B2 |
| 7855755 | Weller et al. | Dec 2010 | B2 |
| 7916009 | Schofield et al. | Mar 2011 | B2 |
| 7918570 | Weller et al. | Apr 2011 | B2 |
| 8063759 | Bos et al. | Nov 2011 | B2 |
| 8179437 | Schofield et al. | May 2012 | B2 |
| 8223203 | Ohsumi et al. | Jul 2012 | B2 |
| 8254011 | Baur et al. | Aug 2012 | B2 |
| 8256821 | Lawlor et al. | Sep 2012 | B2 |
| 8339453 | Blake, III et al. | Dec 2012 | B2 |
| 8405726 | Schofield et al. | Mar 2013 | B2 |
| 8451332 | Rawlings | May 2013 | B2 |
| 8513590 | Heslin et al. | Aug 2013 | B2 |
| 8531278 | DeWard et al. | Sep 2013 | B2 |
| 8534887 | DeLine | Sep 2013 | B2 |
| 8542451 | Lu et al. | Sep 2013 | B2 |
| 8743203 | Karner | Jun 2014 | B2 |
| 8851690 | Uken | Oct 2014 | B2 |
| 9090213 | Lawlor et al. | Jul 2015 | B2 |
| 9150165 | Fortin | Oct 2015 | B1 |
| 9156403 | Rawlings | Oct 2015 | B2 |
| 9277104 | Sesti et al. | Mar 2016 | B2 |
| 9451138 | Winden et al. | Sep 2016 | B2 |
| 9487159 | Achenbach | Nov 2016 | B2 |
| 9596387 | Achenbach et al. | Mar 2017 | B2 |
| 9871971 | Wang et al. | Jan 2018 | B2 |
| 20020003571 | Schofield | Jan 2002 | A1 |
| 20030169522 | Schofield et al. | Sep 2003 | A1 |
| 20040189862 | Gustavsson et al. | Sep 2004 | A1 |
| 20050141106 | Lee et al. | Jun 2005 | A1 |
| 20060050018 | Hutzel et al. | Mar 2006 | A1 |
| 20060061008 | Karner | Mar 2006 | A1 |
| 20060077575 | Nakai et al. | Apr 2006 | A1 |
| 20060103727 | Tseng | May 2006 | A1 |
| 20070109406 | Schofield et al. | May 2007 | A1 |
| 20070120657 | Schofield et al. | May 2007 | A1 |
| 20070171037 | Schofield | Jul 2007 | A1 |
| 20070221826 | Bechtel | Sep 2007 | A1 |
| 20080247751 | Lang et al. | Oct 2008 | A1 |
| 20080252882 | Kesterson | Oct 2008 | A1 |
| 20090295181 | Lawlor | Dec 2009 | A1 |
| 20100110192 | Johnston et al. | May 2010 | A1 |
| 20100134616 | Seger et al. | Jun 2010 | A1 |
| 20100165468 | Yamada et al. | Jul 2010 | A1 |
| 20100172542 | Stein | Jul 2010 | A1 |
| 20100279439 | Shah et al. | Nov 2010 | A1 |
| 20110025850 | Maekawa et al. | Feb 2011 | A1 |
| 20120008129 | Lu et al. | Jan 2012 | A1 |
| 20120013741 | Blake, III et al. | Jan 2012 | A1 |
| 20120081550 | Sewell | Apr 2012 | A1 |
| 20120105641 | Schofield | May 2012 | A1 |
| 20120265416 | Lu et al. | Oct 2012 | A1 |
| 20120310519 | Lawlor | Dec 2012 | A1 |
| 20130002873 | Hess | Jan 2013 | A1 |
| 20130037589 | Heslin | Feb 2013 | A1 |
| 20130141579 | Schofield | Jun 2013 | A1 |
| 20130144488 | Schofield | Jun 2013 | A1 |
| 20140043465 | Salomonsson | Feb 2014 | A1 |
| 20140160284 | Achenbach et al. | Jun 2014 | A1 |
| 20140226012 | Achenbach et al. | Aug 2014 | A1 |
| 20140241589 | Weber | Aug 2014 | A1 |
| 20140320946 | Tomkins | Oct 2014 | A1 |
| 20150015713 | Wang | Jan 2015 | A1 |
| 20150251605 | Uken et al. | Sep 2015 | A1 |
| 20150329063 | Lawlor et al. | Nov 2015 | A1 |
| Number | Date | Country | |
|---|---|---|---|
| 20160264063 A1 | Sep 2016 | US |
| Number | Date | Country | |
|---|---|---|---|
| 62131517 | Mar 2015 | US |