The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more cameras at a vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties. Various cameras have been proposed for such imaging systems, including cameras of the types described in U.S. Pat. No. 7,965,336 and U.S. Publication No. US-2009-0244361, which are hereby incorporated herein by reference in their entireties.
The present invention provides a vision system or imaging system for a vehicle that utilizes one or more cameras to capture image data representative of images exterior of the vehicle. The camera or camera module comprises an imager and a circuit board (or circuit boards) and a lens at a lens barrel. A front camera housing portion is configured to receive an imager printed circuit board therein, with the imager printed circuit board disposed at the lens barrel with the imager optically aligned with an optical axis of optical elements of the lens. A rear camera housing portion is mated with a rear portion of the front camera housing so as to encase and seal the imager printed circuit board in the camera module. A thermoelectric device is disposed at the rear camera housing portion, and a heat transfer element is disposed between and in thermal conductive contact with the thermoelectric device and the imager printed circuit board. The thermoelectric device is electrically powered to draw heat from the imager printed circuit board to the rear camera housing portion. The thermoelectric device may be operable responsive to a temperature sensor disposed in the camera module. Circuitry of the camera module is electrically connected to the imager and the thermoelectric device and is electrically connected to electrical connecting elements that are configured to electrically connect to a wire harness of the vehicle.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle vision system and/or driver assist system and/or object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide display, such as a rearview display or a top down or bird's eye or surround view display or the like.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes at least one exterior facing imaging sensor or camera, such as a rearward facing imaging sensor or camera 14a (and the system may optionally include multiple exterior facing imaging sensors or cameras, such as a forward facing camera 14b at the front (or at the windshield) of the vehicle, and a sideward/rearward facing camera 14c, 14d at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
The control unit may comprise or may be part of an autonomous vehicle control system, whereby the cameras capture image data that is processed for use in autonomously controlling the vehicle. Such high resolution autonomous vehicle cameras require exceptional image quality for machine vision. Image quality degradation occurs from sensor noise at elevated temperatures. Automotive industry sensor suppliers see this occurring to some degree at temperatures as low as 45 degrees C.
High-resolution and automotive cameras take steps to optimize heat transfer between the image sensor or imager and external heat dissipating features. Many such cameras are expected to operate in ambient temperatures as high as around 85 degrees C. so can rarely achieve sensor junction temperatures below around 100 degrees C. Image degradation at these temperatures may be substantial and limits machine vision capabilities.
The autonomous vehicle camera of the present invention includes active cooling. As shown in
As also shown in
Thus, and such as shown in
The heat transfer element is formed to be disposed over and at least partially around the TEC and is attached to the rear housing portion via fasteners or screws, which may be tightened to clamp the heat transfer element to and around the TEC to provide and maintain contact between the heat transfer element and the TEC. Thermal insulators may be provided at the screws. Optionally, for example, a desiccant sheet with adhesive backing may be disposed between the heat transfer element and the rear housing portion. Optionally, a heat spreader, such as one made from graphite (such as a graphite sheet or layer or film, or such as another suitable heat spreading or heat diffusing sheet or layer or film, such as, for example, a graphene sheet or layer or film or the like), may be disposed between the heat transfer element and the rear housing portion to enhance the rear cover sink ability.
In the illustrated embodiment, the camera module includes two printed circuit boards (in addition to the imager PCB) that include circuitry associated with the imager and camera. The heat transfer element protrudes generally centrally aligned holes or apertures in the two PCBs and terminates at the rear of the imager PCB, such that the heat transfer element thermally conductively connects at the rear of the imager PCB to draw or conduct or transfer heat generated by the imager (during operation of the camera) away from the imager PCB to the TEC and rear cover or housing portion. Optionally, the heat transfer element may contact one or both of the other PCBs to draw heat from them as well. Optionally, the heat transfer element may engage a rear side of another circuit board (and not engage the imager PCB) to draw heat from circuitry or components that are disposed on the other circuit board (such that heat generated by an image processor of the other circuit board is drawn away from the image processor and toward the rear of the camera). Circuitry of the PCBs and of the camera module (including the image processor and TEC or circuitry associated with the TEC) is electrically connected to the imager and is electrically connected to electrical connecting elements that are configured to electrically connect to a wire harness of the vehicle when the camera module is disposed at the vehicle.
As shown in
Also, the image sensor PCB is directly adhesively bonded at the front camera housing or lens or lens barrel. To eliminate all sources of movement between the lens and image sensor, the imager or its printed circuit board is bonded directly to the lens structure or lens barrel, such as via a suitable quick-cure adhesive (see
The camera module may utilize aspects of the cameras and connectors described in U.S. Pat. Nos. 9,621,769; 9,596,387; 9,277,104; 9,077,098; 8,994,878; 8,542,451 and/or 7,965,336, and/or U.S. Publication Nos. US-2009-0244361; US-2013-0242099; US-2014-0373345; US-2015-0124098; US-2015-0222795; US-2015-0327398; US-2016-0243987; US-2016-0268716; US-2016-0286103; US-2016-0037028; US-2017-0054881; US-2017-0133811; US-2017-0201661; US-2017-0280034; US-2017-0295306; US-2017-0302829 and/or US-2018-0098033, and/or U.S. patent applications, Ser. No. 16/165,170, filed Oct. 19, 2018, now U.S. Pat. No. 10,750,064, and/or Ser. No. 16/165,253, filed Oct. 19, 2018, now U.S. Pat. No. 10,678,018, which are hereby incorporated herein by reference in their entireties.
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EYEQ™ family of image processing chips (for example, an EYEQ3™, EYEQ4™ or EYEQ5™ image processing chip) available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. Publication Nos. US-2014-0340510; US-2014-0313339; US-2014-0347486; US-2014-0320658; US-2014-0336876; US-2014-0307095; US-2014-0327774; US-2014-0327772; US-2014-0320636; US-2014-0293057; US-2014-0309884; US-2014-0226012; US-2014-0293042; US-2014-0218535; US-2014-0218535; US-2014-0247354; US-2014-0247355; US-2014-0247352; US-2014-0232869; US-2014-0211009; US-2014-0160276; US-2014-0168437; US-2014-0168415; US-2014-0160291; US-2014-0152825; US-2014-0139676; US-2014-0138140; US-2014-0104426; US-2014-0098229; US-2014-0085472; US-2014-0067206; US-2014-0049646; US-2014-0052340; US-2014-0025240; US-2014-0028852; US-2014-005907; US-2013-0314503; US-2013-0298866; US-2013-0222593; US-2013-0300869; US-2013-0278769; US-2013-0258077; US-2013-0258077; US-2013-0242099; US-2013-0215271; US-2013-0141578 and/or US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO 2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application is a continuation of U.S. patent application Ser. No. 17/247,081, filed Nov. 30, 2020, now U.S. Pat. No. 11,228,697, which is a continuation of U.S. patent application Ser. No. 16/165,204, filed Oct. 19, 2018, now U.S. Pat. No. 10,855,890, which claims the filing benefits of U.S. provisional application Ser. No. 62/575,651, filed Oct. 23, 2017, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4712879 | Lynam et al. | Dec 1987 | A |
5393931 | Guenther | Feb 1995 | A |
5550677 | Schofield et al. | Aug 1996 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5949331 | Schofield et al. | Sep 1999 | A |
5978017 | Tino | Nov 1999 | A |
6151065 | Steed et al. | Nov 2000 | A |
6690268 | Schofield et al. | Feb 2004 | B2 |
6824281 | Schofield et al. | Nov 2004 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7479986 | Karaki | Jan 2009 | B2 |
7480149 | DeWard et al. | Jan 2009 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7855755 | Weller et al. | Dec 2010 | B2 |
7965336 | Bingle et al. | Jun 2011 | B2 |
8256821 | Lawlor et al. | Sep 2012 | B2 |
8542451 | Lu et al. | Sep 2013 | B2 |
8994878 | Byrne et al. | Mar 2015 | B2 |
9077098 | Latunski | Jul 2015 | B2 |
9233641 | Sesti et al. | Jan 2016 | B2 |
9277104 | Sesti et al. | Mar 2016 | B2 |
9596387 | Achenbach et al. | Mar 2017 | B2 |
9621769 | Mai et al. | Apr 2017 | B2 |
10207646 | Oh | Feb 2019 | B2 |
10274812 | Chen | Apr 2019 | B1 |
10855890 | Mleczko et al. | Dec 2020 | B2 |
11228697 | Mleczko et al. | Jan 2022 | B2 |
20030090569 | Poechmueller | May 2003 | A1 |
20040075870 | Karaki | Apr 2004 | A1 |
20040169771 | Washington et al. | Sep 2004 | A1 |
20090244361 | Gebauer et al. | Oct 2009 | A1 |
20090295181 | Lawlor et al. | Dec 2009 | A1 |
20110025850 | Maekawa et al. | Feb 2011 | A1 |
20110298925 | Inoue et al. | Dec 2011 | A1 |
20130242099 | Sauer et al. | Sep 2013 | A1 |
20140104184 | Meador et al. | Apr 2014 | A1 |
20140160284 | Achenbach et al. | Jun 2014 | A1 |
20140226012 | Achenbach | Aug 2014 | A1 |
20140373345 | Steigerwald | Dec 2014 | A1 |
20150015713 | Wang et al. | Jan 2015 | A1 |
20150029337 | Uchiyama et al. | Jan 2015 | A1 |
20150054961 | Saitoh et al. | Feb 2015 | A1 |
20150070557 | Petty et al. | Mar 2015 | A1 |
20150124098 | Winden et al. | May 2015 | A1 |
20150205186 | Park et al. | Jul 2015 | A1 |
20150222795 | Sauer et al. | Aug 2015 | A1 |
20150266430 | Mleczko et al. | Sep 2015 | A1 |
20150327398 | Achenbach et al. | Nov 2015 | A1 |
20150365569 | Mai et al. | Dec 2015 | A1 |
20160037028 | Biemer | Feb 2016 | A1 |
20160191863 | Minikey, Jr. et al. | Jun 2016 | A1 |
20160243987 | Kendall | Aug 2016 | A1 |
20160268716 | Conger et al. | Sep 2016 | A1 |
20160286103 | Van Dan Elzen | Sep 2016 | A1 |
20170036600 | Whitehead et al. | Feb 2017 | A1 |
20170054881 | Conger et al. | Feb 2017 | A1 |
20170133811 | Conger et al. | May 2017 | A1 |
20170201661 | Conger | Jul 2017 | A1 |
20170280034 | Hess et al. | Sep 2017 | A1 |
20170295306 | Mleczko | Oct 2017 | A1 |
20170302829 | Mleczko et al. | Oct 2017 | A1 |
20180027151 | Kazama et al. | Jan 2018 | A1 |
20180072239 | Wienecke et al. | Mar 2018 | A1 |
20180098033 | Mleczko et al. | Apr 2018 | A1 |
20180241917 | Zhang et al. | Aug 2018 | A1 |
20180345911 | Zurowski | Dec 2018 | A1 |
20190121051 | Byrne et al. | Apr 2019 | A1 |
20190124238 | Byrne et al. | Apr 2019 | A1 |
20190124243 | Mleczko et al. | Apr 2019 | A1 |
20190306966 | Byrne et al. | Oct 2019 | A1 |
20200001787 | Lu et al. | Jan 2020 | A1 |
20200010024 | Sesti et al. | Jan 2020 | A1 |
Number | Date | Country | |
---|---|---|---|
20220141366 A1 | May 2022 | US |
Number | Date | Country | |
---|---|---|---|
62575651 | Oct 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17247081 | Nov 2020 | US |
Child | 17648004 | US | |
Parent | 16165204 | Oct 2018 | US |
Child | 17247081 | US |