The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more cameras at a vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties. Various cameras have been proposed for such imaging systems, including cameras of the types described in U.S. Pat. No. 7,965,336 and U.S. Publication No. US-2009-0244361, which are hereby incorporated herein by reference in their entireties.
The present invention provides a vision system or imaging system for a vehicle that utilizes one or more cameras to capture image data representative of images exterior of the vehicle. The camera or camera module comprises an imager and a circuit board (or circuit boards) and a lens. The lens comprises a plurality of optical elements fixedly disposed in a lens barrel. A front camera housing portion is configured to receive an imager printed circuit board therein, with the imager printed circuit board disposed at the lens barrel and bonded thereat with the imager optically aligned with an optical axis of the optical elements. The lens barrel and the front camera housing portion are integrally formed together as a single part. The front camera housing portion includes mounting structure configured to mount the camera module at a vehicle. A rear camera housing portion is mated with a rear portion of the front camera housing so as to encase and seal the imager printed circuit board in the camera module. The mounting structure is formed as part of the front camera housing so as to be evenly or centrally located about the optical axis and about a center of mass of the camera module. Circuitry of the camera module is electrically connected to the imager and is electrically connected to electrical connecting elements that are configured to electrically connect to a wire harness of the vehicle.
Optionally, the rear camera housing portion may comprise a heat sink. Optionally, the optic elements may be fixedly bonded to the lens barrel.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle vision system and/or driver assist system and/or object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide display, such as a rearview display or a top down or bird's eye or surround view display or the like.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes at least one exterior facing imaging sensor or camera, such as a rearward facing imaging sensor or camera 14a (and the system may optionally include multiple exterior facing imaging sensors or cameras, such as a forward facing camera 14b at the front (or at the windshield) of the vehicle, and a sideward/rearward facing camera 14c, 14d at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
The control unit may comprise or may be part of an autonomous vehicle control system, whereby the cameras capture image data that is processed for use in autonomously controlling the vehicle. Optical axis alignment and focus requirements are very precise for autonomous vehicle cameras. Typical automotive cameras require that they maintain alignment to within +/−1 degree, whereas autonomous vehicle cameras may have requirements that are hundreds of times smaller. Also, typical cameras may have a greater than about 40 μm range of focus, whereas autonomous vehicle cameras may have less than about 10 μm range of focus.
Cameras comprise many interfacing components along the optical axis which may shift when exposed to thermal changes, vibrations or mechanical shocks. Some typical interfaces include: (1) Lens elements-to-lens barrel, (2) Lens barrel-to-lens holder, (3) Lens holder-to-image sensor PCB, (4) Lens holder-to-camera front housing, (5) Camera front housing-to-camera rear housing and mounting features, and/or (6) Camera rear housing mounting features to Camera bracket. Each of these contribute to the current alignment and focus tolerance expected for standard automotive cameras today.
The “high precision camera” of the present invention minimizes alignment and focus change. For example, while lens elements are typically placed within a “barrel” with some small amount of clearance and held in place with a compression force, the camera of the present invention has its lens elements permanently bonded in place in the lens barrel to prevent shifting.
Also, the camera directly adhesively bonds the image sensor PCB (printed circuit board) at the lens or lens barrel. To eliminate all sources of movement between the lens and image sensor, the imager or its printed circuit board is bonded directly to the lens structure or lens barrel, such as via a suitable quick-cure adhesive (see
The camera also includes a unified lens and camera body structure. To eliminate the lens-to-camera body structure interface, these are a single piece or single construction. The lens barrel structure is extended to also become the camera body and includes the camera mounting features (that mount the camera at the vehicle, such as via fasteners or the like).
The structural rigidity and load balance or load symmetry are important for maintaining the camera's aim during dynamic conditions such as vehicle vibration. Thus, the mounting features are evenly or centrally located about the camera optical axis and center of mass of the camera.
Symmetrical heat distribution about the optical axis or key structures of the camera also becomes very important for these high levels of precession. The camera of the present invention is designed to evenly distribute heat about the optical axis and includes sources of heat dissipation, such as cooling fins or mounting features, positioned to do the same. This typically means being symmetrically positioned as well as possible round the optical axis.
If required for athermalization, the lens barrel may be constructed from a material with a lower CTE (coefficient of thermal expansion), such as titanium or the like, or a higher CTE, such as a zinc alloy or the like, to provide a focus CTE that is selected to maintain focus of images at the imager in varying temperatures. Optionally, a bridge member may be included between the lens barrel and image sensor to compensate for thermal effects. Such a bridge member may be permanently bonded to the lens barrel.
The camera may reduce moisture air in the camera module or body by way of desiccant, nitrogen bath assembly or other gas to prevent internal condensation on cooled components. Optionally, for example, a desiccant sheet with adhesive backing may be used. Optionally, a heat spreader, such as one made from graphite (such as a graphite sheet or layer or film, or such as another suitable heat spreading or heat diffusing sheet or layer or film, such as, for example, a graphene sheet or layer or film or the like), may be used to enhance the rear cover sink ability.
Thus, and such as shown in
In the illustrated embodiment, the camera module includes two printed circuit boards (in addition to the imager PCB) that include circuitry associated with the imager and camera. Circuitry of the PCBs and of the camera module is electrically connected to the imager and is electrically connected to electrical connecting elements that are configured to electrically connect to a wire harness of the vehicle. One of the PCBs may include an image processor disposed thereat. The thermal element may be in contact with the PCB having the processor, and may be in contact at or the processor, so as to draw heat generated by the processor away from the processor and its PCB and toward the rear of the camera housing. For example, the thermal element may engage the rear surface of the processor PCB at or near or opposite from the processor.
As shown in
The camera module may utilize aspects of the cameras and connectors described in U.S. Pat. Nos. 9,621,769; 9,596,387; 9,277,104; 9,077,098; 8,994,878; 8,542,451 and/or 7,965,336, and/or U.S. Publication Nos. US-2009-0244361; US-2013-0242099; US-2014-0373345; US-2015-0124098; US-2015-0222795; US-2015-0327398; US-2016-0243987; US-2016-0268716; US-2016-0286103; US-2016-0037028; US-2017-0054881; US-2017-0133811; US-2017-0201661; US-2017-0280034; US-2017-0295306; US-2017-0302829 and/or US-2018-0098033, and/or U.S. patent application Ser. No. 16/165,204, filed Oct. 19, 2018, and published on Apr. 25, 2019 as U.S. Publication No. US-2019-0124243, and/or Ser. No. 16/165,253, filed Oct. 19, 2018, now U.S. Pat. No. 10,678,018, which are hereby incorporated herein by reference in their entireties.
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EYEQ™ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. Publication Nos. US-2014-0340510; US-2014-0313339; US-2014-0347486; US-2014-0320658; US-2014-0336876; US-2014-0307095; US-2014-0327774; US-2014-0327772; US-2014-0320636; US-2014-0293057; US-2014-0309884; US-2014-0226012; US-2014-0293042; US-2014-0218535; US-2014-0218535; US-2014-0247354; US-2014-0247355; US-2014-0247352; US-2014-0232869; US-2014-0211009; US-2014-0160276; US-2014-0168437; US-2014-0168415; US-2014-0160291; US-2014-0152825; US-2014-0139676; US-2014-0138140; US-2014-0104426; US-2014-0098229; US-2014-0085472; US-2014-0067206; US-2014-0049646; US-2014-0052340; US-2014-0025240; US-2014-0028852; US-2014-005907; US-2013-0314503; US-2013-0298866; US-2013-0222593; US-2013-0300869; US-2013-0278769; US-2013-0258077; US-2013-0258077; US-2013-0242099; US-2013-0215271; US-2013-0141578 and/or US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO/2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application is a continuation of U.S. patent application Ser. No. 16/947,775, filed Aug. 17, 2020, now U.S. Pat. No. 11,212,429, which is a continuation of U.S. patent application Ser. No. 16/165,170, filed Oct. 19, 2018, now U.S. Pat. No. 10,750,064, which claims the filing benefits of U.S. provisional application Ser. No. 62/575,650, filed Oct. 23, 2017, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5550677 | Schofield et al. | Aug 1996 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5949331 | Schofield et al. | Sep 1999 | A |
7038577 | Pawlicki et al. | May 2006 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7855755 | Weller et al. | Dec 2010 | B2 |
7965336 | Bingle et al. | Jun 2011 | B2 |
8542451 | Lu et al. | Sep 2013 | B2 |
8994878 | Byrne et al. | Mar 2015 | B2 |
9077098 | Latunski | Jul 2015 | B2 |
9277104 | Sesti et al. | Mar 2016 | B2 |
9596387 | Achenbach et al. | Mar 2017 | B2 |
9621769 | Mai et al. | Apr 2017 | B2 |
10750064 | Byrne | Aug 2020 | B2 |
11212429 | Byrne | Dec 2021 | B2 |
20090244361 | Gebauer et al. | Oct 2009 | A1 |
20120019940 | Lu et al. | Jan 2012 | A1 |
20130242099 | Sauer et al. | Sep 2013 | A1 |
20140373345 | Steigerwald | Dec 2014 | A1 |
20150015713 | Wang et al. | Jan 2015 | A1 |
20150124098 | Winden et al. | May 2015 | A1 |
20150222795 | Sauer et al. | Aug 2015 | A1 |
20150327398 | Achenbach et al. | Nov 2015 | A1 |
20150379361 | Boulanger | Dec 2015 | A1 |
20160037028 | Biemer | Feb 2016 | A1 |
20160243987 | Kendall | Aug 2016 | A1 |
20160268716 | Conger et al. | Sep 2016 | A1 |
20160286103 | Van Dan Elzen | Sep 2016 | A1 |
20170048463 | Mleczko | Feb 2017 | A1 |
20170054881 | Conger et al. | Feb 2017 | A1 |
20170133811 | Conger et al. | May 2017 | A1 |
20170201661 | Conger | Jul 2017 | A1 |
20170280034 | Hess et al. | Sep 2017 | A1 |
20170295306 | Mleczko | Oct 2017 | A1 |
20170302829 | Mleczko et al. | Oct 2017 | A1 |
20180072239 | Wienecke et al. | Mar 2018 | A1 |
20180098033 | Mleczko et al. | Apr 2018 | A1 |
20190121051 | Byrne et al. | Apr 2019 | A1 |
20190124238 | Byrne et al. | Apr 2019 | A1 |
20190124243 | Mleczko et al. | Apr 2019 | A1 |
20190306966 | Byrne et al. | Oct 2019 | A1 |
Number | Date | Country | |
---|---|---|---|
20220124229 A1 | Apr 2022 | US |
Number | Date | Country | |
---|---|---|---|
62575650 | Oct 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16947775 | Aug 2020 | US |
Child | 17646002 | US | |
Parent | 16165170 | Oct 2018 | US |
Child | 16947775 | US |