The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more cameras at a vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
The present invention provides a camera assembly process or method for assembling a vehicle camera for a driver assistance system or vision system or imaging system for a vehicle that utilizes one or more cameras (preferably one or more CMOS cameras) to capture image data representative of images exterior of the vehicle. The method or process assembles the camera components such that the components (such as the PCB or PCBs and electrical connector can float or move along a z-axis of the camera assembly relative to the camera housing. The PCB or PCBs may be supported by a cage that is disposed in the housing and that limits the insertion of the lens holder and PCBs into the housing. After the components are assembled together, the components can be adjusted relative to the housing and then an adhesive is applied and cured to retain the components relative to the housing. Thus, the camera assembly can be configured for the particular vehicle application of the camera and vision system application.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle vision system and/or driver assist system and/or object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide display, such as a rearview display or a top down or bird's eye or surround view display or the like.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes at least one exterior facing imaging sensor or camera, such as a rearward facing imaging sensor or camera 14a (and the system may optionally include multiple exterior facing imaging sensors or cameras, such as a forward facing camera 14b at the front (or at the windshield) of the vehicle, and a sideward/rearward facing camera 14c, 14d at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
As shown in
The present invention provides a universal camera design, with the lens already aligned and glued to the lens holder at the beginning of assembly. To obtain a “true” universal design where lines A and B are parallel, a flexible connection between the imager PCB and the connector PCB is needed. In that case the customer connector is at a well-defined position also. This provides enhanced fit for the various vehicle applications. The camera provides for a flexible connection, and the sub-assembly process for PCB-Package will vary depending on that.
With reference to
Thus, the connector PCB and imager PCB may be engaged with the cage and the cage may be inserted into the housing (along the longitudinal axis of the housing) until the cage stops or limits insertion with the lens holder at the front opening of the housing and the electrically conductive connectors of the connector PCB at the rear opening of the housing. The rear connector portion is then disposed at the rear of the housing to electrically connect the electrically conductive connectors of the connector PCB to the terminals of the connector portion. The rear connector portion is then adhesively or otherwise bonded at the rear of the housing and the lens holder is adhesively or otherwise bonded at the front of the housing to complete the camera assembly process.
A paste is applied (if necessary only) through a hole in the rear cover at what is covered later by a customer connector. The customer connector assembly 4 is provided with molded-in pins and is disposed or inserted into the header until it lays on the surface of the rear cover or housing portion (with its position defined by or determined by reference holes at the rear cover but not held in z-direction). This position is maintained until the camera assembly is finished, with some adhesive provided between the rear cover 3 and the connector 4 at their mating surfaces A. Optionally, the subassembly comprising the rear cover and the connector may be assembled at the supplier.
An adhesive (such as a UV curable adhesive) is then applied (in its uncured state) at locations 5 and 6 (
Thus, the assembly process of the present invention allows for assembly of the camera components, with the components being adjustable along the z-axis after initial assembly, such that application of and curing of adhesive applied at opposite end regions of the housing portion affixes the components relative to the housing at the desired or appropriate location relative to the housing. The optical axis of the lens assembly, the z-axis of the housing and the direction of the electrical connection at the rear electrical connector are parallel to one another to allow for z-axis adjustment during assembly (and prior to application of and curing of the adhesive), with the plane of the PCB or PCBs being normal to the z-axis.
The camera and cage may utilize aspects of the cameras described in U.S. patent application Ser. No. 15/642,749, filed Jul. 6, 2017, and published on Jan. 11, 2018 as U.S. Patent Publication No. US-2018-0013935, which is hereby incorporated herein by reference in its entirety. The lens and imager alignment of the camera may utilize aspects of the camera assemblies described in U.S. Pat. Nos. 8,542,451 and/or 9,277,104, which are hereby incorporated herein by reference in their entireties. The camera may include electrical connecting elements that may utilize aspects of the cameras and electrical connectors described in U.S. Pat. No. 9,233,641 and/or U.S. Publication Nos. US-2013-0242099; US-2014-0373345; US-2015-0222795; US-2015-0266430; US-2015-0365569; US-2016-0037028; US-2016-0268716; US-2017-0054881; US-2017-0133811 and/or US-2017-0201661, and/or U.S. patent applications, Ser. No. 15/478,274, filed Apr. 4, 2017, and published Oct. 12, 2017 as U.S. Patent Publication No. US-2017-0295306 and/or Ser. No. 15/487,459, filed Apr. 14, 2017, and published Oct. 19, 2017 as U.S. Patent Publication No. US-2017-0302829, which are hereby incorporated herein by reference in their entireties.
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EyeQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. Publication Nos. US-2014-0340510; US-2014-0313339; US-2014-0347486; US-2014-0320658; US-2014-0336876; US-2014-0307095; US-2014-0327774; US-2014-0327772; US-2014-0320636; US-2014-0293057; US-2014-0309884; US-2014-0226012; US-2014-0293042; US-2014-0218535; US-2014-0218535; US-2014-0247354; US-2014-0247355; US-2014-0247352; US-2014-0232869; US-2014-0211009; US-2014-0160276; US-2014-0168437; US-2014-0168415; US-2014-0160291; US-2014-0152825; US-2014-0139676; US-2014-0138140; US-2014-0104426; US-2014-0098229; US-2014-0085472; US-2014-0067206; US-2014-0049646; US-2014-0052340; US-2014-0025240; US-2014-0028852; US-2014-005907; US-2013-0314503; US-2013-0298866; US-2013-0222593; US-2013-0300869; US-2013-0278769; US-2013-0258077; US-2013-0258077; US-2013-0242099; US-2013-0215271; US-2013-0141578 and/or US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO 2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 62/377,878, filed Aug. 22, 2016, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5204615 | Richards | Apr 1993 | A |
5525264 | Cronin et al. | Jun 1996 | A |
5550677 | Schofield et al. | Aug 1996 | A |
5559556 | Kagebeck | Sep 1996 | A |
5657539 | Orikasa et al. | Aug 1997 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5821532 | Beaman et al. | Oct 1998 | A |
5854708 | Komatsu et al. | Dec 1998 | A |
5872332 | Verma | Feb 1999 | A |
5920061 | Feng | Jul 1999 | A |
5949331 | Schofield et al. | Sep 1999 | A |
5978017 | Tino | Nov 1999 | A |
6002544 | Yatsu | Dec 1999 | A |
6013372 | Hayakawa et al. | Jan 2000 | A |
6071606 | Yamazaki et al. | Jun 2000 | A |
6072814 | Ryan et al. | Jun 2000 | A |
6117193 | Glenn | Sep 2000 | A |
6151065 | Steed | Nov 2000 | A |
6178034 | Allemand et al. | Jan 2001 | B1 |
6193378 | Tonar et al. | Feb 2001 | B1 |
6222447 | Schofield et al. | Apr 2001 | B1 |
6259475 | Ramachandran et al. | Jul 2001 | B1 |
6292311 | Bohn et al. | Sep 2001 | B1 |
6481003 | Maeda | Nov 2002 | B1 |
6483101 | Webster | Nov 2002 | B1 |
6535242 | Strumolo et al. | Mar 2003 | B1 |
6559439 | Tsuchida et al. | May 2003 | B1 |
6590658 | Case et al. | Jul 2003 | B2 |
6603612 | Nakano | Aug 2003 | B2 |
6651187 | Lacey, III | Nov 2003 | B2 |
6654187 | Ning | Nov 2003 | B2 |
6805767 | Shinomiya | Oct 2004 | B2 |
6897432 | Schmidtke et al. | May 2005 | B2 |
7015944 | Holz et al. | Mar 2006 | B2 |
7031075 | Tsuji | Apr 2006 | B2 |
7095123 | Prior | Aug 2006 | B2 |
7095572 | Lee et al. | Aug 2006 | B2 |
7215479 | Bakin | May 2007 | B1 |
7268957 | Frenzel et al. | Sep 2007 | B2 |
7289037 | Uken et al. | Oct 2007 | B2 |
7391458 | Sakamoto | Jun 2008 | B2 |
7419315 | Hirata et al. | Sep 2008 | B2 |
7423665 | Ray et al. | Sep 2008 | B2 |
7453509 | Losehand et al. | Nov 2008 | B2 |
7480149 | DeWard et al. | Jan 2009 | B2 |
7536316 | Ozer et al. | May 2009 | B2 |
7599134 | Bechtel et al. | Oct 2009 | B2 |
7665915 | Lee | Feb 2010 | B2 |
7697056 | Huang | Apr 2010 | B2 |
7768574 | Humpston | Aug 2010 | B2 |
7965336 | Bingle et al. | Jun 2011 | B2 |
8120652 | Bechtel et al. | Feb 2012 | B2 |
8256821 | Lawlor et al. | Sep 2012 | B2 |
8318512 | Shah et al. | Nov 2012 | B2 |
8482664 | Byrne | Jul 2013 | B2 |
8542451 | Lu et al. | Sep 2013 | B2 |
8994878 | Byrne et al. | Mar 2015 | B2 |
9233641 | Sesti et al. | Jan 2016 | B2 |
9277104 | Sesti et al. | Mar 2016 | B2 |
9338334 | Lu et al. | May 2016 | B2 |
9365160 | Byrne et al. | Jun 2016 | B2 |
9487159 | Achenbach | Nov 2016 | B2 |
9596387 | Achenbach et al. | Mar 2017 | B2 |
9992392 | Byrne et al. | Jun 2018 | B2 |
20020167605 | Akimoto et al. | Nov 2002 | A1 |
20020175832 | Mizusawa | Nov 2002 | A1 |
20030090569 | Poechmueller | May 2003 | A1 |
20030137595 | Takachi | Jul 2003 | A1 |
20050104995 | Spryshak et al. | May 2005 | A1 |
20050141106 | Lee et al. | Jun 2005 | A1 |
20050190283 | Ish-Shalom et al. | Sep 2005 | A1 |
20050274883 | Nagano | Dec 2005 | A1 |
20060049533 | Kamoshita | Mar 2006 | A1 |
20060050018 | Hutzel et al. | Mar 2006 | A1 |
20060054802 | Johnston | Mar 2006 | A1 |
20060056077 | Johnston | Mar 2006 | A1 |
20060061008 | Karner et al. | Mar 2006 | A1 |
20060065436 | Gally et al. | Mar 2006 | A1 |
20060077575 | Nakai et al. | Apr 2006 | A1 |
20060103727 | Tseng | May 2006 | A1 |
20060125919 | Camilleri et al. | Jun 2006 | A1 |
20060171704 | Bingle et al. | Aug 2006 | A1 |
20060184297 | Higgins-Luthman | Aug 2006 | A1 |
20070040034 | Hennick | Feb 2007 | A1 |
20070096020 | Mitsugi et al. | May 2007 | A1 |
20070279518 | Apel et al. | Dec 2007 | A1 |
20080024883 | Iwasaki | Jan 2008 | A1 |
20080043105 | Kallhammer et al. | Feb 2008 | A1 |
20080122965 | Fang | May 2008 | A1 |
20090010494 | Bechtel et al. | Jan 2009 | A1 |
20090012203 | Nakanishi et al. | Jan 2009 | A1 |
20090244361 | Gebauer et al. | Oct 2009 | A1 |
20090295181 | Lawlor et al. | Dec 2009 | A1 |
20100015713 | Deeter et al. | Jan 2010 | A1 |
20100097519 | Byrne et al. | Apr 2010 | A1 |
20100103308 | Butterfield et al. | Apr 2010 | A1 |
20100279439 | Shah et al. | Nov 2010 | A1 |
20110025850 | Maekawa et al. | Feb 2011 | A1 |
20110298968 | Tseng et al. | Dec 2011 | A1 |
20120081550 | Sewell | Apr 2012 | A1 |
20120265416 | Lu | Oct 2012 | A1 |
20130242099 | Sauer et al. | Sep 2013 | A1 |
20140000804 | Looi et al. | Jan 2014 | A1 |
20140022657 | Lu et al. | Jan 2014 | A1 |
20140313337 | Devota et al. | Oct 2014 | A1 |
20140373345 | Steigerwald | Dec 2014 | A1 |
20150124098 | Winden et al. | May 2015 | A1 |
20150222795 | Sauer et al. | Aug 2015 | A1 |
20150266430 | Mleczko et al. | Sep 2015 | A1 |
20150327398 | Achenbach et al. | Nov 2015 | A1 |
20150365569 | Mai et al. | Dec 2015 | A1 |
20160037028 | Biemer | Feb 2016 | A1 |
20160255257 | Lu et al. | Sep 2016 | A1 |
20160268716 | Conger et al. | Sep 2016 | A1 |
20170054881 | Conger et al. | Feb 2017 | A1 |
20170129419 | Conger et al. | May 2017 | A1 |
20170133811 | Conger et al. | May 2017 | A1 |
20170201661 | Conger | Jul 2017 | A1 |
20170295306 | Mleczko | Oct 2017 | A1 |
20170302829 | Mleczko et al. | Oct 2017 | A1 |
20180013935 | Kunze et al. | Jan 2018 | A1 |
20180072239 | Wienecke et al. | Mar 2018 | A1 |
20180098033 | Mleczko et al. | Apr 2018 | A1 |
Number | Date | Country | |
---|---|---|---|
20180054555 A1 | Feb 2018 | US |
Number | Date | Country | |
---|---|---|---|
62377878 | Aug 2016 | US |