The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more cameras at a vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
The present invention provides a testing system for a camera for a driver assistance system or vision system or imaging system for a vehicle. The testing system or method measures or estimates the defocus of the camera. The system includes a collimator with an internal optic and target. The target may be angled relative to an image plane of the camera or include a stepped surface within the view of the optic and camera. The defocus of the camera may be measured based on image data captured of the angled or stepped target.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle vision system and/or driver assist system and/or object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide display, such as a rearview display or a top down or bird's eye or surround view display or the like.
Referring now to the drawings and the illustrative embodiments depicted therein, a vision system 10 for a vehicle 12 includes at least one exterior viewing imaging sensor or camera, such as a forward viewing imaging sensor or camera, which may be disposed at and behind the windshield 14 of the vehicle and viewing forward through the windshield so as to capture image data representative of the scene occurring forward of the vehicle (
The system or method of the present invention estimates or measures the defocus of a vehicular camera. Typically, the ability to determine a position of a lens (i.e., an optic) in relation to an imager is accomplished by one of three methods. First, as illustrated in
The system or method of the present invention (
Referring now to
In another implementation, a camera 610 has a field of view that includes a collimator 620 (
In some examples, the system includes multiple targets 710 to increase the measurement window. That is, additional targets may be added before and/or after a center target (
Optionally, and with reference to
Referring now to
Thus, the system of the present invention includes a single collimator without moving parts and with one or more targets that measures the defocus of a camera. The collimator may include a light emitting diode (LED) light source. High precision tooling (e.g., Fullcut Mill (FCM) 3) may be used to improve stability. Image data captured by the camera with the target or targets present in the field of view of the camera is processed (via an image processor) to determine the degree of focus or defocus of the camera at the target.
The system includes an image processor operable to process image data captured by the camera or cameras. For example, the image processor may comprise an image processing chip selected from the EYEQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. The system may utilize aspects of the systems described in U.S. Publication No. US-2018-0373944, which is hereby incorporated herein by reference in its entirety.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. Publication Nos. US-2014-0340510; US-2014-0313339; US-2014-0347486; US-2014-0320658; US-2014-0336876; US-2014-0307095; US-2014-0327774; US-2014-0327772; US-2014-0320636; US-2014-0293057; US-2014-0309884; US-2014-0226012; US-2014-0293042; US-2014-0218535; US-2014-0218535; US-2014-0247354; US-2014-0247355; US-2014-0247352; US-2014-0232869; US-2014-0211009; US-2014-0160276; US-2014-0168437; US-2014-0168415; US-2014-0160291; US-2014-0152825; US-2014-0139676; US-2014-0138140; US-2014-0104426; US-2014-0098229; US-2014-0085472; US-2014-0067206; US-2014-0049646; US-2014-0052340; US-2014-0025240; US-2014-0028852; US-2014-005907; US-2013-0314503; US-2013-0298866; US-2013-0222593; US-2013-0300869; US-2013-0278769; US-2013-0258077; US-2013-0258077; US-2013-0242099; US-2013-0215271; US-2013-0141578 and/or US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO 2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application is a continuation of U.S. patent application Ser. No. 16/718,823, filed Dec. 18, 2019, now U.S. Pat. No. 11,012,684, which claims priority of U.S. provisional application, Ser. No. 62/878,945, filed Jul. 26, 2019, and U.S. provisional application Ser. No. 62/781,791, filed Dec. 19, 2018, which are hereby incorporated herein by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
5550677 | Schofield et al. | Aug 1996 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5949331 | Schofield et al. | Sep 1999 | A |
7038577 | Pawlicki et al. | May 2006 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7855755 | Weller et al. | Dec 2010 | B2 |
11012684 | Sesti et al. | May 2021 | B2 |
20020050988 | Petrov | May 2002 | A1 |
20050089208 | Dong et al. | Apr 2005 | A1 |
20060038910 | Knoedgen | Feb 2006 | A1 |
20060038976 | Knoedgen | Feb 2006 | A1 |
20070211240 | Matsumoto et al. | Sep 2007 | A1 |
20140152845 | Seger | Jun 2014 | A1 |
20150138372 | Apel | May 2015 | A1 |
20150277135 | Johnson | Oct 2015 | A1 |
20170006282 | Sigle | Jan 2017 | A1 |
20170132774 | Ruprecht | May 2017 | A1 |
20170234923 | Douglas et al. | Aug 2017 | A1 |
20170287166 | Claveau | Oct 2017 | A1 |
20180113321 | Heshmat Dehkordi | Apr 2018 | A1 |
20180302615 | Lehmann et al. | Oct 2018 | A1 |
20180373944 | Sesti et al. | Dec 2018 | A1 |
20200084436 | Patterson | Mar 2020 | A1 |
Number | Date | Country | |
---|---|---|---|
20210274160 A1 | Sep 2021 | US |
Number | Date | Country | |
---|---|---|---|
62878945 | Jul 2019 | US | |
62781791 | Dec 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16718823 | Dec 2019 | US |
Child | 17302934 | US |