The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more cameras at a vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
The present invention provides a driver assistance system or vision system or imaging system for a vehicle that utilizes one or more cameras (preferably one or more CMOS cameras) to capture image data representative of images exterior of the vehicle, and provides a brightness control that controls brightness parameters of the camera or image sensor. An image processor is operable to process image data captured by the camera. The brightness control, responsive to brightness values of a previous frame of image data, interpolates towards an expected brightness by calculating a set of at least three brightness values from the previous frame of image data with three different control coefficients, from which two are chosen in between set points (of expected or desired brightnesses) for further processing. The system repeats this process over multiple frames of captured image data, adjusting the coefficients based the previous frame and on an error between the previous frame brightness and the expected brightness, to adjust the camera brightness until it is within a threshold level from the expected brightness.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle vision system and/or driver assist system and/or object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide display, such as a rearview display or a top down or bird's eye or surround view display or the like.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes at least one exterior viewing imaging sensor or camera, such as a rearward facing imaging sensor or camera 14a (and the system may optionally include multiple exterior viewing imaging sensors or cameras, such as a forward viewing camera 14b at the front (or at the windshield) of the vehicle, and a sideward/rearward viewing camera 14c, 14d at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
Conventional vehicle camera brightness controls typically employ a loop control. The brightness output is the feedback for the deviation from the desired brightness set point. The brightness can be calculated under use of a cumulative histogram, where, for example, the 80 percent bin may be the indicator of the brightness. To decrease or increase the brightness, there may be a coefficient which is a gain factor to the pixels before running the cumulative histogram. The coefficient can be a controlling parameter in a non-linear function, for example, a hyperbolic tangent function. By that the coefficient can be used as control for the brightness. Since the histogram's profile varies, the histogram cannot be expressed as a continuous mathematical function, and the single pixel's brightness is not linearly linked to the cumulative brightness result, the optimal coefficient cannot be calculated a priori, but must be estimated or assumed as best as possible in advance of each frame. A typical way is to predict the best coefficient for the next frame from the knowledge of the previous frames, so by loop (or feedback) control, is illustrated in
The brightness (loop) control is typically a PID-, PD-, DI-, or PI-, or just a P-type. The parameters of the proportional, the integral and the differential control part are typically fixedly set. In case the overall dampening is below one, the system adapts to the (user desired) set point faster, but overshoots more and oscillates longer, and in the case the overall system's dampening above one, there is no or less over swing, but the desired set point or image brightness changes are reached later. Over swing, oscillation and slow brightness adaption are not desired in brightness control. The parameters of proportional-integral-derivative (PID) controls are typically set (optimized) in a way that the system dampening is close to 1, but since the control system is not continuous, there is no optimal PID setting. A PID controlled brightness control setting may be a compromise between the lackings of being slow or over swinging. A brightness always varies (needs to be adapted to) when the light condition in the capture image changes and the dynamic range of the camera, display and/or processing system is not wide enough.
Brightness Control in Practice:
Brightness control approaches mainly fall into 2 categories:
For the second approach, PID control is popular, such as shown in
Instead of using an comparably slow PID control for adapting the output brightness to a varying input brightness, the camera brightness control of the present invention interpolates towards the set point (expected brightness Bexp) by calculating a set of three (or five or optionally more) brightnesses from three (or five or optionally more) different control coefficients, from which two are chosen in between the set points, for further processing (see
For example, for every current frame of image or image data, there are two hidden frames running parallel (instantaneously) with it. These three frames of image data have their own brightness coefficients and brightness result. Due to the nonlinearity between the coefficient and the brightness (see
Note that in
The present invention provides enhanced fast online brightness control through coefficient interpolation (see
under use of the minimal brightness deviation e:
min(e)=min(|B(Ci)−Bexp|),B−≤B≤B+.
Image brightness (B) is controlled by a coefficient (C), in the solution of the present invention (see
Conventionally, a brightness control requirement may be fulfilled by PID control. But for real-time purposes, the PID introduces complexity of parameter selection. The PID control has at least 3 parameters to adjust and it is difficult to find an optimal combination for all lighting conditions.
The system or control of the present invention takes advantage of field-programmable gate array (FPGA) parallel processing on the original image and has 3 versions of brightness values out of 3 different previous coefficients simultaneously. The 3 coefficients are chosen as the previous coefficient (Coefi-1), a larger coefficient (+) and a smaller coefficient (−), so that there are correspondingly 3 brightness value results (B, B+ and B−). The next coefficient (Coefi) is calculated by interpolation among these 3 coefficients with respect to smallest error to the 3 brightness values. Since the FPGA cannot hold the current frame in memory, the new coefficient may be used for the next frame's brightness determination. Optionally, with very fast systems or systems with long frame pause time and a frame memory, the brightness of the current frame may be calculated out of the new found coefficient instantaneously.
The interpolation works as in the formula below, where
A dead-zone ∥ε∥ is needed to have a stable video (nun pumping or flickering). The dead zone ∥ε∥ may be a tolerance band above and below the error e (see
In case the calculated coefficient correction (as a function of the current difference or error e) is within the dead zone boundary, the correction will be ignored so the new brightness coefficient will be identical to the former, see
When an algorithm according the invention uses a function of the difference e which is linear such as being ƒ(e)=k*e, with k being a scaling factor coefficient or non-linear such as being ƒ(e)=2k*e, with k being a scaling factor coefficient in the exponent the convergence to the actual brightness may be significantly faster.
Optionally, the coefficient may be limited to a maximal value, shown in the graph of
With extreme lighting conditions (too bright and too dark), the control module can change the global gain or integration time of the image sensor in very low possibility.
This brightness control can also target on a sub region, such as a region of interest (ROI).
Optionally, the system of the present invention may incorporate image noise filtering methods and algorithms such as by utilizing aspects of the vision systems described in U.S. Publication No. US-2015-0042806, which is hereby incorporated herein by reference in its entirety.
Optionally, the system of the present invention may incorporate enhanced low light capability methods and algorithms such as by utilizing aspects of the vision systems described in U.S. Publication No. US-2014-0354811, which is hereby incorporated herein by reference in its entirety.
Optionally, the system of the present invention may incorporate shading correction methods and algorithms such as by utilizing aspects of the vision systems described in U.S. provisional application Ser. No. 62/448,091, filed Jan. 19, 2017, which is hereby incorporated herein by reference in its entirety.
Optionally, the system of the present invention may incorporate test methods and devices such as by utilizing aspects of the vision systems described in U.S. provisional application Ser. No. 62/486,072, filed Apr. 17, 2017, which is hereby incorporated herein by reference in its entirety.
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EyeQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. Publication Nos. US-2014-0340510; US-2014-0313339; US-2014-0347486; US-2014-0320658; US-2014-0336876; US-2014-0307095; US-2014-0327774; US-2014-0327772; US-2014-0320636; US-2014-0293057; US-2014-0309884; US-2014-0226012; US-2014-0293042; US-2014-0218535; US-2014-0218535; US-2014-0247354; US-2014-0247355; US-2014-0247352; US-2014-0232869; US-2014-0211009; US-2014-0160276; US-2014-0168437; US-2014-0168415; US-2014-0160291; US-2014-0152825; US-2014-0139676; US-2014-0138140; US-2014-0104426; US-2014-0098229; US-2014-0085472; US-2014-0067206; US-2014-0049646; US-2014-0052340; US-2014-0025240; US-2014-0028852; US-2014-005907; US-2013-0314503; US-2013-0298866; US-2013-0222593; US-2013-0300869; US-2013-0278769; US-2013-0258077; US-2013-0258077; US-2013-0242099; US-2013-0215271; US-2013-0141578 and/or US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO 2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.
Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device, such as by utilizing aspects of the video display systems described in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187; 6,690,268; 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252 and/or 6,642,851, and/or U.S. Publication Nos. US-2012-0162427; US-2006-0050018 and/or US-2006-0061008, which are all hereby incorporated herein by reference in their entireties. Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2010/099416; WO 2011/028686; WO 2012/075250; WO 2013/019795; WO 2012/075250; WO 2012/145822; WO 2013/081985; WO 2013/086249 and/or WO 2013/109869, and/or U.S. Publication No. US-2012-0162427, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 62/369,775, filed Aug. 2, 2016, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5530240 | Larson et al. | Jun 1996 | A |
5535314 | Alves et al. | Jul 1996 | A |
5550677 | Schofield et al. | Aug 1996 | A |
5668663 | Varaprasad et al. | Sep 1997 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5724187 | Varaprasad et al. | Mar 1998 | A |
5949331 | Schofield et al. | Sep 1999 | A |
6329925 | Skiver et al. | Dec 2001 | B1 |
6690268 | Schofield et al. | Feb 2004 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7184190 | McCabe et al. | Feb 2007 | B2 |
7195381 | Lynam et al. | Mar 2007 | B2 |
7255451 | McCabe et al. | Aug 2007 | B2 |
7274501 | McCabe et al. | Sep 2007 | B2 |
7338177 | Lynam | Mar 2008 | B2 |
7370983 | DeWind et al. | May 2008 | B2 |
7375803 | Bamji | May 2008 | B1 |
7446650 | Scholfield et al. | Nov 2008 | B2 |
7581859 | Lynam | Sep 2009 | B2 |
7626749 | Baur et al. | Dec 2009 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7855755 | Weller et al. | Dec 2010 | B2 |
8446470 | Lu et al. | May 2013 | B2 |
9041806 | Baur et al. | May 2015 | B2 |
9126525 | Lynam et al. | Sep 2015 | B2 |
9264672 | Lynam | Feb 2016 | B2 |
9800794 | Weber | Oct 2017 | B2 |
20020135743 | Gindele | Sep 2002 | A1 |
20020163489 | Kuo | Nov 2002 | A1 |
20030103141 | Bechtel et al. | Jun 2003 | A1 |
20050206745 | Daiku | Sep 2005 | A1 |
20060050018 | Hutzel et al. | Mar 2006 | A1 |
20060061008 | Karner et al. | Mar 2006 | A1 |
20060164533 | Hsieh et al. | Jul 2006 | A1 |
20070097260 | Takeuchi | May 2007 | A1 |
20110080421 | Capener | Apr 2011 | A1 |
20110206280 | Lee | Aug 2011 | A1 |
20110292241 | Segapelli et al. | Dec 2011 | A1 |
20130250114 | Lu | Sep 2013 | A1 |
20130321476 | Botzas | Dec 2013 | A1 |
20140160284 | Achenbach et al. | Jun 2014 | A1 |
20140333729 | Pflug | Nov 2014 | A1 |
20140340510 | Ihlenburg et al. | Nov 2014 | A1 |
20140354811 | Weber | Dec 2014 | A1 |
20150022664 | Pflug et al. | Jan 2015 | A1 |
20150042806 | Wierich | Feb 2015 | A1 |
20150049193 | Gupta et al. | Feb 2015 | A1 |
20150245043 | Greenebaum | Aug 2015 | A1 |
20160119600 | Kusuda | Apr 2016 | A1 |
20170122244 | Dufford | May 2017 | A1 |
20180139368 | Nakayama | May 2018 | A1 |
20180204310 | Junglas et al. | Jul 2018 | A1 |
20180302615 | Lehmann et al. | Oct 2018 | A1 |
Number | Date | Country | |
---|---|---|---|
20180041713 A1 | Feb 2018 | US |
Number | Date | Country | |
---|---|---|---|
62369775 | Aug 2016 | US |