Method for enhancing vehicle camera image quality

Information

  • Patent Grant
  • 10257432
  • Patent Number
    10,257,432
  • Date Filed
    Monday, September 25, 2017
    6 years ago
  • Date Issued
    Tuesday, April 9, 2019
    5 years ago
Abstract
A method of image enhancement for a vehicle vision system includes providing a camera at the vehicle and providing a processor operable to process image data. Multiple frames of image data are captured with the camera, and contrast is enhanced in image data by tone mapping. As the vehicle moves, contrast thresholds are tracked within the captured frames of image data with respect to image flow caused by the vehicle's movement. Image data of a first frame of captured image data may be passed through two individual image transfer functions to generate a first transferred frame of image data. The first transferred frame may be blended with a second frame of image data. Presence of an object is detected in the field of view of the camera, and an output is generated responsive to detection of the object present in the field of view of the camera.
Description
FIELD OF THE INVENTION

The present invention relates to imaging systems or vision systems for vehicles.


BACKGROUND OF THE INVENTION

Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,877,897; 5,796,094; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.


SUMMARY OF THE INVENTION

The present invention provides a vision system or imaging system for a vehicle that utilizes one or more cameras to capture images exterior of the vehicle, such as forwardly or rearwardly of the vehicle, and provides for enhanced image processing to detect objects in poor visibility conditions, such as in dense fog or the like.


The vision system may enhance the image processing by amplifying the contrast in the captured images by brightness transfer function filtering and exposure stacking and tracking contrast thresholds or features within the captured images, such as on a frame-by-frame basis as the vehicle travels along a road.


These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a plan view of a vehicle with a vision system and forward facing imaging sensor or camera that provides a forward exterior field of view in accordance with the present invention;



FIG. 2 shows images captured by the forward facing camera and processed by the vision system of the present invention;



FIGS. 3(a) and 3(b) show graphs showing histograms of luminance distribution for the vision system, with FIG. 3(b) showing the histogram of the original image's graph of FIG. 3(a) as spread into the highest possible dynamic range of the target system which equates to a contrast amplification of the present invention;



FIGS. 4(a) and 4(b) show graphs of brightness transfer functions A (FIG. 4(a)) and B (FIG. 4(b)) which find use in the flow chart of FIGS. 5 and 6, with the brightness transfer function A enhancing the brighter areas and dampening the darker ones, and with the brightness transfer function B decreasing the medium illuminated areas, and with the upper end at less of 100, whereby the overall illumination becomes decreased by this transfer function;



FIG. 5 shows a flow chart of the image enhancing and processing steps according the invention, and by mapping/stacking an illumination reduced image scene on top of a contrast enhanced image the dynamic range of the image increases: overexposed areas appear less bright and underexposed more bright, which leads to acknowledge details in the scene easier;



FIG. 6 shows a flow chart of the image enhancing and processing steps according the invention as to be used in a vehicle vision system, supporting machine and human vision driver assistant algorithms; and



FIG. 7 shows an example of how images may be altered when processed according to the flow chart of FIG. 5, whereby it becomes apparent that the process turns out more contrasts of possible objects on foggy weather conditions when comparing Image(t0) and Imageh(t0).





DESCRIPTION OF THE PREFERRED EMBODIMENTS

Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes at least one imaging sensor or camera 14 (such as forward facing camera at the front (or at the windshield) of the vehicle), which captures images exterior of and forwardly of the vehicle (FIG. 1). The imaging system 12 is operable to process (such as via an image processor) image data captured by the camera 14 to present these on a display or to detect objects and/or headlights of approaching vehicles and/or taillights of leading vehicles in the field of view of the camera (such as for use in an object detection system of the vehicle or collision avoidance system of the vehicle or headlamp control system of the vehicle or adaptive cruise control system of the vehicle or lane change departure warning system of the vehicle or traffic sign recognition system of the vehicle or driver assistance system of the vehicle the like).


The image processor of the vision system 12 is operable to process captured image data, such as to detect and identify objects forward (and optionally sideward and/or rearward) of the vehicle during normal operation of the vehicle. In poor visibility conditions, such as foggy conditions and/or heavy snow fall conditions or the like, objects may be difficult for the driver to see and may be difficult even for the image processor to detect, even when image processing algorithms for lens pollution detection (such as similar to that described in U.S. provisional application Ser. No. 61/616,126, filed Mar. 27, 2012, which is hereby incorporated herein by reference in its entirety) come into use. For example, and with reference to image “A” in FIG. 2, during low visibility conditions, such as fog conditions as shown, it is difficult for the driver of the vehicle to detect the person and dog at the side of the road ahead of the vehicle and beyond the principal illumination area of the vehicle headlamps (set at low beams for seeing in the fog). The image processor may process the image to detect objects, but, and with reference to image “B” in FIG. 2, normal image processing may not detect the object of interest (the person and dog in this example) due to the poor visibility conditions. Typically, the object detection may not work feasibly when the image contrast falls under a certain level. In order to increase the detectability of such objects in poor visibility conditions, the vision system of the present invention is operable to enhance or increase the contrast of the captured images so that any objects in the field of view of the camera are darkened to enhance the detectability of the objects by the image processor or to ease the visibility of objects to the driver of the vehicle.


As can be seen with reference to images “C” through “F” in FIG. 2, as the contrast is increased, the side markers or posts along the side of the road and the object of interest (the person and dog in this example) become darker and, in this example, the object moves relative to other fixed objects in the captured images (see images B-F in FIG. 2 and note that the person and dog approach the fixed road marker in the captured images), and thus the image processor can detect the presence of the fixed and moving objects and determine if they are objects of interest to the driver of the vehicle and generate the appropriate signal responsive to such detection and determination or identification. For example, the system, responsive to such an object detection, may generate an alert to the driver or may adjust the headlamps accordingly or may display the detected object on a display screen for viewing by the driver (particularly for backup assist systems where the object is detected rearward of the vehicle during a reversing maneuver). Thus, by increasing the contrast in captured images, the vision system can enhance detection of objects in the camera's field of view that may otherwise go undetected. The system may be operable to increase the contrast in the captured images responsive to a user input or to a detection or determination of a low visibility condition, such as responsive to a signal from a rain sensor or the like that is indicative of detection of a foggy condition or such as responsive to image processing of the captured images to determine that the vehicle is in foggy driving conditions (such as by utilizing aspects of the vision systems described in U.S. Pat. Nos. 4,973,844; 5,796,094; 5,877,897 and 6,353,392, which are hereby incorporated herein by reference in their entireties).


It is known to provide image contrast enhancing for photographs (such as photographs taken by hand held digital cameras or astronomical telescopes or the like), and such enhancements may be done by known computer based tools for editing images. Today, nearly every operating system, library, presenting program and/or the like provides at least basic image editing functions. Professional photo editing programs like CoralDRAW®, Gimp® or Adobe Photoshop® provide a wide range of image editing and enhancing features. Typically used for contrast enhancing is the editing of the contrast histogram. This can be used to expose objects stronger. A function used especially to do this is “Contrast Enhancement through Localized Histogram Equalization” (see Cromwell-intl.com: http://www.cromwell-intl.com/3d/histogram/, which is hereby incorporated herein by reference in its entirety). Even night images can become contrast enhanced in a way that low illuminated objects turn out more visible. Such algorithms used in consumer computer programs for image enhancing are typically used in individual pictures, and are not meant to be used in real time applications.


Image quality improvement in poor visibility conditions is known from airborne weather surveillance pictures for reworking pictures taken in cloudy (foggy) situations. The best results were achieved by Oakley et al. when contrast enhancement algorithm in conjunction with a temporal filters came into use (see Image Processing, IEEE; “Improving Image Quality in Poor Visibility Conditions Using a Physical Model for Contrast Degradation,” http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=660994, by Oakley, J. P. and Satherley, B. L., February 1998, which is hereby incorporated herein by reference in its entirety). The base was a physical model on fog reflection.


Attempts have been made to do video contrast enhancements such as in “Contrast Enhancement Using Brightness Preserving Bi-Histogram Equalization” by Yeong-Taeg Kim (Consumer Electronics: IEEE: “Contrast Enhancement Using Brightness Preserving Bi-Histogram Equalization,” by Yeong-Taeg Kim, February 1997, which is hereby incorporated herein by reference in its entirety). This requires real time processing. Demand for this was and is in applications for the likes of television images, images providing medical devices, military engineering and/or the like, and Kim et al. suggested “Partially Overlapped Sub-Block Histogram Equalization” to be used in cameras (Circuits and Systems for Video Technology, IEEE: “Partially Overlapped Sub-Block Histogram Equalization” http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=915354, by Joung-Youn Kim, Lee-Sup Kim and Seung-Ho Hwang, April 2001, which is hereby incorporated herein by reference in its entirety). Also, Marsi et al. were able to simplify algorithms by attempting recursive rational filters (Imaging Systems and Techniques, 2004; IEEE International Workshop: “Real Time Video Contrast Enhancement by Using Recursive Rational Filter,” http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=1397276, by Marsi, S., Ramponi, G. and Carrato, S., May 14, 2004, which is hereby incorporated herein by reference in its entirety), and Wang et al. suggested the use of weighted thresholded histogram equalization for fast processing (Consumer Electronics, IEEE: “Real Time Video Contrast Enhancement by using Weighted Thresholded Histogram Equalization” http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=4266969, by Qing Wang and Ward, R. K., May 2007, which is hereby incorporated herein by reference in its entirety). Another challenge is the noise, a common problem on electronic cameras; Starck et al. published a procedure to do noise reduction by curvelet transforms in 2003 (Image Processing, IEEE: “Gray and Color Image Contrast Enhancement by the Curvelet Transform,” http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=1208320, by Starck, J.-L., Murtagh, F., Candes, E. J. and Donoho, D. L., June 2003, which is hereby incorporated herein by reference in its entirety).


It is also known to use infrared systems or low light amplifying systems in vehicles. Earlier systems have used infrared cameras alone, and some systems additionally use infrared headlights to light up the area in front of the vehicle (invisible for the human eye) which makes that area easier to detect with the infrared camera. Infrared cameras may provide enhanced performance in object detection in dense fog conditions due to its physical principal and the detected wave length have the intrinsic property to interfuse fog, so objects in fog can be detected and/or visualized.


State of the art automotive driver assistance systems typically provide the driver with useful information of the vehicle's environment, including the traffic or objects in front of, to the side of and rearward of the vehicle. Typically, there are additional warnings or image overlays for highlighting hazards, especially those in the driving direction of the vehicle and in the anticipated path of travel of the vehicle. Obstacles or pedestrians that are in the way or path of the vehicle or tend to step into the path of the vehicle may be highlighted. Systems which also do active interventions such as braking or collision avoidance maneuvers are also known. For distinguishing pedestrians from other objects and for predetermining their walking direction and speed, the detected objects need to be tracked over a certain time. Also, analyzing shapes or markers of walking or standing pedestrians is known in the field of automotive vision systems and image processing. Due to the vehicle's own movement, the objects in the captured images flow or move over successively captured images (optical flow). For example, external or outside objects (even stationary objects) move through the images taken from a front facing vehicle camera as the vehicle travels along the road. Algorithms for tracking objects under driving conditions are also known. When a vehicle drives through a turn, the optical flow also behaves in a turned manner. That turn can be anticipated by the knowledge of the steering wheel's angle and a kinematic model of the vehicle's curve behavior. The optical flow speed directly translates from the vehicle's ground speed given by the odometer. Alternatively known algorithms may determine the optical flow direct from the image flow without the previous mentioned input from the vehicle.


For enabling the above mentioned pedestrian and obstacle acknowledging and tracking algorithm to work properly, especially to be able to highlight a hazard or warn the driver or intervene (such as via braking or cruise control adjustment or the like), it is necessary to receive sufficient images. In foggy driving conditions or during heavy snow fall driving conditions, cameras in the visible spectrum deliver images of insufficient quality. The present invention provides enhanced image quality of visible spectrum cameras, especially the dynamic range of the resulting image, so that the driver assist system algorithms can work properly and/or display the processed image to the driver as an improvement to his or her view in such limited visibility conditions. This is achieved without the need of additional cameras using different light spectrums (such as infrared sensitive cameras or the like) or other sensors for the same purpose or high dynamic range (HDR) cameras.


The present invention thus provides enhanced image quality in poor visibility conditions captured by a non HDR camera by amplifying the contrast details in the captured images by generating a pseudo HDR image out of current and historical image components by tone mapping. The system then tracks the contrast thresholds/features within the captured images with respect to the image flow caused by the vehicle's movement. This process is repeated on a frame-by-frame basis to detect and identify objects in the camera's forward field of view, as can be seen in FIGS. 5 and 6. At every loop the historically (previously enhanced) image (Imageh(t0-n))) passes two individual image transfer functions and then becomes superpositioned (or mapped, merged, blended or stacked) by the currently captured frame (Image t0)). This tone mapping method is called image stacking, exposure fusion or exposure blending. The mapping ratio of how much of the historical image (Imageh(t0-n))) becomes kept and how much of the current image (Image t0)) becomes mapped in is freely selectable between 0 and 1. In the example in FIG. 5, 20%/80% was chosen for a data frame rate of 30 frames per second. Slower frame rates might require a shift into a stronger influence of (Image t0)). The used image enhancements shall not be limited to these shown in the example of FIGS. 5 and 6.


The brightness transfer function A (FIG. 4(a)) enhances the brighter areas and dampens the darker ones. This equates to a histogram spread (Dynamic Range Increase), such as shown in FIGS. 3(a) and 3(b), of the historically image (Imageh(t0-n))). The brightness transfer function B (FIG. 4(b)) decreases the medium illuminated areas of the currently captured image (Image t0)). The upper end is at less than 100%. The overall illumination becomes decreased by this transfer function. FIG. 7 shows that already after this step the discrimination between the object (person with dog) and surrounding (foggy) area is improved. By mapping/stacking an illumination reduced image scene (currently captured images) on top of a contrast enhanced image (historical image), the dynamic range of the image increases as to be seen in FIGS. 3(a) and 3(b). Overexposed areas appear less bright and underexposed more bright which leads to acknowledge details in the scene easier (see FIG. 7). After consecutive loops it may come to a blooming effect or halo at the borderline of areas with high contrast. This effect may be enhanced by some blurring which is caused by unavoidable inaccuracy of the distorting, turning, cropping and moving of the currently captured image to the historical scene.


The result of this image processing and tracking of the features with respect to the optical flow and the vehicle movement is shown in principle in FIG. 2 (and discussed above). The algorithm based on already established image processing procedures (non-automotive, image enhancements of photographs and ‘image registration’ and the like), such as tonal value splitting/-buckling/-limiting, histogram equalization and the like, as simplified can be seen with reference to FIGS. 3(a) and 3(b).


Because the yet to be processed images are captured by a camera on a moving vehicle, it is necessary that the optical flow and the according information or data of objects (both steady or moving) moving through the images, including the vehicle speed, the steering angle of the vehicle and the like, be taken into account. There may be a model of the vehicle's cinematic mathematical equations. Its results may be stored in a look up table. The camera's or cameras parameters as like mounting position and viewing angle optical properties may be reflected in that (combined) look up table or in another mathematical model or table. The moving objects/obstacles can thus be distinguished from steady objects relative to the movement of the vehicle that is equipped with the camera system or vision system of the present invention. Object classification may work on further distances by feeding enhanced image data. Further algorithms may process the image data and may indicate hazards or the like, and/or may actively intervene to avoid collisions and the like. The image enhancing algorithm may find use in processing multiple camera images separate or by processing a stitched image which may be arranged as a vehicle top view image or the like.


The imaging sensor and its photosensor array may comprise any suitable camera or sensing device, such as, for example, an array of a plurality of photosensor elements arranged in 640 columns and 480 rows (a 640×480 imaging array), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. The logic and control circuit of the imaging sensor may function in any known manner, such as in the manner described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094 and/or 6,396,397, and/or U.S. provisional applications, Ser. No. 61/696,416, filed Sep. 4, 2012; Ser. No. 61/682,995, filed Aug. 14, 2012; Ser. No. 61/682,486, filed Aug. 13, 2012; Ser. No. 61/680,883, filed Aug. 8, 2012; Ser. No. 61/678,375, filed Aug. 1, 2012; Ser. No. 61/676,405, filed Jul. 27, 2012; Ser. No. 61/666,146, filed Jun. 29, 2012; Ser. No. 61/653,665, filed May 31, 2012; Ser. No. 61/653,664, filed May 31, 2012; Ser. No. 61/648,744, filed May 18, 2012; Ser. No. 61/624,507, filed Apr. 16, 2012; Ser. No. 61/616,126, filed Mar. 27, 2012; Ser. No. 61/615,410, filed Mar. 26, 2012; Ser. No. 61/613,651, filed Mar. 21, 2012; Ser. No. 61/607,229, filed Mar. 6, 2012; Ser. No. 61/605,409, filed Mar. 1, 2012; Ser. No. 61/602,878, filed Feb. 24, 2012; Ser. No. 61/602,876, filed Feb. 24, 2012; Ser. No. 61/600,205, filed Feb. 17, 2012; Ser. No. 61/588,833, filed Jan. 20, 2012; Ser. No. 61/583,381, filed Jan. 5, 2012; Ser. No. 61/579,682, filed Dec. 23, 2011; Ser. No. 61/570,017, filed Dec. 13, 2011; Ser. No. 61/568,791, filed Dec. 9, 2011; Ser. No. 61/567,446, filed Dec. 6, 2011; Ser. No. 61/559,970, filed Nov. 15, 2011; and/or Ser. No. 61/552,167, filed Oct. 27, 2011, and/or PCT Application No. PCT/CA2012/000378, filed Apr. 25, 2012, and published Nov. 1, 2012 as International Publication No. WO 2012/145822, and/or PCT Application No. PCT/US2012/056014, filed Sep. 19, 2012, and published Mar. 28, 2013 as International Publication No. WO 2013/043661, and/or PCT Application No. PCT/US2012/048800, filed Jul. 30, 2012, and published Feb. 7, 2013 as International Publication No. WO 2013/019707, and/or PCT Application No. PCT/US2012/048110, filed Jul. 25, 2012, and published Jan. 31, 2013 as International Publication No. WO 2013/016409, and/or U.S. patent application Ser. No. 13/534,657, filed Jun. 27, 2012, and published Jan. 3, 2013 as U.S. Publication No. US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in PCT Application No. PCT/US10/038477, filed Jun. 14, 2010, and/or U.S. patent application Ser. No. 13/202,005, filed Aug. 17, 2011, now U.S. Pat. No. 9,126,525, and/or U.S. provisional applications, Ser. No. 61/650,667, filed May 23, 2012; Ser. No. 61/579,682, filed Dec. 23, 2011; Ser. No. 61/565,713, filed Dec. 1, 2011, which are hereby incorporated herein by reference in their entireties.


The imaging device and control and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras and vision systems described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454 and 6,824,281, and/or International Publication No. WO 2010/099416, published Sep. 2, 2010, and/or PCT Application No. PCT/US10/47256, filed Aug. 31, 2010, and/or U.S. patent application Ser. No. 12/508,840, filed Jul. 24, 2009, and published Jan. 28, 2010 as U.S. Pat. Publication No. US 2010-0020170; and/or PCT Application No. PCT/US2012/048110, filed Jul. 25, 2012, and published Jan. 31, 2013 as International Publication No. WO 2013/016409, and/or U.S. patent application Ser. No. 13/534,657, filed Jun. 27, 2012, and published Jan. 3, 2013 as U.S. Publication No. US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The camera or cameras may comprise any suitable cameras or imaging sensors or camera modules, and may utilize aspects of the cameras or sensors described in U.S. patent application Ser. No. 12/091,359, filed Apr. 24, 2008 and published Oct. 1, 2009 as U.S. Publication No. US-2009-0244361; and/or U.S. patent application Ser. No. 13/260,400, filed Sep. 26, 2011, now U.S. Pat. No. 8,542,451, and/or U.S. Pat. Nos. 7,965,336 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties. The imaging array sensor may comprise any suitable sensor, and may utilize various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as the types described in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,715,093; 5,877,897; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 6,498,620; 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,806,452; 6,396,397; 6,822,563; 6,946,978; 7,339,149; 7,038,577; 7,004,606; 7,720,580 and/or 7,965,336, and/or PCT Application No. PCT/US2008/076022, filed Sep. 11, 2008 and published Mar. 19, 2009 as International Publication No. WO 2009/036176, and/or PCT Application No. PCT/US2008/078700, filed Oct. 3, 2008 and published Apr. 9, 2009 as International Publication No. WO 2009/046268, which are all hereby incorporated herein by reference in their entireties.


The camera module and circuit chip or board and imaging sensor may be implemented and operated in connection with various vehicular vision-based systems, and/or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 7,004,606; 7,339,149 and/or 7,526,103, which are all hereby incorporated herein by reference in their entireties, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454; 6,320,176 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties, a vehicle vision system, such as a forwardly, sidewardly or rearwardly directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978 and/or 7,859,565, which are all hereby incorporated herein by reference in their entireties, a trailer hitching aid or tow check system, such as the type disclosed in U.S. Pat. No. 7,005,974, which is hereby incorporated herein by reference in its entirety, a reverse or sideward imaging system, such as for a lane change assistance system or lane departure warning system or for a blind spot or object detection system, such as imaging or detection systems of the types disclosed in U.S. Pat. Nos. 7,881,496; 7,720,580; 7,038,577; 5,929,786 and/or 5,786,772, and/or U.S. provisional applications, Ser. No. 60/628,709, filed Nov. 17, 2004; Ser. No. 60/614,644, filed Sep. 30, 2004; Ser. No. 60/618,686, filed Oct. 14, 2004; Ser. No. 60/638,687, filed Dec. 23, 2004, which are hereby incorporated herein by reference in their entireties, a video device for internal cabin surveillance and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962; 5,877,897; 6,690,268 and/or 7,370,983, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties, a traffic sign recognition system, a system for determining a distance to a leading or trailing vehicle or object, such as a system utilizing the principles disclosed in U.S. Pat. Nos. 6,396,397 and/or 7,123,168, which are hereby incorporated herein by reference in their entireties, and/or the like.


Optionally, the circuit board or chip may include circuitry for the imaging array sensor and or other electronic accessories or features, such as by utilizing compass-on-a-chip or EC driver-on-a-chip technology and aspects such as described in U.S. Pat. Nos. 7,255,451 and/or 7,480,149; and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, and/or U.S. patent application Ser. No. 12/578,732, filed Oct. 14, 2009, now U.S. Pat. No. 9,487,144, which are hereby incorporated herein by reference in their entireties.


Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device disposed at or in the interior rearview mirror assembly of the vehicle, such as by utilizing aspects of the video mirror display systems described in U.S. Pat. No. 6,690,268 and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011, now U.S. Pat. No. 9,264,672, which are hereby incorporated herein by reference in their entireties. The video mirror display may comprise any suitable devices and systems and optionally may utilize aspects of the compass display systems described in U.S. Pat. Nos. 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252 and/or 6,642,851, and/or European patent application, published Oct. 11, 2000 under Publication No. EP 0 1043566, and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, which are all hereby incorporated herein by reference in their entireties. Optionally, the video mirror display screen or device may be operable to display images captured by a rearward viewing camera of the vehicle during a reversing maneuver of the vehicle (such as responsive to the vehicle gear actuator being placed in a reverse gear position or the like) to assist the driver in backing up the vehicle, and optionally may be operable to display the compass heading or directional heading character or icon when the vehicle is not undertaking a reversing maneuver, such as when the vehicle is being driven in a forward direction along a road (such as by utilizing aspects of the display system described in PCT Application No. PCT/US2011/056295, filed Oct. 14, 2011 and published Apr. 19, 2012 as International Publication No. WO 2012/051500, which is hereby incorporated herein by reference in its entirety).


Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in PCT Application No. PCT/US10/25545, filed Feb. 26, 2010 and published on Sep. 2, 2010 as International Publication No. WO 2010/099416, and/or PCT Application No. PCT/US10/47256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686, and/or PCT Application No. PCT/US11/62755, filed Dec. 1, 2011 and published Jun. 7, 2012 as International Publication No. WO 2012-075250, and/or PCT Application No. PCT/US2012/048993, filed Jul. 31, 2012, and published Feb. 7, 2013 as International Publication No. WO 2013/019795, and/or PCT Application No. PCT/CA2012/000378, filed Apr. 25, 2012, and published Nov. 1, 2012 as International Publication No. WO 2012/145822, and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011, now U.S. Pat. No. 9,264,672, and/or U.S. provisional applications, Ser. No. 61/615,410, filed Mar. 26, 2012; Ser. No. 61/588,833, filed Jan. 20, 2012; Ser. No. 61/570,017, filed Dec. 13, 2011; Ser. No. 61/568,791, filed Dec. 9, 2011; Ser. No. 61/559,970, filed Nov. 15, 2011; Ser. No. 61/540,256, filed Sep. 28, 2011, which are hereby incorporated herein by reference in their entireties.


Optionally, the video mirror display may be disposed rearward of and behind the reflective element assembly and may comprise a display such as the types disclosed in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187 and/or 6,690,268, and/or in U.S. patent applications, Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008; and/or Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are all hereby incorporated herein by reference in their entireties. The display is viewable through the reflective element when the display is activated to display information. The display element may be any type of display element, such as a vacuum fluorescent (VF) display element, a light emitting diode (LED) display element, such as an organic light emitting diode (OLED) or an inorganic light emitting diode, an electroluminescent (EL) display element, a liquid crystal display (LCD) element, a video screen display element or backlit thin film transistor (TFT) display element or the like, and may be operable to display various information (as discrete characters, icons or the like, or in a multi-pixel manner) to the driver of the vehicle, such as passenger side inflatable restraint (PSIR) information, tire pressure status, and/or the like. The mirror assembly and/or display may utilize aspects described in U.S. Pat. Nos. 7,184,190; 7,255,451; 7,446,924 and/or 7,338,177, which are all hereby incorporated herein by reference in their entireties. The thicknesses and materials of the coatings on the substrates of the reflective element may be selected to provide a desired color or tint to the mirror reflective element, such as a blue colored reflector, such as is known in the art and such as described in U.S. Pat. Nos. 5,910,854; 6,420,036 and/or 7,274,501, which are hereby incorporated herein by reference in their entireties.


Optionally, the display or displays and any associated user inputs may be associated with various accessories or systems, such as, for example, a tire pressure monitoring system or a passenger air bag status or a garage door opening system or a telematics system or any other accessory or system of the mirror assembly or of the vehicle or of an accessory module or console of the vehicle, such as an accessory module or console of the types described in U.S. Pat. Nos. 7,289,037; 6,877,888; 6,824,281; 6,690,268; 6,672,744; 6,386,742 and 6,124,886, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties.


The display or displays may comprise a video display and may utilize aspects of the video display devices or modules described in U.S. Pat. Nos. 6,690,268; 7,184,190; 7,274,501; 7,370,983; 7,446,650 and/or 7,855,755, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are all hereby incorporated herein by reference in their entireties. The video display may be operable to display images captured by one or more imaging sensors or cameras at the vehicle.


Changes and modifications to the specifically described embodiments may be carried out without departing from the principles of the present invention, which is intended to be limited only by the scope of the appended claims as interpreted according to the principles of patent law.

Claims
  • 1. A method of image enhancement for a vehicle vision system, said method comprising: (a) providing a camera at a vehicle so that the camera has an exterior field of view;(b) providing a processor operable to process image data;(c) capturing multiple frames of image data with the camera;(d) enhancing contrast in image data by tone mapping;(e) tracking, as the vehicle moves, contrast thresholds within the captured frames of image data with respect to image flow caused by the vehicle's movement;(f) passing image data of a first frame of captured image data through two individual image transfer functions to generate a first transferred frame of image data;(g) blending the first transferred frame with a second frame of image data;(h) detecting, at least via processing of image data by the processor, presence of an object in the field of view of the camera; and(i) generating an output responsive to detection of the object present in the field of view of the camera.
  • 2. The method of claim 1, comprising executing a brightness transfer function to enhance contrast of image data.
  • 3. The method of claim 1, comprising executing tone mapping of multiple frames of image data to enhance detection of the object present in the field of view of the camera.
  • 4. The method of claim 1, comprising classifying the detected object present in the field of view of the camera.
  • 5. The method of claim 4, comprising generating an output responsive to classification of the detected object.
  • 6. The method of claim 1, comprising determining a low visibility driving condition and, responsive to determination of the low visibility driving condition, increasing contrast of features in captured image data by brightening brighter areas of image data and dampening darker areas of image data.
  • 7. The method of claim 6, comprising increasing contrast of features over multiple successive frames of captured image data.
  • 8. The method of claim 7, comprising tracking, via processing by the processor of multiple successive frames of captured image data during the determined low visibility driving condition, image flow caused by movement of the vehicle to enhance detection and identification of objects present in the field of view of the camera.
  • 9. The method of claim 1, wherein at least one previously captured frame of image data is retrieved from memory.
  • 10. The method of claim 1, comprising determining a low visibility driving condition via processing by the processor of image data.
  • 11. The method of claim 1, comprising determining that fog is present in the field of view of the camera via processing by the processor of image data.
  • 12. The method of claim 1, comprising providing blended frames of image data to a video display screen that is disposed in the vehicle at a location viewable by a driver of the vehicle when operating the vehicle.
  • 13. The method of claim 1, wherein capturing multiple frames of image data with the camera comprises capturing multiple frames of image data with the camera at a frame rate of at least 30 frames per second.
  • 14. The method of claim 1, wherein blending the first transferred frame with the second frame of image data generates a blended image frame of image data that is up to 20 percent derived from the first frame of image data.
  • 15. The method of claim 1, comprising executing a brightness transfer function on at least one previously captured frame of image data.
  • 16. The method of claim 1, comprising providing the generated output to a driver assistance system of the vehicle.
  • 17. The method of claim 16, wherein the driver assistance system of the vehicle comprises a system selected from the group consisting of (i) a lane change assist system of the vehicle, (ii) a lane departure warning system of the vehicle, (iii) a blind spot detection system of the vehicle, (iv) an adaptive cruise control system of the vehicle, (v) a collision avoidance system of the vehicle, (vi) a traffic sign recognition system of the vehicle and (vii) a vehicle headlamp control system of the vehicle.
  • 18. The method of claim 1, comprising tracking the detected object over successive frames of image data to determine if the detected object is an object of interest in the field of view of the camera.
  • 19. The method of claim 1, wherein processing by the processor of image data by the processor is responsive at least in part to steering of the vehicle.
  • 20. The method of claim 1, comprising distinguishing, via processing by the processor of image data, moving objects from non-moving objects.
  • 21. The method of claim 20, wherein distinguishing moving objects comprises distinguishing moving objects responsive at least in part to at least one of (i) speed of the vehicle and (ii) steering of the vehicle.
  • 22. The method of claim 1, wherein providing the camera at the vehicle comprises disposing the camera at a rear portion of the vehicle with an exterior field of view at least rearward of the vehicle, and wherein said method comprises providing a plurality of cameras at the vehicle so as to have respective exterior fields of view, and wherein the plurality of cameras comprises the camera at the rear portion of the vehicle.
  • 23. The method of claim 22, comprising providing a display for displaying images derived, at least in part, from image data captured by the camera at the rear portion of the vehicle and derived, at least in part, from image data captured by other cameras of the plurality of cameras.
  • 24. The method of claim 1, wherein providing the camera at the vehicle comprises disposing the camera at a rear portion of the vehicle with an exterior field of view at least rearward of the vehicle, and wherein said method comprises providing a display for displaying images derived, at least in part, from image data captured by the camera during a reversing maneuver of the vehicle.
  • 25. A method of image enhancement for a vehicle vision system, said method comprising: (a) disposing a camera at a front portion of a vehicle with an exterior field of view at least forward of the vehicle;(b) providing a processor operable to process image data;(c) capturing multiple frames of image data with the camera;(d) executing a brightness transfer function on at least one frame of image data;(e) tracking, as the vehicle moves, image flow caused by the vehicle's movement;(f) detecting, at least via processing of image data by the processor, presence of an object in the field of view of the camera;(g) tracking the detected object over successive frames of image data to determine if the detected object is an object of interest in the field of view of the camera;(h) generating an output responsive to detection of the object present in the field of view of the camera;(i) providing the generated output to a driver assistance system of the vehicle, wherein the driver assistance system of the vehicle comprises a system selected from the group consisting of (i) a lane change assist system of the vehicle, (ii) a lane departure warning system of the vehicle, (iii) a blind spot detection system of the vehicle, (iv) an adaptive cruise control system of the vehicle, (v) a collision avoidance system of the vehicle, (vi) a traffic sign recognition system of the vehicle and (vii) a vehicle headlamp control system of the vehicle;(j) determining a low visibility driving condition via processing of image data by the processor;(k) responsive to determination of the low visibility driving condition, increasing contrast of features in captured image data by brightening brighter areas of image data and dampening darker areas of image data; and(l) distinguishing, at least via processing of image data by the processor, moving objects from non-moving objects, and wherein distinguishing moving objects comprises distinguishing moving objects responsive at least in part to at least one of (i) speed of the vehicle and (ii) steering of the vehicle.
  • 26. A method of image enhancement for a vehicle vision system, said method comprising: (a) disposing a plurality of cameras at the vehicle so as to have respective exterior fields of view, wherein the plurality of cameras comprises a rear camera at a rear portion of a vehicle with an exterior field of view at least rearward of the vehicle;(b) providing a processor operable to process image data;(c) capturing multiple frames of image data with the rear camera;(d) executing tone mapping of multiple frames of image data to enhance contrast;(e) detecting presence of an object in the field of view of the rear camera;(f) generating an output responsive to detection of the object present in the field of view of the rear camera;(g) providing a display for displaying images derived, at least in part, from image data captured by the rear camera at the rear portion of the vehicle and derived, at least in part, from image data captured by other cameras of the plurality of cameras;(h) determining a low visibility driving condition via processing by the processor of image data captured by at least one of the plurality of cameras;(i) responsive to determination of the low visibility driving condition, increasing contrast of features in captured image data by brightening brighter areas of image data and dampening darker areas of image data; and(j) distinguishing, at least via processing of image data by the processor, moving objects from non-moving objects, and wherein distinguishing moving objects comprises distinguishing moving objects responsive at least in part to at least one of (i) speed of the vehicle and (ii) steering of the vehicle.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. patent application Ser. No. 15/619,630, filed Jun. 12, 2017, now U.S. Pat. No. 9,774,790, which is a continuation of U.S. patent application Ser. No. 14/343,937, filed Mar. 10, 2014, now U.S. Pat. No. 9,681,062, which is a 371 national phase filing of PCT Application No. PCT/US2012/057007, filed Sep. 25, 2012, which claims the filing benefit of U.S. provisional application Ser. No. 61/539,049, filed Sep. 26, 2012, which is hereby incorporated herein by reference in its entirety.

US Referenced Citations (261)
Number Name Date Kind
4973844 O'Farrell et al. Nov 1990 A
4982287 Lagoni Jan 1991 A
4987357 Masaki Jan 1991 A
5001558 Burley et al. Mar 1991 A
5003288 Wilhelm Mar 1991 A
5012082 Watanabe Apr 1991 A
5016977 Baude et al. May 1991 A
5027001 Torbert Jun 1991 A
5027200 Petrossian et al. Jun 1991 A
5059877 Teder Oct 1991 A
5086253 Lawler Feb 1992 A
5096287 Kakinami et al. Mar 1992 A
5097362 Lynas Mar 1992 A
5121200 Choi Jun 1992 A
5130709 Toyama et al. Jul 1992 A
5170374 Shimohigashi et al. Dec 1992 A
5172235 Wilm et al. Dec 1992 A
5177685 Davis et al. Jan 1993 A
5182502 Slotkowski et al. Jan 1993 A
5184956 Langlais et al. Feb 1993 A
5189561 Hong Feb 1993 A
5193000 Lipton et al. Mar 1993 A
5204778 Bechtel Apr 1993 A
5208701 Maeda May 1993 A
5245422 Borcherts et al. Sep 1993 A
5276389 Levers Jan 1994 A
5285060 Larson et al. Feb 1994 A
5289182 Brillard et al. Feb 1994 A
5289321 Secor Feb 1994 A
5307136 Saneyoshi Apr 1994 A
5309137 Kajiwara May 1994 A
5313072 Vachss May 1994 A
5325096 Pakett Jun 1994 A
5325386 Jewell et al. Jun 1994 A
5329206 Slotkowski et al. Jul 1994 A
5331312 Kudoh Jul 1994 A
5336980 Levers Aug 1994 A
5341437 Nakayama Aug 1994 A
5351044 Mathur et al. Sep 1994 A
5355118 Fukuhara Oct 1994 A
5374852 Parkes Dec 1994 A
5386285 Asayama Jan 1995 A
5394333 Kao Feb 1995 A
5406395 Wilson et al. Apr 1995 A
5410346 Saneyoshi et al. Apr 1995 A
5414257 Stanton May 1995 A
5414461 Kishi et al. May 1995 A
5416313 Larson et al. May 1995 A
5416318 Hegyi May 1995 A
5416478 Morinaga May 1995 A
5424952 Asayama Jun 1995 A
5426294 Kobayashi et al. Jun 1995 A
5430431 Nelson Jul 1995 A
5434407 Bauer et al. Jul 1995 A
5440428 Hegg et al. Aug 1995 A
5444478 Lelong et al. Aug 1995 A
5451822 Bechtel et al. Sep 1995 A
5461357 Yoshioka et al. Oct 1995 A
5469298 Suman et al. Nov 1995 A
5471515 Fossum et al. Nov 1995 A
5475494 Nishida et al. Dec 1995 A
5498866 Bendicks et al. Mar 1996 A
5500766 Stonecypher Mar 1996 A
5510983 Iino Apr 1996 A
5515448 Nishitani May 1996 A
5521633 Nakajima et al. May 1996 A
5528698 Kamei et al. Jun 1996 A
5529138 Shaw et al. Jun 1996 A
5530240 Larson et al. Jun 1996 A
5530420 Tsuchiya et al. Jun 1996 A
5535314 Alves et al. Jul 1996 A
5537003 Bechtel et al. Jul 1996 A
5539397 Asanuma et al. Jul 1996 A
5541590 Nishio Jul 1996 A
5550677 Schofield et al. Aug 1996 A
5555555 Sato et al. Sep 1996 A
5568027 Teder Oct 1996 A
5574443 Hsieh Nov 1996 A
5581464 Woll et al. Dec 1996 A
5614788 Mullins Mar 1997 A
5634709 Iwama Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5648835 Uzawa Jul 1997 A
5650944 Kise Jul 1997 A
5660454 Mori et al. Aug 1997 A
5661303 Teder Aug 1997 A
5666028 Bechtel et al. Sep 1997 A
5670935 Schofield et al. Sep 1997 A
5677851 Kingdon et al. Oct 1997 A
5699044 Van Lente et al. Dec 1997 A
5724316 Brunts Mar 1998 A
5732379 Eckert et al. Mar 1998 A
5737226 Olson et al. Apr 1998 A
5760828 Cortes Jun 1998 A
5760931 Saburi et al. Jun 1998 A
5761094 Olson et al. Jun 1998 A
5765116 Wilson-Jones et al. Jun 1998 A
5765118 Fukatani Jun 1998 A
5781437 Wiemer et al. Jul 1998 A
5786772 Schofield et al. Jul 1998 A
5790403 Nakayama Aug 1998 A
5790973 Blaker et al. Aug 1998 A
5793308 Rosinski et al. Aug 1998 A
5793420 Schmidt Aug 1998 A
5796094 Schofield et al. Aug 1998 A
5837994 Stam et al. Nov 1998 A
5844505 Van Ryzin Dec 1998 A
5844682 Kiyomoto et al. Dec 1998 A
5845000 Breed et al. Dec 1998 A
5848802 Breed et al. Dec 1998 A
5850176 Kinoshita et al. Dec 1998 A
5850254 Takano et al. Dec 1998 A
5867591 Onda Feb 1999 A
5877707 Kowalick Mar 1999 A
5877897 Schofield et al. Mar 1999 A
5878357 Sivashankar et al. Mar 1999 A
5878370 Olson Mar 1999 A
5883739 Ashihara et al. Mar 1999 A
5884212 Lion Mar 1999 A
5890021 Onoda Mar 1999 A
5896085 Mori et al. Apr 1999 A
5899956 Chan May 1999 A
5915800 Hiwatashi et al. Jun 1999 A
5923027 Stam et al. Jul 1999 A
5924212 Domanski Jul 1999 A
5959555 Furuta Sep 1999 A
5963247 Banitt Oct 1999 A
5990469 Bechtel et al. Nov 1999 A
5990649 Nagao et al. Nov 1999 A
6020704 Buschur Feb 2000 A
6049171 Stam et al. Apr 2000 A
6066933 Ponziana May 2000 A
6084519 Coulling et al. Jul 2000 A
6097024 Stam et al. Aug 2000 A
6100799 Fenk Aug 2000 A
6144022 Tenenbaum et al. Nov 2000 A
6175300 Kendrick Jan 2001 B1
6201642 Bos et al. Mar 2001 B1
6223114 Boros et al. Apr 2001 B1
6266082 Yonezawa et al. Jul 2001 B1
6266442 Laumeyer et al. Jul 2001 B1
6285393 Shimoura et al. Sep 2001 B1
6294989 Schofield et al. Sep 2001 B1
6297781 Turnbull et al. Oct 2001 B1
6310611 Caldwell Oct 2001 B1
6317057 Lee Nov 2001 B1
6320282 Caldwell Nov 2001 B1
6333759 Mazzilli Dec 2001 B1
6353392 Schofield et al. Mar 2002 B1
6370329 Teuchert Apr 2002 B1
6392315 Jones et al. May 2002 B1
6396397 Bos et al. May 2002 B1
6411204 Bloomfield et al. Jun 2002 B1
6424273 Gutta et al. Jul 2002 B1
6430303 Naoi et al. Aug 2002 B1
6442465 Breed et al. Aug 2002 B2
6477464 McCarthy et al. Nov 2002 B2
6497503 Dassanayake et al. Dec 2002 B1
6498620 Schofield et al. Dec 2002 B2
6534884 Marcus et al. Mar 2003 B2
6539306 Turnbull Mar 2003 B2
6553130 Lemelson et al. Apr 2003 B1
6594583 Ogura et al. Jul 2003 B2
6611610 Stam et al. Aug 2003 B1
6636258 Strumolo Oct 2003 B2
6672731 Schnell et al. Jan 2004 B2
6690268 Schofield et al. Feb 2004 B2
6704621 Stein et al. Mar 2004 B1
6711474 Treyz et al. Mar 2004 B1
6730913 Remillard et al. May 2004 B2
6735506 Breed et al. May 2004 B2
6744353 Sjönell Jun 2004 B2
6795221 Urey Sep 2004 B1
6806452 Bos et al. Oct 2004 B2
6819231 Berberich et al. Nov 2004 B2
6823241 Shirato et al. Nov 2004 B2
6824281 Schofield et al. Nov 2004 B2
6850156 Bloomfield et al. Feb 2005 B2
6889161 Winner et al. May 2005 B2
6909753 Meehan et al. Jun 2005 B2
6946978 Schofield Sep 2005 B2
6975775 Rykowski et al. Dec 2005 B2
6989736 Berberich et al. Jan 2006 B2
7004606 Schofield Feb 2006 B2
7038577 Pawlicki et al. May 2006 B2
7062300 Kim Jun 2006 B1
7065432 Moisel et al. Jun 2006 B2
7079017 Lang et al. Jul 2006 B2
7085637 Breed et al. Aug 2006 B2
7092548 Laumeyer et al. Aug 2006 B2
7111968 Bauer et al. Sep 2006 B2
7116246 Winter et al. Oct 2006 B2
7123168 Schofield Oct 2006 B2
7136753 Samukawa et al. Nov 2006 B2
7145519 Takahashi et al. Dec 2006 B2
7149613 Stam et al. Dec 2006 B2
7161616 Okamoto et al. Jan 2007 B1
7195381 Lynam et al. Mar 2007 B2
7202776 Breed Apr 2007 B2
7227611 Hull et al. Jun 2007 B2
7365769 Mager Apr 2008 B1
7425076 Schofield et al. Sep 2008 B2
7460951 Altan Dec 2008 B2
7490007 Taylor et al. Feb 2009 B2
7526103 Schofield et al. Apr 2009 B2
7543946 Ockerse et al. Jun 2009 B2
7592928 Chinomi et al. Sep 2009 B2
7639149 Katoh Dec 2009 B2
7681960 Wanke et al. Mar 2010 B2
7720580 Higgins-Luthman May 2010 B2
7724962 Zhu et al. May 2010 B2
7733464 David et al. Jun 2010 B2
7855755 Weller et al. Dec 2010 B2
7881496 Camilleri et al. Feb 2011 B2
7952490 Fechner et al. May 2011 B2
8013780 Lynam et al. Sep 2011 B2
8027029 Lu et al. Sep 2011 B2
8577169 Andrus et al. Nov 2013 B2
8849495 Chundrlik, Jr. et al. Sep 2014 B2
9681062 Kussel Jun 2017 B2
9774790 Kussel Sep 2017 B1
20020015153 Downs Feb 2002 A1
20020113873 Williams Aug 2002 A1
20030137586 Lewellen Jul 2003 A1
20030222982 Hamdan et al. Dec 2003 A1
20040016870 Pawlicki Jan 2004 A1
20040114381 Salmeen et al. Jun 2004 A1
20060017807 Lee Jan 2006 A1
20060018511 Stam et al. Jan 2006 A1
20060018512 Stam et al. Jan 2006 A1
20060091813 Stam et al. May 2006 A1
20060103727 Tseng May 2006 A1
20060164221 Jensen Jul 2006 A1
20060171704 Bingle Aug 2006 A1
20060250501 Wildmann et al. Nov 2006 A1
20060290479 Akatsuka et al. Dec 2006 A1
20070047833 Zhang Mar 2007 A1
20070104476 Yasutomi et al. May 2007 A1
20080094715 Schofield Apr 2008 A1
20080197997 Vitito Aug 2008 A1
20090093938 Isaji et al. Apr 2009 A1
20090113509 Tseng et al. Apr 2009 A1
20090177347 Breuer et al. Jul 2009 A1
20090243824 Peterson et al. Oct 2009 A1
20090244361 Gebauer et al. Oct 2009 A1
20090265069 Desbrunes Oct 2009 A1
20100020170 Higgins-Luthman et al. Jan 2010 A1
20100195901 Andrus Aug 2010 A1
20100228437 Hanzawa et al. Sep 2010 A1
20120044066 Mauderer et al. Feb 2012 A1
20120062743 Lynam et al. Mar 2012 A1
20120218412 Dellantoni et al. Aug 2012 A1
20120262340 Hassan et al. Oct 2012 A1
20130124052 Hahne May 2013 A1
20130129150 Saito May 2013 A1
20130131918 Hahne May 2013 A1
20140067206 Pflug Mar 2014 A1
20140156157 Johnson et al. Jun 2014 A1
20140222280 Salomonsson Aug 2014 A1
20140313339 Diessner et al. Oct 2014 A1
20140379233 Chundrlik, Jr. et al. Dec 2014 A1
Foreign Referenced Citations (1)
Number Date Country
0353200 Jan 1990 EP
Non-Patent Literature Citations (7)
Entry
Cromwell et al., “Contrast Enhancement through Localized Histogram Equalization”, webpage located at:_http://www.cromwell-intl.com/3d/histogram/.
International Search Report and Written Opinion dated Jan. 28, 2013 from corresponding PCT Application No. PCT/Us2012/057007.
Kim et al., “An advanced contrast enhancement using partially overlapped sub-block histogram equalization,” IEEE Transactions on Circuits and Systems for Video Technology, vol. 11, No. 4, Apr. 2001.
Marsi, “Real Time Video Contrast Enhancement by Using Recursive Rational Filter,” Imaging Systems and Techniques, 2004, Abstract.
Oakley et al., “Improving Image Quality in Poor Visibility Conditions Using a Physical Model for Contrast Degradation,” IEEE Transactions on Image Processing, Feb. 1998, Abstact.
Qing Wang, “Real Time Video Contrast Enhancement by Using Recursive Rational Filter,” IEEE Transactions on Consumer Electronics, May 2007, Abstract.
Stark et al., “Gray and Color Image Contrast Enhancement by the Curvelet Transform,” IEEE Transactions on Image Processing, vol. 12, No. 6, Jun. 2003.
Related Publications (1)
Number Date Country
20180027188 A1 Jan 2018 US
Provisional Applications (1)
Number Date Country
61539049 Sep 2011 US
Continuations (2)
Number Date Country
Parent 15619630 Jun 2017 US
Child 15713814 US
Parent 14343937 US
Child 15619630 US