Vehicle vision system

Information

  • Patent Grant
  • 8629768
  • Patent Number
    8,629,768
  • Date Filed
    Monday, June 18, 2012
    12 years ago
  • Date Issued
    Tuesday, January 14, 2014
    10 years ago
Abstract
A vision system for a vehicle includes at least one imaging sensor having a forward field of view with respect to a forward direction of travel of the vehicle. The imaging sensor captures image data representative of objects present in the forward field of view. A control is responsive to the imaging sensor and processes image data captured by the imaging sensor to determine an object present in the forward field of view. The control processes image data to determine a distance between the vehicle and the object determined present in the forward field of view. The control may determine the distance between the vehicle and the determined object at least in part in response to at least one of (i) size of the determined object, (ii) position of the determined object, (iii) light intensity of the determined object, and (iv) rate of approach of the determined object.
Description
BACKGROUND OF THE INVENTION

This invention relates generally to vehicular vision systems and, more particularly, to a vehicular vision system which is operable to determine a distance from the vehicle to an object or light source remote from the vehicle. More particularly, the present invention is directed to determining the distance to an object whose image is captured by an image capture device. One application for the imaging system of the present invention is with a vehicle headlamp control and may identify particular light sources of interest and adjust a vehicle's headlamps according to the distance between the vehicle and the particular light sources.


Vehicle camera or vision systems have been proposed for various applications, such as rear and/or side view vision systems, back up aids, collision avoidance systems, rain sensor systems, head lamp control systems and the like. These systems may include a camera or sensor positioned on the vehicle for capturing an image of a scene exteriorly of the vehicle. The vision systems may also include a display for displaying a captured image, or may control an associated accessory on the vehicle, such as windshield wipers, headlamps or even the brake system in response to one or more characteristics of the captured image. In some applications, it has been recognized that distance information between the vehicle and an object in the captured scene may be helpful. In such applications, a ranging device may also be included to provide this information. Various ranging devices have been proposed, such as radar, ultrasonic, sonar, infrared beam/detector devices or similar proximity sensing devices. While such devices provide distance information to the associated vehicular system, this requires an additional sensing device separate from the vehicular vision or camera system, which adds to the bulk and costs associated with the system.


One vehicle system which distance information may be particularly useful is a vehicle headlamp control system for adjusting a vehicle headlamp in response to a detection of oncoming headlamps or leading taillights associated with other vehicles. To date, there have been many proposed headlight dimmer control systems. Many of the prior attempts at vehicle headlight dimming controls include a single light sensor which integrates light from a scene remote from the vehicle. The vehicle headlights are then dimmed when the integrated light exceeds a predetermined threshold. However, these systems typically require a sufficiently low threshold of detection such that many other lower intensity light sources may also be interpreted as headlights or taillights. These systems also have difficulties in reliably detecting taillights of other vehicles traveling ahead of the operative vehicle, since the intensity of taillights is typically substantially less than the intensity of oncoming headlights.


Other proposed headlight dimming controls implement an imaging array sensor which not only senses the light originating from both headlights and taillights, but may further determine the color and intensity of the light, thereby further determining whether the light source is a headlight or a taillight. Such systems are deficient in determining the distance between the sensed light source and the subject vehicle, which would be helpful modulating the headlamps in response to both the sensed light and the distance to the light. One proposed solution is to estimate the distance between the vehicle and the light source in response to the brightness or intensity of the sensed light source, since the detected signal from the light source may at times vary with the square of the distance to the light source. However, such a calculation is only accurate when the sensed light source intensity is within a predetermined level corresponding to a known or assumed intensity of headlamps and is at certain distances. Because the intensity of headlamps and taillamps vary between vehicles and may further vary as the headlamps are modulated between high and low beams and as the brake lights are activated or deactivated, such an estimation of distance may be inaccurate in many cases.


SUMMARY OF THE INVENTION

The present invention provides a vehicular imaging system which is capable of accurately determining the distance from the subject vehicle to an object or light source sensed by the sensors of the imaging system. The distance sensor accurately estimates the distance between the sensed object and the vehicle, while avoiding excessive additional costs and bulk to the vehicle vision and/or control system. In one aspect, the present invention is intended to provide a vehicular headlamp control system which senses oncoming headlights and leading taillights of other vehicles and controls the headlamps of the subject vehicle in response to the sensed light sources and the distance between the vehicle and the sensed light sources. The control system preferably includes ranging capability for determining the distance between the sensed objects and the vehicle. The device preferably is adaptable for use in other vehicular imaging systems associated with the vehicle which may display a distance readout to an operator of the vehicle or may control a vehicle accessory in response to the distance.


According to an aspect of the present invention, a vehicular imaging system comprises at least one imaging array sensor and a control. The imaging sensor is mounted at a vehicle and has stereoscopic distance-sensing capability. The control is responsive to an output of the imaging array sensor in order to capture an image of at least one object external of the vehicle and determine a distance between the imaging array sensor and the object.


Preferably, the imaging array sensor receives a stereoscopic image of a scene remote from the imaging array sensor. The stereoscopic image includes a first image of an object in the scene on a first portion of the imaging array sensor and a second image of the object on a second portion of the imaging array sensor. The control is responsive to the imaging array sensor in order to determine a distance between the imaging array sensor and the object.


In one form, the vehicular imaging system is implemented in a vehicular headlamp control system, such that the headlamps are modulated between high and low beams in response to the distance between the sensed object or light source, which may be representative of an oncoming headlight or leading taillight, and the imaging array sensor.


In another form, the vehicular imaging system includes first and second imaging array sensors such that the first image of the object is received by the first imaging array sensor and the second image of the object is received by the second imaging array sensor. Preferably, a first and second optic element is included along the respective optic paths between the first and second imaging array sensors and the scene. The distance between the object and the sensors may then be determined as a function of a relative position of the image of the object as received on the first and second imaging array sensors and the focal lengths of the first and second optic elements.


According to another aspect of the present invention, a vehicular headlamp control for modulating a headlamp of a vehicle comprises at least one imaging array sensor adaptable to receive a stereoscopic image of a scene remote from the vehicle and a control responsive to the imaging array sensor. The imaging array sensor receives a plurality of images associated with a plurality of light sources associated with the scene. The control identifies light sources of interest and provides a control output to the vehicle. The control calculates a distance between at least one of the light sources and the imaging array sensor and provides the control output in response to the distance. The headlamp control modulates the headlamps of the vehicle in response to the control output.


According to another aspect of the present invention, a rearview vision system for a vehicle comprises at least one imaging array sensor and a control. The imaging array sensor is positioned on the vehicle and is directed outwardly from the vehicle. The imaging array sensor has stereoscopic distance-sensing capability. The control is operable to determine a distance from at least one object exteriorly of the vehicle in response to an output of the imaging array sensor.


These and other objects, advantages, purposes and features of this invention will become apparent upon review of the following specification in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a plan view of a vehicle incorporating the present invention;



FIG. 2 is a block diagram of the imaging system of the present invention;



FIG. 3 is a block diagram of an imaging sensor useful with the present invention;



FIG. 4 is a schematic diagram of a light-sensing array useful with the present invention;



FIG. 5 is the same view as FIG. 3 illustrating the geometric relationship between an object and the imaging sensor useful with the present invention;



FIG. 6 is the same view as FIG. 4, with shading of the pixels indicating pixels sensing an object or light source;



FIG. 7 is the same view as FIG. 6 with similarly illuminated pixels being designated as groups of pixels or segments;



FIG. 7A is a schematic of three-pixel sub-array useful for identifying and labeling the segments illustrated in FIG. 7;



FIGS. 8A and 8B are the same view as FIG. 6 of first and second imaging arrays useful with the present invention, with the similarly illuminated groups of pixels being labeled as discreet groups or segments;



FIG. 9 is a flow-chart of a segment labeling process useful with the present invention;



FIG. 10 is a flow-chart of a process for determining the position and intensity of the segments;



FIG. 11 is a flow-chart of a process for determining whether a particular segment on a first imaging array sensor is an image of the same object as a corresponding segment on a second imaging array sensor;



FIG. 12 is a flow-chart of the stereoscopic distance determination function of the present invention;



FIGS. 13A-C are schematics of various embodiments of a stereoscopic imaging sensor with distance determining capability within a housing, such as an interior rearview mirror assembly housing;



FIG. 14 is a side elevation of a portion of a vehicle embodying a headlamp dimmer control in accordance with the present invention;



FIG. 15 is a partial side elevation view and block diagram of the vehicle headlight dimming control of FIG. 14;



FIGS. 16A and 16B are flow-charts of the stereoscopic headlamp control processes in accordance with the present invention; and



FIGS. 17A-C are curves of segment intensity versus distance useful in determining whether to activate or deactivate the high or low beams of the headlamps.





DESCRIPTION OF THE PREFERRED EMBODIMENTS

Referring now specifically to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes a vehicle imaging system 12 which includes an imaging sensor module 14 and an imaging control 16, as shown in FIGS. 1, 2 and 3. Vehicle imaging system 12 may be a rearview vision system of the type disclosed in commonly assigned U.S. Pat. No. 5,670,935, a rearview vision system of the type disclosed in commonly assigned published PCT Application, International Publication No. WO96/38319, published Dec. 5, 1996, a wide angle image capture system of the type disclosed in commonly assigned co-pending U.S. patent application Ser. No, 09/199,907, filed Nov. 25, 1998, now U.S. Pat. No. 6,717,610, a rain sensor and the like of the type disclosed in commonly assigned published PCT application, International Publication No. WO 99/23828, published May 14, 1999, or a headlamp dimming control of the type disclosed in U.S. Pat. No. 5,796,094, the disclosures of which are hereby incorporated herein by reference. Imaging sensor module 14 senses light from a scene outside of vehicle 10 and imaging control 16 receives an output from sensor module 14. Imaging array module 14 is operable to facilitate determination of a distance between the module 14 and an object, such as a light source, in the target scene by receiving a stereoscopic image of the object on a pair of imaging sensors 34a and 34b or a divided sensor. By comparing the relative locations or registrations of a particular object or light source in the target scene on each of the imaging sensors 34a and 34b, the distance to the object may be determined as discussed below. Vehicle imaging system 12 may include a display 13 or other means for conveying the distance to an operator of vehicle 10 or may respond to the distance determination by controlling an accessory or device such as a warning indicator or signaling device or even the brake system of the vehicle if the control is associated with a collision avoidance system or the windshield wipers and/or headlamps if the control is associated with a rain sensor and/or headlamp control, respectively. If associated with a headlamp control, the distance is used to detect when headlamps or taillamps are at a distance where the headlamps of the controlled vehicle should be dimmed.


As shown in FIG. 1, a backup aid or rear view vision system 70 may be positioned on a rearward portion 72 of vehicle 10 and may comprise a stereoscopic imaging system. Rear view vision system 70 may alternately be positioned on side rearview mirrors 70a or on the rear view mirror 30 within the vehicle. It is further envisioned that the imaging sensors 34a and 34b may be integrally constructed to a housing or fixed portion of the bracket of the exterior mirror, thereby combining the sensors or cameras within the mirror to form a single unit. The stereoscopic vision system may then determine the distance from the vehicle to an object rearward of the vehicle and provide a distance output to an operator of vehicle 10. The vision system may include a display 13 which provides an operator of the vehicle with an image of the scene remote from the vehicle and a distance readout to an object or objects in the scene.


Preferably, the image may be displayed as a unitary image synthesized from outputs of two or more imaging sensors. Image enhancements may also be provided in the displayed image to further enhance the driver's understanding of the area immediately surrounding vehicle 10. For example, graphic overlays, such as distance indicia in the form of horizontal grid markings or the like, may be provided to indicate distances between the vehicle and objects displayed in display 13. These graphic overlays may be superimposed on display 13 and thus are visible to the operator of vehicle 10. The grid markings may be moved, curved or otherwise adjusted in response to a change in the vehicle's direction of travel, which may be determined by a change in the vehicle's steering system, the vehicle's differential system or a compass heading. Additionally, images of objects or other vehicles may be adjusted or enhanced in response to the distance between vehicle 10 and the other vehicles, such as by flashing or changing the color of images of objects within a threshold distance of vehicle 10. Alternatively, the distance to multiple objects or a distance to a closest object may be displayed on display 13 or otherwise communicated to the vehicle operator. The distance to several objects may be displayed or the operator may select one or more particular objects in the display for which the distance is determined. The selection may be made by a mouse, keypad, joystick or the like.


Alternately, the stereoscopic vision system may be implemented with a rain sensor 80, which may be placed inside the vehicle passenger compartment and directed toward a window or windshield 26. Rain sensor 80 may then be operable to determine a distance from the sensor to the sensed droplets, in order to ensure that the sensed droplets are positioned on the windshield 26 of vehicle 10 and not remotely positioned therefrom, thereby reducing the possibility of a false detection of rain on the windshield.


As mentioned above, the stereoscopic imaging system is also useful with a vehicle headlamp dimming control 12′. The headlamp control 12′ may be implemented in a rearview mirror assembly 30 and directed forwardly of vehicle 10 (FIG. 14). Headlamp control 12′ may then adjust or modulate the headlamps 20 of vehicle 10 in response to a distance between the vehicle and oncoming headlamps or leading taillights of other vehicles. This substantially reduces the possibility of modulating the headlamps when the detected vehicle is substantially distant from vehicle 10.


Referring now to FIG. 3, imaging sensor module 14 preferably includes a pair of imaging array sensors 34a and 34b, each of which receives an image of the target scene via a pair of focusing lenses 36a and 36b and a pair of color filters 38a and 38b, respectively, all of which are positionable along respective optic paths between the target scene and imaging array sensors 34a and 34b. Control 16 receives an output from each imaging array sensor 34a and 34b and converts the output to digital values via an analog to digital converter (not shown) and communicates the values to an appropriate control logic, such as a vehicle lighting control logic module 18 (FIG. 15). Control 16 further functions to at least occasionally activate each imaging array sensor 34a and 34b and analyze the output of each to determine the type of light source sensed and a distance from the vehicle to the light source.


Preferably, imaging arrays 34a and 34b are pixelated imaging array sensors, such as a CCD or a CMOS sensor, although other array sensors may be implemented without affecting the scope of the present invention. As shown in FIG. 4, each of the imaging array sensors 34a and 34b are preferably similar to the type disclosed in commonly assigned U.S. Pat. No. 5,550,677 issued to Kenneth Schofield and Mark Larson, the disclosure of which is hereby incorporated herein by reference. Because the imaging array sensors are described in detail in the Schofield '677 patent, the specific details will not be further discussed herein. Briefly, each of the imaging array sensors 34a and 34b preferably comprise a plurality of photon accumulating light sensors or pixels 42. The array of photo-sensors 42 are interconnected to a vertical shift register 46 and a horizontal shift register 52 via a common word line 44 and a common bit line 48, respectively. The bit lines 48 are also interconnected with amplifiers 50. The registers 46 and 52 function to individually access each photo-sensor pixel or element 42 and provide an output 56 associated with the individual signals to the analog to digital converter of control 16.


As imaging array sensors 34a and 34b receive light from objects and/or light sources in the target scene, control 16 may then be operable to determine a color or other characteristic, such as intensity or size, being communicated by the sensed light sources, which may further be determined to be a desired target object, such as a headlamp or taillight, as disclosed in the Schofield '094 patent. Color filters 38a and 38b may also be used to determine the color of other light sources as well. The color filters may be conventional mosaic filters or the like or may be electro-optic filters of the type disclosed in commonly assigned and co-pending U.S. provisional patent application Ser. No. 60/135,657, filed on May 24, 1999, the disclosure of which is hereby incorporated herein by reference. By receiving a stereoscopic image on sensors 34 such that one image is received on one array 34a while a corresponding image is received on the second array 34b, the distance to an object in the target scene may then be determined as a function of the locations of each sensed image relative to a respective reference location, such as a center point or axis, of the corresponding imaging array sensors, the separation distance of the two arrays and the focal length of the focusing lenses or optics. This distance may be calculated according to the following equation:










D
=


Δ






f
1



f
2





f
1



x

D





2



-


f
2



x

D





1






;




(
1
)








where, as represented in FIG. 4, D is the straight-line distance from the sensed object to a forward surface 36c of optics 36a and 36b, A is the lateral separation distance between a mid-point, axis or other reference point associated with each sensor 34a and 34b, f1 is a focal length of the first optic 36a, f2 is a focal length of the second optic 36b, xD1 is a directed distance from a center axis 34c of the first sensor 34a to the sensed image 34d of the object O on sensor 34a and xD2 is a corresponding directed distance from a center axis 34f of the second sensor 34b to the sensed image 34e of the object O on sensor 34b. The directed distances xD1 and xD2 may be positive or negative values in accordance with the location where the sensed images 34d and 34e are detected by sensors 34a and 34b, respectively. For example, xD1 and xD2 may both be positive in FIG. 5, but one or both may be a negative value if the object O is positioned relative to the optics and sensors such that one or both sensed images 34d and 34e are received by sensors 34a and 34b on the other side of the center axes 34c and 34f, respectively.


Once the distance D is known, the lateral distance X to the object O may also be determined by the equation:










X
=


Dx

D





2



f
2



,




(
2
)








Similarly, the angle from the vehicle to the object O may easily be calculated by taking the inverse tangent of the lateral distance X divided by the longitudinal distance D or of the image position xD2 divided by the focal length f2. Control 16 may then determine if the sensed object or light source is within a predetermined tolerance band of a targeted object or light source, such as a typical headlamp or taillight, both in intensity and in location (lateral and longitudinal distance) relative to vehicle 10. If the intensity and distance of the signal is within the tolerance or threshold levels, the signal may be determined to be one of the targeted objects and imaging system 12 may respond accordingly. For example, if imaging system 12 is associated with a vehicle headlamp control, imaging system 12 may adjust the headlamps 20 of vehicle 10 in response to a distance and angle between vehicle 10 and the detected headlamps and/or taillights of other vehicles.


Referring now to FIGS. 6 through 8, the following illustrates and describes the processes through which control 16 may determine the distance between a light source or other sensed object and the vehicle 10. As shown in FIG. 6, the arrays 35a and 35b of the respective imaging array sensors 34a and 34b include pixels 42, which sense light values representative of light sources and other objects present in the target scene. Although shown as an array comprising an 8×8 array of pixels, the array is shown here as a small array for purposes of clarity only, since typical imaging array sensors useful with the present invention may comprise approximately 512×512 pixel arrays or more. The pixels 42 are shown with shaded pixels 42a representing sensed light values which are greater than a pre-determined noise level associated with the array sensors 34a and 34b.


When operable, control 16 may shutter or open each of the imaging array sensors 34a and 34b to collect the signals from the target scene on each array 35a and 35b. After the signal has been received and communicated to control 16, control 16 may function to identify and classify each of the pixels in accordance with their intensity and color as determined by control 16 and pixel assignment with respect to color filters 38a and 38b. For example, white pixels may be identified and analyzed to determine whether the white pixels are headlamps of oncoming vehicles, and then red pixels may be identified and analyzed to determine whether the red pixels are tail lights of the leading vehicles traveling in the same direction ahead of the subject vehicle 10. Clearly, however, the pixels may be classified and analyzed according to other colors or intensities for determining the distance to other objects or light sources within the targeted scene, without affecting the scope of the present invention.


As shown in FIG. 7, similarly illuminated pixels, having a similar color and/or intensity, are similarly classified, such as red or white, and are shown as pixels 42b with an “x” through them. Not all of the shaded pixels 42a in FIG. 6 are similarly classified in FIG. 7 because some of the shaded pixels 42a may represent a light value above the noise threshold but from a different colored light source. The similarly classified pixels 42b may then be assigned a value of one or otherwise labeled, while the other blank pixels 42 may be assigned a value of zero, for the purpose of determining connected segments or groups of pixels corresponding to each particular light source in the target scene. This is preferably accomplished by activating a segmentation and labeling algorithm or process 100 which determines which of the classified pixels 42b belongs to each particular segment or light source and labels each segment in numeric order. Each segment of pixels within a particular classification, such as white, red or other color, is thus labeled as a discreet segment from the other pixels or segments of pixels with the same classification. Labeling algorithm 100 preferably analyzes each pixel and compares the assigned value (such as one or zero) of each pixel to one or more neighboring pixels. A set of neighboring pixels is represented by a three-pixel window or sub-array 43 (FIG. 7A) which may be applied to each of the imaging arrays 35a and 35b. The sub-array 43 is preferably moved through the array, starting at an upper left corner and proceeding left to right and then downward until each pixel in the array has been analyzed and compared to its neighboring pixels.


As sub-array 43 moves through arrays 35, each pixel 42 and 42b is individually analyzed by a leading pixel window 43a to determine if the individual pixel has been assigned a value of one. If the pixel is assigned as one, each of the neighboring upper and left pixels are also analyzed by an upper and left pixel window 43b and 43c, respectively, in order to determine if an individual pixel that is assigned a value of one is connected with one or more previously analyzed pixels similarly assigned a value of one. A labeling window or sub-array 44 then further analyzes the individual pixel with a labeling pixel window 44a and the upper and left adjacent pixels with labeling pixel windows 44b and 44c, respectively. Labeling sub-array 44 determines and compares the designated segment number for each of the previously analyzed neighboring or adjacent pixels and labels the subject individual pixel accordingly. For example, if either the upper or left pixel were also assigned a value of one, then that particular pixel would already be labeled as a segment by labeling sub-array 44. Accordingly, labeling sub-array 44 would label the subject pixel with the same segment number as already applied to its neighboring pixel. If the upper and left pixels are labeled differently, the left pixel would then be re-labeled to match the upper, or first labeled, pixel. Pixels within a connected segment are thus labeled in accordance with that particular segment number by labeling sub-array 44. This process is continued for each pixel in array 35. Clearly, however, other processes for analyzing and labeling neighboring pixels may be performed without affecting the scope of the present invention. Furthermore, although labeling algorithm 100 is described as analyzing and labeling segments which include only pixels which have adjacent or connected sides, other algorithms may be implemented which label segments which have pixels adjacent at their corners or within a predetermined range and/or intensity of each other.


After the three pixel windows 43 and 44 have completed analyzing and labeling each of the pixels 42 within the imaging arrays, each of the discreet segments are grouped together and labeled numerically, as shown in FIGS. 8A and 8B for imaging array sensors 34a and 34b, respectively. The average pixel location and maximum intensity of each segment may then be determined in order to facilitate a comparison of the segments on their respective sensors. This is accomplished by summing the x and y pixel coordinates for the pixels within each segment and dividing each sum by the number of pixels within the segment. For example, segment number (2) in FIG. 8A would have an average x position of






5.67


(


5
+
6
+
6

3

)






from a left edge 35c of array 35a and an average y position of






2.67


(


2
+
3
+
3

3

)






from an upper edge 35d of array 35a. Because the two imaging sensors 34a and 34b are separated by a predetermined distance, each of the particular segments representing a particular light source may be positioned differently on imaging array sensor 34b as compared to a corresponding segment on the other imaging array sensor 34a, depending on the distance and lateral orientation between the sensors and the light source in the targeted scene. This is represented in FIG. 8B, where segment number (2) is received by sensor 34b such that it has an average x position of






6.67


(


6
+
7
+
7

3

)






and the same average y position as the segment had on the sensor 34a in FIG. 8A. The distance may then be calculated using equation (1) above, where xD1 and xD2 are the directed distances from a reference point or center axis 34c and 34f of each sensor 34a and 34b to the average position of the particular segment on each sensor. In this example, xD1 may be a distance corresponding to separation of 1.67 pixels while xD2 may be a distance corresponding to a separation of 2.67 pixels, with the center axes 34c and 34f being at the center of the depicted arrays. Vehicle imaging system 12 may then determine if the intensity and location of the segments are consistent with the relevant or targeted images or light sources, such as headlamps or taillights, and may display an image or readout or adjust an associated accessory of vehicle 10 accordingly.


Although described as preferably utilizing segmentation and averaging algorithms, the present invention may alternatively compare individual pixels on one array to similarly illuminated individual pixels on the other array. Because the preferred embodiment groups similarly classified and positioned pixels together into segments and determines a maximum intensity and average location of the segment, the preferred system provides improved accuracy for distance calculation over a comparison of individual pixels. This is because the measurement resolution is then not limited to a pixel separation distance, since the average or center location of the sensed light source may be somewhere between two or more pixels. Accordingly, the preferred control of the present invention provides sub-pixel resolution in the distance calculation.


Referring now to FIG. 9, labeling algorithm or process 100 determines and labels the segments of similarly classified pixels on each imaging array sensor. Process 100 starts at 110 and compares each individual pixel to at least two neighboring pixels. If it is determined at 120 that the target pixel has not been assigned a value of one, or is not above a threshold value, then process 100 moves to the next pixel at 125 and continues at 115. If it is determined at 120 that the target pixel value is greater than the threshold value or, in other words, has been assigned a value of one, then it is further determined at 130 whether the pixel value is greater than the values associated with both an upper adjacent pixel and left adjacent pixel. If it is determined at 130 that the pixel value is greater than both of the upper and left pixels, then that particular pixel is assigned a new segment number at 135 and process 100 moves to the next pixel at 125 and continues at 115. If it is determined at 130 that the pixel value is not greater than both the upper and left pixel, then it is further determined at 140 whether the pixel value is equal to the upper pixel and not equal to the left value. If the pixel value is equal to the upper pixel and is not equal to or is greater than the left pixel, then the particular pixel is assigned the same segment number as the upper pixel at 145 and the process 100 moves to the next pixel at 125 and continues at 115.


If it is determined at 140 that the pixel value is not equal to the upper pixel or is equal to the left pixel, then it is further determined at 150 whether the pixel value is both equal to the left pixel and is not equal to or is greater than the upper pixel. If it is determined at 150 that the pixel value is equal to the left pixel and is not equal to the upper pixel, then the particular pixel is assigned the same segment number as the left pixel at 155, and process 100 moves to the next pixel at 125 and continues at 115. If it is determined at 150 that the pixel value is not equal to the left pixel value or is equal to the upper pixel value, then it is further determined at 160 whether the pixel value is equal to both the left and upper pixels and the left and upper pixels are labeled the same. If it is determined at 160 that the pixel value is equal to the left and upper assigned values and the left and upper pixels are labeled the same, then the particular pixel is labeled the same as the upper pixel at 165. Process 100 then moves to the next pixel at 125 and continues at 115. If, however, the left label is not equal to the upper label at 160, then the particular pixel is labeled the same as the upper pixel and the left pixel is correspondingly relabeled to the same as the upper pixel at 170, since the target pixel now joins the left and upper pixel within the same segment. Process 100 then moves to the next pixel to 125 and continues at 115 until each pixel within each imaging array sensor has been analyzed and labeled accordingly. Process 100 may be performed one or more times on each of the pixelated imaging array sensors in order to provide optimal results.


After labeling process 100 has been performed on each of the pixelated imaging array sensors 34a and 34b, the pixels are labeled according to the segments or groups of pixels associated with particularly classified light sources. Once each particular segment is labeled on each sensor, additional algorithms or processes may be performed by control 16, in order to determine a location and intensity of each segment with respect to the particular sensor. As shown in FIG. 10, a position and intensity process 200 determines an average x and y position of each segment relative to its respective sensor and a maximum intensity associated with each segment. Process 200 analyzes each pixel in each array and starts at 210. Process 200 sets each position and intensity value for each segment to zero at 220. If it is determined at 230 that the label for the pixel being analyzed is not equal to one of the previously designated segment numbers, then process 200 moves to the next pixel at 235 and continues at 237. If, on the other hand, the label associated with the particular pixel is equal to one of the segment numbers, then the x position and y position values for that segment are summed at 240. The x position value for the particular segment is the sum of the previously calculated x position value for that segment plus the x ordinate for the particular pixel relative to the sensor array. The y position value for that segment is similarly calculated and a counter value is increased by one at 240.


It is then determined at 250 whether an image intensity value for that pixel is greater than the maximum intensity value associated with that particular segment. If the pixel intensity value is greater than the maximum intensity for that segment, then the maximum intensity value for that segment is set to the sensed image intensity value for the particular pixel at 260. It is then determined at 270 whether all the pixels on each array have been analyzed. If it is determined at 270 that not all the pixels have been analyzed, then process 200 moves to the next pixel at 235 and continues at 237. If it is determined at 270 that the pixels have all been analyzed, then an average x position and y position associated with each segment is then calculated at 280 by dividing the summed x and y position values for each segment by the corresponding count value for each particular segment. The process ends at 290. Upon completion of process 200, an average x and y position and a maximum intensity associated with each segment is stored for comparison with the positions and intensities sensed by the other array sensor. The positional values may be converted to conventional units of measurement for use in the distance calculations of equation (1).


Referring now to FIG. 11, a distance algorithm or process 300 compares the average positions and intensities of each segment to corresponding segments on the other sensor 34b in order to determine whether a segment on the first sensor 34a represents the same object or light source as a corresponding segment on the second sensor 34b. Process 300 begins at 310 and selects a first segment at 320. If it is determined at 330 that an average x position and y position of the segment on the first sensor is within a predetermined position threshold of the average x position and y position of a segment on the second sensor, then it is further determined at 340 whether the maximum intensities associated with each segment on each sensor are within a maximum intensity threshold. If the average x and y positions are not within the position threshold at 330, then the process 300 moves to the next segment at 333 and continues at 335. Likewise, if the maximum intensities are not within the maximum intensity threshold at 340, the process moves to the next segment at 333 and continues at 335. If the average x and y positions are within the position threshold at 330 and the maximum intensities are within the maximum intensity threshold at 340, a distance to that object or light source is calculated at 350, preferably as a function of the x positions of the sensed light source on both sensors according to equation (1), discussed above.


Because the vehicle imaging system 12 of the present invention preferably adjusts or controls an accessory of vehicle 10 in response to the closest object or light source sensed by sensors 34a and 34b, it may also be determined at 360 whether the calculated distance is less than a lowest distance for all segments. This provides the system with the distance to the closest object or light source that has been classified by control 16. If it is determined at 360 that the distance is less than a lowest distance value, then the lowest distance value is set to the newly calculated distant value at 370. It is then determined at 380 whether all the segments have been accounted for. If it is determined at 380 that not all the segments have been accounted for, the process moves to the next segment at 333 and continues at 335. If, on the other hand, it is determined at 380 that all the segments have been accounted for, the process ends at 390. Upon completion of process 300, the least distance from the vehicle 10 to a sensed object or light source which is in a selected classification and within a position and maximum intensity threshold is stored for use by the imaging control 16. Control 16 may then function to display a distance readout or adjust the appropriate accessory of vehicle 10 in response to the intensity of the light source sensed and/or the calculated distance to that light source. Algorithms 100, 200 and 300 may then be repeated for different classifications of light sources. For example, segments may be classified as white or red light sources for headlamps or taillights or any other color which may be of interest to an operator of the vehicle.


Referring now to FIG. 12, a process 500 is shown which calculates a distance from an imaging array sensor or sensors to an object or light source sensed by the sensors and provides an output signal in response to the distance and intensity of the light source. The output signal may be in the form of a distance display or may provide an activation signal to a control, depending on the particular application of the stereoscopic imaging process 500. Process 500 begins at 505 and grabs a color frame in each sensor or camera at 510 and 512. each pixel is then classified according to a desired color or other characteristic at 520 and 522. The classified pixels are assigned a value of one, while the remaining pixels are assigned a value of zero and a segment labeling algorithm similar to process 100 discussed above is performed at 530 and 532 for the respective sensors. Clearly, however, the classified pixels may be designated in other manners, without affecting the scope of the present invention. The average x and y pixel locations and maximum intensity of each segment are then determined at 540 and 542. Process 500 then compares the segmented images from both sensors at 550 and calculates the distance to the light source corresponding to the similar segments in both sensors at 560. The angular or lateral position of the object or light source may also be determined at 560. It may then be determined at 570 if the distance and maximum intensity of a particular segment are within a predetermined threshold. If the distance and maximum intensity are within the threshold levels, then an appropriate output signal is sent at 580 and the process continues at 590. If, on the other hand, the distance and/or maximum intensity are not within the threshold at 570, then the process may continue at 590.


Although shown in FIG. 3 as having sensors 34a and 34b and lenses 36a and 36b positioned such that their optic paths are substantially parallel, clearly other orientations are within the scope of the present invention. For example, as shown in FIG. 13A, two oppositely facing sensors 34a and 34b may be implemented within a housing 29 or the like such that a pair of flat reflective surfaces or mirrors 37a and 37b are positioned along the respective optic paths between the lenses 36a and 36b and the sensors 34a and 34b. Alternately, a pair of openings 39a and 39b may be provided in the housing 29 to allow light to pass therethrough such that it is redirected by the flat reflective surfaces 37a and 37b toward the respective sensors 34a and 34b. The focusing lenses 36a and 36b may then be positioned along the respective optic paths between the flat reflective surfaces 37a and 37b and the sensors 34a and 34b (FIG. 13B). In another alternate orientation, a single imaging array sensor 34 may be implemented within housing 29 to receive a stereoscopic image of the scene remote from the vehicle. A divider 41 may be implemented substantially adjacent to sensor 34 to divide sensor 34 into separate and distinct sensing arrays 34a ′ and 34b ′ (FIG. 13C). An additional pair of flat reflective surfaces or mirrors 42a and 42b may also be included to redirect the image rays toward sensor 34 via focusing lenses 36a and 36b. Clearly, however, the scope of the present invention includes other orientations where the lenses and one or more reflective surfaces may be implemented along an optic path between one or more sensors and the target scene.


Although vehicle imaging system 12 is useful in various imaging system applications, the control is particularly useful with a vehicle headlamp dimming control 12′ (FIGS. 14 and 15). Vehicle headlamp control 12′ may then classify the pixels as red, white or black and correspondingly identify the light sources as taillights or headlamps, using the principles disclosed in commonly assigned U.S. Pat. No. 5,796,094, referenced above. Headlamp control 12′ may determine the distances between vehicle 10 and the identified taillights and headlamps and communicate this information to a vehicle lighting control logic module 18 (FIG. 15). Vehicle lighting control logic module 18 may then exchange data with control 16 to control headlamps 20 of vehicle 10 in response to the output of sensor module 14 as received by imaging control 16. Imaging control 16 may analyze detected light sources to determine a color and/or intensity of the light sources and to determine a distance between the light sources and vehicle 10. This information may then be communicated to lighting control logic module 18 for dimming of headlamps 20. Dimmer control 12′ thus may correspondingly control the headlamps 20 in response to the color or intensity of the light sources as well as the distance to the light sources. Additional criteria may also be considered, such as the lateral position of the sensed light sources with respect to the vehicle or other criteria associated with size, color, position, intensity or rate of approach of the light source.


Preferably, as shown in FIG. 14, imaging sensor module 14 may be fixedly mounted in housing 28 by a bracket 24 mounted to, or near, the vehicle's windshield 26. Sensor module 14 may be mounted within housing 28 in various orientations, as discussed above with respect to FIGS. 13A-13C. Bracket 24 may also mount an interior rear-view mirror 30. However, imaging sensor module 14 may be mounted elsewhere on the vehicle without affecting the scope of the present invention.


Referring now to FIGS. 16A and 16B, a headlamp control process 400 is shown which starts at 405 by determining whether the ambient light level is below a predetermined threshold. If the light level is below the threshold, then process 400 grabs a color frame at a headlamp shutter setting for both cameras or sensors 34a and 34b at 410 and 412, respectively. Process 400 then classifies each pixel as white or black at 415 and 417 and assigns a value of one to white pixels and a value of zero to black pixels at 420 and 422 or otherwise designates the pixels. The segment labeling algorithm 100 is performed at 420 and 422 for the two sensors 34a and 34b, respectively. An average x and y pixel location and maximum intensity is then calculated according to process 200 at 425 and 427 for each segment on the respective sensors. Headlamp control process 400 then compares the location and intensity of the segmented images from both sensors at 430 in order to determine segments on each sensor which correspond to a particular light source. Control process 400 determines that the segments correspond to a particular light source if the compared segments on both sensors are within an x-y pixel space threshold and intensity threshold, in accordance with process 300, discussed above. The distance to the light source corresponding to the similar segments is then calculated at 440. The angular and/or lateral position of the light source relative to vehicle 10 may also be calculated at 440. It is then determined at 450 whether the distance and maximum intensity of corresponding segments are consistent with a headlamp of an oncoming vehicle and within a predetermined threshold level. The consistency criteria may include a forward and lateral position relative to vehicle 10, intensity, size, or any other criteria which may discern a headlamp form other light sources, such as rate of approach or the like relative to vehicle 10. If it is determined at 450 that the distance, intensity and/or any other selected criteria are within the threshold levels, the headlamps are set to a low beam setting at 452 and the process returns at 455.


If it is determined at 450 that the distance, maximum intensity or other characteristics of the segment are not consistent with a headlamp or within the threshold level, then process 400 grabs color frames at a taillamp shutter setting in camera sensors 34a and 34b at 460 and 462, respectively, using the principles disclosed in U.S. Pat. No. 5,796,094, referenced above. Each pixel is then classified as red or black at 465 and 467. The red pixels are then assigned a value of one or otherwise designated, while the black pixels are assigned a value of zero or otherwise designated, at 470 and 472. The segment labeling algorithm 100 is again performed on each of the respective sensors at 470 and 472. An average x and y pixel location and maximum intensity are then calculated according to process 200 at 475 and 477 for each segment on the respective sensors. The segmented images from both cameras are then compared at 480 to determine which segments are close in x-y pixel positioning and similar in maximum intensity between the two sensors. The distance to a light source corresponding to the similar segments in both sensors is then calculated at 485. The lateral position of the light sources may also be determined at 485. It is then determined at 490 if the distance and maximum intensity of the segment are consistent with a taillamp and within a predetermined threshold. Similar to the consistency criteria above with respect to headlamps, the light source may be analyzed to determine if their size, intensity, lateral and vertical position relative to vehicle 10 and/or rate of approach to vehicle 10 are consistent with known or assumed values associated with vehicle taillights. If the distance, maximum intensity and the like are within the threshold levels, the headlamps are set to a low beam at 492 and the process returns to 405 at 455. If, on the other hand, the distance, maximum intensity and/or other selected criteria are not consistent with taillamps or are not within the threshold levels, the headlamps are set to a high beam setting at 495 and the process again returns at 455. Process 400 thus adjusts the headlamp setting in response to the distance and maximum intensity of light sources sensed by both of the sensors 34a and 34b.


The present invention thus accounts for both the intensity of light sensed by the sensors and the distance to the light source from the vehicle 10, before adjusting the headlamp setting for the vehicle. This allows the vehicle headlamps to remain in a high beam setting until vehicle 10 is within a predetermined range of a sensed headlamp or taillight, and conversely, the headlamps may be set to a high beam setting once a sensed headlamp or taillight moves beyond that predetermined range. By sampling real world data or simulating various driving conditions, a pixel intensity versus distance curve may be created which is typical of headlamps and taillamps for various driving conditions. Such a curve is shown in FIG. 17A, where a segment intensity and corresponding distance at point A below the curve would not be classified as a headlamp, while a signal B, which has similar intensity but greater distance than point A, may be classified as a headlamp. Headlamp control process 400 is then further optimized since certain segments which are not within a range of the real world data curve would not be included in the headlamp analysis. Similarly, as shown in FIG. 17B, real world data may be used to modify the curve such that an angular position of the light source relative to vehicle 10 is further included in the analysis in order determine whether or not the segment should be classified as a headlamp or taillight. For example, the signal C in FIG. 17B would be classified as a headlamp if it is determined to be at approximately a 15° angle relative to vehicle 10, but may not be classified as a headlamp if it is only approximately 0°-5° off of the axis of the sensors 34a and 34b in vehicle 10. The system may be otherwise optimized as shown if FIG. 17C, where a minimum and maximum pixel intensity band 60 versus distance is implemented. With such a band, segments which fall within the shaded area or band 60, such as point D, may be classified as headlamps, while segments falling outside of the band 60, such as points E and F, may not be classified as headlamps by headlamp control process 400. Clearly, the scope of the present invention further includes other thresholds and criteria for determining whether a particular segment should be classified as a headlamp or taillight, with respect to its intensity and distance and/or angle or lateral position relative to vehicle 10.


Therefore, the present invention provides a stereoscopic imaging system useful with various accessory controls or displays which is operable to determine a distance from one or more imaging array sensors to an object or light source remote from the sensors. The stereoscopic imaging system may determine a distance to any object or light source in a targeted scene, without requiring additional equipment or ranging devices. Furthermore, the system may provide a distance determination to a headlamp control, without having to assume that the light source is within a predetermined range of intensities corresponding to a typical intensity of a headlamp or taillight and calculating the distance based on the intensity alone. Accordingly, the imaging system provides a more accurate distance calculation, since it is not affected by variations in the intensity of the light source that is being sensed. The accuracy of the distance calculations may be further enhanced by implementing a segmentation algorithm which determines the average position of the light source as received by the sensor, thereby facilitating sub-pixel resolution for the distance calculations. Furthermore, the distance calculation may be applied equally as well to other images that are not associated with headlamps or taillights of other vehicles. Accordingly, the stereoscopic imaging system described herein may be useful with other vehicular imaging systems, such as rearview vision systems, backup aids, rain sensors or the like.


Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law.

Claims
  • 1. A vision system for a vehicle, said vision system comprising: at least one imaging sensor having a forward field of view with respect to a forward direction of travel of a vehicle equipped with said vision system, wherein said at least one imaging sensor comprises a lens and a CMOS imaging array comprising a two-dimensional array of photosensing elements, and wherein said at least one imaging sensor captures image data representative of objects present in said forward field of view;at least one register operable to access photosensing elements of said array of photosensing elements;wherein said array of photosensing elements comprises a plurality of sub-arrays, said sub-arrays each having at least three neighboring photosensing elements;wherein at least one photosensing element of each sub-array comprises a red light sensitive photosensing element and at least one other photosensing element of each sub-array comprises a white light sensitive photosensing element;a control responsive to said at least one imaging sensor;wherein said control processes image data captured by said at least one imaging sensor to determine an object present in said forward field of view;wherein said control processes image data captured by said at least one imaging sensor to determine a distance between the equipped vehicle and the object determined present in said forward field of view; andwherein said control discerns if the determined object is an oncoming vehicle or a leading vehicle at least in part in response to rate of approach of the determined object to the equipped vehicle.
  • 2. The vision system of claim 1, wherein said control is operable to adjust a light beam emitted by at least one headlamp of the equipped vehicle responsive to said processing of image data captured by said at least one imaging sensor.
  • 3. The vision system of claim 1, wherein said array of photosensing elements comprises at least 512 rows of photosensing elements and at least 512 columns of photosensing elements.
  • 4. The vision system of claim 1, wherein said array of photosensing elements comprises at least 262,144 photosensing elements.
  • 5. The vision system of claim 1, wherein the object determined present in said forward field of view comprises a headlight or a taillight of another vehicle and wherein said control is operable to adjust a light beam emitted by at least one headlamp of the equipped vehicle responsive to said processing of said image data.
  • 6. The vision system of claim 1, wherein said control processes image data captured by said at least one imaging sensor to identify a headlamp or taillight of another vehicle present in said forward field of view.
  • 7. The vision system of claim 6, wherein said control processes image data captured by said at least one imaging sensor and determines a distance between the equipped vehicle and the other vehicle.
  • 8. The vision system of claim 7, wherein said control determines the distance between the equipped vehicle and the other vehicle at least in part in response to at least one of (i) size of the determined headlamp or taillight, (ii) position of the determined headlamp or taillight, (iii) light intensity of the determined headlamp or taillight, and (iv) rate of approach of the determined headlamp or taillight.
  • 9. The vision system of claim 1, wherein red filtering is used with at least some of said photosensing elements, and wherein, responsive at least to processing of image data captured by said red filtered photosensing elements, said control is operable to at least partially distinguish the object from other objects present in said forward field of view.
  • 10. The vision system of claim 9, wherein, responsive at least in part to processing of image data captured by said red filtered photosensing elements, said control is operable to distinguish taillights of other vehicles ahead of the equipped vehicle.
  • 11. The vision system of claim 1, wherein said control adjusts a light beam emitted by at least one headlamp of the equipped vehicle responsive to said control (i) processing image data captured by said at least one imaging sensor and (ii) identifying a headlamp or taillight of another vehicle present in said forward field of view.
  • 12. The vision system of claim 11, wherein said control adjusts the light beam emitted by the at least one headlamp of the equipped vehicle by adjusting the at least one headlamp between a first beam and a second beam.
  • 13. The vision system of claim 12, wherein said first beam comprises a lower beam and said second beam comprises a higher beam.
  • 14. The vision system of claim 1, wherein said control is operable to control a system of the equipped vehicle responsive to said processing of image data captured by said at least one imaging sensor.
  • 15. The vision system of claim 14, wherein said controlled system comprises at least one of (i) a lighting system of the equipped vehicle, (ii) a brake system of the equipped vehicle and (iii) a driver assistance system of the equipped vehicle.
  • 16. The vision system of claim 1, wherein said control controls a brake system of the equipped vehicle responsive to determining that the object determined present in said forward field of view is within a threshold distance of the equipped vehicle.
  • 17. The vision system of claim 1, wherein said at least one imaging sensor is disposed behind a windshield of the equipped vehicle and said at least one imaging sensor views through the windshield of the equipped vehicle.
  • 18. The vision system of claim 1, wherein said at least one imaging sensor comprises at least a first imaging sensor and a second imaging sensor, and wherein said first imaging sensor is spaced from said second imaging sensor.
  • 19. The vision system of claim 18, wherein said first and second imaging sensors comprise imaging sensors of a stereoscopic imaging system of the equipped vehicle.
  • 20. A vision system for a vehicle, said vision system comprising: at least one imaging sensor having a forward field of view with respect to a forward direction of travel of a vehicle equipped with said vision system, wherein said at least one imaging sensor comprises a lens and a CMOS imaging array comprising a two-dimensional array of photosensing elements, and wherein said at least one imaging sensor captures image data representative of objects present in said forward field of view;at least one register operable to access photosensing elements of said array of photosensing elements;wherein said array of photosensing elements comprises a plurality of sub-arrays, said sub-arrays each having at least three neighboring photosensing elements;wherein at least one photosensing element of each sub-array comprises a red light sensitive photosensing element and at least one other photosensing element of each sub-array comprises a white light sensitive photosensing element;a control responsive to said at least one imaging sensor;wherein said control processes image data captured by said at least one imaging sensor to determine a headlamp or taillight of another vehicle present in said forward field of view;wherein said control processes image data captured by said at least one imaging sensor to determine a distance between the equipped vehicle and the other vehicle; andwherein said control discerns if the other vehicle is an oncoming vehicle or a leading vehicle at least in part in response to rate of approach of the determined headlamp or taillight to the equipped vehicle.
  • 21. The vision system of claim 20, wherein said control is operable to adjust a light beam emitted by at least one headlamp of the equipped vehicle responsive to said processing of image data captured by said at least one imaging sensor.
  • 22. The vision system of claim 20, wherein said array of photosensing elements comprises at least 262,144 photosensing elements.
  • 23. The vision system of claim 20, wherein red filtering is used with at least some of said photosensing elements, and wherein, responsive at least to processing of image data captured by said red filtered photosensing elements, said control is operable to at least partially distinguish the headlamp or taillight of the other vehicle from other objects present in said forward field of view.
  • 24. A vision system for a vehicle, said vision system comprising: at least one imaging sensor having a forward field of view with respect to a forward direction of travel of a vehicle equipped with said vision system, wherein said at least one imaging sensor comprises a lens and a CMOS imaging array comprising a two-dimensional array of at least 262,144 photosensing elements, and wherein said at least one imaging sensor captures image data representative of objects present in said forward field of view;at least one register operable to access photosensing elements of said array of photosensing elements;wherein said array of photosensing elements comprises a plurality of sub-arrays, said sub-arrays each having at least three neighboring photosensing element;wherein at least one photosensing element of each sub-array comprises a red light sensitive photosensing element and at least one other photosensing element of each sub-array comprises a white light sensitive photosensing element;a control responsive to said at least one imaging sensor;wherein said control processes image data captured by said at least one imaging sensor to determine a headlamp or taillight of another vehicle present in said forward field of view;wherein said control processes image data captured by said at least one imaging sensor to determine a distance between the equipped vehicle and the other vehicle;wherein red filtering is used with at least some of said photosensing elements, and wherein, responsive at least to processing of image data captured by said red filtered photosensing elements, said control is operable to at least partially distinguish the determined headlamp or taillight of the other vehicle from other objects present in said forward field of view; andwherein said control discerns if the other is an oncoming vehicle or a leading vehicle at least in part in response to rate of approach of the determined headlamp or taillight to the equipped vehicle.
  • 25. The vision system of claim 24, wherein said control is operable to adjust a light beam emitted by at least one headlamp of the equipped vehicle responsive to said processing of image data captured by said at least one imaging sensor.
  • 26. A vision system for a vehicle, said vision system comprising: at least one imaging sensor having a forward field of view with respect to a forward direction of travel of a vehicle equipped with said vision system, wherein said at least one imaging sensor comprises a lens and a CMOS imaging array comprising a two-dimensional array of photosensing elements, and wherein said at least one imaging sensor captures image data representative of objects present in said forward field of view;at least one register operable to access photosensing elements of said array of photosensing elements;wherein said array of photosensing elements comprises a plurality of sub-arrays, said sub-arrays each having at least three neighboring photosensing elements;wherein at least one photosensing element of each sub-array comprises a red light sensitive photosensing element and at least one other photosensing element of each sub-array comprises a white light sensitive photosensing element;wherein said at least one imaging sensor is disposed behind a windshield of the equipped vehicle and said at least one imaging sensor views through the windshield of the equipped vehicle;a control responsive to said at least one imaging sensor;wherein said control processes image data captured by said at least one imaging sensor to determine a headlamp or taillight of another vehicle present in said forward field of view;wherein said control processes image data captured by said at least one imaging sensor to determine a distance between the equipped vehicle and the other vehicle; andwherein said control discerns if the determined object is an oncoming vehicle or a leading vehicle at least in part in response to rate of approach of the determined headlamp or taillight to the equipped vehicle.
  • 27. The vision system of claim 26, wherein said array of photosensing elements comprises at least 262,144 photosensing elements.
  • 28. The vision system of claim 26, wherein said control is operable to adjust a light beam emitted by at least one headlamp of the equipped vehicle responsive to said processing of said image data.
  • 29. The vision system of claim 26, wherein red filtering is used with at least some of said photosensing elements, and wherein, responsive at least to processing of image data captured by said red filtered photosensing elements, said control is operable to at least partially distinguish the object from other objects present in said forward field of view.
  • 30. The vision system of claim 29, wherein, responsive at least in part to processing of image data captured by said red filtered photosensing elements, said control is operable to distinguish taillights of other vehicles ahead of the equipped vehicle.
  • 31. The vision system of claim 26, wherein said control adjusts a light beam emitted by at least one headlamp of the equipped vehicle responsive to said control (i) processing image data captured by said at least one imaging sensor and (ii) identifying a headlamp or taillight of another vehicle present in said forward field of view.
  • 32. The vision system of claim 26, wherein said control is operable to control a system of the equipped vehicle responsive to said processing of image data captured by said at least one imaging sensor, and wherein said controlled system comprises at least one of (i) a lighting system of the equipped vehicle, (ii) a brake system of the equipped vehicle and (iii) a driver assistance system of the equipped vehicle.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. patent application Ser. No. 13/292,119, filed Nov. 9, 2011, now U.S. Pat. No. 8,203,443, which is a continuation of U.S. patent application Ser. No. 11/810,164, filed Jun. 5, 2007, now U.S. Pat. No. 8,063,759, which is a continuation of U.S. patent application Ser. No. 10/984,403, filed Nov. 9, 2004, now U.S. Pat. No. 7,227,459, which is a continuation of U.S. patent application Ser. No. 10/047,901, filed Jan. 14, 2002, now U.S. Pat. No. 6,822,563, which is a continuation of U.S. patent application Ser. No. 09/372,915, filed Aug. 12, 1999, now U.S. Pat. No. 6,396,397.

US Referenced Citations (489)
Number Name Date Kind
2632040 Rabinow Mar 1953 A
2827594 Rabinow Mar 1958 A
3141393 Platt Jul 1964 A
3601614 Platzer Aug 1971 A
3612666 Rabinow Oct 1971 A
3665224 Kelsey May 1972 A
3680951 Jordan Aug 1972 A
3689695 Rosenfield et al. Sep 1972 A
3708231 Walters Jan 1973 A
3746430 Brean Jul 1973 A
3807832 Castellion Apr 1974 A
3811046 Levick May 1974 A
3813540 Albrecht May 1974 A
3862798 Hopkins Jan 1975 A
3947095 Moultrie Mar 1976 A
3962600 Pittman Jun 1976 A
3985424 Steinacher Oct 1976 A
3986022 Hyatt Oct 1976 A
4037134 Löper Jul 1977 A
4052712 Ohama et al. Oct 1977 A
4093364 Miller Jun 1978 A
4111720 Michel et al. Sep 1978 A
4161653 Bedini Jul 1979 A
4200361 Malvano Apr 1980 A
4214266 Myers Jul 1980 A
4218698 Bart et al. Aug 1980 A
4236099 Rosenblum Nov 1980 A
4247870 Gabel et al. Jan 1981 A
4249160 Chilvers Feb 1981 A
4266856 Wainwright May 1981 A
4277804 Robison Jul 1981 A
4281898 Ochiai Aug 1981 A
4288814 Talley et al. Sep 1981 A
4355271 Noack Oct 1982 A
4357558 Massoni et al. Nov 1982 A
4381888 Momiyama May 1983 A
4420238 Felix Dec 1983 A
4431896 Lodetti Feb 1984 A
4443057 Bauer Apr 1984 A
4460831 Oettinger et al. Jul 1984 A
4481450 Watanabe et al. Nov 1984 A
4491390 Tong-Shen Jan 1985 A
4512637 Ballmer Apr 1985 A
4529275 Ballmer Jul 1985 A
4529873 Ballmer Jul 1985 A
4546551 Franks Oct 1985 A
4549208 Kamejima et al. Oct 1985 A
4571082 Downs Feb 1986 A
4572619 Reininger Feb 1986 A
4580875 Bechtel Apr 1986 A
4600913 Caine Jul 1986 A
4603946 Kato Aug 1986 A
4614415 Hyatt Sep 1986 A
4620141 McCumber et al. Oct 1986 A
4623222 Itoh Nov 1986 A
4626850 Chey Dec 1986 A
4629941 Ellis Dec 1986 A
4630109 Barton Dec 1986 A
4632509 Ohmi Dec 1986 A
4638287 Umebayashi et al. Jan 1987 A
4647161 Müller Mar 1987 A
4653316 Fukuhara Mar 1987 A
4669825 Itoh Jun 1987 A
4669826 Itoh Jun 1987 A
4671615 Fukada Jun 1987 A
4672457 Hyatt Jun 1987 A
4676601 Itoh Jun 1987 A
4690508 Jacob Sep 1987 A
4692798 Seko et al. Sep 1987 A
4697883 Suzuki Oct 1987 A
4701022 Jacob Oct 1987 A
4713685 Nishimura et al. Dec 1987 A
4717830 Botts Jan 1988 A
4727290 Smith Feb 1988 A
4731669 Hayashi et al. Mar 1988 A
4741603 Miyagi May 1988 A
4768135 Kretschmer et al. Aug 1988 A
4772942 Tuck Sep 1988 A
4789904 Peterson Dec 1988 A
4793690 Gahan Dec 1988 A
4817948 Simonelli Apr 1989 A
4820933 Hong Apr 1989 A
4825232 Howdle Apr 1989 A
4838650 Stewart Jun 1989 A
4847772 Michalopoulos et al. Jul 1989 A
4855822 Narendra et al. Aug 1989 A
4859031 Berman et al. Aug 1989 A
4862037 Farber et al. Aug 1989 A
4867561 Fujii et al. Sep 1989 A
4871917 O'Farrell et al. Oct 1989 A
4872051 Dye Oct 1989 A
4881019 Shiraishi et al. Nov 1989 A
4882565 Gallmeyer Nov 1989 A
4886960 Molyneux Dec 1989 A
4891559 Matsumoto et al. Jan 1990 A
4892345 Rachael, III Jan 1990 A
4895790 Swanson et al. Jan 1990 A
4896030 Miyaji Jan 1990 A
4900133 Berman Feb 1990 A
4907870 Brucker Mar 1990 A
4910591 Petrossian et al. Mar 1990 A
4916374 Schierbeek Apr 1990 A
4917477 Bechtel et al. Apr 1990 A
4937796 Tendler Jun 1990 A
4953305 Van Lente et al. Sep 1990 A
4956591 Schierbeek Sep 1990 A
4961625 Wood et al. Oct 1990 A
4967319 Seko Oct 1990 A
4970653 Kenue Nov 1990 A
4971430 Lynas Nov 1990 A
4974078 Tsai Nov 1990 A
4987357 Masaki Jan 1991 A
4987410 Berman et al. Jan 1991 A
4991054 Walters Feb 1991 A
5001558 Burley et al. Mar 1991 A
5003288 Wilhelm Mar 1991 A
5012082 Watanabe Apr 1991 A
5016977 Baude et al. May 1991 A
5027001 Torbert Jun 1991 A
5027200 Petrossian et al. Jun 1991 A
5044706 Chen Sep 1991 A
5050966 Berman Sep 1991 A
5055668 French Oct 1991 A
5059877 Teder Oct 1991 A
5064274 Alten Nov 1991 A
5072154 Chen Dec 1991 A
5086253 Lawler Feb 1992 A
5096287 Kakinami et al. Mar 1992 A
5097362 Lynas Mar 1992 A
5121200 Choi Jun 1992 A
5124549 Michaels et al. Jun 1992 A
5130709 Toyama et al. Jul 1992 A
5148014 Lynam Sep 1992 A
5168378 Black Dec 1992 A
5170374 Shimohigashi et al. Dec 1992 A
5172235 Wilm et al. Dec 1992 A
5177685 Davis et al. Jan 1993 A
5182502 Slotkowski et al. Jan 1993 A
5184956 Langlais et al. Feb 1993 A
5189561 Hong Feb 1993 A
5193000 Lipton et al. Mar 1993 A
5193029 Schofield Mar 1993 A
5204778 Bechtel Apr 1993 A
5208701 Maeda May 1993 A
5214408 Asayama May 1993 A
5245422 Borcherts et al. Sep 1993 A
5253109 O'Farrell Oct 1993 A
5276389 Levers Jan 1994 A
5285060 Larson et al. Feb 1994 A
5289182 Brillard et al. Feb 1994 A
5289321 Secor Feb 1994 A
5305012 Faris Apr 1994 A
5307136 Saneyoshi Apr 1994 A
5309137 Kajiwara May 1994 A
5313072 Vachss May 1994 A
5325096 Pakett Jun 1994 A
5325386 Jewell et al. Jun 1994 A
5329206 Slotkowski et al. Jul 1994 A
5331312 Kudoh Jul 1994 A
5336980 Levers Aug 1994 A
5341437 Nakayama Aug 1994 A
5351044 Mathur et al. Sep 1994 A
5355118 Fukuhara Oct 1994 A
5374852 Parkes Dec 1994 A
5386285 Asayama Jan 1995 A
5394333 Kao Feb 1995 A
5406395 Wilson et al. Apr 1995 A
5408346 Trissel et al. Apr 1995 A
5410346 Saneyoshi et al. Apr 1995 A
5414257 Stanton May 1995 A
5414461 Kishi et al. May 1995 A
5416313 Larson et al. May 1995 A
5416318 Hegyi May 1995 A
5416478 Morinaga May 1995 A
5424952 Asayama Jun 1995 A
5426294 Kobayashi et al. Jun 1995 A
5430431 Nelson Jul 1995 A
5434407 Bauer et al. Jul 1995 A
5440428 Hegg et al. Aug 1995 A
5444478 Lelong et al. Aug 1995 A
5451822 Bechtel et al. Sep 1995 A
5457493 Leddy et al. Oct 1995 A
5461357 Yoshioka et al. Oct 1995 A
5461361 Moore Oct 1995 A
5469298 Suman et al. Nov 1995 A
5471515 Fossum et al. Nov 1995 A
5475494 Nishida et al. Dec 1995 A
5487116 Nakano et al. Jan 1996 A
5493392 Blackmon et al. Feb 1996 A
5498866 Bendicks et al. Mar 1996 A
5500766 Stonecypher Mar 1996 A
5510983 Iino Apr 1996 A
5515448 Nishitani May 1996 A
5521633 Nakajima et al. May 1996 A
5528698 Kamei et al. Jun 1996 A
5529138 Shaw et al. Jun 1996 A
5530240 Larson et al. Jun 1996 A
5530420 Tsuchiya et al. Jun 1996 A
5535144 Kise Jul 1996 A
5535314 Alves et al. Jul 1996 A
5537003 Bechtel et al. Jul 1996 A
5539397 Asanuma et al. Jul 1996 A
5541590 Nishio Jul 1996 A
5550677 Schofield et al. Aug 1996 A
5555312 Shima et al. Sep 1996 A
5555555 Sato et al. Sep 1996 A
5568027 Teder Oct 1996 A
5574443 Hsieh Nov 1996 A
5581464 Woll et al. Dec 1996 A
5594222 Caldwell Jan 1997 A
5614788 Mullins Mar 1997 A
5619370 Guinosso Apr 1997 A
5634709 Iwama Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5648835 Uzawa Jul 1997 A
5650944 Kise Jul 1997 A
5660454 Mori et al. Aug 1997 A
5661303 Teder Aug 1997 A
5666028 Bechtel et al. Sep 1997 A
5668663 Varaprasad et al. Sep 1997 A
5670935 Schofield et al. Sep 1997 A
5675489 Pomerleau Oct 1997 A
5677851 Kingdon et al. Oct 1997 A
5699044 Van Lente et al. Dec 1997 A
5715093 Schierbeek et al. Feb 1998 A
5724187 Varaprasad et al. Mar 1998 A
5724316 Brunts Mar 1998 A
5737226 Olson et al. Apr 1998 A
5757949 Kinoshita et al. May 1998 A
5760826 Nayer Jun 1998 A
5760828 Cortes Jun 1998 A
5760931 Saburi et al. Jun 1998 A
5760962 Schofield et al. Jun 1998 A
5761094 Olson et al. Jun 1998 A
5765116 Wilson-Jones et al. Jun 1998 A
5781437 Wiemer et al. Jul 1998 A
5786772 Schofield et al. Jul 1998 A
5790403 Nakayama Aug 1998 A
5790973 Blaker et al. Aug 1998 A
5793308 Rosinski et al. Aug 1998 A
5793420 Schmidt Aug 1998 A
5796094 Schofield et al. Aug 1998 A
5798575 O'Farrell et al. Aug 1998 A
5835255 Miles Nov 1998 A
5837994 Stam et al. Nov 1998 A
5844505 Van Ryzin Dec 1998 A
5844682 Kiyomoto et al. Dec 1998 A
5845000 Breed et al. Dec 1998 A
5848802 Breed et al. Dec 1998 A
5850176 Kinoshita et al. Dec 1998 A
5850254 Takano et al. Dec 1998 A
5867591 Onda Feb 1999 A
5877707 Kowalick Mar 1999 A
5877897 Schofield et al. Mar 1999 A
5878370 Olson Mar 1999 A
5883739 Ashihara et al. Mar 1999 A
5884212 Lion Mar 1999 A
5890021 Onoda Mar 1999 A
5896085 Mori et al. Apr 1999 A
5899956 Chan May 1999 A
5914815 Bos Jun 1999 A
5923027 Stam et al. Jul 1999 A
5929786 Schofield et al. Jul 1999 A
5940120 Frankhouse et al. Aug 1999 A
5949331 Schofield et al. Sep 1999 A
5956181 Lin Sep 1999 A
5959367 O'Farrell et al. Sep 1999 A
5959555 Furuta Sep 1999 A
5963247 Banitt Oct 1999 A
5964822 Alland et al. Oct 1999 A
5971552 O'Farrell et al. Oct 1999 A
5986796 Miles Nov 1999 A
5990469 Bechtel et al. Nov 1999 A
5990649 Nagao et al. Nov 1999 A
6001486 Varaprasad et al. Dec 1999 A
6009336 Harris et al. Dec 1999 A
6020704 Buschur Feb 2000 A
6049171 Stam et al. Apr 2000 A
6066933 Ponziana May 2000 A
6084519 Coulling et al. Jul 2000 A
6087953 DeLine et al. Jul 2000 A
6097023 Schofield et al. Aug 2000 A
6097024 Stam et al. Aug 2000 A
6116743 Hoek Sep 2000 A
6124647 Marcus et al. Sep 2000 A
6124886 DeLine et al. Sep 2000 A
6139172 Bos et al. Oct 2000 A
6144022 Tenenbaum et al. Nov 2000 A
6172613 DeLine et al. Jan 2001 B1
6175164 O'Farrell et al. Jan 2001 B1
6175300 Kendrick Jan 2001 B1
6198409 Schofield et al. Mar 2001 B1
6201642 Bos Mar 2001 B1
6222447 Schofield et al. Apr 2001 B1
6222460 DeLine et al. Apr 2001 B1
6243003 DeLine et al. Jun 2001 B1
6250148 Lynam Jun 2001 B1
6259412 Duroux Jul 2001 B1
6266082 Yonezawa et al. Jul 2001 B1
6266442 Laumeyer et al. Jul 2001 B1
6285393 Shimoura et al. Sep 2001 B1
6291906 Marcus et al. Sep 2001 B1
6294989 Schofield et al. Sep 2001 B1
6297781 Turnbull et al. Oct 2001 B1
6302545 Schofield et al. Oct 2001 B1
6310611 Caldwell Oct 2001 B1
6313454 Bos et al. Nov 2001 B1
6317057 Lee Nov 2001 B1
6320176 Schofield et al. Nov 2001 B1
6320282 Caldwell Nov 2001 B1
6326613 Heslin et al. Dec 2001 B1
6329925 Skiver et al. Dec 2001 B1
6333759 Mazzilli Dec 2001 B1
6341523 Lynam Jan 2002 B2
6353392 Schofield et al. Mar 2002 B1
6366213 DeLine et al. Apr 2002 B2
6370329 Teuchert Apr 2002 B1
6396397 Bos et al. May 2002 B1
6411204 Bloomfield et al. Jun 2002 B1
6411328 Franke et al. Jun 2002 B1
6420975 DeLine et al. Jul 2002 B1
6424273 Gutta et al. Jul 2002 B1
6428172 Hutzel et al. Aug 2002 B1
6429594 Stam et al. Aug 2002 B1
6430303 Naoi et al. Aug 2002 B1
6433676 DeLine et al. Aug 2002 B2
6433817 Guerra Aug 2002 B1
6442465 Breed et al. Aug 2002 B2
6477464 McCarthy et al. Nov 2002 B2
6485155 Duroux et al. Nov 2002 B1
6497503 Dassanayake et al. Dec 2002 B1
6498620 Schofield et al. Dec 2002 B2
6513252 Schierbeek et al. Feb 2003 B1
6516664 Lynam Feb 2003 B2
6523964 Schofield et al. Feb 2003 B2
6534884 Marcus et al. Mar 2003 B2
6539306 Turnbull Mar 2003 B2
6547133 DeVries, Jr. et al. Apr 2003 B1
6553130 Lemelson et al. Apr 2003 B1
6559435 Schofield et al. May 2003 B2
6574033 Chui et al. Jun 2003 B1
6578017 Ebersole et al. Jun 2003 B1
6587573 Stam et al. Jul 2003 B1
6589625 Kothari et al. Jul 2003 B1
6593565 Heslin et al. Jul 2003 B2
6594583 Ogura et al. Jul 2003 B2
6611202 Schofield et al. Aug 2003 B2
6611610 Stam et al. Aug 2003 B1
6627918 Getz et al. Sep 2003 B2
6631994 Suzuki et al. Oct 2003 B2
6636258 Strumolo Oct 2003 B2
6648477 Hutzel et al. Nov 2003 B2
6650233 DeLine et al. Nov 2003 B2
6650455 Miles Nov 2003 B2
6672731 Schnell et al. Jan 2004 B2
6674562 Miles Jan 2004 B1
6678056 Downs Jan 2004 B2
6678614 McCarthy et al. Jan 2004 B2
6680792 Miles Jan 2004 B2
6690268 Schofield et al. Feb 2004 B2
6700605 Toyoda et al. Mar 2004 B1
6703925 Steffel Mar 2004 B2
6704621 Stein et al. Mar 2004 B1
6710908 Miles et al. Mar 2004 B2
6711474 Treyz et al. Mar 2004 B1
6714331 Lewis et al. Mar 2004 B2
6717610 Bos et al. Apr 2004 B1
6728393 Stam et al. Apr 2004 B2
6735506 Breed et al. May 2004 B2
6741377 Miles May 2004 B2
6744353 Sjönell Jun 2004 B2
6757109 Bos Jun 2004 B2
6762867 Lippert et al. Jul 2004 B2
6794119 Miles Sep 2004 B2
6795221 Urey Sep 2004 B1
6802617 Schofield et al. Oct 2004 B2
6806452 Bos et al. Oct 2004 B2
6822563 Bos et al. Nov 2004 B2
6823241 Shirato et al. Nov 2004 B2
6824281 Schofield et al. Nov 2004 B2
6831261 Schofield et al. Dec 2004 B2
6847487 Burgner Jan 2005 B2
6861809 Stam Mar 2005 B2
6882287 Schofield Apr 2005 B2
6889161 Winner et al. May 2005 B2
6891563 Schofield et al. May 2005 B2
6909753 Meehan et al. Jun 2005 B2
6946978 Schofield Sep 2005 B2
6947577 Stam et al. Sep 2005 B2
6953253 Schofield et al. Oct 2005 B2
6968736 Lynam Nov 2005 B2
6975775 Rykowski et al. Dec 2005 B2
7004593 Weller et al. Feb 2006 B2
7004606 Schofield Feb 2006 B2
7005974 McMahon et al. Feb 2006 B2
7038577 Pawlicki et al. May 2006 B2
7046448 Burgner May 2006 B2
7062300 Kim Jun 2006 B1
7065432 Moisel et al. Jun 2006 B2
7085637 Breed et al. Aug 2006 B2
7092548 Laumeyer et al. Aug 2006 B2
7116246 Winter et al. Oct 2006 B2
7123168 Schofield Oct 2006 B2
7133661 Hatae et al. Nov 2006 B2
7149613 Stam et al. Dec 2006 B2
7167796 Taylor et al. Jan 2007 B2
7195381 Lynam et al. Mar 2007 B2
7202776 Breed Apr 2007 B2
7224324 Quist et al. May 2007 B2
7227459 Bos et al. Jun 2007 B2
7227611 Hull et al. Jun 2007 B2
7249860 Kulas et al. Jul 2007 B2
7253723 Lindahl et al. Aug 2007 B2
7255451 McCabe et al. Aug 2007 B2
7311406 Schofield et al. Dec 2007 B2
7325934 Schofield et al. Feb 2008 B2
7325935 Schofield et al. Feb 2008 B2
7338177 Lynam Mar 2008 B2
7339149 Schofield et al. Mar 2008 B1
7344261 Schofield et al. Mar 2008 B2
7360932 Uken et al. Apr 2008 B2
7370983 DeWind et al. May 2008 B2
7375803 Bamji May 2008 B1
7380948 Schofield et al. Jun 2008 B2
7388182 Schofield et al. Jun 2008 B2
7402786 Schofield et al. Jul 2008 B2
7423248 Schofield et al. Sep 2008 B2
7423821 Bechtel et al. Sep 2008 B2
7425076 Schofield et al. Sep 2008 B2
7432967 Bechtel et al. Oct 2008 B2
7459664 Schofield et al. Dec 2008 B2
7526103 Schofield et al. Apr 2009 B2
7541743 Salmeen et al. Jun 2009 B2
7561181 Schofield et al. Jul 2009 B2
7565006 Stam et al. Jul 2009 B2
7616781 Schofield et al. Nov 2009 B2
7619508 Lynam et al. Nov 2009 B2
7633383 Dunsmoir et al. Dec 2009 B2
7639149 Katoh Dec 2009 B2
7676087 Dhua et al. Mar 2010 B2
7720580 Higgins-Luthman May 2010 B2
7792329 Schofield et al. Sep 2010 B2
7843451 Lafon Nov 2010 B2
7855778 Yung et al. Dec 2010 B2
7859565 Schofield et al. Dec 2010 B2
7881496 Camilleri Feb 2011 B2
7914187 Higgins-Luthman et al. Mar 2011 B2
7930160 Hosagrahara et al. Apr 2011 B1
7994462 Schofield et al. Aug 2011 B2
8017898 Lu et al. Sep 2011 B2
8063759 Bos et al. Nov 2011 B2
8095310 Taylor et al. Jan 2012 B2
8098142 Schofield et al. Jan 2012 B2
8203443 Bos et al. Jun 2012 B2
8224031 Saito Jul 2012 B2
20020015153 Downs Feb 2002 A1
20020113873 Williams Aug 2002 A1
20030137586 Lewellen Jul 2003 A1
20030222982 Hamdan et al. Dec 2003 A1
20040051634 Schofield et al. Mar 2004 A1
20040200948 Bos et al. Oct 2004 A1
20050073853 Stam Apr 2005 A1
20050146792 Schofield et al. Jul 2005 A1
20050200700 Schofield et al. Sep 2005 A1
20050219852 Stam et al. Oct 2005 A1
20050237385 Kosaka et al. Oct 2005 A1
20060018511 Stam et al. Jan 2006 A1
20060018512 Stam et al. Jan 2006 A1
20060028731 Schofield et al. Feb 2006 A1
20060050018 Hutzel et al. Mar 2006 A1
20060091813 Stam et al. May 2006 A1
20060103727 Tseng May 2006 A1
20060250501 Wildmann et al. Nov 2006 A1
20070023613 Schofield et al. Feb 2007 A1
20070104476 Yasutomi et al. May 2007 A1
20070109406 Schofield et al. May 2007 A1
20070109651 Schofield et al. May 2007 A1
20070109652 Schofield et al. May 2007 A1
20070109653 Schofield et al. May 2007 A1
20070120657 Schofield et al. May 2007 A1
20070176080 Schofield et al. Aug 2007 A1
20070242339 Bradley Oct 2007 A1
20080147321 Howard et al. Jun 2008 A1
20080192132 Bechtel et al. Aug 2008 A1
20090113509 Tseng et al. Apr 2009 A1
20090160987 Bechtel et al. Jun 2009 A1
20090190015 Bechtel et al. Jul 2009 A1
20090256938 Bechtel et al. Oct 2009 A1
20120045112 Lundblad et al. Feb 2012 A1
Foreign Referenced Citations (98)
Number Date Country
2133182 Jan 1973 DE
2808260 Aug 1979 DE
2931368 Feb 1981 DE
2946561 May 1981 DE
3041692 May 1981 DE
3248511 Jul 1984 DE
3041612 Aug 1986 DE
4107965 Sep 1991 DE
4118208 Nov 1991 DE
4139515 Jun 1992 DE
4123641 Jan 1993 DE
48506 May 1985 EP
0202460 Nov 1986 EP
48810 Oct 1988 EP
0416222 Mar 1991 EP
0426503 May 1991 EP
0450553 Oct 1991 EP
0492591 Jul 1992 EP
0513476 Nov 1992 EP
0605045 Jul 1994 EP
0788947 Aug 1997 EP
0830267 Dec 2001 EP
2241085 Mar 1975 FR
2513198 Mar 1983 FR
2585991 Feb 1987 FR
2641237 Jul 1990 FR
2672857 Aug 1992 FR
2673499 Sep 1992 FR
2726144 Apr 1996 FR
934037 Aug 1963 GB
1535182 Dec 1978 GB
2029343 Mar 1980 GB
2119087 Nov 1983 GB
2137373 Oct 1984 GB
2137573 Oct 1984 GB
2156295 Oct 1985 GB
2244187 Nov 1991 GB
2255539 Nov 1992 GB
2327823 Feb 1999 GB
55-039843 Mar 1980 JP
56-30305 Mar 1981 JP
57-173801 Oct 1982 JP
57-208530 Dec 1982 JP
57-208531 Dec 1982 JP
58-19941 Feb 1983 JP
58-110334 Jun 1983 JP
58209635 Dec 1983 JP
59-51325 Mar 1984 JP
5951301 Mar 1984 JP
59114139 Jul 1984 JP
59133336 Sep 1984 JP
6079889 May 1985 JP
6080953 May 1985 JP
60-212730 Oct 1985 JP
60166651 Nov 1985 JP
60-261275 Dec 1985 JP
6154942 Mar 1986 JP
61-56638 Jul 1986 JP
62-43543 Feb 1987 JP
6272245 May 1987 JP
62122487 Jun 1987 JP
62122844 Jun 1987 JP
62131837 Jun 1987 JP
6414700 Jan 1989 JP
01-123587 May 1989 JP
03-061192 Mar 1991 JP
03-099952 Apr 1991 JP
042394 Nov 1991 JP
03-284413 Dec 1991 JP
417386 Jan 1992 JP
4114587 Apr 1992 JP
05-245886 Sep 1992 JP
05000638 Jan 1993 JP
05-050883 Mar 1993 JP
05-077657 Mar 1993 JP
5213113 Aug 1993 JP
05213113 Aug 1993 JP
06-107035 Apr 1994 JP
6227318 Aug 1994 JP
06267304 Sep 1994 JP
06276524 Sep 1994 JP
06295601 Oct 1994 JP
074170 Jan 1995 JP
0732936 Feb 1995 JP
0747878 Feb 1995 JP
07052706 Feb 1995 JP
0769125 Mar 1995 JP
07-105496 Apr 1995 JP
07105496 Apr 1995 JP
08166221 Jun 1996 JP
02-630604 Jul 1997 JP
2003-083742 Mar 2003 JP
WO 8605147 Sep 1986 WO
WO 9427262 Nov 1994 WO
WO 9621581 Jul 1996 WO
WO 9814974 Apr 1998 WO
WO 9914088 Mar 1999 WO
WO 9923828 May 1999 WO
Non-Patent Literature Citations (24)
Entry
G. Wang, D. Renshaw, P.B. Denyer and M. Lu, CMOS Video Cameras, article, 1991, 4 pages, University of Edinburgh, UK.
Dana H. Ballard and Christopher M. Brown, Computer Vision, Prentice-Hall, Englewood Cliffs, New Jersey, 5 pages, 1982.
Kobe, Gerry, “Hypnotic Wizardry! (interior electronics),” Automotive Industries, vol. 169, No. 5, p. 60, published May 1989. Relevant section is entitled Instrumentation.
SAE Information Report, “Vision Factors Considerations in Rear View Mirror Design—SAE J985 OCT88,” approved Oct. 1988, and located in 1995 SAE Handbook, vol. 3.
Hamit, Francis, “360-Degree Interactivity: New Video and Still Cameras Provide a Global Roaming Viewpoint”, Advanced Imaging, Mar. 1997, p. 50.
Johannes, Laura “A New Microchip Ushers in Cheaper Digital Cameras”, The Wall Street Journal, Aug. 21, 1998, p. B1.
Article entitled “Generation of Vision Technology,” published by VLSI Vision Limted, publication date unknown.
Article entitled “On-chip CMOS Sensors for VLSI Imaging Systems,” published by VLSI Vision Limited, 1991.
Decision—Motions—Bd. R. 125(a), issued Aug. 29, 2006 in connection with Interference No. 105,325, which involved U.S. Appl. No. 09/441,341, filed Nov. 16, 1999, by Schofield et al. and U.S. Patent No. 5,837,994, issued to Stam et al.
Reexamination Control No. 90/007,519, Reexamination of U.S. Patent No. 6,222,447, issued to Schofield et al.
Reexamination Control No. 90/007,520, Reexamination of U.S. Patent No. 5,949,331, issued to Schofield et al.
Reexamination Control No. 90/011,478, Reexamination of U.S. Patent No. 6,222,447, issued to Schofield et al.
Reexamination Control No. 90/011,477, Reexamination of U.S. Patent No. 5,949,331, issued to Schofield et al.
Search Report from European Patent Application No. EP 96916533.
Tokimaru et al., “CMOS Rear-View TV System with CCD Camera”, National Technical Report vol. 34, No. 3, pp. 329-336, Jun. 1988 (Japan).
J. Borenstein et al., “Where am I? Sensors and Method for Mobile Robot Positioning”, University of Michigan, Apr. 1996, pp. 2, 125-128.
Bow, Sing T., “Pattern Recognition and Image Preprocessing (Signal Processing and Communications)”, CRC Press, Jan. 15, 2002, pp. 557-559.
Vlacic et al., (Eds), “Intelligent Vehicle Tecnologies, Theory and Applications”, Society of Automotive Engineers Inc., edited by SAE International, 2001.
Van Leuven et al., “Real-Time Vehicle Tracking in Image Sequences”, IEEE, US, vol. 3, May 21, 2001, pp. 2049-2054, XP010547308.
Van Leeuwen et al., “Requirements for Motion Estimation in Image Sequences for Traffic Applications”, IEEE, US, vol. 1, May 24, 1999, pp. 145-150, XP010340272.
Van Leeuwen et al., “Motion Estimation with a Mobile Camera for Traffic Applications”, IEEE, US, vol. 1, Oct. 3, 2000, pp. 58-63.
Van Leeuwen et al., “Motion Interpretation for In-Car Vision Systems”, IEEE, US, vol. 1, Sep. 30, 2002, p. 135-140.
Pratt, “Digital Image Processing, Passage—ED.3”, John Wiley & Sons, US, Jan. 1, 2001, pp. 657-659, XP002529771.
Donnelly Panoramic Vision™ on Renault Talisman Concept Car At Frankfort Motor Show, PR Newswire, Frankfort, Germany Sep. 10, 2001.
Related Publications (1)
Number Date Country
20120257060 A1 Oct 2012 US
Continuations (5)
Number Date Country
Parent 13292119 Nov 2011 US
Child 13525767 US
Parent 11810164 Jun 2007 US
Child 13292119 US
Parent 10984403 Nov 2004 US
Child 11810164 US
Parent 10047901 Jan 2002 US
Child 10984403 US
Parent 09372915 Aug 1999 US
Child 10047901 US