Vehicular imaging system with misalignment correction of camera

Information

  • Patent Grant
  • 11908166
  • Patent Number
    11,908,166
  • Date Filed
    Monday, May 9, 2022
    a year ago
  • Date Issued
    Tuesday, February 20, 2024
    2 months ago
Abstract
A vehicular imaging system includes a camera disposed behind a windshield of a vehicle and viewing through a portion of the windshield. Image data captured by the camera is provided to a control. The control receives, via a communication bus of the vehicle, at least one selected from the group consisting of (i) vehicle pitch information relating to pitch of the vehicle, (ii) vehicle yaw information relating to yaw of the vehicle and (iii) vehicle steering information relating to steering of the vehicle. The system automatically corrects for misalignment of the camera. Image data captured by the camera is processed at the control for a lane departure warning system of the vehicle and for at least one selected from the group consisting of (i) an automatic headlamp control system of the vehicle, (ii) a collision avoidance system of the vehicle and (iii) an adaptive front lighting system of the vehicle.
Description
FIELD OF THE INVENTION

The present invention relates to automatic headlamp control systems for vehicles and, more particularly, to automatic headlamp control systems that automatically adjust the high and low beam states of a vehicle headlamp.


BACKGROUND OF THE INVENTION

Automotive forward lighting systems are evolving in several areas including the use of image-based sensors, typically referred to as Automatic High Beam (AHB) control systems, to maximize the use of high beam road illumination when appropriate, the use of steerable beam systems, typically referred to as Adaptive Front Lighting (AFL) systems, to provide a greater range of beam pattern options particularly for driving on curved roads or during turn maneuvers wherein the beam pattern may be biased or supplemented in the direction of the curve or turn, and the combination of such AHB and AFL systems.


Automatic high beam control system are known that utilize an optical system, an image sensor, and signal processing including spectral, spatial and temporal techniques to determine ambient lighting conditions, the road environment, and the presence of other road users in order to automatically control the selection of the appropriate forward lighting state such that user forward vision is optimized while minimizing the impact of headlamp caused glare on other road users in all lighting conditions. Examples of such systems are described in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,396,397; 6,822,563 and 7,004,606, which are hereby incorporated herein by reference in their entireties.


While AHB systems that utilize the features and concepts described within the above identified U.S. patents have achieved performance levels that have resulted in considerable commercial success, it is desired to provide additional features and techniques, which may increase the utility, improve the performance, facilitate the manufacture, and simplify the installation of such systems.


SUMMARY OF THE INVENTION

The present invention provides an automatic headlamp control system that is operable to automatically control or adjust the high beam state of a vehicle's headlamps. The headlamp control system is operable to determine a focus of expansion pixel or pixels in the captured image and adjust the image processing in response to the location or pixel/pixels of the focus of expansion and the tracking of movement of detected light sources and other objects as the vehicle travels along the road. The headlamp control system of the present invention may provide reduced processing of image data to provide a low cost system.


According to an aspect of the present invention, a vehicular imaging system comprises a photosensor array comprising a plurality of photosensor elements and a control responsive to an output of the photosensor array. The photosensor array has a field of view forward of the vehicle that is generally in line with the vehicle's primary direction of forward travel. The photosensor array captures images of an area encompassed by the forward field of view. The control processes an image data set indicative of captured images. The control processes a reduced image data set of the image data set to determine whether an object of interest is within a target zone of the captured images. The reduced image data set is representative of a portion of the captured images as captured by a particular grouping of the photosensor elements. Responsive to a determination of a change in a focus of expansion of the captured images, the control adjusts the reduced image data set so as to be representative of a portion of the captured images as captured by a different particular grouping of the photosensor elements.


The control may be operable to adjust a state of a headlamp beam in response to the image processing. The focus of expansion comprises at least one photosensor element that initially detects a new light source in the field of view. The control may track the new light source as it expands in the captured images (such as while the relative distance between the controlled vehicle and the new light source decreases) to confirm that the new light source is indicative of an object of interest. The control may determine that the new light source is representative of a light source of a leading or approaching vehicle and the controlled vehicle and approaching vehicle are traveling along a substantially flat and substantially straight road, and the control may compare a location of the new light source (such as when it is at or near the targeted zone of interest) to an expected location of the light source to determine if there is an offset. The control may process many samples of new light sources to arrive at an optimal or enhanced offset. The control adjusts the reduced data set in response to determination of such an offset.


According to another aspect of the present invention, a vehicular imaging system includes a photosensor array having a plurality of photosensor elements and a control responsive to an output of the photosensor array. The photosensor array has a field of view forward of the vehicle that is generally in line with the vehicle's primary direction of forward travel. The photosensor array captures images of an area encompassed by the forward field of view. The control processes image data indicative of captured images, and is operable to selectively process the output of the photosensor array as an output from the photosensor array at two or more different resolutions. The control utilizes a single classifying parameter for identifying a particular object of interest in the forward field of view for all of the at least two resolutions.


Optionally, for example, the at least two different resolutions may comprise (a) an output of a higher resolution photosensor array, (b) an output of a medium resolution photosensor array, and (c) an output of a lower resolution photosensor array. The control may process the output of the photosensor array at different resolutions in response to one of (a) a location of a detected light source and (b) a distance between the subject vehicle and a detected light source.


These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a side elevation of a portion of a vehicle embodying the present invention;



FIG. 2 is a partial side elevation view and block diagram of a vehicle headlight dimming control system according to the present invention;



FIG. 3 is a schematic of an imaging array suitable for use with the control system of the present invention;



FIG. 4 is a schematic of a determination of an offset of a focus of expansion for the control system of the present invention; and



FIG. 5 is a schematic of a headlamp control system utilizing a cell phone camera for capturing images of a forward field of view.





DESCRIPTION OF THE PREFERRED EMBODIMENTS

Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an automatic vehicle headlamp control system or vehicle headlamp dimming control system 12, which includes an image sensor 14 which senses light from a scene forward of vehicle 10, an imaging processor or control circuit 13 which receives data from image sensor 14 and processes the image data, and a vehicle lighting control logic module 16 which exchanges data with control circuit 13 and controls the headlamps 18 (such as by changing or retaining the state of the headlamps, such as between a higher beam state and a lower beam state) of vehicle 10 for the purpose of modifying the beam illumination state of the headlamps of the vehicle (FIGS. 1 and 2). The headlamps are operable to selectively emit a light output via a high beam lighting element and a lower beam or low beam lighting element. Headlamp dimming control 12 is operable to determine whether light sources in the image captured by the image sensor are or may be indicative of headlamps of oncoming vehicles or taillights of leading vehicles and is operable to adjust the headlamps of the controlled vehicle between a high beam state and a lower beam state or low beam state in response to such a determination. Headlamp dimming control 12 may utilize the principles disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,396,397; 6,822,563 and/or 7,004,606, which are hereby incorporated herein by reference in their entireties. Headlamp control 12 is operable to distinguish the light sources captured in the image between light sources representing headlamps and/or taillights of other vehicles, as discussed below.


The imaging sensor for the headlamp control of the present invention may comprise any suitable sensor, and may utilize various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as the types described in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,715,093; 5,877,897; 6,498,620; 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,806,452; 6,396,397; 6,822,563; 6,946,978; 7,038,577 and/or 7,004,606; and/or U.S. patent application Ser. No. 11/315,675, filed Dec. 22, 2005 and published Aug. 17, 2006 as U.S. Patent Publication No. US-2006-0184297; and/or U.S. provisional applications, Ser. No. 60/845,381, filed Sep. 18, 2006; and Ser. No. 60/837,408, filed Aug. 11, 2006; and/or PCT Application No. PCT/US2007/075702, filed Aug. 10, 2007, and published Feb. 28, 2008 as PCT Publication No. WO 2008/024639, and/or PCT Application No. PCT/US2003/036177, filed Nov. 14, 2003, and published Jun. 3, 2004 as PCT Publication No. WO 2004/047421, which are all hereby incorporated herein by reference in their entireties. The control 12 may include a lens element or optic 20 between the image sensor and the forward scene to substantially focus the scene at an image plane of the image sensor. Optionally, the optic may comprise an asymmetric optic, which focuses a generally central portion of the scene onto the image sensor, while providing classical distortion on the periphery of the scene or field of view.


Such imaging sensors or cameras are pixelated imaging array sensors having a photosensing array 15 of a plurality of photon accumulating or photosensing light sensors or pixels 15a (FIG. 3), which are arranged in a two-dimensional array of rows and columns on a semiconductor substrate. The camera established on the substrate or circuit board includes circuitry which is operable to individually access each photosensor pixel or element of the array of photosensor pixels and to provide an output or image data set associated with the individual signals to the control circuit 13, such as via an analog to digital converter (not shown). As camera 14 receives light from objects and/or light sources in the target scene, the control circuit 13 may then be operable to process the signal from at least some of the pixels to analyze the image data of the captured image, as discussed below.


As shown in FIG. 3, the control may process one or more sub-arrays 15b of the photosensor array 15, where a particular sub-array may be representative of a zone or region of interest in the forward field of view of the camera. The control may process the sub-array of pixels while ignoring other pixels or processing other pixels at a reduced level (such as by utilizing aspects of the systems described in U.S. Pat. No. 7,038,577, which is hereby incorporated herein by reference in its entirety), and/or the control may process the sub-array of pixels in a particular manner (such as to determine if a light source is a vehicle lamp in the regions forward of the vehicle and near the host vehicle's path of travel, such as a headlamp of an oncoming vehicle in a lane adjacent to (such as to the left of) the host vehicle or other vehicle lamp forward and/or to the left or right of the host vehicle) while processing other sub-arrays or pixels in a different manner.


In order to take advantage of the environmental protection offered by the vehicle cabin, the frequently cleaned optically clear path offered by the vehicle windshield (which is cleaned or wiped by the windshield wipers when the wipers are activated), and the relatively high vantage point offered at the upper region or top of the windshield, the headlamp control system or at least the imaging device or camera is preferably mounted centrally at or near the upper inside surface of the front windshield of a vehicle and with a forward field of view through the region cleaned or wiped by the windshield wipers. The imaging device may be mounted at an interior rearview mirror assembly (such as at a mounting bracket or base of the mirror assembly) or at an accessory module or windshield electronics module disposed at or near the interior rearview mirror assembly and at or near the interior surface of the vehicle windshield.


Automatic image-based high beam control systems, in which an image of the scene forward of the vehicle is focused by an optical system, may have a horizontal field of view equal to, but not limited to, approximately +/−24 degrees about the imaging system centerline. This horizontal field of view may be larger than (and may be substantially larger than) the horizontal extent of the high beam pattern, but optionally the high beam pattern itself may be moved left and right up to approximately 15 degrees in either direction by an adaptive front lighting (AFL) system. The image may be focused or imaged onto a rectangular array image capture device, such as, but not limited to, onto a 640×480 CMOS color imager, which captures image data and provides sequential frames of data indicative of the light energy reflected or emitted by objects in the region subtended by each element of the array. The image capture rate may be at a rate in the range of about 5 to 120 times per second or more, with processing being performed on the data to determine the presence, location and characteristics of objects and/or light sources within the monitored scene and to determine characteristics of the monitored scene, such as general illumination level, and to utilize several defined regions or zones of the monitored scene for various purposes. For example, the region of the scene that generally corresponds to the region of influence of the vehicle high beam pattern may be used to determine the appropriate high beam state of the headlamps depending on whether or not other road users are detected within that region. Optionally, the regions to the left and right of the first region may be used to anticipate the upcoming entry of other road users into the first region in order to facilitate a rapid and appropriate response upon entry or just prior to entry of the first region. The upper central region of the monitored scene may be used to determine ambient lighting conditions such that a first threshold may be established below which low beam headlights are activated, and a second threshold may be established above which high beam activation may be inhibited, while the lower horizontal portion of the ambient lighting condition detection region may be used to detect urban lighting conditions or the like. Other processing of the captured image data may be implemented depending on the particular application of the image sensor and processor, while remaining within the spirit and scope of the present invention.


The control system of the present invention thus captures images or generates image data indicative of the scene occurring forwardly of the vehicle and processes the image data to determine whether or not a headlamp or taillight of another vehicle is present, whereby the headlamps of the controlled vehicle may be adjusted between their high and low beams accordingly. The image processor processes one or more zones of interest or regions of interest to determine whether the detected light source is a headlamp or taillight of another vehicle traveling on the same road as the controlled vehicle (since a light detected that is remote from the particular or appropriate region of interest is not likely another vehicle light or is not likely relevant to the decision as to whether or not the controlled vehicle's headlamps should be adjusted). The control system thus may utilize aspects of the image systems and/or headlamp control systems described above to process different zones or regions of interest, and may ignore other regions of the captured images or process other regions at a reduced level (such as by utilizing aspects of the systems described in U.S. Pat. Nos. 5,550,677; 5,877,897 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties).


In order to ensure that the region of interest or regions of interest being processed are representative of the appropriate region relative to the controlled vehicle and direction of travel thereof, the control system of the present invention is operable to provide an automatic alignment or correction factor of the image data captured by the image sensor. Thus, the various regions of interest within the scene monitored by the sensor are optimally maintained regardless of vehicle and high beam control system module geometric manufacturing and assembly tolerances, and other sources of misalignment, such as vehicle pitch and yaw variations due to a wide range of possible vehicle loading conditions.


Typical vehicle body structures, windshields and assembly systems of vehicles may contribute to geometric tolerances associated with the surface to which the headlamp control system module is attached. It is not unusual to encounter a total stack up of tolerances which result in a potential vertical and horizontal misalignment of approximately +/−4 degrees from the theoretically ideal condition. This is a significant value and may result in errors in processing the appropriate region of interest and/or determining lane widths and object sizes and distances and the like.


It is known to provide a mechanical adjustment means to allow for the correction of such a misalignment at the installation of a headlamp control system to the vehicle. Such mechanical adjustments are, however, often undesirable since it is often expensive to apply manual labor to the alignment of components on each vehicle equipped with a headlamp control system at the vehicle assembly plant or facility. Such adjustments are additionally undesirable since the alignment procedure is then subject to operator error.


Also, such adjustment will only correct for misalignment of the imaging device and system at the time of manufacturing of the vehicle, and will not correct or account for or adapt the system for misalignment that may occur during use, such as due to a physical or mechanical misalignment of the imaging device or due to different load balancing of the vehicle or replacement of the camera or mirror assembly or assembly module or windshield and/or the like. For example, in normal use, a typical vehicle experiences many different loading conditions which cause it to adopt a wide range of pitch and roll attitudes, causing an automatic headlamp control system of the vehicle to view the forward scene from perspectives different from the ideal, or initially considered design conditions, and thereby potentially resulting in different headlight actuation decisions than contemplated by the original system specification.


Thus, it is beneficial for the headlamp control system to include a feature which automatically compensates for an initial misalignment condition and additionally is capable of correcting for temporary vehicle conditions and re-installation misalignments which may occur during the use of the vehicle. In order to achieve optimum performance of the headlamp control system, it is desirable to determine which of the array elements of the image capture device fall into each of the defined regions of interest. Since the regions are defined relative to the forward scene, it is desirable to determine a particular point or area within the forward scene and to relate that point or area to a particular array element or photosensor or group of photosensors of the image capture device.


The particular point in the forward scene may be defined as a particular distant point or area which lies on the forward extended vehicle centerline on the horizontal plane which passes generally through the center of the optical system associated with the image capture device. When driving on a substantially flat and substantially straight road, the distant point may be the point within the forward scene at which the headlights of an oncoming vehicle or the tail lamps of a slower leading vehicle are first detected. As the distance between the controlled vehicle and target vehicle decreases, the image of the target vehicle expands within the imaged scene, towards the left if traveling in a leftward lane, centrally if in the same lane, and towards the right if traveling in a rightward lane. Thus, the described distant point may be called the focus of expansion or FOE.


In order to determine the imaging array element or pixel which subtends the FOE in the as assembled and as loaded vehicle, it is necessary to identify the array element or pixel or pixels which first detects a new light source (which has the potential to be a vehicular light source or headlamp or taillight) within that region of the monitored scene which could potentially contain the FOE, and to continue to track the detected light source as it expands in the image as the distance between the detected source and the controlled vehicle decreases until it is confirmed that the source is a headlamp or taillight of another vehicle (such as by utilizing aspects of the systems described in U.S. provisional applications, Ser. No. 60/845,381, filed Sep. 18, 2006; and Ser. No. 60/837,408, filed Aug. 11, 2006, and/or PCT Application No. PCT/US2007/075702, filed Aug. 10, 2007, and published Feb. 28, 2008 as PCT Publication No. WO 2008/024639, which are hereby incorporated herein by reference in their entireties). The control system may monitor the controlled vehicle trajectory until it reaches the point in the road where the new light source would have been initially detected in order to confirm that the road traveled for the duration of the monitoring period was substantially flat and substantially straight. If it is determined that the point or light source was a light source of a leading or approaching vehicle and the controlled vehicle and approaching vehicle are traveling along a substantially flat and substantially straight road, the location of the initial distant point or FOE may be compared to an expected location (the location of the pixel corresponding to the preset or expected FOE) to determine if there is an offset or error in the imaging device's or system's calibration. The control system optionally, and preferably, collects or processes or analyzes many new light sources and picks the best samples and averages them to arrive at the best or optimal or enhanced FOE.


If an offset between the actual or detected FOE and the expected or preset FOE is detected, the image processor determines the degree of offset and adjusts or shifts the regions of interest parameters or coordinates or targeted pixels to accommodate for the offset, such that the processor processes the image data captured by the pixels representative of the appropriate zones or regions of interest forwardly of the controlled vehicle for detecting headlamps of approaching vehicles and taillights of leading vehicles. For example, if the detected FOE is ten pixels to the left and five pixels down from the expected FOE, the processor may readily adjust the parameters or coordinates of the regions of interest by that amount (or by a scaled value based on the detected offset). Thus, the headlamp control system may adjust the processing to adapt to shifts or changes in the FOE of the imaging device and thus may do so electronically and without physical or mechanical adjustment of the imaging device relative to the vehicle.


The headlamp control system of the present invention thus provides a low cost processing adjustment to maintain processing of the appropriate regions of interest when detecting light sources or objects forwardly of the vehicle and determining whether or not the detected light sources or objects are leading vehicles or approaching vehicles along the road on which the controlled vehicle is traveling. The control system thus calibrates or adapts the image data or image processing to accommodate for manufacturing tolerances and/or physical misalignment that may occur during the camera and/or mirror or accessory module manufacture or during the vehicle manufacture, and to accommodate for misalignment or shifts in the principal viewing axis of the camera or imaging device due to different load balancing of the vehicle or distortion in shape of the headlamp control system assembly due to heating and/or other situations where the vehicle encounters or experiences a change in pitch or tilt or yaw of the vehicle.


Optionally, the control system may be adjusted in response to a detection of lane markers, such as along a straight and/or flat road (or optionally along a curved road and optionally in conjunction with a steering angle of the vehicle). For example, and with reference to FIG. 4, the system may detect lane markers 22 along the lane in which the controlled vehicle is traveling and, if the lane markers are substantially straight, may determine an intersection 22a of the lane markers. The control system may detect headlamps in front of the vehicle and may monitor the appearance point 24 of the detected headlamps. The system may monitor the appearance point and the intersection point as it travels toward and past the physical locations corresponding to the monitored points in the captured images, and if it is determined that the road or lane was substantially straight, the offset between the actual FOE and the expected FOE may be determined. The control system collects determined lane-marker-based FOE samples and rejects the error samples and averages them to arrive at a best or optimal or enhanced lane-marker-based FOE.


Optionally, the control system may utilize a weighted sum calculation of data representative of the intersection point and the (appearance point plus offset) to determine the actual FOE, depending on the particular application. For example, the adaptive FOE may be based on a detection of the appearance (initial detection) of lights and disappearance of lights (when the lights are out of the range of the sensor and are no longer detected) in front of the vehicle and a detection of the lane markers along the road in front of the vehicle, and may be calculated, for example, via the following equations:

AFOEROW=(a*[LaneMark Row+Offset]+b*[Headlight Appear Row+Offsetv]+c*[Taillight Disappear Row+Offsetw])/(a+b+c); and  (1)
AFOECOLUMN=(d*[LaneMark Column+Offsetx]+e*[Headlight Appear Column+Offsety]+f*[Taillight Disappear Column+Offsetz])/(d+e+f);  (2)

where a, b, c, d, e and f are parameter weights that depend on the particular application. Other equations may be utilized to substantially estimate or calculate the present FOE of the imaging device, such as based on the detection of lane markers and/or light sources and/or the like in the forward field of view. Since this method uses either or both lane markers or vehicle appearance/disappearance, the system can work for environments without lane markers or for environments without initial other-vehicle presence.


Optionally, the control system may be adjusted in response to vehicle pitch information from a bus or accelerometer, and/or vehicle roll information from an accelerometer or bus information of the vehicle, and/or vehicle yaw information from an accelerometer or bus information of the vehicle. Optionally, the system may only monitor for new light sources when the vehicle is traveling in a substantially straight line (such as when the steering wheel angle is between, for example, about 0+/−10 degrees for a vehicle with steering ratio of about 17, or at or between any other suitable or selected threshold angle or angles depending on the particular application of the control system). Thus, adjustment and/or alignment of the image sensor may occur by tracking movement of light sources through the images when the vehicle is traveling substantially straight, so that the control may compare the tracked light sources to expected locations and paths through the captured images as the vehicle moves along the substantially straight path and may adjust the processing parameters of the image processor and imaging sensor accordingly.


Optionally, the control system may determine the actual FOE and offset along curved road sections in response to the lane marker detection and/or a steering angle input, whereby the system may monitor the detected appearing light source and monitor its initial or appearance location as the controlled vehicle approaches the initial location. By taking into account the steering angle of the vehicle as the vehicle travels toward the initial or appearance location of the light source, the control system may monitor or track the initial location to determine if the controlled vehicle approaches or arrives at or near that location. The control system may also determine if the detected light source was a headlamp of an approaching vehicle or taillight of a leading vehicle and, if so, may determine the offset and adjust or adapt the image processing accordingly.


The automatic adjustment or correction or adaptation of the image processor in response to a detected offset between a detected FOE and an expected FOE allows the control system of the present invention to utilize various cameras or imaging devices, such as aftermarket devices or cell phone cameras or the like. For example, an aftermarket camera may be installed in the vehicle with a generally forward field of view in the direction of travel of the vehicle, and the system may, as the vehicle is then driven, determine an offset or error in the expected FOE and readily compensate for such offset, without requiring any further manual input or physical adjustments.


Thus, it is envisioned that any imaging device (such as, for example, a cell phone camera) may be utilized for the imaging system or headlamp control system of the present invention. For example, and with reference to FIG. 5, a cell phone 26 may be docked at (such as at a phone connector mount or port 28 or the like at the vehicle instrument panel or dashboard 29 or mirror assembly or accessory module or console or the like) or in communication with an image processor that processes the images captured by the cell phone camera 26a in a similar manner as described above, and determines the current FOE for the cell phone camera at its present orientation relative to the vehicle and determines the appropriate zones of interest or regions of interest for processing the image data to determine if detected light sources in the forward field of view are representative of a headlamp of an approaching vehicle or taillight of a leading vehicle. The cell phone may transmit a compressed video stream (such as, for example, at about 264 Hz) to a cell phone network, and/or may communicate video signals to an on-board or vehicle-based processor.


It is further envisioned that the adaptive FOE process of the present invention allows for the use of various aftermarket cameras and/or cell phone cameras for various imaging systems or applications, such as adaptive front lighting systems or lane departure warning systems or object detection systems or collision avoidance systems or the like, since the camera (such as a cell phone and camera) may be located at or in or mounted at the vehicle and the processing of the image data may be adapted to automatically accommodate for and correct for any misalignment or mis-mounting or mis-positioning of the camera. Optionally, aspects of the adaptive FOE system described above may be utilized for cameras (such as OEM cameras or aftermarket cameras or cell phone cameras or the like) having a rearward field of view so that the processing of the captured images is corrected or adapted and the images are processed accordingly, such as for a rear vision system or backup aid or the like, and/or may be utilized for cameras having a field of view directed inside the vehicle, such as for interior cabin monitoring systems or the like (such as utilizing aspects of the systems described in U.S. Pat. Nos. 5,760,962; 5,877,897 and/or 6,690,268, which are hereby incorporated herein by reference in their entireties). The image data from the cell phone camera (or other camera) may be communicated wirelessly (such as via a short-range radio frequency communication, such as via a BLUETOOTH® communication protocol or the like) or via a wired connection (such as via a docking port or USB port or the like at the vehicle) to a vehicle-based or onboard processor (such as processor 13 described above), or compressed video data or image output of the camera may be streamed to a cell phone network or the like.


Optionally, the control system may adjust the zones of interest or regions of interest in the captured images in response to an input representative of the vehicle trajectory, such as in response to a steering angle of the vehicle or steering wheel angle of the vehicle, such as by utilizing aspects of the systems described in U.S. patent application Ser. No. 11/315,675, filed Dec. 22, 2005, and published Aug. 17, 2006 as U.S. Publication No. US-2006-0184297; and/or U.S. provisional applications, Ser. No. 60/845,381, filed Sep. 18, 2006; and Ser. No. 60/837,408, filed Aug. 11, 2006, and/or PCT Application No. PCT/US2007/075702, filed Aug. 10, 2007, and published Feb. 28, 2008 as PCT Publication No. WO 2008/024639, which are all hereby incorporated herein by reference in their entireties. For example, when the controlled vehicle is traveling along a curved road, the zones of interest or regions of interest may be adjusted or offset (i.e., the image processor may process different groups of pixels corresponding to the different zones or regions of interest) so that the image processor processes the adjusted or offset zones or regions of interest to detect vehicles along a curved road. Such an adjustment of the zones of interest parameters or pixel locations may occur when the control system determines that the controlled vehicle is traveling along a curved road, such as in response to a steering angle input or lane detection input or the like. Optionally, the regions of interest may be reduced or shrunk (or optionally extended) at the corresponding side regions depending on the direction of the curve along which the vehicle is traveling.


Optionally, the control system of the present invention may adjust or adapt other processing parameters based on previous image processing. For example, the control system may process a large window or region of interest and may adapt the region of interest to a smaller region or window if a light source is detected. For example, if a detected light source is identified as headlamps of an approaching vehicle, the region of interest may be adapted from a large region or zone or window to a smaller or predicted region that is representative of where the headlamps of an approaching vehicle should be located relative to the controlled vehicle, such as down and to the left in the captured image (for a detected taillight, the adaptive region or window may be generally downward at or near a center line of the image as the controlled vehicle approaches a leading vehicle). If the adapted or smaller or inner or predicted region or window no longer detects a light source or object, the control system may resume processing of the larger region of interest or window to determine if other light sources or objects are present. Optionally, the exposure may be adjusted or adapted from one frame to the next, such as if an approaching headlamp is detected, for example, the exposure may be less for the subsequent frames as the headlamp moves closer to the controlled vehicle. For detected taillights, the change in exposure may be reduced or inhibited, since the taillights typically move generally with the controlled vehicle and do not approach the controlled vehicle as rapidly as to headlamps of approaching vehicles.


The driving side of the road varies by countries of the world (for example, in the United States, the vehicles are driven on the right side of the road). It is desirable to support left and right driving side of road to reduce engineering and manufacturing cost. Thus, it is beneficial for the head lamp control system to include a feature that automatically detects the driving side of the road. The control system thus may be operable to process the image data set to detect the new light source and identify the head light of the oncoming vehicle. The control system tracks the head light of the detected oncoming vehicle and stores the trajectory of headlight. The driving side of the vehicle (the side of the road along which the vehicle is traveling) is identified by analyzing a predetermined amount of the trajectories of the oncoming vehicles. If the driving side is on the right side of the road, then the oncoming vehicles will pass the host vehicle on the left, and vice versa. Optionally, the control system may detect the driving side only by analyzing the location of appearance of new headlights in image data, since the appearance of new head light sources is biased toward one side of the captured image or image data according the driving side of the road when the vehicle is traveling along a substantially flat and straight road. Optionally, the control system may be responsive to a global positioning system input and may determine the driving side of the road on which the vehicle is traveling based on the geographical location of the vehicle and the driving rules and regulations of that geographical location or region.


The automatic high beam system or automatic headlamp control system may be optimized to adapt the vehicle for enhanced performance for the particular road (and side of the road) along which the vehicle is being driven. For example, the control system may modify many calibrations or parameters such as, but not limited to, different zones or sub-arrays of image data, weighting factors of different zones of image data, offset of the FOE, automotive light sources acceptance parameters, lane marker detection and tracking of objects and light sources and/or the like, in order to adapt the system for enhanced performance depending on which side of the road the host vehicle is driven on.


Optionally, the control system of the present invention may be operable to provide a low-cost processing of the image data via processing captured frames of image data at different resolution levels, such as at least two different resolutions or resolution levels. For example, the control system may process images at a higher resolution level (where the imaging device may be processed, for example, as a pixelated array of 640×480 pixels), at a medium or intermediate resolution (where the imaging device may be processed, for example, as a pixelated array of 320×240 pixels), and at a lower resolution (where the imaging device may be processed, for example, as a pixelated array of 160×120 pixels). Such a processing technique allows the processor to use the same classifier (such as the same window size or mask size, such as about a 2×3 pixel mask for detecting a distant taillight) for detecting and identifying taillights (or other light sources) at each distance or range, and thus may substantially reduce the memory requirements of the processor.


Typically, if a processor is to identify a taillight that is about 200 meters (or thereabouts) in front of the controlled vehicle, the processor may utilize a 2×3 pixel mask to determine if the detected light source is about the expected size of a taillight to assist in correctly identifying the taillights. However, if the light source is closer to the vehicle, such as at about 100 meters or thereabouts in front of the controlled vehicle, the processor would process the image data with a larger mask or window because the light source would be larger when it is closer to the controlled vehicle. When the light source is even closer, such as about 50 meters or less, to the controlled vehicle, an even larger window or mask is utilized to identify the detected light source. Thus, the control system requires sufficient memory capability to store the different window sizes for detecting the various light sources at various distances in front of the controlled vehicle. Such memory or data storage can be costly and thus may add to the cost of the headlamp control system (or other vision-based system).


However, by processing the captured images at different resolutions (such as a higher resolution, a medium resolution and a lower resolution), the system may generally equalize the sizes of the imaged objects or light sources for the various distances from the controlled vehicle, so that only a single sized mask or window need be utilized for identifying a particular light source, such as a taillight of a leading vehicle. This is because a taillight at about 200 meters may take up a window of about 2×3 pixels of a 640×480 higher resolution image, while a taillight at about 100 meters or thereabouts may take up a window of about 2×3 pixels of a 320×240 medium resolution image (which would be about a 4×6 pixel window if it were a higher resolution image), and a taillight at about 50 meters or less may take up a window of about 2×3 pixels of a 160×120 lower resolution image (which would be about a 8×12 pixel window if it were a higher resolution image).


Thus, by processing the different resolution images, the control system may utilize the same mask or window or classifier for identifying a detected light source. Although the intensity of the detected light sources would be different (such as, for example, the intensity of the light source of the medium resolution image may be eight times the intensity of the light source of the low resolution image and the intensity of the light source of the high resolution image may be 64 times the intensity of the light source of the low resolution image), this can be readily accounted for when detecting the light source and identifying the detected light source. Thus, when a light source is detected that may be a taillight of a leading vehicle, the processor may process the image data with a single classifier (such as for classifying the light source as a taillight) regardless of the distance to the detected light source from the controlled vehicle, and thus, the memory requirements of the system for multiple classifiers may be substantially reduced to reduce the cost of the control system.


Optionally, the control system may be operable to superimpose a code or flicker on the headlight beams to communicate a code or message to control systems of other vehicles or of roadside monitors or stations, such as by utilizing aspects of the systems described in U.S. Pat. No. 7,004,606, which is hereby incorporated herein by reference in its entirety. For example, the headlamps (or other vehicle lights) could be used to signal other drivers with “messages” which other vehicle's machine vision systems could decode, while typical drivers without such systems are unaware of the communication system. Such a code would be camouflaged to people viewing the headlamps or other lights, but visible to the machine vision systems of the other vehicles. Different flicker rates or different color combinations or spectral signature of the lights may communicate different codes, and the codes may be preset codes (such as, for example, a code that communicates to the driver of the other vehicle or vehicles that there is an accident ahead or the like), or may be entered by the driver of the controlled vehicle (such as via a voice input or manual input or the like).


Thus, with vehicles inside the line of sight distances, messages may be sent from the controlled vehicle to other vehicles via the code embedded in or superimposed on the output signal or illumination signal of the vehicle lights. For example, a code or message may be communicated from a controlled vehicle passing an accident to all similarly equipped oncoming traffic to alert the oncoming traffic of the accident. The code may comprise a color change (such as a subtle color change) in the color of the light emitted by the vehicle light source or a flicker (such as a high frequency flicker that is not readily noticeable or discernible to a human observer) or the like, and may be readily detected and identified or decoded by a similar control system of another vehicle. For example, the vehicle light source may comprise yellow and blue LEDs flickering at a predetermined rate and pattern and can thus encode information or data or messages which would look like a typical white HID or halogen headlight to human observers. Human perception of flickering in this color pair is worse than other colors which could also produce white, such as those nearer the red and green colors. This flicker rate of the yellow and blue LEDs thus may be lower than other color combinations (while still avoiding detection by the human observers), such as less than about 60 Hz.


For communicating messages or codes rearwardly, the taillights may also or otherwise be flickered or adjusted or coded to communicate a message or data. Red taillight location in color space may not be optimal for flickering different colors, but using flicker rates above about 60 Hz can provide the desired communication means while limiting or substantially avoiding human detection. Optionally, the light sources may flicker or may have superimposed thereon an illumination output in the infrared or near infrared range of the spectrum, where humans have poor sensitivity, and where the imaging devices may be highly sensitive.


Optionally, the control system of the present invention may be operable to determine if a pixel or pixels of the imaging array is either inoperable or “bad” or blocked, so that the control system may ignore the bad/blocked pixel output to avoid adversely affecting averages of pixel output intensities during the image processing. The bad pixel detection process or algorithm may be performed periodically when the system is operating. For example, a captured frame or image may be dedicated to bad pixel detection. If a bad pixel or pixels is/are detected, averaging of the output intensities of the pixels surrounding the bad pixel may be performed to accommodate or replace the bad or inoperable pixel.


It is further envisioned that the control system may be operable to determine if some or all of the pixels of the imaging array are blocked (such as via an object or dirt or debris at the vehicle windshield or the like) and to adapt the image processing accordingly or notify or alert the driver of the vehicle that such blockage has occurred. For example, a partial or total day blockage algorithm may be run during daytime lighting conditions, such as in response to a user input or on demand, while a partial or total night blockage algorithm may be run when the ambient condition is indicative of nighttime lighting conditions. When the total blockage algorithm is run, the number of pixels above an intensity threshold may be counted for a captured image or frame, and if, over a number of frames, the count of the bright pixels is continuously below a threshold, the control system may conclude that the imaging device substantially or totally blocked. When the partial blockage algorithm is run, the control system may perform region-based processing to take into account intensity variations in different regions of the pixelated imaging array. Based on intensity variations with neighboring or adjacent regions and the continuity of the variations over time, the control may determine that the imaging array is partially blocked. The control system may process the blocked pixel region in a night mode to reduce or substantially preclude the possibility of a false blockage detection.


If either partial or total blockage is detected or determined, the system may adapt the image processing to accommodate the blocked pixels, or the system may alert the driver of the vehicle that the pixels are blocked so that the driver or user may unblock the imaging device (such as via cleaning the windshield of the vehicle), or the system may actuate the vehicle windshield wipers to clear the windshield at the imaging device or the like, or the system may actuate a blower system (such as a defogger system or the like) of the vehicle to direct or force or blow air toward the detected blockage to clear the windshield or window or area in the forward field of view of the imaging device. Optionally, the control thus may detect that at least a portion of the imaging device or photosensor array is blocked and may switch to a low beam mode in response to the detection (so as to allow the system to confirm the existence of the blockage without the high beams on during this period of time), and the system may at least one of (a) alert the driver of the subject vehicle of the detected blockage so that he or she can clean the windshield or sensor or otherwise remove the blockage or actuate the wipers and/or related system of the vehicle to remove the blockage; (b) automatically actuate a wiper (such as the windshield wipers) of the vehicle to remove the blockage from the forward field of view of the imaging device; and (c) automatically actuate a blower system of the vehicle to remove or dissipate the blockage from the forward field of view. The system or control may also detect that the blockage has been removed from the forward field of view and may resume the normal functionality of the headlamp control system and/or the wiper system of the vehicle and/or the blower system of the vehicle.


Optionally, the imaging sensor (and/or aspects of the control system described above) may be suitable for use in connection with other vehicle imaging systems, such as, for example, a blind spot detection system, where a blind spot indicator may be operable to provide an indication to the driver of the host vehicle that an object or other vehicle has been detected in the lane or area adjacent to the side of the host vehicle. In such a blind spot detector/indicator system, the blind spot detection system may include an imaging sensor or sensors, or ultrasonic sensor or sensors, or sonar sensor or sensors or the like. For example, the blind spot detection system may utilize aspects of the blind spot detection and/or imaging systems described in U.S. Pat. Nos. 7,038,577; 6,882,287; 6,198,409; 5,929,786 and/or 5,786,772, and/or U.S. patent application Ser. No. 11/315,675, filed Dec. 22, 2005, and published Aug. 17, 2006 as U.S. Publication No. US-2006-0184297; and/or Ser. No. 11/239,980, filed Sep. 30, 2005, and/or U.S. provisional applications, Ser. No. 60/696,953, filed Jul. 6, 2005; Ser. No. 60/628,709, filed Nov. 17, 2004; Ser. No. 60/614,644, filed Sep. 30, 2004; and/or Ser. No. 60/618,686, filed Oct. 14, 2004, and/or PCT Application No. PCT/US2006/026148, filed Jul. 5, 2006, and published Jan. 11, 2007 as PCT Publication No. WO 2007/005942, and/or of the reverse or backup aid systems, such as the rearwardly directed vehicle vision systems described in U.S. Pat. Nos. 5,550,677; 5,760,962; 5,670,935; 6,201,642; 6,396,397; 6,498,620; 6,717,610; 6,757,109 and/or 7,005,974, and/or of the rain sensors described in U.S. Pat. Nos. 6,250,148 and 6,341,523, and/or of other imaging systems, such as the types described in U.S. Pat. Nos. 7,123,168; 6,353,392 and/or 6,313,454, with all of the above referenced U.S. patents, patent applications and provisional applications and PCT applications being commonly assigned and being hereby incorporated herein by reference in their entireties.


Optionally, the optical system may be held by features of a housing assembly of an interior rearview mirror assembly of an accessory module or the like. The housing assembly may utilize aspects of the modules or assemblies described in U.S. Pat. Nos. 7,004,593; 6,968,736; 6,877,888; 6,824,281; 6,690,268; 6,672,744; 6,593,565; 6,516,664; 6,501,387; 6,428,172; 6,386,742; 6,341,523; 6,329,925; and 6,326,613; 6,250,148 and 6,124,886, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. patent publication No. US2006-0050018, and/or Ser. No. 11/201,661, filed Aug. 11, 2005, now U.S. Pat. No. 7,480,149, and/or PCT Application No. PCT/US03/40611, filed Dec. 19, 2003; PCT Application No. PCT/US03/03012, filed Jan. 31, 2003, and/or PCT Application No. PCT/US04/15424, filed May 18, 2004, and/or Ireland pat. applications, Ser. No. S2004/0614, filed Sep. 15, 2004; Ser. No. S2004/0838, filed Dec. 14, 2004; and Ser. No. S2004/0840, filed Dec. 15, 2004, which are all hereby incorporated herein by reference in their entireties.


Optionally, the mirror assembly and/or accessory module or windshield electronics module may include one or more displays, such as for displaying the captured images or video images captured by the imaging sensor or sensors of the vehicle, such as the displays of the types disclosed in U.S. Pat. Nos. 7,004,593; 5,530,240 and/or 6,329,925, which are hereby incorporated herein by reference, and/or display-on-demand or transflective type displays, such as the types disclosed in U.S. Pat. Nos. 7,195,381; 6,690,268; 5,668,663 and/or 5,724,187, and/or in U.S. patent application Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; Ser. No. 10/528,269, filed Mar. 17, 2005, now U.S. Pat. No. 7,274,501; Ser. No. 10/533,762, filed May 4, 2005, now U.S. Pat. No. 7,184,190; Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. patent publication No. US2006-0050018; Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. patent publication No. US 2006-0061008; Ser. No. 10/993,302, filed Nov. 19, 2004, now U.S. Pat. No. 7,338,177; and/or Ser. No. 11/284,543, filed Nov. 22, 2005, now U.S. Pat. No. 7,370,983, and/or PCT Application No. PCT/US03/29776, filed Sep. 9, 2003; and/or PCT Application No. PCT/US03/35381, filed Nov. 5, 2003, and/or PCT Application No. PCT/US03/40611, filed Dec. 19, 2003, and/or PCT Application No. PCT/US2006/018567, filed May 15, 2006, which are all hereby incorporated herein by reference, or may include or incorporate video displays or the like, such as the types described in PCT Application No. PCT/US03/40611, filed Dec. 19, 2003, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. patent publication No. US2006-0050018; and/or Ser. No. 11/284,543, filed Nov. 22, 2005, now U.S. Pat. No. 7,370,983, and/or U.S. provisional applications, Ser. No. 60/732,245, filed Nov. 1, 2005; Ser. No. 60/759,992, filed Jan. 18, 2006; and/or Ser. No. 60/836,219, filed Aug. 8, 2006, which are hereby incorporated herein by reference.


The imaging sensor may be incorporated at or in an accessory module or windshield electronics module (such as described above), or may be incorporated at or in an interior rearview mirror assembly of the vehicle, while remaining within the spirit and scope of the present invention. Optionally, the mirror assembly and/or module may support one or more other accessories or features, such as one or more electrical or electronic devices or accessories. For example, illumination sources or lights, such as map reading lights or one or more other lights or illumination sources, such as illumination sources of the types disclosed in U.S. Pat. Nos. 7,195,381; 6,690,268; 5,938,321; 5,813,745; 5,820,245; 5,673,994; 5,649,756; 5,178,448; 5,671,996; 4,646,210; 4,733,336; 4,807,096; 6,042,253; 6,971,775 and/or 5,669,698, and/or U.S. patent application Ser. No. 10/933,842, filed Sep. 3, 2004, now U.S. Pat. No. 7,249,860, which are hereby incorporated herein by reference, may be included in the mirror assembly or module. The illumination sources and/or the circuit board may be connected to one or more buttons or inputs for activating and deactivating the illumination sources. Optionally, the mirror assembly or module may also or otherwise include other accessories, such as microphones, such as analog microphones or digital microphones or the like, such as microphones of the types disclosed in U.S. Pat. Nos. 6,243,003; 6,278,377 and/or 6,420,975, and/or in PCT Application No. PCT/US03/308877, filed Oct. 1, 2003. Optionally, the mirror assembly may also or otherwise include other accessories, such as a telematics system, speakers, antennas, including global positioning system (GPS) or cellular phone antennas, such as disclosed in U.S. Pat. No. 5,971,552, a communication module, such as disclosed in U.S. Pat. No. 5,798,688, a voice recorder, transmitters and/or receivers, such as for a garage door opener or a vehicle door unlocking system or the like (such as a remote keyless entry system), a digital network, such as described in U.S. Pat. No. 5,798,575, a memory mirror system, such as disclosed in U.S. Pat. No. 5,796,176, a hands-free phone attachment, a video device for internal cabin surveillance (such as for sleep detection or driver drowsiness detection or the like) and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962 and/or 5,877,897, a remote keyless entry receiver, a seat occupancy detector, a remote starter control, a yaw sensor, a clock, a carbon monoxide detector, status displays, such as displays that display a status of a door of the vehicle, a transmission selection (4 wd/2 wd or traction control (TCS) or the like), an antilock braking system, a road condition (that may warn the driver of icy road conditions) and/or the like, a trip computer, a tire pressure monitoring system (TPMS) receiver (such as described in U.S. Pat. Nos. 6,124,647; 6,294,989; 6,445,287; 6,472,979 and/or 6,731,205; and/or U.S. patent application Ser. No. 11/232,324, filed Sep. 21, 2005, now U.S. Pat. No. 7,423,522, and/or an ONSTAR® system and/or any other accessory or circuitry or the like (with all of the above-referenced patents and PCT and U.S. patent applications being commonly assigned, and with the disclosures of the referenced patents and patent applications being hereby incorporated herein by reference in their entireties).


Optionally, the mirror assembly or module may include one or more user inputs for controlling or activating/deactivating one or more electrical accessories or devices of or associated with the mirror assembly or module or vehicle. The mirror assembly or module may comprise any type of switches or buttons, such as touch or proximity sensing switches, such as touch or proximity switches of the types described in PCT Application No. PCT/US03/40611, filed Dec. 19, 2003; and/or U.S. Pat. Nos. 6,001,486; 6,310,611; 6,320,282 and 6,627,918; and/or U.S. patent application Ser. No. 09/817,874, filed Mar. 26, 2001, now U.S. Pat. No. 7,224,324; Ser. No. 10/956,749, filed Oct. 1, 2004, now U.S. Pat. No. 7,446,924; Ser. No. 10/933,842, filed Sep. 3, 2004, now U.S. Pat. No. 7,249,860; Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; and/or Ser. No. 11/140,396, filed May 27, 2005, now U.S. Pat. No. 7,360,932, which are hereby incorporated herein by reference, or the inputs may comprise other types of buttons or switches, such as those described in U.S. patent application Ser. No. 11/029,695, filed Jan. 5, 2005, now U.S. Pat. No. 7,253,723; and/or Ser. No. 11/451,639, filed Jun. 13, 2006, now U.S. Pat. No. 7,527,403, which are hereby incorporated herein by reference, or such as fabric-made position detectors, such as those described in U.S. Pat. Nos. 6,504,531; 6,501,465; 6,492,980; 6,452,479; 6,437,258 and 6,369,804, which are hereby incorporated herein by reference. Other types of switches or buttons or inputs or sensors may be incorporated to provide the desired function, without affecting the scope of the present invention.


Optionally, any such user inputs or buttons may comprise user inputs for a garage door opening system, such as a vehicle based garage door opening system of the types described in U.S. Pat. Nos. 6,396,408; 6,362,771 and 5,798,688, and/or U.S. patent application Ser. No. 10/770,736, filed Feb. 3, 2004, now U.S. Pat. No. 7,023,322; and/or U.S. provisional applications, Ser. No. 60/502,806, filed Sep. 12, 2003; and Ser. No. 60/444,726, filed Feb. 4, 2003, which are hereby incorporated herein by reference. The user inputs may also or otherwise function to activate and deactivate a display or function or accessory, and/or may activate/deactivate and/or commence a calibration of a compass system of the mirror assembly and/or vehicle. The compass system may include compass sensors and circuitry within the mirror assembly or within a compass pod or module at or near or associated with the mirror assembly. Optionally, the user inputs may also or otherwise comprise user inputs for a telematics system of the vehicle, such as, for example, an ONSTAR® system as found in General Motors vehicles and/or such as described in U.S. Pat. Nos. 4,862,594; 4,937,945; 5,131,154; 5,255,442; 5,632,092; 5,798,688; 5,971,552; 5,924,212; 6,243,003; 6,278,377; 6,420,975; 6,946,978; 6,477,464; 6,678,614 and/or 7,004,593, and/or U.S. patent application Ser. No. 10/645,762, filed Aug. 20, 2003, now U.S. Pat. No. 7,167,796; and Ser. No. 10/964,512, filed Oct. 13, 2004, now U.S. Pat. No. 7,308,341; and/or PCT Application No. PCT/US03/40611, filed Dec. 19, 2003, and/or PCT Application No. PCT/US03/308877, filed Oct. 1, 2003, which are all hereby incorporated herein by reference.


Optionally, the accessory module may utilize aspects of other accessory modules or windshield electronics modules or the like, such as the types described in U.S. patent application Ser. No. 10/958,087, filed Oct. 4, 2004, now U.S. Pat. No. 7,188,963; and/or Ser. No. 11/201,661, filed Aug. 11, 2005, now U.S. Pat. No. 7,480,149, and/or U.S. Pat. Nos. 7,004,593; 6,824,281; 6,690,268; 6,250,148; 6,341,523; 6,593,565; 6,428,172; 6,501,387; 6,329,925 and 6,326,613, and/or in PCT Application No. PCT/US03/40611, filed Dec. 19, 2003, and/or Ireland pat. applications, Ser. No. S2004/0614, filed Sep. 15, 2004; Ser. No. S2004/0838, filed Dec. 14, 2004; and Ser. No. S2004/0840, filed Dec. 15, 2004, which are all hereby incorporated herein by reference.


The reflective element of the rearview mirror assembly of the vehicle may comprise an electro-optic or electrochromic reflective element or cell, such as an electrochromic mirror assembly and electrochromic reflective element utilizing principles disclosed in commonly assigned U.S. Pat. Nos. 7,195,381; 6,690,268; 5,140,455; 5,151,816; 6,178,034; 6,154,306; 6,002,544; 5,567,360; 5,525,264; 5,610,756; 5,406,414; 5,253,109; 5,076,673; 5,073,012; 5,117,346; 5,724,187; 5,668,663; 5,910,854; 5,142,407 and/or 4,712,879, and/or U.S. patent application Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. patent publication No. US 2006-0061008, and/or PCT Patent Application No. PCT/US2006/018567, filed May 15, 2006, which are all hereby incorporated herein by reference, and/or as disclosed in the following publications: N. R. Lynam, “Electrochromic Automotive Day/Night Mirrors”, SAE Technical Paper Series 870636 (1987); N. R. Lynam, “Smart Windows for Automobiles”, SAE Technical Paper Series 900419 (1990); N. R. Lynam and A. Agrawal, “Automotive Applications of Chromogenic Materials”, Large Area Chromogenics: Materials and Devices for Transmittance Control, C. M. Lampert and C. G. Granquist, EDS., Optical Engineering Press, Wash. (1990), which are hereby incorporated by reference herein. The thicknesses and materials of the coatings on the substrates of the electrochromic reflective element, such as on the third surface of the reflective element assembly, may be selected to provide a desired color or tint to the mirror reflective element, such as a blue colored reflector, such as is known in the art and/or such as described in U.S. Pat. Nos. 5,910,854 and 6,420,036, and in PCT Application No. PCT/US03/29776, filed Sep. 9, 2003, which are all hereby incorporated herein by reference.


Optionally, use of an elemental semiconductor mirror, such as a silicon metal mirror, such as disclosed in U.S. Pat. Nos. 6,286,965; 6,196,688; 5,535,056; 5,751,489 and 6,065,840, and/or in U.S. patent application Ser. No. 10/993,302, filed Nov. 19, 2004, now U.S. Pat. No. 7,338,177, which are all hereby incorporated herein by reference, can be advantageous because such elemental semiconductor mirrors (such as can be formed by depositing a thin film of silicon) can be greater than 50 percent reflecting in the photopic (SAE J964a measured), while being also substantially transmitting of light (up to 20 percent or even more). Such silicon mirrors also have the advantage of being able to be deposited onto a flat glass substrate and to be bent into a curved (such as a convex or aspheric) curvature, which is also advantageous since many passenger-side exterior rearview mirrors are bent or curved.


Optionally, the reflective element may include a perimeter metallic band, such as the types described in PCT Application No. PCT/US03/29776, filed Sep. 19, 2003; and/or PCT Application No. PCT/US03/35381, filed Nov. 5, 2003; and/or U.S. patent application Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; and/or Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. patent publication No. US 2006-0061008, which are hereby incorporated herein by reference. Optionally, the reflective element may include indicia formed at and viewable at the reflective element, such as by utilizing aspects of the reflective elements described in PCT Patent Application No. PCT/US2006/018567, filed May 15, 2006, which are hereby incorporated herein by reference.


Optionally, the reflective element of the mirror assembly may comprise a single substrate with a reflective coating at its rear surface, without affecting the scope of the present invention. The mirror assembly thus may comprise a prismatic mirror assembly or other mirror having a single substrate reflective element, such as a mirror assembly utilizing aspects described in U.S. Pat. Nos. 6,318,870; 6,598,980; 5,327,288; 4,948,242; 4,826,289; 4,436,371 and 4,435,042; and PCT Application No. PCT/US04/015424, filed May 18, 2004; and U.S. patent application Ser. No. 10/933,842, filed Sep. 3, 2004, now U.S. Pat. No. 7,249,860, which are hereby incorporated herein by reference. Optionally, the reflective element may comprise a conventional prismatic or flat reflective element or prism, or may comprise a prismatic or flat reflective element of the types described in PCT Application No. PCT/US03/29776, filed Sep. 19, 2003; U.S. patent application Ser. No. 10/709,434, filed May 5, 2004, now U.S. Pat. No. 7,420,756; Ser. No. 10/933,842, filed Sep. 3, 2004, now U.S. Pat. No. 7,249,860; Ser. No. 11/021,065, filed Dec. 23, 2004, now U.S. Pat. No. 7,255,451; and/or Ser. No. 10/993,302, filed Nov. 19, 2004, now U.S. Pat. No. 7,338,177, and/or PCT Application No. PCT/US2004/015424, filed May 18, 2004, which are all hereby incorporated herein by reference, without affecting the scope of the present invention.


Changes and modifications to the specifically described embodiments may be carried out without departing from the principles of the present invention, which is intended to be limited by the scope of the appended claims as interpreted according to the principles of patent law including the doctrine of equivalents.

Claims
  • 1. A vehicular imaging system, said vehicular imaging system comprising: a camera comprising a CMOS imaging sensor;said imaging sensor comprising a photosensor array having a plurality of photosensor elements;wherein said camera is disposed in a vehicle equipped with said vehicular imaging system, and wherein said camera is disposed behind a windshield of the equipped vehicle and views through a portion of the windshield;wherein said camera views a scene forward of the equipped vehicle;wherein the portion of the windshield comprises a portion cleaned by a windshield wiper of the equipped vehicle;wherein said camera is operable to capture frames of image data at an image frame capture rate that is between 5 times per second and 120 times per second;wherein image data captured by said camera is provided to a control, said control comprising an image processor;wherein said control receives, via a communication bus of the equipped vehicle, at least one selected from the group consisting of (i) vehicle pitch information relating to pitch of the equipped vehicle, (ii) vehicle yaw information relating to yaw of the equipped vehicle and (iii) vehicle steering information relating to steering of the equipped vehicle;wherein said vehicular imaging system automatically corrects for misalignment of said camera disposed behind the windshield of the equipped vehicle and viewing through the portion of the windshield; andwherein image data captured by said camera is processed at said control for a lane departure warning system of the equipped vehicle and for at least one selected from the group consisting of (i) an automatic headlamp control system of the equipped vehicle, (ii) a collision avoidance system of the equipped vehicle and (iii) an adaptive front lighting system of the equipped vehicle.
  • 2. The vehicular imaging system of claim 1, wherein said vehicular imaging system automatically corrects for misalignment of said camera of up to +/−4 degrees from an aligned condition of said camera.
  • 3. The vehicular imaging system of claim 2, wherein said control corrects a misalignment of said camera that occurs due to use of the equipped vehicle.
  • 4. The vehicular imaging system of claim 3, wherein the misalignment that occurs due to use of the equipped vehicle arises from at least one selected from the group consisting of (i) a change in pitch of the equipped vehicle, (ii) a change in tilt of the equipped vehicle and (iii) a change in yaw of the equipped vehicle.
  • 5. The vehicular imaging system of claim 3, wherein the misalignment arises from a load condition of the equipped vehicle.
  • 6. The vehicular imaging system of claim 1, wherein said control is operable to process a reduced set of image data more than other image data.
  • 7. The vehicular imaging system of claim 1, wherein said control processes captured image data at a higher resolution level of processing of captured image data and at a lower resolution level of processing of captured image data.
  • 8. The vehicular imaging system of claim 7, wherein said control processes captured image data at the lower resolution level responsive to distance between the equipped vehicle and an object viewed by said camera being below a threshold level for the object, and wherein said control processes captured image data at the higher resolution level responsive to distance between the equipped vehicle and the object viewed by said camera being above the threshold level for the object.
  • 9. The vehicular imaging system of claim 1, wherein said vehicular imaging system detects when at least a portion of said photosensor array of said imaging sensor of said camera is blocked.
  • 10. The vehicular imaging system of claim 9, wherein said vehicular imaging system determines that said photosensor array of said camera is totally blocked when, over a plurality of frames of image data captured by said camera, processing of image data at said control determines that a count of bright pixels remains below a threshold.
  • 11. The vehicular imaging system of claim 9, wherein said vehicular imaging system determines that said photosensor array of said camera is partially blocked via said control performing region-based image processing to take into account intensity variations in different regions of said photosensor array of said camera.
  • 12. The vehicular imaging system of claim 1, wherein said vehicular imaging system automatically detects a driving side of a road being traveled along by the equipped vehicle.
  • 13. The vehicular imaging system of claim 12, wherein image processing at said control of image data captured by said camera identifies a taillight of another vehicle that is 200 meters in front of the equipped vehicle.
  • 14. The vehicular imaging system of claim 1, wherein said vehicular imaging system is operable to encode signals into a light output of a light of the equipped vehicle to allow an imaging system of another vehicle to detect an encoded signal.
  • 15. The vehicular imaging system of claim 1, wherein said vehicular imaging system is operable to detect and ameliorate a bad photosensor element of said plurality of photosensor elements of said imaging sensor of said camera.
  • 16. The vehicular imaging system of claim 1, wherein a reduced image data set of image data captured by said camera is processed at said control, the reduced image data set being representative of a portion of the captured image data as captured by a particular grouping of said photosensor elements of said imaging sensor of said camera.
  • 17. The vehicular imaging system of claim 1, wherein the automatic correction for misalignment of said camera mounted at the equipped vehicle comprises image processing at said control of image data captured by said camera determining where converging road features of a road along which the equipped vehicle is traveling would converge.
  • 18. The vehicular imaging system of claim 1, wherein the automatic correction for misalignment of said camera comprises comparing an imaged location of an object present in a field of view of said imaging sensor to an expected location of the object.
  • 19. The vehicular imaging system of claim 1, wherein the automatic correction for misalignment of said camera mounted at the equipped vehicle comprises image processing at said control of image data captured by said camera comparing an imaged location of an object viewed by said camera to an expected location of the object, wherein the imaged location is determined by image processing at said control of image data captured said camera and wherein the expected location is determined responsive to vehicle data carried to said control via said communication bus.
  • 20. A vehicular imaging system, said vehicular imaging system comprising: a camera comprising a CMOS imaging sensor;said imaging sensor comprising a photosensor array having a plurality of photosensor elements;wherein said camera is disposed in a vehicle equipped with said vehicular imaging system, and wherein said camera is disposed behind a windshield of the equipped vehicle and views through a portion of the windshield;wherein said camera views a scene forward of the equipped vehicle;wherein the portion of the windshield comprises a portion cleaned by a windshield wiper of the equipped vehicle;wherein said camera is operable to capture frames of image data at an image frame capture rate that is between 5 times per second and 120 times per second;wherein image data captured by said camera is provided to a control, said control comprising an image processor;wherein said control receives, via a communication bus of the equipped vehicle, at least one selected from the group consisting of (i) vehicle pitch information relating to pitch of the equipped vehicle, (ii) vehicle yaw information relating to yaw of the equipped vehicle and (iii) vehicle steering information relating to steering of the equipped vehicle;wherein said vehicular imaging system automatically corrects for misalignment of said camera disposed behind the windshield of the equipped vehicle and viewing through the portion of the windshield;wherein said vehicular imaging system automatically corrects for misalignment of said camera of up to +/−4 degrees from an aligned condition of said camera;wherein the automatic correction for misalignment of said camera mounted at the equipped vehicle comprises image processing at said control of image data captured by said camera determining where converging road features of a road along which the equipped vehicle is traveling would converge; andwherein image data captured by said camera is processed at said control for a lane departure warning system of the equipped vehicle and for at least one selected from the group consisting of (i) an automatic headlamp control system of the equipped vehicle, (ii) a collision avoidance system of the equipped vehicle and (iii) an adaptive front lighting system of the equipped vehicle.
  • 21. The vehicular imaging system of claim 20, wherein image processing at said control of image data captured by said camera identifies a taillight of another vehicle that is 200 meters in front of the equipped vehicle.
  • 22. The vehicular imaging system of claim 21, wherein said control corrects a misalignment of said camera that occurs due to use of the equipped vehicle.
  • 23. The vehicular imaging system of claim 22, wherein the misalignment that occurs due to use of the equipped vehicle arises from at least one selected from the group consisting of (i) a change in pitch of the equipped vehicle, (ii) a change in tilt of the equipped vehicle and (iii) a change in yaw of the equipped vehicle.
  • 24. The vehicular imaging system of claim 20, wherein said control is operable to process a reduced set of image data more than other image data.
  • 25. The vehicular imaging system of claim 24, wherein said control processes captured image data at a higher resolution level of processing of captured image data and at a lower resolution level of processing of captured image data.
  • 26. The vehicular imaging system of claim 25, wherein said control processes captured image data at the lower resolution level responsive to distance between the equipped vehicle and an object viewed by said camera being below a threshold level for the object, and wherein said control processes captured image data at the higher resolution level responsive to distance between the equipped vehicle and the object viewed by said camera being above the threshold level for the object.
  • 27. The vehicular imaging system of claim 20, wherein said vehicular imaging system detects when at least a portion of said photosensor array of said imaging sensor of said camera is blocked.
  • 28. The vehicular imaging system of claim 27, wherein said vehicular imaging system determines that said photosensor array of said camera is totally blocked when, over a plurality of frames of image data captured by said camera, processing of image data at said control determines that a count of bright pixels remains below a threshold.
  • 29. The vehicular imaging system of claim 27, wherein said vehicular imaging system determines that said photosensor array of said camera is partially blocked via said control performing region-based image processing to take into account intensity variations in different regions of said photosensor array of said camera.
  • 30. The vehicular imaging system of claim 20, wherein said vehicular imaging system automatically detects a driving side of the road being traveled along by the equipped vehicle.
  • 31. The vehicular imaging system of claim 30, wherein a reduced image data set of image data captured by said camera is processed at said control, the reduced image data set being representative of a portion of the captured image data as captured by a particular grouping of said photosensor elements of said imaging sensor of said camera.
  • 32. A vehicular imaging system, said vehicular imaging system comprising: a camera comprising a CMOS imaging sensor;said imaging sensor comprising a photosensor array having a plurality of photosensor elements;wherein said camera is disposed in a vehicle equipped with said vehicular imaging system, and wherein said camera is disposed behind a windshield of the equipped vehicle and views through a portion of the windshield;wherein said camera views a scene forward of the equipped vehicle;wherein the portion of the windshield comprises a portion cleaned by a windshield wiper of the equipped vehicle;wherein said camera is operable to capture frames of image data at an image frame capture rate that is between 5 times per second and 120 times per second;wherein image data captured by said camera is provided to a control, said control comprising an image processor;wherein said control receives, via a communication bus of the equipped vehicle, at least one selected from the group consisting of (i) vehicle pitch information relating to pitch of the equipped vehicle, (ii) vehicle yaw information relating to yaw of the equipped vehicle and (iii) vehicle steering information relating to steering of the equipped vehicle;wherein said vehicular imaging system automatically corrects for misalignment of said camera disposed behind the windshield of the equipped vehicle and viewing through the portion of the windshield;wherein said vehicular imaging system automatically corrects for misalignment of said camera of up to +/−4 degrees from an aligned condition of said camera;wherein the automatic correction for misalignment of said camera mounted at the equipped vehicle comprises image processing at said control of image data captured by said camera comparing an imaged location of an object viewed by said camera to an expected location of the object;wherein the imaged location is determined by image processing at said control of image data captured said camera;wherein the expected location is determined responsive to vehicle data carried to said control via said communication bus; andwherein image data captured by said camera is processed at said control for a lane departure warning system of the equipped vehicle and for at least one selected from the group consisting of (i) an automatic headlamp control system of the equipped vehicle, (ii) a collision avoidance system of the equipped vehicle and (iii) an adaptive front lighting system of the equipped vehicle.
  • 33. The vehicular imaging system of claim 32, wherein image processing at said control of image data captured by said camera identifies a taillight of another vehicle that is 200 meters in front of the equipped vehicle.
  • 34. The vehicular imaging system of claim 33, wherein said vehicular imaging system automatically detects a driving side of a road being traveled along by the equipped vehicle.
  • 35. The vehicular imaging system of claim 34, wherein a reduced image data set of image data captured by said camera is processed at said control, the reduced image data set being representative of a portion of the captured image data as captured by a particular grouping of said photosensor elements of said imaging sensor of said camera.
  • 36. The vehicular imaging system of claim 33, wherein said control is operable to process a reduced set of image data more than other image data.
  • 37. The vehicular imaging system of claim 32, wherein said control corrects a misalignment of said camera that occurs due to use of the equipped vehicle.
  • 38. The vehicular imaging system of claim 37, wherein the misalignment that occurs due to use of the equipped vehicle arises from at least one selected from the group consisting of (i) a change in pitch of the equipped vehicle, (ii) a change in tilt of the equipped vehicle and (iii) a change in yaw of the equipped vehicle.
  • 39. The vehicular imaging system of claim 32, wherein said control processes captured image data at a higher resolution level of processing of captured image data and at a lower resolution level of processing of captured image data.
  • 40. The vehicular imaging system of claim 39, wherein said control processes captured image data at the lower resolution level responsive to distance between the equipped vehicle and an object viewed by said camera being below a threshold level for the object, and wherein said control processes captured image data at the higher resolution level responsive to distance between the equipped vehicle and the object viewed by said camera being above the threshold level for the object.
  • 41. The vehicular imaging system of claim 32, wherein said vehicular imaging system detects when at least a portion of said photosensor array of said imaging sensor of said camera is blocked.
  • 42. The vehicular imaging system of claim 41, wherein said vehicular imaging system determines that said photosensor array of said camera is totally blocked when, over a plurality of frames of image data captured by said camera, processing of image data at said control determines that a count of bright pixels remains below a threshold.
  • 43. The vehicular imaging system of claim 41, wherein said vehicular imaging system determines that said photosensor array of said camera is partially blocked via said control performing region-based image processing to take into account intensity variations in different regions of said photosensor array of said camera.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 16/947,270, filed Jul. 27, 2020, now U.S. Pat. No. 11,328,447, which is a continuation of U.S. patent application Ser. No. 15/978,435, filed May 14, 2018, now U.S. Pat. No. 10,726,578, which is a divisional of U.S. patent application Ser. No. 14/694,226, filed Apr. 23, 2015, now U.S. Pat. No. 9,972,100, which is a continuation of U.S. patent application Ser. No. 13/776,094, filed Feb. 25, 2013, now U.S. Pat. No. 9,018,577, which is a continuation of U.S. patent application Ser. No. 13/204,791, filed Aug. 8, 2011, which is a continuation of U.S. patent application Ser. No. 12/190,698, filed Aug. 13, 2008, now U.S. Pat. No. 8,017,898, which claims the benefit of U.S. provisional application Ser. No. 60/956,633, filed Aug. 17, 2007, which are incorporated herein by reference for all purposes.

US Referenced Citations (556)
Number Name Date Kind
2632040 Rabinow Mar 1953 A
2827594 Rabinow Mar 1958 A
3349394 Carver Oct 1967 A
3601614 Platzer Aug 1971 A
3612666 Rabinow Oct 1971 A
3665224 Kelsey May 1972 A
3680951 Jordan et al. Aug 1972 A
3689695 Rosenfield et al. Sep 1972 A
3708231 Walters Jan 1973 A
3746430 Brean et al. Jul 1973 A
3807832 Castellion Apr 1974 A
3811046 Levick May 1974 A
3813540 Albrecht May 1974 A
3862798 Hopkins Jan 1975 A
3947095 Moultrie Mar 1976 A
3962600 Pittman Jun 1976 A
3985424 Steinacher Oct 1976 A
3986022 Hyatt Oct 1976 A
4037134 Loper Jul 1977 A
4052712 Ohama et al. Oct 1977 A
4093364 Miller Jun 1978 A
4111720 Michel et al. Sep 1978 A
4161653 Bedini et al. Jul 1979 A
4200361 Malvano et al. Apr 1980 A
4214266 Myers Jul 1980 A
4218698 Bart et al. Aug 1980 A
4236099 Rosenblum Nov 1980 A
4247870 Gabel et al. Jan 1981 A
4249160 Chilvers Feb 1981 A
4266856 Wainwright May 1981 A
4277804 Robison Jul 1981 A
4281898 Ochiai et al. Aug 1981 A
4288814 Talley et al. Sep 1981 A
4355271 Noack Oct 1982 A
4357558 Massoni et al. Nov 1982 A
4381888 Momiyama May 1983 A
4420238 Felix Dec 1983 A
4431896 Lodetti Feb 1984 A
4443057 Bauer et al. Apr 1984 A
4460831 Oettinger et al. Jul 1984 A
4481450 Watanabe et al. Nov 1984 A
4491390 Tong-Shen Jan 1985 A
4512637 Ballmer Apr 1985 A
4529275 Ballmer Jul 1985 A
4529873 Ballmer et al. Jul 1985 A
4532550 Bendell et al. Jul 1985 A
4546551 Franks Oct 1985 A
4549208 Kamejima et al. Oct 1985 A
4571082 Downs Feb 1986 A
4572619 Reininger et al. Feb 1986 A
4580875 Bechtel et al. Apr 1986 A
4600913 Caine Jul 1986 A
4603946 Kato et al. Aug 1986 A
4614415 Hyatt Sep 1986 A
4620141 McCumber et al. Oct 1986 A
4623222 Itoh et al. Nov 1986 A
4626850 Chey Dec 1986 A
4629941 Ellis et al. Dec 1986 A
4630109 Barton Dec 1986 A
4632509 Ohmi et al. Dec 1986 A
4638287 Umebayashi et al. Jan 1987 A
4647161 Muller Mar 1987 A
4653316 Fukuhara Mar 1987 A
4669825 Itoh et al. Jun 1987 A
4669826 Itoh et al. Jun 1987 A
4671615 Fukada et al. Jun 1987 A
4672457 Hyatt Jun 1987 A
4676601 Itoh et al. Jun 1987 A
4690508 Jacob Sep 1987 A
4692798 Seko et al. Sep 1987 A
4697883 Suzuki et al. Oct 1987 A
4701022 Jacob Oct 1987 A
4713685 Nishimura et al. Dec 1987 A
4717830 Botts Jan 1988 A
4727290 Smith et al. Feb 1988 A
4731669 Hayashi et al. Mar 1988 A
4741603 Miyagi et al. May 1988 A
4768135 Kretschmer et al. Aug 1988 A
4772942 Tuck Sep 1988 A
4789904 Peterson Dec 1988 A
4793690 Gahan et al. Dec 1988 A
4817948 Simonelli Apr 1989 A
4820933 Hong et al. Apr 1989 A
4825232 Howdle Apr 1989 A
4838650 Stewart et al. Jun 1989 A
4847772 Michalopoulos et al. Jul 1989 A
4855822 Narendra et al. Aug 1989 A
4862037 Farber et al. Aug 1989 A
4867561 Fujii et al. Sep 1989 A
4871917 O'Farrell et al. Oct 1989 A
4872051 Dye Oct 1989 A
4881019 Shiraishi et al. Nov 1989 A
4882565 Gallmeyer Nov 1989 A
4886960 Molyneux et al. Dec 1989 A
4891559 Matsumoto et al. Jan 1990 A
4892345 Rachael, III Jan 1990 A
4895790 Swanson et al. Jan 1990 A
4896030 Miyaji Jan 1990 A
4907870 Brucker Mar 1990 A
4910591 Petrossian et al. Mar 1990 A
4916374 Schierbeek et al. Apr 1990 A
4917477 Bechtel et al. Apr 1990 A
4937796 Tendler Jun 1990 A
4953305 Van Lente et al. Sep 1990 A
4956591 Schierbeek et al. Sep 1990 A
4961625 Wood et al. Oct 1990 A
4966441 Conner Oct 1990 A
4967319 Seko Oct 1990 A
4970653 Kenue Nov 1990 A
4971430 Lynas Nov 1990 A
4974078 Tsai Nov 1990 A
4987357 Masaki Jan 1991 A
4991054 Walters Feb 1991 A
5001558 Burley et al. Mar 1991 A
5003288 Wilhelm Mar 1991 A
5012082 Watanabe Apr 1991 A
5016977 Baude et al. May 1991 A
5027001 Torbert Jun 1991 A
5027200 Petrossian et al. Jun 1991 A
5044706 Chen Sep 1991 A
5055668 French Oct 1991 A
5059877 Teder Oct 1991 A
5064274 Alten Nov 1991 A
5072154 Chen Dec 1991 A
5086253 Lawler Feb 1992 A
5096287 Kakinami et al. Mar 1992 A
5097362 Lynas Mar 1992 A
5121200 Choi Jun 1992 A
5124549 Michaels et al. Jun 1992 A
5130709 Toyama et al. Jul 1992 A
5148014 Lynam et al. Sep 1992 A
5166681 Bottesch et al. Nov 1992 A
5168378 Black Dec 1992 A
5170374 Shimohigashi et al. Dec 1992 A
5172235 Wilm et al. Dec 1992 A
5177606 Koshizawa Jan 1993 A
5177685 Davis et al. Jan 1993 A
5182502 Slotkowski et al. Jan 1993 A
5184956 Langlais et al. Feb 1993 A
5189561 Hong Feb 1993 A
5193000 Lipton et al. Mar 1993 A
5193029 Schofield et al. Mar 1993 A
5204778 Bechtel Apr 1993 A
5208701 Maeda May 1993 A
5208750 Kurami et al. May 1993 A
5214408 Asayama May 1993 A
5243524 Ishida et al. Sep 1993 A
5245422 Borcherts et al. Sep 1993 A
5253109 O'Farrell et al. Oct 1993 A
5276389 Levers Jan 1994 A
5285060 Larson et al. Feb 1994 A
5289182 Brillard et al. Feb 1994 A
5289321 Secor Feb 1994 A
5305012 Faris Apr 1994 A
5307136 Saneyoshi Apr 1994 A
5313072 Vachss May 1994 A
5325096 Pakett Jun 1994 A
5325386 Jewell et al. Jun 1994 A
5329206 Slotkowski et al. Jul 1994 A
5331312 Kudoh Jul 1994 A
5336980 Levers Aug 1994 A
5341437 Nakayama Aug 1994 A
5351044 Mathur et al. Sep 1994 A
5355118 Fukuhara Oct 1994 A
5374852 Parkes Dec 1994 A
5386285 Asayama Jan 1995 A
5394333 Kao Feb 1995 A
5406395 Wilson et al. Apr 1995 A
5408346 Trissel et al. Apr 1995 A
5410346 Saneyoshi et al. Apr 1995 A
5414257 Stanton May 1995 A
5414461 Kishi et al. May 1995 A
5416313 Larson et al. May 1995 A
5416318 Hegyi May 1995 A
5416478 Morinaga May 1995 A
5424952 Asayama Jun 1995 A
5426294 Kobayashi et al. Jun 1995 A
5430431 Nelson Jul 1995 A
5434407 Bauer et al. Jul 1995 A
5440428 Hegg et al. Aug 1995 A
5444478 Lelong et al. Aug 1995 A
5451822 Bechtel et al. Sep 1995 A
5457493 Leddy et al. Oct 1995 A
5461357 Yoshioka et al. Oct 1995 A
5461361 Moore Oct 1995 A
5469298 Suman et al. Nov 1995 A
5471515 Fossum et al. Nov 1995 A
5475494 Nishida et al. Dec 1995 A
5487116 Nakano et al. Jan 1996 A
5498866 Bendicks et al. Mar 1996 A
5500766 Stonecypher Mar 1996 A
5510983 Lino Apr 1996 A
5515448 Nishitani May 1996 A
5521633 Nakajima et al. May 1996 A
5528698 Kamei et al. Jun 1996 A
5529138 Shaw et al. Jun 1996 A
5530240 Larson et al. Jun 1996 A
5530420 Tsuchiya et al. Jun 1996 A
5535144 Kise Jul 1996 A
5535314 Alves et al. Jul 1996 A
5537003 Bechtel et al. Jul 1996 A
5539397 Asanuma et al. Jul 1996 A
5541590 Nishio Jul 1996 A
5550677 Schofield et al. Aug 1996 A
5555312 Shima et al. Sep 1996 A
5555555 Sato et al. Sep 1996 A
5568027 Teder Oct 1996 A
5574443 Hsieh Nov 1996 A
5581464 Woll et al. Dec 1996 A
5594222 Caldwell Jan 1997 A
5614788 Mullins Mar 1997 A
5619370 Guinosso Apr 1997 A
5634709 Iwama Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5648835 Uzawa Jul 1997 A
5650944 Kise Jul 1997 A
5660454 Mori et al. Aug 1997 A
5661303 Teder Aug 1997 A
5666028 Bechtel et al. Sep 1997 A
5668663 Varaprasad et al. Sep 1997 A
5670935 Schofield et al. Sep 1997 A
5675489 Pomerleau Oct 1997 A
5677851 Kingdon et al. Oct 1997 A
5699044 Van Lente et al. Dec 1997 A
5724187 Varaprasad et al. Mar 1998 A
5724316 Brunts Mar 1998 A
5737226 Olson et al. Apr 1998 A
5757949 Kinoshita et al. May 1998 A
5760826 Nayar Jun 1998 A
5760828 Cortes Jun 1998 A
5760931 Saburi et al. Jun 1998 A
5760962 Schofield et al. Jun 1998 A
5761094 Olson et al. Jun 1998 A
5765116 Wilson-Jones et al. Jun 1998 A
5781437 Wiemer et al. Jul 1998 A
5786772 Schofield et al. Jul 1998 A
5790403 Nakayama Aug 1998 A
5790973 Blaker et al. Aug 1998 A
5793308 Rosinski et al. Aug 1998 A
5793420 Schmidt Aug 1998 A
5796094 Schofield et al. Aug 1998 A
5798575 O'Farrell et al. Aug 1998 A
5835255 Miles Nov 1998 A
5837994 Stam et al. Nov 1998 A
5844505 Van Ryzin Dec 1998 A
5844682 Kiyomoto et al. Dec 1998 A
5845000 Breed et al. Dec 1998 A
5848802 Breed et al. Dec 1998 A
5850176 Kinoshita et al. Dec 1998 A
5850254 Takano et al. Dec 1998 A
5867591 Onda Feb 1999 A
5877707 Kowalick Mar 1999 A
5877897 Schofield et al. Mar 1999 A
5878370 Olson Mar 1999 A
5883739 Ashihara et al. Mar 1999 A
5884212 Lion Mar 1999 A
5890021 Onoda Mar 1999 A
5896085 Mori et al. Apr 1999 A
5899956 Chan May 1999 A
5914815 Bos Jun 1999 A
5920367 Kajimoto et al. Jul 1999 A
5923027 Stam et al. Jul 1999 A
5929786 Schofield et al. Jul 1999 A
5940120 Frankhouse et al. Aug 1999 A
5949331 Schofield et al. Sep 1999 A
5956181 Lin Sep 1999 A
5959367 O'Farrell et al. Sep 1999 A
5959555 Furuta Sep 1999 A
5963247 Banitt Oct 1999 A
5964822 Alland et al. Oct 1999 A
5971552 O'Farrell et al. Oct 1999 A
5986796 Miles Nov 1999 A
5990469 Bechtel et al. Nov 1999 A
5990649 Nagao et al. Nov 1999 A
6001486 Varaprasad et al. Dec 1999 A
6020704 Buschur Feb 2000 A
6049171 Stam et al. Apr 2000 A
6052124 Stein et al. Apr 2000 A
6066933 Ponziana May 2000 A
6084519 Coulling et al. Jul 2000 A
6087953 DeLine et al. Jul 2000 A
6091833 Yasui et al. Jul 2000 A
6097023 Schofield et al. Aug 2000 A
6097024 Stam et al. Aug 2000 A
6100811 Hsu et al. Aug 2000 A
6116743 Hoek Sep 2000 A
6124647 Marcus et al. Sep 2000 A
6124886 DeLine et al. Sep 2000 A
6139172 Bos et al. Oct 2000 A
6144022 Tenenbaum et al. Nov 2000 A
6160369 Chen Dec 2000 A
6172613 DeLine et al. Jan 2001 B1
6175164 O'Farrell et al. Jan 2001 B1
6175300 Kendrick Jan 2001 B1
6198409 Schofield et al. Mar 2001 B1
6201642 Bos Mar 2001 B1
6222447 Schofield et al. Apr 2001 B1
6222460 DeLine et al. Apr 2001 B1
6226061 Tagusa May 2001 B1
6243003 DeLine et al. Jun 2001 B1
6250148 Lynam Jun 2001 B1
6259412 Duroux Jul 2001 B1
6259423 Tokito et al. Jul 2001 B1
6266082 Yonezawa et al. Jul 2001 B1
6266442 Laumeyer et al. Jul 2001 B1
6285393 Shimoura et al. Sep 2001 B1
6285778 Nakajima et al. Sep 2001 B1
6291906 Marcus et al. Sep 2001 B1
6294989 Schofield et al. Sep 2001 B1
6297781 Turnbull et al. Oct 2001 B1
6302545 Schofield et al. Oct 2001 B1
6310611 Caldwell Oct 2001 B1
6313454 Bos et al. Nov 2001 B1
6317057 Lee Nov 2001 B1
6320176 Schofield et al. Nov 2001 B1
6320282 Caldwell Nov 2001 B1
6326613 Heslin et al. Dec 2001 B1
6329925 Skiver et al. Dec 2001 B1
6333759 Mazzilli Dec 2001 B1
6341523 Lynam Jan 2002 B2
6353392 Schofield et al. Mar 2002 B1
6366213 DeLine et al. Apr 2002 B2
6370329 Teuchert Apr 2002 B1
6396397 Bos et al. May 2002 B1
6411204 Bloomfield et al. Jun 2002 B1
6411328 Franke et al. Jun 2002 B1
6420975 DeLine et al. Jul 2002 B1
6424273 Gutta et al. Jul 2002 B1
6428172 Hutzel et al. Aug 2002 B1
6430303 Naoi et al. Aug 2002 B1
6433676 DeLine et al. Aug 2002 B2
6442465 Breed et al. Aug 2002 B2
6445287 Schofield et al. Sep 2002 B1
6477464 McCarthy et al. Nov 2002 B2
6485155 Duroux et al. Nov 2002 B1
6497503 Dassanayake et al. Dec 2002 B1
6498620 Schofield et al. Dec 2002 B2
6513252 Schierbeek et al. Feb 2003 B1
6515378 Drummond et al. Feb 2003 B2
6516664 Lynam Feb 2003 B2
6523964 Schofield et al. Feb 2003 B2
6534884 Marcus et al. Mar 2003 B2
6539306 Turnbull Mar 2003 B2
6547133 Devries, Jr. et al. Apr 2003 B1
6553130 Lemelson et al. Apr 2003 B1
6559435 Schofield et al. May 2003 B2
6570998 Ohtsuka et al. May 2003 B1
6574033 Chui et al. Jun 2003 B1
6587573 Stam et al. Jul 2003 B1
6589625 Kothari et al. Jul 2003 B1
6593565 Heslin et al. Jul 2003 B2
6594583 Ogura et al. Jul 2003 B2
6611202 Schofield et al. Aug 2003 B2
6611610 Stam et al. Aug 2003 B1
6627918 Getz et al. Sep 2003 B2
6636258 Strumolo Oct 2003 B2
6648477 Hutzel et al. Nov 2003 B2
6650233 DeLine et al. Nov 2003 B2
6650455 Miles Nov 2003 B2
6672731 Schnell et al. Jan 2004 B2
6674562 Miles Jan 2004 B1
6678614 McCarthy et al. Jan 2004 B2
6680792 Miles Jan 2004 B2
6690268 Schofield et al. Feb 2004 B2
6700605 Toyoda et al. Mar 2004 B1
6703925 Steffel Mar 2004 B2
6704621 Stein et al. Mar 2004 B1
6710908 Miles et al. Mar 2004 B2
6711474 Treyz et al. Mar 2004 B1
6714331 Lewis et al. Mar 2004 B2
6717610 Bos et al. Apr 2004 B1
6735506 Breed et al. May 2004 B2
6741377 Miles May 2004 B2
6744353 Sjonell Jun 2004 B2
6757109 Bos Jun 2004 B2
6762867 Lippert et al. Jul 2004 B2
6794119 Miles Sep 2004 B2
6795221 Urey Sep 2004 B1
6802617 Schofield et al. Oct 2004 B2
6806452 Bos et al. Oct 2004 B2
6807287 Hermans Oct 2004 B1
6822563 Bos et al. Nov 2004 B2
6823241 Shirato et al. Nov 2004 B2
6824281 Schofield et al. Nov 2004 B2
6831261 Schofield et al. Dec 2004 B2
6831591 Horibe Dec 2004 B2
6847487 Burgner Jan 2005 B2
6864930 Matsushita et al. Mar 2005 B2
6882287 Schofield Apr 2005 B2
6889161 Winner et al. May 2005 B2
6891563 Schofield et al. May 2005 B2
6909753 Meehan et al. Jun 2005 B2
6946978 Schofield Sep 2005 B2
6953253 Schofield et al. Oct 2005 B2
6968736 Lynam Nov 2005 B2
6975775 Rykowski et al. Dec 2005 B2
7004593 Weller et al. Feb 2006 B2
7004606 Schofield Feb 2006 B2
7005974 McMahon et al. Feb 2006 B2
7038577 Pawlicki et al. May 2006 B2
7062300 Kim Jun 2006 B1
7065432 Moisel et al. Jun 2006 B2
7085637 Breed et al. Aug 2006 B2
7092548 Laumeyer et al. Aug 2006 B2
7113867 Stein Sep 2006 B1
7116246 Winter et al. Oct 2006 B2
7123168 Schofield Oct 2006 B2
7133661 Hatae et al. Nov 2006 B2
7149613 Stam et al. Dec 2006 B2
7151996 Stein Dec 2006 B2
7167796 Taylor et al. Jan 2007 B2
7195381 Lynam et al. Mar 2007 B2
7202776 Breed Apr 2007 B2
7227459 Bos et al. Jun 2007 B2
7227611 Hull et al. Jun 2007 B2
7311406 Schofield et al. Dec 2007 B2
7325934 Schofield et al. Feb 2008 B2
7325935 Schofield et al. Feb 2008 B2
7338177 Lynam Mar 2008 B2
7339149 Schofield et al. Mar 2008 B1
7344261 Schofield et al. Mar 2008 B2
7375803 Bamji May 2008 B1
7380948 Schofield et al. Jun 2008 B2
7388182 Schofield et al. Jun 2008 B2
7402786 Schofield et al. Jul 2008 B2
7423248 Schofield et al. Sep 2008 B2
7423821 Bechtel et al. Sep 2008 B2
7425076 Schofield et al. Sep 2008 B2
7459664 Schofield et al. Dec 2008 B2
7526103 Schofield et al. Apr 2009 B2
7541743 Salmeen et al. Jun 2009 B2
7561181 Schofield et al. Jul 2009 B2
7565006 Stam et al. Jul 2009 B2
7566851 Stein et al. Jul 2009 B2
7605856 Imoto Oct 2009 B2
7616781 Schofield et al. Nov 2009 B2
7619508 Lynam et al. Nov 2009 B2
7639149 Katoh Dec 2009 B2
7676087 Dhua et al. Mar 2010 B2
7720580 Higgins-Luthman May 2010 B2
7786898 Stein et al. Aug 2010 B2
7792329 Schofield et al. Sep 2010 B2
7843451 Lafon Nov 2010 B2
7855778 Yung et al. Dec 2010 B2
7859565 Schofield et al. Dec 2010 B2
7877175 Higgins-Luthman Jan 2011 B2
7881496 Camilleri et al. Feb 2011 B2
7914187 Higgins-Luthman et al. Mar 2011 B2
7914188 DeLine et al. Mar 2011 B2
7930160 Hosagrahara et al. Apr 2011 B1
7949486 Denny et al. May 2011 B2
7991522 Higgins-Luthman Aug 2011 B2
8017898 Lu et al. Sep 2011 B2
8064643 Stein et al. Nov 2011 B2
8082101 Stein et al. Dec 2011 B2
8100568 DeLine et al. Jan 2012 B2
8164628 Stein et al. Apr 2012 B2
8224031 Saito Jul 2012 B2
8233045 Luo et al. Jul 2012 B2
8254635 Stein et al. Aug 2012 B2
8300886 Hoffmann Oct 2012 B2
8378851 Stein et al. Feb 2013 B2
8386114 Higgins-Luthman Feb 2013 B2
8421865 Euler et al. Apr 2013 B2
8452055 Stein et al. May 2013 B2
8534887 DeLine et al. Sep 2013 B2
8553088 Stein et al. Oct 2013 B2
9018577 Lu Apr 2015 B2
9972100 Lu May 2018 B2
10726578 Lu Jul 2020 B2
11328447 Lu May 2022 B2
20020005778 Breed et al. Jan 2002 A1
20020011611 Huang et al. Jan 2002 A1
20020015153 Downs Feb 2002 A1
20020044065 Quist et al. Apr 2002 A1
20020113873 Williams Aug 2002 A1
20020159270 Lynam et al. Oct 2002 A1
20030103142 Hitomi et al. Jun 2003 A1
20030137586 Lewellen Jul 2003 A1
20030222982 Hamdan et al. Dec 2003 A1
20030227777 Schofield Dec 2003 A1
20040008410 Stam et al. Jan 2004 A1
20040012488 Schofield Jan 2004 A1
20040016870 Pawlicki et al. Jan 2004 A1
20040032321 McMahon et al. Feb 2004 A1
20040051634 Schofield et al. Mar 2004 A1
20040114381 Salmeen et al. Jun 2004 A1
20040128065 Taylor et al. Jul 2004 A1
20040164228 Fogg et al. Aug 2004 A1
20040200948 Bos et al. Oct 2004 A1
20050078389 Kulas et al. Apr 2005 A1
20050134966 Burgner Jun 2005 A1
20050134983 Lynam Jun 2005 A1
20050146792 Schofield et al. Jul 2005 A1
20050169003 Lindahl et al. Aug 2005 A1
20050195488 McCabe et al. Sep 2005 A1
20050200700 Schofield et al. Sep 2005 A1
20050219852 Stam et al. Oct 2005 A1
20050232469 Schofield et al. Oct 2005 A1
20050237385 Kosaka et al. Oct 2005 A1
20050254688 Franz Nov 2005 A1
20050264891 Uken et al. Dec 2005 A1
20060018511 Stam et al. Jan 2006 A1
20060018512 Stam et al. Jan 2006 A1
20060028731 Schofield et al. Feb 2006 A1
20060050018 Hutzel et al. Mar 2006 A1
20060091813 Stam et al. May 2006 A1
20060103727 Tseng May 2006 A1
20060106518 Stam May 2006 A1
20060157639 Shaffer et al. Jul 2006 A1
20060164230 DeWind et al. Jul 2006 A1
20060250501 Wildmann et al. Nov 2006 A1
20070023613 Schofield et al. Feb 2007 A1
20070024724 Stein et al. Feb 2007 A1
20070104476 Yasutomi et al. May 2007 A1
20070109406 Schofield et al. May 2007 A1
20070109651 Schofield et al. May 2007 A1
20070109652 Schofield et al. May 2007 A1
20070109653 Schofield et al. May 2007 A1
20070109654 Schofield et al. May 2007 A1
20070115357 Stein et al. May 2007 A1
20070120657 Schofield et al. May 2007 A1
20070176080 Schofield et al. Aug 2007 A1
20070242339 Bradley Oct 2007 A1
20080043099 Stein et al. Feb 2008 A1
20080147321 Howard et al. Jun 2008 A1
20080180529 Taylor et al. Jul 2008 A1
20080192132 Bechtel et al. Aug 2008 A1
20080266396 Stein Oct 2008 A1
20090113509 Tseng et al. Apr 2009 A1
20090160987 Bechtel et al. Jun 2009 A1
20090190015 Bechtel et al. Jul 2009 A1
20090256938 Bechtel et al. Oct 2009 A1
20090290032 Zhang et al. Nov 2009 A1
20100045797 Schofield et al. Feb 2010 A1
20110216201 McAndrew et al. Sep 2011 A1
20110285850 Lu et al. Nov 2011 A1
20110292668 Schofield Dec 2011 A1
20120045112 Lundblad et al. Feb 2012 A1
20120069185 Stein Mar 2012 A1
20120200707 Stein et al. Aug 2012 A1
20120314071 Rosenbaum et al. Dec 2012 A1
20120320209 Vico et al. Dec 2012 A1
20130141580 Stein et al. Jun 2013 A1
20130147957 Stein Jun 2013 A1
20130169812 Lu et al. Jul 2013 A1
20130286193 Pflug Oct 2013 A1
20140043473 Gupta et al. Feb 2014 A1
20140063254 Shi et al. Mar 2014 A1
20140098229 Lu et al. Apr 2014 A1
20140247352 Rathi et al. Sep 2014 A1
20140247354 Knudsen Sep 2014 A1
20140320658 Pliefke Oct 2014 A1
20140333729 Pflug Nov 2014 A1
20140347486 Okouneva Nov 2014 A1
20140350834 Turk Nov 2014 A1
Foreign Referenced Citations (107)
Number Date Country
2133182 Jan 1973 DE
2808260 Aug 1979 DE
2931368 Feb 1981 DE
2946561 May 1981 DE
3041692 May 1981 DE
3248511 Jul 1984 DE
4107965 Sep 1991 DE
4118208 Nov 1991 DE
4139515 Jun 1992 DE
4123641 Jan 1993 DE
102004048400 Apr 2006 DE
0048506 Mar 1982 EP
0048810 Apr 1982 EP
0202460 Nov 1986 EP
0353200 Jan 1990 EP
0361914 Apr 1990 EP
0416222 Mar 1991 EP
0426503 May 1991 EP
0450553 Oct 1991 EP
0492591 Jul 1992 EP
0513476 Nov 1992 EP
0640903 Mar 1995 EP
0697641 Feb 1996 EP
0788947 Aug 1997 EP
0830267 Mar 1998 EP
1074430 Feb 2001 EP
1115250 Jul 2001 EP
2377094 Oct 2011 EP
2667325 Nov 2013 EP
2241085 Mar 1975 FR
2513198 Mar 1983 FR
2585991 Feb 1987 FR
2672857 Aug 1992 FR
2673499 Sep 1992 FR
2726144 Apr 1996 FR
934037 Aug 1963 GB
1535182 Dec 1978 GB
2029343 Mar 1980 GB
2119087 Nov 1983 GB
2137373 Oct 1984 GB
2137573 Oct 1984 GB
2156295 Oct 1985 GB
2233530 Jan 1991 GB
2244187 Nov 1991 GB
2255539 Nov 1992 GB
2267341 Dec 1993 GB
2327823 Feb 1999 GB
S5539843 Mar 1980 JP
56-030305 Mar 1981 JP
57173801 Oct 1982 JP
57208530 Dec 1982 JP
57208531 Dec 1982 JP
58-019941 Feb 1983 JP
S58110334 Jun 1983 JP
58209635 Dec 1983 JP
59-051301 Mar 1984 JP
59-051325 Mar 1984 JP
59114139 Jul 1984 JP
59133336 Jul 1984 JP
60-080953 May 1985 JP
60-166651 Aug 1985 JP
60212730 Oct 1985 JP
60261275 Nov 1985 JP
61-054942 Mar 1986 JP
6079889 Oct 1986 JP
62043543 Feb 1987 JP
S6216073 Apr 1987 JP
6272245 May 1987 JP
62122487 Jun 1987 JP
62122844 Jun 1987 JP
S62131837 Jun 1987 JP
6414700 Jan 1989 JP
01123587 May 1989 JP
H1168538 Jul 1989 JP
H236417 Aug 1990 JP
03-061192 Mar 1991 JP
3099952 Apr 1991 JP
04-002397 Oct 1991 JP
03284413 Dec 1991 JP
04114587 Apr 1992 JP
04-245866 Sep 1992 JP
05-000638 Jan 1993 JP
05050883 Mar 1993 JP
52-013113 Aug 1993 JP
61-007035 Apr 1994 JP
61-056638 Jun 1994 JP
6227318 Aug 1994 JP
07-04170 Jan 1995 JP
07105496 Apr 1995 JP
08166221 Jun 1996 JP
05-077657 Jul 1997 JP
2630604 Jul 1997 JP
06-069559 Aug 1999 JP
200274339 Mar 2002 JP
200383742 Mar 2003 JP
2003083742 Mar 2003 JP
20041658 Jan 2004 JP
1994019212 Sep 1994 WO
199427262 Nov 1994 WO
1996021581 Jul 1996 WO
1996038319 Dec 1996 WO
1997035743 Oct 1997 WO
1998014974 Apr 1998 WO
1998058450 Dec 1998 WO
1999014088 Mar 1999 WO
1999023828 May 1999 WO
2012143036 Oct 2012 WO
Non-Patent Literature Citations (34)
Entry
Achler et al., “Vehicle Wheel Detector using 2D Filter Banks,” IEEE Intelligent Vehicles Symposium of Jun. 2004.
Article entitled “Generation of Vision Technology,” published by VLSI Vision Limited, publication date unknown.
Article entitled On-Chip CMOS Sensors for VLSI Imaging Systems,: published by VLSI Vision Limited, 1991.
Behringer et al., “Simultaneous Estimation of Pitch Angle and Lane Width from the Video Image of a Marked Road,” pp. 966-973, Sep. 12-16, 1994.
Borenstein et al., “Where am I? Sensors and Method for Mobile Robot Positioning”, University of Michigan, Apr. 1996, pp. 2, 125-128.
Bow, Sing T., “Pattern Recognition and Image Preprocessing (Signal Processing and Communications)”, CRC Press, Jan. 15, 2002, pp. 557-559.
Broggi et al., “Automatic Vehicle Guidance: The Experience of the ARGO Vehicle”, World Scientific Publishing Co., 1999.
Broggi et al., “Multi-Resolution Vehicle Detection using Artificial Vision,” IEEE Intelligent Vehicles Symposium of Jun. 2004.
Dana H. Ballard and Christopher M. Brown, Computer Vision, Prentice-Hall, Englewood Cliffs, New Jersey, 5 pages, 1982.
Decision—Motions—Bd. R. 125(a), issued Aug. 29, 2006 in connection with Interference No. 105,325, which Involved U.S. Appl. No. 09/441,341, filed Nov. 16, 1999 by Schofield et al. and U.S. Pat. No. 5,837,994, issued to Stam et al.
Franke et al., “Autonomous driving approaches downtown”, Intelligent Systems and Their Applications, IEEE 13 (6), 40-48, Nov./Dec. 1999.
G. Wang, D. Renshaw, P.B. Denyer and M. Lu, CMOS Video Cameras, article, 1991, 4 pages, University of Edinburgh, UK.
Hamit, Francis “360-Degree Interactivity: New Video and Still Cameras Provide a Global Roaming Viewpoint,” Advanced Imaging, Mar. 1997, p. 50.
IEEE 100—The Authoritative Dictionary of IEEE Standards Terms, 7th Ed. (2000).
Japanese Article “Television Image Engineering Handbook, The Institute of Television Engineers of Japan”, Jan. 17, 1981.
Johannas, Laura “A New Microchip Ushers in Cheaper Digital Cameras,” The Wall Street Journal, Aug. 21, 1998, p. B1.
Kastrinaki et al., “A survey of video processing techniques for traffic applications”.
Philomin et al., “Pedestrain Tracking from a Moving Vehicle”.
Pollastri F. “Projection Center Calibration by Motion”, Pattern Recognization Letters, Elsevier, Amsterdam, NL, vol. 14, No. 12, Dec. 1, 1993, pp. 975-983, XP002363595, ISSN:0167-8655.
Reexamination Control No. 90/007,519, Reexamination of U.S. Pat. No. 6,222,447, issued to Schofield et al.
Reexamination Control No. 90/007,520, Reexamination of U.S. Pat. No. 5,949,331, issued to Schofield et al.
Reexamination Control No. 90/011,477, Reexamination of U.S. Pat. No. 5,949,331, issued to Schofield et al.
Reexamination Control No. 90/011,478, Reexamination of U.S. Pat. No. 6,222,447, issued to Schofield et al.
Sahli et al., “A Kalman Filter-Based Update Scheme for Road Following,” IAPR Workshop on Machine Vision Applications, pp. 5-9, Nov. 12-14, 1996.
Search Report from European Patent Application No. EP96916533.
Sun et al., “On-road vehicle detection using optical sensors: a review”.
Tokimaru et al., “CMOS Rear-View TV System with CCD Camera”, National Technical Report vol. 34, No. 3, pp. 329-336, Jun. 1988 (Japan).
Van Leeuwen et al., “Motion Estimation with a Mobile Camera for Traffic Applications”, IEEE, US, vol. 1, Oct. 3, 2000, pp. 58-63.
Van Leeuwen et al., “Motion Interpretation for In-Car Vision Systems”, IEEE, US, vol. 1, Sep. 30, 2002, p. 135-140.
Van Leeuwen et al., “Real-Time Vehicle Tracking in Image Sequences”, IEEE, US, vol. 3, May 21, 2001, pp. 2049-2054, XP010547308.
Van Leeuwen et al., “Requirements for Motion Estimation in Image Sequences for Traffic Applications”, IEEE, US, vol. 1, May 24, 1999, pp. 145-150, XP010340272.
Vlacic et al. (Eds), “Intelligent Vehicle Tecnologies, Theory and Applications”, Society of Automotive Engineers Inc., edited by SAE International, 2001.
Wang et al., CMOS Video Cameras, article, 1991, 4 pages, University of Edinburgh, UK.
Zheng et al., “An Adaptive System for Traffic Sign Recognition,” IEEE Proceedings of the Intelligent Vehicles '94 Symposium, pp. 165-170 (Oct. 1994).
Related Publications (1)
Number Date Country
20220262038 A1 Aug 2022 US
Provisional Applications (1)
Number Date Country
60956633 Aug 2007 US
Divisions (2)
Number Date Country
Parent 15978435 May 2018 US
Child 16947270 US
Parent 14694226 Apr 2015 US
Child 15978435 US
Continuations (4)
Number Date Country
Parent 16947270 Jul 2020 US
Child 17662447 US
Parent 13776094 Feb 2013 US
Child 14694226 US
Parent 13204791 Aug 2011 US
Child 13776094 US
Parent 12190698 Aug 2008 US
Child 13204791 US