The present invention relates to automatic headlamp control systems for vehicles and, more particularly, to automatic headlamp control systems that automatically adjust the beam illumination state of a vehicle headlamp, such as between different beam illumination states, such as between higher and lower beam illumination states of the vehicle headlamps.
Automotive forward lighting systems are evolving in several areas including the use of image-based sensors, typically referred to as Automatic High Beam (AHB) control systems, to maximize the use of high beam road illumination when appropriate, the use of steerable beam systems, typically referred to as Adaptive Front Lighting (AFL) systems, to provide a greater range of beam pattern options particularly for driving on curved roads or during turn maneuvers wherein the beam pattern may be biased or supplemented in the direction of the curve or turn, and the combination of such AHB and AFL systems.
Automatic high beam control system are known that utilize an optical system, an image sensor, and signal processing including spectral, spatial and temporal techniques to determine ambient lighting conditions, the road environment, and the presence of other road users in order to automatically control the selection of the appropriate forward lighting state such that user forward vision is optimized while minimizing the impact of headlamp caused glare on other road users in all lighting conditions. Examples of such systems are described in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,396,397; 6,822,563 and 7,004,606, which are hereby incorporated herein by reference in their entireties.
While AHB systems that utilize the features and concepts described within the above identified U.S. patents have achieved performance levels that have resulted in considerable commercial success, it is desired to provide additional features and techniques, which may increase the utility, improve the performance, facilitate the manufacture, and simplify the installation of such systems.
The present invention provides an automatic headlamp control system that is operable to automatically control or adjust the beam illumination state of a vehicle's headlamps, such as from one beam illumination state (such as a lower beam illumination state) to another or different beam illumination state (such as a higher beam illumination state). The headlamp control system is operable to determine when the vehicle is traveling along a substantially curved section of road, such as an on-ramp or off-ramp of an expressway or the like, and may adjust the image processing and/or headlamp beam illumination state decision responsive to such a determination. Optionally, the system may be operable to detect when the vehicle is approaching or entering or driving along a construction zone, and may adjust the headlamp beam illumination state decision or trigger/switch threshold responsive to such detection. Optionally, the system may be adjustable to tailor the image processing (such as by adjusting the algorithm or decision thresholds or the like) to the particular vehicle equipped with the headlamp control system and/or to the particular type of headlamp of the equipped vehicle, such as to more readily discern or discriminate between detected oncoming headlamps of approaching vehicles and reflections of light emitted by the headlamps of the equipped vehicle. Optionally, the system may be operable to determine if the camera or image sensor is blocked or partially blocked (such as by debris or dirt or ice or the like at the vehicle windshield), and may adjust the determination parameters depending on the location and/or driving conditions of the vehicle.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an automatic vehicle headlamp control system or vehicle headlamp dimming control system 12, which includes a forward facing camera or image sensor 14 that senses light from a scene forward of vehicle 10, an imaging processor or control circuit 13 that receives data from image sensor 14 and processes the image data, and a vehicle lighting control logic module 16 that exchanges data with control circuit 13 and controls the headlamps 18 (such as by changing or retaining the beam illumination state of the headlamps, such as between a higher beam state and a lower beam state) of vehicle 10 for the purpose of modifying the beam illumination state of the headlamps of the vehicle (
The imaging sensor for the headlamp control of the present invention may comprise any suitable sensor, and may utilize various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as the types described in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,715,093; 5,877,897; 6,498,620; 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,806,452; 6,396,397; 6,822,563; 6,946,978; 7,038,577 and/or 7,004,606, and/or U.S. patent application Ser. No. 12/190,698, filed Aug. 13, 2008 and published Feb. 19, 2009 as U.S. Patent Publication No. US-2009-0045323, and/or U.S. patent application Ser. No. 11/315,675, filed Dec. 22, 2005 and published Aug. 17, 2006 as U.S. Patent Publication No. US-2006-0184297A1, and/or U.S. provisional application, Ser. No. 61/083,222, filed Jul. 24, 2008, and/or PCT Application No. PCT/US2008/076022, filed Sep. 11, 2008, and published Mar. 19, 2009 as International Publication No. WO 2009036176, and/or PCT Application No. PCT/US2008/078700, filed Oct. 3, 2008, and published Apr. 9, 2009 as International Publication No. WO 2009/046268, and/or PCT Application No. PCT/US2007/075702, filed Aug. 10, 2007, and published Feb. 28, 2008 as PCT Publication No. WO 2008/024639, and/or PCT Application No. PCT/US2003/036177, filed Nov. 14, 2003, and published Jun. 3, 2004 as PCT Publication No. WO 2004/047421 A3, which are all hereby incorporated herein by reference in their entireties. The control 12 may include a lens element or optic 20 between the image sensor and the forward scene to substantially focus the scene at an image plane of the image sensor. Optionally, the optic may comprise an asymmetric optic, which focuses a generally central portion of the scene onto the image sensor, while providing classical distortion on the periphery of the scene or field of view. The imaging device and control and image processor may comprise any suitable components, and may utilize aspects of the vision systems of the text described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454 and 6,824,281, which are all hereby incorporated herein by reference in their entireties. The imaging device and/or control may be part of or share components or circuitry with other image or imaging or vision systems of the vehicle, such as headlamp control systems and/or rain sensing systems and/or cabin monitoring systems and/or the like.
Such imaging sensors or cameras are pixelated imaging array sensors having a photosensing array 15 of a plurality of photon accumulating or photosensing light sensors or pixels 15a (
For example, and as shown in
In order to take advantage of the environmental protection offered by the vehicle cabin, the frequently cleaned optically clear path offered by the vehicle windshield (which is cleaned or wiped by the windshield wipers when the wipers are activated), and the relatively high vantage point offered at the upper region or top of the windshield, the headlamp control system 12 or at least the imaging device or camera 14 is preferably mounted centrally at or near the upper inside surface of the front windshield of a vehicle and with a forward field of view through the region cleaned or wiped by the windshield wipers (such as shown in
Optionally, and desirably, the control system may be operable to determine when there is a blockage or partial blockage in front of the forward facing camera or image sensor, such as when dirt or ice or snow or debris accumulates on the windshield in the area in front of the camera. The control system may be operable to determine if some or all of the pixels of the imaging array are blocked (such as via an object or dirt or debris at the vehicle windshield or the like) and may adapt the image processing accordingly or notify or alert the driver of the vehicle that such blockage has been detected. The blockage or partial blockage detection algorithm or algorithms may vary depending on the driving conditions or the like. For example, a partial or total daytime blockage algorithm may be run during daytime lighting conditions, such as in response to an ambient light sensor or a user input or on demand, while a partial or total nighttime blockage algorithm may be run when the ambient condition is indicative of nighttime lighting conditions, such as by utilizing aspects of the systems described in U.S. patent application Ser. No. 12/190,698, filed Aug. 13, 2008, now U.S. Pat. No. 8,017,898, which is hereby incorporated herein by reference in its entirety.
When the total blockage algorithm is run, the number of pixels above an intensity threshold may be counted for a captured image or frame, and if, over a number of captured frames, the count of the bright pixels is continuously below a threshold level, the control system may conclude that the imaging device is substantially or totally blocked. When the partial blockage algorithm is run, the control system may perform region-based processing to take into account intensity variations in different regions of the pixelated imaging array. Based on intensity variations with neighboring or adjacent regions and the continuity of the variations over time, the control may determine that the imaging array is partially blocked. The control system may process the blocked pixel region in a night mode to reduce or substantially preclude the possibility of a false blockage detection.
If either partial or total blockage is detected or determined, the system may adapt the image processing to accommodate the blocked pixels, or the system may alert the driver of the vehicle that the pixels are blocked so that the driver or user may unblock the imaging device (such as via cleaning the windshield of the vehicle), or the system may actuate the vehicle windshield wipers to clear the windshield at the imaging device or the like, or the system may actuate a blower system (such as a defogger system or the like) of the vehicle to direct or force or blow air toward the detected blockage to clear the windshield or window or area in the forward field of view of the imaging device. Optionally, the control system may detect that at least a portion of the imaging device or photosensor array is blocked and may switch to a lower or low beam mode in response to the blockage detection (so as to allow the system to confirm the existence of the blockage without the high beams on during this period of time), and the system may at least one of (a) alert the driver of the subject vehicle of the detected blockage so that he or she can clean the windshield or sensor or otherwise remove the blockage or actuate the wipers and/or related system of the vehicle to remove the blockage; (b) automatically actuate a wiper (such as the windshield wipers) of the vehicle to remove the blockage from the forward field of view of the imaging device; and (c) automatically actuate a blower system of the vehicle to remove or dissipate the blockage from the forward field of view. The control system may also detect that the blockage has been removed from the forward field of view and may resume the normal functionality of the headlamp control system and/or the wiper system of the vehicle and/or the blower system of the vehicle.
Optionally, the control system of the present invention may be operable to adjust or reconfigure the processing or algorithms for detecting sensor blockage in response to a low temperature detection (because ice or snow may linger at the camera location because the defrosting blower may not provide sufficient air flow at the camera to melt such ice and snow on the windshield at that location), in order to provide enhanced blockage detection during cold ambient temperature conditions where ice or snow may accumulate on the windshield in front of the sensor, and to limit high or higher beam actuation during conditions where the camera may be blocked and thus not detecting leading or oncoming vehicles. Such cold weather blockage of sensor may result in high beam flashing of other drivers as the camera or sensor detects nothing and the system concludes that no vehicles are present in front of the equipped vehicle.
Thus, the control system of the present invention may use an outside air temperature input and may switch to cold weather processing or a cold weather algorithm when the air temperature is detected to be below a threshold temperature (such as near 32 degrees F. or thereabouts), in order to enhance blockage detection of snow or ice that typically occurs in cold weather conditions. For example, the windshield may be blocked by frost or snow or ice from cold night temperatures, and the morning temperatures may be warmer than 32 degrees F., but the blockage may still remain, thus the threshold temperature may be close to, but above 32 degrees F. The control system may also monitor detection behavior and switch to a constant lower beam illumination state when the system detects evidence of poor or erroneous detection behavior. Optionally, for example, the control system may switch to an initial lower beam state when the temperature is below the threshold temperature level and may exit the initial lower beam state (or other situations where the system is operating at a constant lower beam illumination state) in response to monitoring of the detections when the monitoring evidences good or accurate detection behavior (and/or optionally following a period of time that should be sufficient for the vehicle climate control system to have melted any ice or snow that may be or may have been present on the windshield), whereby the system may return to normal automatic behavior or performance.
For example, the control system may be responsive to a temperature input that is indicative of the outside or ambient temperature at the vehicle. When the outside temperature falls below a specified or threshold level, the system may enable enhancements to the current blockage detection algorithms. For example, the system may enable more-aggressive blockage detection parameters (to transition to a ‘blocked’ state or mode quicker), or a delay time may be added to the ‘blocked’ state prior to enabling transition to the ‘unblocked/ready’ state. The minimum blockage time (the time a blockage or low light is detected by the sensor before the system recognizes a blocked or partially blocked condition) may be increased by an incremental amount each time the system transitions from its ‘waiting to clear’ state back to its ‘blocked’ state. Optionally, the system may count the number of blockage or weather mode events that occur over a period of time and may adjust the image processing and/or control of the headlamps in response to such counting.
When the temperature is above a specified or threshold level, the system may revert to standard or higher temperature blockage detection parameters, revert to standard “blocked to unblocked/ready” transition delay, and/or suspend counting of the number of blockage or weather mode events that occur over a period of time (but the system may not clear the count). If a temperature signal is not received by the control system, the control system or algorithms may default to their normal operation (in other words, the system may interpret a “no temperature signal” in the same manner as a temperature signal that is indicative of a detected temperature that is greater than the low temperature threshold).
When a specified or threshold number of blockage or weather mode events occur, the control system may adjust or configure the transition to the ‘blocked’ state or mode for the remainder of the ignition cycle of the equipped vehicle (if the duration threshold or time period is set to the vehicle ignition cycle) or until the detected ambient temperature is greater than a high temperature threshold (if the duration threshold is set to a temperature threshold). The number of blockages and weather events may be reset with every power-on reset of the vehicle and/or control system.
If the temperature blockage detection configuration is enabled, the control system will not perform automatic high beam activations when the temperature falls below a specified or threshold level. In such applications, the system may return or be switched to automatic control of the headlamps when the detected temperature is greater than a higher temperature threshold (a temperature threshold that is greater than the low temperature threshold that triggers a switch to a constant lower beam illumination state). The temperature shutdown status may be reset with each power-on reset of the vehicle and/or system.
Optionally, the system may include a blockage detection ‘supervisory’ algorithm (configurable on/off) that is operable to monitor a degradation in detection distances and transition to the ‘blocked’ state after a specified configurable number of ‘short’ detections. While in the ‘blocked’ state, the system may continue to monitor detection distances and transition to the ‘unblocked’ state after a specified configurable number of ‘long’ detections. When the supervisory algorithm is configured or activated or on, the supervisory algorithm may run continuously, independent of the “outside air temperature” signal and the “temperature blockage duration” setting. Optionally, additional supplier-range DIDs may be added for the threshold number of ‘short’ detections required prior to transitioning to the ‘blocked’ state, to define the parametric ‘short’ detection threshold, while such DIDs may be added for the threshold number of ‘long’ detections required prior to transitioning out of the ‘blocked’ state, to define the parametric ‘long’ detection threshold.
Optionally, the control system may include other blockage detection algorithms or parameters depending on the driving conditions of the vehicle. For example, during end of line vehicle testing at a vehicle assembly plant, the vehicle is “driven” on rollers to check engine performance and the like. When a vehicle equipped with a headlamp control system of the types described herein is so tested, the control system may falsely or erroneously detect a partial blockage condition due to the low lighting conditions at the assembly plant and the “driving” of the vehicle on the rollers (which the system may detect as the vehicle being driven along a road while the image sensor detects the same low light pattern, which may be indicative of a blocked or partially blocked sensor), whereby the system may switch to the lower beam illumination state. Such false blockage detections may result in warranty issues and/or further testing and evaluation of the vehicle and/or headlamp control system.
Thus, the automatic headlamp control system of the present invention may utilize a blockage detection algorithm that is tuned to recognize such a condition and not generate a false blockage detection during such end of line testing. For example, the control system may process the captured image data for different parameters to reduce the possibility of a false blockage detection, such as by altering the comparison of different regions or areas of the captured image data during such testing. Optionally, the system may function to limit blockage testing during an initial start-up period or the like.
As discussed above, during normal vehicle operation, the headlamp control system is operable to adjust the beam illumination state of the vehicle headlamps responsive to processing of the image data captured by the forward facing camera. Optionally, the control system may adjust the decision thresholds or parameters responsive to the image processing or responsive to other inputs. For example, the control system may adjust the image processing responsive to a determination that the equipped vehicle is traveling along a curved road or section of road that is indicative of or representative of an on-ramp or off-ramp of an expressway or freeway or the like.
During normal driving conditions (such as along a road having leading traffic ahead of the equipped vehicle and oncoming traffic ahead of and in a lane adjacent to the lane traveled by the equipped vehicle), the control system may switch to the higher beam state responsive to a determination that there are no leading or oncoming vehicles ahead of the equipped vehicle (by determining if detected light sources in the field of view ahead of the equipped vehicle are headlamps or taillights of other vehicles). The switch to the higher beam state may occur following a period of time during which no oncoming headlamps or leading taillights are detected by the control system (to reduce the possibility of rapid switching of the headlamp state between the higher and lower beam states).
When the vehicle equipped with headlamp control system 12 is driven along a curved section of road, such as a curved on-ramp or off-ramp of a freeway or expressway, taillights of leading vehicles may not be detected by the forward facing camera 14 of the equipped vehicle 10, because the leading vehicle may be far enough ahead of the equipped vehicle along the curved road so that the leading taillights are outside of the field of view of the forward facing camera. In such a driving condition or situation, it is desirable to limit switching to the higher beam illumination state of the headlamps of the equipped vehicle because the higher beam illumination may be bothersome to a driver of an undetected leading vehicle that is ahead of the equipped vehicle along the curved road.
Thus, the automatic headlamp control system of the present invention is operable to detect the driving conditions (such as the road curvature and/or steering wheel angle) and, responsive to a detection of a driving condition representative of a substantial curve in the road (such as a curve that results in the vehicle changing direction by about 270 degrees or thereabouts, which may be indicative of a freeway on-ramp or off-ramp or the like), may adjust the decision threshold to limit or delay switching to a different beam illumination state, such as from a lower beam state to a higher beam state. For example, the system may determine the road curvature responsive to image processing (where the camera may capture lane markers or the like along the center or side of the road surface) or a steering wheel sensor (that detects the steering wheel angle and/or turning angle of the vehicle and/or that may detect or determine that the vehicle is turning at a substantial turning angle or maintains a similar turning angle for a substantial distance or period of time) or a global positioning system (GPS) or navigational system or the like (that detects a geographical location of the vehicle and can determine if the vehicle is on an on-ramp or off-ramp or the like and/or that may determine the turning radius of the vehicle and/or the distance or period of time during which the vehicle is traveling at such a turning radius). When a threshold degree of road curvature is detected (such as a threshold turning radius and/or threshold distance along which the vehicle travels along a detected curved section of road), the control system may limit or delay switching to a higher beam state such that the headlamps of the equipped vehicle remain in the lower beam state during the turn (such as until the vehicle completes the turn or the steering wheel angle is returned toward a zero angle or straight path of travel) to limit or substantially preclude glare to a driver of a vehicle that may be in front of the equipped vehicle (yet outside of the field of view of the forward facing camera of the equipped vehicle) along the curved road or section of road.
Optionally, the headlamp control system may be operable to adjust the threshold switching parameter responsive to a situation where a detected taillight moves to one side of the equipped vehicle (such as to the right side as the leading vehicle travels along a curve to the right) and when such detected movement and subsequent non-detection of the leading vehicle taillights is followed by a detection or determination that the equipped vehicle is traveling along a curvature in the road (such as a curve to the right and thus following the direction that the leading vehicle was last headed when still detected by the system), whereby the control system may limit the switching to the higher beam illumination state due to the likelihood that the leading vehicle is still ahead of the equipped vehicle, but is not detectable along the road curvature. Such a delayed or limited high beam switching function provides enhanced performance of the headlamp control system and limits activating the higher beam illumination state in situations where such activation would lead to glare to the driver of a leading vehicle.
Optionally, the automatic headlamp control system may be operable to detect when the equipped vehicle is approaching or entering or driving through a construction zone, such as responsive to traffic sign recognition (such as by detecting orange signs or the like) or character recognition (such as by determining that a detected sign includes characters or indicia or text that is/are indicative of the vehicle approaching or driving through or along a construction zone) or object detection and recognition (such as detection and identification of barrels or cones or the like that are typically disposed at construction zones) or spectral recognition (such as by recognizing or discriminating between orange and red) or spatial recognition (such as by recognizing or discerning construction zone signs by the location and/or number of signs along the side of the road being traveled) or the like. If the system detects that the equipped vehicle is at or in a construction zone and does not detect taillights of leading vehicles ahead of the equipped vehicle, the system may switch the headlamps to a different beam illumination state, such as to a higher beam illumination state.
Because construction zone signs are typically orange, they present potential difficulties to the system in discriminating between the construction zone signs and taillights of leading vehicles (which are red) when reflection off of the signs is detected at a distance ahead of the equipped vehicle. Thus, the image processing may be operable to discriminate between reflection of light off an orange sign and red light emitted by a vehicle taillight, and may make such a determination based on the color of the detected light source or object, the location of the detected light source or object relative to the equipped vehicle, a recognition of a reflection of the headlamps of the equipped vehicle (such as by superimposing a signature or code or pattern on an output of the headlamps of the equipped vehicle such as described in U.S. Pat. No. 7,004,606, which is hereby incorporated herein by reference in its entirety), or the like. Optionally, for example, the headlamp control system may detect a construction zone by any of the above approaches and/or via an output of a traffic sign recognition (TSR) system (that identifies construction zone signs), a lane departure warning (LDW) system (that may identify traffic shifts or lane changes and/or the like along a construction zone), a forward collision warning system or object detection system (that may identify objects of interest ahead of the equipped vehicle, such as traffic cones and/or barrels and/or the like that are typically disposed at construction zones), a GPS and/or navigating system (that may identify when the detected geographic location of the vehicle corresponds to a construction zone) and/or the like. When it is detected or determined that the vehicle is approaching or at or in a construction zone, the system may discriminate between orange construction zone signs and leading taillights and may readily switch (so long as no leading taillights or oncoming headlamps are detected by the control system) to the higher beam illumination state (or remain at the higher beam illumination state) to provide enhanced illumination to the driver of the equipped vehicle while the equipped vehicle is driving through or along the construction zone. When the system subsequently detects that the equipped vehicle is exiting the construction zone, the headlamp control system may return to its normal operation and function to switch the beam illumination state between higher and lower beam illumination states responsive to detection of oncoming headlamps and/or leading taillights.
Optionally, automatic headlamp control system 12 may be operable to adjust the image processing to tailor or tune the image processing to the particular vehicle that is equipped with the automatic headlamp control system. For example, a common or universal automatic headlamp control system may be provided by a headlamp control system manufacturer or supplier (and such a common headlamp control system may be provided or supplied to one or more vehicle manufacturers for implementation on two or more vehicle programs), and the system may be adjusted or preset to a particular processing level or regime, depending on the type of headlamps used by the vehicle that is eventually equipped with the headlamp control system. The control system thus may be programmed or configured or adjusted, such as at a vehicle assembly plant or such as at a dealership or at an aftermarket installation facility or the like, to correlate the system with the type of headlamps (such as halogen headlamps, HID headlamps, light emitting diodes, and/or the like) of the equipped vehicle. For example, the calibration parameters for the control system (such as the decision making or switching parameters) may be selected depending on the type of headlamps of the vehicle.
Such an adjustment or configuration of the image processor and/or control enhances the system's ability to recognize reflections of light emitted by the equipped vehicle's headlamps and to discern such reflections from headlamps of other vehicles and/or taillights of other vehicles. For example, for headlamps that have a particular spectral signature, the control system or image processor may be configured (such as during an end of line calibration at the vehicle assembly plant) to tune the image processing to the equipped vehicle's headlamp color spectrum. The control system thus may be more sensitive (or less sensitive) to particular spectral regions and may be tuned or configured to recognize the equipped vehicle's headlamp spectral signature to enhance recognition of reflections of the equipped vehicle's headlamps off signs and the like. The system thus may be adjusted or configured to better detect red taillights by adjusting or tuning the system for the particular headlamps (since some types of headlamps may output more or less light in the red spectrum range than other types of headlamps).
Changes and modifications to the specifically described embodiments may be carried out without departing from the principles of the present invention, which is intended to be limited only by the scope of the appended claims as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application is a continuation of U.S. patent application Ser. No. 16/947,774, filed Aug. 17, 2020, now U.S. Pat. No. 11,511,668, which is a continuation of U.S. patent application Ser. No. 16/016,815, filed Jun. 25, 2018, now U.S. Pat. No. 10,744,940, which is a continuation of U.S. patent application Ser. No. 14/942,088, filed Nov. 16, 2015, now U.S. Pat. No. 10,005,394, which is a continuation of U.S. patent application Ser. No. 13/767,208, filed Feb. 14, 2013, now U.S. Pat. No. 9,187,028, which is a continuation of U.S. patent application Ser. No. 12/781,119, filed May 17, 2010, now U.S. Pat. No. 8,376,595, which claims the benefits of U.S. provisional application Ser. No. 61/178,565, filed May 15, 2009.
Number | Name | Date | Kind |
---|---|---|---|
5182502 | Slotkowski et al. | Jan 1993 | A |
5184956 | Langlais et al. | Feb 1993 | A |
5189561 | Hong | Feb 1993 | A |
5193000 | Lipton et al. | Mar 1993 | A |
5193029 | Schofield et al. | Mar 1993 | A |
5204778 | Bechtel | Apr 1993 | A |
5208701 | Maeda | May 1993 | A |
5245422 | Borcherts et al. | Sep 1993 | A |
5253109 | O'Farrell et al. | Oct 1993 | A |
5255442 | Schierbeek et al. | Oct 1993 | A |
5276389 | Levers | Jan 1994 | A |
5285060 | Larson et al. | Feb 1994 | A |
5289182 | Brillard et al. | Feb 1994 | A |
5289321 | Secor | Feb 1994 | A |
5305012 | Faris | Apr 1994 | A |
5307136 | Saneyoshi | Apr 1994 | A |
5309137 | Kajiwara | May 1994 | A |
5313072 | Vachss | May 1994 | A |
5325096 | Pakett | Jun 1994 | A |
5325386 | Jewell et al. | Jun 1994 | A |
5329206 | Slotkowski et al. | Jul 1994 | A |
5331312 | Kudoh | Jul 1994 | A |
5336980 | Levers | Aug 1994 | A |
5341437 | Nakayama | Aug 1994 | A |
5351044 | Mathur et al. | Sep 1994 | A |
5355118 | Fukuhara | Oct 1994 | A |
5374852 | Parkes | Dec 1994 | A |
5386285 | Asayama | Jan 1995 | A |
5394333 | Kao | Feb 1995 | A |
5406395 | Wilson et al. | Apr 1995 | A |
5410346 | Saneyoshi et al. | Apr 1995 | A |
5414257 | Stanton | May 1995 | A |
5414461 | Kishi et al. | May 1995 | A |
5416313 | Larson et al. | May 1995 | A |
5416318 | Hegyi | May 1995 | A |
5416478 | Morinaga | May 1995 | A |
5424952 | Asayama | Jun 1995 | A |
5426294 | Kobayashi et al. | Jun 1995 | A |
5430431 | Nelson | Jul 1995 | A |
5434407 | Bauer et al. | Jul 1995 | A |
5440428 | Hegg et al. | Aug 1995 | A |
5444478 | Lelong et al. | Aug 1995 | A |
5451822 | Bechtel et al. | Sep 1995 | A |
5457493 | Leddy et al. | Oct 1995 | A |
5461357 | Yoshioka et al. | Oct 1995 | A |
5461361 | Moore | Oct 1995 | A |
5469298 | Suman et al. | Nov 1995 | A |
5471515 | Fossum et al. | Nov 1995 | A |
5475494 | Nishida et al. | Dec 1995 | A |
5497306 | Pastrick | Mar 1996 | A |
5498866 | Bendicks et al. | Mar 1996 | A |
5500766 | Stonecypher | Mar 1996 | A |
5510983 | Lino | Apr 1996 | A |
5515448 | Nishitani | May 1996 | A |
5521633 | Nakajima et al. | May 1996 | A |
5528698 | Kamei et al. | Jun 1996 | A |
5529138 | Shaw et al. | Jun 1996 | A |
5530240 | Larson et al. | Jun 1996 | A |
5530420 | Tsuchiya et al. | Jun 1996 | A |
5535314 | Alves et al. | Jul 1996 | A |
5537003 | Bechtel et al. | Jul 1996 | A |
5539397 | Asanuma et al. | Jul 1996 | A |
5541590 | Nishio | Jul 1996 | A |
5550677 | Schofield et al. | Aug 1996 | A |
5555555 | Sato et al. | Sep 1996 | A |
5567360 | Varaprasad et al. | Oct 1996 | A |
5568027 | Teder | Oct 1996 | A |
5574443 | Hsieh | Nov 1996 | A |
5581464 | Woll et al. | Dec 1996 | A |
5594222 | Caldwell | Jan 1997 | A |
5610756 | Lynam et al. | Mar 1997 | A |
5614788 | Mullins | Mar 1997 | A |
5619370 | Guinosso | Apr 1997 | A |
5632092 | Blank et al. | May 1997 | A |
5634709 | Iwama | Jun 1997 | A |
5642299 | Hardin et al. | Jun 1997 | A |
5648835 | Uzawa | Jul 1997 | A |
5650944 | Kise | Jul 1997 | A |
5660454 | Mori et al. | Aug 1997 | A |
5661303 | Teder | Aug 1997 | A |
5666028 | Bechtel et al. | Sep 1997 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5677851 | Kingdon et al. | Oct 1997 | A |
5699044 | Van Lente et al. | Dec 1997 | A |
5715093 | Schierbeek et al. | Feb 1998 | A |
5724316 | Brunts | Mar 1998 | A |
5737226 | Olson et al. | Apr 1998 | A |
5760826 | Nayar | Jun 1998 | A |
5760828 | Cortes | Jun 1998 | A |
5760931 | Saburi et al. | Jun 1998 | A |
5760962 | Schofield et al. | Jun 1998 | A |
5761094 | Olson et al. | Jun 1998 | A |
5765116 | Wilson-Jones et al. | Jun 1998 | A |
5781437 | Wiemer et al. | Jul 1998 | A |
5786772 | Schofield et al. | Jul 1998 | A |
5790403 | Nakayama | Aug 1998 | A |
5790973 | Blaker et al. | Aug 1998 | A |
5793308 | Rosinski et al. | Aug 1998 | A |
5793420 | Schmidt | Aug 1998 | A |
5796094 | Schofield et al. | Aug 1998 | A |
5798575 | O'Farrell et al. | Aug 1998 | A |
5823654 | Pastrick et al. | Oct 1998 | A |
5835255 | Miles | Nov 1998 | A |
5844505 | Van Ryzin | Dec 1998 | A |
5844682 | Kiyomoto et al. | Dec 1998 | A |
5845000 | Breed et al. | Dec 1998 | A |
5848802 | Breed et al. | Dec 1998 | A |
5850176 | Kinoshita et al. | Dec 1998 | A |
5850254 | Takano et al. | Dec 1998 | A |
5867591 | Onda | Feb 1999 | A |
5877707 | Kowalick | Mar 1999 | A |
5877897 | Schofield et al. | Mar 1999 | A |
5878370 | Olson | Mar 1999 | A |
5883739 | Ashihara et al. | Mar 1999 | A |
5884212 | Lion | Mar 1999 | A |
5890021 | Onoda | Mar 1999 | A |
5896085 | Mori et al. | Apr 1999 | A |
5899956 | Chan | May 1999 | A |
5910854 | Varaprasad et al. | Jun 1999 | A |
5914815 | Bos | Jun 1999 | A |
5923027 | Stam et al. | Jul 1999 | A |
5924212 | Domanski | Jul 1999 | A |
5929786 | Schofield et al. | Jul 1999 | A |
5940120 | Frankhouse et al. | Aug 1999 | A |
5949331 | Schofield et al. | Sep 1999 | A |
5956181 | Lin | Sep 1999 | A |
5959367 | O'Farrell et al. | Sep 1999 | A |
5959555 | Furuta | Sep 1999 | A |
5963247 | Banitt | Oct 1999 | A |
5971552 | O'Farrell et al. | Oct 1999 | A |
5986796 | Miles | Nov 1999 | A |
5990469 | Bechtel et al. | Nov 1999 | A |
5990649 | Nagao et al. | Nov 1999 | A |
6020704 | Buschur | Feb 2000 | A |
6049171 | Stam | Apr 2000 | A |
6066933 | Ponziana | May 2000 | A |
6084519 | Coulling et al. | Jul 2000 | A |
6087953 | DeLine et al. | Jul 2000 | A |
6097023 | Schofield et al. | Aug 2000 | A |
6097024 | Stam et al. | Aug 2000 | A |
6116743 | Hoek | Sep 2000 | A |
6139172 | Bos et al. | Oct 2000 | A |
6144022 | Tenenbaum et al. | Nov 2000 | A |
6154306 | Varaprasad et al. | Nov 2000 | A |
6172613 | DeLine et al. | Jan 2001 | B1 |
6175164 | O'Farrell et al. | Jan 2001 | B1 |
6175300 | Kendrick | Jan 2001 | B1 |
6178034 | Allemand et al. | Jan 2001 | B1 |
6198409 | Schofield et al. | Mar 2001 | B1 |
6201642 | Bos | Mar 2001 | B1 |
6222447 | Schofield et al. | Apr 2001 | B1 |
6227689 | Miller | May 2001 | B1 |
6250148 | Lynam | Jun 2001 | B1 |
6259412 | Duroux | Jul 2001 | B1 |
6266082 | Yonezawa et al. | Jul 2001 | B1 |
6266442 | Laumeyer et al. | Jul 2001 | B1 |
6285393 | Shimoura et al. | Sep 2001 | B1 |
6294989 | Schofield et al. | Sep 2001 | B1 |
6297781 | Turnbull et al. | Oct 2001 | B1 |
6302545 | Schofield et al. | Oct 2001 | B1 |
6310611 | Caldwell | Oct 2001 | B1 |
6313454 | Bos et al. | Nov 2001 | B1 |
6317057 | Lee | Nov 2001 | B1 |
6320176 | Schofield et al. | Nov 2001 | B1 |
6320282 | Caldwell | Nov 2001 | B1 |
6323942 | Bamji | Nov 2001 | B1 |
6326613 | Heslin et al. | Dec 2001 | B1 |
6333759 | Mazzilli | Dec 2001 | B1 |
6341523 | Lynam | Jan 2002 | B2 |
6353392 | Schofield et al. | Mar 2002 | B1 |
6370329 | Teuchert | Apr 2002 | B1 |
6392315 | Jones et al. | May 2002 | B1 |
6411204 | Bloomfield et al. | Jun 2002 | B1 |
6411328 | Franke et al. | Jun 2002 | B1 |
6420975 | DeLine et al. | Jul 2002 | B1 |
6424273 | Gutta et al. | Jul 2002 | B1 |
6428172 | Hutzel et al. | Aug 2002 | B1 |
6430303 | Naoi et al. | Aug 2002 | B1 |
6433676 | DeLine et al. | Aug 2002 | B2 |
6442465 | Breed et al. | Aug 2002 | B2 |
6477464 | McCarthy et al. | Nov 2002 | B2 |
6485155 | Duroux et al. | Nov 2002 | B1 |
6497503 | Dassanayake et al. | Dec 2002 | B1 |
6498620 | Schofield et al. | Dec 2002 | B2 |
6513252 | Schierbeek et al. | Feb 2003 | B1 |
6516664 | Lynam | Feb 2003 | B2 |
6523964 | Schofield et al. | Feb 2003 | B2 |
6539306 | Turnbull | Mar 2003 | B2 |
6547133 | Devries, Jr. et al. | Apr 2003 | B1 |
6553130 | Lemelson et al. | Apr 2003 | B1 |
6559435 | Schofield et al. | May 2003 | B2 |
6574033 | Chui et al. | Jun 2003 | B1 |
6580496 | Bamji et al. | Jun 2003 | B2 |
6589625 | Kothari et al. | Jul 2003 | B1 |
6590719 | Bos | Jul 2003 | B2 |
6594583 | Ogura et al. | Jul 2003 | B2 |
6611202 | Schofield et al. | Aug 2003 | B2 |
6611610 | Stam et al. | Aug 2003 | B1 |
6627918 | Getz et al. | Sep 2003 | B2 |
6636258 | Strumolo | Oct 2003 | B2 |
6648477 | Hutzel et al. | Nov 2003 | B2 |
6650455 | Miles | Nov 2003 | B2 |
6657410 | Berger et al. | Dec 2003 | B1 |
6672731 | Schnell et al. | Jan 2004 | B2 |
6674562 | Miles | Jan 2004 | B1 |
6678614 | McCarthy et al. | Jan 2004 | B2 |
6680792 | Miles | Jan 2004 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6700605 | Toyoda et al. | Mar 2004 | B1 |
6703925 | Steffel | Mar 2004 | B2 |
6704621 | Stein et al. | Mar 2004 | B1 |
6710908 | Miles et al. | Mar 2004 | B2 |
6711474 | Treyz et al. | Mar 2004 | B1 |
6714331 | Lewis et al. | Mar 2004 | B2 |
6717610 | Bos et al. | Apr 2004 | B1 |
6735506 | Breed et al. | May 2004 | B2 |
6741377 | Miles | May 2004 | B2 |
6744353 | Sjonell | Jun 2004 | B2 |
6757109 | Bos | Jun 2004 | B2 |
6762867 | Lippert et al. | Jul 2004 | B2 |
6794119 | Miles | Sep 2004 | B2 |
6795221 | Urey | Sep 2004 | B1 |
6802617 | Schofield et al. | Oct 2004 | B2 |
6806452 | Bos et al. | Oct 2004 | B2 |
6819231 | Berberich et al. | Nov 2004 | B2 |
6822563 | Bos et al. | Nov 2004 | B2 |
6823241 | Shirato et al. | Nov 2004 | B2 |
6824281 | Schofield et al. | Nov 2004 | B2 |
6831261 | Schofield et al. | Dec 2004 | B2 |
6882287 | Schofield | Apr 2005 | B2 |
6889161 | Winner et al. | May 2005 | B2 |
6891563 | Schofield et al. | May 2005 | B2 |
6902284 | Hutzel et al. | Jun 2005 | B2 |
6909753 | Meehan et al. | Jun 2005 | B2 |
6946978 | Schofield | Sep 2005 | B2 |
6953253 | Schofield et al. | Oct 2005 | B2 |
6968736 | Lynam | Nov 2005 | B2 |
6975775 | Rykowski et al. | Dec 2005 | B2 |
6989736 | Berberich et al. | Jan 2006 | B2 |
7004593 | Weller et al. | Feb 2006 | B2 |
7004606 | Schofield | Feb 2006 | B2 |
7005974 | McMahon et al. | Feb 2006 | B2 |
7012727 | Hutzel et al. | Mar 2006 | B2 |
7030738 | Ishii | Apr 2006 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7062300 | Kim | Jun 2006 | B1 |
7065432 | Moisel et al. | Jun 2006 | B2 |
7079017 | Lang et al. | Jul 2006 | B2 |
7085637 | Breed et al. | Aug 2006 | B2 |
7092548 | Laumeyer et al. | Aug 2006 | B2 |
7111968 | Bauer et al. | Sep 2006 | B2 |
7116246 | Winter et al. | Oct 2006 | B2 |
7123168 | Schofield | Oct 2006 | B2 |
7145519 | Takahashi et al. | Dec 2006 | B2 |
7149613 | Stam et al. | Dec 2006 | B2 |
7161616 | Okamoto et al. | Jan 2007 | B1 |
7167796 | Taylor et al. | Jan 2007 | B2 |
7195381 | Lynam et al. | Mar 2007 | B2 |
7202776 | Breed | Apr 2007 | B2 |
7205904 | Schofield | Apr 2007 | B2 |
7227459 | Bos et al. | Jun 2007 | B2 |
7227611 | Hull et al. | Jun 2007 | B2 |
7311406 | Schofield et al. | Dec 2007 | B2 |
7325934 | Schofield et al. | Feb 2008 | B2 |
7325935 | Schofield et al. | Feb 2008 | B2 |
7338177 | Lynam | Mar 2008 | B2 |
7339149 | Schofield et al. | Mar 2008 | B1 |
7344261 | Schofield et al. | Mar 2008 | B2 |
7355524 | Schofield | Apr 2008 | B2 |
7380948 | Schofield et al. | Jun 2008 | B2 |
7388182 | Schofield et al. | Jun 2008 | B2 |
7402786 | Schofield et al. | Jul 2008 | B2 |
7423248 | Schofield et al. | Sep 2008 | B2 |
7425076 | Schofield et al. | Sep 2008 | B2 |
7446650 | Scholfield et al. | Nov 2008 | B2 |
7459664 | Schofield et al. | Dec 2008 | B2 |
7460951 | Altan | Dec 2008 | B2 |
7480149 | DeWard et al. | Jan 2009 | B2 |
7490007 | Taylor et al. | Feb 2009 | B2 |
7492281 | Lynam et al. | Feb 2009 | B2 |
7526103 | Schofield et al. | Apr 2009 | B2 |
7561181 | Schofield et al. | Jul 2009 | B2 |
7565006 | Stam et al. | Jul 2009 | B2 |
7581859 | Lynam | Sep 2009 | B2 |
7592928 | Chinomi et al. | Sep 2009 | B2 |
7613327 | Stam et al. | Nov 2009 | B2 |
7616781 | Schofield et al. | Nov 2009 | B2 |
7619508 | Lynam et al. | Nov 2009 | B2 |
7639149 | Katoh | Dec 2009 | B2 |
7650864 | Hassan et al. | Jan 2010 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7855755 | Weller et al. | Dec 2010 | B2 |
7881496 | Camilleri et al. | Feb 2011 | B2 |
7914187 | Higgins-Luthman et al. | Mar 2011 | B2 |
7965336 | Bingle et al. | Jun 2011 | B2 |
8027029 | Lu et al. | Sep 2011 | B2 |
8045760 | Stam et al. | Oct 2011 | B2 |
8058977 | Lynam | Nov 2011 | B2 |
8063759 | Bos et al. | Nov 2011 | B2 |
8184159 | Luo | May 2012 | B2 |
8217830 | Lynam | Jul 2012 | B2 |
8254635 | Stein et al. | Aug 2012 | B2 |
8314689 | Schofield et al. | Nov 2012 | B2 |
8325986 | Schofield et al. | Dec 2012 | B2 |
8376595 | Higgins-Luthman | Feb 2013 | B2 |
8379923 | Ishikawa | Feb 2013 | B2 |
8605947 | Zhang et al. | Dec 2013 | B2 |
9187028 | Higgins-Luthman | Nov 2015 | B2 |
10005394 | Higgins-Luthman | Jun 2018 | B2 |
10744940 | Higgins-Luthman | Aug 2020 | B2 |
11511668 | Higgins-Luthman | Nov 2022 | B2 |
20020015153 | Downs | Feb 2002 | A1 |
20020044065 | Quist et al. | Apr 2002 | A1 |
20020113873 | Williams | Aug 2002 | A1 |
20020159270 | Lynam et al. | Oct 2002 | A1 |
20030072471 | Otsuka | Apr 2003 | A1 |
20030137586 | Lewellen | Jul 2003 | A1 |
20030222982 | Hamdan et al. | Dec 2003 | A1 |
20030227777 | Schofield | Dec 2003 | A1 |
20040012488 | Schofield | Jan 2004 | A1 |
20040016870 | Pawlicki et al. | Jan 2004 | A1 |
20040032321 | McMahon et al. | Feb 2004 | A1 |
20040051634 | Schofield et al. | Mar 2004 | A1 |
20040086153 | Tsai et al. | May 2004 | A1 |
20040114381 | Salmeen et al. | Jun 2004 | A1 |
20040128065 | Taylor et al. | Jul 2004 | A1 |
20040143380 | Stam | Jul 2004 | A1 |
20040164228 | Fogg et al. | Aug 2004 | A1 |
20040200948 | Bos et al. | Oct 2004 | A1 |
20050035926 | Takenaga | Feb 2005 | A1 |
20050078389 | Kulas et al. | Apr 2005 | A1 |
20050134966 | Burgner | Jun 2005 | A1 |
20050134983 | Lynam | Jun 2005 | A1 |
20050146792 | Schofield et al. | Jul 2005 | A1 |
20050169003 | Lindahl et al. | Aug 2005 | A1 |
20050189493 | Bagley et al. | Sep 2005 | A1 |
20050195488 | McCabe et al. | Sep 2005 | A1 |
20050200700 | Schofield et al. | Sep 2005 | A1 |
20050232469 | Schofield et al. | Oct 2005 | A1 |
20050264891 | Uken et al. | Dec 2005 | A1 |
20060018511 | Stam et al. | Jan 2006 | A1 |
20060018512 | Stam et al. | Jan 2006 | A1 |
20060028731 | Schofield et al. | Feb 2006 | A1 |
20060050018 | Hutzel et al. | Mar 2006 | A1 |
20060061008 | Karner et al. | Mar 2006 | A1 |
20060091813 | Stam et al. | May 2006 | A1 |
20060103727 | Tseng | May 2006 | A1 |
20060157639 | Shaffer et al. | Jul 2006 | A1 |
20060164230 | DeWind et al. | Jul 2006 | A1 |
20060171704 | Bingle et al. | Aug 2006 | A1 |
20060228001 | Tsukamoto | Oct 2006 | A1 |
20060250501 | Wildmann et al. | Nov 2006 | A1 |
20070013495 | Suzuki et al. | Jan 2007 | A1 |
20070023613 | Schofield et al. | Feb 2007 | A1 |
20070024724 | Stein et al. | Feb 2007 | A1 |
20070102214 | Wittorf | May 2007 | A1 |
20070104476 | Yasutomi et al. | May 2007 | A1 |
20070109406 | Schofield et al. | May 2007 | A1 |
20070109651 | Schofield et al. | May 2007 | A1 |
20070109652 | Schofield et al. | May 2007 | A1 |
20070109653 | Schofield et al. | May 2007 | A1 |
20070109654 | Schofield et al. | May 2007 | A1 |
20070115357 | Stein et al. | May 2007 | A1 |
20070120657 | Schofield et al. | May 2007 | A1 |
20070150196 | Grimm | Jun 2007 | A1 |
20070176080 | Schofield et al. | Aug 2007 | A1 |
20070242339 | Bradley | Oct 2007 | A1 |
20070293996 | Mori | Dec 2007 | A1 |
20080043099 | Stein et al. | Feb 2008 | A1 |
20080137908 | Stein et al. | Jun 2008 | A1 |
20080180529 | Taylor et al. | Jul 2008 | A1 |
20080221767 | Ikeda et al. | Sep 2008 | A1 |
20090109061 | McNew | Apr 2009 | A1 |
20090113509 | Tseng et al. | Apr 2009 | A1 |
20090182690 | Stein | Jul 2009 | A1 |
20090185716 | Kato et al. | Jul 2009 | A1 |
20090208058 | Schofield et al. | Aug 2009 | A1 |
20090243824 | Peterson et al. | Oct 2009 | A1 |
20090244361 | Gebauer et al. | Oct 2009 | A1 |
20090254247 | Osanai | Oct 2009 | A1 |
20090295181 | Lawlor et al. | Dec 2009 | A1 |
20100020170 | Higgins-Luthman et al. | Jan 2010 | A1 |
20100045797 | Schofield et al. | Feb 2010 | A1 |
20100097469 | Blank et al. | Apr 2010 | A1 |
20100156617 | Nakada | Jun 2010 | A1 |
20100171642 | Hassan et al. | Jul 2010 | A1 |
20100172542 | Stein et al. | Jul 2010 | A1 |
20110043624 | Haug | Feb 2011 | A1 |
20110273582 | Gayko et al. | Nov 2011 | A1 |
20120019940 | Lu et al. | Jan 2012 | A1 |
20120062743 | Lynam et al. | Mar 2012 | A1 |
20120116632 | Bechtel | May 2012 | A1 |
20120162427 | Lynam | Jun 2012 | A1 |
20120233841 | Stein | Sep 2012 | A1 |
20130124038 | Naboulsi | May 2013 | A1 |
20140211013 | Drummond et al. | Jul 2014 | A1 |
Number | Date | Country |
---|---|---|
2008127752 | Oct 2008 | WO |
2009073054 | Jun 2009 | WO |
2010099416 | Sep 2010 | WO |
2011014497 | Feb 2011 | WO |
Number | Date | Country | |
---|---|---|---|
20230100684 A1 | Mar 2023 | US |
Number | Date | Country | |
---|---|---|---|
61178565 | May 2009 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16947774 | Aug 2020 | US |
Child | 18058284 | US | |
Parent | 16016815 | Jun 2018 | US |
Child | 16947774 | US | |
Parent | 14942088 | Nov 2015 | US |
Child | 16016815 | US | |
Parent | 13767208 | Feb 2013 | US |
Child | 14942088 | US | |
Parent | 12781119 | May 2010 | US |
Child | 13767208 | US |