Vehicle vision system with collision mitigation

Information

  • Patent Grant
  • 10692380
  • Patent Number
    10,692,380
  • Date Filed
    Monday, November 20, 2017
    6 years ago
  • Date Issued
    Tuesday, June 23, 2020
    4 years ago
Abstract
A method for determining potential collision with another vehicle by a vehicle equipped with a vision system includes providing, at the equipped vehicle, a vision sensor including a camera and at least one non-vision sensor, and determining presence of a leading vehicle ahead of the equipped vehicle. A time to collision to the leading vehicle is determined based at least in part on a determined distance to the leading vehicle and a determined relative velocity between the equipped vehicle and the leading vehicle. A braking level of the equipped vehicle is determined to mitigate collision of the equipped vehicle with the leading vehicle. A weighting factor is employed when determining the braking level, and the weighting factor is adjusted responsive at least in part to determining, via processing of image data captured by the camera, that a brake light of the leading vehicle ceases to be illuminated.
Description
FIELD OF THE INVENTION

The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more cameras at a vehicle.


BACKGROUND OF THE INVENTION

Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.


SUMMARY OF THE INVENTION

The present invention provides a collision avoidance system and/or collision warning system or vision system or imaging system for a vehicle that utilizes one or more cameras (such as one or more CMOS cameras) to capture image data representative of images exterior of the vehicle, and determines an appropriate warning or alert and/or an appropriate or required braking level or condition for the vehicle to avoid or mitigate a collision with a leading vehicle in the path of travel ahead of the equipped vehicle. The vision system determines an appropriate warning or alert level or timing and/or an appropriate braking level and, responsive to a determination of whether or not the taillight or taillights or brake lights of the leading vehicle are actuated, the vision system may adjust or weight one or more parameters to increase the valuation or emphasis on that parameter or parameters when the vision system determines that the leading vehicle is braking.


For example, the vision system may determine a relative acceleration between the equipped vehicle and the leading vehicle and, responsive to a determination that the leading vehicle is braking (such as via a determination that the brake lights of the leading vehicle are actuated), the system may increase the emphasis or weight of the relative acceleration in the determination or calculation, in order to provide an earlier warning (or louder or more intense warning) and/or to increase the braking level that is appropriate or required to avoid or mitigate the collision. Similarly, responsive to a determination that the leading vehicle is not braking (such as via a determination that the brake lights of the leading vehicle are not actuated), the system may decrease the emphasis or weight of the relative acceleration in the determination or calculation, in order to provide a later warning or alert (or softer or less intense warning) and/or decrease the braking level that is appropriate or required to avoid or mitigate the collision.


Therefore, the system of the present invention determines and utilizes the preceding or leading vehicle brake light illumination to independently determine that the leading or preceding vehicle is decelerating. This knowledge or determination can be used to apply weighting to the determined relative acceleration value to increase the emphasis of the data. The weighted acceleration data is used when calculating the time to collision and required vehicle deceleration to avoid or mitigate potential collision with the leading vehicle, and/or may be used to determine the timing of an alert or warning and the degree of that alert or warning.


These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a plan view of a vehicle with a vision system that incorporates cameras in accordance with the present invention; and



FIG. 2 is a flow chart of a process of determining the time to collision and required vehicle deceleration of the subject vehicle in accordance with the present invention.





DESCRIPTION OF THE PREFERRED EMBODIMENTS

A vehicle vision system and/or driver assist system and/or object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a forward or rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide a top down or bird's eye or surround view display and may provide a displayed image that is representative of the subject vehicle, and optionally with the displayed image being customized to at least partially correspond to the actual subject vehicle.


Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes at least one exterior facing imaging sensor or camera, such as a rearward facing imaging sensor or camera 14a (and the system may optionally include multiple exterior facing imaging sensors or cameras, such as a forwardly facing camera 14b at the front (or at the windshield) of the vehicle, and a sidewardly/rearwardly facing camera 14c, 14d at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (FIG. 1). The vision system 12 includes a control or electronic control unit (ECU) or processor 18 that is operable to process image data captured by the cameras and may provide displayed images at a display device 16 for viewing by the driver of the vehicle (although shown in FIG. 1 as being part of or incorporated in or at an interior rearview mirror assembly 20 of the vehicle, the control and/or the display device may be disposed elsewhere at or in the vehicle). The data transfer or signal communication from the camera to the ECU may comprise any suitable data or communication link, such as a vehicle network bus or the like of the equipped vehicle.


Collision avoidance systems and collision mitigation systems typically require information about the location and motion of preceding vehicles (vehicles ahead of the subject or equipped vehicle). Calculating if a collision may occur is based on data associated with the subject vehicle and the preceding vehicle. This data is used to determine if a collision will occur, the estimated or calculated time to the collision and the magnitude of the subject vehicle braking needed to avoid the collision. When the time to collision is below a predefined value or threshold level, vehicle avoidance/mitigation functions utilizing automatic braking are activated. The time to collision determination requires accurate distance, relative velocity and acceleration between the subject vehicle and the leading or preceding vehicle.


Object detection sensors that determine the distance and velocity are typically accurate enough. The determination of acceleration is susceptible to error which can cause incorrect determination of the time to collision. For example, an error in the determination or estimation of the preceding vehicle acceleration could trigger a collision mitigation system to mistakenly determine that the vehicle is going to collide with another vehicle. Also, this acceleration error may generate an incorrect magnitude of emergency braking to avoid the collision. Such braking could be dangerous if following vehicles are close behind the subject (braking) vehicle and emergency braking is not warranted. Perhaps worse, the system may mistakenly determine that the vehicle is not about to be in a collision and the system does not take action to prevent an otherwise avoidable collision.


The present invention provides an enhanced system that incorporates a relative acceleration weighting factor to reduce errors associated with the calculations of collision avoidance/mitigation/warning variables. The preceding vehicle acceleration calculated by the object detection sensors is inherently noisy or latent due to the method acceleration is determined. Acceleration is typically determined utilizing either a first/second derivative of radar/lidar sensor data or image inflation of object data. To reduce performance impacts associated with noisy acceleration data, various filtering techniques may be applied. Typically, filtering makes the acceleration data latent, thereby potentially reducing the performance and effectiveness of the collision avoidance/mitigation feature. Also, it may complicate the time relationship between the filtered vs. non-filtered variables used to determine if avoidance/mitigation actions are required.


The system of the present invention utilizes the preceding vehicle brake light illumination to independently determine that the leading or preceding vehicle is decelerating. This knowledge or determination can be used to apply weighting to the sensor determined acceleration to increase the emphasis of the data. For example, if the system determines that the preceding vehicle (the vehicle that is ahead of the equipped or subject vehicle and in the same lane of travel as the equipped or subject vehicle and thus is in the path of travel of the equipped or subject vehicle) brake lights are not illuminated, the system may apply a weighting that decreases the emphasis on the acceleration data. The weighted acceleration data is used when calculating time to collision and required vehicle deceleration to avoid the collision.


In a preferred embodiment, the system may apply weighting of the preceding vehicle acceleration data based on detecting the preceding vehicle brake light illumination. Other techniques other than weighting can be utilized, such as, for example, a discrete low pass filter (proportion of new versus old acceleration) may be used in determining the magnitude of acceleration. The illumination of the preceding vehicle brake lights provides a higher level of confidence that the preceding vehicle is decelerating and is not associated with the sensor data noise. This information is used in an algorithm to apply a larger magnitude of sensor determined acceleration, when determining when to provide a warning and/or initiate an avoidance/mitigation action and determine the amount of deceleration to command. The avoidance/mitigation action and/or timing of a warning or alert (such as an audible or visual or haptic warning or alert) is based on the magnitude of the Time to Collision (TTC) and SV required deceleration, see calculations below.







Time





to





Collision

=


2
*

d
x




d
v

±





d
v



(

t
0

)


2

+

2


d
x



K

accel





weight





d
a



(

t
0

)














SV





Required





Decel

=



sgn






(
v
)

*

V
rel
2



2


(


K

veh





gap





stopped


+


V
rel



t
brakereaction







)



+


K
accelweight







(


a
SV

+

a
rel


)







In this particular embodiment, the system of the present invention is directed in a method to determine the weighting of sensor determined acceleration, which is utilized in determining collision avoidance/mitigation variables, time to collision and required vehicle deceleration (see FIG. 2). This method comprises:

    • Capturing the images of the environment preceding the vehicle;
    • Determining if there is a preceding vehicle and, if there is a preceding vehicle ahead of the subject vehicle, determining if the brake lights of the preceding vehicle are illuminated based on the captured images;
    • Determining or adjusting the acceleration weighting based on the determined illumination of the preceding vehicle brake lights;
    • Obtaining the measurement of distance, velocity and acceleration to the preceding vehicle based on the captured images;
    • Determining the subject vehicle velocity and acceleration utilizing on board sensors;
    • Determining the relative velocity and acceleration to the preceding vehicle; and
    • Determining the time to collision and required vehicle deceleration (and/or the degree of warning or timing of warning to the driver of the subject vehicle to alert the driver of the hazardous condition and the degree of the hazard) based on distance to, relative velocity and weighted acceleration between the subject and preceding vehicles.


Thus, the present invention provides a calculation or determination of the time to collision with a preceding vehicle and the required or appropriate deceleration of the subject vehicle that may avoid the collision with the preceding vehicle. The system makes the determination or determinations of the relative velocity and relative acceleration between the subject vehicle and the preceding vehicle, and the relative velocity and relative acceleration may be determined or adjusted based on the brake lights of the preceding vehicle, the distance to the preceding vehicle, the preceding vehicle velocity and acceleration (based on image processing of captured image data as captured by a forward facing camera of the subject vehicle), and the subject vehicle velocity and acceleration (based on vehicle sensors or accessories). Responsive to such determinations, an alert may be generated to the driver of the vehicle to alert the driver of a potential or imminent collision, and/or the brake system of the subject vehicle may be controlled to slow or stop the subject vehicle to avoid or mitigate the collision with the preceding vehicle.


Thus, the system or control of the present invention provides enhanced control and adjustment of the vehicle brake system following the initial determination that the brakes should be applied to mitigate collision (such as when the system determines that a collision with a detected target vehicle is likely or imminent). The system of the present invention preferably includes a forward facing (and/or rearward facing) machine vision camera and a forward facing (and/or rearward facing) radar device or sensor (preferably such as described in U.S. Pat. No. 8,013,780, which is hereby incorporated herein by reference in its entirety). As described in U.S. Pat. No. 8,013,780, image data captured by the camera and as processed by an image processor may be fused with radar data for the overall processing and in making the determination of whether to apply the vehicle brakes and/or how much to apply the vehicle brakes.


The system uses a forward facing camera or sensor, which may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.


The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an EyeQ2 or EyeQ3 image processing chip available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.


The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.


For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or International Publication Nos. WO 2011/028686; WO 2010/099416; WO 2012/061567; WO 2012/068331; WO 2012/075250; WO 2012/103193; WO 2012/0116043; WO 2012/0145313; WO 2012/0145501; WO 2012/145818; WO 2012/145822; WO 2012/158167; WO 2012/075250; WO 2012/0116043; WO 2012/0145501; WO 2012/154919; WO 2013/019707; WO 2013/016409; WO 2013/019795; WO 2013/067083; WO 2013/070539; WO 2013/043661; WO 2013/048994; WO 2013/063014, WO 2013/081984; WO 2013/081985; WO 2013/074604; WO 2013/086249; WO 2013/103548; WO 2013/109869; WO 2013/123161; WO 2013/126715; WO 2013/043661 and/or WO 2013/158592, and/or U.S. patent application Ser. No. 14/359,341, filed May 20, 2014 and published Nov. 20, 2014 as U.S. Publication No. 2014/0340510; Ser. No. 14/359,340, filed May 20, 2014 and published Oct. 23, 2014 as U.S. Publication No. 2014/0313339; Ser. No. 14/282,029, filed May 20, 02014, now U.S. Pat. No. 9,205,776; Ser. No. 14/282,028, filed May 20, 2014 and published Nov. 27, 2014 as U.S. Publication No. 2014/0347486; Ser. No. 14/358,232, filed May 15, 2014 and published Oct. 30, 2014 as U.S. Publication No. 2014/0320658; Ser. No. 14/272,834, filed May 8, 2014 and published Nov. 13, 2014 as U.S. Publication No. 2014/0336876; Ser. No. 14/356,330, filed May 5, 2014 and published Oct. 16, 2014 as U.S. Publication No. 2014/0307095; Ser. No. 14/269,788, filed May 5, 2014 and published Nov. 6, 2014 as U.S. Publication No. 2014/0327774; Ser. No. 14/268,169, filed May 2, 2014 and published Nov. 6, 2014 as U.S. Publication No. 2014/0327772; Ser. No. 14/264,443, filed Apr. 29, 2014 and published Oct. 30, 2014 as U.S. Publication No. 2014/0320636; Ser. No. 14/354,675, filed Apr. 28, 2014 and published Oct. 2, 2014 as U.S. Publication No. 2014/0293057; Ser. No. 14/248,602, filed Apr. 9, 2014 and published Oct. 16, 2014 as U.S. Publication No. 2014/0309884; Ser. No. 14/242,038, filed Apr. 1, 2014 and published Aug. 14, 2014 as U.S. Publication No. 2014/0226012; Ser. No. 14/229,061, filed Mar. 28, 2014 and published 2014/0293042 as U.S. Publication No. Oct. 2, 2014; Ser. No. 14/343,937, filed Mar. 10, 2014 and published Aug. 7, 2014 as U.S. Publication No. 2014/0218535; Ser. No. 14/343,936, filed Mar. 10, 2014 and published Aug. 7, 2014 as U.S. Publication No. 2014/0218535; Ser. No. 14/195,135, filed Mar. 3, 2014 and published Sep. 4, 2014 as U.S. Publication No. 2014/0247354; Ser. No. 14/195,136, filed Mar. 3, 2014 and published Sep. 4, 2014 as U.S. Publication No. 2014/0247355; Ser. No. 14/191,512, filed Feb. 27, 2014 and published Sep. 4, 2014 as U.S. Publication No. 2014/0247352; Ser. No. 14/183,613, filed Feb. 19, 2014 and published Aug. 21, 2014 as U.S. Publication No. 2014/0232869; Ser. No. 14/169,329, filed Jan. 31, 2014 and published Aug. 7, 2014 as/U.S. Publication No. 2014/0218529; Ser. No. 14/169,328, filed Jan. 31, 2014, now U.S. Pat. No. 9,092,986; Ser. No. 14/163,325, filed Jan. 24, 2014 and published Jul. 31, 2014 as U.S. Publication No. 2014/0211009; Ser. No. 14/159,772, filed Jan. 21, 2014, now U.S. Pat. No. 9,068,390; Ser. No. 14/107,624, filed Dec. 16, 2013, now U.S. Pat. No. 9,140,789; Ser. No. 14/102,981, filed Dec. 11, 2013 and published Jun. 12, 2014 as U.S. Publication No. 2014/0160276; Ser. No. 14/102,980, filed Dec. 11, 2013 and published Jun. 19, 2014 as U.S. Publication No. 2014/0168437; Ser. No. 14/098,817, filed Dec. 6, 2013 and published Jun. 19, 2014 as U.S. Publication No. 2014/0168415; Ser. No. 14/097,581, filed Dec. 5, 2013 and published Jun. 12, 2014 as U.S. Publication No. 2014/0160291; Ser. No. 14/093,981, filed Dec. 2, 2013, now U.S. Pat. No. 8,917,169; Ser. No. 14/093,980, filed Dec. 2, 2013 and published Jun. 5, 2014 as U.S. Publication No. 2014/0152825; Ser. No. 14/082,573, filed Nov. 18, 2013 and published May 22, 2014 as U.S. Publication No. 2014/0139676; Ser. No. 14/082,574, filed Nov. 18, 2013 and published May 22, 2014 as U.S. Publication No. 2014/0138140; Ser. No. 14/082,575, filed Nov. 18, 2013, now U.S. Pat. No. 9,090,234; Ser. No. 14/082,577, filed Nov. 18, 2013, now U.S. Pat. No. 8,818,042; Ser. No. 14/071,086, filed Nov. 4, 2013, now U.S. Pat. No. 8,886,401; Ser. No. 14/076,524, filed Nov. 11, 2013, now U.S. Pat. No. 9,077,962; Ser. No. 14/052,945, filed Oct. 14, 2013 and published Apr. 17, 2014 as U.S. Publication No. 2014/0104426; Ser. No. 14/046,174, filed Oct. 4, 2013 and published Apr. 10, 2014 as U.S. Publication No. 2014/0098229; Ser. No. 14/036,723, filed Sep. 25, 2013 and published Mar. 27, 2014 as U.S. Publication No. 2014/0085472; Ser. No. 14/016,790, filed Sep. 3, 2013 and published Mar. 6, 2014 as U.S. Publication No. 2014/0067206; Ser. No. 14/001,272, filed Aug. 23, 2013, now U.S. Pat. No. 9,233,641; Ser. No. 13/970,868, filed Aug. 20, 2013 and published Feb. 20, 2014 as U.S. Publication No. 2014/0049646; Ser. No. 13/964,134, filed Aug. 12, 2013 and published Feb. 20, 2014 as U.S. Publication No. 2014/0052340; Ser. No. 13/942,758, filed Jul. 16, 2013 and published Jan. 23, 2014 as U.S. Publication No. 2014/0025240; Ser. No. 13/942,753, filed Jul. 16, 2013 and published Jan. 30, 2014 as U.S. Publication No. 2014/0028852; Ser. No. 13/927,680, filed Jun. 26, 2013 and published Jan. 2, 2014 as U.S. Publication No. 2014/005907; Ser. No. 13/916,051, filed Jun. 12, 2013, now U.S. Pat. No. 9,077,098; Ser. No. 13/894,870, filed May 15, 2013 and published Nov. 28, 2013 as U.S. Publication No. 2013/0314503; Ser. No. 13/887,724, filed May 6, 2013 and published Nov. 14, 2013 as U.S. Publication No. 2013/0298866; Ser. No. 13/852,190, filed Mar. 28, 2013 and published Aug. 29, 2013 as U.S. Publication No. 2013/0222593; Ser. No. 13/851,378, filed Mar. 27, 2013 and published Nov. 14, 2013 as U.S. Publication No. 2013/0300869; Ser. No. 13/848,796, filed Mar. 22, 2012 and published Oct. 24, 2013 as U.S. Publication No. 2013/0278769; Ser. No. 13/847,815, filed Mar. 20, 2013 and published Oct. 3, 2013 as U.S. Publication No. 2013/0258077; Ser. No. 13/800,697, filed Mar. 13, 2013 and published Oct. 3, 2013 as U.S. Publication No. 2013/0258077; Ser. No. 13/785,099, filed Mar. 5, 2013 and published Sep. 19, 2013 as U.S. Publication No. 2013/0242099; Ser. No. 13/779,881, filed Feb. 28, 2013, now U.S. Pat. No. 8,694,224; Ser. No. 13/774,317, filed Feb. 22, 2013 and published Aug. 29, 2013 as U.S. Publication No. 2013/0222592; Ser. No. 13/774,315, filed Feb. 22, 2013 and published Aug. 22, 2013 as U.S. Publication No. 2013/0215271; Ser. No. 13/681,963, filed Nov. 20, 2012 and published Jun. 6, 2013 as U.S. Publication No. 2013/0141578; Ser. No. 13/660,306, filed Oct. 25, 2012, now U.S. Pat. No. 9,146,898; Ser. No. 13/653,577, filed Oct. 17, 2012, now U.S. Pat. No. 9,174,574; and/or Ser. No. 13/534,657, filed Jun. 27, 2012 and published Jan. 3, 2013 as U.S. Publication No. 2013/0002873, and/or U.S. provisional applications, Ser. No. 61/993,736, filed May 15, 2014; Ser. 61/991,810, filed May 12, 2014; Ser. No. 61/991,809, filed May 12, 2014; Ser. No. 61/990,927, filed May 9, 2014; Ser. No. 61/989,652, filed May 7, 2014; Ser. No. 61/981,938, filed Apr. 21, 2014; Ser. No. 61/981,937, filed Apr. 21, 2014; Ser. No. 61/977,941, filed Apr. 10, 2014; Ser. No. 61/977,940, filed Apr. 10, 2014; Ser. No. 61/977,929, filed Apr. 10, 2014; Ser. No. 61/977,928, filed Apr. 10, 2014; Ser. No. 61/973,922, filed Apr. 2, 2014; Ser. No. 61/972,708, filed Mar. 31, 2014; Ser. No. 61/972,707, filed Mar. 31, 2014; Ser. No. 61/969,474, filed Mar. 24, 2014; Ser. No. 61/955,831, filed Mar. 20, 2014; Ser. No. 61/953,970, filed Mar. 17, 2014; Ser. No. 61/952,335, filed Mar. 13, 2014; Ser. No. 61/952,334, filed Mar. 13, 2014; Ser. No. 61/950,261, filed Mar. 10, 2014; Ser. No. 61/950,261, filed Mar. 10, 2014; Ser. No. 61/947,638, filed Mar. 4, 2014; Ser. No. 61/947,053, filed Mar. 3, 2014; Ser. No. 61/941,568, filed Feb. 19, 2014; Ser. No. 61/935,485, filed Feb. 4, 2014; Ser. No. 61/935,057, filed Feb. 3, 2014; Ser. No. 61/935,056, filed Feb. 3, 2014; Ser. No. 61/935,055, filed Feb. 3, 2014; Ser. 61/931,811, filed Jan. 27, 2014; Ser. No. 61/919,129, filed Dec. 20, 2013; Ser. No. 61/919,130, filed Dec. 20, 2013; Ser. No. 61/919,131, filed Dec. 20, 2013; Ser. No. 61/919,147, filed Dec. 20, 2013; Ser. No. 61/919,138, filed Dec. 20, 2013, Ser. No. 61/919,133, filed Dec. 20, 2013; Ser. No. 61/918,290, filed Dec. 19, 2013; Ser. No. 61/915,218, filed Dec. 12, 2013; Ser. No. 61/912,146, filed Dec. 5, 2013; Ser. No. 61/911,666, filed Dec. 4, 2013; Ser. No. 61/911,665, filed Dec. 4, 2013; Ser. No. 61/905,461, filed Nov. 18, 2013; Ser. No. 61/905,462, filed Nov. 18, 2013; Ser. No. 61/901,127, filed Nov. 7, 2013; Ser. No. 61/895,610, filed Oct. 25, 2013; Ser. No. 61/895,609, filed Oct. 25, 2013; Ser. No. 61/879,837, filed Sep. 19, 2013; Ser. No. 61/879,835, filed Sep. 19, 2013; Ser. No. 61/875,351, filed Sep. 9, 2013; Ser. No. 61/869,195, filed. Aug. 23, 2013; Ser. No. 61/864,835, filed Aug. 12, 2013; Ser. No. 61/864,836, filed Aug. 12, 2013; Ser. No. 61/864,837, filed Aug. 12, 2013; Ser. No. 61/864,838, filed Aug. 12, 2013; Ser. No. 61/856,843, filed Jul. 22, 2013, Ser. No. 61/845,061, filed Jul. 11, 2013; Ser. No. 61/844,630, filed Jul. 10, 2013; Ser. No. 61/844,173, filed Jul. 9, 2013; Ser. No. 61/844,171, filed Jul. 9, 2013; Ser. No. 61/842,644, filed Jul. 3, 2013; Ser. No. 61/840,542, filed Jun. 28, 2013; Ser. No. 61/838,619, filed Jun. 24, 2013; Ser. No. 61/838,621, filed Jun. 24, 2013; Ser. No. 61/837,955, filed Jun. 21, 2013; Ser. No. 61/836,900, filed Jun. 19, 2013; Ser. No. 61/836,380, filed Jun. 18, 2013; Ser. No. 61/833,080, filed Jun. 10, 2013; Ser. No. 61/830,375, filed Jun. 3, 2013; and/or Ser. No. 61/830,377, filed Jun. 3, 2013; which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos.


WO/2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. patent application Ser. No. 13/202,005, filed Aug. 17, 2011, now U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.


The imaging device and control and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras and vision systems described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,937,667; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454 and/or 6,824,281, and/or International Publication Nos. WO 2010/099416; WO 2011/028686 and/or WO 2013/016409, and/or U.S. Pat. Publication No. US 2010-0020170, and/or U.S. patent application Ser. No. 13/534,657, filed Jun. 27, 2012 and published Jan. 3, 2013 as U.S. Publication No. 2013/0002873, which are all hereby incorporated herein by reference in their entireties. The camera or cameras may comprise any suitable cameras or imaging sensors or camera modules, and may utilize aspects of the cameras or sensors described in U.S. Publication No. US-2009-0244361 and/or U.S. patent application Ser. No. 13/260,400, filed Sep. 26, 2011, now U.S. Pat. No. 8,542,451, and/or U.S. Pat. Nos. 7,965,336 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties. The imaging array sensor may comprise any suitable sensor, and may utilize various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as the types described in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,715,093; 5,877,897; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 6,498,620; 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,806,452; 6,396,397; 6,822,563; 6,946,978; 7,339,149; 7,038,577; 7,004,606; 7,720,580 and/or 7,965,336, and/or International Publication Nos. WO/2009/036176 and/or WO/2009/046268, which are all hereby incorporated herein by reference in their entireties.


The camera module and circuit chip or board and imaging sensor may be implemented and operated in connection with various vehicular vision-based systems, and/or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 7,004,606; 7,339,149 and/or 7,526,103, which are all hereby incorporated herein by reference in their entireties, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454; 6,320,176 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties, a vehicle vision system, such as a forwardly, sidewardly or rearwardly directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978 and/or 7,859,565, which are all hereby incorporated herein by reference in their entireties, a trailer hitching aid or tow check system, such as the type disclosed in U.S. Pat. No. 7,005,974, which is hereby incorporated herein by reference in its entirety, a reverse or sideward imaging system, such as for a lane change assistance system or lane departure warning system or for a blind spot or object detection system, such as imaging or detection systems of the types disclosed in U.S. Pat. Nos. 7,881,496; 7,720,580; 7,038,577; 5,929,786 and/or 5,786,772, and/or U.S. provisional applications, Ser. No. 60/628,709, filed Nov. 17, 2004; Ser. No. 60/614,644, filed Sep. 30, 2004; Ser. No. 60/618,686, filed Oct. 14, 2004; Ser. No. 60/638,687, filed Dec. 23, 2004, which are hereby incorporated herein by reference in their entireties, a video device for internal cabin surveillance and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962; 5,877,897; 6,690,268 and/or 7,370,983, and/or U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties, a traffic sign recognition system, a system for determining a distance to a leading or trailing vehicle or object, such as a system utilizing the principles disclosed in U.S. Pat. Nos. 6,396,397 and/or 7,123,168, which are hereby incorporated herein by reference in their entireties, and/or the like.


Optionally, the circuit board or chip may include circuitry for the imaging array sensor and or other electronic accessories or features, such as by utilizing compass-on-a-chip or EC driver-on-a-chip technology and aspects such as described in U.S. Pat. No. 7,255,451 and/or U.S. Pat. No. 7,480,149, and/or U.S. Publication No. US-2006-0061008 and/or U.S. patent application Ser. No. 12/578,732, filed Oct. 14, 2009 and published Apr. 22, 2010 as U.S. Publication No. 2010/0097469, which are hereby incorporated herein by reference in their entireties.


Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device disposed at or in the interior rearview mirror assembly of the vehicle, such as by utilizing aspects of the video mirror display systems described in U.S. Pat. No. 6,690,268 and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011, now U.S. Pat. No. 9,264,672, which are hereby incorporated herein by reference in their entireties. The video mirror display may comprise any suitable devices and systems and optionally may utilize aspects of the compass display systems described in U.S. Pat. Nos. 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252 and/or 6,642,851, and/or European patent application, published Oct. 11, 2000 under Publication No. EP 0 1043566, and/or U.S. Publication No. US-2006-0061008, which are all hereby incorporated herein by reference in their entireties. Optionally, the video mirror display screen or device may be operable to display images captured by a rearward viewing camera of the vehicle during a reversing maneuver of the vehicle (such as responsive to the vehicle gear actuator being placed in a reverse gear position or the like) to assist the driver in backing up the vehicle, and optionally may be operable to display the compass heading or directional heading character or icon when the vehicle is not undertaking a reversing maneuver, such as when the vehicle is being driven in a forward direction along a road (such as by utilizing aspects of the display system described in International Publication No. WO 2012/051500, which is hereby incorporated herein by reference in its entirety).


Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2010/099416; WO 2011/028686; WO 2012/075250; WO 2013/019795; WO 2012/075250; WO 2012/145822; WO 2013/081985; WO 2013/086249 and/or WO 2013/109869, and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011, now U.S. Pat. No. 9,264,672, which are hereby incorporated herein by reference in their entireties.


Optionally, a video mirror display may be disposed rearward of and behind the reflective element assembly and may comprise a display such as the types disclosed in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,370,983; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187 and/or 6,690,268, and/or in U.S. Publication Nos. US-2006-0061008 and/or US-2006-0050018, which are all hereby incorporated herein by reference in their entireties. The display is viewable through the reflective element when the display is activated to display information. The display element may be any type of display element, such as a vacuum fluorescent (VF) display element, a light emitting diode (LED) display element, such as an organic light emitting diode (OLED) or an inorganic light emitting diode, an electroluminescent (EL) display element, a liquid crystal display (LCD) element, a video screen display element or backlit thin film transistor (TFT) display element or the like, and may be operable to display various information (as discrete characters, icons or the like, or in a multi-pixel manner) to the driver of the vehicle, such as passenger side inflatable restraint (PSIR) information, tire pressure status, and/or the like. The mirror assembly and/or display may utilize aspects described in U.S. Pat. Nos. 7,184,190; 7,255,451; 7,446,924 and/or 7,338,177, which are all hereby incorporated herein by reference in their entireties. The thicknesses and materials of the coatings on the substrates of the reflective element may be selected to provide a desired color or tint to the mirror reflective element, such as a blue colored reflector, such as is known in the art and such as described in U.S. Pat. Nos. 5,910,854; 6,420,036 and/or 7,274,501, which are hereby incorporated herein by reference in their entireties.


Optionally, the display or displays and any associated user inputs may be associated with various accessories or systems, such as, for example, a tire pressure monitoring system or a passenger air bag status or a garage door opening system or a telematics system or any other accessory or system of the mirror assembly or of the vehicle or of an accessory module or console of the vehicle, such as an accessory module or console of the types described in U.S. Pat. Nos. 7,289,037; 6,877,888; 6,824,281; 6,690,268; 6,672,744; 6,386,742 and/or 6,124,886, and/or U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties.


Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.

Claims
  • 1. A method for determining potential collision with another vehicle by a vehicle equipped with a vision system, said method comprising: providing, at the equipped vehicle, a vision sensor comprising a camera having at least one million photosensing elements;providing, at the equipped vehicle, at least one non-vision sensor comprising at least one radar sensor or at least one lidar sensor;capturing, using said camera, image data of an exterior environment ahead of the equipped vehicle;determining presence of a leading vehicle ahead of the equipped vehicle;when the leading vehicle is determined to be present ahead of the equipped vehicle, determining, via processing by an image processor of image data captured by said camera, that a brake light of the leading vehicle is illuminated;determining distance to the leading vehicle;determining relative velocity between the equipped vehicle and the leading vehicle;determining a time to collision to the leading vehicle based at least in part on the determined distance to the leading vehicle and the determined relative velocity between the equipped vehicle and the leading vehicle;determining a braking level of the equipped vehicle to mitigate collision of the equipped vehicle with the leading vehicle;employing a weighting factor when determining the braking level of the equipped vehicle to mitigate collision of the equipped vehicle with the leading vehicle; andadjusting the weighting factor responsive at least in part to determining, via processing of image data captured by said camera, that the brake light of the leading vehicle ceases to be illuminated.
  • 2. The method of claim 1, wherein adjusting the weighting factor comprises decreasing weight given to the determined time to collision to the leading vehicle responsive at least in part to determining, via processing of image data captured by said camera, that the brake light of the leading vehicle ceases to be illuminated.
  • 3. The method of claim 1, comprising, responsive at least in part to the determined time to collision being below a threshold value of time, and responsive at least in part to determining, via image processing by said image processor of image data captured by said camera, that the brake light of the leading vehicle is illuminated, determining a degree of warning to a driver of the equipped vehicle.
  • 4. The method of claim 3, comprising, after determining the degree of warning to the driver of the equipped vehicle, and responsive at least in part to determining, via image processing by said image processor of image data captured by said camera, that the brake light of the leading vehicle ceases to be illuminated, reducing the determined degree of warning to the driver of the equipped vehicle.
  • 5. The method of claim 3, comprising providing a warning in accordance with the determined degree of warning.
  • 6. The method of claim 5, comprising, after providing the warning in accordance with the determined degree of warning, and responsive at least in part to determining, via processing of image data captured by said camera, that the brake light of the leading vehicle ceases to be illuminated, reducing the degree of warning provided to the driver of the equipped vehicle.
  • 7. The method of claim 1, wherein determining distance to the leading vehicle comprises determining distance to the leading vehicle at least in part responsive to image processing by said image processor of image data captured by said camera.
  • 8. The method of claim 1, wherein determining relative velocity between the equipped vehicle and the leading vehicle comprises determining relative velocity between the equipped vehicle and the leading vehicle responsive at least in part to image processing by said image processor of image data captured by said camera.
  • 9. The method of claim 1, comprising determining velocity of the equipped vehicle at least in part via at least one sensor of the equipped vehicle.
  • 10. The method of claim 1, wherein providing, at the equipped vehicle, at least one non-vision sensor comprises providing, at the equipped vehicle, at least one radar sensor.
  • 11. The method of claim 1, wherein providing, at the equipped vehicle, at least one non-vision sensor comprises providing, at the equipped vehicle, at least one lidar sensor.
  • 12. A method for determining potential collision with another vehicle by a vehicle equipped with a vision system, said method comprising: providing, at the equipped vehicle, a vision sensor comprising a camera having at least one million photosensing elements;providing, at the equipped vehicle, at least one non-vision sensor comprising at least one radar sensor or at least one lidar sensor;capturing, using said camera, image data of an exterior environment ahead of the equipped vehicle;determining presence of a leading vehicle ahead of the equipped vehicle;when the leading vehicle is determined to be present ahead of the equipped vehicle, determining, via processing by an image processor of image data captured by said camera, that a brake light of the leading vehicle is illuminated;determining, at least in part via image processing by said image processor of image data captured by said camera, distance to the leading vehicle;determining relative velocity between the equipped vehicle and the leading vehicle;determining a time to collision to the leading vehicle based at least in part on the determined distance to the leading vehicle and the determined relative velocity between the equipped vehicle and the leading vehicle;determining a braking level of the equipped vehicle to mitigate collision of the equipped vehicle with the leading vehicle;employing a weighting factor when determining the braking level of the equipped vehicle to mitigate collision of the equipped vehicle with the leading vehicle;adjusting the weighting factor responsive at least in part to determining, via processing of image data captured by said camera, that the brake light of the leading vehicle ceases to be illuminated; andwherein adjusting the weighting factor comprises decreasing weight given to the determined time to collision to the leading vehicle responsive at least in part to determining, via processing of image data captured by said camera, that the brake light of the leading vehicle ceases to be illuminated.
  • 13. The method of claim 12, comprising, responsive at least in part to the determined time to collision being below a threshold value of time, and responsive at least in part to determining, via image processing by said image processor of image data captured by said camera, that the brake light of the leading vehicle is illuminated, determining a degree of warning to a driver of the equipped vehicle.
  • 14. The method of claim 13, comprising, after determining the degree of warning to the driver of the equipped vehicle, and responsive at least in part to determining, via image processing by said image processor of image data captured by said camera, that the brake light of the leading vehicle ceases to be illuminated, reducing the determined degree of warning to the driver of the equipped vehicle.
  • 15. The method of claim 13, comprising providing a warning in accordance with the determined degree of warning.
  • 16. The method of claim 15, comprising, after providing the warning in accordance with the determined degree of warning, and responsive at least in part to determining, via processing of image data captured by said camera, that the brake light of the leading vehicle ceases to be illuminated, reducing the degree of warning provided to the driver of the equipped vehicle.
  • 17. A method for determining potential collision with another vehicle by a vehicle equipped with a vision system, said method comprising: providing, at the equipped vehicle, a vision sensor comprising a camera having at least one million photosensing elements;providing, at the equipped vehicle, at least one non-vision sensor comprising at least one radar sensor;capturing, using said camera, image data of an exterior environment ahead of the equipped vehicle;determining presence of a leading vehicle ahead of the equipped vehicle;when the leading vehicle is determined to be present ahead of the equipped vehicle, determining, via processing by an image processor of image data captured by said camera, that a brake light of the leading vehicle is illuminated;determining distance to the leading vehicle;determining relative velocity between the equipped vehicle and the leading vehicle;wherein determining relative velocity between the equipped vehicle and the leading vehicle is responsive at least in part to at least one of (i) image processing by said image processor of image data captured by said camera and (ii) processing of radar data sensed by said at least one radar sensor of the equipped vehicle;determining a time to collision to the leading vehicle based at least in part on the determined distance to the leading vehicle and the determined relative velocity between the equipped vehicle and the leading vehicle;determining a braking level of the equipped vehicle to mitigate collision of the equipped vehicle with the leading vehicle;employing a weighting factor when determining the braking level of the equipped vehicle to mitigate collision of the equipped vehicle with the leading vehicle; andadjusting the weighting factor responsive at least in part to determining, via processing of image data captured by said camera, that the brake light of the leading vehicle ceases to be illuminated.
  • 18. The method of claim 17, comprising, responsive at least in part to the determined time to collision being below a threshold value of time, and responsive at least in part to determining, via image processing by said image processor of image data captured by said camera, that the brake light of the leading vehicle is illuminated, determining a degree of warning to a driver of the equipped vehicle.
  • 19. The method of claim 18, comprising, after determining the degree of warning to the driver of the equipped vehicle, and responsive at least in part to determining, via image processing by said image processor of image data captured by said camera, that the brake light of the leading vehicle ceases to be illuminated, reducing the determined degree of warning to the driver of the equipped vehicle.
  • 20. The method of claim 18, comprising providing a warning in accordance with the determined degree of warning, and, after providing the warning in accordance with the determined degree of warning, and responsive at least in part to determining, via processing of image data captured by said camera, that the brake light of the leading vehicle ceases to be illuminated, reducing the degree of warning provided to the driver of the equipped vehicle.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. patent application Ser. No. 15/042,666, filed Feb. 12, 2016, now U.S. Pat. No. 9,824,587, which is a continuation of U.S. patent application Ser. No. 14/303,694, filed Jun. 13, 2014, now U.S. Pat. No. 9,260,095, which claims the filing benefits of U.S. provisional application Ser. No. 61/836,900, filed Jun. 19, 2013, which is hereby incorporated herein by reference in its entirety.

US Referenced Citations (433)
Number Name Date Kind
4720790 Miki et al. Jan 1988 A
4987357 Masaki Jan 1991 A
4991054 Walters Feb 1991 A
5001558 Burley et al. Mar 1991 A
5003288 Wilhelm Mar 1991 A
5012082 Watanabe Apr 1991 A
5016977 Baude et al. May 1991 A
5027001 Torbert Jun 1991 A
5027200 Petrossian et al. Jun 1991 A
5044706 Chen Sep 1991 A
5055668 French Oct 1991 A
5059877 Teder Oct 1991 A
5064274 Alten Nov 1991 A
5072154 Chen Dec 1991 A
5073012 Lynam Dec 1991 A
5076673 Lynam et al. Dec 1991 A
5086253 Lawler Feb 1992 A
5096287 Kakinami et al. Mar 1992 A
5097362 Lynas Mar 1992 A
5115346 Lynam May 1992 A
5121200 Choi Jun 1992 A
5124549 Michaels et al. Jun 1992 A
5130709 Toyama et al. Jul 1992 A
5148014 Lynam Sep 1992 A
5151816 Varaprasad et al. Sep 1992 A
5168378 Black Dec 1992 A
5170374 Shimohigashi et al. Dec 1992 A
5172235 Wilm et al. Dec 1992 A
5177685 Davis et al. Jan 1993 A
5182502 Slotkowski et al. Jan 1993 A
5184956 Langlais et al. Feb 1993 A
5189561 Hong Feb 1993 A
5193000 Lipton et al. Mar 1993 A
5193029 Schofield Mar 1993 A
5204778 Bechtel Apr 1993 A
5208701 Maeda May 1993 A
5245422 Borcherts et al. Sep 1993 A
5253109 O'Farrell Oct 1993 A
5255442 Schierbeek et al. Oct 1993 A
5276389 Levers Jan 1994 A
5285060 Larson et al. Feb 1994 A
5289182 Brillard et al. Feb 1994 A
5289321 Secor Feb 1994 A
5305012 Faris Apr 1994 A
5307136 Saneyoshi Apr 1994 A
5309137 Kajiwara May 1994 A
5313072 Vachss May 1994 A
5325096 Pakett Jun 1994 A
5325386 Jewell et al. Jun 1994 A
5329206 Slotkowski et al. Jul 1994 A
5331312 Kudoh Jul 1994 A
5336980 Levers Aug 1994 A
5341437 Nakayama Aug 1994 A
5351044 Mathur et al. Sep 1994 A
5355118 Fukuhara Oct 1994 A
5357438 Davidian Oct 1994 A
5374852 Parkes Dec 1994 A
5386285 Asayama Jan 1995 A
5394333 Kao Feb 1995 A
5406395 Wilson et al. Apr 1995 A
5406414 O'Farrell et al. Apr 1995 A
5410346 Saneyoshi et al. Apr 1995 A
5414257 Stanton May 1995 A
5414461 Kishi et al. May 1995 A
5416313 Larson et al. May 1995 A
5416318 Hegyi May 1995 A
5416478 Morinaga May 1995 A
5424952 Asayama Jun 1995 A
5426294 Kobayashi et al. Jun 1995 A
5430431 Nelson Jul 1995 A
5434407 Bauer et al. Jul 1995 A
5440428 Hegg et al. Aug 1995 A
5444478 Lelong et al. Aug 1995 A
5451822 Bechtel et al. Sep 1995 A
5457493 Leddy et al. Oct 1995 A
5461357 Yoshioka et al. Oct 1995 A
5461361 Moore Oct 1995 A
5469298 Suman et al. Nov 1995 A
5471515 Fossum et al. Nov 1995 A
5475494 Nishida et al. Dec 1995 A
5497306 Pastrick Mar 1996 A
5498866 Bendicks et al. Mar 1996 A
5500766 Stonecypher Mar 1996 A
5510983 Iino Apr 1996 A
5515448 Nishitani May 1996 A
5521633 Nakajima et al. May 1996 A
5528698 Kamei et al. Jun 1996 A
5529138 Shaw et al. Jun 1996 A
5530240 Larson et al. Jun 1996 A
5530420 Tsuchiya et al. Jun 1996 A
5535314 Alves et al. Jul 1996 A
5537003 Bechtel et al. Jul 1996 A
5539397 Asanuma et al. Jul 1996 A
5541590 Nishio Jul 1996 A
5550677 Schofield et al. Aug 1996 A
5555555 Sato et al. Sep 1996 A
5568027 Teder Oct 1996 A
5574443 Hsieh Nov 1996 A
5581464 Woll et al. Dec 1996 A
5594222 Caldwell Jan 1997 A
5610756 Lynam et al. Mar 1997 A
5614788 Mullins Mar 1997 A
5619370 Guinosso Apr 1997 A
5632092 Blank et al. May 1997 A
5634709 Iwama Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5648835 Uzawa Jul 1997 A
5650944 Kise Jul 1997 A
5660454 Mori et al. Aug 1997 A
5661303 Teder Aug 1997 A
5666028 Bechtel et al. Sep 1997 A
5670935 Schofield et al. Sep 1997 A
5677851 Kingdon et al. Oct 1997 A
5699044 Van Lente et al. Dec 1997 A
5724316 Brunts Mar 1998 A
5732379 Eckert et al. Mar 1998 A
5737226 Olson et al. Apr 1998 A
5760828 Cortes Jun 1998 A
5760931 Saburi et al. Jun 1998 A
5760962 Schofield et al. Jun 1998 A
5761094 Olson et al. Jun 1998 A
5765116 Wilson-Jones et al. Jun 1998 A
5765118 Fukatani Jun 1998 A
5781437 Wiemer et al. Jul 1998 A
5786772 Schofield et al. Jul 1998 A
5790403 Nakayama Aug 1998 A
5790973 Blaker et al. Aug 1998 A
5793308 Rosinski et al. Aug 1998 A
5793420 Schmidt Aug 1998 A
5796094 Schofield et al. Aug 1998 A
5835255 Miles Nov 1998 A
5837994 Stam et al. Nov 1998 A
5844505 Van Ryzin Dec 1998 A
5844682 Kiyomoto et al. Dec 1998 A
5845000 Breed et al. Dec 1998 A
5848802 Breed et al. Dec 1998 A
5850176 Kinoshita et al. Dec 1998 A
5850254 Takano et al. Dec 1998 A
5867591 Onda Feb 1999 A
5877707 Kowalick Mar 1999 A
5877897 Schofield et al. Mar 1999 A
5878357 Sivashankar et al. Mar 1999 A
5878370 Olson Mar 1999 A
5883739 Ashihara et al. Mar 1999 A
5884212 Lion Mar 1999 A
5890021 Onoda Mar 1999 A
5896085 Mori et al. Apr 1999 A
5899956 Chan May 1999 A
5915800 Hiwatashi et al. Jun 1999 A
5923027 Stam et al. Jul 1999 A
5924212 Domanski Jul 1999 A
5929786 Schofield et al. Jul 1999 A
5949331 Schofield et al. Sep 1999 A
5959555 Furuta Sep 1999 A
5963247 Banitt Oct 1999 A
5986796 Miles Nov 1999 A
5990469 Bechtel et al. Nov 1999 A
5990649 Nagao et al. Nov 1999 A
6020704 Buschur Feb 2000 A
6049171 Stam et al. Apr 2000 A
6066933 Ponziana May 2000 A
6084519 Coulling et al. Jul 2000 A
6097023 Schofield et al. Aug 2000 A
6097024 Stam et al. Aug 2000 A
6100799 Fenk Aug 2000 A
6144022 Tenenbaum et al. Nov 2000 A
6175300 Kendrick Jan 2001 B1
6178034 Allemand et al. Jan 2001 B1
6185499 Kinoshita et al. Feb 2001 B1
6198409 Schofield et al. Mar 2001 B1
6201642 Bos et al. Mar 2001 B1
6222447 Schofield et al. Apr 2001 B1
6223114 Boros et al. Apr 2001 B1
6227689 Miller May 2001 B1
6250148 Lynam Jun 2001 B1
6266082 Yonezawa et al. Jul 2001 B1
6266442 Laumeyer et al. Jul 2001 B1
6285393 Shimoura et al. Sep 2001 B1
6294989 Schofield et al. Sep 2001 B1
6297781 Turnbull et al. Oct 2001 B1
6302545 Schofield et al. Oct 2001 B1
6310611 Caldwell Oct 2001 B1
6313454 Bos et al. Nov 2001 B1
6317057 Lee Nov 2001 B1
6320176 Schofield et al. Nov 2001 B1
6320282 Caldwell Nov 2001 B1
6333759 Mazzilli Dec 2001 B1
6341523 Lynam Jan 2002 B2
6353392 Schofield et al. Mar 2002 B1
6370329 Teuchert Apr 2002 B1
6392315 Jones et al. May 2002 B1
6396397 Bos et al. May 2002 B1
6411204 Bloomfield et al. Jun 2002 B1
6420975 DeLine et al. Jul 2002 B1
6424273 Gutta et al. Jul 2002 B1
6430303 Naoi et al. Aug 2002 B1
6442465 Breed et al. Aug 2002 B2
6477464 McCarthy et al. Nov 2002 B2
6497503 Dassanayake et al. Dec 2002 B1
6498620 Schofield et al. Dec 2002 B2
6516664 Lynam Feb 2003 B2
6523964 Schofield et al. Feb 2003 B2
6534884 Marcus et al. Mar 2003 B2
6539306 Turnbull Mar 2003 B2
6547133 DeVries, Jr. et al. Apr 2003 B1
6553130 Lemelson et al. Apr 2003 B1
6559435 Schofield et al. May 2003 B2
6574033 Chui et al. Jun 2003 B1
6589625 Kothari et al. Jul 2003 B1
6594583 Ogura et al. Jul 2003 B2
6611202 Schofield et al. Aug 2003 B2
6611610 Stam et al. Aug 2003 B1
6636258 Strumolo Oct 2003 B2
6650455 Miles Nov 2003 B2
6672731 Schnell et al. Jan 2004 B2
6674562 Miles Jan 2004 B1
6678614 McCarthy et al. Jan 2004 B2
6680792 Miles Jan 2004 B2
6684143 Graf et al. Jan 2004 B2
6690268 Schofield et al. Feb 2004 B2
6700605 Toyoda et al. Mar 2004 B1
6704621 Stein et al. Mar 2004 B1
6710908 Miles et al. Mar 2004 B2
6711474 Treyz et al. Mar 2004 B1
6714331 Lewis et al. Mar 2004 B2
6717610 Bos et al. Apr 2004 B1
6735506 Breed et al. May 2004 B2
6741377 Miles May 2004 B2
6744353 Sjönell Jun 2004 B2
6757109 Bos Jun 2004 B2
6762867 Lippert et al. Jul 2004 B2
6794119 Miles Sep 2004 B2
6795221 Urey Sep 2004 B1
6802617 Schofield et al. Oct 2004 B2
6806452 Bos et al. Oct 2004 B2
6819231 Berberich et al. Nov 2004 B2
6822563 Bos et al. Nov 2004 B2
6823241 Shirato et al. Nov 2004 B2
6824281 Schofield et al. Nov 2004 B2
6831261 Schofield et al. Dec 2004 B2
6850156 Bloomfield et al. Feb 2005 B2
6882287 Schofield Apr 2005 B2
6889161 Winner et al. May 2005 B2
6891563 Schofield et al. May 2005 B2
6909753 Meehan et al. Jun 2005 B2
6946978 Schofield Sep 2005 B2
6953253 Schofield et al. Oct 2005 B2
6968736 Lynam Nov 2005 B2
6975775 Rykowski et al. Dec 2005 B2
6989736 Berberich et al. Jan 2006 B2
7004606 Schofield Feb 2006 B2
7005974 McMahon et al. Feb 2006 B2
7038577 Pawlicki et al. May 2006 B2
7062300 Kim Jun 2006 B1
7065432 Moisel et al. Jun 2006 B2
7077549 Corliss Jul 2006 B1
7079017 Lang et al. Jul 2006 B2
7085637 Breed et al. Aug 2006 B2
7092548 Laumeyer et al. Aug 2006 B2
7111968 Bauer et al. Sep 2006 B2
7116246 Winter et al. Oct 2006 B2
7123168 Schofield Oct 2006 B2
7136753 Samukawa Nov 2006 B2
7145519 Takahashi et al. Dec 2006 B2
7149613 Stam et al. Dec 2006 B2
7161616 Okamoto et al. Jan 2007 B1
7167796 Taylor et al. Jan 2007 B2
7195381 Lynam et al. Mar 2007 B2
7202776 Breed Apr 2007 B2
7205904 Schofield Apr 2007 B2
7227459 Bos et al. Jun 2007 B2
7227611 Hull et al. Jun 2007 B2
7301478 Chinn et al. Nov 2007 B1
7311406 Schofield et al. Dec 2007 B2
7325934 Schofield et al. Feb 2008 B2
7325935 Schofield et al. Feb 2008 B2
7338177 Lynam Mar 2008 B2
7339149 Schofield et al. Mar 2008 B1
7344261 Schofield et al. Mar 2008 B2
7355524 Schofield Apr 2008 B2
7365769 Mager Apr 2008 B1
7370983 De Wind et al. May 2008 B2
7380948 Schofield et al. Jun 2008 B2
7388182 Schofield et al. Jun 2008 B2
7402786 Schofield et al. Jul 2008 B2
7423248 Schofield et al. Sep 2008 B2
7425076 Schofield et al. Sep 2008 B2
7446650 Schofield et al. Nov 2008 B2
7459664 Schofield et al. Dec 2008 B2
7460951 Altan Dec 2008 B2
7480149 DeWard et al. Jan 2009 B2
7490007 Taylor et al. Feb 2009 B2
7492281 Lynam et al. Feb 2009 B2
7526103 Schofield et al. Apr 2009 B2
7561181 Schofield et al. Jul 2009 B2
7566851 Stein Jul 2009 B2
7581859 Lynam Sep 2009 B2
7592928 Chinomi et al. Sep 2009 B2
7616781 Schofield et al. Nov 2009 B2
7619508 Lynam et al. Nov 2009 B2
7639149 Katoh Dec 2009 B2
7681960 Wanke et al. Mar 2010 B2
7720580 Higgins-Luthman May 2010 B2
7724962 Zhu et al. May 2010 B2
7777611 Desai Aug 2010 B2
7855755 Weller et al. Dec 2010 B2
7859565 Schofield et al. Dec 2010 B2
7881496 Camilleri et al. Feb 2011 B2
7914187 Higgins-Luthman et al. Mar 2011 B2
7952490 Fechner et al. May 2011 B2
7965336 Bingle et al. Jun 2011 B2
8013780 Lynam et al. Sep 2011 B2
8027029 Lu et al. Sep 2011 B2
8031062 Smith Oct 2011 B2
8058977 Lynam Nov 2011 B2
8078356 Kano et al. Dec 2011 B2
8144002 Kiuchi Mar 2012 B2
8340866 Hanzawa et al. Dec 2012 B2
8473171 Zagorski Jun 2013 B2
8588997 Pribula et al. Nov 2013 B2
8637801 Schofield et al. Jan 2014 B2
8694224 Chundrlik, Jr. et al. Apr 2014 B2
8849495 Chundrlik, Jr. et al. Sep 2014 B2
8855844 Schwindt Oct 2014 B2
8996224 Herbach Mar 2015 B1
9042600 Endo May 2015 B2
9092986 Salomonsson et al. Jul 2015 B2
9260095 Chundrlik, Jr. Feb 2016 B2
9264673 Chundrlik, Jr. et al. Feb 2016 B2
9286520 Lo Mar 2016 B1
9317757 Winter et al. Apr 2016 B2
9318020 Salomonsson et al. Apr 2016 B2
9327693 Wolf May 2016 B2
9436880 Bos et al. Sep 2016 B2
9509957 Higgins-Luthman et al. Nov 2016 B2
9545921 Wolf Jan 2017 B2
9563809 Salmonsson et al. Feb 2017 B2
9643605 Pawlicki et al. May 2017 B2
9666067 Nagpal et al. May 2017 B1
9701307 Newman Jul 2017 B1
9738125 Brickley Aug 2017 B1
9769381 Lu et al. Sep 2017 B2
9824587 Chundrlik, Jr. Nov 2017 B2
20020015153 Downs Feb 2002 A1
20020044065 Quist et al. Apr 2002 A1
20020113873 Williams Aug 2002 A1
20020115423 Hatae Aug 2002 A1
20020159270 Lynam et al. Oct 2002 A1
20030137586 Lewellen Jul 2003 A1
20030222982 Hamdan et al. Dec 2003 A1
20030227777 Schofield Dec 2003 A1
20040012488 Schofield Jan 2004 A1
20040016870 Pawlicki et al. Jan 2004 A1
20040032321 McMahon et al. Feb 2004 A1
20040051634 Schofield et al. Mar 2004 A1
20040114381 Salmeen et al. Jun 2004 A1
20040128065 Taylor et al. Jul 2004 A1
20040200948 Bos et al. Oct 2004 A1
20050078389 Kulas et al. Apr 2005 A1
20050134966 Burgner Jun 2005 A1
20050134983 Lynam Jun 2005 A1
20050146792 Schofield et al. Jul 2005 A1
20050169003 Lindahl et al. Aug 2005 A1
20050195488 McCabe et al. Sep 2005 A1
20050200700 Schofield et al. Sep 2005 A1
20050232469 Schofield et al. Oct 2005 A1
20050237172 Boomershine, III Oct 2005 A1
20050264891 Uken et al. Dec 2005 A1
20060018511 Stam et al. Jan 2006 A1
20060018512 Stam et al. Jan 2006 A1
20060028731 Schofield et al. Feb 2006 A1
20060050018 Hutzel et al. Mar 2006 A1
20060061008 Karner et al. Mar 2006 A1
20060091813 Stam et al. May 2006 A1
20060103727 Tseng May 2006 A1
20060155469 Kawasaki Jul 2006 A1
20060164221 Jensen Jul 2006 A1
20060164230 DeWind et al. Jul 2006 A1
20060250501 Wildmann et al. Nov 2006 A1
20060290479 Akatsuka et al. Dec 2006 A1
20070023613 Schofield et al. Feb 2007 A1
20070104476 Yasutomi et al. May 2007 A1
20070109406 Schofield et al. May 2007 A1
20070109651 Schofield et al. May 2007 A1
20070109652 Schofield et al. May 2007 A1
20070109653 Schofield et al. May 2007 A1
20070109654 Schofield et al. May 2007 A1
20070120657 Schofield et al. May 2007 A1
20070152803 Huang et al. Jul 2007 A1
20070176080 Schofield et al. Aug 2007 A1
20080180529 Taylor et al. Jul 2008 A1
20080189000 Duong Aug 2008 A1
20090093938 Isaji et al. Apr 2009 A1
20090113509 Tseng et al. Apr 2009 A1
20090143986 Stein et al. Jun 2009 A1
20090177347 Breuer et al. Jul 2009 A1
20090243824 Peterson et al. Oct 2009 A1
20090244361 Gebauer et al. Oct 2009 A1
20090265069 Desbrunes Oct 2009 A1
20090295181 Lawlor et al. Dec 2009 A1
20100020170 Higgins-Luthman et al. Jan 2010 A1
20100045797 Schofield et al. Feb 2010 A1
20100090863 Chen Apr 2010 A1
20100097469 Blank et al. Apr 2010 A1
20100228437 Hanzawa et al. Sep 2010 A1
20120013459 Giangrande Jan 2012 A1
20120044066 Mauderer et al. Feb 2012 A1
20120062743 Lynam et al. Mar 2012 A1
20120218412 Dellantoni et al. Aug 2012 A1
20120245817 Cooprider et al. Sep 2012 A1
20120262340 Hassan et al. Oct 2012 A1
20120287276 Dwivedi et al. Nov 2012 A1
20120303222 Cooprider et al. Nov 2012 A1
20130124052 Hahne May 2013 A1
20130129150 Saito May 2013 A1
20130131918 Hahne May 2013 A1
20130190972 Pribula et al. Jul 2013 A1
20130231825 Chundrlik, Jr. Sep 2013 A1
20140067206 Pflug Mar 2014 A1
20140156157 Johnson et al. Jun 2014 A1
20140222280 Salomonsson Aug 2014 A1
20140309884 Wolf Oct 2014 A1
20140313339 Diessner et al. Oct 2014 A1
20150085119 Dagan et al. Mar 2015 A1
20160016560 Parker et al. Jan 2016 A1
20160110620 Botusescu et al. Apr 2016 A1
20160318490 Ben Shalom Nov 2016 A1
20170108863 Chundrlik, Jr. Apr 2017 A1
20170243071 Stein et al. Aug 2017 A1
20170259815 Shaker Sep 2017 A1
20170262712 Chundrlik, Jr. et al. Sep 2017 A1
20190206260 Pilkington Jul 2019 A1
20200005646 Wong Jan 2020 A1
Related Publications (1)
Number Date Country
20180075752 A1 Mar 2018 US
Provisional Applications (1)
Number Date Country
61836900 Jun 2013 US
Continuations (2)
Number Date Country
Parent 15042666 Feb 2016 US
Child 15817611 US
Parent 14303694 Jun 2014 US
Child 15042666 US