The present invention relates to imaging systems or vision systems for vehicles.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
The present invention provides a vision system or imaging system for a vehicle that utilizes one or more cameras to capture images exterior of the vehicle, and provides the communication/data signals, including camera data or image data that may be displayed or processed to provide the desired display images and/or processing and control, depending on the particular application of the camera and vision or imaging system.
According to an aspect of the present invention, a vision system or alert system is operable, based on image processing of image data captured by a forward facing camera of the vehicle, to determine when the vehicle is stopped at a traffic light and another vehicle is ahead of the equipped vehicle at the traffic light, and is further operable to determine when the traffic light changes to green and the vehicle in front of the equipped or subject vehicle begins to move forward away from the equipped vehicle. At least in part responsive to such detection or determination, the system is operable to generate an alert or notification to the driver of the equipped vehicle and/or the system may govern or control forward movement of the equipped vehicle. The system thus alerts the driver of the possibility or likelihood that a traffic light has changed to green or the like, whereby the driver may, if appropriate, proceed forward to follow the leading vehicle into or through the intersection or the like.
According to another aspect of the present invention, an automatic braking system for a vehicle comprises a forward viewing camera and a rearward viewing camera disposed at a vehicle and an image processor operable to process image data captured by the forward viewing camera and the rearward viewing camera. Responsive at least in part to a determination that the equipped vehicle is approaching an object (such as a leading vehicle or other object) present forwardly of the equipped vehicle (such as in the lane being traveled by the equipped vehicle and/or in the forward path of travel of the equipped vehicle), the automatic braking system is operable to apply a vehicle brake of the equipped vehicle to mitigate or reduce the likelihood of collision with the determined object. Responsive at least in part to a determination that another vehicle is following the equipped vehicle (such as in the lane being traveled by the equipped vehicle and/or otherwise trailing or following the equipped vehicle) and the determined following vehicle is at least one of (i) within a threshold distance from the equipped vehicle and (ii) approaching the equipped vehicle at a threshold rate, the automatic braking system adjusts or reduces application of the vehicle brakes to mitigate or reduce the likelihood of a rear collision by the determined following vehicle. For example, the system may reduce the degree of braking responsive to a determination that a following vehicle is too close or within a threshold distance and/or is approaching too fast or above a threshold rate of approach, in order to mitigate the potential rear collision with the following vehicle upon application of the brakes to mitigate or avoid a front collision with the leading vehicle.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A driver assist system and/or vision system and/or object detection system and/or alert system may operate to capture images exterior of the vehicle and process the captured image data to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The object detection may utilize detection and analysis of moving vectors representative of objects detected in the field of view of the vehicle camera, in order to determine which detected objects are objects of interest to the driver of the vehicle, such as when the driver of the vehicle undertakes a reversing maneuver.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes one or more imaging sensors or cameras (such as a rearward facing imaging sensor or camera 14a and/or a forwardly facing camera 14b at the front (or at the windshield) of the vehicle, and/or a sidewardly/rearwardly facing camera 14c, 14d at the sides of the vehicle), which capture images exterior of the vehicle, with the cameras having a lens for focusing images at or onto an imaging array or imaging plane of the camera (
When a driver of a vehicle is at a traffic light that is red and waiting at red light for green, the driver may not pay attention to the traffic light status, and may concentrate on other items or the like. For example, the driver of the stopped vehicle may check text messages or emails or the like while waiting for the traffic light to change to green. When not paying attention to the traffic light or traffic or vehicle ahead of the driver's vehicle, the driver may not notice when the traffic light turns to green and when the vehicle or vehicles ahead of the driver have proceeded into the intersection or the like. Often, drivers that are slow to start moving when a traffic signal changes to a green light are slow to respond due to inattention, or rather misguided attention to emails or texts or the like, instead of the traffic light (green light) and/or traffic ahead of the equipped vehicle, and thus do not notice that the vehicle in front of them just moved forward and away from the equipped vehicle.
The present invention provides a feature or alert system which, based on the vehicle's forward viewing camera or forward camera module, and using the camera's or vision system's vehicle detection algorithms, detects that the vehicle in front of the equipped vehicle is starting to move forward and, responsive to such detection, alerts the driver of the equipped vehicle to look up and, if appropriate, to also start driving the vehicle forward. The system is operable to determine when the equipped vehicle is at a traffic light and when the traffic light is a red light or a green light and when another vehicle is ahead of the equipped vehicle. Thus, when the system determines that the equipped vehicle is at a red light and stopped behind another vehicle at the red light, and then determines that the light changes to green and the leading other vehicle moves away and into the intersection, the system may generate an alert or control signal, such as after the leading vehicle moves a threshold distance ahead of the equipped vehicle with the equipped vehicle still not moving. The threshold value for the distance between the leaving leading vehicle and the equipped vehicle at which the alert is generated may be any suitable distance, such as, for example, at least about two meters or at least about three meters or more, in order to make sure that the alert is a valid notification.
The present invention thus provides a new function for a front camera 114b of a vehicle vision system of a vehicle 110 (see
Preferably, the alert system will operate in a manner that will not annoy the driver with warnings if the driver is already aware of what is going on ahead of the equipped vehicle. For example, in addition to the detection of the movement of the vehicle in front of the equipped vehicle, the alert system of the present invention may not provide an alert when the driver of the equipped vehicle takes actions to initiate movement of the equipped vehicle, such as applying the accelerator or turning the steering wheel or engaging or disengaging the clutch or any other suitable actions or parameters that indicate that the driver is alert and is driving the vehicle or is about to drive the vehicle.
Thus, the alert system of the present invention provides reliable detection of the target or leading vehicle speed and/or the distance between the target or leading vehicle and the equipped vehicle in order to avoid warning or alerting the driver of the equipped vehicle every time the target vehicle advances slightly to adjust within a line of vehicles at a traffic light or intersection or the like. The notification or alert to the driver of the equipped vehicle also should be provided early enough to be useful to the driver. For example, the threshold setting may be set low enough to provide an alert when it is highly likely that the driver is inattentive but high enough to avoid false alerts. Optionally, the threshold setting may be adjustable or adaptive for different drivers.
Optionally, with the increased proliferation of start-stop technology, it is envisioned that the present invention may be operable to start the engine of the equipped vehicle even before the driver presses the gas pedal based on the intersection alert (such as when the system determines that the light changes to green and/or when the system determines that the leading vehicle starts to move away from the stopped equipped vehicle) to save time in moving the vehicle forward. The starting of the engine would also provide an alert or indication to the driver that he or she should pay attention to the traffic and/or traffic light ahead of the equipped vehicle.
Optionally, the alert system of the present invention may be responsive to detection of other items or events in addition to or instead of detection of the forward movement of the vehicle ahead of the equipped vehicle. For example, the system may process image data captured by the forward viewing camera to detect when the traffic or intersection light changes from red to green, whereby the system may generate the alert when forward movement of the leading vehicle is detected and when the system detects that the traffic light is green. Such an additional sensing may be implemented when the traffic light is in clear view/sight of the camera, and thus such an additional sensing may comprise a complementary or auxiliary sensing, but not the main sensing parameter or input for the system. Such a sensing of the state of the traffic light may allow the alert system to operate in situations where the equipped vehicle is at an intersection with no vehicles ahead of the equipped vehicle, whereby an alert may be generated responsive to a detection of the traffic light changing to green and no indication that the drier of the equipped vehicle is aware of the change.
Optionally, the alert system may only operate to detect the movement of the leading vehicle and alert the driver of the equipped vehicle accordingly, if the system first detects or determines that the equipped vehicle is at a traffic light intersection, or optionally if the system detects or determines that the equipped vehicle is in a line of at least two vehicles at a stop sign or the like. Optionally, the alert system may be operable to determine movement of the leading vehicle and to generate the alert in response to first determining, such as via a GPS system of the vehicle or the like, that the equipped vehicle is stopped at or near an intersection.
Optionally, by using vehicle-to-roadside bidirectional communication or roadside-to-vehicle communication from the intersection light or signal to the camera, the alert system may receive a signal or communication that is indicative of when the traffic light switches to green (such as by utilizing aspects of V2V communications or X2V communications or the like). Such a communication may also augment or supplement the sensing of forward movement of a leading vehicle.
Optionally, the alert system may link the knowledge of the distance to the target or leading vehicle (when the leading vehicle is stationary or still or when starting to move forward) to an overall vehicle safety system, whereby the system may use such information during a rear end collision at the rear of the equipped vehicle in order to mitigate the impact or collision.
Optionally, the alert system may be operable to provide a start notification using an in-vehicle telematics system or communication protocol, such as an in-vehicle BLUETOOTH® system or the like. For example, responsive to detection of the leading vehicle moving forward (indicative of, for example, the traffic light changing to a green light), an alert or notification or output from the vehicle alert system may be communicated to the driver's PDA or cell phone or smartphone or communication device or the like as a means for the start notification. Such an alert may be useful since the driver, who is not moving the vehicle forward with the vehicle ahead of the equipped vehicle, may already be looking at his or her PDA or cell phone or smartphone display or the like, and even if not looking at the PDA or cell phone or smartphone or the like, will not be annoyed by any audible (such as a chime or voice message) notification or visual notification from the vehicle. For example, the driver's PDA or cell phone or smartphone or communication device may display “get moving” or any similar text or message or icon or the like (and such a visual message may be coupled with any audible alert or chime if the user so chooses). The alert may be selectable by the driver so that the alert that is provided is acceptable to and preferred by the driver of the equipped vehicle.
Optionally, such an alert may only be generated by the cell phone or smartphone or PDA or communication device or the like only when the cell phone or smartphone or PDA or communication device is in a certain type of application that likely has the driver's attention at that time (such as, for example, when the driver's phone is in an email mode or text messaging mode or internet browsing mode or any active mode or app or game that would typically require the user's attention), and the system may limit such notification frequency even further to keep annoyance to a reduced level or minimum level. The alert system may communicate with the driver's communication device and may receive a signal or output therefrom that is indicative of the current state of the device, such as what app or function the device is currently operating, and the alert system may, responsive at least in part to such a determination, communicate an alert to the communication device to alert the driver of the change in traffic light or movement of the vehicle or vehicles ahead of the equipped vehicle.
Optionally, and alternatively (or complementary) to any of the above notifications or alerts, the start notification may comprise an alert or change at the radio/NAV display or the like. Such an alert may be provided in cases where the driver is entering GPS data or adjusting the controls or the like of the radio and/or navigation and/or telematics system of the vehicle (or any other system or accessory of the vehicle that utilizes user inputs). For example, a big red symbol (or green symbol) on the NAV screen would be an attention grabber to alert the driver that it may be time to drive the equipped vehicle forward.
Thus, the present invention provides an alert system that is operable to determine when a vehicle ahead of the equipped vehicle moves forward from a stopped position, such as when a traffic light changes to green, and may determine that the leading vehicle moves a threshold distance ahead of the equipped vehicle and/or the equipped vehicle does not follow within a threshold time period or the like, and, responsive to such determination or determinations, alerts the driver to pay attention and, if appropriate, drive the equipped vehicle forward and into and/or through the intersection. The alert system may only generate the alert responsive to other detections or determinations or parameters, such as a detection that the driver is using a vehicle accessory or system or using a cell phone or PDA or communication device or the like, and thus is not likely paying attention to the current driving situation. Optionally, the alert system may only generate the alert if the detection of movement of the leading vehicle occurs after the equipped vehicle has stopped behind the leading vehicle or when the system determines that the equipped vehicle is at an intersection or the like.
Thus, the present invention may provide an alert system that utilizes a standard or existing front camera of a vehicle with existing forward field of view and vehicle detection algorithms. The present invention, by alerting the driver when it is time to commence or recommence driving, provides an incentive for or allows drivers to read incoming texts or emails or the like when the vehicle is stopped at an intersection, without concerns with sitting through a green light or holding up following traffic or the like. Such an alert system thus may encourage drivers to not text or check emails or the like while moving/driving the vehicle along the road, since they will be able to focus on the texts and emails at the next stop light, without worrying about not paying attention to the traffic light or vehicles ahead of the equipped vehicle. The present invention may also increase the efficiency of traffic flow at intersections by limiting or reducing the time that a vehicle may sit after the vehicle in front of it has moved forward. Such increased efficiency may also reduce irritation and possible road rage between drivers, and may reduce the number of rear end collisions at intersections based on less uncertainty and hesitating driver and better flow in the line of vehicles at intersection. The present invention also increases the benefits from front cameras for little or close to no cost to drive the acceptance and implementation of such forward viewing or front cameras in the market place.
The alert system utilizes a forward viewing camera that may be disposed at a forward region of the vehicle and/or at or behind the windshield of the vehicle, such as at a windshield electronics module or forward camera module or interior rearview mirror assembly or the like. The camera may comprise any suitable camera or imaging sensor, such as discussed below. The system includes an image processor for processing image data captured by the forward facing camera to determine that (i) the equipped vehicle is stopped at a red traffic light, (ii) another vehicle is ahead of the equipped vehicle at the traffic light, (iii) the traffic light changes to a green light, (iv) the leading vehicle moves away from the equipped vehicle and (v) the equipped vehicle does not move. When the equipped vehicle does not move for a threshold period of time after the leading vehicle moves (or after the traffic light changes to a green light) or when the leading vehicle moves a threshold distance away from the non-moving equipped vehicle, the system may generate an alert or control a vehicle system (such as the ignition to start a shut off vehicle or such as a display system to display a message or such as a control system that provides a haptic signal, such as by vibrating the steering wheel, or the like) to alert the driver that it is time to drive the vehicle forward into the intersection.
Optionally, the system of the present invention may utilize a rear video camera with vehicle detection output to a front camera of the vehicle. For example, the system of the present invention may avoid rear end collisions (by a vehicle rearward of and following the equipped lead vehicle) by deliberately activating the rear brake lights of the equipped vehicle earlier when a Forward Collision Alert (FCA) system tracks close to the time to collision (TTC) for activation of the braking system of the lead (Ego) vehicle function (such as low speed collision mitigation by braking (LSCMB)/pedestrian collision mitigation by braking (PedCMB) or the like). This may provide a useful function even without automatic braking as a warning to drivers close behind an FCA equipped vehicle. The system basically provides a heads up or alert to the driver of a following vehicle or vehicles that the driver of the leading vehicle ahead of the following vehicle(s) may brake soon.
Optionally, the system may delay/minimize CMB braking force of the lead equipped (Ego) vehicle when TTC is reached if a vehicle is following close behind the lead vehicle, in order to limit or mitigate or avoid rear end collision. This may be seen as conflicting objectives with braking for the obstacle ahead of the equipped vehicle, but if the distance is known to a detected object or obstacle in front of the equipped vehicle and the distance to the following vehicle is known, there will be an optimal compromise available to enhance limiting or mitigation of or avoidance of collision with one or both objects/vehicles. This could also limit liabilities or required safety levels (including Automotive Safety Integrity Levels or ASILs) and may reduce or eliminate the need for a “perfect” front camera system that operates without false positives.
For example, and with reference to
Optionally, the AEB system may be responsive to an output of a rear camera of the vehicle, and may adjust the braking responsive to a determination that another vehicle is following the equipped vehicle. For example, the system may provide a decreasing level of braking if a vehicle is determined to be closely behind the equipped vehicle and may only initiate a maximum braking or high braking if there is no vehicle detected behind the equipped vehicle or if there is a vehicle detected behind the equipped vehicle but a rear end collision is unavoidable and the maximum braking is needed to avoid or reduce or mitigate a collision with a detected object or vehicle ahead of the equipped vehicle.
As shown in
Thus, the automatic braking system for a vehicle includes an image processor operable to process image data captured by front and rear cameras of the equipped vehicle. Responsive at least in part to a determination that the equipped vehicle is approaching an object determined to be present forwardly of the equipped vehicle, the system is operable to apply the vehicle brakes to reduce the likelihood of collision with the determined object. Also, responsive at least in part to a determination that another vehicle is following the equipped vehicle and within at least one of (i) a threshold distance from the equipped vehicle and (ii) a threshold rate of approach to the equipped vehicle, the system is operable to adjust the application of the vehicle brakes to reduce the likelihood of a rear collision by the determined following vehicle.
The braking system may be operable to determine a degree of application of the vehicle brakes to mitigate collision with the determined forward object and the determined following vehicle. The braking system may be operable to apply a maximum degree of braking only when the system determines that there is no vehicle following the equipped vehicle within a threshold distance from the equipped vehicle. Optionally, the braking system may be operable to apply a maximum or high degree of braking even when the system determines that there is a vehicle following the equipped vehicle but also determines that the rate of approach to the leading vehicle requires a high degree of braking to mitigate an imminent collision. The system may consider the rate of approach and distance to the leading vehicle and compare that to the rate of approach and distance to the trailing vehicle in determining the best degree of braking of the subject vehicle to mitigate either or both potential or imminent collisions.
Thus, an AEB system is provided that uses input from a rearward facing camera and adjusts the braking level accordingly. Such a system allows for a lower ASIL for a front camera system. If the system detects that there is no vehicle within some distance behind the AEB vehicle, then the system can brake at a high level (such as, for example, up to about 1 g or thereabouts) without consequences (such as rear end collision) even for situations when the AEB system is triggered by a false positive. In other words, by only max-braking when there is no vehicle following the equipped vehicle, the risk of harming someone is reduced (which may translate to a lower required ASIL). In the fairly low occurrence case that someone is really close behind the equipped vehicle, the system may limit or reduce the applied braking responsive to a rear camera data input. Thus, such a system utilizes vehicle or object detection capability in the rear camera or multi-camera system of the vehicle.
Another positive or benefit of the reduced risk for causing inadvertent rear end collisions (such as when the AEB system, responsive to a front camera output, issues the brake command without regard to what is following behind the equipped vehicle), is that the TTC (Time To Collision) brake trigger settings in the front camera could, optionally, be extended to allow for more false positives, while also insuring that no proper (true positive) braking for a vehicle/pedestrian is ever missed.
Optionally, the system of the present invention may include or utilize additional lane departure warning (LDW) support data from lane markings detected behind the vehicle. Thus, the rear camera may augment the lane marking detection by the forward facing camera and associated processor. The image data captured by the rear facing camera may be processed by the same image processor as the front camera image data or a separate image processor may be used to process image data captured by the rearward facing imager or camera.
The system of the present invention thus uses the camera or cameras already present on the vehicle. For example, the camera or cameras used by the alert and/or braking system may be part of a multi-camera vision system or surround view system or rear backup aid system or forward facing camera system of the vehicle (and may utilize aspects of the systems described in U.S. Pat. No. 7,855,755, which is hereby incorporated herein by reference in its entirety). Such use of cameras already present on the vehicle for other purposes reduces the cost of the recording system, since no dedicated cameras are needed for the recording system when the recording system is added to the vehicle.
Optionally, the vision and/or alert system may utilize other types of forward facing or forward viewing sensors, such as a radar sensor or lidar sensor or the like, either instead of a camera or in conjunction with a camera. Optionally, the alert system may utilize a ladar sensor (a radar type sensor that uses lasers instead of radio frequencies), such as a ladar sensor that comprises a two dimensional (2D) optical phased array. The ladar sensor, instead of using radio waves, uses lasers to scan a given area, and emits optical beams and returns information that is more detailed than radar. A 2D laser phased array developed by Defense Advanced Research Projects Agency (DARPA) of Va. is around the size of the head of a pin (about 576 μm×576 μm) and all of the required circuitry and/or components, such as 4,096 nanoantennas arranged in a 64×64 fashion, may be incorporated onto a single silicon chip. The ladar chip may provide dynamic beam steering via an 8×8 array.
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in PCT Application No. PCT/US2012/066570, filed Nov. 27, 2012 and published as International Publication No. WO 2013/081984, and/or PCT Application No. PCT/US2012/066571, filed Nov. 27, 2012 and published as International Publication No. WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an EYEQ2 or EYEQ3 image processing chip available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or International Publication Nos. WO 2011/028686; WO 2010/099416; WO 2012/061567; WO 2012/068331; WO 2012/075250; WO 2012/103193; WO 2012/0116043; WO 2012/0145313; WO 2012/0145501; WO 2012/145818; WO 2012/145822; WO 2012/158167; WO 2012/075250; WO 2012/0116043; WO 2012/0145501; WO 2012/154919; WO 2013/019707; WO 2013/016409; WO 2013/019795; WO 2013/067083; WO 2013/070539; WO 2013/043661; WO 2013/048994; WO 2013/063014, WO 2013/081984; WO 2013/081985; WO 2013/074604; WO 2013/086249; WO 2013/103548; WO 2013/109869; WO 2013/123161; WO 2013/126715; WO 2013/043661 and/or WO 2013/158592 and/or U.S. patent applications, Ser. No. 14/107,624, filed Dec. 16, 2013; Ser. No. 14/102,981, filed Dec. 11, 2013; Ser. No. 14/102,980, filed Dec. 11, 2013; Ser. No. 14/098,817, filed Dec. 6, 2013; Ser. No. 14/097,581, filed Dec. 5, 2013; Ser. No. 14/093,981, filed Dec. 2, 2013; Ser. No. 14/093,980, filed Dec. 2, 2013; Ser. No. 14/082,573, filed Nov. 18, 2013; Ser. No. 14/082,574, filed Nov. 18, 2013; Ser. No. 14/082,575, filed Nov. 18, 2013; Ser. No. 14/082,577, filed Nov. 18, 2013; Ser. No. 14/071,086, filed Nov. 4, 2013; Ser. No. 14/076,524, filed Nov. 11, 2013; Ser. No. 14/052,945, filed Oct. 14, 2013; Ser. No. 14/046,174, filed Oct. 4, 2013; Ser. No. 14/016,790, filed Oct. 3, 2013; Ser. No. 14/036,723, filed Sep. 25, 2013; Ser. No. 14/016,790, filed Sep. 3, 2013; Ser. No. 14/001,272, filed Aug. 23, 2013; Ser. No. 13/970,868, filed Aug. 20, 2013; Ser. No. 13/964,134, filed Aug. 12, 2013; Ser. No. 13/942,758, filed Jul. 16, 2013; Ser. No. 13/942,753, filed Jul. 16, 2013; Ser. No. 13/927,680, filed Jun. 26, 2013; Ser. No. 13/916,051, filed Jun. 12, 2013; Ser. No. 13/894,870, filed May 15, 2013; Ser. No. 13/887,724, filed May 6, 2013; Ser. No. 13/852,190, filed Mar. 28, 2013; Ser. No. 13/851,378, filed Mar. 27, 2013; Ser. No. 13/848,796, filed Mar. 22, 2012; Ser. No. 13/847,815, filed Mar. 20, 2013; Ser. No. 13/800,697, filed Mar. 13, 2013; Ser. No. 13/785,099, filed Mar. 5, 2013; Ser. No. 13/779,881, filed Feb. 28, 2013; Ser. No. 13/774,317, filed Feb. 22, 2013; Ser. No. 13/774,315, filed Feb. 22, 2013; Ser. No. 13/681,963, filed Nov. 20, 2012; Ser. No. 13/660,306, filed Oct. 25, 2012; Ser. No. 13/653,577, filed Oct. 17, 2012; and/or Ser. No. 13/534,657, filed Jun. 27, 2012, and/or U.S. provisional applications, Ser. No. 61/919,129, filed Dec. 20, 2013; Ser. No. 61/919,130, filed Dec. 20, 2013; Ser. No. 61/919,131, filed Dec. 20, 2013; Ser. No. 61/919,147, filed Dec. 20, 2013; Ser. No. 61/919,138, filed Dec. 20, 2013, Ser. No. 61/919,133, filed Dec. 20, 2013; Ser. No. 61/918,290, filed Dec. 19, 2013; Ser. No. 61/915,218, filed Dec. 12, 2013; Ser. No. 61/912,146, filed Dec. 5, 2013; Ser. No. 61/911,666, filed Dec. 4, 2013; Ser. No. 61/911,665, filed Dec. 4, 2013; Ser. No. 61/905,461, filed Nov. 18, 2013; Ser. No. 61/905,462, filed Nov. 18, 2013; Ser. No. 61/901,127, filed Nov. 7, 2013; Ser. No. 61/895,610, filed Oct. 25, 2013; Ser. No. 61/895,609, filed Oct. 25, 2013; Ser. No. 61/893,489, filed Oct. 21, 2013; Ser. No. 61/879,837, filed Sep. 19, 2013; Ser. No. 61/879,835, filed Sep. 19, 2013; Ser. No. 61/878,877, filed Sep. 17, 2013; Ser. No. 61/875,351, filed Sep. 9, 2013; Ser. No. 61/869,195, filed Aug. 23, 2013; Ser. No. 61/864,835, filed Aug. 12, 2013; Ser. No. 61/864,836, filed Aug. 12, 2013; Ser. No. 61/864,837, filed Aug. 12, 2013; Ser. No. 61/864,838, filed Aug. 12, 2013; Ser. No. 61/856,843, filed Jul. 22, 2013, Ser. No. 61/845,061, filed Jul. 11, 2013; Ser. No. 61/844,630, filed Jul. 10, 2013; Ser. No. 61/844,173, filed Jul. 9, 2013; Ser. No. 61/844,171, filed Jul. 9, 2013; Ser. No. 61/842,644, filed Jul. 3, 2013; Ser. No. 61/840,542, filed Jun. 28, 2013; Ser. No. 61/838,619, filed Jun. 24, 2013; Ser. No. 61/838,621, filed Jun. 24, 2013; Ser. No. 61/837,955, filed Jun. 21, 2013; Ser. No. 61/836,900, filed Jun. 19, 2013; Ser. No. 61/836,380, filed Jun. 18, 2013; Ser. No. 61/833,080, filed Jun. 10, 2013; Ser. No. 61/830,375, filed Jun. 3, 2013; Ser. No. 61/830,377, filed Jun. 3, 2013; Ser. No. 61/825,752, filed May 21, 2013; Ser. No. 61/825,753, filed May 21, 2013; Ser. No. 61/823,648, filed May 15, 2013; Ser. No. 61/823,644, filed May 15, 2013; Ser. No. 61/821,922, filed May 10, 2013; Ser. No. 61/819,835, filed May 6, 2013; Ser. No. 61/819,033, filed May 3, 2013; Ser. No. 61/816,956, filed Apr. 29, 2013; Ser. No. 61/815,044, filed Apr. 23, 2013; Ser. No. 61/814,533, filed Apr. 22, 2013; Ser. No. 61/813,361, filed Apr. 18, 2013; Ser. No. 61/810,407, filed Apr. 10, 2013; Ser. No. 61/808,930, filed Apr. 5, 2013; Ser. No. 61/807,050, filed Apr. 1, 2013; Ser. No. 61/806,674, filed Mar. 29, 2013; Ser. No. 61/793,592, filed Mar. 15, 2013; Ser. No. 61/772,015, filed Mar. 4, 2013; Ser. No. 61/772,014, filed Mar. 4, 2013; Ser. No. 61/770,051, filed Feb. 27, 2013; Ser. No. 61/766,883, filed Feb. 20, 2013; Ser. No. 61/760,364, filed Feb. 4, 2013; Ser. No. 61/756,832, filed Jan. 25, 2013; and/or Ser. No. 61/754,804, filed Jan. 21, 2013, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO 2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. patent application Ser. No. 13/202,005, filed Aug. 17, 2011 and published as U.S. Publication No. US-2012-0062743, which are hereby incorporated herein by reference in their entireties.
The imaging device and control and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras and vision systems described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,937,667; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454 and 6,824,281, and/or International Publication Nos. WO 2010/099416; WO 2011/028686 and/or WO 2013/016409, and/or U.S. patent application Ser. No. 12/508,840, filed Jul. 24, 2009, and published Jan. 28, 2010 as U.S. Pat. Publication No. US 2010-0020170, and/or U.S. patent application Ser. No. 13/534,657, filed Jun. 27, 2012, which are all hereby incorporated herein by reference in their entireties. The camera or cameras may comprise any suitable cameras or imaging sensors or camera modules, and may utilize aspects of the cameras or sensors described in U.S. patent application Ser. No. 12/091,359, filed Apr. 24, 2008 and published Oct. 1, 2009 as U.S. Publication No. US-2009-0244361; and/or Ser. No. 13/260,400, filed Sep. 26, 2011, now U.S. Pat. No. 8,542,451, and/or U.S. Pat. Nos. 7,965,336 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties. The imaging array sensor may comprise any suitable sensor, and may utilize various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as the types described in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,715,093; 5,877,897; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 6,498,620; 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,806,452; 6,396,397; 6,822,563; 6,946,978; 7,339,149; 7,038,577; 7,004,606; 7,720,580 and/or 7,965,336, and/or International Publication Nos. WO 2009/036176 and/or WO 2009/046268, which are all hereby incorporated herein by reference in their entireties.
The camera module and circuit chip or board and imaging sensor may be implemented and operated in connection with various vehicular vision-based systems, and/or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 7,004,606; 7,339,149 and/or 7,526,103, which are all hereby incorporated herein by reference in their entireties, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454; 6,320,176 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties, a vehicle vision system, such as a forwardly, sidewardly or rearwardly directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978 and/or 7,859,565, which are all hereby incorporated herein by reference in their entireties, a trailer hitching aid or tow check system, such as the type disclosed in U.S. Pat. No. 7,005,974, which is hereby incorporated herein by reference in its entirety, a reverse or sideward imaging system, such as for a lane change assistance system or lane departure warning system or for a blind spot or object detection system, such as imaging or detection systems of the types disclosed in U.S. Pat. Nos. 7,720,580; 7,038,577; 5,929,786 and/or 5,786,772, and/or U.S. patent application Ser. No. 11/239,980, filed Sep. 30, 2005, now U.S. Pat. No. 7,881,496, and/or U.S. provisional applications Ser. No. 60/628,709, filed Nov. 17, 2004; Ser. No. 60/614,644, filed Sep. 30, 2004; Ser. No. 60/618,686, filed Oct. 14, 2004; Ser. No. 60/638,687, filed Dec. 23, 2004, which are hereby incorporated herein by reference in their entireties, a video device for internal cabin surveillance and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962; 5,877,897; 6,690,268 and/or 7,370,983, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties, a traffic sign recognition system, a system for determining a distance to a leading or trailing vehicle or object, such as a system utilizing the principles disclosed in U.S. Pat. Nos. 6,396,397 and/or 7,123,168, which are hereby incorporated herein by reference in their entireties, and/or the like.
Optionally, the circuit board or chip may include circuitry for the imaging array sensor and or other electronic accessories or features, such as by utilizing compass-on-a-chip or EC driver-on-a-chip technology and aspects such as described in U.S. Pat. No. 7,255,451 and/or U.S. Pat. No. 7,480,149; and/or U.S. patent applications, Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, and/or Ser. No. 12/578,732, filed Oct. 14, 2009 and published Apr. 22, 2010 as U.S. Publication No. US-2010-0097469, which are hereby incorporated herein by reference in their entireties.
Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device disposed at or in the interior rearview mirror assembly of the vehicle, such as by utilizing aspects of the video mirror display systems described in U.S. Pat. No. 6,690,268 and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011 and published as U.S. Publication No. US-2012-0162427, which are hereby incorporated herein by reference in their entireties. The video mirror display may comprise any suitable devices and systems and optionally may utilize aspects of the compass display systems described in U.S. Pat. Nos. 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252 and/or 6,642,851, and/or European patent application, published Oct. 11, 2000 under Publication No. EP 0 1043566, and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, which are all hereby incorporated herein by reference in their entireties. Optionally, the video mirror display screen or device may be operable to display images captured by a rearward viewing camera of the vehicle during a reversing maneuver of the vehicle (such as responsive to the vehicle gear actuator being placed in a reverse gear position or the like) to assist the driver in backing up the vehicle, and optionally may be operable to display the compass heading or directional heading character or icon when the vehicle is not undertaking a reversing maneuver, such as when the vehicle is being driven in a forward direction along a road (such as by utilizing aspects of the display system described in PCT Application No. PCT/US2011/056295, filed Oct. 14, 2011 and published Apr. 19, 2012 as International Publication No. WO 2012/051500, which is hereby incorporated herein by reference in its entirety).
Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2010/099416; WO 2011/028686; WO 2012/075250; WO 2013/019795; WO 2012/075250; WO 2012/145822; WO 2013/081985; WO 2013/086249 and/or WO 2013/109869, and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011 and published as U.S. Publication No. US-2012-0162427, which are hereby incorporated herein by reference in their entireties.
Optionally, a video mirror display may be disposed rearward of and behind the reflective element assembly and may comprise a display such as the types disclosed in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,370,983; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187 and/or 6,690,268, and/or in U.S. patent applications, Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008; and/or Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are all hereby incorporated herein by reference in their entireties. The display is viewable through the reflective element when the display is activated to display information. The display element may be any type of display element, such as a vacuum fluorescent (VF) display element, a light emitting diode (LED) display element, such as an organic light emitting diode (OLED) or an inorganic light emitting diode, an electroluminescent (EL) display element, a liquid crystal display (LCD) element, a video screen display element or backlit thin film transistor (TFT) display element or the like, and may be operable to display various information (as discrete characters, icons or the like, or in a multi-pixel manner) to the driver of the vehicle, such as passenger side inflatable restraint (PSIR) information, tire pressure status, and/or the like. The mirror assembly and/or display may utilize aspects described in U.S. Pat. Nos. 7,184,190; 7,255,451; 7,446,924 and/or 7,338,177, which are all hereby incorporated herein by reference in their entireties. The thicknesses and materials of the coatings on the substrates of the reflective element may be selected to provide a desired color or tint to the mirror reflective element, such as a blue colored reflector, such as is known in the art and such as described in U.S. Pat. Nos. 5,910,854; 6,420,036 and/or 7,274,501, which are hereby incorporated herein by reference in their entireties.
Optionally, the display or displays and any associated user inputs may be associated with various accessories or systems, such as, for example, a tire pressure monitoring system or a passenger air bag status or a garage door opening system or a telematics system or any other accessory or system of the mirror assembly or of the vehicle or of an accessory module or console of the vehicle, such as an accessory module or console of the types described in U.S. Pat. Nos. 7,289,037; 6,877,888; 6,824,281; 6,690,268; 6,672,744; 6,386,742 and 6,124,886, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties.
Changes and modifications to the specifically described embodiments may be carried out without departing from the principles of the present invention, which is intended to be limited only by the scope of the appended claims as interpreted according to the principles of patent law.
The present application is a continuation of U.S. patent application Ser. No. 15/817,612, filed Nov. 20, 2017, now U.S. Pat. No. 10,497,262, which is a continuation of U.S. patent application Ser. No. 15/416,217, filed Jan. 26, 2017, now U.S. Pat. No. 9,824,285, which is a continuation of U.S. patent application Ser. No. 15/131,593, filed Apr. 18, 2016, now U.S. Pat. No. 9,563,809, which is a continuation of U.S. patent application Ser. No. 14/809,541, filed Jul. 27, 2015, now U.S. Pat. No. 9,318,020, which is a continuation of U.S. patent application Ser. No. 14/169,328, filed Jan. 31, 2014, now U.S. Pat. No. 9,092,986, which claims the filing benefits of U.S. provisional applications, Ser. No. 61/886,883, filed Oct. 4, 2013, Ser. No. 61/834,129, filed Jun. 12, 2013, and Ser. No. 61/760,366, filed Feb. 4, 2013, which are hereby incorporated herein by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
4720790 | Miki et al. | Jan 1988 | A |
4987357 | Masaki | Jan 1991 | A |
4991054 | Walters | Feb 1991 | A |
5001558 | Burley et al. | Mar 1991 | A |
5003288 | Wilhelm | Mar 1991 | A |
5012082 | Watanabe | Apr 1991 | A |
5016977 | Baude et al. | May 1991 | A |
5027001 | Torbert | Jun 1991 | A |
5027200 | Petrossian et al. | Jun 1991 | A |
5044706 | Chen | Sep 1991 | A |
5055668 | French | Oct 1991 | A |
5059877 | Teder | Oct 1991 | A |
5064274 | Alten | Nov 1991 | A |
5072154 | Chen | Dec 1991 | A |
5073012 | Lynam | Dec 1991 | A |
5076673 | Lynam et al. | Dec 1991 | A |
5086253 | Lawler | Feb 1992 | A |
5096287 | Kakinami et al. | Mar 1992 | A |
5097362 | Lynas | Mar 1992 | A |
5115346 | Lynam | May 1992 | A |
5121200 | Choi | Jun 1992 | A |
5124549 | Michaels et al. | Jun 1992 | A |
5130709 | Toyama et al. | Jul 1992 | A |
5148014 | Lynam et al. | Sep 1992 | A |
5151816 | Varaprasad et al. | Sep 1992 | A |
5161107 | Mayeaux | Nov 1992 | A |
5168378 | Black | Dec 1992 | A |
5170374 | Shimohigashi et al. | Dec 1992 | A |
5172235 | Wilm et al. | Dec 1992 | A |
5177685 | Davis et al. | Jan 1993 | A |
5182502 | Slotkowski et al. | Jan 1993 | A |
5184956 | Langlais et al. | Feb 1993 | A |
5189561 | Hong | Feb 1993 | A |
5193000 | Lipton et al. | Mar 1993 | A |
5193029 | Schofield et al. | Mar 1993 | A |
5204778 | Bechtel | Apr 1993 | A |
5208701 | Maeda | May 1993 | A |
5245422 | Borcherts et al. | Sep 1993 | A |
5253109 | O'Farrell et al. | Oct 1993 | A |
5255442 | Schierbeek et al. | Oct 1993 | A |
5276389 | Levers | Jan 1994 | A |
5285060 | Larson et al. | Feb 1994 | A |
5289182 | Brillard et al. | Feb 1994 | A |
5289321 | Secor | Feb 1994 | A |
5305012 | Faris | Apr 1994 | A |
5307136 | Saneyoshi | Apr 1994 | A |
5309137 | Kajiwara | May 1994 | A |
5313072 | Vachss | May 1994 | A |
5325096 | Pakett | Jun 1994 | A |
5325386 | Jewell et al. | Jun 1994 | A |
5329206 | Slotkowski et al. | Jul 1994 | A |
5331312 | Kudoh | Jul 1994 | A |
5336980 | Levers | Aug 1994 | A |
5341437 | Nakayama | Aug 1994 | A |
5351044 | Mathur et al. | Sep 1994 | A |
5355118 | Fukuhara | Oct 1994 | A |
5374852 | Parkes | Dec 1994 | A |
5386285 | Asayama | Jan 1995 | A |
5394333 | Kao | Feb 1995 | A |
5406395 | Wilson et al. | Apr 1995 | A |
5406414 | O'Farrell et al. | Apr 1995 | A |
5410346 | Saneyoshi et al. | Apr 1995 | A |
5414257 | Stanton | May 1995 | A |
5414461 | Kishi et al. | May 1995 | A |
5416313 | Larson et al. | May 1995 | A |
5416318 | Hegyi | May 1995 | A |
5416478 | Morinaga | May 1995 | A |
5424952 | Asayama | Jun 1995 | A |
5426294 | Kobayashi et al. | Jun 1995 | A |
5430431 | Nelson | Jul 1995 | A |
5434407 | Bauer et al. | Jul 1995 | A |
5440428 | Hegg et al. | Aug 1995 | A |
5444478 | Lelong et al. | Aug 1995 | A |
5451822 | Bechtel et al. | Sep 1995 | A |
5457493 | Leddy et al. | Oct 1995 | A |
5461357 | Yoshioka et al. | Oct 1995 | A |
5461361 | Moore | Oct 1995 | A |
5469298 | Suman et al. | Nov 1995 | A |
5471515 | Fossum et al. | Nov 1995 | A |
5475494 | Nishida et al. | Dec 1995 | A |
5497306 | Pastrick | Mar 1996 | A |
5498866 | Bendicks et al. | Mar 1996 | A |
5500766 | Stonecypher | Mar 1996 | A |
5510983 | Lino | Apr 1996 | A |
5515448 | Nishitani | May 1996 | A |
5521633 | Nakajima et al. | May 1996 | A |
5528698 | Kamei et al. | Jun 1996 | A |
5529138 | Shaw et al. | Jun 1996 | A |
5530240 | Larson et al. | Jun 1996 | A |
5530420 | Tsuchiya et al. | Jun 1996 | A |
5535314 | Alves et al. | Jul 1996 | A |
5537003 | Bechtel et al. | Jul 1996 | A |
5539397 | Asanuma et al. | Jul 1996 | A |
5541590 | Nishio | Jul 1996 | A |
5550677 | Schofield et al. | Aug 1996 | A |
5555555 | Sato et al. | Sep 1996 | A |
5568027 | Teder | Oct 1996 | A |
5574443 | Hsieh | Nov 1996 | A |
5581464 | Woll et al. | Dec 1996 | A |
5594222 | Caldwell | Jan 1997 | A |
5610756 | Lynam et al. | Mar 1997 | A |
5614788 | Mullins | Mar 1997 | A |
5619370 | Guinosso | Apr 1997 | A |
5632092 | Blank et al. | May 1997 | A |
5634709 | Iwama | Jun 1997 | A |
5642299 | Hardin et al. | Jun 1997 | A |
5648835 | Uzawa | Jul 1997 | A |
5650944 | Kise | Jul 1997 | A |
5660454 | Mori et al. | Aug 1997 | A |
5661303 | Teder | Aug 1997 | A |
5666028 | Bechtel et al. | Sep 1997 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5677851 | Kingdon et al. | Oct 1997 | A |
5699044 | Van Lente et al. | Dec 1997 | A |
5724316 | Brunts | Mar 1998 | A |
5732379 | Eckert et al. | Mar 1998 | A |
5737226 | Olson et al. | Apr 1998 | A |
5760828 | Cortes | Jun 1998 | A |
5760931 | Saburi et al. | Jun 1998 | A |
5760962 | Schofield et al. | Jun 1998 | A |
5761094 | Olson et al. | Jun 1998 | A |
5765116 | Wilson-Jones et al. | Jun 1998 | A |
5765118 | Fukatani | Jun 1998 | A |
5781437 | Wiemer et al. | Jul 1998 | A |
5786772 | Schofield et al. | Jul 1998 | A |
5790403 | Nakayama | Aug 1998 | A |
5790973 | Blaker et al. | Aug 1998 | A |
5793308 | Rosinski et al. | Aug 1998 | A |
5793420 | Schmidt | Aug 1998 | A |
5796094 | Schofield et al. | Aug 1998 | A |
5835255 | Miles | Nov 1998 | A |
5837994 | Stam et al. | Nov 1998 | A |
5844505 | Van Ryzin | Dec 1998 | A |
5844682 | Kiyomoto et al. | Dec 1998 | A |
5845000 | Breed et al. | Dec 1998 | A |
5848802 | Breed et al. | Dec 1998 | A |
5850176 | Kinoshita et al. | Dec 1998 | A |
5850254 | Takano et al. | Dec 1998 | A |
5867591 | Onda | Feb 1999 | A |
5877707 | Kowalick | Mar 1999 | A |
5877897 | Schofield et al. | Mar 1999 | A |
5878357 | Sivashankar et al. | Mar 1999 | A |
5878370 | Olson | Mar 1999 | A |
5883739 | Ashihara et al. | Mar 1999 | A |
5884212 | Lion | Mar 1999 | A |
5890021 | Onoda | Mar 1999 | A |
5896085 | Mori et al. | Apr 1999 | A |
5899956 | Chan | May 1999 | A |
5915800 | Hiwatashi et al. | Jun 1999 | A |
5923027 | Stam et al. | Jul 1999 | A |
5924212 | Domanski | Jul 1999 | A |
5929786 | Schofield et al. | Jul 1999 | A |
5949331 | Schofield et al. | Sep 1999 | A |
5959555 | Furuta | Sep 1999 | A |
5963247 | Banitt | Oct 1999 | A |
5986796 | Miles | Nov 1999 | A |
5990469 | Bechtel et al. | Nov 1999 | A |
5990649 | Nagao et al. | Nov 1999 | A |
6020704 | Buschur | Feb 2000 | A |
6049171 | Stam et al. | Apr 2000 | A |
6066933 | Ponziana | May 2000 | A |
6084519 | Coulling et al. | Jul 2000 | A |
6097023 | Schofield et al. | Aug 2000 | A |
6097024 | Stam et al. | Aug 2000 | A |
6144022 | Tenenbaum et al. | Nov 2000 | A |
6175300 | Kendrick | Jan 2001 | B1 |
6178034 | Allemand et al. | Jan 2001 | B1 |
6198409 | Schofield et al. | Mar 2001 | B1 |
6201642 | Bos | Mar 2001 | B1 |
6222447 | Schofield et al. | Apr 2001 | B1 |
6223114 | Boros et al. | Apr 2001 | B1 |
6227689 | Miller | May 2001 | B1 |
6250148 | Lynam | Jun 2001 | B1 |
6266082 | Yonezawa et al. | Jul 2001 | B1 |
6266442 | Laumeyer et al. | Jul 2001 | B1 |
6285393 | Shimoura et al. | Sep 2001 | B1 |
6294989 | Schofield et al. | Sep 2001 | B1 |
6297781 | Turnbull et al. | Oct 2001 | B1 |
6302545 | Schofield et al. | Oct 2001 | B1 |
6310611 | Caldwell | Oct 2001 | B1 |
6313454 | Bos et al. | Nov 2001 | B1 |
6317057 | Lee | Nov 2001 | B1 |
6320176 | Schofield et al. | Nov 2001 | B1 |
6320282 | Caldwell | Nov 2001 | B1 |
6333759 | Mazzilli | Dec 2001 | B1 |
6341523 | Lynam | Jan 2002 | B2 |
6353392 | Schofield et al. | Mar 2002 | B1 |
6370329 | Teuchert | Apr 2002 | B1 |
6392315 | Jones et al. | May 2002 | B1 |
6396397 | Bos et al. | May 2002 | B1 |
6411204 | Bloomfield et al. | Jun 2002 | B1 |
6420975 | DeLine et al. | Jul 2002 | B1 |
6424273 | Gutta et al. | Jul 2002 | B1 |
6430303 | Naoi et al. | Aug 2002 | B1 |
6442465 | Breed et al. | Aug 2002 | B2 |
6477464 | McCarthy et al. | Nov 2002 | B2 |
6497503 | Dassanayake et al. | Dec 2002 | B1 |
6498620 | Schofield et al. | Dec 2002 | B2 |
6516262 | Takenaga | Feb 2003 | B2 |
6516664 | Lynam | Feb 2003 | B2 |
6523964 | Schofield et al. | Feb 2003 | B2 |
6534884 | Marcus et al. | Mar 2003 | B2 |
6539306 | Turnbull | Mar 2003 | B2 |
6547133 | Devries, Jr. et al. | Apr 2003 | B1 |
6553130 | Lemelson et al. | Apr 2003 | B1 |
6559435 | Schofield et al. | May 2003 | B2 |
6574033 | Chui et al. | Jun 2003 | B1 |
6589625 | Kothari et al. | Jul 2003 | B1 |
6594583 | Ogura et al. | Jul 2003 | B2 |
6611202 | Schofield et al. | Aug 2003 | B2 |
6611610 | Stam et al. | Aug 2003 | B1 |
6636258 | Strumolo | Oct 2003 | B2 |
6650455 | Miles | Nov 2003 | B2 |
6672731 | Schnell et al. | Jan 2004 | B2 |
6674562 | Miles | Jan 2004 | B1 |
6678614 | McCarthy et al. | Jan 2004 | B2 |
6680792 | Miles | Jan 2004 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6700605 | Toyoda et al. | Mar 2004 | B1 |
6704621 | Stein et al. | Mar 2004 | B1 |
6710908 | Miles et al. | Mar 2004 | B2 |
6711474 | Treyz et al. | Mar 2004 | B1 |
6714331 | Lewis et al. | Mar 2004 | B2 |
6717610 | Bos et al. | Apr 2004 | B1 |
6728623 | Takenaga | Apr 2004 | B2 |
6735506 | Breed et al. | May 2004 | B2 |
6741377 | Miles | May 2004 | B2 |
6744353 | Sjonell | Jun 2004 | B2 |
6757109 | Bos | Jun 2004 | B2 |
6762867 | Lippert et al. | Jul 2004 | B2 |
6794119 | Miles | Sep 2004 | B2 |
6795221 | Urey | Sep 2004 | B1 |
6802617 | Schofield et al. | Oct 2004 | B2 |
6806452 | Bos et al. | Oct 2004 | B2 |
6819231 | Berberich et al. | Nov 2004 | B2 |
6822563 | Bos et al. | Nov 2004 | B2 |
6823241 | Shirato et al. | Nov 2004 | B2 |
6824281 | Schofield et al. | Nov 2004 | B2 |
6831261 | Schofield et al. | Dec 2004 | B2 |
6850156 | Bloomfield et al. | Feb 2005 | B2 |
6882287 | Schofield | Apr 2005 | B2 |
6889161 | Winner et al. | May 2005 | B2 |
6891563 | Schofield et al. | May 2005 | B2 |
6909753 | Meehan et al. | Jun 2005 | B2 |
6946978 | Schofield | Sep 2005 | B2 |
6953253 | Schofield et al. | Oct 2005 | B2 |
6968736 | Lynam | Nov 2005 | B2 |
6975775 | Rykowski et al. | Dec 2005 | B2 |
6989736 | Berberich et al. | Jan 2006 | B2 |
7004606 | Schofield | Feb 2006 | B2 |
7005974 | McMahon et al. | Feb 2006 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7062300 | Kim | Jun 2006 | B1 |
7065432 | Moisel et al. | Jun 2006 | B2 |
7079017 | Lang et al. | Jul 2006 | B2 |
7085637 | Breed et al. | Aug 2006 | B2 |
7092548 | Laumeyer et al. | Aug 2006 | B2 |
7111968 | Bauer et al. | Sep 2006 | B2 |
7116246 | Winter et al. | Oct 2006 | B2 |
7123168 | Schofield | Oct 2006 | B2 |
7145519 | Takahashi et al. | Dec 2006 | B2 |
7149613 | Stam et al. | Dec 2006 | B2 |
7161616 | Okamoto et al. | Jan 2007 | B1 |
7167796 | Taylor et al. | Jan 2007 | B2 |
7195381 | Lynam et al. | Mar 2007 | B2 |
7202776 | Breed | Apr 2007 | B2 |
7205904 | Schofield | Apr 2007 | B2 |
7227459 | Bos et al. | Jun 2007 | B2 |
7227611 | Hull et al. | Jun 2007 | B2 |
7311406 | Schofield et al. | Dec 2007 | B2 |
7325934 | Schofield et al. | Feb 2008 | B2 |
7325935 | Schofield et al. | Feb 2008 | B2 |
7338177 | Lynam | Mar 2008 | B2 |
7339149 | Schofield et al. | Mar 2008 | B1 |
7344261 | Schofield et al. | Mar 2008 | B2 |
7355524 | Schofield | Apr 2008 | B2 |
7370983 | DeWind et al. | May 2008 | B2 |
7380948 | Schofield et al. | Jun 2008 | B2 |
7388182 | Schofield et al. | Jun 2008 | B2 |
7398076 | Kubota | Jul 2008 | B2 |
7402786 | Schofield et al. | Jul 2008 | B2 |
7423248 | Schofield et al. | Sep 2008 | B2 |
7425076 | Schofield et al. | Sep 2008 | B2 |
7446650 | Scholfield et al. | Nov 2008 | B2 |
7459664 | Schofield et al. | Dec 2008 | B2 |
7460951 | Altan | Dec 2008 | B2 |
7480149 | DeWard et al. | Jan 2009 | B2 |
7490007 | Taylor et al. | Feb 2009 | B2 |
7492281 | Lynam et al. | Feb 2009 | B2 |
7526103 | Schofield et al. | Apr 2009 | B2 |
7561181 | Schofield et al. | Jul 2009 | B2 |
7581859 | Lynam | Sep 2009 | B2 |
7592928 | Chinomi et al. | Sep 2009 | B2 |
7616781 | Schofield et al. | Nov 2009 | B2 |
7619508 | Lynam et al. | Nov 2009 | B2 |
7639149 | Katoh | Dec 2009 | B2 |
7676324 | Bae | Mar 2010 | B2 |
7681960 | Wanke et al. | Mar 2010 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7777611 | Desai | Aug 2010 | B2 |
7855755 | Weller et al. | Dec 2010 | B2 |
7859565 | Schofield et al. | Dec 2010 | B2 |
7881496 | Camilleri et al. | Feb 2011 | B2 |
7914187 | Higgins-Luthman et al. | Mar 2011 | B2 |
7965336 | Bingle et al. | Jun 2011 | B2 |
8013780 | Lynam | Sep 2011 | B2 |
8027029 | Lu et al. | Sep 2011 | B2 |
8031062 | Smith | Oct 2011 | B2 |
8058977 | Lynam | Nov 2011 | B2 |
8078379 | Lu | Dec 2011 | B2 |
8340866 | Hanzawa et al. | Dec 2012 | B2 |
8606455 | Boehringer | Dec 2013 | B2 |
8694192 | Cullinane | Apr 2014 | B2 |
8694224 | Chundrlik, Jr. et al. | Apr 2014 | B2 |
8849495 | Chundrik, Jr. et al. | Sep 2014 | B2 |
9092986 | Salomonsson | Jul 2015 | B2 |
9318020 | Salomonsson | Apr 2016 | B2 |
9563809 | Salomonsson | Feb 2017 | B2 |
9580013 | Wierich | Feb 2017 | B2 |
9604581 | Wierich | Mar 2017 | B2 |
9824285 | Salomonsson | Nov 2017 | B2 |
10497262 | Salomonsson et al. | Dec 2019 | B2 |
20020015153 | Downs | Feb 2002 | A1 |
20020044065 | Quist et al. | Apr 2002 | A1 |
20020113873 | Williams | Aug 2002 | A1 |
20020159270 | Lynam et al. | Oct 2002 | A1 |
20030137586 | Lewellen | Jul 2003 | A1 |
20030222982 | Hamdan et al. | Dec 2003 | A1 |
20030227777 | Schofield | Dec 2003 | A1 |
20040012488 | Schofield | Jan 2004 | A1 |
20040016870 | Pawlicki et al. | Jan 2004 | A1 |
20040032321 | McMahon et al. | Feb 2004 | A1 |
20040051634 | Schofield et al. | Mar 2004 | A1 |
20040114381 | Salmeen et al. | Jun 2004 | A1 |
20040128065 | Taylor et al. | Jul 2004 | A1 |
20040200948 | Bos et al. | Oct 2004 | A1 |
20050078389 | Kulas et al. | Apr 2005 | A1 |
20050134966 | Burgner | Jun 2005 | A1 |
20050134983 | Lynam | Jun 2005 | A1 |
20050146792 | Schofield et al. | Jul 2005 | A1 |
20050169003 | Lindahl et al. | Aug 2005 | A1 |
20050195488 | McCabe et al. | Sep 2005 | A1 |
20050200700 | Schofield et al. | Sep 2005 | A1 |
20050232469 | Schofield et al. | Oct 2005 | A1 |
20050264891 | Uken et al. | Dec 2005 | A1 |
20060018511 | Stam et al. | Jan 2006 | A1 |
20060018512 | Stam et al. | Jan 2006 | A1 |
20060028731 | Schofield et al. | Feb 2006 | A1 |
20060050018 | Hutzel et al. | Mar 2006 | A1 |
20060061008 | Kamer et al. | Mar 2006 | A1 |
20060091813 | Stam et al. | May 2006 | A1 |
20060103727 | Tseng | May 2006 | A1 |
20060164230 | DeWind et al. | Jul 2006 | A1 |
20060250501 | Wildmann et al. | Nov 2006 | A1 |
20060290479 | Akatsuka et al. | Dec 2006 | A1 |
20070023613 | Schofield et al. | Feb 2007 | A1 |
20070104476 | Yasutomi et al. | May 2007 | A1 |
20070109406 | Schofield et al. | May 2007 | A1 |
20070109651 | Schofield et al. | May 2007 | A1 |
20070109652 | Schofield et al. | May 2007 | A1 |
20070109653 | Schofield et al. | May 2007 | A1 |
20070109654 | Schofield et al. | May 2007 | A1 |
20070120657 | Schofield et al. | May 2007 | A1 |
20070176080 | Schofield et al. | Aug 2007 | A1 |
20080122597 | Englander | May 2008 | A1 |
20080180529 | Taylor et al. | Jul 2008 | A1 |
20090113509 | Tseng et al. | Apr 2009 | A1 |
20090177347 | Breuer et al. | Jul 2009 | A1 |
20090243824 | Peterson et al. | Oct 2009 | A1 |
20090244361 | Gebauer et al. | Oct 2009 | A1 |
20090295181 | Lawlor et al. | Dec 2009 | A1 |
20100020170 | Higgins-Luthman et al. | Jan 2010 | A1 |
20100045797 | Schofield et al. | Feb 2010 | A1 |
20100070172 | Kumar | Mar 2010 | A1 |
20100097469 | Blank et al. | Apr 2010 | A1 |
20100228437 | Hanzawa et al. | Sep 2010 | A1 |
20100292886 | Szczerba | Nov 2010 | A1 |
20120062743 | Lynam | Mar 2012 | A1 |
20120116632 | Bechtel | May 2012 | A1 |
20120218412 | Dellantoni et al. | Aug 2012 | A1 |
20120245817 | Cooprider et al. | Sep 2012 | A1 |
20120262340 | Hassan et al. | Oct 2012 | A1 |
20120277947 | Boehringer | Nov 2012 | A1 |
20120303222 | Cooprider et al. | Nov 2012 | A1 |
20120312645 | Frashure | Dec 2012 | A1 |
20130116915 | Ferreira | May 2013 | A1 |
20130124052 | Hahne | May 2013 | A1 |
20130131918 | Hahne | May 2013 | A1 |
20130141582 | Reilhac | Jun 2013 | A1 |
20130158800 | Trageser | Jun 2013 | A1 |
20130231825 | Chundrlik, Jr. et al. | Sep 2013 | A1 |
20140067206 | Pflug | Mar 2014 | A1 |
20140277901 | Ferguson | Sep 2014 | A1 |
20140309884 | Wolf | Oct 2014 | A1 |
Number | Date | Country |
---|---|---|
2013081984 | Jun 2013 | WO |
Number | Date | Country | |
---|---|---|---|
20200111356 A1 | Apr 2020 | US |
Number | Date | Country | |
---|---|---|---|
61886883 | Oct 2013 | US | |
61834129 | Jun 2013 | US | |
61760366 | Feb 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15817612 | Nov 2017 | US |
Child | 16699915 | US | |
Parent | 15416217 | Jan 2017 | US |
Child | 15817612 | US | |
Parent | 15131593 | Apr 2016 | US |
Child | 15416217 | US | |
Parent | 14809541 | Jul 2015 | US |
Child | 15131593 | US | |
Parent | 14169328 | Jan 2014 | US |
Child | 14809541 | US |