The present invention relates generally to a collision avoidance system for a vehicle and, more particularly, to a collision avoidance system that detects pedestrians in or approaching.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
The present invention provides a collision avoidance system or vision system or imaging system for a vehicle that utilizes one or more sensors, such as one or more cameras (preferably one or more CMOS cameras) to sense regions exterior (such as forward) of the vehicle and/or to capture image data representative of images exterior of the vehicle, and provides a pedestrian collision warning system that is operable to generate an alert or warning to a driver of the vehicle and/or to control the brake system of the vehicle responsive to a determination that the vehicle may collide with a pedestrian approaching the path of travel of the vehicle ahead of (or behind) the vehicle. The system may determine a baseline time to collision (TTC) based on vehicle speed and pedestrian speed and distance between the vehicle and pedestrian, and the system adjusts the TTC responsive to various parameters, including vehicle parameters (pertaining to traction or braking ability of the vehicle at that time), environmental parameters, location parameters (such as the location of the vehicle being at or near where a pedestrian is more likely to be found), condition/time/place parameters (such as the location of the vehicle being at or near a location and at a particular time where a pedestrian is more likely to be found at that location) and/or driver parameters (attentiveness of driver, distractions and/or the like). For example, when the vehicle is at a location near a bus stop when the bus is at the bus stop (thus a high likelihood that pedestrians will be present), the system may increase the sensitivity and provide an earlier warning to the driver of the vehicle or may control the vehicle (such as apply the vehicle brakes) at an earlier time, when it is determined that a pedestrian may be moving in or towards the path of travel of the vehicle.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle vision system and/or driver assist system and/or object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide a top down or bird's eye or surround view display and may provide a displayed image that is representative of the subject vehicle, and optionally with the displayed image being customized to at least partially correspond to the actual subject vehicle.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes at least one exterior facing imaging sensor or camera, such as a rearward facing imaging sensor or camera 14a (and the system may optionally include multiple exterior facing imaging sensors or cameras, such as a forwardly facing camera 14b at the front (or at the windshield) of the vehicle, and a sidewardly/rearwardly facing camera 14c, 14d at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
ASPECSS (Assessment methodologies for forward looking Integrated Pedestrian and further extension to Cyclist Safety Systems) is a project to develop harmonized test and assessment procedures for forward looking integrated pedestrian safety systems. See, for example, ‘ASPECSS-D1.1-FINAL-Scenariosweighting-BASt-2013-02-17-PUBLIC’, which is hereby incorporated herein by reference in its entirety.
As disclosed in the ASPECSS document (incorporated above), it may be justified to adjust the size of a safety zone depending on the pedestrian's walking speed. Therefore, the quantity safe lateral time-gap SLT is introduced. The conversion of safe lateral distance (SLD) to safe lateral time (SLT) is:
where vPed is the speed component of a pedestrian lateral to the way an ego-vehicle is heading. This is linear vector algebra. ASPECSS shows that more distant pedestrians have to be reflected when they are approaching faster and less distant when they are approaching slower.
ASPECSS describes a safety zone which expands in a cone shape in front of the vehicle. The faster a potential endangered pedestrian is, the more time he or she may have to walk in front of the approaching vehicle.
The diagrams in
Since semi-automated abrupt accelerating may be disturbing to a driver for passing an approaching pedestrian before he or she may be able to enter the path of travel of the approaching vehicle, only deceleration may be acceptable as an automated measure. The case A in
As another aspect of the invention, for implementation in active pedestrian collision avoidance or collision warning systems, it is preferred to engage actions stepwise depending on the remaining time to collision (TTC). Typically, in a first stage actuation or warning levels become elevated. Audible, visual and/or haptic measures may be actuated to draw the driver's attention to a potential collision endangered pedestrian (assuming the pedestrian continues approaching the driving path of the vehicle). The systems are often not developed enough to avoid a collision by steering (in combination with braking), by that these are meant to brake only. In another higher actuation or warning levels when the TTC is shorter, the system may prefill the brake pressure and may actively lower the torque demand of the engine. The warnings may be switched to become more obtrusive such as like sounding a beep and warning lights may flicker. At a TTC when a collision seems to become unavoidable in case of not braking the vehicle may start full braking of the vehicle.
While the last stages of actuation or warning levels may be reached very seldom, the lower levels may be reached quite often. To limit or avoid disturbing the driver too often with false positive or obviously easy to avoid conflict warnings but to brake sufficiently when it is necessary for pedestrians' safety, the OEMs aspire to optimize the parameters which lead to the early warning or warning levels. This is done by parameters.
There is a base TTC at which a system may actuate early warnings. Most of the parameters lead to earlier warnings which equates to a higher TTC value. For simplifying the concept some OEMs quantize the parameters in levels which lead to a fixed value of time which has to be added to the base TTC (from a baseline time). Any count of levels may be used. The OEM typically use three levels, such as a level for ‘wiper status.’ A level of 3 (such as engaged in heavy rain) leads to an addition of 0.6 seconds to the TTC, while a level of 1 (such as ‘Interval,’ engaged in slight rain) leads to an addition of 0.2 seconds to the TTC.
Known other parameters include:
The system of the present invention uses additional vehicle imminent parameters in determining an adjustment of the TTC (where the system may reduce the alert time if the system determines excessive tire wear or excessive brake temperature or wear or the like, where a time to stop the vehicle may be increased), such as:
Additionally, the system of the present invention may also take environmental and/or temporal parameters (where the system may reduce the alert time if conditions are such that a time to stop the vehicle may be increased) into account such as:
A more sophisticated system may be able to detect ground or road or vehicle tire-road interface conditions. This may done by assessing the tire slip (where the system may reduce the alert time if conditions are such that a time to stop the vehicle may be increased). Such information may be generated by the ABS and TCS (traction control system) of the vehicle. Alternatively, or additionally, the system may assume a road condition by the weather forecast or may receive the road condition from a data server, specific to the position the vehicle is at that moment.
The geographical location or position may come from a navigation system with GPS. Additionally or alternatively the system may have parameters according a position dependent context information. The vehicle may use the inherent present information or may receive information from a context server. For example, at times when the navigation system indicates the system is approaching a school, the context parameter may add a higher value to the base TTC than when driving on a highway (where no pedestrian are to be expected normally).
Additional contexts at which increased or decreased parameter levels may be engaged may pertain to the current geographical location of the vehicle (where the system may reduce the alert time if the geographical location of the vehicle is at or near a location where more pedestrians are expected to be), and may include, for example:
Some contexts may be engaged in combination, such as condition, time and place in combination (where the timing of an event that occurs at a particular location at a particular time may be considered when the vehicle is determined to be at or near a particular location at or near the time for an event at that location, such that the alert time may be reduced if the system determines that the vehicle is at or near such a location at such a time) such as:
There may be off line data involved such as the map information or the bus schedule as well as online data such as the fire alert event. Sophisticated systems may keep the bus schedule updated online, by that a bus which is a minute delayed can be reflected in the TTC parameters correctly when the bus is actually present (not when it was scheduled instead).
As another aspect of the present invention, the system may also take the condition of the driver and or driver distractions into account as a parameter (where the system may reduce the alert time if it is determined that the driver may be distracted or inattentive) such as:
Optionally, the same procedure may be used accordingly and simultaneously for setting (increasing) the parameters of “Present position of the Pedestrian.”
This procedure may also be used for TTC parameters of cyclists, motorcyclists, rickshaws, horse riders (vulnerable road users or VRU) or other vehicles or animals or other (potentially moving) obstacles such as dropped cargo (rolling around), rolling bushes or toys (e.g., balls, RC or autonomous crafts or drones); it means all AEB features can take advantage of the adjustment of the adaption of the thresholds for warnings or braking maneuvers.
Thus, the system of the present invention is operable to adjust or weight the processing of data associated with the vehicle traveling along a road to optimize the system's ability to warn against or avoid collision with a pedestrian. The system may increase the sensitivity of the alert (to effectively widen the vehicle path corridor in
Thus, the system of the present invention uses vehicle inherent parameters to influence the TTC warning time (at which the driver will be alerted to a potential collision with a pedestrian). The system may also or otherwise use environmental parameters and may generate context information from several input conditions, which influence the various parameters and the TTC warning time. The system may utilize one or more cameras of the vehicle to assist in determining the presence of pedestrians and may be responsive to an output of a GPS system of the vehicle (that indicates the current geographical location of the vehicle) and/or may be responsive to an external service provider or communication system (that may provide data pertaining to bus schedules or real time bus locations and/or school crossing information and/or weather details and/or the like). The system may be responsive to the various parameters (as provided or determined or as adjusted in response to other inputs or data) to determine a time at which the system may warn the driver of the vehicle of a potential hazard (collision with pedestrian) as the vehicle is driven along a road.
Thus, the system may initially determine a potential hazard or collision with a pedestrian and generate an alert to the driver of the vehicle that the hazardous condition has been determined. If the pedestrian continues on his or her path and the driver of the vehicle does not alter the vehicle's path or speed, the system may then control the vehicle and/or generate a pedestrian alert to alert the pedestrian of the potentially hazardous condition. For example, responsive to an initial determination that a detected pedestrian is moving towards the path of travel of the vehicle, the system may generate a pedestrian alert (such as actuating the vehicle's horn or flashing the vehicle's headlights) to alert the pedestrian of a potential hazard. If the pedestrian does not alter course, the system may (if a determination is made that the vehicle may collide with the pedestrian) apply the vehicle brakes to slow down or stop the vehicle before arriving at the location where the pedestrian crosses the vehicle's path. This may be done after the processor determines a time to collision based on a determined distance to the pedestrian and determined speed of the pedestrian and speed of the vehicle, and after the collision avoidance system generates an alert to the driver of the vehicle at a threshold time before the determined collision with the pedestrian.
The collision avoidance system may be operable to apply the brakes of the vehicle to avoid collision with a determined pedestrian. Optionally, the system may adjust the degree of braking responsive to the predicted location of the pedestrian at the time that the vehicle arrives at the pedestrian's path. For example, the system may gently or lightly apply the brakes to slow the vehicle's speed responsive to a determination that the pedestrian will be exiting the path of travel of the vehicle towards the end of the determined time to collision (i.e., the pedestrian is fully or almost fully across the vehicle path by the time the vehicle arrives at the pedestrian's path). Optionally, the collision avoidance system may apply the brakes of the vehicle to stop the vehicle responsive to a determination that the pedestrian will be entering the path of travel of the vehicle towards the end of the determined time to collision (i.e., the pedestrian will likely be in the path of travel of the vehicle at the time that the vehicle arrives at the pedestrian's path). Optionally, the collision avoidance system may generate a pedestrian alert to the pedestrian responsive to a determination that the pedestrian will be entering the path of travel of the vehicle towards the end of the determined time to collision (i.e., at or before the time at which the vehicle arrives at the pedestrian's path).
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an EYEQ2 or EYEQ3 image processing chip available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO 2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. Publication No. US-2012-0062743, which are hereby incorporated herein by reference in their entireties.
Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device disposed at or in the interior rearview mirror assembly of the vehicle, such as by utilizing aspects of the video mirror display systems described in U.S. Pat. No. 6,690,268 and/or U.S. Publication No. US-2012-0162427, which are hereby incorporated herein by reference in their entireties. The video mirror display may comprise any suitable devices and systems and optionally may utilize aspects of the compass display systems described in U.S. Pat. Nos. 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252 and/or 6,642,851, and/or European patent application, published Oct. 11, 2000 under Publication No. EP 0 1043566, and/or U.S. Publication No. US-2006-0061008, which are all hereby incorporated herein by reference in their entireties. Optionally, the video mirror display screen or device may be operable to display images captured by a rearward viewing camera of the vehicle during a reversing maneuver of the vehicle (such as responsive to the vehicle gear actuator being placed in a reverse gear position or the like) to assist the driver in backing up the vehicle, and optionally may be operable to display the compass heading or directional heading character or icon when the vehicle is not undertaking a reversing maneuver, such as when the vehicle is being driven in a forward direction along a road (such as by utilizing aspects of the display system described in International Publication No. WO 2012/051500, which is hereby incorporated herein by reference in its entirety).
Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2010/099416; WO 2011/028686; WO 2012/075250; WO 2013/019795; WO 2012/075250; WO 2012/145822; WO 2013/081985; WO 2013/086249 and/or WO 2013/109869, and/or U.S. Publication No. US-2012-0162427, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application is a continuation of U.S. patent application Ser. No. 17/643,880, filed Dec. 13, 2021, now U.S. Pat. No. 11,572,065, which is a continuation of U.S. patent application Ser. No. 15/935,545, filed Mar. 26, 2018, now U.S. Pat. No. 11,198,432, which is a continuation of U.S. patent application Ser. No. 14/854,376, filed Sep. 15, 2015, now U.S. Pat. No. 9,925,980, which claims the filing benefits of U.S. provisional applications, Ser. No. 62/129,285, filed Mar. 6, 2015, and Ser. No. 62/051,446, filed Sep. 17, 2014, which are hereby incorporated herein by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
5432509 | Kajiwara | Jul 1995 | A |
5541590 | Nishio | Jul 1996 | A |
5550677 | Schofield et al. | Aug 1996 | A |
5555555 | Sato et al. | Sep 1996 | A |
5568027 | Teder | Oct 1996 | A |
5574443 | Hsieh | Nov 1996 | A |
5581464 | Woll et al. | Dec 1996 | A |
5614788 | Mullins | Mar 1997 | A |
5619370 | Guinosso | Apr 1997 | A |
5632092 | Blank et al. | May 1997 | A |
5634709 | Iwama | Jun 1997 | A |
5642299 | Hardin et al. | Jun 1997 | A |
5648835 | Uzawa | Jul 1997 | A |
5650944 | Kise | Jul 1997 | A |
5660454 | Mori et al. | Aug 1997 | A |
5661303 | Teder | Aug 1997 | A |
5666028 | Bechtel et al. | Sep 1997 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5677851 | Kingdon et al. | Oct 1997 | A |
5699044 | Van Lente et al. | Dec 1997 | A |
5724316 | Brunts | Mar 1998 | A |
5732379 | Ckert et al. | Mar 1998 | A |
5760828 | Cortes | Jun 1998 | A |
5760931 | Saburi et al. | Jun 1998 | A |
5761094 | Olson et al. | Jun 1998 | A |
5765116 | Wilson-Jones et al. | Jun 1998 | A |
5765118 | Fukatani | Jun 1998 | A |
5781437 | Wiemer et al. | Jul 1998 | A |
5790403 | Nakayama | Aug 1998 | A |
5790973 | Blaker et al. | Aug 1998 | A |
5793308 | Rosinski et al. | Aug 1998 | A |
5793420 | Schmidt | Aug 1998 | A |
5796094 | Schofield et al. | Aug 1998 | A |
5837994 | Stam et al. | Nov 1998 | A |
5844505 | Van Ryzin | Dec 1998 | A |
5844682 | Kiyomoto et al. | Dec 1998 | A |
5845000 | Breed et al. | Dec 1998 | A |
5848802 | Breed et al. | Dec 1998 | A |
5850176 | Kinoshita et al. | Dec 1998 | A |
5850254 | Takano et al. | Dec 1998 | A |
5867591 | Onda | Feb 1999 | A |
5877707 | Kowalick | Mar 1999 | A |
5877897 | Schofield et al. | Mar 1999 | A |
5878357 | Sivashankar et al. | Mar 1999 | A |
5878370 | Olson | Mar 1999 | A |
5883739 | Ashihara et al. | Mar 1999 | A |
5884212 | Lion | Mar 1999 | A |
5890021 | Onoda | Mar 1999 | A |
5896085 | Mori et al. | Apr 1999 | A |
5899956 | Chan | May 1999 | A |
5915800 | Hiwatashi et al. | Jun 1999 | A |
5923027 | Stam et al. | Jul 1999 | A |
5924212 | Domanski | Jul 1999 | A |
5959555 | Furuta | Sep 1999 | A |
5963247 | Banitt | Oct 1999 | A |
5990469 | Bechtel et al. | Nov 1999 | A |
5990649 | Nagao et al. | Nov 1999 | A |
6020704 | Buschur | Feb 2000 | A |
6049171 | Stam et al. | Apr 2000 | A |
6084519 | Coulling et al. | Jul 2000 | A |
6097024 | Stam et al. | Aug 2000 | A |
6100799 | Fenk | Aug 2000 | A |
6144022 | Tenenbaum et al. | Nov 2000 | A |
6175300 | Kendrick | Jan 2001 | B1 |
6178034 | Allemand et al. | Jan 2001 | B1 |
6223114 | Boros et al. | Apr 2001 | B1 |
6227689 | Miller | May 2001 | B1 |
6266082 | Yonezawa et al. | Jul 2001 | B1 |
6266442 | Laumeyer et al. | Jul 2001 | B1 |
6285393 | Shimoura et al. | Sep 2001 | B1 |
6317057 | Lee | Nov 2001 | B1 |
6320282 | Caldwell | Nov 2001 | B1 |
6333759 | Mazzilli | Dec 2001 | B1 |
6370329 | Teuchert | Apr 2002 | B1 |
6392315 | Jones et al. | May 2002 | B1 |
6430303 | Naoi et al. | Aug 2002 | B1 |
6442465 | Breed et al. | Aug 2002 | B2 |
6547133 | Devries, Jr. et al. | Apr 2003 | B1 |
6553130 | Lemelson et al. | Apr 2003 | B1 |
6574033 | Chui et al. | Jun 2003 | B1 |
6589625 | Kothari et al. | Jul 2003 | B1 |
6594583 | Ogura et al. | Jul 2003 | B2 |
6611610 | Stam et al. | Aug 2003 | B1 |
6636258 | Strumolo | Oct 2003 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6700605 | Toyoda et al. | Mar 2004 | B1 |
6704621 | Stein et al. | Mar 2004 | B1 |
6735506 | Breed et al. | May 2004 | B2 |
6744353 | Sjonell | Jun 2004 | B2 |
6762867 | Lippert et al. | Jul 2004 | B2 |
6795221 | Urey | Sep 2004 | B1 |
6819231 | Berberich et al. | Nov 2004 | B2 |
6823241 | Shirato et al. | Nov 2004 | B2 |
6889161 | Winner et al. | May 2005 | B2 |
6909753 | Meehan et al. | Jun 2005 | B2 |
6975775 | Rykowski et al. | Dec 2005 | B2 |
6989736 | Berberich et al. | Jan 2006 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7062300 | Kim | Jun 2006 | B1 |
7065432 | Moisel et al. | Jun 2006 | B2 |
7079017 | Lang et al. | Jul 2006 | B2 |
7085637 | Breed et al. | Aug 2006 | B2 |
7092548 | Laumeyer et al. | Aug 2006 | B2 |
7111968 | Bauer et al. | Sep 2006 | B2 |
7116246 | Winter et al. | Oct 2006 | B2 |
7136753 | Samukawa et al. | Nov 2006 | B2 |
7145519 | Takahashi et al. | Dec 2006 | B2 |
7149613 | Stam et al. | Dec 2006 | B2 |
7161616 | Okamoto et al. | Jan 2007 | B1 |
7202776 | Breed | Apr 2007 | B2 |
7227611 | Hull et al. | Jun 2007 | B2 |
7365769 | Mager | Apr 2008 | B1 |
7460951 | Altan | Dec 2008 | B2 |
7526103 | Schofield et al. | Apr 2009 | B2 |
7592928 | Chinomi et al. | Sep 2009 | B2 |
7639149 | Katoh | Dec 2009 | B2 |
7681960 | Wanke et al. | Mar 2010 | B2 |
7720580 | Diggins-Luthman | May 2010 | B2 |
7724962 | Zhu et al. | May 2010 | B2 |
7952490 | Fechner et al. | May 2011 | B2 |
8027029 | Lu et al. | Sep 2011 | B2 |
8340866 | Hanzawa et al. | Dec 2012 | B2 |
8788176 | Yopp | Jul 2014 | B1 |
8849495 | Chundrik, Jr. et al. | Sep 2014 | B2 |
9090234 | Johnson et al. | Jul 2015 | B2 |
9092986 | Salomonsson et al. | Jul 2015 | B2 |
9196164 | Urmson | Nov 2015 | B1 |
9925980 | Edo Ros | Mar 2018 | B2 |
11198432 | Edo Ros | Dec 2021 | B2 |
11572065 | Edo Ros | Feb 2023 | B2 |
20020113873 | Williams | Aug 2002 | A1 |
20020118862 | Sugimoto et al. | Aug 2002 | A1 |
20030137586 | Lewellen | Jul 2003 | A1 |
20030222982 | Hamdan et al. | Dec 2003 | A1 |
20040022416 | Lemelson et al. | Feb 2004 | A1 |
20040114381 | Salmeen et al. | Jun 2004 | A1 |
20060018511 | Stam et al. | Jan 2006 | A1 |
20060018512 | Stam et al. | Jan 2006 | A1 |
20060091813 | Stam et al. | May 2006 | A1 |
20060103727 | Tseng | May 2006 | A1 |
20060164221 | Jensen | Jul 2006 | A1 |
20060250501 | Wildmann et al. | Nov 2006 | A1 |
20060255920 | Maeda et al. | Nov 2006 | A1 |
20060290479 | Akatsuka et al. | Dec 2006 | A1 |
20070104476 | Yasutomi et al. | May 2007 | A1 |
20080243389 | Inoue et al. | Oct 2008 | A1 |
20090093938 | Isaji et al. | Apr 2009 | A1 |
20090113509 | Tseng et al. | Apr 2009 | A1 |
20090171559 | Lehtiniemi et al. | Jul 2009 | A1 |
20090177347 | Breuer et al. | Jul 2009 | A1 |
20090244361 | Gebauer et al. | Oct 2009 | A1 |
20090265069 | Desbrunes | Oct 2009 | A1 |
20100020170 | Higgins-Luthman et al. | Jan 2010 | A1 |
20100228437 | Hanzawa et al. | Sep 2010 | A1 |
20110115615 | Luo et al. | May 2011 | A1 |
20110157309 | Bennett et al. | Jun 2011 | A1 |
20110224978 | Sawada | Sep 2011 | A1 |
20120035846 | Sakamoto et al. | Feb 2012 | A1 |
20120044066 | Mauderer et al. | Feb 2012 | A1 |
20120218412 | Dellantoni et al. | Aug 2012 | A1 |
20120262340 | Hassan et al. | Oct 2012 | A1 |
20130002873 | Hess | Jan 2013 | A1 |
20130116859 | Ihlenburg et al. | May 2013 | A1 |
20130124052 | Hahne | May 2013 | A1 |
20130129150 | Saito | May 2013 | A1 |
20130131918 | Hahne | May 2013 | A1 |
20130141578 | Chundrlik, Jr. et al. | Jun 2013 | A1 |
20130222593 | Byrne et al. | Aug 2013 | A1 |
20130278769 | Nix et al. | Oct 2013 | A1 |
20130314503 | Nix et al. | Nov 2013 | A1 |
20140044310 | Schamp et al. | Feb 2014 | A1 |
20140067206 | Pflug | Mar 2014 | A1 |
20140156157 | Johnson et al. | Jun 2014 | A1 |
20140222280 | Salomonsson et al. | Aug 2014 | A1 |
20140313339 | Diessner | Oct 2014 | A1 |
20140324330 | Minemura | Oct 2014 | A1 |
20140379233 | Chundrlik, Jr. et al. | Dec 2014 | A1 |
20150166062 | Johnson et al. | Jun 2015 | A1 |
20150291159 | Sasabuchi | Oct 2015 | A1 |
Number | Date | Country | |
---|---|---|---|
20230182727 A1 | Jun 2023 | US |
Number | Date | Country | |
---|---|---|---|
62129285 | Mar 2015 | US | |
62051446 | Sep 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17643880 | Dec 2021 | US |
Child | 18164789 | US | |
Parent | 15935545 | Mar 2018 | US |
Child | 17643880 | US | |
Parent | 14854376 | Sep 2015 | US |
Child | 15935545 | US |