The present invention relates generally to a vehicle vision and control system for a vehicle and, more particularly, to a vehicle vision and control system that utilizes one or more cameras at a vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
The present invention provides a driver assistance system or vision system or imaging system for a vehicle that utilizes one or more cameras (preferably one or more CMOS cameras) to capture image data representative of images exterior of the vehicle, and provides a control that, responsive to a determination of an emergency driving condition, controls steering and braking of the vehicle to guide the vehicle to a targeted stopping location.
According to an aspect of the present invention, a driver assistance system of a vehicle comprises a control operable to control steering of the vehicle and braking of the vehicle responsive to a determination of an emergency driving event. The control, responsive to a determination of a lane in which the vehicle is traveling, and responsive to the determination of an emergency driving event, controls the steering of the vehicle to steer the vehicle along the determined lane. The control does not control braking of the vehicle to quickly stop the vehicle responsive to an input from the driver indicative of the driver not wanting to stop the vehicle.
The control may be operable to determine a targeted stopping location and may control steering of the vehicle and braking of the vehicle to guide the vehicle to and stop the vehicle at the targeted stopping location. The control may determine the targeted stopping location substantially ahead of the vehicle and after a road condition changes. The determined road condition change may comprise one of (i) a construction zone ending, (ii) a narrow road widening and (iii) a shoulder of the road beginning.
The input from the driver may comprise the driver accelerating the vehicle, which indicates that the vehicle is not to be stopped at that location, even though an emergency driving event (such as a blown tire) may have been determined by the control. The control may determine the emergency driving condition responsive to one or more sensors of the vehicle and may switch the system into an emergency steering mode. The lane may be determined responsive to image processing of image data captured by a camera disposed at the vehicle and having a field of view exterior and forward of the vehicle.
Responsive to a determination of an emergency driving event, the control switches to operate under the emergency steering mode and steers the vehicle to maintain the vehicle moving along the determined path of travel. The control (while operating in the emergency steering mode) may also control braking at individual ones of the wheels of the vehicle to assist in guiding the vehicle along the determined path of travel. The control (while operating in the emergency steering mode) may also control driving (such as accelerating or decelerating) of individual ones of the wheels of the vehicle to assist in guiding the vehicle along the determined path of travel.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle vision system and/or driver assist system and/or object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide display, such as a rearview display or a top down or bird's eye or surround view display or the like.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes at least one exterior facing imaging sensor or camera, such as a rearward facing imaging sensor or camera 14a (and the system may optionally include multiple exterior facing imaging sensors or cameras, such as a forwardly facing camera 14b at the front (or at the windshield) of the vehicle, and a sidewardly/rearwardly facing camera 14c, 14d at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
The present invention combines perception systems of advanced driver assistant and safety systems (or optionally similar detection systems just for that purpose instead of driver assistant system born sensors) with vehicle torque-, traction-, stability- and steering control systems to generate an added value to the whole, especially in critical maneuvering situations. The combined control 19 may be located anywhere in the vehicle, and may be linked to vehicle sensors and sensor processing system including the vision ECU 12.
The system may exceed known electronic stability program (ESP) or electronic stability control intervention for lane keeping assist (LKA) in functionality, effectiveness and comfort, by using more input data and controlled drivetrain areas. Contrary to known systems, the system of the present invention may be activated or stay activated also if ABS, ESP®, Brake Assist or another active safety system intervene, or when a tire runs flat or bursts or after a collision event.
For example, at least the active lane keep assist (LKA) may be combined or fused with at least the electronic stability control (ESC) (also called VSC, ESP) in a way that the LKA may detect and provide the desired boundaries of a safe vehicle path or ‘free space’ (by that the set point) in which a vehicle is supposed to remain supported by the ESC control interventions. The focus may be set on controlling the vehicle safely on a non-collision path when in an emergency situation. Such a situation may occur when a tire bursts abruptly or the vehicle loses traction on a slippery road, such as due to hydroplaning, ice conditions or snow conditions. Typically staying in the road's boundaries or within the current lane markings is preferred as safer than departing from the road or lane unintentionally. The system of the present invention may control the vehicle within the road's or lane boundaries also when traveling in or along or just entering a curve in the road.
The system of the present invention may be further improved by also taking collision mitigation steering and braking functions into account. The system may set a desired collision avoidance or mitigation path (see the example of
Supposedly, the driver is interested in bringing the vehicle to a halt in emergency situations (during emergency steering mode) or the collision mitigation braking (CMB) system is signaling a collision hazard, by that the system may assist by slowing down the vehicle in a controlled manner. CMB systems are typically made in a way that the driver is able to override the function. For example, when the driver steps onto the accelerator while the CMB system is controlling the brakes of the vehicle, the CMB system will switch off so as to not interfere the driver's intention to accelerate. The full shut off of the assistance system may be suboptimal though. There may be situations at which an automatically induced stopping is not desired, such as, for example, when there is a tire burst emergency when crossing a construction site at which the driver does not want to stop his or her vehicle but to continue driving until exiting the construction site. By that the driver may be free in his or her decision when to brake to slow down the vehicle, also when being in a mode of emergency such as having a blown tire. The system of the present invention may stay activated in aiding to steer, accelerate and decelerate, while not braking to stop the vehicle or switching off the assistance system.
For achieving this, the vehicle's vision and environmental sensor system may optionally fuse and process the scene data to generate an environmental scene's map, which then may optionally be processed by a scene understanding and context classification algorithm, such as by utilizing aspects of the systems and algorithms described in U.S. Publication No. US-2015-0344028, which is hereby incorporated herein by reference in its entirety, for the purpose of finding proper emergency stopping points or locations and safe paths to the determined stopping point or points or locations, including executing lane changes and turning maneuvers as well as acceleration and deceleration maneuvers within the remaining extent of travel to the determined stopping point or location (this may be desired by the driver anyways or may be necessary in situations where the driver is unable to drive due to an emergency situation occurring, such as if the driver becomes unconscious or non-responsive). Optionally, the system may determine a targeted stopping location ahead of the vehicle and may control the vehicle to drive the vehicle to and stop the vehicle at the targeted location (safe harbor maneuver). Optionally, the system may, responsive to, for example, a determination that the vehicle is at a construction zone or otherwise at an area where the vehicle cannot be pulled over (such as a narrowed road before it widens or a road without a shoulder with a shoulder of the road starting or appearing ahead), continue to control the vehicle steering and acceleration/deceleration to guide the vehicle through the determined construction zone to a targeted location where it is safe to steer the vehicle to the side of the road and stop.
The system of the present invention may also be able to control the vehicle along the planned [emergency] path also when the vehicle is not in a fully proper technical state, such as having a blown tire or when the road conditions are bad such as on a snowy road. The scene understanding (or context classification) system may additionally process and provide the road condition status such as dry road, snowy road, icy road, muddy road, wet road or gravel road for being considered in the vehicle control algorithm. For that, drive train sensors or devices may be linked or fused into the scene understanding system.
Optionally, a blown tire may be detected by a direct or indirect tire pressure monitoring system and may also be provided as an input to or fused to the system of the present invention. Optionally, the system may incorporate or activate an emergency support system (ESS) after an occurrence such an emergency stop after a blown tire while operating in the emergency steering mode.
Optionally, all-wheel drive (AWD) (such as Flex4™ and Actimax™) torque distribution systems may also be provided as an input to or fused to the system of the present invention.
Optionally, the acceleration sensors may also be provided as an input to or fused to the system of the present invention.
Optionally, the anti-lock braking system (ABS) may also be provided as an input to or fused to the system of the present invention.
Optionally, the anti-slip regulation (ASR) may also be provided as an input to or fused to the system of the present invention. Optionally, the traction control (TRC) may also be provided as an input to or fused to the system of the present invention. Optionally, the lateral stability control (LSC) may also be provided as an input to or fused to the system of the present invention. Optionally, the roll stability control (RSC) may also be provided as an input to or fused to the system of the present invention.
To follow the path desired by the driver or set by the scene understanding system, the vehicle powertrain may have to be controlled in an abnormal manner or mode or emergency steering mode. That abnormal or emergency mode may be triggered by less severe events such as the vehicle not fully following its steering direction any more, such as when driving in deep snow, sand or gravel, so that the vehicle is understeering due to some floating or vehicle sways and slings due to massive turn changing. The abnormal or emergency mode may be triggered by severe or emergency events such as bouncing off after a [light] collision or mostly losing street contact. For example, the abnormal or emergency mode may be triggered during a jump (when one or more of the wheels lose or substantially lose contact with the ground), or when the street is very slippery on ice or mud or snow, or when any vehicle powertrain part is substantially malfunctioning, such as when a tire runs flat, a tire is blown, a wheel has been lost, a wheel is wobbling (lost some lug nuts), a tire sticks, the gearbox sticks, the engine sticks, the front wheel steering is blocked or is limited to one direction, the real wheel steering (if present) is blocked, the vehicle is not balanced (or greatly imbalanced) due to lost or shifted cargo or vehicle parts, or in case the cargo is a liquid, sloshing cargo, or the rear axle is not aligned any more (possibly after a collision). A wobbling or lost tire may optionally be detected by using a system such as described in U.S. provisional application No. 62/347,836, filed Jun. 9, 2016, which is hereby incorporated herein by reference in its entirety.
In the abnormal mode (or emergency steering mode), the direct or scaled or incrementally scaled linkage of the steering wheel to the normally steered tires may optionally be disconnected and fully machine controlled or replaced by a different behavior than normal. As shown in the lost tire example of
In accordance with the example of
In situations where the desired pathway of the example of
Thus, the present invention provides advanced control of the vehicle steering and braking and accelerating/decelerating, and may provide such braking and accelerating/decelerating at the individual wheels, responsive to a determination that the vehicle is undergoing an emergency driving condition, such as a blown tire or a slide or skid on slippery surfaces (such as ice, snow, mud, or the like). The system of the present invention may control the steering and wheels of the vehicle to maintain the vehicle moving in the desired direction without immediately stopping the vehicle, such that the system and the driver can maneuver the vehicle to a targeted safe stopping location. The system may determine the path of travel of the vehicle responsive to image processing of image data captured by a forward viewing camera of the vehicle (such as to detect lane markers on the road ahead of the vehicle or to detect the boundaries of the road on which the vehicle is traveling), and the system may control the steering and wheels of the vehicle responsive to the determined lane markers and responsive to a determination of the type of emergency situation experienced by the vehicle and driver.
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an EYEQ2™ or EYEQ3™ image processing chip available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. Publication Nos. US-2014-0340510; US-2014-0313339; US-2014-0347486; US-2014-0320658; US-2014-0336876; US-2014-0307095; US-2014-0327774; US-2014-0327772; US-2014-0320636; US-2014-0293057; US-2014-0309884; US-2014-0226012; US-2014-0293042; US-2014-0218535; US-2014-0218535; US-2014-0247354; US-2014-0247355; US-2014-0247352; US-2014-0232869; US-2014-0211009; US-2014-0160276; US-2014-0168437; US-2014-0168415; US-2014-0160291; US-2014-0152825; US-2014-0139676; US-2014-0138140; US-2014-0104426; US-2014-0098229; US-2014-0085472; US-2014-0067206; US-2014-0049646; US-2014-0052340; US-2014-0025240; US-2014-0028852; US-2014-005907; US-2013-0314503; US-2013-0298866; US-2013-0222593; US-2013-0300869; US-2013-0278769; US-2013-0258077; US-2013-0258077; US-2013-0242099; US-2013-0215271; US-2013-0141578 and/or US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO 2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.
Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device utilizing aspects of the video display systems described in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,370,983; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187 and/or 6,690,268, and/or in U.S. Publication Nos. US-2006-0061008; US-2006-0050018 and/or US-2012-0162427, which are hereby incorporated herein by reference in their entireties.
Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2010/099416; WO 2011/028686; WO 2012/075250; WO 2013/019795; WO 2012/075250; WO 2012/145822; WO 2013/081985; WO 2013/086249 and/or WO 2013/109869, and/or U.S. Publication No. US-2012-0162427, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application is a continuation of U.S. patent application Ser. No. 17/248,116, filed Jan. 11, 2021, now U.S. Pat. No. 11,618,442, which is a continuation of U.S. patent application Ser. No. 16/203,976, filed Nov. 29, 2018, now U.S. Pat. No. 10,889,293, which is a continuation of U.S. patent application Ser. No. 15/358,166, filed Nov. 22, 2016, now U.S. Pat. No. 10,144,419, which claims the filing benefits of U.S. provisional application Ser. No. 62/398,091, filed Sep. 22, 2016, and U.S. provisional application Ser. No. 62/258,722, filed Nov. 23, 2015, which are hereby incorporated herein by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
5432509 | Kajiwara | Jul 1995 | A |
5541590 | Nishio | Jul 1996 | A |
5550677 | Schofield et al. | Aug 1996 | A |
5555555 | Sato et al. | Sep 1996 | A |
5568027 | Teder | Oct 1996 | A |
5574443 | Hsieh | Nov 1996 | A |
5581464 | Woll et al. | Dec 1996 | A |
5614788 | Mullins | Mar 1997 | A |
5619370 | Guinosso | Apr 1997 | A |
5632092 | Blank et al. | May 1997 | A |
5634709 | Iwama | Jun 1997 | A |
5642299 | Hardin et al. | Jun 1997 | A |
5648835 | Uzawa | Jul 1997 | A |
5650944 | Kise | Jul 1997 | A |
5660454 | Mori et al. | Aug 1997 | A |
5661303 | Teder | Aug 1997 | A |
5666028 | Bechtel et al. | Sep 1997 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5677851 | Kingdon et al. | Oct 1997 | A |
5699044 | Van Lente et al. | Dec 1997 | A |
5724316 | Brunts | Mar 1998 | A |
5732379 | Eckert et al. | Mar 1998 | A |
5760828 | Cortes | Jun 1998 | A |
5760931 | Saburi et al. | Jun 1998 | A |
5761094 | Olson et al. | Jun 1998 | A |
5765116 | Wilson-Jones et al. | Jun 1998 | A |
5765118 | Fukatani | Jun 1998 | A |
5781437 | Wiemer et al. | Jul 1998 | A |
5790403 | Nakayama | Aug 1998 | A |
5790973 | Blaker et al. | Aug 1998 | A |
5793308 | Rosinski et al. | Aug 1998 | A |
5793420 | Schmidt | Aug 1998 | A |
5796094 | Schofield et al. | Aug 1998 | A |
5837994 | Stam et al. | Nov 1998 | A |
5844505 | Van Ryzin | Dec 1998 | A |
5844682 | Kiyomoto et al. | Dec 1998 | A |
5845000 | Breed et al. | Dec 1998 | A |
5848802 | Breed et al. | Dec 1998 | A |
5850176 | Kinoshita et al. | Dec 1998 | A |
5850254 | Takano et al. | Dec 1998 | A |
5867591 | Onda | Feb 1999 | A |
5877707 | Kowalick | Mar 1999 | A |
5877897 | Schofield et al. | Mar 1999 | A |
5878357 | Sivashankar et al. | Mar 1999 | A |
5878370 | Olson | Mar 1999 | A |
5883739 | Ashihara et al. | Mar 1999 | A |
5884212 | Lion | Mar 1999 | A |
5890021 | Onoda | Mar 1999 | A |
5896085 | Mori et al. | Apr 1999 | A |
5899956 | Chan | May 1999 | A |
5915800 | Hiwatashi et al. | Jun 1999 | A |
5923027 | Stam et al. | Jul 1999 | A |
5924212 | Domanski | Jul 1999 | A |
5949331 | Schofield et al. | Sep 1999 | A |
5959555 | Furuta | Sep 1999 | A |
5963247 | Banitt | Oct 1999 | A |
5990469 | Bechtel et al. | Nov 1999 | A |
5990649 | Nagao et al. | Nov 1999 | A |
6020704 | Buschur | Feb 2000 | A |
6049171 | Stam et al. | Apr 2000 | A |
6084519 | Coulling et al. | Jul 2000 | A |
6097024 | Stam et al. | Aug 2000 | A |
6100799 | Fenk | Aug 2000 | A |
6144022 | Tenenbaum et al. | Nov 2000 | A |
6175300 | Kendrick | Jan 2001 | B1 |
6178034 | Allemand et al. | Jan 2001 | B1 |
6223114 | Boros et al. | Apr 2001 | B1 |
6227689 | Miller | May 2001 | B1 |
6266082 | Yonezawa et al. | Jul 2001 | B1 |
6266442 | Laumeyer et al. | Jul 2001 | B1 |
6285393 | Shimoura et al. | Sep 2001 | B1 |
6317057 | Lee | Nov 2001 | B1 |
6320282 | Caldwell | Nov 2001 | B1 |
6333759 | Mazzilli | Dec 2001 | B1 |
6370329 | Teuchert | Apr 2002 | B1 |
6392315 | Jones et al. | May 2002 | B1 |
6430303 | Naoi et al. | Aug 2002 | B1 |
6442465 | Breed et al. | Aug 2002 | B2 |
6523912 | Bond, III | Feb 2003 | B1 |
6547133 | Devries, Jr. et al. | Apr 2003 | B1 |
6553130 | Lemelson et al. | Apr 2003 | B1 |
6574033 | Chui et al. | Jun 2003 | B1 |
6589625 | Kothari et al. | Jul 2003 | B1 |
6594583 | Ogura et al. | Jul 2003 | B2 |
6611610 | Stam et al. | Aug 2003 | B1 |
6636258 | Strumolo | Oct 2003 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6700605 | Toyoda et al. | Mar 2004 | B1 |
6704621 | Stein et al. | Mar 2004 | B1 |
6735506 | Breed et al. | May 2004 | B2 |
6744353 | Sjonell | Jun 2004 | B2 |
6762867 | Lippert et al. | Jul 2004 | B2 |
6795221 | Urey | Sep 2004 | B1 |
6819231 | Berberich et al. | Nov 2004 | B2 |
6823241 | Shirato et al. | Nov 2004 | B2 |
6879890 | Matsumoto et al. | Apr 2005 | B2 |
6889161 | Winner et al. | May 2005 | B2 |
6909753 | Meehan et al. | Jun 2005 | B2 |
6975775 | Rykowski et al. | Dec 2005 | B2 |
6989736 | Berberich et al. | Jan 2006 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7062300 | Kim | Jun 2006 | B1 |
7065432 | Moisel et al. | Jun 2006 | B2 |
7079017 | Lang et al. | Jul 2006 | B2 |
7085637 | Breed et al. | Aug 2006 | B2 |
7092548 | Aumeyer et al. | Aug 2006 | B2 |
7111968 | Bauer et al. | Sep 2006 | B2 |
7116246 | Winter et al. | Oct 2006 | B2 |
7136753 | Samukawa et al. | Nov 2006 | B2 |
7145519 | Takahashi et al. | Dec 2006 | B2 |
7149613 | Stam et al. | Dec 2006 | B2 |
7161616 | Okamoto et al. | Jan 2007 | B1 |
7202776 | Breed | Apr 2007 | B2 |
7227611 | Hull et al. | Jun 2007 | B2 |
7365769 | Mager | Apr 2008 | B1 |
7460951 | Altan | Dec 2008 | B2 |
7526103 | Schofield et al. | Apr 2009 | B2 |
7592928 | Chinomi et al. | Sep 2009 | B2 |
7639149 | Katoh | Dec 2009 | B2 |
7681960 | Wanke et al. | Mar 2010 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7724962 | Zhu et al. | May 2010 | B2 |
7952490 | Fechner et al. | May 2011 | B2 |
8027029 | Lu et al. | Sep 2011 | B2 |
8340866 | Hanzawa et al. | Dec 2012 | B2 |
8788176 | Yopp | Jul 2014 | B1 |
8849495 | Chundrik, Jr. et al. | Sep 2014 | B2 |
8935088 | Matsubara | Jan 2015 | B2 |
9008369 | Schofield et al. | Apr 2015 | B2 |
9082239 | Ricci | Jul 2015 | B2 |
9090234 | Johnson et al. | Jul 2015 | B2 |
9092986 | Salomonsson et al. | Jul 2015 | B2 |
9176924 | Ricci | Nov 2015 | B2 |
9205864 | Matsubara | Dec 2015 | B2 |
9384609 | Ricci | Jul 2016 | B2 |
9466161 | Ricci | Oct 2016 | B2 |
9524597 | Ricci | Dec 2016 | B2 |
9545930 | Ricci | Jan 2017 | B2 |
9751534 | Fung et al. | Sep 2017 | B2 |
9862380 | Minoiu Enache | Jan 2018 | B2 |
9925980 | Edo Ros | Mar 2018 | B2 |
10023161 | Johnson | Jul 2018 | B2 |
10144419 | Viehmann | Dec 2018 | B2 |
10889293 | Viehmann | Jan 2021 | B2 |
11618442 | Viehmann | Apr 2023 | B2 |
20020113873 | Williams | Aug 2002 | A1 |
20020118862 | Sugimoto et al. | Aug 2002 | A1 |
20030137586 | Lewellen | Jul 2003 | A1 |
20030222982 | Hamdan et al. | Dec 2003 | A1 |
20040022416 | Lemelson et al. | Feb 2004 | A1 |
20040114381 | Salmeen et al. | Jun 2004 | A1 |
20060018511 | Stam et al. | Jan 2006 | A1 |
20060018512 | Stam et al. | Jan 2006 | A1 |
20060091813 | Stam et al. | May 2006 | A1 |
20060103727 | Tseng | May 2006 | A1 |
20060164221 | Jensen | Jul 2006 | A1 |
20060250501 | Wildmann et al. | Nov 2006 | A1 |
20060255920 | Maeda et al. | Nov 2006 | A1 |
20060290479 | Akatsuka et al. | Dec 2006 | A1 |
20070104476 | Yasutomi et al. | May 2007 | A1 |
20080133136 | Breed et al. | Jun 2008 | A1 |
20080150786 | Breed | Jun 2008 | A1 |
20080154629 | Breed et al. | Jun 2008 | A1 |
20080243389 | Inoue et al. | Oct 2008 | A1 |
20090093938 | Isaji et al. | Apr 2009 | A1 |
20090113509 | Tseng et al. | Apr 2009 | A1 |
20090171559 | Lehtiniemi et al. | Jul 2009 | A1 |
20090177347 | Breuer et al. | Jul 2009 | A1 |
20090228174 | Takagi et al. | Sep 2009 | A1 |
20090244361 | Gebauer et al. | Oct 2009 | A1 |
20090265069 | Desbrunes | Oct 2009 | A1 |
20100020170 | Higgins-Luthman et al. | Jan 2010 | A1 |
20100228437 | Hanzawa et al. | Sep 2010 | A1 |
20110115615 | Luo et al. | May 2011 | A1 |
20110157309 | Bennett et al. | Jun 2011 | A1 |
20110169625 | James | Jul 2011 | A1 |
20110190972 | Timmons | Aug 2011 | A1 |
20110224978 | Sawada | Sep 2011 | A1 |
20110282516 | Lich et al. | Nov 2011 | A1 |
20120035846 | Sakamoto et al. | Feb 2012 | A1 |
20120044066 | Mauderer et al. | Feb 2012 | A1 |
20120062743 | Lynam et al. | Mar 2012 | A1 |
20120083960 | Zhu et al. | Apr 2012 | A1 |
20120218412 | Dellantoni et al. | Aug 2012 | A1 |
20120262340 | Hassan et al. | Oct 2012 | A1 |
20130002873 | Hess | Jan 2013 | A1 |
20130116859 | Ihlenburg et al. | May 2013 | A1 |
20130124052 | Hahne | May 2013 | A1 |
20130129150 | Saito | May 2013 | A1 |
20130131905 | Green et al. | May 2013 | A1 |
20130131907 | Green et al. | May 2013 | A1 |
20130131918 | Hahne | May 2013 | A1 |
20130141578 | Chundrlik, Jr. et al. | Jun 2013 | A1 |
20130222593 | Byrne et al. | Aug 2013 | A1 |
20130278769 | Nix et al. | Oct 2013 | A1 |
20130314503 | Nix et al. | Nov 2013 | A1 |
20140044310 | Schamp et al. | Feb 2014 | A1 |
20140067206 | Pflug | Mar 2014 | A1 |
20140156157 | Johnson et al. | Jun 2014 | A1 |
20140222280 | Salomonsson et al. | Aug 2014 | A1 |
20140267689 | Lavoie | Sep 2014 | A1 |
20140303845 | Hartmann et al. | Oct 2014 | A1 |
20140313335 | Koravadi | Oct 2014 | A1 |
20140313339 | Diessner | Oct 2014 | A1 |
20140379233 | Chundrlik, Jr. et al. | Dec 2014 | A1 |
20150046038 | Kawamata | Feb 2015 | A1 |
20150153735 | Clarke | Jun 2015 | A1 |
20150158499 | Koravadi | Jun 2015 | A1 |
20150158527 | Hafner et al. | Jun 2015 | A1 |
20150166062 | Johnson | Jun 2015 | A1 |
20150177007 | Su | Jun 2015 | A1 |
20150197248 | Breed | Jul 2015 | A1 |
20150203109 | McClain et al. | Jul 2015 | A1 |
20150203156 | Hafner et al. | Jul 2015 | A1 |
20150284008 | Tan et al. | Oct 2015 | A1 |
20150344028 | Gieseke et al. | Dec 2015 | A1 |
20160090100 | Oyama | Mar 2016 | A1 |
20160121906 | Matsuno et al. | May 2016 | A1 |
20160133130 | Grimm et al. | May 2016 | A1 |
20160133131 | Grimm et al. | May 2016 | A1 |
20160339959 | Lee | Nov 2016 | A1 |
20160362050 | Lee et al. | Dec 2016 | A1 |
20160362118 | Mollicone et al. | Dec 2016 | A1 |
20160375766 | Konet et al. | Dec 2016 | A1 |
20160375767 | Konet et al. | Dec 2016 | A1 |
20160375768 | Konet et al. | Dec 2016 | A1 |
20170036673 | Lee | Feb 2017 | A1 |
20170060234 | Sung | Mar 2017 | A1 |
20170124987 | Kim et al. | May 2017 | A1 |
20170358155 | Krapf et al. | Dec 2017 | A1 |
20180299887 | Cashler et al. | Oct 2018 | A1 |
20180364700 | Liu et al. | Dec 2018 | A1 |
20180364701 | Liu et al. | Dec 2018 | A1 |
20180364702 | Liu et al. | Dec 2018 | A1 |
20180364703 | Liu et al. | Dec 2018 | A1 |
20180364704 | Liu et al. | Dec 2018 | A1 |
20180365908 | Liu et al. | Dec 2018 | A1 |
20190082377 | Silver | Mar 2019 | A1 |
20190302781 | Tao et al. | Oct 2019 | A1 |
20190315345 | Newman | Oct 2019 | A1 |
20190361439 | Zeng et al. | Nov 2019 | A1 |
20190361454 | Zeng et al. | Nov 2019 | A1 |
20190361456 | Zeng et al. | Nov 2019 | A1 |
20190367021 | Zhao et al. | Dec 2019 | A1 |
20190369623 | Sadakiyo et al. | Dec 2019 | A1 |
20200180612 | Finelt et al. | Jun 2020 | A1 |
20210035442 | Baig et al. | Feb 2021 | A1 |
20210197858 | Zhang et al. | Jul 2021 | A1 |
20210221389 | Long et al. | Jul 2021 | A1 |
20220063678 | McPeek-Bechtold et al. | Mar 2022 | A1 |
20220119011 | Li et al. | Apr 2022 | A1 |
20220176987 | Russell et al. | Jun 2022 | A1 |
20220185313 | Wang et al. | Jun 2022 | A1 |
20220222597 | Neese | Jul 2022 | A1 |
20220230537 | Whyte et al. | Jul 2022 | A1 |
20230140569 | Foster | May 2023 | A1 |
20230294682 | Kim | Sep 2023 | A1 |
20230311656 | Yasui | Oct 2023 | A1 |
20230311864 | Iwase | Oct 2023 | A1 |
20230311891 | Sekijima | Oct 2023 | A1 |
20230311918 | Yasui | Oct 2023 | A1 |
20230347903 | Katz | Nov 2023 | A1 |
20240017738 | Vozar | Jan 2024 | A1 |
20240078363 | Nassar | Mar 2024 | A1 |
Number | Date | Country |
---|---|---|
2011244950 | Jun 2012 | AU |
114763167 | Jul 2022 | CN |
102009057836 | Jun 2011 | DE |
102016119265 | Apr 2017 | DE |
112017003968 | May 2019 | DE |
4137371 | Feb 2023 | EP |
2524393 | Sep 2015 | GB |
20230091210 | Jun 2023 | KR |
102612925 | Dec 2023 | KR |
WO-2015134840 | Sep 2015 | WO |
2016130719 | Aug 2016 | WO |
WO-2017154070 | Sep 2017 | WO |
WO-2021023463 | Feb 2021 | WO |
2022009900 | Jan 2022 | WO |
2022159173 | Jul 2022 | WO |
Number | Date | Country | |
---|---|---|---|
20230256963 A1 | Aug 2023 | US |
Number | Date | Country | |
---|---|---|---|
62398091 | Sep 2016 | US | |
62258722 | Nov 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17248116 | Jan 2021 | US |
Child | 18194707 | US | |
Parent | 16203976 | Nov 2018 | US |
Child | 17248116 | US | |
Parent | 15358166 | Nov 2016 | US |
Child | 16203976 | US |