The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more cameras at a vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
A vehicular vision system includes a front camera disposed at a windshield of a vehicle equipped with the vehicular vision system that views at least forward of the equipped vehicle through the windshield and captures image data. The system also includes a side camera disposed at a side mirror of the equipped vehicle and viewing at least forward and sideward of the equipped vehicle that captures image data. The front camera includes a CMOS imaging array with at least one million photosensors arranged in rows and columns. The side camera includes a CMOS imaging array with at least one million photosensors arranged in rows and columns. The system includes an electronic control unit (ECU) with electronic circuitry and associated software. The electronic circuitry of the ECU includes an image processor for processing image data captured by the front camera and image data captured by the side camera. The vehicular vision system, via processing at the ECU of image data captured by the front camera, determines a traffic lane along which the equipped vehicle is traveling. The vehicular vision system detects a leading vehicle traveling in front of the equipped vehicle in the traffic lane along which the equipped vehicle is traveling. Responsive to determination of an intent of the equipped vehicle to change from the traffic lane along which the equipped vehicle is traveling to an adjacent traffic lane that is adjacent to the traffic lane along which the equipped vehicle is traveling, the vehicular vision system, via processing at the ECU of image data captured by the side camera, detects an oncoming vehicle traveling in the adjacent traffic lane.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle vision system and/or driver or driving assist system and/or object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide display, such as a rearview display or a top down or bird's eye or surround view display or the like.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes at least one exterior viewing imaging sensor or camera, such as a rearward viewing imaging sensor or camera 14a (and the system may optionally include multiple exterior viewing imaging sensors or cameras, such as a forward viewing camera 14b at the front (or at the windshield) of the vehicle, and a sideward/rearward viewing camera 14c, 14d at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
An overtaking maneuver is commonly defined as a traffic maneuver where a vehicle passes another vehicle traveling in the same direction. When travelling along a single-lane road, overtaking maneuvers involve temporarily driving in the same lane as oncoming traffic, increasing the risk of the maneuver.
Implementations herein include an overtaking anti-collision assist system that assists in avoiding collisions with oncoming traffic during overtaking maneuvers. For example, as shown in
Referring now to
Based on sensor data captured by an imaging or non-imaging sensing sensor (e.g., a camera, a radar sensor, etc.), the ADAS may provide a message to the driver or the equipped vehicle and a crash avoidance routine (active or passive) may be started. For example, a passive crash avoidance routine may include generating a visual warning (e.g., on a display disposed within the vehicle, on a head-up display (HUD), etc.), generating an acoustic or audible warning, generating a haptic warning (e.g. vibration of steering wheel or seat), etc.
Additionally or alternatively, the system may include an active crash avoidance routine that, for example, disables or resists turning the steering wheel in direction of oncoming traffic to help stop the overtaking maneuver from beginning, slowing the vehicle or slowing or stopping a speed or acceleration of the vehicle, prefilling a brake (e.g., to prepare for emergency braking), tensioning a seat belt of one or more occupants of the vehicle, preparing airbags, etc.
Thus, the system can provide either or both an passive warning or actively attempt to reduce the likelihood of a collision or mitigate the consequence of a collision via controlling one or more systems of the vehicle (e.g., braking, steering, acceleration, seat belts, air bags, etc.).
The system may be used for left and right hand traffic (e.g., configurable via software). The forward sensing/imaging sensor can be mounted in different locations. For example, the sensor(s) may be mounted at or near the side mirrors (
Mounting the sensors toward a side of the vehicle (e.g., at the side mirrors, at a corner of a bumper, etc.), may increase the field of view of the oncoming traffic lane. For example, as can be seen with reference to the field of view of the sensor in
The system may be used when traffic ahead of the equipped vehicle is slower than the equipped vehicle and the equipped vehicle desires to overtake. For example, vehicle B of
During operation, the system may detect at least two targets. For example, the system detects a first object moving in the same direction as the equipped vehicle (i.e., the leading vehicle) and at a second object travelling in the opposite direction (i.e., the oncoming vehicle). The sensor(s) (e.g., one or more cameras, one or more radar sensors, one or more LIDAR sensors, etc.) may work independent of other ADAS such as lane-keep assist systems and operate regardless of current lane markings and turn signals (i.e., without the driver indicating they are preparing to overtake).
The system may use multiple sensors. For example, the system may use a camera mounted at or behind a windshield of the vehicle in conjunction with a camera mounted at a side mirror of the vehicle and/or a radar sensor disposed at the front and/or at the front corner and/or side mirror of the vehicle. The system may use any form of sensor fusion to evaluate the sensor data from multiple sensors. The passive warning and/or the active interventions may be user-configurable. For example, the user may enable passive warnings (e.g., visual and/or audible alerts) and disable active interventions (e.g., steering and/or brake control). The system may detect a leading vehicle and/or the current traffic lane (e.g., lane markers) using a first sensor (e.g., a windshield-mounted camera) and may detect a leading vehicle in the current traffic lane via the first sensor and/or via a forward-sensing radar sensor, and may detect an oncoming vehicle in an adjacent traffic lane using a second sensor (e.g., a side-mounted camera, such as a camera mounted at a side mirror of the vehicle, or a sideward and forward sensing radar sensor, such as a radar sensor mounted at a front corner of the vehicle or at a side mirror of the equipped vehicle).
The system may always be engaged. That is, the passive warning and/or active interventions may always be active whenever the system determines an oncoming vehicle poses a collision risk during any potential overtake maneuver. In other examples, the system predicts or determines an upcoming overtake maneuver and only engages the passive warning and/or active interventions when the likelihood of an imminent overtake maneuver satisfies a threshold. For example, the system may determine an overtake maneuver based on an increase in speed in the equipped vehicle, a turn signal, lateral movement in the equipped vehicle, etc. The system may enable the passive warnings of the system whenever an oncoming vehicle is detected and enable active interventions only when the system determines the likelihood of a collision exceeds a threshold value.
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in U.S. Pat. Nos. 10,099,614 and/or 10,071,687, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EYEQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. The imaging array may comprise a CMOS imaging array having at least 300,000 photosensor elements or pixels, preferably at least 500,000 photosensor elements or pixels and more preferably at least one million photosensor elements or pixels or at least three million photosensor elements or pixels or at least five million photosensor elements or pixels arranged in rows and columns. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. Publication Nos. US-2014-0340510; US-2014-0313339; US-2014-0347486; US-2014-0320658; US-2014-0336876; US-2014-0307095; US-2014-0327774; US-2014-0327772; US-2014-0320636; US-2014-0293057; US-2014-0309884; US-2014-0226012; US-2014-0293042; US-2014-0218535; US-2014-0218535; US-2014-0247354; US-2014-0247355; US-2014-0247352; US-2014-0232869; US-2014-0211009; US-2014-0160276; US-2014-0168437; US-2014-0168415; US-2014-0160291; US-2014-0152825; US-2014-0139676; US-2014-0138140; US-2014-0104426; US-2014-0098229; US-2014-0085472; US-2014-0067206; US-2014-0049646; US-2014-0052340; US-2014-0025240; US-2014-0028852; US-2014-005907; US-2013-0314503; US-2013-0298866; US-2013-0222593; US-2013-0300869; US-2013-0278769; US-2013-0258077; US-2013-0258077; US-2013-0242099; US-2013-0215271; US-2013-0141578 and/or US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in U.S. Pat. Nos. 10,071,687; 9,900,490; 9,126,525 and/or 9,036,026, which are hereby incorporated herein by reference in their entireties.
The system may utilize other sensors, such as radar sensors or imaging radar sensors or lidar sensors or the like, to detect presence of and/or range to objects and/or other vehicles and/or pedestrians. The sensing system may utilize aspects of the systems described in U.S. Pat. Nos. 10,866,306; 9,954,955; 9,869,762; 9,753,121; 9,689,967; 9,599,702; 9,575,160; 9,146,898; 9,036,026; 8,027,029; 8,013,780; 7,408,627; 7,405,812; 7,379,163; 7,379,100; 7,375,803; 7,352,454; 7,340,077; 7,321,111; 7,310,431; 7,283,213; 7,212,663; 7,203,356; 7,176,438; 7,157,685; 7,053,357; 6,919,549; 6,906,793; 6,876,775; 6,710,770; 6,690,354; 6,678,039; 6,674,895 and/or 6,587,186, and/or U.S. Publication Nos. US-2019-0339382; US-2018-0231635; US-2018-0045812; US-2018-0015875; US-2017-0356994; US-2017-0315231; US-2017-0276788; US-2017-0254873; US-2017-0222311 and/or US-2010-0245066, which are hereby incorporated herein by reference in their entireties.
The radar sensors of the sensing system each comprise a plurality of transmitters that transmit radio signals via a plurality of antennas, a plurality of receivers that receive radio signals via the plurality of antennas, with the received radio signals being transmitted radio signals that are reflected from an object present in the field of sensing of the respective radar sensor. The system includes an ECU or control that includes a data processor for processing sensor data captured by the radar sensors. The ECU or sensing system may be part of a driving assist system of the vehicle, with the driving assist system controls at least one function or feature of the vehicle (such as to provide autonomous driving control of the vehicle) responsive to processing of the data captured by the radar sensors.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 63/363,651, filed Apr. 27, 2022, which is hereby incorporated herein by reference in its entirety.
| Number | Name | Date | Kind |
|---|---|---|---|
| 5550677 | Schofield et al. | Aug 1996 | A |
| 5670935 | Schofield et al. | Sep 1997 | A |
| 5760962 | Schofield et al. | Jun 1998 | A |
| 5786772 | Schofield et al. | Jul 1998 | A |
| 5796094 | Schofield et al. | Aug 1998 | A |
| 5877897 | Schofield et al. | Mar 1999 | A |
| 5929786 | Schofield et al. | Jul 1999 | A |
| 5949331 | Schofield et al. | Sep 1999 | A |
| 6201642 | Bos | Mar 2001 | B1 |
| 6222447 | Schofield et al. | Apr 2001 | B1 |
| 6302545 | Schofield et al. | Oct 2001 | B1 |
| 6396397 | Bos et al. | May 2002 | B1 |
| 6498620 | Schofield et al. | Dec 2002 | B2 |
| 6523964 | Schofield et al. | Feb 2003 | B2 |
| 6587186 | Bamji et al. | Jul 2003 | B2 |
| 6611202 | Schofield et al. | Aug 2003 | B2 |
| 6636258 | Strumolo | Oct 2003 | B2 |
| 6674895 | Rafii et al. | Jan 2004 | B2 |
| 6678039 | Charbon | Jan 2004 | B2 |
| 6690268 | Schofield et al. | Feb 2004 | B2 |
| 6690354 | Sze | Feb 2004 | B2 |
| 6693517 | McCarthy et al. | Feb 2004 | B2 |
| 6710770 | Tomasi et al. | Mar 2004 | B2 |
| 6717610 | Bos et al. | Apr 2004 | B1 |
| 6757109 | Bos | Jun 2004 | B2 |
| 6802617 | Schofield et al. | Oct 2004 | B2 |
| 6806452 | Bos et al. | Oct 2004 | B2 |
| 6822563 | Bos et al. | Nov 2004 | B2 |
| 6842687 | Winner et al. | Jan 2005 | B2 |
| 6876775 | Torunoglu | Apr 2005 | B2 |
| 6882287 | Schofield | Apr 2005 | B2 |
| 6891563 | Schofield et al. | May 2005 | B2 |
| 6906793 | Bamji et al. | Jun 2005 | B2 |
| 6919549 | Bamji et al. | Jul 2005 | B2 |
| 6946978 | Schofield | Sep 2005 | B2 |
| 7005974 | McMahon et al. | Feb 2006 | B2 |
| 7038577 | Pawlicki et al. | May 2006 | B2 |
| 7053357 | Schwarte | May 2006 | B2 |
| 7145519 | Takahashi et al. | Dec 2006 | B2 |
| 7157685 | Bamji et al. | Jan 2007 | B2 |
| 7161616 | Okamoto et al. | Jan 2007 | B1 |
| 7176438 | Bamji et al. | Feb 2007 | B2 |
| 7203356 | Gokturk et al. | Apr 2007 | B2 |
| 7205904 | Schofield | Apr 2007 | B2 |
| 7212663 | Tomasi | May 2007 | B2 |
| 7230640 | Regensburger et al. | Jun 2007 | B2 |
| 7248283 | Takagi et al. | Jul 2007 | B2 |
| 7283213 | O'Connor et al. | Oct 2007 | B2 |
| 7295229 | Kumata et al. | Nov 2007 | B2 |
| 7301466 | Asai | Nov 2007 | B2 |
| 7310431 | Gokturk et al. | Dec 2007 | B2 |
| 7321111 | Bamji et al. | Jan 2008 | B2 |
| 7340077 | Gokturk et al. | Mar 2008 | B2 |
| 7352454 | Bamji et al. | Apr 2008 | B2 |
| 7375803 | Bamji | May 2008 | B1 |
| 7379100 | Gokturk et al. | May 2008 | B2 |
| 7379163 | Rafii et al. | May 2008 | B2 |
| 7405812 | Bamji | Jul 2008 | B1 |
| 7408627 | Bamji et al. | Aug 2008 | B2 |
| 7580795 | McCarthy et al. | Aug 2009 | B2 |
| 7592928 | Chinomi et al. | Sep 2009 | B2 |
| 7720580 | Higgins-Luthman | May 2010 | B2 |
| 7855755 | Weller et al. | Dec 2010 | B2 |
| 7859565 | Schofield et al. | Dec 2010 | B2 |
| 7881496 | Camilleri et al. | Feb 2011 | B2 |
| 8013780 | Lynam | Sep 2011 | B2 |
| 8027029 | Lu et al. | Sep 2011 | B2 |
| 8694224 | Chundrlik, Jr. et al. | Apr 2014 | B2 |
| 8818042 | Schofield et al. | Aug 2014 | B2 |
| 8886401 | Schofield et al. | Nov 2014 | B2 |
| 8917169 | Schofield et al. | Dec 2014 | B2 |
| 9036026 | Dellantoni et al. | May 2015 | B2 |
| 9068390 | Ihlenburg et al. | Jun 2015 | B2 |
| 9077098 | Latunski | Jul 2015 | B2 |
| 9077962 | Shi et al. | Jul 2015 | B2 |
| 9090234 | Johnson et al. | Jul 2015 | B2 |
| 9092986 | Salomonsson et al. | Jul 2015 | B2 |
| 9126525 | Lynam et al. | Sep 2015 | B2 |
| 9140789 | Lynam | Sep 2015 | B2 |
| 9146898 | Ihlenburg et al. | Sep 2015 | B2 |
| 9174574 | Salomonsson | Nov 2015 | B2 |
| 9205776 | Turk | Dec 2015 | B2 |
| 9233641 | Sesti et al. | Jan 2016 | B2 |
| 9575160 | Davis et al. | Feb 2017 | B1 |
| 9599702 | Bordes et al. | Mar 2017 | B1 |
| 9682712 | Kubo | Jun 2017 | B2 |
| 9689967 | Stark et al. | Jun 2017 | B1 |
| 9753121 | Davis et al. | Sep 2017 | B1 |
| 9869762 | Alland et al. | Jan 2018 | B1 |
| 9900490 | Ihlenburg et al. | Feb 2018 | B2 |
| 9954955 | Davis et al. | Apr 2018 | B2 |
| 10071687 | Ihlenburg et al. | Sep 2018 | B2 |
| 10099614 | Diessner | Oct 2018 | B2 |
| 10115314 | Boegel | Oct 2018 | B2 |
| 10214157 | Achenbach et al. | Feb 2019 | B2 |
| 10222224 | Johnson et al. | Mar 2019 | B2 |
| 10406981 | Chundrlik, Jr. et al. | Sep 2019 | B2 |
| 10457209 | Byrne et al. | Oct 2019 | B2 |
| 10787125 | Achenbach et al. | Sep 2020 | B2 |
| 10812992 | Tran et al. | Oct 2020 | B1 |
| 10866306 | Maher et al. | Dec 2020 | B2 |
| 11017665 | Roy | May 2021 | B1 |
| 11454719 | Hess et al. | Sep 2022 | B2 |
| 11763410 | Roy | Sep 2023 | B1 |
| 12030501 | Solar et al. | Jul 2024 | B2 |
| 12100225 | Nix et al. | Sep 2024 | B2 |
| 20050179527 | Schofield | Aug 2005 | A1 |
| 20080192984 | Higuchi et al. | Aug 2008 | A1 |
| 20100245066 | Sarioglu et al. | Sep 2010 | A1 |
| 20120062743 | Lynam et al. | Mar 2012 | A1 |
| 20120218412 | Dellantoni et al. | Aug 2012 | A1 |
| 20130002873 | Hess | Jan 2013 | A1 |
| 20130141578 | Chundrlik, Jr. et al. | Jun 2013 | A1 |
| 20130215271 | Lu | Aug 2013 | A1 |
| 20130222592 | Gieseke | Aug 2013 | A1 |
| 20130222593 | Byrne et al. | Aug 2013 | A1 |
| 20130242099 | Sauer et al. | Sep 2013 | A1 |
| 20130258077 | Bally et al. | Oct 2013 | A1 |
| 20130278769 | Nix et al. | Oct 2013 | A1 |
| 20130297387 | Michael | Nov 2013 | A1 |
| 20130298866 | Vogelbacher | Nov 2013 | A1 |
| 20130300869 | Lu et al. | Nov 2013 | A1 |
| 20130314503 | Nix et al. | Nov 2013 | A1 |
| 20140005907 | Bajpai | Jan 2014 | A1 |
| 20140025240 | Steigerwald et al. | Jan 2014 | A1 |
| 20140028852 | Rathi | Jan 2014 | A1 |
| 20140049646 | Nix | Feb 2014 | A1 |
| 20140052340 | Bajpai | Feb 2014 | A1 |
| 20140067206 | Pflug | Mar 2014 | A1 |
| 20140085472 | Lu et al. | Mar 2014 | A1 |
| 20140098229 | Lu et al. | Apr 2014 | A1 |
| 20140104426 | Boegel et al. | Apr 2014 | A1 |
| 20140138140 | Sigle | May 2014 | A1 |
| 20140139676 | Wierich | May 2014 | A1 |
| 20140152825 | Schaffner | Jun 2014 | A1 |
| 20140160276 | Pliefke et al. | Jun 2014 | A1 |
| 20140160291 | Schaffner | Jun 2014 | A1 |
| 20140168415 | Ihlenburg et al. | Jun 2014 | A1 |
| 20140168437 | Rother et al. | Jun 2014 | A1 |
| 20140211009 | Fursich | Jul 2014 | A1 |
| 20140218529 | Mahmoud et al. | Aug 2014 | A1 |
| 20140218535 | Ihlenburg et al. | Aug 2014 | A1 |
| 20140226012 | Achenbach | Aug 2014 | A1 |
| 20140232869 | May et al. | Aug 2014 | A1 |
| 20140247352 | Rathi et al. | Sep 2014 | A1 |
| 20140247354 | Knudsen | Sep 2014 | A1 |
| 20140247355 | Ihlenburg | Sep 2014 | A1 |
| 20140293042 | Lynam | Oct 2014 | A1 |
| 20140293057 | Wierich | Oct 2014 | A1 |
| 20140307095 | Wierich | Oct 2014 | A1 |
| 20140309884 | Wolf | Oct 2014 | A1 |
| 20140313339 | Diessner | Oct 2014 | A1 |
| 20140320636 | Bally et al. | Oct 2014 | A1 |
| 20140320658 | Pliefke | Oct 2014 | A1 |
| 20140327772 | Sahba | Nov 2014 | A1 |
| 20140327774 | Lu et al. | Nov 2014 | A1 |
| 20140336876 | Gieseke et al. | Nov 2014 | A1 |
| 20140340510 | Ihlenburg et al. | Nov 2014 | A1 |
| 20140347486 | Okouneva | Nov 2014 | A1 |
| 20140375476 | Johnson et al. | Dec 2014 | A1 |
| 20150124096 | Koravadi | May 2015 | A1 |
| 20150158499 | Koravadi | Jun 2015 | A1 |
| 20150251599 | Koravadi | Sep 2015 | A1 |
| 20150352953 | Koravadi | Dec 2015 | A1 |
| 20160036917 | Koravadi et al. | Feb 2016 | A1 |
| 20160159394 | Ryu et al. | Jun 2016 | A1 |
| 20160210853 | Koravadi | Jul 2016 | A1 |
| 20170222311 | Hess et al. | Aug 2017 | A1 |
| 20170254873 | Koravadi | Sep 2017 | A1 |
| 20170276788 | Wodrich | Sep 2017 | A1 |
| 20170315231 | Wodrich | Nov 2017 | A1 |
| 20170356994 | Wodrich et al. | Dec 2017 | A1 |
| 20180015875 | May et al. | Jan 2018 | A1 |
| 20180045812 | Hess | Feb 2018 | A1 |
| 20180173239 | Yoon et al. | Jun 2018 | A1 |
| 20180231635 | Woehlte | Aug 2018 | A1 |
| 20190329627 | Chundrlik, Jr. | Oct 2019 | A1 |
| 20190339382 | Hess et al. | Nov 2019 | A1 |
| 20200327343 | Lund et al. | Oct 2020 | A1 |
| 20210061276 | Zhang | Mar 2021 | A1 |
| 20210221390 | Slobodyanyuk et al. | Jul 2021 | A1 |
| 20210385865 | Mueck et al. | Dec 2021 | A1 |
| 20210392454 | Choi et al. | Dec 2021 | A1 |
| 20220024485 | Theverapperuma et al. | Jan 2022 | A1 |
| 20220097625 | Russell et al. | Mar 2022 | A1 |
| 20220255223 | Tran et al. | Aug 2022 | A1 |
| 20230286439 | Gali | Sep 2023 | A1 |
| 20240278736 | Lynam et al. | Aug 2024 | A1 |
| 20240359691 | Solar et al. | Oct 2024 | A1 |
| 20250014356 | Nix et al. | Jan 2025 | A1 |
| Number | Date | Country |
|---|---|---|
| 201741056955 | May 2019 | IN |
| Number | Date | Country | |
|---|---|---|---|
| 20230347878 A1 | Nov 2023 | US |
| Number | Date | Country | |
|---|---|---|---|
| 63363651 | Apr 2022 | US |