This application is related to the subject matter of: U.S. patent application Ser. No. 17/305,701 filed Jun. 30, 2021 and entitled SYSTEM AND METHOD IN THE PREDICTION OF TARGET VEHICLE BEHAVIOR BASED ON IMAGE FRAME AND NORMALIZATION; U.S. patent application Ser. No. 17/305,702 filed Jul. 13, 2021 and entitled SYSTEM AND METHOD IN DATA-DRIVEN VEHICLE DYNAMIC MODELING FOR PATH-PLANNING AND CONTROL; U.S. patent application Ser. No. 17/305,703 filed Jul. 13, 2021 and entitled SYSTEM AND METHODS OF INTEGRATING VEHICLE KINEMATICS AND DYNAMICS FOR LATERAL CONTROL FEATURE AT AUTONOMOUS DRIVING; U.S. patent application Ser. No. 17/305,704 filed Jul. 13, 2021 and entitled SYSTEM AND METHOD IN VEHICLE PATH PREDICTION BASED ON FULL NONLINEAR KINEMATICS; U.S. patent application Ser. No. 17/305,706 filed Jul. 13, 2021 and entitled SYSTEM AND METHOD FOR LANE DEPARTURE WARNING WITH EGO MOTION AND VISION. The content of the above-identified patent documents is incorporated herein by reference.
This disclosure relates generally to vehicle driver assist or autonomous driving systems. More specifically, this disclosure relates to lane departure warning.
Advanced driving assist system (ADAS) features, which use automated technology to assist the vehicle operator in driving and parking, form a foundation for autonomous driving (AD). Determination of vehicle position information and/or detection of nearby objects enables features such as: collision detection and avoidance for adaptive cruise control (ACC), emergency braking; blind spot detection for collision warning and/or evasive steering; lane detection for lane keeping and/or centering, lane changing, or lane departure warning; and path planning and control. Other ADAS and AD features may also be implemented using the same sensor set(s).
Electric vehicles (EVs) are often capable of higher driving and handling performance relative to conventional vehicles. EV designs can include low centers of gravity, independent steering, and immediate, quick, and smooth acceleration. As a result, ADAS and AD features for EVs can involve different considerations than those for conventional vehicles.
Vehicle lane departure prediction for warning and/or collision avoidance, within the vehicle's ADAS or AD features, is improved in ways suitable to EVs having higher driving and handling performance. The vehicle rate of departure from an occupied lane is determined even for road segments having a small radius of curvature. The improved lane departure warning helps predict and warn of lane departure even during high performance maneuvers so that vehicle planning and control may optionally take control of at least vehicle steering and/or braking for a corrective action.
In one embodiment, an apparatus includes at least one camera configured to capture an image of a traffic lane in front of a vehicle. The apparatus also includes a vehicle behavior prediction controller configured to determine lane boundaries and road curvature for a segment of a traffic lane occupied by the vehicle from the captured image and prior captured images; determine lateral distances of the vehicle from the lane boundaries and a rate of departure of the vehicle from the occupied traffic lane that is accurate for the determined road curvature; determine a time to line crossing for the vehicle from the lateral distances and the rate of departure; and activate a lane departure warning indicator based on the determined time to line crossing.
The rate of departure may be determined using first and second terms for effect of road curvature on lane departure for different radii of curvature. The rate of departure Vdepart may be determined from:
Vdepart=v·sin(θ)+cl·{dot over (θ)},
where v is a speed of the vehicle, θ is a heading offset for the vehicle, {dot over (θ)} is a first derivative of θ with respect to traveled distance, cl is a length of the vehicle, v·sin(θ) represents effect of road curvature on lane departure for a first range of radii of curvature, and cl·{dot over (θ)} represents effect of road curvature on lane departure for a second range of radii of curvature. {dot over (θ)} may be determined from:
where ω is vehicle yaw rate, κ is the road curvature, and r is a lateral offset of the vehicle to a lane boundary. The lane departure warning indicator may be activated based on the lateral distances, the rate of departure, and a curvature of the road. The lane departure warning indicator may be activated based on:
LDW=function(Dthreshold, Vthreshold, κthreshold),
where Dthreshold is a threshold for distance of the vehicle from a lane boundary being approached, Vthreshold is a threshold for vehicle speed, and κthreshold is a threshold for road curvature. A time to line crossing TTLC may be determined from a lateral distance dL, to a left lane boundary, a lateral distance d R to a right lane boundary, and the rate of departure Vdepart based on:
The apparatus may also include a vehicle motion controller configured to activate at least one of a braking control and a steering control based on the lane departure warning indicator.
In another embodiment, a vehicle includes the apparatus and a motor configured to drive wheels of the vehicle. The vehicle also includes a chassis supporting axles on which the wheels are mounted and a steering control configured to generate a steering command to control the wheels based on activation of the lane departure warning indicator. The vehicle further includes a brake actuator configured to actuate brakes for one or more of the wheels and a braking control configured to generate a braking command to control the brake actuator based on activation of the lane departure warning indicator. The vehicle may be an electric vehicle.
In still another embodiment, a method includes capturing an image of a traffic lane in front of a vehicle, determining lane boundaries and road curvature for a segment of a traffic lane occupied by the vehicle from the captured image and prior captured images, determining lateral distances of the vehicle from the lane boundaries and a rate of departure of the vehicle from the occupied traffic lane that is accurate for the determined road curvature, determining a time to line crossing for the vehicle from the lateral distances and the rate of departure, and activating a lane departure warning indicator based on the determined time to line crossing.
The method may further include determining the rate of departure using first and second terms for effect of road curvature on lane departure for different radii of curvature. The rate of departure Vdepart may be determined from:
Vdepart=v·sin(θ)+cl·{dot over (θ)},
where v is a speed of the vehicle, θ is a heading offset for the vehicle, {dot over (θ)} is a first derivative of θ with respect to traveled distance, cl is a length of the vehicle, v·sin(θ) represents effect of road curvature on lane departure for a first range of radii of curvature, and cl·{dot over (θ)} represents effect of road curvature on lane departure for a second range of radii of curvature. {dot over (θ)} may be determined from:
where ω is vehicle yaw rate, κ is the road curvature, and r is a lateral offset of the vehicle to a lane boundary. The lane departure warning indicator may be activated based on the lateral distances, the rate of departure, and a curvature of the road. The lane departure warning indicator may be activated based on:
LDW=function(Dthreshold, Vthreshold, κthreshold) ,
where Dthreshold is a threshold for distance of the vehicle from a lane boundary being approached, Vthreshold is a threshold for vehicle speed, and κthreshold is a threshold for road curvature. The time to line crossing TTLC may be determined from a lateral distance dL, to a left lane boundary, a lateral distance dR to a right lane boundary, and the rate of departure Vdepart based on:
The method may also include controlling a vehicle motion by activating at least one of a braking control and a steering control based on the lane departure warning indicator.
Other technical features may be readily apparent to one skilled in the art from the following figures, descriptions, and claims.
For a more complete understanding of this disclosure and its advantages, reference is now made to the following description taken in conjunction with the accompanying drawings, in which like reference numerals represent like parts:
In ADAS, there are lateral warning features alarming a human driver (or “operator”). One such lateral warning feature is a lane departure warning (LDW), which warns the driver when the subject vehicle starts to drift toward or over a traffic lane boundary. However, LDW is often very limited due to only being activated when the perception of a lane is available, which may not be the case for a road with a high curvature (small radius of curvature, e.g., less than about 250 meters). As a result, current LDW only covers road segments with a limited radius of curvature and/or vehicle speed. In addition, limited vision information of lateral offset and heading offset may be available for use for LDW. LDW according to the present disclosure uses full nonlinear kinematics with full vision information of lateral offset, heading offset, curvature, and rate of curvature. Among other things, this allows LDW according to the present disclosure to cope with road segments having a small radius of curvature.
The vehicle 100 of
Passengers may enter and exit the cabin 101 through at least one door 102 forming part of the cabin 101. A transparent windshield 103 and other transparent panels mounted within and forming part of the cabin 101 allow at least one passenger (referred to as the “operator,”even when the vehicle 100 is operating in an AD mode) to see outside the cabin 101. Rear view mirrors 104 mounted to sides of the cabin 101 enable the operator to see objects to the sides and rear of the cabin 101 and may include warning indicators (e.g., selectively illuminated warning lights) for ADAS features such as blind spot warning (indicating that another vehicle is in the operator's blind spot) and/or lane departure warning.
Wheels 105 mounted on axles that are supported by the chassis and driven by the motor(s) (all not visible in
In the present disclosure, the vehicle 100 includes a vision system including at least a front camera 106, side cameras 107 (mounted on the bottoms of the rear view mirrors 104 in the example depicted), and a rear camera. The cameras 106, 107 provide images to the vehicle control system for use as part of ADAS and AD features as described below, and the images may optionally be displayed to the operator.
Although
By way of example, power doors on a vehicle may be operated by an ECU called the body control module (not shown in
Notably, vehicle control systems are migrating to higher-speed networks with an Ethernet-like bus for which each ECU is assigned an Internet protocol (IP) address. Among other things, this may allow both centralized vehicle ECUs and remote computers to pass around huge amounts of information and participate in the Internet of Things (IoT).
In the example shown in
For the present disclosure, the vehicle control system 200 includes an image processing module (IPM) CAN 211 to which the front camera ECU 216, side camera ECU 217, and rear camera ECU 218 are connected. The front camera ECU 216 receives image data from the front camera 106 on the vehicle 100, while the side camera ECU 217 receives image data from each of the side cameras 107 and the rear camera ECU 218 receives image data from the rear camera. In some embodiments, a separate ECU may be used for each camera, such that two side camera ECUs may be employed. The IPM CAN 211 and the front camera ECU 216, side camera ECU 217, and rear camera ECU 218 process image data for use in vision-based ADAS features, such as providing a rear back-up camera display and/or stitching together the images to create a “bird's eye” view of the vehicle's surroundings.
Although
To support various ADAS functions such as lane departure warning and collision avoidance during high performance operation or at other times, the IPM CAN 211 for the vehicle 100 can accurately detect a lane boundary even for a road segment with a small radius of curvature.
To support ADAS and AD features, the system 300 includes the functions of camera perception 301, vehicle behavior prediction 302, decision and motion planning 303, and motion control 304. Camera perception 301 detects a traffic lane ahead and the relative position of the vehicle within that lane. Vehicle behavior prediction 302 determines whether the vehicle could potentially cross or is crossing a lane boundary, risking collision with another vehicle in an adjacent lane. Decision and motion planning 303 and motion control 304 respectively determine and, if necessary, implement reactive responses to the vehicle's possible lane departure, such as activation of an indicator or steering assistance and/or emergency braking.
Camera perception 301 determines, from captured images using any suitable techniques, detection/tracking parameters 305 including lateral offset of the vehicle from a reference path (e.g., the lane centerline), a heading offset from the lane direction, curvature of the lane, and a rate of curvature of the lane. Detection/tracking parameters 305 are used by behavior prediction 302 to determine lane departure warning parameters 306, which include lateral distance of the vehicle to a lane boundary, rate of departure of the vehicle from the lane, curvature of the vehicle's predicted path, and time to line crossing (TTLC).
The vehicle 100 travels exactly along the reference path if the reference point lies on the reference path and the velocity vector is tangent to the path. Therefore, the relative course angle θ=ψ−ψp is introduced, where ψp is the orientation of the path at the projection point P. The vehicle kinematics can be described by:
where s is traveled distance, v is the vehicle's longitudinal speed, ω is yaw rate, r is the vehicle's lateral offset from the reference path, θ is heading offset, and κ is the curvature of the reference path at the projection point. The curvature is defined as the derivative of the orientation with respect to the travelled distance along the path and may be interpreted as the reciprocal of the local curve radius.
The first differential equation above describes how fast the vehicle moves along the reference path. The equation is derived by first taking the fraction of the velocity tangent to the path, i.e., v cos(θ), and applying the rule of three. This equation plays a useful role in deriving the dynamics but is usually ignored in lateral motion control. That fact indicates a benefit of using Frenet coordinates in lateral control, namely that the number of relevant differential equations can be reduced.
Some parameters, namely lateral distances (dL, dR) from the front corners of the vehicle 100 to the closest lane boundary 403, 401 illustrated by
The lateral distances dL, dR to each lane boundary 403, 401 can be calculated as:
such that
Another parameter, namely rate of departure (Vdepart) illustrated by
where r is the lateral offset of the vehicle to a lane boundary along the vehicle heading. The rate of departure of the vehicle 100 can be calculated as:
Vdepart=v·sin(θ)+cl{dot over (θ)},
where Vdepart is defined at the center of the front edge of the vehicle 100 so that both the translational and rotational components of the rate of departure can be considered. The first term, v·sin(θ), indicates the effect of the translational departure rate on a road segment with a large radius of curvature. The second term, cl·{dot over (θ)}, denotes the rotational departure rate effects from both yaw motion and the rate of curvature of a road segment with a small radius of curvature. Accordingly, the rate of departure Vdepart can be captured more precisely for road segments across a larger range of radii of curvature.
Finally, the time to line crossing (TTLC) can be expressed with the two parameters discussed above:
Note that when Vdepart≈0, the value may be saturated to protect TTLC from being diverged.
A lane departure warning bit LDW can be set or cleared based on a combination of the example parameters discussed above: lateral distances dL, dR and the rate of departure Vdepart, together with the road curvature κ (the derivative of the path orientation with respect to distance, θ, is interpreted as the reciprocal of the local curve radius as discussed above). One way to do this is to use TTLC with a fixed threshold: LDW=1(on) if TTLC<TTLCthreshold. Another way to do this is to take advantage of the speed-dependent LDW defined by International Standards Organization within ISO 17361:2017 depicted in
LDW=function(Dthreshold, Vthreshold, κthreshold),
where Dthreshold is a threshold for distance of the vehicle from a lane boundary being approached, Vthreshold is a threshold for vehicle speed, and κthreshold is a threshold for road curvature.
The example process 700 illustrated in
A control (e.g., behavior prediction 302) used to determine lane departure is used to determine lateral distances dL, dR of the vehicle from the lane boundaries and the vehicle rate of departure Vdepart from the lane (step 704), with the calculation of the vehicle's rate of departure being accurate even for a small radius of curvature. Time to line crossing TTLC is determined from the lateral distances dL, dR, accurate vehicle rate of departure Vdepart, and road curvature κ (step 705). Based on the time to line crossing, a check is made whether vehicle lane departure is imminent (determined, for example, by fixed or variable thresholding of TTLC) (step 706). If not, another iteration of the process is started. If so, the process activates visible, audible, and/or haptic warning indicator(s) (step 707), and another iteration of the process is started. The warning indicators may be deactivated when a subsequent iteration of the process determines that vehicle lane departure is no longer imminent. The lane departure warning control signal may also be employed by a collision avoidance control.
The improved lane departure warning of the present disclosure helps predict and warn of the lane departure of the subject vehicle, even during high performance maneuvers. In some cases, this may allow a planning and control module to take control of at least vehicle steering and/or braking for a corrective action.
It may be advantageous to set forth definitions of certain words and phrases used throughout this patent document. The terms “include” and “comprise,” as well as derivatives thereof, mean inclusion without limitation. The term “or” is inclusive, meaning and/or. The phrase “associated with,” as well as derivatives thereof, may mean to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, have a relationship to or with, or the like. The phrase “at least one of,” when used with a list of items, means that different combinations of one or more of the listed items may be used, and only one item in the list may be needed. For example, “at least one of: A, B, and C” includes any of the following combinations: A, B, C, A and B, A and C, B and C, and A and B and C.
The description in this patent document should not be read as implying that any particular element, step, or function is an essential or critical element that must be included in the claim scope. Also, none of the claims is intended to invoke 35 U.S.C. § 112(f) with respect to any of the appended claims or claim elements unless the exact words “means for” or “step for” are explicitly used in the particular claim, followed by a participle phrase identifying a function. Use of terms such as (but not limited to) “mechanism,” “module,” “device,” “unit,”“component,” “element,” “member,” “apparatus,” “machine,” “system,” “processor,”“processing device,” or “controller” within a claim is understood and intended to refer to structures known to those skilled in the relevant art, as further modified or enhanced by the features of the claims themselves, and is not intended to invoke 35 U.S.C. § 112(f).
While this disclosure has described certain embodiments and generally associated methods, alterations and permutations of these embodiments and methods will be apparent to those skilled in the art. Accordingly, the above description of example embodiments does not define or constrain this disclosure. Other changes, substitutions, and alterations are also possible without departing from the spirit and scope of this disclosure, as defined by the following claims.
Number | Name | Date | Kind |
---|---|---|---|
6675094 | Russell et al. | Jan 2004 | B2 |
9352778 | Yoon et al. | May 2016 | B2 |
9595197 | Lee | Mar 2017 | B2 |
10227039 | Prasad | Mar 2019 | B1 |
11104336 | Lin et al. | Aug 2021 | B2 |
11237562 | Schultz et al. | Feb 2022 | B2 |
11260757 | Degand et al. | Mar 2022 | B2 |
11328593 | Urano et al. | May 2022 | B2 |
11535274 | Dingli et al. | Dec 2022 | B2 |
20030229438 | Hac | Dec 2003 | A1 |
20040164851 | Crawshaw | Aug 2004 | A1 |
20080071451 | Yamaguchi et al. | Mar 2008 | A1 |
20090021358 | Lee et al. | Jan 2009 | A1 |
20090030613 | Kataoka et al. | Jan 2009 | A1 |
20090037062 | Lee et al. | Feb 2009 | A1 |
20090157263 | Shin | Jun 2009 | A1 |
20090284360 | Litkouhi | Nov 2009 | A1 |
20100172542 | Stein et al. | Jul 2010 | A1 |
20100182139 | Chen | Jul 2010 | A1 |
20120022739 | Zeng | Jan 2012 | A1 |
20120050074 | Bechtel | Mar 2012 | A1 |
20130190982 | Nakano et al. | Jul 2013 | A1 |
20130190985 | Nakano et al. | Jul 2013 | A1 |
20130261898 | Fujita et al. | Oct 2013 | A1 |
20130321172 | Igarashi | Dec 2013 | A1 |
20130335213 | Sherony | Dec 2013 | A1 |
20130345900 | Usui | Dec 2013 | A1 |
20140002655 | Woo | Jan 2014 | A1 |
20140236428 | Akiyama | Aug 2014 | A1 |
20150149037 | Lim | May 2015 | A1 |
20150314783 | Nespolo et al. | Nov 2015 | A1 |
20170010618 | Shashua et al. | Jan 2017 | A1 |
20170313253 | Hughes et al. | Nov 2017 | A1 |
20180024238 | Khlifi | Jan 2018 | A1 |
20180024562 | Bellaiche | Jan 2018 | A1 |
20180025235 | Fridman | Jan 2018 | A1 |
20180141528 | Oh et al. | May 2018 | A1 |
20180150700 | Kaneko et al. | May 2018 | A1 |
20180186378 | Zhuang et al. | Jul 2018 | A1 |
20180237007 | Adam et al. | Aug 2018 | A1 |
20180307236 | Reed | Oct 2018 | A1 |
20190072973 | Sun et al. | Mar 2019 | A1 |
20190202453 | Farooqi et al. | Jul 2019 | A1 |
20190283748 | Hajika et al. | Sep 2019 | A1 |
20190384294 | Shashua et al. | Dec 2019 | A1 |
20190389470 | Zarringhalam et al. | Dec 2019 | A1 |
20200079372 | Hajika | Mar 2020 | A1 |
20200272835 | Cheng et al. | Aug 2020 | A1 |
20200339079 | Ohmura | Oct 2020 | A1 |
20200377088 | Fukushige et al. | Dec 2020 | A1 |
20200379461 | Singh et al. | Dec 2020 | A1 |
20210171042 | Hayakawa et al. | Jun 2021 | A1 |
20210197858 | Zhang et al. | Jul 2021 | A1 |
20210221364 | Mase et al. | Jul 2021 | A1 |
20210229708 | Kondo et al. | Jul 2021 | A1 |
20210366144 | Magistri et al. | Nov 2021 | A1 |
20220082403 | Shapira et al. | Mar 2022 | A1 |
20220089219 | Takebayashi et al. | Mar 2022 | A1 |
20220097697 | Wang et al. | Mar 2022 | A1 |
20220212670 | Aoki et al. | Jul 2022 | A1 |
20220266852 | Khayyer | Aug 2022 | A1 |
20220363250 | Varunjikar et al. | Nov 2022 | A1 |
Number | Date | Country |
---|---|---|
109740469 | May 2019 | CN |
111344646 | Jun 2020 | CN |
109740469 | Jan 2021 | CN |
10333670 | Jul 2019 | DE |
3342666 | Jul 2018 | EP |
3805073 | Apr 2021 | EP |
2550256 | Nov 2017 | GB |
202014005110 | Aug 2020 | IN |
2009-020854 | Jan 2009 | JP |
5187171 | Apr 2013 | JP |
2018-203173 | Dec 2018 | JP |
2021-503414 | Feb 2021 | JP |
2019099622 | May 2019 | WO |
Entry |
---|
Bouhoute, Afaf, et al., “On the Application of Machine Learning for Cut-In Maneuver Recognition in Platooning Scenarios”, 2020 IEEE 91st Vehicular Technology Conference (VTC2020-Spring), 2020, 5 pages. |
Heinemann, Tonja, “Predicting Cut-Ins in Traffic Using a Neural Network”, Masters thesis in Systems, Control and Mechatronics, Chalmers University of Technology, Department of Electrical Engineering, Gothenburg, Sweden, 2017, 62 pages. |
Bar Hillel, Aharon, et al., “Recent Progress in Road and Lane Detection: A Survery”, Machine Vision and Applications, Apr. 2014, 20 pages. |
Morris, Brendan, et al., “Lane Change Intent Prediction for Driver Assistance: On-Road Design and Evaluation”, 2011 IEEE Intelligent Vehicles Symposium (IV), Baden-Baden, Germany, Jun. 5-9, 2011, 8 pages. |
Narote, S.P., et al., “A Review of Recent Advances in Lane Detection and Departure Warning System”, Pattern Recognition, vol. 73, Jan. 2018, 50 pages. |
Non-final Office Action dated Jan. 18, 2023, in connection with U.S. Appl. No. 17/305,701, 9 pages. |
Final Office Action dated Feb. 23, 2023, in connection with U.S. Appl. No. 17/305,702, 10 pages. |
Non-final Office Action dated Mar. 28, 2023, in connection with U.S. Appl. No. 17/305,704, 14 pages. |
Non-final Office Action dated Mar. 30, 2023, in connection with U.S. Appl. No. 17/305,706, 12 pages. |
Non-final Office Action dated Apr. 25, 2023, in connection with U.S. Appl. No. 17/305,703, 17 pages. |
Non-final Office Action dated Sep. 29, 2022, in connection with U.S. Appl. No. 17/305,702, 13 pages. |
International Search Report and Written Opinion of the International Search Authority dated Sep. 30, 2022, in connection with International Application No. PCT/US2022/037008, 7 pages. |
International Search Report and Written Opinion of the International Search Authority dated Nov. 16, 2022, in connection with International Application No. PCT/US2022/037000, 8 pages. |
International Search Report and Written Opinion of the International Search Authority dated Nov. 15, 2022, in connection with International Application No. PCT/US2022/037011, 8 pages. |
International Search Report and Written Opinion of the International Search Authority dated Nov. 18, 2022, in connection with International Application No. PCT/US2022/037013, 9 pages. |
International Search Report and Written Opinion of the International Search Authority dated Nov. 18, 2022, in connection with International Application No. PCT/US2022/037015, 8 pages. |
International Search Report and Written Opinion of the International Search Authority dated Nov. 15, 2022, in connection with International Application No. PCT/US2022/037016, 9 pages. |
Baek, et al., “Practical Approach for Developing Lateral Motion Control of Autonomous Lane Change System,” Applied Sciences 2020, 10, 3143, Apr. 2020, 15 pages. |
Lian et al. “Cornering Stiffness and Sideslip Angle Estimation Based on Simplified Lateral Dynamic Models for Four-In-Wheel-Motor-Driven Electric Vehicles with Lateral Tire Force Information,” International Journal of Automotive Technology, vol. 16, No. 4, 2015, 15 pages. |
Pereira, et al., “Cornering stiffness estimation using Levenberg-Marquardt approach,” Inverse Problems in Science and Engineering, vol. 29, 2021—Issue 12, May 2021, 55 pages. |
Sierra et al., “Cornering stiffness estimation based on vehicle later dynamics,” Vehicle System Dynamics, vol. 44, Supplement, 2006, 15 pages. |
Weon, et al., “Lane Departure Detecting with Classification of Roadway Based on Bezier Curve Fitting Using DGPS/GIS,” Technical Gazette 28 1(2021), Feb. 2021, 8 pages. |
Final Office Action dated Jun. 23, 2023, in connection with U.S. Appl. No. 17/305,701, 13 pages. |
Notice of Allowance dated Jun. 29, 2023, in connection with U.S. Appl. No. 17/305,702, 9 pages. |
Non-final Office Action dated Aug. 24, 2023, in connection with U.S. Appl. No. 17/305,701, 12 pages. |
Notice of Allowance dated Aug. 7, 2023, in connection with U.S. Appl. No. 17/305,706, 6 pages. |
Notice of Allowance dated Aug. 3, 2023, in connection with U.S. Appl. No. 17/305,702, 8 pages. |
Final Office Action dated Sep. 8, 2023, in connection with U.S. Appl. No. 17/305,704, 15 pages. |
Notice of Allowance dated Oct. 18, 2023, in connection with U.S. Appl. No. 17/305,703, 11 pages. |
Number | Date | Country | |
---|---|---|---|
20230013737 A1 | Jan 2023 | US |