This application is related to the subject matter of: U.S. patent application Ser. No. 17/305,701 filed Jul. 13, 2021 and entitled SYSTEM AND METHOD IN THE PREDICTION OF TARGET VEHICLE BEHAVIOR BASED ON IMAGE FRAME AND NORMALIZATION; U.S. patent application Ser. No. 17/305,702 filed Jul. 13, 2021 and entitled SYSTEM AND METHOD IN DATA-DRIVEN VEHICLE DYNAMIC MODELING FOR PATH-PLANNING AND CONTROL; U.S. patent application Ser. No. 17/305,703 filed Jul. 13, 2021 and entitled SYSTEM AND METHODS OF INTEGRATING VEHICLE KINEMATICS AND DYNAMICS FOR LATERAL CONTROL FEATURE AT AUTONOMOUS DRIVING; U.S. patent application Ser. No. 17/305,704 filed Jul. 13, 2021 and entitled SYSTEM AND METHOD IN VEHICLE PATH PREDICTION BASED ON FULL NONLINEAR KINEMATICS; and U.S. patent application Ser. No. 17/305,705 filed Jul. 13, 2021 and entitled SYSTEM AND METHOD IN LANE DEPARTURE WARNING WITH FULL NONLINEAR KINEMATICS AND CURVATURE. The content of the above-identified patent documents is incorporated herein by reference.
This disclosure relates generally to vehicle driver assist or autonomous driving systems. More specifically, this disclosure relates to lane departure detection and warning with ego motion and vision.
Advanced driving assist system (ADAS) features, which use automated technology to assist the vehicle operator in driving and parking, form a foundation for autonomous driving (AD). Determination of vehicle position information and/or detection of nearby objects enables features such as: collision detection and avoidance for adaptive cruise control (ACC), emergency braking; blind spot detection for collision warning and/or evasive steering; lane detection for lane keeping and/or centering, lane changing, or lane departure warning; and path planning and control. Other ADAS and AD features may also be implemented using the same sensor set(s).
Electric vehicles (EVs) are often capable of higher driving and handling performance relative to conventional vehicles. EV designs can include low centers of gravity, independent steering, and immediate, quick, and smooth acceleration. As a result, ADAS and AD features for EVs can involve different considerations than those for conventional vehicles.
Vehicle lane departure detection and LDW, within the vehicle's ADAS or AD features, is improved in ways suitable to EVs having higher driving and handling performance. Predicted path curvature is calculated based on assumption of both low vehicle speed and high vehicle speed, and the two path curvatures and corresponding derived rates of curvature are combined in a weighted manner based on the vehicle's current speed. The weighted combinations of predicted path curvature and rate more accurately predict the vehicle's path. Combining vehicle path predictions with vehicle vision provides an improved and more accurate lane departure warning. The improved lane departure warning helps accurately predict and warn of lane departure without false positives even during high performance maneuvers, so that vehicle planning and control may optionally take control of at least vehicle steering and/or braking for a corrective action.
In one aspect, an apparatus comprises at least one camera configured to capture at least one image of a traffic lane in front of a vehicle, an inertial measurement unit (IMU) configured to detect motion characteristics of the vehicle, and at least one processor. The at least one processor is configured to obtain a vehicle motion trajectory using the IMU and based on one or more vehicle path prediction parameters, obtain a vehicle vision trajectory based on the at least one image, wherein the vehicle vision trajectory includes at least one lane boundary for a segment of the traffic lane occupied by the vehicle, determine distances between one or more points on the vehicle and one or more intersection points of the at least one lane boundary based on the obtained vehicle motion trajectory, determine at least one time to line crossing (TTLC) based on the determined distances and a speed of the vehicle, and activate a lane departure warning indicator based on the determined at least one TTLC.
In some embodiments, to obtain the vehicle motion trajectory includes, the at least one processor is further configured to determine first parameters for predicting a path of the vehicle, determine second parameters for predicting the path of the vehicle, and predict the path of the vehicle using a combination of the first parameters and the second parameters, wherein the combination is weighted based on the speed of the vehicle.
In some embodiments, the first parameters comprise a first path curvature for predicting the path of the vehicle and a first rate of the first path curvature and the second parameters comprise a second path curvature for predicting the path of the vehicle and a second rate of the second path curvature.
In some embodiments, the weighted combination weights the first parameters using a weight α and weights the second parameters using a weight 1−α.
In some embodiments, the weight α is applied to the first parameters for vehicle speeds below a first threshold and the weight 1−α is applied to the second parameters for vehicle speeds above a second threshold.
In some embodiments, the first parameters comprise a first path curvature κL and a first rate κ′L, the second parameters comprise a second path curvature κH and a second rate κ′H, the first path curvature κL and the second path curvature κH are combined according to K=α·κL+(1−α)·κH, and the first rate κ′L and the second rate κ′H are combined according to κ′=α·κ′L+(1−α)·κ′H.
In some embodiments, to determine the at least one TTLC, the at least one processor is further configured to determine a plurality of TTLCs each based on a distance between one of the one or more points on the vehicle and one of the one or more intersection points of the at least one lane boundary, and to activate the lane departure warning indicator, the at least one processor is further configured to compare a threshold with a combination of the plurality of TTLCs with applied weighting factors.
In some embodiments, the one or more points on the vehicle include a left corner point, a center point, and a right corner point.
In some embodiments, vehicle comprising the apparatus comprises a motor configured to drive wheels of the vehicle, a chassis supporting axles on which the wheels are mounted, the steering control configured to generate a steering command configured to control the wheels when the steering control is activated based on the proximity of the identified closest in path vehicle, and a brake actuator configured to actuate brakes for one or more of the wheels, and a braking control configured to generate a braking command to control the brake actuator based on activation of the lane departure warning indicator.
In some embodiments, the vehicle is an electric vehicle.
In another aspect a method comprises capturing at least one image of a traffic lane in front of a vehicle using at least one camera, detecting motion characteristics of the vehicle using an inertial measurement unit (IMU), obtaining a vehicle motion trajectory using the IMU and based on one or more vehicle path prediction parameters, obtaining a vehicle vision trajectory based on the at least one image, wherein the vehicle vision trajectory includes at least one lane boundary for a segment of the traffic lane occupied by the vehicle, determining distances between one or more points on the vehicle and one or more intersection points of the at least one lane boundary based on the obtained vehicle motion trajectory, determining at least one time to line crossing (TTLC) based on the determined distances and a speed of the vehicle, and activating a lane departure warning indicator based on the determined at least one TTLC.
In some embodiments, obtaining the vehicle motion trajectory includes determining first parameters for predicting a path of the vehicle, determining second parameters for predicting the path of the vehicle, and predicting the path of the vehicle using a combination of the first parameters and the second parameters, wherein the combination is weighted based on the speed of the vehicle.
In some embodiments, the first parameters comprise a first path curvature for predicting the path of the vehicle and a first rate of the first path curvature and the second parameters comprise a second path curvature for predicting the path of the vehicle and a second rate of the second path curvature.
In some embodiments, the weighted combination weights the first parameters using a weight α and weights the second parameters using a weight 1−α.
In some embodiments, the weight α is applied to the first parameters for vehicle speeds below a first threshold, and the weight 1−α is applied to the second parameters for vehicle speeds above a second threshold.
In some embodiments, the first parameters comprise a first path curvature κL and a first rate κ′L, the second parameters comprise a second path curvature κH and a second rate κ′H, the first path curvature κL and the second path curvature κH are combined according to κ=α·κL+(1−α)·κH, and the first rate κ′L and the second rate κ′H are combined according to κ′=α·κ′L+(1−α)·κ′H.
In some embodiments, determining the at least one TTLC includes determining a plurality of TTLCs each based on a distance between one of the one or more points on the vehicle and one of the one or more intersection points of the at least one lane boundary and activating the lane departure warning indicator includes comparing a threshold with a combination of the plurality of TTLCs with applied weighting factors.
In some embodiments, the one or more points on the vehicle include a left corner point, a center point, and a right corner point.
In some embodiments, the method further comprises driving wheels of the vehicle with a motor, generating a steering command control the wheels based on activation of the lane departure warning indicator, and generating a braking command to control the brake actuator based on activation of the lane departure warning indicator.
In some embodiments, the vehicle is an electric vehicle.
Other technical features may be readily apparent to one skilled in the art from the following figures, descriptions, and claims.
For a more complete understanding of this disclosure and its advantages, reference is now made to the following description taken in conjunction with the accompanying drawings, in which like reference numerals represent like parts:
ADAS features often include lateral warning features that alarm a driver, such as a lane departure warning (LDW) that warns the driver when the vehicle begins to drift to the boundary of the lane. Existing LDW features, however, are limited in operation range because the LDW is only activated when a perception of the lane is available on roads with low curvature. Existing LDW features also only cover road segments with limited curvature, using limited vision information. In such systems, parameters such as distance to lane boundary, rate of departure, and time to line crossing (TTLC) can be calculated using vision information, without—or at least with very little—prediction information of ego vehicle motion. Additionally, the vehicle's LDW feature is often turned off by the driver because of the warning's disturbing and frequent alarm due to inaccurate lane departure determination.
The present disclosure provides a vehicle with improved lane departure detection and warning system. The improved lane detection and warning system combines both ego motion of the vehicle and vehicle vision to accurately determine TTLC and predict lane departure by predicting vehicle trajectory from both ego motion prediction and camera lane detection using parameters including lateral offset, heading offset, path curvature, and rate of curvature, and without requiring additional calculations of distance to lane boundary and rate of departure. The improved vehicle of the present disclosure using accurately determined TTLC provides the driver with more precise and un-disturbing warnings.
With kinematics, the curvature of the vehicle's predicted path (e.g., occupied traffic lane) can be obtained, together with the vehicle's lateral acceleration and speed. For example, lateral vehicle acceleration Ay (in units of meters per second squared, [m/s2]), path curvature κ (in units of [m−1]), and vehicle speed Vx (in units of [m/s]) are related as:
However, the information needs to be filtered for noisy lateral acceleration and does not consider the vehicle's dynamic effects.
Ackerman steer angle calculates path curvature from the road wheel angle δrwa (in units of radians [rad]) and the wheelbase length L (in units of [m]) as follows:
However, the relationship is not considered accurate for low-speed maneuvers.
The vehicle 100 of
Passengers may enter and exit the cabin 101 through at least one door 102 forming part of the cabin 101. A transparent windshield 103 and other transparent panels mounted within and forming part of the cabin 101 allow at least one passenger (referred to as the “operator,” even when the vehicle 100 is operating in an AD mode) to see outside the cabin 101. Rear view mirrors 104 mounted to sides of the cabin 101 enable the operator to see objects to the sides and rear of the cabin 101 and may include warning indicators (e.g., selectively illuminated warning lights) for ADAS features such as blind spot warning (indicating that another vehicle is in the operator's blind spot) and/or lane departure warning.
Wheels 105 mounted on axles that are supported by the chassis and driven by the motor(s) (all not visible in
In the present disclosure, the vehicle 100 includes a vision system including at least a front camera 106, side cameras 107 (mounted on the bottoms of the rear view mirrors 104 in the example depicted), and a rear camera. The cameras 106, 107 provide images to the vehicle control system for use as part of ADAS and AD features as described below, and the images may optionally be displayed to the operator. In addition, the vehicle 100 includes an inertial measurement unit (IMU) 120 (shown in phantom in
Although
By way of example, power doors on a vehicle may be operated by an ECU called the body control module (not shown in
Notably, vehicle control systems are migrating to higher-speed networks with an Ethernet-like bus for which each ECU is assigned an Internet protocol (IP) address. Among other things, this may allow both centralized vehicle ECUs and remote computers to pass around huge amounts of information and participate in the Internet of Things (IoT).
In the example shown in
For the present disclosure, the vehicle control system 200 includes an image processing module (IPM) CAN 211 to which the front camera ECU 216, side camera ECU 217, and rear camera ECU 218 are connected. The front camera ECU 216 receives image data from the front camera 106 on the vehicle 100, while the side camera ECU 217 receives image data from each of the side cameras 107, and the rear camera ECU 218 receives image data from the rear camera. In some embodiments, a separate ECU may be used for each camera, such that two side camera ECUs may be employed. The IPM CAN 211 and the front camera ECU 216, side camera ECU 217, and rear camera ECU 218 process image data for use in vision-based ADAS features, such as providing a rear back-up camera display and/or stitching together the images to create a “bird's eye” view of the vehicle's surroundings.
For the present disclosure, the vehicle control system 200 also includes an IMU CAN 220 to which an IMU ECU 221 having an IMU 222 is connected. The IMU CAN 220, IMU ECU 221, and IMU 222 are used to detect vehicle motion such as yaw, pitch, and roll of the vehicle 100.
Although
To support various ADAS functions such as collision avoidance during high performance operation, the IPM CAN 211 for the vehicle 100 can accurately predict the vehicle path, and the IMU CAN 220 can detect vehicle motion. In the present disclosure, a combination of vehicle motion and vision (with optional input from other sensors) is used to predict lane boundary locations within the ego vehicle path (where “ego” refers to the vehicle implementing the ADAS and/or AD feature(s)).
To support ADAS and AD features, the system 300 includes the functions of camera perception and IMU vehicle motion 301, behavior prediction 302, decision and motion planning 303, and motion control 304. In various embodiments, at least the behavior prediction 302, the decision and motion planning 303, and the motion control 304 is performed by one or more processors, such as the CAN processor/controller 252. Camera perception and IMU vehicle motion 301 can detect a traffic lane ahead, the relative position of the traffic lane boundaries and the vehicle within the boundaries, the relative position and velocity of other vehicles, and the motion of the vehicle. The vehicle behavior prediction 302 determines whether the ego vehicle could potentially cross the lane boundary, risking collision, based on the ego vehicle's speed and detected motion, the predicted path, and the relative of the lane boundaries. Decision and motion planning 303 and motion control 304 respectively determine and, if necessary, issue one or more lane departure warnings or indicators, such as an audio, visual, or haptic feedback warning or other responses such as steering assistance and/or emergency braking.
The camera perception and IMU vehicle motion 301 are used to perform detection and tracking 305 to determine parameters for lane departure detection such as lateral offset, heading offset, path curvature of a predicted path, and rate of curvature of the predicted path. In the various embodiments of this disclosure, camera vision of the vehicle provides separate lateral offset, heading offset, curvature, and rate of curvature based on the camera vision, while the IMU provides separate lateral offset, heading offset, curvature, and rate of curvature based on ego vehicle motion. The ego vehicle motion can include ego vehicle speed, yaw rate, lateral offset from a reference path, longitudinal acceleration, and steering angle. In some embodiments, in addition to the IMU, various ego vehicle parameters may also be determined from vision, radar, other sensors, or some combination thereof.
Based on the parameters and ego vehicle motion determined from the detection and tracking 305, the behavior prediction 302 performs lane departure warning 306, such as by determining a lateral distance to the lane boundary, a rate of departure, and/or TTLC by fusing or combining predicted trajectories from the detected vision and ego vehicle motion. In various embodiments of the present disclosure, the behavior prediction 302 uses the TTLC determined from the predicted vision and vehicle motion trajectories to predict a lane boundary crossing and issue in response to the prediction a lane departure warning to the driver.
ym=a0+a1x+a2x2+a3x3
where x is distance along the longitudinal direction of the ego vehicle, y is distance along the lateral direction, a0 is the ego vehicle lateral offset from the reference path, a1 is the ego vehicle heading offset from the reference path, a2 is curvature of the predicted (and reference) path to be found, and a3 is the rate of curvature to be found. When the ego vehicle 100 travels along the reference path (the lane centerline), the above polynomial (with a0=0) represents the predicted lane centerline 403 shown in
The path prediction represented by the above polynomial is used in predicting lane departure of the ego vehicle, such as predicting whether the ego vehicle is or will soon be passing over the left lane boundary 401 or the right lane boundary 402. In some embodiments, the path prediction represented by the above polynomial can also be used to identify the closest in path vehicle as target vehicle 405 from among target vehicles 404, 405 and 406.
Vehicle kinematics 501, vehicle dynamics 502, and weighting 503 may be implemented as part of behavior prediction 302 and/or decision and motion planning 303. Vehicle kinematics 501 receive as inputs 504 the ego vehicle steering angle, speed, and yaw rate. Vehicle dynamics 502 receive as inputs 505 the ego vehicle steering angle and speed. Weighting 503 receives as an input 506 the ego vehicle speed.
In some embodiments, the ego vehicle path prediction is made with a third-order polynomial including both curvature and rate of curvature. Two types of curvature and rate of curvature may be obtained by using the Ackerman angle, kinematics, and vehicle dynamics. The final curvature and rate of curvature may be determined by fusing the previous initial two types of curvature and the associated rates based on the vehicle speed.
The path curvature κL can be expressed from the Ackerman angle as:
The rate of that path curvature κ′L can be derived as:
where the derivative of road wheel angle {dot over (δ)}rwa can be obtained from the first-order delay between road wheel angle (δrwa) and steering wheel angle (δswa) by a time delay τ and a ratio (κ):
which can be written in the time domain as:
The path curvature κH can be also expressed from kinematics using yaw rate ω and vehicle speed Vx:
The rate of path curvature κ′H can be derived as:
where {dot over (ω)} can be obtained from bicycle dynamics. The integrated system model with first-order delay and bicycle dynamics can be expressed as:
where β is side slip angle, ω is yaw rate, Cf and Cr are respectively front/rear cornering stiffness, lf and lr are respectively front/rear axle distance from the vehicle center of gravity, m is vehicle mass, and Iz is yaw rotational inertia.
Accordingly, vehicle kinematics 501 output two curvatures 507, κL and κH. Vehicle dynamics 502 employ those two curvatures to derive two rates of curvature 508, {dot over (κ)}L and {dot over (κ)}H (or alternatively κ′L and κ′H). The relationship among vehicle speed Iz, steering wheel angle δswa, and road wheel angle δrwa may be provided in a mapping table 509 as illustrated in
The final curvature used for ego vehicle path prediction can be determined from those calculated from Ackerman angle and kinematics, and the final rate of curvature used for ego vehicle path prediction can be derived from the curvatures calculated from Ackerman angle and kinematics. For example, this may be accomplished by applying weights α and 1−α as follows:
y=κx2+κ′x3
κ=α·κL+(1−α)·κH
κ′=α·κ′L+(1−α)·κ′H
The weights α and 1−α can be applied by weighting 503 based on vehicle speed according to tuning parameters νfading,start and νfading,width as shown in
The third-order polynomial above for ego vehicle path prediction can be completed with the following coefficients:
a0=0
a1=0
a2=κ
a3=κ′
Note that a0=0 and a1=0 when the ego vehicle follows the reference path (the predicted lane centerline 403). The left and right lane boundaries 401, 402 have the same coefficients (a1, a2, a3) but different lateral offsets
In various embodiments, the above ego vehicle motion trajectory prediction is determined using, or by connection to, the IMU CAN 220 and the IMU ECU 221 and the CAN processor/controller 252 depicted in
The example process 600 illustrated in
At step 604, the processor uses a kinematics control (e.g., kinematics 501) to determine path curvatures according to kinematics and Ackerman steering angle, and a dynamics control (e.g., dynamics 502) to determine rates of the two curvatures. At step 606, the processor applies vehicle speed-dependent weighting (e.g., by weighting 503) to determine a final predicted path curvature and rate of curvature. At step 608, the processor predicts the ego vehicle motion path or trajectory using at least the lateral offset, heading offset, path curvature and rate of curvature.
The determination of two curvatures based on kinematics and Ackerman steering angle, and two rates of curvature, with weighted combinations of both are used for vehicle motion path prediction to assist with detecting lane departure and issuing an LDW.
In various embodiments of this disclosure, vehicle trajectory is also predicted from vehicle vision using, or by connection to, the front camera 106 and the side cameras 107 in
yν=b0+b1x+b2x2+b3x3
where bi for i=0, . . . , 3 represent lateral/heading offset, ½ times the curvature, and ⅙ times the rate of curvature from vision, respectively.
In the example process 700 illustrated in
ym=a0+a1x+a2x2+a3x3
yν=b0+b1x+b2x2+b3x3
where ym represents ego vehicle motion trajectory and yν represents vehicle vision trajectory.
As illustrated in
At step 902, the processor obtains ego vehicle motion trajectory information from the IMU and/or other vehicle sensors, such as described with respect to
At step 906, using the obtained vehicle motion trajectory and vehicle vision trajectory, the processor determines distances of one or more points on the vehicle, such as the vehicle points P, P′, P″ representing points at the right corner, the front center, and the left corner of the vehicle, respectively, and one or more intersection points, such as intersection points Q, R, S of at least one lane boundary defined by the vision trajectory, as also described with respect to
where xs and xe present the starting and ending points.
At step 908, the processor determines one or more TTLCs. For example, with the distances determined at step 906, and with a detected vehicle speed Vego, three time to line crossings (TTLC) can be calculated as:
TTLCQ=dLookahead/Vego
TTLCR=dP′Q/Vego
TTLCS=dP″Q/Vego
where Vego>0 and TTLCQ is the earliest of three TTLCs, which includes not only vision information but also the prediction information of ego vehicle motion. The TTLC determination described above does not require additional calculation of distance to lane boundary and rate of departure.
At decision step 910, the processor determines if a calculated TTLC parameter is below a threshold, where the threshold is a predetermined amount of time. For example, TTLCQ as defined above can be used with a threshold as follows:
LDW=1(on) if TTLCQ<TTLCthreshold
As another example, the processor can combine each of the three above TTLCs can with weighting factors, such as follows:
LDW=1(on) if wQ·TTLCQ+wR·TTLCR+wS·TTLCS<TTLCthreshold
where the tunable weights range 0<wQ, wR, wS<1, where more weight can be on wQ for an earlier warning, and more weight can be on wS for a later warning.
If, at decision step 910, the processor determines the TTLC parameter is not below the threshold, the process 900 loops back to step 902. If, at decision step 910, the processor determines the TTLC parameter is below the threshold, the process 900 moves to step 912. At step 912, the processor activates an LDW indicator such as an audible, visual or haptic warning indicator. The process 900 then loops back to step 902 to perform another iteration of the process to provide continuous lane departure detection and warning services. The warning indicators may be deactivated when a subsequent iteration of the process determines that vehicle lane departure is no longer imminent. The lane departure warning control signal may also be employed by a collision avoidance control.
The improved lane departure warning of the present disclosure helps predict and warn of the lane departure of the subject vehicle, even during high performance maneuvers. In some cases, this may allow a planning and control module to take control of at least vehicle steering and/or braking for a corrective action.
It may be advantageous to set forth definitions of certain words and phrases used throughout this patent document. The terms “include” and “comprise,” as well as derivatives thereof, mean inclusion without limitation. The term “or” is inclusive, meaning and/or. The phrase “associated with,” as well as derivatives thereof, may mean to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, have a relationship to or with, or the like. The phrase “at least one of,” when used with a list of items, means that different combinations of one or more of the listed items may be used, and only one item in the list may be needed. For example, “at least one of: A, B, and C” includes any of the following combinations: A, B, C, A and B, A and C, B and C, and A and B and C.
The description in this patent document should not be read as implying that any particular element, step, or function is an essential or critical element that must be included in the claim scope. Also, none of the claims is intended to invoke 35 U.S.C. § 112(f) with respect to any of the appended claims or claim elements unless the exact words “means for” or “step for” are explicitly used in the particular claim, followed by a participle phrase identifying a function. Use of terms such as (but not limited to) “mechanism,” “module,” “device,” “unit,” “component,” “element,” “member,” “apparatus,” “machine,” “system,” “processor,” “processing device,” or “controller” within a claim is understood and intended to refer to structures known to those skilled in the relevant art, as further modified or enhanced by the features of the claims themselves, and is not intended to invoke 35 U.S.C. § 112(f).
While this disclosure has described certain embodiments and generally associated methods, alterations and permutations of these embodiments and methods will be apparent to those skilled in the art. Accordingly, the above description of example embodiments does not define or constrain this disclosure. Other changes, substitutions, and alterations are also possible without departing from the spirit and scope of this disclosure, as defined by the following claims.
Number | Name | Date | Kind |
---|---|---|---|
6675094 | Russell et al. | Jan 2004 | B2 |
9352778 | Yoon et al. | May 2016 | B2 |
9595197 | Lee | Mar 2017 | B2 |
10227039 | Prasad | Mar 2019 | B1 |
11104336 | Lin et al. | Aug 2021 | B2 |
11237562 | Schultz et al. | Feb 2022 | B2 |
11260757 | Degand et al. | Mar 2022 | B2 |
11328593 | Urano et al. | May 2022 | B2 |
11535274 | Dingli et al. | Dec 2022 | B2 |
20030229438 | Hac | Dec 2003 | A1 |
20040164851 | Crawshaw | Aug 2004 | A1 |
20080071451 | Yamaguchi et al. | Mar 2008 | A1 |
20090021358 | Lee et al. | Jan 2009 | A1 |
20090030613 | Kataoka et al. | Jan 2009 | A1 |
20090037062 | Lee et al. | Feb 2009 | A1 |
20090157263 | Shin | Jun 2009 | A1 |
20090284360 | Litkouhi | Nov 2009 | A1 |
20100172542 | Stein et al. | Jul 2010 | A1 |
20100182139 | Chen et al. | Jul 2010 | A1 |
20120022739 | Zeng | Jan 2012 | A1 |
20120050074 | Bechtel et al. | Mar 2012 | A1 |
20130190982 | Nakano | Jul 2013 | A1 |
20130190985 | Nakano | Jul 2013 | A1 |
20130261898 | Fujita et al. | Oct 2013 | A1 |
20130321172 | Igarashi et al. | Dec 2013 | A1 |
20130335213 | Sherony et al. | Dec 2013 | A1 |
20130345900 | Usui | Dec 2013 | A1 |
20140002655 | Woo et al. | Jan 2014 | A1 |
20140236428 | Akiyama | Aug 2014 | A1 |
20150149037 | Lim et al. | May 2015 | A1 |
20150314783 | Nespolo et al. | Nov 2015 | A1 |
20170010618 | Shashua et al. | Jan 2017 | A1 |
20170313253 | Hughes | Nov 2017 | A1 |
20180024238 | Khlifi | Jan 2018 | A1 |
20180024562 | Bellaiche | Jan 2018 | A1 |
20180025235 | Fridman | Jan 2018 | A1 |
20180141528 | Oh et al. | May 2018 | A1 |
20180150700 | Kaneko | May 2018 | A1 |
20180186378 | Zhuang | Jul 2018 | A1 |
20180237007 | Adam | Aug 2018 | A1 |
20180307236 | Reed | Oct 2018 | A1 |
20190072973 | Sun et al. | Mar 2019 | A1 |
20190202453 | Farooqi et al. | Jul 2019 | A1 |
20190283748 | Hajika | Sep 2019 | A1 |
20190384294 | Shashua et al. | Dec 2019 | A1 |
20190389470 | Zarringhalam et al. | Dec 2019 | A1 |
20200079372 | Hajika | Mar 2020 | A1 |
20200272835 | Cheng et al. | Aug 2020 | A1 |
20200339079 | Ohmura | Oct 2020 | A1 |
20200377088 | Fukushige | Dec 2020 | A1 |
20200379461 | Singh | Dec 2020 | A1 |
20210171042 | Hayakawa et al. | Jun 2021 | A1 |
20210197858 | Zhang | Jul 2021 | A1 |
20210221364 | Mase | Jul 2021 | A1 |
20210229708 | Kondo et al. | Jul 2021 | A1 |
20210366144 | Magistri et al. | Nov 2021 | A1 |
20220082403 | Shapira et al. | Mar 2022 | A1 |
20220089219 | Takebayashi | Mar 2022 | A1 |
20220097697 | Wang et al. | Mar 2022 | A1 |
20220212670 | Aoki et al. | Jul 2022 | A1 |
20220266852 | Khayyer | Aug 2022 | A1 |
20220363250 | Varunjikar et al. | Nov 2022 | A1 |
Number | Date | Country |
---|---|---|
109740469 | May 2019 | CN |
111344646 | Jun 2020 | CN |
109740469 | Jan 2021 | CN |
10333670 | Jul 2019 | DE |
3342666 | Jul 2018 | EP |
3342666 | Jul 2018 | EP |
3805073 | Apr 2021 | EP |
2550256 | Nov 2017 | GB |
2550256 | Nov 2017 | GB |
202014005110 | Aug 2020 | IN |
2009-020854 | Jan 2009 | JP |
5187171 | Apr 2013 | JP |
2018-203173 | Dec 2018 | JP |
2021-503414 | Feb 2021 | JP |
2019099622 | May 2019 | WO |
WO-2019099622 | May 2019 | WO |
Entry |
---|
Chanyoung et al., “Time-to-Line Crossing Enhanced End-to-End Autonomous Driving Framework,” 2020, Publisher: IEEE. |
S. Mammar et al., “Time-to-line crossing and vehicle dynamics for lane departure avoidance,” 2004, Publisher: IEEE. |
Non-final Office Action dated Sep. 29, 2022, in connection with U.S. Appl. No. 17/305,702, 13 pages. |
International Search Report and Written Opinion of the International Search Authority dated Sep. 30, 2022, in connection with International Application No. PCT/US2022/037008, 7 pages. |
International Search Report and Written Opinion of the International Search Authority dated Nov. 16, 2022, in connection with International Application No. PCT/US2022/037000, 8 pages. |
International Search Report and Written Opinion of the International Search Authority dated Nov. 15, 2022, in connection with International Application No. PCT/US2022/037011, 8 pages. |
International Search Report and Written Opinion of the International Search Authority dated Nov. 18, 2022, in connection with International Application No. PCT/US2022/037013, 9 pages. |
International Search Report and Written Opinion of the International Search Authority dated Nov. 18, 2022, in connection with International Application No. PCT/US2022/037015, 8 pages. |
International Search Report and Written Opinion of the International Search Authority dated Nov. 15, 2022, in connection with International Application No. PCT/US2022/037016, 9 pages. |
Baek, et al., “Practical Approach for Developing Lateral Motion Control of Autonomous Lane Change System,” Applied Sciences 2020, 10, 3143, Apr. 2020, 15 pages. |
Lian et al. “Cornering Stiffness and Sideslip Angle Estimation Based on Simplified Lateral Dynamic Models for Four-in-Wheel-Motor-Driven Electric Vehicles with Lateral Tire Force Information,” International Journal of Automotive Technology, vol. 16, No. 4, 2015, 15 pages. |
Pereira, et al., “Cornering stiffness estimation using Levenberg-Marquardt approach,” Inverse Problems in Science and Engineering, vol. 29, 2021—Issue 12, May 2021, 55 pages. |
Sierra et al., “Cornering stiffness estimation based on vehicle later dynamics,” Vehicle System Dynamics, vol. 44, Supplement, 2006, 15 pages. |
Weon, et al., “Lane Departure Detecting with Classification of Roadway Based on Bezier Curve Fitting Using DGPS/GIS,” Technical Gazette 28 1(2021), Feb. 2021, 8 pages. |
Bouhoute, Afaf, et al., “On the Application of Machine Learning for Cut-In Maneuver Recognition in Platooning Scenarios”, 2020 IEEE 91st Vehicular Technology Conference (VTC2020-Spring), 2020, 5 pages. |
Heinemann, Tonja, “Predicting Cut-Ins in Traffic Using a Neural Network”, Masters thesis in Systems, Control and Mechatronics, Chalmers University of Technology, Department of Electrical Engineering, Gothenburg, Sweden, 2017, 62 pages. |
Bar Hillel, Aharon, et al., “Recent Progress in Road and Lane Detection: A Survery”, Machine Vision and Applications, Apr. 2014, 20 pages. |
Morris, Brendan, et al., “Lane Change Intent Prediction for Driver Assistance: On-Road Design and Evaluation”, 2011 IEEE Intelligent Vehicles Symposium (IV), Baden-Baden, Germany, Jun. 5-9, 2011, 8 pages. |
Narote, S.P., et al., “A Review of Recent Advances in Lane Detection and Departure Warning System”, Pattern Recognition, vol. 73, Jan. 2018, 50 pages. |
Final Office Action dated Feb. 23, 2023, in connection with U.S. Appl. No. 17/305,702, 10 pages. |
Non-final Office Action dated Jan. 18, 2023, in connection with U.S. Appl. No. 17/305,701, 9 pages. |
Final Office Action dated Jun. 23, 2023, in connection with U.S. Appl. No. 17/305,701, 13 pages. |
Notice of Allowance dated Jun. 29, 2023, in connection with U.S. Appl. No. 17/305,702, 9 pages. |
Non-final Office Action dated Jun. 30, 2023, in connection with U.S. Appl. No. 17/305,705, 12 pages. |
Non-final Office Action dated Mar. 28, 2023, in connection with U.S. Appl. No. 17/305,704, 14 pages. |
Non-final Office Action dated Apr. 25, 2023, in connection with U.S. Appl. No. 17/305,703, 17 pages. |
Notice of Allowance dated Aug. 3, 2023, in connection with U.S. Appl. No. 17/305,702, 8 pages. |
Non-final Office Action dated Aug. 24, 2023, in connection with U.S. Appl. No. 17/305,701, 12 pages. |
Final Office Action dated Sep. 8, 2023, in connection with U.S. Appl. No. 17/305,704, 15 pages. |
Notice of Allowance dated Oct. 18, 2023, in connection with U.S. Appl. No. 17/305,703, 11 pages. |
Notice of Allowance dated Oct. 18, 2023, in connection with U.S. Appl. No. 17/305,705, 11 pages. |
Number | Date | Country | |
---|---|---|---|
20230029533 A1 | Feb 2023 | US |