Sensor fusion for autonomous driving transition control

Abstract
A system for sensor fusion for autonomous driving transition control includes a sensor fusion module and a decision making module. The sensor fusion module fuses a plurality of steering sensor data from one or more sensors of a steering system with a plurality of driver state data from a plurality of driver state sensors as a plurality of fused state data aligned in time. The decision making module determines whether to transition from an autonomous driving mode to a manual driving mode based on the fused state data.
Description
BACKGROUND OF THE INVENTION

Advanced driver assistance systems (ADAS) and automated driving systems are beginning to use a sensing system to monitor driver state when a vehicle is being driven autonomously or near autonomously (semi-autonomously). These systems need to monitor the driver to ensure that the driver state is appropriate for the driving mode. Examples include built-in steering and torque sensors to estimate driver input, steering wheel touch sensors to check for presence of a driver's hand on the steering wheel for lane keeping assistance and similar functions, and camera monitoring to ensure the driver state is sufficiently attentive for the hands-free driving condition, i.e., the driver is not sleeping or taking eyes off the road for more than a stipulated interval.


However, each one of these sensing systems has limitations when transitioning from automated driving to manual driving. If the transition was not the intent of the driver, then a safety hazard is created as the system would have relinquished control to a human who is not ready to take over. Consider an application where built-in steering system torque & position sensors are used to detect driver input as a signal to override autonomous control and transition to manual control. If in this situation something other than the driver's hands were the source of steering input (e.g., the driver's knee contacted the steering wheel), then the system could end up transitioning to manual driving contrary to the driver's intent, thus creating a hazard.


SUMMARY OF THE INVENTION

A system for sensor fusion for autonomous driving transition control includes a sensor fusion module and a decision making module. The sensor fusion module fuses a plurality of steering sensor data from one or more sensors of a steering system with a plurality of driver state data from a plurality of driver state sensors as a plurality of fused state data aligned in time. The decision making module determines whether to transition from an autonomous driving mode to a manual driving mode based on the fused state data.


A steering system includes one or more sensors operable to produce a plurality of steering sensor data, a plurality of driver state sensors operable to produce a plurality of driver state data, a steering actuator motor, and a control module. The control module is operable to operate the steering actuator motor in an autonomous driving mode and in a manual driving mode. The control module is further operable to fuse the steering sensor data with the driver state data as a plurality of fused state data aligned in time and determine whether to transition from the autonomous driving mode to the manual driving mode based on the fused state data.


A method for sensor fusion for autonomous driving transition control includes acquiring, by a control module, a plurality of steering sensor data from one or more sensors of a steering system and acquiring a plurality of driver state data from a plurality of driver state sensors. The steering sensor data are fused with the driver state data as a plurality of fused state data aligned in time. The control module determines whether to transition from an autonomous driving mode to a manual driving mode based on the fused state data.


These and other advantages and features will become more apparent from the following description taken in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS

The subject matter which is regarded as the invention is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features, and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:



FIG. 1 illustrates a functional block diagram illustrating a vehicle including a steering system in accordance with some embodiments;



FIG. 2 illustrates a system for sensor fusion for autonomous driving transition control in accordance with some embodiments;



FIG. 3 illustrates a process for sensor fusion for autonomous driving transition control in accordance with some embodiments; and



FIG. 4 illustrates a process for autonomous driving transition confirmation in accordance with some embodiments.





DETAILED DESCRIPTION

Referring now to the Figures, where the invention will be described with reference to specific embodiments, without limiting the same, an exemplary embodiment of a vehicle 10 including a steering system 12 is illustrated. In various embodiments, the steering system 12 includes a handwheel 14 coupled to a steering shaft 16. In the exemplary embodiment shown, the steering system 12 is an electric power steering (EPS) system that further includes a steering assist unit 18 that couples to the steering shaft 16 of the steering system 12 and to a left tie rod 20 and a right tie rod 22 of the vehicle 10. It should be noted that the steering system 12 may be a rack assist EPS (REPS) as well. The steering assist unit 18 includes, for example, a rack and pinion steering mechanism (not shown) that may be coupled through the steering shaft 16 to a steering actuator motor 19 and gearing. During operation, as the handwheel 14 is turned by a vehicle operator, the steering actuator motor 19 provides the assistance to move the left tie rod 20 and the right tie rod 22 which in turn moves left and right steering knuckles 24, 26, respectively. The left knuckle 24 is coupled to a left roadway wheel 28, and the right knuckle 26 is coupled to a right roadway wheel 30 of the vehicle 10.


As shown in FIG. 1, the vehicle 10 further includes various sensors 31-36 that detect and measure signals of the steering system 12, of the vehicle 10, and driver attentiveness. The sensors 31-36 generate sensor signals based on the measured/observed signals. In one embodiment, a handwheel torque sensor 31 is provided for sensing a torque placed on the handwheel 14. In the exemplary embodiment as shown, the handwheel torque sensor 31 is placed on the handwheel 14, however it is to be understood that the handwheel torque sensor 31 may not always be placed near or on the handwheel 14. In one embodiment, a motor position/velocity sensor 32 senses motor position and/or velocity, and a handwheel position/velocity sensor 33 senses handwheel position and/or velocity. In addition, the vehicle 10 may include a wheel speed sensor 34 to assist in measuring vehicle speed. In some embodiments, one or more handwheel touch sensors 35 measure a grip force or pressure on the handwheel 14 at various locations that be detected as an area of contact with the handwheel 14, a width of contact with the handwheel 14, a force of contact with the handwheel 14, and/or a position of contact with the handwheel 14. Data from one or more handwheel touch sensors 35 can include magnitude in combination with angular position. A camera 36 can detect one or more of: a driver body posture, a driver head pose, a driver eye gaze, and a driver hand position. The camera 36 can be mounted in any suitable location to monitor the driver's body and/or face and/or eyes and/or hands. For instance, the camera 36 can be mounted in a steering column, an instrument panel, an A-pillar, or an overhead console. In some embodiments, multiple cameras 36 are utilized to collect image data from various angles/locations. The one or more handwheel touch sensors 35 and camera 36 are also referred to as driver state sensors which collect information about the attentiveness of the driver.


A control module 40 controls the operation of the steering system 12 based on one or more of the sensor signals and further based on the steering control systems and methods of the present disclosure. The control module 40 generates a command signal to control the steering actuator motor 19 of the steering system 12 based on one or more of the inputs and further based on the steering control systems and methods of the present disclosure. The steering control systems and methods of the present disclosure fuse state data aligned in time for two or more of the sensors 31-36 to determine whether to transition from an autonomous driving mode to a manual driving mode based on the fused state data. The control module 40 can be embodied in one or more controllers. It will be appreciated that such a system is capable of operating in both an autonomous state and in a manual driving mode, the overall vehicle system that is capable of operating in both states being a semi-autonomous vehicle.



FIG. 2 illustrates a system 100 for grip-based handwheel compensation according to an embodiment. The system 100 includes control module 40 and receives data from two or more of the sensors 31-36 of FIG. 1. In various embodiments, the control module 40 can include one or more sub-modules and datastores, such as a sensor fusion module 102, a decision making module 104, a recent data store 106, and a history data store 108. As used herein the terms module and sub-module refer to an application specific integrated circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and memory that executes one or more software or firmware programs, a combinational logic circuit, or other suitable components that provide the described functionality. As can be appreciated, the control module 40 shown in FIG. 2 may be further partitioned and include additional control elements known in the art of steering control systems. For instance, the control module 40 or other module (not depicted) of the vehicle 10 of FIG. 1 can implement known techniques for automated steering control of the steering system 12 of FIG. 1.


Inputs to the control module 40 may be generated from the sensors 31-36 (FIG. 1) of the vehicle 10 (FIG. 1) as well as other sensors (not depicted). In addition, the inputs may be received from other control modules (not shown) within the vehicle 10 (FIG. 1), and may be modeled or predefined. Steering sensor data 110 from one or more of the sensors 31, 32, and 33 (FIG. 1) of the steering system 12 and driver state data 112 from one or more of driver state sensors 35 and 36 (FIG. 1) can be provided to the sensor fusion module 102 to fuse the steering sensor data 110 and the driver state data 112 as a plurality of fused state data 114 aligned in time. For example, the steering sensor data 110 can include sensed or derived data from one or more of the handwheel torque sensor 31 as EPS torque 116 and the handwheel position/velocity sensor 33 as EPS position 118. The driver state data 112 can include sensed or derived data from one or more of handwheel touch sensors 35 as touch/proximity data 120 and the camera 36 as image data 122. The touch/proximity data 120 can include a variety of sensed or derived data such as an area of contact with the handwheel 14 (FIG. 1), a width of contact with the handwheel 14, a force of contact with the handwheel 14, and/or a position of contact with the handwheel 14. The image data 122 can be used to determine various driver engagement indicators such as a driver body posture (e.g., straight, left, right, bent down, obstructed), a driver head pose (e.g., left, right, up, down, obstructed), a driver eye gaze (e.g., direction such as left/right/up/down or targeting specific objects such as road, instrument cluster, center stack, rear-view mirror, side-view mirror with a level of obstruction determined for either or both eyes), and/or a driver hand position (e.g., both hands visible and off handwheel 14, one-hand at an observed position on handwheel 14, both hands at observed positions on handwheel 14, both hands not visible or obstructed, etc.).


The recent data store 106 can hold a buffer of recently acquired values of the fused state data 114, for instance, a most recent 200 millisecond data fusion period. The history data store 108 collects older values of the fused state data 114 than collected in the recent data store 106, for instance, a 200 millisecond to a five second data fusion period. In some embodiments, the recent data store 106 can hold values of the fused state data 114 for a most recent 100 millisecond data fusion period, and the history data store 108 collects values of the fused state data 114 older than 100 milliseconds. Alternate data timing splits between the recent data store 106 and the history data store 108 are contemplated in other embodiments. In some embodiments, the recent data store 106 and the history data store 108 are combined in a single shared buffer, such as a circular buffer, where a first set of pointers identifies the locations of the recent data store 106 and a second set of pointers identifies the locations of the history data store 108.


The decision making module 104 determines whether to transition from an autonomous driving mode to a manual driving mode as a mode command 124 based on the fused state data 114. In embodiments, the decision making module 104 can average multiple samples of the fused state data 114 from the history data store 108 as time-averaged fused state data and determine whether to transition from the autonomous driving mode to the manual driving mode based on the fused state data 114 from the recent data store 106 in combination with the time-averaged fused state data. For example, transition criteria 126 can define a truth table to cover use cases for the recent data store 106 in combination with the history data store 108 to determine an action taken, such as a change of state of the mode command 124 (e.g., transition to manual driving mode). Time-averaged fused state data of the history data store 108 can be used as a baseline state to filter noise, and relative changes of the fused state data 114 from the recent data store 106 can indicate a likely transition request. For instance, a combination of state values of body position, face/head position, eye gaze, hand position, EPS input sensed from EPS torque 116 and/or EPS position 118 can be determined from both the recent data store 106 and the history data store 108 to interpret driver intent in making the decision to relinquish automatic control and transition to manual steering. A combination of a straight body position and/or a straight face/head position and/or an on-road eye gaze and/or an on-wheel hand position, and a sensed EPS input can confirm that the driver is ready to transition from automatic control to manual control. In another embodiment, a weighted average Alert index=w1×body boolean+w2×face/head position boolean+w3×eye gaze boolean+w4×hands on wheel boolean, where the weighted average Alert index can be passed through a low pass filter. If automatic control is not relinquished soon enough, the driver may be opposed with excess efforts by the steering system 12, resulting in increased difficulty in manually controlling steering of the vehicle 10. To ensure that control is not erroneously relinquished, an autonomous driving transition confirmation process can also be performed as further described herein.



FIG. 3 illustrates a process 200 for sensor fusion for autonomous driving transition control. Process 200 is described in further reference to FIGS. 1 and 2. The process 200 can be performed by the control module 40 of FIGS. 1 and 2. At block 202, the control module 40 acquires a plurality of steering sensor data 110 from one or more sensors 31-33 of the steering system 12. At block 204, the control module 40 acquires a plurality of driver state data 112 from a plurality of driver state sensors 34, 35. At block 206, the sensor fusion module 102 fuses the steering sensor data 110 with the driver state data 112 as a plurality of fused state data 114 aligned in time. At block 208, the decision making module 104 of the control module 40 determines whether to transition from an autonomous driving mode to a manual driving mode based on the fused state data 114. The decision making module 104 can access the fused state data 114 stored in the recent data store 106 and in the history data store 108 (e.g., as time-averaged fused state data) for comparison against the transition criteria 126 to determine the mode command 124.



FIG. 4 illustrates a process 300 for autonomous driving transition confirmation. Process 300 is described in further reference to FIGS. 1-3. Process 300 can be performed in combination with process 200 of FIG. 3. At block 302, the fused sensor data 114 is monitored by the decision making module 104 to determine whether a change to the mode command 124 is intended by the driver. At block 304, if the decision making module 104 determines that no switch to manual mode should be performed based on comparing the fused sensor data 114 from the recent data store 106 in combination with the history data store 108 to the transition criteria 126, the process 300 returns to block 302 to continue monitoring the fused sensor data 114. If a switch to manual mode is determined at block 304, then a transition to manual driving is initiated at block 306, for instance, by reducing automated steering control inputs applied to the steering actuator motor 19. At block 308, the decision making module 104 continues to monitor the fused state data 114 after the transition from the autonomous driving mode to the manual driving mode has been initiated. At block 310, the decision making module 104 confirms whether to revert to the autonomous driving mode based on transition validity criteria in the transition criteria 126. For instance, if the fused state data 114 in the recent data store 106 and/or the history data store 108 indicates that the driver is not in an attentive state or inputs have become obstructed, the decision making module 104 can restore and transition back to autonomous driving at block 312. If the driver remains attentive for a predetermined period of time after the transition occurs (e.g., a period of time greater than or equal to the time storage capacity of the history data store 108), then the decision is confirmed and the transition to manual driving is finalized at block 314.


While the invention has been described in detail in connection with only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Rather, the invention can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the invention. Additionally, while various embodiments of the invention have been described, it is to be understood that aspects of the invention may include only some of the described embodiments. Accordingly, the invention is not to be seen as limited by the foregoing description.

Claims
  • 1. A system for sensor fusion for autonomous driving transition control, the system comprising: a processor operable to receive a plurality of steering sensor data from one or more sensors of a steering system and receive a plurality of driver state data from a plurality of driver state sensors, the processor configured to: fuse the steering sensor data with the driver state data as a plurality of fused state data aligned in time;determine whether to transition from an autonomous driving mode to a manual driving mode based on the fused state data;command a steering motor actuator of the steering system autonomously in the autonomous driving mode;command the steering motor actuator responsive to a driver input in the manual driving mode;determine an attentive state of a driver based on time-averaging of the fused state data;continue monitoring the fused state data after the transition from the autonomous driving mode to the manual driving mode has been initiated to determine an attentive state of a driver based on time-averaging of the fused state data; andrevert from the manual driving mode to the autonomous driving mode based on determining that the attentive state of the driver no longer meets one or more transition validity criteria.
  • 2. The system of claim 1, wherein the steering sensor data comprise sensed or derived data from one or more of a handwheel torque sensor and a handwheel position/velocity sensor.
  • 3. The system of claim 1, wherein the driver state data comprise sensed or derived data from one or more of handwheel touch sensor and a camera.
  • 4. The system of claim 3, wherein the driver state data sensed or derived from the handwheel touch sensor comprises one or more of: an area of contact with a handwheel, a width of contact with the handwheel, a force of contact with the handwheel, and a position of contact with the handwheel, and wherein the driver state data sensed or derived from the camera comprises one or more of: a driver body posture, a driver head pose, a driver eye gaze, and a driver hand position.
  • 5. The system of claim 1, further comprising a recent data store of the fused state data and a history data store of the fused state data, wherein the history data store is operable to collect older values of the fused state data than collected in the recent data store.
  • 6. The system of claim 5, wherein the processor is configured to average multiple samples of the fused state data from the history data store as time-averaged fused state data and determine whether to transition from the autonomous driving mode to the manual driving mode based on the fused state data from the recent data store in combination with the time-averaged fused state data.
  • 7. The system of claim 1, wherein the processor is further configured to revert from the manual driving mode to the autonomous driving mode based on determining that an input from one or more of the driver state sensors is obstructed while monitoring the fused state data after the transition from the autonomous driving mode to the manual driving mode has been initiated.
  • 8. A steering system comprising: one or more sensors operable to produce a plurality of steering sensor data;a plurality of driver state sensors operable to produce a plurality of driver state data;a steering actuator motor; anda control module comprising a processor operable to command the steering actuator motor in an autonomous driving mode and in a manual driving mode, the control module further operable to fuse the steering sensor data with the driver state data as a plurality of fused state data aligned in time, determine whether to transition from the autonomous driving mode to the manual driving mode based on the fused state data, determine an attentive state of a driver based on time-averaging of the fused state data, continue monitoring the fused state data after the transition from the autonomous driving mode to the manual driving mode has been initiated to determine an attentive state of a driver based on time-averaging of the fused state data, and revert from the manual driving mode to the autonomous driving mode based on determining that the attentive state of the driver no longer meets one or more transition validity criteria.
  • 9. The steering system of claim 8, wherein the steering sensor data comprise sensed or derived data from one or more of a handwheel torque sensor and a handwheel position/velocity sensor.
  • 10. The steering system of claim 8, wherein the driver state data comprise sensed or derived data from one or more of handwheel touch sensor and a camera.
  • 11. The steering system of claim 10, wherein the driver state data sensed or derived from the handwheel touch sensor comprises one or more of: an area of contact with a handwheel, a width of contact with the handwheel, a force of contact with the handwheel, and a position of contact with the handwheel, and wherein the driver state data sensed or derived from the camera comprises one or more of: a driver body posture, a driver head pose, a driver eye gaze, and a driver hand position.
  • 12. The steering system of claim 8, wherein the control module comprises a recent data store of the fused state data and a history data store of the fused state data, wherein the history data store is operable to collect older values of the fused state data than collected in the recent data store.
  • 13. The steering system of claim 12, wherein the control module is operable to average multiple samples of the fused state data from the history data store as time-averaged fused state data and determine whether to transition from the autonomous driving mode to the manual driving mode based on the fused state data from the recent data store in combination with the time-averaged fused state data.
  • 14. The steering system of claim 8, wherein the processor is further configured to revert from the manual driving mode to the autonomous driving mode based on determining that an input from one or more of the driver state sensors is obstructed while monitoring the fused state data after the transition from the autonomous driving mode to the manual driving mode has been initiated.
  • 15. A method for sensor fusion for autonomous driving transition control, the method comprising: acquiring, by a processor of a control module, a plurality of steering sensor data from one or more sensors of a steering system;acquiring, by the processor of the control module, a plurality of driver state data from a plurality of driver state sensors;fusing, by the processor of the control module, the steering sensor data with the driver state data as a plurality of fused state data aligned in time;determining, by the processor of the control module, whether to transition from an autonomous driving mode to a manual driving mode based on the fused state data;commanding, by the processor of the control module, a steering motor actuator of the steering system autonomously in the autonomous driving mode;commanding, by the processor of the control module, the steering motor actuator responsive to a driver input in the manual driving mode;determining, by the processor of the control module, an attentive state of a driver based on time-averaging of the fused state data;continuing monitoring, by the processor of the control module, the fused state data after the transition from the autonomous driving mode to the manual driving mode has been initiated to determine an attentive state of a driver based on time-averaging of the fused state data; andreverting, by the processor of the control module, from the manual driving mode to the autonomous driving mode based on determining that the attentive state of the driver no longer meets one or more transition validity criteria.
  • 16. The method of claim 15, wherein the steering sensor data comprise sensed or derived data from one or more of a handwheel torque sensor and a handwheel position/velocity sensor.
  • 17. The method of claim 15, wherein the driver state data comprise sensed or derived data from one or more of handwheel touch sensor and a camera, the driver state data sensed or derived from the handwheel touch sensor comprises one or more of: an area of contact with a handwheel, a width of contact with the handwheel, a force of contact with the handwheel, and a position of contact with the handwheel, and wherein the driver state data sensed or derived from the camera comprises one or more of: a driver body posture, a driver head pose, a driver eye gaze, and a driver hand position.
  • 18. The method of claim 15, further comprising: storing the fused state data in a recent data store; andstoring the fused state data in the history data store, wherein the history data store collects older values of the fused state data than collected in the recent data store.
  • 19. The method of claim 18, further comprising: averaging multiple samples of the fused state data from the history data store as time-averaged fused state data; anddetermining whether to transition from the autonomous driving mode to the manual driving mode based on the fused state data from the recent data store in combination with the time-averaged fused state data.
  • 20. The method of claim 15, further comprising: reverting from the manual driving mode to the autonomous driving mode based on determining that an input from one or more of the driver state sensors is obstructed while monitoring the fused state data after the transition from the autonomous driving mode to the manual driving mode has been initiated.
US Referenced Citations (287)
Number Name Date Kind
4315117 Kokubo et al. Feb 1982 A
4337967 Yoshida et al. Jul 1982 A
4503300 Lane, Jr. Mar 1985 A
4503504 Suzumura et al. Mar 1985 A
4561323 Stromberg Dec 1985 A
4691587 Farrand et al. Sep 1987 A
4836566 Birsching Jun 1989 A
4921066 Conley May 1990 A
4962570 Hosaka et al. Oct 1990 A
4967618 Matsumoto et al. Nov 1990 A
4976239 Hosaka Dec 1990 A
5240284 Takada et al. Aug 1993 A
5295712 Omura Mar 1994 A
5319803 Allen Jun 1994 A
5469356 Hawkins Nov 1995 A
5488555 Asgari et al. Jan 1996 A
5618058 Byon Apr 1997 A
5668721 Chandy Sep 1997 A
5690362 Peitsmeier et al. Nov 1997 A
5765116 Wilson-Jones et al. Jun 1998 A
5893580 Hoagland et al. Apr 1999 A
5911789 Keipert et al. Jun 1999 A
6070686 Pollmann Jun 2000 A
6138788 Bohner et al. Oct 2000 A
6170862 Hoagland et al. Jan 2001 B1
6212453 Kawagoe et al. Apr 2001 B1
6227571 Sheng et al. May 2001 B1
6256561 Asanuma Jul 2001 B1
6301534 McDermott, Jr. et al. Oct 2001 B1
6354622 Ulbrich et al. Mar 2002 B1
6360149 Kwon et al. Mar 2002 B1
6373472 Palalau et al. Apr 2002 B1
6381526 Higashi et al. Apr 2002 B1
6390505 Wilson May 2002 B1
6481526 Millsap et al. Nov 2002 B1
6575263 Hjelsand et al. Jun 2003 B2
6578449 Anspaugh et al. Jun 2003 B1
6598695 Menjak et al. Jul 2003 B1
6611745 Paul Aug 2003 B1
6612392 Park et al. Sep 2003 B2
6612393 Bohner et al. Sep 2003 B2
6778890 Shimakage et al. Aug 2004 B2
6799654 Menjak et al. Oct 2004 B2
6817437 Magnus et al. Nov 2004 B2
6819990 Ichinose Nov 2004 B2
6820713 Menjak et al. Nov 2004 B2
6889792 Fardoun et al. May 2005 B1
7021416 Kapaan et al. Apr 2006 B2
7048305 Muller May 2006 B2
7062365 Fei Jun 2006 B1
7295904 Kanevsky et al. Nov 2007 B2
7308964 Hara et al. Dec 2007 B2
7428944 Gerum Sep 2008 B2
7461863 Muller Dec 2008 B2
7495584 Sorensen Feb 2009 B1
7628244 Chino et al. Dec 2009 B2
7719431 Bolourchi May 2010 B2
7735405 Parks Jun 2010 B2
7793980 Fong Sep 2010 B2
7862079 Fukawatase et al. Jan 2011 B2
7894951 Norris et al. Feb 2011 B2
7909361 Oblizajek et al. Mar 2011 B2
8002075 Markfort Aug 2011 B2
8027767 Klein et al. Sep 2011 B2
8055409 Tsuchiya Nov 2011 B2
8069745 Strieter et al. Dec 2011 B2
8079312 Long Dec 2011 B2
8146945 Born et al. Apr 2012 B2
8150581 Iwazaki et al. Apr 2012 B2
8170725 Chin et al. May 2012 B2
8260482 Szybalski et al. Sep 2012 B1
8352110 Szybalski Jan 2013 B1
8452492 Buerkle et al. May 2013 B2
8479605 Shavrnoch et al. Jul 2013 B2
8548667 Kaufmann Oct 2013 B2
8606455 Boehringer et al. Dec 2013 B2
8632096 Quinn et al. Jan 2014 B1
8634980 Urmson et al. Jan 2014 B1
8650982 Matsuno et al. Feb 2014 B2
8670891 Szybalski et al. Mar 2014 B1
8695750 Hammond et al. Apr 2014 B1
8725230 Lisseman et al. May 2014 B2
8798852 Chen et al. Aug 2014 B1
8818608 Cullinane Aug 2014 B2
8825258 Cullinane et al. Sep 2014 B2
8825261 Szybalski et al. Sep 2014 B1
8843268 Lathrop et al. Sep 2014 B2
8874301 Rao et al. Oct 2014 B1
8880287 Lee et al. Nov 2014 B2
8881861 Tojo Nov 2014 B2
8899623 Stadler et al. Dec 2014 B2
8909428 Lombrozo Dec 2014 B1
8915164 Moriyama Dec 2014 B2
8948993 Schulman et al. Feb 2015 B2
8950543 Heo et al. Feb 2015 B2
8994521 Gazit Mar 2015 B2
9002563 Green et al. Apr 2015 B2
9031729 Lathrop et al. May 2015 B2
9032835 Davies et al. May 2015 B2
9045078 Tovar et al. Jun 2015 B2
9073574 Cuddihy et al. Jul 2015 B2
9092093 Jubner et al. Jul 2015 B2
9108584 Rao et al. Aug 2015 B2
9134729 Szybalski et al. Sep 2015 B1
9150200 Urhahne Oct 2015 B2
9150224 Yopp Oct 2015 B2
9159221 Stantchev Oct 2015 B1
9164619 Goodlein Oct 2015 B2
9174642 Wimmer et al. Nov 2015 B2
9186994 Okuyama et al. Nov 2015 B2
9193375 Schramm et al. Nov 2015 B2
9199553 Cuddihy et al. Dec 2015 B2
9207856 Imai Dec 2015 B2
9227531 Cuddihy et al. Jan 2016 B2
9233638 Lisseman et al. Jan 2016 B2
9235111 Davidsson et al. Jan 2016 B2
9235211 Davidsson et al. Jan 2016 B2
9235987 Green et al. Jan 2016 B2
9238409 Lathrop et al. Jan 2016 B2
9248743 Enthaler et al. Feb 2016 B2
9260130 Mizuno Feb 2016 B2
9290174 Zagorski Mar 2016 B1
9290201 Lombrozo Mar 2016 B1
9298184 Bartels et al. Mar 2016 B2
9308857 Lisseman et al. Apr 2016 B2
9308891 Cudak et al. Apr 2016 B2
9315210 Sears et al. Apr 2016 B2
9333983 Lathrop et al. May 2016 B2
9360865 Yopp Jun 2016 B2
9714036 Yamaoka Jul 2017 B2
9725098 Abou-Nasr et al. Aug 2017 B2
9810727 Kandler et al. Nov 2017 B2
9852752 Chou et al. Dec 2017 B1
9868449 Holz et al. Jan 2018 B1
20030046012 Yamaguchi Mar 2003 A1
20030094330 Boloorchi et al. May 2003 A1
20030227159 Muller Dec 2003 A1
20040016588 Vitale et al. Jan 2004 A1
20040046346 Eki et al. Mar 2004 A1
20040099468 Chernoff et al. May 2004 A1
20040129098 Gayer et al. Jul 2004 A1
20040182640 Katou et al. Sep 2004 A1
20040204808 Satoh et al. Oct 2004 A1
20040262063 Kaufmann et al. Dec 2004 A1
20050001445 Ercolano Jan 2005 A1
20050081675 Oshita et al. Apr 2005 A1
20050155809 Krzesicki et al. Jul 2005 A1
20050197746 Pelchen et al. Sep 2005 A1
20050275205 Ahnafield Dec 2005 A1
20060224287 Izawa et al. Oct 2006 A1
20060244251 Muller Nov 2006 A1
20060271348 Rossow et al. Nov 2006 A1
20070021889 Tsuchiya Jan 2007 A1
20070029771 Haglund et al. Feb 2007 A1
20070046003 Mori et al. Mar 2007 A1
20070046013 Bito Mar 2007 A1
20070198145 Norris Aug 2007 A1
20070241548 Fong Oct 2007 A1
20070284867 Cymbal et al. Dec 2007 A1
20080009986 Lu et al. Jan 2008 A1
20080238068 Kumar et al. Oct 2008 A1
20090024278 Kondo et al. Jan 2009 A1
20090189373 Schramm et al. Jul 2009 A1
20090256342 Cymbal et al. Oct 2009 A1
20090276111 Wang et al. Nov 2009 A1
20090292466 McCarthy et al. Nov 2009 A1
20100152952 Lee et al. Jun 2010 A1
20100222976 Haug Sep 2010 A1
20100228417 Lee et al. Sep 2010 A1
20100228438 Buerkle Sep 2010 A1
20100250081 Kinser et al. Sep 2010 A1
20100280713 Stahlin et al. Nov 2010 A1
20100286869 Katch et al. Nov 2010 A1
20100288567 Bonne Nov 2010 A1
20110098922 Ibrahim Apr 2011 A1
20110153160 Hesseling et al. Jun 2011 A1
20110167940 Shavrnoch et al. Jul 2011 A1
20110187518 Strumolo et al. Aug 2011 A1
20110245643 Lisseman Oct 2011 A1
20110266396 Abildgaard et al. Nov 2011 A1
20110282550 Tada et al. Nov 2011 A1
20120136540 Miller May 2012 A1
20120150388 Boissonnier et al. Jun 2012 A1
20120197496 Limpibunterng et al. Aug 2012 A1
20120205183 Rombold Aug 2012 A1
20120209473 Birsching et al. Aug 2012 A1
20120215377 Takemura et al. Aug 2012 A1
20130002416 Gazit Jan 2013 A1
20130325202 Howard et al. Jan 2013 A1
20130087006 Ohtsubo et al. Apr 2013 A1
20130131906 Green May 2013 A1
20130158771 Kaufmann Jun 2013 A1
20130218396 Moshchuk et al. Aug 2013 A1
20130233117 Read et al. Sep 2013 A1
20130253765 Bolourchi et al. Sep 2013 A1
20130292955 Higgins et al. Nov 2013 A1
20140028008 Stadler et al. Jan 2014 A1
20140046542 Kauffman et al. Feb 2014 A1
20140046547 Kaufmann et al. Feb 2014 A1
20140111324 Lisseman et al. Apr 2014 A1
20140152551 Mueller et al. Jun 2014 A1
20140156107 Karasawa et al. Jun 2014 A1
20140168061 Kim Jun 2014 A1
20140172231 Terada et al. Jun 2014 A1
20140277896 Lathrop et al. Sep 2014 A1
20140277945 Chandy Sep 2014 A1
20140300479 Wolter et al. Oct 2014 A1
20140303827 Dolgov Oct 2014 A1
20140306799 Ricci Oct 2014 A1
20140309816 Stefan et al. Oct 2014 A1
20140354568 Andrews et al. Dec 2014 A1
20150002404 Hooton Jan 2015 A1
20150006033 Sekiya Jan 2015 A1
20150014086 Eisenbarth Jan 2015 A1
20150032322 Wimmer Jan 2015 A1
20150032334 Jang Jan 2015 A1
20150051780 Hahne Jan 2015 A1
20150120142 Park et al. Jan 2015 A1
20150210273 Kaufmann et al. Feb 2015 A1
20150060185 Feguri Mar 2015 A1
20150120124 Bartels et al. Apr 2015 A1
20150120141 Lavoie et al. Apr 2015 A1
20150246673 Tseng et al. Apr 2015 A1
20150123947 Jubner et al. May 2015 A1
20150149035 Enthaler et al. May 2015 A1
20150251666 Attard et al. Jul 2015 A1
20150283998 Lind et al. Sep 2015 A1
20150324111 Jubner et al. Nov 2015 A1
20150338849 Nemec et al. Nov 2015 A1
20160001781 Fung Jan 2016 A1
20160009332 Sirbu Jan 2016 A1
20160071418 Oshida Mar 2016 A1
20160075371 Varunkikar et al. Mar 2016 A1
20160082867 Sugioka et al. Mar 2016 A1
20160200246 Lisseman et al. Mar 2016 A1
20160185387 Kuoch Jun 2016 A1
20160200343 Lisseman et al. Jun 2016 A1
20160200344 Sugioka et al. Jul 2016 A1
20160207536 Yamaoka Jul 2016 A1
20160207538 Urano et al. Jul 2016 A1
20160209841 Yamaoka et al. Jul 2016 A1
20160229450 Basting et al. Jul 2016 A1
20160231743 Bendewald et al. Jul 2016 A1
20160244070 Bendewald et al. Aug 2016 A1
20160347347 Lubischer Aug 2016 A1
20160347348 Lubischer Aug 2016 A1
20160291862 Yaron et al. Oct 2016 A1
20160318540 King Nov 2016 A1
20160318542 Pattok et al. Nov 2016 A1
20160355207 Urushibata Dec 2016 A1
20160362084 Martin et al. Dec 2016 A1
20160362117 Kaufmann et al. Dec 2016 A1
20160362126 Lubischer Dec 2016 A1
20160364003 O'Brien Dec 2016 A1
20160368522 Lubischer Dec 2016 A1
20160375860 Lubischer Dec 2016 A1
20160375923 Schulz Dec 2016 A1
20160375925 Lubischer et al. Dec 2016 A1
20160375926 Lubischer et al. Dec 2016 A1
20160375927 Schulz et al. Dec 2016 A1
20160375928 Magnus Dec 2016 A1
20160375929 Rouleau Dec 2016 A1
20160375931 Lubischer Dec 2016 A1
20170029009 Rouleau Feb 2017 A1
20170029018 Lubischer Feb 2017 A1
20170101032 Sugioka et al. Apr 2017 A1
20170113712 Watz Apr 2017 A1
20170151950 Lien Jun 2017 A1
20170151978 Oya et al. Jun 2017 A1
20170158055 Kim et al. Jun 2017 A1
20170158222 Schulz et al. Jun 2017 A1
20170166222 James Jun 2017 A1
20170225704 Urushibata Aug 2017 A1
20170240204 Raad et al. Aug 2017 A1
20170242428 Pal Aug 2017 A1
20170293306 Riefe et al. Oct 2017 A1
20170297606 Kim et al. Oct 2017 A1
20170305425 Xing Oct 2017 A1
20170305458 Wang et al. Oct 2017 A1
20170334458 Sato Nov 2017 A1
20180017968 Zhu Jan 2018 A1
20180029632 Bodtker Feb 2018 A1
20180059661 Sato Mar 2018 A1
20180059662 Sato Mar 2018 A1
20180072341 Schulz et al. Mar 2018 A1
20180093700 Chandy Apr 2018 A1
20180105198 Bodtker et al. Apr 2018 A1
Foreign Referenced Citations (45)
Number Date Country
1722030 Jan 2006 CN
1736786 Feb 2006 CN
101037117 Sep 2007 CN
101041355 Sep 2007 CN
101596903 Dec 2009 CN
102027458 Apr 2011 CN
102320324 Jan 2012 CN
102452391 May 2012 CN
202563346 Nov 2012 CN
102939474 Feb 2013 CN
103419840 Dec 2013 CN
103448785 Dec 2013 CN
103677253 Mar 2014 CN
103818386 May 2014 CN
104968554 Oct 2015 CN
19523214 Jan 1997 DE
19923012 Nov 2000 DE
10212782 Oct 2003 DE
102005032528 Jan 2007 DE
102005056438 Jun 2007 DE
102006025254 Dec 2007 DE
102008057313 Oct 2009 DE
102010025197 Dec 2011 DE
102013110865 Apr 2015 DE
1559630 Aug 2005 EP
1783719 May 2007 EP
1932745 Jun 2008 EP
2384946 Nov 2011 EP
2426030 Mar 2012 EP
2489577 Aug 2012 EP
2604487 Jun 2013 EP
1606149 May 2014 EP
2862595 May 2005 FR
3016327 Jul 2015 FR
S60157963 Aug 1985 JP
S60164629 Aug 1985 JP
H105162652 Jun 1993 JP
2007253809 Oct 2007 JP
20174099 Jan 2017 JP
20100063433 Jun 2010 KR
2006099483 Sep 2006 WO
2007034567 Mar 2007 WO
2010082394 Jul 2010 WO
2010116518 Oct 2010 WO
2013101058 Jul 2013 WO
Non-Patent Literature Citations (29)
Entry
China Patent Application No. 201510204221.5 Second Office Action dated Mar. 10, 2017, 8 pages.
CN Patent Application No. 201210599006.6 First Office Action dated Jan. 27, 2015, 9 pages.
CN Patent Application No. 201210599006.6 Second Office Action dated Aug. 5, 2015, 5 pages.
CN Patent Application No. 201310178012.9 First Office Action dated Apr. 13, 2015, 13 pages.
CN Patent Application No. 201310178012.9 Second Office Action dated Dec. 28, 2015, 11 pages.
CN Patent Application No. 201410089167 First Office Action and Search Report dated Feb. 3, 2016, 9 pages.
EP Application No. 14156903.8 Extended European Search Report, dated Jan. 27, 2015, 10 pages.
EP Application No. 14156903.8 Office Action dated Nov. 16, 2015, 4 pages.
EP Application No. 14156903.8 Office Action dated May 31, 2016, 5 pages.
EP Application No. 14156903.8 Partial European Search Report dated Sep. 23, 2014, 6 pages.
European Application No. 12196665.9 Extended European Search Report dated Mar. 6, 2013, 7 pages.
European Search Report for European Application No. 13159950.8; dated Jun. 6, 2013; 7 pages.
European Search Report for related European Application No. 15152834.6, dated Oct. 8, 2015; 7 pages.
Gillespie, Thomas D.; “FUndamentals of Vehicla Dynamics”; Society of Automotive Enginers, Inc.; published 1992; 294 pages.
Kichun, et al.; “Development of Autonomous Car—Part II: A Case Study on the Implementation of an Autonomous Driving System Based on Distributed Architecture”; IEEE Transactions on Industrial Electronics, vol. 62, No. 8, Aug. 2015; 14 pages.
Partial European Search Report for related European Patent Application No. 14156903.8, dated Sep. 23, 2014, 6 pages.
Van der Jagt, Pim; “Prediction of Steering Efforts During Stationary or Slow Rolling Parking Maneuvers”; Ford Forschungszentrum Aachen GmbH.; Oct. 27, 1999; 20 pages.
Varunjikar, Tejas; Design of Horizontal Curves With DownGrades Using Low-Order Vehicle Dynamics Models; A Theisis by T. Varunkikar; 2011; 141 pages.
CN Patent Application No. 201610575225.9 First Office Action dated Jan. 22, 2018, 10 pages.
English Translation of Chinese Office Action and Search Report for Chinese Application No. 201210599006.6 dated Jan. 27, 2015, 9 pages.
English Translation of Chinese Office Action and Search Report for Chinese Application No. 201310178012.9 dated Apr. 13, 2015, 13 pages.
English Translation of Chinese Office Action and Search Report for Chinese Application No. 201410089167.X dated Feb. 3, 2016, 9 pages.
English Translation of Chinese Office Action and Search Report for Chinese Application No. 2016103666609.X dated Dec. 20, 2017, 8 pages.
European Search Report for European Patent Application No. 14156903.8 dated Jan. 27, 2015, 10 pages.
English Translation of Chinese Office Action and Search Report for Chinese Application No. 201610832736.4 dated Mar. 22, 2018, 6 pages.
Chinese Office Action and Search Report from the Chinese Patent Office for CN Application No. 201610575225.9 dated Oct. 16, 2018, 19 pages, English Translation Included.
Chinese Office Action from the Chinese Patent Office for CN Application No. 2017102318205 dated Oct. 12, 2018, 7 pages, English Translation Only.
Chinese Office Action from the CN Patent Office for CN Application No. 201610832736.4 dated Oct. 16, 2018, 18 pages, English Translation Included.
CN Patent Application No. 201611096572 First Office Action dated Aug. 1, 2018, 12 pages.
Related Publications (1)
Number Date Country
20180107214 A1 Apr 2018 US