Retractable handwheel gesture control

Information

  • Patent Grant
  • 10442441
  • Patent Number
    10,442,441
  • Date Filed
    Tuesday, June 14, 2016
    8 years ago
  • Date Issued
    Tuesday, October 15, 2019
    5 years ago
Abstract
An embodiment of a control system includes a sensing element configured to detect a gesture from at least one user in a vehicle including a handwheel, and a gesture control module configured to receive gesture information from the sensing element and control at least one of the handwheel and the vehicle based on the gesture information.
Description
BACKGROUND OF THE INVENTION

Retractable and/or stowable handwheels may be available with the introduction of automated driver assistance systems (ADAS) and autonomous vehicle systems. Drivers may want a handwheel to be retracted or moved (e.g., upon request) in situations such as when a vehicle is in an autonomous mode. The handwheel may also be brought into a standard driving position when the driver wishes to steer the vehicle.


SUMMARY OF THE INVENTION

In accordance with one aspect of the invention, an embodiment of a control system includes a sensing element configured to detect a gesture from at least one user in a vehicle including a handwheel, and a gesture control module configured to receive gesture information from the sensing element and control at least one of the handwheel and the vehicle based on the gesture information.


In accordance with another aspect of the invention, an embodiment of a method of controlling an aspect of a vehicle includes detecting, by a sensing element, a gesture from at least one user in a vehicle including a handwheel, receiving gesture information from the sensing element by a gesture control module, and controlling at least one of the handwheel and the vehicle based on the gesture information.


These and other advantages and features will become more apparent from the following description taken in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS

The subject matter which is regarded as the invention is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features, and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:



FIG. 1 depicts aspects of a vehicle and vehicle steering assist and/or control system;



FIG. 2 depicts a steering gesture control system in accordance with one aspect of the invention;



FIG. 3 depicts an embodiment of a three-dimensional holographic display and aspects of a steering gesture control system; and



FIG. 4 is a flow diagram depicting a method of controlling a handwheel and/or a vehicle based on gesture recognition.





DETAILED DESCRIPTION

Referring now to FIG. 1, where the invention will be described with reference to specific embodiments without limiting same, an exemplary embodiment of a vehicle 10 including a steering system 12 such as an electrical power steering (EPS) system, an automated driver assist system and/or an automated vehicle control system is illustrated. In various embodiments, the steering system 12 includes a handwheel 14 coupled to a steering shaft 16. In the exemplary embodiment shown, the steering system 12 is an electric power steering (EPS) system and/or automated driver assist system that further includes a steering assist unit 18 that couples to the steering shaft 16 of the steering system 12 and to tie rods 20, 22 of the vehicle 10. The steering assist unit 18 includes, for example, a steering actuator motor (e.g., electrical motor) and a rack and pinion steering mechanism (not shown) that may be coupled through the steering shaft 16 to the steering actuator motor and gearing. During operation, as the handwheel 14 is turned by a vehicle operator, the motor of the steering assist unit 18 provides the assistance to move the tie rods 20, 22 which in turn moves steering knuckles 24, 26, respectively, coupled to roadway wheels 28, 30, respectively of the vehicle 10. In one embodiment, the steering assist unit 18 can autonomously control steering using the actuator motor, for example, for parking assist or autonomous driving.


It is noted that the embodiments described herein are not limited to the steering system shown in FIG. 1, and may be used in conjunction with any suitable steering or control system. Examples of control systems that can be used with the embodiments described herein include Active Front Steering (AFS) systems and hydraulic steering systems. Other control systems that can be used with the embodiments include systems configured to control vehicle functions without a handwheel. An example of such a system is a by-wire system such as a Drive by Wire, Steer by Wire or X-by-Wire system.


In one embodiment, the handwheel 14 is moveable to allow a controller and/or user to move the handwheel 14 between various positions. For example, the handwheel 14 may be moveable to one or more positions at which a driver can operate the handwheel 14 and steer the vehicle (referred to as “driving positions”). The handwheel 14 may also be retracted or moved to a stowed position (e.g., during autonomous operation). The handwheel 14 can be moved using a variety of mechanisms, such as a hinged or pivoting portion of the steering shaft 16, a telescoping or retractable portion of the steering shaft 16, or any other suitable mechanism.


As shown in FIG. 1, the vehicle 10 further includes various sensors that detect and measure observable conditions of the steering system 12 and/or of the vehicle 10. The sensors generate sensor signals based on the observable conditions. In the example shown, sensors 32 are wheel speed sensors that sense a rotational speed of the wheels 28 and 30, respectively. The sensors 32 generate wheel speed signals based thereon. In other examples, other wheel speed sensors can be provided in addition to or alternative to the sensors 32. The other wheel speed sensors may sense a rotational speed of rear wheels 36 and generate sensor signals based thereon. As can be appreciated, other wheel sensors that sense wheel movement, such as wheel position sensors, may be used in place of the wheel speed sensors. In such a case, a wheel velocity and/or vehicle velocity or speed may be calculated based on the wheel sensor signal.


The vehicle also includes sensors for detecting the position and/or movement of the handwheel 16. In one embodiment, the handwheel includes a torque sensor and/or position sensor (collectively referred to as handwheel sensor 34). The handwheel sensor 34 can sense a torque placed on the handwheel 14 and/or sense the angular position of the handwheel 14. Other sensors include sensors for detecting the position (motor position) and rotational speed (motor velocity or motor speed) of the steering actuator motor or other motor associated with the steering assist unit 18.


A control module 40 controls the operation of the steering system 12 based on one or more sensor signals and further based on the steering control systems and methods of the present disclosure. The control module may be used as part of an EPS system to provide steering assist torque and/or may be used as a driver assistance system that can control steering of the vehicle (e.g., for parking assist, emergency steering control and/or autonomous or semi-autonomous steering control). An example of a driver assistance system is an ADAS (Advanced Driver Assistance Systems) system that, instead of or in addition to directly assisting the driver (by reducing steering efforts), can also accept a position command from another control system to achieve directional control of a vehicle in certain conditions.


Generally speaking, the steering control systems and methods of the present disclosure can be used to control the position of a handwheel or steering wheel and/or provide directional control of a vehicle (either autonomously, semi-autonomously or by providing torque or steering assist) based on one or more gestures performed by a user (e.g., driver or passenger). The control module 40 or other suitable processing device or system senses user gestures and performs various control actions. In one embodiment, the control module responds to appropriate gestures to move the handwheel 14 between one or more driving positions and/or between a driving position and a retracted or stowed position. The control module 40 may also respond to gestures to control steering of the vehicle to allow a user to control the vehicle without physically engaging the handwheel 14.


Aspects of embodiments described herein may be performed by any suitable control system and/or processing device, such as the steering assist unit 18 and/or the control module 40. In one embodiment, the control module 40 is or is included as part of an autonomous driving system.



FIG. 2 shows an embodiment of a control system 50 that includes a gesture control module 52 and a sensing element 54. The control system 50 is configured to control aspects of a vehicle such as vehicle speed, vehicle steering and/or positioning of a handwheel. The control system 50 also includes an interface 56 that includes a display that provides control information to a user. The control system 50 may be incorporated as part of the control module 40 or any other suitable component of a vehicle. In addition, aspects of the control system 50 can be incorporated into a device or system that is separate from the vehicle, such as a smartphone or other portable device, which can communicate with the vehicle (e.g., by a plug-in connection or wireless connection).


The gesture control module 52 may control steering and/or handwheel functions, such as the function of retracting the handwheel 14 to a retracted or stowed position, redeployment of the handwheel 14 to a driving position, and moving the handwheel 14 between multiple driving positions. For example, a wave with both hands of a driver with palms toward the dash (or other location of the sensing element 54) may indicate to the control system 50 to retract the handwheel 14. Similarly, a driver may wave both hands toward the dash (or other location of the sensing element 54) to deploy the handwheel 14 into the driving position (e.g. by waving with both hands). The sensing element 54 is positioned so as to view the driver's hand or hands. The sensing element 54 may be a video camera, light sensor, motion sensor or other type of sensor sufficient to recognize driver gestures. A location for the camera may be in the center of the handwheel 14, or to one or more sides of an interface 54 located on the dashboard or on the center of the handwheel 14, although the location is not limited to any specific embodiments described herein.


As described herein, a “gesture” refers to any movement by a user, driver or operator that can be recognized by the gesture control module 52 and used to control an aspect of a handwheel and/or vehicle. Gestures may include directional gestures, waves, hand signals and others, and can correspond to a command such as a steering command or a handwheel position control command, e.g., to retract or stow the handwheel, return the handwheel to a driving position and/or move the handwheel to different driving positions. The number and type of gestures recognized by the gesture control module may be pre-selected (e.g., default) or customizable based on user input.


In one embodiment, the interface 56 includes a display area such as a two-dimensional screen display and/or a three-dimensional holographic display. The display may be positioned at any suitable location, such as at the dashboard or at the handwheel 14. For example, once the handwheel 14 is retracted, an area that the handwheel 14 has vacated may include use for a holographic display area for other gesture functions or entertainment. For gesture functions, a driver could passively page through a menu of vehicle functions by waving a hand left or right within a range of the sensing element 56, for example. A driver could also raise both hands palms up to indicate a change request (e.g. increasing the volume of the audio system) within a range of the sensing element 56. Similarly, to lower the volume, the driver may lower both hands, palms down.


In one embodiment, the display is disposed at a central location or other location on the handwheel that is visible to a user. For example, the display is located on the center hub of the handwheel 14. The center hub may be a stationary hub (i.e., the hub does not rotate when the handwheel is rotated) or the center hub may rotate with the handwheel 14. If the center hub rotates with the handwheel 14, the display could be positioned or configured so that the best viewing would be when the handwheel 14 is stationary. In another example, if the handwheel 14 is stowable, the display can be located so that the best viewing would be when the handwheel 14 is stowed. The display may be located in the dash of the vehicle, e.g., if the handwheel 14 is completely stowed in the dash, so that an unobstructed view of the display area is ensured.


Various gestures can be used to steer the vehicle via a holographic or virtual steering wheel. An example of a three-dimensional holographic display is shown in FIG. 3. The control system 50 includes sensing elements in the form of cameras 60 located on either side of an interface, which in this example is a holographic display 62. The cameras 60 and/or holographic display 62 may be positioned at any suitable location in the vehicle. For example, the cameras and/or the display 62 are positioned on a panel 64 that may be lowered or positioned over the handwheel 14 when the handwheel 14 is in the stowed position.


In the embodiment of FIG. 3, the holographic display 62 includes illumination sources 66 and a display screen 68 that project a holographic image 70 above the screen 68 or otherwise in view of the user. The holographic display 62 may be of any suitable type and is not limited to the embodiments described herein.


An example of the holographic image 70 is an image of a steering wheel (a “virtual steering wheel”). The virtual steering wheel can be projected and hand motions around the virtual steering wheel can be tracked to provide steering signals to the vehicle. This can be used in a retracted mode or give the opportunity to remove the actual steering wheel completely. Gestures can also provide optional placement of the virtual steering wheel in the vehicle if multiple cameras are installed. Finally, the physical steering wheel may be replaced with a holographic steering wheel and the holographic steering wheel may be controlled with gestures in a manner described herein.



FIG. 4 illustrates a method 80 of controlling a handwheel and/or a vehicle based on gesture recognition. The method 80 is used in conjunction with the system 10 and/or the control system 50, although the method 80 may be utilized in conjunction with any suitable combination of sensing devices and processors. The method 80 includes one or more stages 81-84. In one embodiment, the method 80 includes the execution of all of stages 81-84 in the order described. However, certain stages may be omitted, stages may be added, or the order of the stages changed.


In the first stage 81, a user (e.g., a driver or passenger) performs a gesture, such as a hand wave or a display of a hand or hands in a selected configuration. The gesture is performed by the user to cause an action to be performed by a control system such as the control system 50.


In the second stage 82, the gesture is detected by a sensing element, such as a video camera.


In the third stage 83, the gesture control module receives gesture detection data from the sensing element and determines the action to be performed based on the gesture. Examples of actions include retracting the handwheel to a retracted or stowed position, moving the handwheel from the retracted or stowed position to a driving position, and moving the handwheel between different driving positions. Other examples include controlling other vehicle systems, such as video, display options, radio station and volume, wipers, etc. Further examples of actions include vehicle control actions such as steering the vehicle.


In the fourth stage 84, the gesture control module generates a message or command to perform the action. For example, the gesture control module sends a command to an appropriate vehicle system or component to realize the action. For example, if the gesture is for moving the handwheel, the gesture control module sends a command to operate an internal motor to move the handwheel. If the gesture is to steer the vehicle (e.g., in conjunction with a holographic image of the handwheel), the gesture control module sends a command to a motor (e.g., in the steering assist unit 18) or sends a command to a steering assist or control system (e.g., the control module 40).


Embodiments described herein provide numerous advantages. The control systems described herein can allow a user to position a handwheel and/or steer a vehicle without requiring the use of buttons or other mechanical devices. This is advantageous over processes that retract the handwheel by pressing a button or grabbing the handwheel and moving it to a desired location. Such processes can present challenges as they may require the user to search for the button, which adds to driver work load and may force the driver's attention off the road. The button may also take up dashboard space, and the act of grabbing the handwheel and moving the handwheel could be difficult for the driver. Embodiments described herein address such challenges and can increase driver safety.


While the invention has been described in detail in connection with only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Rather, the invention can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the invention. Additionally, while various embodiments of the invention have been described, it is to be understood that aspects of the invention may include only some of the described embodiments. Accordingly, the invention is not to be seen as limited by the foregoing description.

Claims
  • 1. A control system comprising: a panel comprising a sensing element and an interface, the sensing element configured to detect a gesture from at least one user in a vehicle including a handwheel, and the interface configured to display information related to control of at least one of the handwheel and the vehicle; anda gesture control module configured to receive gesture information from the sensing element and control at least one of the handwheel and the vehicle based on the gesture information, the gesture being performed to cause the gesture control module to move the handwheel between a driving position and a retracted or stowed position.
  • 2. The system of claim 1, wherein the gesture is performed to cause the gesture control module to steer the vehicle.
  • 3. The system of claim 1, wherein the interface is configured to generate a holographic image, the holographic image configured to be interacted with by the user to control at least one of the handwheel and the vehicle.
  • 4. The system of claim 3, wherein the holographic image is an image of the handwheel.
  • 5. The system of claim 3, wherein the sensing element is configured to detect the gesture in relation to the holographic image and steer the vehicle based on the gesture.
  • 6. The system of claim 1, wherein the sensing element includes a camera configured to record the gesture.
  • 7. The system of claim 1, wherein the interface is disposed at a center hub of the handwheel.
  • 8. The system of claim 1, wherein the interface is disposed at a location in the vehicle that is available to the user upon stowing the handwheel.
  • 9. The system of claim 1, wherein the panel is positioned over the handwheel when the handwheel is in the retracted or stowed position.
  • 10. A method of controlling an aspect of a vehicle, comprising: detecting, by a sensing element, a first gesture from at least one user in a vehicle including a handwheel;receiving first gesture information from the sensing element by a gesture control module;controlling at least one of the handwheel and the vehicle based on the first gesture information, the first gesture being performed to cause the gesture control module to move the handwheel from a driving position to a retracted or stowed position;detecting, by the sensing element, a second gesture from the at least one user in the vehicle;receiving second gesture information from the sensing element by the gesture control module; andcontrolling at least one of the handwheel and the vehicle based on the second gesture information, the second gesture being performed to cause the gesture control module to move the handwheel from the retracted or stowed position to the driving position, the first gesture and the second gesture being the same gesture.
  • 11. The method of claim 10, wherein one of the first gesture and the second gesture is performed to cause the gesture control module to steer the vehicle.
  • 12. The method of claim 10, wherein the sensing element is connected to an interface configured to display information related to control of at least one of the handwheel and the vehicle.
  • 13. The method of claim 12, wherein the interface is configured to generate a holographic image, the holographic image configured to be interacted with by the user to control at least one of the handwheel and the vehicle.
  • 14. The method of claim 13, wherein the holographic image is configured to be displayed when the handwheel is in a stowed position and interacted with by the user to steer the vehicle.
  • 15. The method of claim 13, wherein the sensing element is configured to detect at least one of the first gesture and the second gesture in relation to the holographic image and steer the vehicle based on the at least one of the first gesture and the second gesture.
  • 16. The method of claim 12, wherein the interface is disposed at a center hub of the handwheel.
  • 17. The method of claim 12, wherein the interface is disposed at a location in the vehicle that is available to the user upon stowing the handwheel.
  • 18. The method of claim 10, wherein the sensing element includes a camera configured to record the first gesture and the second gesture.
  • 19. The method of claim 10, wherein the first gesture and the second gesture comprise a user waving both hands of the user with palms toward the sensing element.
  • 20. A control system comprising: a sensing element configured to detect a gesture from at least one user in a vehicle including a handwheel, the sensing element being disposed at a center hub of the handwheel;an interface configured to display information related to control of at least one of the handwheel and the vehicle, the interface being disposed at the center hub of the handwheel; anda gesture control module configured to receive gesture information from the sensing element and control at least one of the handwheel and the vehicle based on the gesture information, the gesture being performed to cause the gesture control module to move the handwheel between a driving position and a retracted or stowed position.
CROSS-REFERENCES TO RELATED APPLICATIONS

This patent application claims priority to U.S. Provisional Patent Application Ser. No. 62/175,777, filed Jun. 15, 2015 which is incorporated herein by reference in its entirety.

US Referenced Citations (304)
Number Name Date Kind
4315117 Kokubo et al. Feb 1982 A
4337967 Yoshida et al. Jul 1982 A
4503300 Lane, Jr. Mar 1985 A
4503504 Suzumura et al. Mar 1985 A
4561323 Stromberg Dec 1985 A
4691587 Farrand et al. Sep 1987 A
4836566 Birsching Jun 1989 A
4921066 Conley May 1990 A
4962570 Hosaka et al. Oct 1990 A
4967618 Matsumoto et al. Nov 1990 A
4976239 Hosaka Dec 1990 A
5240284 Takada et al. Aug 1993 A
5295712 Omura Mar 1994 A
5319803 Allen Jun 1994 A
5469356 Hawkins et al. Nov 1995 A
5488555 Asgari et al. Jan 1996 A
5618058 Byon Apr 1997 A
5668721 Chandy Sep 1997 A
5690362 Peitsmeier et al. Nov 1997 A
5765116 Wilson-Jones et al. Jun 1998 A
5893580 Hoagland et al. Apr 1999 A
5911789 Keipert et al. Jun 1999 A
6070686 Pollmann Jun 2000 A
6138788 Bohner et al. Oct 2000 A
6170862 Hoagland et al. Jan 2001 B1
6212453 Kawagoe et al. Apr 2001 B1
6227571 Sheng et al. May 2001 B1
6256561 Asanuma Jul 2001 B1
6301534 McDermott, Jr. et al. Oct 2001 B1
6354622 Ulbrich et al. Mar 2002 B1
6360149 Kwon et al. Mar 2002 B1
6373472 Palalau et al. Apr 2002 B1
6381526 Higashi et al. Apr 2002 B1
6390505 Wilson May 2002 B1
6481526 Millsap et al. Nov 2002 B1
6575263 Hjelsand et al. Jun 2003 B2
6578449 Anspaugh et al. Jun 2003 B1
6598695 Menjak et al. Jul 2003 B1
6612392 Park et al. Sep 2003 B2
6612393 Bohner et al. Sep 2003 B2
6778890 Shimakage et al. Aug 2004 B2
6799654 Menjak et al. Oct 2004 B2
6817437 Magnus et al. Nov 2004 B2
6819990 Ichinose Nov 2004 B2
6820713 Menjak et al. Nov 2004 B2
6889792 Fardoun et al. May 2005 B1
7021416 Kapaan et al. Apr 2006 B2
7048305 Muller May 2006 B2
7062365 Fei Jun 2006 B1
7295904 Kanevsky et al. Nov 2007 B2
7308964 Hara et al. Dec 2007 B2
7428944 Gerum Sep 2008 B2
7461863 Muller Dec 2008 B2
7495584 Sorensen Feb 2009 B1
7628244 Chino et al. Dec 2009 B2
7719431 Bolourchi May 2010 B2
7735405 Parks Jun 2010 B2
7793980 Fong Sep 2010 B2
7862079 Fukawatase et al. Jan 2011 B2
7894951 Norris et al. Feb 2011 B2
7909361 Oblizajek et al. Mar 2011 B2
8002075 Markfort Aug 2011 B2
8027767 Klein et al. Sep 2011 B2
8055409 Tsuchiya Nov 2011 B2
8069745 Strieter et al. Dec 2011 B2
8079312 Long Dec 2011 B2
8146945 Born et al. Apr 2012 B2
8150581 Iwazaki et al. Apr 2012 B2
8170725 Chin et al. May 2012 B2
8170751 Lee et al. May 2012 B2
8260482 Szybalski et al. Sep 2012 B1
8352110 Szybalski et al. Jan 2013 B1
8452492 Buerkle et al. May 2013 B2
8479605 Shavrnoch et al. Jul 2013 B2
8548667 Kaufmann Oct 2013 B2
8606455 Boehringer et al. Dec 2013 B2
8632096 Quinn et al. Jan 2014 B1
8634980 Urmson et al. Jan 2014 B1
8650982 Matsuno et al. Feb 2014 B2
8670891 Szybalski et al. Mar 2014 B1
8695750 Hammond et al. Apr 2014 B1
8725230 Lisseman et al. May 2014 B2
8798852 Chen et al. Aug 2014 B1
8818608 Cullinane et al. Aug 2014 B2
8825258 Cullinane et al. Sep 2014 B2
8825261 Szybalski et al. Sep 2014 B1
8843268 Lathrop et al. Sep 2014 B2
8874301 Rao et al. Oct 2014 B1
8880287 Lee et al. Nov 2014 B2
8881861 Tojo Nov 2014 B2
8899623 Stadler et al. Dec 2014 B2
8909428 Lombrozo Dec 2014 B1
8915164 Moriyama Dec 2014 B2
8948993 Schulman et al. Feb 2015 B2
8950543 Heo et al. Feb 2015 B2
8994521 Gazit Mar 2015 B2
9002563 Green et al. Apr 2015 B2
9031729 Lathrop et al. May 2015 B2
9032835 Davies et al. May 2015 B2
9045078 Tovar et al. Jun 2015 B2
9073574 Cuddihy et al. Jul 2015 B2
9092093 Jubner et al. Jul 2015 B2
9108584 Rao et al. Aug 2015 B2
9134729 Szybalski et al. Sep 2015 B1
9150200 Urhahne Oct 2015 B2
9150224 Yopp Oct 2015 B2
9150238 Alcazar et al. Oct 2015 B2
9159221 Stantchev Oct 2015 B1
9164619 Goodlein Oct 2015 B2
9174642 Wimmer et al. Nov 2015 B2
9186994 Okuyama et al. Nov 2015 B2
9193375 Schramm et al. Nov 2015 B2
9199553 Cuddihy et al. Dec 2015 B2
9227531 Cuddihy et al. Jan 2016 B2
9233638 Lisseman et al. Jan 2016 B2
9235111 Davidsson et al. Jan 2016 B2
9235211 Davidsson et al. Jan 2016 B2
9235987 Green et al. Jan 2016 B2
9238409 Lathrop et al. Jan 2016 B2
9248743 Enthaler et al. Feb 2016 B2
9260130 Mizuno Feb 2016 B2
9290174 Zagorski Mar 2016 B1
9290201 Lombrozo Mar 2016 B1
9298184 Bartels et al. Mar 2016 B2
9308857 Lisseman et al. Apr 2016 B2
9308891 Cudak et al. Apr 2016 B2
9315210 Sears et al. Apr 2016 B2
9333983 Lathrop et al. May 2016 B2
9360865 Yopp Jun 2016 B2
9714036 Yamaoka et al. Jul 2017 B2
9725098 Abou-Nasr Aug 2017 B2
9810727 Kandler et al. Nov 2017 B2
9845109 George et al. Dec 2017 B2
9852752 Chou et al. Dec 2017 B1
9868449 Holz Jan 2018 B1
10040330 Anderson Aug 2018 B2
20020016661 Frediani et al. Feb 2002 A1
20030046012 Yamaguchi Mar 2003 A1
20030094330 Boloorchi et al. May 2003 A1
20030227159 Muller Dec 2003 A1
20040016588 Vitale et al. Jan 2004 A1
20040046346 Eki et al. Mar 2004 A1
20040099468 Chernoff et al. May 2004 A1
20040129098 Gayer et al. Jul 2004 A1
20040182640 Katou et al. Sep 2004 A1
20040204808 Satoh et al. Oct 2004 A1
20040262063 Kaufmann et al. Dec 2004 A1
20050001445 Ercolano Jan 2005 A1
20050081675 Oshita et al. Apr 2005 A1
20050155809 Krzesicki et al. Jul 2005 A1
20050197746 Pelchen et al. Sep 2005 A1
20050205344 Uryu Sep 2005 A1
20050275205 Ahnafield Dec 2005 A1
20060224287 Izawa et al. Oct 2006 A1
20060244251 Muller Nov 2006 A1
20060271348 Rossow et al. Nov 2006 A1
20070021889 Tsuchiya Jan 2007 A1
20070029771 Haglund et al. Feb 2007 A1
20070046003 Mori et al. Mar 2007 A1
20070046013 Bito Mar 2007 A1
20070241548 Fong Oct 2007 A1
20070284867 Cymbal et al. Dec 2007 A1
20080009986 Lu et al. Jan 2008 A1
20080238068 Kumar et al. Oct 2008 A1
20090024278 Kondo et al. Jan 2009 A1
20090112406 Fujii et al. Apr 2009 A1
20090189373 Schramm Jul 2009 A1
20090256342 Cymbal et al. Oct 2009 A1
20090276111 Wang et al. Nov 2009 A1
20090292466 McCarthy et al. Nov 2009 A1
20100152952 Lee et al. Jun 2010 A1
20100222976 Haug Sep 2010 A1
20100228417 Lee et al. Sep 2010 A1
20100228438 Buerkle Sep 2010 A1
20100250081 Kinser et al. Sep 2010 A1
20100280713 Stahlin et al. Nov 2010 A1
20100286869 Katch et al. Nov 2010 A1
20100288567 Bonne Nov 2010 A1
20110098922 Ibrahim Apr 2011 A1
20110153160 Hesseling et al. Jun 2011 A1
20110167940 Shavrnoch et al. Jul 2011 A1
20110187518 Strumolo et al. Aug 2011 A1
20110224876 Paholics et al. Sep 2011 A1
20110266396 Abildgaard et al. Nov 2011 A1
20110282550 Tada et al. Nov 2011 A1
20120136540 Miller May 2012 A1
20120150388 Boissonnier et al. Jun 2012 A1
20120197496 Limpibunterng et al. Aug 2012 A1
20120205183 Rombold Aug 2012 A1
20120209473 Birsching et al. Aug 2012 A1
20120215377 Takemura et al. Aug 2012 A1
20120296525 Endo et al. Nov 2012 A1
20130002416 Gazit Jan 2013 A1
20130087006 Ohtsubo et al. Apr 2013 A1
20130158771 Kaufmann Jun 2013 A1
20130218396 Moshchuk et al. Aug 2013 A1
20130233117 Read et al. Sep 2013 A1
20130253765 Bolourchi et al. Sep 2013 A1
20130292955 Higgins et al. Nov 2013 A1
20130325202 Howard et al. Dec 2013 A1
20140012469 Kunihiro et al. Jan 2014 A1
20140028008 Stadler et al. Jan 2014 A1
20140046542 Kauffman et al. Feb 2014 A1
20140046547 Kauffman et al. Feb 2014 A1
20140070933 Gautama et al. Mar 2014 A1
20140111324 Lisseman et al. Apr 2014 A1
20140152551 Mueller Jun 2014 A1
20140156107 Karasawa et al. Jun 2014 A1
20140168061 Kim Jun 2014 A1
20140172231 Terada Jun 2014 A1
20140277896 Lathrop et al. Sep 2014 A1
20140277945 Chandy Sep 2014 A1
20140300479 Wolter et al. Oct 2014 A1
20140303827 Dolgov et al. Oct 2014 A1
20140306799 Ricci Oct 2014 A1
20140309816 Stefan et al. Oct 2014 A1
20140354568 Andrews et al. Dec 2014 A1
20150002404 Hooton Jan 2015 A1
20150006033 Sekiya Jan 2015 A1
20150014086 Eisenbarth Jan 2015 A1
20150032322 Wimmer Jan 2015 A1
20150032334 Jang Jan 2015 A1
20150051780 Hahne Feb 2015 A1
20150060185 Feguri Mar 2015 A1
20150120124 Bartels et al. Apr 2015 A1
20150120141 Lavoie et al. Apr 2015 A1
20150120142 Park et al. Apr 2015 A1
20150123947 Jubner May 2015 A1
20150210273 Kaufmann et al. Jul 2015 A1
20150246673 Tseng et al. Sep 2015 A1
20150251666 Attard et al. Sep 2015 A1
20150283998 Lind et al. Oct 2015 A1
20150314804 Aoki et al. Nov 2015 A1
20150324111 Jubner et al. Nov 2015 A1
20150338849 Nemec et al. Nov 2015 A1
20160001781 Fung et al. Jan 2016 A1
20160009332 Sirbu Jan 2016 A1
20160071418 Oshida et al. Mar 2016 A1
20160075371 Varunjikar et al. Mar 2016 A1
20160082867 Sugioka et al. Mar 2016 A1
20160185387 Kuoch Jun 2016 A1
20160200246 Lisseman et al. Jul 2016 A1
20160200343 Lisseman et al. Jul 2016 A1
20160200344 Sugioka et al. Jul 2016 A1
20160207536 Yamaoka et al. Jul 2016 A1
20160207538 Urano et al. Jul 2016 A1
20160209841 Yamaoka et al. Jul 2016 A1
20160229450 Basting et al. Aug 2016 A1
20160231743 Bendewald et al. Aug 2016 A1
20160244070 Bendewald et al. Aug 2016 A1
20160280251 George et al. Sep 2016 A1
20160288825 Varunjikar et al. Oct 2016 A1
20160291862 Yaron et al. Oct 2016 A1
20160318540 King Nov 2016 A1
20160318542 Pattok et al. Nov 2016 A1
20160347347 Lubischer Dec 2016 A1
20160347348 Lubischer Dec 2016 A1
20160355207 Urushibata Dec 2016 A1
20160362084 Martin et al. Dec 2016 A1
20160362126 Lubischer Dec 2016 A1
20160364003 O'Brien Dec 2016 A1
20160368522 Lubischer Dec 2016 A1
20160375860 Lubischer Dec 2016 A1
20160375923 Schulz Dec 2016 A1
20160375925 Lubischer et al. Dec 2016 A1
20160375926 Lubischer et al. Dec 2016 A1
20160375927 Schulz et al. Dec 2016 A1
20160375928 Magnus Dec 2016 A1
20160375929 Rouleau Dec 2016 A1
20160375931 Lubischer Dec 2016 A1
20170029009 Rouleau Feb 2017 A1
20170029018 Lubischer Feb 2017 A1
20170066473 Yu et al. Mar 2017 A1
20170101032 Sugioka et al. Apr 2017 A1
20170101127 Varunjikar et al. Apr 2017 A1
20170113712 Watz Apr 2017 A1
20170151950 Lien Jun 2017 A1
20170151977 Varunjikar et al. Jun 2017 A1
20170151978 Oya et al. Jun 2017 A1
20170158055 Kim et al. Jun 2017 A1
20170158222 Schulz et al. Jun 2017 A1
20170166222 James Jun 2017 A1
20170203785 Naik et al. Jul 2017 A1
20170225704 Urushibata Aug 2017 A1
20170232998 Ramanujam et al. Aug 2017 A1
20170240204 Raad et al. Aug 2017 A1
20170242428 Pal et al. Aug 2017 A1
20170274929 Sasaki et al. Sep 2017 A1
20170293306 Riefe et al. Oct 2017 A1
20170297606 Kim et al. Oct 2017 A1
20170305425 Xing Oct 2017 A1
20170305458 Wang et al. Oct 2017 A1
20170334458 Sato et al. Nov 2017 A1
20180015948 Varunjikar et al. Jan 2018 A1
20180017968 Zhu et al. Jan 2018 A1
20180029632 Bodtker et al. Feb 2018 A1
20180059661 Sato et al. Mar 2018 A1
20180059662 Sato et al. Mar 2018 A1
20180072341 Schulz et al. Mar 2018 A1
20180093700 Chandy Apr 2018 A1
20180105198 Bodtker et al. Apr 2018 A1
20180107214 Chandy Apr 2018 A1
20180136727 Chandy May 2018 A1
20180148087 Wang et al. May 2018 A1
Foreign Referenced Citations (54)
Number Date Country
1722030 Jan 2006 CN
1736786 Feb 2006 CN
101037117 Sep 2007 CN
101041355 Sep 2007 CN
101596903 Dec 2009 CN
102027458 Apr 2011 CN
102320324 Jan 2012 CN
102452391 May 2012 CN
202563346 Nov 2012 CN
102939474 Feb 2013 CN
103158699 Jun 2013 CN
103419840 Dec 2013 CN
103448785 Dec 2013 CN
103677253 Mar 2014 CN
103777632 May 2014 CN
103818386 May 2014 CN
104024084 Sep 2014 CN
102939474 Aug 2015 CN
104936850 Sep 2015 CN
104968554 Oct 2015 CN
19523214 Jan 1997 DE
19923012 Nov 2000 DE
10212782 Oct 2003 DE
102005032528 Jan 2007 DE
102005056438 Jun 2007 DE
102006025254 Dec 2007 DE
102008057313 Oct 2009 DE
102010025197 Dec 2011 DE
102012010887 Dec 2013 DE
1559630 Aug 2005 EP
1783719 May 2007 EP
1932745 Jun 2008 EP
2384946 Nov 2011 EP
2426030 Mar 2012 EP
2489577 Aug 2012 EP
2604487 Jun 2013 EP
1606149 May 2014 EP
2862595 May 2005 FR
3016327 Jul 2015 FR
S60157963 Aug 1985 JP
S60164629 Aug 1985 JP
H05162652 Jun 1993 JP
2768034 Jun 1998 JP
2004074845 Mar 2004 JP
2007253809 Oct 2007 JP
2011043884 Mar 2011 JP
20174099 Jan 2017 JP
20100063433 Jun 2010 KR
2006099483 Sep 2006 WO
2007034567 Mar 2007 WO
2010082394 Jul 2010 WO
2010116518 Oct 2010 WO
2013080774 Jun 2013 WO
2013101058 Jul 2013 WO
Non-Patent Literature Citations (26)
Entry
China Patent Application No. 201510204221.5 Second Office Action dated Mar. 10, 2017, 8 pages.
CN Patent Application No. 201210599006.6 First Office Action dated Jan. 27, 2015, 9 pages.
CN Patent Application No. 201210599006.6 Second Office Action dated Aug. 5, 2015, 5 pages.
CN Patent Application No. 201310178012.9 First Office Action dated Apr. 13, 2015, 13 pages.
CN Patent Application No. 201310178012.9 Second Office Action dated Dec. 28, 2015, 11 pages.
CN Patent Application No. 201410089167 First Office Action and Search Report dated Feb. 3, 2016, 9 pages.
EP Application No. 14156903.8 Extended European Search Report, dated Jan. 27, 2015, 10 pages.
EP Application No. 14156903.8 Office Action dated May 31, 2016, 5 pages.
EP Application No. 14156903.8 Partial European Search Report dated Sep. 23, 2014, 6 pages.
EP Application No. 15152834.6 Extended European Search Report dated Oct. 8, 2015, 7 pages.
European Application No. 12196665.9 Extended European Search Report dated Mar. 6, 2013, 7 pages.
European Search Report for European Application No. 13159950.8; dated Jun. 6, 2013; 7 pages.
European Search Report for related European Application No. 15152834.6, dated Oct. 8, 2015; 7 pages.
Gillespie, Thomas D.; “Fundamentals of Vehicle Dynamics”; Society of Automotive Enginers, Inc.; published 1992; 294 pages.
Kichun, et al.; “Development of Autonomous Car-Part II: A Case Study on the Implementation of an Autonomous Driving System Based on Distributed Architecture”; IEEE Transactions on Industrial Electronics, vol. 62, No. 8, Aug. 2015; 14 pages.
Van der Jagt, Pim; “Prediction of Steering Efforts During Stationary or Slow Rolling Parking Maneuvers”; Ford Forschungszentrum Aachen GmbH.; Oct. 27, 1999; 20 pages.
Van Der Jagt, Pim; “Prediction of steering efforts during stationary or slow rolling parking maneuvers”; Jul. 2013, 20 pages.
Varunjikar, Tejas; Design of Horizontal Curves With DownGrades Using Low-Order Vehicle Dynamics Models; A Theisis by T. Varunkikar; 2011; 141 pages.
Chinese First Office Action and Search Report dated Dec. 20, 2017 cited in Application No. 2016103666609.X, (w/ English language translation), 16 pgs.
Chinese First Office Action dated Jan. 22, 2018 cited in Application No. 201610575225.9, (w/ English language translation), 16 pgs.
Chinese Office Action and Search Report dated Mar. 22, 2018 cited in Application No. 201610832736.4, (w/ English language translation) 12 pgs.
Chinese Office Action and Search Report from the Chinese Patent Office for CN Application No. 201610575225.9 dated Oct. 16, 2018, 19 pages, English Translation Included.
Chinese Office Action from the Chinese Patent Office for CN Application No. 2017102318205 dated Oct. 12, 2018, 7 pages, English Translation Only.
Chinese Office Action from the CN Patent Office for CN Application No. 201610832736.4 dated Oct. 16, 2018, 18 pages, English Translation Included.
Chinese Office Action & Search Report for Chinese Application No. 201601575225.9 dated Oct. 16, 2018, English translation included, 19 pages.
Yan, et al., “EPS Control Technology Based on Road Surface Conditions,” Jun. 22-25, 2009, pp. 933-938, 2009 IEEE International Conference on Information and Automation.
Related Publications (1)
Number Date Country
20160362117 A1 Dec 2016 US
Provisional Applications (1)
Number Date Country
62175777 Jun 2015 US