Vehicle user interface

Information

  • Patent Grant
  • 10719218
  • Patent Number
    10,719,218
  • Date Filed
    Tuesday, March 26, 2019
    5 years ago
  • Date Issued
    Tuesday, July 21, 2020
    4 years ago
Abstract
A vehicle user interface including a vehicle steering wheel including a grip, a sensor mounted in the steering wheel grip detecting objects touching the steering wheel grip, a plurality of individually activatable illumination units illuminating respective locations on the steering wheel grip, and a processor receiving outputs from the sensor, selectively activating a subset of the illumination units adjacent to a detected object, and controlling a plurality of vehicle functions in response to outputs of the sensor.
Description
FIELD OF THE INVENTION

The field of the present invention is steering wheel user interfaces for vehicles.


BACKGROUND OF THE INVENTION

Reference is made to FIG. 1, which is a simplified illustration of a prior art steering wheel. A steering wheel 400, shown in FIG. 1, includes a circular gripping member 401, one or more connecting members 402-404 that connect the gripping member 401 to steering column 407, and buttons 405 and 406 on connecting members 402 and 403 for controlling various devices in the vehicle. Connecting members 402-404, which connect gripping member 401 to steering column 407, are also referred to as spokes. In FIG. 1, button 405 is used to answer an incoming phone call on the vehicle's BLUETOOTH® speaker phone and button 406 hangs up the call. BLUETOOTH is a trademark owned by the Bluetooth SIG of Kirkland, Wash., USA. Controls mounted in a steering wheel can be operated comfortably and safely since the driver is able to control and operate these controls without taking hands off the wheel or eyes off the road.


Historically, the first button added to a steering wheel was a switch to activate the car's electric horn. When cruise control systems were introduced, some automakers located the operating switches for this feature on the steering wheel as well. Today additional button controls for an audio system, a telephone and voice control system, a navigation system, a stereo system, and on-board computer functions are commonly placed on the steering wheel.


U.S. Patent Publication No. 2012/0232751 A1 for PRESSURE SENSITIVE STEERING WHEEL CONTROLS describes adding pressure-sensitive controls to the circular gripping member of the steering wheel. Pressure sensors are located at various locations along the perimeter of the gripping member, and different locations correspond to different controls. A control is actuated in response to application of pressure at a sensor location, e.g., by the user tightening his grip.


Prior art user interfaces associated with steering wheels, such as the buttons and grips discussed hereinabove, associate a function with an absolute position on the steering wheel. This is conceptually analogous to a touch-sensitive screen displaying icons where the user touches the location on the screen at which the icon is located to activate the icon. The concept of absolute positioning for user input goes back even further: each key on a keyboard is positioned at an absolute position within the keyboard. Similarly, early graphical user interfaces using light pens required the user to place the light pen onto a graphic displayed on the screen in order to activate a corresponding function.


In contrast to these user interfaces based on absolute positioning, the computer mouse introduced a user interface for controlling a cursor based on relative positioning. Namely, the mouse cursor moves on the screen in a direction that the mouse moves from point A to point B, but this movement is not at all contingent on the actual coordinates—the absolute positions—of points A and B. This shift from absolute positioning to relative positioning frees the user from having to look at, or be aware of, the location of the mouse on the table. The user only has to control the direction in which the mouse moves on the table, which he can do without looking at the mouse. One of the objectives of the present invention is to provide a user interface for a driver based on the relative positioning user interface model.


SUMMARY

The present disclosure relates to user interfaces for on board vehicle systems, and teaches a user interface that does not require the user to be aware of the location at which he is touching the steering wheel in order to activate a function. The present disclosure teaches a robust vocabulary of user gestures that can be mapped to a wide variety of applications. The user gestures of the present disclosure are performed with absolute confidence by a user, without the user looking at the surface on which the gestures are performed. In certain embodiments of the invention the gestures are performed on the rim of a steering wheel. The nature of these gestures and the underlying hardware provided for detecting these gestures enables each user interface gesture to be performed by the user without any need for looking at the steering wheel. Furthermore these gestures are entirely independent of how the steering wheel is rotated at the time the gestures are performed.


There is thus provided in accordance with an embodiment of the present invention a steering wheel that identifies gestures performed on its surface, including a circular gripping element including a thumb-receiving notch disposed along its circumference, an array of light-based proximity sensors, mounted in the gripping element, that projects light beams through the notch radially outward from the gripping element, and detects light beams reflected back into the gripping element by a moving object at or near the notch, and a processor, coupled with the proximity sensor array, for determining polar angles along the circumference of the gripping element occupied by the object, responsive to light beams projected by the proximity sensor array and reflected back by the object being detected by the proximity sensor array.





BRIEF DESCRIPTION OF THE DRAWINGS

The present invention will be more fully understood and appreciated from the following detailed description, taken in conjunction with the drawings in which:



FIG. 1 is a simplified illustration of a prior art steering wheel;



FIG. 2 is an exploded view of a steering wheel, in accordance with a first embodiment of the present invention;



FIG. 3 is a cutaway view of a segment of the steering wheel of FIG. 2, in accordance with an embodiment of the present invention;



FIGS. 4 and 5 are exploded views of the steering wheel segment illustrated in FIG. 3, in accordance with an embodiment of the present invention;



FIG. 6 is a simplified illustration of electronic components in the steering wheel segment of FIG. 2 connected to a processor, in accordance with an embodiment of the present invention;



FIG. 7 is a simplified illustration of a structure of light baffles placed upon the electronic components in FIG. 6, in accordance with an embodiment of the present invention;



FIGS. 8 and 9 are simplified illustrations of light beams detecting an object, in accordance with an embodiment of the present invention;



FIG. 10 is a simplified side view illustration of light beams projected radially outward from a steering wheel, in accordance with an embodiment of the present invention;



FIG. 11 is a simplified illustration of communication between touch detection firmware and multiple clients over a network, in accordance with an embodiment of the present invention;



FIG. 12 is a simplified illustration of five basic gesture components used in a steering wheel user interface, in accordance with an embodiment of the present invention;



FIG. 13 is a flowchart of an exemplary vehicle user interface, in accordance with an embodiment of the present invention;



FIG. 14 is a simplified illustration of user interface gestures performed on a steering wheel for an exemplary adaptive cruise control function, in accordance with an embodiment of the present invention;



FIG. 15 is a simplified illustration of a multi-touch double-tap gesture and an exemplary user interface to activate an autonomous drive mode, in accordance with an embodiment of the present invention;



FIG. 16 is a simplified illustration of a gesture and an exemplary user interface for exiting the autonomous drive mode, in accordance with an embodiment of the present invention;



FIG. 17 is a simplified illustration showing how an incoming call is received, in accordance with an embodiment of the present invention;



FIG. 18 is a simplified illustration showing how to hang up a call, in accordance with an embodiment of the present invention; and



FIG. 19 is a simplified illustration of a user interface for a park assist function, in accordance with an embodiment of the present invention.





In the disclosure and figures, the following numbering scheme is used. Light transmitters are numbered in the 100's, light detectors are numbered in the 200's, light guides and lenses are numbered in the 300's, miscellaneous items are numbered in the 400's, light beams are numbered in the 600's, and flow chart elements are numbered 1000-1100. Like numbered elements are similar but not necessarily identical.


The following tables catalog the numbered elements and list the figures in which each numbered element appears.












Light Transmitters








Element
FIGS.





100
4, 5


101
6


102
6


105
8-10


106
6, 7


107
6, 7


108
6, 7


109
6, 7



















Light Detectors








Element
FIGS.





200
4


201
6, 8, 9


202
6, 8, 9


203
8, 9


205
8, 9


206
8



















Light Guides and Lenses








Element
FIGS.





300
2-7


301
8, 9


302
8, 9


303
8, 9


304
8, 9


305
8-10



















Light Beams









Element
FIGS.
Description





601
10
light beam


602
3, 8-10
light beam


603
8, 9
light beam


604
8, 9
light beam



















Flow Chart Stages











Element
FIGS.
Description






1001-1005
13
vehicle application state



1010-1019
13
vehicle application action



















Miscellaneous Items









Element
FIGS.
Description





400
 1
steering wheel


401
 1
grip


402
 1
right spoke


403
 1
left spoke


404
 1
bottom spoke


405
 1
answer button


406
 1
reject button


410
12, 14-19
steering wheel


411
2-5
steering wheel frame


412
2-5
top cover


413
2-5
thumb notch


414
2-7
PCB


415
2-5, 7
light baffle


416
3, 5
transparent cover section


417
3, 5
transparent cover section


418
12, 14, 15, 17, 18
finger


419
12, 14, 16-19
hand


420
12, 15
steering wheel surface


421-424,
12, 15
hand/finger movement directions


428, 431-434




425
12
clock icon


426
12
finger


430
14
double-tap gesture


436
14-16
Illumination


437
14
movement of illumination


438
14, 17
tap gesture


440
 6, 11
Processor


441-443
11
network client


444
11
message bus









DETAILED DESCRIPTION

Aspects of the present disclosure relate to light-based touch controls that allow a driver to keep his hands on a steering element while operating peripheral electronic devices and automated features in a vehicle.


According to a first embodiment of the invention, a steering wheel is provided with a touch sensitive strip disposed along the entire circumference of the steering wheel. In order to facilitate locating the strip, it is disposed in a thumb receiving notch or groove that is etched or otherwise formed along the circumference of the steering wheel. In addition to a touch sensor, there is also a visible-light illuminator behind or around the touch sensitive strip that is used to indicate the state of the user interface to the user, and also indicate where certain tap gestures should be performed.


A user interface for this steering wheel is designed to be independent of the rotation of the steering wheel. Sweep gestures are clockwise and counter-clockwise so that they are independent of rotation of the wheel. A function is activated in response to a gesture, such as a double-tap, performed anywhere along the circumference of the wheel. The activation of some functions places the user interface into a state in which one or more additional functions can be selectively activated. In order to activate these additional functions, the touch location at which the initial gesture was performed is illuminated and subsequent gestures are performed in relation to the illuminated portion of the wheel. When a portion of the wheel is thus illuminated, and the driver slides his hand along the steering wheel grip, the illuminated portion of the steering wheel follows the hand so that the hand is always next to the location for performing subsequent gestures. Similarly, when the user switches hands gripping the steering wheel, the illumination jumps to the newly gripped part of the wheel.


Reference is made to FIG. 2, which is an exploded view of a steering wheel, in accordance with a first embodiment of the present invention. Elements of this steering wheel include steering wheel frame 411, PCB 414, an array of lenses 300, a light baffle structure 415, and a steering wheel top cover 412. A thumb-receiving notch 413 is disposed within steering wheel cover 412.


Reference is made to FIG. 3, which is a cutaway view of a segment of the steering wheel of FIG. 2, in accordance with an embodiment of the present invention. Thumb-receiving notch 413 is illustrated in FIG. 3. Two light transmissive portions of cover 412 are also shown in FIG. 3. A first light transmissive portion 417 forms the side wall of thumb-receiving notch 413. Light beams traveling into and out of this portion provide touch detection and proximity detection, as explained below. Three touch detection light beams 602 are shown directed radially outward from the steering wheel gripping element. The second light transmissive portion 416 forms a floor of thumb-receiving notch 413, and is used for visible illumination indicating a state of the user interface to the driver, and at which location the driver should perform additional user interface gestures.


Reference is made to FIGS. 4 and 5, which are exploded views of the steering wheel segment illustrated in FIG. 3, in accordance with an embodiment of the present invention. As shown in FIG. 4, two concentric rows of elements are mounted on PCB 414. Namely, an inner row of light detectors 200 and an outer row of light emitters 100. Light from the emitters enters lenses 300 through which it is re-directed out of the steering wheel through light transmissive portion 417 as light beams 602, illustrated in FIGS. 3 and 8-10. An object such as a thumb placed in notch 413 reflects the light back through portion 417 and lenses 300 onto one or more of the light detectors 200, thereby providing touch detection, as illustrated in FIGS. 8 and 9. Similarly, an object such as a user's hand placed along the outer rim of the steering wheel outside notch 413 and opposite light transmissive portion 417 also reflects the light back through portion 417 and lenses 300 onto one or more of the light detectors 200, thereby providing proximity detection.



FIG. 5 shows an exploded view from below of the steering wheel segment illustrated in FIG. 3, in accordance with an embodiment of the present invention.


Reference is made to FIG. 6, which is a simplified illustration of electronic components in the steering wheel segment of FIG. 3 connected to a processor, in accordance with an embodiment of the present invention. FIG. 6 shows processor 440 connected to PCB 414 on which three concentric arrangements of light elements are mounted, namely, an inner circular arrangement of inward-facing light detectors, including detectors 201 and 202; a middle circular arrangement of inward-facing light emitters, including emitters 101 and 102; and an outer circular arrangement of outward-facing light emitters 105-108. The inward facing light emitters are used for touch and proximity detection and typically emit light in the near-infrared range. Processor 440 controls activation of the emitters and detectors, and detects gestures performed on the steering wheel based on these activations and based on the outputs of the detectors.


The outward-facing light emitters are used to provide visual indications to the user by illuminating light-transmissive portion 416 of the steering wheel cover, and emit light in the visible range. Lenses 300 are described in assignee's U.S. application Ser. No. 14/555,731, entitled DOOR HANDLE WITH OPTICAL PROXIMITY SENSORS.


Reference is made to FIG. 7, which is a simplified illustration of a structure of light baffles placed upon the electronic components in FIG. 6, in accordance with an embodiment of the present invention. FIG. 7 shows PCB 414 and lenses 300 of FIG. 6, but with baffle structure 415 placed above the mounted light elements.


Reference is made to FIGS. 8 and 9, which are simplified illustrations of light beams detecting an object, in accordance with an embodiment of the present invention. FIGS. 8 and 9 show a light path used to detect an object. Shown in FIGS. 8 and 9 are individual lens structures 301-305. Each lens structure serves a respective opposite emitter and two detectors, one to the left of the emitter and one to the right of the emitter. Thus, for example, lens structure 305 serves emitter 105 and detectors 205 and 206. In addition each detector is served by two lens structures; e.g., detector 205 receives reflected light from lens structures 304 and 305. In the example shown in FIGS. 8 and 9, light from emitter 105 is reflected by an object (not shown) into lens structure 303 and onto detector 203. Three segments of the detected light are indicated in FIGS. 8 and 9; namely, light beam 602 projected outward from lens structure 305 and radially outward of the steering wheel, light beam 603 reflected by the object into lens structure 303, and light beam 604 directed by lens structure 303 onto detector 203.


Reference is made to FIG. 10, which is a simplified side view illustration of light beams projected radially outward from a steering wheel, in accordance with an embodiment of the present invention. FIG. 10 shows a cutaway side view of the light path illustrated in FIGS. 8 and 9. FIG. 10 shows light beam 601 from emitter 105 entering lens structure 305, where it is redirected outward as light beam 602.


Methods for determining two-dimensional coordinates of an object detected by the disclosed proximity sensor array are described in assignee's U.S. application Ser. No. 14/312,787, entitled OPTICAL PROXIMITY SENSORS, and U.S. application Ser. No. 14/555,731, entitled DOOR HANDLE WITH OPTICAL PROXIMITY SENSORS. Because the present application is for a steering wheel and the proximity sensor array is arranged along an arc-shaped grip of the steering wheel, the determined coordinates are polar coordinates, including a polar angle and a radial coordinate. The polar angle corresponds to a coordinate along the proximity sensor array, which in U.S. application Ser. Nos. 14/312,787 and 14/555,731 is described as an x-axis coordinate. The radial coordinate corresponds to a distance from the proximity sensor array, which in U.S. application Ser. Nos. 14/312,787 and 14/555,731 is described as a y-axis coordinate.


Discussion now turns to the firmware and software used to detect and interpret user gestures. There are five basic gesture components that are detected by the hardware and low-level drivers: (i) Thumb-Tap, (ii) Thumb-Glide, (iii) Thumb-Long-Press, (iv) Grab and (v) Rim-Tap. These components are emitted on the network as they are detected, and are used by higher level software to assemble more complex gestures such as double-taps. Application software interprets these gestures as input commands. In some embodiments of the invention multiple client applications are connected via a network to the detector firmware. The firmware sends information for each detected gesture component over the network, and a client application translates that information into commands and/or constructs compound gestures from multiple gesture components.


Reference is made to FIG. 11, which is a simplified illustration of communication between touch detection firmware and multiple clients over a network, in accordance with an embodiment of the present invention. FIG. 11 shows an exemplary network architecture in which processor 440 sends detected gesture components over message bus 444, e.g., using the Message Queue Telemetry Transport (MQTT) messaging protocol on top of the TCP/IP protocol, to connected clients 441-443.


The five basic gesture components are categorized according to whether they are performed by a large object (hand) or small object (thumb), and whether the nature of the gesture component is discrete or continuous, as presented in the table below.















Component
Description
Object
Type







Thumb-Tap
Tap thumb on steering
Small
Discrete



wheel rim




Thumb-Glide
Glide thumb along
Small
Continuous



steering wheel rim




Thumb-Long-Press
Hold thumb on steering
Small
Continuous



wheel rim




Grab
Grab hold of steering
Large
Continuous



wheel rim




Rim-Tap
Tap hand on steering
Large
Discrete



wheel rim










These gesture components are alternatively referred to as follows.















Component
Alternative Name








Thumb-Tap
small-object tap



Thumb-Glide
small-object glide



Thumb-Long-Press
small-object touch-and-hold



Grab
large-object grab



Rim-Tap
large-object tap









The parameters are the same for all gesture components; namely, time stamp, start angle (min_angle), end angle (max_angle), center angle (angle) and state.


The angle parameters refer to a polar angle along the steering wheel at which the object is detected. Because of the object's size, there is a first polar angle at which the object begins (start angle) and a last polar angle at which the object ends (end angle). The midpoint between the start and end angles (center angle) is used as the object's polar angle. The start and end angles are useful for determining the size of a detected object.


The state parameter takes on three values: RECOGNIZED, UPDATED and ENDED. The ENDED state is applied to all discrete gesture components, and also when a continuous gesture component ends. The RECOGNIZED and UPDATED states are only applied to continuous gesture components. The RECOGNIZED state is applied when a continuous gesture component is first detected. The UPDATED state is applied during the course of a continuous gesture component.


The discrete gesture components, Thumb-Tap and Rim-Tap, are emitted to the clients after they happen, and then only one message is sent for the gesture component. They are only sent with the state ENDED.


The continuous gesture components, Thumb-Glide, Thumb-Long-Press and Grab, are emitted to the clients intermittently from the instant that they are recognized until they end when the hand or finger leaves the rim. When they are first recognized, they are sent to the network with the state RECOGNIZED. With a configurable interval, the gesture component is reported to the network with new parameters and the state UPDATED. When the gesture component ends, the gesture component is sent with the state ENDED.


Reference is made to FIG. 12, which is a simplified illustration of five basic gesture components used in a steering wheel user interface, in accordance with an embodiment of the present invention. FIG. 12 shows the five gesture components performed by thumb 418 and hand 419 on steering wheel 410. Some gesture components are illustrated both from above and from the side. When illustrated from the side, thumb 418 is shown interacting with steering wheel surface 420.


A Thumb-Tap gesture component is generated when a small object touches the rim (or gets very close) and then is lifted from the rim within a short period. This period is configurable, but typically it is 100-200 ms. FIG. 12 shows the Thumb-Tap gesture component from above and from the side, and illustrates the movement of thumb 418 by arrows 421.


A Rim-Tap gesture component is the same as a Thumb-Tap, but for a large object such as a hand. FIG. 12 shows the Rim-Tap gesture component from the side and illustrates the movement of hand 419 by arrows 424.


A Thumb-Glide gesture component is generated when a small object touches the rim and moves at least a certain threshold distance along the rim. That distance is configurable. When it continues to move, UPDATE messages are sent when the object has moved a certain distance, also configurable. FIG. 12 shows the Thumb-Glide gesture component from above and from the side, and illustrates the movement of thumb 418 by arrows 422 and 423.


A Grab gesture component is the same as a Thumb-Glide gesture component, but for a large object touching the rim, and with the difference that the Grab gesture component does not have to move to be reported on the network. When the hand has been on the rim for a certain time threshold, the Grab gesture component is recognized and messages are intermittently sent to the network. FIG. 12 shows the Grab gesture component from above by showing hand 419 gripping steering wheel 410.


A Thumb-Long-Press gesture component is generated when a small object is present, and not moving, on the rim. When the small object has been present for a certain time, messages are sent intermittently to the network about the gesture component. If the object starts moving, the Thumb-Long-Press gesture component is ended and a Thumb-Glide gesture component is started instead. FIG. 12 shows the Thumb-Long-Press gesture component from above and from the side. Clock icon 425 indicates the time threshold required to distinguish this gesture component from a Thumb-Tap.


As mentioned above, gesture components are combined into compound user interface gestures. In some cases, environment conditions at the gesture location are combined with the gesture component to define a gesture. For example, a Thumb-Tap gesture performed at one end of an illuminated portion of the rim is translated into a first command, and a Thumb-Tap gesture performed at the other end of the illuminated portion of the rim is translated into a second command. The following table lists the different gestures and compound gestures in the steering wheel user interface, the gesture components that make up each gesture, additional gesture parameters, and example context and commands for each gesture.

















Gesture
Additional
Example
Example


Gesture
Components
Parameters
Context
Command







Tap inside
Thumb-Tap
Thumb-tap
Cruise
Increase or


notch

performed at top or
control is
decrease cruise




bottom of illuminated
active
control speed in




portion of illuminated

5 mph




segment of steering

increments




wheel




Tap on
Rim-Tap

During
Reactivate phone


steering


phone call
interaction, e.g.,


wheel outer



when phone call


rim



is active for set






period of time.






Enables hanging






up the phone call






with a clockwise






swipe gesture


Single
Two Thumb-
Thumb-taps have
Vehicle is in
Activate cruise


object
Taps
different time
motion
control and


double-tap

stamps, similar

illuminate


inside notch

center angles

location of






double-tap


Single
Two Rim-Taps
Side of steering
Car is not
Activate Park


object

wheel rim (left or
moving, and
Assist to park on


double-tap

right) at which
Park Assist
left or right side


on steering

double-tap is
icon is
of car, based on


wheel outer

performed
displayed on
tapped side of


rim


HUD
rim


Multi-touch
Two Thumb-
Thumb-taps have
Autonomous
Activate


double-tap
Taps
similar time stamps,
drive is not
autonomous


inside notch

different center
active
drive




angles




Extended
Thumb-Long-
Thumb-long-press
Cruise
Increase or


touch inside
Press
performed at top or
control is
decrease cruise


notch

bottom of illuminated
active
control speed in




portion of illuminated

1 mph




segment of steering

increments




wheel




Grab
Grab

Autonomous
Deactivate





drive is
autonomous





active
drive, and enter






cruise control






mode


Swipe
Thumb-Glide
clockwise/counter-
Cruise
Increase or




clockwise
control is
decrease distance





active
from forward car






in cruise control






mode


Radial swipe
Thumb-Glide
Thumb-glide data
Cruise
Open cruise




structures have
control is
control menu on




similar center angles
active
HUD




and different radial






coordinates




Slide
Grab
Grab data structures
Portion of
Move illumination




have different time
steering
to new hand




stamps and different
wheel is
location (follow




center angles
selectively
slide movement)





illuminated



Switch
Grab
Grab data structures
Portion of
Move illumination


hands

have different time
steering
to new hand




stamps and different
wheel is
location (jump to




center angles
selectively
other side of





illuminated
wheel)









Reference is made to FIG. 13, which is a flowchart of an exemplary vehicle user interface, in accordance with an embodiment of the present invention. The flowchart illustrates the different application states, different commands within each state, and the gestures used to issue those commands. The details of the gestures are illustrated in FIGS. 14-19. In some embodiments a heads-up display (HUD) is provided.


The flowchart of FIG. 13 illustrates a highway scenario that includes three driving modes: Normal Drive 1001, Adaptive Cruise Control 1002 and Autonomous Drive 1003. In Normal Drive mode, the driver steers the vehicle and controls its speed. In Adaptive Cruise Control mode the driver steers the vehicle but the vehicle's speed, its distance from the next vehicle on the road, and other parameters are controlled by the vehicle. In Autonomous Drive mode the vehicle is driven and steered automatically without driver input.


The user enters Adaptive Cruise Control mode from Normal Drive mode by performing a double-tap gesture. The user enters Autonomous Drive mode from Normal Drive mode and from Adaptive Cruise Control mode by performing a multi-touch double-tap gesture. These gestures are described below. In order to alert the driver that Autonomous Drive mode will begin shortly, the steering wheel is illuminated with an illumination pattern that indicates a countdown until Autonomous Drive is activated.


The user exits Adaptive Cruise Control mode by performing a double-tap gesture that opens a menu on the HUD for changing the mode 1015 of cruise control. The user performs clockwise or counter-clockwise swipe gestures to scroll through the different modes on the HUD, and performs a single-tap gesture to select the displayed mode. One of the modes is Exit ACC 1018, and selecting this mode exits Adaptive cruise Control. Another mode configures the cruise control application to follow the road signage 1019.


The user exits Autonomous Drive mode 1013 by grabbing the rim of the steering wheel. In order to alert the driver that Autonomous Drive mode is about to exit, the steering wheel is illuminated with an illumination pattern that indicates a countdown until Autonomous Drive is deactivated. Upon exiting Autonomous Drive mode, the vehicle enters Adaptive Cruise Control mode.


In Adaptive Cruise Control mode 1002 the user adjusts a distance 1016 between the vehicle and the vehicle directly in front of it, by performing a clockwise or counter-clockwise swipe gesture. The user adjusts the speed of the vehicle by performing either a tap gesture or an extended touch gesture. When the vehicle enters Adaptive Cruise Control mode 1002 a segment of the steering wheel is illuminated. A tap gesture or extended touch gesture at one end of the illuminated segment increases the vehicle speed, and a tap gesture or extended touch gesture at the other end of the illuminated segment decreases the vehicle speed.


A voice control state 1004 can be entered from Normal Drive mode and Adaptive Cruise Control mode. In this state, the user can initiate a phone call by saying “call” and the name of a contact from his phone's contact list. Once the call has been connected, the user can hang up 1010 by performing a clockwise swipe gesture. The user can also adjust the volume 1011 by saying the word “volume” and then performing a counter-clockwise swipe gesture to raise the volume, or a clockwise swipe gesture to lower the volume.


When an incoming phone call 1005 is received, the user can answer the call 1012 by performing a counter-clockwise swipe gesture, or decline the call 1012 by performing a clockwise swipe gesture.


Reference is made to FIG. 14, which is a simplified illustration of user interface gestures performed on a steering wheel for an exemplary adaptive cruise control function, in accordance with an embodiment of the present invention. In order to enter Adaptive Cruise Control mode from Normal Drive mode the user performs a single-object double-tap gesture. Namely, the user taps twice with his thumb on the thumb notch in the steering wheel. This gesture is illustrated in drawing (a) in FIG. 14, showing steering wheel 410, hand 419 gripping steering wheel 410, and double-tap gesture 430. The present invention enables the user to perform the double-tap gesture 430 at any location along the perimeter of steering wheel 410.


When Adaptive Cruise Control is active the user has four options; namely, adjust cruise control speed, adjust the distance between the vehicle and the vehicle ahead, open an adaptive cruise control menu, and activate Autonomous Drive mode. As mentioned above Adaptive Cruise Control is activated when the user taps twice with his thumb in the steering wheel thumb notch. The location of these taps is subsequently illuminated to indicate to the user where to perform future gestures. This is illustrated in drawing (b) in FIG. 14, showing illuminated segment 436 of the steering wheel 410 at the location at which double-tap 430 was performed. Thus, to increase the cruise control speed the user performs a gesture, e.g. a single-tap, above the illuminated portion. This is illustrated in drawing (d) in FIG. 14 showing tap gesture 438 at the counter-clockwise edge of illuminated portion 436. The “+” indicates that this gesture increases the speed of the vehicle. Drawing (e) in FIG. 14 shows gesture 438 performed at the clockwise end of illuminated portion 436, and the “−” indicates that the gesture decreases the speed of the vehicle.


If the user slides his hand 419 along steering wheel 410, the illuminated portion 436 moves with the hand so that the user's thumb is always next to the illuminated portion of the steering wheel. This is illustrated in drawing (c) in FIG. 14, in which hand 419 gripping steering wheel 410 slides clockwise as indicated by arrow 428, and illuminated portion 436 also slides in the same direction as indicated by arrow 437.


In some embodiments the cruise control speed is also adjusted in response to extended touch gestures above and below the illuminated portion of the steering wheel. For example, the speed is adjusted by 5 km/h in response to a tap gesture, and is adjusted by 1 km/h in response to an extended touch gesture.


In order to increase or decrease the distance between the vehicle and the vehicle in front of it on the road, the user performs clockwise and counter-clockwise swipe gestures. These are illustrated in drawings (f) and (g) in FIG. 14. Drawing (f) illustrates a counter-clockwise gesture 431 to increase the distance between vehicles, and drawing (g) illustrates a clockwise gesture 432 to decrease the distance between vehicles.


In order to change the mode of Adaptive Cruise Control the user performs a radial swipe gesture with his thumb across the width of the steering wheel thumb notch. This is illustrated in drawings (h) and (i) in FIG. 14. Drawing (h) illustrates swipe gesture 433 that moves outward across the width of illuminated portion 436. Drawing (i) illustrates swipe gesture 434 that moves inward across the width of illuminated portion 436. Either gesture causes the HUD to present a mode option for selection. The user performs a single-tap gesture with his thumb in the steering wheel notch to accept the displayed mode. The mode displayed in the HUD is changed in response to a swipe gesture. For example, a first mode is to follow road signage. If the user performs a single-tap when this mode is displayed on the HUD, a Follow Road Signage mode is activated. If the user swipes clockwise or counter-clockwise, a next or previous mode is displayed such as exit Adaptive Cruise Control. The user performs a single-tap to activate this mode. If no interaction from the user is received within a fixed amount of time, such as 5 seconds, then the change mode user interface is deactivated.


Reference is made to FIG. 15, which is a simplified illustration of a multi-touch double-tap gesture and an exemplary user interface to activate an autonomous drive mode, in accordance with an embodiment of the present invention. Drawing (a) in FIG. 15 illustrates two fingers, 418 and 426, simultaneously tapping at two locations on steering wheel 410. The upper part of this drawing is a view from above, and the lower part of this drawing is a view from the side of each of the fingers 418 and 426. The tap gesture is a brief down and up gesture illustrated by arrows 421 and 428 touching surface 420 of the steering wheel.


Once the user performs this multi-touch double-tap gesture, a series of locations on the steering wheel is sequentially illuminated over time to indicate a countdown until Autonomous Drive is activated, as illustrated in drawings (b) and (c). For example, viewing the upright steering wheel as a clock, drawing (b) illustrates a sequence of illuminations that begins with (i) the 2:30 and 9:30 clock positions indicated by a 1; followed by (ii) the 1:30 and 10:30 clock positions indicated by 2; followed by (iii) the 12:30 and 11:30 clock positions indicated by 3. Drawing (c) illustrates finally illuminating the 12 o'clock position indicated by the word “Go” to inform the user that Autonomous Drive is activated and the user can safely take his hands off the wheel.


In order to exit Autonomous Drive mode and enter Adaptive Cruise Control mode, the user grabs the steering wheel. Reference is made to FIG. 16, which is a simplified illustration of a gesture and an exemplary user interface for exiting Autonomous Drive mode, in accordance with an embodiment of the present invention. FIG. 16 shows two hands 419 gripping steering wheel 410, in accordance with an embodiment of the present invention. A series of locations on the steering wheel is then sequentially illuminated to indicate that Autonomous Drive mode is about to be the deactivated. For example, drawing (a) illustrates a sequence of illuminations that begins with (i) the 11:30 and 12:30 clock positions indicated by a 1; followed by (ii) the 10:30 and 1:30 clock positions indicated by 2; followed by (iii) the 9:30 and 2:30 clock positions indicated by 3. When Autonomous Drive mode is deactivated the vehicle enters Adaptive Cruise Control mode, and a portion 436 of steering wheel 410 next to one of the hands 419 gripping the steering wheel is illuminated, as illustrated in drawing (b) of FIG. 16, and as discussed above with reference to FIG. 14.


In both Normal Drive mode and Adaptive Cruise Control mode the user can enable voice-activated controls by tapping twice on the outer rim of the steering wheel. When voice-activated controls are enabled the user disables these controls by repeating the same double-tap gesture.


Two voice-activated controls are illustrated in FIG. 13: placing a phone call and enabling volume adjustments. To place a phone call the user says “call” and the name of the person to call, e.g., “call Mom”. In order to hang up the call the user performs a swipe gesture along the thumb notch in the steering wheel. To adjust the volume of a call or the stereo system, the user says the word “volume” and then adjusts the volume up or down by swiping clockwise or counter-clockwise along the thumb notch in the steering wheel.


Reference is made to FIG. 17, which is a simplified illustration showing how an incoming call is received, in accordance with an embodiment of the present invention. FIG. 17 shows that when an incoming call is received, the user answers or declines the call by swiping finger 418 clockwise or counter-clockwise along the thumb notch of the steering wheel, e.g. swipe counter-clockwise to accept the call and swipe clockwise to reject the call.


Reference is made to FIG. 18, which is a simplified illustration showing how to hang up a call, in accordance with an embodiment of the present invention. The gesture to hang up a call is a clockwise swipe gesture. However, when a call has been active for a certain amount of time, the system ignores clockwise swipe gestures so that the user does not inadvertently hang up the call. In order to hang up the call, the user first taps the outer rim of the steering wheel, as shown by hand 419, to indicate that the system should respond to the next swipe gesture, followed by a clockwise swipe gesture by finger 418 to hang up the call.


In a city scenario the user interface provides a park assist function that automatically parks the car without the user's intervention. Reference is made to FIG. 19, which is a simplified illustration of a user interface for a park assist function, in accordance with an embodiment of the present invention. When the vehicle is moving at less than 30 km/h, the Park Assist function begins automatically scanning for available parking spaces. In addition, a faded Park Assist icon appears on the HUD, as illustrated in drawing (a) of FIG. 19. As the car further slows down, this icon becomes bolder until the car has stopped moving, as illustrated in drawings (b) and (c) of FIG. 19. The HUD then presents information about available parking spots; e.g. whether the vehicle can fit into that spot. The user performs a double-tap on the outer rim of the steering wheel, as illustrated in drawing (d), by hand 419 to begin the automated parking. To indicate to the Park Assist function that the parking space is on the left side of the car, the user performs this double-tap on of the left half of the steering wheel rim. To indicate to the Park Assist function that the parking space is on the right side of the car, the user performs this double-tap on of the right half of the steering wheel rim.


In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made to the specific exemplary embodiments without departing from the broader spirit and scope of the invention. In particular, sensors other than optical sensors may be used to implement the user interface, inter alia capacitive sensors disposed along the circumference of the steering wheel, and cameras that captured images of the steering wheel. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.

Claims
  • 1. A vehicle user interface comprising: a vehicle steering wheel comprising a grip;a sensor detecting objects touching said steering wheel grip;a set of individually activatable illumination units mounted in said steering wheel grip illuminating respective locations on said steering wheel grip; anda processor receiving outputs from said sensor, selectively activating a subset of said illumination units adjacent to a driver's hand gripping said steering wheel grip, detected by said sensor, and activating a vehicle function in response to said sensor further detecting a thumb or finger of the driver's hand touching said steering wheel grip at an illuminated location.
  • 2. The vehicle user interface of claim 1, wherein the activated subset of illumination units together illuminates a contiguous segment of said steering wheel grip, and wherein said processor activates a first vehicle function in response to said sensor detecting the thumb or finger touching one end of the illuminated segment, and a second vehicle function, different than the first vehicle function, in response to said sensor detecting the thumb or finger touching the opposite end of the illuminated segment.
  • 3. The vehicle user interface of claim 2, wherein the first vehicle function increases a parameter for a vehicle system, and the second vehicle function decreases that parameter.
  • 4. The vehicle user interface of claim 1, wherein said sensor outputs enable said processor to determine whether an object touching said steering wheel grip is thumb-sized or hand-sized, and wherein said processor activates vehicle functions in response to said sensor detecting an object touching said steering wheel grip, in accordance with the size of the object determined by said processor.
  • 5. The vehicle user interface of claim 4, wherein said sensor comprises a plurality of light emitters and a plurality of light detectors mounted in said steering wheel grip, said processor determining whether an object touching said steering wheel grip is thumb-sized or hand-sized based on the number of neighboring activated emitter-detector pairs that are responsive to the object, an activated emitter-detector pair comprising an activated one of said emitters and a synchronously activated one of said detectors, and a responsive emitter-detector pair being characterized by light emitted by the light emitter of the pair out of said steering wheel grip being reflected by the object back to the light detector of the pair.
  • 6. The vehicle user interface of claim 4, wherein autonomous drive functionality is provided in the vehicle and controlled by said processor, and wherein said processor deactivates a previously activated autonomous drive function in response to sensor outputs indicating that a hand-sized object is taking hold of said steering wheel grip.
  • 7. The vehicle user interface of claim 1, wherein autonomous drive functionality is provided in the vehicle, and wherein prior to deactivating a previously activated autonomous drive function, said processor informs the driver that the previously activated autonomous drive function will be deactivated, by selectively activating a subset of said illumination units.
  • 8. The vehicle user interface of claim 7, wherein said processor informs the driver that an autonomous drive function will be activated, via an activation sequence of said illumination units, and further informs the driver that the activated autonomous drive function will be deactivated, via an inversion of the activation sequence of said illumination units.
  • 9. A vehicle user interface comprising: a vehicle steering wheel comprising a grip;a sensor comprising a plurality of light emitters and a plurality of light detectors mounted in said steering wheel grip; anda processor controlling a plurality of vehicle functions, said processor (i) activating a plurality of emitter-detector pairs, an activated emitter-detector pair comprising an activated one of said emitters and a synchronously activated one of said detectors, (ii) receiving detector outputs for the activated emitter-detector pairs, (iii) determining that an object is touching said steering wheel grip, and determining whether the object is thumb-sized or hand-sized, based on the number of activated emitter-detector pairs that are responsive to the object as determined by the received detector outputs, a responsive emitter-detector pair being characterized by light emitted by the light emitter of the pair out of said steering wheel grip being reflected by the object back to the light detector of the pair, and (iv) activating different vehicle functions in accordance with the determined size of the object.
  • 10. The vehicle user interface of claim 9, wherein autonomous drive functionality is provided in the vehicle and controlled by said processor, and wherein said processor deactivates a previously activated autonomous drive function in response to said processor determining that a hand-sized object is taking hold of said steering wheel grip.
CROSS REFERENCES TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 15/647,693, entitled VEHICLE USER INTERFACE and filed on Jul. 12, 2017, by inventors Alexander Jubner, Thomas Eriksson, Gunnar Martin Fröjdh, Simon Fellin and Stefan Holmgren, the contents of which are hereby incorporated herein in their entirety. U.S. application Ser. No. 15/647,693 is a continuation of U.S. application Ser. No. 14/805,445, now U.S. Pat. No. 9,710,144, entitled STEERING WHEEL USER INTERFACE and filed on Jul. 21, 2015, by inventors Alexander Jubner, Thomas Eriksson, Gunnar Martin Fröjdh, Simon Fellin and Stefan Holmgren, the contents of which are hereby incorporated herein in their entirety. U.S. application Ser. No. 14/805,445 is a continuation of U.S. application Ser. No. 14/590,010, now U.S. Pat. No. 9,092,093, entitled STEERING WHEEL USER INTERFACE and filed on Jan. 6, 2015, by inventors Alexander Jubner, Thomas Eriksson, Gunnar Martin Fröjdh, Simon Fellin and Stefan Holmgren, the contents of which are hereby incorporated herein in their entirety. U.S. application Ser. No. 14/590,010 is a continuation-in-part of U.S. application Ser. No. 14/551,096, entitled LIGHT-BASED CONTROLS IN A TOROIDAL STEERING WHEEL and filed on Nov. 24, 2014, by inventors Gunnar Martin Fröjdh, Simon Fellin, Thomas Eriksson, John Karlsson, Maria Hedin and Richard Berglind, the contents of which are hereby incorporated herein in their entirety. U.S. application Ser. No. 14/590,010 is also a continuation-in-part of U.S. application Ser. No. 14/555,731, now U.S. Pat. No. 9,741,184, entitled DOOR HANDLE WITH OPTICAL PROXIMITY SENSORS and filed on Nov. 28, 2014, by inventors Sairam Iyer, Stefan Holmgren and Per Rosengren, the contents of which are hereby incorporated herein in their entirety. U.S. application Ser. No. 14/551,096 is a continuation of U.S. application Ser. No. 14/312,711, now U.S. Pat. No. 8,918,252, entitled LIGHT-BASED TOUCH CONTROLS ON A STEERING WHEEL and filed on Jun. 24, 2014 by inventors Gunnar Martin Fröjdh, Simon Fellin, Thomas Eriksson, John Karlsson, Maria Hedin and Richard Berglind, the contents of which are hereby incorporated herein in their entirety. U.S. application Ser. No. 14/312,711 is a continuation of U.S. application Ser. No. 14/088,458, now U.S. Pat. No. 8,775,023, entitled LIGHT-BASED TOUCH CONTROLS ON A STEERING WHEEL AND DASHBOARD and filed on Nov. 25, 2013 by inventors Gunnar Martin Fröjdh, Simon Fellin, Thomas Eriksson, John Karlsson, Maria Hedin and Richard Berglind, the contents of which are hereby incorporated herein in their entirety. U.S. application Ser. No. 14/088,458 is a non-provisional of U.S. Provisional Application No. 61/730,139 entitled LIGHT-BASED TOUCH CONTROLS ON A STEERING WHEEL AND DASHBOARD and filed on Nov. 27, 2012 by inventors Gunnar Martin Fröjdh, Thomas Eriksson, John Karlsson, Maria Hedin and Richard Berglind, the contents of which are hereby incorporated herein in their entirety. U.S. application Ser. No. 14/555,731 is a continuation-in-part of U.S. application Ser. No. 14/312,787, now U.S. Pat. No. 9,164,625, entitled OPTICAL PROXIMITY SENSORS and filed on Jun. 24, 2014 by inventors Stefan Holmgren, Sairam Iyer, Richard Berglind, Karl Erik Patrik Nordström, Lars Sparf, Per Rosengren, Erik Rosengren, John Karlsson, Thomas Eriksson, Alexander Jubner, Remo Behdasht, Simon Fellin, Robin Åman and Joseph Shain, the contents of which are hereby incorporated herein in their entirety. U.S. application Ser. No. 14/555,731 is also a continuation-in-part of U.S. application Ser. No. 14/311,366, now U.S. Pat. No. 9,063,614, entitled OPTICAL TOUCH SCREENS and filed on Jun. 23, 2014 by inventors Robert Pettersson, Per Rosengren, Erik Rosengren, Stefan Holmgren, Lars Sparf, Richard Berglind, Thomas Eriksson, Karl Erik Patrik Nordström, Gunnar Martin Fröjdh, Xiatao Wang and Remo Behdasht, the contents of which are hereby incorporated herein in their entirety. U.S. application Ser. No. 14/555,731 is also a continuation-in-part of U.S. application Ser. No. 14/140,635, now U.S. Pat. No. 9,001,087, entitled LIGHT-BASED PROXIMITY DETECTION SYSTEM AND USER INTERFACE and filed on Dec. 26, 2013 by inventors Thomas Eriksson and Stefan Holmgren, the contents of which are hereby incorporated herein in their entirety. U.S. application Ser. No. 14/312,787 is a continuation of International Application No. PCT/US14/40112 entitled OPTICAL PROXIMITY SENSORS and filed on May 30, 2014 by inventors Stefan Holmgren, Sairam Iyer, Richard Berglind, Karl Erik Patrik Nordström, Lars Sparf, Per Rosengren, Erik Rosengren, John Karlsson, Thomas Eriksson, Alexander Jubner, Remo Behdasht, Simon Fellin, Robin Åman and Joseph Shain, the contents of which are hereby incorporated herein in their entirety. International Application No. PCT/US14/40112 is claims priority benefit of U.S. Provisional Application No. 61/828,713 entitled OPTICAL TOUCH SCREEN SYSTEMS USING REFLECTED LIGHT and filed on May 30, 2013 by inventors Per Rosengren, Lars Sparf, Erik Rosengren and Thomas Eriksson; of U.S. Provisional Application No. 61/838,296 entitled OPTICAL GAME ACCESSORIES USING REFLECTED LIGHT and filed on Jun. 23, 2013 by inventors Per Rosengren, Lars Sparf, Erik Rosengren, Thomas Eriksson, Joseph Shain, Stefan Holmgren, John Karlsson and Remo Behdasht; of U.S. Provisional Application No. 61/846,089 entitled PROXIMITY SENSOR FOR LAPTOP COMPUTER AND ASSOCIATED USER INTERFACE and filed on Jul. 15, 2013 by inventors Richard Berglind, Thomas Eriksson, Simon Fellin, Per Rosengren, Lars Sparf, Erik Rosengren, Joseph Shain, Stefan Holmgren, John Karlsson and Remo Behdasht; of U.S. Provisional Application No. 61/929,992 entitled CLOUD GAMING USER INTERFACE filed on Jan. 22, 2014 by inventors Thomas Eriksson, Stefan Holmgren, John Karlsson, Remo Behdasht, Erik Rosengren, Lars Sparf and Alexander Jubner; of U.S. Provisional Application No. 61/972,435 entitled OPTICAL TOUCH SCREEN SYSTEMS and filed on Mar. 31, 2014 by inventors Sairam Iyer, Karl Erik Patrik Nordström, Lars Sparf, Per Rosengren, Erik Rosengren, Thomas Eriksson, Alexander Jubner and Joseph Shain; and of U.S. Provisional Application No. 61/986,341 entitled OPTICAL TOUCH SCREEN SYSTEMS and filed on Apr. 30, 2014 by inventors Sairam Iyer, Karl Erik Patrik Nordström, Lars Sparf, Per Rosengren, Erik Rosengren, Thomas Eriksson, Alexander Jubner and Joseph Shain, the contents of which are hereby incorporated herein in their entirety. U.S. application Ser. No. 14/311,366 is a continuation of International Application No. PCT/US14/40579 entitled OPTICAL TOUCH SCREENS and filed on Jun. 3, 2014 by inventors Robert Pettersson, Per Rosengren, Erik Rosengren, Stefan Holmgren, Lars Sparf, Richard Berglind, Thomas Eriksson, Karl Erik Patrik Nordström, Gunnar Martin Fröjdh, Xiatao Wang and Remo Behdasht, the contents of which are hereby incorporated herein in their entirety. International Application No. PCT/US14/40579 claims priority benefit of U.S. Provisional Application No. 61/830,671 entitled MULTI-TOUCH OPTICAL TOUCH SCREENS WITHOUT GHOST POINTS and filed on Jun. 4, 2013 by inventors Erik Rosengren, Robert Pettersson, Lars Sparf and Thomas Eriksson; of U.S. Provisional Application No. 61/833,161 entitled CIRCULAR MULTI-TOUCH OPTICAL TOUCH SCREENS and filed on Jun. 10, 2013 by inventors Richard Berglind, Erik Rosengren, Robert Pettersson, Lars Sparf, Thomas Eriksson, Gunnar Martin Fröjdh and Xiatao Wang; of U.S. Provisional Application No. 61/911,915 entitled CIRCULAR MULTI-TOUCH OPTICAL TOUCH SCREENS and filed on Dec. 4, 2013 by inventors Richard Berglind, Erik Rosengren, Robert Pettersson, Lars Sparf, Thomas Eriksson, Gunnar Martin Fröjdh and Xiatao Wang; of U.S. Provisional Application No. 61/919,759 entitled OPTICAL TOUCH SCREENS WITH TOUCH-SENSITIVE BORDERS and filed on Dec. 22, 2013 by inventors Remo Behdasht, Erik Rosengren, Robert Pettersson, Lars Sparf and Thomas Eriksson; of U.S. Provisional Application No. 61/923,775 entitled MULTI-TOUCH OPTICAL TOUCH SCREENS WITHOUT GHOST POINTS and filed on Jan. 6, 2014 by inventors Per Rosengren, Stefan Holmgren, Erik Rosengren, Robert Pettersson, Lars Sparf and Thomas Eriksson; and of U.S. Provisional Application No. 61/950,868 entitled OPTICAL TOUCH SCREENS and filed on Mar. 11, 2014 by inventors Karl Erik Patrik Nordström, Per Rosengren, Stefan Holmgren, Erik Rosengren, Robert Pettersson, Lars Sparf and Thomas Eriksson, the contents of which are hereby incorporated herein in their entirety.

US Referenced Citations (192)
Number Name Date Kind
4243879 Carroll et al. Jan 1981 A
4267443 Carroll et al. May 1981 A
4301447 Funk et al. Nov 1981 A
4518249 Murata et al. May 1985 A
4550250 Mueller et al. Oct 1985 A
4703316 Sherbeck Oct 1987 A
4710760 Kasday Dec 1987 A
4782328 Denlinger Nov 1988 A
4790028 Ramage Dec 1988 A
4847606 Beiswenger Jul 1989 A
4880969 Lawrie Nov 1989 A
4928094 Smith May 1990 A
5003505 McClelland Mar 1991 A
5016008 Gruaz et al. May 1991 A
5036187 Yoshida et al. Jul 1991 A
5053758 Cornett et al. Oct 1991 A
5103085 Zimmerman Apr 1992 A
5119079 Hube et al. Jun 1992 A
5162783 Moreno Nov 1992 A
5179369 Person et al. Jan 1993 A
5194863 Barker et al. Mar 1993 A
5220409 Bures Jun 1993 A
5283558 Chan Feb 1994 A
5406307 Hirayama et al. Apr 1995 A
5414413 Tamaru et al. May 1995 A
5422494 West et al. Jun 1995 A
5463725 Henckel et al. Oct 1995 A
5559727 Deley et al. Sep 1996 A
5577733 Downing Nov 1996 A
5579035 Beiswenger Nov 1996 A
5603053 Gough et al. Feb 1997 A
5612719 Beernink et al. Mar 1997 A
5618232 Martin Apr 1997 A
5729250 Bishop et al. Mar 1998 A
5748185 Stephan et al. May 1998 A
5785439 Bowen Jul 1998 A
5825352 Bisset et al. Oct 1998 A
5838308 Knapp et al. Nov 1998 A
5880743 Moran et al. Mar 1999 A
5886697 Naughton et al. Mar 1999 A
5889236 Gillespie et al. Mar 1999 A
5900875 Haitani et al. May 1999 A
5914709 Graham et al. Jun 1999 A
5936615 Waters Aug 1999 A
5943043 Furuhata et al. Aug 1999 A
5943044 Martinelli et al. Aug 1999 A
5956030 Conrad et al. Sep 1999 A
5988645 Downing Nov 1999 A
6010061 Howell Jan 2000 A
6023265 Lee Feb 2000 A
6031989 Cordell Feb 2000 A
6052279 Friend et al. Apr 2000 A
6073036 Heikkinen et al. Jun 2000 A
6085204 Chijiwa et al. Jul 2000 A
6091405 Lowe et al. Jul 2000 A
6114949 Schmitz et al. Sep 2000 A
6135494 Lotito et al. Oct 2000 A
6246395 Goyins et al. Jun 2001 B1
6259436 Moon et al. Jul 2001 B1
6292179 Lee Sep 2001 B1
6310609 Morgenthaler Oct 2001 B1
6323846 Westerman et al. Nov 2001 B1
6340979 Beaton et al. Jan 2002 B1
6346935 Nakajima et al. Feb 2002 B1
6356287 Ruberry et al. Mar 2002 B1
6359632 Eastty et al. Mar 2002 B1
6362468 Murakami et al. Mar 2002 B1
6411283 Murphy Jun 2002 B1
6421042 Omura et al. Jul 2002 B1
6429857 Masters et al. Aug 2002 B1
6456952 Nathan Sep 2002 B1
6529920 Arons et al. Mar 2003 B1
6542191 Yonezawa Apr 2003 B1
6549217 De Greef et al. Apr 2003 B1
6570557 Westerman et al. May 2003 B1
6597345 Hirshberg Jul 2003 B2
6628268 Harada et al. Sep 2003 B1
6639584 Li Oct 2003 B1
6646633 Nicolas Nov 2003 B1
6677932 Westerman Jan 2004 B1
6690365 Hinckley et al. Feb 2004 B2
6690387 Zimmerman et al. Feb 2004 B2
6703999 Iwanami et al. Mar 2004 B1
6707449 Hinckley et al. Mar 2004 B2
6727917 Chew et al. Apr 2004 B1
6734883 Wynn et al. May 2004 B1
6757002 Oross et al. Jun 2004 B1
6788292 Nako et al. Sep 2004 B1
6803906 Morrison et al. Oct 2004 B1
6833827 Lui et al. Dec 2004 B2
6836367 Seino et al. Dec 2004 B2
6857746 Dyner Feb 2005 B2
6864882 Newton Mar 2005 B2
6874683 Keronen et al. Apr 2005 B2
6888536 Westerman et al. May 2005 B2
6944557 Hama et al. Sep 2005 B2
6947032 Morrison et al. Sep 2005 B2
6954197 Morrison et al. Oct 2005 B2
6958749 Matsushita et al. Oct 2005 B1
6972401 Akitt et al. Dec 2005 B2
6988246 Kopitzke et al. Jan 2006 B2
6992660 Kawano et al. Jan 2006 B2
7006077 Uusimaki Feb 2006 B1
7007239 Hawkins et al. Feb 2006 B1
7030861 Westerman et al. Apr 2006 B1
7046232 Inagaki et al. May 2006 B2
7126583 Breed Oct 2006 B1
7133032 Cok Nov 2006 B2
7155683 Williams Dec 2006 B1
7159763 Yap et al. Jan 2007 B2
7176905 Baharav et al. Feb 2007 B2
7184030 McCharles et al. Feb 2007 B2
7221462 Cavallucci May 2007 B2
7225408 O'Rourke May 2007 B2
7232986 Worthington et al. Jun 2007 B2
7254775 Geaghan et al. Aug 2007 B2
7265748 Ryynanen Sep 2007 B2
7283845 De Bast Oct 2007 B2
7286063 Gauthey et al. Oct 2007 B2
7339580 Westerman et al. Mar 2008 B2
7352940 Charters et al. Apr 2008 B2
7355594 Barkan Apr 2008 B2
7369724 Deane May 2008 B2
7372456 McLintock May 2008 B2
7429706 Ho Sep 2008 B2
7435940 Eliasson et al. Oct 2008 B2
7441196 Gottfurcht et al. Oct 2008 B2
7441800 Weber et al. Oct 2008 B2
7442914 Eliasson et al. Oct 2008 B2
7464110 Pyhalammi et al. Dec 2008 B2
7465914 Eliasson et al. Dec 2008 B2
7469381 Ording Dec 2008 B2
7474772 Russo Jan 2009 B2
7479949 Jobs et al. Jan 2009 B2
7518738 Cavallucci et al. Apr 2009 B2
7587072 Russo et al. Sep 2009 B2
7633300 Keroe et al. Dec 2009 B2
7663607 Hotelling et al. Feb 2010 B2
7705835 Eikman Apr 2010 B2
7742290 Kaya Jun 2010 B1
7782296 Kong et al. Aug 2010 B2
7812828 Westerman et al. Oct 2010 B2
7855716 McCreary et al. Dec 2010 B2
7880724 Nguyen et al. Feb 2011 B2
7880732 Goertz Feb 2011 B2
8022941 Smoot Sep 2011 B2
8026798 Makinen et al. Sep 2011 B2
8068101 Goertz Nov 2011 B2
8089299 Rahman et al. Jan 2012 B1
8095879 Goertz Jan 2012 B2
8120625 Hinckley Feb 2012 B2
8193498 Cavallucci et al. Jun 2012 B2
8289299 Newton Oct 2012 B2
8564424 Evarts et al. Oct 2013 B2
8775023 Frojdh et al. Jul 2014 B2
8918252 Frojdh et al. Dec 2014 B2
8933876 Gator et al. Jan 2015 B2
9092093 Jubner et al. Jul 2015 B2
9770986 Sannomiya et al. Sep 2017 B2
10254943 Jubner Apr 2019 B2
20020152010 Colmenarez et al. Oct 2002 A1
20020158453 Levine Oct 2002 A1
20030086588 Shinada et al. May 2003 A1
20040044293 Burton Mar 2004 A1
20040199309 Hayashi et al. Oct 2004 A1
20050021190 Worrell et al. Jan 2005 A1
20050052426 Hagermoser et al. Mar 2005 A1
20060047386 Kanevsky et al. Mar 2006 A1
20080211779 Pryor Sep 2008 A1
20090139778 Butler et al. Jun 2009 A1
20090166098 Sunder Jul 2009 A1
20090278915 Kramer et al. Nov 2009 A1
20090322673 Cherradi El Fadili Dec 2009 A1
20100185341 Wilson et al. Jul 2010 A1
20110030502 Lathrop Feb 2011 A1
20110032214 Gruhlke et al. Feb 2011 A1
20110050589 Yan et al. Mar 2011 A1
20110087963 Brisebois et al. Apr 2011 A1
20110241850 Bosch et al. Oct 2011 A1
20110310005 Chen et al. Dec 2011 A1
20120109455 Newman et al. May 2012 A1
20120179328 Goldman-Shenhar Jul 2012 A1
20120232751 Guspan Sep 2012 A1
20120283894 Naboulsi Nov 2012 A1
20120326735 Bennett et al. Dec 2012 A1
20130024071 Sivertsen Jan 2013 A1
20130063336 Sugimoto et al. Mar 2013 A1
20130204457 King et al. Aug 2013 A1
20140081521 Frojdh et al. Mar 2014 A1
20140292665 Lathrop et al. Oct 2014 A1
20150100204 Gondo Apr 2015 A1
20180105185 Watanabe et al. Apr 2018 A1
Foreign Referenced Citations (36)
Number Date Country
4423744 Apr 1995 DE
0330767 Sep 1989 EP
0513694 Nov 1992 EP
0601651 Jun 1994 EP
0618528 Oct 1994 EP
0703525 Mar 1996 EP
1059603 Dec 2000 EP
1107666 Mar 1968 GB
2319997 Jun 1998 GB
2423808 Sep 2006 GB
03216719 Sep 1991 JP
5-173699 Jul 1993 JP
6-39621 May 1994 JP
3240941 Apr 1995 JP
10-148640 Jun 1998 JP
10-269012 Oct 1998 JP
11-232024 Aug 1999 JP
2001-216069 Aug 2001 JP
2009-248629 Oct 2009 JP
2011-254957 Dec 2011 JP
2012-181639 Sep 2012 JP
2014-225145 Dec 2014 JP
8600446 Jan 1986 WO
8600447 Jan 1986 WO
93615464 May 1996 WO
0102949 Jan 2001 WO
0140922 Jun 2001 WO
02095668 Nov 2002 WO
03038592 May 2003 WO
03083767 Oct 2003 WO
2005026938 Mar 2005 WO
2008147266 Dec 2008 WO
2009008786 Jan 2009 WO
2010093570 Aug 2010 WO
2010121031 Oct 2010 WO
2011119483 Sep 2011 WO
Non-Patent Literature Citations (26)
Entry
Moeller et al., “ZeroTouch: An Optical Multi-Touch and Free-Air Interaction Architecture”, Proc. CHI 2012, Proceedings of the 2012 Annual Conference Extended Abstracts on Human Factors in Computing Systems, May 5, 2012, pp. 2165-2174, ACM, New York, NY, USA.
Moeller et al., “ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field”, CHI EA '11, Proceedings of the 2011 Annual Conference Extended Abstracts on Human Factors in Computing Systems, May 2011, pp. 1165-1170, ACM, New York, NY, USA.
Moeller et al., “IntangibleCanvas: Free-Air Finger Painting on a Projected Canvas”, CHI EA '11 Proceedings of the 2011 Annual Conference Extended Abstracts on Human Factors in Computing Systems, May 2011, pp. 1615-1620, ACM, New York, NY, USA.
Moeller et al., “Scanning FTIR: Unobtrusive Optoelectronic Multi-Touch Sensing through Waveguide Transmissivity Imaging”,TEI '10 Proceedings of the Fourth International Conference on Tangible, Embedded, and Embodied Interaction, Jan. 2010, pp. 73-76, ACM, New York, NY, USA.
Myers, Brad A., “Mobile Devices for Control”, Mobile HCI 2002, LNCS 2411, pp. 1-8, 2002, Springer-Verlag, Berlin Heidelberg.
Myers et al., “Two-Handed Input Using a PDA and a Mouse”, CHI Letters, vol. 2—Issue 1, CHI 2000, Apr. 1-6, 2000.
Myers, Brad A., “Using Handhelds and PCs Together”, Communications of the ACM, Nov. 2001/vol. 44, No. 11, ACM.
Pfleging et al., “Multimodal Interaction in the Car—Combining Speech and Gestures on the Steering Wheel”, Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '12), Oct. 17-19, 2012, Portsmouth, NH, USA.
Pfeiffer et al., A., A Multi-Touch Enabled Steering Wheel—Exploring the Design Space, CHI 2010, Apr. 10-15, 2010, Atlanta, Georgia, USA.
Mahr et al., Determining Human-Centered Parameters of Ergonomic Micro-Gesture Interaction for Drivers Using the Theater Approach, AutomotiveUI 2011, Nov. 30-Dec. 2, 2011, Salzburg, Austria.
Döring et al., Gestural Interaction on the Steering Wheel—Reducing the Visual Demand, CHI 2011, May 7-12, 2011, Vancouver, BC, Canada.
Navarro et L., “Lateral Control Support for Car Drivers: a Human-Machine Cooperation Approach”, Proceedings of the ECCE 2007 Conference, Aug. 28-31, 2007, London, UK.
Angelini et al., “Gesturing on the Steering Wheel: a User-Elicited Taxonomy”, AutomotiveUI '14, Sep. 17-19, 2014, Seattle, WA, USA.
Werner, Steffen, “The Steering Wheel as a Touch Interface: Using Thumb-Based Gesture Interfaces as Control Inputs While Driving”, AutomotiveUI '14, Sep. 17-19, 2014, Seattle, WA, USA.
González et al., “Eyes on the Road, Hands on the Wheel: Thumb-based Interaction Techniques for Input on Steering Wheels”, Graphics Interface Conference 2007, May 28-30, 2007, Montreal, Canada.
Murer et al., “Exploring the Back of the Steering Wheel: Text Input with Hands on the Wheel and Eyes on the Road”, AutomotiveUI'12, Oct. 17-19, 2012, Portsmouth, NH, USA.
Koyama et al., “Multi-Touch Steering Wheel for In-Car Tertiary Applications Using Infrared Sensors”, AH '14, Mar. 7-9, 2014, Kobe, Japan.
U.S. Appl. No. 14/088,458, Non-final Office action, dated Feb. 7, 2014, 8 pages.
U.S. Appl. No. 14/088,458, Notice of Allowance, dated Mar. 6, 2014, 8 pages.
PCT Application No. PCT/US13/71557, Search Report and Written Opinion, dated Apr. 25, 2014, 25 pages.
Australian Patent Application No. 2013352456, Examination Report No. 1, dated Dec. 23, 2014, 9 pages.
Chinese Patent Application No. 201380021907.X, First Office Action, dated Mar. 28, 2016, 12 pages.
European Patent Application No. 13 859 391.8, Mar. 18, 2016, 8 pages.
European Patent Application No. 17 184 782.5, Search Report, dated Jul. 9, 2018, 10 pages.
Patent Patent Application No. 2015-530174, Office Action, dated Aug. 6, 2015, 7 pages.
Korean Patent Application No. 10-2015-7001419, First Office action, dated May 20, 2015, 3 pages.
Related Publications (1)
Number Date Country
20190258389 A1 Aug 2019 US
Provisional Applications (13)
Number Date Country
61730139 Nov 2012 US
61828713 May 2013 US
61838296 Jun 2013 US
61846089 Jul 2013 US
61929992 Jan 2014 US
61972435 Mar 2014 US
61986341 Apr 2014 US
61830671 Jun 2013 US
61833161 Jun 2013 US
61911915 Dec 2013 US
61919759 Dec 2013 US
61923775 Jan 2014 US
61950868 Mar 2014 US
Continuations (7)
Number Date Country
Parent 15647693 Jul 2017 US
Child 16365657 US
Parent 14805445 Jul 2015 US
Child 15647693 US
Parent 14590010 Jan 2015 US
Child 14805445 US
Parent 14312711 Jun 2014 US
Child 14551096 US
Parent 14088458 Nov 2013 US
Child 14312711 US
Parent PCT/US2014/040112 May 2014 US
Child 14312787 US
Parent PCT/US2014/040579 Jun 2014 US
Child 14311366 Jun 2014 US
Continuation in Parts (5)
Number Date Country
Parent 14551096 Nov 2014 US
Child 14590010 US
Parent 14555731 Nov 2014 US
Child 14551096 US
Parent 14312787 Jun 2014 US
Child 14555731 US
Parent 14311366 Jun 2014 US
Child 14312787 US
Parent 14140635 Dec 2013 US
Child 14311366 US