The present disclosure relates generally to computer and consumer product user interface pointing systems. More particularly, the disclosure relates to a pointing apparatus and method that responds to multiple input sources simultaneously and collaboratively to control the position and appearance of a cursor or pointer on an electronic display.
This section provides background information related to the present disclosure which is not necessarily prior art.
Pointing is a fundamental operation found in most graphical user interface (GUI) systems used by computers and many consumer electronics products. Typically, the user will manipulate a controller, such as a mouse, which in turn moves a computer-generated cursor on a display. The user will then move the cursor to select items of interest, navigate through many screens, explore content and the like. While use of a mouse is typical, some applications prefer gestural control where the user performs pointing using multiple body parts simultaneously and collaboratively, such as moving elbow, hand and finger to reach an object. For example, holding a remote input device, the user may make an in-the-air gesture with the controller and the gesture is translated into cursor movement on the display screen.
Current remote pointing methods suffer from the conflicting constraints of having limited range and having limited precision. A remote pointing device that has sufficient range to reach all parts of the display tends to be difficult to precisely control because the user has difficulty holding his or her hand steady once the desired cursor position is reached. Conversely, a remote pointing device that offers precise control within a predefined region of the display may not easily be able to reach other regions on the display. For example, a touch-sensitive controller may allow accurate pointing within a limited range but requires repeated swipes to move to a different region on the display, causing user fatigue.
The hybrid pointing apparatus and method disclosed here overcomes the aforementioned difficulties by allowing multiple input sources, such as in-the-air hand movement and finger-pointing on a touchpad surface, to work together in a collaborative fashion.
The disclosed pointing apparatus or controller facilitates user interaction with displayed elements on an electronic display of the type having a cursor generation and display system that displays a graphical cursor at a user-controllable position on the display. The controller includes a first sensor, such as a motion sensor, responsive to user movement of a first type producing first sensor data. The controller includes a second sensor, such as a touch-responsive touchpad sensor, responsive to user movement of a second type different from the first type producing second sensor data.
The controller and/or the electronics product coupled to the display further includes at least one processor that calculates a hybrid cursor movement signal having a large scale movement component and a fine scale movement component. The processor or processors calculate the large scale movement component based on the first sensor data to which is applied a sensitivity parameter based on the second sensor data. The processor or processors also calculate the fine scale movement component based on the second sensor data to which is applied a sensitivity parameter based on the first sensor data.
The controller further includes transmitter for wirelessly communicating the hybrid cursor movement signal to the cursor generation and display system.
Further areas of applicability will become apparent from the description provided herein. The description and specific examples in this summary are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.
The drawings described herein are for illustrative purposes only of selected embodiments and not all possible implementations, and are not intended to limit the scope of the present disclosure.
a and 3b illustrate an alternative embodiment featuring gesture sensors disposed about the periphery of a touch-sensitive controller;
a and 10b graphically illustrate how the motion sensitivity parameters perform under exemplary use cases;
Corresponding reference numerals indicate corresponding parts throughout the several views of the drawings.
Example embodiments will now be described more fully with reference to the accompanying drawings.
Referring to
In the illustrated embodiment, the handheld controller includes at least one touchpad 16 and also includes embedded motion sensing circuitry (discussed below) to detect in-air movement of the controller 10. The handheld controller thus provides two sources of sensor data: in-air motion data and touchpad data that are combined in a hybrid fashion to control how the cursor 12 appears on display 14 and how it moves. More specifically, the cursor 12 defines a rough pointer region 18, centered at 20, and a precise pointer 22 disposed within rough pointer region 18. As will be more fully described, in-air movement of the controller 10, such as movement from side to side, causes the rough pointer region (and precise pointer 22 within) to move about the screen, generally tracking with how the user waves the controller in the air. Thus the cursor, and specifically the rough pointer region of the cursor, effects large scale movement about the screen. By touching the touchpad 16 with a thumb or finger, the user controls the precise pointer 22 within the rough pointer region 18. Thus the cursor, and specifically the precise pointer of the cursor, effects fine scale movement within the rough pointer region. As will be described, the overall size of the rough pointer region varies depending on how the user manipulates the controller.
Referring to
Coupled to the I/O interface 28 is a wireless transceiver 30 used to communicate by radio frequency with electronic circuitry associated with display 14 (
To sense in-air motion, the controller includes one or more inertial sensors such as accelerometer 36, magnetometer 37 and gyroscope 38. The accelerometer produces a signal indicative of the second derivative of linear motion. The accelerometer 36 of the preferred embodiment is a three-axis accelerometer that measures the second derivative of linear motion in a three-dimensional reference frame. Gyroscope 38 produces a signal indicative of rotation and thus has the ability to measure the rate of rotation about a particular axis. The magnetometer 37 produces a signal indicative of the compass pointing direction.
A first embodiment uses only gyroscope 38. This embodiment has the advantage of low cost. However, the gyroscope-only solution may experience some drift. To reduce the effect of this drift, the accelerometer may be combined with a gyroscope. A third embodiment adds the magnetometer 37 to the accelerometer and gyroscope. Addition of the magnetometer further reduces problems with drift and gives the controller knowledge of actual pointing direction with respect to a geographic reference frame, such as true North-South-East-West, for example; as opposed to merely relative motion information.
As an alternative or addition to inertial sensors, an optical tracking system may also be used. The optical tracking system uses an infrared camera on the remote control, which tracks infrared light emitting diodes disposed along the top or bottom of the display.
As previously noted, the processor and electronics for implementing the hybrid pointing system and method can be deployed in the handheld controller alone, or distributed across other components, such as a receiver, blue ray disc player, television receiver, audio-video processor and the like. See
Referring to
The handheld controller includes a processor 26c with associated memory 24c that communicates through its input/output circuit 28 to supply sensor movement data (obtained from the touchpads 16 and from the motion sensors: accelerometer 36, magnetometer 37 and gyroscope 38) to the consumer electronics product 15 via the wireless transceiver pair: transceiver 30c (on the handheld controller) and transceiver 30 (on the consumer electronics product).
The processor 24 within the consumer electronics product then uses this sensor movement data to calculate the hybrid cursor movement signal and the cursor size data.
In general, the hybrid pointing system and method can be implemented using any plural number of sensors (two or more). The controller 10 of
The system and method takes signals from disparate sensors and combines them in a unique, collaborative way. In this regard, the embodiment illustrated in
The method 54 then supplies an application 56 which may handle the actual on-screen cursor generation. In other words, application 56 is responsible for generating the graphical appearance of the cursor and places that cursor at a location on the display based on the results of the hybrid pointing/selection method 54. Of course, application 56 is not necessarily limited to generating the on-screen cursor. Additional application functionality can also be implemented. In this regard, the hybrid pointing/selection method 54 provides the raw cursor location and cursor size information that an application can use to achieve the desired goals of the application. Thus, a video game application, for example, might use the location and cursor size information from method 54 to control movement of a character or player in the game. In such an implementation, the rough pointer movement data might be used to control the position of the character's body, while the precise pointer movement data might be used to control the character's arms.
The hybrid pointing/selection method 54 offers several advantages. One advantage is to enable cascading control where each sensor controls a different range of precision. This has been illustrated in
By virtue of the hybrid manner in which the respective sensor data are combined, the output of one stage is also related to the sensor signals of other stages. Each sensor's range may thus be used in a cascading manner where the output of a first stage is used to constrain the search space of the next stage. In other words, signal processing of any sensor source level depends on the signals from other sensor sources. The result is an accurate, stable and responsive cursor position and cursor size that dynamically adapts to the user's intentions.
The hybrid pointing/selection method 54 is preferably implemented by programming processor 24 (
In a similar fashion, touchpad data 52 are fed to the finger motion intention calculation processor 74, and also to the hand motion intention calculation processor 73. Thus, the hand motion intention calculation results are somewhat dependent on what the user is doing with the touchpad while the in-air gestural motions are being performed. The respective hand motion intention calculation processors generate motion sensitivity values represented herein by α. More specifically, hand motion intention calculation processor 72 computes αhand as follows:
Similarly, finger motion intention calculation processor 74 generates the motion sensitivity value αfinger as follows:
B0 is constant; e.g., =10
=> αfinger is high when user is not performing in-the-air motion, and lower when user is performing intensive in-the-air motion.
These motion sensitivity parameters serve as contribution factors for the associated sensor. The rough pointer movement processor 76 calculates the rough pointer movement variable: Δ{right arrow over (P)}hand as follows:
Δ{right arrow over (P)}hand,t=αhand,t({right arrow over (P)}*hand,t−{right arrow over (P)}*hand,t−1) EQUATION 3:
Note: Δ{right arrow over (P)}hand,t is a vector quantity that changes as a function of time. The {right arrow over (P)}*hand,t and {right arrow over (P)}*hand,t−1 are raw input values from the sensor.
Similarly, precise pointer movement processor 78 calculates the precise pointer value: Δt{right arrow over (P)}inger as follows:
Δ{right arrow over (P)}finger,t=αfinger,t({right arrow over (P)}*finger,t−{right arrow over (P)}*finger,t−1) EQUATION 4:
Note: Δ{right arrow over (P)}finger,t is a vector quantity that changes as a function of time. The {right arrow over (P)}*finger,t and {right arrow over (P)}*finger,t−1 are raw input values from the sensor.
The resultant values from processors 76 and 78 are then combined by processor 80 to generate the vector D representing the cursor position as follows:
For hand, finger embodiment: {right arrow over (D)}t={right arrow over (D)}t−1+(Shand·Δ{right arrow over (P)}hand,t+Sfinger·Δ{right arrow over (P)}finger,t)
=> Movement of cursor is a combination of movement data from all sensors.
Scale Factor S:
In addition to providing cursor position information, the hybrid pointing/selection method 54 can also control the visual appearance of the cursor such as controlling the rough pointer region as was illustrated at 18 in
Cdiameter=Sfinger*αfinger,t EQUATION 6:
=> Cursor size is defined by scale factor of finger motion, and reduced to smaller size when αfinger is small (e.g., Cursor size is small when user is performing intensive hand in-the-air motion, and becomes larger when user's hand is steady in the air).
For
At step 100, the processor captures raw sensor data for time t−1. In other words, at a designated starting time (t−1), the raw sensor data are captured and stored in the data structure at 102. Note this data structure has also been labeled in
Once two raw sensor data values have been obtained for the different times t−1 and t, a difference calculation is performed at 108 and then temporarily stored at 110 (shown in
Next, at step 114 the sensor movement value is calculated. There are two methods to perform this step. The first method calculates the sensor movement as a relative value by multiplying the sensitivity parameter with the calculated difference stored at 110. See Equation 7 below. The second method performs an absolute movement calculation where a position value is calculated and stored at 116 (
Δ{right arrow over (P)}n,t=αn,t·({right arrow over (P)}*n,t−{right arrow over (P)}*n,t−1) EQUATION 7:
Δ{right arrow over (P)}n,t=αn,t·({right arrow over (P)}*n,t−{right arrow over (P)}n,t−1), and update: {right arrow over (P)}n,t={right arrow over (P)}n,t−1+Δ{right arrow over (P)}n,t EQUATION 8:
Then at step 118, the resultant cursor coordinate value D is calculated and stored at 118 (
Meanwhile, the cursor size calculation is performed at 122 using both the sensitivity parameters calculated at 112 and also using the constant S at 120. In the illustrated embodiment, the cursor is presented as a circle that has a diameter that dynamically changes. It will be appreciated that cursors of other shapes and configurations are also possible, in which case the calculated value at 122 might represent some other parameter rather than diameter.
The calculated position D and cursor size C are then output at 124. This output is then fed to the application 56 (
The procedure thus described is figuratively repeated for subsequent time intervals so that the value at time t for a current iteration becomes the value for t−1 for the next iteration. With reference to the data structure diagram of
Referring now to
The foregoing description of the embodiments has been provided for purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure. Individual elements or features of a particular embodiment are generally not limited to that particular embodiment, but, where applicable, are interchangeable and can be used in a selected embodiment, even if not specifically shown or described. The same may also be varied in many ways. Such variations are not to be regarded as a departure from the disclosure, and all such modifications are intended to be included within the scope of the disclosure.
Number | Name | Date | Kind |
---|---|---|---|
5666499 | Baudel et al. | Sep 1997 | A |
7696980 | Piot et al. | Apr 2010 | B1 |
20030210286 | Gerpheide et al. | Nov 2003 | A1 |
20040017355 | Shim | Jan 2004 | A1 |
20060267934 | Harley et al. | Nov 2006 | A1 |
20070188458 | Bells et al. | Aug 2007 | A1 |
20090153500 | Cho et al. | Jun 2009 | A1 |
20100039394 | Moussavi | Feb 2010 | A1 |
20100253619 | Ahn | Oct 2010 | A1 |
20100315336 | Butler et al. | Dec 2010 | A1 |
Entry |
---|
Cao, Xiang, et al., “Comparing User Performance with Single-Finger, Whole-Hand, and Hybrid Pointing Devices,” CHI 2010: Devising Input, Apr. 10-15, 2010, Atlanta, GA, pp. 1643-1646. |
Vogel, Daniel, et al., “Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays,” UIST '05, Oct. 23-27, 2005, Seattle, WA, pp, 33-42. |
Number | Date | Country | |
---|---|---|---|
20130093674 A1 | Apr 2013 | US |