Portable electronic devices equipped with touch screens enable users to directly interact with graphical user interface elements displayed on the screen via touch input sensed by a touch screen sensor. The user visually examines the screen, and touches the screen in a location at which a graphical user interface element is displayed. The touch input is sensed by the device as occurring at the location of the graphical user interface element, triggering appropriate functionality on the portable electronic device.
One drawback with such devices is that they are difficult to interact with when the user cannot, or prefers not to, visually examine the screen. For example, when a user is exercising, riding a subway train, etc., the user may find it inconvenient or undesirable to look at the screen for extended periods of time. This may result in input errors by the user, or cause the user to look at the screen during at an undesirable time, generally frustrating the user experience.
An computer program executable on a portable electronic device having a touch screen sensor is provided. The computer program may include an input mode switching module configured to receive a mode switch user input to switch between a direct input mode and a relative gesture recognition mode in response to a user input. In the direct input mode, one or more graphical user interface elements of a graphical user interface of the portable electronic device are selectable via touch input of the user. In the relative gesture recognition mode, the graphical user interface elements in at least a defined region of the graphical user interface are made to be unselectable. The computer program may further include a gesture-based control module configured, in the relative gesture recognition mode, to recognize a contact point on the touch screen sensor between a digit of a user and a surface of the touch screen sensor in the defined region in which the graphical user interface elements are unselectable, and to present in the defined region a gesture control proximate to the contact point. The gesture-based control module may further be configured to identify a detected gesture based on user touch input originating from the contact point, and to send a message to an application program to adjust an operation of the portable electronic device based on the detected gesture.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
The graphical user interface 164 may be configured with a direct input mode in which one or more graphical user interface elements 165 of the graphical user interface are selectable graphical user interface elements 166, which are selectable via touch input of the user sensed by the touch screen sensor 162 at a location of the selectable graphical user interface element 166 on the display 160. Examples of selectable graphical user interface elements 166 include buttons, sliders, scroll bars, hyperlinks, pull down menus, icons, etc. The behavior of these various selectable graphic user interface elements 166 may be programmed, for example, via a computer program 130, which may be an application programming interface. Thus, in response to a user touch input selecting a selectable graphical user interface element 166, the portable electronic device may exhibit a programmed behavior associated with the selectable graphical user interface element 166, such as selecting a pull down menu option, scrolling a window, etc.
To enable a user to switch input modes, portable electronic device 100 may include a computer program 130, such as an application programming interface, which includes an input mode switch module 135 configured to receive mode switch user input 152 to switch between the direct input mode and a relative gesture recognition mode in response to mode switch user input 152. In the relative gesture recognition mode, one or more graphical user interface elements 165 in at least a defined region 170 of the graphical user interface 164 are made to be unselectable graphical user interface elements 168. In other words, an input received at a location adjacent a particular unselectable graphical user interface element 168 in the relative gesture input mode will not cause portable electronic device 100 to execute the programmed functionality associated with that user interface element in the direct input mode. Rather, the touch input 156 in the relative gesture recognition mode will be processed as relative gesture input, irrespective of underlying graphical user interface elements 165, as described below.
In the relative gesture recognition mode, a gesture-based control module 140 within computer program 130 is configured to recognize a contact point 174 on touch screen sensor 162 between a digit of a user and the surface of touch screen sensor 162 in the defined region 170 in which the graphical user interface elements 168 are unselectable, and to present in the defined region 170 gesture control 172 proximate to contact point 174. The gesture-based control module 140 is further configured to identify a detected gesture 158 based on user touch input 156 originating from contact point 174, and to send a message to application program 110 to adjust an operation of portable electronic device 100 based on the detected gesture 158.
Computer program 130 may also be configured to enable gesture-based control module 140 to access developer specified control parameters 149 by which gesture control 172 is configured to operate. Developer specified control parameters 149 may be received by gesture-based control module 140 from developer specified control parameter interface 180. Developer specified control parameters 149 may be specified, for example, by an application program developer via a software development kit (SDK) and may include parameters to customize the features and the functionality of gesture control 172. For example, developer specified control parameters 149 may include a volume parameter, a playback speed parameter, a playback direction parameter, a control perimeter definition parameter, and a defined region definition parameter. In this manner, a developer may define the gesture control 172 to be a volume control or a playback control, and may specify the control perimeter or other geometric properties of the control, as well as the defined region of the display that will be configured to receive gesture input.
According to these developer specified control parameters, or alternatively according to other pre-defined parameters specified by computer program 130, in the relative gesture recognition mode, gesture-based control module 140 is configured to present gesture control 172 within defined region 170, which is configured to receive touch input 156. By identifying detected gesture 158 within defined region 170, gesture-based control module 140, in the relative gesture recognition mode, functions as a front-end processor to receive input that would otherwise, in the direct input mode, be directed to graphical user interface 164. Acting as a front end processor, it will be appreciated that the gesture-based control module 140 may be configured to position defined region 170 independent of the various elements displayed on graphical user interface 164 of portable electronic device 100, such that the defined region 170 floats over a portion of or the entire graphical user interface 164.
The relative gesture recognition mode may be initiated by receipt of mode switch user input 152 by input mode switch module 135 of computer program 130. Mode switch user input 152 is shown in
Having received mode switch user input 152, input mode switch module 135 initiates the relative gesture recognition mode, and outputs a message to gesture-based control module 140. Specifically, input mode switch module 135 sends a request message to contact point recognizer 142 within gesture-based control module 140, indicating that the relative gesture recognition mode has been initiated and requesting that the contact point recognizer 142 return a contact point 174 in defined region 170, within which the graphical user interface elements 168 are unselectable.
Upon receiving the request message, contact point recognizer 142 recognizes contact point 174 within defined region 170 on the surface of touch screen sensor 162. Contact point 174 is formed by contact between a digit of a user (represented in
Upon recognition of contact point 174, contact point recognizer 142 may be configured to present a gesture control 172 having a defined perimeter 176 proximate to the recognized contact point 174 in defined region 170. Contact point recognizer 142 may receive input specifying parameters for control perimeter 176 from control perimeter definer 144. The contact point recognizer 142 may receive a control perimeter definition parameter from control perimeter definer 144. The control perimeter definition parameter may specify a formula, for example, for computing the control perimeter, which may be based on distance D from contact point 174. In one example, the control perimeter may be a preset control perimeter from a set of standard control definitions accessible via computer program 130. In another example, control perimeter definer 144 may receive input including a control perimeter definition parameter included in a set of developer specified control parameters 149 from developer specified parameter module 148, thus enabling a developer to specify the size and shape of the control perimeter.
It will be appreciated that gesture control 172 may include an associated icon, which may be partially translucent, although in other embodiments gesture control 172 may not be visually perceptible. The icon, if present, may visually indicate the control perimeter and/or the contact point, or may provide the user with other iconographic information. This other iconographic information may, for example, including an angle and degree of deflection in the case of a virtual control stick control, or degree of deflection in the case of a linear slider control. In some embodiments, the icons may respond to tapping inputs in addition to accepting gestures as described herein.
Having presented gesture control 172, contact point recognizer 142 is configured to send a message to identifier 146 requesting identification of detected gesture 158, which is illustrated as originating at contact point 174 in
Based on these inputs, identifier 146 is configured to identify touch input received via the touch sensor as a detected gesture 158 originating from contact point 174. For example, in
It will be appreciated that the interpretation of detected gesture 158 may be based on one or more developer specified control parameters 149, as included in developer specified parameter module 148 and received from developer specified control parameter interface 180. In this way, a developer for application program 110 may specify the interpretation of detected gesture 158. For example, a developer may indicate domains within defined region 170 in which detected gesture 158 may be ignored (e.g., a “dead zone”), discrimination parameters that interpret detected gesture 158 according to developer specified rules, logic configured to discriminate between actual detected gestures and spurious detected gestures, etc. In this way, a developer may tailor the operation of identifier 146 according to a particular application program 110.
Having interpreted detected gesture 158, identifier 146 sends a message to application program 110, via a communication module 150 of gesture-based control module. The message informs the application program 110 of the detected gesture 158, and may function to cause the application program to adjust an operation of portable electronic device 100 based on detected gesture 158.
For example, the identifier 146 may be configured instruct the application program 110 to adjust an operation of portable electronic device 100 based on a relative distance from contact point 174 to detected gesture 158 that has been identified. One example of this is illustrated in
Continuing with
The substantially positive vertical direction of detected gesture 158 may be interpreted as corresponding to a pre-defined gesture 192 within library 190 for increasing the volume of media playback. Further, a volume intensity of the media playback may be determined according to distance B shown, which illustrates the distance between contact point 174 and an endpoint of detected gesture 158. For example, the volume intensity may be determined by an absolute measure of distance B. Thus, if B is determined to be five measured distance units, the volume intensity may be changed by five volume units, for example. In another example, the volume intensity may be determined by distance B relative to a particular volume level which may be specified among a set of developer specified control parameters 149 (
To implement a pause control, for example, the detected gesture 158 may be identified based on detection of a tapping movement of the digit of the user relative to the frame of reference 210. In response, the gesture based control module may send the tapping input to the application program, which may be interpret the tapping input to change a pause status of media playback. To implement fast forward and/or rewind controls, the detected gesture 158 may be identified based on detection of a substantially horizontal direction of movement of the digit of a user relative to frame of reference 210, and in response the gesture based control module may send the detected gesture 158 to the application program, which in turn may adjust a temporal position of a media playback. It will be appreciated that media playback may be audio or visual media stored on portable electronic device 100 or may be media received by portable electronic device 100 from a network. Further, transport control 200 may be configured according to the type of media played back. For example, if the media playback is a broadcast stream from a radio station, the fast forward and/or rewind controls described above may instead control scanning forward or backward across radio frequencies, the above-described tapping input may activate a station preset, etc.
It will be further appreciated that transport control 200 may further present control options according to the context of an application program, which may be gesture based or non-gesture based, in addition to the gesture based transport controls. For example, if transport control 200 is presented in the context of a web browser application program, controls relevant to the web browser may be presented in addition to transport controls for controlling media playback. In another example, if transport control 200 is presented in the context of a computer game application program, controls relevant to the computer game may be presented, for example, transport controls controlling game music and a gesture based menu for pausing the game and selecting game options. In this way, a developer may be able to harmonize transport control 200 to an application program.
In addition, the gesture based control module may be configured to instruct the application program to adjust an operation of portable electronic device 100 based on a relative distance from a pre-determined location 178 on defined control perimeter 176 to the detected gesture 158. For example, as shown in
Continuing with
The example shown in
In the various embodiments described above, it will be appreciated that when the contact point recognizer detects that contact between a digit of the user and the touch pad sensor has been terminated, for example, for a predetermined period of time, the gesture based control mode module 140 ceases identifying the gesture via identifier 146 and begins attempting to detect touch. When a new contact point 174 is detected, it will be appreciated that a new gesture control 172 will be instantiated and as a result the frame of reference 210 will effectively snap to the location of the new contact point 174. In this manner, wherever the user chooses to contact the touch screen sensor 162 within defined region 170, a gesture control will be spawned in that location, thus enabling user input at various locations on the display 160. With such flexible input, the user can easily control portable electronic device 100 without visually examining the display 160, and without unintentionally activating unselectable graphical user interface elements.
Method 400 includes, at 402, initiating a relative gesture recognition mode responsive to a mode switch user input, wherein in the relative gesture recognition mode one or more graphical user interface elements in a defined region of a graphical user interface are made to be unselectable. The mode switch user input may be selected from the group consisting of a user input via a clutch key associated with the portable electronic device and a user input via a contact between the digit of a user and the surface of the touch screen sensor. In some examples, initiating a relative gesture at 402 may further include positioning the defined region of the graphical user interface in which the graphical user interface elements are unselectable independent of the graphical user interface. In other words, once the relative gesture recognition mode is activated, the defined region may be positioned anywhere on the touch screen sensor, and may include a subregion of the touch screen sensor or the entire touch screen sensor.
Method 400 further includes, at 404, recognizing a contact point on the touch screen sensor between a digit of a user and a surface of the touch screen sensor in the defined region in which the graphical user interface elements are unselectable in the relative gesture recognition mode. Next, method 400 includes, at 406, presenting a gesture control having a defined control perimeter proximate to the contact point in the defined region in which the graphical user interface elements are unselectable in the relative gesture recognition mode.
It will be appreciated that presenting a gesture control at 406 may include presenting a transport control, as described above. In addition, presenting a gesture control may further include snapping a frame of reference for the transport control to the contact point within the defined region. Further, presenting a gesture control having a defined control perimeter proximate to the contact point may include spawning a virtual control stick control for a virtual game control at the contact point, wherein the defined control perimeter has a full-scale deflection of a virtual control stick control at the defined control perimeter. With the device in the relative gesture recognition mode and with the gesture control presented in this manner, a detected gesture may be received within the defined region of the touch screen sensor, and identified.
Method 400 further includes, at 408, identifying a detected gesture based on a user touch input originating from the contact point received by the gesture control in the defined region in which the graphical user interface elements are unselectable within the touch screen sensor via the touch screen sensor. In one example, identifying a detected gesture further includes interpreting the detected gesture based at least in part on a comparison of the detected gesture received by the gesture control in the defined region in which the graphical user interface elements are unselectable via the touch screen sensor to a definition corresponding to one of a set of one or more pre-defined gestures within the a library of pre-defined gestures.
Method 400 may further include, at 408, enabling the gesture-based control module to access developer specified control parameters by which the gesture control is configured to operate. In one example, the developer specified control parameters may be selected from the group consisting of a volume parameter, a playback speed parameter, a playback direction parameter, a control perimeter definition parameter, and a defined region definition parameter. In this way, a developer may specify, via a software development kit, for example, control parameters for the portable electronic device that are peculiar to a particular application program.
Finally, method 400 further includes, at 410 adjusting an operation of the portable electronic device based on a relative distance from a pre-determined location on the defined control perimeter to the detected gesture so identified or based on a relative distance from the contact point to the detected gesture so identified. In one example, adjusting the operation of the portable electronic device includes adjusting a temporal position of a media playback responsive to the detected gesture identified by a substantially horizontal direction of the digit of a user relative to the frame of reference. In another example, adjusting the operation of the device includes adjusting a volume of the media playback responsive to responsive to the detected gesture identified by a substantially vertical direction of the digit of a user relative to the frame of reference. In yet another example, adjusting the operation of the device includes adjusting a pause status of the media playback responsive to the detected gesture identified by a tapping movement of the digit of a user relative to the frame of reference.
In addition, adjusting an operation of the portable electronic device may include outputting a response from the virtual game control that is in proportion to a measured deflection of the virtual control stick control with respect to the full-scale deflection of the virtual control stick control, when the gesture received by the touch screen sensor is received within the defined control perimeter. And, adjusting an operation of the portable electronic device may further include outputting a response from the virtual game control that is substantially the same as a full-scale deflection of the virtual control stick, when the relative gesture is received outside of the defined control perimeter.
The above described method, like the systems described herein, may be used to facilitate user control of a portable electronic device in situations in which the user does not visually examine the device, and without unintentional selection of graphical user interface elements that have been made unselectable.
It will be appreciated that the method illustrated in
It should be understood that the embodiments herein are illustrative and not restrictive, since the scope of the invention is defined by the appended claims rather than by the description preceding them, and all changes that fall within metes and bounds of the claims, or equivalence of such metes and bounds thereof are therefore intended to be embraced by the claims.