The present invention relates generally to computing devices and more particularly to screen inputs for computing devices.
As computer devices with touchscreens become increasingly commonplace, new ways of user-device-interaction are becoming possible including, for example, zooming and resizing objects with multi-touch gestures. New shapes or characters can also be input directly.
Typically, however, the range of possible control gestures is limited by a requirement to immediately decide what kind of action the user wants to take, and in most cases the system's behavior is determined by the specific area that a user touches and the general direction of the user's gesture. By contrast, conventional desktop computer systems typically include a pointing device that enables at least one alternative input mode (e.g., via a mouse right-click).
The present disclosure is illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
The description that follows includes illustrative systems, methods, techniques, instruction sequences, and computing program products that embody the present invention. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments of the inventive subject matter. It will be evident, however, to those skilled in the art that embodiments of the inventive subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures and techniques have not been shown in detail.
Example embodiments allow a multitude of more complex gestures by a user on a screen of a display device. For example, the user can draw a gesture, e.g., in form of a symbol, on the screen (e.g., using a finger or a pointing device such as a mouse) and the device may determine a corresponding action when the gesture is complete.
Next, as shown at operation 106, an image marker 204 (see
As shown at operation 108, the method 102 further includes receiving a second screen input that identifies a second location or position on the display screen 201 during the response-time period after the first screen input. The response-time period allows a user to provide a second and subsequent input relative to the first input. The second input may be used to define a gesture associated with a predefined function of the device (e.g., a portable computer, mobile device such as a cellular telephone, or the like device having a display screen providing a user interface).
As time passes, the persistence of the image marker 204 during the response-time period can be displayed by a changing the color or intensity of the image marker. For example, the image marker may fade so that it disappears from the display screen 201 by the end of the response-time period.
The method 102 next changes the input mode to an alternative input mode for the display screen 201 based on the position of the second location relative to the image marker 110. The method 102 next includes receiving a gesture as a screen input in the alternative input mode, where the gesture includes a motion along a path on the display screen 201 starting from the second location 112. In general, this path starts at the second location and ends at some completion of the gesture, either by explicitly terminating the input (e.g., releasing the computer pointing device or pointer) or satisfying some completion condition (e.g., completing a closed curve as in
For example, in
More generally, this alternative input mode may use the completed gesture to define a screen input including, for example, generating a new shape (e.g., as in
Alternative input modes may also enable the manipulation of existing screen objects (e.g., by zooming or re-sizing), either based on a completed gesture or operating continuously as the corresponding path is traced.
The example resizing operations described above (e.g.,
Then sizes (e.g., line lengths) and positions (e.g., (x,y) coordinates) are calculated by the following formulas for updating these values:
Sizex′=Sizex*Sx Sizey′=Sizey*Sy (2)
x′=x+x0*(1−Sx) y′=y+y0*(1−Sy). (3)
Additional example embodiments relate to an apparatus for carrying out any one of the above-described methods. The apparatus may include a computer for executing computer instructions related to the methods described herein by way of example. In this example context the computer may be a general-purpose computer including, for example, a processor, memory, storage, and input/output devices (e.g., keyboard, display, disk drive, Internet connection, etc.). However, the computer may include circuitry or other specialized hardware for carrying out some or all aspects of the method. In some operational settings, the apparatus or computer may be configured as a system that includes one or more units, each of which is configured to carry out some aspects of the method either in software, in hardware or in some combination thereof. For example, the system may be configured as part of a computer network that includes the Internet. At least some values for the results of the method can be saved for later use in a computer-readable medium, including memory units (e.g., RAM (Random Access Memory), ROM (Read Only Memory)) and storage devices (e.g., hard-disk systems, optical storage systems).
Additional embodiments also relate to a computer-readable medium that stores (e.g., tangibly embodies) a computer program for carrying out any one of the above-described methods by means of a computer. The computer program may be written, for example, in a general-purpose programming language (e.g., C, C++) or some specialized application-specific language. The computer program may be stored as an encoded file in some useful format (e.g., binary, ASCII). In some contexts, the computer-readable medium may be alternatively described as a computer-useable medium, a computer-storage medium, a computer-program medium or some alternative non-transitory storage medium. Depending on the on the operational setting, specified values for the above-described methods may correspond to input files for the computer program or computer.
As described above, certain embodiments of the present invention can be implemented using standard computers and networks including the Internet.
In accordance with an example embodiment, the apparatus 502 includes a multi-input gesture control module 508 that includes a first location-receiving module 510, a marker module 512, a second location-receiving module 514, a mode-changing module 516, a gesture-receiving module 518, and a storage module 520. The first location-receiving module 510 operates to receive a first screen input that identifies a first location on the display screen. The marker module 512 operates to provide an image marker at the first location in response to the first screen input, where the image marker persists on the display screen for a response-time period after the first screen input. The second location-receiving module 514 operates to receive a second screen input that identifies a second location on the display screen during the response-time period after the first screen input. The mode-changing module 516 operates to change to an alternative input mode for the display screen based on a position of the second location relative to the image marker. The gesture-receiving module 518 operates to receive a gesture as a screen input in the alternative input mode, where the gesture includes a motion along a path on the display screen starting from the second location. The storage module 520 operates to persistently store display screen data that identifies the first location, the image marker, the second location, and the gesture.
In addition, a graphics module 522 operates to render images on the display screen and a database interface 524 operates to enable access to remote data storage. The database interface 524 may provide database management functionality including a database application, a database management system (DBMS), one or more databases (local and/or remote), input/output (I/O) buffer caches, and the like. The database application may provide order fulfillment, business monitoring, inventory control, online shopping, and/or any other suitable functions by way of interactions with other elements of the processing system 504. According to some example embodiments, the database application communicates with the DBMS over one or more interfaces provided by the DBMS. The database application may, in turn, support client applications executed by client devices.
The DBMS may comprise any suitable system for managing a database instance. Generally, the DBMS may receive requests for data (e.g., Structured Query Language (SQL) requests from the database application), may retrieve requested data from the database and may return the requested data to a requestor. The DBMS may also perform start-up, logging, recovery, management, optimization, monitoring, and other database-related tasks.
Although only certain example embodiments of this invention have been described in detail above, those skilled in the art will readily appreciate that many modifications are possible in the example embodiments without materially departing from the novel teachings and advantages of this invention. For example, aspects of embodiments disclosed above can be combined in other combinations to form additional embodiments. Accordingly, all such modifications are intended to be included within the scope of this invention.
Number | Name | Date | Kind |
---|---|---|---|
5596656 | Goldberg | Jan 1997 | A |
6057845 | Dupouy | May 2000 | A |
7812828 | Westerman et al. | Oct 2010 | B2 |
20100013780 | Ikeda et al. | Jan 2010 | A1 |
20100079388 | Ohnishi et al. | Apr 2010 | A1 |
20100149109 | Elias | Jun 2010 | A1 |
20100295796 | Roberts et al. | Nov 2010 | A1 |
20100302205 | Noma | Dec 2010 | A1 |
20110050562 | Schoen et al. | Mar 2011 | A1 |
20110074710 | Weeldreyer et al. | Mar 2011 | A1 |
20110080341 | Helmes et al. | Apr 2011 | A1 |
20110093778 | Kim et al. | Apr 2011 | A1 |
20110119638 | Forutanpour | May 2011 | A1 |
20110122080 | Kanjiya | May 2011 | A1 |
20110175821 | King | Jul 2011 | A1 |
20110193795 | Seidman et al. | Aug 2011 | A1 |
20120084662 | Navarro et al. | Apr 2012 | A1 |
20120110519 | Werner et al. | May 2012 | A1 |
Number | Date | Country | |
---|---|---|---|
20120113015 A1 | May 2012 | US |