The present invention generally relates to systems and methods for interacting with user interfaces, and more particularly to systems and methods for making selections in context menus, dropdown menus and dialog boxes.
In most operating systems, performing an action on book covers, text, and other objects tends to require quite a few taps, typically on touch screens, through a sequence of menus, drop down boxes and dialog boxes in order to execute the desired action. For example, to post a quote from an eBook to the user's Facebook™ wall, or recommend an eBook from the user's library requires a series of taps through a sequence of menus. Although the conventional method of navigating these menus is self-explanatory, the user's actions are not fluid or quick.
In other conventional menu systems, the menu option are expanded sideways and downward as users make selections. Other menu systems are “flower” or circular shaped multi-level structures.
The system and method of the present invention is alternative to the conventional system and method and is more akin to drag-and-drop operations, allowing swifter actions with less motion and effort. The user can either drag through options, or tap through options. In a preferred embodiment, the present invention is operated on a mobile device with a touch screen
A user initiates the process of performing an action with respect to a book cover or other item (e.g., a selection of text or some other object) by touching a thumb or finger down on the item. The user pauses briefly before dragging the item (perhaps as short as 1/10th second but ideally long enough for the system to recognize that this gesture is not merely a tap). After the pause, a small icon representing the selected item appears (“shrinking” clown from the selected item) and a first row of options appears, preferably, above the item. This first row of options contains the various functions that can be performed with the item (e.g., recommend an eBook to a friend). The user then drags the item to the option which she would like to perform. As the icon of the item being dragged by the user reaches the desired option, a second set of sub options, if any, associated with the first option appears. As the second set of sub options appears, the first set of options shrinks, but is still visible so that the user can determine how she has navigated to the second set of options. This process can be repeated for as many levels of sub options that exist for the particular action to be performed on the selected item.
For the purposes of illustrating the present invention, there is shown in the drawings a form which is presently preferred, it being understood however, that the invention is not limited to the precise form shown by the drawing in which:
Drag-Through-Options Mode
As shown in
As shown in
As shown in
In a preferred embodiment, the first row 40 of menu options appears just a little above the point at which the user's finger was pressed down on the touch sensitive input device, e.g., touch screen. In an alternative embodiment, the first row 40 of menu options appears near or just a little above the top of the selected item 20. Alternative embodiments may position the first row 40 of menu options below the point at which the finger pressed down. However, in this alternate embodiment, the user's hand 5 or fingers may obscure the menu options. Accordingly, this embodiment is not recommended for cases where there is room available above the point at which the user's finger pressed down.
In the preferred embodiment, the menu 40 is brought up after the user 5 presses the selected item 20 and briefly keep pressing the item 20 without sliding or lifting her finger. If user just quickly taps, without keeping her finger down and fairly still for some minimal length of time, which can be as little as 1/10th second, then the menu 40 preferably does not appear. If the user immediately starts dragging the item without first holding her finger fairly still for that minimal hold time, then the menu preferably does not appear. Once the menu 40 appears, the user can drag the item 20 to the options as described below.
The row 40 of menu options is the first level of actions than can be taken with respect to the selected item 20. In the present example, as shown in row 40, the user 5 can choose to “highlight” the selected text, add “notes” at the position in the eBook of the selected text, “share” the selected text, or perform a “look up” with respect to the selected text, e.g., a dictionary definition look up of a selected word. In the present example, the user is going to “share” the selected text 20.
As seen in
In the particular sharing example shown in
In the present example shown in
To enter the drag-though-option mode embodiment of the present invention, the user presses-and-briefly-pauses on an item, e.g., an area of selected text, a word, a book cover, a file icon, to bring up the first row 40 of menu items, then drags the selected item more than some minimal distance, e.g., 1/10th inch in any direction. The system of the present invention interprets this user action as an intention to enter the drag-through-options mode. Once the system has detected the user action, a flag can be set indicating the user is in the drag-through-options mode, as opposed to the tap-through-options mode described below.
In a preferred embodiment, once the system is in drag-through-options mode, if the user “drops” the item on an option that is not executable because that option leads to an additional row of bubbles (such as the “share” option in the example above) or if the user “drops” the item outside of any of the bubble options—then the layers of bubbles simply disappear for (“pop”) without any action being taken. If the user drops the item on an option that does not lead to an additional row of bubbles, then it is an executable option, which can also be referred to as a “leaf” bubble. Leaf bubbles generally represent an action to be taken on the object being dragged. Dropping an item on the leaf bubble will initiate taking that action. In same cases, this may involve displaying a dialog or other UI element that the user can use to provide more information to proceed with performing that action, such as in the “Facebook™” share action described above.
Once the system is in the drag-through-options mode, the user can drag the item to any option in the initial row 40 of selectable options/actions/functions. These options are also known as “action bubbles,” and represent actions that can be invoked with respect to the item that is being dragged. For example, if the item is a section of text in an eBook, the first level 40 of action bubbles would represent actions that can be executed with respect to the selected text, such as sharing the text, as illustrated in
In preferred embodiments, if a row of action bubbles will not fit within the width of the device given the preferred font size and action-bubble sizing and the given the device's size and orientation, the system will automatically move the options that don't fit in this level of action bubbles to a higher level If this occurs, the system introduces a “more . . . ” action bubble into this level of action bubbles that, when invoked, opens the higher level of action bubbles containing the options that did not fit.
As described above, in the preferred embodiment of the drag-through-options mode, as the user drags up to a second or third row of options, the previous row remains visible, but shrinks in size. In the preferred embodiment, if the user is on or above any row except the first row, and then drags her finger (dragging the item) back down below that row, then that row disappears and the previous row enlarges again. In the preferred embodiment, if the user drags the item below the first row, the first row remains visible and in its original size, until the user lifts her finger. In this preferred embodiment, if the user lifts her finger while not touching any item in any row, all existing bubbles/row disappear. In an alternative embodiment, a “pop” visual and/or audio effect can be executed when the bubbles disappear.
Shrinking the menu levels as described above is an element that helps conserve screen space on tablets. In alternative embodiments, the first level can be shrunk once the user has reached the second menu level. However, the second level is not necessarily shrunk when the user gets to the third level.
As described above, in the preferred embodiment, the rows of menu item “grows” upward from the user's touch on the screen. In the prior art the menus typically grew sideways and/or downward. Most user interface designers find it counter-intuitive to grow upward, in part because people read side to side and top to bottom (in all languages). People don't read bottom to top in any language. The circular or flower shaped menu structures described above are awkward to read and use. On mouse-based systems, growing sideways or down is acceptable and is the reason why it is the standard for desktop/laptop OSs. However, in the context of a touch-based tablet or handset, if a multi-level menu grow sideways or down, the user's hand would often obscure each new layer of menu options that appears.
By growing the bubbles upward in the preferred embodiment, the present invention solves the problem of the user's hand obscuring the next level of options. This upward growth allows the user to slide through the multi-level menu options. It has been found that having the user sliding the selected item though the menu options is quite a bit quicker and easier than tapping through sequences of menus and panels/dialogs.
Although the present invention can be used in a tap-though mode, as described below, in the context of a touch screen device, a user can slide a finger or thumb much more accurately than he or she can tap. One way to see this is to open an email or note full of text on a device with an insertion point magnifier, in which a user slides her finger left or right to position the insertion cursor at any letter she wants. The user's finger precisely moves the cursor tiny distances, well below 1/20th of an inch, to move the cursor left or right over the letter.
In contrast, if the user tries tapping the screen on the device to insert the cursor at a specific spot in the text, the user will most often miss the spot. In general, users can only accurately tap with about ¼th inch resolution with an index finger, due to the need to use your entire arm to position the finger. Using a thumb, the user can position a spot to only within ⅜th inch, due to the size and shape of thumbs and the necessity of using the user's other fingers to hold the back of the device stable.
With the preferred drag through embodiment of the present invention, the embodiment allows users employ the more accurate positioning technique of sliding rather than tapping. Further the sliding motion contributes to the effortless feel of the gesture. This embodiment takes less work to accurately choose the menu options.
Tap-Through-Options Mode:
In addition to the drag-through-options embodiment described above, the present invention also provides a tap-through-options mode in which a user can tap her way through the options, rather than dragging the selected item. Again, as appreciated by those skilled in the art, the tapping actions described herein can be accomplished by other input mechanisms, such as with a mouse. To enter the tap-through-options mode, the user selects an object, e.g., by tapping three time on a portion of text as described above, and then presses her finger briefly on the item. In response to this selection of an item, the system displays the first row of options/bubbles. The user then lifts her finger off of the selected item without dragging (e.g., without moving more than, say, 1/10th inch). After the user lifts her finger, the system continues to display the first row of options. The user can then tap on one of the options on the visible row in order to execute the action associated with the option, or bring up a second row of options as described above. Visually, the rows/bubbles look the same as they do in the drag-through-options mode described above, however the user taps through the options instead of dragging the selected item.
Once the user has entered the tap-through-options mode, by pressing and briefly-pausing on an item, the user can cancel the operation by “tapping out,” i.e., by tapping anywhere outside of the set of bubbles or the original item. This will cause the system to exit the tap-though-options mode.
Some users may prefer to tap their way through the options. However, the drag-through-options mode allows the user to slide the item through the options. Since dragging motion does not require any lifting and carefully repositioning a finger many times, this motion may be preferred by most users. Poking at items on a screen (e.g., a tablet) that's too big to operate exclusively with thumbs requires a surprising amount of muscle strength and control. Almost every joint and muscle from the shoulder to the finger tip is involved in each poke. Further, users are generally more precise when sliding a finger or thumb between two nearby items than they are when tapping one item and then the other.
As with the examples described above in
As further shown in
In act 315, the system determines the type of item that the user has selected. This determination is performed so that in act 320, the system can display the types of options in a menu row that are appropriate to the type of selected item. In the preferred embodiment, the correlation between the options that are displayed and the type of item are predetermined. In action 325, the system monitors for further user input, specifically the selection of one of the options in the displayed menu row. If the user does not make a selection of one of the options, e.g., lifts her finger off the screen in the drag-through mode or taps elsewhere in the tap-through mode, the system interprets this as an intent by the user to abandon the operation with respect to the selected item and returns to the monitoring in act 300.
If the user selects an option on the current level, row, of menu options, e.g. by “dropping” the selected item on the option, pausing on the option or tapping on the option, the system then determines in act 330 if the option represents a executable function with respect to the selected item. If the option is an executable function, e.g., ‘print this file’, the system in act 335 executes the function in step 335. If the option is not executable by itself, it is because further information about the action the user wants to perform is required to be gathered, which the system does in act 340 by displaying a further row, menu, of options to the user.
As with act 325, in act 345, the system looks to see if the user selects of one of the options in the higher level menu row. If the user does not make a selection of one of the options, e.g., lifts her finger off the screen the drag-through mode or taps elsewhere in the tap-through mode, the system in act 350 interprets this as an intent by the user to abandon the operation with respect to the selected item and returns to the monitoring in act 300. If the user doesn't select an option on the higher level row, but moved back down toward, or taps on the lower level row, the system interprets this as an intent by the user to re-think her selection in the lower level row. In this case, the higher level row is no longer displayed and the user can select another option on the lower level row. In the embodiment of the present invention described above where the lower level row had been shrunken in size, it is returned to it's original size in act 320.
If the user does select an option on the higher level row of menu options, e.g. by “dropping” the selected item on the option, pausing on the option or tapping on the option, the system in act 355 determines if the option represents a executable function with respect to the selected item. If the option is an executable function, e.g., ‘share this selected text to my Facebook™ page’, the system in act 360 executes the function. If there is still more information that the system needs to gather in order to determine the executable function the user wants to perform with respect to the selected item, the system can iteratively display additional levels of menus of option in acts 340-360.
Electronic device 100 can include any suitable type of electronic device. For example, electronic device 100 can include a portable electronic device that the user may hold in his or her hand, such as a digital media player, a personal e-mail device, a personal data assistant (“PDA”), a cellular telephone, a handheld gaming device, a tablet device or an eBook reader. As another example, electronic device 100 can include a larger portable electronic device, such as a laptop computer. As yet another example, electronic device 100 can include a substantially fixed electronic device, such as a desktop computer.
Control circuitry 400 can include any processing circuitry or processor operative to control the operations and performance of electronic device 100. For example, control circuitry 400 can be used to run operating system applications, firmware applications, media playback applications, media editing applications, or any other application. Control circuitry 400 can drive the display 450 and process inputs received from a user interface, e.g., the touch screen portion of display 450.
Storage 410 can include, for example, one or more computer readable storage mediums including a hard-drive, solid state drive, flash memory, permanent memory such as ROM, magnetic, optical, semiconductor, paper, or any other suitable type of storage component, or any combination thereof. Storage 410 can store, for example, media content, eBooks, music and video files, application data, e.g., software for implementing functions on electronic device 100, firmware, user preference information data, e.g., content preferences, authentication information, e.g., libraries of data associated with authorized users, transaction information data, e.g., information such as credit card information, wireless connection information data, e.g., information that can enable electronic device 100 to establish a wireless connection, subscription information data, e.g., information that keeps track of podcasts or television shows or other media a user subscribes to, contact information data, e.g., telephone numbers and email addresses, calendar information data, and any other suitable data or any combination thereof. The instructions for implementing the functions of the present invention may, as non-limiting examples, comprise software and/or scripts stored in the computer-readable media 410.
Memory 420 can include cache memory, semi-permanent memory such as RAM, and/or one or more different types of memory used for temporarily storing data. In some embodiments, memory 420 can also be used for storing data used to operate electronic device applications, or any other type of data that can be stored in storage 410. In some embodiments, memory 420 and storage 410 can be combined as a single storage medium.
I/O circuitry 430 can be operative to convert, and encode/decode, if necessary analog signals and other signals into digital data. In some embodiments, I/O circuitry 430 can also convert digital data into any other type of signal, and vice-versa. For example, I/O circuitry 430 can receive and convert physical contact inputs, e.g., from a multi-touch screen, i.e., display 450, physical movements, e.g., from a mouse or sensor, analog audio signals, e.g., from a microphone, or any other input. The digital data can be provided to and received from control circuitry 400, storage 410, and memory 420, or any other component of electronic device 100. Although I/O circuitry 430 is illustrated in this Figure as a single component of electronic device 100, several instances of I/O circuitry 430 can be included in electronic device 100.
Electronic device 100 can include any suitable interface or component for allowing a user to provide inputs to I/O circuitry 430. For example, electronic device 100 can include any suitable input mechanism, such as a button, keypad, dial, a click wheel, or a touch screen, e.g., display 450. In some embodiments, electronic device 100 can include a capacitive sensing mechanism, or a multi-touch capacitive sensing mechanism.
In some embodiments, electronic device 100 can include specialized output circuitry associated with output devices such as, for example, one or more audio outputs. The audio output can include one or more speakers, e.g., mono or stereo speakers, built into electronic device 100, or an audio component that is remotely coupled to electronic device 100, e.g., a headset, headphones or earbuds that can be coupled to device 100 with a wire or wirelessly.
Display 450 includes the display and display circuitry for providing a display visible to the user. For example, the display circuitry can include a screen, e.g., an LCD screen, that is incorporated in electronics device 100. In some embodiments, the display circuitry can include a coder/decoder (Codec) to convert digital media data into analog signals. For example, the display circuitry or other appropriate circuitry within electronic device 100 can include video Codecs, audio Codecs, or any other suitable type of Codec.
The display circuitry also can include display driver circuitry, circuitry for driving display drivers, or both. The display circuitry can be operative to display content, e.g., media playback information, application screens for applications implemented on the electronic device 100, information regarding ongoing communications operations, information regarding incoming communications requests, or device operation screens, under the direction of control circuitry 400. Alternatively, the display circuitry can be operative to provide instructions to a remote display.
Communications circuitry 440 can include any suitable communications circuitry operative to connect to a communications network and to transmit communications, e.g., data from electronic device 100 to other devices within the communications network. Communications circuitry 440 can be operative to interface with the communications network using any suitable communications protocol such as, for example, Wi-Fi, e.g., a 802.11 protocol, Bluetooth, radio frequency systems, e.g., 900 MHz, 1.4 GHz, and 5.6 GHz communication systems, infrared, GSM, GSM plus EDGE, CDMA, quadband, and other cellular protocols, VOIP, or any other suitable protocol.
Electronic device 100 can include one more instances of communications circuitry 440 for simultaneously performing several communications operations using different communications networks, although only one is shown in this Figure to avoid overcomplicating the drawing. For example, electronic device 100 can include a first instance of communications circuitry 440 for communicating over a cellular network, and a second instance of communications circuitry 440 for communicating over Wi-Fi or using Bluetooth. In some embodiments, the same instance of communications circuitry 440 can be operative to provide for communications over several communications networks.
In some embodiments, electronic device 100 can be coupled to a host device such as remote servers for data transfers, synching the communications device, software or firmware updates, providing performance information to a remote source, e.g., providing reading characteristics to a remote server, or performing any other suitable operation that can require electronic device 100 to be coupled to a host device. Several electronic devices 100 can be coupled to a single host device using the host device as a server. Alternatively or additionally, electronic device 100 can be coupled to several host devices, e.g., for each of the plurality of the host devices to serve as a backup for data stored in electronic device 100.
Although the present invention has been described in relation to particular embodiments thereof, many other variations and other uses will be apparent to those skilled in the art, it is preferred, therefore, that the present invention be limited not by the specific disclosure herein, but only by the gist and scope of the disclosure.
This application claims benefit of U.S. Provisional Application No. 61/545,074, filed Oct. 7, 2011, which is hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
61545074 | Oct 2011 | US |