As people continue to use their hand-held mobile devices as a phone for telecommunication, more and more these same people are also using their mobile devices as content consumption devices. Through their mobile devices, people can “consume” (i.e., view and interact with) content such as maps, images, videos, web content, email, text messages, and the like. Additionally, a growing percentage of these mobile devices are touch-sensitive, i.e., a user interacts with the device, as well as content presented on the device, through the device's touch-sensitive display surface.
Quite often, the content that a user wishes to display on the mobile device is substantially larger than the mobile device's available display surface, especially when the content is displayed in full zoom. When this is the case, the user must decrease the zoom level (shrinking the size of the content) of the displayed content or must reposition the device's viewport with respect to the displayable content, or both. While there are user interface techniques for modifying the zoom level of content (e.g., pinching or spreading one's fingers on a touch-sensitive surface) or repositioning the content/display surface (via pan or swiping gestures), these techniques are generally considered two-handed techniques: one hand to hold the mobile device and one hand to interact on the touch-sensitive display surface. However, there are many occasions in which the user has only one free hand with which to hold the device and interact with the display surface. In such situations, fully interacting with content displayed on the mobile device is difficult, if not impossible. On wall-mounted or tabletop displays with direct touch, there is no issue of holding the device. However, on such large form factors the pinch and swipe technique can be very tiring and zooming might require two hands.
The following Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. The Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
According to aspects of the disclosed subject matter, a dynamic user-interaction control is presented that enables a person to interact with a touch-sensitive device in a single-handed manner. A triggering event causes the dynamic user-interaction control to be temporarily presented on a display screen. Generally, the dynamic user-interaction control is presented on the display window of the display screen. In one embodiment, the triggering event occurs when the device user touches a touch-sensitive input device and holds that touch for a predetermined amount of time. Typically, the dynamic user-interaction control is presented at the location corresponding to the triggering event (i.e., the location of the device user's touch). The dynamic user-interaction control remains present on the display screen and the device user can interact with the control until a dismissal event is encountered. A dismissal event occurs under multiple conditions including the device user breaks touch connection with the dynamic user-interaction control for a predetermined amount of time.
According to additional aspects of the disclosed subject matter, a method for interacting with content displayed in a display window is presented. A triggering event for interacting with content displayed in a display window is detected. Upon detection of the triggering event, a dynamic user-interaction control is displayed on the display window. User activity in regard to the dynamic user-interaction control is detected and a determination is made as to whether the detected user activity corresponds to a panning activity or a zooming activity. The detected user activity is implemented with regard to the display of the content in the display window.
The foregoing aspects and many of the attendant advantages of the disclosed subject matter will become more readily appreciated as they are better understood by reference to the following description when taken in conjunction with the following drawings, wherein:
For purposed of clarity, the term “exemplary” in this document should be interpreted as serving as an illustration or example of something, and it should not be interpreted as an ideal and/or a leading illustration of that thing. A display window refers to the area of display screen that is available for displaying content. The display window may comprise the entirety of a display screen, but that is not required.
The term panning refers to the act of changing the content that can be viewed through a display window such that a portion of the content that was previously displayed in the display window is no longer visible while a portion of the content that was not previously displayed in the display window becomes visible. Similar to panning, “flicking” involves quickly dragging the point of contact (such as the touch location of a finger) across an area of the screen and releasing contact. Flicking causes a panning/scrolling action to continue for a period of time, as though there were momentum provided by the flicking gesture, along the vector defined the original contact location and the release location. The speed of the flicking gesture determines the speed of scrolling and the momentum imparted and, therefore, the continued scrolling after contact is released. Panning and flicking typically involve content that cannot be fully displayed at a current resolution within a display window, i.e., there is more content that can be displayed by the display window. Conceptually, one may think of moving the display window over the content. Alternatively, one may think of a fixed display window and the content is moved underneath. The following discussion will be made in the context of the former: that of moving the display window over the content, but this is for simplicity and consistency in description and is not limiting upon the disclosed subject matter. Panning typically involves a smooth transition in the content (based on the speed of panning) but this is not a requirement. Panning and scrolling (with regard to the repositioning of the display window to the content) are used synonymously.
The term zoom refers to the resolution of the displayed content through a display window. Conceptually, one may think of zoom as referring to the distance of the display window to the content: the further away the display window is from the content the less resolution and/or detail of the content can be displayed, but more of the content can be displayed within the display window. Conversely, the closer the display window is “zoomed in” to the content, the greater the resolution and/or detail of the content can be displayed, but the amount (overall area) of content that can be displayed in the display window is reduced.
According to aspects of the disclosed subject matter, a dynamic user-interaction control is presented that enables a person to interact with a touch-sensitive device in a single-handed manner. A triggering event causes the dynamic user-interaction control to be temporarily presented on a display screen. Generally, the dynamic user-interaction control is presented on the display window of the display screen. In one embodiment, the triggering event occurs when the device user touches a touch-sensitive input device and holds that touch for a predetermined amount of time. Typically, the dynamic user-interaction control is presented at the location corresponding to the triggering event (i.e., the location of the device user's touch). The dynamic user-interaction control remains present on the display screen and the device user can interact with the control until a dismissal event is encountered. A dismissal event occurs under multiple conditions including the device user breaks touch connection with the dynamic user-interaction control for a predetermined amount of time.
Turning now to the figures,
As shown in
According to aspects of the disclosed subject matter, a triggering event may be caused by the device user touching and remain touching a location on a touch sensitive surface (e.g., the touch sensitive display window 102) for a predetermined amount of time. In a non-limiting example, the predetermined amount of time is 1 second. As will be appreciated, touching and maintaining contact on the touch-sensitive display window 102 may be readily accomplished with one hand, such as pressing and touching the touch-sensitive display window with a thumb as shown in
Turning now to
In addition to panning, the dynamic user-interaction control 104 also enables the device user to alter the resolution/zoom of the content (i.e., simulate movement toward or away from the content such that the content may be viewed in differing resolutions and sizes).
While both panning and zooming are initiated within the dynamic user-interaction control 104, it should be appreciated that the user interaction need not be contained within the limits of the control. Indeed, the user interaction for panning will often exit the extent of the dynamic user-interaction control 104. Similarly, while the zooming interaction is determined according to rotation around an origin, the rotation may occur outside of the displayed limits of the dynamic user-interaction control 104.
Regarding the origin around which the rotation (and therefore zoom) is determined, the above description has been made in regard to the origin corresponding to the original touch location which also corresponds to the center of the dynamic user-interaction control 104. However, this is an example of only one embodiment of the disclosed subject matter. In alternative embodiments, the origin may correspond to the center of the touch-sensitive surface and/or the center of the display screen. Alternatively still, the origin may be dynamically established to correspond to the location of the beginning of the zoom activity/interaction. Still further, the origin may be dynamically determined based on the circular motion of the user's interaction. Of course, the center of the zoom may correspond to other locations, such as the center of the display screen. Further still, the center of zoom may be determined by any number of methods, including being established by another touch with a finger or stylus.
Regarding the circular motions that control zooming, while the above discussion is made in regard to clockwise corresponding to zooming in and counter-clockwise zooming out, this is illustrative of one embodiment and should not be construed as limiting upon the disclosed subject matter. While the discussed arrangement may work well for some, an alternative arrangement may be similarly utilized: where counter-clockwise motions correspond to zooming in and clockwise motions correspond to zooming out.
The dynamic user-interaction control 104 may be dismissed via a dismissal event initiated in any number of ways. According to one embodiment, the dynamic user-interaction control 104 is dismissed from the display window 102 by a dismissal event caused by breaking contact with the control for a predetermined amount of time. For example, 2 seconds after the device user breaks contact (and does not re-initiate contact with the dynamic user-interaction control 104 in the touch sensitive surface) a dismissal event is triggered. Alternatively, by breaking contact with the dynamic user-interaction control 104 and/or interacting with the touch-sensitive surface (e.g., the touch sensitive display window 102) outside of the control a dismissal event is triggered.
Advantageously, by providing a predetermined amount of time after breaking contact with the touch-sensitive surface, the device use can resume activity in that time by touching within the dynamic user-interaction control 104 and either panning or zooming (as described above. In this way, the device user can both pan and zoom without bringing the dynamic user-interaction control 104 up twice. For example, the device user may trigger the display of the dynamic user-interaction control 104 and tart with a zoom, break contact for less than the predetermined amount of time it takes to trigger a dismissal event, touch again within the control perform a pan or zoom action.
Turning now to
To illustrate how the disclosed subject matter may work, the following is provided by way of example. On a touch-sensitive screen, the user touches and holds the touch for a predetermined amount of time (such as 0.5 seconds). Holding the touch means that the user maintains contact with the touch-sensitive surface and moves from the original touch location less than some threshold value for the predetermined amount of time. Holding the touch for that predetermined amount of time is recognized as a triggering event and causes a dynamic user interface control (such as user interface control 502 of
Continuing the example of above, the user may release the touch (after panning) and if the user initiates another touch with the dynamic user-interaction control 502 within another predetermined threshold amount of time (e.g., 2 seconds) then another interaction with the control is interpreted. Assume this time that the user initiates another interaction within the outer area 504 of the dynamic user-interaction control 502 within the second predetermine threshold. Now the system interprets the interaction as a zoom because the user is touching within the outer area 504. As the user rotates around the origin of the control 502, a corresponding zooming action is made with regard to the underlying content 106. After the user releases the touch and the second time period (the second predetermined amount of time) expires without the user interacting within the dynamic user-interaction control 502, the control is dismissed. In various embodiments, the zoom may exceed the bounds of the inner area 504, even outside of the control 502, so long as it was initiated within the control 502 (i.e. within the inner area 504).
While the disclosed subject matter has been described in regard to a mobile device 100 having a touch-sensitive display window 102, the disclosed subject matter is not limited to operating on this type of device. Indeed, the disclosed subject matter may be suitably applied to any number of other computing devices, including those that are typically not considered mobile devices. These other devices upon which the disclosed subject matter may operate include, by way of illustration and not limitation: a tablet computer; a laptop computer; all-in-one desktop computers; a desktop computer; television remote controls; computers having wall-mounted displays; tabletop computers; and the like. Each of these may have an integral or external touch-sensitive input area that may or may not correspond to the display window. For example, aspects of the disclosed subject matter may be implemented on a laptop having a touchpad. As suggested in the non-exclusive list of devices that may take advantage of the disclosed subject matter, while a suitable device receives input via a touch-sensitive surface for interacting with displayed content, the touch-sensitive surface need not be the display window 102. Of course, when the input device and the display device are not the same, suitable indicators may be displayed on the dynamic user interface control 104 indicating the origin location as well as the current location.
Turning now to
At decision block 608, a determination is made as to whether the activity was a pan or a zoom. This determination may be based on the particular nature of the user interaction (i.e., if the user forms an arc that may be indicative of a zoom or if the user moves away from the user interaction point that may be indicative of a pan) or the location of the user interaction: whether the user interacts (and/or initiate the interaction) within an area designated for panning or within an area designated for zooming. If the activity was a zoom, the routine 600 proceeds to label B (
At block 616, a determination is made as to whether there has been a chance in the current location. If there has been a change, the routine 600 returns to block 610 to re-determine the direction and magnitude for continuous panning. Alternatively, if there has not been a change, the routine 600 proceeds to block 618 where a further determination is made as to whether the device user has released contact with the input device. If the device user has not released contact the routine 600 returns to block 614 to continue the continuous panning.
If, at block 618, the device user has released contact (a release event), the routine 600 proceeds to decision block 620. At decision block 620, a determination is made as to whether the device user has re-established contact with the dynamic user-interaction control 104 within the predetermined amount of time. If yes, the routine 600 returns to block 606 where a determination as to the device user's new user activity with the dynamic user-interaction control 104 is made. However, if not, the routine 600 proceeds to block 624 where the dynamic user-interaction control 104 is removed from display. Thereafter, the routine 600 terminates.
With regard to zooming, if at decision block 608 the user activity is in regard to zooming, the routine 600 proceeds through label B (
While many novel aspects of the disclosed subject matter are expressed in routines (such as routine 600 of
Turning now to
The processor 702 executes instructions retrieved from the memory 704 in carrying out various functions, particularly in regard to presenting a dynamic user interaction control. The processor 702 may be comprised of any of various commercially available processors such as single-processor, multi-processor, single-core units, and multi-core units. Moreover, those skilled in the art will appreciate that the novel aspects of the disclosed subject matter may be practiced with other computer system configurations, including but not limited to: mini-computers; mainframe computers, personal computers (e.g., desktop computers, laptop computers, tablet computers, etc.); handheld computing devices such as smartphones, personal digital assistants, and the like; microprocessor-based or programmable consumer electronics; game consoles, and the like.
The system bus 710 provides an interface for the various components to inter-communicate. The system bus 710 can be of any of several types of bus structures that can interconnect the various components (including both internal and external components). The exemplary computing device 700 may optionally include a network communication component 712 for interconnecting the computing device 700 with other computers, devices and services on a computer network. The network communication component 712 may be configured to communicate with these other, external devices and services via a wired connection, a wireless connection, or both.
The exemplary computing device 700 also includes a display subsystem 714. It is through the display subsystem 714 that the display window 102 displays content 106 to the device user, and further presents the dynamic user-interaction control. The display subsystem 714 may be entirely integrated or may include external components (such as a display monitor—not shown—of a desktop computing system). Also included in the exemplary computing device 700 is an input subsystem 728. The input subsystem 728 provides the ability to the device user to interact with the computing system 700, including interaction with a dynamic user-interaction control 104. In one embodiment, the input subsystem 728 includes (either as an integrated device or an external device) a touch-sensitive device. Further, in one embodiment the display window of the display subsystem 714 and the input device of the input subsystem 728 are the same device (and are touch-sensitive.)
Still further included in the exemplary computing device 700 is a dynamic user-interaction component 720. The dynamic user-interaction component 720 interacts with the input subsystem 728 and the display subsystem 714 to present a dynamic user-interaction control 104 for interaction by a device user. The dynamic user-interaction component 720 includes a continuous panning component 722 that implements the continuous panning features of a dynamic user-interaction control 104 described above. Similarly, the dynamic user-interaction component 720 includes a zoom component 724 that implements the various aspects of the zooming features of a dynamic user-interaction control 104 described above. The presentation component 726 presents a dynamic user-interaction control 104 upon the dynamic user-interaction component 720 detecting a triggering event, and may also be responsible for dismissing the dynamic user-interaction control upon a dismissal event.
Those skilled in the art will appreciate that the various components of the exemplary computing device 700 of
As mentioned above, aspects of the disclosed subject matter may be implemented on a variety of computing devices, including computing devices that do not have a touch-sensitive input device. Indeed aspects of the disclosed subject matter may be implemented on computing devices through stylus, mouse, or joystick input devices. Similarly, aspects of the disclosed subject matter may also work with pen and touch (on suitable surfaces) where the non-dominant hand is using the dynamic user-interaction control with touch while the dominant hand is using the stylus. Accordingly, the disclosed subject matter should not be viewed as limited to touch-sensitive input devices.
It should be appreciated that the panning and zooming activities/interaction described above may be combined with other user interactions. For example, as a user is panning or zooming the displayed content 106, the user may finish the panning with a flick gesture.
While various novel aspects of the disclosed subject matter have been described, it should be appreciated that these aspects are exemplary and should not be construed as limiting. Variations and alterations to the various aspects may be made without departing from the scope of the disclosed subject matter.