Touch-sensitive graphical user interfaces of computing devices are capable of presenting graphical content and receiving one or more touch inputs from fingers, styluses, and/or other suitable objects in order to manipulate the graphical content. Such touch-sensitive graphical user interfaces may include a display system that is configured to display the graphical content to a user, and a touch input device that is configured to detect one or more touch inputs on a display surface. Various types of touch input devices are known, including but not limited to capacitive, resistive and optical mechanisms.
The use of a touch-sensitive graphical user interface may enable the utilization of a broader range of touch-based inputs than other user input devices. However, current pointer-based graphical user interfaces configured for use with a mouse or other cursor control device may not be configured to utilize the capabilities of modern touch-sensitive devices.
Accordingly, various embodiments related to the manipulation of content items on a touch-sensitive graphical user interface are disclosed herein. For example, one disclosed embodiment provides a method of organizing content items presented on a touch-sensitive graphical user interface. The method comprises receiving a touch gesture at the touch-sensitive graphical user interface, the touch gesture defining a set of zero or more content items to be grouped together and further defining a region of the touch-sensitive graphical user interface. The method further comprises forming an organizational container responsive to receiving the touch gesture at the touch-sensitive graphical user interface and presenting a boundary that defines the organizational container. The method further comprises moving the set of content items into the organizational container and presenting the set of content items on the touch-sensitive graphical user interface within the boundary defining the organizational container. The set of content items may be arranged within the boundary according to an organized view.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
Various embodiments are disclosed herein that relate to the operation of a touch-sensitive graphical user interface. As mentioned above, many touch-sensitive graphical user interfaces for computing devices may not be configured to exploit the capabilities offered by a touch-sensitive use environment that may allow for a richer user experience. Before discussing the touch-sensitive graphical user interface-related embodiments disclosed herein, an example touch-sensitive graphical user interface environment is described.
Touch-sensitive graphical user interface 102 includes a display system 120 configured to present graphical content. Display system 120 includes a display surface 106 and an image source 104. As a non-limiting example, image source 104 may include a projection device configured to present an image (e.g., graphical content) on display surface 106.
Touch-sensitive graphical user interface 102 further includes a touch input device 118 configured to receive a touch gesture responsive to an object contacting display surface 106 of display system 120. Touch input device 118 may include an image sensor 108 for acquiring an infrared image of the display surface 106 to detect objects, such as fingers, touching or contacting the display surface 106. The display surface 106 may comprise various structures such as diffuser layers, anti-glare layers, etc. not shown in detail herein. The touch input device may further include an illuminant 110, depicted herein as an infrared light source, configured to illuminate a backside of the display surface 106 with infrared light.
Through operation of one or more of the image source 104, the image sensor 108, and the illuminant 110, the touch-sensitive graphical user interface may be configured to detect one or more touches contacting display surface 106. In some embodiments, touch input device 118 may be configured to detect and distinguish multiple temporally overlapping touches on display surface 106, herein referred to as a multi-touch input (e.g., a multi-touch gesture). For example, infrared light from the illuminant 110 may be reflected by objects contacting display surface 106, and then detected by image sensor 108 to allow detection of one or more objects on display surface 106. An optical filter (not shown) may be used to reduce or prevent unwanted wavelengths of light from reaching image sensor 108. While the depicted embodiment comprises a single image sensor 108, it will be understood that a touch-sensitive graphical user interface may have any suitable number of image sensors which each may detect a portion of the display surface 106, or an entire area of the display surface 106.
Computing device 100 further comprises a controller 112 having memory 114 and a logic subsystem 116. Logic subsystem 116 may include one or more processors. Memory 114 may comprise instructions (e.g., one or more programs) executable by the logic subsystem 116 to operate the various components of computing device 100. For example, memory 114 may comprise instructions executable by the logic subsystem 116 to operate display system 120 and the touch input device 118 to receive a touch gesture at the touch input device.
As will be described in greater detail with reference to the following figures, the touch gesture may define a set of content items to be grouped together within an organizational container and may further define a region of the display surface where the organizational container may be formed. The term “content items” as used herein refers to the representation of a content item on a graphical user display, and may include representations of any suitable type of content, including but not limited to electronic files, documents, images, audio, video, software applications, etc.
Memory 114 may further comprise instructions executable by the logic subsystem 116 to operate display system 120 and the touch input device 118 to form an organizational container responsive to receiving the touch gesture at the touch input device. The term “organizational container” as used herein signifies a dynamic grouping mechanism where content (such as cards, photos, videos, albums, etc.) is added to the container and organized within the container. Unlike folders, organizational containers allow a user to view the content and manipulate the content and the containers in various interactive ways.
For example, where a set of content items is associated with an organizational container, for example, by moving the set of content items into the organizational container, the set of content items may be controlled or navigated as a group or individually, depending upon the input gestures used. As another example, if an action is applied to the organizational container by a user the action may be applied to each content item within that organizational container. As yet another example, a user may navigate the set of content items to a different location of the display surface by dragging and dropping the organizational container.
At 210, the method includes receiving a touch gesture at the touch-sensitive graphical user interface. Next, at 212, the method comprises forming an organizational container in response to the receipt of the touch input and, at 214, presenting a boundary on the touch-sensitive graphical user interface at the region defined by the touch gesture, wherein the boundary defines the organizational container. The method next comprises, at 216, moving a set of content items into the organizational container, and then, at 218, presenting the set of content items on the graphical user interface within the organizational container in an organized view. In this manner, a user may organize content (e.g., represented as content items) displayed on a graphical user interface with simple, intuitive gestures. The content may then be manipulated in other manners via the manipulation of the organizational container. For example, a user may use the organizational container to present a slideshow of movies and/or videos contained within the organizational container. It will be understood that this example of a use of an organizational container is presented for the purpose of example, and is not intended to be limiting in any manner.
The touch gesture received at 210 may be defined by a path of travel of an object contacting the touch-sensitive graphical user interface (e.g., display surface 106). In some embodiments, the touch gesture defines a set of zero or more content items to be grouped together in the organizational container. For example, referring to
The touch gesture received at 210 also may define a region of the touch-sensitive graphical user interface (e.g., a region of display surface 106) at or near which the organization container is to be formed. For example, such a region is shown at 452 in
As mentioned above, the touch inputs described herein to form an organizational container may be configured to be intuitive gestures that are similar to physical gestures used to perform similar physical tasks. For example, referring to
The organizational container formed at process 212 of
As described above, a boundary may be displayed around a perimeter of an organizational container to illustrate the location and shape of the container to a user more clearly. Such a boundary may have any suitable appearance. For example, the boundary may be displayed as a sharp line, a diffuse aura, or in any other suitable form. Further, the boundary may extend around the entire perimeter of an organizational container, or only a portion of the container. Furthermore, in some embodiments, a background canvas 420 presented on the graphical user interface may be exposed to a user in an internal region of the boundary such that the canvas is visible within the organizational container.
The organizational containers shown in
In other embodiments, instead of defining a set of content items and forming an organizational container with those items by substantially surrounding the items with a touch gesture, a set of content items may be defined and an organizational container may be formed by defining a path of travel between two or more content items on the touch-sensitive display.
In yet other embodiments, an organizational container may be formed by making a touch input that defines a path of travel that corresponds to a recognized gesture. A recognized gesture may include a symbol, a geometric shape, an alphanumeric character, or a gesture defined by a specified action. For example, an alphanumeric character may include an alphabetic character (e.g., a letter), a numerical character (e.g., a digit), or any other suitable character. A geometric shape may include a line, a circle, a semi-circle, an ellipse, a polygon (e.g., a triangle, square, rectangle, etc.), or other suitable geometric shape. It should be appreciated that a geometric shape may include closed, open, or substantially closed forms that are defined by the path of travel of an object contacting the display surface. A symbol may include a swirl, an arrow, or other suitable symbol. Likewise, an action may include a characteristic rubbing action of the touch-sensitive graphical user interface or a tapping of the touch-sensitive graphical user interface, or other suitable action.
As examples,
Each of these methods of forming an organizational container may involve comparing a received touch input gesture to one or more expected touch input gesture, and then determining if the path of travel of the received touch input gesture matches an expected touch.
The method of
Alternatively, if the answer at 310 is judged no, the process flow may instead proceed to 312 where it is determined whether the path of travel of the object contacts one or more content items displayed on the graphical user interface. If the answer at 312 is judged yes, the process flow may proceed to 318 where the organizational container may be formed.
Alternatively, if the answer at 312 is judged no, the process flow may instead proceed to 314 where it may be judged whether the path of travel of the object is within a threshold proximity to one or more content items of the set of content items. If the answer at 314 is judged yes, the process flow may proceed to 318 where the organizational container may be formed.
Alternatively, if the answer at 314 is judged no, the process flow may instead proceed to 316 where it may be judged whether the path of travel substantially surrounds the set of content items. For example, referring again to
Alternatively, if the answer at 316 is judged no, then the method proceeds to 317, where it is determined whether the path of travel of the touch gesture causes a movement of two or more content items into an overlapping arrangement on the graphical user interface. If the path of travel does cause a movement of two or more content items into an overlapping arrangement, then an organizational container is formed if the number of content items in the overlapping arrangement exceeds a threshold number of overlapping content items. On the other hand, if the path of travel does not cause a movement of content items into an overlapping arrangement where the number of overlapping content items exceeds the threshold number of overlapping content items, then the process flow may return or end.
Any suitable value may be used for the threshold number of overlapping content items to form an organizational container. For example,
Next referring to
In some embodiments, a “scooping” gesture also may be used to form an overlapping arrangement of content items.
In the above-described embodiments, it can be seen that a set of content items may be defined and then moved into an organizational container in various manners. As a more specific example, in some embodiments, content items are moved into the organizational container responsive to formation of the organizational container (e.g., at the time of formation of the organizational container). For example, as shown in
In other embodiments, the set of content items may be moved into the organizational container after formation of the organizational container and responsive to receiving at least a second touch gesture at the touch-sensitive graphical user interface after receiving the gesture that forms the organizational container. For example, the embodiments of
It will be appreciated that the computing devices described herein may be any suitable computing device configured to execute the programs described herein other than the disclosed surface computing device. For example, the computing devices may be a mainframe computer, personal computer, laptop computer, portable data assistant (PDA), computer-enabled wireless telephone, networked computing device, or other suitable computing device, and may be connected to each other via computer networks, such as the Internet. These computing devices typically include a processor and associated volatile and non-volatile memory, and are configured to execute programs stored in non-volatile memory using portions of volatile memory and the processor. As used herein, the term “program” refers to software or firmware components that may be executed by, or utilized by, one or more computing devices described herein, and is meant to encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc. It will be appreciated that computer-readable media may be provided having program instructions stored thereon, which upon execution by a computing device, cause the computing device to execute the methods described above and cause operation of the systems described above.
It should be understood that the embodiments herein are illustrative and not restrictive, since the scope of the invention is defined by the appended claims rather than by the description preceding them, and all changes that fall within metes and bounds of the claims, or equivalence of such metes and bounds thereof are therefore intended to be embraced by the claims.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.