Graphical user interfaces for computing devices are increasing being utilized to provide more natural, intuitive interactions with content. For example, some graphical user interfaces configured to be used with a touch-sensitive display input device may allow a user to move a virtual object by touching the display over the virtual object and then moving the touch to drag the object across the display, and/or to scroll through a list displayed on the display by flicking an item located on the display to cause a similar inertial motion as would occur if a physical object were flicked in a similar manner. Likewise, content may be displayed in a similarly natural, real-world manner. For example, a collection of photographs may be displayed as a pile or scattering of larger images, instead of as a grid or list of icons or thumbnails.
The use of modern touch-sensitive displays for interaction with a graphical user interface has allowed the development of intuitive gestures to be used to interact with an interface. However, current methods to organize, display and manipulate content on such touch-sensitive displays may use organizational techniques developed for pointer-based graphical user interfaces, and may not fully utilize the capabilities of modern touch-sensitive display technology. Further, creating advanced Natural User Interfaces (NUIs) for such graphical user interfaces may pose daunting programming challenges.
Accordingly, various embodiments related to the manipulation of contents items on a touch-sensitive display are disclosed. For example, one disclosed embodiment provides a method for operating a graphical user interface on a computing device comprising a touch-sensitive display. The method comprises displaying a content container on the touch-sensitive display, the content container being configured to arrange one or more content items in the content container as a grouped set of content items and to allow a user to selectively move content items into and out of the content container. The method further comprises displaying an ungrouped set of content items on the touch-sensitive display outside of the content container, receiving a user input via a user interface associated with the content container, and in response to the user input, highlighting a content item in the ungrouped set of content items.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
Prior to discussing the organization and manipulation of content items on a touch-sensitive display, an embodiment of an example computing device including a touch-sensitive display is described.
The image source 104 includes a light source 108 such as a lamp (depicted), an LED array, or other suitable light source. The image source 104 also includes an image-producing element 110 such as the depicted LCD (liquid crystal display), an LCOS (liquid crystal on silicon) display, a DLP (digital light processing) display, or any other suitable image-producing element.
The display screen 106 includes a clear, transparent portion 112, such as sheet of glass, and a diffuser screen layer 114 disposed on top of the clear, transparent portion 112. As depicted, the diffuser screen layer 114 acts as a touch surface. In other embodiments, an additional transparent layer (not shown) may be disposed over diffuser screen layer 114 as a touch surface to provide a smooth look and feel to the display surface. Further, in embodiments that utilize a LCD panel rather than a projection image source to display images on display screen 106, the diffuser screen layer 114 may be omitted.
Continuing with
To sense objects placed on display screen 106, the touch-sensitive display 102 includes an image sensor 124 configured to capture an image of the entire backside of display screen 106, and to provide the image to electronic controller 116 for the detection of objects appearing in the image. The diffuser screen layer 114 helps to avoid the imaging of objects that are not in contact with or positioned within a few millimeters of display screen 106. Because objects that are close to but not touching the display screen 106 may be detected by image sensor 124, it will be understood that the term “touch” as used herein also may comprise near-touch inputs.
The image sensor 124 may include any suitable image sensing mechanism. Examples of suitable image sensing mechanisms include but are not limited to CCD and CMOS image sensors. Further, the image sensing mechanisms may capture images of display screen 106 at a sufficient frequency to detect motion of an object across display screen 106. While the embodiment of
The image sensor 124 may be configured to detect light of any suitable wavelength, including but not limited to infrared and visible wavelengths. To assist in detecting objects placed on display screen 106, the image sensor 124 may further include an illuminant 126 such as one or more light emitting diodes (LEDs) configured to produce infrared or visible light to illuminate a backside of display screen 106. Light from illuminant 126 may be reflected by objects placed on display screen 106 and then detected by image sensor 124. Further, an infrared band pass filter 127 may be utilized to pass light of the frequency emitted by the illuminant 126 but prevent light at frequencies outside of the band pass frequencies from reaching the image sensor 124, thereby reducing the amount of ambient light that reaches the image sensor 124.
While described herein in the context of an optical touch-sensitive system, the embodiments described herein also may be used with any other suitable type of touch-sensitive input system and with any suitable type of computing device. Examples of other such systems include, but are not limited to, capacitive and resistive touch-sensitive inputs. Further, while depicted schematically as a single device that incorporates the various components described above into a single unit, it will be understood that the touch-sensitive display 102 also may comprise a plurality of discrete physical parts or units connected as a system by cables, wireless connections, network connections, etc. It will be understood that the term “computing device” may include any device that electronically executes one or more programs, such as a user interface program. Such devices may include, but are not limited to, personal computers, laptop computers, servers, portable media players, hand-held devices, cellular phones, and microprocessor-based programmable consumer electronic and/or appliances.
Upon insertion of the data storage device 204 a content container 206 may be generated, as illustrated in
The content items may be arranged in any suitable manner in the content container 206, including but not limited to a stacked arrangement and a grid arrangement.
Content items arranged in the stacked configuration may be scrolled via a touch input (not shown), wherein scrolling comprises revealing a next-lowest content item in a stack by adjusting a z-order of the stack. Example of suitable touch inputs include, but are not limited to, a tapping type touch input. The tapping type touch input may comprise touching the touch-sensitive display, via a digit or other manipulator, for a brief period of time after which the digit or manipulator is removed from the touch-sensitive display. However, in other embodiments, alternate approaches may be used to scroll through the grouped set of content items 208, such as a flicking type touch gesture, adjustment of a scrollbar, etc.
After the touch gesture has been performed the content container 206 is generated, as shown in
Additionally, in some embodiments a user may toggle between the various arrangements (e.g. grid configuration, stacked configuration), allowing the content container to be easily adapted. Toggling may be initiated via a touch input, touch gesture, or in any other suitable manner.
A user may want to move one or more content items outside of the content container 206, for example, to edit, manipulate, resize, etc. a content item. Therefore as illustrated in
In response to the touch gesture, the selected content item 220 may be moved to a location outside of the content container 206, as illustrated in
First,
Next,
When a content item is located outside of content container 206, the specific location of the content item on the graphical user interface may be determined via interaction with the proxy view 226, a context menu associated with the content container 206, or other suitable interactions with the graphical user interface.
Next referring to
In response to the touch input over the proxy view 226 of the selected content item 220, the selected content item 220 is highlighted. Highlighting may comprise any visual response configured to distinguish the selected content item 220 from other ungrouped content items. In the depicted embodiment, highlighting is represented schematically via a hatched boarder 232 surrounding the selected content item 220 in
In some embodiments, highlighting also may comprise an animated movement of the selected content item 220, for example, via vibration, movement to an unoccupied portion of the user interface, etc. Further, in some embodiments, a user may move the proxy view 226 to cause movement of the selected content item 220 to help locate the selected content item 220.
As depicted, a user first touches an area over or proximate to proxy view 226 on In response to the touch gesture both the proxy view 226 of the selected content item and the selected content item 220 move in response, as illustrated in
In some embodiments, the highlighting of the selected content item 220, illustrated in
A content category 244 may be selected via a touch input, or other suitable user input. The touch input may be performed above or proximate to the displayed content category 244, as illustrated in
The method 1900 comprising, at 1902, displaying a content container on the touch-sensitive display, the content container being configured to arrange one or more content items in the content container as a grouped set of content items and to allow a user to selectively move content items into and out of the content container. In some embodiments, the content items comprise one or more of image content, video content, music content, documents, spreadsheets, text files, programs, and/or any other suitable type of content, and may have any suitable representation and/or appearance.
The grouped set of items may be arranged in various configurations within the content container. One non-limiting example configuration includes a stacked configuration. A stacked configuration may comprise two or more content items arranged according to an assigned z-order. Additionally, each content item included in the stack may be offset according to a pre-determined geometry, facilitating easy viewing of the content items contained within the stack.
Additionally, the grouped set of content items may be displayed in a grid configuration. The grid configuration may comprise two or more content items arranged in axial alignment, which may be horizontal and/or vertical. It will be appreciated that a multitude of configurations may be used and the aforementioned configurations are example in nature.
Method 1900 next comprises, at 1904, displaying an ungrouped set of content items on the touch-sensitive display outside of the content container and, at 1906, receiving a user input via a user interface associated with the content container. In some embodiments, the user input may include a touch gesture performed over or proximate to a selected content item. However, in other embodiments, the user input may be received via a contextual menu associated with (e.g. displayed within) the content container. Therefore, the selection of the content category may be received from the contextual menu. Additionally, the highlighted ungrouped content item may be a member of the content category.
Method 1900 next comprises, at 1908, highlighting a content item in the ungrouped set of content items to form a highlighted ungrouped content item in response to the user input. In some embodiments highlighting the content item in the ungrouped set of content items comprises one or more of applying an effect on the content item, adjusting a z-order of the content item, adjusting one or more image characteristics of the content item, and moving the content item, either via animation or via user-controlled movement.
In one example embodiment, the grouped set of content items may include a proxy view of the highlighted ungrouped content item. The proxy view of the content item may have different image characteristic than the ungrouped view of the content item. The image characteristics may include opacity, saturation, and brightness. Additionally, the user input may include a touch input above the proxy view of the highlighted ungrouped content item.
In some embodiments, as shown at 1910, the method may comprise maintaining highlighting of the highlighted ungrouped content item for a duration after cessation of the touch input. In some embodiments the duration may be pre-determined. After 1910 the method ends.
The above-described embodiments further allow a user to efficiently utilize inputs on a touch-sensitive display to manage, organize, and manipulate content items. It will be understood that the term “computing device” as used herein may refer to any suitable type of computing device configured to execute programs. Such computing device may include, but are not limited to, the illustrated surface computing device, a mainframe computer, personal computer, laptop computer, portable data assistant (PDA), computer-enabled wireless telephone, networked computing device, combinations of two or more thereof, etc. As used herein, the term “program” refers to software or firmware components that may be executed by, or utilized by, one or more computing devices described herein, and is meant to encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc. It will be appreciated that a computer-readable storage medium may be provided having program instructions stored thereon, which upon execution by a computing device, cause the computing device to execute the methods described above and cause operation of the systems described above.
It will further be understood that the embodiments of touch-sensitive displays depicted herein are shown for the purpose of example, and that other embodiments are not so limited. Furthermore, the specific routines or methods described herein may represent one or more of any number of processing strategies such as event-driven, interrupt-driven, multi-tasking, multi-threading, and the like. As such, various acts illustrated may be performed in the sequence illustrated, in parallel, or in some cases omitted. Likewise, the order of any of the above-described processes is not necessarily required to achieve the features and/or results of the example embodiments described herein, but is provided for ease of illustration and description. The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.