In some situations, a user may desire to simultaneously display multiple documents or multiple programs running on a computing device. However, the user's display device may not have a sufficiently large screen or a sufficient resolution to effectively provide the desired display. Thus, the conventional strategy to overcome this deficiency is to couple multiple display devices to one computing device and configure each display device to represent a respective user interface portion. This often problematic because the conventional strategy requires additional space, time, and resources to purchase, install, configure, and operate the multiple display devices. Furthermore, the conventional strategy results in a fragmented user interface with dead space in between the display devices.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter. Nor is this Summary intended to be used to limit the claimed subject matter's scope.
Canvas manipulation using three-dimensional (3D) spatial gestures may be provided. A two-dimensional (2D) user interface (UI) representation may be displayed. A first gesture may be performed, and, in response to the first gesture's detection, the 2D UI representation may be converted into a 3D UI representation. A second gesture may then be performed, and, in response to the second gesture's detection, the 3D UI representation may be manipulated. Finally, a third gesture may be performed, and, in response to the third gesture's detection, the 3D UI representation may be converted back into the 2D UI representation.
Both the foregoing general description and the following detailed description provide examples and are explanatory only. Accordingly, the foregoing general description and the following detailed description should not be considered to be restrictive. Further, features or variations may be provided in addition to those set forth herein. For example, embodiments may be directed to various feature combinations and sub-combinations described in the detailed description.
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate various embodiments of the present invention. In the drawings:
The following detailed description refers to the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the following description to refer to the same or similar elements. While embodiments of the invention may be described, modifications, adaptations, and other implementations are possible. For example, substitutions, additions, or modifications may be made to the elements illustrated in the drawings, and the methods described herein may be modified by substituting, reordering, or adding stages to the disclosed methods. Accordingly, the following detailed description does not limit the invention. Instead, the proper scope of the invention is defined by the appended claims.
Embodiments of the invention may allow the user to manipulate the 3D user interface representation. For example, while the 3D user interface representation is being displayed, the user may perform hand gestures indicating user interface rotation, propagation, or any other user interface manipulations. In this way, the user's gestures may be correlated with respective manipulations in the 3D user interface. For instance, the user may have his hand 115 initially positioned perpendicularly (or approximately perpendicular) to the display device with, for example, his fingers pointing towards display device 100. From the perpendicular position, by way of example, the user may angle their hand 115 towards the right to indicate a desired user interface rotation towards the right. Accordingly, the user may angle their hand 115 in any direction to indicate a desired user interface rotation in the corresponding angled direction.
Similarly, the user may perform gestures to zoom into or out of the 3D representation of the user interface. In this way, combining the angled and zooming hand gestures, the user may simulate ‘propagation’ through the user interface as though their hand gestures were controlling the roll, pitch, and yaw of the simulated propagation. This may be done, for example, by moving the user's hand 115 towards display device 100 to indicate a zoom, or propagation, into the user interface, and by moving the user's hand 115 away from display device 100 to indicate a zoom, or prorogation, out of the user interface. In addition, the user may perform gestures with both of their hands in unison. For example, the user's left hand may indicate a rate of user interface manipulation, while the user's right hand may indicate a type of user interface manipulation.
Moreover, embodiments of the invention may use a gesture detection device, for example, a detection device 315 as described in more detail below with respect to
In addition, embodiments of the invention may provide a system for manipulating a user interface using spatial gestures. For example, display device 100 may display a user interface as either a 3D representation or a 2D representation. The user interface may comprise displayed user interface portions 105 and hidden user interface portions 110 (shown in
Method 200 may begin at starting block 205 and proceed to stage 210 where computing device 300 may display a first user interface representation. For example, display device 100 coupled to computing device 300 may present the user interface in a 2D representation. This 2D representation may comprise displayable elements that may not be displayed at the display device. For instance, the display device may not have a sufficient resolution or a larger enough screen to display the entirety of the user interface. Consequently, the display device may only display a first user interface portion.
From stage 210, where computing device 300 displays the first user interface representation, method 200 may advance to stage 220 where computing device 300 may receive a first user gesture detection. For example, a detection device, as detailed above, may detect a first hand gesture by a user of computing device 300. The first hand gesture may indicate that the user would like to change a representation of the user interface from the first representation to a second representation. With the second user interface representation, the user may view additional user interface portions not displayable by the first user interface representation, as described in greater detail below.
Once computing device 300 receives the first user gesture detection in stage 220, method 200 may continue to decision block 225 where computing device 300 may determine if the received first gesture corresponds to a requested change in user interface representation. For example, in order to indicate a request to change the user interface representation from the first representation to the second representation, the user may perform a first hand gesture. The first hand gesture may comprise a motion of the user's hand from an initial position with the palm approximately parallel the display device and the fingers pointing upward, to a subsequent position with the palm approximately perpendicular to the display device and the fingers pointing towards the display device. In other embodiments of the invention, the first hand gesture may comprise a displacement of the user's hand towards the display device.
If computing device 300 determines that the received first gesture does not correspond to a requested change in user interface, method 200 may proceed to stage 210 where computing device 300 may continue to display the first representation of the user interface. Otherwise, after computing device 300 determines that the received first gesture corresponds to the requested change in the user interface, method 200 may continue to stage 230 where computing device 300 may display a second user interface representation. For example, the second user interface representation may be a 3D user interface representation. In this way, the second user interface representation may represent the first user interface portion, displayed initially in the first user interface representation, in a 3D perspective. This second, 3D representation may also display user interface portions that were not displayable in the first representation. For instance, the first, 2D user interface representation may be converted into the second, 3D user interface representation, exposing previously hidden user interface portions. This conversion may be portrayed at the display device by having the 2D user interface representation pivot, along a horizontal axis of the 2D representation, into the display device, thereby shifting the perspective of the upper portion of the 2D representation towards a ‘horizon’ of the 3D representation. Consequently, user interface portions that were previously out of view in the first representation may now be viewed in the second representation.
While computing device 300 displays the second representation of the user interface in stage 230, method 200 may proceed to stage 240 where computing device 300 may receive a second user gesture detection. For example, by performing hand gestures associated with user interface manipulation, the user may navigate through the second user interface representation by rotating the user interface, zooming into or out of the user interface, or otherwise manipulating the second user interface representation. In this way, the user may expose undisplayed user interface portions in either the first representation or initial second representation.
Once computing device 300 receives the second user gesture detection in stage 240, method 200 may continue to decision block 245 where computing device 300 may determine if the received second user gesture corresponds to a requested user interface manipulation. For example, in order to indicate a request to change the user interface representation from the first representation to the second representation, the user may perform a second hand gesture. The second hand gesture may comprise a motion of the user's hand from an initial position with the user's palm approximately perpendicular to the display device and the fingers pointing towards the display device, to a subsequent angle at the user's wrist in an upward, downward, or side to side motion. In this way, the corresponding angle of the user's hand may correspond to a direction of user interface rotation. In various embodiments of the invention, the second hand gesture may comprise a displacement of the user's hand toward or away from the display device, resulting in a respective zoom in or zoom out of the user interface.
If computing device 300 determines that the received second user gesture does not correspond to a requested user interface manipulation, method 200 may proceed to decision block 265 where computing device 300 may determine if the received second gesture corresponds to a requested change in user interface representation. Otherwise, after computing device 300 determines that the received second gestures corresponds to the requested user interface manipulation, method 200 may continue to stage 250 where computing device 300 may manipulate the second representation of the user interface. For example, if the user has angled their hand to the right, the second user interface representation may be rotated towards the right about a vertical axis. Similarly, if the user has angled their hand upward, the second user interface representation may be rotated upwards about a horizontal axis. In this way, the user may expose user interface portions not previously displayed. Moreover, the user may use both hands to manipulate the user interface. For example, the user's right hand gestures may control a direction of propagation through the 3D user interface representation, while the user's left hand may control a rate of propagation through the user interface. With these user interface manipulations, the user may navigate to previously undisplayable user interface portions.
Once computing device 300 manipulates the second user interface representation in accordance with the second received user gesture in stage 250, method 200 may proceed to stage 260 where computing device 300 may receive a third user gesture. For example, the user may have navigated to a desired user interface portion and may like to see the desired user interface portion in the initial, first user interface representation. Accordingly, the user may perform a third gesture to indicate a request to display the desired user interface portion in the first representation.
Upon receipt of the third gesture by computing device 300, method 200 then proceeds to decision block 265 where computing device 300 may determine if the received third gesture corresponds to a requested change in user interface representation. For example, in order to indicate a request to change the user interface representation from the second representation to the first representation, the user may perform a third hand gesture. The third hand gesture may comprise a motion of the user's hand from an initial position with the palm approximately perpendicular to the display device and the fingers pointing towards the display device, to a subsequent position with the palm approximately parallel with the display device and the fingers pointing upwards. In other embodiments of the invention, the third hand gesture may comprise a displacement of the user's hand away from the display device.
If computing device 300 determines that the received first third gesture does not correspond to a requested change in user interface, method 200 may proceed to stage 230 where computing device 300 may continue to display the second representation of the user interface. Otherwise, after computing device 300 determines that the received third gesture corresponds to the requested change in the user interface, method 200 may continue to stage 270 where computing device 300 may display the first user interface representation. For example, the first user interface representation may now include the desired user interface portion the user has navigated to. In this way, where the display device may have initially displayed the first user interface portion in stage 210 of method 200, the display device may now display a second user interface portion corresponding to the user's interface navigation. After computing device 300 has restored the first representation of the user interface, method 200 may either to end at stage 280, or return to stage 220 where method 200 may be repeated.
Embodiments consistent with the invention may comprise a system for displaying information based on gesture detection. The system may comprise a memory storage and a processing unit coupled to the memory storage. The processing unit may be operative to display a first, 2D user interface representation. While displaying the 2D user interface representation, the processing unit may be further operative to receive user gesture detection, and, in response to the detection, display a 3D user interface representation.
Other embodiments consistent with the invention may comprise a system for providing multi-dimensional user interface navigation based on gesture detection. The system may comprise a memory storage and a processing unit coupled to the memory storage. The processing unit may be operative to display a first, 2D user interface representation. While displaying the 2D user interface representation, the processing unit may be further operative to receive a first user gesture detection, and, in response to the detection, display a 3D user interface representation. Furthermore, the processing unit may receive a second user gesture detection, and, in response to the detection, manipulate the 3D user interface representation.
Various additional embodiments consistent with the invention may comprise a system for providing displaying information based on gesture detection. The system may comprise a display device operative to display a 2D representation of a user interface and a 3D representation of the user interface; a gesture detection device operative to detect hand gestures and send signals corresponding to the detected hand gestures; a memory storage for storing a plurality of instructions associated with the detected hand gestures; and a processing unit coupled to the display device, the gesture detection device, and the memory storage. The processing unit may be operative to cause a display of a first user interface representation or a second user interface representation; receive signals indicative of a detected hand gesture; determine instructions associated with detected hand gestures; and cause a display of the user interface in accordance with the determined instructions.
With reference to
Computing device 300 may have additional features or functionality. For example, computing device 300 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated in
Computing device 300 may also contain a communication connection 316 that may allow device 300 to communicate with other computing devices 318, such as over a network in a distributed computing environment, for example, an intranet or the Internet. Communication connection 316 is one example of communication media. Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media. The term computer readable media as used herein may include both storage media and communication media.
As stated above, a number of program modules and data files may be stored in system memory 304, including operating system 305. While executing on processing unit 302, programming modules 306, such as detection analysis application 320 and user interface manipulation application 321, may perform processes including, for example, one or more of method 200's stages as described above. The aforementioned process is an example, and processing unit 302 may perform other processes. Other programming modules that may be used in accordance with embodiments of the present invention may include electronic mail and contacts applications, word processing applications, spreadsheet applications, database applications, slide presentation applications, drawing or computer-aided application programs, etc.
Generally, consistent with embodiments of the invention, program modules may include routines, programs, components, data structures, and other types of structures that may perform particular tasks or that may implement particular abstract data types. Moreover, embodiments of the invention may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like. Embodiments of the invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
Furthermore, embodiments of the invention may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors. Embodiments of the invention may also be practiced using other technologies capable of performing logical operations such as, for example, AND, OR, and NOT, including but not limited to mechanical, optical, fluidic, and quantum technologies. In addition, embodiments of the invention may be practiced within a general purpose computer or in any other circuits or systems.
Embodiments of the invention, for example, may be implemented as a computer process (method), a computing system, or as an article of manufacture, such as a computer program product or computer readable media. The computer program product may be a computer storage media readable by a computer system and encoding a computer program of instructions for executing a computer process. The computer program product may also be a propagated signal on a carrier readable by a computing system and encoding a computer program of instructions for executing a computer process. Accordingly, the present invention may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.). In other words, embodiments of the present invention may take the form of a computer program product on a computer-usable or computer-readable storage medium having computer-usable or computer-readable program code embodied in the medium for use by or in connection with an instruction execution system. A computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
The computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific computer-readable medium examples (a non-exhaustive list), the computer-readable medium may include the following: an electrical connection having one or more wires, a portable computer diskette, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, and a portable compact disc read-only memory (CD-ROM). Note that the computer-usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
Embodiments of the present invention, for example, are described above with reference to block diagrams and/or operational illustrations of methods, systems, and computer program products according to embodiments of the invention. The functions/acts noted in the blocks may occur out of the order as shown in any flowchart. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
While certain embodiments of the invention have been described, other embodiments may exist. Furthermore, although embodiments of the present invention have been described as being associated with data stored in memory and other storage mediums, data can also be stored on or read from other types of computer-readable media, such as secondary storage devices, like hard disks, floppy disks, or a CD-ROM, a carrier wave from the Internet, or other forms of RAM or ROM. Further, the disclosed methods' stages may be modified in any manner, including by reordering stages and/or inserting or deleting stages, without departing from the invention.
All rights including copyrights in the code included herein are vested in and the property of the Applicant. The Applicant retains and reserves all rights in the code included herein, and grants permission to reproduce the material only in connection with reproduction of the granted patent and for no other purpose.
While the specification includes examples, the invention's scope is indicated by the following claims. Furthermore, while the specification has been described in language specific to structural features and/or methodological acts, the claims are not limited to the features or acts described above. Rather, the specific features and acts described above are disclosed as example for embodiments of the invention.
Related U.S. application No. ______, entitled “Tear-Drop Object Indication” (14917.1222US01), related U.S. application No. ______, entitled “Dual Module Portable Device” (14917.1224US01), and U.S. application No. ______, entitled “Projected Way-Finding” (14917.1223US01), filed on even date herewith, assigned to the assignee of the present application, are hereby incorporated by reference.