In today's digital world, the use of graphical user interfaces (GUIs) to display and manage computer information has become ubiquitous. For example, the WINDOWS™ (Microsoft Corporation, Redmond, Wash.) operating systems used in many personal computers employ a GUI desktop that displays information (e.g., text, images, etc.) for a user to view, and provides various icons or indicia with which the user may interact (e.g., a button, an Internet link, etc.). Software applications in these systems count on knowing, in advance, the position from which the user will be viewing the display screen, and will arrange and orient their graphical elements accordingly. For example, for a typical desktop computer, the applications assume that the user will be viewing the display with one particular edge being at the “top,” and will orient text and images with that orientation in mind.
New applications opened on the system assume the same orientation for the display, and present their information aligned in the same manner as other applications (e.g., they share the same understanding of “up” and “down” as other applications on the system). This allows the same user to easily and accurately see the information for the various applications. However, such arrangements may include drawbacks.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
A graphical user interface may be provided with a first graphical user interface in a first orientation, and in response to receiving a user request to open a second graphical user interface on the display, a second user interface can be displayed with a different, and variable, orientation. In some configurations, the orientation of the second interface can be automatically determined based on a touch input used to request the interface. For example, the touch could be a gesture, and the orientation could be based on the direction of movement in the gesture. The location of the second interface can also be based on the touch.
The interfaces can be a polygonal panel, such as a rectangle, or they may be a non-polygonal shape, such as an oval or circle. The interface can be a radial interface with elements that can be read by users viewing from different directions. A radial interface can be circular, and can be an arc centered at a corner of the display (e.g., the display screen or a display area). Elements on radial interfaces, such as buttons, labels, etc., may be oriented with respect to a central point at the corner (e.g., rotated such that they appear upright when viewed from the central point). Elements of an interface may be rearranged and moved around. For example, buttons on a circular interface may “slide” around the interface, where the buttons rotate to maintain an orientation with respect to the center origin point of the interface.
Different areas of the display may be assigned to predetermined orientations and/or types of interfaces. For example, corners of the display may default to radial interfaces at the corner. Other areas may be the edges of the display. The display may be divided into the various areas, for example, into quadrants.
The various interfaces may be independently rotated to have individual orientations, so that users may freely move about the perimeter of the display and still interact with their application interfaces. For example, some interfaces may remain at their orientations while other interfaces are rotated to change orientations. The actual rotation can be accomplished using a recursive method involving parent-child data structures, where the child data structures may identify location of interface elements using the interface itself as a frame of reference.
The addition of a new interface may cause existing interfaces to rotate and/or move to avoid overlapping the new interface.
These and other features will be described in greater detail below.
a illustrates an example data structure for an interface and its subcomponents, and
a-c illustrate examples of a corner radial interface that may be opened on the display shown in
a-b illustrate examples of a circular radial interface that may be opened on the display shown in
The features herein are described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that can perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the features may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like. The features may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
With reference to
Computer 110 may include a variety of computer readable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. The system memory 130 may include computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132. A basic input/output system 133 (BIOS), containing the basic routines that help to transfer information between elements within computer 110, such as during start-up, may be stored in ROM 131. RAM 132 may contain data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120. By way of example, and not limitation,
The computer 110 may also include other removable/nonremovable, volatile/nonvolatile computer storage media. By way of example only,
The drives and their associated computer storage media discussed above and illustrated in
The computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 180. The remote computer 180 may be a personal computer, and typically includes many or all of the elements described above relative to the computer 110, although only a memory storage device 181 has been illustrated in
When used in a LAN networking environment, the computer 110 may be connected to the LAN 171 through a network interface or adapter 170. When used in a WAN networking environment, the computer 110 may include a modem 172 or other means for establishing communications over the WAN 173, such as the Internet. The modem 172, which may be internal or external, may be connected to the system bus 121 via the user input interface 160, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 110, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,
The computing device shown in
The display device 200 may display a computer-generated image on its display surface 201, which allows the device 200 to be used as a display monitor for computing processes, displaying television or other visual images, video games, and the like. The display may be projection-based, and may use a digital light processing (DLP) technique, or it may be based on other display technologies, such as liquid crystal display (LCD) technology. A projector 202 may be used to project light onto the underside of the display surface 201. It may do so directly, or may do so using one or more mirrors. As shown in
In addition to being used as an output display for displaying images, the device 200 may also be used as an input-receiving device. As illustrated in
The device 200 shown in
The device 200 is also shown in a substantially horizontal orientation, with the display surface 201 acting as a tabletop. Other orientations may also be used. For example, the device 200 may be oriented to project a display onto any desired surface, such as a vertical wall. Reflective IR light may also be received from any such oriented surface.
The actual input command used by the second user 602 can be of any desired type. For example, a gesture, such as touching a location on the display with the user 602's finger and dragging it in a direction, such as towards the second user, may identify the orientation to use for the second interface 601. The gesture may include taps, swipes, holding in one place, or any other movement of an object that may be detected by the display. Keyboard and/or mouse input devices may also be used to select a new orientation. Alternatively, the display surface itself 502 may be configured to determine a direction from which the user approached, the angle of the user's finger (or other object detected by the display), etc., to automatically determine the appropriate orientation for the second interface 601. The input may also include a button press, such as on a keyboard, a displayed graphical button on the display 502, mouse click, etc., or any other desired input command. In some situations, physical pieces may be placed on the display surface and recognized (e.g., by reading a pattern on the bottom of a piece, or by detecting a shape of the piece), and the location, placement, orientation and/or type of the piece may be interpreted as an input requesting a particular interface and its display characteristics.
The second interface 601 may appear at a location on the display 502 based on the location of the user input, or at a predetermined location, and may have an orientation that is also based on the user input. For example, the second interface 601 may be placed at the location of a user's swipe, and oriented in a direction of the swipe gesture (e.g., if user 602 swiped a finger from right to left, that directional swipe may determine the “up” orientation for interface 601). Orientation arrow 603 illustrates an example of such an orientation. Alternatively, the second interface 601 may have a predetermined orientation based on location. For example, second interface 601 is shown as being placed near the right edge of display 502, and the interface 601 may automatically be oriented such that it is perpendicular to the right edge, as shown. Such an automatic orientation may occur, for example, whenever the interface 601 is placed within a predetermined distance (e.g., an inch) of an edge of the display.
The orientation of various application interfaces may be changed by the user.
Carrying out an orientation of the displayed interface 601 may be accomplished in a variety of ways. For example, for graphics display systems that support use of a transformation matrix, the displayed image of the interface 601 may automatically be rotated using such a matrix. Alternatively, a recursive approach may be used in which individual elements of an interface 601 are rotated. For example, an interface 601 may be viewed as a collection of individual objects, where some objects (child objects) are oriented and positioned based on an orientation and position of a parent object, such that repositioning/reorienting the parent object automatically causes the child objects to be repositioned/reoriented as well. Using the interface 601 as an example, the parent object might be a panel formed by the outermost border of the interface 601, while the individual buttons may be the child objects. As a further alternative, applications may have different layouts of objects for different orientations.
a illustrates an example data object 901 that can be used for identifying positional information for such an interface. As shown in
The parent object 902 may also include orientation data, such as an angular value, second point, etc., that identifies the orientation for the interface 601. This angle may define the direction in which the Y axis, or “up,” extends for the parent object. So, for example, the parent data object 901 for interface 601 shown in
The data 903 for the child objects, however, may use a different frame of reference. Rather than using an absolute frame of reference, the child objects may have their position and orientation identified as an offset from the parent object. So, for example, the (0,0) position for the child object may refer to the origin point of the parent object, regardless of where it is located on the actual screen. Similarly, the X and Y axes 952 used for the child object may simply be aligned with the Y axis of the parent, regardless of whether the parent is vertical or rotated at some angle. By using such a dependency, changes in orientation and placement made to the parent may automatically be carried out to the children, and the children's data objects 903 need not be revised for every change (e.g., a child object, once created, need not be changed). So, if interface 601 is to be rotated, the system may rotate the parent object first, and then recursively apply the rotation to the child object using the new frame of reference.
The example interface 601 is depicted as originating along an edge of a display. Alternatively, the interface may be radial in nature, such that various elements on the interface have a point origin for their orientations.
The radial application interface 1001 features shown in
Although the example radial application interfaces are shown with a fixed-radius arc overall appearance, the interfaces need not have an arcuate overall appearance, and may be of any desired shape with its elements being oriented to have a point origin.
Having a circular radial interface 1101 may allow multiple users to share the interface 1101, conserving display screen real estate, but to allow for different viewing angles, the various elements on the interface, such as button 1102, may be repeated at one or more positions around the interface. So, for example, button 1102 may appear again at the bottom of the interface shown in
Application interfaces may have predetermined orientations depending on how they were originally requested. For example, as shown in
Modifications to the edge regions may also be made. For example, the edge regions may have different placement and/or appearances, and need not be limited to edges at all.
As noted above, the request may include an identification of a location for the new interface (such as by a user tapping on an origin location on the screen, or by selecting a region having a predetermined location, such as the corners shown in
In step 1403, the new interface may be generated and displayed if it was requested in step 1402, having the particular location and new orientation requested by the user (or as established by predetermined default). Adding the new interface to a display may result in an overlap on existing interfaces, or may cause existing interfaces to automatically move and/or change size to accommodate the new interface, as illustrated in
If no request was received, then in step 1404 the process may check to determine whether an existing interface is to be rotated to a new orientation. Step 1405 illustrates the rotation of one interface to have a different orientation while a first interface remains at a current orientation, such as that shown in
If no rotation request is received in step 1404, the system may check in step 1406 to determine whether the user (or other application) has requested movement of an existing interface. Step 1407 illustrates the movement of one or more interface elements, with corresponding change in orientation of the interface element, as shown in
Using one or more of the features and approaches described above, users' experiences with various orientations can be improved. Although the description above provides illustrative examples and sequences of actions, it should be understood that the various examples and sequences may be rearranged, divided, combined and subcombined as desired. For example, steps and features described may be omitted, or additional steps and features may be added. Accordingly, although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Number | Name | Date | Kind |
---|---|---|---|
4817176 | Marshall et al. | Mar 1989 | A |
5230063 | Hoeber et al. | Jul 1993 | A |
5252951 | Tannenbaum et al. | Oct 1993 | A |
5345549 | Appel et al. | Sep 1994 | A |
5423554 | Davis | Jun 1995 | A |
5434964 | Moss et al. | Jul 1995 | A |
5463725 | Henckel et al. | Oct 1995 | A |
5665951 | Newman et al. | Sep 1997 | A |
5714698 | Tokioka et al. | Feb 1998 | A |
5804803 | Cragun et al. | Sep 1998 | A |
5818450 | Katsuta | Oct 1998 | A |
5883626 | Glaser et al. | Mar 1999 | A |
5910653 | Campo | Jun 1999 | A |
5943164 | Rao | Aug 1999 | A |
6159100 | Smith | Dec 2000 | A |
6181343 | Lyons | Jan 2001 | B1 |
6240207 | Shinozuka et al. | May 2001 | B1 |
6247128 | Fisher et al. | Jun 2001 | B1 |
6333735 | Anvekar | Dec 2001 | B1 |
6414672 | Rekimoto et al. | Jul 2002 | B2 |
6445364 | Zwern | Sep 2002 | B2 |
6448964 | Isaacs et al. | Sep 2002 | B1 |
6452593 | Challener | Sep 2002 | B1 |
6512507 | Furihata | Jan 2003 | B1 |
6545663 | Arbter et al. | Apr 2003 | B1 |
6568596 | Shaw | May 2003 | B1 |
6577330 | Tsuda et al. | Jun 2003 | B1 |
6593945 | Nason et al. | Jul 2003 | B1 |
6624833 | Kumar | Sep 2003 | B1 |
6630943 | Nason et al. | Oct 2003 | B1 |
6662365 | Sullivan et al. | Dec 2003 | B1 |
6667741 | Kataoka et al. | Dec 2003 | B1 |
6672961 | Uzun | Jan 2004 | B1 |
6686931 | Bodnar | Feb 2004 | B1 |
6720860 | Narayanaswami | Apr 2004 | B1 |
6735625 | Ponna | May 2004 | B1 |
6745234 | Philyaw et al. | Jun 2004 | B1 |
6765559 | Hayakawa | Jul 2004 | B2 |
6767287 | McQuaid et al. | Jul 2004 | B1 |
6768419 | Karel et al. | Jul 2004 | B2 |
6791530 | Vernier et al. | Sep 2004 | B2 |
6792452 | Philyaw | Sep 2004 | B1 |
6822635 | Shahoian et al. | Nov 2004 | B2 |
6842175 | Schmalstieg et al. | Jan 2005 | B1 |
6847856 | Bohannon | Jan 2005 | B1 |
6910076 | Lortz | Jun 2005 | B2 |
6950534 | Cohen et al. | Sep 2005 | B2 |
6965842 | Rekimoto | Nov 2005 | B2 |
6990660 | Moshir et al. | Jan 2006 | B2 |
7036090 | Nguyen | Apr 2006 | B1 |
7085590 | Kennedy et al. | Aug 2006 | B2 |
7098891 | Pryor | Aug 2006 | B1 |
7104891 | Osako et al. | Sep 2006 | B2 |
7148876 | Kawasome | Dec 2006 | B2 |
7259747 | Bell | Aug 2007 | B2 |
7327376 | Shen et al. | Feb 2008 | B2 |
7397464 | Robbins et al. | Jul 2008 | B1 |
7469381 | Ording | Dec 2008 | B2 |
7483015 | Sato | Jan 2009 | B2 |
7612786 | Vale et al. | Nov 2009 | B2 |
20010012001 | Rekimoto et al. | Aug 2001 | A1 |
20010054082 | Rudolph et al. | Dec 2001 | A1 |
20020109737 | Jaeger | Aug 2002 | A1 |
20020151337 | Yamashita et al. | Oct 2002 | A1 |
20020154214 | Scallie et al. | Oct 2002 | A1 |
20020180811 | Chu | Dec 2002 | A1 |
20030025676 | Cappendijk | Feb 2003 | A1 |
20030063132 | Sauer et al. | Apr 2003 | A1 |
20030119576 | McClintock et al. | Jun 2003 | A1 |
20030234773 | Sano et al. | Dec 2003 | A1 |
20040005920 | Soltys et al. | Jan 2004 | A1 |
20040032409 | Girard | Feb 2004 | A1 |
20040046784 | Shen et al. | Mar 2004 | A1 |
20040051733 | Katzie | Mar 2004 | A1 |
20040090432 | Takahashi et al. | May 2004 | A1 |
20040119746 | Mizrah | Jun 2004 | A1 |
20040127272 | Park et al. | Jul 2004 | A1 |
20040141008 | Jarczyk et al. | Jul 2004 | A1 |
20040141648 | Dodge et al. | Jul 2004 | A1 |
20040212617 | Fitzmaurice | Oct 2004 | A1 |
20040246240 | Kolmykov-Zotov et al. | Dec 2004 | A1 |
20050054392 | Too | Mar 2005 | A1 |
20050069186 | Kobayashi | Mar 2005 | A1 |
20050110781 | Geaghan et al. | May 2005 | A1 |
20050122308 | Bell et al. | Jun 2005 | A1 |
20050134578 | Chambers et al. | Jun 2005 | A1 |
20050146508 | Kirkland et al. | Jul 2005 | A1 |
20050153128 | Selinfreund et al. | Jul 2005 | A1 |
20050162402 | Watanachote | Jul 2005 | A1 |
20050166264 | Yamada et al. | Jul 2005 | A1 |
20050177054 | Yi et al. | Aug 2005 | A1 |
20050183035 | Ringel | Aug 2005 | A1 |
20050193120 | Taylor | Sep 2005 | A1 |
20050200291 | Naugler et al. | Sep 2005 | A1 |
20050248729 | Drucker et al. | Nov 2005 | A1 |
20050251800 | Kurlander et al. | Nov 2005 | A1 |
20050253872 | Goss et al. | Nov 2005 | A1 |
20050275622 | Patel et al. | Dec 2005 | A1 |
20050277071 | Yee | Dec 2005 | A1 |
20050280631 | Wong et al. | Dec 2005 | A1 |
20060015501 | Sanamrad et al. | Jan 2006 | A1 |
20060017709 | Okano | Jan 2006 | A1 |
20060026535 | Hotelling | Feb 2006 | A1 |
20060075250 | Liao | Apr 2006 | A1 |
20060077211 | Zhou | Apr 2006 | A1 |
20060090078 | Blythe et al. | Apr 2006 | A1 |
20060119541 | Blythe et al. | Jun 2006 | A1 |
20060156249 | Blythe et al. | Jul 2006 | A1 |
20060161871 | Hotelling et al. | Jul 2006 | A1 |
20060244719 | Brigham et al. | Nov 2006 | A1 |
20060244734 | Hill et al. | Nov 2006 | A1 |
20070063981 | Galyean et al. | Mar 2007 | A1 |
20070188518 | Vale et al. | Aug 2007 | A1 |
20070236485 | Trepte | Oct 2007 | A1 |
20070284429 | Beeman | Dec 2007 | A1 |
20070300182 | Bilow | Dec 2007 | A1 |
20070300307 | Duncan | Dec 2007 | A1 |
20080040692 | Sunday et al. | Feb 2008 | A1 |
20080192005 | Elgoyhen et al. | Aug 2008 | A1 |
20080211813 | Jamwal et al. | Sep 2008 | A1 |
Number | Date | Country |
---|---|---|
0050979 | Aug 2001 | WO |
WO 0236225 | May 2002 | WO |
WO 2005040944 | May 2005 | WO |
WO 2005122557 | Dec 2005 | WO |
2006003586 | Jan 2006 | WO |
Entry |
---|
Noi Sukaviriya et al., “Augmenting a Retail Environment Using Steerable Interactive Displays”, 2 pages, http://www.research.ibm.com/ed/publications/chi03b.pdf, date unknown. |
Chia Shen et al., “DiamondSpin: An Extensible Toolkit for Around-the-Table Interaction”, 8 pages, http://hci.stanford.edu/publications/2004/diamondspin/diamondspin.pdf, Apr. 2004. |
Andrew D. Wilson, “PlayAnywhere: A Compact Interactive Tabletop Projection-Vision System”, 10 pages, http://research.microsoft.com/˜awilson/papers/Wilson%20PlayAnywhere%20UIST%202005.pdf, Oct. 2005. |
Sasaki et al., “Hands-Free User Interface for Seamless Collaborative Works in Shared MR Space”, date unknown, 6 pp. |
Krishna et al., “23.3: Tactile Sensor Based on Piezoelectric Resonance”, 2002 IEEE, pp. 1643-1647. |
http://www.softsland.com/Natural—Login—Pro.html, Apr. 13, 2006, 3 pp. |
Logitech, “SecureConnect: A Major Leap in the Cordless Desktop Experience”, http://www.logitech.com/pub/pdf/bluetooth/secure—connect—whitepaper.pdf, received Apr. 7, 2006, 5 pp. |
Elzabadani et al., “Self-Sensing Spaces: Smart Plugs for Smart Environments”, http://www.icta.ufl.edu/projects/publications/2005-ICOST-Selfsensingspaces.pdf, received Apr. 7, 2006, 8 pp. |
Symantec, “Symantec Discovery: Track hardware/software assets and monitor license compliance throughout a multiplatform IT infrastructure”, http://eval.veritas.com/mktginfo/enterprise/fact—sheets/ent-factsheet—discovery—12-2005.en-us.pdf, Dec. 2005, 5 pp. |
Leikas et al., “Virtual Space Computer Games with a Floor Sensor Control Human Centered Approach in the Design Process”, http://www.dcs/gla.ac/uk/˜stephen/workshops/haptic/papers/leikas.pdf, date unknown, 4 pp. |
Tollmar et al. “Gesture + Play, Exploring Full-Body Navigation for Virtual Environments”, http://people/csail.mit.edu/demirdji/papers/cvprhci-pg.pdf, date unknown, 8 pp. |
Stockley et al., “Virtual Reality on a WIM: Interactive Worlds in Miniature”, Conference on Human factors in Computer Systems, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM Press/Addison-Wesley Publishing Co., http://delivery.acm.org/10.1145/230000/223938/p265-stoakley.html?key1=223938&key2=5808034411&coll=GUIDE&dl=GUIDE&CFID=73042672&CFTOKEN=344092262, 1995, 14 pp. |
Turk, “Perceptual User Interfaces”, http://ilab.cs.ucsb.edu/projects/turk/Turk&20DEC-NSF%20Workshop.pdt, date unknown, 10 pp. |
Lee et al., “Modeling Virtual Object Behavior Within Virtual Environment”, Virtual Reality Laboratory, Dept. of Computer Science and Engineering, pp. 41-48. |
Nikitin et al., “Real-Time Simulation of Elastic Objects in Virtual Environments Using Finite Element Method and Precomputed Green's Functions”, Eighth Eurographics Workshop on Virtual Environments, 2002, 6 pp. |
TouchTable™, Northrop Grumman, www.northropgrumman.com, 2005, 2 pp. |
TouchTable™, Northrop Grumman, http://www.ms.northropgrumman.com/touchtable.index.html, 2006, 2 pp. |
U.S. Official Action mailed Mar. 3, 2008 in U.S. Appl. No. 11/278,264. |
U.S. Official Action mailed May 30, 2008 in U.S. Appl. No. 11/425,843. |
U.S. Official Action mailed Jul. 10, 2008 in U.S. Appl. No. 11/423,883. |
U.S. Official Action mailed Oct. 7, 2008 in U.S. Appl. No. 11/350,853. |
U.S. Official Action mailed Dec. 2, 2008 in U.S. Appl. No. 11/278,264. |
U.S. Official Action mailed Jul. 9, 2009 in U.S. Appl. No. 11/426,101. |
U.S. Official Action mailed Nov. 12, 2009 in U.S. Appl. No. 11/278,264. |
U.S. Official Action mailed Dec. 15, 2009 in U.S. Appl. No. 11/426,101. |
U.S. Official Action mailed Oct. 27, 2010 in U.S. Appl. No. 11/278,264. |
U.S. Official Action mailed Nov. 22, 2010 in U.S. Appl. No. 11/426,101. |
U.S. Official Action mailed Dec. 5, 2012 in U.S. Appl. No. 11/427,684. |
U.S. Official Action mailed Apr. 20, 2011 in U.S. Appl. No. 11/427,684. |
U.S. Official Action mailed May 12, 2010 in U.S. Appl. No. 11/278,264. |
U.S. Official Action mailed Jun. 7, 2011 in U.S. Appl. No. 11/278,264. |
U.S. Official Action mailed Oct. 6, 2010 in U.S. Appl. No. 11/427,684. |
U.S. Official Action mailed Feb. 24, 2010 in U.S. Appl. No. 11/426,101. |
U.S. Official Action mailed May 6, 2013 in U.S. Appl. No. 11/427,684. |
U.S. Official Action mailed Jul. 2, 2010 in U.S. Appl. No. 11/426,101. |
U.S. Official Action mailed Sep. 27, 2013 in U.S. Appl. No. 11/427,684. |
U.S. Official Action mailed Apr. 14, 2009, in U.S. Appl. No. 11/278,264. |
Number | Date | Country | |
---|---|---|---|
20070220444 A1 | Sep 2007 | US |