A computing system often displays user interface elements called controls that the user may interact with to cause the computing system to execute respective operations, or through which the user may visualize underlying information. A common form of a user interface element is a tile or icon. For instance, in a desktop area or start area, various tiles or icons may be laid out, awaiting selection by a user, or displaying information to a user.
During normal mode, such controls may be invoked to execute underlying operations. However, many systems allow the user to select an organization mode, in which typically the user interface elements are no longer selectable to perform underlying operations. Instead, the user interface elements may be moved around, repositioned, deleted, resized, and so forth, allowing the user to organize his or her workspace.
The subject matter claimed herein is not limited to embodiments that solve any disadvantages or that operate only in environments such as those described above. Rather, this background is only provided to illustrate one exemplary technology area where some embodiments described herein may be practiced.
At least some embodiments described herein provide for the supporting of an organization mode in a user interface that displays multiple user interface elements.
In accordance with a first aspect described herein, the multiple user interface elements fit over a grid positions that are at least conceptually imposed over a canvas. Each of the user interface elements occupies one or more of the grid positions and has boundaries corresponding to boundaries between grid positions. The system detects that the user interface is to enter an organization mode in which one or more of the user interface elements may be organized on the user interface. For instance, the user might make an explicit gesture recognized by the system as a user intent to enter organization mode. In response, the grid positions are displayed on the canvas. This allows the user to more easily see, during organization mode, where user interface elements may be placed. This is especially useful when the user interfaces may take on a predetermined number of combinations of shapes and sizes, each combination fittable over one or more grid positions. In some embodiments, as the user moves and/or resizes a user interface element, one or more corresponding grid positions are highlighted to show where the user interface element would be placed if the move or resize operation were to conclude at that moment.
In accordance with a second aspect described herein, while in organization mode, a contextual actions menu is caused to appear with respect to a particular user interface element. The contextual actions menu includes multiple organization mode commands, including one or more that may be directed selected from the contextual actions menu to invoke the command. For instance, perhaps those organization mode commands that are more frequently performed in organization mode (such as pin or unpin) might be directly invoked from the contextual actions menu, while other less frequently invoked organization mode commands may be reachable from the contextual actions menu, but not directly invoked.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In order to describe the manner in which the above-recited and other advantages and features of the invention can be obtained, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered to be limiting of its scope, the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
At least some embodiments described herein provide for the supporting of an organization mode in a user interface that displays multiple user interface elements.
In accordance with a first aspect described herein, the multiple user interface elements fit over a grid positions that are at least conceptually imposed over a canvas. Each of the user interface elements occupies one or more of the grid positions and has boundaries corresponding to boundaries between grid positions. The system detects that the user interface is to enter an organization mode in which one or more of the user interface elements may be organized on the user interface. For instance, the user might make an explicit gesture recognized by the system as a user intent to enter organization mode. In response, the grid positions are displayed on the canvas. This allows the user to more easily see, during organization mode, where user interface elements may be placed. This is especially useful when the user interfaces may take on a predetermined number of combinations of shapes and sizes, each combination fittable over one or more grid positions. In some embodiments, as the user moves and/or resizes a user interface element, one or more corresponding grid positions are highlighted to show where the user interface element would be placed if the move or resize operation were to conclude at that moment.
In accordance with a second aspect described herein, while in organization mode, a contextual actions menu is caused to appear with respect to a particular user interface element. The contextual actions menu includes multiple organization mode commands, including one or more that may be directed selected from the contextual actions menu to invoke the command. For instance, perhaps those organization mode commands that are more frequently performed in organization mode (such as pin or unpin) might be directly invoked from the contextual actions menu, while other less frequently invoked organization mode commands may be reachable from the contextual actions menu, but not directly invoked.
Some introductory discussion of a computing system will be described with respect to
Computing systems are now increasingly taking a wide variety of forms. Computing systems may, for example, be handheld devices, appliances, laptop computers, desktop computers, mainframes, distributed computing systems, or even devices that have not conventionally been considered a computing system. In this description and in the claims, the term “computing system” is defined broadly as including any device or system (or combination thereof) that includes at least one physical and tangible processor, and a physical and tangible memory capable of having thereon computer-executable instructions that may be executed by the processor. The memory may take any form and may depend on the nature and form of the computing system. A computing system may be distributed over a network environment and may include multiple constituent computing systems.
As illustrated in
In the description that follows, embodiments are described with reference to acts that are performed by one or more computing systems. If such acts are implemented in software, one or more processors of the associated computing system that performs the act direct the operation of the computing system in response to having executed computer-executable instructions. For example, such computer-executable instructions may be embodied on one or more computer-readable media that form a computer program product. An example of such an operation involves the manipulation of data. The computer-executable instructions (and the manipulated data) may be stored in the memory 104 of the computing system 100. Computing system 100 may also contain communication channels 108 that allow the computing system 100 to communicate with other message processors over, for example, network 110. The computing system 100 also includes a display 112 for displaying user interfaces such as those described herein.
Embodiments described herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below. Embodiments described herein also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are physical storage media. Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media and transmission media.
Computer storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other tangible medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry or desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system. Thus, it should be understood that computer storage media can be included in computer system components that also (or even primarily) utilize transmission media.
Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
For instance,
The user interface 300 also shows a user interface element 330 that is outside of the start board 310. In one embodiment, the user interface 300 shows a portion of an extensible canvas that is extendible in an extendable dimension (e.g., horizontally. As new user interface elements are added to the canvas by selection of a current user interface element in the canvas, the canvas may extend in an extendable direction (e.g., rightward) in the extensible dimension. Accordingly, the user interface element 330 might be caused to appear when the user selects one of the user interface elements in
Referring back to
In response to this user instruction, the user interface shows at least some of the grid positions to be displayed (act 203). For instance,
The analysis of which of the unoccupied grid positions should be highlighted may be repeatedly performed as the move operation continues, thus giving the user a real-time impression of the effect of dropping the user interface element at any given point in time. In this example, only the user interface element 316 is being moved, but the principles may also apply if multiple user interface elements are moved in a group.
In
The visualized unoccupied grid positions may also be helpful when resizing one or the user interface elements. When resizing, the unoccupied user interface elements that would be occupied by the resized user interface element may also be highlighted. The analysis of which of the unoccupied grid positions should be highlighted may be repeatedly performed as the resize operation continues, thus giving the user a real-time impression of the effect of resizing of the user interface element at any given point in time.
Again, the user interface enters organization mode (act 702) in the user interface and while displaying the user interface elements.
A contextual actions menu is then caused to appear with respect to a particular user interface element (act 703). For instance, referring to
The contextual actions menu 610 is illustrated as having three organization mode commands. Two of the organization commands may be directly selected from the contextual actions menu. For instance, the user might directly select the pin command 611 to pin the user interface element to a portion of the user interface (such as the start board 310). The user might also directly select the unpin command 612 to unpin the user interface element from a portion of the user interface. Direct commands may be disabled depending on the circumstances. For instance, the pin command 611 may be disabled and not visualized if the user interface element 610 is already pinned to the start board. Likewise, the unpin command 612 may be disabled and not visualized if the user interface element 620 is not pinned to the start board.
The commands that may be directly invoked from the context menu 610 may, for instance, be the commands that are more commonly used. For instance, the system may cause the commands that are more commonly used in general to be displayed. On the other hand, the system may more proactively monitor the usage of a given user, and provide direct commands on the context menu for those commands that are more commonly performed in organization mode by the given user. For instance, if a given user performs resizing to a particular size more often, a resize control may be added to the contextual actions menu for direct selection.
Accordingly, the principles described herein provide support for organization mode of operation in systems that display multiple user interface elements that are to be organized.
The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.
This application claims the benefit of U.S. Provisional Application Ser. No. 61/974,191, filed Apr. 2, 2014, which provisional patent application is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5123087 | Newell et al. | Jun 1992 | A |
5204947 | Bernstein | Apr 1993 | A |
5226117 | Miklos | Jul 1993 | A |
5471578 | Moran et al. | Nov 1995 | A |
5490241 | Mallgren et al. | Feb 1996 | A |
5555357 | Fernandes et al. | Sep 1996 | A |
5565888 | Selker | Oct 1996 | A |
5613058 | Koppolu et al. | Mar 1997 | A |
5644737 | Tuniman et al. | Jul 1997 | A |
5694563 | Belfiore et al. | Dec 1997 | A |
5809267 | Moran et al. | Sep 1998 | A |
5844558 | Kumar et al. | Dec 1998 | A |
5861886 | Moran et al. | Jan 1999 | A |
5886694 | Breinberg et al. | Mar 1999 | A |
5892554 | DiCicco et al. | Apr 1999 | A |
6034684 | Proehl et al. | Mar 2000 | A |
6065021 | George | May 2000 | A |
6425121 | Phillips | Jul 2002 | B1 |
6426761 | Kanevsky | Jul 2002 | B1 |
6437804 | Ibe et al. | Aug 2002 | B1 |
6525749 | Moran et al. | Feb 2003 | B1 |
6801200 | Prakriya et al. | Oct 2004 | B1 |
7058653 | Okamoto et al. | Jun 2006 | B2 |
7071952 | Ternulf et al. | Jul 2006 | B1 |
7096454 | Damm et al. | Aug 2006 | B2 |
7233341 | Sauervrei | Jun 2007 | B1 |
7253823 | Wong et al. | Aug 2007 | B2 |
7318199 | Nickolayev et al. | Jan 2008 | B2 |
7320120 | Rajarajan et al. | Jan 2008 | B2 |
7356774 | Shah et al. | Apr 2008 | B2 |
7483028 | Wong et al. | Jan 2009 | B2 |
7600197 | Gourdol et al. | Oct 2009 | B2 |
7657840 | Gibson et al. | Feb 2010 | B2 |
7725841 | Michelman et al. | May 2010 | B2 |
8209630 | Thimbleby et al. | Jun 2012 | B2 |
8225224 | Robertson et al. | Jul 2012 | B1 |
8438500 | Rapp et al. | May 2013 | B2 |
9043722 | Holt | May 2015 | B1 |
20020078433 | Rajarajan et al. | Jun 2002 | A1 |
20020120784 | Rajarajan et al. | Aug 2002 | A1 |
20030072486 | Loui et al. | Apr 2003 | A1 |
20030214536 | Jarrett et al. | Nov 2003 | A1 |
20040010776 | Shah | Jan 2004 | A1 |
20050039145 | Diering et al. | Feb 2005 | A1 |
20050044089 | Wu | Feb 2005 | A1 |
20050050053 | Thompson | Mar 2005 | A1 |
20050057575 | Nickolayev et al. | Mar 2005 | A1 |
20050102634 | Sloo | May 2005 | A1 |
20050195217 | Robertson et al. | Sep 2005 | A1 |
20050243373 | Silverbrook et al. | Nov 2005 | A1 |
20050257157 | Gilboa et al. | Nov 2005 | A1 |
20060143570 | Washington et al. | Jun 2006 | A1 |
20060146059 | Inoue | Jul 2006 | A1 |
20060150169 | Cook et al. | Jul 2006 | A1 |
20060164682 | Lev | Jul 2006 | A1 |
20060174568 | Kinoshita et al. | Aug 2006 | A1 |
20060212790 | Perantatos et al. | Sep 2006 | A1 |
20060242557 | Nortis, III | Oct 2006 | A1 |
20060282790 | Matthews | Dec 2006 | A1 |
20070101321 | Mahoney et al. | May 2007 | A1 |
20070136351 | Dames et al. | Jun 2007 | A1 |
20070157096 | Keren et al. | Jul 2007 | A1 |
20070260332 | Torgerson | Nov 2007 | A1 |
20070266307 | Panditharadhya et al. | Nov 2007 | A1 |
20080012859 | Saillet et al. | Jan 2008 | A1 |
20080022215 | Lee et al. | Jan 2008 | A1 |
20080036784 | Behar et al. | Feb 2008 | A1 |
20090132942 | Santoro | May 2009 | A1 |
20090158200 | Palahnuk et al. | Jun 2009 | A1 |
20090228786 | Danton et al. | Sep 2009 | A1 |
20090319939 | Danton et al. | Dec 2009 | A1 |
20090327954 | Danton et al. | Dec 2009 | A1 |
20100188410 | Gilbert et al. | Jul 2010 | A1 |
20110016425 | Homburg | Jan 2011 | A1 |
20110161827 | Dedis et al. | Jun 2011 | A1 |
20120044172 | Ohki | Feb 2012 | A1 |
20130019175 | Kotler et al. | Jan 2013 | A1 |
20130067412 | Leonard | Mar 2013 | A1 |
20130086498 | Eskander | Apr 2013 | A1 |
20130187866 | Kim | Jul 2013 | A1 |
20130246955 | Schwesig et al. | Sep 2013 | A1 |
20140351015 | Ehn | Nov 2014 | A1 |
20140351727 | Danton et al. | Nov 2014 | A1 |
Number | Date | Country |
---|---|---|
07021241 | Jan 1995 | JP |
08096153 | Apr 1996 | JP |
08221500 | Aug 1996 | JP |
2002208021 | Jul 2002 | JP |
2011526039 | Sep 2011 | JP |
20080044827 | May 2008 | KR |
2007098243 | Aug 2007 | WO |
2009158219 | Dec 2009 | WO |
2013157013 | Oct 2013 | WO |
Entry |
---|
“How to Use the New Youtube,” Dec. 3, 2012, https://www.youtube.com/watch?v=r0272TInjR4, 1 page. |
“International Preliminary Report on Patentability Issued in PCT Application No. PCT/US2015/023448”, dated Jul. 4, 2016, 8 Pages. |
“Second Written Opinion Issued in PCT Application No. PCT/US2015/023448”, dated Mar. 15, 2016, 7 Pages. |
“Office Action Issued in Canadian Patent Application No. 2,724,684”, dated Feb. 6, 2017, 5 Pages. |
Notice of Allowance dated May 16, 2017 cited in U.S. Appl. No. 14/453,044. |
Jiang, Yi-Feng, “Shape Alignment by Learning a Landmark-PDM Coupled Model”, Proceedings of the 18th International Conference on Pattern Recognition (ICPR '06), Aug. 2006, IEEE, 4 pages. |
MSDN, “Using Alignment Boxes to Snap Shapes to a Grid”, 2008 Microsoft Corporation, 3 pages. |
MSDN, “Working with Shape Layers”, 2008 Microsoft Corporation, 5 pages. |
Notice of Allowance Issued in Japanese Patent Application No. 2011-516434, dated Apr. 28, 2015, 3 Pages. |
Third Office Action Received for Chinese Patent Application No. 200980124800.1, dated Sep. 2, 2015, 6 Pages. |
“Office Action Issued in Korean Application No. 10-2010-7029202”, dated Jan. 14, 2016, 10 Pages. |
“Office Action Issued in Canada Application No. 2,724,684”, dated Feb. 17, 2016, 4 Pages. |
“Office Action Issued in Canadian Patent Application No. 2,724,684”, dated Jun. 23, 2016, 5 Pages. |
“Supplementary Search Report Issued in European Patent Application No. 09770753.3”, dated Aug. 4, 2016, 7 Pages. |
“Microsoft Word 2007, released Nov. 30, 2006, screenshot printout” pp. 1-5. |
“BET: PowerPoint-Diagrams-Radial Diagrams” Date: Oct. 17, 2006, pp. 1-2. |
“Microsoft Office Visio 2007 Step by Step” by Judy Lemke, Feb. 14, 2007. |
Venkatrao, “SqL/CLI A New Binding Style for SQL” SIGMOD Record, vol. 24, No. 4, Dec. 1995, pp. 72-77. |
Moustakas et al. “Master-Piece: A Multimodal (Gesture + Speech) Interface for 3D Model Search and Retrieval Integrated in a Virtual Assembly Application” eNTERFACE 2005 The Summer Workshop on Multimodal Interfaces, Jul. 18-Aug. 12, pp. 62-75. |
Zhao et al. “Incremental Recognition in Gesture-based and Syntax-Directed Diagram Editors” Interchi '93, 1998 ACM, pp. 95-100. |
Laviola Jr. et al. “MathPad2: A System for the Creation and Exploration of Mathematical Sketches” 2004 ACM, pp. 432-440. |
Henry et al. “Integrating Gesture and Snapping into a User Interface Toolkit” 1990 ACM, pp. 112-122. |
U.S. Appl. No. 12/045,517, dated Apr. 14, 2011, Office Action. |
U.S. Appl. No. 12/163,352, dated Jul. 26, 2011, Office Action. |
U.S. Appl. No. 12/163,352, dated Feb. 2, 2012, Office Action. |
U.S. Appl. No. 12/045,517, dated Dec. 6, 2012, Office Action. |
U.S. Appl. No. 12/045,517, dated May 30, 2013, Office Action. |
U.S. Appl. No. 12/163,352, dated Sep. 13, 2013, Office Action. |
U.S. Appl. No. 12/045,517, dated Sep. 20, 2013, Office Action. |
U.S. Appl. No. 12/045,517, dated Feb. 26, 2014, Office Action. |
U.S. Appl. No. 12/163,352, dated Apr. 24, 2014, Notice of Allowance. |
“International Search Report and Written Opinion Issued in PCT Application No. PCT/US2015/023448”, dated Jun. 9, 2015, 11 Pages. |
“UI Element Guidelines: Menus”, Sep. 26, 2011 Available at: https://developer.apple.com/library/mac/documentation/userexperience/conceptual/applehiguidelines/Menus/Menus.html. |
Number | Date | Country | |
---|---|---|---|
20150286343 A1 | Oct 2015 | US |
Number | Date | Country | |
---|---|---|---|
61974191 | Apr 2014 | US |