The present invention is directed to user interfaces and more particularly, to a method of navigation and interaction with a user interface. Such interaction and navigation involve operating an input device such as a mouse, a trackball or a three-dimensional (hereinafter “3D”) pointing remote device. The operation of the input device includes at least one of point, click, scroll, etc.
User interfaces, such as graphical user interfaces (GUIs) are well known. Virtually all computers include (or, enable) a graphical user interface in order to make the interaction with the computer more “user friendly”. This is accomplished by reducing, if not eliminating, the number of keystrokes a user is required to enter in order to perform a function such as launching an application residing on the computer. An increasing number of other electronic devices, from cell phones to user controls on appliances, rely on various graphical user interfaces that facilitate a user interaction with the particular device.
Traditional methods of using a GUI (on a computer for example) include the use of an input device such as a mouse or a track ball. A movement of the mouse or the track ball results in a corresponding pointer moving on the graphical user interface. The pointer can thus be navigated to an object (represented by an icon) on the GUI that corresponds to an executable task such as launching a software application for example. Once the pointer is navigated to an icon, the corresponding task can be executed by clicking (depressing) on an actuating button that is integrated within the input device. For example, if the icon on the GUI corresponds to a word processing application, clicking on the icon results in launching the word processing application. The pointer can also be used to rapidly scroll through pages of text within a word processing document for example.
The input device can be used to perform various tasks depending on the application being executed on the computer. In a map software program (generically referred to as location information program) for example, maps of a geographic area can be displayed. While displaying a map, the actuating button of the pointer can be depressed on a particular area of the map to zoom in on the selected area to provide additional detail while displaying a smaller overall geographic area. The actuating button therefore can be used to zoom in on an area of interest.
Referring to
The zooming in on the point of interest (i.e. New York in this example) results in map 314 of
Pointer 104, previously pointing to the point of interest 105 (in
If a user now wishes to zoom in further on New York, the user has to move the mouse until pointer 104 points to the point of interest 105 (or, center point 108) prior to depressing the actuating button. The pointer location does not coincide with or, is not synchronized to, the point of interest when zooming occurs according to current implementations. If a user wishes to zoom in a few times (to a number of zooming levels), a cumulative delay factor is introduced into the process as the pointer has to be located and moved to the point of interest for each desired zooming level. The cumulative delay factor is a sum of delays each of which is associated with having to re-centering the pointer after each zooming function.
Some embodiments provide a synchronization (or, coordination) between the zooming function and the pointer location on a user interface.
Methods according to the present invention address these needs and others by providing a method for maintaining the position of pointer on a center of a graphical user interface
According to one exemplary embodiment of the present invention, a method for navigating a pointer on a graphical user interface (GUI) includes the steps of: scrolling an input device to locate the pointer corresponding to the input device on a point of interest within the GUI, depressing an actuating button associated with the input device on the point of interest, obtaining a detailed view of the point of interest while centering the point of interest on the GUI and maintaining a position of the pointer on the point of interest.
According to another exemplary embodiment of the present invention, a method of centering a pointer on a graphical user interface (GUI) includes navigating the pointer to a point of interest away from a center of the GUI, actuating a mechanism for obtaining a magnified view of the point of interest, computing a distance between a center of the GUI and a location of the point of interest, generating a detailed view of the point of interest, displaying the detailed view with the point of interest centered on the GUI and animating a movement of the pointer from the position away from the center of the GUI to the point of interest.
The accompanying drawings illustrate exemplary embodiments of the present invention, wherein:
The following detailed description of the invention refers to the accompanying drawings. The same reference numbers in different drawings identify the same or similar elements. Also, the following detailed description does not limit the invention. Instead, the scope of the invention is defined by the appended claims.
In exemplary embodiments, incremental zooming may be utilized to coordinate pointer location with a point of interest on a user interface. Referring back to
By navigating pointer 104 and depressing actuating button at the point of interest 105 on
A distance 106 between center 108 and point of interest 105 may be computed. A virtual line (representing 106) may be drawn between starting point 108 and point of interest 105. The virtual line may represent the linear distance between points 108 and 105 of
The center 108 of the user interface 102 remains fixed at one physical location on the interface as long as the size of the interface (represented by the window) remains constant; the geographic point represented by the center may vary based on the zooming level. For example, in
A detailed view desired by zooming in point of interest 105 of
In some embodiments, the intermediate zooming level, the results of which are illustrated in
Centering the point of interest 105 within user interface 102 while zooming in may be achieved by combining the zooming function with a simultaneous panning function. Panning refers to translating the view in either the vertical horizontal dimensions. As a result of the panning process, the point of interest 105 coincides with center 108 of the interface 102.
As the actuation button of the input device is depressed to achieve zooming, the point of interest 105 may move along the dotted distance line 106 to center 108 of the interface 102. A progress of the pointer's movement along this line may be illustrated in an animated manner. In preferred embodiments, panning in order to make the point of interest 105 coincide with center 108 of user interface 102 may be completed at the same time the desired zooming level is achieved. The amount of movement (or displacement) the point of interest 105 undergoes for each zooming step may be computed. As described above, center 108 of the interface represents the point of interest as a result of this movement.
The final level of detail available for zooming in may be determined by a designer of the particular software program being used. For example, a designer of a map software program might choose to facilitate zooming in to a block level or a street level, etc. This may assist in determining the number of available zooming levels between a starting point 108 of
While the number of zooming levels illustrated is two and one intermediate frame is illustrated in this example, a higher number of zooming levels will result in more intermediate frames being shown. If four levels are available in an embodiment, then the number of intermediate frames may be three. That is, a first intermediate frame may depict point 108 being located between point 108 of
Exemplary methods may also facilitate zooming out from a point of interest. In zooming out, the pointer may remain on the point of interest but the center may no longer coincide with the pointer. In
Each of the figures also shows a scale (designated by 112, 212 and 312) to depict what one unit may represent (such as distance for example) in the corresponding figure. In some embodiments, a history of zooming levels that were illustrated (frames) may be maintained in order to enable a user to visit previous frames.
In some embodiments as described above, the animation or transition between a starting point (such as
In other embodiments, the animation may take place at a different rate (or at a varying rate). The first few intermediate frames may be shown slowly, the next several intermediate frames may be shown at a faster rate and the last few intermediate frames may be shown slowly for example.
Exemplary embodiments may be implemented on a general purpose computer such as a desktop, a laptop, a pocket PC, personal digital assistant (PDA) or other similar devices having the processing capacity. Methods described may be encoded on a computer readable medium as a series of executable instructions or on an application specific integrated chip (ASIC).
Methods in accordance with exemplary embodiments as described above may be illustrated as process or flow charts 400 and 500 in
While the description has focused on zooming in on a map, exemplary methods may be equally applicable in other scenarios such as in virtual tour programs (i.e. real estate viewing for example) and in gaming, etc. The methods can also be used in menu selection within an entertainment/pay-per-view environment. For example, thumb nail images representing various movies available for viewing or on a pay-per-view basis may be displayed to a user on a display or screen. The user may utilize a 3D pointing device such as that developed by Hillcrest Laboratories, Inc. of Rockville, Md. to select one of the images. As a result of this selection, more detailed information corresponding to the selected image may be displayed to the user. Input devices may also include a graphic tablet, a tracking surface such as a track pad or a 3D pointing device.
The above-described exemplary embodiments are intended to be illustrative in all respects, rather than restrictive, of the present invention. Thus the present invention is capable of many variations in detailed implementation that can be derived from the description contained herein by a person skilled in the art. All such variations and modifications are considered to be within the scope and spirit of the present invention as defined by the following claims. No element, act, or instruction used in the description of the present application should be construed as critical or essential to the invention unless explicitly described as such. Also, as used herein, the article “a” is intended to include one or more items.
This application claims priority from U.S. patent application Ser. No. 11/893,512, filed Aug. 16, 2007; which claims priority from U.S. patent application Ser. No. 11/064,310, filed Feb. 23, 2005, now U.S. Pat. No. 7,260,789 B2, which issued on Aug. 21, 2007; which claims priority from U.S. Provisional Patent Application No. 60/546,859, filed on Feb. 23, 2004, and entitled “A Method of Real-time Incremental Zooming using Pointing”. This application is also related to U.S. patent application Ser. No. 10/768,432. The subject matter of each of these applications is incorporated in its entirety herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4745402 | Auerbach | May 1988 | A |
5045843 | Hansen | Sep 1991 | A |
5283562 | Kaneko et al. | Feb 1994 | A |
5341466 | Perlin et al. | Aug 1994 | A |
5359348 | Pilcher et al. | Oct 1994 | A |
5615384 | Allard et al. | Mar 1997 | A |
5638523 | Mullet et al. | Jun 1997 | A |
5671342 | Millier et al. | Sep 1997 | A |
5745710 | Clanton, III et al. | Apr 1998 | A |
5790121 | Sklar et al. | Aug 1998 | A |
5793438 | Bedard | Aug 1998 | A |
5796395 | de Hond | Aug 1998 | A |
5835156 | Blonstein et al. | Nov 1998 | A |
5912612 | DeVolpi | Jun 1999 | A |
5940072 | Jahanghir et al. | Aug 1999 | A |
5955988 | Blonstein et al. | Sep 1999 | A |
5978043 | Blonstein et al. | Nov 1999 | A |
5982369 | Sciammarella et al. | Nov 1999 | A |
6002394 | Schein et al. | Dec 1999 | A |
6005578 | Cole | Dec 1999 | A |
6016144 | Blonstein et al. | Jan 2000 | A |
6037933 | Blonstein et al. | Mar 2000 | A |
6049823 | Hwang | Apr 2000 | A |
6052110 | Sciammarella et al. | Apr 2000 | A |
6057831 | Harms et al. | May 2000 | A |
6088031 | Lee et al. | Jul 2000 | A |
6092076 | McDonough et al. | Jul 2000 | A |
6154723 | Cox et al. | Nov 2000 | A |
6175362 | Harms et al. | Jan 2001 | B1 |
6178264 | Kamatani | Jan 2001 | B1 |
6181333 | Chaney et al. | Jan 2001 | B1 |
6191781 | Chaney et al. | Feb 2001 | B1 |
6195089 | Chaney et al. | Feb 2001 | B1 |
6268849 | Boyer et al. | Jul 2001 | B1 |
6295646 | Goldschmidt Iki | Sep 2001 | B1 |
6314575 | Billock et al. | Nov 2001 | B1 |
6330858 | McDonough et al. | Dec 2001 | B1 |
6349257 | Liu et al. | Feb 2002 | B1 |
6385542 | Millington | May 2002 | B1 |
6397387 | Rosin et al. | May 2002 | B1 |
6407749 | Duke | Jun 2002 | B1 |
6411308 | Blonstein et al. | Jun 2002 | B1 |
6412110 | Schein et al. | Jun 2002 | B1 |
6415226 | Kozak | Jul 2002 | B1 |
6421067 | Kamen et al. | Jul 2002 | B1 |
6426761 | Kanevsky et al. | Jul 2002 | B1 |
6429813 | Feigen | Aug 2002 | B2 |
6452609 | Katinsky et al. | Sep 2002 | B1 |
6529218 | Ogawa et al. | Mar 2003 | B2 |
6577350 | Proehl et al. | Jun 2003 | B1 |
6621452 | Knockeart et al. | Sep 2003 | B2 |
6735777 | Kim | May 2004 | B1 |
6753849 | Curran et al. | Jun 2004 | B1 |
6765598 | Kim | Jul 2004 | B2 |
7142205 | Chithambaram et al. | Nov 2006 | B2 |
7446783 | Grossman | Nov 2008 | B2 |
20010030667 | Kelts | Oct 2001 | A1 |
20020039108 | Roy et al. | Apr 2002 | A1 |
20020112237 | Kelts | Aug 2002 | A1 |
20020129366 | Schein et al. | Sep 2002 | A1 |
20030018427 | Yokota et al. | Jan 2003 | A1 |
20030095135 | Kaasila et al. | May 2003 | A1 |
20050034075 | Riegelman et al. | Feb 2005 | A1 |
20050041044 | Gannon | Feb 2005 | A1 |
Number | Date | Country |
---|---|---|
1179769 | Feb 2002 | EP |
1 276 039 | Jan 2003 | EP |
2 237 911 | May 1991 | GB |
H11-134511 | May 1999 | JP |
2000-221874 | Aug 2000 | JP |
2001-229166 | Aug 2001 | JP |
2002-49454 | Feb 2002 | JP |
2002-98897 | Apr 2002 | JP |
2002-288647 | Oct 2002 | JP |
9843183 | Oct 1998 | WO |
03100573 | Dec 2003 | WO |
Entry |
---|
Summons to attend oral proceedings in corresponding European Application No. 05713848.9 dated Jun. 26, 2014. |
Anonymous: “Zooming”; Internet citation; [retrieved from the Internet: URL:http://www.glue.edu/ pkd/imagine/viewer/viewer—zooming.htm, retrieved on Jul. 14, 2005] XP002338343; Jun. 4, 2001; pp. 1-2. |
International Standard; “Ergonomic requirements for office work with visual display terminals (VDTs)” Part 10: dialogue principles; ISO 9241-10 First Edition May 1, 1996; pp. 1-17. |
Quesenbery, W., et al., “Designing for Interactive Television,” http://www.wqusability.com/articles/itv-design.htnnl, 1996, pp. 1-6. |
Prasar, V. “Technology to the aid of science popularisation,” http://www.vigyanprasar.com/dream/jan99/janvpinsight.htm, Jan. 1999, pp. 1-2. |
Press Release, “NetTV Selected for 800 Kansas City Classrooms,” http://www.fno.org/mar98/NKCSDPR1.html, Mar. 23, 1998, pp. 1-2. |
Fuerst, J., et al., “Interactive Television: A Survey of the State of Research and the Proposal and Evaluation of a User Interface,” http://www.ai.wu-wien.ac.at/˜koch/stud/itv/paper.html, Jun. 1996, pp. 1-11. |
Bier, E., et al., “Toolglass and Magic Lenses: The See-Through Interface,” Proceedings of Siggraph 93, Computer Graphics Annual Conference Series, ACM, Aug. 1993, pp. 73-80. |
Stone, M., et al., “The Movable Filter as a User Interface Tool,” Proceedings of CHI '94, ACM, Apr. 24-28, 1994, pp. 306-312. |
Bier, E., et al., “A Taxonomy of See-Through Tools,” Proceedings of 'CHI '94, ACM, Apr. 24-28, 1994, pp. 358-364. |
Fishkin, K., et al., “Enhanced Dynamic Queries via Movable Filters,” May 7, 1995, pp. 1-6. |
Bederson, B., “Quantum Treemaps and Bubblemaps for a Zoomable Image Browser,” UIST 2001, ACM Symposium on User Interface Software and Technology, CHI Letters 3(2), pp. 71-80. |
International Search Report issued in corresponding International Application No. PCT/US04/14487, date of mailing Feb. 9, 2005. |
Written Opinion of the International Searching Authority issued in corresponding International application No. PCT/US04/14487, date of mailing Feb. 9, 2005. |
International Search Report issue in corresponding International Application No. PCT/US05/05376, date of mailing Jan. 23, 2007. |
Written Opinion issued in corresponding International Application No. PCT/US05/05376, date of mailing Jan. 23, 2007. |
Verhoeven, A., et al., “Hypermedia on the Map: Spatial Hypermedia in HyperMap,” International Conference on Information, Communications and Signal Processing, ICICS '97, Singapore, Sep. 9-12, 1997, pp. 589-592. |
International Search Report issued in corresponding International Application No. PCT/US01/08377, date of mailing Jul. 25, 2005. |
International Search Report issued in corresponding International Application No. PCT/US01/08331, date of mailing Nov. 13, 2002. |
International Search Report issued in corresponding International Application No. PCT/US01/08261, date of mailing Aug. 5, 2002. |
CN Office Action mailed Aug. 15, 2008 in related CN Application No. 2005800081099. |
CN Office Action mailed Apr. 10, 2009 in related CN Application No. 2005800081099. |
Decision to Refuse a European Patent Application mailed Jan. 21, 2015 in related EP Application No. 05 713 848.9. |
EP Office Action mailed Feb. 15, 2008 in related EP Application No. 05 713 848.9. |
EP Office Action mailed Jul. 8, 2008 in related EP Application No. 05 713 848.9. |
In Office Action mailed Apr. 23, 2013 in related IN Application No. 4852/DELNP/2006. |
JP Office Action mailed Feb. 1, 2011 in related JP Application No. 2006-554262. |
KR Office Action mailed Jul. 15, 2011 in related KR Application No. 10-2006-7019387. |
Supplementary European Search Report mailed Nov. 26, 2007 in related EP Application No. 05 713 848.9. |
Number | Date | Country | |
---|---|---|---|
20140157208 A1 | Jun 2014 | US |
Number | Date | Country | |
---|---|---|---|
60546859 | Feb 2004 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11893512 | Aug 2007 | US |
Child | 14173116 | US | |
Parent | 11064310 | Feb 2005 | US |
Child | 11893512 | US |