User interface

Information

  • Patent Grant
  • 8650510
  • Patent Number
    8,650,510
  • Date Filed
    Tuesday, March 27, 2012
    12 years ago
  • Date Issued
    Tuesday, February 11, 2014
    10 years ago
Abstract
A computer readable medium storing instructions which, when executed by a processor of a mobile handheld computer, cause the processor to present a user interface including a touch sensitive area in which a representation of a function is provided, wherein the function is activated in response to a mufti-step operation including an object touching the touch sensitive area at a location where the representation is provided, and then the object gliding along the touch sensitive area away from the touched location irrespective of how far the object glides, wherein the function, when activated, causes the user interface to display user input controls at respective display locations that are determined prior to performance of the mufti-step operation, and wherein any user gesture that activates a function and begins with an object touching the touch sensitive area at a location where the representation is provided causes the same response.
Description
TECHNICAL FIELD

The present invention relates to a user interface for a mobile handheld computer unit, which computer unit comprises a touch sensitive area, and which touch sensitive area is divided into a menu area and a display area.


The computer unit is adapted to run several applications simultaneously and to present any active application on top of any other application on the display area.


The present invention also relates to an enclosure for a handheld computer unit


The present invention also relates to a computer readable medium. A computer program product with computer program code is stored within the computer readable medium, which code, when read by a computer, will make it possible for this computer to present a user interface according to the invention.


DESCRIPTION OF BACKGROUND ART

Mobile handheld computers are known in various embodiments. One kind of handheld computer is the personal digital assistant (PDA), which is getting more and more powerful.


Another kind of handheld computer unit is the mobile phone, which also is getting more and more powerful. There are also examples of where the mobile phone and the PDA are merging into one unit.


A third kind of handheld computer is the laptop computer, which is getting smaller and smaller, even competing in size with the PDA's.


The need to manage more information has led the development towards new solutions regarding user interfaces and navigation. The PDA's and mobile phones are getting larger and larger in order to provide a user-friendly interface.


Since the users have gotten used to small handheld units, it is hard to move towards larger units. This has led to foldable keyboards, different kinds of joy sticks and different kinds of touch sensitive displays and pads intended to help in providing a user interface that is suitable for small handheld computer units.


SUMMARY OF THE PRESENT INVENTION
Technical Problems

It is a problem to provide a user-friendly interface that is adapted to handle a large amount of information and different kinds of traditional computer-related applications on a small handheld computer unit.


It is a problem to provide a user interface that is simple to use, even for inexperienced users of computers or handheld devices.


It is a problem to provide a small handheld computer unit with an easily accessible text input function.


It is also a problem to provide a simple way to make the most commonly used functions for navigation and management available in the environment of a small handheld computer unit.


Solution

Taking these problems into consideration, and with the staring point from a user interface for a mobile handheld computer unit, which computer unit comprises a touch sensitive area, which touch sensitive area is divided into a menu area and a display area, which computer unit is adapted to run several applications simultaneously and to present an active application on top of any other application on the display area, the present invention teaches that the menu area is adapted to present a representation of a first, a second and a third predefined function, where the first function is a general application dependent function, the second function is a keyboard function, and the third function is a task and file manager. The present invention also teaches that any one of these three functions can be activated when the touch sensitive area detects a movement of an object with its starting point within the representation of the function on the menu area and with a direction from the menu area to the display area.


With the purpose of providing a simple way of managing any application or the operations system, the present invention teaches that if the first function is activated, the display area is adapted to display icons representing services or settings, depending on the current active application. One of the icons always represents a “help”-service, regardless of application. The icons are adapted to represent services or settings of the operations system of said computer unit, such as background picture, clock, users, help, etc. if no application is currently active on the computer unit.


Selections of preferred service or setting is done by tapping on corresponding icon


With the purpose of providing the access to a text input function in any application in the computer unit, the present invention teaches that when the second function is activated, the display area is adapted to display a keyboard and a text field,


If a text passage in an active application is highlighted, then this text passage is displayed in the text field for editing through the keyboard and that the highlighted text passage is replaced by the edited text passage when the second function is deactivated.


If no text passage in an active application is highlighted, then the text field is available for inputting and editing of text through the keyboard.


In the case of the latter the first function can be activated, or the second function can be closed, in which a choice of saving or deleting the inputted text is given. The choice of saving the inputted text results in an activation of the first function. In this case the first function will present services or settings available for the inputted text, such as saving the inputted text for later use, using the inputted text as telephone number in a telephone application, or sending the inputted text as message in communications application.


In order to provide a task and file management in a user interface for a handheld mobile computer, the present invention teaches that, if the third function is activated, the display area is adapted to display a list with a library of available applications and files on the computer unit. A selection of an application will start the application, and a selection of a file will open the file in an application intended for the file.


A selection of an application or a file is done by moving the object so that the representation of desired application or file is highlighted, removing the object from the touch sensitive area, and then tapping on the touch sensitive area.


According to the present invention a navigation in the list is performed by moving the object in a direction towards the top of the list or towards the bottom of the list. This will cause the marking to move in the same direction. The speed of the movement of the marking is lower than the speed of the movement of the object, with the purpose of making the navigation easier.


The user interface of the present invention is specifically adapted to be used with a small computer unit where the size of the touch sensitive area is in the order of 2-3 inches. The user interface is also adapted to be operated by one hand, where the object can be a finger, such as the thumb, of a user of the computer unit.


Advantages

Those advantages that can be primarily associated with a user interface or a computer readable medium according to the present invention reside in the ability to establish a user-friendly interface for small handheld computers, both regarding general application set-up functions, text input functions, and file and task management.





BRIEF DESCRIPTION OF THE DRAWINGS

The present invention will now be described in more detail with reference to the accompanying drawings, in which



FIG. 1 is a schematic and highly simplified view of a touch sensitive area on a mobile handheld computer unit;



FIG. 2 is a schematic side view illustrating the activation of a function;



FIG. 3 is a schematic illustration of a first function;



FIG. 4 is a schematic side view illustrating the selection of a service or setting represented by an icon;



FIG. 5 is a schematic illustration of a second function;



FIG. 6 is a schematic side view illustrating the selection of a third function;



FIG. 7 is a schematic illustration of an application or file;



FIG. 8 is a schematic illustration on how navigation is performed;



FIG. 9 is a schematic illustration of how the content of the display are is changed;



FIG. 10 is a schematic side view further illustrating how navigation is performed;



FIG. 11 is a schematic illustration of moving forwards in an application;



FIG. 12 is a schematic illustration of moving backwards in, or closing, an application;



FIG. 13 is a schematic illustration of an enclosure



FIG. 14 is a schematically illustration of a computer readable medium





DESCRIPTION OF EMBODIMENTS AT PRESENT PREFERRED


FIG. 1 illustrates a user interface for a mobile handheld computer unit. The user interface according to the present invention is specifically adapted to computer units comprising a touch sensitive area 1, which is divided into a menu area 2 and a display area 3. It should be understood that there are several different kinds of known touch sensitive displays and that the present invention to does not depend on what kind of touch sensitive display that is used in relation to the inventive user interface.


The computer unit is adapted to run several applications simultaneously and to present an active application on top of any other application on the display area 3. It should be understood that by simultaneously it is meant any technology that will make it appear to a user of the computer unit that applications are run simultaneously and that the present invention does not depend on how this is realised, whether it is through time-sharing of one processor, parallel use of several processors, or any other technique.


According to the present invention the menu area 2 is adapted to present a representation of a first 21, a second 22 and a third 23 predefined function.


The first function 21 is a general application dependent function, the second function 22 is a keyboard function, and the third function 23 is a task and file manager.



FIG. 2 shows that any one of these three functions 21, 22, 23 can be activated when the touch sensitive area 1 detects a movement of an object 4 with its starting point A within the representation of a function on the menu area 2 and with a direction B from the menu area 2 to the display area 3.



FIG. 3 shows that if the first function 21 is activated, then the display area 3 is adapted to display icons 211, 212, 213, 214, 215, 216 representing services or functions depending on the current active application. One of the icons, in the figure exemplified by icon 211, always represents a “help”-service, regardless of application. Any key that, because of lack of space on the display area, or because the key should be hidden from the active application, or because of any other reason is not shown on the display area of an active application, can be represented by one of the icons 212, 213, 214, 215, 216 that is shown when the first function 21 is activated.


If for instance the active application handles a picture, then the icons that are shown when the first function is activated can be services such as “save to disk”, “send as SMS”, or “delete” and they can be settings such as “resolution”, “colour”, or “brightness”.


If no application is currently active on the computer unit, then the icons 211, 212, 213, 214, 215, 216 are adapted to represent services or settings of the operations system of the computer unit, such as background picture, clock, alarm 215, users 213, help 211, etc.



FIG. 4 shows that selection of a preferred service or setting is done by tapping C, D on corresponding icon 213.



FIG. 5 shows that if the second function 22 is activated, then the display area 3 is adapted to display a keyboard 221 and a text field 222.


Two different scenarios can be at hand when this function key is activated. A first scenario can be that a text passage in the active application is highlighted as the second function is activated. If this is the case then the highlighted text passage is displayed in the text field 222 for editing through the keyboard 221.


The highlighted text passage is replaced by the edited text passage when the second function 21 is deactivated.


A second scenario can be that no text passage in the active application is highlighted. If this is the case then the text field 222 is available for inputting and editing of text through the keyboard 221.


In the case of the latter scenario, the first function 21 can be activated, or the second function 22 can be closed. If the second function 22 is closed then a choice of saving or deleting the inputted text is given, where the choice of saving the inputted text results in an activation of the first function 21.


As the first function 21 is activated with the second function 22 as currently active application the first function 21 will present services or settings available for the inputted text, such as saving the inputted text for later use, using the inputted text as telephone number in a telephone application, or sending the inputted text as message in communications application, such as e-mail, SMS, or fax.



FIG. 6 shows that if the third function 23 is activated, then the display area 3 is adapted to display a list 231 with a library of available applications and files on the computer unit.


A selection of an application will start the application, and a selection of a file will open the file in an application intended for the file. The name of a selected file can be edited by activation of the second function 22 as the file is highlighted.



FIG. 7 shows that a selection of an application or a file is done by moving E the object 4 so that the representation of a desired application or file is highlighted, removing F the object 4 from the touch sensitive area 1, and then tapping G, H on the touch sensitive area 1.


An application or file is highlighted by placing some kind of marking 232 on the representation of the application or file. This marking can be done in different ways, for example by putting a frame around the representation of the application or file, as shown in the figure, or by inverting the representation of the application or file.


It should be understood that all lists in the computer unit, such as a list of contact information in an address book, a list of e-mail messages in a mailbox, or a telephone log, can be managed in the above described manner.


The list 231 can be adapted to present only files or only applications. In this case, the top area of the list 231 can present a field 233 through which the content of the list 231 can be altered. If the list only presents files, then the field 233 can display a representation of a task manager and a selection of the field 233 will cause the list 231 to after to present only applications, and if the list 231 only presents applications, then the field 233 displays a representation of a file manager and a selection of the field 233 will cause the list 231 to after and present only files.



FIG. 8 shows that navigation in the list is performed by moving the object 4 in a direction I towards the top 231a of the list 231 or towards J the bottom 231b of the list 231. This movement I, J of the object 4 will cause the marking 232 to move K, L in the same direction. The speed of the movement K, L of the marking 232 is lower than the speed of the movement I, J of the object 4.



FIG. 9 shows that if the number of applications and/or files in the list 231 exceeds the number of applications and/or files that can be presented on the display area 3, and if the object 4 is moved to the top or bottom position of the display area, then lifted, replaced on the display area, and then again moved to the top or bottom of the display area, then the content of the display area will be replaced one whole page, meaning that if the object 4 is positioned N at the bottom 3b of the display area 3, then lifted, replaced on the display area 3, and then again moved M to the bottom 3b of the display area 3, then the content 31 of the display area 3 will be replaced P by the following applications and/or files 32 in the list 231. In the same way, but not shown in the figure, if the object is positioned at the top of the display area, then lifted, replaced on the display area 3, and then again moved to the top of the display area, the content of the display area will be replaced by the preceding applications and/or files in the list.



FIG. 10 shows that if the object 4 is removed Q from a first position 33 on the display area 3 and then replaced R, S on a second position 34 on the display area 3, then the navigation can be continued T from the second position 34.



FIG. 11 shows that moving U the object 4 from the left of the display area 3 to the right of the display area 3 moves the active application, function, service or setting on one step forwards. FIG. 12 shows that, in a similar manner, the active application, function, service or setting is closed or backed one step by moving V the object 4 from the right of the display area 3 to the left of the display area 3.


As shown in FIG. 1, the menu area 2 is positioned at the bottom of the touch sensitive area 1. The representation of the first function 21 is positioned at the left side of the menu area 2, the representation of the second function 22 is positioned at the middle of the menu area 2, and the representation of the third function 23 is positioned at the right side of the menu area 2.


As shown in FIG. 13, the present invention relates to a user interface for a hand held mobile unit that preferably can be manageable with one hand. Hence the present invention teaches that the user interface is adapted to a touch sensitive area 1 with a size that is in the order of 2-3 inches, meaning the diagonal distance W between two corners of the touch sensitive area 1.


The user interface is adapted to be operated by one hand, where the object 4 can be a finger, such as the thumb shown in the figures, of a user of the computer unit. It should be understood though that the present invention might also be used with another object, such as a pen or other pointing device.


According to one preferred embodiment of the present invention the computer unit is covered with an enclosure 5, which is provided with an opening 51 for the display area 3, and where the representations of the menu area 2 is printed on top of the enclosure 5. It should be understood that the opening 51 might be a transparent part of the enclosure 5 or that it might be an open aperture depending on among other things technical considerations pertaining to the touch sensitive area 1.


This makes it possible to allow the enclosure 5 to be removable and exchangeable.



FIG. 14 shows a computer readable medium, in the figure schematically shown as a solid-state memory 61. A computer program product is stored within the computer readable medium. This computer program product comprises computer readable code 62, which, when read by a computer 6, will make it possible for the computer 6 to present a user interface according to the present invention.


The present invention also teaches that the computer program product is adapted to function as a shell upon an operations system.


It will be understood that the invention is not restricted to the aforedescribed and illustrated exemplifying embodiments thereof, and that these embodiments can be modified within the scope of the inventive concept illustrated in the accompanying Claims.

Claims
  • 1. A non-transitory computer readable medium storing instructions which, when executed by a processor of a mobile handheld computer comprising a screen, cause the processor to present a user interface for the mobile handheld computer, the user interface comprising at least one touch sensitive strip along at least one edge of the screen, wherein the user interface is programmed to provide an information transmission pipeline in response to first and second multi-step input gestures, each gesture comprising (i) an object touching the screen within said at least one touch sensitive strip, and then (ii) the object gliding along the screen away from said at least one touch sensitive strip, whereby the user interface is responsive to the first multi-step user input gesture to activate a first graphical user interface (GUI) on the screen for providing information to be transmitted, and is responsive to the second multi-step input gesture to activate a second GUI providing a plurality of options for transmitting the information provided by the first GUI.
  • 2. The computer readable medium of claim 1, wherein at least one of the first and second GUI comprises a plurality of user interface controls.
  • 3. The computer readable medium of claim 2, wherein said plurality of user interface controls comprises icons.
  • 4. The computer readable medium of claim 2, wherein said plurality of user interface controls comprises virtual buttons.
  • 5. The computer readable medium of claim 2, wherein said plurality of user interface controls comprises a list of mobile handheld computer applications.
  • 6. The computer readable medium of claim 2, wherein the plurality of user interface controls is displayed outside of said at least one touch sensitive strip.
  • 7. The computer readable medium of claim 6, wherein the plurality of user interface controls is not displayed prior to the first GUI being activated.
  • 8. The computer readable medium of claim 7, wherein at least one of the plurality of user interface controls comprises a control for activating a mobile handheld computer function.
  • 9. The computer readable medium of claim 1, wherein at least one of the first and second GUI comprises a keyboard and a keyboard entry field.
  • 10. The computer readable medium of claim 1, wherein the entire screen is touch-sensitive, and wherein an active application, function, service or setting is advanced one step in response to a multi-step operation comprising (i) an object touching the screen outside of said at least one touch sensitive strip, and then (ii) the object gliding along the screen in a first direction away from the touched location.
  • 11. The computer readable medium of claim 10, wherein an active application, function, service or setting is closed or backed one step in response to a multi-step operation comprising (i) an object touching the screen outside of said at least one touch sensitive strip, and then (ii) the object gliding along the screen in a direction opposite to the first direction and away from the touched location.
  • 12. The computer readable medium of claim 1, wherein said instructions function as a layer upon an operating system.
  • 13. The computer readable medium of claim 1, wherein the user interface is programmed to provide for only one type of input gesture within the at least one touch sensitive strip, namely, the multi-step input gesture.
  • 14. The computer readable medium of claim 1, wherein no additional gesture is performed between the first and the second multi-step input gestures.
  • 15. The computer readable medium of claim 1, wherein each of said at least one touch sensitive strip is less than a thumb's width wide within the screen.
  • 16. The computer readable medium of claim 1, wherein the information provided by the first GUI comprises text.
  • 17. The computer readable medium of claim 1, wherein the second GUI replaces the first GUI.
PRIORITY REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 13/310,755, entitled USER INTERFACE, filed on Dec. 4, 2011 by inventor Magnus Goertz, which is a continuation of U.S. application Ser. No. 10/315,250, now U.S. Pat. No. 8,095,879, entitled USER INTERFACE FOR MOBILE HANDHELD COMPUTER UNIT, filed on Dec. 10, 2002 by inventor Magnus George Goertz.

US Referenced Citations (128)
Number Name Date Kind
4243879 Carroll et al. Jan 1981 A
4301447 Funk et al. Nov 1981 A
4790028 Ramage Dec 1988 A
5036187 Yoshida et al. Jul 1991 A
5053758 Cornett et al. Oct 1991 A
5194863 Barker et al. Mar 1993 A
5283558 Chan Feb 1994 A
5305435 Bronson Apr 1994 A
5406307 Hirayama et al. Apr 1995 A
5463725 Henckel et al. Oct 1995 A
5561758 Hocker et al. Oct 1996 A
5577733 Downing Nov 1996 A
5581681 Tchao et al. Dec 1996 A
5603053 Gough et al. Feb 1997 A
5612719 Beernink et al. Mar 1997 A
5618232 Martin Apr 1997 A
5644628 Schwarzer et al. Jul 1997 A
5677708 Matthews et al. Oct 1997 A
5745116 Pisutha-Arnond Apr 1998 A
5748185 Stephan et al. May 1998 A
5757368 Gerpheide et al. May 1998 A
5785439 Bowen Jul 1998 A
5821930 Hansen Oct 1998 A
5821933 Keller et al. Oct 1998 A
5825352 Bisset et al. Oct 1998 A
5889236 Gillespie et al. Mar 1999 A
5898434 Small et al. Apr 1999 A
5900875 Haitani et al. May 1999 A
5907327 Ogura et al. May 1999 A
5914709 Graham et al. Jun 1999 A
5943039 Anderson et al. Aug 1999 A
5943044 Martinelli et al. Aug 1999 A
5956030 Conrad et al. Sep 1999 A
5988645 Downing Nov 1999 A
6037937 Beaton et al. Mar 2000 A
6052279 Friend et al. Apr 2000 A
6085204 Chijiwa et al. Jul 2000 A
6088032 Mackinlay Jul 2000 A
6292179 Lee Sep 2001 B1
6304261 Shields et al. Oct 2001 B1
6310610 Beaton et al. Oct 2001 B1
6340979 Beaton et al. Jan 2002 B1
6346935 Nakajima et al. Feb 2002 B1
6356287 Ruberry et al. Mar 2002 B1
6529920 Arons et al. Mar 2003 B1
6542191 Yonezawa Apr 2003 B1
6549217 De Greef et al. Apr 2003 B1
6597345 Hirshberg Jul 2003 B2
6618063 Kurtenbach Sep 2003 B1
6639584 Li Oct 2003 B1
6646633 Nicolas Nov 2003 B1
6664983 Ludolph Dec 2003 B2
6690365 Hinckley et al. Feb 2004 B2
6690387 Zimmerman et al. Feb 2004 B2
6707449 Hinckley et al. Mar 2004 B2
6727917 Chew et al. Apr 2004 B1
6734883 Wynn et al. May 2004 B1
6757001 Allport Jun 2004 B2
6757002 Oross et al. Jun 2004 B1
6765559 Hayakawa Jul 2004 B2
6788292 Nako et al. Sep 2004 B1
6833827 Lui et al. Dec 2004 B2
6874683 Keronen et al. Apr 2005 B2
6925611 SanGiovanni Aug 2005 B2
6988246 Kopitzke et al. Jan 2006 B2
7006077 Uusimaki Feb 2006 B1
7007239 Hawkins et al. Feb 2006 B1
7030861 Westerman et al. Apr 2006 B1
7155683 Williams Dec 2006 B1
7159763 Yap et al. Jan 2007 B2
7199786 Suraqui Apr 2007 B2
7225408 O'Rourke May 2007 B2
7283845 De Bast Oct 2007 B2
7286063 Gauthey et al. Oct 2007 B2
7304638 Murphy Dec 2007 B2
7441196 Gottfurcht et al. Oct 2008 B2
7450114 Anwar Nov 2008 B2
7665043 Kho Feb 2010 B2
7818691 Irvine Oct 2010 B2
7880724 Nguyen et al. Feb 2011 B2
7996878 Basso et al. Aug 2011 B1
8120625 Hinckley Feb 2012 B2
8127141 Hypponen Feb 2012 B2
20010000668 Bodnar May 2001 A1
20010002694 Nakazawa et al. Jun 2001 A1
20010017934 Paloniemi et al. Aug 2001 A1
20010022579 Hirabayashi Sep 2001 A1
20010026268 Ito Oct 2001 A1
20010028344 Iwamoto et al. Oct 2001 A1
20010028365 Ludolph Oct 2001 A1
20010030641 Suzuki Oct 2001 A1
20010043189 Brisebois et al. Nov 2001 A1
20010043198 Ludtke Nov 2001 A1
20010055006 Sano et al. Dec 2001 A1
20020002326 Causey, III et al. Jan 2002 A1
20020015064 Robotham et al. Feb 2002 A1
20020027549 Hirshberg Mar 2002 A1
20020029341 Juels et al. Mar 2002 A1
20020046353 Kishimoto Apr 2002 A1
20020054153 Arnold May 2002 A1
20020060699 D'Agostini May 2002 A1
20020063696 Kubo et al. May 2002 A1
20020075244 Tani et al. Jun 2002 A1
20020080183 Johnson et al. Jun 2002 A1
20020109677 Taylor Aug 2002 A1
20020135619 Allport Sep 2002 A1
20020171691 Currans et al. Nov 2002 A1
20020173300 Shtivelman et al. Nov 2002 A1
20020191029 Gillespie et al. Dec 2002 A1
20030013483 Ausems et al. Jan 2003 A1
20030016253 Aoki et al. Jan 2003 A1
20030043207 Duarte Mar 2003 A1
20030076306 Zadesky et al. Apr 2003 A1
20030095102 Kraft et al. May 2003 A1
20030122787 Zimmerman et al. Jul 2003 A1
20040001101 Trajkovic et al. Jan 2004 A1
20040021643 Hoshino et al. Feb 2004 A1
20040021681 Liao Feb 2004 A1
20040100510 Milic-Frayling et al. May 2004 A1
20040119763 Mizobuchi et al. Jun 2004 A1
20040125143 Deaton et al. Jul 2004 A1
20040141010 Fitzmaurice et al. Jul 2004 A1
20050035956 Sinclair et al. Feb 2005 A1
20050253818 Nettamo Nov 2005 A1
20070263822 Chang et al. Nov 2007 A1
20080165190 Min et al. Jul 2008 A1
20090006418 O'Malley Jan 2009 A1
20090285383 Tsuei Nov 2009 A1
Foreign Referenced Citations (6)
Number Date Country
0330767 Sep 1989 EP
0513694 Nov 1992 EP
0618528 Oct 1994 EP
0703525 Mar 1996 EP
8600446 Jan 1986 WO
8600447 Jan 1986 WO
Non-Patent Literature Citations (43)
Entry
“Launch 'Em 3.02”, Oct. 8, 2001, Syngery Solutions, Inc. pp. 1-20.
Carlson, Jeff, Visual Quickstart Guide Palm Organizers, Peachpit Press, 2000, Berkeley, CA, pp. xiii, 12, 25, 26, 28-30, 40, 47, 246 and 253.
Dulberg, et al., An Imprecise Mouse Gesture for the Fast Activation of Controls, Interact 1999, pp. 1-8.
Pogue, David, Palm Pilot: The Ultimate Guide, 2nd Edition, 1998, O'Reilly and Associates, Inc., pp. 1-17.
Streitelmeier, Julie, “Palm m100”, The Gadgeteer, 2000, <http://www.the-gadgeteer.com/review/palm—m100—review>, pp. 1-8.
Venolia et al., “T-Cube: A Fast, Self-Disclosing Pen-Based Alphabet”, Apr. 24, 1994, pp. 265-270.
SwitchHack 1.62,Jan. 17, 2001, downloaded from http://web.archive.org/web/200101170650/http://www.deskfree.com/switchhack.html.
swipe—readme.txt, May 8, 2001, downloaded in swipe.zip from http://web.archive.org/web/20010508013439/http://www.samsungshop.co.yu/palm/swipe.zip.
“Desk Accessories”,Oct. 18, 2000, downoaded from http://web.archive.org/web/20001018025151/http://www.doublebang.com/resources/da.html.
“Launch 'em 1.53”, Dec. 11, 1998, downloaded from http://www.5star-shareware.com/PDA/Palm/Utilities/launchem.html.
Pogue, David, Palm Pilot: The Ultimate Guide, 2nd Edition, 1999, O'Reilly and Associates, Inc., pp. 19, 27-36, 38-47, 80, 113, 494, 507, 509, 519-520.
Harrison, B. L., Fishkin, K. P., Gujar. A., Mochon, C. and Want, R., Squeeze Me, Hold Me, Tilt Me! An Exploration of Manipulative User Interfaces, Proceeding of the CHI '98 Conference on Human Factors in Computing Systems, ACM, Los Angeles, CA, Apr. 18-23, 1998, pp. 17-24.
Kurtenbach, G. P., Sellen, A. J. and Buxton, W. A. S., An Empirical Evaluation of Some Articulatory and Cognitive Aspects of Marking Menus, Human-Computer Interaction, vol. 8, Issue 1, L. Erlbaum Associates Inc., Hillsdale, NJ, Mar. 1993, pp. 1-23.
MacKenzie, I. S. and Soukoreff, R W., Text Entry for Mobile Computing: Models and Methods, Theory and Practice, Human-Computer Interaction, L. Erlbaum Associates Inc. Hillsdale, NJ, 2002, vol. 17, pp. 147-198.
Mynatt, E. D., Igarashi, T., Edwards, W. K. and LaMarca, A., Flatland: New Dimensions in Office Whiteboards, Proceeding of the CHI'99 Conference on Human Factors in Computing Systems, ACM, New York, NY, May 15-20, 1999, pp. 346-353.
Bier, E. A., Fishkin, K., Pier, K. and Stone, M. C., A Taxonomy of See-Through Tools: The Video, Proceeding of the CHI '95 Mosaic of Creativity, ACM, New York, NY, May 7-11, 1995, pp. 411-412.
Bier, E. A., Stone, M. C., Fishkin, K., Buxton, W. and Baudel, T., A Taxonomy of See-Through Tools, Conference on Human Factors in Computing Systems, CHI '94, Boston, MA, ACM, New York, NY, Apr. 24-28, 1994, pp. 358-364.
Damm, C. H., Hansen, K. M. and Thomsen, M., Tool Support for Cooperative Object-Oriented Design: Gesture Based Modeling on an Electronic Whiteboard, Proceeding of the CHI '00 Conference on Human Factors in Computing Systems, ACM, New York, NY, Apr. 1-6, 2000, pp. 518-525.
Jermyn, I., Mayer, A., Monrose, F., Reiter, M. K. and Rubin, A. D., The Design and Analysis of Graphical Passwords, Proceedings of the 8th USENIX Security Symposium, Washington, D.C., USA, Aug. 23-26, 1999.
Kurtenbach, G. and Buxton, W., Gordon Kurtenbach and William Buxton, The Limits of Expert Performance Using Hierarchic Marking Menus, Proceedings from INTERCHI 93—Conference on Human Factors in Computing Systems, ACM, New York, NY, Apr. 24-29, 1993, pp. 482-487.
Pirhonen, A., Brewster, S. and Holguin, C., Gestural and Audio Metaphors as a Means of Control for Mobile Devices, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: Changing our World, Changing Ourselves, ACM New York, NY, pp. 291-298.
C. Plaisant and D. Wallace, “Touchscreen toggle design”, CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, May 3-7, 1992, pp. 667-668, ACM New York, NY, USA.
Brad A. Myers, “Mobile Devices for Control”, Mobile HCI 2002, F. Paterno (Ed.), LNCS 2411, pp. 1-8, 2002. Springer-Verlag, Berlin, Heidelberg.
Brad A. Myers, Kin Pou Lie and Bo-Chieh Yang, “Two-Handed Input Using a PDA and a Mouse”, CHI '2000, Apr. 1-6, 2000, CHI Letters, vol. 2, Issue 1, pp. 41-48, 2000, ACM, New York, NY, USA.
Brad A. Myers, “Using Handhelds and PCs Together”, Communications of the ACM, Nov. 2001, vol. 44, No. 11, pp. 34-41, ACM, New York, NY, USA.
Gordon Kurtenbach and William Buxton, “User Learning and Performance with Marking Menus”, CHI '94 Proceedings of the SIGCHI Conference on Human factors in Computing Systems, pp. 258-264, ACM, New York, NY, USA, Apr. 24-28, 1994.
Gordon Paul Kurtenbach, “The Design and Evaluation of Marking Menus”, 1993. Doctoral Thesis, Graduate Department of Computer Science, University of Toronto.
David A. Carr, “Specification of Interface Interaction Objects”, CHI '94 Proceedings of the SIGCHI Conference on Human factors in Computing Systems, pp. 372-378, 1994, ACM, New York, NY, USA.
David A. Carr, Ninad Jog, Harsha Prem Kumar, Marko Teittinen and Christopher Ahlberg, “Using Interaction Object Graphs to Specify Graphical Widgets”, Tech. Rep. ISR-TR-94-69, Sep. 1994, Institute for Systems Research, University of Maryland, College Park, MD.
Catherine Plaisant and Daniel Wallace, “Touchscreen Toggle Switches: Push or Slide? Design Issues and Usability Study”, Tech. Rep. CAR-TR-521, Nov. 1990, Human-Computer Interaction Laboratory,Center for Automation Research, Dept. of Psychology, University of Maryland, College Park, MD.
Mark A. Tapia and Gordon Kurtenbach, “Some Design Refinements and Principles on the Appearance and Behavior of Marking Menus”, UIST '95, pp. 189-195, ACM New York, NY, USA, Nov. 14-17, 1995.
Michael McGuffin, Nicolas Burtnyk and Gordon Kurtenbach, “FaST Sliders: Integrating Marking Menus and the Adjustment of Continuous Values”, Proc. Graphics Interface, May 2002, pp. 35-42.
Harrison, Beverly L., Fishkin, Kenneth P., Gujar, Anuj, Mochon, Carlos and Want, Roy, Squeeze Me, Hold Me, Tilt Me! An Exploration of Manipulative User Interfaces, Proceeding of the CHI '98 Conference on Human Factors in Computing Systems, Apr. 18-23, 1998, pp. 17-24, ACM, Los Angeles, CA.
Kurtenbach, Gordon P., Sellen, Abigail J. and Buxton, William A. S., An Empirical Evaluation of Some Articulatory and Cognitive Aspects of Marking Menus, Human-Computer Interaction, vol. 8, Issue 1, Mar. 1993, pp. 1-23, L. Erlbaum Associates Inc., Hillsdale, NJ.
MacKenzie, I. Scott and Soukoreff, William, Text Entry for Mobile Computing: Models and Methods, Theory and Practice, Human-Computer Interaction, 2002, vol. 17, pp. 147-198, L. Erlbaum Associates Inc., Hillsdale, NJ.
Mynatt, Elizabeth D., Igarashi, Takeo, Edwards, W. Keith and LaMarca, Anthony, Flatland: New Dimensions in Office Whiteboards, Proceeding of the CHI '99 Conference on Human Factors in Computing Systems, May 15-20, 1999, pp. 346-353, ACM, Pittsburgh, PA.
Bier, Eric A., Fishkin, Ken, Pier, Ken and Stone, Maureen C., A Taxonomy of See-Through Tools: The Video, Proceeding of the CHI '95 Mosaic of Creativity, May 7-11, 1995, pp. 411-412, ACM, Denver, CO.
Bier, Eric A., Stone, Maureen C., Fishkin, Ken, Buxton, William and Baudel, Thomas, A Taxonomy of See-Through Tools, Conference on Human Factors in Computing Systems, CHI 1994, Boston, MA, Apr. 24-28, 1994, Proceedings, pp. 358-364. ACM, New York, NY.
Damm, Christian Heide, Hansen, Klaus Marius and Thomsen, Michael, Tool Support for Cooperative Object-Oriented Design: Gesture Based Modeling on an Electronic Whiteboard, Proceeding of the CHI '00 Conference on Human Factors in Computing Systems, Apr. 1-6, 2000, pp. 518-525, ACM, New York, NY.
Jermyn, Ian, Mayer Alain, Monrose, Fabian, Reiter, Michael K. and Rubin, Aviel D., The Design and Analysis of Graphical Passwords, Proceedings of the 8th USENIX Security Symposium, Washington, D.C., USA, Aug. 23-26, 1999.
Kenwood KVT-911DVD Instruction Manual, 2001, Japan.
Kurtenbach, Gordon and Buxton, William, The Limits of Expert Performance Using Hierarchic Marking Menus, Proceedings from INTERCHI 93—Conference on Human Factors in Computing Systems, pp. 482-487, ACM, New York, NY. Apr. 24-29, 1993.
Pirhonen, Antti, Brewster, Stephen and Holguin, Christopher, Gestural and Audio Metaphors as a Means of Control for Mobile Devices, Proceedings of the SIGCHI conference on Human factors in computing systems: Changing Our World, Changing Ourselves, pp. 291-298, ACM New York, NY, Apr. 20-25, 2002.
Related Publications (1)
Number Date Country
20120192094 A1 Jul 2012 US
Continuations (2)
Number Date Country
Parent 13310755 Dec 2011 US
Child 13430718 US
Parent 10315250 Dec 2002 US
Child 13310755 US