Touch screen devices often have cumbersome on-screen user interfaces. Various ways of interacting with touch screens are known in the art. For example, using a stylus pen or fingers as input devices. The user experience may be further complicated when using one or more fingers as an input device. Placing a cursor and selecting text can be difficult using a finger because precision is much lower than with other input devices, such as a mouse. For example, placing a cursor at a precise point within a word can be difficult due to the size of the finger relative to the size of the word.
A user may want to place a cursor so that text being displayed by a computing device may be edited. Similarly, text may be selected so that it may be copied, cut, or overwritten by pasting text or entering new text. These operations, which are known in the art, have proven difficult to implement with touch screen devices due to the imprecision of using one or more fingers to interact with a touch screen. The speed and ease of selecting text is also reduced when the user interface requires the user to enter complicated commands such as pressing and holding the selected text. The above operations are an even more difficult problem for portable electronic devices.
User interfaces known in the art display a cursor that may make it difficult for a user to discern the exact location where text will be inserted when entered by the user. Furthermore, when selecting text, present user interfaces often require that the user's finger block the portion of text being selected. Thus, these user interfaces often utilize an offset representation of the text being selected which requires unintuitive and unnecessary hand-eye coordination.
Selecting text on multiple lines can be difficult because the lines of text typically occupy a small vertical space relative to the size of a user's finger. It is also very difficult for humans to move their finger in a straight line. This results in errors when a user attempts to select text on a single line but the user's finger moves just outside of the vertical space defined by the line of text causing the computing device to interpret the user's input as purposefully changing lines.
In modern touch screen devices, users expect an intuitive and simple user interface that allows efficient operation of the device. Described herein are techniques for implementing a user interface with simple cursor placement and occlusion-free text selection. The user interface is optimized such that users of mobile devices, e.g. handheld devices, laptops, or tablet computers, may quickly and efficiently perform these operations.
Cursor placement may be achieved with a simple tap input from a user. Initially, a cursor may be placed coarsely. Upon further input from the user, the cursor may be placed more precisely. A visual indication of a location on the screen that the user may interact with, referred to as a “gripper,” may be displayed below the line of text with which it is associated. The user interface may also implement “safety zones” that allow the user to more accurately select text on a single line of text
In some embodiments, a cursor may be placed on a display screen of a computing device by receiving a location indication from a user, wherein the indication from the user indicates some text or character string. An initial cursor location is selected based on the location indication, in combination with other information about the displayed content, and the cursor is displayed at that location, wherein the initial cursor location coarsely positioned relative to the location indicated by the user. The computing device is then placed in a state in which execution of a function is based on the initial cursor location. A second location indication may then be received. A more precise cursor location may be selected based on the second location indication, and the cursor is displayed in the more precise cursor location.
In some embodiments, a computer system with a display screen, a sensor and a processor implement a user interface for selecting text. A string of characters, such as text, is displayed on the display screen along with a “gripper.” A user may drag the gripper, as determined by the sensor, from a first location associated with a first character of the string to a second character of the string. The text between the first and second characters is highlighted and a gripper is displayed again at the first location and a second gripper is displayed at a location corresponding to the second character of the string.
In some embodiments, multiple lines of text may be displayed by a display screen. A portion of the text may be selected, which may be indicated by highlighting the text. A user may adjust the portion of text that is selected by dragging an end point of the selected text. As the drag input is being received, the selected text is updated based on the current location the user is indicating. In a first mode, the device allows a relatively large threshold for error in the vertical location indicated by the user such that the end point does not change lines unless the user moves past the threshold distance. Once the threshold distance is passed and the device continues to receive the drag input from the user, the device enters a second mode where a threshold distance smaller than the relatively large threshold is used. Thus, in the second mode, a user can move the end point of the selected text from line to line by simply passing the relatively small threshold distance.
Some methods for selecting and highlighting may be executed by a processor of a computing system executing instructions stored on a computer readable storage device.
The foregoing is a non-limiting summary of the invention, which is defined by the attached claims.
The accompanying drawings are not intended to be drawn to scale. In the drawings, each identical or nearly identical component that is illustrated in various figures is represented by a like numeral. For purposes of clarity, not every component may be labeled in every drawing. In the drawings:
The inventors have recognized and appreciated that to provide an easy to use and efficient user interface for a touch screen device, it is desirable for placing a cursor to be simple, straightforward and take into account the imprecision that results from using a finger to input commands to the device. The inventors have further recognized and appreciated that providing a user interface that allows a user to interact directly with text being selected without a finger occluding the text results in an intuitive, efficient user experience.
Embodiments of the invention are operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to, personal computers, server computers, hand-held or laptop devices, cellular phones, tablet computers, netbooks, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
The computing environment may execute computer-executable instructions, such as program modules. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Embodiments of the invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
With reference to
Computer 110 may include a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 110 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by computer 110. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of computer readable media.
The system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132. A basic input/output system 133 (BIOS), containing the basic routines that help to transfer information between elements within computer 110, such as during start-up, is typically stored in ROM 131. RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120. By way of example, and not limitation,
The computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only,
The drives and their associated computer storage media, discussed above and illustrated in
A user may enter commands and information into the computer 110 through input devices such as a keyboard 162 and a pointing device 161, commonly referred to as a mouse, trackball or touch pad. These input devices may be present in some embodiments, but are not required for operation of computer 110. In some embodiments, a display screen 191 includes a touch screen sensor 172 that may receive inputs from a user's one or more fingers or other input device, such as a stylus or pen. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 120 through a user input interface 160 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). The monitor 191 or other type of display device is also connected to the system bus 121 via an interface, such as a video interface 190.
OS 134 may comprise a touch screen engine associated with the touch screen. The touch screen engine receives input from the touch sensor 172, processes the input and provides information pertaining to the input to other components, such as other components of the OS 134, application programs 135, or other program modules 136. Inputs from the touch screen sensor 172 may indicate a selection of characters or other items displayed on the screen 191. The inputs may also indicate a position of a cursor, as selected by the user. The touch screen engine may also receive information from the other components and render the information on display screen 191. For example, the OS 132 may provide information to the touch screen engine to display in a context menu on display 191. Embodiments of the invention may be implemented to alter the way components identify selections and cursor locations. Embodiments may also alter the way the user interface is presented to indicate character selection and cursor location.
Computing device 200 comprises a display screen 250 for displaying one or more strings of characters 260. The strings may comprise any characters, for example, letters, numbers, punctuation, and the space character, to name a few. A string may be a single word, a sentence, a paragraph or any other collection of characters. The example embodiment of
The display screen 250 of computing device 200 is associated with one or more touch screen sensors 240. The combination of sensors and display screen may be referred to as a touch screen. Thus, the computing device 200 may be referred to as a touch screen device. The computing device 200 may employ any type of touch screen technology. For example, the touch screen may be resistive, capacitive, acoustic, infrared or any other touch screen technology. Input may be received by the touch screen sensors 240 by a stylus, pen or a user's body, such as one or more fingers. Embodiments of the invention are not limited to any particular implementation of touch screens.
In addition to accepting input via the one or more touch screen sensors 240, the computing device 200 may have one or more buttons 230 for accepting input from a user. The buttons may be on the front, back or sides of the computing device 200. They may be mechanical buttons, rotary input devices, capacitive buttons or any other type of input device known in the art. As discussed in connection with
A user of computing device 200 may want to perform functions on text 260 displayed on the display screen 250. To perform a function, the user may use the touch screen to indicate the desired placement of a cursor and/or the desired text to be selected. The cursor may be any visual indicator of a location, such as a caret or an arrow.
Functions may be dependent on the location of a cursor or on text that has been selected by the user. Functions may also depend on other setting of the device. For example, if a cursor is placed at a particular location and additional text is entered by the user, the additional text may be inserted at the location of the cursor or the additional text may overwrite existing text following the cursor based on whether the device is in an insert mode or an overwrite mode. A paste function may also be performed based on the placement of the cursor such that text that has previously been cut or copied may be inserted at the cursor location. Another example is selecting a delete command, which may delete one or more characters immediately adjacent to the cursor, such as characters before the cursor or after the cursor.
The same functions listed above may be executed when text is selected, but the behavior will be different based on text being selected. For example, if additional text is entered or pasted when text is selected, the selected text will be overwritten with the additional text. Selecting a delete command will delete the selected text. There may be additional functions available to the user when text is selected. For example, the selected text may be copied or cut. Also, the style of the selected text may be changed by the user. For example, the selected text may be made bold or italic, the font may be changed, or the size of the font may be changed.
This different behavior of functions based on whether a cursor is at a specific location or text is selected may be described as the device being in a first state and a second state.
It should be noted that the computing device 200 may perform corrections to the user's selections such that the indicated location determined by the computing device 200 may not correspond exactly to the physical location at which a user touch was detected on the touch screen. For example, it is known for a user of a touch screen device to touch the screen at a location that is slightly lower than the actual location they wish to indicate. This is merely an issue of perspective that the computing device can automatically correct. Thus, embodiments of the present invention may user locations that have already been corrected to account for this effect and other similar effects. The input received by the user via the touch screen sensor 240 may be any suitable input command. In some embodiments, the input may be a “tap input,” indicating that the user touched the screen for only brief moment. The tap may be detected by techniques known in the art. A brief moment is typically less than one second. In some embodiments, the input may be a drag input, wherein the user touches the screen with a finger at a beginning location, creates a path by dragging the finger across the screen (while maintaining contact with the screen), and terminates the path by lifting the finger at an ending location. In other embodiments, the input may be a press and hold input, wherein the user touches the screen at a location, holds it for a period of time, and then terminates contact with the screen. Each of these types of input may be detected using different techniques as is known in the art. Embodiments of the invention are not limited to any particular type of input.
Due to the size of a user's finger, which may be, for example 1-2 cm in width or height, location indications received via a touch screen may be imprecise relative to items displayed on the screen (fractions of 1 mm in size). Therefore, the user interface of some embodiments of the invention will, upon a first location indication from a user, place the cursor in an approximate location relative to the string of characters being selected. Then, if the user wishes to place the cursor more precisely, a second indication may be input to the device and the cursor will be placed at a more precise location associated with the input. This approximate, or rough, placement of the cursor may be implemented in any way and embodiments of the invention are not limited in this respect. If the computing device places the cursor using approximate placement, the computing device may be said to be in a first state. If the computing device is placing the cursor using more precisely, the computing device may be said to be in a second state. In some embodiments, whether the device will use precise placement or approximate placement will depend on characteristics of the objects being displayed. For example, if the display includes text that is larger than a predetermined threshold, then the device may only implement precise placement of the cursor. In some embodiments, the predetermined text size threshold may be related to the approximate size of a user's finger. Thus, the device may not use approximate placement when the size of the user's finger is approximately the same size as the items displayed on the screen.
Approximate placement of a cursor 420 may be implemented in any suitable way. In some embodiments, an input is received from a user that indicates a particular string of characters on the display screen. The computing device 200 determines the input to be a command to place the cursor at a location associated with that word. The number of possible locations at which a cursor may be placed may be reduced compared to when precise placement is being used. Fewer possible locations results in course cursor placement, whereas fine cursor placement has a larger number of possible locations. For example, approximate cursor placement may only allow placement of the cursor at the beginning of a string 300, as shown in
In the above example where the beginning and the end of a string are the only options for approximately placing the cursor, the computing device 200 may determine whether to place the cursor at the beginning or the end in any suitable way. In some embodiments, as illustrated in
The string may be split into portions in any suitable manner. For example,
Furthermore, embodiments of the invention are not limited to splitting the string 300 into two portions. For example, in
If the user's input indicates a location corresponding to the beginning portion 370 of string 300, then the cursor 420 will be placed before the beginning letter 310 of the string 300, as shown in
Once computing device 200 coarsely places the cursor 420, the user may wish to place the cursor 420 at a more precise location within the same string 300. The number of possible locations to place the cursor is greater than the number of locations available when using coarse placement. Precise placement, or fine placement, of a cursor 420 may be implemented in any suitable way. For example, if the user indicates a second location within the same string 300, then the computing device 200 will select a location corresponding to the second location from a plurality of possible placement locations, wherein there is a greater number of possible placement locations when the computing device 200 performs precise cursor placement than the number of possible placement locations that were available during approximate cursor placement. In some embodiments, the plurality of possible placement locations may comprise each location adjacent to each character of the string 300. For example, a user's first input may correspond to a first location and the cursor may be approximately placed at the beginning of the string 300 (see
In some embodiments, both the first input, for approximate cursor placement, and the second input, for precise cursor placement, may be tap inputs. This allows the user to quickly place the cursor at a desired location very quickly and accurately without relying on inputs such as a drag input or a press and hold input. Tap inputs are particularly advantageous in portable devices. Embodiments of the invention are not limited to the type of input the user uses.
In some embodiments, when the display screen 250 displays cursor 420, a “gripper” 410 is also displayed. A gripper is a graphical indication on the screen with which a user may interact and convey further input information to the computing device 200. A gripper may be implemented in any suitable way. For example, the gripper 410 may be displayed below the cursor 420, as illustrated in
The aforementioned grippers are for exemplary purposes and embodiments of the invention are not limited to any particular shape. For example, any image or icon may be selected by the user for the gripper image. In this way, grippers may be personalized to the user's preferred experience. In some embodiments the grippers are always displayed below the line of text associated with the selected characters. In the case that text is not selected and, instead, a cursor is displayed, a gripper may be displayed below the cursor. Further, one of ordinary skill in the art would realize that there are many other variations of gripper shape and placement not shown that may be used and are covered by embodiments of the invention claimed herein. For example, if a vertical language is being displayed, grippers may be displayed to the left of the text.
As discussed above, the computing device 110 may receive input from various devices other than the touch screen sensor 172. In some embodiments, if the cursor 420 is being placed using these other devices, the touch screen engine may not display gripper 410 on display screen 191. For example, if the arrow keys on keyboard 162 or buttons 230 are used to move the cursor, the gripper may not be displayed.
Using grippers to select text will now be discussed in conjunction with
In some embodiments, the characters of the string are highlighted as the path 525, indicating a selection, is received from the user. For example, as the user's finger follows the path beneath the letter “r,” the highlighted portion will grow to encompass the letter. The highlighting may be a shading 520 of the background behind the characters being selected and may indicate to the user that the character is being selected. As the user's finger continues, the other letters traversed by path 525 become part of the highlighted portion until the user completes the drag input by lifting the finger from the display screen 250. The ending location of the drag input corresponds to a character 530 and the text between character 530 and character 320, corresponding to the original cursor location, will be selected. As noted above, because the path is below the string, the user may maintain visual line of sight with the characters being selected.
In some embodiments the touch screen engine will not display gripper 410 and/or the cursor 420 while the drag input is being entered by the user. This removes unnecessary graphics that may distract or confuse the user and allows the user to simply concentrate on the characters being highlighted.
In some embodiments, touch screen sensor 172 may receive a drag input from the user corresponding to gripper 410, shown in
In some embodiments, the touch screen engine changes the shape and/or location of the grippers so that the two grippers do not overlap.
Once the user selects text and the touch screen engine highlights the selected text, as illustrated in
Similarly, even when text is not selected and only a single gripper is displayed, as illustrated in
The above discussion of character selection was limited to text on a single line. However, selecting a subset of characters from a text block that occupies multiple lines is also an aspect of some embodiments of the invention. Specifically, the ability to determine when a user intends to select text on a different line than the line of text associated with the initiation of a drag input from a user. Input from a user's finger is imprecise and humans tend to trace paths on touch screens that are not straight and deviate from the user's intended path. Thus, some embodiments of the invention provide a “safety zone” associated with each line of text. A safety zone is a region surrounding a line of text wherein, as long as the path associated with a drag input from a user remains within the safety zone, the computing device 200 will determine that the user intended for the selection to stay on the same line of text. Safety zones also ensure that a selection initiated by the user begins on the line with which the cursor was originally associated. In some embodiments, the threshold distance away from a line of text that defines the boundary of a safety zone may change based on the actions of the user.
As an example of some embodiments of the invention,
A user may enter a drag input beginning at the gripper 714, following a path 716, which terminates in a location associated with the second line 720. A safety zone boundary 740 is illustrated by a dashed line in
The second mode is implemented after the initial boundary 740 is traversed and while the drag input is still being input by the user. When the computing device 200 operates in the second mode, each line has a safety zone boundary that is a shorter distance away from the line of text than when the computing device 200 operates in the first mode. For example, the safety zone boundary could have the same reduced distance threshold has boundary 741.
A user may enter a third drag input using the touch screen of computing device 200.
In some embodiments, as drag inputs are received from the user, the highlighting 722 is updated based on the current location along the path associated with each drag input. In some embodiments, the grippers 714 and 724 may not be displayed while the drag input is being received.
The safety zones discussed above do not apply to the direction in which the lines of text are oriented. For example, in the above discussion, safety zones only applied to the direction perpendicular to the lines of text, e.g. the vertical direction. Determining the end points of the highlighted portion of text along the direction of the line of text, e.g. the horizontal direction may be done the same way as described in conjunction with
Having thus described several aspects of at least one embodiment of this invention, it is to be appreciated that various alterations, modifications, and improvements will readily occur to those skilled in the art.
Such alterations, modifications, and improvements are intended to be part of this disclosure, and are intended to be within the spirit and scope of the invention. Further, though advantages of the present invention are indicated, it should be appreciated that not every embodiment of the invention will include every described advantage. Some embodiments may not implement any features described as advantageous herein and in some instances. Accordingly, the foregoing description and drawings are by way of example only.
The above-described embodiments of the present invention can be implemented in any of numerous ways. For example, the embodiments may be implemented using hardware, software or a combination thereof. When implemented in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers. Such processors may be implemented as integrated circuits, with one or more processors in an integrated circuit component. Though, a processor may be implemented using circuitry in any suitable format.
The various methods or processes outlined herein may be coded as software that is executable on one or more processors that employ any one of a variety of operating systems or platforms. Additionally, such software may be written using any of a number of suitable programming languages and/or programming or scripting tools, and also may be compiled as executable machine language code or intermediate code that is executed on a framework or virtual machine.
The terms “program” or “software” are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects of the present invention as discussed above. Additionally, it should be appreciated that according to one aspect of this embodiment, one or more computer programs that when executed perform methods of the present invention need not reside on a single computer or processor, but may be distributed in a modular fashion amongst a number of different computers or processors to implement various aspects of the present invention.
Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Typically the functionality of the program modules may be combined or distributed as desired in various embodiments.
Also, data structures may be stored in computer-readable media in any suitable form. For simplicity of illustration, data structures may be shown to have fields that are related through location in the data structure. Such relationships may likewise be achieved by assigning storage for the fields with locations in a computer-readable medium that conveys relationship between the fields. However, any suitable mechanism may be used to establish a relationship between information in fields of a data structure, including through the use of pointers, tags or other mechanisms that establish relationship between data elements.
Various aspects of the present invention may be used alone, in combination, or in a variety of arrangements not specifically discussed in the embodiments described in the foregoing and is therefore not limited in its application to the details and arrangement of components set forth in the foregoing description or illustrated in the drawings. For example, aspects described in one embodiment may be combined in any manner with aspects described in other embodiments.
Also, the invention may be embodied as a method, of which an example has been provided. The acts performed as part of the method may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
For example, one embodiment disclosed above is directed to approximate and precise placement of a cursor within a string of text. Another embodiment is directed to selection of a string of text using grippers. These embodiments may be combined such that rough and precise placement are used while performing text selection for a gripper. For example, the ending location of a drag input from a user may initially be determined using approximate placement. Then, upon further input from the user, the gripper may be placed more precisely. Any suitable input from the user may be received. For example, the user may provide a tap input within the selected string of text or provide an additional drag input associated with the gripper to precisely place the gripper.
Use of ordinal terms such as “first,” “second,” “third,” etc., in the claims to modify a claim element does not by itself connote any priority, precedence, or order of one claim element over another or the temporal order in which acts of a method are performed, but are used merely as labels to distinguish one claim element having a certain name from another element having a same name (but for use of the ordinal term) to distinguish the claim elements.
Also, the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” “having,” “containing,” “involving,” and variations thereof herein, is meant to encompass the items listed thereafter and equivalents thereof as well as additional items.
Number | Name | Date | Kind |
---|---|---|---|
5040131 | Torres | Aug 1991 | A |
5345543 | Capps et al. | Sep 1994 | A |
5465325 | Capps et al. | Nov 1995 | A |
5513309 | Meier et al. | Apr 1996 | A |
5523775 | Capps | Jun 1996 | A |
5613019 | Altman et al. | Mar 1997 | A |
5649133 | Arquie | Jul 1997 | A |
5710831 | Beernink et al. | Jan 1998 | A |
5778404 | Capps et al. | Jul 1998 | A |
5867144 | Wyard | Feb 1999 | A |
6057844 | Strauss | May 2000 | A |
6336124 | Alam et al. | Jan 2002 | B1 |
6344865 | Matthews et al. | Feb 2002 | B1 |
6381593 | Yano et al. | Apr 2002 | B1 |
6525749 | Moran et al. | Feb 2003 | B1 |
6587132 | Smethers | Jul 2003 | B1 |
6628315 | Smith Dawkins | Sep 2003 | B1 |
6687875 | Suzuki | Feb 2004 | B1 |
7185291 | Wu et al. | Feb 2007 | B2 |
7574664 | Jaeger | Aug 2009 | B2 |
7617443 | Mills et al. | Nov 2009 | B2 |
7692629 | Baudisch et al. | Apr 2010 | B2 |
7778821 | Mowatt et al. | Aug 2010 | B2 |
8009146 | Pihlaja | Aug 2011 | B2 |
8042042 | Kim et al. | Oct 2011 | B2 |
8091045 | Christie et al. | Jan 2012 | B2 |
20020056575 | Keely | May 2002 | A1 |
20020067346 | Mouton | Jun 2002 | A1 |
20020097270 | Keely et al. | Jul 2002 | A1 |
20020122197 | Abir | Sep 2002 | A1 |
20040056875 | Jaeger | Mar 2004 | A1 |
20040080541 | Saiga et al. | Apr 2004 | A1 |
20040141009 | Hinckley et al. | Jul 2004 | A1 |
20040150664 | Baudisch | Aug 2004 | A1 |
20050083485 | Toshima et al. | Apr 2005 | A1 |
20050193321 | Iwema et al. | Sep 2005 | A1 |
20050198561 | McAuley | Sep 2005 | A1 |
20050210369 | Damm, Jr. | Sep 2005 | A1 |
20060005151 | Altman | Jan 2006 | A1 |
20060036945 | Radtke et al. | Feb 2006 | A1 |
20060136807 | Yalovsky et al. | Jun 2006 | A1 |
20060161846 | Van Leeuwen | Jul 2006 | A1 |
20060244735 | Wilson | Nov 2006 | A1 |
20060253803 | Backlund | Nov 2006 | A1 |
20070101292 | Kupka | May 2007 | A1 |
20070115264 | Yu et al. | May 2007 | A1 |
20070157085 | Peters | Jul 2007 | A1 |
20070260981 | Kim et al. | Nov 2007 | A1 |
20070294644 | Yost | Dec 2007 | A1 |
20080165136 | Christie et al. | Jul 2008 | A1 |
20080165142 | Kocienda et al. | Jul 2008 | A1 |
20080259040 | Ording et al. | Oct 2008 | A1 |
20090002326 | Pihlaja | Jan 2009 | A1 |
20090109182 | Fyke et al. | Apr 2009 | A1 |
20090128505 | Partridge et al. | May 2009 | A1 |
20090167700 | Westerman et al. | Jul 2009 | A1 |
20090213134 | Stephanick et al. | Aug 2009 | A1 |
20090228792 | van Os et al. | Sep 2009 | A1 |
20090228842 | Westerman et al. | Sep 2009 | A1 |
20090295826 | Good et al. | Dec 2009 | A1 |
20100042933 | Ragusa | Feb 2010 | A1 |
20100070281 | Conkie et al. | Mar 2010 | A1 |
20100134425 | Storrusten | Jun 2010 | A1 |
20100171713 | Kwok et al. | Jul 2010 | A1 |
20100235726 | Ording et al. | Sep 2010 | A1 |
20100235729 | Kocienda et al. | Sep 2010 | A1 |
20100235793 | Ording | Sep 2010 | A1 |
20100245261 | Karlsson | Sep 2010 | A1 |
20100269039 | Pahlavan et al. | Oct 2010 | A1 |
20100289752 | Birkler | Nov 2010 | A1 |
20100289757 | Budelli | Nov 2010 | A1 |
20100293460 | Budelli | Nov 2010 | A1 |
20100299587 | Swett | Nov 2010 | A1 |
20100309147 | Fleizach et al. | Dec 2010 | A1 |
20100328227 | Matejka et al. | Dec 2010 | A1 |
20110018812 | Baird | Jan 2011 | A1 |
20110081083 | Lee et al. | Apr 2011 | A1 |
20110107211 | Chu et al. | May 2011 | A1 |
20110131481 | Vronay et al. | Jun 2011 | A1 |
20110163968 | Hogan | Jul 2011 | A1 |
20110202835 | Jakobsson et al. | Aug 2011 | A1 |
20110225525 | Chasman et al. | Sep 2011 | A1 |
20110264993 | Leong et al. | Oct 2011 | A1 |
20120013539 | Hogan et al. | Jan 2012 | A1 |
20120013540 | Hogan | Jan 2012 | A1 |
20120030566 | Victor | Feb 2012 | A1 |
20120032979 | Blow et al. | Feb 2012 | A1 |
20120139844 | Ramstein et al. | Jun 2012 | A1 |
20120151394 | Locke | Jun 2012 | A1 |
20120185805 | Louch et al. | Jul 2012 | A1 |
20120272179 | Stafford | Oct 2012 | A1 |
20120306772 | Tan et al. | Dec 2012 | A1 |
20120311507 | Murrett et al. | Dec 2012 | A1 |
20130002719 | Ide | Jan 2013 | A1 |
20130024820 | Kirkpatrick | Jan 2013 | A1 |
20140002377 | Brauninger et al. | Jan 2014 | A1 |
Number | Date | Country |
---|---|---|
101068411 | Nov 2007 | CN |
101910988 | Dec 2010 | CN |
2098947 | Sep 2009 | EP |
2301758 | Dec 1996 | GB |
09244813 | Sep 1997 | JP |
2006185443 | Jul 2006 | JP |
2007299394 | Nov 2007 | JP |
2007097644 | Aug 2007 | WO |
2009085779 | Jul 2009 | WO |
2010040216 | Apr 2010 | WO |
2010107653 | Sep 2010 | WO |
Entry |
---|
“International Search Report”, Mailed Date: Sep. 27, 2012, Application No. PCT/US2011/055765, Filed Date: Oct. 11, 2011, pp. 9. |
Apple, Inc., iPhone User Guide for iOS 4.2 and 4.3 Software, Chapter 3 “Basics,” pp. 37-40, published Mar. 11, 2011. |
Benko et al., “Precise selection techniques for multi-touch screens,” Proc. CHI '06, published Apr. 22-28, 2006, downloaded Jun. 21, 2011 http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.67.6740&rep=rep1&type=pdf. |
Vogel et al., “Shift: a technique for operating pen-based interfaces using touch,” Proc. Chi '07, pp. 657-666, published Apr. 28, 2007, downloaded Jun. 21, 2011, http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.79.2710&rep=rep1&type=pdf. |
Int'l Preliminary Examination Report, including Written Opinion, mailed Mar. 12, 2014, in connection with PCT Appl. Ser. No. PCT/US2011/055765. |
First office action, mailed Jun. 5, 2013 in related application U.S. Appl. No. 13/683,244. |
Second office action, mailed Nov. 15, 2013 in related application U.S. Appl. No. 13/683,244. |
“Documents To Go for iPhone/iPad/iPod Touch Help”, Retrieved From <<www.dataviz.com/DTG—iphone—manual.html#—Working—with—Excel>>, Mar. 22, 2012, 13 Pages. |
Xu, et al., “Enabling Efficient Browsing and Manipulation of Web Tables on Smartphone”, In Proceedings of the Towards Mobile and Intelligent Interaction Environments on Human-Computer Interaction, vol. 6763, 10 Pages. |
“How to Highlight, Cut, Copy and Paste Text Using a BlackBerry Smartphone”, Retrieved from <<http://helpblog.blackberry.com/2011110/edit-text-blackberry>>, Oct. 26, 2011, 6 Pages. |
“Numbers for iOS (iPhone, iPod touch): Add, Remove, Resize, and Rearrange Table Rows and Columns”, Retrieved from <<http://web.archive.org/web/20141012231847/http://support.apple.com/kb/PH3417?viewlocale=en—US>>, Mar. 23, 2012, 2 Pages. |
“Office Action Issued in European Patent Application No. 11872529.0”, Mailed Date: Jan. 5, 2016, 5 Pages. |
“Amendment with RCE Filed in U.S. Appl. No. 13/540,594”, filed Apr. 6, 2015, 14 Pages. |
“Final Office Action Issued in U.S. Appl. No. 13/540,594”, Mailed Date: Dec. 4, 2014 and Amendment Dated: Oct. 30, 2014, 42 Pages. |
“Non-Final Office Action Issued in U.S. Appl. No. 13/540,594”, Mailed Date: Jul. 31, 2014, and Application Filed: Jul. 2, 2012, 61 Pages. |
“Non-Final Office Action Issued in U.S. Appl. No. 13/540,594”, Mailed Dated: Jun. 3, 2015, 23 Pages. |
“Final Office Action Issued in U.S. Appl. No. 13/557,212”, Mailed Date: Jan. 2, 2015, 25 Pages. |
“Non-Final Office Action Issued in U.S. Appl. No. 13/557,212”, Mailed Date: Apr. 25, 2014, 21 Pages. |
“Non-Final Office Action Issued in U.S. Appl. No. 13/557,212”, Mailed Date: Nov. 23, 2015, 37 Pages. |
“International Search Report & Written Opinion Issued in PCT Application No. PCT/US2013/048993”, Mailed Date: Sep. 30, 2013, 8 Pages. |
“International Preliminary Report on Patentability for International Application No. PCT/US2013/051749”, Mailed dated: Jan. 12, 2015, 8 Pages. |
“International Search Report & Written Opinion Issued Issued in PCT Application No. PCT/US2013/051749”, Mailed Date: May 21, 2014, 17 Pages. |
“Second Written Opinion Issued in PCT Application No. PCT/US2013/051749”, Mailed Date: Sep. 25, 2014, 7 Pages. |
Walkenbach, John, “Excel 2007 Bible”, A Wiley Publication, Jan. 3, 2007, 9 Pages. |
“Supplementary Search Report Issued in European Patent Application No. 11872529.0”, Mailed Date: Mar. 18, 2015, 10 Pages. |
“Non-Final Office Action Issued in U.S. Appl. No. 13/683,244”, Mailed Date: Jan. 6, 2015, 18 Pages. |
“Notice of Allowance Issued in U.S. Appl. No. 13/683,244”, Mailed Date: Oct. 23, 2015, 9 Pages. |
“First Office Action and Search Report Issued in Chinese Patent Application No. 201210335577.9”, Mailed date: Aug. 8, 2014, 12 Pages. |
“Second Office Action and Search Report Issued in Chinese Patent Application No. 201210335577.9”, Mailed Date: Apr. 14, 2015, 11 Pages. |
Number | Date | Country | |
---|---|---|---|
20130067373 A1 | Mar 2013 | US |