The disclosed embodiments relate generally to user interfaces, and more particularly to a zooming user interface.
With the increasing popularity of mobile devices, including cellphone devices, handheld devices, handheld computers, smartphones, PDAs, etc., there is a need for improving the user interface experience.
Mobile devices with capacitive or resistive touch capabilities are well known. Modern mobile phones have evolved over recent years to the point where they now possess a broad range of capabilities. They are not only capable of placing and receiving mobile phone calls, multimedia messaging (MMS), and sending and receiving email, they can also access the Internet, are GPS-enabled, possess considerable processing power and large amounts of memory, and are equipped with high-resolution color liquid crystal displays capable of detecting touch input. As such, today's mobile phones are general purpose computing and telecommunication devices capable of running a multitude of applications. For example, modern mobile phones can run, word processing, web browser, navigation system, media player and gaming applications.
Along with these enhanced capabilities has come a demand for larger displays to provide a richer user experience. Mobile phone displays have increased in size to the point where they can now consume almost the entire viewing surface of a phone. To increase the size of displays any further would require an increase in the size of the phones themselves. Even with the display size being at its maximum, the content on the display remains relatively small. Due to the size of content in the display, a finger touching the display can obfuscate the very content being manipulated, making precise operations difficult. As a result, using touch screen user interfaces can often obscure text and provide inconsistent results.
Among other innovations described herein, various tools and techniques are disclosed for using a single-finger single touch to zoom content and interact with the zoomed content. According to one aspect of the techniques and tools described herein, a single-finger single touch on a touch screen displaying at least a page of content is detected. At least in response to the detecting the single-finger single touch, a page zoom is performed.
According to another aspect of the techniques and tools described herein, text in a page of content displayed in a touch screen is selected. A single-finger single touch with the touch screen on the selected text is detected and based on the detecting the single-finger single touch, a page zoom is performed. A dragging movement of the single-finger single touch along the touch screen is detected, and based at least on the detecting the dragging movement, revealed text is scrolled into display and at least a portion of the revealed text is selected. A removal of the single-finger single touch is detected and based at least on the detecting the removal, a page of content is zoomed out.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
The foregoing and other objects, features, and advantages of the invention will become more apparent from the following Detailed Description, which proceeds with reference to the accompanying figures.
In one implementation of a page zoom, when content in a page of content is being zoomed, the content can be displayed as expanding to various sizes until zoomed to a particular size for the zoomed page of content. In another implementation of a page zoom, when content is being zoomed, the content can be displayed such that it does not expand to various sizes until zoomed to a particular size for the zoomed page of content. For example, the content when zoomed can be displayed at the larger zoomed scale without transitioning through various displayed sizes.
In any of the examples herein, content can include, for example, application content, system continent, editable text in a text editing field, un-editable text (e.g., text displayed in a webpage), icons, fields, application windows, images, pictures, multimedia objects, emoticons, a cursor, a cursor placement tool, or other graphics displayed within a touch screen. In one implementation of application content, application content is content rendered at least using a launched application or software executing on a computing device. For example, a web browser application can display a webpage as application content. In one implementation of system content, system content is content rendered or displayed by an operating system. For example, in a touch screen display for a mobile device, system content can include a cursor placement tool, a battery power indicator, a displayed time, a signal indicator, and other content displayed by a mobile device operating system.
For example, a text editing field can be a field in a display where text can be manipulated for editing. A text edit field can include a cursor, and can allow common text editing functions including but not limited to adding and deleting text in the field. In various implementations of a text editing field, text can be added to and deleted from a text editing field at a position indicated by a cursor in the field. A user can often manipulate the text in a text editing field using a physical keyboard, or a soft keyboard displayed on the touch screen.
In
The single-finger single touch can be detected by continuously detecting a continuous contact with the touch screen or by periodically checking if a contact with the touch screen is maintained. For example, if consecutive periodic check for a contact at or local to (e.g., near) a previously checked point of contact on the touch screen indicates that a contact is being made at the checked touch screen location, then a continuous contact can be detected. However, if the consecutive periodic check indicates that a contact is not being made at or local to the previously checked point of contact then a break of contact with the touch screen can be detected. A point of contact can be contact with a small portion of the touch screen and detected as a single point of contact. For example, the single point of contact can be derived from an area of contact such as the area under a finger that comes in contact with a touch screen when pressed against the touch screen. In another embodiment, a point of contact can be a representative point of a larger continuous area contacted on the touch screen. A single-finger single touch can be made with a finger, stylus, or other tool capable of contacting a touch screen at a point.
With reference to
In one implementation of a page zoom, the page zoom scales or zooms content at a page level such that content in the visible area and possibly content outside the visible area is zoomed in. In another implementation, the content to be page zoomed is included in a page of content and the content is scaled together as a unit. Some of the content of the page of content can be displayed while other content of the page of content is not displayed. For example, a webpage displayed in web browser can be a page of content. A portion of the web page can be displayed while other portions of the web page are not displayed because such portions are outside of the viewable area. However, the portions of the web page that are not displayed can be brought into display such as by scrolling the web page. In some implementations, a page zoom scales or zooms the displayed page of content at page level or display level and does not display a zoomed copy of the content at the same time the original sized content is displayed. That is to say, a page zoom does not produce a magnifying glass or magnifying window in the display that displays a magnified copy of displayed content at the same time as displaying the unmagnified content or a portion of the unmagnified content. In some implementations, a page of content can include a web page in a web browser, a document in a word processing application, a spreadsheet, the content displayed in a display or other page of content displayed in a touch screen by an application or computing device. In some implementations of a page zoom, the system content around the page of content is also zoomed or scaled along with the page of content. For example, if the content of a single-line text edit box is zoomed as a page of content, the content around the text edit box including system content can also be zoomed proportionally so that the text edit box can be displayed at its zoomed scale without being obscured by system content. That is to say, the content (e.g., application content and system content) in the display can be zoomed at display level or a level of the display. In some implementations of performing a page zoom at the display level, the content of the entire display is zoomed such that at least the portions of the content of the entire display located at one or more edges of the display is no longer displayed in the touch screen when zoomed. In another implementation of a page zoom at the display level, a portion of the content in the display is zoomed such that at least one or more edge portions of the portion of content that are located at the edges of the portion of content are no longer displayed in the touch screen when zoomed.
At block 630, a dragging movement of the single-finger single touch is detected. For example, a user pressing a finger against the touch screen can drag the finger along the touch screen surface to a new point along the touch screen maintaining a continuous contact, and the dragging of the finger can be detected. At block 640, in response to the detecting of the dragging movement of the single-finger single touch, additional text is selected. In one example implementation, text is selected based on a location of single-finger single touch on the touch screen. Additional selectable text can be selected between already selected text and a location in the selectable text relative to the location of the single-finger single touch due to the dragging movement. For example, the location in the selectable text relative to the location of the single-finger single touch due to the dragging movement can be the detected single point of contact of the single-finger single touch or a location in the text a predetermined distance displaced from the detected single point of contact. The dragging movement can be detected as moving in one or more directions. For example, contact movement can be in an upward, downward, leftward, rightward, or diagonal direction or a combination of directions. The selection of text can be at the character level, word level, line level or other level of text. The selection of text can include a wrapping selection of text that selects consecutive lines of text up to a predetermined location in the text.
In some implementations, text can be selected by highlighting the text in the touch screen display, and/or changing the displayed color of the text in the display. For example, a word can be highlighted when an area of the display that contains the word is changed in color or designated visually in some other manner. In other implementations, selected text can be displayed with a different color than the color of unselected text, or displayed as underlined. In yet other implementations, selection of text can persist in the touch screen display until the selection is canceled, deleted, or removed. In some implementations selected text can be marked for common editing functions, such as copying, deleting, or other functions.
At block 650, in response to the dragging movement of the single-finger single touch, text not in the previously displayed zoomed content is displayed when the zoomed page of content is scrolled in the touch screen display. For example, while selecting text, whenever the single-finger single touch reaches an area relative to (e.g. very close) or around a predetermined boundary an automatic scrolling can occur, and text scrolled into view can be selected. A predetermined boundary can include the edge of a text box, edge of a display, edge of a display window, edge of touch screen or other boundary in a touch screen. The automatic scrolling can scroll vertically (e.g. Up/Down), horizontally (e.g. Right/Left), diagonally or some other direction depending upon the direction of the dragging motion and one or more boundaries. Scrolling can move undisplayed text into a text editing field for display. The amount of scrolling can be relative to the distance the single-finger single touch is from the predetermined boundary. In one example, where the boundary is the edge of the display of the touch screen, the remainder of the page of content from the location of the single-finger single touch in the direction of the scrolling is mapped proportionally to the distance between the location of the single-finger single touch in the display and the boundary. As the single-finger single touch approaches the boundary the page of content will be proportionally scrolled to display content. When the single-finger single touch is at the boundary the edge of the page of content is displayed. This allows a user to scroll to the edge of a zoomed page of content without breaking contact with a touch screen display.
At block 660, a removal of the single-finger single touch is detected. In one example implementation, the single-finger single touch is removed when the continuous contact with the touch screen is broken. For example, to remove the single-finger single touch, a user performing a single-finger single touch by continuously contacting the touch screen with a finger can lift the finger off the touch screen to break contact with the touch screen. The removal of the single-finger single touch can be detected by detecting that the contact with the touch screen is broken and no longer maintained or continuous.
At block 670, at least in response to the detecting the removal of the single-finger single touch, the page of content is zoomed out. For example, the zoomed page of content is returned to its original size or scale before being zoomed by the page zoom. In other implementations, zooming out a zoomed page of content reduces the scale of the zoomed page of content to a fraction of the scale.
At block 920, in response to the detecting the single-finger single touch, a page zoom is performed. In one implementation, in addition to performing a page zoom, an element such as a cursor placement tool or cursor is activated and/or grabbed in response to detecting the single-finger single touch. An activated or grabbed element can be controlled functionally or can be moved in the display in response to the motion of the single-finger single touch with the touch screen. In one implementation of a page zoom, the single-finger single touch comprises a continuous contact that is moving along the touch screen and because the moving continuous contact is detected a page zoom is performed. In some implementations of a page zoom the page zoom is performed such that a text edit box is displayed such that it is located based on the location of the single-finger single touch. For example, if a single-finger single touch is detected on a single-line or multi-line text edit or entry field, the text edit or entry field can be positioned in the display relative to the single-finger single touch. This allows for convenient placement of a cursor in the text edit field. In a single-line text edit field implementation, the text edit field can be located either under the single-finger single touch or displaced a predetermined distance from (e.g. above or below) the single-finger single touch. In one implementation the single-line edit box is displaced slightly above the location of the single-finger single touch so that a user's finger does not obscure the single-finger single touch. In an example of a multi-line text edit field, the text edit field can be located such that the location of the single-finger single touch is on a portion of the text edit field or displaced a predetermined distance from the text edit field. In some implementations of performing a single-finger single touch on a text edit field a page zoom can be automatically performed in response to the single-finger single touch on the text edit field. In another implementation, after a single-finger single touch is performed on a text edit field, a page zoom can be automatically performed after a predetermined period of time has passed while the single-finger single touch continues. Pausing or waiting a predetermined period of time before performing a page zoom after a single-finger single touch has been performed can disambiguate the intent of the user in contacting the touch screen display. For example, a user's intent of wanting to select text in the display can be indicated by performing a sustained single-finger single touch longer than a period of time, as opposed to a user's intent of merely wanting to scroll the display, or tap on a button or link which can be performed before the end of the predetermined amount of time or pause. In some examples of a page zoom, the page zoom is centered or zoomed as expanding out from the location in the display that is under or a predetermined distance from the location of the single-finger single touch on the touch screen. In some implementations of a page zoom, content that is not part of the page of content such as system content, or a newly activated cursor or cursor placement tool can be zoomed along with the and proportionally to the zoomed page of content in response to the page of content being zoomed.
At block 930, a dragging movement of the single-finger single touch is detected. In some implementations a drag or drag gesture can be detected when a user drags a finger that is pressed against the touch screen along the surface of the touch screen to a new location on the touch screen while maintaining contact with the touch screen. At block 940, at least in response to the dragging movement of the single-finger single touch, the cursor is relocated in editable text displayed in the touch screen. For example, the cursor can be activated outside of an area where editable text is located in the display and can be moved to an location where editable text is displayed. In some implementations, a user can relocate a grabbed element such as a cursor anywhere in editable text that the grabbed element is allowed. For example, the grabbed element can be placed between characters (e.g., characters of a word), or between words. When a grabbed element is moved it can be displayed under or displaced a predetermined distance from the location of the single-finger single touch.
At block 950, in response to the dragging movement of the single-finger single touch, text not in the previously displayed zoomed content is displayed when the zoomed page of content is scrolled in the touch screen display. For example, a user can drag the single-finger single touch horizontally toward a edge or boundary of the display and the display can be automatically scrolled to display content beyond the displayed content in the direction of the horizontal movement. In some implementations of scrolling zoomed content, scrolling can be allowed in one or more directions and not allowed in one or more other directions. For example, when a single-finger single touch is detected in a single-line text edit field, and the text edit field and surrounding content are zoomed, scrolling in the vertical directions can be locked such that scrolling cannot happen in the vertical directions, and scrolling in the horizontal directions is allowed. In other implementations, no locks are placed on scrolling and scrolling can be allowed in horizontal, vertical and diagonal directions.
At block 960, a removal of the single-finger single touch is detected. For example, a user can remove a finger from contacting the touch screen. At block 970, at least in response to the detecting the removal of the single-finger single touch, the page of content is zoomed out and the cursor is displayed in the relocated position. For example, because the user removed a finger from the touch screen, the content in the display is zoomed out to an original smaller scale and the cursor is displayed in the display in the position it was relocated to in the editable text. In some implementations when the content is zoomed out the location of the displayed content is based on the location of the last position of the single-finger single touch. For example, the page can be zoomed out such that and the displayed content can be content that was zoomed out from or a predetermined distance from the last point of contact with the touch screen of the removed single-finger single touch.
The illustrated mobile device 1200 can include a controller or processor 1210 (e.g., signal processor, microprocessor, ASIC, or other control and processing logic circuitry) for performing such tasks as signal coding, data processing, input/output processing, power control, and/or other functions. An operating system 1212 can control the allocation and usage of the components 1202 and support for one or more application programs 1214. The applications 1214 can include the software for the technologies described herein, such as for zooming in response to a single-finger single touch. The application programs can include common mobile computing applications (e.g., email applications, text editing applications, calendars, contact managers, web browsers, messaging applications), or any other computing application.
The illustrated mobile device 1200 can include memory 1220. Memory 1220 can include non-removable memory 1222 and/or removable memory 1224. The non-removable memory 1222 can include RAM, ROM, flash memory, a hard disk, or other well-known memory storage technologies. The removable memory 1224 can include flash memory or a Subscriber Identity Module (SIM) card, which is well known in GSM communication systems, or other well-known memory storage technologies, such as “smart cards.” The memory 1220 can be used for storing data and/or code for running the operating system 1212 and the applications 1214. Example data can include web pages, text, images, sound files, video data, or other data sets to be sent to and/or received from one or more network servers or other devices via one or more wired or wireless networks. The memory 1220 can be used to store a subscriber identifier, such as an International Mobile Subscriber Identity (IMSI), and an equipment identifier, such as an International Mobile Equipment Identifier (IMEI). Such identifiers can be transmitted to a network server to identify users and equipment.
The mobile device 1200 can support one or more input devices 1230, such as a touch screen 1232, microphone 1234, camera 1236, physical keyboard 1238 and/or trackball 1240 and one or more output devices 1250, such as a speaker 1252 and a display 1254. Other possible output devices (not shown) can include piezoelectric or other haptic output devices. Some devices can serve more than one input/output function. For example, touch screen 1232 and display 1254 can be combined in a single input/output device.
A wireless modem 1260 can be coupled to an antenna (not shown) and can support two-way communications between the processor 1210 and external devices, as is well understood in the art. The modem 1260 is shown generically and can include a cellular modem for communicating with the mobile communication network 1204 and/or other radio-based modems (e.g., Bluetooth or Wi-Fi). The wireless modem 1260 is typically configured for communication with one or more cellular networks, such as a GSM network for data and voice communications within a single cellular network, between cellular networks, or between the mobile device and a public switched telephone network (PSTN).
The mobile device can further include at least one input/output port 1280, a power supply 1282, a satellite navigation system receiver 1284, such as a Global Positioning System (GPS) receiver, an accelerometer 1286, and/or a physical connector 1290, which can be a USB port, IEEE 1394 (FireWire) port, and/or RS-232 port. The illustrated components 1202 are not required or all-inclusive, as any components can deleted and other components can be added.
In example environment 1300, various types of services (e.g., computing services) are provided by a cloud 1310. For example, the cloud 1310 can comprise a collection of computing devices, which may be located centrally or distributed, that provide cloud-based services to various types of users and devices connected via a network such as the Internet. The implementation environment 1300 can be used in different ways to accomplish computing tasks. For example, some tasks (e.g., processing user input and presenting a user interface) can be performed on local computing devices (e.g., connected devices 1330, 1340, 1350) while other tasks (e.g., storage of data to be used in subsequent processing) can be performed in the cloud 1310.
In example environment 1300, the cloud 1310 provides services for connected devices 1330, 13401350 with a variety of screen capabilities. Connected device 1330 represents a device with a computer screen 1335 (e.g., a mid-size screen). For example, connected device 1330 could be a personal computer such as desktop computer, laptop, notebook, netbook, or the like. Connected device 1340 represents a device with a mobile device screen 1345 (e.g., a small size screen). For example, connected device 1340 could be a mobile phone, smart phone, personal digital assistant, tablet computer, and the like. Connected device 1350 represents a device with a large screen 1355. For example, connected device 1350 could be a television screen (e.g., a smart television) or another device connected to a television (e.g., a set-top box or gaming console) or the like. One or more of the connected devices 1330, 1340, 1350 can include touch screen capabilities. Touchscreens can accept input in different ways. For example, capacitive touchscreens detect touch input when an object (e.g., a fingertip or stylus) distorts or interrupts an electrical current running across the surface. As another example, touchscreens can use optical sensors to detect touch input when beams from the optical sensors are interrupted. Physical contact with the surface of the screen is not necessary for input to be detected by some touchscreens. Devices without screen capabilities also can be used in example environment 1300. For example, the cloud 1310 can provide services for one or more computers (e.g., server computers) without displays.
Services can be provided by the cloud 1310 through service providers 1320, or through other providers of online services (not depicted). For example, cloud services can be customized to the screen size, display capability, and/or touch screen capability of a particular connected device (e.g., connected devices 1330, 1340, 1350).
In example environment 1300, the cloud 1310 provides the technologies and solutions described herein to the various connected devices 1330, 1340, 1350 using, at least in part, the service providers 1320. For example, the service providers 1320 can provide a centralized solution for various cloud-based services. The service providers 1320 can manage service subscriptions for users and/or devices (e.g., for the connected devices 1330, 1340, 1350 and/or their respective users).
Although the operations of some of the disclosed methods are described in a particular, sequential order for convenient presentation, it should be understood that this manner of description encompasses rearrangement, unless a particular ordering is required by specific language set forth below. For example, operations described sequentially may in some cases be rearranged or performed concurrently. Moreover, for the sake of simplicity, the attached figures may not show the various ways in which the disclosed methods can be used in conjunction with other methods.
Any of the disclosed methods can be implemented as computer-executable instructions stored on one or more computer-readable storage media (e.g., non-transitory computer-readable media, such as one or more optical media discs (e.g., a DVD, or CD), volatile memory components (such as DRAM or SRAM), or nonvolatile memory components (such as hard drives)) and executed on a computer (e.g., any commercially available computer, including smart phones or other mobile devices that include computing hardware). Any of the computer-executable instructions for implementing the disclosed techniques as well as any data created and used during implementation of the disclosed embodiments can be stored on one or more computer-readable media (e.g., non-transitory computer-readable media). The computer-executable instructions can be part of, for example, a dedicated software application or a software application that is accessed or downloaded via a web browser or other software application (such as a remote computing application). Such software can be executed, for example, on a single local computer (e.g., any suitable commercially available computer) or in a network environment (e.g., via the Internet, a wide-area network, a local-area network, a client-server network (such as a cloud computing network), or other such network) using one or more network computers.
For clarity, only certain selected aspects of the software-based implementations are described. Other details that are well known in the art are omitted. For example, it should be understood that the disclosed technology is not limited to any specific computer language or program. For instance, the disclosed technology can be implemented by software written in C++, Java, Perl, JavaScript, Adobe Flash, Media Center Markup Language or any other suitable programming language. Likewise, the disclosed technology is not limited to any particular computer or type of hardware. Certain details of suitable computers and hardware are well known and need not be set forth in detail in this disclosure.
Furthermore, any of the software-based embodiments (comprising, for example, computer-executable instructions for causing a computer to perform any of the disclosed methods) can be uploaded, downloaded, or remotely accessed through a suitable communication means. Such suitable communication means include, for example, the Internet, the World Wide Web, an intranet, software applications, cable (including fiber optic cable), magnetic communications, electromagnetic communications (including RF, microwave, and infrared communications), electronic communications, or other such communication means.
The disclosed methods, apparatus, and systems should not be construed as limiting in any way. Instead, the present disclosure is directed toward all novel and nonobvious features and aspects of the various disclosed embodiments, alone and in various combinations and subcombinations with one another. The disclosed methods, apparatus, and systems are not limited to any specific aspect or feature or combination thereof, nor do the disclosed embodiments require that any one or more specific advantages be present or problems be solved.
In view of the many possible embodiments to which the principles of the disclosed invention may be applied, it should be recognized that the illustrated embodiments are only preferred examples of the invention and should not be taken as limiting the scope of the invention. Rather, the scope of the invention is defined by the following claims. We therefore claim as our invention all that comes within the scope of these claims.
Number | Name | Date | Kind |
---|---|---|---|
4190835 | Buynak | Feb 1980 | A |
4633432 | Kitamura | Dec 1986 | A |
4698625 | McCaskill et al. | Oct 1987 | A |
4739314 | McCaskill et al. | Apr 1988 | A |
RE32773 | Goldwasser et al. | Oct 1988 | E |
4786894 | Furusawa et al. | Nov 1988 | A |
5051930 | Kuwabara et al. | Sep 1991 | A |
5070478 | Abbott | Dec 1991 | A |
5122953 | Uekusa et al. | Jun 1992 | A |
5164713 | Bain | Nov 1992 | A |
5581670 | Bier et al. | Dec 1996 | A |
5583981 | Pleyer | Dec 1996 | A |
5655093 | Frid-Nielsen | Aug 1997 | A |
5666139 | Thielens et al. | Sep 1997 | A |
5666552 | Greyson et al. | Sep 1997 | A |
5714971 | Shalit et al. | Feb 1998 | A |
5754737 | Gipson | May 1998 | A |
5778404 | Capps et al. | Jul 1998 | A |
5798752 | Buxton et al. | Aug 1998 | A |
5825352 | Bisset | Oct 1998 | A |
5857212 | Van De Vanter | Jan 1999 | A |
5905486 | Brittenham et al. | May 1999 | A |
5917476 | Czerniecki | Jun 1999 | A |
6049326 | Beyda et al. | Apr 2000 | A |
6057827 | Matthews | May 2000 | A |
6115482 | Sears | Sep 2000 | A |
6137472 | Pekelney et al. | Oct 2000 | A |
6204848 | Nowlan et al. | Mar 2001 | B1 |
6232969 | Fawcett | May 2001 | B1 |
6310610 | Beaton et al. | Oct 2001 | B1 |
6313836 | Russell, Jr. et al. | Nov 2001 | B1 |
6340967 | Maxted | Jan 2002 | B1 |
6360237 | Schulz et al. | Mar 2002 | B1 |
6396474 | Johnson, Jr. et al. | May 2002 | B1 |
6489981 | Jones | Dec 2002 | B1 |
6661965 | Yamamoto | Dec 2003 | B2 |
6677928 | Brodey | Jan 2004 | B1 |
6683627 | Ullmann et al. | Jan 2004 | B1 |
6693653 | Pauly | Feb 2004 | B1 |
6714218 | Bian | Mar 2004 | B1 |
6714221 | Christie et al. | Mar 2004 | B1 |
6854090 | Yu | Feb 2005 | B2 |
6927783 | MacInnis et al. | Aug 2005 | B1 |
6986106 | Soin et al. | Jan 2006 | B2 |
7006074 | Chesters | Feb 2006 | B2 |
7024623 | Higashiyama et al. | Apr 2006 | B2 |
7030861 | Westerman et al. | Apr 2006 | B1 |
7032171 | Carroll | Apr 2006 | B1 |
7075512 | Fabre et al. | Jul 2006 | B1 |
7091954 | Iesaka | Aug 2006 | B2 |
7098933 | Thoemmes et al. | Aug 2006 | B1 |
7113178 | Webb | Sep 2006 | B1 |
7149550 | Kraft et al. | Dec 2006 | B2 |
7177797 | Micher et al. | Feb 2007 | B1 |
7355583 | Beardsley | Apr 2008 | B2 |
7356760 | Jaeger | Apr 2008 | B2 |
7380203 | Keely | May 2008 | B2 |
7423659 | Pratley et al. | Sep 2008 | B1 |
7456850 | Meier et al. | Nov 2008 | B2 |
7489306 | Kolmykov-Zotov et al. | Feb 2009 | B2 |
7586481 | Paquette et al. | Sep 2009 | B1 |
7605804 | Wilson | Oct 2009 | B2 |
7761807 | Tapuska | Jul 2010 | B2 |
7856605 | Ording et al. | Dec 2010 | B2 |
7916157 | Kelley et al. | Mar 2011 | B1 |
7954054 | Iwema et al. | May 2011 | B2 |
7966578 | Tolmasky et al. | Jun 2011 | B2 |
8074181 | Zaman et al. | Dec 2011 | B2 |
8117034 | Gschwendtner | Feb 2012 | B2 |
8176438 | Zaman et al. | May 2012 | B2 |
8299943 | Longe | Oct 2012 | B2 |
8319728 | Geffin et al. | Nov 2012 | B2 |
8446392 | Wang et al. | May 2013 | B2 |
8566751 | Kelley et al. | Oct 2013 | B2 |
8614665 | Li | Dec 2013 | B2 |
8656282 | Kulas | Feb 2014 | B2 |
8656296 | Ouyang et al. | Feb 2014 | B1 |
8656315 | Kirkpatrick | Feb 2014 | B2 |
8826190 | Kirkpatrick | Sep 2014 | B2 |
8922479 | Pennington, II et al. | Dec 2014 | B2 |
8984436 | Tseng et al. | Mar 2015 | B1 |
20010012437 | Yamamoto | Aug 2001 | A1 |
20020032705 | Higashiyama et al. | Mar 2002 | A1 |
20020059350 | Iwema et al. | May 2002 | A1 |
20020063740 | Forlenza et al. | May 2002 | A1 |
20020143544 | Gschwendtner | Oct 2002 | A1 |
20020156615 | Takatsuka et al. | Oct 2002 | A1 |
20030095135 | Kaasila et al. | May 2003 | A1 |
20030103082 | Carroll | Jun 2003 | A1 |
20040019849 | Weng et al. | Jan 2004 | A1 |
20040027398 | Jaeger | Feb 2004 | A1 |
20040056899 | Sinclair et al. | Mar 2004 | A1 |
20040083109 | Halonen et al. | Apr 2004 | A1 |
20040135797 | Meier et al. | Jul 2004 | A1 |
20040179001 | Morrison et al. | Sep 2004 | A1 |
20040225965 | Garside et al. | Nov 2004 | A1 |
20040249627 | Mirkin | Dec 2004 | A1 |
20050008343 | Frohlich et al. | Jan 2005 | A1 |
20050057524 | Hill | Mar 2005 | A1 |
20050193321 | Iwema | Sep 2005 | A1 |
20050204295 | Voorhees et al. | Sep 2005 | A1 |
20050270269 | Tokkonen | Dec 2005 | A1 |
20060005151 | Altman | Jan 2006 | A1 |
20060026536 | Hotelling | Feb 2006 | A1 |
20060064640 | Forlines et al. | Mar 2006 | A1 |
20060072137 | Nishikawa et al. | Apr 2006 | A1 |
20060119588 | Yoon et al. | Jun 2006 | A1 |
20060132460 | Kolmykov-Zotov et al. | Jun 2006 | A1 |
20060197756 | Sun | Sep 2006 | A1 |
20060253803 | Backlund | Nov 2006 | A1 |
20060256088 | Kong | Nov 2006 | A1 |
20070061753 | Ng et al. | Mar 2007 | A1 |
20070125633 | Boillot | Jun 2007 | A1 |
20070229466 | Peng | Oct 2007 | A1 |
20070234235 | Scott | Oct 2007 | A1 |
20070260981 | Kim et al. | Nov 2007 | A1 |
20070294644 | Yost | Dec 2007 | A1 |
20080048997 | Gillespie | Feb 2008 | A1 |
20080084400 | Rosenberg | Apr 2008 | A1 |
20080148177 | Lang et al. | Jun 2008 | A1 |
20080165141 | Christie | Jul 2008 | A1 |
20080165142 | Kocienda | Jul 2008 | A1 |
20080174570 | Jobs et al. | Jul 2008 | A1 |
20080184290 | Tapuska | Jul 2008 | A1 |
20080195979 | Souza et al. | Aug 2008 | A1 |
20080259040 | Ording | Oct 2008 | A1 |
20080309632 | Westerman | Dec 2008 | A1 |
20090049398 | Ahn | Feb 2009 | A1 |
20090058830 | Herz | Mar 2009 | A1 |
20090087095 | Webb | Apr 2009 | A1 |
20090109182 | Fyke et al. | Apr 2009 | A1 |
20090113353 | Bansal et al. | Apr 2009 | A1 |
20090187846 | Paasovaara | Jul 2009 | A1 |
20090189862 | Viberg | Jul 2009 | A1 |
20090204888 | Miyamoto | Aug 2009 | A1 |
20090217158 | Bailey | Aug 2009 | A1 |
20090228792 | van Os et al. | Sep 2009 | A1 |
20090228842 | Westerman et al. | Sep 2009 | A1 |
20090237421 | Kim et al. | Sep 2009 | A1 |
20090249232 | Lundy | Oct 2009 | A1 |
20090249257 | Bove et al. | Oct 2009 | A1 |
20100066764 | Refai | Mar 2010 | A1 |
20100159892 | Dunnam et al. | Jun 2010 | A1 |
20100171713 | Kwok et al. | Jul 2010 | A1 |
20100205575 | Arora et al. | Aug 2010 | A1 |
20100235726 | Ording et al. | Sep 2010 | A1 |
20100235729 | Kocienda et al. | Sep 2010 | A1 |
20100245261 | Karlsson | Sep 2010 | A1 |
20100295798 | Nicholson et al. | Nov 2010 | A1 |
20100302281 | Kim | Dec 2010 | A1 |
20100313126 | Jung et al. | Dec 2010 | A1 |
20100328209 | Nakao | Dec 2010 | A1 |
20100328317 | Lindfors | Dec 2010 | A1 |
20110010668 | Feldstein et al. | Jan 2011 | A1 |
20110029917 | Um | Feb 2011 | A1 |
20110035209 | Macfarlane | Feb 2011 | A1 |
20110080341 | Helmes et al. | Apr 2011 | A1 |
20110109581 | Ozawa et al. | May 2011 | A1 |
20110134029 | Park et al. | Jun 2011 | A1 |
20110157028 | Stallings et al. | Jun 2011 | A1 |
20110239153 | Carter et al. | Sep 2011 | A1 |
20110310026 | Davis et al. | Dec 2011 | A1 |
20120306772 | Tan et al. | Dec 2012 | A1 |
Number | Date | Country |
---|---|---|
101419526 | Apr 2009 | CN |
101526881 | Sep 2009 | CN |
WO 2010135127 | Nov 2010 | WO |
Entry |
---|
Apple, “Vision,” <http://www.apple.com/accessibility/iphone/vision.html>, 8 pages, Accessed Aug. 9, 2011. |
Hillebrand, “How to Make your Windows Mobile Touch Screen Finger-Friendly,” Published Date: Apr. 22, 2008, <http://www.mobilitysite.com/2008/04/how-to-make-your-windows-mobile-touch-screen-finger-friendly/>, 4 pages (retrieved Mar. 24, 2010). |
“iPhone OS 3.0: How to cut, copy and paste text and images,” <http://www.iphonic.tv/2009/06/iphone—os—30—how—to—cut—copy—a.html>, accessed Jun. 18, 2010, 11 pages. |
Microsoft Office, “Select text by using the mouse,” <http://office.microsoft.com/en-us/word-help/select-text-HA010096402.aspx#BMI>, accessed Jun. 18, 2010, 4 pages. |
“My-T-Touch for Windows Indestructible Keyboards and Indispensable Utilites, Version 1.78, Release 4, User's Guide,” Released Date: Nov. 24, 2009, <http://www.imgpresents.com/myttouch/guide/mtta4.pdf>, 187 pages (retrieved Jan. 22, 2010). |
Olwal et al., “Rubbing and Tapping for Precise and Rapid Selection on Touch-Screen Displays,”Proceedings of CHI 2008 (SIGCHI Conference on Human Factors in Computing Systems), Florence, Italy, Apr. 5-10, 2008, pp. 295-304. |
PCWorld, “Magnifying Glass Pro description, Database Downloads List by 30 Day Change,” <http://www.pcworld.com/downloads/file/fid,71819-order,4-c,database/description.html>, Added Feb. 1, 2010, pp. 1-4, Downloaded Mar. 31, 2011. |
Ramos et al., “Zliding: Fluid Zooming and Sliding for High Precision Parameter Manipulation,” UIST '05, Oct. 23-27, 2005, pp. 143-152. |
Stoup, “The New 12″ MacBook will have an iPhone-Like Interface,” Published Date: Jan. 17, 2007, <http://www.applematters.com/article/the-new-12-macbook-will-have-an-iphonelike-interface/>, 7 pages (retrieved Jan. 22, 2010). |
ThomasNet News, “Touch Sensor Features Gesture Recognition Technology,” <http://news.thomasnet.com/fullstory/543504>, Published Date: Apr. 28, 2008, 3 pages. |
“Virtual On-Screen Keyboard for any Taste,” <http://hot-virtual-keyboard.com/>, Published Date: Dec. 15, 2009, 1 page. |
International Search Report and Written Opinion dated Feb. 28, 2013, PCT Application No. PCT/US2012/050104, 10 pages. |
European Search Report received for European Patent Application No. 12822739.4, Mailed: Oct. 30, 2014, 3 Pages. |
Office Action received for European Patent Application No. 12822739.4, Mailed: Nov. 18, 2014, 5 Pages. |
First Office Action and Search Report Issued in Chinese Application No. 201280039141.3, Mailed Date: Dec. 29, 2015, 17 Pages. |
Number | Date | Country | |
---|---|---|---|
20130042199 A1 | Feb 2013 | US |