The disclosed embodiments relate generally to touch-sensitive user interfaces for electronic devices and, more particularly, to touch-sensitive interfaces for controlling playback of digital media, including fast scanning through media files.
Touch-sensitive interfaces are used in many devices to provide a user interface through with the user may interact with the device. Touch-sensitive interfaces include touch pads and touch screens.
Touch pads are typically flat, touch-sensitive surfaces that enable a user to interact with content displayed on an electronic device through finger gestures, such as taps or strokes, made on the touch pad. Touch pads are made from different technologies (e.g., capacitive sensing, membrane switches and resistive sensors) and can be used in a wide range of electronic devices, including on laptops and keyboards connected to personal computers, and as stand-alone user interfaces devices, similar to a mouse. Among other things, a touch pad enables a user to move an on-screen cursor in directions that correspond to the motion of the user's finger on the touch pad.
Touch screens are used in electronic devices to display graphics and text, and to provide a user interface through which a user may interact with the device. A touch screen detects and responds to contact on the touch screen. A device may display one or more soft keys, menus, and other user-interface objects on the touch screen. A user may interact with the device by contacting the touch screen at locations corresponding to the user-interface objects with which she wishes to interact. Touch screens are becoming more popular for use as displays and as user input devices on portable devices, such as mobile telephones, personal digital assistants (PDAs), and cameras.
Touch-sensitive interfaces, when used to control media playback in a media player application, sometimes lack the intuitive feel and/or functionality of dedicated media controls. For example, when used to control playback of media content on an electronic device, a touchpad is commonly used to move an on-screen pointer to engage responsive regions displayed as part of a media player application's user interface (such as a fast forward region displayed as part of an on-screen control bar). This is generally an inconvenient interface when compared to the rich user interfaces provided to control media playback in dedicated media player devices. One example of such a rich user interface is the click wheel technology used in Apple's iPod® family of media player devices. As is well known, the click wheel interface enables the intuitive selection and control of media playback using, among other things, circular finger motions and clicks. Other physical interfaces for media playback control include the jog/shuttle interface of DVD players, and the button-based interfaces seen on CD and DVD players and TV remotes that allow users, among other functions, to play media at normal speed, fast forward, scan (playback media at different speeds), reverse/rewind and stop media playback.
Therefore, there is a need for touch-sensitive interfaces that provide a more intuitive experience for users when controlling media playback in a media player application, including specifying varying media playback scan rates and scan directions.
In some embodiments, a method for use in an electronic device with a touch interface and a media player includes: while a media a media file is playing in the media player, detecting a first user gesture on the touch interface, wherein the first user gesture is associated with a request to vary scan rate through the media file; in response to the first user gesture, changing playback speed of the media file by a first scan rate factor; detecting a second user gesture on the touch interface that is connected to the first user gesture, wherein the second user gesture is also associated with a request to vary scan rate through the media file; and in response to the second user gesture, changing playback speed of the media file by an additional second scan rate factor.
In other embodiments, a method for use in an electronic device with a touch interface and a media player includes: while a media file is playing in the media player, detecting a first user gesture on the touch interface, wherein the first user gesture is associated with a request to vary scan rate through the media file; detecting a distance, speed, and/or gesture duration on the touch interface for the first user gesture; and in response to the first user gesture, changing playback speed of the media file by a scan rate factor determined by the distance, speed and/or gesture duration.
In other embodiments, a method for use in an electronic device with a touch interface and a media player includes: while a media file is playing in the media player, detecting a first rotational user gesture on the touch interface, wherein the first rotational user gesture is associated with a request to vary scan rate through the media file; detecting an angular distance, speed and/or gesture duration on the touch interface for the first rotational user gesture; and in response to the first user rotational gesture, changing playback speed of the media file by a scan rate factor that is a function of the angular distance, speed and/or gesture duration.
In yet another embodiment, a method for use in an electronic device with a touch interface and a media player includes: while a media file is playing in the media player, detecting a first user gesture on the touch interface, wherein the first user gesture is associated with a request to vary scan rate through the media file; and in response to the first user gesture, changing playback speed of the media file by a first predetermined scan rate factor independent of length or speed of the first user gesture. The method can further include: detecting one or more subsequent connected user gestures on the touch interface, wherein each of the subsequent connected user gestures is associated with a request to vary scan rate through the media file; and in response to each of the subsequent connected user gestures, changing playback speed of the media file by a respective additional predetermined scan rate factor independent of length or speed of the respective connected user gestures.
In any of these embodiments, the touch interface can be a touch screen or a touch pad, and the electronic devices in which these methods are implemented can include any sort of electronic device, ranging from a portable device with a touch screen, to a laptop with an integrated touch pad, to a personal computer with a separate touch pad interface.
For a better understanding of the aforementioned embodiments of the invention as well as additional embodiments thereof, reference should be made to the Description of Embodiments below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
The memory 102 may include high speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state memory devices. In some embodiments, the memory 102 may further include storage remotely located from the one or more processors 106, for instance network attached storage accessed via the RF circuitry 112 or external port 148 and a communications network (not shown) such as the Internet, intranet(s), Local Area Networks (LANs), Wide Local Area Networks (WLANs), Storage Area Networks (SANs) and the like, or any suitable combination thereof. Access to the memory 102 by other components of the device 100, such as the CPU 106 and the peripherals interface 108, may be controlled by the memory controller 104.
The peripherals interface 108 couples the input and output peripherals of the device to the CPU 106 and the memory 102. The one or more processors 106 run various software programs and/or sets of instructions stored in the memory 102 to perform various functions for the device 100 and to process data.
In some embodiments, the peripherals interface 108, the CPU 106, and the memory controller 104 may be implemented on a single chip, such as a chip 111. In some other embodiments, they may be implemented on separate chips.
The RF (radio frequency) circuitry 112 receives and sends electromagnetic waves. The RF circuitry 112 converts electrical signals to/from electromagnetic waves and communicates with communications networks and other communications devices via the electromagnetic waves. The RF circuitry 112 may include well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. The RF circuitry 112 may communicate with the networks, such as the Internet, also referred to as the World Wide Web (WWW), an Intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication. The wireless communication may use any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for email, instant messaging, and/or Short Message Service (SMS)), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
The audio circuitry 114, the speaker 116, and the microphone 118 provide an audio interface between a user and the device 100. The audio circuitry 114 receives audio data from the peripherals interface 108, converts the audio data to an electrical signal, and transmits the electrical signal to the speaker 116. The speaker converts the electrical signal to human-audible sound waves. The audio circuitry 114 also receives electrical signals converted by the microphone 118 from sound waves. The audio circuitry 114 converts the electrical signal to audio data and transmits the audio data to the peripherals interface 108 for processing. Audio data may be may be retrieved from and/or transmitted to the memory 102 and/or the RF circuitry 112 by the peripherals interface 108. In some embodiments, the audio circuitry 114 also includes a headset jack (not shown). The headset jack provides an interface between the audio circuitry 114 and removable audio input/output peripherals, such as output-only headphones or a headset with both output (headphone for one or both ears) and input (microphone).
The I/O subsystem 120 provides the interface between input/output peripherals on the device 100, such as the touch screen 126 and other input/control devices 128, and the peripherals interface 108. The I/O subsystem 120 includes a touch-screen controller 122 and one or more input controllers 124 for other input or control devices. The one or more input controllers 124 receive/send electrical signals from/to other input or control devices 128. The other input/control devices 128 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, sticks, and so forth.
The touch screen 126 provides both an output interface and an input interface between the device and a user. The touch-screen controller 122 receives/sends electrical signals from/to the touch screen 126. The touch screen 126 displays visual output to the user. The visual output may include text, graphics, video, and any combination thereof. Some or all of the visual output may correspond to user-interface objects, further details of which are described below.
The touch screen 126 also accepts input from the user based on haptic and/or tactile contact. The touch screen 126 forms a touch-sensitive surface that accepts user input. The touch screen 126 and the touch screen controller 122 (along with any associated modules and/or sets of instructions in the memory 102) detects contact (and any movement or break of the contact) on the touch screen 126 and converts the detected contact into interaction with user-interface objects, such as one or more soft keys, that are displayed on the touch screen. In an exemplary embodiment, a point of contact between the touch screen 126 and the user corresponds to one or more digits of the user. The touch screen 126 may use LCD (liquid crystal display) technology, or LPD (light emitting polymer display) technology, although other display technologies may be used in other embodiments. The touch screen 126 and touch screen controller 122 may detect contact and any movement or break thereof using any of a plurality of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touch screen 126. The touch-sensitive display may be analogous to the multi-touch sensitive tablets described in the following U.S. Pat. No. 6,323,846 (Westerman et al.), U.S. Pat. No. 6,570,557 (Westerman et al.), and/or U.S. Pat. No. 6,677,932 (Westerman), and/or U.S. Patent Publication 2002/0015024A1, each of which is hereby incorporated by reference by their entirety. However, the touch screen 126 displays visual output from the portable device, whereas touch sensitive tablets do not provide visual output. The touch screen 126 may have a resolution in excess of 100 dpi. In an exemplary embodiment, the touch screen 126 may have a resolution of approximately 168 dpi. The user may make contact with the touch screen 126 using any suitable object or appendage, such as a stylus, finger, and so forth.
A touch-sensitive display 126 in some embodiments may be as described in the following applications: (1) U.S. patent application Ser. No. 11/381,313, “Multipoint Touch Surface Controller,” filed on May 2, 2006; (2) U.S. patent application Ser. No. 10/840,862, “Multipoint Touchscreen,” filed on May 6, 2004; (3) U.S. patent application Ser. No. 10/903,964, “Gestures For Touch Sensitive Input Devices,” filed on Jul. 30, 2004; (4) U.S. patent application Ser. No. 11/048,264, “Gestures For Touch Sensitive Input Devices,” filed on Jan. 31, 2005; (5) U.S. patent application Ser. No. 11/038,590, “Mode-Based Graphical User Interfaces For Touch Sensitive Input Devices,” filed on Jan. 18, 2005; (6) U.S. patent application Ser. No. 11/228,758, “Virtual Input Device Placement On A Touch Screen User Interface,” filed on Sep. 16, 2005; (7) U.S. patent application Ser. No. 11/228,700, “Operation Of A Computer With A Touch Screen Interface,” filed on Sep. 16, 2005; (8) U.S. patent application Ser. No. 11/228,737, “Activating Virtual Keys Of A Touch-Screen Virtual Keyboard,” filed on Sep. 16, 2005; and (9) U.S. patent application Ser. No. 11/367,749, “Multi-Functional Hand-Held Device,” filed on Mar. 3, 2006. All of these applications are incorporated by reference herein in their entirety.
In some embodiments, in addition to the touch screen, the device 100 may include a touchpad (shown in
The device 100 also includes a power system 130 for powering the various components. The power system 130 may include a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
In some embodiments, the software components include an operating system 132, a communication module (or set of instructions) 134, a contact/motion module (or set of instructions) 138, a power control module 154, and one or more applications (or set of instructions) 146.
The operating system 132 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
The communication module 134 facilitates communication with other devices over one or more external ports 148 and also includes various software components for handling data received by the RF circuitry 112 and/or the external port 148. The external port 148 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.).
The contact/motion module 138 detects contact with the touch screen 126, in conjunction with the touch-screen controller 122. The contact/motion module 138 includes various software components for performing various operations related to detection of contact with the touch screen 126, such as determining if contact has occurred, determining if there is movement of the contact and tracking the movement across the touch screen, and determining if the contact has been broken (i.e., if the contact has ceased). Determining movement of the point of contact may include determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (including magnitude and/or direction) of the point of contact. In some embodiments, the contact/motion module 138 and the touch screen controller 122 also detects contact on the touchpad.
The contact/motion module 138 includes a media playback gesture processing routine 150 (or instructions) to process user gestures related to controlling the speed and/or direction of media playback in a device 100. One media playback function enabled by the playback gesture processing routine 150 is variable rate scan, which allows a user to scan through a media file during playback—typically in a media player application 152, such as Quicktime (trademark of Apple Inc.)—at varying accelerated scan rates, either forward or backward. The variable rate scan is applicable to a wide range of media file types, including audio and video files.
In some embodiments described below, variable scan rates are chosen (in response to predefined user gestures, described below) from a predetermined/programmed collection of scan rate factors, such as twice (sometimes represented as “2×”), four times (“4×”), eight times (“8×”), and sixteen times (“16×”) normal playback speed for the media file, and corresponding scan rate factors in the reverse playback direction (represented as “−2×”, “−4×”, “−8×” and “−16×” scan rate factors). Of course, these values are only exemplary, and any combination of predetermined scan rate factors can be employed. In some embodiments, a user accelerates/varies the scan rate for a media file during playback by connecting multiple distinct media scan gestures, the scan rate increasing or decreasing in response to each one of the connected scan rate gestures. For example, given the predetermined scan rate factors of 2×, 4×, 8× and 16×, a user could accelerate playback of a media file from 2× to 4× to 8× normal playback speed by connecting three variable rate scan gestures while the media file is being played in the media player.
In some embodiments, a user connects gestures by issuing a series of discrete gestures without breaking contact with the touch sensitive interface. For example, given a variable-rate-scan gesture that is a short, approximately horizontal, two-fingered swipe (one embodiment described herein), a user would connect three such variable-rate-scan gestures by issuing three two-finger swipe gestures separated by pauses—all while their finger tips remain in contact with the touch sensitive interface. This connection method is exemplary—other methods of connecting gestures can also be employed with embodiments of the present invention. Methods for connecting gestures to achieve variable rate scan through a media file are described in greater detail below, in reference to
A user can employ different gestures to initiate variable rate scan operations. These different gesture types can include, among others, a multi-touch linear swipe gesture made by contacting the touch sensitive interface with two closely-spaced finger tips, or a multi-touch rotational gesture made by contacting the touch sensitive interface with separated finger tips and then rotating one of the fingers around the contact point made by the other finger tip. The present invention can be employed with different gestures and gesture styles and is not limited to the gestures described herein, which are exemplary. In addition, the present invention is equally applicable to gestures or to connected sequences of gestures performed in different directions (e.g., leftward as opposed to rightward, or upward as opposed to downward), and is not to be limited to descriptions herein of gestures performed in particular directions, which are exemplary.
In some embodiments, variable scan rates can be determined by the media playback gesture processing module 150 as a function of some physical measure of a gesture, such as the speed and/or linear distance (and gesture direction) across the touch sensitive interface covered by the gesture; the rotational speed and/or angular distance covered by a rotational gesture; or the amount of time the user's fingers remain in contact with the touch sensitive interface after initiating the variable rate scan operation. In these embodiments, the scan rates can be computed using one or more predefined functions of the relevant physical measures, or they can be selected from predefined/programmed scan rate values, as described above. For example, a user can quickly accelerate the scan rate for playback of a media file from normal speed to 16× normal speed by issuing a long (or fast) two-fingered swipe gesture to the right on the touch sensitive interface, or keeping their fingers in contact with the interface until the scan rate is accelerated to the desired speed. The use of this sort of gesture is described in greater detail below, in reference to
Another variable scan rate feature supported by the media playback gesture processing module 150 is media scan with playback in either direction (forward or backward), with the ability to change scan direction and speed during a variable scan rate operation. For example, in some embodiments that provide this functionality, a user can initiate a forward scan at an accelerated rate by using a forward scan gesture (e.g., a two fingered swipe gesture going approximately from left to right on the touch sensitive interface) and then, by using a reverse scan gesture (e.g., a two fingered swipe gesture going approximately from right to left on the touch sensitive interface) immediately initiate variable rate scan in the reverse direction (i.e., in response to the leftward swipe, the media playback gesture processing module 150 would initiate reverse playback of the media file). This operation is described further in reference to
In other embodiments, when a user issues one or more scan gestures that are associated with a reverse in playback direction from the current playback direction (i.e., one or more leftward two-fingered swipes issued by the user when the media file is currently being scanned in the forward direction), the media playback gesture processing module 150 first slows (until normal playback speed is reached) and then reverses the scan direction of the media file. This operation is described further in reference to
The media playback gesture processing module 150 enables intuitive user interaction with the variable scan rate feature by providing different operations in response to cessation of user contact with the touch sensitive interface. In some embodiments, the media playback gesture processing module 150 reverts to normal playback upon any cessation of user contact with the touch sensitive interface. In other embodiments, following cessation of user contact the media playback gesture processing module 150 holds the accelerated scan rate and does not return to normal playback speed until the user issues a third gesture associated with that purpose (such as a two fingered tap on the touch screen interface). These operations are described further in reference to
In some embodiments the contact/motion module 138 and the media scan processing module 150 support multi-touch user gestures, as described in some of the patent applications referenced above, which are incorporated herein. This application incorporates teachings from those references related to, among other things, resolving gestures that are approximately correct and in approximately the expected direction and/or orientation. Accordingly, any description or reference in this application to a gesture occurring in a particular direction and/or orientation or has a particular other characteristic shall presume that the gesture is at best approximately as described. In different embodiments, these modules 138, 150 can also implement the variable rate scan operations described herein using other than multi-touch technology.
The graphics module 140 includes various known software components for rendering and displaying graphics on the touch screen 126. Note that the term “graphics” includes any object that can be displayed to a user, including without limitation text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations and the like.
The power control module 154 detects, mediates, and implements user power-off and power-on requests. It is responsive to inputs provided by the touch-screen controller 122 and the power system 130. It also issues control signals 131 to the power system 130 to implement user power-off requests.
The one or more applications 146 can include any applications installed on the device 100, including without limitation, a browser, address book, contact list, email, instant messaging, word processing, keyboard emulation, widgets, JAVA-enabled applications, encryption, digital rights management, voice recognition, voice replication, location determination capability (such as that provided by the global positioning system (GPS)), a music player (which plays back recorded music stored in one or more files, such as MP3 or AAC files), a media player 150 (which plays recorded videos or music stored in one or more media files, such as Quicktime, DIVX, or MPEG video files; or MP3 or AAC audio files, to name just a few possibilities), etc.
In some embodiments, the device 100 may include the functionality of an MP3 player, such as an iPod (trademark of Apple Inc.). The device 100 may, therefore, include a 36-pin connector that is compatible with the iPod. An example of such a device is the iPhone (trademark of Apple Inc.). In some embodiments, the device 100 may include one or more optional optical sensors (not shown), such as CMOS or CCD image sensors, for use in imaging applications.
In some embodiments, the device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through the touch screen 126 and, if included on the device 100, the touchpad. By using the touch screen and touchpad as the primary input/control device for operation of the device 100, the number of physical input/control devices (such as push buttons, dials, and the like) on the device 100 may be reduced. In some embodiments, the device 100 includes the touch screen 126, the touchpad, a power button 129 (which can be any manner of physical interface device, including but not limited to, a push button, switch, dial, slider, rocker button or touchpad) for powering the device on/off and locking the device, a volume adjustment rocker button and a slider switch for toggling ringer profiles. In an alternative embodiment, the device 100 also may accept verbal input for activation or deactivation of some functions through the microphone 118.
The predefined set of functions that are performed exclusively through the touch screen and the touchpad include navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates the device 100 to a main, home, or root menu from any user interface that may be displayed on the device 100. In such embodiments, the touchpad may be referred to as a “menu button” 410. In some other embodiments, the menu button 410 may be a physical push button or other physical input/control device instead of a touchpad.
Referring to
Referring to
In contrast, in the embodiment shown in
Referring to
The process 700B shown in
While the processes of
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated.
This application claims priority to U.S. Provisional Patent Application No. 61/019,291, “Variable Rate Media Playback Methods for Electronic Devices with Touch Interfaces,” filed Jan. 6, 2008, which is incorporated herein by reference in its entirety. This application is related to the following applications: (1) U.S. patent application Ser. No. 10/188,182, “Touch Pad For Handheld Device,” filed on Jul. 1, 2002; (2) U.S. patent application Ser. No. 10/722,948, “Touch Pad For Handheld Device,” filed on Nov. 25, 2003; (3) U.S. patent application Ser. No. 10/643,256, “Movable Touch Pad With Added Functionality,” filed on Aug. 18, 2003; (4) U.S. patent application Ser. No. 10/654,108, “Ambidextrous Mouse,” filed on Sep. 2, 2003; (5) U.S. patent application Ser. No. 10/840,862, “Multipoint Touchscreen,” filed on May 6, 2004; (6) U.S. patent application Ser. No. 10/903,964, “Gestures For Touch Sensitive Input Devices,” filed on Jul. 30, 2004; (7) U.S. patent application Ser. No. 11/038,590, “Mode-Based Graphical User Interfaces For Touch Sensitive Input Devices” filed on Jan. 18, 2005; (8) U.S. patent application Ser. No. 11/057,050, “Display Actuator,” filed on Feb. 11, 2005; (9) U.S. Provisional Patent Application No. 60/658,777, “Multi-Functional Hand-Held Device,” filed Mar. 4, 2005; (10) U.S. patent application Ser. No. 11/367,749, “Multi-Functional Hand-Held Device,” filed Mar. 3, 2006; (11) U.S. patent application Ser. No. 11/850,635, “Touch Screen Device, Method, and Graphical User Interface for Determining Commands by Applying Heuristics,” filed Sep. 5, 2007; (12) U.S. patent application Ser. No. 11/322,549, “Unlocking a Device by Performing Gestures on an Unlock Image,” filed Dec. 23, 2005; and (13) U.S. patent application Ser. No. 11/322,550, “Indication of Progress Towards Satisfaction of a User Input Condition,” filed Dec. 23, 2005. All of these applications are incorporated by reference herein in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4837798 | Cohen et al. | Jun 1989 | A |
4935954 | Thompson et al. | Jun 1990 | A |
4972462 | Shibata | Nov 1990 | A |
5003577 | Ertz et al. | Mar 1991 | A |
5164982 | Davis et al. | Nov 1992 | A |
5202961 | Mills et al. | Apr 1993 | A |
5283818 | Klausner et al. | Feb 1994 | A |
5333266 | Boaz et al. | Jul 1994 | A |
5390236 | Klausner et al. | Feb 1995 | A |
5463725 | Henckel et al. | Oct 1995 | A |
5510808 | Cina et al. | Apr 1996 | A |
5524140 | Klausner et al. | Jun 1996 | A |
5550559 | Isensee et al. | Aug 1996 | A |
5559301 | Bryan et al. | Sep 1996 | A |
5572576 | Klausner et al. | Nov 1996 | A |
5745716 | Tchao et al. | Apr 1998 | A |
5809267 | Moran et al. | Sep 1998 | A |
5825308 | Rosenberg | Oct 1998 | A |
5844547 | Minakuchi et al. | Dec 1998 | A |
5859638 | Coleman et al. | Jan 1999 | A |
5936623 | Amro | Aug 1999 | A |
5943052 | Allen et al. | Aug 1999 | A |
5973676 | Kawakura | Oct 1999 | A |
6073036 | Heikkinen et al. | Jun 2000 | A |
6278443 | Amro et al. | Aug 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6335722 | Tani et al. | Jan 2002 | B1 |
6340979 | Beaton et al. | Jan 2002 | B1 |
6353442 | Masui | Mar 2002 | B1 |
6430574 | Stead | Aug 2002 | B1 |
6469695 | White | Oct 2002 | B1 |
6542171 | Satou et al. | Apr 2003 | B1 |
6570557 | Westerman et al. | May 2003 | B1 |
6677932 | Westerman | Jan 2004 | B1 |
6690387 | Zimmerman et al. | Feb 2004 | B2 |
6865718 | Montalcini | Mar 2005 | B2 |
6954899 | Anderson | Oct 2005 | B1 |
6966037 | Fredriksson et al. | Nov 2005 | B2 |
7007239 | Hawkins et al. | Feb 2006 | B1 |
7054965 | Bell et al. | May 2006 | B2 |
7082163 | Uenoyama et al. | Jul 2006 | B2 |
7152210 | Van Den Hoven et al. | Dec 2006 | B1 |
7312790 | Sato et al. | Dec 2007 | B2 |
7404152 | Zinn et al. | Jul 2008 | B2 |
7408538 | Hinckley et al. | Aug 2008 | B2 |
7436395 | Chiu et al. | Oct 2008 | B2 |
7479949 | Jobs et al. | Jan 2009 | B2 |
7786975 | Ording et al. | Aug 2010 | B2 |
7996792 | Anzures et al. | Aug 2011 | B2 |
8032298 | Han | Oct 2011 | B2 |
20020015024 | Westerman et al. | Feb 2002 | A1 |
20020080151 | Venolia | Jun 2002 | A1 |
20020122066 | Bates et al. | Sep 2002 | A1 |
20020143741 | Laiho et al. | Oct 2002 | A1 |
20020154173 | Etgen et al. | Oct 2002 | A1 |
20020186252 | Himmel et al. | Dec 2002 | A1 |
20020191029 | Gillespie et al. | Dec 2002 | A1 |
20030008679 | Iwata et al. | Jan 2003 | A1 |
20030076301 | Tsuk et al. | Apr 2003 | A1 |
20030076306 | Zadesky et al. | Apr 2003 | A1 |
20030122787 | Zimmerman et al. | Jul 2003 | A1 |
20030128192 | van Os | Jul 2003 | A1 |
20030131317 | Budka et al. | Jul 2003 | A1 |
20030226152 | Billmaier et al. | Dec 2003 | A1 |
20040100479 | Nakano et al. | May 2004 | A1 |
20040143796 | Lerner et al. | Jul 2004 | A1 |
20040252109 | Trent, Jr. et al. | Dec 2004 | A1 |
20050012723 | Pallakoff | Jan 2005 | A1 |
20050024345 | Eastty et al. | Feb 2005 | A1 |
20050071437 | Bear et al. | Mar 2005 | A1 |
20050177445 | Church | Aug 2005 | A1 |
20050210403 | Satanek | Sep 2005 | A1 |
20060007174 | Shen | Jan 2006 | A1 |
20060010400 | Dehlin et al. | Jan 2006 | A1 |
20060015819 | Hawkins et al. | Jan 2006 | A1 |
20060018446 | Schmandt et al. | Jan 2006 | A1 |
20060026521 | Hotelling et al. | Feb 2006 | A1 |
20060026535 | Hotelling et al. | Feb 2006 | A1 |
20060026536 | Hotelling et al. | Feb 2006 | A1 |
20060038796 | Hinckley et al. | Feb 2006 | A1 |
20060184901 | Dietz | Aug 2006 | A1 |
20060234680 | Doulton | Oct 2006 | A1 |
20060236262 | Bathiche et al. | Oct 2006 | A1 |
20060239419 | Joseph et al. | Oct 2006 | A1 |
20060253547 | Wood et al. | Nov 2006 | A1 |
20060268020 | Han | Nov 2006 | A1 |
20070002018 | Mori | Jan 2007 | A1 |
20070080936 | Tsuk et al. | Apr 2007 | A1 |
20070132789 | Ording et al. | Jun 2007 | A1 |
20070146337 | Ording et al. | Jun 2007 | A1 |
20070150830 | Ording et al. | Jun 2007 | A1 |
20070192744 | Reponen | Aug 2007 | A1 |
20070198111 | Oetzel et al. | Aug 2007 | A1 |
20080042984 | Lim et al. | Feb 2008 | A1 |
20080055264 | Anzures et al. | Mar 2008 | A1 |
20080056459 | Vallier et al. | Mar 2008 | A1 |
20080155417 | Vallone et al. | Jun 2008 | A1 |
20080163131 | Hirai et al. | Jul 2008 | A1 |
20080165141 | Christie | Jul 2008 | A1 |
20080168395 | Ording et al. | Jul 2008 | A1 |
20080207176 | Brackbill et al. | Aug 2008 | A1 |
20080259040 | Ording et al. | Oct 2008 | A1 |
20080320391 | Lemay et al. | Dec 2008 | A1 |
20090006958 | Pohjola et al. | Jan 2009 | A1 |
20090075694 | Kim et al. | Mar 2009 | A1 |
20090158149 | Ko | Jun 2009 | A1 |
20090174667 | Kocienda et al. | Jul 2009 | A1 |
20090178008 | Herz et al. | Jul 2009 | A1 |
20090199119 | Park et al. | Aug 2009 | A1 |
20100134425 | Storrusten | Jun 2010 | A1 |
20100162181 | Shiplacoff et al. | Jun 2010 | A1 |
20100231534 | Chaudhri et al. | Sep 2010 | A1 |
Number | Date | Country |
---|---|---|
1673939 | Sep 2005 | CN |
196 21 593 | Dec 1997 | DE |
298 24 936 | Aug 2003 | DE |
10 2004 029 203 | Dec 2005 | DE |
0 679 005 | Oct 1995 | EP |
0 684 543 | Nov 1995 | EP |
0 795 811 | Sep 1997 | EP |
0 961 199 | Dec 1999 | EP |
0 994 409 | Apr 2000 | EP |
1 058 181 | Dec 2000 | EP |
1 615 109 | Jan 2006 | EP |
1 942 401 | Jul 2008 | EP |
2002-0069952 | Sep 2002 | KR |
WO 9320640 | Oct 1993 | WO |
WO 9417469 | Aug 1994 | WO |
WO 9916181 | Apr 1999 | WO |
WO 0063766 | Oct 2000 | WO |
WO 0102949 | Jan 2001 | WO |
WO 0129702 | Apr 2001 | WO |
WO 2005010725 | Feb 2005 | WO |
WO 2006020304 | Feb 2006 | WO |
WO 2006020304 | Feb 2006 | WO |
WO 2006020305 | Feb 2006 | WO |
WO 2006020305 | Feb 2006 | WO |
Entry |
---|
International Search Report and Written Opinion for International Application No. PCT/US2008/086538, dated Jun. 2, 2009. |
Office Action dated Jun. 19, 2012, received in U.S. Appl. No. 12/566,669, 34 pages (Chaudhri). |
Office Action dated Jun. 7, 2012, received in U.S. Appl. No. 12/666,673, 20 pages (Pisula). |
Ahlberg, C. et al., “The Alphaslider: A Compact and Rapid Selector,” ACM, Apr. 1994, proceedings of the SIGCHI conference on Human Factors in Computing Systems, pp. 365-371. |
Arons, B., “The Audio-Graphical Interface to a Personal Integrated Telecommunications System,” Massachusetts Institute of Technology, Department of Architecture Master Thesis, Jun. 1984, 88 pages. |
Bederson, B, “Fisheye Menus,” Human-Computer Interaction Lab, Institute for Advanced Computer Studies, Computer Science Department, University of Maryland, College Park, ACM 2000, 9 pages. |
Coleman, D., “Meridian Mail Voice Mail System Integrates Voice Processing and Personal Computing,” Speech Technology, vol. 4, No. 2, Mar./Apr. 1988, 7 pages. |
Esato, “A Couple of My Mates. Meet JasJar and K-Jam (Many Pics),” 90 pages, Apr. 13, 2006. |
Hinckley et al., “Quantitative Analysis of Scrolling Techniques,” CHI 2002 Conf. On Human Factors in Computing Systems, pp. 65-72 (CHI Letters, vol. 4, No. 1), 2002. |
Microsoft Corporation, Microsoft Office Word 2003 (SP2), 1983-2003, Microsoft Corporation, SP3 as of 2005, 5 pages MSWord 2003 Figures 1-5. |
Microsoft Word 2000 (9.0.2720), 1999, Microsoft Corporation, 5 Pages MSWord figures 1-5. |
Miller, D., “Personal/Java Application Environment,” Jun. 8, 1999, http://java.sun.com/products/personaljava/touchable/, 12 pages. |
Myers, B., “Shortcutter for Palm,” The Pittsburgh Pebbles PDA Project, printed Dec. 19, 2006, 11 pages, http://www.cs.cmu.edu/˜pebbles/v5/shortcutter/palm/index.html. |
Northern Telecom, “Meridian Mail PC User Guide,” 17 pages. |
Potala Software, “Potala Telly,” Oct. 19, 2005, http://web.archive.org/web/20051019000340/www.potalasoftware.com/telly.aspx, pp. 1-6. |
Ramos, G., “Zliding: Fluid Zooming and Sliding for High Precision Parameter Manipulation,” Oct. 2005, Proceedings of the 18th annual ACM Symposium on User Interface Software and Technology, pp. 143-152. |
Rekimoto, J, “SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces,” CHI 2002, Apr. 20-25, 2002, 8 pages. |
Roth et al., “Bezel Swipe: Conflict-Free Scrolling and Multiple Selection on Mobile Touch Screen Devices,” Proceedings of the 27th International Conference on Human Factors in Computing Systems, Apr. 8, 2009, Boston MA, pp. 1523-1526. |
Schmandt, C. et al., “A Conversational Telephone Messaging System” IEEE Transactions on Consumer Electronics, Aug. 1984, vol. CE-30, 4 pages. |
Schmandt, C. et al., “Phone Slave: A Graphical Telecommunications Interface,” Proceeding of the SID, vol. 26/1, 1985, 4 pages. |
Schmandt, C. et al., “Phone Slave: A Graphical Telecommunications Interface,” Society for Information Display, 1984 International Symposium Digest of Technical Papers, Jun. 1984, San Francisco, CA, 4 pages. |
Smith, R., “Sygic. Mobile Contacts,” Sep. 2, 2004, 13 pages, http://www.pocketnow.com/index.php?a=portal—detail&t=reviews&id=467. |
International Search Report and Written Opinion dated Nov. 22, 2007, received in International Application No. PCT/US2006/061333, which corresponds to U.S. Appl. No. 11/322,551, 16 pages (Ording). |
International Search Report and Written Opinion dated Feb. 15, 2008, received in International Application No. PCT/US2006/061337, which corresponds to U.S. Appl. No. 11/322,553, 12 pages (Ording). |
European Search Report dated Sep. 2, 2009, received in European Application No. 09162953.5, which corresponds to U.S. Appl. No. 11/322,553, 6 pages (Ording). |
International Search Report dated Apr. 26, 2007, received in International Application No. PCT/US2006/061627, which corresponds to U.S. Appl. No. 11/322,547, 11 pages (Ording). |
International Search Report and Written Opinion dated Feb. 21, 2008, received in International Application No. PCT/US2007/077443, which corresponds to U.S. Appl. No. 11/770,720, 11 pages (Anzures). |
International Search Report and Written Opinion dated Jul. 4, 2008, received in International Application No. PCT/US2008/050083, which corresponds to U.S. Appl. No. 11/968,064, 14 pages (Ording). |
International Preliminary Report on Patentability dated Jul. 15, 2010, received in International Application No. PCT/US2008/086538, which corresponds to U.S. Appl. No. 12/240,974, 11 pages (Gehani). |
International Search Report and Written Opinion dated Jun. 18, 2010, received in International Application No. PCT/US2010/027088, which corresponds to U.S. Appl. No. 12/566,669, 13 pages (Chaudhri). |
International Search Report and Written Opinion dated Nov. 15, 2010, received in International Application No. PCT/US10/48443, which corresponds to U.S. Appl. No. 12/567,717, 7 pages (Marr). |
International Search Report and Written Opinion dated May 11, 2011, received in International Application No. PCT/US2010/062319, which corresponds to U.S Appl. No. 12/788,279, 16 pages (Chaudri). |
Office Action dated Dec. 18, 2008, received in U.S. Appl. No. 11/322,551, 24 pages (Ording). |
Office Action dated Jun. 15, 2009, received in U.S. Appl. No. 11/322,551 18 pages (Ording). |
Office Action dated Sep. 22, 2009, received in U.S. Appl. No. 11/322,551, 19 pages (Ording). |
Office Action dated Mar. 12, 2010, received in U.S. Appl. No. 11/322,551, 21 pages (Ording). |
Notice of Allowance dated Jul. 21, 2010, received in U.S. Appl. No. 11/322,551, 8 pages (Ording). |
Office Action dated Jun. 15, 2007, received in U.S. Appl. No. 11/322,553, 16 pages (Ording). |
Office Action dated Feb. 5, 2008, received in U S. Appl. No. 11/322,553, 11 pages (Ording). |
Office Action dated Aug. 5, 2008, received in U.S. Appl. No. 11/322,553, 28 pages (Ording). |
Office Action dated Dec. 26, 2008, received in U.S. Appl. No. 11/322,553, 26 pages (Ording). |
Office Action dated Jun. 17, 2009, received in U.S. Appl. No. 11/322,553, 30 pages (Ording). |
Office Action dated Apr. 5, 2010, received in U.S. Appl. No. 11/322,553, 24 pages (Ording). |
Office Action dated Sep. 1, 2009, received in Australian patent application No. 2006321681, which corresponds to U.S. Appl. No. 11/322,553, 2 pages (Ording). |
Office Action dated Dec. 23, 2009, received in Australian patent application No. 2006321681, which corresponds to U.S. Appl. No. 11/322,553, 2 pages (Ording). |
Notice of Acceptance dated Sep. 14, 2010, received in Australian Application No. 2006321681, which corresponds to U.S. Appl. No. 11/322,553, 3 pages (Ording). |
Office Action dated Jan. 8, 2010, received in Chinese Patent Application No. 200680052109.3, which corresponds to U.S. Appl. No. 11/322,553, 6 pages (Ording). |
Office Action dated Nov. 9. 2010, received in Chinese Patent Application No. 200680052109.3, which corresponds to U.S. Appl. No. 11/322,553, 8 pages (Ording). |
Office Action dated May 5, 2011, received in Chinese Patent Application No, 200680052109.3, which corresponds to U.S. Appl. No. 11/322,553, 9 pages (Ording). |
Office Action dated Jan. 10, 2012, received in Chinese Patent Application No. 200680052109.3, which corresponds to U.S. Appl. No. 11/322,553, 17 pages (Ording). |
Office Action dated Sep. 8, 2009, received in German patent application No. 11 2006 003 309.3-53 which corresoonds to U.S. Appl. No. 11/322,553, 8 pages (Ording). |
Office Action dated Apr. 6, 2011, received in German patent application No. 11 2006 003 309.3, which corresponds to U.S. Appl. No. 11/322,553, 5 pages (Ording). |
Office Action dated Apr. 6, 2011, received in German Patent Application No. 11 2006 004 220.3, which corresponds to U.S. Appl. No. 11/322,553, 5 pages (Ording). |
Office Action dated Sep. 2, 2009, issued in European Application No. 09162953.5, which corresponds to U.S. Appl. No. 11/322,553, 6 pages (Ording). |
Office Action dated May 31, 2010, received in Korean Application No. 10-2008-7016570, which corresponds to U.S. Appl. No. 11/322,553, 5 pages (Ording). |
Office Action dated Jan. 5, 2011, received in Korean Patent Application No. 10-2009-7011991, which corresponds to U.S. Appl. No. 11/322,553, 6 pages (Ording). |
Office Action dated Oct. 30, 2007 received in U.S. Appl. No. 11/322,547, 16 pages (Ording). |
Office Action dated Jun. 9, 2008, received in U.S. Appl. No. 11/322,547, 17 pages (Ording). |
Office Action dated Aug. 22, 2008, received in U.S. Appl. No. 11/322,547, 3 pages (Ording). |
Office Action dated Feb. 5, 2009, received in U.S. Appl. No. 11/322,547, 13 pages (Ording). |
Office Action dated Aug. 6, 2009, received in U.S. Appl. No. 11/322,547, 13 pages (Ording). |
Final Office Action dated May 28, 2010, received in U.S. Appl. No. 11/322,547, 16 pages (Ording). |
Notice of Allowance dated Aug. 6, 2010, received in U.S. Appl. No. 11/322,547, 14 pages (Ording). |
Office Action dated Jan. 8, 2010, received in Chinese Patent Application No. 200680052778.0; which corresponds to U.S. Appl. No. 11/322,547, 19 pages (Ording). |
Office Action dated Aug. 11, 2010, received in Chinese Application No. 200680052778.0, which corresponds to U.S. Appl. No. 11/322,547, 9 pages (Ording). |
Office Action dated May 6, 2011, received in Chinese Patent Application No. 201010516160.3, which corresponds to U.S. Appl. No. 11/322,547, 10 pages (Ording). |
Office Action dated Oct. 14, 2009, received in German Patent Application No. A116012WODE, which corresponds to U.S. Appl. No. 11/322,547, 9 pages (Ording). |
Office Action dated Apr. 21, 2009, received in the European Patent Application No. 06 846 477.5, which corresponds to U.S. Appl. No. 11/322,547, 6 pages (Ording). |
Office Action dated May 31, 2010, received in Korean Patent Application No. 10-2008-7017977, which corresponds to U.S. Appl. No. 11/322,547, 7 pages (Ording). |
Office Action dated Jan. 4, 2011, received in U.S. Appl. No. 11/770,720, 18 pages (Anzures). |
Notice of Allowance dated May 20, 2011, received in U.S. Appl. No. 11/770,720,8 pages (Anzures). |
Office Action dated Feb. 17, 2010, received in Australian Patent Application No. 2007292473, which corresponds to U.S. Appl. No. 11/770,720, 1 page (Anzures). |
Office Action dated Oct. 25, 2010, received in Chinese Patent Application No. 200780040362.1, which corresponds to U.S. Appl. No. 11/770,720, 18 pages (Anzures). |
Office Action dated Jul. 21, 2011, received in Chinese Patent Application No. 200780040362.1, which corresponds to U.S. Appl. No. 11/770,720, 19 pages (Anzures). |
Office Action dated Jun. 7, 2010, received in German Patent Application No. 11 2007 002 090.3-53, which corresponds to U.S. Appl. No. 11/770,720, 8 pages (Anzures). |
Office Action dated Feb. 24, 2010, received in European Patent Application No. 07 814 635.4-2212, which corresponds to U.S. Appl. No. 11/770,720, 4 pages (Anzures). |
Summons to attend oral proceedings dated Nov. 24, 2010 received in European Patent Application No. 07 814 635.4, which corresponds to U.S. Appl. No. 11/770,720, 5 pages (Anzures). |
Decision to Grant dated Nov. 11, 2011, received in European Patent Application No. 2 069 895, which corresponds to U.S. Appl. No. 11/770,720, 1 page (Anzures). |
Office Action dated Jun. 6, 2011, received in Japanese Patent Application No. 2009 527504, which corresponds to U.S. Appl. No. 11/770,720, 4 pages (Anzures). |
Office Action dated Feb. 15, 2011, received in Korean Patent Application No. 10-2009-700762, which corresponds to U.S. Appl. No. 11/770,720, 3 pages (Anzures). |
Office Action dated May 15, 2009, received in U.S. Appl. No. 11/968,064, 20 pages (Ording). |
Final Office dated Jan. 5, 2010, received in U.S. Appl. No. 11/968,064, 20 pages (Ording). |
Office Action dated Sep. 23, 2011, received in U.S. Appl. No. 12/566,638, 20 pages (Ording). |
Office Action dated May 7, 2012, received in U.S. Appl. No. 12/566,638, 15 pages (Ording). |
Office Action dated Apr. 23, 2012, received in Chinese Patent Application No. 201010292415.2, which corresponds to U.S. Appl. No. 12/567,717, 9 pages (Marr). |
Office Action dated May 23, 2012, received in U.S. Appl. No. 12/566,671, 24 pages (Chaudhri). |
Number | Date | Country | |
---|---|---|---|
20090174677 A1 | Jul 2009 | US |
Number | Date | Country | |
---|---|---|---|
61019291 | Jan 2008 | US |