The present disclosure relates generally to computer user interfaces, and more specifically to techniques for displaying widgets.
An electronic device can have applications installed on the device, where the applications enable access to certain functions and application data. Techniques are implemented on the electronic device for displaying application data.
Some techniques for displaying application data using electronic devices, however, are generally cumbersome and inefficient. For example, some existing techniques use a complex and time-consuming user interface, which may include multiple key presses or keystrokes. Existing techniques require more time than necessary, wasting user time and device energy. This latter consideration is particularly important in battery-operated devices.
Accordingly, the present technique provides electronic devices with faster, more efficient methods and interfaces for displaying application data using widgets. Such methods and interfaces optionally complement or replace other methods for displaying application data. Such methods and interfaces reduce the cognitive burden on a user and produce a more efficient human-machine interface. For battery-operated computing devices, such methods and interfaces conserve power and increase the time between battery charges.
In some embodiments, a method is described. The method comprises: at an electronic device with a display device: displaying, via the display device, a first plurality of application icons without displaying a first set of one or more user interface elements, wherein the application icons are selectable to display application user interfaces for corresponding applications; while displaying, via the display device, the first plurality of application icons, detecting a first user input; and in response to detecting the first user input: in accordance with a determination that the first user input includes movement in a first direction: ceasing display of the first plurality of application icons; and displaying, via the display device, a second plurality of application icons that are different from the first plurality of application icons, wherein the application icons are selectable to display application user interfaces for corresponding applications; and in accordance with a determination that the first user input includes movement in a second direction that is different from the first direction: modifying display of the first plurality of application icons to change a distance between a first application icon of the first plurality of application icons and a second application icon of the first plurality of application icons; and concurrently displaying, via the display device, the modified first plurality of application icons and the first set of one or more user interface elements.
In some embodiments, a non-transitory computer-readable storage medium is described. The non-transitory computer-readable storage medium stores one or more programs configured to be executed by one or more processors of an electronic device with a display device, the one or more programs including instructions for: displaying, via the display device, a first plurality of application icons without displaying a first set of one or more user interface elements, wherein the application icons are selectable to display application user interfaces for corresponding applications; while displaying, via the display device, the first plurality of application icons, detecting a first user input; and in response to detecting the first user input: in accordance with a determination that the first user input includes movement in a first direction: ceasing display of the first plurality of application icons; and displaying, via the display device, a second plurality of application icons that are different from the first plurality of application icons, wherein the application icons are selectable to display application user interfaces for corresponding applications; and in accordance with a determination that the first user input includes movement in a second direction that is different from the first direction: modifying display of the first plurality of application icons to change a distance between a first application icon of the first plurality of application icons and a second application icon of the first plurality of application icons; and concurrently displaying, via the display device, the modified first plurality of application icons and the first set of one or more user interface elements.
In some embodiments, a transitory computer-readable storage medium is described. The transitory computer-readable storage medium stores one or more programs configured to be executed by one or more processors of an electronic device with a display device, the one or more programs including instructions for: displaying, via the display device, a first plurality of application icons without displaying a first set of one or more user interface elements, wherein the application icons are selectable to display application user interfaces for corresponding applications; while displaying, via the display device, the first plurality of application icons, detecting a first user input; and in response to detecting the first user input: in accordance with a determination that the first user input includes movement in a first direction: ceasing display of the first plurality of application icons; and displaying, via the display device, a second plurality of application icons that are different from the first plurality of application icons, wherein the application icons are selectable to display application user interfaces for corresponding applications; and in accordance with a determination that the first user input includes movement in a second direction that is different from the first direction: modifying display of the first plurality of application icons to change a distance between a first application icon of the first plurality of application icons and a second application icon of the first plurality of application icons; and concurrently displaying, via the display device, the modified first plurality of application icons and the first set of one or more user interface elements.
In some embodiments, an electronic device is described. The electronic device comprises: a display device; one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: displaying, via the display device, a first plurality of application icons without displaying a first set of one or more user interface elements, wherein the application icons are selectable to display application user interfaces for corresponding applications; while displaying, via the display device, the first plurality of application icons, detecting a first user input; and in response to detecting the first user input: in accordance with a determination that the first user input includes movement in a first direction: ceasing display of the first plurality of application icons; and displaying, via the display device, a second plurality of application icons that are different from the first plurality of application icons, wherein the application icons are selectable to display application user interfaces for corresponding applications; and in accordance with a determination that the first user input includes movement in a second direction that is different from the first direction: modifying display of the first plurality of application icons to change a distance between a first application icon of the first plurality of application icons and a second application icon of the first plurality of application icons; and concurrently displaying, via the display device, the modified first plurality of application icons and the first set of one or more user interface elements.
In some embodiments, an electronic device is described. The electronic device comprises: a display device; means for displaying, via the display device, a first plurality of application icons without displaying a first set of one or more user interface elements, wherein the application icons are selectable to display application user interfaces for corresponding applications; means, while displaying, via the display device, the first plurality of application icons, for detecting a first user input; and means, in response to detecting the first user input: in accordance with a determination that the first user input includes movement in a first direction: for ceasing display of the first plurality of application icons; and for displaying, via the display device, a second plurality of application icons that are different from the first plurality of application icons, wherein the application icons are selectable to display application user interfaces for corresponding applications; and in accordance with a determination that the first user input includes movement in a second direction that is different from the first direction: for modifying display of the first plurality of application icons to change a distance between a first application icon of the first plurality of application icons and a second application icon of the first plurality of application icons; and for concurrently displaying, via the display device, the modified first plurality of application icons and the first set of one or more user interface elements.
Executable instructions for performing these functions are, optionally, included in a non-transitory computer-readable storage medium or other computer program product configured for execution by one or more processors. Executable instructions for performing these functions are, optionally, included in a transitory computer-readable storage medium or other computer program product configured for execution by one or more processors.
Thus, devices are provided with faster, more efficient methods and interfaces for displaying widgets, thereby increasing the effectiveness, efficiency, and user satisfaction with such devices. Such methods and interfaces may complement or replace other methods for displaying widgets.
For a better understanding of the various described embodiments, reference should be made to the Description of Embodiments below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.
The following description sets forth exemplary methods, parameters, and the like. It should be recognized, however, that such description is not intended as a limitation on the scope of the present disclosure but is instead provided as a description of exemplary embodiments.
There is a need for electronic devices that provide efficient methods and interfaces for displaying widgets. For example, a user may find it difficult to easily access certain application data. The user may use widgets to access the application data, but it may be difficult to navigate to the widgets. Additionally, the user may not be able to access other desired functions of the device while widgets are being displayed. Accordingly, techniques are needed for displaying widgets in readily accessible manner. Moreover, techniques are needed that allow the user to perform a variety of functions while having access to the application data provided by the widgets. Such techniques can reduce the cognitive burden on a user who accesses application data via widgets, thereby enhancing productivity. Further, such techniques can reduce processor and battery power otherwise wasted on redundant user inputs.
Below,
Although the following description uses terms “first,” “second,” etc. to describe various elements, these elements should not be limited by the terms. These terms are only used to distinguish one element from another. For example, a first touch could be termed a second touch, and, similarly, a second touch could be termed a first touch, without departing from the scope of the various described embodiments. The first touch and the second touch are both touches, but they are not the same touch.
The terminology used in the description of the various described embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “includes,” “including,” “comprises,” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The term “if” is, optionally, construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” is, optionally, construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
Embodiments of electronic devices, user interfaces for such devices, and associated processes for using such devices are described. In some embodiments, the device is a portable communications device, such as a mobile telephone, that also contains other functions, such as PDA and/or music player functions. Exemplary embodiments of portable multifunction devices include, without limitation, the iPhone®, iPod Touch®, and iPad® devices from Apple Inc. of Cupertino, Calif. Other portable electronic devices, such as laptops or tablet computers with touch-sensitive surfaces (e.g., touch screen displays and/or touchpads), are, optionally, used. It should also be understood that, in some embodiments, the device is not a portable communications device, but is a desktop computer with a touch-sensitive surface (e.g., a touch screen display and/or a touchpad).
In the discussion that follows, an electronic device that includes a display and a touch-sensitive surface is described. It should be understood, however, that the electronic device optionally includes one or more other physical user-interface devices, such as a physical keyboard, a mouse, and/or a joystick.
The device typically supports a variety of applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disk authoring application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a workout support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
The various applications that are executed on the device optionally use at least one common physical user-interface device, such as the touch-sensitive surface. One or more functions of the touch-sensitive surface as well as corresponding information displayed on the device are, optionally, adjusted and/or varied from one application to the next and/or within a respective application. In this way, a common physical architecture (such as the touch-sensitive surface) of the device optionally supports the variety of applications with user interfaces that are intuitive and transparent to the user.
Attention is now directed toward embodiments of portable devices with touch-sensitive displays.
As used in the specification and claims, the term “intensity” of a contact on a touch-sensitive surface refers to the force or pressure (force per unit area) of a contact (e.g., a finger contact) on the touch-sensitive surface, or to a substitute (proxy) for the force or pressure of a contact on the touch-sensitive surface. The intensity of a contact has a range of values that includes at least four distinct values and more typically includes hundreds of distinct values (e.g., at least 256). Intensity of a contact is, optionally, determined (or measured) using various approaches and various sensors or combinations of sensors. For example, one or more force sensors underneath or adjacent to the touch-sensitive surface are, optionally, used to measure force at various points on the touch-sensitive surface. In some implementations, force measurements from multiple force sensors are combined (e.g., a weighted average) to determine an estimated force of a contact. Similarly, a pressure-sensitive tip of a stylus is, optionally, used to determine a pressure of the stylus on the touch-sensitive surface. Alternatively, the size of the contact area detected on the touch-sensitive surface and/or changes thereto, the capacitance of the touch-sensitive surface proximate to the contact and/or changes thereto, and/or the resistance of the touch-sensitive surface proximate to the contact and/or changes thereto are, optionally, used as a substitute for the force or pressure of the contact on the touch-sensitive surface. In some implementations, the substitute measurements for contact force or pressure are used directly to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is described in units corresponding to the substitute measurements). In some implementations, the substitute measurements for contact force or pressure are converted to an estimated force or pressure, and the estimated force or pressure is used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is a pressure threshold measured in units of pressure). Using the intensity of a contact as an attribute of a user input allows for user access to additional device functionality that may otherwise not be accessible by the user on a reduced-size device with limited real estate for displaying affordances (e.g., on a touch-sensitive display) and/or receiving user input (e.g., via a touch-sensitive display, a touch-sensitive surface, or a physical/mechanical control such as a knob or a button).
As used in the specification and claims, the term “tactile output” refers to physical displacement of a device relative to a previous position of the device, physical displacement of a component (e.g., a touch-sensitive surface) of a device relative to another component (e.g., housing) of the device, or displacement of the component relative to a center of mass of the device that will be detected by a user with the user's sense of touch. For example, in situations where the device or the component of the device is in contact with a surface of a user that is sensitive to touch (e.g., a finger, palm, or other part of a user's hand), the tactile output generated by the physical displacement will be interpreted by the user as a tactile sensation corresponding to a perceived change in physical characteristics of the device or the component of the device. For example, movement of a touch-sensitive surface (e.g., a touch-sensitive display or trackpad) is, optionally, interpreted by the user as a “down click” or “up click” of a physical actuator button. In some cases, a user will feel a tactile sensation such as an “down click” or “up click” even when there is no movement of a physical actuator button associated with the touch-sensitive surface that is physically pressed (e.g., displaced) by the user's movements. As another example, movement of the touch-sensitive surface is, optionally, interpreted or sensed by the user as “roughness” of the touch-sensitive surface, even when there is no change in smoothness of the touch-sensitive surface. While such interpretations of touch by a user will be subject to the individualized sensory perceptions of the user, there are many sensory perceptions of touch that are common to a large majority of users. Thus, when a tactile output is described as corresponding to a particular sensory perception of a user (e.g., an “up click,” a “down click,” “roughness”), unless otherwise stated, the generated tactile output corresponds to physical displacement of the device or a component thereof that will generate the described sensory perception for a typical (or average) user.
It should be appreciated that device 100 is only one example of a portable multifunction device, and that device 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components. The various components shown in
Memory 102 optionally includes high-speed random access memory and optionally also includes non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Memory controller 122 optionally controls access to memory 102 by other components of device 100.
Peripherals interface 118 can be used to couple input and output peripherals of the device to CPU 120 and memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in memory 102 to perform various functions for device 100 and to process data. In some embodiments, peripherals interface 118, CPU 120, and memory controller 122 are, optionally, implemented on a single chip, such as chip 104. In some other embodiments, they are, optionally, implemented on separate chips.
RF (radio frequency) circuitry 108 receives and sends RF signals, also called electromagnetic signals. RF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other communications devices via the electromagnetic signals. RF circuitry 108 optionally includes well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. RF circuitry 108 optionally communicates with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication. The RF circuitry 108 optionally includes well-known circuitry for detecting near field communication (NFC) fields, such as by a short-range communication radio. The wireless communication optionally uses any of a plurality of communications standards, protocols, and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), high-speed uplink packet access (HSUPA), Evolution, Data-Only (EV-DO), HSPA, HSPA+, Dual-Cell HSPA (DC-HSPDA), long term evolution (LTE), near field communication (NFC), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Bluetooth Low Energy (BTLE), Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, IEEE 802.11n, and/or IEEE 802.11ac), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for e-mail (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
Audio circuitry 110, speaker 111, and microphone 113 provide an audio interface between a user and device 100. Audio circuitry 110 receives audio data from peripherals interface 118, converts the audio data to an electrical signal, and transmits the electrical signal to speaker 111. Speaker 111 converts the electrical signal to human-audible sound waves. Audio circuitry 110 also receives electrical signals converted by microphone 113 from sound waves. Audio circuitry 110 converts the electrical signal to audio data and transmits the audio data to peripherals interface 118 for processing. Audio data is, optionally, retrieved from and/or transmitted to memory 102 and/or RF circuitry 108 by peripherals interface 118. In some embodiments, audio circuitry 110 also includes a headset jack (e.g., 212,
I/O subsystem 106 couples input/output peripherals on device 100, such as touch screen 112 and other input control devices 116, to peripherals interface 118. I/O subsystem 106 optionally includes display controller 156, optical sensor controller 158, depth camera controller 169, intensity sensor controller 159, haptic feedback controller 161, and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 receive/send electrical signals from/to other input control devices 116. The other input control devices 116 optionally include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 160 are, optionally, coupled to any (or none) of the following: a keyboard, an infrared port, a USB port, and a pointer device such as a mouse. The one or more buttons (e.g., 208,
A quick press of the push button optionally disengages a lock of touch screen 112 or optionally begins a process that uses gestures on the touch screen to unlock the device, as described in U.S. patent application Ser. No. 11/322,549, “Unlocking a Device by Performing Gestures on an Unlock Image,” filed Dec. 23, 2005, U.S. Pat. No. 7,657,849, which is hereby incorporated by reference in its entirety. A longer press of the push button (e.g., 206) optionally turns power to device 100 on or off. The functionality of one or more of the buttons are, optionally, user-customizable. Touch screen 112 is used to implement virtual or soft buttons and one or more soft keyboards.
Touch-sensitive display 112 provides an input interface and an output interface between the device and a user. Display controller 156 receives and/or sends electrical signals from/to touch screen 112. Touch screen 112 displays visual output to the user. The visual output optionally includes graphics, text, icons, video, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output optionally corresponds to user-interface objects.
Touch screen 112 has a touch-sensitive surface, sensor, or set of sensors that accepts input from the user based on haptic and/or tactile contact. Touch screen 112 and display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) on touch screen 112 and convert the detected contact into interaction with user-interface objects (e.g., one or more soft keys, icons, web pages, or images) that are displayed on touch screen 112. In an exemplary embodiment, a point of contact between touch screen 112 and the user corresponds to a finger of the user.
Touch screen 112 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other embodiments. Touch screen 112 and display controller 156 optionally detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 112. In an exemplary embodiment, projected mutual capacitance sensing technology is used, such as that found in the iPhone® and iPod Touch® from Apple Inc. of Cupertino, Calif.
A touch-sensitive display in some embodiments of touch screen 112 is, optionally, analogous to the multi-touch sensitive touchpads described in the following U.S. Pat. No. 6,323,846 (Westerman et al.), U.S. Pat. No. 6,570,557 (Westerman et al.), and/or U.S. Pat. No. 6,677,932 (Westerman), and/or U.S. Patent Publication 2002/0015024A1, each of which is hereby incorporated by reference in its entirety. However, touch screen 112 displays visual output from device 100, whereas touch-sensitive touchpads do not provide visual output.
A touch-sensitive display in some embodiments of touch screen 112 is described in the following applications: (1) U.S. patent application Ser. No. 11/381,313, “Multipoint Touch Surface Controller,” filed May 2, 2006; (2) U.S. patent application Ser. No. 10/840,862, “Multipoint Touchscreen,” filed May 6, 2004; (3) U.S. patent application Ser. No. 10/903,964, “Gestures For Touch Sensitive Input Devices,” filed Jul. 30, 2004; (4) U.S. patent application Ser. No. 11/048,264, “Gestures For Touch Sensitive Input Devices,” filed Jan. 31, 2005; (5) U.S. patent application Ser. No. 11/038,590, “Mode-Based Graphical User Interfaces For Touch Sensitive Input Devices,” filed Jan. 18, 2005; (6) U.S. patent application Ser. No. 11/228,758, “Virtual Input Device Placement On A Touch Screen User Interface,” filed Sep. 16, 2005; (7) U.S. patent application Ser. No. 11/228,700, “Operation Of A Computer With A Touch Screen Interface,” filed Sep. 16, 2005; (8) U.S. patent application Ser. No. 11/228,737, “Activating Virtual Keys Of A Touch-Screen Virtual Keyboard,” filed Sep. 16, 2005; and (9) U.S. patent application Ser. No. 11/367,749, “Multi-Functional Hand-Held Device,” filed Mar. 3, 2006. All of these applications are incorporated by reference herein in their entirety.
Touch screen 112 optionally has a video resolution in excess of 100 dpi. In some embodiments, the touch screen has a video resolution of approximately 160 dpi. The user optionally makes contact with touch screen 112 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which can be less precise than stylus-based input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user.
In some embodiments, in addition to the touch screen, device 100 optionally includes a touchpad for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad is, optionally, a touch-sensitive surface that is separate from touch screen 112 or an extension of the touch-sensitive surface formed by the touch screen.
Device 100 also includes power system 162 for powering the various components. Power system 162 optionally includes a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
Device 100 optionally also includes one or more optical sensors 164.
Device 100 optionally also includes one or more depth camera sensors 175.
Device 100 optionally also includes one or more contact intensity sensors 165.
Device 100 optionally also includes one or more proximity sensors 166.
Device 100 optionally also includes one or more tactile output generators 167.
Device 100 optionally also includes one or more accelerometers 168.
In some embodiments, the software components stored in memory 102 include operating system 126, communication module (or set of instructions) 128, contact/motion module (or set of instructions) 130, graphics module (or set of instructions) 132, text input module (or set of instructions) 134, Global Positioning System (GPS) module (or set of instructions) 135, and applications (or sets of instructions) 136. Furthermore, in some embodiments, memory 102 (
Operating system 126 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, iOS, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
Communication module 128 facilitates communication with other devices over one or more external ports 124 and also includes various software components for handling data received by RF circuitry 108 and/or external port 124. External port 124 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.). In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or similar to and/or compatible with, the 30-pin connector used on iPod® (trademark of Apple Inc.) devices.
Contact/motion module 130 optionally detects contact with touch screen 112 (in conjunction with display controller 156) and other touch-sensitive devices (e.g., a touchpad or physical click wheel). Contact/motion module 130 includes various software components for performing various operations related to detection of contact, such as determining if contact has occurred (e.g., detecting a finger-down event), determining an intensity of the contact (e.g., the force or pressure of the contact or a substitute for the force or pressure of the contact), determining if there is movement of the contact and tracking the movement across the touch-sensitive surface (e.g., detecting one or more finger-dragging events), and determining if the contact has ceased (e.g., detecting a finger-up event or a break in contact). Contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact, which is represented by a series of contact data, optionally includes determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations are, optionally, applied to single contacts (e.g., one finger contacts) or to multiple simultaneous contacts (e.g., “multitouch”/multiple finger contacts). In some embodiments, contact/motion module 130 and display controller 156 detect contact on a touchpad.
In some embodiments, contact/motion module 130 uses a set of one or more intensity thresholds to determine whether an operation has been performed by a user (e.g., to determine whether a user has “clicked” on an icon). In some embodiments, at least a subset of the intensity thresholds are determined in accordance with software parameters (e.g., the intensity thresholds are not determined by the activation thresholds of particular physical actuators and can be adjusted without changing the physical hardware of device 100). For example, a mouse “click” threshold of a trackpad or touch screen display can be set to any of a large range of predefined threshold values without changing the trackpad or touch screen display hardware. Additionally, in some implementations, a user of the device is provided with software settings for adjusting one or more of the set of intensity thresholds (e.g., by adjusting individual intensity thresholds and/or by adjusting a plurality of intensity thresholds at once with a system-level click “intensity” parameter).
Contact/motion module 130 optionally detects a gesture input by a user. Different gestures on the touch-sensitive surface have different contact patterns (e.g., different motions, timings, and/or intensities of detected contacts). Thus, a gesture is, optionally, detected by detecting a particular contact pattern. For example, detecting a finger tap gesture includes detecting a finger-down event followed by detecting a finger-up (liftoff) event at the same position (or substantially the same position) as the finger-down event (e.g., at the position of an icon). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event followed by detecting one or more finger-dragging events, and subsequently followed by detecting a finger-up (liftoff) event.
Graphics module 132 includes various known software components for rendering and displaying graphics on touch screen 112 or other display, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast, or other visual property) of graphics that are displayed. As used herein, the term “graphics” includes any object that can be displayed to a user, including, without limitation, text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations, and the like.
In some embodiments, graphics module 132 stores data representing graphics to be used. Each graphic is, optionally, assigned a corresponding code. Graphics module 132 receives, from applications etc., one or more codes specifying graphics to be displayed along with, if necessary, coordinate data and other graphic property data, and then generates screen image data to output to display controller 156.
Haptic feedback module 133 includes various software components for generating instructions used by tactile output generator(s) 167 to produce tactile outputs at one or more locations on device 100 in response to user interactions with device 100.
Text input module 134, which is, optionally, a component of graphics module 132, provides soft keyboards for entering text in various applications (e.g., contacts 137, e-mail 140, IM 141, browser 147, and any other application that needs text input).
GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to telephone 138 for use in location-based dialing; to camera 143 as picture/video metadata; and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets).
Applications 136 optionally include the following modules (or sets of instructions), or a subset or superset thereof:
Examples of other applications 136 that are, optionally, stored in memory 102 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, contacts module 137 are, optionally, used to manage an address book or contact list (e.g., stored in application internal state 192 of contacts module 137 in memory 102 or memory 370), including: adding name(s) to the address book; deleting name(s) from the address book; associating telephone number(s), e-mail address(es), physical address(es) or other information with a name; associating an image with a name; categorizing and sorting names; providing telephone numbers or e-mail addresses to initiate and/or facilitate communications by telephone 138, video conference module 139, e-mail 140, or IM 141; and so forth.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, telephone module 138 are optionally, used to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in contacts module 137, modify a telephone number that has been entered, dial a respective telephone number, conduct a conversation, and disconnect or hang up when the conversation is completed. As noted above, the wireless communication optionally uses any of a plurality of communications standards, protocols, and technologies.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, optical sensor 164, optical sensor controller 158, contact/motion module 130, graphics module 132, text input module 134, contacts module 137, and telephone module 138, video conference module 139 includes executable instructions to initiate, conduct, and terminate a video conference between a user and one or more other participants in accordance with user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, e-mail client module 140 includes executable instructions to create, send, receive, and manage e-mail in response to user instructions. In conjunction with image management module 144, e-mail client module 140 makes it very easy to create and send e-mails with still or video images taken with camera module 143.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, the instant messaging module 141 includes executable instructions to enter a sequence of characters corresponding to an instant message, to modify previously entered characters, to transmit a respective instant message (for example, using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for telephony-based instant messages or using XMPP, SIMPLE, or IMPS for Internet-based instant messages), to receive instant messages, and to view received instant messages. In some embodiments, transmitted and/or received instant messages optionally include graphics, photos, audio files, video files and/or other attachments as are supported in an MMS and/or an Enhanced Messaging Service (EMS). As used herein, “instant messaging” refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and Internet-based messages (e.g., messages sent using XMPP, SIMPLE, or IMPS).
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, GPS module 135, map module 154, and music player module, workout support module 142 includes executable instructions to create workouts (e.g., with time, distance, and/or calorie burning goals); communicate with workout sensors (sports devices); receive workout sensor data; calibrate sensors used to monitor a workout; select and play music for a workout; and display, store, and transmit workout data.
In conjunction with touch screen 112, display controller 156, optical sensor(s) 164, optical sensor controller 158, contact/motion module 130, graphics module 132, and image management module 144, camera module 143 includes executable instructions to capture still images or video (including a video stream) and store them into memory 102, modify characteristics of a still image or video, or delete a still image or video from memory 102.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and camera module 143, image management module 144 includes executable instructions to arrange, modify (e.g., edit), or otherwise manipulate, label, delete, present (e.g., in a digital slide show or album), and store still and/or video images.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, browser module 147 includes executable instructions to browse the Internet in accordance with user instructions, including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, e-mail client module 140, and browser module 147, calendar module 148 includes executable instructions to create, display, modify, and store calendars and data associated with calendars (e.g., calendar entries, to-do lists, etc.) in accordance with user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and browser module 147, widget modules 149 are mini-applications that are, optionally, downloaded and used by a user (e.g., weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, and dictionary widget 149-5) or created by the user (e.g., user-created widget 149-6). In some embodiments, a widget includes an HTML (Hypertext Markup Language) file, a CSS (Cascading Style Sheets) file, and a JavaScript file. In some embodiments, a widget includes an XML (Extensible Markup Language) file and a JavaScript file (e.g., Yahoo! Widgets).
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and browser module 147, the widget creator module 150 are, optionally, used by a user to create widgets (e.g., turning a user-specified portion of a web page into a widget).
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, search module 151 includes executable instructions to search for text, music, sound, image, video, and/or other files in memory 102 that match one or more search criteria (e.g., one or more user-specified search terms) in accordance with user instructions.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, and browser module 147, video and music player module 152 includes executable instructions that allow the user to download and play back recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files, and executable instructions to display, present, or otherwise play back videos (e.g., on touch screen 112 or on an external, connected display via external port 124). In some embodiments, device 100 optionally includes the functionality of an MP3 player, such as an iPod (trademark of Apple Inc.).
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, notes module 153 includes executable instructions to create and manage notes, to-do lists, and the like in accordance with user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, GPS module 135, and browser module 147, map module 154 are, optionally, used to receive, display, modify, and store maps and data associated with maps (e.g., driving directions, data on stores and other points of interest at or near a particular location, and other location-based data) in accordance with user instructions.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, text input module 134, e-mail client module 140, and browser module 147, online video module 155 includes instructions that allow the user to access, browse, receive (e.g., by streaming and/or download), play back (e.g., on the touch screen or on an external, connected display via external port 124), send an e-mail with a link to a particular online video, and otherwise manage online videos in one or more file formats, such as H.264. In some embodiments, instant messaging module 141, rather than e-mail client module 140, is used to send a link to a particular online video. Additional description of the online video application can be found in U.S. Provisional Patent Application No. 60/936,562, “Portable Multifunction Device, Method, and Graphical User Interface for Playing Online Videos,” filed Jun. 20, 2007, and U.S. patent application Ser. No. 11/968,067, “Portable Multifunction Device, Method, and Graphical User Interface for Playing Online Videos,” filed Dec. 31, 2007, the contents of which are hereby incorporated by reference in their entirety.
Each of the above-identified modules and applications corresponds to a set of executable instructions for performing one or more functions described above and the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (e.g., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of these modules are, optionally, combined or otherwise rearranged in various embodiments. For example, video player module is, optionally, combined with music player module into a single module (e.g., video and music player module 152,
In some embodiments, device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through a touch screen and/or a touchpad. By using a touch screen and/or a touchpad as the primary input control device for operation of device 100, the number of physical input control devices (such as push buttons, dials, and the like) on device 100 is, optionally, reduced.
The predefined set of functions that are performed exclusively through a touch screen and/or a touchpad optionally include navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates device 100 to a main, home, or root menu from any user interface that is displayed on device 100. In such embodiments, a “menu button” is implemented using a touchpad. In some other embodiments, the menu button is a physical push button or other physical input control device instead of a touchpad.
Event sorter 170 receives event information and determines the application 136-1 and application view 191 of application 136-1 to which to deliver the event information. Event sorter 170 includes event monitor 171 and event dispatcher module 174. In some embodiments, application 136-1 includes application internal state 192, which indicates the current application view(s) displayed on touch-sensitive display 112 when the application is active or executing. In some embodiments, device/global internal state 157 is used by event sorter 170 to determine which application(s) is (are) currently active, and application internal state 192 is used by event sorter 170 to determine application views 191 to which to deliver event information.
In some embodiments, application internal state 192 includes additional information, such as one or more of: resume information to be used when application 136-1 resumes execution, user interface state information that indicates information being displayed or that is ready for display by application 136-1, a state queue for enabling the user to go back to a prior state or view of application 136-1, and a redo/undo queue of previous actions taken by the user.
Event monitor 171 receives event information from peripherals interface 118. Event information includes information about a sub-event (e.g., a user touch on touch-sensitive display 112, as part of a multi-touch gesture). Peripherals interface 118 transmits information it receives from I/O subsystem 106 or a sensor, such as proximity sensor 166, accelerometer(s) 168, and/or microphone 113 (through audio circuitry 110). Information that peripherals interface 118 receives from I/O subsystem 106 includes information from touch-sensitive display 112 or a touch-sensitive surface.
In some embodiments, event monitor 171 sends requests to the peripherals interface 118 at predetermined intervals. In response, peripherals interface 118 transmits event information. In other embodiments, peripherals interface 118 transmits event information only when there is a significant event (e.g., receiving an input above a predetermined noise threshold and/or for more than a predetermined duration).
In some embodiments, event sorter 170 also includes a hit view determination module 172 and/or an active event recognizer determination module 173.
Hit view determination module 172 provides software procedures for determining where a sub-event has taken place within one or more views when touch-sensitive display 112 displays more than one view. Views are made up of controls and other elements that a user can see on the display.
Another aspect of the user interface associated with an application is a set of views, sometimes herein called application views or user interface windows, in which information is displayed and touch-based gestures occur. The application views (of a respective application) in which a touch is detected optionally correspond to programmatic levels within a programmatic or view hierarchy of the application. For example, the lowest level view in which a touch is detected is, optionally, called the hit view, and the set of events that are recognized as proper inputs are, optionally, determined based, at least in part, on the hit view of the initial touch that begins a touch-based gesture.
Hit view determination module 172 receives information related to sub-events of a touch-based gesture. When an application has multiple views organized in a hierarchy, hit view determination module 172 identifies a hit view as the lowest view in the hierarchy which should handle the sub-event. In most circumstances, the hit view is the lowest level view in which an initiating sub-event occurs (e.g., the first sub-event in the sequence of sub-events that form an event or potential event). Once the hit view is identified by the hit view determination module 172, the hit view typically receives all sub-events related to the same touch or input source for which it was identified as the hit view.
Active event recognizer determination module 173 determines which view or views within a view hierarchy should receive a particular sequence of sub-events. In some embodiments, active event recognizer determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, active event recognizer determination module 173 determines that all views that include the physical location of a sub-event are actively involved views, and therefore determines that all actively involved views should receive a particular sequence of sub-events. In other embodiments, even if touch sub-events were entirely confined to the area associated with one particular view, views higher in the hierarchy would still remain as actively involved views.
Event dispatcher module 174 dispatches the event information to an event recognizer (e.g., event recognizer 180). In embodiments including active event recognizer determination module 173, event dispatcher module 174 delivers the event information to an event recognizer determined by active event recognizer determination module 173. In some embodiments, event dispatcher module 174 stores in an event queue the event information, which is retrieved by a respective event receiver 182.
In some embodiments, operating system 126 includes event sorter 170. Alternatively, application 136-1 includes event sorter 170. In yet other embodiments, event sorter 170 is a stand-alone module, or a part of another module stored in memory 102, such as contact/motion module 130.
In some embodiments, application 136-1 includes a plurality of event handlers 190 and one or more application views 191, each of which includes instructions for handling touch events that occur within a respective view of the application's user interface. Each application view 191 of the application 136-1 includes one or more event recognizers 180. Typically, a respective application view 191 includes a plurality of event recognizers 180. In other embodiments, one or more of event recognizers 180 are part of a separate module, such as a user interface kit or a higher level object from which application 136-1 inherits methods and other properties. In some embodiments, a respective event handler 190 includes one or more of: data updater 176, object updater 177, GUI updater 178, and/or event data 179 received from event sorter 170. Event handler 190 optionally utilizes or calls data updater 176, object updater 177, or GUI updater 178 to update the application internal state 192. Alternatively, one or more of the application views 191 include one or more respective event handlers 190. Also, in some embodiments, one or more of data updater 176, object updater 177, and GUI updater 178 are included in a respective application view 191.
A respective event recognizer 180 receives event information (e.g., event data 179) from event sorter 170 and identifies an event from the event information. Event recognizer 180 includes event receiver 182 and event comparator 184. In some embodiments, event recognizer 180 also includes at least a subset of: metadata 183, and event delivery instructions 188 (which optionally include sub-event delivery instructions).
Event receiver 182 receives event information from event sorter 170. The event information includes information about a sub-event, for example, a touch or a touch movement. Depending on the sub-event, the event information also includes additional information, such as location of the sub-event. When the sub-event concerns motion of a touch, the event information optionally also includes speed and direction of the sub-event. In some embodiments, events include rotation of the device from one orientation to another (e.g., from a portrait orientation to a landscape orientation, or vice versa), and the event information includes corresponding information about the current orientation (also called device attitude) of the device.
Event comparator 184 compares the event information to predefined event or sub-event definitions and, based on the comparison, determines an event or sub-event, or determines or updates the state of an event or sub-event. In some embodiments, event comparator 184 includes event definitions 186. Event definitions 186 contain definitions of events (e.g., predefined sequences of sub-events), for example, event 1 (187-1), event 2 (187-2), and others. In some embodiments, sub-events in an event (187) include, for example, touch begin, touch end, touch movement, touch cancellation, and multiple touching. In one example, the definition for event 1 (187-1) is a double tap on a displayed object. The double tap, for example, comprises a first touch (touch begin) on the displayed object for a predetermined phase, a first liftoff (touch end) for a predetermined phase, a second touch (touch begin) on the displayed object for a predetermined phase, and a second liftoff (touch end) for a predetermined phase. In another example, the definition for event 2 (187-2) is a dragging on a displayed object. The dragging, for example, comprises a touch (or contact) on the displayed object for a predetermined phase, a movement of the touch across touch-sensitive display 112, and liftoff of the touch (touch end). In some embodiments, the event also includes information for one or more associated event handlers 190.
In some embodiments, event definition 187 includes a definition of an event for a respective user-interface object. In some embodiments, event comparator 184 performs a hit test to determine which user-interface object is associated with a sub-event. For example, in an application view in which three user-interface objects are displayed on touch-sensitive display 112, when a touch is detected on touch-sensitive display 112, event comparator 184 performs a hit test to determine which of the three user-interface objects is associated with the touch (sub-event). If each displayed object is associated with a respective event handler 190, the event comparator uses the result of the hit test to determine which event handler 190 should be activated. For example, event comparator 184 selects an event handler associated with the sub-event and the object triggering the hit test.
In some embodiments, the definition for a respective event (187) also includes delayed actions that delay delivery of the event information until after it has been determined whether the sequence of sub-events does or does not correspond to the event recognizer's event type.
When a respective event recognizer 180 determines that the series of sub-events do not match any of the events in event definitions 186, the respective event recognizer 180 enters an event impossible, event failed, or event ended state, after which it disregards subsequent sub-events of the touch-based gesture. In this situation, other event recognizers, if any, that remain active for the hit view continue to track and process sub-events of an ongoing touch-based gesture.
In some embodiments, a respective event recognizer 180 includes metadata 183 with configurable properties, flags, and/or lists that indicate how the event delivery system should perform sub-event delivery to actively involved event recognizers. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate how event recognizers interact, or are enabled to interact, with one another. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate whether sub-events are delivered to varying levels in the view or programmatic hierarchy.
In some embodiments, a respective event recognizer 180 activates event handler 190 associated with an event when one or more particular sub-events of an event are recognized. In some embodiments, a respective event recognizer 180 delivers event information associated with the event to event handler 190. Activating an event handler 190 is distinct from sending (and deferred sending) sub-events to a respective hit view. In some embodiments, event recognizer 180 throws a flag associated with the recognized event, and event handler 190 associated with the flag catches the flag and performs a predefined process.
In some embodiments, event delivery instructions 188 include sub-event delivery instructions that deliver event information about a sub-event without activating an event handler. Instead, the sub-event delivery instructions deliver event information to event handlers associated with the series of sub-events or to actively involved views. Event handlers associated with the series of sub-events or with actively involved views receive the event information and perform a predetermined process.
In some embodiments, data updater 176 creates and updates data used in application 136-1. For example, data updater 176 updates the telephone number used in contacts module 137, or stores a video file used in video player module. In some embodiments, object updater 177 creates and updates objects used in application 136-1. For example, object updater 177 creates a new user-interface object or updates the position of a user-interface object. GUI updater 178 updates the GUI. For example, GUI updater 178 prepares display information and sends it to graphics module 132 for display on a touch-sensitive display.
In some embodiments, event handler(s) 190 includes or has access to data updater 176, object updater 177, and GUI updater 178. In some embodiments, data updater 176, object updater 177, and GUI updater 178 are included in a single module of a respective application 136-1 or application view 191. In other embodiments, they are included in two or more software modules.
It shall be understood that the foregoing discussion regarding event handling of user touches on touch-sensitive displays also applies to other forms of user inputs to operate multifunction devices 100 with input devices, not all of which are initiated on touch screens. For example, mouse movement and mouse button presses, optionally coordinated with single or multiple keyboard presses or holds; contact movements such as taps, drags, scrolls, etc. on touchpads; pen stylus inputs; movement of the device; oral instructions; detected eye movements; biometric inputs; and/or any combination thereof are optionally utilized as inputs corresponding to sub-events which define an event to be recognized.
Device 100 optionally also include one or more physical buttons, such as “home” or menu button 204. As described previously, menu button 204 is, optionally, used to navigate to any application 136 in a set of applications that are, optionally, executed on device 100. Alternatively, in some embodiments, the menu button is implemented as a soft key in a GUI displayed on touch screen 112.
In some embodiments, device 100 includes touch screen 112, menu button 204, push button 206 for powering the device on/off and locking the device, volume adjustment button(s) 208, subscriber identity module (SIM) card slot 210, headset jack 212, and docking/charging external port 124. Push button 206 is, optionally, used to turn the power on/off on the device by depressing the button and holding the button in the depressed state for a predefined time interval; to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or to unlock the device or initiate an unlock process. In an alternative embodiment, device 100 also accepts verbal input for activation or deactivation of some functions through microphone 113. Device 100 also, optionally, includes one or more contact intensity sensors 165 for detecting intensity of contacts on touch screen 112 and/or one or more tactile output generators 167 for generating tactile outputs for a user of device 100.
Each of the above-identified elements in
Attention is now directed towards embodiments of user interfaces that are, optionally, implemented on, for example, portable multifunction device 100.
It should be noted that the icon labels illustrated in
Although some of the examples that follow will be given with reference to inputs on touch screen display 112 (where the touch-sensitive surface and the display are combined), in some embodiments, the device detects inputs on a touch-sensitive surface that is separate from the display, as shown in
Additionally, while the following examples are given primarily with reference to finger inputs (e.g., finger contacts, finger tap gestures, finger swipe gestures), it should be understood that, in some embodiments, one or more of the finger inputs are replaced with input from another input device (e.g., a mouse-based input or stylus input). For example, a swipe gesture is, optionally, replaced with a mouse click (e.g., instead of a contact) followed by movement of the cursor along the path of the swipe (e.g., instead of movement of the contact). As another example, a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e.g., instead of detection of the contact followed by ceasing to detect the contact). Similarly, when multiple user inputs are simultaneously detected, it should be understood that multiple computer mice are, optionally, used simultaneously, or a mouse and finger contacts are, optionally, used simultaneously.
Exemplary techniques for detecting and processing touch intensity are found, for example, in related applications: International Patent Application Serial No. PCT/US2013/040061, titled “Device, Method, and Graphical User Interface for Displaying User Interface Objects Corresponding to an Application,” filed May 8, 2013, published as WIPO Publication No. WO/2013/169849, and International Patent Application Serial No. PCT/US2013/069483, titled “Device, Method, and Graphical User Interface for Transitioning Between Touch Input to Display Output Relationships,” filed Nov. 11, 2013, published as WIPO Publication No. WO/2014/105276, each of which is hereby incorporated by reference in their entirety.
In some embodiments, device 500 has one or more input mechanisms 506 and 508. Input mechanisms 506 and 508, if included, can be physical. Examples of physical input mechanisms include push buttons and rotatable mechanisms. In some embodiments, device 500 has one or more attachment mechanisms. Such attachment mechanisms, if included, can permit attachment of device 500 with, for example, hats, eyewear, earrings, necklaces, shirts, jackets, bracelets, watch straps, chains, trousers, belts, shoes, purses, backpacks, and so forth. These attachment mechanisms permit device 500 to be worn by a user.
Input mechanism 508 is, optionally, a microphone, in some examples. Personal electronic device 500 optionally includes various sensors, such as GPS sensor 532, accelerometer 534, directional sensor 540 (e.g., compass), gyroscope 536, motion sensor 538, and/or a combination thereof, all of which can be operatively connected to I/O section 514.
Memory 518 of personal electronic device 500 can include one or more non-transitory computer-readable storage mediums, for storing computer-executable instructions, which, when executed by one or more computer processors 516, for example, can cause the computer processors to perform the techniques described below, including method 700 (
As used here, the term “affordance” refers to a user-interactive graphical user interface object that is, optionally, displayed on the display screen of devices 100, 300, and/or 500 (
As used herein, the term “focus selector” refers to an input element that indicates a current part of a user interface with which a user is interacting. In some implementations that include a cursor or other location marker, the cursor acts as a “focus selector” so that when an input (e.g., a press input) is detected on a touch-sensitive surface (e.g., touchpad 355 in
As used in the specification and claims, the term “characteristic intensity” of a contact refers to a characteristic of the contact based on one or more intensities of the contact. In some embodiments, the characteristic intensity is based on multiple intensity samples. The characteristic intensity is, optionally, based on a predefined number of intensity samples, or a set of intensity samples collected during a predetermined time period (e.g., 0.05, 0.1, 0.2, 0.5, 1, 2, 5, 10 seconds) relative to a predefined event (e.g., after detecting the contact, prior to detecting liftoff of the contact, before or after detecting a start of movement of the contact, prior to detecting an end of the contact, before or after detecting an increase in intensity of the contact, and/or before or after detecting a decrease in intensity of the contact). A characteristic intensity of a contact is, optionally, based on one or more of: a maximum value of the intensities of the contact, a mean value of the intensities of the contact, an average value of the intensities of the contact, a top 10 percentile value of the intensities of the contact, a value at the half maximum of the intensities of the contact, a value at the 90 percent maximum of the intensities of the contact, or the like. In some embodiments, the duration of the contact is used in determining the characteristic intensity (e.g., when the characteristic intensity is an average of the intensity of the contact over time). In some embodiments, the characteristic intensity is compared to a set of one or more intensity thresholds to determine whether an operation has been performed by a user. For example, the set of one or more intensity thresholds optionally includes a first intensity threshold and a second intensity threshold. In this example, a contact with a characteristic intensity that does not exceed the first threshold results in a first operation, a contact with a characteristic intensity that exceeds the first intensity threshold and does not exceed the second intensity threshold results in a second operation, and a contact with a characteristic intensity that exceeds the second threshold results in a third operation. In some embodiments, a comparison between the characteristic intensity and one or more thresholds is used to determine whether or not to perform one or more operations (e.g., whether to perform a respective operation or forgo performing the respective operation), rather than being used to determine whether to perform a first operation or a second operation.
In some embodiments, a portion of a gesture is identified for purposes of determining a characteristic intensity. For example, a touch-sensitive surface optionally receives a continuous swipe contact transitioning from a start location and reaching an end location, at which point the intensity of the contact increases. In this example, the characteristic intensity of the contact at the end location is, optionally, based on only a portion of the continuous swipe contact, and not the entire swipe contact (e.g., only the portion of the swipe contact at the end location). In some embodiments, a smoothing algorithm is, optionally, applied to the intensities of the swipe contact prior to determining the characteristic intensity of the contact. For example, the smoothing algorithm optionally includes one or more of: an unweighted sliding-average smoothing algorithm, a triangular smoothing algorithm, a median filter smoothing algorithm, and/or an exponential smoothing algorithm. In some circumstances, these smoothing algorithms eliminate narrow spikes or dips in the intensities of the swipe contact for purposes of determining a characteristic intensity.
The intensity of a contact on the touch-sensitive surface is, optionally, characterized relative to one or more intensity thresholds, such as a contact-detection intensity threshold, a light press intensity threshold, a deep press intensity threshold, and/or one or more other intensity thresholds. In some embodiments, the light press intensity threshold corresponds to an intensity at which the device will perform operations typically associated with clicking a button of a physical mouse or a trackpad. In some embodiments, the deep press intensity threshold corresponds to an intensity at which the device will perform operations that are different from operations typically associated with clicking a button of a physical mouse or a trackpad. In some embodiments, when a contact is detected with a characteristic intensity below the light press intensity threshold (e.g., and above a nominal contact-detection intensity threshold below which the contact is no longer detected), the device will move a focus selector in accordance with movement of the contact on the touch-sensitive surface without performing an operation associated with the light press intensity threshold or the deep press intensity threshold. Generally, unless otherwise stated, these intensity thresholds are consistent between different sets of user interface figures.
An increase of characteristic intensity of the contact from an intensity below the light press intensity threshold to an intensity between the light press intensity threshold and the deep press intensity threshold is sometimes referred to as a “light press” input. An increase of characteristic intensity of the contact from an intensity below the deep press intensity threshold to an intensity above the deep press intensity threshold is sometimes referred to as a “deep press” input. An increase of characteristic intensity of the contact from an intensity below the contact-detection intensity threshold to an intensity between the contact-detection intensity threshold and the light press intensity threshold is sometimes referred to as detecting the contact on the touch-surface. A decrease of characteristic intensity of the contact from an intensity above the contact-detection intensity threshold to an intensity below the contact-detection intensity threshold is sometimes referred to as detecting liftoff of the contact from the touch-surface. In some embodiments, the contact-detection intensity threshold is zero. In some embodiments, the contact-detection intensity threshold is greater than zero.
In some embodiments described herein, one or more operations are performed in response to detecting a gesture that includes a respective press input or in response to detecting the respective press input performed with a respective contact (or a plurality of contacts), where the respective press input is detected based at least in part on detecting an increase in intensity of the contact (or plurality of contacts) above a press-input intensity threshold. In some embodiments, the respective operation is performed in response to detecting the increase in intensity of the respective contact above the press-input intensity threshold (e.g., a “down stroke” of the respective press input). In some embodiments, the press input includes an increase in intensity of the respective contact above the press-input intensity threshold and a subsequent decrease in intensity of the contact below the press-input intensity threshold, and the respective operation is performed in response to detecting the subsequent decrease in intensity of the respective contact below the press-input threshold (e.g., an “up stroke” of the respective press input).
In some embodiments, the display of representations 578A-578C includes an animation. For example, representation 578A is initially displayed in proximity of application icon 572B, as shown in
In some embodiments, the device employs intensity hysteresis to avoid accidental inputs sometimes termed “jitter,” where the device defines or selects a hysteresis intensity threshold with a predefined relationship to the press-input intensity threshold (e.g., the hysteresis intensity threshold is X intensity units lower than the press-input intensity threshold or the hysteresis intensity threshold is 75%, 90%, or some reasonable proportion of the press-input intensity threshold). Thus, in some embodiments, the press input includes an increase in intensity of the respective contact above the press-input intensity threshold and a subsequent decrease in intensity of the contact below the hysteresis intensity threshold that corresponds to the press-input intensity threshold, and the respective operation is performed in response to detecting the subsequent decrease in intensity of the respective contact below the hysteresis intensity threshold (e.g., an “up stroke” of the respective press input). Similarly, in some embodiments, the press input is detected only when the device detects an increase in intensity of the contact from an intensity at or below the hysteresis intensity threshold to an intensity at or above the press-input intensity threshold and, optionally, a subsequent decrease in intensity of the contact to an intensity at or below the hysteresis intensity, and the respective operation is performed in response to detecting the press input (e.g., the increase in intensity of the contact or the decrease in intensity of the contact, depending on the circumstances).
For ease of explanation, the descriptions of operations performed in response to a press input associated with a press-input intensity threshold or in response to a gesture including the press input are, optionally, triggered in response to detecting either: an increase in intensity of a contact above the press-input intensity threshold, an increase in intensity of a contact from an intensity below the hysteresis intensity threshold to an intensity above the press-input intensity threshold, a decrease in intensity of the contact below the press-input intensity threshold, and/or a decrease in intensity of the contact below the hysteresis intensity threshold corresponding to the press-input intensity threshold. Additionally, in examples where an operation is described as being performed in response to detecting a decrease in intensity of a contact below the press-input intensity threshold, the operation is, optionally, performed in response to detecting a decrease in intensity of the contact below a hysteresis intensity threshold corresponding to, and lower than, the press-input intensity threshold.
Attention is now directed towards embodiments of user interfaces (“UI”) and associated processes that are implemented on an electronic device, such as portable multifunction device 100, device 300, or device 500.
At
In
As depicted in
At
As a result, while in a landscape orientation and while displaying app icons 612A-612Z, electronic device 600 detects swipe gesture 618 in a region corresponding to (e.g., occupied by) app icons 612A-612Z. In some embodiments, the region corresponding to app icons 612A-612Z does not include dock region 610 and/or the region occupied by status bar 620.
In some embodiments, upon detecting swipe gesture 618, electronic device 600 determines the direction of the movement in swipe gesture 618. As depicted in
Merging the page of widget(s) onto home screen 608 includes sliding widget 622A onto home screen 608 (e.g., from the left side of the display), and resizing the spacing between app icons 612A-612Z in order to accommodate the display of widget 622A on home screen 608. That is, the horizontal spacing between the app icons uniformly decreases. For example, the horizontal spacing between app icon 612A and app icon 612B decreases. Similarly, the horizontal spacing between app icon 612C and app icon 612D decreases by the same amount. In contrast, the vertical spacing between the app icons remains the same.
Moreover, as depicted in
At
As a result of merging the page of widget(s) onto home screen 608, electronic device 600 updates page indicator 616 to reflect that the total number of pages has been reduced from three to two. Additionally, as a result of merging the page of widget(s) onto home screen 608, time indicator 624 and date indicator 626 moves from status bar 620 to widget region 628 (further described below).
It is noted that the above described result in
As discussed above, upon determining that the direction of the movement in swipe gesture 618 is left-to-right, electronic device 600 concurrently displays widget 622A and app icons 612A-612Z on home screen 608. As depicted in
As shown in
At
At
At
It is noted that the above described result in
At
With reference to
At
In some embodiments, instead of detecting swipe gesture 656, electronic device 600 detects swipe gesture 660 at notification 654B, as shown in
After viewing widgets 622A-622E, the user navigates back to user interface 652 of
At
At
At
It is noted that electronic device 600 displays a single widget (e.g., 622A) in
At
At
At
At
Having reached the end of the scrolled widgets, the user performs a tap gesture on edit button 680 to enter an edit mode, as illustrated in
In the edit mode, the user can perform a variety of functions pertaining to the widgets or app icons. These functions can include reordering the widgets, moving widgets from one category (e.g., pinned, favorites, library) to another category, removing app icons from the home screen, and reorganizing app icons by changing the placement of app icons on the home screen. For example, in response to detecting a tap gesture at delete icon 673, electronic device 600 moves the widget (e.g., 622C) corresponding to delete icon 673 from the “favorites” category to the “library” category. As another example, in response to detecting a tap gesture at add icon 675, electronic device 600 moves the widget corresponding to add icon 675 from the “library” category to the “pinned” category or, in some embodiments, the “favorites” category. As yet another example, in response to detecting a drag gesture at icon 677, electronic device 600 moves the widget (e.g., 622B) corresponding to icon 677 to a different location in the list of widgets in accordance with the movement of the drag gesture. As yet another example, in response to detecting a tap gesture at icon 679, electronic device 600 initiates a process for deleting the application corresponding to app icon 612, thereby resulting in removal of app icon 612A from home screen 608.
At
At
At
At
At
At
At
At
At
It is noted that the above described result in
At
As described below, method 700 provides an intuitive way for displaying widgets. The method reduces the cognitive burden on a user for displaying widgets, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to display widgets faster and more efficiently conserves power and increases the time between battery charges.
An electronic device (e.g., 100, 300, 500, 700) displays (702), via the display device (e.g., 602), a first plurality of application icons (e.g., 612A-612Z) (e.g., a plurality of icons on a first page) without displaying a first set of one or more user interface elements (e.g., 622A) (e.g., without displaying widgets), wherein the application icons are selectable to display application user interfaces (e.g., 636) for corresponding applications (e.g., by the device detecting, on a touch-sensitive surface (e.g., of display 602), a tap gesture (e.g., 634) at a location (on the touch-sensitive surface) corresponding to a displayed location of the respective application icon (e.g., 612I)). In some embodiments, the first plurality of application icons (e.g., 612A-612Z) are displayed on a home screen (e.g., 608). In some embodiments, the first plurality of application icons are displayed without displaying a second plurality of application icons (e.g., 612AA-612AK) that are different from the first plurality of application icons.
While displaying, via the display device (e.g., 602), the first plurality of application icons (e.g., 612A-612Z), the electronic device detects (704) a first user input (e.g., 618, 692) (e.g., a user gesture, swipe, drag). In some embodiments, the first user input is detected in a region occupied by the first plurality of application icons.
In response (706) to detecting the first user input (e.g., 618, 692): in accordance (708) with a determination that the first user input includes movement in a first direction (e.g., left, right, up, down) (e.g., the direction is relative to the displayed user interface (e.g., 608)) (e.g., a determination that a first set of criteria is met, including a requirement that the first user input includes movement in a first direction and, optionally, that the orientation of the electronic device is in a first orientation): the electronic device ceases (710) display of the first plurality of application icons (e.g., 612A-612Z) (e.g., by sliding the first plurality of application icons off the display device (e.g., 602) in the first direction), and displays (712) (e.g., initially displays), via the display device, a second plurality of application icons (e.g., 612AA-612AK) that are different from the first plurality of application icons (e.g., without displaying widget(s) (e.g., 622A-622H) (e.g., user interface elements)), wherein the application icons are selectable to display application user interfaces for corresponding applications. Thus, the technique replaces display of the first plurality of application icons (e.g., 612A-612Z) with display of the second plurality of application icons (e.g., 612AA-612AK). In some embodiments, the determination is made by the electronic device. In some embodiments, the determination is made by a device external to the electronic device.
In response (706) to detecting the first user input (e.g., 618, 692): in accordance (714) with a determination that the first user input includes movement in a second direction (e.g., left, right, up, down) (e.g., the direction is relative to the displayed user interface (e.g., 608)) that is different from (e.g., substantially opposite to) the first direction (e.g., a determination that a second set of criteria is met, including a requirement that the first user input includes movement in a second direction and, optionally, that the orientation of the electronic device is in the first orientation): the electronic device modifies (716) (e.g., resizes) display of the first plurality of application icons (e.g., 612A-612Z) to change (e.g., to reduce) a distance between (e.g., the centers of, the edges of) a first application icon (e.g., 612A) of the first plurality of application icons and a second application icon (e.g., 612B) of the first plurality of application icons, and concurrently displays (718), via the display device, the modified first plurality of application icons (e.g., 612A-612Z) and the first set of one or more user interface elements (e.g., 622A) (e.g., without concurrently displaying the second plurality of application icons (e.g., 612AA-612AK)). Modifying display of the first plurality of application icons to change a distance between the first application icon and the second application icon enables concurrent display of the first plurality of application icons and the first set of one or more user interface elements. This is in contrast to ceasing display of the first plurality of application icons, which would necessitate additional user inputs to access the first plurality of application icons. Accordingly, modifying display of the first plurality of application icons to allow for concurrent display reduces the number of inputs needed for performing operations pertaining to the application icons and the user interface elements. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
In some embodiments, the distance that changes between the application icons (e.g., 612A-612Z) is a horizontal distance. In some embodiments, the vertical distance between the first application icon and the second application icon does not change. In some embodiments, horizontal and vertical is relative to the displayed user interface (e.g., 608) or the device itself. In some embodiments, a user interface element (e.g., 622A-622H) (e.g., of the one or more user interface elements includes displayed content related to or corresponding to an application. In some embodiments, the modified first plurality of application icons (e.g., 612A-612Z) are displayed adjacent to the first set of one or more user interface elements (e.g., 622A). In some embodiments, the modified first plurality of application icons and the first set of one or more user interface elements are displayed on the home screen (e.g., 608). In some embodiments, the first set of one or more user interface elements includes only the pinned user interface elements (e.g., 622A) (e.g., user interface elements that are designated as a special category) regardless of the last state at which the user interface elements were displayed. In some embodiments, the first set of one or more user interface elements are selected for display based on the last state. That is, the device displays one or more of (e.g., all of) the user interface elements (e.g., 622A-622H) that were displayed during the last state. In some embodiments, the determination is made by the electronic device. In some embodiments, the determination is made by a device external to the electronic device.
In some embodiments, while concurrently displaying, via the display device (e.g., 602), the modified first plurality of application icons (e.g., 612A-612Z) and the first set of one or more user interface elements (e.g., 622A), the electronic device detects a second user input (e.g., 634, 638) (e.g., a user gesture, tap). In some embodiments, in response to detecting the second user input: in accordance with a determination that the second user input corresponds to selection of an application icon (e.g., 612I) of the modified first plurality of application icons, the electronic device displays (e.g., initially displays), via the display device, a user interface (e.g., 636) of an application corresponding to the selected application icon.
In some embodiments, displaying the user interface (e.g., 636) of an application corresponding to the selected application icon (e.g., 612I) includes replacing concurrent display of the modified first plurality of application icons (e.g., 612A-612Z) and the first set of one or more user interface elements (e.g., 622A) with display of the user interface of the application corresponding to the selected application icon. In some embodiments, displaying the user interface of the application occurs as a result of launching the application corresponding to the selected application icon.
In some embodiments, the selected application icon (e.g., 612I) and a user interface element (e.g., 622A) of the first set of one or more user interface elements correspond to the same application. In some embodiments, in response to detecting the second user input (e.g., 634, 638): in accordance with a determination that the second user input corresponds to selection of the user interface element of the first set of one or more user interface elements (e.g., an app icon (e.g., 640) corresponding to the user interface element), the electronic device displays (e.g., initially displays), via the display device (e.g., 602), a user interface (e.g., 636) of an application corresponding to the selected user interface element, wherein the application corresponding to the selected user interface element is the same as the application corresponding to the selected application icon.
In some embodiments, the user interface (e.g., 636) of the application corresponding to the selected user interface element (e.g., 622A) is the same user interface as the user interface of the application corresponding to the selected application icon (e.g., 612I). In some embodiments, displaying the user interface (e.g., 636) of the application corresponding to the selected user interface element (e.g., 622A) includes replacing concurrent display of the modified first plurality of application icons (e.g., 612A-612Z) and the first set of one or more user interface elements (e.g., 622A) with display of the user interface of the application corresponding to the selected user interface element. In some embodiments, displaying the user interface (e.g., 636) of the application occurs as a result of launching the application corresponding to the selected user interface element (e.g., 622A). In some embodiments, the user interface element (e.g., 622A) includes displayed information received from an application corresponding to the user interface element.
In some embodiments, while displaying, via the display device (e.g., 602), the user interface (e.g., 636) of the application corresponding to the selected application icon, the electronic device detects a user request (e.g., 642) to navigate to a home screen (e.g., 608) (e.g., a user input (e.g., swipe in the up direction) starting at a location proximate to the bottom edge of the device or activation of a physical or virtual home button). In some embodiments, the home screen (e.g., 6-608) includes the first plurality of application icons (e.g., 612A-612Z). In some embodiments, the user interface (e.g., 636) of the application corresponding to the selected application is displayed without displaying the modified first plurality of application icons (e.g., 612A-612Z) and/or the first set of one or more user interface elements (e.g., 622A).
In some embodiments, in response to detecting the user request (e.g., 642) to navigate to the home screen, the electronic device replaces display of the user interface (e.g., 636) of the application corresponding to the selected application with display of the home screen (e.g., 608), wherein displaying the home screen includes concurrently displaying, via the display device, the modified first plurality of application icons (e.g., 612A-612Z) and the first set of one or more user interface elements (e.g., 622A). Concurrently displaying the first plurality of application icons and the first set of one or more user interface elements after having displayed a user interface of an application enables a user to quickly regain access to the user interface elements. This is in contrast to ceasing display of the first set of one or more user interface elements after having displayed the user interface of the application, which would necessitate additional user inputs to access the user interface elements. Accordingly, concurrent display of the first plurality of application icons and the first set of one or more user interface elements in response to detecting the user request reduces the number of inputs needed for performing operations pertaining to the user interface elements. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
In some embodiments, while concurrently displaying, via the display device (e.g., 602), the modified first plurality of application icons (e.g., 612A-612Z) and the first set of one or more user interface elements (e.g., 622A), the electronic device detects a first paging input (e.g., 644) (e.g., that includes movement in the first direction such as a swipe gesture in the first direction). In some embodiments, the first paging input (e.g., 644) is detected in a region (e.g., 630) occupied by the modified first plurality of application icons (e.g., 612A-612Z). In some embodiments, the first paging input (e.g., 644) is detected in a region (e.g., 630) that does not correspond to (e.g., is not occupied by) the first set of one or more user interface elements (e.g., 622A), the status bar (e.g., 620), and the dock region (e.g., 610).
In some embodiments, in response to detecting the paging input (e.g., 644), the electronic device displays, via the display device (e.g., 602), the second plurality of application icons (e.g., 612AA-612AK) that are different from the first plurality of application icons (and, optionally ceases to display the modified first plurality of application icons (e.g., 612A-612Z) and the first set of one or more user interface elements (e.g., 622A)).
In some embodiments, while displaying, via the display device (e.g., 602), the second plurality of application icons (e.g., 612AA-612AK), the electronic device detects a second paging input (e.g., 646) (e.g., that includes movement in the second direction such as a swipe gesture in the second direction). In some embodiments, the second plurality of application icons are displayed without displaying the modified first plurality of application icons (e.g., 612A-612Z) and/or the first set of one or more user interface elements (e.g., 622A). In some embodiments, the second paging input (e.g., 646) is detected in a region (e.g., 630) occupied by the second plurality of application icons (e.g., 612AA-612AK). In some embodiments, the second paging input is detected in a region (e.g., 630) that does not correspond to (e.g., is not occupied by) the first set of one or more user interface elements (e.g., 622A), the status bar (e.g., 620), and the dock region (e.g., 610).
In some embodiments, in response to detecting the second paging input (e.g., 646), the electronic device replaces display of the second plurality of application icons (e.g., 612AA-612AK) with display of the modified first plurality of application icons (e.g., 612A-612Z) and the first set of one or more user interface elements (e.g., 622A). Concurrently displaying the first plurality of application icons and the first set of one or more user interface elements after having displayed the second plurality of application icons enables a user to quickly regain access to the user interface elements. This is in contrast to ceasing display of the first set of one or more user interface elements after having displayed the second page of application icons, which would necessitate additional user inputs to access the user interface elements. Accordingly, concurrent display of the first plurality of application icons and the first set of one or more user interface elements in response to detecting the second paging input reduces the number of inputs needed for performing operations pertaining to the user interface elements. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
In some embodiments, while concurrently displaying, via the display device (e.g., 602), the modified first plurality of application icons (e.g., 612A-612Z) and the first set of one or more user interface elements (e.g., 622A), the electronic device detects a user request (e.g., input 648 at hardware button 604) (while in an unlocked state) to transition the electronic device to a locked state. In some embodiments, in response to detecting the user request to transition the electronic device to the locked state: the electronic device transitions the electronic device from an unlocked state to the locked state, and ceases display of the modified first plurality of application icons (e.g., 612A-612Z) and the first set of one or more user interface elements (e.g., 622A) (e.g., including transitioning the display device to an inactive state (e.g., off, a state in which nothing is displayed) or displaying a wake screen user interface (e.g., 652)).
In some embodiments, while the electronic device is in the locked state, the electronic device detects one or more inputs (e.g., 662, biometric input) to transition the electronic device to an unlocked state and navigate to the home screen (e.g., 608) (e.g., detecting change in orientation of device, detecting user input (e.g., 662) at display device (e.g., 602) (e.g., tap gesture, swipe gesture starting near an edge of the display device, information corresponding to a biometric feature (e.g., fingerprint, face))). In some embodiments, in response to detecting the one or more inputs: the electronic device transitions the electronic device from the locked state to the unlocked state, and concurrently displays, via the display device (e.g., 602), the modified first plurality of application icons (e.g., 612A-612Z) and the first set of one or more user interface elements (e.g., 622A). In some embodiments, the electronic device maintains the state of what is displayed prior to locking of the device. Concurrently displaying the first plurality of application icons and the first set of one or more user interface elements after being in a locked state enables a user to quickly regain access to the user interface elements. This is in contrast to ceasing display of the first set of one or more user interface elements after being in a locked state, which would necessitate additional user inputs to access the user interface elements. Accordingly, concurrent display of the first plurality of application icons and the first set of one or more user interface elements in response to the one or more inputs reduces the number of inputs needed for performing operations pertaining to the user interface elements. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
In some embodiments, the first user input (e.g., 618, 692) is detected while the electronic device (e.g., a displayed user interface (e.g., 608)) is in a first orientation (e.g., portrait orientation, landscape orientation). In some embodiments, ceasing display of the first plurality of application icons (e.g., 612A-612Z), displaying the second plurality of application icons (e.g., 612AA-612AK), modifying display of the first plurality of application icons, and/or concurrently displaying the modified first plurality of application icons and the first set of one or more user interface elements (e.g., 622A) occur while the electronic device is in the first orientation.
In some embodiments, while displaying, via the display device (e.g., 602), the first plurality of application icons (e.g., 612A-612Z), the electronic device detects a third user input (e.g., 689, 690) (e.g., a user gesture, swipe, drag) that is different from the first user input, wherein the third user input is detected while the electronic device (e.g., a displayed user interface) is in a second orientation (e.g., portrait orientation, landscape orientation) that is different from the first orientation.
In some embodiments, in response to detecting the third user input (e.g., 689, 690): in accordance with a determination that the detected third user input includes movement in a third direction (e.g., left, right, up, down) (e.g., the direction is relative to the displayed user interface (e.g., 608)) (e.g., a determination that a third set of criteria is met, including a requirement that the third user input includes movement in a third direction and, optionally, that the orientation of the electronic device is in the second orientation): the electronic device ceases display of the first plurality of application icons (e.g., 612A-612Z) (e.g., by sliding the first plurality of application icons off the display device (e.g., 602) in the third direction), and displays (e.g., initially displays), via the display device (e.g., 602), the second plurality of application icons (e.g., 612AA-612AK) that are different from the first plurality of application icons (e.g., without displaying widgets (e.g., 622A-622H)), wherein the application icons are selectable to display application user interfaces for corresponding applications. Thus, the technique replaces display of the first plurality of application icons (e.g., 612A-612Z) with display of the second plurality of application icons (e.g., 612AA-612AK). In some embodiments, the first direction and the third direction are in the same direction relative to the displayed user interface (e.g., 608).
In some embodiments, in response to detecting the third user input (e.g., 689, 690): in accordance with a determination that the detected third user input includes movement in a fourth direction (e.g., left, right, up, down) (e.g., the direction is relative to the displayed user interface (e.g., 608)) that is different from (e.g., substantially opposite to) the third direction (e.g., a determination that a fourth set of criteria is met, including a requirement that the third user input includes movement in a fourth direction and, optionally, that the orientation of the electronic device is in the second orientation): the electronic device ceases display of the first plurality of application icons (e.g., 612A-612Z) (e.g., by sliding the first plurality of application icons off the display device (e.g., 602) in the third direction), and displays (e.g., initially displays), via the display device (e.g., 602), the first set of one or more user interface elements (e.g., 622A) (e.g., without concurrently displaying the second plurality of application icons (e.g., 612AA-612AK) (or any application icons in the second plurality of application icons)). Thus, the technique replaces display of the first plurality of application icons (e.g., 612A-612Z) with display of the first set of one or more user interface elements (e.g., 622A). In some embodiments, the second direction and the fourth direction are in the same direction relative to the displayed user interface (e.g., 608).
In some embodiments, while displaying, via the display device (e.g., 602), the first plurality of application icons (e.g., 612A-612Z), the electronic device detects a third user input (e.g., 689, 690) (e.g., a user gesture, swipe, drag) that is different from the first user input, wherein the third user input is detected while the electronic device (e.g., a displayed user interface) is in a second orientation (e.g., portrait orientation, landscape orientation) that is different from the first orientation. In some embodiments, the third input is detected while the electronic device is in the first orientation. Thus, the user interface elements slide over the application icons regardless of the orientation of the electronic device.
In some embodiments, in response to detecting the third user input (e.g., 689, 690), the electronic device displays (e.g., initially displays), via the display device (e.g., 602), the first set of one or more user interface elements (e.g., 622A), wherein the first set of one or more user interface elements are overlaid on top of a portion of the first plurality of application icons (e.g., 612A-612Z). In some embodiments, displaying, via the display device (e.g., 602), the first set of one or more user interface elements (e.g., 622A) includes sliding the first set of one or more user interface elements onto the display device from an edge of the display device. In some embodiments, the edge of the display device (e.g., 602) is proximate to the location at which the third user input was detected. In some embodiments, maintaining display of the first plurality of application icons (e.g., 612A-612Z) includes reducing the size of the first plurality of application icons and/or blurring the first plurality of application icons.
In some embodiments, while concurrently displaying, via the display device (e.g., 602), the modified first plurality of application icons (e.g., 612A-612Z) and the first set of one or more user interface elements (e.g., 622A), the electronic device detects a user request (e.g., swipe, drag in the up/down direction) to vertically scroll the first set of one or more user interface elements. In some embodiments, the user request is a user request to scroll the user interface elements (e.g., 622A-622H). In some embodiments, the user request includes a gesture (e.g., swipe, drag) starting in a region (e.g., 628) corresponding to the user interface elements (e.g., 622A-622H) (e.g., a region that does not overlap with a region (e.g., 630) corresponding to the application icons (e.g., 612A-612AK)).
In some embodiments, in response to detecting the user request to vertically scroll the first set of one or more user interface elements (e.g., 622A), the electronic device vertically scrolls the first set of one or more user interface elements, wherein a portion of the first set of one or more user interface elements ceases to be displayed, and a portion of a second set of one or more user interface elements is displayed (e.g., 622B-622H) (e.g., initially displayed). In some embodiments, the first set of one or more user interface elements (e.g., 622A) ceases to display at the same time as when the portion of the second set of one or more user interface elements is displayed.
In some embodiments, while concurrently displaying, via the display device (e.g., 602), the modified first plurality of application icons (e.g., 612A-612Z) and the first set of one or more user interface elements (e.g., 622A), the electronic device detects (720) a fourth user input (e.g., 672) (e.g., user gesture, swipe, drag) in a predefined region (e.g., 628) corresponding to the first set of one or more user interface elements (e.g., the predefined region is occupied by the first set of one or more user interface elements, the predefined region is not occupied by the first plurality of application icons (e.g., 612A-612Z)), wherein the first set of one or more user interface elements includes a single user interface element (e.g., 622A) (e.g., displayed at a particular location on the display device (e.g., 602)).
In some embodiments, in response (722) to detecting the fourth user input (e.g., 672): in accordance (724) with a determination that the fourth user input includes movement in a fifth direction (e.g., up, down, left, right): the electronic device maintains (726) display of the single user interface element (e.g., 622A), and displays (728) (e.g., initially displays), via the display device (e.g., 602), a second set of one or more user interface elements (e.g., 622B-622C), wherein the single user interface element and the second set of one or more user interface elements are concurrently displayed. In some embodiments, the single user interface element (e.g., 622A) remains at the particular location on the display device (e.g., 602). In some embodiments, the second set of one or more user interface elements (e.g., 622B-622C) is displayed proximate to (e.g., below) the first set of one or more user interface elements (e.g., 622A). In some embodiments, in accordance with a determination that the fourth user input includes movement in a sixth direction that is different (e.g., substantially opposite to) from the fifth direction, the electronic device ceases display of the single user interface element (e.g., 622A) and displays a search bar (e.g., 658).
In some embodiments, while concurrently displaying, via the display device (e.g., 602), the single user interface element (e.g., 622A) and the second set of one or more user interface elements (e.g., 622B-622C), the electronic device detects a fifth user input (e.g., 688) (e.g., user gesture, swipe, drag (in an up/down direction)). In some embodiments, in response to detecting the fifth user input (e.g., 688): the electronic device ceases display of the second set of one or more user interface elements (e.g., 622B-622C), and maintains display of the single user interface element (e.g., 622A). In some embodiments, the fifth user input is detected in the predefined region (e.g., 628). In some embodiments, the single user interface element remains at the particular location on the display device (e.g., 602).
In some embodiments, while concurrently displaying, via the display device (e.g., 602), the modified first plurality of application icons (e.g., 612A-612Z) and a third set of one or more user interface elements (e.g., 622F-622H), the electronic device detects a user request (e.g., 681, 683) to enter an edit mode (e.g., corresponding to 682) at a location corresponding to an application icon (e.g., 612H) of the modified first plurality of application icons (e.g., 612A-612Z) (e.g., long press on an application icon). In some embodiments, the user request is detected in the app region (e.g., 630). In some embodiments, the third set is the same as the first set of one or more user interface elements (e.g., 622A). In some embodiments, the edit mode enables editing (e.g., moving, reordering, reorganizing) of the application icons (e.g., 612A-612AK) and user interface elements (e.g., 622A-622H). For example, the user interface elements can have hierarchical categories (e.g., pinned, favorites, library). In some embodiments, the edit mode enables reorganizing of the user interface elements into the different hierarchical categories. In some embodiments, entering the edit mode includes displaying delete affordances (e.g., 673, 679) for application icons and user interface elements. In some embodiments, entering the edit mode includes enabling reorganizing (e.g., movement from an original location to a new location) of application icons and user interface elements via (e.g., in response to) gestures (e.g., drag, tap). In some embodiments, the user request to enter the edit mode is detected while concurrently displaying the modified first plurality of application icons (e.g., 612A-612Z) and the first set of one or more user interface elements (e.g., 622A).
In some embodiments, in response to detecting the user request (e.g., 681, 683) to enter the edit mode, the electronic device causes one or more user interface elements (e.g., 622A-622H) (e.g., first set, second set, third set, or a combination thereof) and application icons (e.g., 612A-612AK) to enter an edit mode. In some embodiments, detecting the user request at a location corresponding to a user interface element (e.g., long press on the user interface element) also causes the user interface elements and application icons to enter an edit mode.
In some embodiments, while concurrently displaying, via the display device (e.g., 602), the modified first plurality of application icons (e.g., 612A-612Z) and a third set of one or more user interface elements (e.g., 622F-622H), the electronic device detects a user request (e.g., 681, 683) to enter an edit mode at a location corresponding to an edit affordance (e.g., 680) that is displayed proximate to the third set of user interface elements (e.g., tap gesture). In some embodiments, the user request is detected in the widget region (e.g., 628). In response to detecting the user request to enter the edit mode, the electronic device causes one or more user interface elements (e.g., 622A-622H) (e.g., first set, second set, third set, or a combination thereof) and application icons (e.g., 612A-612AK) to enter an edit mode.
In some embodiments, while concurrently displaying, via the display device (e.g., 602), the modified first plurality of application icons (e.g., 612A-612Z) and the first set of one or more user interface elements (e.g., 622A), the electronic device detects (730) a change in orientation of the electronic device from a first orientation (e.g., landscape orientation, portrait orientation) to a second orientation (e.g., portrait orientation, landscape orientation) that is different from the first orientation. In some embodiments, in response (732) to detecting the change in orientation of the electronic device from the first orientation to the second orientation, the electronic device displays (734) the first plurality of application icons (e.g., 612A-612Z) without displaying the first set of one or more user interface elements (e.g., 622A). In some embodiments, in response to detecting the change in orientation of the electronic device from the first orientation to the second orientation, changing (e.g., increasing) the distance between a first application icon (e.g., 612A) and a second application icon (e.g., 612B) of the first plurality of application icons (e.g., 612A-612Z). Automatically ceasing to display the first set of one or more user interface elements in response to detecting a change in orientation allows the device to maintain display of the first plurality of application icons in the new orientation. Performing an operation when a set of conditions has been met without requiring further user input enhances the operability of the device (e.g., maintains accessibility of the first plurality of application icons) and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
In some embodiments, while displaying the first plurality of application icons (e.g., 612A-612Z) without displaying the first set of one or more user interface elements (e.g., 622A), the electronic device detects a change in orientation of the electronic device from the second orientation to the first orientation. In some embodiments, in response to detecting the change in orientation of the electronic device from the second orientation to the first orientation, the electronic device concurrently displays, via the display device (e.g., 602), the modified first plurality of application icons (e.g., 612A-612Z) and the first set of one or more user interface elements (e.g., 622A). Concurrently displaying the first plurality of application icons and the first set of one or more user interface elements after reverting back to the device's previous orientation enables a user to quickly regain access to the user interface elements. This is in contrast to ceasing display of the first set of one or more user interface elements after a change in the orientation of the device, which would necessitate additional user inputs to access the user interface elements. Accordingly, concurrent display of the first plurality of application icons and the first set of one or more user interface elements in response to the change in orientation reduces the number of inputs needed for performing operations pertaining to the user interface elements. Reducing the number of inputs needed to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
In some embodiments, a user interface element (e.g., 622A) of the first set of user interface elements includes a first set of information from an application corresponding to the user interface element. In some embodiments, while concurrently displaying, via the display device (e.g., 602), the modified first plurality of application icons (e.g., 612A-612Z) and the first set of one or more user interface elements (e.g., 622A), the electronic device detects a user request (e.g., 666, 674) to expand the user interface element, wherein a portion (e.g., top portion, bottom portion) of the user interface element is displayed at a first location on the display device (e.g., 602). In some embodiments, in response to detecting the user request (e.g., 666, 674) to expand the user interface element: in accordance with a determination that expansion criteria is met (e.g., expansion criteria includes a requirement that the displayed user interface element (e.g., 622A) does not get cropped due to expansion of the user interface element (e.g., no portion of the displayed user interface element ceases to display as a result of expanding the user interface element)): the electronic device expands the size of the user interface element (e.g., 622A) while maintaining the portion of the user interface element (e.g., 622A) at the first location on the display device (e.g., 602), and displays (e.g., initially displays), via the display device (e.g., 602), a second set of information from the application corresponding to the user interface element, wherein the second set of information is different from the first set of information. In some embodiments, the second set of information is concurrently displayed with the first set of information.
In some embodiments, further in response to detecting the user request (e.g., 666, 674) to expand the user interface element (e.g., 622C): in accordance with a determination that expansion criteria is not met: the electronic device expands the size of the user interface element, scrolls the user interface element, wherein scrolling the user interface element (e.g., 622C) causes the portion of the user interface element to be displayed at a second location on the display device (e.g., 602) that is different from the first location, and displays (e.g., initially displays), via the display device (e.g., 602), the second set of information from the application corresponding to the user interface element, wherein the second set of information is different from the first set of information. In some embodiments, the second set of information is concurrently displayed with the first set of information. Automatically scrolling the user interface element based on a determination that expansion criteria is not met allows the user to view the full contents of the user interface element. Otherwise, additional inputs would be required to further scroll the user interface element. Performing an operation when a set of conditions has been met without requiring further user input enhances the operability of the device (e.g., maintains accessibility of the first plurality of application icons) and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
In some embodiments, while concurrently displaying, via the display device (e.g., 602), the modified first plurality of application icons (e.g., 612A-612Z) and the first set of one or more user interface elements (e.g., 622A), the electronic device detects a sixth user input (e.g., 644, 691) (e.g., user gesture, swipe, drag) with movement in a seventh direction (e.g., left, right, up, down). In some embodiments, in response to detecting the sixth user input (e.g., 644, 691): in accordance with a determination that the sixth user input (e.g., the start of the user input, the end of the user input) with movement in the seventh direction is detected in a predefined region (e.g., 628) corresponding to the first set of one or more user interface elements (e.g., 622A) (e.g., the predefined region is occupied by the first set of one or more user interface elements, the predefined region is not occupied by the first plurality of application icons (e.g., 612A-612Z)): the electronic device displays, via the display device (e.g., 602), the first plurality of application icons (e.g., 612A-612Z), and ceases display of the first set of one or more user interface elements (e.g., 622A). In some embodiments, the distance between the first application icon and the second application icon of the first plurality of application icons is increased in response to detecting the sixth user input.
In some embodiments, in response to detecting the sixth user input (e.g., 644, 691): in accordance with a determination that the sixth user input (e.g., the start of the user input, the end of the user input) with movement in the seventh direction is detected in a predefined region (e.g., 630) corresponding to the modified first plurality of application icons (e.g., 612A-612Z) (e.g., the predefined region corresponding to the modified first plurality of application icons is adjacent to the predefined region (e.g., 628) corresponding to the first set of one or more user interface elements (e.g., 622A)), the electronic device replaces display of the modified first plurality of application icons (e.g., 612A-612Z) with display of the second plurality of application icons (e.g., 612AA-612AK), wherein the predefined region (e.g., 630) corresponding to the modified first plurality of application icons does not overlap with the predefined region (e.g., 628) corresponding to the first set of one or more user interface elements (e.g., 622A). Replacing display of the modified first plurality of application icons (e.g., 612A-612Z) includes ceasing display of the modified first plurality of application icons.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the techniques and their practical applications. Others skilled in the art are thereby enabled to best utilize the techniques and various embodiments with various modifications as are suited to the particular use contemplated.
Although the disclosure and examples have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of the disclosure and examples as defined by the claims.
As described above, one aspect of the present technology is the gathering and use of data available from various sources for display in widgets. The present disclosure contemplates that in some instances, this gathered data may include personal information data that uniquely identifies or can be used to contact or locate a specific person. Such personal information data can include demographic data, location-based data, telephone numbers, email addresses, twitter IDs, home addresses, data or records relating to a user's health or level of fitness (e.g., vital signs measurements, medication information, exercise information), date of birth, or any other identifying or personal information.
The present disclosure recognizes that the use of such personal information data, in the present technology, can be used to the benefit of users. For example, the personal information data can be used to provide useful, glanceable information in widgets. Further, other uses for personal information data that benefit the user are also contemplated by the present disclosure. For instance, health and fitness data may be used to provide insights into a user's general wellness, or may be used as positive feedback to individuals using technology to pursue wellness goals.
The present disclosure contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. Such policies should be easily accessible by users, and should be updated as the collection and/or use of data changes. Personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection/sharing should occur after receiving the informed consent of the users. Additionally, such entities should consider taking any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices. In addition, policies and practices should be adapted for the particular types of personal information data being collected and/or accessed and adapted to applicable laws and standards, including jurisdiction-specific considerations. For instance, in the US, collection of or access to certain health data may be governed by federal and/or state laws, such as the Health Insurance Portability and Accountability Act (HIPAA); whereas health data in other countries may be subject to other regulations and policies and should be handled accordingly. Hence different privacy practices should be maintained for different personal data types in each country.
Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, in the case of displaying personal information data in widgets, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services or anytime thereafter. In addition to providing “opt in” and “opt out” options, the present disclosure contemplates providing notifications relating to the access or use of personal information. For instance, a user may be notified upon downloading an app that their personal information data will be accessed and then reminded again just before personal information data is accessed by the app.
Moreover, it is the intent of the present disclosure that personal information data should be managed and handled in a way to minimize risks of unintentional or unauthorized access or use. Risk can be minimized by limiting the collection of data and deleting data once it is no longer needed. In addition, and when applicable, including in certain health related applications, data de-identification can be used to protect a user's privacy. De-identification may be facilitated, when appropriate, by removing specific identifiers (e.g., date of birth, etc.), controlling the amount or specificity of data stored (e.g., collecting location data a city level rather than at an address level), controlling how data is stored (e.g., aggregating data across users), and/or other methods.
Therefore, although the present disclosure broadly covers use of personal information data to implement one or more various disclosed embodiments, the present disclosure also contemplates that the various embodiments can also be implemented without the need for accessing such personal information data. That is, the various embodiments of the present technology are not rendered inoperable due to the lack of all or a portion of such personal information data. For example, widgets can show glanceable information from applications based on non-personal information data or a bare minimum amount of personal information.
This application claims priority to U.S. Provisional Patent Application Ser. No. 62/843,507, titled “USER INTERFACES FOR WIDGETS,” filed May 5, 2019, the content of which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4355380 | Huguenin et al. | Oct 1982 | A |
4899136 | Beard et al. | Feb 1990 | A |
5051736 | Bennett et al. | Sep 1991 | A |
5124959 | Yamazaki et al. | Jun 1992 | A |
5146556 | Hullot et al. | Sep 1992 | A |
5196838 | Meier et al. | Mar 1993 | A |
5237679 | Wang et al. | Aug 1993 | A |
5312478 | Reed et al. | May 1994 | A |
5452414 | Rosendahl et al. | Sep 1995 | A |
5491778 | Gordon et al. | Feb 1996 | A |
5497454 | Bates et al. | Mar 1996 | A |
5515486 | Amro et al. | May 1996 | A |
5544295 | Capps | Aug 1996 | A |
5546529 | Bowers et al. | Aug 1996 | A |
5572238 | Krivacic | Nov 1996 | A |
5598524 | Johnston, Jr. et al. | Jan 1997 | A |
5610653 | Abecassis | Mar 1997 | A |
5612719 | Beernink et al. | Mar 1997 | A |
5621878 | Owens et al. | Apr 1997 | A |
5625818 | Zarmer et al. | Apr 1997 | A |
5642490 | Morgan et al. | Jun 1997 | A |
5644739 | Moursund | Jul 1997 | A |
5657049 | Ludolph et al. | Aug 1997 | A |
5671381 | Strasnick et al. | Sep 1997 | A |
5678015 | Goh | Oct 1997 | A |
5726687 | Belfiore et al. | Mar 1998 | A |
5736974 | Selker | Apr 1998 | A |
5745096 | Ludolph et al. | Apr 1998 | A |
5745116 | Pisutha-arnond | Apr 1998 | A |
5745718 | Cline et al. | Apr 1998 | A |
5745910 | Piersol et al. | Apr 1998 | A |
5754179 | Hocker et al. | May 1998 | A |
5754809 | Gandre | May 1998 | A |
5757371 | Oran et al. | May 1998 | A |
5760773 | Berman et al. | Jun 1998 | A |
5774119 | Alimpich et al. | Jun 1998 | A |
5796401 | Winer | Aug 1998 | A |
5801699 | Hocker et al. | Sep 1998 | A |
5801704 | Oohara et al. | Sep 1998 | A |
5812862 | Smith et al. | Sep 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5825357 | Malamud et al. | Oct 1998 | A |
5835079 | Shieh | Nov 1998 | A |
5835094 | Ermel et al. | Nov 1998 | A |
5838326 | Card et al. | Nov 1998 | A |
5861885 | Strasnick et al. | Jan 1999 | A |
5870683 | Wells et al. | Feb 1999 | A |
5870734 | Kao | Feb 1999 | A |
5877765 | Dickman et al. | Mar 1999 | A |
5877775 | Theisen et al. | Mar 1999 | A |
5880733 | Horvitz et al. | Mar 1999 | A |
5880743 | Moran et al. | Mar 1999 | A |
5900876 | Yagita et al. | May 1999 | A |
5914716 | Rubin et al. | Jun 1999 | A |
5914717 | Kleewein et al. | Jun 1999 | A |
5923327 | Smith et al. | Jul 1999 | A |
5923908 | Schrock et al. | Jul 1999 | A |
5934707 | Johnson | Aug 1999 | A |
5943679 | Niles et al. | Aug 1999 | A |
5956025 | Goulden et al. | Sep 1999 | A |
5963204 | Ikeda et al. | Oct 1999 | A |
5995106 | Naughton et al. | Nov 1999 | A |
6005579 | Sugiyama et al. | Dec 1999 | A |
6012072 | Lucas et al. | Jan 2000 | A |
6043818 | Nakano et al. | Mar 2000 | A |
6049336 | Liu et al. | Apr 2000 | A |
6054989 | Robertson et al. | Apr 2000 | A |
6072486 | Sheldon et al. | Jun 2000 | A |
6088032 | Mackinlay | Jul 2000 | A |
6111573 | McComb et al. | Aug 2000 | A |
6121969 | Jain et al. | Sep 2000 | A |
6133914 | Rogers et al. | Oct 2000 | A |
6166738 | Robertson et al. | Dec 2000 | A |
6188407 | Smith et al. | Feb 2001 | B1 |
6195094 | Celebiler | Feb 2001 | B1 |
6211858 | Moon et al. | Apr 2001 | B1 |
6222465 | Kumar et al. | Apr 2001 | B1 |
6229542 | Miller | May 2001 | B1 |
6253218 | Aoki et al. | Jun 2001 | B1 |
6275935 | Barlow et al. | Aug 2001 | B1 |
6278454 | Krishnan | Aug 2001 | B1 |
6313853 | Lamontagne et al. | Nov 2001 | B1 |
6317140 | Livingston | Nov 2001 | B1 |
6353451 | Teibel et al. | Mar 2002 | B1 |
6396520 | Ording | May 2002 | B1 |
6477117 | Narayanaswami et al. | Nov 2002 | B1 |
6486895 | Robertson et al. | Nov 2002 | B1 |
6496206 | Mernyk et al. | Dec 2002 | B1 |
6496209 | Horii | Dec 2002 | B2 |
6525997 | Narayanaswami et al. | Feb 2003 | B1 |
6545669 | Kinawi et al. | Apr 2003 | B1 |
6549218 | Gershony et al. | Apr 2003 | B1 |
6571245 | Huang et al. | May 2003 | B2 |
6590568 | Astala et al. | Jul 2003 | B1 |
6597378 | Shiraishi et al. | Jul 2003 | B1 |
6621509 | Eiref et al. | Sep 2003 | B1 |
6628309 | Dodson et al. | Sep 2003 | B1 |
6628310 | Hiura et al. | Sep 2003 | B1 |
6647534 | Graham | Nov 2003 | B1 |
6683628 | Nakagawa et al. | Jan 2004 | B1 |
6690623 | Maano | Feb 2004 | B1 |
6700612 | Anderson et al. | Mar 2004 | B1 |
6710788 | Freach et al. | Mar 2004 | B1 |
6714222 | Bjorn et al. | Mar 2004 | B1 |
6763388 | Tsimelzon | Jul 2004 | B1 |
6774914 | Benayoun | Aug 2004 | B1 |
6781575 | Hawkins et al. | Aug 2004 | B1 |
6798429 | Bradski | Sep 2004 | B2 |
6809724 | Shiraishi et al. | Oct 2004 | B1 |
6816175 | Hamp et al. | Nov 2004 | B1 |
6820111 | Rubin et al. | Nov 2004 | B1 |
6822638 | Dobies et al. | Nov 2004 | B2 |
6847387 | Roth | Jan 2005 | B2 |
6874128 | Moore et al. | Mar 2005 | B1 |
6880132 | Uemura | Apr 2005 | B2 |
6915490 | Ewing | Jul 2005 | B1 |
6931601 | Vronay et al. | Aug 2005 | B2 |
6934911 | Salmimaa et al. | Aug 2005 | B2 |
6940494 | Hoshino et al. | Sep 2005 | B2 |
6963349 | Nagasaki | Nov 2005 | B1 |
6970749 | Chinn et al. | Nov 2005 | B1 |
6976210 | Silva et al. | Dec 2005 | B1 |
6976228 | Bernhardson | Dec 2005 | B2 |
6978127 | Bulthuis et al. | Dec 2005 | B1 |
7003495 | Burger et al. | Feb 2006 | B1 |
7007239 | Hawkins et al. | Feb 2006 | B1 |
7010755 | Anderson et al. | Mar 2006 | B2 |
7017118 | Carroll | Mar 2006 | B1 |
7043701 | Gordon | May 2006 | B2 |
7071943 | Adler | Jul 2006 | B2 |
7075512 | Fabre et al. | Jul 2006 | B1 |
7080326 | Molander et al. | Jul 2006 | B2 |
7093201 | Duarte | Aug 2006 | B2 |
7107549 | Deaton et al. | Sep 2006 | B2 |
7117453 | Drucker et al. | Oct 2006 | B2 |
7119819 | Robertson et al. | Oct 2006 | B1 |
7126579 | Ritter | Oct 2006 | B2 |
7133859 | Wong | Nov 2006 | B1 |
7134092 | Fung et al. | Nov 2006 | B2 |
7134095 | Smith et al. | Nov 2006 | B1 |
7142210 | Schwuttke et al. | Nov 2006 | B2 |
7146576 | Chang et al. | Dec 2006 | B2 |
7155667 | Kotler et al. | Dec 2006 | B1 |
7173604 | Marvit et al. | Feb 2007 | B2 |
7178111 | Glein et al. | Feb 2007 | B2 |
7194527 | Drucker et al. | Mar 2007 | B2 |
7194698 | Gottfurcht et al. | Mar 2007 | B2 |
7215323 | Gombert et al. | May 2007 | B2 |
7216305 | Jaeger | May 2007 | B1 |
7231229 | Hawkins et al. | Jun 2007 | B1 |
7242406 | Robotham et al. | Jul 2007 | B2 |
7249327 | Nelson et al. | Jul 2007 | B2 |
7278115 | Conway et al. | Oct 2007 | B1 |
7283845 | De Bast | Oct 2007 | B2 |
7287232 | Tsuchimura et al. | Oct 2007 | B2 |
7292243 | Burke | Nov 2007 | B1 |
7310636 | Bodin et al. | Dec 2007 | B2 |
7340678 | Chiu et al. | Mar 2008 | B2 |
7355593 | Hill et al. | Apr 2008 | B2 |
7362331 | Ording | Apr 2008 | B2 |
7383497 | Glenner et al. | Jun 2008 | B2 |
7392488 | Card et al. | Jun 2008 | B2 |
7403211 | Sheasby et al. | Jul 2008 | B2 |
7403910 | Hastings et al. | Jul 2008 | B1 |
7404151 | Borchardt et al. | Jul 2008 | B2 |
7406666 | Davis et al. | Jul 2008 | B2 |
7412650 | Gallo | Aug 2008 | B2 |
7415677 | Arend et al. | Aug 2008 | B2 |
7417680 | Aoki et al. | Aug 2008 | B2 |
7432928 | Shaw et al. | Oct 2008 | B2 |
7433179 | Hisano et al. | Oct 2008 | B2 |
7434177 | Ording et al. | Oct 2008 | B1 |
7437005 | Drucker et al. | Oct 2008 | B2 |
7456823 | Poupyrev et al. | Nov 2008 | B2 |
7468742 | Ahn et al. | Dec 2008 | B2 |
7478437 | Hatanaka et al. | Jan 2009 | B2 |
7479948 | Kim et al. | Jan 2009 | B2 |
7480872 | Ubillos | Jan 2009 | B1 |
7480873 | Kawahara | Jan 2009 | B2 |
7487467 | Kawahara et al. | Feb 2009 | B1 |
7490295 | Chaudhri et al. | Feb 2009 | B2 |
7493573 | Wagner | Feb 2009 | B2 |
7496595 | Accapadi et al. | Feb 2009 | B2 |
7506268 | Jennings et al. | Mar 2009 | B2 |
7509321 | Wong et al. | Mar 2009 | B2 |
7509588 | Van Os et al. | Mar 2009 | B2 |
7511710 | Barrett | Mar 2009 | B2 |
7512898 | Jennings et al. | Mar 2009 | B2 |
7523414 | Schmidt et al. | Apr 2009 | B2 |
7526738 | Ording et al. | Apr 2009 | B2 |
7546548 | Chew et al. | Jun 2009 | B2 |
7546554 | Chiu et al. | Jun 2009 | B2 |
7552402 | Bilow | Jun 2009 | B2 |
7557804 | McDaniel | Jul 2009 | B1 |
7561874 | Wang et al. | Jul 2009 | B2 |
7584278 | Rajarajan et al. | Sep 2009 | B2 |
7587683 | Ito et al. | Sep 2009 | B2 |
7594185 | Anderson et al. | Sep 2009 | B2 |
7606819 | Audet et al. | Oct 2009 | B2 |
7607150 | Kobayashi et al. | Oct 2009 | B1 |
7620894 | Kahn | Nov 2009 | B1 |
7624357 | De Bast | Nov 2009 | B2 |
7642934 | Scott | Jan 2010 | B2 |
7650575 | Cummins et al. | Jan 2010 | B2 |
7657842 | Matthews et al. | Feb 2010 | B2 |
7657845 | Drucker et al. | Feb 2010 | B2 |
7663620 | Robertson et al. | Feb 2010 | B2 |
7665033 | Byrne et al. | Feb 2010 | B2 |
7667703 | Hong et al. | Feb 2010 | B2 |
7680817 | Audet et al. | Mar 2010 | B2 |
7683883 | Touma et al. | Mar 2010 | B2 |
7698658 | Ohwa et al. | Apr 2010 | B2 |
7710423 | Drucker et al. | May 2010 | B2 |
7716604 | Kataoka et al. | May 2010 | B2 |
7719523 | Hillis | May 2010 | B2 |
7719542 | Gough et al. | May 2010 | B1 |
7724242 | Hillis et al. | May 2010 | B2 |
7725839 | Michaels | May 2010 | B2 |
7728821 | Hillis et al. | Jun 2010 | B2 |
7730401 | Gillespie et al. | Jun 2010 | B2 |
7730423 | Graham | Jun 2010 | B2 |
7735021 | Padawer et al. | Jun 2010 | B2 |
7739604 | Lyons et al. | Jun 2010 | B1 |
7747289 | Wang et al. | Jun 2010 | B2 |
7761813 | Kim et al. | Jul 2010 | B2 |
7765266 | Kropivny | Jul 2010 | B2 |
7770125 | Young et al. | Aug 2010 | B1 |
7783990 | Amadio et al. | Aug 2010 | B2 |
7797637 | Marcjan | Sep 2010 | B2 |
7805684 | Arvilommi | Sep 2010 | B2 |
7810038 | Matsa et al. | Oct 2010 | B2 |
7840901 | Lacey et al. | Nov 2010 | B2 |
7840907 | Kikuchi et al. | Nov 2010 | B2 |
7840912 | Elias et al. | Nov 2010 | B2 |
7853972 | Brodersen et al. | Dec 2010 | B2 |
7856602 | Armstrong | Dec 2010 | B2 |
7873916 | Chaudhri | Jan 2011 | B1 |
7880726 | Nakadaira et al. | Feb 2011 | B2 |
7904832 | Ubillos | Mar 2011 | B2 |
7907124 | Hillis et al. | Mar 2011 | B2 |
7907476 | Lee | Mar 2011 | B2 |
7917869 | Anderson | Mar 2011 | B2 |
7924444 | Takahashi | Apr 2011 | B2 |
7940250 | Forstall | May 2011 | B2 |
7956869 | Gilra | Jun 2011 | B1 |
7958457 | Brandenberg et al. | Jun 2011 | B1 |
7979879 | Kazama et al. | Jul 2011 | B2 |
7986324 | Funaki et al. | Jul 2011 | B2 |
7995078 | Baar | Aug 2011 | B2 |
7996789 | Louch et al. | Aug 2011 | B2 |
8020110 | Hurst | Sep 2011 | B2 |
8024671 | Lee et al. | Sep 2011 | B2 |
8046714 | Yahiro et al. | Oct 2011 | B2 |
8059101 | Westerman et al. | Nov 2011 | B2 |
8064704 | Kim et al. | Nov 2011 | B2 |
8065618 | Kumar et al. | Nov 2011 | B2 |
8069404 | Audet | Nov 2011 | B2 |
8072439 | Hillis et al. | Dec 2011 | B2 |
8078966 | Audet | Dec 2011 | B2 |
8099441 | Surasinghe | Jan 2012 | B2 |
8103963 | Ikeda et al. | Jan 2012 | B2 |
8111255 | Park | Feb 2012 | B2 |
8125481 | Gossweile et al. | Feb 2012 | B2 |
8130211 | Abernathy | Mar 2012 | B2 |
8139043 | Hillis | Mar 2012 | B2 |
8151185 | Audet | Apr 2012 | B2 |
8156175 | Hopkins | Apr 2012 | B2 |
8161419 | Palahnuk et al. | Apr 2012 | B2 |
8185842 | Chang et al. | May 2012 | B2 |
8188985 | Hillis et al. | May 2012 | B2 |
8205172 | Wong et al. | Jun 2012 | B2 |
8209628 | Davidson | Jun 2012 | B1 |
8214793 | Muthuswamy | Jul 2012 | B1 |
8230358 | Chaudhri | Jul 2012 | B1 |
8232990 | King et al. | Jul 2012 | B2 |
8255808 | Lindgren et al. | Aug 2012 | B2 |
8259163 | Bell | Sep 2012 | B2 |
8266550 | Cleron et al. | Sep 2012 | B1 |
8269729 | Han et al. | Sep 2012 | B2 |
8269739 | Hillis et al. | Sep 2012 | B2 |
8306515 | Ryu et al. | Nov 2012 | B2 |
8335784 | Gutt et al. | Dec 2012 | B2 |
8365084 | Lin et al. | Jan 2013 | B1 |
8423911 | Chaudhri | Apr 2013 | B2 |
8434027 | Jones | Apr 2013 | B2 |
8446371 | Fyke et al. | May 2013 | B2 |
8458615 | Chaudhri | Jun 2013 | B2 |
8519964 | Platzer et al. | Aug 2013 | B2 |
8519972 | Forstall et al. | Aug 2013 | B2 |
8525839 | Chaudhri | Sep 2013 | B2 |
8558808 | Forstall et al. | Oct 2013 | B2 |
8564544 | Jobs et al. | Oct 2013 | B2 |
8601370 | Chiang et al. | Dec 2013 | B2 |
8619038 | Chaudhri et al. | Dec 2013 | B2 |
8626762 | Seung et al. | Jan 2014 | B2 |
8672885 | Kriesel et al. | Mar 2014 | B2 |
8683349 | Roberts et al. | Mar 2014 | B2 |
8713011 | Asai et al. | Apr 2014 | B2 |
8713469 | Park et al. | Apr 2014 | B2 |
8730188 | Pasquero et al. | May 2014 | B2 |
8799777 | Lee et al. | Aug 2014 | B1 |
8799821 | De Rose et al. | Aug 2014 | B1 |
8826170 | Weber et al. | Sep 2014 | B1 |
8839128 | Krishnaraj et al. | Sep 2014 | B2 |
8881060 | Chaudhri et al. | Nov 2014 | B2 |
8881061 | Chaudhri et al. | Nov 2014 | B2 |
8957866 | Barnett et al. | Feb 2015 | B2 |
8972898 | Carter | Mar 2015 | B2 |
9026508 | Nagai | May 2015 | B2 |
9032438 | Ito et al. | May 2015 | B2 |
9053462 | Cadiz et al. | Jun 2015 | B2 |
9082314 | Tsai | Jul 2015 | B2 |
9152312 | Terleski et al. | Oct 2015 | B1 |
9170708 | Chaudhri et al. | Oct 2015 | B2 |
9237855 | Hong et al. | Jan 2016 | B2 |
9239673 | Shaffer et al. | Jan 2016 | B2 |
9256627 | Surasinghe | Feb 2016 | B2 |
9259615 | Weast et al. | Feb 2016 | B2 |
9367232 | Platzer et al. | Jun 2016 | B2 |
9377762 | Hoobler et al. | Jun 2016 | B2 |
9386432 | Chu et al. | Jul 2016 | B2 |
9417787 | Fong | Aug 2016 | B2 |
9448691 | Suda | Sep 2016 | B2 |
9619143 | Herz et al. | Apr 2017 | B2 |
9715277 | Lee | Jul 2017 | B2 |
9772749 | Chaudhri et al. | Sep 2017 | B2 |
9794397 | Min et al. | Oct 2017 | B2 |
9933913 | Van Os et al. | Apr 2018 | B2 |
9993913 | McCardle et al. | Jun 2018 | B2 |
10025458 | Chaudhri et al. | Jul 2018 | B2 |
10250735 | Butcher et al. | Apr 2019 | B2 |
10359907 | Van Os et al. | Jul 2019 | B2 |
10620780 | Chaudhri et al. | Apr 2020 | B2 |
10684592 | Chang et al. | Jun 2020 | B2 |
10788953 | Chaudhri et al. | Sep 2020 | B2 |
10788976 | Chaudhri et al. | Sep 2020 | B2 |
10884579 | Van Os et al. | Jan 2021 | B2 |
10915224 | Van Os et al. | Feb 2021 | B2 |
11009833 | Essery | May 2021 | B2 |
20010024195 | Hayakawa | Sep 2001 | A1 |
20010024212 | Ohnishi | Sep 2001 | A1 |
20010038394 | Tsuchimura et al. | Nov 2001 | A1 |
20020008691 | Hanajima et al. | Jan 2002 | A1 |
20020015024 | Westerman et al. | Feb 2002 | A1 |
20020015042 | Robotham et al. | Feb 2002 | A1 |
20020015064 | Robotham et al. | Feb 2002 | A1 |
20020018051 | Singh | Feb 2002 | A1 |
20020024540 | Mccarthy | Feb 2002 | A1 |
20020038299 | Zernik et al. | Mar 2002 | A1 |
20020054090 | Silva et al. | May 2002 | A1 |
20020057287 | Crow et al. | May 2002 | A1 |
20020067376 | Martin et al. | Jun 2002 | A1 |
20020078037 | Hatanaka et al. | Jun 2002 | A1 |
20020085037 | Leavitt et al. | Jul 2002 | A1 |
20020091697 | Huang et al. | Jul 2002 | A1 |
20020093531 | Barile | Jul 2002 | A1 |
20020097261 | Gottfurcht et al. | Jul 2002 | A1 |
20020104096 | Cramer et al. | Aug 2002 | A1 |
20020109721 | Konaka et al. | Aug 2002 | A1 |
20020140698 | Robertson et al. | Oct 2002 | A1 |
20020140736 | Chen | Oct 2002 | A1 |
20020143949 | Rajarajan et al. | Oct 2002 | A1 |
20020149561 | Fukumoto et al. | Oct 2002 | A1 |
20020152222 | Holbrook | Oct 2002 | A1 |
20020167683 | Hanamoto et al. | Nov 2002 | A1 |
20020191029 | Gillespie et al. | Dec 2002 | A1 |
20020196238 | Tsukada et al. | Dec 2002 | A1 |
20030001898 | Bernhardson | Jan 2003 | A1 |
20030007012 | Bate | Jan 2003 | A1 |
20030016241 | Burke | Jan 2003 | A1 |
20030030664 | Parry | Feb 2003 | A1 |
20030048295 | Lilleness et al. | Mar 2003 | A1 |
20030063072 | Brandenberg et al. | Apr 2003 | A1 |
20030080991 | Crow et al. | May 2003 | A1 |
20030085931 | Card et al. | May 2003 | A1 |
20030090572 | Belz et al. | May 2003 | A1 |
20030098894 | Sheldon et al. | May 2003 | A1 |
20030122787 | Zimmerman et al. | Jul 2003 | A1 |
20030128242 | Gordon | Jul 2003 | A1 |
20030142136 | Carter et al. | Jul 2003 | A1 |
20030156119 | Bonadio | Aug 2003 | A1 |
20030156140 | Watanabe | Aug 2003 | A1 |
20030156756 | Gokturk et al. | Aug 2003 | A1 |
20030160825 | Weber | Aug 2003 | A1 |
20030164827 | Gottesman et al. | Sep 2003 | A1 |
20030169298 | Ording | Sep 2003 | A1 |
20030169302 | Davidsson et al. | Sep 2003 | A1 |
20030174170 | Jung et al. | Sep 2003 | A1 |
20030174172 | Conrad et al. | Sep 2003 | A1 |
20030184552 | Chadha | Oct 2003 | A1 |
20030184587 | Ording et al. | Oct 2003 | A1 |
20030189597 | Anderson et al. | Oct 2003 | A1 |
20030195950 | Huang et al. | Oct 2003 | A1 |
20030200289 | Kemp et al. | Oct 2003 | A1 |
20030206195 | Matsa et al. | Nov 2003 | A1 |
20030206197 | Mcinerney | Nov 2003 | A1 |
20030210278 | Kyoya et al. | Nov 2003 | A1 |
20040008224 | Molander et al. | Jan 2004 | A1 |
20040021643 | Hoshino et al. | Feb 2004 | A1 |
20040027330 | Bradski | Feb 2004 | A1 |
20040056809 | Prassmayer et al. | Mar 2004 | A1 |
20040056839 | Yoshihara | Mar 2004 | A1 |
20040070608 | Saka | Apr 2004 | A1 |
20040103156 | Quillen et al. | May 2004 | A1 |
20040109013 | Goertz | Jun 2004 | A1 |
20040121823 | Noesgaard et al. | Jun 2004 | A1 |
20040125088 | Zimmerman et al. | Jul 2004 | A1 |
20040138569 | Grunwald et al. | Jul 2004 | A1 |
20040141011 | Smethers et al. | Jul 2004 | A1 |
20040143598 | Drucker et al. | Jul 2004 | A1 |
20040155909 | Wagner | Aug 2004 | A1 |
20040160462 | Sheasby et al. | Aug 2004 | A1 |
20040196267 | Kawai et al. | Oct 2004 | A1 |
20040215719 | Altshuler | Oct 2004 | A1 |
20040218104 | Smith et al. | Nov 2004 | A1 |
20040222975 | Nakano et al. | Nov 2004 | A1 |
20040236769 | Smith et al. | Nov 2004 | A1 |
20040257375 | Cowperthwaite | Dec 2004 | A1 |
20050005246 | Card et al. | Jan 2005 | A1 |
20050005248 | Rockey et al. | Jan 2005 | A1 |
20050010955 | Elia et al. | Jan 2005 | A1 |
20050012862 | Lee | Jan 2005 | A1 |
20050024341 | Gillespie et al. | Feb 2005 | A1 |
20050026644 | Lien | Feb 2005 | A1 |
20050039134 | Wiggeshoff et al. | Feb 2005 | A1 |
20050043987 | Kumar et al. | Feb 2005 | A1 |
20050052471 | Nagasaki | Mar 2005 | A1 |
20050057524 | Hill et al. | Mar 2005 | A1 |
20050057530 | Hinckley et al. | Mar 2005 | A1 |
20050057548 | Kim | Mar 2005 | A1 |
20050060653 | Fukase et al. | Mar 2005 | A1 |
20050060664 | Rogers | Mar 2005 | A1 |
20050060665 | Rekimoto | Mar 2005 | A1 |
20050088423 | Keely et al. | Apr 2005 | A1 |
20050091596 | Anthony et al. | Apr 2005 | A1 |
20050091609 | Matthews et al. | Apr 2005 | A1 |
20050097089 | Nielsen et al. | May 2005 | A1 |
20050116026 | Burger et al. | Jun 2005 | A1 |
20050120142 | Hall | Jun 2005 | A1 |
20050131924 | Jones | Jun 2005 | A1 |
20050134578 | Chambers et al. | Jun 2005 | A1 |
20050138570 | Good et al. | Jun 2005 | A1 |
20050151742 | Hong et al. | Jul 2005 | A1 |
20050177796 | Takahashi | Aug 2005 | A1 |
20050210410 | Ohwa et al. | Sep 2005 | A1 |
20050210412 | Matthews et al. | Sep 2005 | A1 |
20050216913 | Gemmell et al. | Sep 2005 | A1 |
20050227642 | Jensen | Oct 2005 | A1 |
20050229102 | Watson et al. | Oct 2005 | A1 |
20050246331 | De vorchik et al. | Nov 2005 | A1 |
20050251755 | Mullins et al. | Nov 2005 | A1 |
20050259087 | Hoshino et al. | Nov 2005 | A1 |
20050262448 | Vronay et al. | Nov 2005 | A1 |
20050270276 | Sugimoto et al. | Dec 2005 | A1 |
20050275636 | Dehlin et al. | Dec 2005 | A1 |
20050278757 | Grossman et al. | Dec 2005 | A1 |
20050283734 | Santoro et al. | Dec 2005 | A1 |
20050289476 | Tokkonen | Dec 2005 | A1 |
20050289482 | Anthony et al. | Dec 2005 | A1 |
20060004685 | Pyhalammi et al. | Jan 2006 | A1 |
20060005207 | Louch et al. | Jan 2006 | A1 |
20060007182 | Sato et al. | Jan 2006 | A1 |
20060020903 | Wang et al. | Jan 2006 | A1 |
20060022955 | Kennedy | Feb 2006 | A1 |
20060025110 | Liu | Feb 2006 | A1 |
20060026521 | Hotelling et al. | Feb 2006 | A1 |
20060026535 | Hotelling et al. | Feb 2006 | A1 |
20060026536 | Hotelling et al. | Feb 2006 | A1 |
20060031874 | Ok et al. | Feb 2006 | A1 |
20060033751 | Keely et al. | Feb 2006 | A1 |
20060035628 | Miller et al. | Feb 2006 | A1 |
20060036568 | Moore et al. | Feb 2006 | A1 |
20060048069 | Igeta | Mar 2006 | A1 |
20060051073 | Jung et al. | Mar 2006 | A1 |
20060053392 | Salmimaa et al. | Mar 2006 | A1 |
20060055700 | Niles et al. | Mar 2006 | A1 |
20060070007 | Cummins et al. | Mar 2006 | A1 |
20060075355 | Shiono et al. | Apr 2006 | A1 |
20060075396 | Surasinghe | Apr 2006 | A1 |
20060080386 | Roykkee et al. | Apr 2006 | A1 |
20060080616 | Vogel et al. | Apr 2006 | A1 |
20060080617 | Anderson et al. | Apr 2006 | A1 |
20060090022 | Flynn et al. | Apr 2006 | A1 |
20060092133 | Touma et al. | May 2006 | A1 |
20060092770 | Demas | May 2006 | A1 |
20060097991 | Hotelling et al. | May 2006 | A1 |
20060107231 | Matthews et al. | May 2006 | A1 |
20060112335 | Hofmeister et al. | May 2006 | A1 |
20060112347 | Baudisch | May 2006 | A1 |
20060116578 | Grunwald et al. | Jun 2006 | A1 |
20060117372 | Hopkins | Jun 2006 | A1 |
20060119619 | Fagans et al. | Jun 2006 | A1 |
20060123353 | Matthews et al. | Jun 2006 | A1 |
20060123359 | Schatzberger | Jun 2006 | A1 |
20060123360 | Anwar et al. | Jun 2006 | A1 |
20060125799 | Hillis et al. | Jun 2006 | A1 |
20060129586 | Arrouye et al. | Jun 2006 | A1 |
20060143574 | Ito et al. | Jun 2006 | A1 |
20060153531 | Kanegae et al. | Jul 2006 | A1 |
20060161863 | Gallo | Jul 2006 | A1 |
20060161871 | Hotelling et al. | Jul 2006 | A1 |
20060164418 | Hao et al. | Jul 2006 | A1 |
20060174211 | Hoellerer et al. | Aug 2006 | A1 |
20060187212 | Park et al. | Aug 2006 | A1 |
20060190833 | Sangiovanni et al. | Aug 2006 | A1 |
20060197752 | Hurst et al. | Sep 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20060209035 | Jenkins et al. | Sep 2006 | A1 |
20060210958 | Rimas-ribikauskas et al. | Sep 2006 | A1 |
20060212828 | Yahiro et al. | Sep 2006 | A1 |
20060212833 | Gallagher et al. | Sep 2006 | A1 |
20060236266 | Majava | Oct 2006 | A1 |
20060242596 | Armstrong | Oct 2006 | A1 |
20060242604 | Wong et al. | Oct 2006 | A1 |
20060242607 | Hudson | Oct 2006 | A1 |
20060250578 | Pohl et al. | Nov 2006 | A1 |
20060253771 | Baschy | Nov 2006 | A1 |
20060262116 | Moshiri et al. | Nov 2006 | A1 |
20060267966 | Grossman et al. | Nov 2006 | A1 |
20060268100 | Karukka et al. | Nov 2006 | A1 |
20060271864 | Satterfield et al. | Nov 2006 | A1 |
20060271867 | Wang et al. | Nov 2006 | A1 |
20060271874 | Raiz et al. | Nov 2006 | A1 |
20060277460 | Forstall et al. | Dec 2006 | A1 |
20060277481 | Forstall et al. | Dec 2006 | A1 |
20060277486 | Skinner | Dec 2006 | A1 |
20060278692 | Matsumoto et al. | Dec 2006 | A1 |
20060282790 | Matthews et al. | Dec 2006 | A1 |
20060284852 | Hofmeister et al. | Dec 2006 | A1 |
20060290661 | Innanen et al. | Dec 2006 | A1 |
20070013665 | Vetelainen et al. | Jan 2007 | A1 |
20070016958 | Bodepudi et al. | Jan 2007 | A1 |
20070024468 | Quandel et al. | Feb 2007 | A1 |
20070028269 | Nezu et al. | Feb 2007 | A1 |
20070030362 | Ota et al. | Feb 2007 | A1 |
20070035513 | Sherrard et al. | Feb 2007 | A1 |
20070044029 | Fisher et al. | Feb 2007 | A1 |
20070050432 | Yoshizawa | Mar 2007 | A1 |
20070050726 | Wakai et al. | Mar 2007 | A1 |
20070055947 | Ostojic et al. | Mar 2007 | A1 |
20070061745 | Anthony et al. | Mar 2007 | A1 |
20070067272 | Flynt et al. | Mar 2007 | A1 |
20070070066 | Bakhash | Mar 2007 | A1 |
20070083827 | Scott et al. | Apr 2007 | A1 |
20070083911 | Madden et al. | Apr 2007 | A1 |
20070091068 | Liberty | Apr 2007 | A1 |
20070101292 | Kupka | May 2007 | A1 |
20070101297 | Forstall et al. | May 2007 | A1 |
20070106950 | Hutchinson et al. | May 2007 | A1 |
20070113207 | Gritton | May 2007 | A1 |
20070121869 | Gorti et al. | May 2007 | A1 |
20070123205 | Lee et al. | May 2007 | A1 |
20070124677 | de los reyes et al. | May 2007 | A1 |
20070126696 | Boillot | Jun 2007 | A1 |
20070126732 | Robertson et al. | Jun 2007 | A1 |
20070132789 | Ording et al. | Jun 2007 | A1 |
20070136351 | Dames et al. | Jun 2007 | A1 |
20070146325 | Poston et al. | Jun 2007 | A1 |
20070150810 | Katz et al. | Jun 2007 | A1 |
20070150834 | Muller et al. | Jun 2007 | A1 |
20070150835 | Muller | Jun 2007 | A1 |
20070152958 | Ahn et al. | Jul 2007 | A1 |
20070152980 | Kocienda et al. | Jul 2007 | A1 |
20070156697 | Tsarkova | Jul 2007 | A1 |
20070157089 | Van os et al. | Jul 2007 | A1 |
20070157094 | Lemay et al. | Jul 2007 | A1 |
20070157097 | Peters | Jul 2007 | A1 |
20070174785 | Perttula | Jul 2007 | A1 |
20070177803 | Elias et al. | Aug 2007 | A1 |
20070177804 | Elias et al. | Aug 2007 | A1 |
20070180395 | Yamashita et al. | Aug 2007 | A1 |
20070188518 | Vale et al. | Aug 2007 | A1 |
20070189737 | Chaudhri et al. | Aug 2007 | A1 |
20070192741 | Yoritate et al. | Aug 2007 | A1 |
20070226652 | Kikuchi et al. | Sep 2007 | A1 |
20070239760 | Simon | Oct 2007 | A1 |
20070240079 | Flynt et al. | Oct 2007 | A1 |
20070243862 | Coskun et al. | Oct 2007 | A1 |
20070243905 | Juh et al. | Oct 2007 | A1 |
20070245250 | Schechter et al. | Oct 2007 | A1 |
20070247425 | Liberty et al. | Oct 2007 | A1 |
20070250793 | Miura et al. | Oct 2007 | A1 |
20070250794 | Miura et al. | Oct 2007 | A1 |
20070266011 | Rohrs et al. | Nov 2007 | A1 |
20070271532 | Nguyen et al. | Nov 2007 | A1 |
20070288860 | Ording et al. | Dec 2007 | A1 |
20070288862 | Ording | Dec 2007 | A1 |
20070288868 | Rhee et al. | Dec 2007 | A1 |
20070294231 | Kaihotsu | Dec 2007 | A1 |
20080001924 | de los reyes et al. | Jan 2008 | A1 |
20080005702 | Skourup et al. | Jan 2008 | A1 |
20080005703 | Radivojevic et al. | Jan 2008 | A1 |
20080006762 | Fadell et al. | Jan 2008 | A1 |
20080016468 | Chambers et al. | Jan 2008 | A1 |
20080016471 | Park | Jan 2008 | A1 |
20080024454 | Everest | Jan 2008 | A1 |
20080034013 | Cisler et al. | Feb 2008 | A1 |
20080034309 | Louch et al. | Feb 2008 | A1 |
20080034317 | Fard et al. | Feb 2008 | A1 |
20080040668 | Ala-rantala | Feb 2008 | A1 |
20080059915 | Boillot | Mar 2008 | A1 |
20080062126 | Algreatly | Mar 2008 | A1 |
20080062141 | Chandhri | Mar 2008 | A1 |
20080062257 | Corson | Mar 2008 | A1 |
20080067626 | Hirler et al. | Mar 2008 | A1 |
20080082930 | Omernick et al. | Apr 2008 | A1 |
20080089587 | Kim et al. | Apr 2008 | A1 |
20080091763 | Devonshire et al. | Apr 2008 | A1 |
20080094369 | Ganatra et al. | Apr 2008 | A1 |
20080104515 | Dumitru et al. | May 2008 | A1 |
20080109408 | Choi et al. | May 2008 | A1 |
20080117461 | Mitsutake et al. | May 2008 | A1 |
20080120568 | Jian et al. | May 2008 | A1 |
20080122796 | Jobs et al. | May 2008 | A1 |
20080125180 | Hoffman et al. | May 2008 | A1 |
20080126971 | Kojima | May 2008 | A1 |
20080130421 | Akaiwa et al. | Jun 2008 | A1 |
20080134088 | Tse et al. | Jun 2008 | A1 |
20080136785 | Baudisch et al. | Jun 2008 | A1 |
20080151700 | Inoue et al. | Jun 2008 | A1 |
20080155453 | Othmer | Jun 2008 | A1 |
20080155617 | Angiolillo et al. | Jun 2008 | A1 |
20080158145 | Westerman | Jul 2008 | A1 |
20080158172 | Hotelling et al. | Jul 2008 | A1 |
20080161045 | Vuorenmaa | Jul 2008 | A1 |
20080164468 | Chen et al. | Jul 2008 | A1 |
20080165140 | Christie et al. | Jul 2008 | A1 |
20080168365 | Chaudhri | Jul 2008 | A1 |
20080168367 | Chaudhri et al. | Jul 2008 | A1 |
20080168368 | Louch et al. | Jul 2008 | A1 |
20080168382 | Louch et al. | Jul 2008 | A1 |
20080168401 | Boule et al. | Jul 2008 | A1 |
20080168478 | Platzer et al. | Jul 2008 | A1 |
20080180406 | Han et al. | Jul 2008 | A1 |
20080182628 | Lee et al. | Jul 2008 | A1 |
20080184112 | Chiang et al. | Jul 2008 | A1 |
20080204424 | Jin et al. | Aug 2008 | A1 |
20080216017 | Kurtenbach et al. | Sep 2008 | A1 |
20080222545 | Lemay et al. | Sep 2008 | A1 |
20080225007 | Nakadaira et al. | Sep 2008 | A1 |
20080229254 | Warner | Sep 2008 | A1 |
20080231610 | Hotelling et al. | Sep 2008 | A1 |
20080244119 | Tokuhara et al. | Oct 2008 | A1 |
20080259045 | Kim et al. | Oct 2008 | A1 |
20080259057 | Brons | Oct 2008 | A1 |
20080266407 | Battles et al. | Oct 2008 | A1 |
20080268948 | Boesen | Oct 2008 | A1 |
20080276201 | Risch et al. | Nov 2008 | A1 |
20080282202 | Sunday | Nov 2008 | A1 |
20080294981 | Balzano et al. | Nov 2008 | A1 |
20080300055 | Lutnick et al. | Dec 2008 | A1 |
20080300572 | Rankers et al. | Dec 2008 | A1 |
20080307361 | Louch et al. | Dec 2008 | A1 |
20080307362 | Chaudhri et al. | Dec 2008 | A1 |
20080309632 | Westerman et al. | Dec 2008 | A1 |
20080313110 | Kreamer et al. | Dec 2008 | A1 |
20080313596 | Kreamer et al. | Dec 2008 | A1 |
20090002335 | Chaudhri | Jan 2009 | A1 |
20090007017 | Anzures et al. | Jan 2009 | A1 |
20090019385 | Khatib et al. | Jan 2009 | A1 |
20090021488 | Kali et al. | Jan 2009 | A1 |
20090023433 | Walley et al. | Jan 2009 | A1 |
20090024946 | Gotz | Jan 2009 | A1 |
20090034805 | Perlmutter et al. | Feb 2009 | A1 |
20090058821 | Chaudhri et al. | Mar 2009 | A1 |
20090063971 | White et al. | Mar 2009 | A1 |
20090064055 | Chaudhri et al. | Mar 2009 | A1 |
20090070708 | Finkelstein | Mar 2009 | A1 |
20090077501 | Partridge et al. | Mar 2009 | A1 |
20090103780 | Nishihara et al. | Apr 2009 | A1 |
20090122018 | Vymenets et al. | May 2009 | A1 |
20090125842 | Nakayama | May 2009 | A1 |
20090132965 | Shimizu | May 2009 | A1 |
20090138194 | Geelen | May 2009 | A1 |
20090138827 | Van os et al. | May 2009 | A1 |
20090144653 | Ubillos | Jun 2009 | A1 |
20090150775 | Miyazaki et al. | Jun 2009 | A1 |
20090158200 | Palahnuk et al. | Jun 2009 | A1 |
20090163193 | Fyke et al. | Jun 2009 | A1 |
20090164936 | Kawaguchi | Jun 2009 | A1 |
20090178008 | Herz et al. | Jul 2009 | A1 |
20090183080 | Thakkar et al. | Jul 2009 | A1 |
20090183125 | Magal et al. | Jul 2009 | A1 |
20090184936 | Algreatly | Jul 2009 | A1 |
20090189911 | Ono | Jul 2009 | A1 |
20090199128 | Matthews et al. | Aug 2009 | A1 |
20090204920 | Beverley et al. | Aug 2009 | A1 |
20090204928 | Kallio et al. | Aug 2009 | A1 |
20090217187 | Kendall et al. | Aug 2009 | A1 |
20090217206 | Liu et al. | Aug 2009 | A1 |
20090217209 | Chen et al. | Aug 2009 | A1 |
20090222420 | Hirata | Sep 2009 | A1 |
20090222765 | Ekstrand | Sep 2009 | A1 |
20090228825 | Van os et al. | Sep 2009 | A1 |
20090237371 | Kim et al. | Sep 2009 | A1 |
20090237372 | Kim et al. | Sep 2009 | A1 |
20090254869 | Ludwig et al. | Oct 2009 | A1 |
20090265669 | Kida et al. | Oct 2009 | A1 |
20090271723 | Matsushima et al. | Oct 2009 | A1 |
20090278812 | Yasutake | Nov 2009 | A1 |
20090282369 | Jones | Nov 2009 | A1 |
20090303231 | Robinet et al. | Dec 2009 | A1 |
20090313567 | Kwon et al. | Dec 2009 | A1 |
20090313584 | Kerr et al. | Dec 2009 | A1 |
20090313585 | Hellinger et al. | Dec 2009 | A1 |
20090315848 | Ku et al. | Dec 2009 | A1 |
20090319928 | Alphin et al. | Dec 2009 | A1 |
20090319935 | Figura | Dec 2009 | A1 |
20090322676 | Kerr et al. | Dec 2009 | A1 |
20090327969 | Estrada | Dec 2009 | A1 |
20100011304 | Van Os | Jan 2010 | A1 |
20100013780 | Ikeda et al. | Jan 2010 | A1 |
20100031203 | Morris et al. | Feb 2010 | A1 |
20100050133 | Nishihara et al. | Feb 2010 | A1 |
20100053151 | Marti et al. | Mar 2010 | A1 |
20100058182 | Jung | Mar 2010 | A1 |
20100063813 | Richter et al. | Mar 2010 | A1 |
20100082661 | Beaudreau | Apr 2010 | A1 |
20100083165 | Andrews et al. | Apr 2010 | A1 |
20100095206 | Kim | Apr 2010 | A1 |
20100095238 | Baudet | Apr 2010 | A1 |
20100095248 | Karstens | Apr 2010 | A1 |
20100100841 | Shin | Apr 2010 | A1 |
20100105454 | Weber et al. | Apr 2010 | A1 |
20100107101 | Shaw et al. | Apr 2010 | A1 |
20100110025 | Lim | May 2010 | A1 |
20100115428 | Shuping et al. | May 2010 | A1 |
20100122195 | Hwang | May 2010 | A1 |
20100124152 | Lee | May 2010 | A1 |
20100153844 | Hwang et al. | Jun 2010 | A1 |
20100153878 | Lindgren et al. | Jun 2010 | A1 |
20100157742 | Relyea et al. | Jun 2010 | A1 |
20100159909 | Stifelman | Jun 2010 | A1 |
20100162170 | Johns et al. | Jun 2010 | A1 |
20100169357 | Ingrassia et al. | Jul 2010 | A1 |
20100199227 | Xiao et al. | Aug 2010 | A1 |
20100211872 | Rolston et al. | Aug 2010 | A1 |
20100223563 | Green | Sep 2010 | A1 |
20100223574 | Wang et al. | Sep 2010 | A1 |
20100229129 | Price et al. | Sep 2010 | A1 |
20100229130 | Edge et al. | Sep 2010 | A1 |
20100241955 | Price et al. | Sep 2010 | A1 |
20100241967 | Lee | Sep 2010 | A1 |
20100241999 | Russ et al. | Sep 2010 | A1 |
20100248788 | Yook et al. | Sep 2010 | A1 |
20100251085 | Zearing et al. | Sep 2010 | A1 |
20100257468 | Bernardo et al. | Oct 2010 | A1 |
20100281408 | Fujioka et al. | Nov 2010 | A1 |
20100315413 | Izadi et al. | Dec 2010 | A1 |
20100318709 | Bell et al. | Dec 2010 | A1 |
20100325529 | Sun | Dec 2010 | A1 |
20100332497 | Valliani et al. | Dec 2010 | A1 |
20100333017 | Ortiz | Dec 2010 | A1 |
20110004835 | Yanchar et al. | Jan 2011 | A1 |
20110007000 | Lim | Jan 2011 | A1 |
20110012921 | Cholewin et al. | Jan 2011 | A1 |
20110029934 | Locker et al. | Feb 2011 | A1 |
20110041098 | Kajiya et al. | Feb 2011 | A1 |
20110055722 | Ludwig | Mar 2011 | A1 |
20110059733 | Kim et al. | Mar 2011 | A1 |
20110061010 | Wasko | Mar 2011 | A1 |
20110078597 | Rapp et al. | Mar 2011 | A1 |
20110083104 | Minton | Apr 2011 | A1 |
20110093821 | Wigdor et al. | Apr 2011 | A1 |
20110119610 | Hackborn et al. | May 2011 | A1 |
20110119629 | Huotari et al. | May 2011 | A1 |
20110124376 | Kim | May 2011 | A1 |
20110131534 | Subramanian et al. | Jun 2011 | A1 |
20110145758 | Rosales et al. | Jun 2011 | A1 |
20110148786 | Day et al. | Jun 2011 | A1 |
20110148798 | Dahl | Jun 2011 | A1 |
20110167058 | Van Os | Jul 2011 | A1 |
20110167365 | Wingrove et al. | Jul 2011 | A1 |
20110173556 | Czerwinski et al. | Jul 2011 | A1 |
20110179368 | King et al. | Jul 2011 | A1 |
20110225549 | Kim | Sep 2011 | A1 |
20110246918 | Henderson | Oct 2011 | A1 |
20110252346 | Chaudhri | Oct 2011 | A1 |
20110252349 | Chaudhri | Oct 2011 | A1 |
20110252372 | Chaudhri | Oct 2011 | A1 |
20110252373 | Chaudhri | Oct 2011 | A1 |
20110283334 | Choi et al. | Nov 2011 | A1 |
20110285659 | Kuwabara et al. | Nov 2011 | A1 |
20110298723 | Fleizach et al. | Dec 2011 | A1 |
20110302513 | Ademar et al. | Dec 2011 | A1 |
20110310005 | Chen et al. | Dec 2011 | A1 |
20110310058 | Yamada et al. | Dec 2011 | A1 |
20110314098 | Farrell et al. | Dec 2011 | A1 |
20120023471 | Fischer et al. | Jan 2012 | A1 |
20120030623 | Hoellwarth | Feb 2012 | A1 |
20120066630 | Kim et al. | Mar 2012 | A1 |
20120084692 | Bae | Apr 2012 | A1 |
20120084694 | Sirpal et al. | Apr 2012 | A1 |
20120110031 | Lahcanski et al. | May 2012 | A1 |
20120151331 | Pallakoff et al. | Jun 2012 | A1 |
20120169617 | Maenpaa | Jul 2012 | A1 |
20120216146 | Korkonen | Aug 2012 | A1 |
20120304092 | Jarrett | Nov 2012 | A1 |
20120324390 | Tao | Dec 2012 | A1 |
20130019175 | Kotler et al. | Jan 2013 | A1 |
20130067411 | Kataoka et al. | Mar 2013 | A1 |
20130111400 | Miwa | May 2013 | A1 |
20130194066 | Rahman et al. | Aug 2013 | A1 |
20130205244 | Decker et al. | Aug 2013 | A1 |
20130234924 | Janefalkar et al. | Sep 2013 | A1 |
20130321340 | Seo et al. | Dec 2013 | A1 |
20130332886 | Cranfill et al. | Dec 2013 | A1 |
20140015786 | Honda | Jan 2014 | A1 |
20140068483 | Platzer et al. | Mar 2014 | A1 |
20140108978 | Yu et al. | Apr 2014 | A1 |
20140109024 | Miyazaki | Apr 2014 | A1 |
20140135631 | Brumback et al. | May 2014 | A1 |
20140139637 | Mistry et al. | May 2014 | A1 |
20140143784 | Mistry et al. | May 2014 | A1 |
20140165006 | Chaudhri et al. | Jun 2014 | A1 |
20140195972 | Lee et al. | Jul 2014 | A1 |
20140200742 | Mauti | Jul 2014 | A1 |
20140215457 | Gava et al. | Jul 2014 | A1 |
20140237360 | Chaudhri et al. | Aug 2014 | A1 |
20140276244 | Kamyar | Sep 2014 | A1 |
20140293755 | Geiser et al. | Oct 2014 | A1 |
20140317555 | Choi et al. | Oct 2014 | A1 |
20140328151 | Serber | Nov 2014 | A1 |
20140365126 | Vulcano et al. | Dec 2014 | A1 |
20150012853 | Chaudhri et al. | Jan 2015 | A1 |
20150015500 | Lee et al. | Jan 2015 | A1 |
20150089407 | Suzuki | Mar 2015 | A1 |
20150105125 | Min et al. | Apr 2015 | A1 |
20150112752 | Wagner et al. | Apr 2015 | A1 |
20150117162 | Tsai | Apr 2015 | A1 |
20150160812 | Yuan et al. | Jun 2015 | A1 |
20150172438 | Yang | Jun 2015 | A1 |
20150242092 | Van os et al. | Aug 2015 | A1 |
20150242989 | Mun et al. | Aug 2015 | A1 |
20150281945 | Seo et al. | Oct 2015 | A1 |
20150301506 | Koumaiha | Oct 2015 | A1 |
20150366518 | Sampson | Dec 2015 | A1 |
20150379476 | Chaudhri et al. | Dec 2015 | A1 |
20160034148 | Wilson et al. | Feb 2016 | A1 |
20160034167 | Wilson et al. | Feb 2016 | A1 |
20160048296 | Gan et al. | Feb 2016 | A1 |
20160054710 | Jo et al. | Feb 2016 | A1 |
20160058337 | Blahnik et al. | Mar 2016 | A1 |
20160062572 | Yang et al. | Mar 2016 | A1 |
20160077495 | Brown et al. | Mar 2016 | A1 |
20160117141 | Ro et al. | Apr 2016 | A1 |
20160124626 | Lee et al. | May 2016 | A1 |
20160139798 | Takikawa | May 2016 | A1 |
20160179310 | Chaudhri et al. | Jun 2016 | A1 |
20160182805 | Emmett et al. | Jun 2016 | A1 |
20160224211 | Xu et al. | Aug 2016 | A1 |
20160253065 | Platzer et al. | Sep 2016 | A1 |
20160269540 | Butcher et al. | Sep 2016 | A1 |
20160313913 | Leem | Oct 2016 | A1 |
20170039535 | Park et al. | Feb 2017 | A1 |
20170075305 | Ryu et al. | Mar 2017 | A1 |
20170147198 | Herz et al. | May 2017 | A1 |
20170255169 | Lee et al. | Sep 2017 | A1 |
20170344329 | Oh et al. | Nov 2017 | A1 |
20170357426 | Wilson et al. | Dec 2017 | A1 |
20170357427 | Wilson et al. | Dec 2017 | A1 |
20170357433 | Boule et al. | Dec 2017 | A1 |
20170374205 | Panda | Dec 2017 | A1 |
20180150216 | Choi | May 2018 | A1 |
20180307388 | Chaudhri et al. | Oct 2018 | A1 |
20190171349 | Van os et al. | Jun 2019 | A1 |
20190173996 | Butcher et al. | Jun 2019 | A1 |
20190179514 | Van os et al. | Jun 2019 | A1 |
20190235724 | Platzer et al. | Aug 2019 | A1 |
20190320057 | Omernick et al. | Oct 2019 | A1 |
20190369842 | Dolbakian | Dec 2019 | A1 |
20200000035 | Calmer | Jan 2020 | A1 |
20200054549 | Paufique | Feb 2020 | A1 |
20200142554 | Lin | May 2020 | A1 |
20200192683 | Lin | Jun 2020 | A1 |
20200225843 | Herz et al. | Jul 2020 | A1 |
20200333945 | Wilson et al. | Oct 2020 | A1 |
20200348814 | Platzer et al. | Nov 2020 | A1 |
20200356242 | Wilson et al. | Nov 2020 | A1 |
20200379615 | Chaudhri et al. | Dec 2020 | A1 |
20210109647 | Van Os et al. | Apr 2021 | A1 |
20210112152 | Omernick et al. | Apr 2021 | A1 |
20210132758 | Xu | May 2021 | A1 |
20210141506 | Chaudhri et al. | May 2021 | A1 |
20210195013 | Butcher et al. | Jun 2021 | A1 |
20210271374 | Chaudhri et al. | Sep 2021 | A1 |
20210311438 | Wilson et al. | Oct 2021 | A1 |
20220137765 | Platzer et al. | May 2022 | A1 |
20220202384 | Saiki et al. | Jun 2022 | A1 |
20220206649 | Chaudhri et al. | Jun 2022 | A1 |
20220377167 | Omernick et al. | Nov 2022 | A1 |
20220413684 | Van Os et al. | Dec 2022 | A1 |
20220417358 | Butcher et al. | Dec 2022 | A1 |
Number | Date | Country |
---|---|---|
2012202140 | May 2012 | AU |
2015100115 | Mar 2015 | AU |
2015101022 | Sep 2015 | AU |
2349649 | Jan 2002 | CA |
2800123 | Jul 2016 | CA |
700242 | Jul 2010 | CH |
1392977 | Jan 2003 | CN |
1464719 | Dec 2003 | CN |
1695105 | Nov 2005 | CN |
1773875 | May 2006 | CN |
1786906 | Jun 2006 | CN |
1940833 | Apr 2007 | CN |
1998150 | Jul 2007 | CN |
101072410 | Nov 2007 | CN |
101308443 | Nov 2008 | CN |
102081502 | Jun 2011 | CN |
102244676 | Nov 2011 | CN |
102446059 | May 2012 | CN |
102801649 | Nov 2012 | CN |
103210366 | Jul 2013 | CN |
103649897 | Mar 2014 | CN |
104281405 | Jan 2015 | CN |
104471532 | Mar 2015 | CN |
104580576 | Apr 2015 | CN |
105335087 | Feb 2016 | CN |
163032 | Dec 1985 | EP |
404373 | Dec 1990 | EP |
626635 | Nov 1994 | EP |
689134 | Dec 1995 | EP |
844553 | May 1998 | EP |
1003098 | May 2000 | EP |
1143334 | Oct 2001 | EP |
1186997 | Mar 2002 | EP |
1517228 | Mar 2005 | EP |
1674976 | Jun 2006 | EP |
1724996 | Nov 2006 | EP |
2150031 | Feb 2010 | EP |
2911377 | Aug 2015 | EP |
2993602 | Mar 2016 | EP |
2819675 | Jul 2002 | FR |
2329813 | Mar 1999 | GB |
2407900 | May 2005 | GB |
6-208446 | Jul 1994 | JP |
8-221203 | Aug 1996 | JP |
9-73381 | Mar 1997 | JP |
9-101874 | Apr 1997 | JP |
9-258971 | Oct 1997 | JP |
9-292262 | Nov 1997 | JP |
9-297750 | Nov 1997 | JP |
10-40067 | Feb 1998 | JP |
10-214350 | Aug 1998 | JP |
11-508116 | Jul 1999 | JP |
2000-20213 | Jan 2000 | JP |
2001-92430 | Apr 2001 | JP |
2001-92586 | Apr 2001 | JP |
2001-318751 | Nov 2001 | JP |
2002-41197 | Feb 2002 | JP |
2002-41206 | Feb 2002 | JP |
2002-132412 | May 2002 | JP |
2002-149312 | May 2002 | JP |
2002-189567 | Jul 2002 | JP |
2002-525705 | Aug 2002 | JP |
2002-297514 | Oct 2002 | JP |
2002-312105 | Oct 2002 | JP |
2003-66941 | Mar 2003 | JP |
2003-139546 | May 2003 | JP |
2003-198705 | Jul 2003 | JP |
2003-248538 | Sep 2003 | JP |
2003-256142 | Sep 2003 | JP |
2003-271310 | Sep 2003 | JP |
2003-295994 | Oct 2003 | JP |
2003-536125 | Dec 2003 | JP |
2004-38260 | Feb 2004 | JP |
2004-70492 | Mar 2004 | JP |
2004-132741 | Apr 2004 | JP |
2004-152075 | May 2004 | JP |
2004-208217 | Jul 2004 | JP |
2004-341892 | Dec 2004 | JP |
2005-4396 | Jan 2005 | JP |
2005-4419 | Jan 2005 | JP |
2005-515530 | May 2005 | JP |
2005-198064 | Jul 2005 | JP |
2005-202703 | Jul 2005 | JP |
2005-227826 | Aug 2005 | JP |
2005-227951 | Aug 2005 | JP |
2005-228088 | Aug 2005 | JP |
2005-228091 | Aug 2005 | JP |
2005-309933 | Nov 2005 | JP |
2005-321915 | Nov 2005 | JP |
2005-327064 | Nov 2005 | JP |
2006-99733 | Apr 2006 | JP |
2006-155232 | Jun 2006 | JP |
2006-259376 | Sep 2006 | JP |
2007-25998 | Feb 2007 | JP |
2007-124667 | May 2007 | JP |
2007-132676 | May 2007 | JP |
2007-512635 | May 2007 | JP |
2007-334984 | Dec 2007 | JP |
2008-15698 | Jan 2008 | JP |
2008-503007 | Jan 2008 | JP |
2008-52705 | Mar 2008 | JP |
2008-102860 | May 2008 | JP |
2008-262251 | Oct 2008 | JP |
2008-304959 | Dec 2008 | JP |
2008-306667 | Dec 2008 | JP |
2009-9350 | Jan 2009 | JP |
2009-508217 | Feb 2009 | JP |
2009-136456 | Jun 2009 | JP |
2009-265929 | Nov 2009 | JP |
2009-277192 | Nov 2009 | JP |
2010-61402 | Mar 2010 | JP |
2010-97552 | Apr 2010 | JP |
2010-187096 | Aug 2010 | JP |
2010-538394 | Dec 2010 | JP |
2012-208645 | Oct 2012 | JP |
2013-25357 | Feb 2013 | JP |
2013-25409 | Feb 2013 | JP |
2013-47919 | Mar 2013 | JP |
2013-106271 | May 2013 | JP |
2013-120468 | Jun 2013 | JP |
2013-191234 | Sep 2013 | JP |
2013-203283 | Oct 2013 | JP |
2013-206274 | Oct 2013 | JP |
2013-211055 | Oct 2013 | JP |
2013-218698 | Oct 2013 | JP |
2014-503891 | Feb 2014 | JP |
2002-0010863 | Feb 2002 | KR |
10-0490373 | May 2005 | KR |
10-2009-0035499 | Apr 2009 | KR |
10-2009-0100320 | Sep 2009 | KR |
10-2010-0019887 | Feb 2010 | KR |
10-2011-0078008 | Jul 2011 | KR |
10-2011-0093729 | Aug 2011 | KR |
10-2012-0057800 | Jun 2012 | KR |
10-2012-0091495 | Aug 2012 | KR |
10-2013-0016329 | Feb 2013 | KR |
10-2015-0022599 | Mar 2015 | KR |
199606401 | Feb 1996 | WO |
199844431 | Oct 1998 | WO |
199938149 | Jul 1999 | WO |
200016186 | Mar 2000 | WO |
200146790 | Jun 2001 | WO |
200213176 | Feb 2002 | WO |
2003060622 | Jul 2003 | WO |
2005041020 | May 2005 | WO |
2005055034 | Jun 2005 | WO |
2006012343 | Feb 2006 | WO |
2006020304 | Feb 2006 | WO |
2006020305 | Feb 2006 | WO |
2006092464 | Sep 2006 | WO |
2006117438 | Nov 2006 | WO |
2006119269 | Nov 2006 | WO |
2007031816 | Mar 2007 | WO |
2007032908 | Mar 2007 | WO |
2006020304 | May 2007 | WO |
2007069835 | Jun 2007 | WO |
2007094894 | Aug 2007 | WO |
2007142256 | Dec 2007 | WO |
2008017936 | Feb 2008 | WO |
2007100944 | Aug 2008 | WO |
2008114491 | Sep 2008 | WO |
2009032638 | Mar 2009 | WO |
2009032750 | Mar 2009 | WO |
2009089222 | Jul 2009 | WO |
2011126501 | Oct 2011 | WO |
2012078079 | Jun 2012 | WO |
2013017736 | Feb 2013 | WO |
2013149055 | Oct 2013 | WO |
2013157330 | Oct 2013 | WO |
2016025395 | Feb 2016 | WO |
Entry |
---|
Office Action received for Australian Patent Application No. 2019210673, dated Jul. 28, 2020, 4 pages. |
Communication of the Board of Appeal received for European Patent Application No. 09170697.8, dated Jan. 25, 2021, 6 pages. |
Office Action received for Chinese Patent Application No. 201780033621.1, dated Dec. 22, 2020, 30 pages (16 pages of English Translation and 14 pages of Official Copy). |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 15/421,865, dated Dec. 15, 2020, 6 pages. |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 16/267,817, dated Dec. 1, 2020, 5 pages. |
Minutes of the Oral Proceedings received for European Patent Application No. 12189764.9, mailed on Oct. 13, 2020, 6 pages. |
Office Action received for European Patent Application No. 17810739.7, dated Nov. 25, 2020, 4 pages. |
Extended European Search Report received for European Patent Application No. 20203888.1, dated Feb. 10, 2021, 8 pages. |
Office Action received for Australian Patent Application No. 2020201723, dated Feb. 4, 2021, 6 pages. |
Office Action received for Chinese Patent Application No. 201780033973.7, dated Jan. 22, 2021, 27 pages (11 pages of English Translation and 16 pages of Official Copy). |
Blickenstorfer Conrad H., “Neonode N2 A new version of the phone that pioneered touchscreens”, Pen Computing Magazine, Online Available at: http://www.pencomputing.com/WinCE/neonode-n2-review.html, Nov. 4, 2007, 9 pages. |
Feist Jonathan, “Android customization—How to create a custom clock widget using zooper widget”, Android Authority, Available Online at: https://www.androidauthority.com/zooper-widget-clock-366476/, May 15, 2014, pp. 1-13. |
Final Office Action received for U.S. Appl. No. 15/411,110, dated Mar. 15, 2021, 28 pages. |
Intention to Grant received for European Patent Application No. 12189764.9, dated Mar. 5, 2021, 14 pages. |
Applicant Initiated Interview Summary received for U.S. Appl. No. 16/270,801, dated Mar. 11, 2020, 3 pages. |
Applicant Initiated Interview Summary received for U.S. Appl. No. 16/270,902, dated Mar. 11, 2020, 3 pages. |
Final Office Action received for U.S. Appl. No. 15/411,110, dated Mar. 5, 2020, 30 pages. |
Notice of Allowance received for U.S. Appl. No. 14/142,648, dated Mar. 13, 2020, 5 pages. |
Record of Oral Hearing received for U.S. Appl. No. 14/142,648, mailed on Mar. 2, 2020, 13 pages. |
Summons to Attend Oral Proceedings received for European Patent Application No. 12189764.9, mailed on Mar. 12, 2020, 11 pages. |
Final Office Action received for U.S. Appl. No. 15/421,865, dated Mar. 19, 2021, 20 pages. |
Notice of Acceptance received for Australian Patent Application No. 2021200102, dated Mar. 16, 2021, 3 pages. |
Notice of Allowance received for U.S. Appl. No. 16/918,855, dated Apr. 6, 2021, 7 pages. |
Office Action received for Japanese Patent Application No. 2018-121118, dated Feb. 19, 2021, 17 pages (9 pages of English Translation and 8 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2019-024663, dated Feb. 19, 2021, 8 pages (4 pages of English Translation and 4 pages of Official Copy). |
Final Office Action received for U.S. Appl. No. 16/267,817, dated Aug. 24, 2020, 23 pages. |
Notice of Allowance received for U.S. Appl. No. 14/261,112, dated Apr. 9, 2021, 2 pages. |
Office Action received for Japanese Patent Application No. 2020-046707, dated Mar. 5, 2021, 8 pages (4 pages of English Translation and 4 pages of Official Copy). |
Non-Final Office Action received for U.S. Appl. No. 16/994,392, dated Jun. 9, 2021, 27 pages. |
Office Action received for Chinese Patent Application No. 201780033621.1, dated May 24, 2021, 18 pages (7 pages of English Translation and 11 pages of Official Copy). |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 16/267,817, dated Jul. 14, 2020, 5 pages. |
Examiner's Pre-Review Report received for Japanese Patent Application No. 2018-121118, dated Jun. 2, 2020, 4 pages (2 pages of English Translation and 2 pages of Official Copy). |
Notice of Acceptance received for Australian Patent Application No. 2020201723, dated May 6, 2021, 3 pages. |
Third Party Proceedings received for European Patent Application No. 17210062.0, mailed on Apr. 23, 2020, 6 pages. |
Decision of Board of Appeal received for European Patent Application No. 09170697.8, mailed on Apr. 23, 2021, 17 pages. |
Office Action received for Korean Patent Application No. 10-2020-7018655, dated Apr. 26, 2021, 9 pages (4 pages of English Translation and 5 pages of Official Copy). |
Notice of Allowance received for U.S. Appl. No. 16/020,804, dated May 28, 2020, 18 pages. |
Office Action received for Japanese Patent Application No. 2019-024663, dated Apr. 27, 2020, 8 pages (4 pages of English Translation and 4 pages of Official Copy). |
Corrected Notice of Allowance received for U.S. Appl. No. 14/142,648, dated May 20, 2020, 2 pages. |
Notice of Allowance received for U.S. Appl. No. 16/428,634, dated May 8, 2020, 16 pages. |
Summons to Attend Oral Proceedings received for European Patent Application No. 12189764.9, mailed on May 20, 2020, 11 pages. |
Decision on Appeal received for U.S. Appl. No. 14/142,648, mailed on Feb. 28, 2020, 6 pages. |
Advisory Action received for U.S. Appl. No. 15/421,865, dated Apr. 16, 2020, 7 pages. |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 15/411,110, dated Apr. 21, 2020, 5 pages. |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 16/020,804, dated Apr. 13, 2020, 3 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/267,817, dated Apr. 15, 2020, 25 pages. |
Notice of Acceptance received for Australian Patent Application No. 2019200692, dated Apr. 7, 2020, 3 pages. |
Office Action received for Australian Patent Application No. 2019219816, dated Apr. 17, 2020, 3 pages. |
Result of Consultation received for European Patent Application No. 17210062.0, dated Apr. 20, 2020, 2 pages. |
Notice of Acceptance received for Australian Patent Application No. 2019204835, dated Dec. 7, 2020, 3 pages. |
Notice of Allowance received for Japanese Patent Application No. 2017-223021, dated Dec. 18, 2020, 3 pages (1 page of English Translation and 2 pages of Official Copy). |
Notice of Allowance received for U.S. Appl. No. 16/267,817, dated Dec. 18, 2020, 11 pages. |
Summons to Oral Proceedings received for European Patent Application No. 09170697.8, mailed on Dec. 17, 2020, 4 pages. |
Intention to Grant received for European Patent Application No. 17210062.0, dated Jun. 23, 2020, 8 pages. |
Minutes of Oral Proceedings received for European Patent Application No. 17210062.0, mailed on Jun. 17, 2020, 5 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/411,110, dated Jun. 26, 2020, 32 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/270,801, dated Mar. 27, 2020, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/270,902, dated Mar. 27, 2020, 11 pages. |
Notice of Allowance received for Korean Patent Application No. 10-2019-7005262, dated Mar. 25, 2020, 5 pages (2 pages of English Translation and 3 pages of Official Copy). |
Notice of Allowance received for U.S. Appl. No. 15/418,537, dated Apr. 6, 2020, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 16/378,291, dated Mar. 25, 2020, 11 pages. |
Office Action received for Australian Patent Application No. 2017277813, dated Mar. 20, 2020, 4 pages. |
Advisory Action received for U.S. Appl. No. 12/689,834, dated Aug. 19, 2015, 3 pages. |
Advisory Action received for U.S. Appl. No. 12/242,851, dated Nov. 15, 2013, 4 pages. |
Advisory Action received for U.S. Appl. No. 12/888,362, dated Sep. 4, 2015, 3 pages. |
Advisory Action received for U.S. Appl. No. 12/888,362, dated May 7, 2013, 3 pages. |
Advisory Action received for U.S. Appl. No. 14/261,112, dated Apr. 23, 2015, 3 pages. |
Advisory Action received for U.S. Appl. No. 14/261,112, dated Nov. 30, 2017, 3 pages. |
Advisory Action received for U.S. Appl. No. 14/710,125, dated Mar. 14, 2017, 3 pages. |
Advisory Action received for U.S. Appl. No. 11/960,669, dated Nov. 3, 2011, 3 pages. |
Apple Iphone School, “Customize 1.19 Update for the iPhone”, 4:02 minutes video, available at <http://www.youtube.com/watch?v=5ogDzOM89oc>, uploaded on Dec. 8, 2007, 2 pages. |
Apple Iphone School, “SummerBoard 3.0a9 for iPhone”, 4:50 minutes video, available at <http://www.youtube.com/watch?v=s_P_9mrZTKs>, uploaded on Oct. 21, 2007, 2 pages. |
Apple, “Iphone User's Guide”, iPhone first generation, Available at: <http://pocketpccentral.net/iphone/products/1 g_iphone.htm>, Jun. 29, 2007, 124 pages. |
Apple, “iPhone User's Guide”, Available at http://mesnotices.20minutes.fr/manuel-notice-mode-emploi/APPLE/IPHONE%2D%5FE#a, Jun. 2007, 137 pages. |
Apple, “Keynote '08 User's Guide”, © Apple Inc., 2008, 204 pages. |
Applicant Initiated Interview Summary received for U.S. Appl. No. 15/411,110, dated Oct. 28, 2019, 6 pages. |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 15/418,537, dated Dec. 23, 2019, 6 pages. |
“Asus Eee News, Mods, and Hacks: Asus Eee PC Easy Mode Internet Tab Options Tour”, asuseeehacks.blogspot.com, Available online at <http://asuseeehacks.blogspot.com/2007/11/asus-eee-pc-user-interface-tour.html>, Nov. 10, 2007, 33 pages. |
Barsch, Bill, “3D Desktop! TouchScreen and XGL on Linux!”, 2:42 minutes video, available at <http://www.youtube.com/watch?v=Yx9FgLr9oTk>, uploaded on Aug. 15, 2006, 2 pages. |
Board Opinion received for Chinese Patent Application No. 200780041309.3, mailed on Apr. 1, 2016, 16 pages (9 pages of English Translation and 7 pages of Official copy). |
Board Opinion received for Chinese Patent Application No. 201480001676.0, mailed on Oct. 21, 2019, 10 pages (1 page of English Translation and 9 pages of Official Copy). |
Bott, E., et al., “Table of Contents/Chapter20: Putting Pictures on Folder Icons”, Microsoft Windows XP Inside Out Deluxe, Second Edition, Available online at: http://proquest.safaribooksonline.com/book/operating-systems/9780735642171, Oct. 6, 2004, pp. 1-8 and 669. |
Cha, Bonnie, “HTC Touch Diamond (Sprint)”, CNET Reviews, available at <http://www.cnet.com/products/htc-touch/>, updated on Sep. 12, 2008, 8 pages. |
Clifton, Marc, “Detect if Another Process is Running and Bring it to the Foreground”, Online Available at: https://www.codeproject.com/Articles/2976/Detect-if-another-process-is-running-andbring-it, Sep. 30, 2002, 6 pages. |
cocoabuilder.com, “Single Instance of a Cocoa Application”, Available at: http://www.cocoabuilder.com/archive/cocoa/167892-single-instance-of-cocoa-application.html, Jul. 19, 2006, 4 pages. |
Collomb, M., et al., “Improving drag-and-drop on wall-size displays”, proceedings of Graphics Interface, May 9, 2005, pp. 25-32. |
Corrected Notice of Allowance received for U.S. Appl. No. 12/689,834, dated Feb. 8, 2018, 4 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 12/689,834, dated May 17, 2018, 2 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 12/888,362, dated Jun. 6, 2018, 3 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 12/888,362, dated Apr. 25, 2018, 3 pages. |
Deanhill1971, “Run a Program or Switch to an Already Running Instance”, Available Online at <https://autohotkey.com/board/topic/7129-run-a-program-or-switch-to-an-already-running-instance/>, Feb. 1, 2006, 16 pages. |
Decision of Board of Appeal received for European Patent Application No. 09170697.8 mailed on Oct. 24, 2016, 24 pages. |
Decision on Acceptance received for Australian Patent Application No. 2017202587, dated Oct. 8, 2019, 19 pages. |
Decision on Appeal received for U.S. Appl. No. 14/142,640, mailed on Oct. 7, 2019, 9 pages. |
Decision on Appeal received for U.S. Appl. No. 14/710,125, mailed on Mar. 11, 2019, 7 pages. |
Decision to Grant received for European Patent Application No. 09700333.9, dated Nov. 7, 2013, 2 pages. |
Decision to Grant received for European Patent Application No. 10762813.3, dated May 11, 2018, 3 pages. |
Decision to Grant received for European Patent Application No. 12177813.8, dated Nov. 24, 2016, 3 pages. |
Decision to Grant received for European Patent Application No. 12194312.0, dated Feb. 1, 2018, 2 pages. |
Decision to Grant received for European Patent Application No. 12194315.3, dated Oct. 12, 2017, 2 pages. |
Decision to Grant received for European Patent Application No. 17198398.4, dated Jun. 14, 2019, 3 pages. |
Decision to Refuse received for European Patent Application No. 06846840.4, dated Mar. 4, 2010, 10 pages. |
Decision to Refuse received for European Patent Application No. 09170697.8, dated Oct. 23, 2013, 12 pages. |
Decision to Refuse received for European Patent Application No. 07814689.1, dated May 11, 2012, 15 pages. |
Decision to Refuse received for European Patent Application No. 09170697.8, dated Jul. 10, 2018, 31 pages. |
Delltech, “Windows XP: The Complete Reference: Working with Graphics”, Available online at: http://web.archive.org/web/20050405151925/http:/delltech.150m.corn/XP/graphics/3.htm, Chapter 18, Apr. 5, 2005, 4 pages. |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 14/142,648, dated Apr. 10, 2018, 15 pages. |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 14/261,112, dated Oct. 29, 2019, 10 pages. |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 14/710,125, dated Jan. 26, 2018, 6 pages. |
Examiner's Pre-review report received for Japanese Patent Application No. 2014-253365, dated Dec. 12, 2017, 7 pages (3 page of English Translation and 4 pages of Official Copy). |
ExpansysTV, “HTC Touch Dual Demonstration by eXpansys”, 5:26 minutes video, available at <http://www.youtube.com/watch?v=Tupk8MYLhMk>, uploaded on Oct. 1, 2007, 2 pages. |
Extended European Search Report received for European Patent Application No. 17198398.4, dated Feb. 8, 2018., 8 pages. |
Extended European Search Report received for European Patent Application No. 09170697.8, dated Apr. 28, 2010, 3 pages. |
Extended European Search Report received for European Patent Application No. 12177813.8, dated Feb. 1, 2013, 6 pages. |
Extended European Search Report received for European Patent Application No. 12189764.9, dated Jan. 4, 2013, 6 pages. |
Extended European Search Report received for European Patent Application No. 12194312.0 dated Jan. 16, 2013, 7 pages. |
Extended European Search Report received for European Patent Application No. 12194315.3, dated Jan. 16, 2013, 7 pages. |
Extended European Search Report received for European Patent Application No. 17210062.0, dated Feb. 20, 2018, 12 pages. |
Extended European Search Report received for European Patent Application No. 17810723.1, dated Nov. 12, 2019, 9 pages. |
Extended European Search Report received for European Patent Application No. 17810739.7, dated Mar. 22, 2019, 9 pages. |
Extended European Search Report received for European Patent Application No. 17813879.8, dated Jan. 8, 2020, 8 pages. |
Extended European Search Report received for European Patent Application No. 19176224.4, dated Dec. 13, 2019, 7 pages. |
Fadhley, Mohd Nazley., “LauncherX”, Online Available at <http://www.palmx.org/mambo/index2.php?option=com_content&task=view&id=65&1temid>, Nov. 21, 2002, 3 pages. |
Feist, Jonathan, “Android customization—how to create a custom clock widget using Zooper Widget”, Available Online at: https://www.androidauthority.com/zooper-widget-clock-366476/, May 15, 2014, 10 pages. |
Final Office Action received for U.S. Appl. No. 14/261,112, dated Mar. 3, 2016, 31 pages. |
Final Office Action received for U.S. Appl. No. 11/960,669, dated Aug. 18, 2011, 13 pages. |
Final Office Action received for U.S. Appl. No. 11/620,686, dated Aug. 3, 2009, 11 pages. |
Final Office Action received for U.S. Appl. No. 11/620,686, dated Jul. 12, 2010, 10 pages. |
Final Office Action received for U.S. Appl. No. 11/620,687, dated Aug. 18, 2009, 7 pages. |
Final Office Action received for U.S. Appl. No. 11/849,938, dated Jan. 30, 2013, 31 pages. |
Final Office Action received for U.S. Appl. No. 11/849,938, dated May 27, 2011, 21 pages. |
Final Office Action received for U.S. Appl. No. 11/850,010 dated Oct. 17, 2011, 11 pages. |
Final Office Action received for U.S. Appl. No. 11/850,010, dated Apr. 18, 2016, 16 pages. |
Final Office Action received for U.S. Appl. No. 11/850,010, dated Aug. 14, 2018, 21 pages. |
Final Office Action received for U.S. Appl. No. 11/850,010, dated Feb. 15, 2013, 12 pages. |
Final Office Action received for U.S. Appl. No. 11/850,010, dated May 8, 2014, 11 pages. |
Final Office Action received for U.S. Appl. No. 11/850,010, dated May 11, 2018, 24 pages. |
Final Office Action received for U.S. Appl. No. 11/850,011, dated Dec. 1, 2010, 15 pages. |
Final Office Action received for U.S. Appl. No. 11/969,809, dated Jul. 14, 2011, 26 pages. |
Final Office Action received for U.S. Appl. No. 12/242,851, dated Dec. 12, 2011, 13 pages. |
Final Office Action received for U.S. Appl. No. 12/242,851, dated Jul. 1, 2016, 90 pages. |
Final Office Action received for U.S. Appl. No. 12/242,851, dated May 10, 2013, 20 pages. |
Final Office Action received for U.S. Appl. No. 12/364,470, dated May 5, 2010, 16 pages. |
Final Office Action received for U.S. Appl. No. 12/364,470, dated Oct. 19, 2011, 20 pages. |
Final Office Action received for U.S. Appl. No. 12/689,834, dated Mar. 26, 2015, 30 pages. |
Final Office Action received for U.S. Appl. No. 12/689,834, dated May 4, 2017, 41 pages. |
Final Office Action received for U.S. Appl. No. 12/689,834, dated Oct. 15, 2012, 22 pages. |
Final Office Action received for U.S. Appl. No. 12/888,362, dated Apr. 29, 2015, 12 pages. |
Final Office Action received for U.S. Appl. No. 12/888,362, dated Jan. 3, 2013, 13 pages. |
Final Office Action received for U.S. Appl. No. 12/888,375, dated Nov. 7, 2012, 14 pages. |
Final Office Action received for U.S. Appl. No. 12/888,376, dated Feb. 8, 2013, 11 pages. |
Final Office Action received for U.S. Appl. No. 14/142,648, dated Dec. 7, 2016, 12 pages. |
Final Office Action received for U.S. Appl. No. 14/261,112, dated Aug. 10, 2017, 35 pages. |
Final Office Action received for U.S. Appl. No. 14/261,112, dated Nov. 7, 2018, 34 pages. |
Final Office Action received for U.S. Appl. No. 14/261,112, dated Oct. 9, 2014, 29 pages. |
Final Office Action received for U.S. Appl. No. 14/710,125, dated Oct. 27, 2016, 13 pages. |
Final Office Action received for U.S. Appl. No. 15/418,537, dated Sep. 23, 2019, 53 pages. |
Final Office Action received for U.S. Appl. No. 15/421,865, dated Dec. 2, 2019, 19 pages. |
Final Office Action received for U.S. Appl. No. 15/426,836, dated Mar. 29, 2019, 49 pages. |
Final Office Action received for U.S. Appl. No. 14/142,640, dated Mar. 8, 2016, 35 pages. |
Fujitsu Ltd, “SX/G Manual of Icons on Desktop”, Edition 14/14A V14, 1st Edition, Mar. 27, 1998, 4 pages (Official Copy only) (See Communication under 37 CFR § 1.98(a) (3)). |
Gade, Lisa, “Sprint HTC Touch”, Smartphone Reviews by Mobile Tech Review, Available online at <http://www.mobiletechreview.com/phones/HTC-Touch.htm>, Nov. 2, 2007, 7 pages. |
“Qualcomm Toq—smartwatch—User Manual”, Available Online at: https://toq.qualcomm.com/sites/default/files/qualcomm_toq_user_manual.pdf, Nov. 27, 2013, pp. 1-38. |
Gsmarena Team, “HTC Touch review”, Online Available at: <twww.gsmarena.com/htc_touch-review-189p3.php>, Nov. 28, 2007, 5 pages. |
Gsmarena, Team, “HTC Touch Review: Smart to Touch the Spot”, available at <http://www.gsmarena.com/htc_touch-review-189.php>, Nov. 28, 2007, 18 pages. |
Hayama, H, et al., “To change images of scaled-down representation”, Windows XP SP3 & SP2, Dec. 1, 2008, 6 pages (2 pages of English Translation and 4 pages of official Copy). |
Higuchi, Tadahiro, “Try API!, Making a cool application with Visual Basic 6.0”, 1st edition, Japan, AI Publishing, AI Mook 221, Jul. 16, 1999, 23 pages (Official Copy Only) (See Communication under 37 CFR § 1.98(a) (3)). |
Huang et al., “Effects of Visual Vibratory Perception by Cross-Modali Matching with Tactile Sensation”, Retrieved from the Internet: <URL:http://media.nuas.ac.jp/˜robin/Research/ADC99.html>, 1999, pp. 1-7. |
Intention to Grant received for European Patent Application No. 09700333.9, dated Jun. 20, 2013, 7 pages. |
Intention to Grant received for European Patent Application No. 10762813.3, dated Dec. 18, 2017, 11 pages. |
Intention to Grant received for European Patent Application No. 12177813.8, dated Jul. 6, 2016, 8 pages. |
Intention to Grant received for European Patent Application No. 12194312.0, dated Aug. 3, 2017, 8 pages. |
Intention to Grant received for European Patent Application No. 12194315.3, dated May 31, 2017, 8 pages. |
Intention to Grant received for European Patent Application No. 13795330.3, dated Aug. 9, 2019, 13 pages. |
Intention to Grant received for European Patent Application No. 17198398.4, dated Jan. 28, 2019, 8 pages. |
International Preliminary Report on Patentability for PCT Patent Application No. PCT/US2008/050047, dated Sep. 15, 2009, 11 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2006/062685, dated Jul. 1, 2008, 6 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2007/077639, dated Mar. 10, 2009, 6 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2007/077643, dated Mar. 10, 2009, 7 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2008/050430, dated Jul. 7, 2009, 10 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2008/074625, dated Mar. 9, 2010, 6 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2009/030225, dated Jul. 15, 2010, 11 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2010/050056, dated Oct. 18, 2012, 21 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2013/067634, dated May 12, 2016, 9 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2014/040414, dated Dec. 23, 2015, 10 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2017/034834, dated Dec. 20, 2018, 9 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2017/035331, dated Dec. 20, 2018, 13 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2017/037057, dated Dec. 27, 2018, 24 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2007/077639, dated Jul. 8, 2008, 7 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2007/077643, dated May 8, 2008, 9 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2008/050047, dated Sep. 3, 2009, 15 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2008/050430, dated Sep. 1, 2008, 13 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2008/074625, dated Jan. 8, 2009, 8 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2009/030225, dated Feb. 25, 2010, 15 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2010/050056, dated May 13, 2011, 26 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2013/067634, dated Apr. 16, 2014, 11 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2014/040414, dated Sep. 16, 2014, 12 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2017/034834, dated Aug. 23, 2017, 10 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2017/035331, dated Oct. 6, 2017, 18 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2017/037057, dated Aug. 29, 2017, 26 pages. |
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2017/035331, dated Aug. 7, 2017, 4 pages. |
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2010/050056, dated Jan. 5, 2011, 5 pages. |
Jobs, Steve, “iPhone Introduction in 2007 (Complete)”, available at <https://www.youtube.com/watch?v=9hUlxyE2Ns8>, Jan. 10, 2013, 3 pages. |
Kondo, Daisuke, “Windows XP Tablet PC Edition Quick Review Challenging by Microsoft”, PCfan No. 9, No. 28, Japan, Mainichi Communication, Oct. 15, 2002, pp. 12-17 (Official copy only) (See Communication under 37 CFR § 1.98(a) (3)). |
“Launch 'Em Version 3.1”, Retrieved from the Internet: http://www.fladnag.net/downloads/telephone/palm/APPS/Inchem31/Documentation/LaunchEm.pdf, 2001, pp. 1-39. |
Mac People, “Useful Technique for Web Browser”, Ascii Media Works Inc., vol. 15, No. 6, Jun. 1, 2009, 17 pages (Official copy only) (See Communication under 37 CFR § 1.98(a) (3)). |
“Macintosh Human Interface Guidelines”, chapter 1, 1995, pp. 3-14. |
Mcguffin et al., “Acquisition of Expanding Targets”, ACM, Apr. 20-25, 2002, 8 pages. |
Microsoft Help and Support, “How to Arrange or Move Icons on the Desktop”, Available online at: http://support.microsoft.com/kb/289587, 2007, 2 pages. |
Microsoft, “Working screenshot of Microsoft Office 2003”, Aug. 19, 2003, 14 pages. |
Miller, Matthew, “HTC Touch and TouchFLO Interface”, 7:53 minutes video, available at <http://www.youtube.com/watch?v=6oUp4wOcUc4>, uploaded on Jun. 6, 2007, 2 pages. |
Minutes of Meeting received for European Patent Application No. 09170697.8, mailed on Jul. 10, 2018, 6 pages. |
Minutes of the Oral Proceedings received for European Patent Application No. 13795330.3, mailed on Aug. 2, 2019, 7 pages. |
mobilissimo.ro, “HTC Touch—Touch FLO Demo”, Online Available at: <<https://www.youtube.com/watch?v=YQ8TQ9Rr_7E>, Jun. 5, 2007, 1 page. |
Nakata, Atsushi, “Tablet PC aiming at spread pen input by changing target user”, Nikkei Windows for IT Professionals, Nikkei Business Publications, Inc., No. 69, Dec. 1, 2002, pp. 14-16 (Official copy only) (See Communication under 37 CFR § 1.98(a) (3)). |
Naver Blog, “iPhone iOS 4 folder management”, Jun. 27, 2010, 2 pages (Official copy only) (See Communication under 37 CFR § 1.98(a) (3)). |
Nishida, T., et al., “Drag-and-Guess: Drag-and-Drop with Prediction”, INTERACT'07 Proceedings of the 11th IFIP TC 13 International Conference on Human-Computer interaction, Sep. 10, 2007, pp. 461-474. |
“Nokia 7710”, Available online at: https://www.nokia.com/en_int/phones/sites/default/files/user-guides/Nokia_7710_UG_en.pdf, 2005, pp. 1-153. |
Non-Final Office Action received for U.S. Appl. No. 11/620,687, dated Dec. 22, 2008, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/620,687, dated Jan. 11, 2010, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/850,010, dated Dec. 17, 2014, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/689,834, dated Aug. 26, 2016, 26 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/689,834, dated May 24, 2012, 21 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/142,640, dated Jun. 5, 2015, 29 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/033,551, dated May 24, 2018, 26 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/421,865, dated Mar. 21, 2019, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/960,669, dated Mar. 17, 2011,23 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/459,602, dated Sep. 4, 2008, 13 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/620,686, dated Dec. 22, 2009, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/620,686, dated Dec. 31, 2008, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/849,938, dated Dec. 14, 2011, 26 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/849,938, dated Oct. 12, 2010, 19 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/850,010 dated May 16, 2012, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/850,010 dated May 2, 2011, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/850,010, dated Oct. 24, 2013, 13 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/850,010, dated Jul. 24, 2017, 19 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/850,010, dated Jun. 25, 2015, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/850,011, dated Aug. 11, 2010, 19 pages. |
Non-Final Office Action received for U.S. Appl. No. 11/969,809, dated Mar. 14, 2011, 25 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/242,851, dated Apr. 15, 2011, 20 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/242,851, dated Jun. 26, 2015, 33 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/242,851, dated Oct. 6, 2014, 27 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/242,851, dated Sep. 20, 2012, 19 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/364,470, dated Mar. 4, 2011, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/364,470, dated Nov. 13, 2009, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/364,470, dated Sep. 2, 2010, 26 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/689,834, dated Jun. 10, 2014, 25 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/788,278, dated Oct. 16, 2012, 19 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/888,362, dated Sep. 4, 2014, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/888,366, dated Jul. 31, 2012, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/888,370, dated Aug. 22, 2012, 13 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/888,373, dated Sep. 10, 2012, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/888,375, dated Jun. 7, 2012, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/888,375, dated Sep. 30, 2013, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/888,376, dated Aug. 29, 2014, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/888,376, dated Oct. 2, 2012, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/888,377, dated Sep. 13, 2012, 13 pages. |
Non-Final Office Action received for U.S. Appl. No. 13/104,903, dated Nov. 13, 2012, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 13/104,911, dated Feb. 20, 2013, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/142,648, dated Apr. 12, 2016, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/261,112, dated Apr. 5, 2018, 40 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/261,112, dated Jul. 8, 2015, 29 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/261,112, dated Jun. 18, 2014, 25 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/261,112, dated Nov. 29, 2016, 34 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/153,617, dated Apr. 2, 2018, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/411,110, dated Dec. 13, 2018, 23 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/411,110, dated Jul. 22, 2019, 29 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/418,537, dated Dec. 13, 2018, 53 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/426,836, dated Oct. 18, 2018, 40 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/020,804, dated Nov. 20, 2019, 13 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/888,362, dated Jul. 20, 2012, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/710,125, dated Apr. 12, 2016, 12 pages. |
Notice of Acceptance received for Australia Patent Application No. 2012261534, dated Jan. 6, 2015, 2 pages. |
Notice of Acceptance received for Australian Patent Application No. 2012200475, dated Aug. 24, 2015, 2 pages. |
Notice of Acceptance received for Australian Patent Application No. 2012202140, dated May 28, 2014, 2 pages. |
Notice of Acceptance received for Australian Patent Application No. 2013404001, dated Nov. 21,2017, 3 pages. |
Notice of Acceptance received for Australian Patent Application No. 2014204422, dated Apr. 28, 2016, 2 pages. |
Notice of Acceptance received for Australian Patent Application No. 2014274556, dated Jul. 27, 2016, 2 pages. |
Notice of Acceptance received for Australian Patent Application No. 2016203168, dated Feb. 14, 2018, 3 pages. |
Notice of Acceptance received for Australian Patent Application No. 2016203309, dated Feb. 14, 2018, 3 pages. |
Notice of Acceptance received for Australian Patent Application No. 2016213886, dated Feb. 9, 2018, 3 pages. |
Notice of Acceptance received for Australian Patent Application No. 2017201768, dated Nov. 21, 2018, 3 pages. |
Notice of Acceptance received for Australian Patent Application No. 2017202587, dated Nov. 6, 2019, 3 pages. |
Notice of Acceptance received for Australian Patent Application No. 2017276153, dated Feb. 19, 2018, 4 pages. |
Notice of Acceptance received for Australian Patent Application No. 2017276153, dated Jan. 17, 2018, 3 pages. |
Notice of Acceptance received for Australian Patent Application No. 2017277851, dated Dec. 9, 2019, 3 pages. |
Notice of Acceptance received for Australian Patent Application No. 2018200272, dated Apr. 23, 2019, 3 pages. |
Notice of Acceptance received for Australian Patent Application No. 2018203512, dated Jul. 26, 2019, 3 pages. |
Notice of Allowance received for Australian Patent Application No. 2010350739, dated Sep. 8, 2014, 2 pages. |
Notice of Allowance received for Australian Patent Application No. 2015202076, dated Apr. 5, 2017, 3 pages. |
Notice of Allowance received for Canadian Patent Application No. 2,845,297, dated Nov. 10, 2014, 1 page. |
Notice of Allowance received for Canadian Patent Application No. 2,890,778, dated Apr. 24, 2017, 1 page. |
Notice of Allowance received for Chinese Patent Application No. 200780041309.3, dated Jul. 31, 2017, 2 Pages (Official Copy only) (See Communication under 37 CFR § 1.98(a) (3)). |
Notice of Allowance received for Chinese Patent Application No. 200980000229.2, dated Oct. 24, 2014, 4 pages (2 pages of English Translation and 2 pages of Official Copy). |
Notice of Allowance received for Chinese Patent Application No. 201010592864.9, dated Jan. 30, 2015, 4 pages (Official Copy Only) (See Communication under 37 CFR § 1.98(a) (3)). |
Notice of Allowance received for Chinese Patent Application No. 201310724733.5, dated Dec. 27, 2018, 2 pages (1 page of English Translation and 1 page of Official Copy). |
Notice of Allowance received for Chinese Patent Application No. 201380080659.6, dated Jul. 29, 2019, 2 pages (1 page of English Translation and 1 page of Official Copy). |
Notice of Allowance received for Chinese Patent Application No. 201410250648.4, dated Aug. 20, 2018, 2 pages (1 page of English Translation and 1 page of Official Copy). |
Notice of Allowance received for Chinese Patent Application No. 201410250688.9, dated May 21, 2018, 3 pages (1 page of English Translation and 2 pages of Official Copy). |
Notice of Allowance received for Chinese Patent Application No. 201410251370.2, dated Jul. 31, 2018, 2 pages (1 page of English Translation and 1 page of Official Copy). |
Notice of Allowance received for Chinese Patent Application No. 201410251400.X, dated Aug. 20, 2018, 3 pages (1 page of English Translation and 2 pages of Official Copy). |
Notice of Allowance received for Japanese Patent Application No. 2009-051921, dated Jan. 20, 2014, 2 pages (Official Copy only) (See Communication under 37 CFR § 1.98(a) (3)). |
Notice of Allowance received for Japanese Patent Application No. 2013-127963, dated Oct. 9, 2015, 3 pages (1 page of English Translation and 2 pages of Official Copy). |
Notice of Allowance received for Japanese Patent Application No. 2013-252338, dated Jun. 23, 2017, 3 pages (Official Copy Only) (See Communication under 37 CFR § 1.98(a) (3)). |
Notice of Allowance received for Japanese Patent Application No. 2014-139095, dated Apr. 1, 2016, 3 pages (Official Copy only) (See Communication under 37 § CFR 1.98(a) (3)). |
Notice of Allowance received for Japanese Patent Application No. 2014-253365, dated Nov. 26, 2018, 3 pages (1 page of English Translation and 2 pages of Official Copy). |
Notice of Allowance received for Japanese Patent Application No. 2015-532193 dated Jan. 23, 2017, 3 pages (Official Copy Only) (See Communication under 37 CFR § 1.98(a) (3)). |
Notice of Allowance received for Japanese Patent Application No. 2016-091460, dated Oct. 9, 2018, 3 pages (1 page of English Translation and 2 pages of Official Copy). |
Notice of Allowance received for Japanese Patent Application No. 2016-092789, dated Feb. 3, 2017, 3 pages (Official Copy Only) (See Communication under 37 CFR § 1.98(a) (3)). |
Notice of Allowance received for Japanese Patent Application No. 2016-527367, dated Jul. 30, 2018, 4 pages (1 page of English Translation and 3 pages of Official Copy). |
Notice of Allowance received for Japanese Patent Application No. 2017-042050, dated Apr. 24, 2017, 3 pages (Official Copy Only) (See Communication under 37 CFR § 1.98(a) (3)). |
Notice of Allowance received for Japanese Patent Application No. 2017-102031, dated Jun. 23, 2017, 3 pages (Official Copy Only) (See Communication under 37 CFR § 1.98(a) (3)). |
Notice of Allowance received for Japanese Patent Application No. 2017-142812, dated Jul. 19, 2019, 4 pages (1 page of English Translation and 3 pages of Official copy). |
Notice of Allowance received for Japanese Patent Application No. 2017-204561, dated Mar. 12, 2019, 4 pages (1 page of English Translation and 3 pages of Official Copy). |
Notice of Allowance received for Japanese Patent Application No. 2013-011209, dated Jun. 13, 2016, 2 pages (Official Copy only) (See Communication under 37 § CFR 1.98(a) (3)). |
Notice of Allowance received for Korean Patent Application No. 10-2011-7026583, dated Apr. 29, 2015, 3 pages (1 page of English Translation and 2 pages of Official Copy). |
Notice of Allowance received for Korean Patent Application No. 10-2012-7029270, dated Sep. 23, 2014, 2 pages (Official Copy only) (See Communication under 37 CFR § 1.98(a) (3)). |
Notice of Allowance received for Korean Patent Application No. 10-2014-7011273, dated Apr. 28, 2015, 3 pages (1 page of English Translation and 2 pages of Official Copy). |
Notice of Allowance Received for Korean Patent Application No. 10-2014-7036624, dated Sep. 26, 2016, 3 pages (1 page of English Translation and 2 pages of Official Copy). |
Notice of Allowance received for Korean Patent Application No. 10-2016-7014051, dated Nov. 27, 2018, 4 pages (1 page of English Translation and 3 pages of Official Copy). |
Notice of Allowance received for U.S. Appl. No. 12/788,278, dated May 1, 2013, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 14/011,639, dated Sep. 29, 2015, 11 pages. |
Notice of Allowance received for U.S. Appl. No. 11/459,602, dated Jan. 9, 2009, 6 pages. |
Notice of Allowance received for U.S. Appl. No. 11/849,938, dated Nov. 27, 2013, 2 pages. |
Notice of Allowance received for U.S. Appl. No. 11/849,938, dated Oct. 10, 2013, 28 pages. |
Notice of Allowance received for U.S. Appl. No. 11/850,011, dated Feb. 11, 2011, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 11/850,011, dated Feb. 18, 2011, 4 pages. |
Notice of Allowance received for U.S. Appl. No. 11/969,809, dated Apr. 26, 2013, 17 pages. |
Notice of Allowance received for U.S. Appl. No. 12/242,851, dated Dec. 27, 2016, 20 pages. |
Notice of Allowance received for U.S. Appl. No. 12/364,470, dated Nov. 24, 2017, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 12/689,834, dated Jan. 17, 2018, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 12/888,362, dated Apr. 11, 2018, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 12/888,366, dated Dec. 14, 2012, 11 pages. |
Notice of Allowance received for U.S. Appl. No. 12/888,370, dated Feb. 12, 2013, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 12/888,370, dated Jul. 1, 2014, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 12/888,373, dated Jul. 1, 2014, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 12/888,376, dated May 29, 2015, 14 pages. |
Notice of Allowance received for U.S. Appl. No. 13/104,903, dated Apr. 29, 2013, 6 pages. |
Notice of Allowance received for U.S. Appl. No. 13/104,911, dated Jun. 10, 2013, 6 pages. |
Notice of Allowance received for U.S. Appl. No. 14/011,639, dated Feb. 16, 2016, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 14/142,640, dated Dec. 11, 2019, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 14/710,125, dated Apr. 19, 2019, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 14/710,125, dated May 7, 2019, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 15/033,551, dated Nov. 14, 2018, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 15/056,913, dated May 24, 2017, 19 pages. |
Notice of Allowance received for U.S. Appl. No. 15/153,617, dated Nov. 23, 2018, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 15/426,836, dated Dec. 16, 2019, 16 pages. |
Notice of Allowance received for U.S. Appl. No. 11/850,010, dated Feb. 6, 2019, 25 pages. |
Notice of Allowance received for U.S. Appl. No. 12/888,373, dated Feb. 22, 2013, 12 pages. |
Notice of Allowance received for U.S. Appl. No. 12/888,377, dated Jan. 30, 2013, 12 pages. |
Office Action received for Danish Patent Application No. PA201670595, dated Nov. 25, 2016, 9 pages. |
Office Action received for European Patent Application No. 13795330.3, dated Oct. 9, 2017, 8 pages. |
Office Action received for Australian Patent Application No. 2014274556, dated Aug. 28, 2015, 2 pages. |
Office Action received for Australian Patent Application No. 2009204252, dated Sep. 16, 2014, 6 pages. |
Office Action received for Australian Patent Application No. 2012200475, dated Aug. 4, 2015, 3 pages. |
Office Action received for Australian Patent Application No. 2012200475, dated Jun. 29, 2015, 3 pages. |
Office Action received for Australian Patent Application No. 2012200475, dated Nov. 19, 2013, 4 pages. |
Office Action received for Australian Patent Application No. 2012202140, dated Aug. 12, 2013, 2 pages. |
Office Action received for Australian Patent Application No. 2012261534, dated Dec. 3, 2013, 3 pages. |
Office Action received for Australian Patent Application No. 2013404001, dated Aug. 3, 2017, 5 pages. |
Office Action received for Australian Patent Application No. 2013404001, dated Nov. 26, 2016, 3 pages. |
Office Action received for Australian Patent Application No. 2014100582, dated Aug. 7, 2014, 5 pages. |
Office Action received for Australian Patent Application No. 2014100582, dated Feb. 4, 2015, 3 pages. |
Office Action received for Australian Patent Application No. 2014204422, dated Aug. 7, 2015, 3 pages. |
Office Action received for Australian Patent Application No. 2014274537, dated Jul. 25, 2016, 3 pages. |
Office Action received for Australian Patent Application No. 2014274537, dated Aug. 14, 2015, 3 pages. |
Office Action received for Australian Patent Application No. 2015202076, dated May 5, 2016, 3 pages. |
Office Action received for Australian Patent Application No. 2015215876, dated Aug. 1, 2016, 4 pages. |
Office Action received for Australian Patent Application No. 2015215876, dated Jul. 26, 2017,6 pages. |
Office Action received for Australian Patent Application No. 2015215876, dated Jun. 28, 2017, 4 pages. |
Office Action received for Australian Patent Application No. 2015215876, dated May 24, 2017, 4 pages. |
Office Action received for Australian Patent Application No. 2016203168, dated Feb. 8, 2017, 2 pages. |
Office Action received for Australian Patent Application No. 2016203309, dated Feb. 8, 2017, 11 pages. |
Office Action received for Australian Patent Application No. 2016213886, dated May 18, 2017, 2 pages. |
Office Action received for Australian Patent Application No. 2017201768, dated Feb. 28, 2018, 4 pages. |
Office Action received for Australian Patent Application No. 2017202587, dated Apr. 26, 2019, 4 pages. |
Office Action received for Australian Patent Application No. 2017202587, dated Jul. 4, 2018, 4 pages. |
Office Action received for Australian Patent Application No. 2017202587, dated Jul. 4, 2019, 4 pages. |
Office Action received for Australian Patent Application No. 2017277813, dated Jun. 11, 2019, 3 pages. |
Office Action received for Australian Patent Application No. 2017277851, dated Jul. 5, 2019, 3 pages. |
Office Action received for Australian Patent Application No. 2018200272, dated Jan. 17, 2019, 2 pages. |
Office Action received for Australian Patent Application No. 2018203512, dated Apr. 15, 2019, 4 pages. |
Office Action received for Australian Patent Application No. 2019200692, dated Dec. 24, 2019, 2 pages. |
Office Action received for Canadian Patent Application No. 2,845,297, dated Apr. 23, 2014, 2 pages. |
Office Action received for Canadian Patent Application No. 2,890,778, dated May 19, 2016, 6 pages. |
Office Action received for Canadian Patent Application No. 2,983,178, dated Aug. 16, 2018, 5 pages. |
Office Action received for Canadian Patent Application No. 2,983,178, dated Jul. 22, 2019, 6 pages. |
Office Action received for Chinese Patent Application No. 200780041309.3, dated Feb. 8, 2017, 4 pages (1 page of English Translation and 3 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 200980000229.2, dated Jan. 6, 2014, 6 pages (3 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 200980000229.2, dated Jun. 27, 2014, 7 pages (4 pages of English Translation and 3 pages of Official copy). |
Office Action received for Chinese Patent Application No. 201310724733.5, dated Apr. 9, 2018, 11 pages (2 pages of English Translation and 9 pages of Official copy). |
Office Action received for Chinese Patent Application No. 201310724733.5, dated Aug. 15, 2018, 2 pages (1 page of English Translation and 1 page of Official copy). |
Office Action received for Chinese Patent Application No. 201310724733.5, dated Aug. 28, 2018, 6 pages (3 pages of English Translation and 3 pages of Official copy). |
Office Action received for Chinese Patent Application No. 201310724733.5, dated Oct. 30, 2017, 14 pages (3 pages of English Translation and 11 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201310724733.5, dated Apr. 12, 2016, 14 pages (3 pages of English Translation and 11 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201310724733.5, dated Apr. 21, 2017, 18 pages (5 pages of English Translation and 13 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201310724733.5, dated Dec. 30, 2016, 13 pages (3 pages of English Translation and 10 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201380080659.6, dated Apr. 4, 2018, 15 pages (5 pages of English Translation and 10 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201380080659.6, dated Mar. 4, 2019, 9 pages (5 pages of English Translation and 4 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201380080659.6, dated Oct. 26, 2018, 11 pages (3 pages of English Translation and 8 pages of Official Copy). |
Office action received for Chinese Patent Application No. 201410250648.4, dated Feb. 14, 2018, 6 pages (3 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201410250648.4, dated Jun. 29, 2017, 13 pages (5 pages of English Translation and 8 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201410250648.4, dated Oct. 9, 2016, 6 pages (3 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201410250688.9, dated Nov. 16, 2017, 6 pages (3 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201410250688.9, dated Jun. 1, 2017, 12 pages (5 pages of English Translation and 7 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201410250688.9, dated Sep. 28, 2016, 7 pages (3 pages of English Translation and 4 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201410251370.2, dated Feb. 11, 2018, 14 pages (5 pages of English Translation and 9 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201410251370.2, dated May 12, 2017, 8 pages (4 pages of English Translation and 4 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201410251370.2, dated Sep. 5, 2016, 7 pages (3 pages of English Translation and 4 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201410251400.X, dated Feb. 8, 2018, 6 pages (3 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201410251400.X, dated Jul. 4, 2016, 8 pages (2 pages of English Translation and 6 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201410251400.X, dated May 26, 2017, 11 pages (3 pages of English Translation and 8 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201480001676.0, dated Mar. 20, 2018, 12 pages (3 pages of English Translation and 9 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201480001676.0, dated May 12, 2017, 15 pages (5 pages of English Translation and 10 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201480001676.0, dated Nov. 27, 2018, 8 pages (1 page of English Translation and 7 pages of Official Copy). |
Office Action received for Danish Patent Application No. PA201670595, dated Aug. 23, 2018, 5 pages. |
Office Action received for Danish Patent Application No. PA201670595, dated May 31, 2017, 3 pages. |
Office Action received for Danish Patent Application No. PA201670595, dated Nov. 30, 2017, 4 pages. |
Office Action received for European Patent Application No. 07814689.1, dated Mar. 4, 2011, 6 pages. |
Office Action received for European Patent Application No. 08705639.6, dated Dec. 19, 2013, 7 pages. |
Office Action received for European Patent Application No. 08829660.3, dated Aug. 2, 2013, 7 pages. |
Office Action received for European Patent Application No. 08829660.3, dated Jan. 3, 2020, 6 pages. |
Office Action received for European Patent Application No. 08829660.3, dated Jan. 11, 2019, 7 pages. |
Office Action received for European Patent Application No. 08829660.3, dated Jul. 5, 2016, 5 pages. |
Office Action received for European Patent Application No. 08829660.3, dated Oct. 15, 2010, 8 pages. |
Office Action received for European Patent Application No. 09170697.8 dated Dec. 13, 2011, 4 pages. |
Office Action received for European Patent Application No. 09170697.8, dated Mar. 3, 2017, 8 pages. |
Office Action received for European Patent Application No. 09700333.9, dated Jun. 10, 2011, 5 pages. |
Office Action received for European Patent Application No. 09700333.9, dated Nov. 26, 2010, 5 pages. |
Office Action received for European Patent Application No. 10762813.3, dated Mar. 21, 2016, 6 pages. |
Office Action received for European Patent Application No. 12189764.9, dated Jan. 21, 2019, 7 pages. |
Office Action received for European Patent Application No. 12189764.9, dated Mar. 1, 2016, 6 pages. |
Office Action received for European Patent Application No. 12194312.0, dated Jan. 13, 2014, 4 pages. |
Office Action received for European Patent Application No. 12194312.0, dated Oct. 8, 2013, 5 pages. |
Office Action received for European Patent Application No. 12194315.3, dated Jan. 13, 2014, 4 pages. |
Office Action received for European Patent Application No. 12194315.3, dated Oct. 8, 2013, 5 pages. |
Office Action received for European Patent Application No. 14734674.6, dated Aug. 30, 2019, 6 pages. |
Office Action received for European Patent Application No. 14734674.6, dated Oct. 5, 2017, 6 pages. |
Office Action received for European Patent Application No. 17210062.0, dated Jan. 3, 2019, 6 pages. |
Office Action received for Japanese Patent Application No. 2017-223021, dated Apr. 8, 2019, 7 pages (3 pages of English Translation and 4 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2013-011209, dated Feb. 7, 2014, 3 pages (Official copy only) (See Communication under 37 CFR § 1.98(a) (3)). |
Office Action received for Japanese Patent Application No. 2013-011209, dated Nov. 2, 2015, 9 pages (2 pages of English Translation and 7 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2013-011209, dated Oct. 27, 2014, 3 pages (Official Copy only) (See Communication under 37 CFR § 1.98(a) (3)). |
Office Action received for Japanese Patent Application No. 2013-127963, dated Aug. 15, 2014, 8 pages (6 pages of English Translation and 2 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2013-127963, dated Mar. 10, 2014, 7 pages (4 pages of English Translation and 3 pages of Official copy). |
Office Action received for Japanese Patent Application No. 2013-252338, dated Dec. 4, 2015, 4 pages (2 pages of English Translation and 2 pages of official copy). |
Office Action received for Japanese Patent Application No. 2013-252338, dated Jan. 27, 2017, 10 pages (5 pages of English Translation and 5 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2013-252338, dated Jan. 30, 2015, 4 pages (Official Copy only) (See Communication under 37 CFR § 1.98(a) (3)). |
Office Action received for Japanese Patent Application No. 2013-252338, dated Jun. 24, 2016, 4 pages (2 pages of English Translation and 2 pages of Official copy). |
Office Action received for Japanese Patent Application No. 2013-503721, dated Feb. 14, 2014, 8 pages (5 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2013-503721, dated Jun. 6, 2014, 3 pages (Official Copy only) (See Communication under 37 CFR § 1.98(a) (3)). |
Office Action received for Japanese Patent Application No. 2014-139095, dated Aug. 17, 2015, 6 pages (3 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2014-253365, dated Aug. 31, 2018, 6 pages (3 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2014-253365, dated Dec. 14, 2015, 6 pages (3 pages of English Translation and 3 pages Official Copy). |
Office Action received for Japanese Patent Application No. 2014-253365, dated Jul. 18, 2017, 9 pages (4 page of English Translation and 5 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2014-253365, dated Oct. 17, 2016, 11 pages (5 pages of English Translation and 6 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2015-532193, dated Mar. 22, 2016, 7 pages (3 pages of English Translation and 4 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2016-042767, dated Mar. 3, 2017, 10 pages (6 pages of English Translation and 4 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2016-091460, dated Jun. 1, 2018, 3 pages (1 page of English Translation and 2 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2016-091460, dated Jun. 26, 2017, 6 pages (3 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2016-091460, dated Nov. 4, 2016, 6 pages (3 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2016-091460, dated Nov. 27, 2017, 7 pages (4 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2016-527367, dated Feb. 26, 2018, 15 pages (8 pages of English Translation and 7 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2016-527367, dated Jul. 7, 2017, 16 pages (8 pages of English Translation and 8 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2017-142812, dated Nov. 2, 2018, 6 pages (3 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2017-204561, dated Aug. 6, 2018, 7 pages (4 pages of English Translation and 3 pages of Official copy). |
Office Action received for Japanese Patent Application No. 2017-204561, dated Nov. 6, 2018, 8 pages (4 pages of English Translation and 4 pages of Official copy). |
Office Action received for Japanese Patent Application No. 2017-223021, dated Jul. 30, 2018, 12 pages (6 pages of English Translation and 6 pages of Official copy). |
Office Action received for Japanese Patent Application No. 2017-223021, dated Sep. 24, 2019, 6 pages (3 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2018-121118, dated May 14, 2019, 10 pages (5 pages of English Translation and 5 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2018-121118, dated Nov. 18, 2019, 10 pages (5 pages of English Translation and 5 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2018-201088, dated Oct. 11, 2019, 9 pages (5 pages of English Translation and 4 pages of Official Copy). |
Office Action received for Korean Patent Application No. 10-2011-7026583, dated Aug. 14, 2014, 6 pages (2 pages of English Translation and 4 pages of Official Copy). |
Office Action received for Korean Patent Application No. 10-2011-7026583, dated Oct. 25, 2013, 4 pages (1 page of English Translation and 3 pages of Official Copy). |
Office Action received for Korean Patent Application No. 10-2012-7029270, dated Dec. 4, 2013, 4 pages (Official Copy only) (See Communication under 37 CFR § 1.98(a) (3)). |
Office Action received for Korean Patent Application No. 10-2014-7011273, dated Aug. 14, 2014, 5 pages (2 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Korean Patent Application No. 10-2014-7036624, dated Jan. 29, 2016, 10 pages (5 pages of office action and 5 pages of English Translation). |
Office Action received for Korean Patent Application No. 10-2016-7014051, dated Apr. 30, 2018, 14 pages (7 pages of English Translation and 7 pages of Official Copy). |
Office Action received for Korean Patent Application No. 10-2016-7014051, dated Jun. 20, 2017, 16 pages (8 pages of English Translation and 8 pages of Official Copy). |
Office Action received for Korean Patent Application No. 10-2019-7005262, dated May 3, 2019, 5 pages (2 pages of English Translation and 3 pages of Official Copy). |
Office Action received from European Patent Application No. 06846840.4, dated Oct. 13, 2008, 3 pages. |
Park, Will, “Apple iPhone v1.1.1 SpringBoard Hacked to Display Multiple Pages”, available at <http://www.intomobile.com/2007/10/09/apple-iphone-v111-springboard-hacked-to-display-multiple-pages/>, Oct. 9, 2007, 5 pages. |
PCFan, “Boot Camp Introduction/Data Transition/Operability/Ability Truth Derived from Gap Mac&Win Dual Boot Hard Verification”, Daily Communications, vol. 13, No. 14, Jun. 15, 2006, 4 pages (Official copy only) (See Communication under 37 CFR § 1.98(a) (3)). |
Record of Oral Hearing received for U.S. Appl. No. 14/142,640, mailed on Nov. 20, 2019, 15 pages. |
Ren, X., et al., “The Adaptive Hybrid Cursor: A Pressure-Based Target Selection Technique for Pen-Based User interfaces”, INTERACT'07, Proceedings of the 11th IFIP TC 13 International Conference on Human-Computer Interaction, Sep. 10, 2007, 14 pages. |
Shima, Korekazu, et al., “Android Application-Development”, From basics of development to mashup/hardwareinteraction, a road to “takumi” of Android application-development, Section I, difference from prior platforms, things which can be done with Android, Mar. 18, 2009, pp. 58-65 (Official copy only) (See Communication under 37 CFR § 1.98(a) (3)). |
Shiota, Shinji, “Special Developer's Story, DOS / V magazine”, vol. 13, No. 10, Jun. 1, 2004, 12 pages (Official copy only) (See Communication under 37 CFR § 1.98(a) (3)). |
“SilverScreen Theme Library”, Online Available at: <https://web.archive.org/web/20061113121041/http://www.pocketsensei.com/ss_themes.htm>, Nov. 13, 2006, 3 pages. |
“SilverScreen User Guide”, Online Available at: <https://web.archive.org/web/20061113121032/http://www.pocketsensei.com/ss_guide.htm>, Nov. 13, 2006, 12 pages. |
Stinson, Craig, “Windows 95 Official Manual”, ASCII Ltd., Ver.1 , Mar. 1, 1996, 6 pages (Official copy only) (See Communication under 37 CFR § 1.98(a) (3)). |
Summons to Attend Oral proceedings received for European Application No. 09170697.8 mailed on Apr. 22, 2013, 6 pages. |
Summons to Attend Oral proceedings received for European Application No. 09170697.8, mailed on Jul. 29, 2016, 9 pages. |
Summons to Attend Oral Proceedings received for European Application No. 09170697.8, mailed on Oct. 19, 2017, 12 pages. |
Summons to Attend Oral proceedings received for European Patent Application No. 06846840.4, mailed on May 18, 2009, 7 pages. |
Summons to Attend Oral Proceedings received for European Patent Application No. 07814689.1, mailed on Dec. 1, 2011, 11 pages. |
Summons to Attend Oral Proceedings received for European Patent Application No. 09700333.9, mailed on Sep. 21, 2012, 4 pages. |
Summons to Attend Oral Proceedings received for European Patent Application No. 10762813.3, mailed on Nov. 9, 2016, 9 pages. |
Summons to Attend Oral Proceedings received for European Patent Application No. 13795330.3, mailed on Oct. 19, 2018, 13 pages. |
Summons to Attend Oral Proceedings received for European Patent Application No. 17210062.0, mailed on Oct. 30, 2019, 7 pages. |
Summons to Oral Proceedings received for European Patent Application No. 12194312.0, mailed on Dec. 8, 2016, 9 pages. |
Summons to Oral Proceedings received for European Patent Application No. 12194315.3, mailed on Dec. 8, 2016, 9 pages. |
Supplemental Notice of Allowance received for U.S. Appl. No. 11/850,011, dated Feb. 24, 2011, 6 pages. |
Takahashi, Masaaki, “Inside Macintosh”, Mystery of File V, Mystery of Drag & Drop, NikkeiMAC, Nikkei Business Publications Inc., vol. 17, Aug. 15, 1994, 9 pages (Official Copy only) (See Communication under 37 CFR § 1.98(a) (3)). |
“TH8000 Series Programmable Thermostats”, Retrieved from the Internet: URL:https://ia802507.us.archive.org/1/items/generalmanual_000075065/generalmanual000075065.pdf, 2004, 44 pages. |
TooEasyToForget, “iPhone—Demo of SummerBoard & Its Features”, 5:05 minutes video, available at <http://www.youtube.com/watch?v=CJOb3ftQLac>, uploaded on Sep. 24, 2007, 2 pages. |
Turetta, Jonathan, “Steve Jobs iPhone 2007 Presentation (HD)”, Retrieved from the Internet: https://www.youtube.com/watch?v=vN4U5FqrOdQ&feature=youtu.be, May 13, 2013, 2 pages. |
Windows XP, “Enable or disable AutoArrange desktop icons in Windows XP”, Windows Tutorials, http://www.freemailtutorials.com/microsoftWindows/autoArrangeIconsOnTheDesktop.php, Nov. 19, 2009, 3 pages. |
Wright, Ben, “Palm OS PDA Application Mini-Reviews”, Online Available at <http://library.indstate.edu/newsletter/feb04/palmmini.htm>, Feb. 3, 2015, 11 pages. |
Zhang et al., “An Ergonomics Study of Menu-Operation on Mobile Phone Interface”, In Proceedings of the workshop on Intelligent Information Technology Application, 2007, pp. 247-251. |
Notice of Allowance received for U.S. Appl. No. 16/270,801, dated Sep. 16, 2020, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 16/270,902, dated Sep. 22, 2020, 9 pages. |
Office Action received for Australian Patent Application No. 2019204835, dated Sep. 16, 2020, 6 pages. |
Notice of Acceptance received for Australian Patent Application No. 2017277813, dated Jun. 16, 2020, 3 pages. |
Notice of Allowance received for U.S. Appl. No. 14/142,648, dated Jul. 15, 2020, 6 pages. |
Office Action received for Australian Patent Application No. 2019210673, dated Sep. 28, 2020, 2 pages. |
Office Action received for Japanese Patent Application No. 2017-223021, dated Sep. 11, 2020, 20 pages (10 pages of English Translation and 10 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2019-144763, dated Oct. 2, 2020, 6 pages (3 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Korean Patent Application No. 10-2020-7018655, dated Oct. 13, 2020, 5 pages (2 pages of English Translation and 3 pages of Official Copy). |
Decision to Grant received for European Patent Application No. 17210062.0, dated Oct. 1, 2020, 2 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/421,865, dated Oct. 7, 2020, 20 pages. |
Notice of Acceptance received for Australian Patent Application No. 2019219816, dated Sep. 23, 2020, 3 pages. |
Notice of Allowance received for Japanese Patent Application No. 2018-201088, dated Sep. 18, 2020, 4 pages (1 page of English Translation and 3 pages of Official Copy). |
Record of Oral Hearing received for U.S. Appl. No. 14/261,112, mailed on Sep. 28, 2020, 20 pages. |
Applicant Initiated Interview Summary received for U.S. Appl. No. 15/411,110, dated Nov. 17, 2020, 7 pages. |
Notice of Allowance received for Canadian Patent Application No. 2,983,178, dated Oct. 20, 2020, 1 page. |
Notice of Allowance received for U.S. Appl. No. 14/261,112, dated Nov. 18, 2020, 6 pages. |
Result of Consultation received for European Patent Application No. 08829660.3, dated Nov. 18, 2020, 5 pages. |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 15/421,865, dated Feb. 3, 2020, 5 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 14/142,640, dated Feb. 5, 2020, 4 pages. |
Decision to Grant received for European Patent Application No. 13795330.3, dated Jan. 16, 2020, 2 pages. |
Pre-interview First Office Action received for U.S. Appl. No. 16/270,801, dated Feb. 10, 2020, 5 pages. |
Pre-interview First Office Action received for U.S. Appl. No. 16/270,902, dated Feb. 10, 2020, 5 pages. |
Decision on Appeal received for U.S. Appl. No. 14/261,112, mailed on Oct. 29, 2020, 20 pages. |
Notice of Acceptance received for Australian Patent Application No. 2019210673, dated Oct. 17, 2020, 3 pages. |
Office Action received for Japanese Patent Application No. 2019-024663, dated Oct. 5, 2020, 7 pages (4 pages of English Translation and 3 pages of Official Copy). |
Advisory Action received for U.S. Appl. No. 15/411,110, dated Jun. 29, 2021, 4 pages. |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 15/421,865, dated Jun. 30, 2021, 6 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/002,622, dated Jul. 6, 2021, 14 pages. |
Notice of Allowance received for U.S. Appl. No. 16/926,530, dated Jun. 24, 2021, 10 pages. |
Office Action received for Australian Patent Application No. 2020239774, dated Jun. 28, 2021, 8 pages. |
Examiner's Pre-Review Report received for Japanese Patent Application No. 2019-024663, dated Aug. 31, 2021, 4 pages (2 pages of English Translation and 2 pages of Official Copy). |
Intention to Grant received for European Patent Application No. 12189764.9, dated Sep. 28, 2021, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/190,869, dated Sep. 27, 2021, 26 pages. |
Notice of Allowance received for Japanese Patent Application No. 2018-121118, dated Sep. 27, 2021, 3 pages (1 page of English Translation and 2 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2020-123882, dated Sep. 3, 2021, 8 pages (4 pages of English Translation and 4 pages of Official Copy). |
Notice of Allowance received for Chinese Patent Application No. 201780033973.7, dated Jul. 7, 2021, 5 pages (1 page of English Translation and 4 pages of Official Copy). |
Office Action received for European Patent Application No. 09170697.8, dated Jul. 6, 2021, 3 pages. |
Office Action received for European Patent Application No. 17810723.1, dated Jul. 9, 2021, 8 pages. |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 16/832,285, dated Nov. 19, 2021, 19 pages. |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 16/994,392, dated Dec. 3, 2021, 2 pages. |
Decision to Grant received for European Patent Application No. 12189764.9, dated Nov. 25, 2021, 2 pages. |
Notice of Allowance received for U.S. Appl. No. 17/002,622, dated Nov. 22, 2021, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 17/019,062, dated Nov. 24, 2021, 0 pages. |
Summons to Attend Oral Proceedings received for European Patent Application No. 14734674.6, mailed on Nov. 23, 2021, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 17/019,062, dated Aug. 10, 2021, 22 pages. |
Office Action received for Australian Patent Application No. 2020239774, dated Oct. 5, 2021, 3 pages. |
Office Action received for European Patent Application No. 17813879.8, dated Oct. 20, 2021, 7 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/832,285, dated Jul. 26, 2021, 62 pages. |
Office Action received for Japanese Patent Application No. 2019-144763, dated Jul. 2, 2021, 6 pages (3 pages of English Translation and 3 pages of Official Copy). |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 17/190,869, dated Dec. 10, 2021, 2 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 17/002,622, dated Dec. 13, 2021, 2 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 17/019,062, dated Dec. 8, 2021, 2 pages. |
Intention to Grant received for European Patent Application No. 08829660.3, dated Dec. 17, 2021, 8 pages. |
Intention to Grant received for European Patent Application No. 09170697.8, dated Dec. 16, 2021, 8 pages. |
Notice of Allowance received for Japanese Patent Application No. 2019-144763, dated Nov. 29, 2021, 4 pages (1 page of English Translation and 3 pages of Official Copy). |
Non-Final Office Action received for U.S. Appl. No. 15/421,865, dated Dec. 29, 2021, 23 pages. |
Notice of Allowance received for U.S. Appl. No. 17/190,869, dated Jan. 10, 2022, 10 pages. |
Office Action received for Chinese Patent Application No. 201780033621.1, dated Dec. 14, 2021, 8 pages (4 pages of English Translation and 4 pages of Official Copy). |
Office Action received for Australian Patent Application No. 2021201687, dated Mar. 16, 2022, 5 pages. |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 15/421,865, dated Feb. 28, 2022, 5 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 17/002,622, dated Feb. 16, 2022, 2 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 17/002,622, dated Jan. 25, 2022, 2 pages. |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 15/411,110, dated Feb. 1, 2022, 9 pages. |
Final Office Action received for U.S. Appl. No. 16/832,285, dated Jan. 19, 2022, 66 pages. |
Final Office Action received for U.S. Appl. No. 16/994,392, dated Jan. 18, 2022, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/130,674, dated Mar. 3, 2022, 8 pages. |
Notice of Acceptance received for Australian Patent Application No. 2020239774, dated Jan. 5, 2022, 3 pages. |
Office Action received for Canadian Patent Application No. 3,109,701, dated Feb. 7, 2022, 4 pages. |
Office Action received for European Patent Application No. 19176224.4, dated Jan. 18, 2022, 6 pages. |
Office Action received for Japanese Patent Application No. 2019-24663, dated Feb. 10, 2022, 4 pages (2 pages of English Translation and 2 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2020-046707, dated Jan. 7, 2022, 10 pages (5 pages of English Translation and 5 pages of Official Copy). |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 16/994,392, dated Mar. 10, 2022, 2 pages. |
Notice of Allowance received for Chinese Patent Application No. 201780033621.1, dated Mar. 10, 2022, 2 pages (1 page of English Translation and 1 page of Official Copy). |
Notice of Allowance received for Korean Patent Application No. 10-2020-7018655, dated Feb. 25, 2022, 4 pages (1 page of English Translation and 3 pages of Official Copy). |
Office Action received for European Patent Application No. 20203888.1, dated Mar. 10, 2022, 6 pages. |
Minutes of the Oral Proceedings received for European Patent Application No. 14734674.6, mailed on Jun. 13, 2022, 9 pages. |
Decision to Grant received for European Patent Application No. 08829660.3, dated May 6, 2022, 2 pages. |
Decision to Grant received for European Patent Application No. 09170697.8, dated Apr. 29, 2022, 2 pages. |
Notice of Allowance received for Japanese Patent Application No. 2020-046707, dated Aug. 15, 2022, 4 pages (1 page of English Translation and 3 pages of Official Copy). |
Notice of Allowance received for Japanese Patent Application No. 2019-024663, dated Sep. 26, 2022, 23 pages (1 page of English Translation and 22 pages of Official Copy). |
Corrected Notice of Allowance received for U.S. Appl. No. 16/994,392, dated Aug. 4, 2022, 2 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 17/130,674, dated Jul. 29, 2022, 2 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 17/521,768, dated Jul. 29, 2022, 2 pages. |
Office Action received for Japanese Patent Application No. 2020-123882, dated Jul. 29, 2022, 8 pages (4 pages of English Translation and 4 pages of Official Copy). |
Corrected Notice of Allowance received for U.S. Appl. No. 16/994,392, dated Jul. 19, 2022, 2 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/411,110, dated Jul. 14, 2022, 28 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/321,313, dated Jul. 19, 2022, 18 pages. |
Notice of Allowance received for U.S. Appl. No. 17/521,768, dated Jul. 15, 2022, 10 pages. |
Brief Communication Regarding Oral Proceedings received for European Patent Application No. 17810723.1, mailed on Nov. 11, 2022, 11 pages. |
Minutes of the Oral Proceedings received for European Patent Application No. 17810723.1, mailed on Dec. 9, 2022, 7 pages. |
Result of Consultation received for European Patent Application No. 17810723.1, mailed on Nov. 30, 2022, 3 pages. |
Decision to Refuse received for European Patent Application No. 14734674.6, dated Jun. 29, 2022, 15 pages. |
Final Office Action received for U.S. Appl. No. 15/421,865, dated Jul. 12, 2022, 27 pages. |
Notice of Allowance received for U.S. Appl. No. 16/994,392, dated Jul. 11, 2022, 26 pages. |
Summons to Attend Oral Proceedings received for European Patent Application No. 17810723.1, mailed on Jul. 5, 2022, 8 pages. |
Summons to Attend Oral Proceedings received for German Patent Application No. 112006003600.9, mailed on Jun. 2, 2022, 33 pages (21 pages of English Translation and 12 pages of Official Copy). |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 16/832,285, dated Sep. 7, 2022, 30 pages. |
Notice of Allowance received for U.S. Appl. No. 17/349,226, dated Sep. 20, 2022, 7 pages. |
Notice of Acceptance received for Australian Patent Application No. 2021201687, dated Jun. 8, 2022, 3 pages. |
Notice of Allowance received for U.S. Appl. No. 17/130,674, dated Jun. 15, 2022, 9 pages. |
Result of Consultation received for European Patent Application No. 14734674.6, mailed on May 27, 2022, 3 pages. |
Intention to Grant received for European Patent Application No. 17810723.1, dated Dec. 16, 2022, 9 pages. |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 15/411,110, dated Oct. 31, 2022, 6 pages. |
Notice of Allowance received for U.S. Appl. No. 17/321,313, dated Oct. 24, 2022, 9 pages. |
Office Action received for Chinese Patent Application No. 201780034059.4, dated Oct. 9, 2022, 11 pages (5 pages of English Translation and 6 pages of Official Copy). |
Number | Date | Country | |
---|---|---|---|
20200348822 A1 | Nov 2020 | US |
Number | Date | Country | |
---|---|---|---|
62843507 | May 2019 | US |