This relates generally to electronic devices with touch-sensitive surfaces, including but not limited to electronic devices with touch-sensitive surfaces that detect inputs for manipulating user interfaces.
The use of touch-sensitive surfaces as input devices for computers and other electronic computing devices has increased significantly in recent years. Exemplary touch-sensitive surfaces include touchpads and touch-screen displays. Such surfaces are widely used to manipulate user interfaces on a display.
Exemplary manipulations include adjusting the position and/or size of one or more user interface objects or activating buttons or opening files/applications represented by user interface objects, as well as associating metadata with one or more user interface objects or otherwise manipulating user interfaces. Exemplary user interface objects include digital images, video, text, icons, and control elements such as buttons and other graphics.
A user will, in some circumstances, need to perform such manipulations on user interface objects in a file management program (e.g., Finder from Apple Inc. of Cupertino, Calif.), a messaging application (e.g., Messages from Apple Inc. of Cupertino, Calif.), an image management application (e.g., Photos from Apple Inc. of Cupertino, Calif.), a camera application (e.g., Camera from Apple Inc. of Cupertino, Calif.), a map application (e.g., Maps from Apple Inc. of Cupertino, Calif.), a note taking application (e.g., Notes from Apple Inc. of Cupertino, Calif.), digital content (e.g., videos and music) management applications (e.g., Music and iTunes from Apple Inc. of Cupertino, Calif.), a news application (e.g., News from Apple Inc. of Cupertino, Calif.), a phone application (e.g., Phone from Apple Inc. of Cupertino, Calif.), an email application (e.g., Mail from Apple Inc. of Cupertino, Calif.), a browser application (e.g., Safari from Apple Inc. of Cupertino, Calif.), a drawing application, a presentation application (e.g., Keynote from Apple Inc. of Cupertino, Calif.), a word processing application (e.g., Pages from Apple Inc. of Cupertino, Calif.), a spreadsheet application (e.g., Numbers from Apple Inc. of Cupertino, Calif.), a reader application (e.g., iBooks from Apple Inc. of Cupertino, Calif.), a video making application (e.g., iMovie from Apple Inc. of Cupertino, Calif.), and/or geo location applications (e.g., Find Friends and Find iPhone from Apple Inc. of Cupertino, Calif.).
But existing methods for performing these manipulations are cumbersome and inefficient. In addition, existing methods take longer than necessary, thereby wasting energy. This latter consideration is particularly important in battery-operated devices.
Accordingly, there is a need for electronic devices with faster, more efficient methods and interfaces for manipulating user interfaces. Such methods and interfaces optionally complement or replace conventional methods for manipulating user interfaces. Such methods and interfaces reduce the number, extent, and/or nature of the inputs from a user and produce a more efficient human-machine interface. For battery-operated devices, such methods and interfaces conserve power and increase the time between battery charges.
The above deficiencies and other problems associated with user interfaces for electronic devices with touch-sensitive surfaces are reduced or eliminated by the disclosed devices. In some embodiments, the device is a desktop computer. In some embodiments, the device is portable (e.g., a notebook computer, tablet computer, or handheld device). In some embodiments, the device is a personal electronic device (e.g., a wearable electronic device, such as a watch). In some embodiments, the device has a touchpad. In some embodiments, the device has a touch-sensitive display (also known as a “touch screen” or “touch-screen display”). In some embodiments, the device has a graphical user interface (GUI), one or more processors, memory and one or more modules, programs or sets of instructions stored in the memory for performing multiple functions. In some embodiments, the user interacts with the GUI primarily through stylus and/or finger contacts and gestures on the touch-sensitive surface. In some embodiments, the functions optionally include image editing, drawing, presenting, word processing, spreadsheet making, game playing, telephoning, video conferencing, e-mailing, instant messaging, workout support, digital photographing, digital videoing, web browsing, digital music playing, note taking, and/or digital video playing. Executable instructions for performing these functions are, optionally, included in a non-transitory computer readable storage medium or other computer program product configured for execution by one or more processors.
In accordance with some embodiments, a method is performed at an electronic device with a touch-sensitive surface and a display. The device includes one or more sensors to detect intensity of contacts with the touch-sensitive surface. The device displays a plurality of user interface objects in a first user interface on the display. The device detects a contact at a location on the touch-sensitive surface while a focus selector is at a location of a first user interface object, in the plurality of user interface objects, on the display. While the focus selector is at the location of the first user interface object on the display, the device detects an increase in a characteristic intensity of the contact to a first intensity threshold; in response to detecting the increase in the characteristic intensity of the contact to the first intensity threshold, the device visually obscures the plurality of user interface objects, other than the first user interface object, in the first user interface while maintaining display of the first user interface object without visually obscuring the first user interface object; the device detects that the characteristic intensity of the contact continues to increase above the first intensity threshold; and, in response to detecting that the characteristic intensity of the contact continues to increase above the first intensity threshold, the device dynamically increases the amount of visual obscuring of the plurality of user interface objects, other than the first user interface object, in the first user interface while maintaining display of the first user interface object without visually obscuring the first user interface object.
In accordance with some embodiments, an electronic device includes a display unit configured to display user interface objects; a touch-sensitive surface unit configured to receive contacts; one or more sensor units configured to detect intensity of contacts with the touch-sensitive surface unit; and a processing unit coupled to the display unit, the touch-sensitive surface unit and the one or more sensor units. The processing unit is configured to enable display of a plurality of user interface objects in a first user interface on the display unit; detect a contact at a location on the touch-sensitive surface unit while a focus selector is at a location of a first user interface object, in the plurality of user interface objects, on the display unit; and, while the focus selector is at the location of the first user interface object on the display unit: detect an increase in a characteristic intensity of the contact to a first intensity threshold; in response to detecting the increase in the characteristic intensity of the contact to the first intensity threshold, visually obscure the plurality of user interface objects, other than the first user interface object, in the first user interface while maintaining display of the first user interface object without visually obscuring the first user interface object; detect that the characteristic intensity of the contact continues to increase above the first intensity threshold; and, in response to detecting that the characteristic intensity of the contact continues to increase above the first intensity threshold, dynamically increase the amount of visual obscuring of the plurality of user interface objects, other than the first user interface object, in the first user interface while maintaining display of the first user interface object without visually obscuring the first user interface object.
In accordance with some embodiments, a method is performed at an electronic device with a touch-sensitive surface and a display. The device includes one or more sensors to detect intensity of contacts with the touch-sensitive surface. The device displays a plurality of user interface objects in a first user interface on the display. The device detects an input by a contact while a focus selector is over a first user interface object, in the plurality of user interface objects, on the display. In accordance with a determination that the input meets selection criteria, the device displays a second user interface that is distinct from the first user interface in response to detecting the input. In accordance with a determination that a first portion of the input meets preview criteria, the device displays a preview area overlaid on at least some of the plurality of user interface objects in the first user interface in response to detecting the first portion of the input, wherein the preview area includes a reduced scale representation of the second user interface. In accordance with a determination that a second portion of the input by the contact, detected after the first portion of the input, meets user-interface-replacement criteria, the device replaces display of the first user interface and the overlaid preview area with display of the second user interface. In accordance with a determination that the second portion of the input by the contact meets preview-area-disappearance criteria, the device ceases to display the preview area and displays the first user interface after the input ends.
In accordance with some embodiments, an electronic device includes a display unit configured to display user interface objects; a touch-sensitive surface unit configured to receive contacts; one or more sensor units configured to detect intensity of contacts with the touch-sensitive surface unit; and a processing unit coupled to the display unit, the touch-sensitive surface unit and the one or more sensor units. The processing unit is configured to enable display of a plurality of user interface objects in a first user interface on the display unit. The processing unit is configured to detect an input by a contact while a focus selector is over a first user interface object, in the plurality of user interface objects, on the display unit. In accordance with a determination that the input meets selection criteria, the processing unit is configured to enable display of a second user interface that is distinct from the first user interface in response to detecting the input. In accordance with a determination that a first portion of the input meets preview criteria, the processing unit is configured to enable display of a preview area overlaid on at least some of the plurality of user interface objects in the first user interface in response to detecting the first portion of the input, wherein the preview area includes a reduced scale representation of the second user interface. In accordance with a determination that a second portion of the input by the contact, detected after the first portion of the input, meets user-interface-replacement criteria, the processing unit is configured to replace display of the first user interface and the overlaid preview area with display of the second user interface. In accordance with a determination that the second portion of the input by the contact meets preview-area-disappearance criteria, the processing unit is configured to cease to display the preview area and enable display of the first user interface after the input ends.
In accordance with some embodiments, a method is performed at an electronic device with a touch-sensitive surface and a display. The device includes one or more sensors to detect intensity of contacts with the touch-sensitive surface. The device displays a plurality of user interface objects in a first user interface on the display. The device detects a first portion of a press input by a contact at a location on the touch-sensitive surface that corresponds to a location of a first user interface object, in the plurality of user interface objects, on the display. While detecting the first portion of the press input by the contact at the location on the touch-sensitive surface that corresponds to the location of the first user interface object, in the plurality of user interface objects, on the display, the device selects the first user interface object and detects the intensity of the contact increase to a second intensity threshold. In response to detecting the intensity of the contact increase to the second intensity threshold, the device displays in the first user interface a preview area overlaid on at least some of the plurality of user interface objects. After detecting the first portion of the press input, the device detects a second portion of the press input by the contact. In response to detecting the second portion of the press input by the contact, in accordance with a determination that the second portion of the press input by the contact meets user-interface-replacement criteria, the device replaces display of the first user interface with a second user interface that is distinct from the first user interface. In accordance with a determination that the second portion of the press input by the contact meets preview-area-maintenance criteria, the device maintains display, after the press input ends, of the preview area overlaid on at least some of the plurality of user interface objects in the first user interface. In accordance with a determination that the second portion of the press input by the contact meets preview-area-disappearance criteria, the device ceases to display to the preview area and maintains display, after the press input ends, of the first user interface.
In accordance with some embodiments, a method is performed at an electronic device with a display, a touch-sensitive surface, and one or more sensors to detect intensity of contacts with the touch-sensitive surface. The method includes displaying, on the display, a first user interface that includes a plurality of selectable user interface objects, including one or more user interface objects of a first type and one or more user interface objects of a second type that is distinct from the first type. While displaying the first user interface on the display, the device detects a first portion of a first input that includes detecting an increase in a characteristic intensity of a first contact on the touch-sensitive surface above a first intensity threshold while a focus selector is over a respective user interface object of the plurality of selectable user interface objects. In response to detecting the first portion of the first input, the device displays supplemental information associated with the respective user interface object. While displaying the supplemental information associated with the respective user interface object, the device detects an end of the first input. In response to detecting the end of the first input: in accordance with a determination that the respective user interface object is the first type of user interface object, the device ceases to display the supplemental information associated with the respective user interface object; and, in accordance with a determination that the respective user interface object is the second type of user interface object, the device maintains display of the supplemental information associated with the respective user interface object after detecting the end of the first input.
In accordance with some embodiments, an electronic device includes a display unit configured to display content items, a touch-sensitive surface unit configured to receive user inputs, one or more sensor units configured to detect intensity of contacts with the touch-sensitive surface unit, and a processing unit coupled to the display unit, the touch-sensitive surface unit and the one or more sensor units. The processing unit is configured to: enable display, on the display unit, of a first user interface that includes a plurality of selectable user interface objects, including one or more user interface objects of a first type and one or more user interface objects of a second type that is distinct from the first type; while the first user interface is displayed on the display unit, detect a first portion of a first input that includes detecting an increase in a characteristic intensity of a first contact on the touch-sensitive surface above a first intensity threshold while a focus selector is over a respective user interface object of the plurality of selectable user interface objects; in response to detecting the first portion of the first input, enable display of supplemental information associated with the respective user interface object; while the supplemental information associated with the respective user interface object is displayed, detect an end of the first input; and, in response to detecting the end of the first input: in accordance with a determination that the respective user interface object is the first type of user interface object, cease to enable display of the supplemental information associated with the respective user interface object; and, in accordance with a determination that the respective user interface object is the second type of user interface object, maintaining display of the supplemental information associated with the respective user interface object after detecting the end of the first input.
In accordance with some embodiments, a method is performed at an electronic device with a display, a touch-sensitive surface, and one or more sensors to detect intensity of contacts with the touch-sensitive surface. The device displays a first user interface on the display, wherein the first user interface includes a background with a first appearance and one or more foreground objects. While displaying the first user interface on the display, the device detects a first input by a first contact on the touch-sensitive surface while a first focus selector is at a location in the first user interface that corresponds to the background of the first user interface. In response to detecting the first input by the first contact, in accordance with a determination that the first contact has a characteristic intensity above a first intensity threshold, the device dynamically changes the appearance of the background of the first user interface without changing the appearance of the one or more foreground objects in the first user interface, wherein the dynamic change in the appearance of the background of the first user interface is based at least in part on the characteristic intensity of the first contact. While dynamically changing the appearance of the background of the first user interface, detecting termination of the first input by the first contact; and, in response to detecting termination of the first input by the first contact, the device reverts the background of the first user interface back to the first appearance of the background.
In accordance with some embodiments, an electronic device includes a display unit configured to display user interfaces, backgrounds and foreground objects, a touch-sensitive surface unit configured to receive user inputs, one or more sensor units configured to detect intensity of contacts with the touch-sensitive surface unit, and a processing unit coupled to the display unit, the touch-sensitive surface unit and the one or more sensor units. The processing unit is configured to enable display of a first user interface on the display, wherein the first user interface includes a background with a first appearance and one or more foreground objects. While displaying the first user interface on the display, the processing unit is configured to detect a first input by a first contact on the touch-sensitive surface unit while a first focus selector is at a location in the first user interface that corresponds to the background of the first user interface. In response to detecting the first input by the first contact, in accordance with a determination that the first contact has a characteristic intensity above a first intensity threshold, the processing unit is configured to dynamically change the appearance of the background of the first user interface without changing the appearance of the one or more foreground objects in the first user interface, wherein the dynamic change in the appearance of the background of the first user interface is based at least in part on the characteristic intensity of the first contact. While dynamically changing the appearance of the background of the first user interface, detect termination of the first input by the first contact; and, in response to detecting termination of the first input by the first contact, the processing unit is configured to revert the background of the first user interface back to the first appearance of the background.
In accordance with some embodiments, a method is performed at an electronic device with a display, a touch-sensitive surface, and one or more sensors to detect intensity of contacts with the touch-sensitive surface. The device display a first user interface on the display, wherein the first user interface includes a background with a first appearance and one or more foreground objects. While displaying the first user interface on the display, the device detects an input by a first contact on the touch-sensitive surface, the first contact having a characteristic intensity above a first intensity threshold. In response to detecting the input by the first contact, in accordance with a determination that, during the input, a focus selector is at a location in the first user interface that corresponds to the background of the user interface, the device dynamically changes the appearance of the background of the first user interface without changing the appearance of the one or more foreground objects in the first user interface, wherein the dynamic change in the appearance of the background of the first user interface is based at least in part on the characteristic intensity of the first contact; and, in accordance with a determination that a focus selector is at a location in the first user interface that corresponds to a respective foreground object of the one or more foreground objects in the first user interface, the device maintains the first appearance of the background of the first user interface.
In accordance with some embodiments, an electronic device includes a display unit configured to display user interfaces, backgrounds and foreground objects, a touch-sensitive surface unit configured to receive user inputs, one or more sensor units configured to detect intensity of contacts with the touch-sensitive surface unit, and a processing unit coupled to the display unit, the touch-sensitive surface unit and the one or more sensor units. The processing unit is configured to enable display of a first user interface on the display unit, wherein the first user interface includes a background with a first appearance and one or more foreground objects. While displaying the first user interface on the display unit, the processing unit is configured to detect an input by a first contact on the touch-sensitive surface unit, the first contact having a characteristic intensity above a first intensity threshold. In response to detecting the input by the first contact, in accordance with a determination that, during the input, a focus selector is at a location in the first user interface that corresponds to the background of the user interface, the processing unit is configured to dynamically change the appearance of the background of the first user interface without changing the appearance of the one or more foreground objects in the first user interface, wherein the dynamic change in the appearance of the background of the first user interface is based at least in part on the characteristic intensity of the first contact. In accordance with a determination that a focus selector is at a location in the first user interface that corresponds to a respective foreground object of the one or more foreground objects in the first user interface, the processing unit is configured to maintain the first appearance of the background of the first user interface.
In accordance with some embodiments, a method is performed at an electronic device with a display, a touch-sensitive surface, and one or more sensors to detect intensity of contacts with the touch-sensitive surface. The device displays a first user interface on the display, wherein: the first user interface includes a background; the first user interface includes a foreground area overlaying a portion of the background; and the foreground area includes a plurality of user interface objects. The device detects an input by a contact on the touch-sensitive surface while a first focus selector is at a first user interface object in the plurality of user interface objects in the foreground area. In response to detecting the input by the contact, in accordance with a determination that the input by the contact meets one or more first press criteria, which include a criterion that is met when a characteristic intensity of the contact remains below a first intensity threshold during the input, the device performs a first predetermined action that corresponds to the first user interface object in the foreground area; and, in accordance with a determination that the input by the contact meets one or more second press criteria, which include a criterion that is met when the characteristic intensity of the contact increases above the first intensity threshold during the input, the device performs a second action, distinct from the first predetermined action, that corresponds to the first user interface object in the foreground area.
In accordance with some embodiments, an electronic device includes a display unit configured to display user interfaces and user interface objects, a touch-sensitive surface unit configured to receive user inputs, one or more sensor units configured to detect intensity of contacts with the touch-sensitive surface unit, and a processing unit coupled to the display unit, the touch-sensitive surface unit and the one or more sensor units. The processing unit is configured to enable display of a first user interface on the display unit, wherein the first user interface includes a background with a first appearance and one or more foreground objects. While displaying the first user interface on the display unit, the processing unit is configured to detect an input by a first contact on the touch-sensitive surface unit, the first contact having a characteristic intensity above a first intensity threshold. In response to detecting the input by the first contact, in accordance with a determination that, during the input, a focus selector is at a location in the first user interface that corresponds to the background of the user interface, the processing unit is configured to dynamically change the appearance of the background of the first user interface without changing the appearance of the one or more foreground objects in the first user interface, wherein the dynamic change in the appearance of the background of the first user interface is based at least in part on the characteristic intensity of the first contact. In accordance with a determination that a focus selector is at a location in the first user interface that corresponds to a respective foreground object of the one or more foreground objects in the first user interface, the processing unit is configured to maintain the first appearance of the background of the first user interface.
In accordance with some embodiments, a method is performed at an electronic device with a display, a touch-sensitive surface, and one or more sensors to detect intensity of contacts with the touch-sensitive surface. The device displays, on the display, an application launching user interface that includes a plurality of application icons for launching corresponding applications. While displaying on the application launching user interface, the device detects a first touch input that includes detecting a first contact at a location on the touch-sensitive surface that corresponds to a first application icon of the plurality of application icon. The first application icon is an icon for launching a first application that is associated with one or more corresponding quick actions. In response to detecting the first touch input in accordance with a determination that the first touch input meets one or more application-launch criteria, the device launches the first application. In accordance with a determination that the first touch input meets one or more quick-action-display criteria which include a criterion that is met when the characteristic intensity of the first contact increases above a respective intensity threshold, the device concurrently displays one or more quick action objects associated with the first application along with the first application icon without launching the first application.
In accordance with some embodiments, an electronic device includes a display unit configured to display user interface objects, a touch-sensitive surface unit configured to receive user inputs, one or more sensor units configured to detect intensity of contacts with the touch-sensitive surface unit, and a processing unit coupled to the display unit, the touch-sensitive surface unit and the one or more sensor units. The processing unit is configured to enable display of, on the display unit, an application launching user interface that includes a plurality of application icons for launching corresponding applications. While displaying on the application launching user interface, the processing unit is configured to detect a first touch input that includes detecting a first contact at a location on the touch-sensitive surface unit that corresponds to a first application icon of the plurality of application icons, wherein the first application icon is an icon for launching a first application that is associated with one or more corresponding quick actions. In response to detecting the first touch input, in accordance with a determination that the first touch input meets one or more application-launch criteria, the processing unit is configured to launch the first application. In accordance with a determination that the first touch input meets one or more quick-action-display criteria which include a criterion that is met when the characteristic intensity of the first contact increases above a respective intensity threshold, the processing unit is configured to concurrently enable display of one or more quick action objects associated with the first application along with the first application icon without launching the first application.
In accordance with some embodiments, a method is performed at an electronic device with a display and one or more input devices. The electronic device displays, on the display, a first user interface that includes a plurality of user interface objects, wherein a respective user interface object is associated with a corresponding set of menu options. The device detects, via the one or more input devices, a first input that corresponds to a request to display menu options for a first user interface object of the plurality of user interface objects. In response to detecting the first input, the device displays menu items in a menu that corresponds to the first user interface object. Displaying the menu includes, in accordance with a determination that the first user interface object is at a first location in the first user interface, displaying the menu items in the menu that corresponds to the first user interface object in a first order; and in accordance with a determination that the first user interface object is at a second location in the first user interface that is different from the first location, displaying the menu items in the menu that corresponds to the first user interface object in a second order that is different from the first order.
In accordance with some embodiments, an electronic device includes a display unit configured to display content items, one or more input devices configured to receive user inputs, and a processing unit coupled to the display unit and the one or more input devices. The processing unit is configured to enable display of, on the display unit, a first user interface that includes a plurality of user interface objects, wherein a respective user interface object is associated with a corresponding set of menu options. The processing unit is configured to detect, via the one or more input devices, a first input that corresponds to a request to display menu options for a first user interface object of the plurality of user interface objects. In response to detecting the first input, enable display of menu items in a menu that corresponds to the first user interface object. Displaying the menu includes, in accordance with a determination that the first user interface object is at a first location in the first user interface, displaying the menu items in the menu that corresponds to the first user interface object in a first order, and in accordance with a determination that the first user interface object is at a second location in the first user interface that is different from the first location, displaying the menu items in the menu that corresponds to the first user interface object in a second order that is different from the first order.
In accordance with some embodiments, a method is performed at an electronic device with a display, a touch-sensitive surface, and one or more sensors to detect intensity of contacts with the touch-sensitive surface. The device displays, on the display, a user interface that includes a selectable user interface object that is associated with a plurality of actions for interacting with the user interface, wherein the plurality of actions include a direct-selection action and one or more other actions. While displaying the user interface that includes the selectable user interface object, the device detects an input that includes detecting a contact on the touch-sensitive surface while a focus selector is over the selectable user interface objects. In response to detecting the input that includes detecting the contact: in accordance with a determination that the input meets selection criteria, the device displays, on the display, a menu that includes graphical representations of the plurality of actions that include the direct-selection action and the one or more other actions; and in accordance with a determination that the input meets direct-selection criteria, wherein the direct-selection criteria include a criterion that is met when a characteristic intensity of the contact increases above a respective intensity threshold, the device performs the direct-selection action.
In accordance with some embodiments, an electronic device includes a display unit configured to display content items, a touch-sensitive surface unit configured to receive user inputs, one or more sensor units configured to detect intensity of contacts with the touch-sensitive surface unit, and a processing unit coupled to the display unit, the touch-sensitive surface unit and the one or more sensor units. The processing unit is configured to enable display of, on the display unit, a user interface that includes a selectable user interface object that is associated with a plurality of actions for interacting with the user interface, wherein the plurality of actions include a direct-selection action and one or more other actions. While displaying the user interface that includes the selectable user interface object, the processing unit is configured to detect an input that includes detecting a contact on the touch-sensitive surface unit while a focus selector is over the selectable user interface objects. In response to detecting the input that includes detecting the contact: in accordance with a determination that the input meets selection criteria, the processing unit is configured to enable display of, on the display unit, a menu that includes graphical representations of the plurality of actions that include the direct-selection action and the one or more other actions; and in accordance with a determination that the input meets direct-selection criteria, wherein the direct-selection criteria include a criterion that is met when a characteristic intensity of the contact increases above a respective intensity threshold, the processing unit is configured to perform the direct-selection action.
There is a need for electronic devices with improved methods and interfaces for teaching new user interface capabilities and features to the user, such as new contact-intensity based capabilities and features. Such methods and interfaces optionally complement or replace conventional methods for teaching new user interface capabilities and features to the user. Such methods reduce the number, extent, and/or nature of the inputs from a user and produce a more efficient human-machine interface. For battery-operated devices, such methods and interfaces conserve power and increase the time between battery charges
In accordance with some embodiments, a method is performed at an electronic device with a display, a touch-sensitive surface, and one or more sensors to detect intensity of contacts with the touch-sensitive surface. The device displays, on the display, a user interface that includes a plurality of user interface objects that are associated with respective object-specific operations that are triggered by changes in contact intensity, wherein the plurality of user interface elements include a first object displayed at a first location in the user interface and a second object displayed at a second location in the user interface. While displaying the user interface that includes the plurality of user interface elements, the device detects a first input that includes detecting a first contact on the touch-sensitive surface and detecting an increase in a characteristic intensity of the first contact above a first intensity threshold. In response to detecting the first input: in accordance with a determination that a focus selector is at the first location in the user interface at which the first object is displayed, the device performs a first operation associated with the first object that includes displaying, on the display, additional information associated with the first object; in accordance with a determination that a focus selector is at the second location in the user interface at which the second object is displayed, the device performs a second operation associated with the second object that includes displaying, on the display, additional information associated with the second object, wherein the second operation associated with the second object is distinct from the first operation associated with the first object; and in accordance with a determination that a focus selector is at the location in the user interface that is away from any objects that are associated with object-specific operations that are triggered by changes in contact intensity, the device performs a third operation that includes updating the user interface on the display to concurrently visually distinguish the first and second objects in the user interface.
In accordance with some embodiments, an electronic device includes a display unit configured to display user interfaces and user interface objects, a touch-sensitive surface unit configured to receive user inputs, one or more sensor units configured to detect intensity of contacts with the touch-sensitive surface unit, and a processing unit coupled to the display unit, the touch-sensitive surface unit and the one or more sensor units. The processing unit is configured to: enable display of, on the display unit, a user interface that includes a plurality of user interface objects that are associated with respective object-specific operations that are triggered by changes in contact intensity, wherein the plurality of user interface elements include a first object displayed at a first location in the user interface and a second object displayed at a second location in the user interface; while displaying the user interface that includes the plurality of user interface elements, detect a first input that includes detecting a first contact on the touch-sensitive surface unit and detecting an increase in a characteristic intensity of the first contact above a first intensity threshold; and in response to detecting the first input: in accordance with a determination that a focus selector is at the first location in the user interface at which the first object is displayed, perform a first operation associated with the first object that includes displaying, on the display unit, additional information associated with the first object; in accordance with a determination that a focus selector is at the second location in the user interface at which the second object is displayed, perform a second operation associated with the second object that includes displaying, on the display unit, additional information associated with the second object, wherein the second operation associated with the second object is distinct from the first operation associated with the first object; and in accordance with a determination that a focus selector is at the location in the user interface that is away from any objects that are associated with object-specific operations that are triggered by changes in contact intensity, perform a third operation that includes updating the user interface on the display unit to concurrently visually distinguish the first and second objects in the user interface.
In accordance with some embodiments, a method is performed at an electronic device with a display, a touch-sensitive surface, and one or more sensors to detect intensity of contacts with the touch-sensitive surface. The device displays a user interface on the display, wherein the user interface includes a first set of user interface elements; for a respective user interface element in the first set of user interface elements, the device is configured to respond to user input of a first input type at a location that corresponds to the respective user interface element by performing a plurality of operations that correspond to the respective user interface element; and, for a remainder of the user interface, the device is not configured to respond to user input of the first input type at a location that corresponds to a user interface element in the remainder of the user interface by performing a plurality of operations that correspond to the user interface element in the remainder of the user interface. The device detects a first user input of the first input type while a focus selector is at a first location in the user interface. In response to detecting the first user input of the first input type while the focus selector is at the first location in the user interface, in accordance with a determination that the first location corresponds to a first user interface element in the first set of user interface elements, the device performs a plurality of operations that correspond to the first user interface element; and, in accordance with a determination that the first location does not correspond to any user interface elements in the first set of user interface elements, the device applies a visual effect to distinguish the first set of user interface elements from the remainder of the user interface on the display.
In accordance with some embodiments, an electronic device includes a display unit configured to display user interfaces and user interface elements, a touch-sensitive surface unit configured to receive user inputs, one or more sensor units configured to detect intensity of contacts with the touch-sensitive surface unit, and a processing unit coupled to the display unit, the touch-sensitive surface unit and the one or more sensor units. The processing unit is configured to enable display of a user interface on the display unit, wherein the user interface includes a first set of user interface elements; for a respective user interface element in the first set of user interface elements, the device is configured to respond to user input of a first input type at a location that corresponds to the respective user interface element by performing a plurality of operations that correspond to the respective user interface element; and, for a remainder of the user interface, the device is not configured to respond to user input of the first input type at a location that corresponds to a user interface element in the remainder of the user interface by performing a plurality of operations that correspond to the user interface element in the remainder of the user interface. The processing unit is configured to detect a first user input of the first input type while a focus selector is at a first location in the user interface; and in response to detecting the first user input of the first input type while the focus selector is at the first location in the user interface, in accordance with a determination that the first location corresponds to a first user interface element in the first set of user interface elements, perform a plurality of operations that correspond to the first user interface element, and in accordance with a determination that the first location does not correspond to any user interface elements in the first set of user interface elements, apply a visual effect to distinguish the first set of user interface elements from the remainder of the user interface on the display unit.
Thus, electronic devices with displays, touch-sensitive surfaces and one or more sensors to detect intensity of contacts with the touch-sensitive surface are provided with fast, efficient methods and interfaces that indicate which user interface elements have contact intensity based capabilities and features, thereby increasing the effectiveness, efficiency, and user satisfaction with such devices. Such methods and interfaces may complement or replace conventional methods for teaching new capabilities and functionalities (e.g., force or pressure sensitive user interface elements) to the user.
There is a need for electronic devices with improved methods and interfaces for previewing media content. Such methods and interfaces optionally complement or replace conventional methods for previewing media content. Such methods reduce the number, extent, and/or nature of the inputs from a user and produce a more efficient human-machine interface. For battery-operated devices, such methods and interfaces conserve power and increase the time between battery charges
In accordance with some embodiments, a method is performed at an electronic device with a display, a touch-sensitive surface, and one or more sensors for detecting intensity of contacts on the touch-sensitive surface. The method includes displaying, on the display, a user interface that includes a plurality of media objects that include a first media object that represents a first set of one or more media items and a second media object that represents a second set of one or more media items, wherein the first set of media items is different from the second set of media items. The method further includes, while a focus selector is over the first media object, detecting an input that includes movement of a contact on the touch-sensitive surface. The method further includes, in response to detecting the input that includes the movement of the contact on the touch-sensitive surface: in accordance with a determination that the input meets media preview criteria, wherein the media preview criteria includes a criterion that is met when the input includes an increase in a characteristic intensity of the contact above a media-preview intensity threshold while the focus selector is over the first media object, outputting a preview of a media item from the first set of media items and, in response to detecting the movement of the contact, ceasing to output the preview of the media item from the first set of media items, and outputting a preview of a media item from the second set of media items; and, in accordance with a determination that the input does not meet the media preview criteria, moving the first media object and the second media object on the display in accordance with the movement of the contact on the touch-sensitive surface.
In accordance with some embodiments, an electronic device includes a display unit configured to display a user interface, a touch-sensitive surface unit to receive contacts, one or more sensor units to detect intensity of contacts with the touch-sensitive surface unit; and a processing unit coupled with the display unit, the touch-sensitive surface unit, and the one or more sensor units. While a focus selector is over the first media object, detect an input that includes movement of a contact on the touch-sensitive surface. The processing unit is configured to enable display, on the display unit, of a user interface that includes a plurality of media objects that include a first media object that represents a first set of one or more media items and a second media object that represents a second set of one or more media items, wherein the first set of media items is different from the second set of media items. The processing unit is configured to, while a focus selector is over the first media object, detect an input that includes movement of a contact on the touch-sensitive surface; and in response to detecting the input that includes the movement of the contact on the touch-sensitive surface: in accordance with a determination that the input meets media preview criteria, wherein the media preview criteria includes a criterion that is met when the input includes an increase in a characteristic intensity of the contact above a media-preview intensity threshold while the focus selector is over the first media object, output a preview of a media item from the first set of media items, and, in response to detecting the movement of the contact, cease to output the preview of the media item from the first set of media items and output a preview of a media item from the second set of media items; and, in accordance with a determination that the input does not meet the media preview criteria, move the first media object and the second media object on the display in accordance with the movement of the contact on the touch-sensitive surface.
Thus, electronic devices with displays, touch-sensitive surfaces and one or more sensors to detect intensity of contacts with the touch-sensitive surface are provided with faster, more efficient methods and interfaces for previewing media content, thereby increasing the effectiveness, efficiency, and user satisfaction with such devices. Such methods and interfaces may complement or replace conventional methods for previewing media content.
In accordance with some embodiments, a method is performed at an electronic device with a display, a touch-sensitive surface, and one or more sensors to detect intensity of contacts with the touch-sensitive surface. The method includes: displaying, on the display, a first portion of paginated content in a user interface, wherein: the paginated content includes a plurality of sections; a respective section in the plurality of sections includes a respective plurality of pages; the first portion of the paginated content is part of a first section of the plurality of sections; and the first portion of the paginated content lies between a sequence of prior pages in the first section and a sequence of later pages in the first section; while a focus selector is within a first predefined region of the displayed first portion of the paginated content on the display, detecting a first portion of an input, wherein detecting the first portion of the input includes detecting a contact on the touch-sensitive surface; in response to detecting the first portion of the input: in accordance with a determination that the first portion of the input meets first content-navigation criteria, wherein the first content-navigation criteria include a criterion that is met when the device detects a lift-off of the contact from the touch-sensitive surface before a characteristic intensity of the contact reaches a first threshold intensity, replacing the displayed first portion of the paginated content with a second portion of the paginated content on the display, wherein the second portion of the paginated content includes a page that is sequentially adjacent to the first portion of the paginated content; and, in accordance with a determination that the first portion of the input meets second content-navigation criteria, wherein the second content-navigation criteria include a criterion that is met when the device detects an increase in the characteristic intensity of the contact above the first intensity threshold while the focus selector is within the first predefined region of the displayed first portion of the paginated content, displaying an indication of a quantity of pages within the sequence of later pages in the first section or displaying an indication of a quantity of pages within the sequence of prior pages in the first section.
In accordance with some embodiments, an electronic device includes a display unit configured to display content items, a touch-sensitive surface unit configured to receive user inputs, one or more sensor units configured to detect intensity of contacts with the touch-sensitive surface unit, and a processing unit coupled to the display unit, the touch-sensitive surface unit and the one or more sensor units. The processing unit is configured to: enable display, on the display, of a first portion of paginated content in a user interface, wherein: the paginated content includes a plurality of sections; a respective section in the plurality of sections includes a respective plurality of pages; the first portion of the paginated content is part of a first section of the plurality of sections; and the first portion of the paginated content lies between a sequence of prior pages in the first section and a sequence of later pages in the first section; while a focus selector is within a first predefined region of the displayed first portion of the paginated content on the display, detect a first portion of an input, wherein detecting the first portion of the input includes detecting a contact on the touch-sensitive surface; in response to detecting the first portion of the input: in accordance with a determination that the first portion of the input meets first content-navigation criteria, wherein the first content-navigation criteria include a criterion that is met when the device detects a lift-off of the contact from the touch-sensitive surface before a characteristic intensity of the contact reaches a first threshold intensity, replace the displayed first portion of the paginated content with a second portion of the paginated content on the display, wherein the second portion of the paginated content includes a page that is sequentially adjacent to the first portion of the paginated content; and, in accordance with a determination that the first portion of the input meets second content-navigation criteria, wherein the second content-navigation criteria include a criterion that is met when the device detects an increase in the characteristic intensity of the contact above the first intensity threshold while the focus selector is within the first predefined region of the displayed first portion of the paginated content, enable display of an indication of a quantity of pages within the sequence of later pages in the first section or enable display of an indication of a quantity of pages within the sequence of prior pages in the first section.
There is a need for electronic devices with improved methods and interfaces for displaying contextual information associated with a point of interest in a map. Such methods and interfaces optionally complement or replace conventional methods for displaying contextual information associated with a point of interest in a map. Such methods reduce the number, extent, and/or nature of the inputs from a user and produce a more efficient human-machine interface. For battery-operated devices, such methods and interfaces conserve power and increase the time between battery charges
In accordance with some embodiments, a method is performed at an electronic device with a display, a touch-sensitive surface, and one or more sensors for detecting intensity of contacts on the touch-sensitive surface. The method includes, displaying, in a first user interface on the display, a view of a map that includes a plurality of points of interest. The method further includes, while displaying the view of the map that includes the plurality of points of interest, and while a focus selector is at a location of a respective point of interest, detecting an increase in a characteristic intensity of the contact on the touch-sensitive surface above a preview intensity threshold. The method further includes, in response to detecting the increase in the characteristic intensity of the contact above the preview intensity threshold, zooming the map to display contextual information near the respective point of interest. The method further includes, after zooming the map, detecting a respective input that includes detecting a decrease in the characteristic intensity of the contact on the touch-sensitive surface below a predefined intensity threshold; and in response to detecting the respective input that includes detecting the decrease in the characteristic intensity of the contact: in accordance with a determination that the characteristic intensity of the contact increased above a maintain-context intensity threshold before detecting the respective input, continuing to display the contextual information near the respective point of interest; and, in accordance with a determination that the characteristic intensity of the contact did not increase above the maintain-context intensity threshold before detecting the respective input, ceasing to display the contextual information near the point of interest and redisplaying the view of the map that includes the plurality of points of interest.
In accordance with some embodiments, an electronic device includes a display unit; a touch-sensitive surface unit; one or more sensor units for detecting intensity of contacts on the touch-sensitive surface unit; and a processing unit coupled to the display unit, the touch-sensitive surface unit, and the one or more sensor units. The processing unit configured to: enable display, in a first user interface on the display unit, of a view of a map that includes a plurality of points of interest; while enabling display of the view of the map that includes the plurality of points of interest, and while a focus selector is at a location of a respective point of interest, detect an increase in a characteristic intensity of the contact on the touch-sensitive surface above a preview intensity threshold; in response to detecting the increase in the characteristic intensity of the contact above the preview intensity threshold, zoom the map to display contextual information near the respective point of interest; after zooming the map, detect a respective input that includes detecting a decrease in the characteristic intensity of the contact on the touch-sensitive surface below a predefined intensity threshold; and in response to detecting the respective input that includes detecting the decrease in the characteristic intensity of the contact: in accordance with a determination that the characteristic intensity of the contact increased above a maintain-context intensity threshold before detecting the respective input, continue to enable display of the contextual information near the respective point of interest; and in accordance with a determination that the characteristic intensity of the contact did not increase above the maintain-context intensity threshold before detecting the respective input, cease to enable display of the contextual information near the point of interest and redisplay the view of the map that includes the plurality of points of interest.
Thus, electronic devices with displays, touch-sensitive surfaces and one or more sensors to detect intensity of contacts with the touch-sensitive surface are provided with faster, more efficient methods and interfaces for displaying contextual information associated with a point of interest in a map, thereby increasing the effectiveness, efficiency, and user satisfaction with such devices. Such methods and interfaces may complement or replace conventional methods for displaying contextual information associated with a point of interest in a map.
There is a need for electronic devices with improved methods and interfaces for zooming a map to display contextual information near a point of interest. Such methods and interfaces optionally complement or replace conventional methods for zooming a map. Such methods reduce the number, extent, and/or nature of the inputs from a user and produce a more efficient human-machine interface. For battery-operated devices, such methods and interfaces conserve power and increase the time between battery charges
In accordance with some embodiments, a method is performed at an electronic device with a display, a touch-sensitive surface, and one or more sensors for detecting intensity of contacts on the touch-sensitive surface. The method includes: concurrently displaying in a user interface on the display: a map view that includes a plurality of points of interest, and a context region that is distinct from the map view and includes a representation of a first point of interest from the plurality of points of interest and a representation of a second point of interest from the plurality of points of interest. The method further includes, while concurrently displaying the map view and the context region on the display, detecting an increase in a characteristic intensity of a contact on the touch-sensitive surface above a respective intensity threshold. The method further includes, in response to detecting the increase in the characteristic intensity of the contact above the respective intensity threshold: in accordance with a determination that a focus selector was at a location of the representation of the first point of interest in the context region when the increase in the characteristic intensity of the contact above the respective intensity threshold was detected, zooming the map view to display respective contextual information for the first point of interest around the first point of interest in the map view; and in accordance with a determination that the focus selector was at a location of the representation of the second point of interest in the context region when the increase in the characteristic intensity of the contact above the respective intensity threshold was detected, zooming the map view to display respective contextual information for the second point of interest around the second point of interest in the map view.
In accordance with some embodiments, an electronic device includes a display unit; a touch-sensitive surface unit; one or more sensor units for detecting intensity of contacts on the touch-sensitive surface; and a processing unit coupled to the display unit, the touch-sensitive surface unit, and the one or more sensor units, the processing unit configured to: enable concurrent display, in a user interface on the display unit, of: a map view that includes a plurality of points of interest, and a context region that is distinct from the map view and includes a representation of a first point of interest from the plurality of points of interest and a representation of a second point of interest from the plurality of points of interest; while enabling concurrent display of the map view and the context region on the display unit, detect an increase in a characteristic intensity of a contact on the touch-sensitive surface unit above a respective intensity threshold; and in response to detecting the increase in the characteristic intensity of the contact above the respective intensity threshold: in accordance with a determination that a focus selector was at a location of the representation of the first point of interest in the context region when the increase in the characteristic intensity of the contact above the respective intensity threshold was detected, zoom the map view to display respective contextual information for the first point of interest around the first point of interest in the map view; and in accordance with a determination that the focus selector was at a location of the representation of the second point of interest in the context region when the increase in the characteristic intensity of the contact above the respective intensity threshold was detected, zoom the map view to display respective contextual information for the second point of interest around the second point of interest in the map view.
Thus, electronic devices with displays, touch-sensitive surfaces and one or more sensors to detect intensity of contacts with the touch-sensitive surface are provided with faster, more efficient methods and interfaces for zooming a map, thereby increasing the effectiveness, efficiency, and user satisfaction with such devices. Such methods and interfaces may complement or replace conventional methods for zooming a map.
There is a need for electronic devices with improved methods and interfaces for displaying and using a menu that includes contact information. Such methods and interfaces optionally complement or replace conventional methods for displaying and using a menu that includes contact information. Such methods reduce the number, extent, and/or nature of the inputs from a user and produce a more efficient human-machine interface. For battery-operated devices, such methods and interfaces conserve power and increase the time between battery charges
In accordance with some embodiments, a method is performed at an electronic device with a display, a touch-sensitive surface, and one or more sensors to detect intensity of contacts with the touch-sensitive surface. The method includes: displaying, on the display, a first user interface that includes a plurality of selectable objects that are associated with contact information; while displaying the plurality of selectable objects and while a focus selector is at a location that corresponds to a respective selectable object, detecting an input that includes detecting a contact on the touch-sensitive surface; and in response to detecting the input: in accordance with a determination that detecting the input includes detecting an increase in intensity of the contact that meets intensity criteria, the intensity criteria including a criterion that is met when a characteristic intensity of the contact increases above a respective intensity threshold, displaying a menu for the respective selectable object that includes the contact information for the respective selectable object overlaid on top of the first user interface that includes the plurality of selectable objects; and in accordance with a determination that detecting the input includes detecting a liftoff of the contact without meeting the intensity criteria, replacing display of the first user interface that includes the plurality of selectable objects with display of a second user interface that is associated with the respective selectable object.
In accordance with some embodiments, an electronic device includes a display unit configured to display a user interface; a touch-sensitive surface unit configured to receive user inputs; one or more sensor units configured to detect intensity of contacts with the touch-sensitive surface unit; and a processing unit coupled to the display unit, the touch-sensitive surface unit and the one or more sensor units. The processing unit is configured to: enable display, on the display unit, of a first user interface that includes a plurality of selectable objects that are associated with contact information; while enabling display of the plurality of selectable objects and while a focus selector is at a location that corresponds to a respective selectable object, detect an input that includes detecting a contact on the touch-sensitive surface unit; and in response to detecting the input: in accordance with a determination that detecting the input includes detecting an increase in intensity of the contact that meets intensity criteria, the intensity criteria including a criterion that is met when a characteristic intensity of the contact increases above a respective intensity threshold, enable display of a menu for the respective selectable object that includes the contact information for the respective selectable object overlaid on top of the first user interface that includes the plurality of selectable objects; and in accordance with a determination that detecting the input includes detecting a liftoff of the contact without meeting the intensity criteria, replace display of the first user interface that includes the plurality of selectable objects with display of a second user interface that is associated with the respective selectable object.
Thus, electronic devices with displays, touch-sensitive surfaces, and one or more sensors to detect intensity of contacts with the touch-sensitive surface are provided with faster, more efficient methods and interfaces for displaying a menu that includes contact information, thereby increasing the effectiveness, efficiency, and user satisfaction with such devices. Such methods and interfaces may complement or replace conventional methods for displaying a menu that includes contact information.
In accordance with some embodiments, an electronic device includes a display, a touch-sensitive surface, optionally one or more sensors to detect intensity of contacts with the touch-sensitive surface, one or more processors, memory, and one or more programs; the one or more programs are stored in the memory and configured to be executed by the one or more processors and the one or more programs include instructions for performing or causing performance of the operations of any of the methods described herein. In accordance with some embodiments, a computer readable storage medium has stored therein instructions which when executed by an electronic device with a display, a touch-sensitive surface, and optionally one or more sensors to detect intensity of contacts with the touch-sensitive surface, cause the device to perform or cause performance of the operations of any of the methods described herein. In accordance with some embodiments, a graphical user interface on an electronic device with a display, a touch-sensitive surface, optionally one or more sensors to detect intensity of contacts with the touch-sensitive surface, a memory, and one or more processors to execute one or more programs stored in the memory includes one or more of the elements displayed in any of the methods described herein, which are updated in response to inputs, as described in any of the methods described herein. In accordance with some embodiments, an electronic device includes: a display, a touch-sensitive surface, and optionally one or more sensors to detect intensity of contacts with the touch-sensitive surface; and means for performing or causing performance of the operations of any of the methods described herein. In accordance with some embodiments, an information processing apparatus, for use in an electronic device with a display and a touch-sensitive surface, and optionally one or more sensors to detect intensity of contacts with the touch-sensitive surface, includes means for performing or causing performance of the operations of any of the methods described herein.
Thus, electronic devices with displays, touch-sensitive surfaces and optionally one or more sensors to detect intensity of contacts with the touch-sensitive surface are provided with faster, more efficient methods and interfaces for manipulating user interfaces, thereby increasing the effectiveness, efficiency, and user satisfaction with such devices. Such methods and interfaces may complement or replace conventional methods for manipulating user interfaces.
For a better understanding of the various described embodiments, reference should be made to the Description of Embodiments below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.
The methods, devices and GUIs described herein provide visual and/or haptic feedback that makes manipulation of user interface objects more efficient and intuitive for a user.
In some embodiments, in a system where a trackpad or touch-screen display is sensitive to a range of contact intensity that includes more than one or two specific intensity values (e.g., more than a simple on/off, binary intensity determination), the user interface provides responses (e.g., visual and/or tactile cues) that are indicative of the intensity of the contact within the range. This provides a user with a continuous response to the force or pressure of a user's contact, which provides a user with visual and/or haptic feedback that is richer and more intuitive. For example, such continuous force responses give the user the experience of being able to press lightly to preview an operation and/or press deeply to push to a predefined user interface state corresponding to the operation.
In some embodiments, for a device with a touch-sensitive surface that is sensitive to a range of contact intensity, multiple contact intensity thresholds are monitored by the device and different responses are mapped to different contact intensity thresholds.
In some embodiments, for a device with a touch-sensitive surface that is sensitive to a range of contact intensity, the device provides additional functionality by allowing users to perform complex operations with a single continuous contact.
In some embodiments, for a device with a touch-sensitive surface that is sensitive to a range of contact intensity, the device provides additional functionality that complements conventional functionality. For example, additional functions provided by intensity-based inputs (e.g., user interface previews and/or navigation shortcuts provided by light-press and/or deep-press gestures) are seamlessly integrated with conventional functions provided by conventional tap and swipe gestures. A user can continue to use conventional gestures to perform conventional functions (e.g., tapping on an application icon on a home screen to launch the corresponding application), without accidentally activating the additional functions. Yet it is also simple for a user to discover, understand, and use the intensity-based inputs and their added functionality (e.g., pressing on an application icon on a home screen to bring up a quick action menu for the application and then lifting off on a menu item to perform an action within the application).
A number of different approaches for manipulating user interfaces are described herein. Using one or more of these approaches (optionally in conjunction with each other) helps to provide a user interface that intuitively provides users with additional information and functionality. Using one or more of these approaches (optionally in conjunction with each other) reduces the number, extent, and/or nature of the inputs from a user and provides a more efficient human-machine interface. This enables users to use devices that have touch-sensitive surfaces faster and more efficiently. For battery-operated devices, these improvements conserve power and increase the time between battery charges.
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the various described embodiments. However, it will be apparent to one of ordinary skill in the art that the various described embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
It will also be understood that, although the terms first, second, etc. are, in some instances, used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first contact could be termed a second contact, and, similarly, a second contact could be termed a first contact, without departing from the scope of the various described embodiments. The first contact and the second contact are both contacts, but they are not the same contact, unless the context clearly indicates otherwise.
The terminology used in the description of the various described embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “includes,” “including,” “comprises,” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used herein, the term “if” is, optionally, construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” is, optionally, construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
Embodiments of electronic devices, user interfaces for such devices, and associated processes for using such devices are described. In some embodiments, the device is a portable communications device, such as a mobile telephone, that also contains other functions, such as PDA and/or music player functions. Exemplary embodiments of portable multifunction devices include, without limitation, the iPhone®, iPod Touch®, and iPad® devices from Apple Inc. of Cupertino, Calif. Other portable electronic devices, such as laptops or tablet computers with touch-sensitive surfaces (e.g., touch-screen displays and/or touchpads), are, optionally, used. It should also be understood that, in some embodiments, the device is not a portable communications device, but is a desktop computer with a touch-sensitive surface (e.g., a touch-screen display and/or a touchpad).
In the discussion that follows, an electronic device that includes a display and a touch-sensitive surface is described. It should be understood, however, that the electronic device optionally includes one or more other physical user-interface devices, such as a physical keyboard, a mouse and/or a joystick.
The device typically supports a variety of applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disk authoring application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a workout support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
The various applications that are executed on the device optionally use at least one common physical user-interface device, such as the touch-sensitive surface. One or more functions of the touch-sensitive surface as well as corresponding information displayed on the device are, optionally, adjusted and/or varied from one application to the next and/or within a respective application. In this way, a common physical architecture (such as the touch-sensitive surface) of the device optionally supports the variety of applications with user interfaces that are intuitive and transparent to the user.
Attention is now directed toward embodiments of portable devices with touch-sensitive displays.
As used in the specification and claims, the term “tactile output” refers to physical displacement of a device relative to a previous position of the device, physical displacement of a component (e.g., a touch-sensitive surface) of a device relative to another component (e.g., housing) of the device, or displacement of the component relative to a center of mass of the device that will be detected by a user with the user's sense of touch. For example, in situations where the device or the component of the device is in contact with a surface of a user that is sensitive to touch (e.g., a finger, palm, or other part of a user's hand), the tactile output generated by the physical displacement will be interpreted by the user as a tactile sensation corresponding to a perceived change in physical characteristics of the device or the component of the device. For example, movement of a touch-sensitive surface (e.g., a touch-sensitive display or trackpad) is, optionally, interpreted by the user as a “down click” or “up click” of a physical actuator button. In some cases, a user will feel a tactile sensation such as an “down click” or “up click” even when there is no movement of a physical actuator button associated with the touch-sensitive surface that is physically pressed (e.g., displaced) by the user's movements. As another example, movement of the touch-sensitive surface is, optionally, interpreted or sensed by the user as “roughness” of the touch-sensitive surface, even when there is no change in smoothness of the touch-sensitive surface. While such interpretations of touch by a user will be subject to the individualized sensory perceptions of the user, there are many sensory perceptions of touch that are common to a large majority of users. Thus, when a tactile output is described as corresponding to a particular sensory perception of a user (e.g., an “up click,” a “down click,” “roughness”), unless otherwise stated, the generated tactile output corresponds to physical displacement of the device or a component thereof that will generate the described sensory perception for a typical (or average) user.
It should be appreciated that device 100 is only one example of a portable multifunction device, and that device 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components. The various components shown in
Memory 102 optionally includes high-speed random access memory and optionally also includes non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to memory 102 by other components of device 100, such as CPU(s) 120 and the peripherals interface 118, is, optionally, controlled by memory controller 122.
Peripherals interface 118 can be used to couple input and output peripherals of the device to CPU(s) 120 and memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in memory 102 to perform various functions for device 100 and to process data.
In some embodiments, peripherals interface 118, CPU(s) 120, and memory controller 122 are, optionally, implemented on a single chip, such as chip 104. In some other embodiments, they are, optionally, implemented on separate chips.
RF (radio frequency) circuitry 108 receives and sends RF signals, also called electromagnetic signals. RF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other communications devices via the electromagnetic signals. RF circuitry 108 optionally includes well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. RF circuitry 108 optionally communicates with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication. The wireless communication optionally uses any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), high-speed uplink packet access (HSDPA), Evolution, Data-Only (EV-DO), HSPA, HSPA+, Dual-Cell HSPA (DC-HSPDA), long term evolution (LTE), near field communication (NFC), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11ac, IEEE 802.11ax, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for e-mail (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
Audio circuitry 110, speaker 111, and microphone 113 provide an audio interface between a user and device 100. Audio circuitry 110 receives audio data from peripherals interface 118, converts the audio data to an electrical signal, and transmits the electrical signal to speaker 111. Speaker 111 converts the electrical signal to human-audible sound waves. Audio circuitry 110 also receives electrical signals converted by microphone 113 from sound waves. Audio circuitry 110 converts the electrical signal to audio data and transmits the audio data to peripherals interface 118 for processing. Audio data is, optionally, retrieved from and/or transmitted to memory 102 and/or RF circuitry 108 by peripherals interface 118. In some embodiments, audio circuitry 110 also includes a headset jack (e.g., 212,
I/O subsystem 106 couples input/output peripherals on device 100, such as touch-sensitive display system 112 and other input or control devices 116, with peripherals interface 118. I/O subsystem 106 optionally includes display controller 156, optical sensor controller 158, intensity sensor controller 159, haptic feedback controller 161, and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 receive/send electrical signals from/to other input or control devices 116. The other input or control devices 116 optionally include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 160 are, optionally, coupled with any (or none) of the following: a keyboard, infrared port, USB port, stylus, and/or a pointer device such as a mouse. The one or more buttons (e.g., 208,
Touch-sensitive display system 112 provides an input interface and an output interface between the device and a user. Display controller 156 receives and/or sends electrical signals from/to touch-sensitive display system 112. Touch-sensitive display system 112 displays visual output to the user. The visual output optionally includes graphics, text, icons, video, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output corresponds to user interface objects. As used herein, the term “affordance” refers to a user-interactive graphical user interface object (e.g., a graphical user interface object that is configured to respond to inputs directed toward the graphical user interface object). Examples of user-interactive graphical user interface objects include, without limitation, a button, slider, icon, selectable menu item, switch, or other user interface control.
Touch-sensitive display system 112 has a touch-sensitive surface, sensor or set of sensors that accepts input from the user based on haptic and/or tactile contact. Touch-sensitive display system 112 and display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) on touch-sensitive display system 112 and converts the detected contact into interaction with user-interface objects (e.g., one or more soft keys, icons, web pages or images) that are displayed on touch-sensitive display system 112. In an exemplary embodiment, a point of contact between touch-sensitive display system 112 and the user corresponds to a finger of the user or a stylus.
Touch-sensitive display system 112 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other embodiments. Touch-sensitive display system 112 and display controller 156 optionally detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch-sensitive display system 112. In an exemplary embodiment, projected mutual capacitance sensing technology is used, such as that found in the iPhone®, iPod Touch®, and iPad® from Apple Inc. of Cupertino, Calif.
Touch-sensitive display system 112 optionally has a video resolution in excess of 100 dpi. In some embodiments, the touch screen video resolution is in excess of 400 dpi (e.g., 500 dpi, 800 dpi, or greater). The user optionally makes contact with touch-sensitive display system 112 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work with finger-based contacts and gestures, which can be less precise than stylus-based input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user.
In some embodiments, in addition to the touch screen, device 100 optionally includes a touchpad (not shown) for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad is, optionally, a touch-sensitive surface that is separate from touch-sensitive display system 112 or an extension of the touch-sensitive surface formed by the touch screen.
Device 100 also includes power system 162 for powering the various components. Power system 162 optionally includes a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
Device 100 optionally also includes one or more optical sensors 164.
Device 100 optionally also includes one or more contact intensity sensors 165.
Device 100 optionally also includes one or more proximity sensors 166.
Device 100 optionally also includes one or more tactile output generators 167.
Device 100 optionally also includes one or more accelerometers 168.
In some embodiments, the software components stored in memory 102 include operating system 126, communication module (or set of instructions) 128, contact/motion module (or set of instructions) 130, graphics module (or set of instructions) 132, haptic feedback module (or set of instructions) 133, text input module (or set of instructions) 134, Global Positioning System (GPS) module (or set of instructions) 135, and applications (or sets of instructions) 136. Furthermore, in some embodiments, memory 102 stores device/global internal state 157, as shown in
Operating system 126 (e.g., iOS, Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
Communication module 128 facilitates communication with other devices over one or more external ports 124 and also includes various software components for handling data received by RF circuitry 108 and/or external port 124. External port 124 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.). In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or similar to and/or compatible with the 30-pin connector used in some iPhone®, iPod Touch®, and iPad® devices from Apple Inc. of Cupertino, Calif. In some embodiments, the external port is a Lightning connector that is the same as, or similar to and/or compatible with the Lightning connector used in some iPhone®, iPod Touch®, and iPad® devices from Apple Inc. of Cupertino, Calif.
Contact/motion module 130 optionally detects contact with touch-sensitive display system 112 (in conjunction with display controller 156) and other touch-sensitive devices (e.g., a touchpad or physical click wheel). Contact/motion module 130 includes various software components for performing various operations related to detection of contact (e.g., by a finger or by a stylus), such as determining if contact has occurred (e.g., detecting a finger-down event), determining an intensity of the contact (e.g., the force or pressure of the contact or a substitute for the force or pressure of the contact), determining if there is movement of the contact and tracking the movement across the touch-sensitive surface (e.g., detecting one or more finger-dragging events), and determining if the contact has ceased (e.g., detecting a finger-up event or a break in contact). Contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact, which is represented by a series of contact data, optionally includes determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations are, optionally, applied to single contacts (e.g., one finger contacts or stylus contacts) or to multiple simultaneous contacts (e.g., “multitouch”/multiple finger contacts). In some embodiments, contact/motion module 130 and display controller 156 detect contact on a touchpad.
Contact/motion module 130 optionally detects a gesture input by a user. Different gestures on the touch-sensitive surface have different contact patterns (e.g., different motions, timings, and/or intensities of detected contacts). Thus, a gesture is, optionally, detected by detecting a particular contact pattern. For example, detecting a finger tap gesture includes detecting a finger-down event followed by detecting a finger-up (lift off) event at the same position (or substantially the same position) as the finger-down event (e.g., at the position of an icon). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event followed by detecting one or more finger-dragging events, and subsequently followed by detecting a finger-up (lift off) event. Similarly, tap, swipe, drag, and other gestures are optionally detected for a stylus by detecting a particular contact pattern for the stylus.
Graphics module 132 includes various known software components for rendering and displaying graphics on touch-sensitive display system 112 or other display, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast or other visual property) of graphics that are displayed. As used herein, the term “graphics” includes any object that can be displayed to a user, including without limitation text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations and the like.
In some embodiments, graphics module 132 stores data representing graphics to be used. Each graphic is, optionally, assigned a corresponding code. Graphics module 132 receives, from applications etc., one or more codes specifying graphics to be displayed along with, if necessary, coordinate data and other graphic property data, and then generates screen image data to output to display controller 156.
Haptic feedback module 133 includes various software components for generating instructions used by tactile output generator(s) 167 to produce tactile outputs at one or more locations on device 100 in response to user interactions with device 100.
Text input module 134, which is, optionally, a component of graphics module 132, provides soft keyboards for entering text in various applications (e.g., contacts 137, e-mail 140, IM 141, browser 147, and any other application that needs text input).
GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to telephone 138 for use in location-based dialing, to camera 143 as picture/video metadata, and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets).
Applications 136 optionally include the following modules (or sets of instructions), or a subset or superset thereof:
Examples of other applications 136 that are, optionally, stored in memory 102 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, contacts module 137 includes executable instructions to manage an address book or contact list (e.g., stored in application internal state 192 of contacts module 137 in memory 102 or memory 370), including: adding name(s) to the address book; deleting name(s) from the address book; associating telephone number(s), e-mail address(es), physical address(es) or other information with a name; associating an image with a name; categorizing and sorting names; providing telephone numbers and/or e-mail addresses to initiate and/or facilitate communications by telephone 138, video conference 139, e-mail 140, or IM 141; and so forth.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, telephone module 138 includes executable instructions to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in address book 137, modify a telephone number that has been entered, dial a respective telephone number, conduct a conversation and disconnect or hang up when the conversation is completed. As noted above, the wireless communication optionally uses any of a plurality of communications standards, protocols and technologies.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch-sensitive display system 112, display controller 156, optical sensor(s) 164, optical sensor controller 158, contact module 130, graphics module 132, text input module 134, contact list 137, and telephone module 138, videoconferencing module 139 includes executable instructions to initiate, conduct, and terminate a video conference between a user and one or more other participants in accordance with user instructions.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, e-mail client module 140 includes executable instructions to create, send, receive, and manage e-mail in response to user instructions. In conjunction with image management module 144, e-mail client module 140 makes it very easy to create and send e-mails with still or video images taken with camera module 143.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, the instant messaging module 141 includes executable instructions to enter a sequence of characters corresponding to an instant message, to modify previously entered characters, to transmit a respective instant message (for example, using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for telephony-based instant messages or using XMPP, SIMPLE, Apple Push Notification Service (APNs) or IMPS for Internet-based instant messages), to receive instant messages and to view received instant messages. In some embodiments, transmitted and/or received instant messages optionally include graphics, photos, audio files, video files and/or other attachments as are supported in a MMS and/or an Enhanced Messaging Service (EMS). As used herein, “instant messaging” refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and Internet-based messages (e.g., messages sent using XMPP, SIMPLE, APNs, or IMPS).
In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, GPS module 135, map module 154, and music player module 146, workout support module 142 includes executable instructions to create workouts (e.g., with time, distance, and/or calorie burning goals); communicate with workout sensors (in sports devices and smart watches); receive workout sensor data; calibrate sensors used to monitor a workout; select and play music for a workout; and display, store and transmit workout data.
In conjunction with touch-sensitive display system 112, display controller 156, optical sensor(s) 164, optical sensor controller 158, contact module 130, graphics module 132, and image management module 144, camera module 143 includes executable instructions to capture still images or video (including a video stream) and store them into memory 102, modify characteristics of a still image or video, and/or delete a still image or video from memory 102.
In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, and camera module 143, image management module 144 includes executable instructions to arrange, modify (e.g., edit), or otherwise manipulate, label, delete, present (e.g., in a digital slide show or album), and store still and/or video images.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, and text input module 134, browser module 147 includes executable instructions to browse the Internet in accordance with user instructions, including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, text input module 134, e-mail client module 140, and browser module 147, calendar module 148 includes executable instructions to create, display, modify, and store calendars and data associated with calendars (e.g., calendar entries, to do lists, etc.) in accordance with user instructions.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, widget modules 149 are mini-applications that are, optionally, downloaded and used by a user (e.g., weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, and dictionary widget 149-5) or created by the user (e.g., user-created widget 149-6). In some embodiments, a widget includes an HTML (Hypertext Markup Language) file, a CSS (Cascading Style Sheets) file, and a JavaScript file. In some embodiments, a widget includes an XML (Extensible Markup Language) file and a JavaScript file (e.g., Yahoo! Widgets).
In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, the widget creator module 150 includes executable instructions to create widgets (e.g., turning a user-specified portion of a web page into a widget).
In conjunction with touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, and text input module 134, search module 151 includes executable instructions to search for text, music, sound, image, video, and/or other files in memory 102 that match one or more search criteria (e.g., one or more user-specified search terms) in accordance with user instructions.
In conjunction with touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, and browser module 147, video and music player module 152 includes executable instructions that allow the user to download and play back recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files, and executable instructions to display, present or otherwise play back videos (e.g., on touch-sensitive display system 112, or on an external display connected wirelessly or via external port 124). In some embodiments, device 100 optionally includes the functionality of an MP3 player, such as an iPod (trademark of Apple Inc.).
In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, notes module 153 includes executable instructions to create and manage notes, to do lists, and the like in accordance with user instructions.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, text input module 134, GPS module 135, and browser module 147, map module 154 includes executable instructions to receive, display, modify, and store maps and data associated with maps (e.g., driving directions; data on stores and other points of interest at or near a particular location; and other location-based data) in accordance with user instructions.
In conjunction with touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, text input module 134, e-mail client module 140, and browser module 147, online video module 155 includes executable instructions that allow the user to access, browse, receive (e.g., by streaming and/or download), play back (e.g., on the touch screen 112, or on an external display connected wirelessly or via external port 124), send an e-mail with a link to a particular online video, and otherwise manage online videos in one or more file formats, such as H.264. In some embodiments, instant messaging module 141, rather than e-mail client module 140, is used to send a link to a particular online video.
Each of the above identified modules and applications correspond to a set of executable instructions for performing one or more functions described above and the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules are, optionally, combined or otherwise re-arranged in various embodiments. In some embodiments, memory 102 optionally stores a subset of the modules and data structures identified above. Furthermore, memory 102 optionally stores additional modules and data structures not described above.
In some embodiments, device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through a touch screen and/or a touchpad. By using a touch screen and/or a touchpad as the primary input control device for operation of device 100, the number of physical input control devices (such as push buttons, dials, and the like) on device 100 is, optionally, reduced.
The predefined set of functions that are performed exclusively through a touch screen and/or a touchpad optionally include navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates device 100 to a main, home, or root menu from any user interface that is displayed on device 100. In such embodiments, a “menu button” is implemented using a touchpad. In some other embodiments, the menu button is a physical push button or other physical input control device instead of a touchpad.
Event sorter 170 receives event information and determines the application 136-1 and application view 191 of application 136-1 to which to deliver the event information. Event sorter 170 includes event monitor 171 and event dispatcher module 174. In some embodiments, application 136-1 includes application internal state 192, which indicates the current application view(s) displayed on touch-sensitive display system 112 when the application is active or executing. In some embodiments, device/global internal state 157 is used by event sorter 170 to determine which application(s) is (are) currently active, and application internal state 192 is used by event sorter 170 to determine application views 191 to which to deliver event information.
In some embodiments, application internal state 192 includes additional information, such as one or more of: resume information to be used when application 136-1 resumes execution, user interface state information that indicates information being displayed or that is ready for display by application 136-1, a state queue for enabling the user to go back to a prior state or view of application 136-1, and a redo/undo queue of previous actions taken by the user.
Event monitor 171 receives event information from peripherals interface 118. Event information includes information about a sub-event (e.g., a user touch on touch-sensitive display system 112, as part of a multi-touch gesture). Peripherals interface 118 transmits information it receives from I/O subsystem 106 or a sensor, such as proximity sensor 166, accelerometer(s) 168, and/or microphone 113 (through audio circuitry 110). Information that peripherals interface 118 receives from I/O subsystem 106 includes information from touch-sensitive display system 112 or a touch-sensitive surface.
In some embodiments, event monitor 171 sends requests to the peripherals interface 118 at predetermined intervals. In response, peripherals interface 118 transmits event information. In other embodiments, peripheral interface 118 transmits event information only when there is a significant event (e.g., receiving an input above a predetermined noise threshold and/or for more than a predetermined duration).
In some embodiments, event sorter 170 also includes a hit view determination module 172 and/or an active event recognizer determination module 173.
Hit view determination module 172 provides software procedures for determining where a sub-event has taken place within one or more views, when touch-sensitive display system 112 displays more than one view. Views are made up of controls and other elements that a user can see on the display.
Another aspect of the user interface associated with an application is a set of views, sometimes herein called application views or user interface windows, in which information is displayed and touch-based gestures occur. The application views (of a respective application) in which a touch is detected optionally correspond to programmatic levels within a programmatic or view hierarchy of the application. For example, the lowest level view in which a touch is detected is, optionally, called the hit view, and the set of events that are recognized as proper inputs are, optionally, determined based, at least in part, on the hit view of the initial touch that begins a touch-based gesture.
Hit view determination module 172 receives information related to sub-events of a touch-based gesture. When an application has multiple views organized in a hierarchy, hit view determination module 172 identifies a hit view as the lowest view in the hierarchy which should handle the sub-event. In most circumstances, the hit view is the lowest level view in which an initiating sub-event occurs (i.e., the first sub-event in the sequence of sub-events that form an event or potential event). Once the hit view is identified by the hit view determination module, the hit view typically receives all sub-events related to the same touch or input source for which it was identified as the hit view.
Active event recognizer determination module 173 determines which view or views within a view hierarchy should receive a particular sequence of sub-events. In some embodiments, active event recognizer determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, active event recognizer determination module 173 determines that all views that include the physical location of a sub-event are actively involved views, and therefore determines that all actively involved views should receive a particular sequence of sub-events. In other embodiments, even if touch sub-events were entirely confined to the area associated with one particular view, views higher in the hierarchy would still remain as actively involved views.
Event dispatcher module 174 dispatches the event information to an event recognizer (e.g., event recognizer 180). In embodiments including active event recognizer determination module 173, event dispatcher module 174 delivers the event information to an event recognizer determined by active event recognizer determination module 173. In some embodiments, event dispatcher module 174 stores in an event queue the event information, which is retrieved by a respective event receiver module 182.
In some embodiments, operating system 126 includes event sorter 170. Alternatively, application 136-1 includes event sorter 170. In yet other embodiments, event sorter 170 is a stand-alone module, or a part of another module stored in memory 102, such as contact/motion module 130.
In some embodiments, application 136-1 includes a plurality of event handlers 190 and one or more application views 191, each of which includes instructions for handling touch events that occur within a respective view of the application's user interface. Each application view 191 of the application 136-1 includes one or more event recognizers 180. Typically, a respective application view 191 includes a plurality of event recognizers 180. In other embodiments, one or more of event recognizers 180 are part of a separate module, such as a user interface kit (not shown) or a higher level object from which application 136-1 inherits methods and other properties. In some embodiments, a respective event handler 190 includes one or more of: data updater 176, object updater 177, GUI updater 178, and/or event data 179 received from event sorter 170. Event handler 190 optionally utilizes or calls data updater 176, object updater 177 or GUI updater 178 to update the application internal state 192. Alternatively, one or more of the application views 191 includes one or more respective event handlers 190. Also, in some embodiments, one or more of data updater 176, object updater 177, and GUI updater 178 are included in a respective application view 191.
A respective event recognizer 180 receives event information (e.g., event data 179) from event sorter 170, and identifies an event from the event information. Event recognizer 180 includes event receiver 182 and event comparator 184. In some embodiments, event recognizer 180 also includes at least a subset of: metadata 183, and event delivery instructions 188 (which optionally include sub-event delivery instructions).
Event receiver 182 receives event information from event sorter 170. The event information includes information about a sub-event, for example, a touch or a touch movement. Depending on the sub-event, the event information also includes additional information, such as location of the sub-event. When the sub-event concerns motion of a touch, the event information optionally also includes speed and direction of the sub-event. In some embodiments, events include rotation of the device from one orientation to another (e.g., from a portrait orientation to a landscape orientation, or vice versa), and the event information includes corresponding information about the current orientation (also called device attitude) of the device.
Event comparator 184 compares the event information to predefined event or sub-event definitions and, based on the comparison, determines an event or sub-event, or determines or updates the state of an event or sub-event. In some embodiments, event comparator 184 includes event definitions 186. Event definitions 186 contain definitions of events (e.g., predefined sequences of sub-events), for example, event 1 (187-1), event 2 (187-2), and others. In some embodiments, sub-events in an event 187 include, for example, touch begin, touch end, touch movement, touch cancellation, and multiple touching. In one example, the definition for event 1 (187-1) is a double tap on a displayed object. The double tap, for example, comprises a first touch (touch begin) on the displayed object for a predetermined phase, a first lift-off (touch end) for a predetermined phase, a second touch (touch begin) on the displayed object for a predetermined phase, and a second lift-off (touch end) for a predetermined phase. In another example, the definition for event 2 (187-2) is a dragging on a displayed object. The dragging, for example, comprises a touch (or contact) on the displayed object for a predetermined phase, a movement of the touch across touch-sensitive display system 112, and lift-off of the touch (touch end). In some embodiments, the event also includes information for one or more associated event handlers 190.
In some embodiments, event definition 187 includes a definition of an event for a respective user-interface object. In some embodiments, event comparator 184 performs a hit test to determine which user-interface object is associated with a sub-event. For example, in an application view in which three user-interface objects are displayed on touch-sensitive display system 112, when a touch is detected on touch-sensitive display system 112, event comparator 184 performs a hit test to determine which of the three user-interface objects is associated with the touch (sub-event). If each displayed object is associated with a respective event handler 190, the event comparator uses the result of the hit test to determine which event handler 190 should be activated. For example, event comparator 184 selects an event handler associated with the sub-event and the object triggering the hit test.
In some embodiments, the definition for a respective event 187 also includes delayed actions that delay delivery of the event information until after it has been determined whether the sequence of sub-events does or does not correspond to the event recognizer's event type.
When a respective event recognizer 180 determines that the series of sub-events do not match any of the events in event definitions 186, the respective event recognizer 180 enters an event impossible, event failed, or event ended state, after which it disregards subsequent sub-events of the touch-based gesture. In this situation, other event recognizers, if any, that remain active for the hit view continue to track and process sub-events of an ongoing touch-based gesture.
In some embodiments, a respective event recognizer 180 includes metadata 183 with configurable properties, flags, and/or lists that indicate how the event delivery system should perform sub-event delivery to actively involved event recognizers. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate how event recognizers interact, or are enabled to interact, with one another. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate whether sub-events are delivered to varying levels in the view or programmatic hierarchy.
In some embodiments, a respective event recognizer 180 activates event handler 190 associated with an event when one or more particular sub-events of an event are recognized. In some embodiments, a respective event recognizer 180 delivers event information associated with the event to event handler 190. Activating an event handler 190 is distinct from sending (and deferred sending) sub-events to a respective hit view. In some embodiments, event recognizer 180 throws a flag associated with the recognized event, and event handler 190 associated with the flag catches the flag and performs a predefined process.
In some embodiments, event delivery instructions 188 include sub-event delivery instructions that deliver event information about a sub-event without activating an event handler. Instead, the sub-event delivery instructions deliver event information to event handlers associated with the series of sub-events or to actively involved views. Event handlers associated with the series of sub-events or with actively involved views receive the event information and perform a predetermined process.
In some embodiments, data updater 176 creates and updates data used in application 136-1. For example, data updater 176 updates the telephone number used in contacts module 137, or stores a video file used in video player module 145. In some embodiments, object updater 177 creates and updates objects used in application 136-1. For example, object updater 177 creates a new user-interface object or updates the position of a user-interface object. GUI updater 178 updates the GUI. For example, GUI updater 178 prepares display information and sends it to graphics module 132 for display on a touch-sensitive display.
In some embodiments, event handler(s) 190 includes or has access to data updater 176, object updater 177, and GUI updater 178. In some embodiments, data updater 176, object updater 177, and GUI updater 178 are included in a single module of a respective application 136-1 or application view 191. In other embodiments, they are included in two or more software modules.
It shall be understood that the foregoing discussion regarding event handling of user touches on touch-sensitive displays also applies to other forms of user inputs to operate multifunction devices 100 with input-devices, not all of which are initiated on touch screens. For example, mouse movement and mouse button presses, optionally coordinated with single or multiple keyboard presses or holds; contact movements such as taps, drags, scrolls, etc., on touch-pads; pen stylus inputs; movement of the device; oral instructions; detected eye movements; biometric inputs; and/or any combination thereof are optionally utilized as inputs corresponding to sub-events which define an event to be recognized.
Device 100 optionally also includes one or more physical buttons, such as “home” or menu button 204. As described previously, menu button 204 is, optionally, used to navigate to any application 136 in a set of applications that are, optionally executed on device 100. Alternatively, in some embodiments, the menu button is implemented as a soft key in a GUI displayed on the touch-screen display.
In some embodiments, device 100 includes the touch-screen display, menu button 204, push button 206 for powering the device on/off and locking the device, volume adjustment button(s) 208, Subscriber Identity Module (SIM) card slot 210, head set jack 212, and docking/charging external port 124. Push button 206 is, optionally, used to turn the power on/off on the device by depressing the button and holding the button in the depressed state for a predefined time interval; to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or to unlock the device or initiate an unlock process. In some embodiments, device 100 also accepts verbal input for activation or deactivation of some functions through microphone 113. Device 100 also, optionally, includes one or more contact intensity sensors 165 for detecting intensity of contacts on touch-sensitive display system 112 and/or one or more tactile output generators 167 for generating tactile outputs for a user of device 100.
Each of the above identified elements in
Attention is now directed towards embodiments of user interfaces (“UI”) that are, optionally, implemented on portable multifunction device 100.
It should be noted that the icon labels illustrated in
Additionally, while the following examples are given primarily with reference to finger inputs (e.g., finger contacts, finger tap gestures, finger swipe gestures, etc.), it should be understood that, in some embodiments, one or more of the finger inputs are replaced with input from another input device (e.g., a mouse based input or a stylus input). For example, a swipe gesture is, optionally, replaced with a mouse click (e.g., instead of a contact) followed by movement of the cursor along the path of the swipe (e.g., instead of movement of the contact). As another example, a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e.g., instead of detection of the contact followed by ceasing to detect the contact). Similarly, when multiple user inputs are simultaneously detected, it should be understood that multiple computer mice are, optionally, used simultaneously, or a mouse and finger contacts are, optionally, used simultaneously.
As used herein, the term “focus selector” refers to an input element that indicates a current part of a user interface with which a user is interacting. In some implementations that include a cursor or other location marker, the cursor acts as a “focus selector,” so that when an input (e.g., a press input) is detected on a touch-sensitive surface (e.g., touchpad 355 in
As used in the specification and claims, the term “intensity” of a contact on a touch-sensitive surface refers to the force or pressure (force per unit area) of a contact (e.g., a finger contact or a stylus contact) on the touch-sensitive surface, or to a substitute (proxy) for the force or pressure of a contact on the touch-sensitive surface. The intensity of a contact has a range of values that includes at least four distinct values and more typically includes hundreds of distinct values (e.g., at least 256). Intensity of a contact is, optionally, determined (or measured) using various approaches and various sensors or combinations of sensors. For example, one or more force sensors underneath or adjacent to the touch-sensitive surface are, optionally, used to measure force at various points on the touch-sensitive surface. In some implementations, force measurements from multiple force sensors are combined (e.g., a weighted average or a sum) to determine an estimated force of a contact. Similarly, a pressure-sensitive tip of a stylus is, optionally, used to determine a pressure of the stylus on the touch-sensitive surface. Alternatively, the size of the contact area detected on the touch-sensitive surface and/or changes thereto, the capacitance of the touch-sensitive surface proximate to the contact and/or changes thereto, and/or the resistance of the touch-sensitive surface proximate to the contact and/or changes thereto are, optionally, used as a substitute for the force or pressure of the contact on the touch-sensitive surface. In some implementations, the substitute measurements for contact force or pressure are used directly to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is described in units corresponding to the substitute measurements). In some implementations, the substitute measurements for contact force or pressure are converted to an estimated force or pressure and the estimated force or pressure is used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is a pressure threshold measured in units of pressure). Using the intensity of a contact as an attribute of a user input allows for user access to additional device functionality that may otherwise not be readily accessible by the user on a reduced-size device with limited real estate for displaying affordances (e.g., on a touch-sensitive display) and/or receiving user input (e.g., via a touch-sensitive display, a touch-sensitive surface, or a physical/mechanical control such as a knob or a button).
In some embodiments, contact/motion module 130 uses a set of one or more intensity thresholds to determine whether an operation has been performed by a user (e.g., to determine whether a user has “clicked” on an icon). In some embodiments, at least a subset of the intensity thresholds are determined in accordance with software parameters (e.g., the intensity thresholds are not determined by the activation thresholds of particular physical actuators and can be adjusted without changing the physical hardware of device 100). For example, a mouse “click” threshold of a trackpad or touch-screen display can be set to any of a large range of predefined thresholds values without changing the trackpad or touch-screen display hardware. Additionally, in some implementations a user of the device is provided with software settings for adjusting one or more of the set of intensity thresholds (e.g., by adjusting individual intensity thresholds and/or by adjusting a plurality of intensity thresholds at once with a system-level click “intensity” parameter).
As used in the specification and claims, the term “characteristic intensity” of a contact refers to a characteristic of the contact based on one or more intensities of the contact. In some embodiments, the characteristic intensity is based on multiple intensity samples. The characteristic intensity is, optionally, based on a predefined number of intensity samples, or a set of intensity samples collected during a predetermined time period (e.g., 0.05, 0.1, 0.2, 0.5, 1, 2, 5, 10 seconds) relative to a predefined event (e.g., after detecting the contact, prior to detecting liftoff of the contact, before or after detecting a start of movement of the contact, prior to detecting an end of the contact, before or after detecting an increase in intensity of the contact, and/or before or after detecting a decrease in intensity of the contact). A characteristic intensity of a contact is, optionally based on one or more of: a maximum value of the intensities of the contact, a mean value of the intensities of the contact, an average value of the intensities of the contact, a top 10 percentile value of the intensities of the contact, a value at the half maximum of the intensities of the contact, a value at the 90 percent maximum of the intensities of the contact, or the like. In some embodiments, the duration of the contact is used in determining the characteristic intensity (e.g., when the characteristic intensity is an average of the intensity of the contact over time). In some embodiments, the characteristic intensity is compared to a set of one or more intensity thresholds to determine whether an operation has been performed by a user. For example, the set of one or more intensity thresholds may include a first intensity threshold and a second intensity threshold. In this example, a contact with a characteristic intensity that does not exceed the first threshold results in a first operation, a contact with a characteristic intensity that exceeds the first intensity threshold and does not exceed the second intensity threshold results in a second operation, and a contact with a characteristic intensity that exceeds the second intensity threshold results in a third operation. In some embodiments, a comparison between the characteristic intensity and one or more intensity thresholds is used to determine whether or not to perform one or more operations (e.g., whether to perform a respective option or forgo performing the respective operation) rather than being used to determine whether to perform a first operation or a second operation.
In some embodiments, a portion of a gesture is identified for purposes of determining a characteristic intensity. For example, a touch-sensitive surface may receive a continuous swipe contact transitioning from a start location and reaching an end location (e.g., a drag gesture), at which point the intensity of the contact increases. In this example, the characteristic intensity of the contact at the end location may be based on only a portion of the continuous swipe contact, and not the entire swipe contact (e.g., only the portion of the swipe contact at the end location). In some embodiments, a smoothing algorithm may be applied to the intensities of the swipe contact prior to determining the characteristic intensity of the contact. For example, the smoothing algorithm optionally includes one or more of: an unweighted sliding-average smoothing algorithm, a triangular smoothing algorithm, a median filter smoothing algorithm, and/or an exponential smoothing algorithm. In some circumstances, these smoothing algorithms eliminate narrow spikes or dips in the intensities of the swipe contact for purposes of determining a characteristic intensity.
The user interface figures described herein optionally include various intensity diagrams that show the current intensity of the contact on the touch-sensitive surface relative to one or more intensity thresholds (e.g., a contact detection intensity threshold IT0, a light press intensity threshold ITL, a deep press intensity threshold ITD (e.g., that is at least initially higher than IL), and/or one or more other intensity thresholds (e.g., an intensity threshold IH that is lower than IL). This intensity diagram is typically not part of the displayed user interface, but is provided to aid in the interpretation of the figures. In some embodiments, the light press intensity threshold corresponds to an intensity at which the device will perform operations typically associated with clicking a button of a physical mouse or a trackpad. In some embodiments, the deep press intensity threshold corresponds to an intensity at which the device will perform operations that are different from operations typically associated with clicking a button of a physical mouse or a trackpad. In some embodiments, when a contact is detected with a characteristic intensity below the light press intensity threshold (e.g., and above a nominal contact-detection intensity threshold IT0 below which the contact is no longer detected), the device will move a focus selector in accordance with movement of the contact on the touch-sensitive surface without performing an operation associated with the light press intensity threshold or the deep press intensity threshold. Generally, unless otherwise stated, these intensity thresholds are consistent between different sets of user interface figures.
In some embodiments, the response of the device to inputs detected by the device depends on criteria based on the contact intensity during the input. For example, for some “light press” inputs, the intensity of a contact exceeding a first intensity threshold during the input triggers a first response. In some embodiments, the response of the device to inputs detected by the device depends on criteria that include both the contact intensity during the input and time-based criteria. For example, for some “deep press” inputs, the intensity of a contact exceeding a second intensity threshold during the input, greater than the first intensity threshold for a light press, triggers a second response only if a delay time has elapsed between meeting the first intensity threshold and meeting the second intensity threshold. This delay time is typically less than 200 ms in duration (e.g., 40, 100, or 120 ms, depending on the magnitude of the second intensity threshold, with the delay time increasing as the second intensity threshold increases). This delay time helps to avoid accidental deep press inputs. As another example, for some “deep press” inputs, there is a reduced-sensitivity time period that occurs after the time at which the first intensity threshold is met. During the reduced-sensitivity time period, the second intensity threshold is increased. This temporary increase in the second intensity threshold also helps to avoid accidental deep press inputs. For other deep press inputs, the response to detection of a deep press input does not depend on time-based criteria.
In some embodiments, one or more of the input intensity thresholds and/or the corresponding outputs vary based on one or more factors, such as user settings, contact motion, input timing, application running, rate at which the intensity is applied, number of concurrent inputs, user history, environmental factors (e.g., ambient noise), focus selector position, and the like. Exemplary factors are described in U.S. patent application Ser. Nos. 14/399,606 and 14/624,296, which are incorporated by reference herein in their entireties.
For example,
An increase of characteristic intensity of the contact from an intensity below the light press intensity threshold ITL to an intensity between the light press intensity threshold ITL and the deep press intensity threshold ITD is sometimes referred to as a “light press” input. An increase of characteristic intensity of the contact from an intensity below the deep press intensity threshold ITD to an intensity above the deep press intensity threshold ITD is sometimes referred to as a “deep press” input. An increase of characteristic intensity of the contact from an intensity below the contact-detection intensity threshold IT0 to an intensity between the contact-detection intensity threshold IT0 and the light press intensity threshold ITL is sometimes referred to as detecting the contact on the touch-surface. A decrease of characteristic intensity of the contact from an intensity above the contact-detection intensity threshold IT0 to an intensity below the contact-detection intensity threshold IT0 is sometimes referred to as detecting liftoff of the contact from the touch-surface. In some embodiments IT0 is zero. In some embodiments, IT0 is greater than zero. In some illustrations a shaded circle or oval is used to represent intensity of a contact on the touch-sensitive surface. In some illustrations, a circle or oval without shading is used represent a respective contact on the touch-sensitive surface without specifying the intensity of the respective contact.
In some embodiments, described herein, one or more operations are performed in response to detecting a gesture that includes a respective press input or in response to detecting the respective press input performed with a respective contact (or a plurality of contacts), where the respective press input is detected based at least in part on detecting an increase in intensity of the contact (or plurality of contacts) above a press-input intensity threshold. In some embodiments, the respective operation is performed in response to detecting the increase in intensity of the respective contact above the press-input intensity threshold (e.g., the respective operation is performed on a “down stroke” of the respective press input). In some embodiments, the press input includes an increase in intensity of the respective contact above the press-input intensity threshold and a subsequent decrease in intensity of the contact below the press-input intensity threshold, and the respective operation is performed in response to detecting the subsequent decrease in intensity of the respective contact below the press-input threshold (e.g., the respective operation is performed on an “up stroke” of the respective press input).
In some embodiments, the device employs intensity hysteresis to avoid accidental inputs sometimes termed “jitter,” where the device defines or selects a hysteresis intensity threshold with a predefined relationship to the press-input intensity threshold (e.g., the hysteresis intensity threshold is X intensity units lower than the press-input intensity threshold or the hysteresis intensity threshold is 75%, 90%, or some reasonable proportion of the press-input intensity threshold). Thus, in some embodiments, the press input includes an increase in intensity of the respective contact above the press-input intensity threshold and a subsequent decrease in intensity of the contact below the hysteresis intensity threshold that corresponds to the press-input intensity threshold, and the respective operation is performed in response to detecting the subsequent decrease in intensity of the respective contact below the hysteresis intensity threshold (e.g., the respective operation is performed on an “up stroke” of the respective press input). Similarly, in some embodiments, the press input is detected only when the device detects an increase in intensity of the contact from an intensity at or below the hysteresis intensity threshold to an intensity at or above the press-input intensity threshold and, optionally, a subsequent decrease in intensity of the contact to an intensity at or below the hysteresis intensity, and the respective operation is performed in response to detecting the press input (e.g., the increase in intensity of the contact or the decrease in intensity of the contact, depending on the circumstances).
For ease of explanation, the description of operations performed in response to a press input associated with a press-input intensity threshold or in response to a gesture including the press input are, optionally, triggered in response to detecting: an increase in intensity of a contact above the press-input intensity threshold, an increase in intensity of a contact from an intensity below the hysteresis intensity threshold to an intensity above the press-input intensity threshold, a decrease in intensity of the contact below the press-input intensity threshold, or a decrease in intensity of the contact below the hysteresis intensity threshold corresponding to the press-input intensity threshold. Additionally, in examples where an operation is described as being performed in response to detecting a decrease in intensity of a contact below the press-input intensity threshold, the operation is, optionally, performed in response to detecting a decrease in intensity of the contact below a hysteresis intensity threshold corresponding to, and lower than, the press-input intensity threshold. As described above, in some embodiments, the triggering of these responses also depends on time-based criteria being met (e.g., a delay time has elapsed between a first intensity threshold being met and a second intensity threshold being met).
Attention is now directed towards embodiments of user interfaces (“UI”) and associated processes that may be implemented on an electronic device, such as portable multifunction device 100 or device 300, with a display, a touch-sensitive surface, and one or more sensors to detect intensity of contacts with the touch-sensitive surface.
In some embodiments, the device is an electronic device with a separate display (e.g., display 450) and a separate touch-sensitive surface (e.g., touch-sensitive surface 451). In some embodiments, the device is portable multifunction device 100, the display is touch-sensitive display system 112, and the touch-sensitive surface includes tactile output generators 167 on the display (
The user interfaces in these figures are used to illustrate the processes described below. Although some of the examples which follow will be given with reference to inputs on a touch-screen display (where the touch-sensitive surface and the display are combined), in some embodiments, the device detects inputs on a touch-sensitive surface 451 that is separate from the display 450, as shown in
In
In
In
In
In
In
In
In
In
The user interfaces in these figures are used to illustrate the processes described below. Although some of the examples which follow will be given with reference to inputs on a touch-screen display (where the touch-sensitive surface and the display are combined), in some embodiments, the device detects inputs on a touch-sensitive surface 451 that is separate from the display 450, as shown in
In
The user interfaces in these figures are used to illustrate the processes described below. Although some of the examples which follow will be given with reference to inputs on a touch-screen display (where the touch-sensitive surface and the display are combined), in some embodiments, the device detects inputs on a touch-sensitive surface 451 that is separate from the display 450, as shown in
In
In
In
In contrast,
In contrast,
The user interfaces in these figures are used to illustrate the processes described below. Although some of the examples which follow will be given with reference to inputs on a touch-screen display (where the touch-sensitive surface and the display are combined), in some embodiments, the device detects inputs on a touch-sensitive surface 451 that is separate from the display 450, as shown in
In
In accordance with some embodiments,
Portable multifunction device 100 displays, in user interface 860, a representative image 866-1 in a grouped sequence of images 866. In some embodiments, the sequence of images 866 is an enhanced photo that the user has chosen for her lock screen (e.g., chosen in a settings user interface). In the example shown in
In some embodiments, user interface 860 also includes quick access information 862, such as time and date information.
While displaying representative image 866-1 on touch screen 112, device 100 detects an input 864 (e.g., a press-and-hold gesture) for which a characteristic intensity of a contact on touch screen 112 exceeds an intensity threshold. In this example, the intensity threshold is the light press threshold ITL. As shown in intensity diagram 872 (
In response to detecting the increase in the characteristic intensity of the contact, the device advances in chronological order through the one or more images acquired after acquiring representative image 866-1 at a rate that is determined based at least in part on the characteristic intensity of the contact of input 864. So, for example, display of representative image 866-1 (
In
In some embodiments, the rate, indicated in rate diagrams 870 (
In some embodiments, the rate forward or backward is determined in real-time or near-real time, so that the user can speed up or slow down movement through the images (either in the forward or reverse direction) by changing the characteristic intensity of the contact. Thus, in some embodiments, the user can scrub forwards and backwards through sequence of images 866 (e.g., in between the initial and final images in the sequence of images) by increasing and decreasing the contact intensity of user input 864.
In accordance with some embodiments,
As shown in
In this example, device 100 has a maximum rate Vmax (e.g., plus or minus 2×) which is reached when input 864's current contact intensity reaches deep press threshold ITD (or any other upper threshold) and hint threshold ITH (or any other appropriate lower threshold), respectively. The rate of movement through the sequence of images is constrained by a maximum reverse rate while the contact is detected on the touch-sensitive surface
In accordance with some embodiments, certain circumstances optionally result in device 100 deviating from a rate of movement based solely on input 864's current contact intensity. For example, as device 100 nears a final image while advancing forward through sequence of images 866, device 100 slows the rate of movement as compared to what the rate of movement would be if it were based solely on input 864's current contact intensity (e.g., device 100 “brakes” slightly as it reaches the end of the sequence of images). Similarly, in some embodiments, as device 100 nears an initial image while advancing backwards through sequence of images 866, device 100 slows the rate of movement as compared to what the rate of movement would be if it were based solely on input 864's current contact intensity (e.g., device 100 “brakes” slightly as it reaches the beginning of the sequence of images going backwards).
The device detects a swipe gesture including movement of contact 1002, having an intensity below a predetermined intensity threshold (e.g., ITL), from position 1002-a over handle icon 806 in
In
In
The user interfaces in these figures are used to illustrate the processes described below. Although some of the examples which follow will be given with reference to inputs on a touch-screen display (where the touch-sensitive surface and the display are combined), in some embodiments, the device detects inputs on a touch-sensitive surface 451 that is separate from the display 450, as shown in
The Figures described below illustrate various embodiments where the device distinguishes between user inputs intended to call up a quick-action menu (e.g.,
In
The user interfaces in these figures are used to illustrate the processes described below. Although some of the examples which follow will be given with reference to inputs on a touch-screen display (where the touch-sensitive surface and the display are combined), in some embodiments, the device detects inputs on a touch-sensitive surface 451 that is separate from the display 450, as shown in
In response to the intensity of contact 1230 increasing above a second threshold (e.g., ITL), hint graphic 1232 morphs into action menu 1214, displaying action options 1216, 1218, 1220, 1222, and 1224 in
In response to the intensity of contact 1230 increasing above the direct-selection action intensity threshold (e.g., ITD), the device further highlights action option 1216 in
The device displays (1302) a plurality of user interface objects in a first user interface on the display (e.g., a plurality of application launch icons, a plurality of rows in a list, a plurality of email messages, or a plurality of instant messaging conversations). For example, user interface 500 displays application launch icons 480, 426, 428, 482, 432, 434, 436, 438, 440, 442, 444, 446, 484, 430, 486, 488, 416, 418, 420, and 424 in
The device detects (1304) a contact at a location on the touch-sensitive surface while a focus selector is at a location of a first user interface object, in the plurality of user interface objects, on the display (e.g., contact 502 is detected over messages launch icon 424 in
While the focus selector is (1306) at the location of the first user interface object on the display: the device detects an increase in a characteristic intensity of the contact to a first intensity threshold (e.g., a “hint” intensity threshold at which the device starts to display visual hints that pressing on a respective user interface object will provide a preview of another user interface that can be reached by pressing harder on the respective user interface object). In response to detecting the increase in the characteristic intensity of the contact to the first intensity threshold, the device visually obscures (e.g., blur, darken, and/or make less legible) the plurality of user interface objects, other than the first user interface object, in the first user interface while maintaining display of the first user interface object without visually obscuring the first user interface object. For example, device 100 detects an increase in the intensity of contact 502 between
The device detects that the characteristic intensity of the contact continues to increase above the first intensity threshold. In response to detecting that the characteristic intensity of the contact continues to increase above the first intensity threshold, the device dynamically increases the amount of visual obscuring of the plurality of user interface objects, other than the first user interface object, in the first user interface while maintaining display of the first user interface object without visually obscuring the first user interface object. For example, device 100 detects a further increase in the intensity of contact 502 between
In some embodiments, in response to detecting the increase in the characteristic intensity of the contact to the first intensity threshold, the device decreases (1308) a size of the plurality of user interface objects (or obscured representations of the plurality of user interface objects), other than the first user interface object (e.g., without decreasing a size of the first user interface object), in the first user interface (e.g., visually pushing the plurality of user interface objects backward in a virtual z-direction). For example, device 100 detects an increase in the intensity of contact 502 between
In some embodiments, the device increases (1310) the size of the first user interface object in the first user interface when the characteristic intensity of the contact meets and/or exceeds the first intensity threshold. In some embodiments, a press input by the contact while the focus selector is on the first user interface object increases the size of the first user interface object (instead of visually pushing the first user interface object backward (in the z-layer direction) on the display) as the characteristic intensity of the contact increases. For example, device 100 detects contact 516 having an intensity above the “hint” threshold in
In some embodiments, in response to detecting that the characteristic intensity of the contact continues to increase above the first intensity threshold, the device dynamically decreases (1312) the size of the plurality of user interface objects, other than the first user interface object, in the first user interface (e.g., visually pushing the plurality of user interface objects further backward in a virtual z-direction). For example, device 100 detects a further increase in the intensity of contact 502 between
In some embodiments, visually obscuring the plurality of user interface objects includes blurring (1314) the plurality of user interface objects with a blurring effect that has a blur radius; and dynamically increasing the amount of visual obscuring of the plurality of user interface objects includes increasing the blur radius of the blurring effect in accordance with the change in the characteristic intensity of the contact.
In some embodiments, after dynamically increasing the amount of visual obscuring of the plurality of user interface objects, other than the first user interface object, in the first user interface while maintaining display of the first user interface object without visually obscuring the first user interface object and prior to detecting an increase in the characteristic intensity of the contact to a second intensity threshold, the device detects (1316) a decrease in the characteristic intensity of the contact; and, in response to detecting the decrease in the characteristic intensity of the contact, the device dynamically decreases the amount of visual obscuring of the plurality of user interface objects, other than the first user interface object, in the first user interface while maintaining display of the first user interface object without visually obscuring the first user interface object. For example, device 100 detects a decrease in the intensity of contact 518 between
In some embodiments, in response to detecting an increase in the characteristic intensity of the contact to a second intensity threshold (e.g., a “peek” intensity threshold at which the device starts to display a preview of another user interface that can be reached by pressing harder on the respective user interface object), greater than the first intensity threshold, the device displays (1318) a preview area overlaid on at least some of the plurality of user interface objects in the first user interface (e.g., a preview area overlaid on representations of the plurality of user interface objects other than the first user interface object that are obscured in accordance with the characteristic intensity of the contact). For example, device 100 detects an increase in the intensity of contact 610 over “peek” threshold (e.g., ITL) between
In some embodiments, the preview area displays (1320) a preview of a user interface that is displayed in response to detecting a tap gesture on the first user interface object. For example, preview area 612 in
In some embodiments, while displaying the preview area overlaid on at least some of the plurality of user interface objects in the first user interface, the device detects (1322) a decrease in the characteristic intensity of the contact. In response to detecting the decrease in the characteristic intensity of the contact, the device maintains display of the preview area overlaid on at least some of the plurality of user interface objects in the first user interface until liftoff of the contact is detected. For example, while displaying preview area 612 in
In some embodiments, in response to detecting an increase in the characteristic intensity of the contact to a third intensity threshold (e.g., a “pop” intensity threshold at which the device replaces display of the first user interface (with the overlaid preview area) with display of a second user interface), greater than the second intensity threshold, the device replaces (1324) display of the first user interface and the overlaid preview area with display of a second user interface that is distinct from the first user interface (e.g., a second user interface that is also displayed in response to detecting a tap gesture on the first user interface object). For example, while displaying preview area 612 in
In some embodiments, in response to detecting an increase in the characteristic intensity of the contact to a second intensity threshold (e.g., an intensity threshold which in some embodiments is the same as the “peek” intensity threshold for displaying previews), greater than the first intensity threshold, the device displays (1326) a menu overlaid on at least some of the plurality of user interface objects in the first user interface. The menu contains activateable menu items associated with the first user interface object. For example, as shown in
It should be understood that the particular order in which the operations in
In accordance with some embodiments,
As shown in
The operations in the information processing methods described above are, optionally implemented by running one or more functional modules in information processing apparatus such as general purpose processors (e.g., as described above with respect to
The device displays (1502) a plurality of user interface objects in a first user interface on the display (e.g., a plurality of application launch icons, a plurality of rows in a list, a plurality of email messages, or a plurality of instant messaging conversations). For example, user interface 600 displays email messages 602, 604, 606, and 608 in
The device detects (1504) an input by a contact while a focus selector is over a first user interface object, in the plurality of user interface objects, on the display (e.g., contacts 610, 616, 618, 630, 638, 642, 644, and 646 over partial view of email message 602 in
In accordance with a determination that the input meets selection criteria (e.g., the selection criteria are satisfied when the input is a tap gesture), the device displays (1506) a second user interface that is distinct from the first user interface in response to detecting the input (e.g., where contact 610 is terminated at an intensity below ITH in
In accordance with a determination that a first portion of the input meets preview criteria (e.g., the input is a press input with a characteristic intensity in the first portion of the input that meets preview criteria, such as a characteristic intensity that meets a “peek” intensity threshold), the device displays (1508) a preview area overlaid on at least some of the plurality of user interface objects in the first user interface in response to detecting the first portion of the input, wherein the preview area includes a reduced scale representation of the second user interface. For example, in response to detecting an increase in the intensity of contact 610 above threshold ITL, device 100 displays preview area 612 in
In some embodiments, determining that the first portion of the input meets preview criteria includes, while the focus selector is over the first user interface object, in the plurality of user interface objects, on the display, detecting (1510) the characteristic intensity of the contact increase to a second intensity threshold (e.g., a “peek” intensity threshold at which the device starts to display a preview of another user interface that can be reached by pressing harder on the respective user interface object, such as ITL illustrated in
In accordance with a determination that a second portion of the input by the contact, detected after the first portion of the input, meets user-interface-replacement criteria, the device replaces (1512) display of the first user interface and the overlaid preview area with display of the second user interface. For example, in response to detecting an increase in the intensity of contact 610 above threshold ITD, device 100 navigates to user interface 614 in
In some embodiments, the user-interface-replacement criteria include (1514) a requirement that the characteristic intensity of the contact increases to a third intensity threshold, greater than a second intensity threshold, during the second portion of the input (e.g., a “pop” intensity threshold, greater than a “peek” intensity threshold, at which the device replaces display of the first user interface (with the overlaid preview area) with display of a second user interface, such as ITD illustrated as a greater intensity than ITL in
In some embodiments, the user-interface-replacement criteria include (1516) a requirement that the characteristic intensity of the contact, during the second portion of the input, decreases below a second intensity threshold and then increases again to at least the second intensity threshold. For example, in
In some embodiments, the user-interface-replacement criteria include (1518) a requirement that the characteristic intensity of the contact increase at or above a predetermined rate during the second portion of the input. In some embodiments, a quick press (e.g., a jab) by the contact that increases the characteristic intensity of the contact at or above a predetermined rate satisfies the user-interface-replacement criteria. In some embodiments, user-interface-replacement criteria are satisfied by increasing the characteristic intensity of the contact above a third “pop” intensity threshold, by repeated presses by the contact that meet or exceed a second “peek” intensity threshold, or by a quick press (e.g., a jab) by the contact that that increases the characteristic intensity of the contact at or above a predetermined rate.
In some embodiments, the user-interface-replacement criteria include (1520) a requirement that an increase in the characteristic intensity of the contact during the second portion of the input is not accompanied by a movement of the contact. In some embodiments, movement of the focus selector in any direction across the preview disables responses to an increase in contact intensity above the “pop” intensity threshold that may occur during the movement of the contact. For example, after sliding contact 638, and preview area 612, to the left in
In accordance with a determination that the second portion of the input by the contact meets preview-area-disappearance criteria, the device ceases (1522) to display the preview area and displays the first user interface after the input ends. (e.g., by liftoff of the contact) In some embodiments, in response to detecting liftoff, the preview area ceases to be displayed and the first user interface returns to its original appearance when preview-area-disappearance criteria are met. For example, after displaying preview area 612 in
In some embodiments, the preview-area-disappearance criteria include (1524) a requirement that no action icons are displayed in the preview area during the second portion of the input. In some embodiments, the preview area ceases to be displayed after the input ends if there no buttons or other icons displayed in the preview area that are responsive to user inputs. For example, device 100 restores the appearance of user interface 600 in
In some embodiments, the preview-area-disappearance criteria include (1526) a requirement that the user-interface-replacement criteria are not satisfied and a requirement that the preview-area-maintenance criteria are not satisfied. For example, device 100 restores the appearance of user interface 600 in
In some embodiments, in accordance with a determination that the second portion of the input by the contact meets preview-area-maintenance criteria, the device maintains (1528) display of the preview area overlaid on at least some of the plurality of user interface objects in the first user interface, after the input ends (e.g., by liftoff of the contact after swiping up to reveal additional options for interacting with the preview area, or the equivalent of liftoff of the contact). In some embodiments, in response to detecting liftoff, the preview area remains displayed over the first user interface when preview-area-maintenance criteria are met. For example, because action icons 624, 626, and 628 were revealed in
In some embodiments, the preview-area-maintenance criteria include (1530) a requirement that the second portion of the input include movement of the contact across the touch-sensitive surface that moves the focus selector in a predefined direction on the display. For example, device 100 maintains display of preview area 612 after liftoff of contact 618 in
In some embodiments, the preview-area-maintenance criteria include (1532) a requirement that action icons are displayed in the preview area during the second portion of the input. For example, because action icons 624, 626, and 628 were revealed in
In some embodiments, in accordance with a determination that the first portion of the input meets hint criteria prior to meeting the preview criteria (e.g., the input is a press input with a characteristic intensity in the first portion of the input that meets hint criteria, such as a characteristic intensity that meets a “hint” intensity threshold, prior to meeting preview criteria, such as a characteristic intensity that meets a “peek” intensity threshold), the device visually obscures (1534) (e.g., blurs, darkens, and/or makes less legible) the plurality of user interface objects other than the first user interface object in the first user interface. For example, device 100 detects an increase in the intensity of contact 610 between
In some embodiments, displaying the preview area overlaid on at least some of the plurality of user interface objects in the first user interface in response to detecting the first portion of the input includes displaying (1536) an animation in which the plurality of user interface objects other than the first user interface object in the first user interface are further obscured. For example, device 100 detects a further increase in the intensity of contact 610 between
In some embodiments, determining that the first portion of the input meets hint criteria includes, while the focus selector is over the first user interface object, in the plurality of user interface objects, on the display, detecting (1538) the characteristic intensity of the contact increase to a first intensity threshold (e.g., a “hint” intensity threshold at which the device starts to display visual hints that pressing on a respective user interface object will provide a preview of another user interface that can be reached by pressing harder on the respective user interface object). For example, device 100 detects an increase in the intensity of contact 610 between
In some embodiments, while detecting the first portion of the input and displaying the preview area, the device detects (1540) the characteristic intensity of the contact changing over time (e.g., increasing above a second intensity threshold (a “peek” intensity threshold)). In response to detecting the characteristic intensity of the contact changing over time (e.g., increasing above the second intensity threshold), the device dynamically changes the size of the preview area in accordance with changes in the characteristic intensity of the contact. For example, device 100 detects an increase in the intensity of contact 610, above peek intensity threshold ITL, between
In some embodiments, the size of the preview area (and, optionally, the magnification of the content within the preview area) dynamically increases in accordance with the increase in the characteristic intensity of the contact above the second intensity threshold until the size of the preview area reaches a predefined maximum size (e.g., 80, 85, 90, 92, or 95% of the size of the first user interface). In some embodiments, the size of the preview area (and, optionally, the magnification of the content within the preview area) dynamically decreases in accordance with the increase in the characteristic intensity of the contact (e.g., while above the second intensity threshold). In some embodiments, the size of the preview area dynamically decreases in accordance with the decrease in the characteristic intensity of the contact until the size of the preview area reaches a predefined minimum size (e.g., 70, 75, 80, 85, 90% of the size of the first user interface). In some embodiments, the preview area is displayed at a predefined size (e.g., 80, 85, 90, 92, or 95% of the size of the first user interface) in response to detecting the characteristic intensity of the contact increase to the second intensity threshold.
In some embodiments, in accordance with a determination that the second portion of the input by the contact includes movement of the contact across the touch-sensitive surface, the device moves (1542) the preview area in accordance with the movement of the contact (e.g., slides the preview in a direction determined based on a direction of movement of the contact on the touch-sensitive surface and optionally revealing one or more actions associated with the preview that include selectable options or swipe options). For example, device 100 detects movement of contacts 618, 630, and 646 up, left, and right on touch screen 112 in
In some embodiments, in accordance with a determination that the second portion of the input by the contact includes movement of the contact across the touch-sensitive surface, the device moves (1544) the focus selector in accordance with the movement of the contact (e.g., the movement of the focus selector is an upward movement across the displayed preview); and displays one or more action items (e.g., displays a menu of actions that includes multiple action items, such as menu 622 including action items 624, 626, and 628 in
In some embodiments, the device provides (1546) (e.g., generates or outputs with one or more tactile output generators of the device) a tactile output (e.g., a second tactile output such as a click) indicative of display of the one or more action items, wherein the tactile output indicative of display of the one or more action items is different from the first tactile output indicative of displaying the preview area (e.g., tactile feedback 623 in
In some embodiments, while the preview area is displayed on the display and the one or more action items are not displayed, the device displays (1548) an indicator indicating that the one or more action items associated with the first user interface object are hidden (e.g., displays a caret at the top of the preview area, or at the top of the first user interface, e.g., caret 619 in
In some embodiments, the indicator is (1550) configured to represent a direction of movement of a focus selector that triggers display of the one or more action items associated with the first user interface object. For example, a caret at the top of the preview area or at the top of the first user interface indicates that a swipe by the contact that move the focus selector upward will trigger the display of a menu of actions associated with the first user interface object (e.g., caret 619 in
In some embodiments, the movement of the contact across the touch-sensitive surface causes (1552) a movement of the focus selector on the display in a first direction (e.g., the first direction is approximately horizontal from left to right, or from right to left); and displaying the one or more action items that are associated with the first user interface object include shifting the preview area in the first direction on the display; and revealing the one or more action items (e.g., from behind the supplemental information or from an edge of the display) as the preview area is shifted in the first direction. For example, device 100 detects movement of contacts 630 and 646 to the left and right on touch screen 112 in
In some embodiments, after revealing the one or more action items the device continues (1554) to shift the preview area in the first direction on the display in accordance with the movement of the contact (e.g., while maintaining a position of the one or more action items on the display). For example, movement of contact 630 from position 630-c to 630-d, and then 630-e, in
In some embodiments, displaying the one or more action items associated with the first user interface object includes displaying (1556) a first action item associated with the first user interface object. While displaying the first action item associated with the first user interface object, the device detects that the movement of the contact causes the focus selector to move at least a first threshold amount on the display before detecting lift-off of the contact (e.g., movement of contact 630 from position 630-a to 630-d in
In some embodiments, in accordance with a determination that the first portion of the input meets preview criteria, the device provides (1558) (e.g., generates or outputs with one or more tactile output generators of the device) a tactile output (e.g., a first tactile output such as a buzz or tap) indicative of display of the one or more action items in conjunction with displaying the preview area (e.g., tactile feedback 61 in
In some embodiments, in accordance with a determination that the second portion of the input by the contact, detected after the first portion of the input, meets user-interface-replacement criteria, the device provides (1560) a tactile output (e.g., second tactile output such as a buzz or tap) indicative of replacement of the first user interface, wherein the tactile output is provided in conjunction with replacing display of the first user interface and the overlaid preview area with display of the second user interface (e.g., tactile feedback 615 in
In some embodiments the first tactile output is different from the second tactile output based on differences in amplitudes of the tactile outputs. In some embodiments, the first type of tactile output is generated by movement of the touch-sensitive surface that includes a first dominant movement component. For example, the generated movement corresponds to an initial impulse of the first tactile output, ignoring any unintended resonance. In some embodiments, the second type of tactile output is generated by movement of the touch-sensitive surface that includes a second dominant movement component. For example, the generated movement corresponds to an initial impulse of the second tactile output, ignoring any unintended resonance. In some embodiments, the first dominant movement component and the second dominant movement component have a same movement profile and different amplitudes. For example, the first dominant movement component and the second dominant movement component have the same movement profile when the first dominant movement component and the second dominant movement component have a same waveform shape, such as square, sine, sawtooth or triangle, and approximately the same period.
In some embodiments the first tactile output is different from the second tactile output based on differences in movement profiles of the tactile outputs. In some embodiments, the first type of tactile output is generated by movement of the touch-sensitive surface that includes a first dominant movement component. For example, the generated movement corresponds to an initial impulse of the first tactile output, ignoring any unintended resonance. In some embodiments, the second type of tactile output is generated by movement of the touch-sensitive surface that includes a second dominant movement component. For example, the generated movement corresponds to an initial impulse of the second tactile output, ignoring any unintended resonance. In some embodiments, the first dominant movement component and the second dominant movement component have different movement profiles and a same amplitude. For example, the first dominant movement component and the second dominant movement component have different movement profiles when the first dominant movement component and the second dominant movement component have a different waveform shape, such as square, sine, sawtooth or triangle, and/or approximately the same period.
In some embodiments, in accordance with a determination that the second portion of the input by the contact includes movement of the contact across the touch-sensitive surface that moves the focus selector in a respective direction and that meets a respective movement threshold (e.g., a distance and/or speed threshold), the device performs (1562) an operation associated with movement in the respective direction (e.g., the action that is revealed when the preview area is moved to the left or right) in response to detecting the end of the input. For example, in response to moving contact 632 past a movement threshold, as indicated by the change in color of action icon 634 in
In some embodiments, in accordance with a determination that the second portion of the input by the contact includes movement of the contact across the touch-sensitive surface that moves the focus selector in the respective direction and that does not meet the respective movement threshold (e.g., a distance and/or speed threshold), the device foregoes performing the operation associated with movement in the respective direction in response to detecting the end of the input. For example, because contact 638 does not move past a movement threshold in
In some embodiments, movement of the focus selector in a first direction is (1564) associated with a first action and movement of the focus selector in a second direction is associated with a second action (e.g., movement to the left reveals the “delete” icon in
In some embodiments, movement of the focus selector in the first direction is (1566) associated with a first threshold and movement of the focus selector in the second direction is associated with a second threshold that is higher than the first threshold (e.g., because the second action associated with movement in the second direction is destructive such as deleting a message, while the first action associated with movement in the first direction is non-destructive such as flagging a message as read or unread). For example, contact 632 must move farther to the left to delete message 602 from user interface 600 in
It should be understood that the particular order in which the operations in
In accordance with some embodiments, a method is performed at an electronic device with a touch-sensitive surface and a display. The device includes one or more sensors to detect intensity of contacts with the touch-sensitive surface. The device displays a plurality of user interface objects in a first user interface on the display. The device detects a first portion of a press input by a contact at a location on the touch-sensitive surface that corresponds to a location of a first user interface object, in the plurality of user interface objects, on the display. While detecting the first portion of the press input by the contact at the location on the touch-sensitive surface that corresponds to the location of the first user interface object, in the plurality of user interface objects, on the display, the device selects the first user interface object and detects the intensity of the contact increase to a second intensity threshold. In response to detecting the intensity of the contact increase to the second intensity threshold, the device displays in the first user interface a preview area overlaid on at least some of the plurality of user interface objects. After detecting the first portion of the press input, the device detects a second portion of the press input by the contact. In response to detecting the second portion of the press input by the contact, in accordance with a determination that the second portion of the press input by the contact meets user-interface-replacement criteria, the device replaces display of the first user interface with a second user interface that is distinct from the first user interface. In accordance with a determination that the second portion of the press input by the contact meets preview-area-maintenance criteria, the device maintains display, after the press input ends, of the preview area overlaid on at least some of the plurality of user interface objects in the first user interface. In accordance with a determination that the second portion of the press input by the contact meets preview-area-disappearance criteria, the device ceases to display to the preview area and maintains display, after the press input ends, of the first user interface.
As noted just above, in some embodiments, the device displays a plurality of user interface objects in a first user interface on the display (e.g., a plurality of application launch icons, a plurality of rows in a list, a plurality of email messages, or a plurality of instant messaging conversations).
The device detects a first portion of a press input by a contact at a location on the touch-sensitive surface that corresponds to a location of a first user interface object, in the plurality of user interface objects, on the display. In some embodiments, the press input is made by a single contact on the touch-sensitive surface. In some embodiments, the press input is a stationary input. In some embodiments, the contact in the press input moves across the touch-sensitive surface during the press input.
While detecting the first portion of the press input by the contact at the location on the touch-sensitive surface that corresponds to the location of the first user interface object, in the plurality of user interface objects, on the display, the device selects the first user interface object. In some embodiments, a focus selector is placed over the first user interface object.
The device detects the intensity of the contact increase to a second intensity threshold (e.g., a “peek” intensity threshold at which the device starts to display a preview of another user interface that can be reached by pressing harder on the respective user interface object).
In response to detecting the intensity of the contact increase to the second intensity threshold, the device displays in the first user interface a preview area overlaid on at least some of the plurality of user interface objects, wherein the preview area is associated with the first user interface object.
After detecting the first portion of the press input, the device detects a second portion of the press input by the contact.
In response to detecting the second portion of the press input by the contact, in accordance with a determination that the second portion of the press input by the contact meets user-interface-replacement criteria, the device replaces display of the first user interface with a second user interface that is distinct from the first user interface.
In accordance with a determination that the second portion of the press input by the contact meets preview-area-maintenance criteria, the device maintains display, after the press input ends (e.g., by liftoff of the contact), of the preview area overlaid on at least some of the plurality of user interface objects in the first user interface.
In accordance with a determination that the second portion of the press input by the contact meets preview-area-disappearance criteria, the device ceases to display to the preview area and maintains display, after the press input ends (e.g., by liftoff of the contact), of the first user interface.
In some embodiments, the preview area includes a reduced scale representation of the second user interface. In some embodiments, the second user interface is a user interface that is also displayed in response to detecting a tap gesture on the first user interface object, instead of the press input by the contact.
In some embodiments, while detecting the first portion of the press input by the contact at the location on the touch-sensitive surface that corresponds to the location of the first user interface object on the display, prior to detecting the intensity of the contact increase to the second intensity threshold, the device detects the intensity of the contact increase to a first intensity threshold (e.g., a “hint” intensity threshold at which the device starts to display visual hints that pressing on a respective user interface object will provide a preview of another user interface that can be reached by pressing harder on the respective user interface object). In some embodiments, in response to detecting the intensity of the contact increases to the first intensity threshold, the device visually obscures (e.g., blurs, darkens, and/or makes less legible) the plurality of user interface objects other than the first user interface object in the first user interface. In some embodiments, non-selected user interface objects are visually obscured and the selected first user interface object is not visually obscured. In some embodiments, additional objects besides the plurality of user interface objects are displayed (e.g., objects in a status bar or navigation icons within the user interface) and these additional objects are not visually obscured when the intensity of the contact increases to or exceeds the first intensity threshold. In some embodiments, these additional objects are also visually obscured when the intensity of the contact increases to or exceeds the first intensity threshold.
In some embodiments, while detecting the first portion of the press input by the contact at the location on the touch-sensitive surface that corresponds to the location of the first user interface object on the display, the device detects that the intensity of the contact continues to increase above the second intensity threshold. In some embodiments, in response to detecting that the intensity of the contact continues to increase above the second intensity threshold, the device dynamically increases the size of the preview area. In some embodiments, the size of the preview area dynamically increases in accordance with the increase in the intensity of the contact above the second intensity threshold. In some embodiments, the size of the preview area dynamically increases in accordance with the increase in the intensity of the contact above the second intensity threshold until the size of the preview area reaches a predefined maximum size (e.g., 80, 85, 90, 92, or 95% of the size of the first user interface). In some embodiments, preview area is displayed at a predefined size (e.g., 80, 85, 90, 92, or 95% of the size of the first user interface) in response to detecting the intensity of the contact increase to the second intensity threshold.
In accordance with some embodiments,
As shown in
The processing unit 1608 is configured to detect an input by a contact while a focus selector is over a first user interface object, in the plurality of user interface objects, on the display unit 1602 (e.g., with detecting unit 1614).
In accordance with a determination that the input meets selection criteria, the processing unit 1608 is configured to enable display of a second user interface that is distinct from the first user interface in response to detecting the input (e.g., with display enabling unit 1612).
In accordance with a determination that a first portion of the input meets preview criteria, the processing unit 1608 is configured to enable display of a preview area overlaid on at least some of the plurality of user interface objects in the first user interface in response to detecting the first portion of the input (e.g., with display enabling unit 1612), wherein the preview area includes a reduced scale representation of the second user interface;
In accordance with a determination that a second portion of the input by the contact, detected after the first portion of the input, meets user-interface-replacement criteria, the processing unit 1608 is configured to replace display of the first user interface and the overlaid preview area with display of the second user interface (e.g., with replacing unit 1616).
In accordance with a determination that the second portion of the input by the contact meets preview-area-disappearance criteria, the processing unit 1608 is configured to cease to display the preview area (e.g., with ceasing unit 1618) and enable display of the first user interface after the input ends (e.g., with display enabling unit 1612).
The operations in the information processing methods described above are, optionally implemented by running one or more functional modules in information processing apparatus such as general purpose processors (e.g., as described above with respect to
The device displays (1702), on the display, a first user interface that includes a plurality of selectable user interface objects, including one or more user interface objects of a first type (e.g., user interface objects associated with “non-sticky” supplemental information (e.g., previews), such as date and time 704 in
While displaying the first user interface on the display, the device detects (1704) a first portion of a first input that includes detecting an increase in a characteristic intensity of a first contact on the touch-sensitive surface above a first intensity threshold (e.g., a “peek” intensity threshold, which may be the same as a threshold for a “light” press input) while a focus selector is over a respective user interface object of the plurality of selectable user interface objects (e.g., an increase in the intensity of contacts 706, 708, 722, 726, 728, 732, and 736 in
In response to detecting the first portion of the first input, the device displays (1706) supplemental information associated with the respective user interface object (e.g., preview area 707 in
While displaying the supplemental information associated with the respective user interface object, the device detects (1708) an end of the first input (e.g., detecting lift-off of the first contact, as illustrated with a broken-lined circle in
In response to detecting the end of the first input: in accordance with a determination that the respective user interface object is the first type of user interface object, the device ceases (1710) to display the supplemental information associated with the respective user interface object (e.g., when the respective user interface object has non-sticky supplemental information (e.g., a preview), the supplemental information is removed when the first input is terminated, as illustrated by removal of preview area 707 in
In some embodiments, when the respective user interface object is the first type of user interface object, the supplemental information includes (1712) a preview of a second user interface (e.g., preview area 707 displays a preview of calendar application user interface 724 in
In some embodiments, when the respective user interface object is the second type of user interface object, the supplemental information includes (1714) a first menu of actions that are associated with the respective user interface object (e.g., a quick action menu that includes a small number of most frequently used actions as its menu items, for example, quick action menu 710 in
In some embodiments, the device detects (1716) a second portion of the first input after the first portion of the first input and before the end of the first input, where detecting the second portion of the first input includes detecting a decrease in the characteristic intensity of the first contact below the first intensity threshold without detecting liftoff of the contact from the touch-sensitive surface. In response to detecting the second portion of the first input, the device maintains (1718) display of the supplemental information associated with the respective user interface object. For example, device 100 maintains display of preview area 707 and quick-action menu 710 after detecting decreases in contacts 706 and 708 in
In some embodiments, after detecting the end of the first input and ceasing to display the supplemental information associated with the respective user interface object (e.g., after the supplemental information is removed from the display (1) after the end of the first input and in accordance with the determination that the respective user interface object is the first type of user interface object, or (2) after detecting another dismissal input (e.g., a tap outside of the first menu of actions) and in accordance with the determination that the respective user interface object is the second type of user interface object): while displaying the first user interface on the display, the device detects (1720) a first portion of a second input that includes detecting an increase in a characteristic intensity of a second contact on the touch-sensitive surface above the first intensity threshold while the focus selector is over the respective user interface object. For example, after display of preview area 707 is ceased in user interface 700 in
In response to detecting the first portion of the second input, the device redisplays the supplemental information associated with the respective user interface object. The device detects a second portion of the second input that includes detecting an increase in the characteristic intensity of the second contact on the touch-sensitive surface above a second intensity threshold (e.g., the second intensity threshold is an intensity threshold that is higher than the first intensity threshold). In response to detecting the second portion of the second input: in accordance with a determination that the respective user interface object is the first type of user interface object, the device replaces display of the first user interface and the supplemental information with a second user interface (e.g., the second user interface is also displayed upon selection of the respective user interface object in the first user interface); and, in accordance with a determination that the respective user interface object is the second type of user interface object, the device maintains display the supplemental information associated with the respective user interface object (e.g., without displaying an additional interface as the intensity increases above the first intensity threshold). For example, in response to the increase in intensity of contact 722 above intensity threshold ITD, the device replaces display of email message viewing user interface 700, associated with an email messaging application, with new event user interface 724, associated with a calendar application, in
In some embodiments, after detecting the end of the first input and ceasing to display the supplemental information associated with the respective user interface object (e.g., the supplemental information is removed from the display (1) after the end of the first input and in accordance with the determination that the respective user interface object is the first type of user interface object, or (2) after detecting another dismissal input (e.g., a tap outside of the first menu of actions) and in accordance with the determination that the respective user interface object is the second type of user interface object): while displaying the first user interface on the display, the device detects (1722) a first portion of a second input that includes detecting an increase in a characteristic intensity of a second contact on the touch-sensitive surface above the first intensity threshold while the focus selector is over the respective user interface object. In some embodiments, when the supplemental information is removed from the display, the first user interface is restored.
In response to detecting the first portion of the second input, the device redisplays the supplemental information associated with the respective user interface object. The device detects a second portion of the second input that includes detecting an increase in the characteristic intensity of the second contact on the touch-sensitive surface above a second intensity threshold (e.g., the second intensity threshold is an intensity threshold that is higher than the first intensity threshold). In response to detecting the second portion of the second input: in accordance with a determination that the respective user interface object is the first type of user interface object, the device replaces display of the first user interface and the supplemental information with a second user interface, wherein the second user interface is also displayed upon selection of the respective user interface object in the first user interface; and, in accordance with a determination that the respective user interface object is the second type of user interface object, the device replaces display of the first user interface and the supplemental information with a third user interface, wherein the third user interface is different from a respective user interface that is displayed upon selection of the respective user interface object in the first user interface. For example, in response to the increase in intensity of contact 722 above intensity threshold ITS, the device replaces display of email message viewing user interface 700, associated with an email messaging application, with new event user interface 724, associated with a calendar application, in
In some embodiments, in accordance with a determination that the increase in the characteristic intensity of the second contact is accompanied by a movement of the second contact, the device disables (1724) replacement of the first user interface and the supplemental information with the second user interface. In some embodiments, movement of the contact in any direction across the displayed/redisplayed supplemental information disables responses to an increase in contact intensity above the second intensity threshold that may occur during the movement of the contact. For example, in response to detecting an increase in the intensity of contact 728 above intensity threshold ITD in
In some embodiments, while displaying the supplemental information on the display and prior to detecting the end of the first input, the device detects (1726) a second portion of the first input that includes movement of the first contact on the touch-sensitive surface. In response to detecting the second portion of the first portion of the input that includes the movement of the first contact: in accordance with a determination that the respective user interface object is the first type of user interface object, the device moves the supplemental information in accordance with the movement of the first contact (e.g., the device slides the peek platter in a direction determined based on a direction of movement of the contact on the touch-sensitive surface and optionally reveals one or more actions associated with the peek platter including selectable options or swipe options); and in accordance with a determination that the respective user interface object is the second type of user interface object, the device maintains a position of the supplemental information and highlights a selectable object in the supplemental information in accordance with the movement of the first contact (e.g., highlights a menu option in the quick action menu when the contact slides over the menu option). For example, in response to detecting movement 730 of contact 728, the device moves preview area 707 to the right in
In some embodiments, after detecting the end of the first input and ceasing to display the supplemental information associated with the respective user interface object (e.g., the supplemental information is removed from the display (1) after the end of the first input and in accordance with the determination that the respective user interface object is the first type of user interface object, or (2) after detecting another dismissal input (e.g., a tap outside of the first menu of actions) and in accordance with the determination that the respective user interface object is the second type of user interface object): while displaying the first user interface on the display, the device detects (1728) a first portion of a second input that includes detecting an increase in a characteristic intensity of a second contact on the touch-sensitive surface above the first intensity threshold while the focus selector is over the respective user interface object of the plurality of user interface objects. In response to detecting the first portion of the second input, the device redisplays the supplemental information associated with the respective user interface object. The device detects a second portion of the second input that includes detecting a movement of the second contact on the touch-sensitive surface that corresponds to a movement of the focus selector on the display (e.g., the movement of the focus selector is an upward movement across the displayed preview, or a movement over one of the actions in the displayed first menu of actions). In response to detecting the second portion of the second input: in accordance with a determination that the respective user interface object is the first type of user interface object, the device displays one or more action items that are associated with the respective user interface object in the first user interface (e.g., displaying a second menu of actions that includes multiple action items, or displaying a single action item); and, in accordance with a determination that the respective user interface object is the second type of user interface object: the device maintains the redisplay of supplemental information associated with the respective user interface object (e.g., maintains display of the first menu of actions associated with the respective user interface object) and highlights a respective portion of the redisplayed supplemental information. For example, in response to detecting movement 730 of contact 728, the device moves preview area 707 to the right, revealing action icon 732 in
In some embodiments, in accordance with a determination that the respective user interface object is the first type of user interface object, the displayed one or more action items are included in a second menu of actions (e.g., an action platter), and each action item in the second menu of actions is individually selectable and would trigger performance of a corresponding action upon selection. In some embodiments, performance of a corresponding action is triggered by detecting lift off of the contact while the focus selector is over the action item. In some embodiments, performance of a corresponding action is triggered by detecting a press input (e.g., a deep press input) by the contact while the focus selector is over the action item. In some embodiments, performance of a corresponding action is triggered by detecting a tap gesture by another contact while the focus selector is over the action item. In some embodiments, an upward movement of the focus selector causes the preview to move up on the display to make room for the second menu of actions. In some embodiments, the second menu of actions has a different look and/or haptics from the first menu of actions. In some embodiments, a sideways movement (e.g., toward the left or the right side of the display) causes the preview to move left or right, and one or more action items (e.g., as represented by corresponding action icons) are revealed from behind the preview platter. In some embodiments, in accordance with a determination that the respective user interface object is the second type of user interface object, the displayed supplemental information is the first menu of actions associated with the respective user interface object, and movement of the contact causes a default action in the first menu of actions to become highlighted. Alternatively, the action that is under the focus selector after the movement of the focus selector is highlighted. In some embodiments, subsequent lift-off of the second contact while the focus selector is on a highlighted action item in the first menu of actions causes performance of the highlighted action, and display of the first menu of actions (and, in some cases, the first user interface) ceases upon detecting the lift-off of the second contact.
In some embodiments, in response to detecting the first portion of the first input: in accordance with the determination that the respective user interface object is the first type of user interface object, the device provides (1730) a first tactile output (e.g., a buzz, such as tactile feedback 705 in
In some embodiments, in accordance with the determination that the respective user interface object is the first type of user interface object, the device provides (1732) a third tactile output (e.g., a click, such as tactile feedback 733 in
In some embodiments, the respective user interface object is the first type of object. While the supplemental information associated with the respective user interface object is displayed on the display and the one or more action items are not displayed: in accordance with the determination that the respective user interface object is the first type of user interface object, the device displays (1734) an indicator indicating that the one or more action items associated with the respective user interface object are hidden (e.g., displays a caret at the top of the user interface area that displays the supplemental information, or at the top of the first user interface, such as caret 729 in
In some embodiments, the indicator is (1736) configured to represent a direction of movement of a contact that triggers display of the one or more action items associated with the respective user interface object. For example, a caret at the top of the user interface area that displays the supplemental information (e.g., the preview), or at the top of the first user interface indicates that a swipe upward by the second contact will trigger the display of the second menu of actions associated with the respective user interface object. In some embodiments, if the second menu of actions is triggered by a swipe to one or both sides (e.g., left or right) of a preview, an indicator is displayed on that side or sides of the preview (e.g., caret 729 displayed on the right side of preview area 707 in
In some embodiments, the respective user interface object is (1738) the first type of object. The movement of the second contact on the touch-sensitive surface corresponds to a movement of the focus selector on the display in a respective direction (e.g., the first direction is approximately horizontal from left to right, or from right to left). Displaying the one or more action items that are associated with the respective user interface object in the first user interface includes: shifting the supplemental information in the first direction on the display; and revealing the one or more action items (e.g., from behind the supplemental information or from an edge of the display) as the supplemental information is shifted in the first direction. For example, in response to movement 730 of contact 728 to the right, preview-area 707 moves to the right revealing action icon 732 in
In some embodiments, after revealing the one or more action items: the device continues (1740) to shift the supplemental information in the first direction on the display in accordance with the movement of the second contact (e.g., while maintaining a position of the first action item on the display, as illustrated in
In some embodiments, displaying the one or more action items associated with the respective user interface object includes (1742) displaying a first action item associated with the respective user interface object. After displaying the first action item associated with the respective user interface object, the device detects that the movement of the second contact corresponds to movement of the focus selector by at least a first threshold amount on the display before detecting lift-off of the second contact (e.g., the preview is dragged along by the focus selector on the user interface by at least the same threshold amount (e.g., an amount that causes the icon of the first action item to be displayed at the center of the space between the edge of the user interface and the edge of the preview platter)). In response to detecting that the movement of the second contact corresponds to movement of the focus selector by at least the first threshold amount on the display, the device changes a visual appearance of the first action item (e.g., by inverting the color of the first action item, as illustrated by the change in color of action icon 732 from
In some embodiments, the respective user interface object is (1744) the first type of object. The device detects a second portion of the first input that includes movement in a respective direction. In response to detecting the end of the first input: in accordance with a determination that the movement in the respective direction meets a respective movement threshold (e.g., a distance and/or speed threshold), the device performs an operation associated with movement in the respective direction (e.g., the action that is revealed when the preview platter is moved to the left or right); and in accordance with a determination that the movement in the respective direction does not meet the respective movement threshold (e.g., a distance and/or speed threshold), the device forgoes performance of the operation associated with movement in the respective direction. For example, in response to movement 730 of contact 728 far to the right, action icon 732 changes color and the device performs the associated action (e.g., creating a new calendar event) upon liftoff in
In some embodiments, movement of the focus selector in a first direction is (1746) associated with a first action and movement of the focus selector in a second direction is associated with a second action (e.g., movement to the left reveals the “delete” icon for deleting the content associated with the respective user interface object (e.g., an email message), while movement to the right reveals a “flag” icon for marking the content associated with the respective user interface object (e.g., an email message)). For example, as described with respect to
In some embodiments, movement of the focus selector in the first direction is (1748) associated with a first threshold and movement of the focus selector in the second direction is associated with a second threshold that is higher than the first threshold (e.g., because the second action associated with movement in the second direction is destructive such as deleting a message, while the first action associated with movement in the first direction is non-destructive such as flagging a message as read or unread). For example, as described with respect to
In some embodiments, after ceasing to display the supplemental information associated with the respective user interface object: while displaying the first user interface on the display (e.g., the supplemental information is removed from the display (1) after the end of the first input and in accordance with the determination that the respective user interface object is the first type of user interface object, or (2) after detecting another dismissal input (e.g., a tap outside of the first menu of actions) and in accordance with the determination that the respective user interface object is the second type of user interface object), the device detects (1750) a third input that includes detecting a third contact with the characteristic intensity below the first intensity threshold on the touch-sensitive surface and lift-off of the third contact while the focus selector is over the respective user interface object of the plurality of user interface objects (e.g., the third input is a tap gesture on the respective user interface object). In response to detecting the third input, the device replaces the first user interface with a second user interface associated with the respective user interface element (e.g., if the respective user interface element is a hyperlink, the second user interface that is displayed in response to the third input includes a webpage or document located at the address associated with the hyperlink. In another example, if the respective user interface element displays a representation (e.g., a name or avatar) of a contact, the second user interface that is displayed in response to the third input includes a contact card of the contact). For example, in response to detecting the tap gesture including contact 740 in
In some embodiments, the first type of user interface object includes (1752) a link to a webpage or document.
In some embodiments, the second type of user interface object includes (1754) a representation of a contactable entity (e.g., a friend, a social network entity, a business entity, etc.).
It should be understood that the particular order in which the operations in
In accordance with some embodiments,
As shown in
The operations in the information processing methods described above are, optionally implemented by running one or more functional modules in information processing apparatus such as general purpose processors (e.g., as described above with respect to
The device displays (1902) a first user interface on the display (e.g., user interface 800 in
In some embodiments, the background of the first user interface includes (1904) a geometric or abstract pattern (e.g., as seen in virtual mesh 810).
While displaying (1906) the first user interface on the display, detecting a first input by a first contact on the touch-sensitive surface while a first focus selector is at a location in the first user interface that corresponds to the background of the first user interface (e.g., contact 812 in
In some embodiments, when the first input is (1908) detected, the electronic device is in a locked mode in which access to a plurality of different operations that are accessible when the device is in an unlocked state is prevented (e.g., the device is locked when the first input is detected and the first user interface is a lock screen user interface, as illustrated in lock screen user interface 800 in
In some embodiments, the background is (1910) used for both the locked state of the device and the unlocked state of the device (e.g., virtual mesh 810 is present in the background of lockscreen user interface 800 and home screen user interface 824, as illustrated in
In some embodiments, a respective foreground object of the one or more foreground objects responds (1912) to an input by a contact having a characteristic intensity below the first intensity threshold. For example, a light swipe gesture on a foreground object (e.g., “slide to unlock,” “Today” view handle, “control center” handle, or camera icon) causes display of a new user interface, as shown in
In response to detecting the first input by the first contact, in accordance with a determination that the first contact has a characteristic intensity above a first intensity threshold (e.g., “hint” threshold ITH, light press threshold ITL, or deep press threshold ITD), the device dynamically changes (1914) the appearance of the background of the first user interface without changing the appearance of the one or more foreground objects in the first user interface (e.g., by pushing back virtual mesh 810 in
In some embodiments, the dynamic change of the appearance of the background of the first user interface is (1916) based at least in part on a position of the first focus selector on the display (e.g., distortion of a background pattern is more pronounced for portions of the background pattern that are closer to the focus selector). For example, virtual mesh 810 is pushed back more at location near contact 812 than at locations near the edge of touch screen 112 in
In some embodiments, the first intensity threshold is associated with an operating system of the electronic device, and respective operations of respective applications on the electronic device are (1918) activated in response to detecting respective inputs that satisfy the first intensity threshold (e.g., a hint/reveal intensity threshold, as described with respect to methods 1300 and 1500 and
In some embodiments, the background of the first user interface includes (1920) a representative image in a sequence of images and dynamically changing the appearance of the background of the first user interface includes displaying in sequence at least some of the sequence of images based at least in part on the characteristic intensity of the first contact. For example, an enhanced photo dynamically animates as the intensity of the input changes, as described in U.S. Provisional Application Ser. No. 62/215,689, filed Sep. 8, 2015, entitled “Devices and Methods for Capturing and Interacting with Enhanced Digital Images,” which is incorporated by reference herein in its entirety.
In some embodiments, respective operations of respective applications on the electronic device are (1922) activated in response to detecting respective inputs that satisfy a second intensity threshold (e.g., a peek/preview intensity threshold that is higher than the first intensity threshold); the appearance of the background changes in a first manner (e.g., changing color and spacing of user interface objects) when the characteristic intensity of the contact is between the first intensity threshold and the second intensity threshold; and the appearance of the background changes in a second manner, different from the first manner (e.g., changing an orientation or size of the user interface objects), when the characteristic intensity of the contact is above the second intensity threshold (e.g., to provide the user with feedback as to how much pressure is required to reach a particular intensity threshold and thereby train the user in how to reach the first intensity threshold and the second intensity threshold).
In some embodiments, the change in the appearance of the background of the first user interface includes (1924): a change in the space between background objects; a change in the radial position of a background object with respect to a position of the first contact; a change in the opacity of a background object (e.g., change opacity of a portion of the lock screen generally (e.g., revealing a portion of a home screen through the lock screen) or of individual objects); a change in the color of a background object; a change in a simulated depth (e.g., z-depth) or focus of a background object; a change in the contrast of a background object; and/or a change in the brightness of a background object (e.g., background objects near the contact glow brighter with increasing contact intensity).
In some embodiments, the change in the appearance of the background of the first user interface includes (1926) a rippling effect applied to a background object (e.g., a geometric shape or pattern) that emanates from the focus selector (e.g., like water ripples, for example, as illustrated in
In some embodiments, reverting the background of the first user interface back to the first appearance of the background includes (1926) moving display of an object (e.g., a geometric shape or pattern) of the background of the first user interface back to its first appearance in the background of the first user interface with a simulated inertia that is based on a rate of decrease in the characteristic intensity of the first contact detected immediately prior to detecting termination of the input by the first contact (e.g., a trampoline effect in which the background springs back towards, and past, the plane of the screen and then oscillates above and below the plane of the screen with a dampening amplitude, as illustrated in
In some embodiments, the dynamic change in the appearance of the background of the first user interface is (1928) based in part on a positive rate of change in the characteristic intensity of the first contact.
In some embodiments, a magnitude of the dynamic change in the appearance of the background of the first user interface decays (1930) following detection of an impulse force by the first contact (e.g., as graphically illustrated in
While dynamically changing the appearance of the background of the first user interface, the device detects (1932) termination of the first input by the first contact; and, in response to detecting termination of the first input by the first contact, the device reverts the background of the first user interface (e.g., as illustrated in
In some embodiments, reverting the background of the first user interface back to the first appearance of the background includes (1934): moving display of an object (e.g., a geometric shape or pattern) of the background of the first user interface back to its first appearance in the background of the first user interface with a simulated inertia that is based on a rate of decrease in the characteristic intensity of the first contact detected immediately prior to detecting termination of the input by the first contact. (e.g., a trampoline effect in which the background springs back towards, and past, the plane of the screen and then oscillates above and below the plane of the screen with a dampening amplitude, as illustrated in
In some embodiments, reverting the background of the first user interface back to the first appearance of the background is (1936) based on a rate of change of the decrease in the characteristic intensity of the first contact prior to termination of the first input. In some embodiments, the dynamic reversion of the change in the appearance of the background is retarded relative to a rate of change in characteristic intensity of the contact above a first rate of change threshold. For example, the rate at which the dynamic distortion of the display is reversed reaches a terminal rate that is less than the rate at which the intensity of the contact is released, creating a “memory foam” effect, as illustrated in
In some embodiments, the device detects (1938) a second input by a second contact, the second input meeting criteria to exit the locked mode of the electronic device (e.g., a fingerprint input on a fingerprint sensor in home button 204 that matches a stored fingerprint for the user of the device, or a directional swipe gesture, optionally coupled to input of a password). In response to detecting the second input by the second contact, the device replaces display of the first user interface with display of a second user interface that is distinct from the first user interface on the display (e.g., upon exiting the locked mode of the electronic device, the device displays a second user interface (e.g., an application springboard) associated with an unlocked state of the electronic device that provides access to a plurality of different applications on the electronic device, which were locked when displaying the first user interface), wherein the second user interface includes a background of the second user interface with a first appearance and one or more foreground objects. For example, device 100 replaces display of lock screen user interface 800 with home screen user interface 824 in
In some embodiments, while displaying the second user interface on the display, the device detects (1940) a third input by a third contact on the touch-sensitive surface while a focus selector is at a location in the second user interface that corresponds to the background of the second user interface, wherein the third contact has a characteristic intensity above the first intensity threshold; and, in response to detecting the third input by the third contact, the device maintains the first appearance of the background of the second user interface (e.g., contact 826 does not change the appearance of the background in
In some embodiments, while displaying the second user interface on the display, the device detects (1942) a fourth input by a fourth contact on the touch-sensitive surface while a focus selector is at a location in the second user interface that corresponds to the background of the second user interface; and, in response to detecting the fourth input by the fourth contact, in accordance with a determination that the fourth contact has a characteristic intensity above the first intensity threshold, the device dynamically changes the appearance of the background of the second user interface without changing the appearance of the one or more foreground objects in the first user interface, wherein the dynamic change in the appearance of the background of the second user interface is based at least in part on the characteristic intensity of the fourth contact (e.g., directly, linearly, non-linearly proportional to, or at a rate determined based on the characteristic intensity of the contact). For example, contact 826 pushes virtual mesh 810 backwards in
In some embodiments, while dynamically changing the appearance of the background of the second user interface, the device detects (1944) termination of the fourth input by the fourth contact; and, in response to detecting termination of the fourth input by the fourth contact, the device reverts the background of the second user interface back to the first appearance of the background of the second user interface (e.g., liftoff of contact 826 reverses the change in the appearance of virtual mesh 810 in
In some embodiments, while detecting the first input by the first contact, after determining that the first contact has a characteristic intensity above the first intensity threshold: the device detects (1946) a decrease in the characteristic intensity of the first contact; and, in response to detecting the decrease in the characteristic intensity of the first contact: in accordance with a determination that a rate of change of the characteristic intensity of the first contact during the detected decrease in the characteristic intensity of the first contact does not exceeds a first rate of change threshold, the device dynamically reverses the change of the appearance of the background of the first user interface based on the rate of change of the characteristic intensity of the first contact. In accordance with a determination that a rate of change of the characteristic intensity of the first contact during the detected decrease in the characteristic intensity of the first contact exceeds a first rate of change threshold, the device animates reversal of the change of the appearance of the background of the first user interface independent of the rate of change of the characteristic intensity of the first contact. In some embodiments, dynamic distortion of the display is retarded in response to a quick release of force. For example, the rate at which the dynamic distortion of the display is reversed reaches a terminal rate that is less than the rate at which the pressure of the contact is released, which results in the background displaying a “memory foam” effect, as illustrated in
In some embodiments, while detecting the first input by the first contact, after determining that the first contact has a characteristic intensity above the first intensity threshold: the device detects (1948) a decrease in the characteristic intensity of the first contact below the first intensity threshold; and, in response to detecting the decrease in the characteristic intensity of the first contact below the first intensity threshold, continues to dynamically change the appearance of the background of the first user interface based at least in part on the characteristic intensity of the first contact. In some embodiments, reversion of the background distortion is slower than the initial background distortion because the end point of the reversion is lift-off of the contact (e.g., zero intensity). For example, contact 852 continues to change the appearance of virtual mesh 810 in
In some embodiments, while continuing to detect the first input by the first contact, after determining that the first contact has a characteristic intensity above the first intensity threshold: the device detects (1950) movement of the first contact on the touch-sensitive surface; and, in response to detecting the movement of the first contact, dynamically updates the change in the appearance of the background of the first user interface based on the movement of the first contact on the touch-sensitive surface. For example, movement of contact 812 in
In some embodiments, after determining that the first contact has a characteristic intensity above the first intensity threshold, and prior to detecting movement of the first contact on the touch-sensitive surface: the device detects (1952) a decrease in the characteristic intensity of the contact below the first intensity threshold. In some embodiments, the background distortion moves with the contact even when the characteristic intensity of the contact falls below the first intensity threshold. For example, contact 852 continues to change the appearance of virtual mesh 810 in
In some embodiments, in response to detecting the input by the first contact, in accordance with the determination that the first contact has a characteristic intensity above the first intensity threshold, the device changes (1954) an aspect of the appearance of the background of the first user interface without changing the appearance of a respective foreground object of the one or more foreground objects in the first user interface, wherein the change of the aspect of the appearance of the background of the first user interface is independent of the position of the focus selector in the background (e.g., the color of the background changes ubiquitously). For example, in response to detecting an increase in the intensity of contact 830 above a first intensity threshold ITH, the appearance of virtual mesh changes ubiquitously in
In some embodiments, while detecting the first input by the first contact on the touch-sensitive surface, the device detects (1956) a second input by a second contact on the touch-sensitive surface while a second focus selector is at a location in the first user interface that corresponds to the background of the user interface. In response to detecting the second input by the second contact: in accordance with a determination that the second contact does not have a characteristic intensity above the first intensity threshold, the device dynamically changes the appearance of the background of the first user interface without changing the appearance of a respective foreground object of the one or more foreground objects in the first user interface, wherein the dynamic change in the appearance of the background of the first user interface is based at least in part on the characteristic intensity of the first contact; and, in accordance with a determination that the second contact has a characteristic intensity above the first intensity threshold, the device dynamically changes the appearance of the background of the first user interface without changing the appearance of a respective foreground object of the one or more foreground objects in the first user interface, wherein the dynamic change in the appearance of the background of the first user interface is based at least in part on the characteristic intensity of the first contact, the characteristic intensity of the second contact, and positions of the first and second focus selectors on the display. For example, as illustrated with respect to contacts 854 and 856 in
In some embodiments, in response to detecting the first input by the first contact on the touch-sensitive surface, in accordance with a determination that the first input does not have a characteristic intensity above the first intensity threshold, the device maintains (1958) the first appearance of the background of the first user interface. In some embodiments, there is no change in the background while the characteristic intensity of the input is below the first intensity threshold (e.g., the device detects an increase in characteristic intensity without distorting the background). This helps to preserve battery life by not activating the dynamic behavior at low intensity thresholds that correspond to accidental or incidental touches. For example, as illustrated in
It should be understood that the particular order in which the operations in
In accordance with some embodiments,
As shown in
The device displays (2102) a first user interface on the display (e.g., user interface 800 in
While displaying the first user interface on the display, the device detects (2104) an input by a first contact on the touch-sensitive surface, the first contact having a characteristic intensity above a first intensity threshold (e.g., “hint” threshold ITH, light press threshold ITL, or deep press threshold ITD). For example, contacts 902 and 904 in
In some embodiments, when the input is detected, the electronic device is (2106) in a locked mode in which access to a plurality of different operations that are accessible when the device is in an unlocked state is prevented (e.g., the device is locked when the input is detected and the first user interface is a lock screen user interface, as illustrated by user interface 800).
In response to detecting the input by the first contact, in accordance with a determination that, during the input, a focus selector is at a location in the first user interface that corresponds to the background of the user interface, the device dynamically changes (2108) the appearance of the background of the first user interface without changing the appearance of the one or more foreground objects in the first user interface. For example, contact 902 appears to push virtual mesh 810 backwards (e.g., in a virtual z-space) in
In some embodiments, while dynamically changing the appearance of the background of the first user interface, the device detects (2110) termination of the input by the first contact; and, in response to detecting termination of the input by the first contact, the device reverts the background of the first user interface back to the first appearance of the background (e.g., restoring display of the first user interface to its appearance prior to the first input; animating the reversal of the changes in the background; and/or springing back to the first appearance with a dampening effect). For example, as illustrated by liftoff of contact 902 in
In some embodiments, the input by the first contact includes (2112) a first portion of the input, and detecting the input by the first contact on the touch-sensitive surface includes detecting the first portion of the first input. In response to detecting the first portion of the input, in accordance with a determination that, during first portion of the input, the focus selector is at a location in the first user interface that corresponds to a first foreground object of the one or more foreground objects, and the first portion of the input meets preview criteria (e.g., the input is a press input with a characteristic intensity in the first portion of the input that meets preview criteria, such as a characteristic intensity that meets a “peek” intensity threshold), the device displays a preview area overlaid on at least some of the background of the first user interface (e.g., a preview area 907 overlaid on the background in
In some embodiments, after detecting the first portion of the first input, detecting a second portion of the input by the first contact; and, in response to detecting the second portion of the input by the first contact: in accordance with a determination that the second portion of the input by the first contact meets user-interface-replacement criteria, the device replaces (2114) display of the first user interface and the overlaid preview area with display of a second user interface associated with the first foreground object (e.g., as described in greater detail herein with reference to method [link claim sets JO1 and JO2]). For example, as illustrated by replacement of user interface 800 with user interface 909 in
In some embodiments, in response to detecting the input by the first contact: in accordance with a determination that the focus selector is at a location in the first user interface that corresponds to a second foreground object of the one or more foreground objects, the device displays (2116) additional information associated with the second foreground object (e.g., increasing the size (e.g., dynamically) of the second foreground object from a first size to a second size that is larger than the first size or displaying a preview area that displays an expanded preview of content corresponding to the second foreground object). For example, in response to the increasing intensity of contact 910 over notification 908, additional content associated with the notification is revealed in
In some embodiments, the second foreground object is (2118) a notification, and expanding the second foreground object includes displaying additional content associated with the notification (e.g., as illustrated in
In some embodiments, the second foreground object is (2120) a representation of a date and/or time, and expanding the second foreground object includes displaying information about expected activities of a user of the device that correspond to the date and/or time.
In some embodiments, in response to detecting the input by the first contact: in accordance with a determination that the focus selector is at a location in the first user interface that corresponds to a third foreground object of the one or more foreground objects, the device displays (2122) a menu area overlaid on at least some of the background of the first user interface (e.g., display a quick-action menu overlaid on part of the background, but not overlaid on the third foreground object), wherein the menu area displays a plurality of selectable actions that are performed by a first application that corresponds to the third foreground object. For example, pressing on the Camera icon in
In some embodiments, the third foreground object is (2124) a representation of a suggested application (e.g., that, when activated such as by swiping upward, causes a corresponding application to be launched) and the menu area includes representations of additional suggested applications (e.g., that, when activated cause a corresponding application to be launched).
In some embodiments, the third foreground object is (2126) a representation of a suggested application (e.g., that, when activated such as by swiping upward, causes a corresponding application to be launched) and the menu area includes representations of actions associated with the suggested application (e.g., that, when activated cause the corresponding actions to be performed e.g., such as the quick actions described with reference to method [link back to JO7 and associated table]).
In some embodiments, the third foreground object is (2128) a representation of a media capture application (e.g., that, when activated such as by swiping upward, causes the media capture application to be launched in a default mode of operation such as a still camera mode of operation or a last used mode of operation) and the menu area includes representations of additional modes of operation for the media capture application (e.g., that, when activated cause the media capture application to be launched in a corresponding mode of operation (e.g., a video capture mode of operation or a panorama capture mode of operation).
In accordance with some embodiments,
As shown in
The device displays (2302) a first user interface on the display (e.g., lock screen user interface 800 in
The device detects (2304) an input by a contact on the touch-sensitive surface while a first focus selector is at a first user interface object in the plurality of user interface objects in the foreground area (e.g., contacts 1026, 1030, and 1034 in
In some embodiments, when the input is (2306) detected, the electronic device is in a locked mode in which access to a plurality of different operations that are accessible when the device is in an unlocked state is prevented (e.g., the device is locked when the input is detected and the first user interface is a lock screen user interface with an overlaid control center area). In some embodiments, while in the locked mode, access to sensitive information (e.g., previously captured images and videos, financial information, electronic communications, etc.) is protected by a passcode and/or biometric authentication.
In response to detecting the input by the contact, in accordance with a determination that the input by the contact meets one or more first press criteria, which include a criterion that is met when a characteristic intensity of the contact remains below a first intensity threshold during the input (e.g., “hint” threshold ITH, light press threshold ITL, or deep press threshold ITD), the device performs (2308) a first predetermined action that corresponds to the first user interface object in the foreground area. For example, in response to lift off of contact 1026 in
In some embodiments, the first predetermined action changes (e.g., toggles) (2310) a setting that corresponds to the first user interface object in the foreground area. In some embodiments, movement of the focus selector off of the first user interface object, followed by lift off of the contact, does not toggle or otherwise change the setting.
In some embodiments, the first predetermined action opens (2312) an application that corresponds to the first user interface object. In some embodiments, opening the application replaces display of the first user interface with a second user interface that corresponds to the opened application.
In some embodiments, the second predetermined action displays (2314) a menu area overlaying a portion of the foreground area, wherein the menu area displays one or more selectable actions that are performed by an application that corresponds to the first user interface object. For example, a deep press input on AirDrop opens a menu with options for making device files deliverable to nearby devices. In some embodiments, movement of the focus selector off of the first user interface object, followed by lift off of the contact, does not display the menu area.
In some embodiments, the foreground area is (2316) displayed overlaying the portion of the background in response to detecting a gesture (e.g., a swipe gesture including movement 1004 of contact 1002 in
In some embodiments, the first predetermined action includes (2318) toggling wireless connectivity (e.g., turning on/off WiFi), and the second predetermined action includes displaying a user interface for selecting a wireless network to join.
In some embodiments, the first predetermined action includes (2320) toggling a limited notification mode of operation (e.g., turning on/off a do not disturb mode of operation), and the second predetermined action includes displaying a user interface for setting a timer associated with the limited notification mode of operation (e.g., specifying a time to turn on or turn off the do not disturb mode of operation).
In some embodiments, the first predetermined action includes (2322) toggling a flashlight function (e.g., turning on/off a light on the device to serve as a flashlight), and the second predetermined action includes displaying a user interface for selecting a mode of operation for the flashlight function (e.g., selecting a brightness level, a strobe effect etc.).
In some embodiments, the first predetermined action includes (2324) launching a timer application (e.g., opening an application for starting or stopping a timer), and the second predetermined action includes displaying a user interface for performing timer management operations (e.g., starting, stopping, or pausing a timer) without launching the timer application.
In some embodiments, the first predetermined action includes (2326) launching an alarm application (e.g., opening an application for starting or stopping a timer), and the second predetermined action includes displaying a user interface for performing alarm management operations (e.g., setting, disabling, or snoozing an alarm) without launching the alarm application.
In some embodiments, the first predetermined action includes (2328) launching a corresponding application, and the second predetermined action includes displaying a user interface for performing operations associated with the corresponding application without launching the corresponding application (e.g., such as the quick actions described with reference to method [link back to JO7 and associated table]). For example, in response to detecting an increase in the intensity of contact 1034 above predetermined intensity threshold ITL, the device displays quick action menu 1036 in
In some embodiments, in response to detecting the input by the contact: in accordance with a determination that the input by the contact meets one or more third press criteria, which include a criterion that is met when a characteristic intensity of the contact increases above a second intensity threshold (e.g., deep press threshold ITD), greater than the first intensity threshold (e.g., light press threshold ITL) during the input, the device performs (2330) a third predetermined action, distinct from the first predetermined action and the second predetermined action, that corresponds to the first user interface object in the foreground area.
In some embodiments, prior to displaying the foreground area, the device displays (2332) the first user interface on the display, wherein the first user interface is a lock screen user interface that includes a background with a first appearance (e.g., a digital image, a pattern, or other wallpaper) and one or more foreground objects (e.g., time/date, camera icon, notifications, pull-down/up panel handles, or other user interface objects). While displaying the lock screen user interface on the display, the device detects an input by a second contact on the touch-sensitive surface while a focus selector is at a location in the lock screen user interface that corresponds to the background of the lock screen user interface; and, in response to detecting the input by the second contact, in accordance with a determination that the second contact has a characteristic intensity above the first intensity threshold (e.g., “hint” threshold ITH, light press threshold ITL, or deep press threshold ITD), the device dynamically changes the appearance of the background of the lock screen user interface without changing the appearance of the one or more foreground objects in the lock screen user interface. In some embodiments, the change includes animating a sequence of images in the background in accordance with the characteristic intensity of the second contact. In some embodiments, the change includes changing a Z-depth, focus, radial position relative to the contact, color, contrast, or brightness of one or more objects of the background, wherein the dynamic change in the appearance of the background of the lock screen user interface is based at least in part on the characteristic intensity of the second contact (e.g., directly, linearly, non-linearly proportional to, or at a rate determined based on the characteristic intensity of the contact).
In accordance with some embodiments,
As shown in
The operations in the information processing methods described above are, optionally implemented by running one or more functional modules in information processing apparatus such as general purpose processors (e.g., as described above with respect to
The device displays (2502), on the display, an application launching user interface that includes a plurality of application icons for launching corresponding applications. For example, user interface 500 displays application launch icons 480, 426, 428, 482, 432, 434, 436, 438, 440, 442, 444, 446, 484, 430, 486, 488, 416, 418, 420, and 424 in
While displaying on the application launching user interface, the device detects (2504) a first touch input that includes detecting a first contact at a location on the touch-sensitive surface that corresponds to a first application icon (e.g., contact 1102 on messages launch icon 424 in
In response to detecting the first touch input, in accordance with a determination that the first touch input meets one or more application-launch criteria, the device launches (2506) (e.g., opens) the first application. For example, upon detecting liftoff of contact 1102, device 100 launches a messaging application associated with messaging launch icon 424, including display of default user interface 1104 in
In some embodiments, the application-launch criteria are (2508) criteria that are configured to be met when the characteristic intensity of the contact does not increase above the respective intensity threshold (e.g., the application-launch criteria are capable of being satisfied without the characteristic intensity of the contact increasing above the respective intensity threshold that is required to trigger display of the one or more quick action objects such as in the quick action menu). For example, the tap input illustrated in
In some embodiments, during the first touch input, the device detects (2510) changes in the characteristic intensity of the first contact before the quick-action-display criteria are met and, the device dynamically adjusts an appearance of the other application icons based on the characteristic intensity of the first contact to progressively deemphasize the plurality of application icons other than the first application icon as the characteristic intensity of the first contact increases. For example, hint graphic 1108 dynamically grows from under messaging launch icon 424 in response to increasing intensity of contact 1106 above hint threshold ITH in
In some embodiments, concurrently displaying the one or more quick action objects with the first application icon includes (2512) displaying the one or more quick action objects in a menu that includes a plurality of quick action objects (e.g., next to or adjacent to the first application icon and, optionally overlaid on one or more of the other application icons). For example, quick action objects 1112, 1114, 1116, and 1118 are displayed in quick action menu 1110, adjacent to messages launch icon 424 and overlaying camera launch icon 430, voice memo launch icon 486, and networking folder launch icon 488, in
In some embodiments, the quick action objects within the menu are (2514) ordered within the menu based on the location of the icon within the application launch user interface. Additional details regarding displaying quick action objects in a quick action menu are provided with respect to method 2700, and corresponding user interfaces shown in
In some embodiments, the application icon includes (2516) an indication of a number of notifications (e.g., a notification badge) and the one or more quick action objects include a quick action object associated with one or more of the notifications (e.g., an option for replying to a most recent message, or listening to a most recent voicemail). For example, messages launch icon 424 in
In some embodiments, the one or more quick action objects include (2518) a respective quick action object that corresponds to a quick action selected based on recent activity within the first application (e.g., a recently played playlist, a recently viewed/edited document, a recent phone call, a recently received message, a recently received email). For example, quick action objects 1160, 1162, 1164, and 1166 in quick action menu 1158, illustrated in
In some embodiments, the one or more quick action objects include (2520) a respective quick action object that is dynamically determined based on a current location of the device (e.g., marking a current location, directions from the current location to the user's home or work, nearby users, recently used payment accounts, etc.).
In some embodiments, in response to detecting the first touch input, in accordance with the determination that the first touch input meets the quick-action-display criteria, the device deemphasizes (2522) a plurality of the application icons relative to the first application icon in conjunction with displaying the one or more quick action objects. For example, device 100 dynamically blurs unselected application launch icons in
In some embodiments, in response to detecting the first touch input, in accordance with a determination that the first touch input meets one or more interface-navigation criteria that include a criterion that is met when more than a threshold amount of movement of the first contact is detected before the characteristic intensity of the first contact increases above the respective intensity threshold, the device ceases (2524) to display at least a portion of the application launching user interface and displays at least a portion of a different user interface on a portion of the display that was previously occupied by the plurality of application icons in the application launching user interface immediately prior to detecting the first touch input (e.g., replace display of the home screen with a search user interface if the user swipes down or to the right, or replace display of the first page of the home screen with a second page of the home screen that includes different application icons if the user swipes to the left). For example, in response to detecting a swipe gesture including movement 1126 of contact 1124 in
In some embodiments, in response to detecting movement of the first contact before the characteristic intensity of the first contact increases above the respective intensity threshold, the device moves (2526) a plurality of application icons in accordance with the movement of the first contact (e.g., move the application launch icons a distance, direction, and/or speed that corresponds to the distance, direction and/or speed of the first contact on the touch-sensitive surface). For example, in response to detecting a swipe gesture including movement 1126 of contact 1124 in
In some embodiments, in response to detecting the first touch input, in accordance with a determination that the first touch input meets icon-reconfiguration criteria that include a criterion that is met when the first contact is detected on the touch-sensitive surface for more than a reconfiguration time threshold before the characteristic intensity of the first contact increases above the respective intensity threshold, the device enters (2528) an icon reconfiguration mode in which one or more application icons can be reorganized within the application launching interface (e.g., in response to movement of a contact that starts at a location that corresponds to an application icon, the device moves the icon around the user interface relative to other icons). For example, in response to a long-press gesture, including contact 1130 in
In some embodiments, while displaying the one or more quick action objects concurrently with the application icon, the device detects (2530) a second touch input (e.g., a tap gesture) that includes detecting a second contact at a location on the touch-sensitive surface that corresponds to the first application icon and meets the application launch criteria. In some embodiments, in response to detecting the second touch input, the device launches the first application (e.g., displays a default view of the first application). For example, in response to detecting a tap gesture, including contact 534 while quick action menu 528 is displayed in
In some embodiments, while displaying the one or more quick action objects concurrently with the application icon, the device detects (2532) a third touch input that includes detecting a third contact at a location on the touch-sensitive surface that corresponds to the first application icon, wherein the third touch input meets icon-reconfiguration criteria that include a criterion that is met when the third contact is detected on the touch-sensitive surface for more than a reconfiguration time threshold before the characteristic intensity of the third contact increases above the respective intensity threshold. In response to detecting the third touch input, the device enters an icon reconfiguration mode in which application icons can be reorganized within the application launching interface (e.g., in response to movement of the third contact that starts a location that corresponds to an application icon, the device moves the icon around the user interface relative to other icons). In some embodiments, in the icon reconfiguration mode, one or more of the application icons include application icon removal affordances that, when selected cause the application icon to be removed from the application launch interface and, optionally cause the application to be deleted from the device. For example, device 100 enters icon-reconfiguration mode upon detection of a long-press gesture including contact 1136 while displaying quick-action menu 1110 in
In some embodiments, entering the icon reconfiguration mode in response to detecting the third touch input includes (2534) ceasing to display the one or more quick action objects (and, optionally, reversing a de-emphasis of application icons other than the first application icon). For example, device 100 terminates display of quick-action menu 1110, as illustrated in
In some embodiments, while displaying the quick action objects concurrently with the first application icon, the device detects (2536) a fourth touch input that includes detecting a fourth contact at a location on the touch-sensitive surface that is away from the quick action objects and the first application icon (e.g., at a location on the touch-sensitive surface that corresponds to one of the other application icons on the display). In response to detecting the fourth touch input, the device ceases to display the one or more quick action objects (and, optionally, reverses a de-emphasis of application icons other than the first application icon). For example, detection of a tap gesture, including contact 1140 while quick action menu 1110 is displayed in
In some embodiments, in response to determining that the quick-action-display criteria have been met, the device generates (2538) a first tactile output that is indicative of the satisfaction of the quick-action-display criteria (e.g., tactile feedback 1111 in
In some embodiments, while displaying the plurality of application icons on the application launching user interface, the device detects (2540) a fifth touch input that includes detecting a fifth contact at a location on the touch-sensitive surface that corresponds to a second application icon of the plurality of application icons, wherein the second application icon is an icon for launching a second application that is not associated with any corresponding quick actions (e.g., contact 1142 on settings launch icon 446 in
In some embodiments, when the first contact approaches the respective intensity threshold, the device displays (2542), on the display, a respective change in the appearance of a plurality of application icons (e.g., a third application icon and, optionally, one or more application icons other than the first application icon and the second application icon). In some embodiments, displaying the respective change includes displaying an animation that is adjusted dynamically in accordance with the change in intensity of the first contact, such as blurring application icons other than the first application icon. In some embodiments, when the fifth contact approaches the respective intensity threshold, the device displays, on the display, the respective change in the appearance of the plurality of application icons (e.g., the third application icon and, optionally, one or more application icons other than the first application icon and the second application icon). In some embodiments, displaying the respective change includes displaying an animation that is adjusted dynamically in accordance with the change in intensity of the fifth contact, such as blurring application icons other than the second application icon. For example, application launch icons other than messages launch icon 424 are dynamically blurred in response to detecting increasing intensity of contact 1106 above hint threshold ITH in
In some embodiments, when the fifth contact approaches the respective intensity threshold, the device displays (2544), on the display, a change in the appearance of the plurality of application icons other than the second application icon (e.g., as described in greater detail above with reference to method 1300, and corresponding user interfaces shown in
In accordance with a determination that the fifth touch input meets the quick-action-display criteria (for application icons that have corresponding quick actions), the device generates visual and/or tactile output indicating that the fifth touch input met the quick-action-display criteria but that the second application is not associated with any quick actions (e.g., blurring and then unblurring other application icons and/or generating a “negative” tactile output that is different from a “positive” tactile output that is generated when quick actions for an application icon are displayed). For example, in response to detecting increasing intensity of contact 1146 while over settings launch icon 446, the device blurs (e.g., dynamically) other launch icons in
In some embodiments, while displaying on the application launching user interface, the device detects (2546) a sixth touch input that includes detecting a sixth contact at a location on the touch-sensitive surface that corresponds to a respective application icon, wherein the sixth contact meets the quick-action-display criteria. In response to detecting the sixth touch input, in accordance with a determination that the respective application icon is associated with one or more quick actions, the device displays quick action objects for the respective application icon and generates a first tactile output (e.g., a “positive” success tactile output) indicating that the sixth touch input met the quick-action-display criteria and that the respective application icon is associated with quick actions. For example, in response to detecting quick-action-display criteria when contact 1138 is over messages launch icon 424 in
In some embodiments, prior to displaying the menu, the device displays (2548) a layer under the application icon, and in response to detecting that the first input meets the quick-action-display criteria, the device expands the layer (and moving the layer across the display) to serve as a background for the menu.
In some embodiments, as the second contact approaches the respective intensity threshold, the device changes (2550) the size of the layer dynamically as an intensity of the first contact changes. For example, hint graphic 1108 grows out from under messages launch icon 424 in response to increasing intensity of contact 1106 in
In some embodiments, while displaying the one or more quick action objects, the device detects (2552) movement of the first contact to a respective location on the touch-sensitive surface that corresponds to a respective quick action object of the one or more quick action objects and detects liftoff of the first contact from the touch-sensitive surface while the first contact is at the respective location on the touch-sensitive surface. In response to detecting liftoff of the first contact, the device performs the respective quick action. For example, contact 1150 moves from over messages launch icon 424 in
In some embodiments, while displaying the one or more quick action objects, the device detects (2554) movement of the first contact to a respective location on the touch-sensitive surface that corresponds to a respective quick action object of the one or more quick action objects and detects an increase in the characteristic intensity of the contact that meets action-selection criteria (e.g., the contact is substantially stationary and the characteristic intensity of the contact increases over a threshold intensity) while the first contact is at the respective location on the touch-sensitive surface. In response to detecting that the first contact meets the action-selection criteria, the device performs the respective quick action. For example, contact 1154 decreases in intensity below intensity threshold ITL and moves from over music launch icon 480 in
In some embodiments, after displaying the one or more quick action objects, the device detects (2556) liftoff of the contact from the touch-sensitive surface and detects a subsequent touch input on the touch sensitive surface at a location that corresponds to a respective quick action object of the one or more quick action objects (e.g., a tap gesture). In response to detecting the subsequent touch input on the touch sensitive surface at a location that corresponds to the respective quick action object, the device performs the respective quick action. For example, in response to a tap gesture including contact 1120 on quick action object 1114 in
In some embodiments, launching the first application in response to detecting the first touch input includes (2558) displaying a default view of the application. In some embodiments, the one or more quick action objects include a respective quick action object that is associated with a non-default view of the application (e.g., user interface 1122 for the messaging application in
In some embodiments, the one or more quick action objects include (2560) a quick action object that is associated with a function of the first application. In some embodiments, the device detects selection of the respective quick action object. In response to detecting selection of the respective quick action object, the device performs the function (e.g., takes a picture, starts to record audio or video, stops recording audio or video, starts/stops/pauses playback of media). In some embodiments, the function is performed without displaying a user interface of the first application (e.g., the device starts recording audio without displaying a user interface for the audio application and instead shows a status indicator in the application launch user interface indicating that audio is being recorded). For example, selection of quick action option 1162 in
In some embodiments, the one or more quick action objects include (2562) a quick action object that is associated with a function of an application other than the first application. In some embodiments, the device detects selection of the respective quick action object. In response to detecting selection of the respective quick action object, the device performs the function (e.g., launches a music recognition program from the music store app icon where the music recognition program is a system functionality that is not specific to the music store app).
In some embodiments, the first application is (2564) a content creation application and the one or more quick action objects include a respective quick action object that is associated with creating new content (e.g., a document, an email, a message, a video, etc.). For example, selection of quick action option 1118 in
In some embodiments, the first application is (2566) a content creation application and the one or more quick action objects include a respective quick action object that is associated with opening previously created content (e.g., a document, an email, a message, a video, etc.). In some embodiments, the device detects selection of the respective quick action object. In response to detecting selection of the respective quick action object, the device opens the application and displays the previously created content within the application (e.g., opens a most recent document, email, message, or video).
It should be understood that the particular order in which the operations in
In accordance with some embodiments,
As shown in
The operations in the information processing methods described above are, optionally implemented by running one or more functional modules in information processing apparatus such as general purpose processors (e.g., as described above with respect to
The device displays (2702), on the display, a first user interface (e.g., a home screen) that includes a plurality of user interface objects (e.g., application launch icons), wherein a respective user interface object is associated with a corresponding set of menu options (e.g., each application launch icon has a corresponding set of menu options that are displayed in a menu over a portion of the first user interface when the application icon is selected). For example, user interface 5500 displays application launch icons 480, 426, 428, 482, 432, 434, 436, 438, 440, 442, 444, 446, 484, 430, 486, 488, 416, 418, 420, and 424 in
The device detects (2704), via the one or more input devices, a first input that corresponds to a request to display menu options for a first user interface object of the plurality of user interface objects (e.g., a long press or, for a device with one or more sensors for detecting intensity of contacts on a touch-sensitive surface, a press characterized by an increase in intensity of a contact above a first threshold while a focus selector is over the first user interface object). For example, device 100 detects an increase in the intensity of contact 502 above intensity threshold ITL while positioned over messages launch icon 424 in
In some embodiments, the first user interface object is (2706) an application icon that corresponds to a first application program (e.g., an application icon for an application program (e.g., “Mail”, “iTunes”, etc.) that is displayed on a home screen). For example, messages launch icon 424 displayed on home screen user interface 500 in
In some embodiments, while displaying the menu items in the menu that corresponds to the first user interface object (e.g., overlaid on top of the first user interface), the device detects (2708) a second input that corresponds to a request to select the first user interface object (e.g., detects a tap gesture on the first user interface object (e.g., the application icon for an application program (e.g., “Mail”, “iTunes”, etc.))). In some embodiments, detecting the tap gesture on the first user interface object includes detecting touch-down of a contact followed by lift-off of the contact on the touch-sensitive surface within a first threshold amount of time, and while a focus selector is at the location of the first user interface object on the first user interface. In some embodiments, during the first threshold amount of time, intensity of the contact is taken in to consideration when responding to the second input. In response to detecting the second input that corresponds to the request to select the first user interface object, the device launches the first application program; and ceases to display the first user interface and the menu that corresponds to the first user interface object (e.g., the first user interface and the menu are replaced with a user interface of the first application program). For example, while displaying quick action menu 528 in
In some embodiments, while displaying the first user interface without displaying the menu that corresponds to the first user interface object, a respective input that corresponds to a request to select the first user interface (e.g., a tap gesture on the first user interface object (e.g., the application icon for an application program (e.g., “Mail”, “iTunes”, etc.)) launches (2710) the first application program. For example, device 100 detects a tap gesture including contact 1102 on messages icon 424 in home screen user interface 1100, while no quick-action menu is displayed in
In some embodiments, while displaying the menu items in the menu that corresponds to the first user interface object (e.g., overlaid on top of the first user interface), the device detects (2712) a first portion of a third input that corresponds to a request to enter a user interface reconfiguration mode (e.g., detects a long press gesture on the first user interface object (e.g., the application icon for an application program (e.g., “Mail”, “iTunes”, etc.))). In some embodiments, detecting the long press gesture on the first user interface object includes detecting touch-down of a contact on the touch-sensitive surface followed by maintenance of a characteristic intensity of the contact below a respective intensity threshold for at least a second threshold amount of time (that is greater than the first threshold amount of time), and while a focus selector is at the location of any of the plurality of user interface objects on the first user interface (e.g., at the location of the first user interface object on the first user interface). In response to detecting the first portion of the third input that corresponds to the request to enter the user interface reconfiguration mode, the device enters the user interface reconfiguration mode; and ceases to display the menu that corresponds to the first user interface object. For example, while displaying quick-action menu 1110 in
In some embodiments, while in the user interface reconfiguration mode: the device detects (2714) a second portion of the third input that corresponds to a request to move the first user interface object from a first location in the first user interface to a second location in the first user interface (e.g., detects a drag gesture on the first user interface object (e.g., the application icon for an application program (e.g., “Mail”, “iTunes”, etc.))). In some embodiments, detecting the drag gesture on the first user interface object includes detecting movement of the contact (e.g., the same contact in the long press that triggered the user interface reconfiguration mode) that drags the first user interface object to a different location in the first user interface. In response to detecting the second portion of the third input that corresponds to the request to move the first user interface object from the first location in the first user interface to the second location in the first user interface, the device reconfigures the first user interface (e.g., moves the first user interface object from the first location to the second location in the first user interface, and optionally moves one or more other user interface objects in the first user interface to accommodate the first user interface object). For example, upon detecting movement of 1170 of contact 1136 from position 1136-a in
In some embodiments, while displaying the first user interface without displaying the menu that corresponds to the first user interface object, a respective input that corresponds to a request to enter the user interface reconfiguration mode (e.g., detecting a long press gesture on the first user interface object (e.g., the application icon for an application program (e.g., “Mail”, “iTunes”, etc.))) causes (2716) the electronic device to enter the reconfiguration mode. For example, while displaying not displaying a quick action menu, the device detects a long-press gesture, including contact 1130 in
In response to detecting the first input, the device displays (2718) menu items in a menu that corresponds to the first user interface object (e.g., a quick action menu with a small subset of the most frequently used or relevant menu options for the application that corresponds to the first user interface object is displayed over the first user interface). For example, device 100 detects an increase in the intensity of contact 502 above intensity threshold ITL while positioned over messages launch icon 424 in
In some embodiments, the second order is (2720) opposite to the first order. For example, the order of action items in quick-action menu 528 in
In some embodiments, the menu items in the menu that corresponds to the first user interface object have associated priorities relative to one another, and the highest priority menu item in the menu is (2722) displayed closest to the first user interface object. For example, as illustrated for quick action menu 504 in
In some embodiments, the first user interface object is (2724) an application launch icon, and the menu for the first user interface object includes a menu item that when activated initiates a process for sending to a second electronic device acquisition information for an application that corresponds to the application launch icon. For example, activating menu item 568 (“Share”) in quick-action menu 558, illustrated in
In some embodiments, in accordance with the determination that the first user interface object is at the first location in the first user interface (e.g., the upper left corner of the home screen), the device extends (2726) the menu that corresponds to the first user interface object away from the first user interface object in a first direction (e.g., vertically downward from the top to the bottom of the home screen). For example, quick-action menus 528 and 571 are displayed on the top half of user interface 500 in
In some embodiments, a plurality of menu items in the menu that corresponds to the first user interface object each includes (2728) a respective graphic and respective text, and a displayed arrangement of the respective graphics and the respective text of said plurality of menu items in the menu is determined based on the location of the first user interface object in the first user interface. For example, quick-action menus 504 and 528 are located on the right side of user interface 500 in
In some embodiments, in accordance with the determination that the first user interface object is at the first location (e.g., upper left corner of the home screen), the respective text of each menu item is (2730) arranged to the right of the respective graphic of the menu item in the menu that corresponds to the first user interface object (and the menu items are in the first order (e.g., with decreasing priority from top to bottom of the menu)). For example, quick-action menu 571 is displayed in the upper-left quadrant of user interface 500 in
In some embodiments, in accordance with the determination that the first user interface object is at the second location (e.g., lower right corner of the home screen), the respective text of each menu item is arranged (2732) to the left of the respective graphic of the menu item in the menu that corresponds to the first user interface object (and the menu items are in the second order (e.g., with decreasing priorities from bottom to top of the menu)). For example, quick-action menu 504 is displayed in the lower-right quadrant of user interface 500 in
In some embodiments, in accordance with the determination that the first user interface object is at a third location (e.g., upper right corner of the home screen), the respective text of each menu item is arranged (2734) to the left of the respective graphic of the menu item in the menu that corresponds to the first user interface object and the menu items in the menu are in the first order (e.g., with decreasing priorities from top to bottom of the menu). For example, quick-action menu 528 is displayed in the upper-right quadrant of user interface 500 in
In some embodiments, in accordance with the determination that the first user interface object is at a fourth location (e.g., lower left corner of the home screen), the respective text of each menu item is arranged (2736) to the right of the respective graphic of the menu item in the menu that corresponds to the first user interface object and the menu items in the menu are in the second order (e.g., with decreasing priorities from bottom to top of the menu). For example, quick-action menu 574 is displayed in the lower-left quadrant of user interface 500 in
In some embodiments, the first user interface object includes a respective icon graphic, and the respective icon graphic of the first user interface object is aligned (2738) with the respective graphics of the menu items in the menu that corresponds to the first user interface object. For example, quick action menus 571 and 574 are aligned with the left edge of corresponding messages launch icon 424 in
In some embodiments, the plurality of user interface objects are arranged (2740) in a grid in the first user interface, the first user interface object is located at a first position in the grid, and the menu is extended in a respective direction vertically (e.g., above or below the first user interface object) and a respective direction horizontally (e.g., to the left or to the right of the first user interface object) relative to the first user interface object such that the menu covers a portion of the first user interface without covering the first user interface object at the first position. For example, as described for quick-action menus 504, 528, 571, and 574 above, and illustrated in
In some embodiments, while displaying the menu that corresponds to the first user interface object, the device visually emphasizes (2742) the first user interface object relative to other user interface objects in the plurality of user interface objects in the first user interface. In some embodiments, in response to the first input that corresponds to the request to display menu options that correspond to the first user interface object, the device highlights (e.g., enlarges, lifts up, brightens, etc.) the first user interface object and/or deemphasizes (e.g., blurs, dims, darkens, masks, etc.) the other user interface objects in the plurality of user interface objects in the first user interface. For example, launch icons other than messages launch icon 424 are blurred and displayed smaller than messages launch icon 424 in
In some embodiments, the device receives (2744), by an operating system of the electronic device, menu generation data from an application associated with the first user interface object, wherein the menu generation data includes the menu items to be included in the menu for the first user interface object and priority information associated with the menu items to be included in the menu for the first user interface object; and generates, by the operating system, the menu for the first user interface object for display on the first user interface, based on the menu generation data received from the application associated with the first user interface object. For example, the third-party application associated with workout launch icon 442 provides the device's 100 operating system with information to display menu items “Start Timer” 566, “Monitor Heartbeat” 564, “Start Workout” 562, and “Map New Run” 560 with corresponding priorities 1, 2, 3, and 4, respectively. As illustrated in
In some embodiments, the device moves (2746) the first user interface object on the first user interface from the first location (or the second location) to a new location in the first user interface, different from the first location (or the second location), and after moving the first user interface object to the new location in the first user interface, the device detects, via the one or more input devices, a second input that corresponds to a second request to display the menu options for the first user interface object (e.g., a long press or, for a device with one or more sensors for detecting intensity of contacts on a touch-sensitive surface, a press characterized by an increase in intensity of a contact above a first threshold while a focus selector is over the first user interface object). In response to detecting the second input, the device displays the menu items in the menu that corresponds to the first user interface object in a new order that is different from the first order (or the second order) in accordance with the new location of the first user interface object. For example, after moving messages launch icon 424 from the lower right quadrant of user interface 500, as illustrated in
In some embodiments, the device applies (2748) a visual effect to obscure (e.g., blur, darken, mask, etc.) one or more user interface objects of the plurality user interface objects other than the first user interface object while displaying the menu items in the menu that corresponds to the first user interface object. For example, launch icons other than messages launch icon 424 are blurred and displayed smaller than messages launch icon 424 in
It should be understood that the particular order in which the operations in
In accordance with some embodiments,
As shown in
The operations in the information processing methods described above are, optionally implemented by running one or more functional modules in information processing apparatus such as general purpose processors (e.g., as described above with respect to
The device displays (2902), on the display, a user interface that includes a selectable user interface object that is associated with a plurality of actions for interacting with the user interface, wherein the plurality of actions include a direct-selection action and one or more other actions (e.g., user interface objects 1202, 1204, 1206, 1208, and 1210 in user interface 1200 in
While displaying the user interface that includes the selectable user interface object, the device detects (2904) an input that includes detecting a contact on the touch-sensitive surface while a focus selector is over the selectable user interface object (e.g., contact 1212 over user interface object 1208 in
In response to detecting the input that includes detecting the contact in accordance with a determination that the input meets selection criteria, the device displays (2906), on the display, a menu that includes graphical representations of the plurality of actions that include the direct-selection action and the one or more other actions. In some embodiments, the selection criteria includes a criterion that is met when lift-off of the contact is detected before a characteristic intensity of the contact increases above a respective intensity threshold (e.g., a deep press intensity threshold) used for direct-selection criteria. For example, because contact 1212 in
In some embodiments, each of the direction-selection action and the one or more other actions are (2908) individually selectable in the menu displayed on the user interface. For example, direction-selection action 1216 (reply to sender), action 1218 (reply to all), action 1220 (forward), action 1222 (print), and action 1224 (cancel) are all individually selectable in action menu 1214 illustrated in
In some embodiments, the menu is (2910) displayed after lift-off of the contact is detected (e.g., liftoff of contact 1212 in
In some embodiments, the menu is (2912) displayed when the characteristic intensity of the contact reaches a first intensity value (e.g., the light press intensity threshold) that is lower than the respective intensity threshold (e.g., the deep press intensity threshold) used in the direct-selection criteria (e.g., action menu 1214 is displayed in response to an increase in the intensity of contact 1230 above ITL in
In some embodiments, displaying the menu that includes (2914) graphical representations of the plurality of actions that include the direct-selection action and the one or more other actions includes applying a visual effect (e.g., enlarging, highlighting, etc. the direct-selection action relative to the one or more other actions) to visually distinguish the direct-selection action from the one or more other actions in the menu (e.g., direct-selection action 1216 (reply to sender) is highlighted in
In some embodiments, displaying the menu that includes graphical representations of the plurality of actions that include the direct-selection action and the one or more other actions includes (2916) presenting the menu gradually (e.g., the menu grows larger (e.g., expands out from the selectable user interface object), becomes more clear, and/or becomes more complete) in accordance with the increase in intensity of the contact. In some embodiments, the size, clarity, completeness (e.g., as reflected in the number of actions shown) of menu is directly manipulated via the intensity of the contact before characteristic intensity of the contact increases above the first intensity value (e.g., the light press intensity threshold). For example, in response to an increase in the intensity of contact 1230 above a “hint” threshold (e.g., ITH) action menu 1214 grows dynamically from user interface object 1208 in
In some embodiments, the menu is (2918) displayed overlaid over a portion of the user interface and adjacent to the selectable user interface object (e.g., action menu 1214 is displayed over a portion of the email viewed in user interface 1200 and above user interface object 1208 in
In some embodiments, performing the direct-selection action includes (2920) updating the user interface (e.g., display of email viewing user interface 1200 is replaced with display of message replying user interface 1234 in
In some embodiments, the selectable user interface object corresponds (2922) to a message interface (e.g., an email interface presenting an email message), and the menu includes a reply action as the direct-selection action, and a reply all action and a forward action as the other actions (e.g., as illustrated in
In some embodiments, the selectable user interface object corresponds (2924) to a camera icon (e.g., a camera icon in the home screen or within an application user interface (e.g., an instant messaging user interface)), and the menu includes a still camera mode as the direct-selection action, and a video camera mode and a panorama mode as the other actions. In some embodiments, the user interface object is an icon on the lock screen of the device (e.g., camera icon 808 on lock screen user interface 800 in
In some embodiments, in accordance with the determination that the input meets direct-selection criteria, the device applies (2926) a second visual effect (e.g., enlarges, highlights, lifts up, pushes back, etc.) to the direct-selection action to visually distinguish the direct-selection action from the one or more other actions in the menu (e.g., reply action option 1216 is highlighted and initially increases in size after being selected as the direct-selection action in
In some embodiments, in accordance with the determination that the input meets direct-selection criteria, the device gradually fades (2928) out the other actions to visually emphasize the direct-selection action in the menu. For example, in some embodiments, when the contact intensity reaches above the deep press intensity threshold, the other actions are optionally blurred out in the menu, while the direct-select action remains visible and clear. In some embodiments, the gradual fading progresses dynamically as the characteristic intensity of the contact changes (e.g., as the intensity of the contact increases, the other actions progressively fade relative to the direct-selection action). For example, unselected action options 1218, 1220, 1222, and 1224 are blurred upon selection of direct-selection action 1216 in
In some embodiments, in accordance with the determination that the input meets direct-selection criteria, the device gradually shrinks (2930) the menu to conceal the other actions in the menu while the direction-selection action remains displayed in the menu. For example, in some embodiments, when the contact intensity reaches above the deep press intensity threshold, the representations of the other actions collapse toward the representation of the direction-selection action in the menu and become concealed behind the representation of the direct-selection action. In some embodiments, the gradual shrinking progresses dynamically as the characteristic intensity of the contact changes (e.g., as the intensity of the contact increases, the other actions progressively get smaller relative to the direct-selection action). For example, the size of unselected action options 1218, 1220, 1222, and 1224 are decreased upon selection of direct-selection action 1216 in
In some embodiments, in accordance with the determination that the input meets direct-selection criteria, the device moves (2932) the direct-selection action closer to the focus selector. For example, in some embodiments, when the contact intensity reaches above the deep press intensity threshold, the representations of the direct-selection action moves towards the focus selector, while the other actions fade away, or collapse toward the representation of the direction-selection action to eventually become concealed behind the representation of the direct-selection action when the direct-selection action arrives beneath the focus selector. In some embodiments, the movement of the direct-selection action closer to the focus selector progresses dynamically as the characteristic intensity of the contact changes (e.g., as the intensity of the contact increases, the direct-selection action progressively moves toward the detected contact). For example, the device animates the transition to a selected user interface, after selection of the direct-selection action 1216, in
In some embodiments, while displaying the menu in accordance with the determination that the input meets selection criteria, the device detects (2934) a termination of the input. Thus, in some embodiments, the menu persists even after the input is terminated (e.g., even after detecting liftoff of the contact). In addition, the device detects a second input including detecting a second contact on the touch-sensitive surface while the focus selector is outside of the displayed menu (e.g., the second input is optionally a tap input detected outside of the displayed menu, or a swipe input across the displayed menu that ends outside of the displayed menu). In response to detecting the second input, the device ceases to display the menu. For example, a tap gesture including contact 1238 outside of the action menu 1214 in
In some embodiments, while displaying the menu in accordance with the determination that the input meets selection criteria (e.g., when a characteristic intensity of the contact increases above a first intensity value (e.g., the light press threshold) below the respective intensity threshold used for the direct-selection criteria (e.g., the deep press intensity threshold)), the device detects (2936) a movement of the contact that corresponds to a movement of the focus selector over to a first action of the one or more other actions (e.g., movement 1242 of contact 1240 from position 1240-a in
It should be understood that the particular order in which the operations in
In accordance with some embodiments,
As shown in
The operations in the information processing methods described above are, optionally implemented by running one or more functional modules in information processing apparatus such as general purpose processors (e.g., as described above with respect to
As noted above, there is a need for electronic devices with improved methods and interfaces for teaching new user interface capabilities and features to the user, such as new contact-intensity based capabilities and features. In the embodiments described below, intensity sensitive user interface objects are revealed in response to a detected input at a location away from the intensity sensitive user interface objects. In this way, an electronic device provides information to a user about which user interface objects in a user interface will be responsive to contact intensity when input is provided at the user interface object. This approach allows for a user interface to identify intensity sensitive user interface elements without the need for consuming space in the interface with a dedicated user interface element selectable by the user to reveal intensity sensitive user interface elements.
Below,
In some embodiments, the device is an electronic device with a separate display (e.g., display 450) and a separate touch-sensitive surface (e.g., touch-sensitive surface 451). In some embodiments, the device is portable multifunction device 100, the display is touch-sensitive display system 112, and the touch-sensitive surface includes tactile output generators 167 on the display (
In
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
In
In
In
In
As described below, the method 3200 provides an intuitive way to indicate intensity sensitive user interface objects in a user interface. The method reduces the number, extent, and/or nature of the inputs from a user and produces a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to learn about intensity sensitive user interface objects in the user interface faster and more efficiently conserves power and increases the time between battery charges.
The device displays (3202), on the display, a user interface (e.g., user interface 400 in
While displaying the user interface that includes the plurality of user interface elements, the device detects (3204) a first input that includes detecting a first contact (e.g., contact 3104 in
In some embodiments, the first operation associated with the first object includes (3208) emphasizing the first object relative to the second object. In some embodiments, the first operation associated with the first object also includes emphasizing the first object relative to one or more regions of the user interface that are separate from the first object and the second object, and are not associated with object-specific responses to changes in contact intensity. In some embodiments, emphasizing the first object relative to the second object includes enhancing the appearance of the first object by, e.g., highlighting, magnifying, lifting up from the user interface plane, and/or animating, the first object to make the first object more distinct on the display than the second object, while maintaining the appearance of the second object (and optionally, the appearance of some or all other objects in remainder of the user interface). In some embodiments, emphasizing the first object relative to the second object includes obscuring the second object (and optionally, some or all other objects in the remainder of the user interface) by, e.g., blurring, shrinking, and/or masking, to make the second object (and the some or all other objects in the remainder of the user interface) less clear or distinct on the display, while maintaining the appearance of the first object in the user interface. In some embodiments, emphasizing the first object relative to the second object includes enhancing the appearance of the first object, while obscuring the second object (and optionally, some or all other objects in the remainder of the user interface). In some embodiments, emphasizing the first object relative to the second object includes providing a visual hint that the first object is an object that would respond to changes in contact intensity by producing an object-specific response (e.g., providing a preview or displaying a quick action menu that is specific to the first object).
In some embodiments, an amount of visual effect applied to emphasize the first object relative to the second object is dynamically varied in accordance with a current change in the characteristic intensity of the contact above the first intensity threshold. In some embodiments, an amount of visual effect applied to emphasize the second object relative to the first object, an amount of visual effect applied to emphasize the first and second objects relative to other objects that do not have associated object-specific operations that are triggered by changes in contact intensity are dynamically varied in accordance with a current change in the characteristic intensity of the contact.
In some embodiments, the second operation associated with the second object includes (3212) emphasizing the second object relative to the first object. In some embodiments, the second operation associated with the second object also includes emphasizing the second object relative to one or more regions of the user interface that are separate from the first object and the second object, and that are not associated with object-specific responses to changes in contact intensity. In some embodiments, emphasizing the second object relative to the first object includes enhancing the appearance of the second object by, e.g., highlighting, magnifying, lifting up from the user interface plane, and/or animating, the second object to make the second object more distinct on the display than the first object, while maintaining the appearance of the first object (and optionally, the appearance of some or all other objects in remainder of the user interface). In some embodiments, emphasizing the second object relative to the first object includes obscuring the first object (and optionally, some or all other objects in the remainder of the user interface) by, e.g., blurring, shrinking, and/or masking, to make the first object (and the some or all other objects in the remainder of the user interface) less clear or distinct on the display, while maintaining the appearance of the second object in the user interface. In some embodiments, emphasizing the second object relative to the first object includes enhancing the appearance of the second object, while obscuring the first object (and optionally, some or all other objects in the remainder of the user interface). In some embodiments, emphasizing the second object relative to the first object includes providing a visual hint that the second object is an object that would respond to changes in contact intensity by producing an object-specific response (e.g., providing a preview or displaying a quick action menu that is specific to the second object).
In some embodiments, the third operation includes (3214) emphasizing the first object and the second object. In some embodiments, the third operation includes emphasizing the first object and the second object relative to one or more regions of the user interface that are separate from the first object and the second object and that are not associated with object-specific responses to changes in contact intensity.
In some embodiments, the emphasizing in the third operation includes (3216) emphasizing the first object in the same way that the first operation emphasizes the first object and emphasizing the second object in the same way that the second operation emphasizes the second object (e.g., by blurring all other objects (and optionally, background regions) that are not subject to the emphasizing in the user interface).
In some embodiments, the first object is (3218) associated with a first type of intensity-triggered operation (e.g., providing a preview associated with the first object in response to contact intensity meeting a preview-presentation criterion (e.g., also referred to a “peek” criterion), and providing content represented in the preview in response to contact intensity meeting a user interface transition criterion (e.g., also referred to as a “pop” criterion)) (e.g., when the first object is a first web link, the first type of intensity-triggered operation associated with the first object includes presenting a preview of a first webpage represented in the first web link, when the contact intensity reaches a preview-presentation intensity threshold (e.g., the “peek” intensity threshold), and/or presenting the first webpage when the contact intensity reaches a user interface transition intensity threshold (e.g., the “pop” intensity threshold)). This is illustrated in
In some embodiments, the second object is (3220) associated with a second type of intensity-triggered operation (e.g., providing a quick action menu associated with the second object in response to contact intensity meeting a menu-presentation criterion (e.g., as illustrated in
In some embodiments, the first object is (3222) associated with a first type of intensity-triggered operation for revealing first content associated with the first object (e.g., when the first object is a first web link, the first type of intensity-triggered operation associated with the first object includes presenting a preview of a first webpage represented in the first web link, when the contact intensity reaches a first intensity threshold (e.g., the “peek” intensity threshold), and presenting the first webpage when the contact intensity reaches a second intensity threshold (e.g., the “pop” intensity threshold)). This is illustrated in
In some embodiments, the second object is (3224) associated with the first type of intensity-triggered operation for revealing second content associated with the second object (e.g., when the second object is a second web link, the first type of intensity-triggered operation associated with the second object includes presenting a preview of a second webpage represented in the second web link, when the contact intensity reaches the first intensity threshold (e.g., the “peek” intensity threshold), and presenting the second webpage when the contact intensity reaches the second intensity threshold (e.g., the “pop” intensity threshold)).
In some embodiments, the first object is (3226) associated with a first type of action API associated with changes in contact intensity. In some embodiments, the device determines whether the first object is associated with a Peek-and-Pop API. In some embodiments, the device determines whether the first object is associated with a Quick Action Menu API. In some embodiments, if the electronic device determines that if an object at the location of the focus selector is not associated with any action API that responds to changes in contact intensity, the device determines that an appropriate response is to visually distinguish/emphasize the objects that are associated with the Peek-and-Pop API or the Quick Action API in the user interface.
In some embodiments, performing the first operation associated with the first object includes (3228) presenting first information that corresponds to the first object (e.g., a “peek” operation for the first object) when the character intensity of the contact increases above the first intensity threshold (e.g., a light press threshold); and presenting second information, that is distinct from the first information, that corresponds to the first object (e.g., a “pop” operation for the first object) when the character intensity of the contact increases above a second intensity threshold (e.g., a deep press threshold) that is greater than the first intensity threshold. In some embodiments, the first intensity threshold is greater than a contact detection threshold. In some embodiments, the first intensity threshold is the “peek” intensity threshold.
In some embodiments, the first information that corresponds to the first object is (3230) a preview associated with the first object (e.g., preview 3128 in
In some embodiments, performing the second operation associated with the second object includes (3232) presenting first information that corresponds to the second object (e.g., presenting a quick action menu for the second object) when the character intensity of the contact increases above the first intensity threshold (e.g., a light press threshold); and performing an action represented in the first information that corresponds to the second object (e.g., performing a direct-selection action in the quick action menu for the second object) when the character intensity of the contact increases above a second intensity threshold (e.g., a deep press threshold) that is greater than the first intensity threshold. In some embodiments, the first intensity threshold is greater than a contact detection threshold. In some embodiments, the first intensity threshold is the “peek” intensity threshold.
In some embodiments, the first information that corresponds to the second object is (3234) a menu of actions associated with the second object, and the action represented in the first information that corresponds to the second object is a direct-selection action represented in the menu of actions associated with the second object. For example, the second object is a representation of a contactable entity (e.g., a name or avatar of a user), and a quick action menu with actions (such as “call” “message”, “FaceTime”, “email”, etc.) is presented in response to the contact intensity increases above the first intensity threshold (e.g., a menu-presentation intensity threshold), and a default direct-selection action (e.g., “call”) is selected and performed (e.g., a default phone number of the contact is dialed) when the contact intensity increases above the second intensity threshold (e.g., a direct-selection intensity threshold).
In some embodiments, while displaying the user interface on the display, the device detects (3236) a second input (e.g., a tap gesture) that includes detecting a second contact on the touch-sensitive surface followed by lift-off of the second contact without detecting an increase in a characteristic intensity of the second contact above the first intensity threshold; and, in response to detecting the second input, in accordance with a determination that a focus selector is at the first location in the user interface at which the first object is displayed, the device performs a second operation associated with the first object that is distinct from the first operation associated with the first object (e.g., the first operation associated with the first object includes displaying additional information (e.g., a preview or a quick action menu) associated with the first object, and the second operation associated with first object includes displaying a second user interface associated with the first object) (e.g., as illustrated in 31K-31L). For example, if the first object is an application icon for an email program on the home screen, performing the first operation associated with the application icon includes displaying a menu of actions that are associated with the email program (e.g., compose, go to inbox, go to contact list, etc.), and performing the second operation associated with the application icon includes activating the email program. If the first object is a hyperlink in a document, performing the first operation associated with the hyperlink includes displaying a preview of a webpage associated with the hyperlink (e.g., as illustrated in 31G-31I), and performing the second operation associated with the hyperlink includes displaying the webpage associated with the hyperlink in a browser interface (e.g., as illustrated in 31K-31L). If the first object is an avatar of a user, the first operation associated with the avatar includes displaying a menu of actions that that are associated with performing various communication functions in connection with the user, and the second operation associated with the avatar includes displaying a contact card for the user represented by the avatar. Further, in response to detecting the second input, in accordance with a determination that a focus selector is at the location in the user interface that is away from any objects that are associated with object-specific operations that are triggered by changes in contact intensity, the device performs a fourth operation that corresponds to a user interface element (e.g., the user interface element at which the focus selector is located at the time of lift-off of the second contact) in the remainder of the user interface (e.g., if the user interface element is a selectable button that is not associated with a Peek-and-Pop API or Quick Action API, performing the third operation includes visually distinguishing (e.g., highlighting) all objects in the user interface that are associated with respective object-specific operations that are triggered by changes in contact intensity the user interface, and performing the fourth operation includes performing an operation associated with selecting/activating the selectable button. If the user interface element is non-editable text, performing the third operation includes visually distinguishing (e.g., highlighting) all objects in the user interface that are associated with respective object-specific operations that are triggered by changes in contact intensity the user interface, and performing the fourth operation includes selecting a portion of the text and optionally displaying a menu on the user interface (e.g., a menu showing actions such as “copy, select all, define”)) This is illustrated in
It should be understood that the particular order in which the operations in
In accordance with some embodiments,
As shown in
As described below, the method 3400 provides an intuitive way to identify objects that are associated with object-specific intensity sensitive operations. The method reduces the cognitive burden on a user when learning about new capabilities of the user interface, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to learn about new capabilities of the user interface faster and more efficiently conserves power and increases the time between battery charges.
The device displays (3402) a user interface on the display, wherein the user interface includes a first set of user interface elements (e.g., icons, links, buttons, images, and/or other activatable user interface objects). For a respective user interface element in the first set of user interface elements, the device is configured to respond to user input of a first input type (e.g., a press input with contact intensity above a respective intensity threshold (e.g., a hint intensity threshold, a preview intensity threshold, etc.)) at a location that corresponds to the respective user interface element (e.g., a location that corresponds to a hit region of the respective user interface element) by performing a plurality of operations that correspond to the respective user interface element. For example, user interface objects 3108-3122 in
One of the benefits of this method is that it reveals the first set of user interface elements without requiring any additional user interface elements, which would take up valuable area in the user interface and increase the complexity of the user interface. For example, the user interface does not have a separate “show objects that are configured to respond to deep presses” icon that when activated results in the device visually distinguishing the first set of user interface elements from the remainder of the user interface.
In some embodiments, determining (3408) whether the first location corresponds to the first user interface element in the first set of user interface elements includes determining whether the first location corresponds to a user interface element that has a first type of action API associated with the first input type. In some embodiments, the device determines whether the first location corresponds to a user interface element associated with a Peek-and-Pop API. In some embodiments, the device determines whether the first location corresponds to a user interface element associated with a contact intensity-based input API that needs to be revealed/taught to the user.
In some embodiments, the first input type is (3410) a press input by a contact on the touch-sensitive surface; the device is configured to respond to the press input by the contact at the location that corresponds to the respective user interface element by performing a first operation that corresponds to the respective user interface element (e.g., a “peek” operation for the respective user interface element, as described herein) when the intensity of the contact exceeds a first intensity threshold (e.g., a light press threshold). In some embodiments, the first intensity threshold is greater than a contact detection threshold. The device is configured to respond to the press input by the contact at the location that corresponds to the respective user interface element by performing a second operation, distinct from the first operation, that corresponds to the respective user interface element (e.g., a “pop” operation for the respective user interface element, as described herein) when the intensity of the contact exceeds a second intensity threshold that is greater than the first intensity threshold (e.g., a deep press threshold).
In some embodiments, the first operation displays (3412) a preview associated with the respective user interface element; and the second operation displays a second user interface associated with the respective user interface element. In some embodiments, the preview is a preview of the second user interface. This is illustrated in
In some embodiments, the first operation displays (3414) a menu of actions associated with the respective user interface element; and the second operation performs an action represented in the menu of actions associated with the respective user interface (e.g., and optionally displays a second user interface associated with the respective user interface element, such as a user interface associated with performance of the action). This is illustrated in
In some embodiments, applying the visual effect to distinguish the first set of user interface elements from the remainder of the user interface on the display includes (3416) enhancing appearances of the first set of user interface elements (e.g., highlighting, magnifying, lifting up from the user interface plane, and/or animating the first set of user interface elements to make the first set of user interface elements more distinct on the display) while maintaining appearances of user interface elements in the remainder of the user interface on the display.
In some embodiments, applying the visual effect to distinguish the first set of user interface elements from the remainder of the user interface on the display includes (3418) obscuring user interface elements in the remainder of the user interface on the display (e.g., blurring, shrinking, and/or masking to make user interface elements in the remainder of the user interface less clear or distinct on the display), while maintaining appearances of the first set of user interface elements on the display.
In some embodiments, applying the visual effect to distinguish the first subset of user interface elements from other user interface elements on the display includes (3420) enhancing appearances of the first set of user interface elements, and obscuring user interface elements in the remainder of the user interface on the display.
In some embodiments, while displaying the user interface on the display, the device detects (3422) a second user input of a second input type (e.g., a tap gesture), distinct from the first input type (e.g., a press input with contact intensity above a respective intensity threshold (e.g., a hint intensity threshold, a preview intensity threshold, etc.)), while a focus selector is at the first location in the user interface. In response to detecting the second user input of the second input type while the focus selector is at the first location in the user interface, in accordance with a determination that the first location corresponds to the first user interface element in the first set of user interface elements (e.g., the first location is within a hit region for the first user interface element in the first set of user interface elements), the device performs an operation that corresponds to the first user interface element (e.g., displaying a second user interface associated with the first user interface element). This is illustrated in
It should be understood that the particular order in which the operations in
In accordance with some embodiments,
As shown in
The operations in the information processing methods described above are, optionally implemented by running one or more functional modules in information processing apparatus such as general purpose processors (e.g., as described above with respect to
As noted above, there is a need for electronic devices with improved methods and interfaces for previewing media content. With existing methods, gestures used for playing media content of media are different from gestures used to move the media objects within a user interface. In the embodiments described below, a moving input may result in previews of content associated with different media objects or movement of the media objects on the display, depending on whether the input exceeds a threshold intensity level. Providing a user with the ability to provide input with or without an intensity component allows additional functionality to be associated with the input.
Below,
In some embodiments, the device is an electronic device with a separate display (e.g., display 450) and a separate touch-sensitive surface (e.g., touch-sensitive surface 451). In some embodiments, the device is portable multifunction device 100, the display is touch-sensitive display system 112, and the touch-sensitive surface includes tactile output generators 167 on the display (
A contact on touch screen 112 moves from a location indicated by focus selector 3604 along a path indicated by arrow 3606. A characteristic intensity of the contact is below a media-preview threshold intensity level (e.g., below a “hint” intensity threshold ITH as indicated at intensity meter 3602).
In
In some embodiments, enhanced media preview criteria include a criterion that is met when received input includes an increase in the characteristic intensity of a contact above an enhanced-preview intensity threshold (e.g., ITL). When enhanced media preview criteria are met while a preview of a media object is being output, an enhanced preview of the media object is displayed.
The user interface of
In response to detecting the movement of the contact (e.g., in response to detecting movement of the contact by a predefined distance), portable multifunction device 100 ceases to output the preview of media item 3664 and outputs a preview of a different media item (e.g., media item 3666, as indicated in
In some embodiments, the set of audio tracks of media object 3614 is automatically displayed after the album art is displayed in preview platter 3654 (e.g., after a predefined period of time). In some embodiments, the set of audio tracks of media object 3614 is displayed in response to the detection of the movement of the contact. In some embodiments, the set of audio tracks of media object 3614 is arranged in a loop, and continued upward movement of the contact detected when a preview of the first audio track in the set is being output would cause preview of the last audio track in the set to start. Similarly, continued downward movement of the contact detected when a preview of the last audio track in the set is being output would cause preview of the first audio track in the set to start.
The user interface of
In
The user interface of
In
In
As described below, the method 3700 provides an intuitive way to preview media content. The method reduces the cognitive burden on a user when previewing media content, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to preview media content faster and more efficiently conserves power and increases the time between battery charges.
The device displays (3702), on the display (e.g., touch screen 112), a user interface (e.g., a user interface as shown in any of
In some embodiments, the first media object (e.g., media object 3614) represents (3704) a first media collection (e.g., an music album, a playlist, etc.) that includes multiple media items (e.g., media items 3660-3672 of media object 3614) and the second media object (e.g., media object 3608) represents a second media collection that includes multiple media items. For example, a media object represents an album or playlist that includes multiple audio tracks, a media object represents multiple audio tracks for an artist or band, a media object represents a video series (such as a TV series) that includes multiple videos, a media object represents an image album that includes multiple animated images (e.g., animated .gif files), etc.
While a focus selector 3604 is over the first media object (e.g., media object 3612 in
In some embodiments, the device tilts (3708) the first media object (e.g., media object 3612) from a first orientation of the first media object (e.g., a default or initial orientation (e.g., parallel to the plane of the user interface)) to a second orientation (e.g., a tilted orientation relative to the plane of the user interface)) of the first media object in accordance with the movement of the contact. For example, as shown in
In response to detecting the input that includes the movement of the contact on the touch-sensitive surface, in accordance with a determination that the input meets media preview criteria, wherein the media preview criteria includes a criterion that is met when the input includes an increase in a characteristic intensity of the contact above a media-preview intensity threshold (e.g., a hint intensity threshold (ITH), a preview intensity threshold (ITL), or another static or dynamically determined media-preview intensity threshold) while the focus selector 3604 is over the first media object (e.g., media object 3612), the device outputs (3710) a preview of a media item. For example, in
In response to detecting the movement of the contact, the device ceases to output the preview of the media item from the first set of media items and outputs (3710) a preview of a media item from the second set of media items. For example, the movement moves the focus selector 3604 from over first media object 3612, along a path indicated by arrow 3634, to over second media object 3608, as indicated in
In accordance with a determination that the input does not meet the media preview criteria, the device moves (3710) the first media object and the second media object on the display in accordance with the movement of the contact on the touch-sensitive surface. For example, when an input includes a movement of a focus selector 3604 along a path indicated by arrow 3606 and media preview criteria are not met (e.g., the characteristic intensity of the contact does not reach an intensity threshold, such as ITL), as indicated at
In some embodiments, in response to detecting the input that includes the movement of the contact on the touch-sensitive surface, in accordance with the determination that the input meets the media preview criteria, the device maintains (3712) positions of the first media object and the second media object on the display during the movement of the contact on the touch-sensitive surface. (e.g., the first media object and the second media object are static or substantially static (e.g., do not scroll) during the movement of the contact/focus selector. For example, when movement of the contact (e.g., from a location indicated by focus selector 3604 along a path indicated by arrow 3606) occurs while or after an increase in the characteristic intensity of the contact above the media-preview intensity threshold is detected, and the preview of a media object is started in response to media preview criteria being met, the first media object and the second media object do not scroll with the movement of the contact. For example, as shown in
In some embodiments, the media preview criteria includes a criterion that is met (3714) when the increase in the characteristic intensity of the contact above the media-preview intensity threshold occurs before the focus selector 3604 has moved by more than a threshold distance. In some embodiments, the threshold distance is a distance selected based on average or maximum contact position variations found in a substantially static contact during a press input (e.g. a lateral range of less than 2 mm or 5 pixels). In some embodiments, the threshold distance is used to differentiate inadvertent movements of the contact while applying pressure to the touch-sensitive surface 112 from intentional movement/translation of the contact on the touch-sensitive surface 112. In some embodiments, the criterion associated with the threshold distance is used in addition to the criterion associated with the media preview intensity threshold when determining whether the input has met the media preview criteria.
In some embodiments, in accordance with a determination that the input meets the media preview criteria, the device selects (3716) the media item from the first set of media items for outputting the preview of the media item from the first set of media items based on at least one selection criterion. For example, the selection criterion includes, e.g., most popular, trending, highest rated for the user, listed first (e.g., in an album or a playlist), etc. In some embodiments, the preview of the media item starts at the beginning of the media item. In some embodiments, the preview of the media item starts at a position other than the beginning of the media item (e.g., a preselected “interesting” portion of the media item).
In some embodiments, while outputting the preview of the media item from the first set of media items, the device visually distinguishes (3718) the first media object (e.g., media object 3612, as shown in
In some embodiments, in response to detecting the movement of the contact, the device ceases (3720) to visually distinguish the first media object from the one or more media objects of the plurality of media objects other than the first media object, while ceasing to output the preview of the media item from the first set of media items; and visually distinguishes the second media object from one or more media objects of the plurality of media objects other than the second media object, while outputting the preview of the media item from the second set of media items. For example,
In some embodiments, after the outputting of the preview of the media item from the second set of media items is started, the device ceases (3722) to output the preview of the media item from the second set of media items after a predetermined duration (e.g., until reaching the end of the media item (such as the end of a preview segment, the end of an audio track, the end of a video, etc.), until a predetermined preview playback duration has been reached, etc.). In some embodiments, the preview of the media object is completed before lift-off of the contact is detected. In some embodiments, the preview of the media object is interrupted when lift-off of the contact is detected. In some embodiments, the preview of the media object continues with a different media item selected from the set of media items, if no lift-off of the contact has been detected.
In some embodiments, while outputting the preview of the media item from one of the first set of media items or the second set of media items, the device detects (3724) a decrease in the characteristic intensity of the contact below a preview-termination intensity threshold (e.g., the contact detection intensity threshold (IT0), the hint intensity threshold (ITH), or the preview intensity threshold (ITL), the media-preview intensity threshold, or another static or dynamically determined preview-termination intensity threshold). In response to detecting the decrease in the characteristic intensity of the contact below the preview-termination intensity threshold, the device ceases to output the preview of the media item from said one of the first set of media items or the second set of media items. In some embodiments, the preview ends immediately on the detected decrease in the characteristic intensity of the contact below the preview-termination threshold (e.g., the device ceases to display image/video, ends audio playback from speakers, etc.). In some embodiments, the preview is gradually faded out.
In some embodiments, the preview-termination intensity threshold (3726) is an intensity threshold that is lower than the media-preview intensity threshold. In such embodiments, preview of a media item can continue without the need to maintain the intensity of the contact above the media-preview intensity threshold all the time. For example, in
In some embodiments, while outputting the preview of the media item from one of the first set of media items or the second set of media items (e.g., while the focus selector 3604 is over media object 3612 as shown in
In some embodiments, the movement of the contact on the touch-sensitive surface 112 causes movement of the focus selector 3604 to a predefined region (e.g., within a threshold distance from an edge (e.g., upper edge 3640 or lower edge 3650) of the user interface displaying the plurality of media objects) of the user interface that includes the plurality of media objects, and, while the focus selector is within the predefined region of the user interface, the device moves (3730) the first media object and the second media object on the display (e.g., automatically scrolling the plurality of media objects in the user interface as the focus selector (e.g., the contact) is within the predefined region of the user interface). For example, when focus selector 3604c is within a predefined region of upper edge 3640 of the user interface, as shown in
In some embodiments, moving the first media object and the second media object on the display while the focus selector 3604 is within the predefined region of the user interface includes (3732) moving the first media object (e.g., media object 3612) and the second media object (e.g., media object 3608) while the focus selector 3604 is substantially stationary within the predefined region of the user interface (e.g., when the contact is substantially stationary on touch-sensitive surface 112).
In some embodiments, moving the first media object (e.g., media object 3612) and the second media object (e.g., media object 3608) on the display while the focus selector 3604 is within the predefined region of the user interface includes moving (3734) the first media object (3612) and the second media object (3608) at a rate in accordance with a current location of the focus selector within the predefined region of the user interface. For example, the scrolling speed is based on (e.g., directly proportional to or otherwise related to) a distance from the edge (e.g., upper edge 3640 or lower edge 3650) of the user interface rather than being dependent on the movement of the contact on the touch-sensitive surface. In some embodiments, the rate at which the media objects are scrolled on the display is determined based on a distance of the contact from the edge of the touch-sensitive surface (e.g., moving faster when the contact is near the edge of the touch-sensitive surface and moving slower when the contact is further away from the edge of the touch-sensitive surface) or a distance of a focus selector from an edge of a content region on the display that includes the media objects. In some embodiments, the rate at which the media objects are scrolled is dependent upon an intensity of the contact (e.g., scrolling faster when the intensity of the contact is higher and scrolling more slowly when the intensity of the contact is lower).
In some embodiments, moving the first media object and the second media object on the display while the focus selector 3604 is within the predefined region of the user interface includes moving (3736) the first media object (e.g., media object 3612) and the second media object (e.g., media object 3608) while outputting the preview of the media item from one of the first set of media items and the second set of media items. For example, after the preview of a media item from one of the first and second set of media items has been started in accordance with a determination that the input meets media preview criteria (e.g., a preview of a media item from media object 3608 being output as indicated by equalizer graphic 3636 in
In some embodiments, the movement of the contact on the touch-sensitive surface 112 causes movement of the focus selector 3604 from within the predefined region to a location outside of the predefined region of the user interface, and, in response to detecting that the movement of the contact has caused the movement of the focus selector from within the predefined region to a location outside of the predefined region of the user interface, the device ceases (3738) to move the first media object and the second media object on the display (e.g., the automatic scrolling of the plurality of media objects stops when the focus selector is moved out of the predefined region of the user interface. Subsequent movement of the focus selector 3604 caused by subsequent movement of the contact on the touch-sensitive surface 112 does not cause further scrolling of the media objects (e.g., media object 3608, 3610, 3612, 3614) on the user interface. Instead, when the focus selector 3604 is moved (through the subsequent movement of the contact) to a third media object on the user interface (e.g., media object 3642), a preview of a media item from the third media object is output, and the preview of the media item from the currently previewed media object (e.g., the first or second media object) is stopped.
In some embodiments, while outputting the preview of the media item from one of the first set of media items or the second set of media items (e.g., while the focus selector 3604 is over media object 3614 as shown in
In some embodiments, while displaying the enhanced preview of said one of the first or second media object corresponding to said one of the first or second set of media items, the device detects (3742) further movement of the contact on the touch-sensitive surface; and in response to detecting the further movement of the contact on the touch-sensitive surface 112 (e.g., movement of the contact that causes movement of the focus selector 3604 by more than a predefined distance or to a different media item in the set of media items, such as movement along the path indicated by arrow 3658 of
In some embodiments, outputting an enhanced preview (e.g., preview platter 3654) of one of the first or second media object corresponding to said one of the first or second set of media items includes displaying (3744) representations of said one of the first or second set of media items. For example, media items 3660-3672 are displayed enhanced preview 3654 in
In some embodiments, while outputting the preview of a first respective media item from said one of the first set of media items or the second set of media items, the first respective media item is visually distinguished (3746) from one or more media items from said one of the first or second set of media items other than the first respective media item (e.g., the first respective media item is highlighted relative to other media items in the set of media items, and/or the first respective media item remains clear and visible while other media items fade away gradually over time on the preview platter). For example, in
In some embodiments, while outputting the preview of the first respective media item from said one of the first set of media items or the second set of media items, the device alters (3748) an appearance of respective representations of one or more media items from said one of the first or second set of media items other than the first respective media item. For example, while the preview of the first respective media item (e.g., media item 3666) from the set of media items for a media object (e.g., media object 3614) is being played and the enhanced preview 3654 for the media object is being displayed over the user interface, the representations of the media items in the listing of the media items are gradually faded out (e.g., as demonstrated by the representations of media items 3662, 3664, 3668, and 3670) leaving only the representation for the media item that is being previewed (e.g., media item 3666) visible/unchanged in the enhanced preview 3654 (e.g., as shown in
In some embodiments, the device detects (3750) movement of the contact that causes movement of the focus selector 3604 to a second respective media item (e.g., while the appearance of the second respective media item is unaltered (e.g., not yet faded) or while the second respective media item has already been altered (e.g., faded but not completely gone from the preview platter) from said one of the first set of media items or the second set of media items, the second respective media item being distinct from the first respective media item; and in response to detecting the movement of the contact that causes the movement of the focus selector to the second respective media item (or, in some embodiments, in response to the focus selector moving to and remaining at the second respective media item for more than a threshold amount of time), the device alters the appearance of the second respective media item. For example, the representation of the second respective media item is highlighted, and the representation of the first respective media item is no longer highlighted, when the focus selector moves over to the second respective media and, optionally, remains at the second respective media item for more than a threshold amount of time. If the second respective media item has already started to fade when the focus selector moves over it, the second respective media item is no longer faded, and the representation of the first respective media item is optionally faded. In some embodiments, as the focus selector traverses to the representation of the second respective media item, altering the appearance of the second respective media item optionally includes showing additional information associated with the second respective media item such as descriptions/labels, lifting the representation of the second respective media item in a virtual z direction, etc. In some embodiments, the alteration of the appearance is reversed in response to determining that focus selector has moved away from the second respective media item.
In some embodiments, in response to detecting the movement of the contact that causes the movement of the focus selector to the second respective media item (or, in some embodiments, in response to the focus selector moving to and remaining at the second respective media item for more than a threshold amount of time), the device ceases (3752) to output the preview of the first respective media item from said one of the first set of media items or the second set of media items and the device outputs a preview of the second respective media item from said one of the first set of media items or the second set of media items. For example, when focus selector 3604 has moved to media item 3670, as indicated at 36Q, a preview of media item 3670 is output.
In some embodiments, while outputting a preview for a currently previewed media item, in accordance with a determination that the input meets media selection criteria (e.g., a characteristic intensity of a contact exceeds a “deep press” intensity threshold (ITD), or another static or dynamically determined media-selection intensity threshold), the device displays (3754) an indication that the representation of the currently previewed media item is selected. In some embodiments, the indication that the representation of the currently previewed media item is selected includes an altered appearance of the representation of the currently previewed media item, such as outline, further highlighting, bold text, etc. For example, as shown in
In some embodiments, while displaying the enhanced preview of said one of the first or second media object that corresponds to said one of the first or second set of media items: in accordance with a determination that a characteristic intensity of the contact has decreased below a respective intensity threshold (e.g., decreased below the enhanced-preview intensity threshold (e.g., (ITL), such as below the enhanced-preview intensity threshold but above the media-preview intensity threshold (e.g., ITH)), the device maintains (3756) display of the enhanced preview 3654 of said one of the first or second media object that corresponds to said one of the first or second set of media items. In some embodiments, maintaining display of the enhanced preview of the currently previewed media item/media object enables a user to more easily scroll through the media item representations (and, optionally, scroll through the list of media items upon moving the focus selector to an edge of the set of media item representations, similar to the way that the media objects scroll (e.g., as discussed with regard to
In some embodiments, while displaying the enhanced preview (e.g., preview platter 3654) of said one of the first or second media object that corresponds to said one of the first or second set of media items, in accordance with a determination that lift-off of the contact has been detected, the device maintains (3758) display of the enhanced preview 3654 of said one of the first or second media object that corresponds to said one of the first or second set of media items. In some embodiments, maintaining display of the enhanced preview of the currently previewed media item/media object on liftoff of the contact enables a user to provide further input related to one or more media items, e.g., the user is enabled to select a media item representation (such as by tapping on the media item representation).
In some embodiments, while displaying the enhanced preview (e.g., preview platter 3654) of said one of the first or second media object that corresponds to said one of the first or second set of media items, in accordance with a determination that lift-off of the contact has been detected, the device ceases (3760) to display the enhanced preview (e.g., preview platter 3654) of said one of the first or second media object that corresponds to said one of the first or second set of media items.
It should be understood that the particular order in which the operations in
In accordance with some embodiments,
As shown in
The processing unit 3808 is configured to enable display, on display unit 3802, of a user interface a plurality of media objects that include a first media object that represents a first set of one or more media items and a second media object that represents a second set of one or more media items, wherein the first set of media items is different from the second set of media items. The processing unit 3808 is configured to, while a focus selector is over the first media object, detect an input that includes movement of a contact on the touch-sensitive surface 3804. The processing unit 3808 is configured to: in response to detecting the input that includes the movement of the contact on the touch-sensitive surface: in accordance with a determination that the input meets media preview criteria, wherein the media preview criteria includes a criterion that is met when the input includes an increase in a characteristic intensity of the contact above a media-preview intensity threshold while the focus selector is over the first media object, output (e.g., with the outputting unit 3810) a preview of a media item from the first set of media items and, in response to detecting the movement of the contact, cease (e.g., with the ceasing unit 3812) to output the preview of the media item from the first set of media items and output (e.g., with the outputting unit 3810) a preview of a media item from the second set of media items; and, in accordance with a determination that the input does not meet the media preview criteria, move (e.g., with the moving unit 3810) the first media object and the second media object on the display in accordance with the movement of the contact on the touch-sensitive surface.
The operations in the information processing methods described above are, optionally implemented by running one or more functional modules in information processing apparatus such as general purpose processors (e.g., as described above with respect to
The operations described above with reference to
Many electronic devices have graphical user interfaces that display paginated content, such as pages of a book displayed in a reader application. With existing methods, tapping or swiping input is used to sequentially access the pages before and after a currently displayed page. In some embodiments described below, when an input meets one respective content navigation criteria (e.g., when a press input received at the edge of a page exceeds a threshold intensity level), an indication of a quantity of later pages or an indication of a quantity of prior pages is displayed. In some embodiments, when the input meets another respective content navigation criteria (e.g., when the press input ends with a focus selector on a particular page in the prior or later pages, or when the press input exceeds a second threshold intensity level), the device jumps ahead or backward to a page that is in the later or prior pages or to a page in a later or prior section. Providing a user with the ability to provide input with or without an intensity component allows additional functionality to be associated with the input, and thereby improve efficiency and ease of content navigation.
Below,
In some embodiments, the device is an electronic device with a separate display (e.g., display 450) and a separate touch-sensitive surface (e.g., touch-sensitive surface 451). In some embodiments, the device is portable multifunction device 100, the display is touch-sensitive display system 112, and the touch-sensitive surface includes tactile output generators 167 on the display (
In some embodiments, an existing bookmark 3922 is displayed (e.g., at the location of the bookmarked page 3918) when edges of pages 3912-3920 are revealed (e.g., in accordance with a determination that the characteristic intensity of the contact at the location within region 3908 indicated by focus selector 3904 exceeded the respective threshold intensity (e.g., ITL), as shown in
In user interface 3930, a portion (e.g., page 3910) of a section (e.g., Chapter 1) of paginated content is shown. A contact with touch screen 112 of portable multifunction device 100 is detected at a location within region 3908 indicated by focus selector 3904. As indicated by intensity meter 3902 shown adjacent to user interface 3930, the characteristic intensity of the contact is below threshold intensity ITL.
In accordance with a determination that the characteristic intensity of the contact at the location indicated by focus selector 3904 exceeded a threshold intensity ITL (as shown at intensity meter 3902 adjacent to user interface 3932), edge portions of pages 3912-3920 are revealed, as shown in user interface 3932.
In some embodiments, more (or less) of the edge portions of pages 3912-3920 are dynamically revealed as the characteristic intensity of the contact at the location indicated by focus selector 3904 increases (decreases). In accordance with a determination that the characteristic intensity of the contact at the location within region 3908 indicated by focus selector 3904 continued to increase beyond intensity threshold ITL (without reaching intensity threshold ITD), as shown at intensity meter 3902 adjacent to user interface 3934, the size of the revealed edges of page edges 3912-3920 increases (e.g., to a predetermined size), as shown in user interface 3934.
In accordance with a determination that the characteristic intensity of the contact at the location within region 3908 indicated by focus selector 3904 exceeded a threshold intensity ITD, as shown at intensity meter 3902 adjacent to user interface 3936, the display of pages 3910-3920 are replaced with beginning page 3920 of Chapter 2, as shown at user interface 3936. In some embodiments, beginning page 3920 continues to be shown when the characteristic intensity of the contact decreases below ITD (e.g., below IT0 upon lift-off of the contact).
In some embodiments, beginning page 3920 as shown in user interface 3936 is displayed in accordance with a determination that the characteristic intensity of the contact at the location within region 3908 indicated by focus selector 3904 (as shown in user interface 3934) fell below a respective threshold intensity (e.g., ITL) followed, within a predetermined time, by an increase in the characteristic intensity to a level above the respective threshold intensity (e.g., ITL).
User interface 3940 illustrates revealed page edges of a sequence of pages 3912-3920 that follow page 3910. For example, edges of pages 3912-3920 are revealed in accordance with a determination that the characteristic intensity of the contact at the location within region 3908 indicated by focus selector 3904 exceeded a respective threshold intensity ITL, as shown at intensity meter 3902 adjacent to user interface 3940.
When portable multifunction device 100 detects a movement of focus selector 3904 (in accordance with movement of the contact) (e.g., in a direction indicated by arrow 3948), as shown in user interfaces 3940-3946, edges of different pages from pages 3912-3920 are selectively enhanced (e.g., enlarged) to show more content on the enhanced pages. In various embodiments, the intensity of the contact is maintained (e.g., above ITL) or reduced (e.g., below ITL, as indicated at intensity meter 3902 adjacent to user interfaces 3942-3946) as the movement of focus selector 3904 occurs.
User interfaces 3942 illustrates that, as focus selector 3904 towards the edge of page 3918 (e.g., by a respective threshold distance), page 3918 is shifted toward focus selector 3904, while other pages on the user interface remained stationary. As a result, more of page 3918 becomes visible on the user interface (e.g., more content of page 3918 is shown on the user interface) (e.g., as shown in user interfaces 3944 and 3946). As movement of focus selector (in accordance with movement of the contact) continues (e.g., in the direction indicated by arrow 3948), enhancement of the page immediately preceding page 3918 (e.g., page 3916) is triggered (not shown in
In some embodiments, analogous behaviors can be implemented when the focus selector is initially detected on the left edge of a currently displayed page. After a sequence of prior pages preceding the currently displayed page are presented in response to an increase in intensity of the contact, movement of the focus selector (in accordance with movement of the contact) toward the right, causes edges of the prior pages to be enhanced (e.g., to be moved leftward toward the contact) one page at a time, such that the user can get a better glimpse of the content of the prior page one page at a time while the edge of the page is enhanced.
User interface 3950 illustrates revealed page edges of a sequence of later pages 3912-3920 that follow a page 3910. For example, edges of pages 3912-3920 are revealed in accordance with a determination that the characteristic intensity of the contact at the location within region 3908 indicated by focus selector 3904 exceeded a threshold intensity ITL, as shown at intensity meter 3902 adjacent to user interface 3950.
Portable multifunction device 100 detects a movement of focus selector 3904 (e.g., in a direction indicated by arrow 3958), as shown in user interfaces 3950. User interface 3952 and 3954 illustrate that page 3918 is being dynamically enhanced (e.g., exposed portion of the page is increased) as focus selector 3904 moves toward the edge of page 3918. User interface 3954 illustrates that page 3916 moves toward focus selector 3904 and eventually reaches a location under focus selector 3904. While focus selector 3904 is over the edge of page 3916, as shown in user interface 3954, lift-off of the contact from touch screen 112 occurs, as indicated by intensity meter 3902 adjacent to user interface 3956. In response to lift-off of the contact from touch screen 112 while focus selector 3904 is over the edge of page 3916, the user interface ceases to display page 3910 and edge portions of pages 3912-3920, and the user interface displays page 3916, as shown in user interface 3956.
In some embodiments, while the device is displaying page x of section y of paginated content, the input is received (e.g., the contact is detected, and the characteristic intensity of the contact I>I0).
(A) If lift-off of the contact is detected before the characteristic intensity of the contact ever increased above a first intensity threshold I1 (e.g., I<I1, before lift-off), the device ceases to display the currently displayed page (e.g., page x), and displays the next page (e.g., page x+1) (or the previous page (e.g., x−1), e.g., depending on whether the location of the contact is on the right edge of the currently displayed page, or the left edge of the currently displayed page) in the user interface. This is illustrated in
(B) Alternatively, if lift-off of the contact is not yet detected, and the characteristic intensity of the contact increases above the first intensity threshold I1 (e.g., I>I1, before lift-off), a sequence of later pages (or a sequence of prior pages, e.g., depending on whether the location of the contact is on the right edge of the currently displayed page, or the left edge of the currently displayed page) in the current section (e.g., section y) are presented in the user interface. In some embodiments, the edges of the sequence of later pages (or the sequence of prior pages) are spread out dynamically (e.g., spread out by a larger or smaller amount) in accordance with the current characteristic intensity of the contact above I1. This is illustrated in
(C) If lift-off of the contact is detected after reaching I1, but before it reaches above a second intensity threshold I2 (e.g., I<I2, before lift-off), the device ceases to display the edges of the sequence of later pages (or the sequence of prior pages), and restores the display of page x in the user interface, upon lift-off of the contact.
(D) Alternatively, if lift-off of the contact is not yet detected, and the characteristic intensity of the contact increases above the second intensity threshold I2 (e.g., I>I2, before lift-off), a stable preview of the sequence of later pages (or the sequence of prior pages) is displayed (and, optionally, content of a respective one of the sequence of later pages or prior pages is enlarged for the user to preview). In addition, the stable preview optionally shows a preview of the content of first page of the next (or previous) section (e.g., page 3920 in
(E) If lift-off of the contact is not yet detected, and the characteristic intensity of the contact increases above a third intensity threshold I3 (e.g., I>I3, before lift-off) while the contact is substantially stationary, the stable preview of the sequence of later pages (or the sequence of prior pages) is removed, and the device displays the first page of the next section (e.g., section y+1) (or the first page of the previous section (e.g., section y−1)) in the user interface. In other words, the devices “pops” into the next section (or the previous section), skipping the pages in between. This is illustrated in
(F) If lift-off of the contact is not yet detected, and movement of the contact is detected, the device scans through the sequence of the later pages (or the sequence of prior pages) to present more content of each of the pages in accordance the movement of the contact. This is illustrated in
(G) If lift-off is detected while the contact (focus selector) is over a respective page in the sequence of later pages (or the sequence of prior pages) during the scanning of the pages in (F), the device ceases to display the stable preview of the sequence of later pages (or the sequence of prior pages), and displays the page that is currently under the contact (focus selector) in the user interface. In other words, the device “pops” into the selected page in the current section, upon lift-off of the contact. This is illustrated in
(H) If lift-off is detected before the characteristic intensity of the contact ever increased above the third intensity threshold I3 (e.g., I<I3, before lift-off), the device maintains the stable preview of the sequence of later pages (or the sequence of prior pages) in the user interface, upon lift-off of the contact. When a subsequent input is detected, if the subsequent input is a selection input (e.g., a tap input) on one of the pages depicted in the preview, the device ceases to display the preview and displays the selected page in the user interface; if the subsequent input is a dismissal input (e.g., a swipe input or a tap input outside of the preview), the preview is removed, and the device restores the originally displayed page x in the user interface.
It should be noted that, the process flow in
As described below, the method 4000 provides an intuitive way to improve efficiency and ease of navigating paginated content. The method reduces the cognitive burden on a user when navigating paginated content, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to locate and navigated to desired portions in paginated content faster and more efficiently conserves power and increases the time between battery charges.
The device displays (4002), on the display, a first portion of paginated content (e.g., a currently displayed page or pages, such as one page in a single page mode (e.g., page 3910 in
While a focus selector is within a first predefined region (e.g., region 3908 in
In response to detecting the first portion of the input: in accordance with a determination that the first portion of the input meets first content-navigation criteria, where the first content-navigation criteria include a criterion that is met when the device detects a lift-off of the contact from the touch-sensitive surface before a characteristic intensity of the contact reaches a first threshold intensity (e.g., a tap or swipe gesture that does not reach a light press threshold intensity before lift-off of the contact in the tap or swipe gesture occurs), the device replaces (4006) the displayed first portion of the paginated content with a second portion of the paginated content (e.g., page 3912 in
In some embodiments, the device determines (4008) whether to display the indication of the quantity of pages within the sequence of later pages in the first section or to display the indication of the quantity of pages within the sequence of prior pages in the first section based on a location of the focus selector during the first portion of the input. For example, when a user presses above a light press threshold on the left edge of the displayed page, edges of the set of prior pages in the current chapter are revealed from behind the currently displayed page (e.g., as shown in
In some embodiments, displaying the indication of the quantity of pages within the sequence of later pages in the first section of the paginated content includes (4010) concurrently displaying, in the user interface, a respective edge portion for a plurality of respective pages in the sequence of later pages (e.g., as shown in
In some embodiments, in accordance with the determination that the first portion of the input meets the second content-navigation criteria, the device dynamically varies (4012) sizes of the respective edge portions of the sequence of later pages that are displayed in the user interface in accordance with a current intensity of the contact. For example, when the characteristic intensity of the contact varies between ITH and ITL, the sizes of the edge portions of the sequence of later pages shown in
In some embodiments, in accordance with the determination that the first portion of the input meets the second content-navigation criteria, the device sequentially displays (4014) respective edge portions of the sequence of later pages in accordance with a current intensity of the contact. For example, as the intensity of the contact increases, the edge portions of additional pages between the current page and the end of the chapter are displayed. In some embodiments, displaying the indication of the quantity of pages between the current page and the end of the document includes sequentially displaying the appearance of a number of page edges that corresponds to the number of pages between the current page and the end of the current chapter.
In some embodiments, in accordance with the determination that the first portion of the input meets the second content-navigation criteria, the device dynamically shifts (4016) the displayed first portion of the paginated content in the user interface to make room for the displayed respective edge portions of the sequence of later pages. Similarly, in some embodiments, in accordance with the determination that the first portion of the input meets the second content-navigation criteria, the device dynamically shifts the displayed first portion of the paginated content in the user interface to make room for the displayed respective edge portions of the sequence of prior pages. For example, as shown in
In some embodiments, while displaying the indication of the quantity of pages within the sequence of later pages in the first section or the indication of the quantity of pages within the sequence of prior pages in the first section and at least some of the first portion of the paginated content, the device detects (4018) a second portion of the input. In accordance with a determination that the second portion of the input meets third content-navigation criteria, the device replaces display of the indication of the quantity of pages within the sequence of later pages in the first section or the indication of the quantity of pages within the sequence of prior pages in the first section and the at least some of the first portion of the paginated content with display of a third portion of the paginated content, where the third portion of the paginated content includes a beginning page of a second section that is sequentially adjacent to (e.g., immediately follows or immediately precedes) the first section (e.g., as shown in
In some embodiments, the third content-navigation criteria include (4020) a criterion that is met when the device detects an increase in the characteristic intensity of the contact above a second intensity threshold (e.g., a deep press threshold) that is higher than the first intensity threshold (e.g., the light press threshold). In some embodiments, the third content-navigation criteria require detecting the increase in the characteristic intensity of the contact above the second intensity threshold while the focus selector is within the first predefined region of the displayed first portion of the paginated content on the display. In some embodiments, a swipe gesture with a characteristic intensity below an intensity threshold (e.g., below a deep press threshold) navigates through the content one page at a time, whereas a swipe gesture with a characteristic intensity above an intensity threshold (e.g., above a deep press threshold) navigates through the content by more than one page at a time (e.g., by one chapter or section at a time).
In some embodiments, the third content-navigation criteria include (4022) a criterion that is met when the device detects a decrease in the characteristic intensity of the contact below the first intensity threshold (e.g., the light press threshold) followed, within a predetermined time, by an increase in the characteristic intensity of the contact to a third intensity threshold that is above the first intensity threshold. For example, in some embodiments, after a light press displays the indication of the quantity of pages within the sequence of later pages in the first section or the indication of the quantity of pages within the sequence of prior pages in the first section (e.g., edges of prior pages or edges of later pages, respectively) and at least some of the first portion of the paginated content, a reduction in intensity followed, within a predetermined time, by an increase in intensity to a third intensity threshold results in display of the first page of the next chapter (e.g., if the focus selector is on the right edge of the displayed page) or results in display of the first page of the previous chapter (e.g., if the focus selector is on the left edge of the displayed page). In some embodiments, the third intensity threshold is below the second intensity threshold. In some embodiments, the third intensity threshold is the same as the second intensity threshold. In some embodiments, the third content-navigation criteria require detecting an increase in the characteristic intensity of the contact at or above the third intensity threshold while the focus selector is within the first predefined region of the displayed first portion of the paginated content on the display. In some embodiments, the criterion based on the second intensity threshold and the criterion based on the third intensity threshold are alternative criterions, and an input meeting either one of the two criteria is sufficient to meet the third content-navigation criteria.
In some embodiments, while displaying the indication of the quantity of pages within the sequence of later pages in the first section or the indication of the quantity of pages within the sequence of prior pages in the first section and at least some of the first portion of the paginated content, the device detects (4024) a second portion of the input. In accordance with a determination that the second portion of the input meets fourth content-navigation criteria, where the fourth content-navigation criteria include a criterion that is met when the device detects a decrease in the characteristic intensity of the contact below the first intensity threshold followed by a lift off of the contact: the device ceases to display the indication of the quantity of pages within the sequence of later pages in the first section or ceasing to display the indication of the quantity of pages within the sequence of prior pages in the first section, and restores the display of the first portion of the paginated content in the user interface on the display to its appearance just prior to detecting the first portion of the input. In some embodiments, the fourth content-navigation criteria require detecting the decrease in the characteristic intensity of the contact below the first intensity threshold followed by a lift off of the contact while the focus selector is within the first predefined region of the displayed first portion of the paginated content on the display.
In some embodiments, while displaying respective edge portions of later pages that indicate the quantity of pages within the sequence of later pages in the first section or respective edge portions of prior pages that indicate the quantity of pages within the sequence of prior pages in the first section and at least some of the first portion of the paginated content, the device detects (4026) a second portion of the input. In accordance with a determination that the second portion of the input meets fifth content-navigation criteria, where the fifth content-navigation criteria include a criterion that is met when the device detects a movement of the focus selector on the display, the device dynamically enhances (e.g., magnifying, enlarging, highlighting, lifting up, or otherwise visually distinguishing) a respective edge portion. This is illustrated in
In some embodiments, dynamically enhancing the respective edge portion occurs (4028) while the focus selector is over the respective edge portion. For example, as the focus selector moves over displayed edge portions of each of the later pages, the displayed edge portion of that later page is enlarged to show more of its content or its content is shown more prominently as compared to the other later pages in the current chapter. In some embodiments, dynamically enhancing a given edge portion requires detecting an increase in intensity of the contact in the second portion of the input (e.g., detecting a light press input) while the focus selector is over the given edge portion.
In some embodiments, when the focus selector moves by a predetermined amount, the dynamically enhanced respective edge portion is (4030) moved to under the focus selector. In some embodiments, an animation is shown to move the respective edge portion to under the focus selector (e.g., the finger contact). This is illustrated in
In some embodiments, after detecting the second portion of the input, the device detects (4032) a third portion of the input while the focus selector is on an edge portion of a second page in the first section. In accordance with a determination that the third portion of the input meets sixth content-navigation criteria: the device ceases (4032) to display the respective edge portions and the first portion of the paginated content and displays a third portion of the paginated content on the display, where the third portion of the paginated content includes the second page in the first section. This is illustrated in
In some embodiments, the sixth content-navigation criteria include (4034) a criterion that is met when the device detects an increase in the characteristic intensity of the contact above the second intensity threshold (e.g., the deep press threshold) (during the third portion of the input, while the focus selector is on the edge portion of the second page in the first section).
In some embodiments, the sixth content-navigation criteria include (4036) a criterion that is met when the device detects a decrease in the characteristic intensity threshold below the first intensity threshold followed, within a predetermined time, by an increase in the characteristic intensity to a third intensity threshold that is above the first intensity threshold (during the third portion of the input, while the focus selector is on the edge portion of the second page in the first section). In some embodiments, the criterion based on the second intensity threshold and the criterion based on the first intensity threshold are alternative criterions, and an input meeting either one of the two criteria is sufficient to meet the sixth content-navigation criteria.
In some embodiments, the sixth content-navigation criteria include (4038) a criterion that is met when the device detects a lift off of the contact in the input from the touch-sensitive surface (during the third portion of the input, while the focus selector is on the edge portion of the second page in the first section). This is illustrated in
It should be understood that the particular order in which the operations in
In accordance with some embodiments,
As shown in
In some embodiments, the processing unit 4108 is configured to: enable display (e.g., with the display enabling unit 4110), on the display unit, of a first portion of paginated content in a user interface, where: the paginated content includes a plurality of sections; a respective section in the plurality of sections includes a respective plurality of pages; the first portion of the paginated content is part of a first section of the plurality of sections; and the first portion of the paginated content lies between a sequence of prior pages in the first section and a sequence of later pages in the first section; while a focus selector is within a first predefined region of the displayed first portion of the paginated content on the display, detect (e.g., with detecting unit 4112) a first portion of an input, where detecting the first portion of the input includes detecting a contact on the touch-sensitive surface; in response to detecting the first portion of the input: in accordance with a determination (e.g., with determining unit 4114) that the first portion of the input meets first content-navigation criteria, wherein the first content-navigation criteria include a criterion that is met when the device detects a lift-off of the contact from the touch-sensitive surface before a characteristic intensity of the contact reaches a first threshold intensity, replace the displayed first portion of the paginated content with a second portion of the paginated content on the display, wherein the second portion of the paginated content includes a page that is sequentially adjacent to the first portion of the paginated content; and, in accordance with a determination (e.g., with determining unit 4114) that the first portion of the input meets second content-navigation criteria, wherein the second content-navigation criteria include a criterion that is met when the device detects an increase in the characteristic intensity of the contact above the first intensity threshold while the focus selector is within the first predefined region of the displayed first portion of the paginated content, enable display (e.g., with display enabling unit 4110) of an indication of a quantity of pages within the sequence of later pages in the first section or enable display (e.g., with display enabling unit 4110) of an indication of a quantity of pages within the sequence of prior pages in the first section.
The operations in the information processing methods described above are, optionally implemented by running one or more functional modules in information processing apparatus such as general purpose processors (e.g., as described above with respect to
Many electronic devices have graphical user interfaces that display a map at various zoom levels. For example, a map view including multiple points of interest can be displayed and the zoom level of the map can be increased to show contextual information for a particular point of interest. As noted above, there is a need for electronic devices with improved methods and interfaces for displaying contextual information associated with a point of interest in a map. In the embodiments described below, a map is zoomed to show contextual information for a point of interest in response to input including an intensity component. The map view is maintained at the zoomed level or redisplayed at a previous zoom level depending on whether the input intensity reaches a threshold intensity level. The approach described in the embodiments below allows a user to display a map at a desired zoom level using input with an intensity component. Giving a user the ability to provide input with or without an intensity component allows additional functionality to be associated with the input.
Below,
In some embodiments, the device is an electronic device with a separate display (e.g., display 450) and a separate touch-sensitive surface (e.g., touch-sensitive surface 451). In some embodiments, the device is portable multifunction device 100, the display is touch-sensitive display system 112, and the touch-sensitive surface includes tactile output generators 167 on the display (
A contact is detected at touch screen 112 at a location indicated by focus selector 4204. Focus selector 4204 is at the location of point of interest 4212, corresponding to an Apple Store in San Francisco. A characteristic intensity of the contact is indicated by intensity meter 4202. In the illustrative example of 42A, the intensity of the contact is between a threshold intensity level IT0 and a threshold intensity level ITH (e.g., a “hint” intensity threshold). The intensity of the contact is below a threshold intensity level ITL (e.g., a “light press” intensity threshold) and below a threshold intensity level ITD (e.g., a “deep press” intensity threshold).
In user interface 4280, map pins representing points of interest 4212 and 4214 are displayed and a contact is received at a location indicated by focus selector 4204. Because focus selector 4204 is closer to point of interest 4212 than point of interest 4214, in user interface 4282, the view of the map 4206 is zoomed to display contextual information near point of interest 4212. In some embodiments, the view of the map 4206 is positioned in user interface 4283 such that point of interest 4212 is located at the position of focus selector 4204. In some embodiments, the zoom from the view of the map 4206 shown in user interface 4280 to the view of the map 4206 shown in user interface 4282 occurs in accordance with a determination that a characteristic intensity of the contact exceeds a threshold intensity level, such as a preview intensity threshold (e.g., ITL, as shown at intensity meter 4202 adjacent to user interface 4282) or another intensity threshold as described herein.
In user interface 4284, map pins representing points of interest 4212 and 4214 are displayed and a contact is received at a location indicated by focus selector 4204. Because focus selector 4204 is closer to point of interest 4214 than point of interest 4212, in user interface 4286, the view of the map 4206 is zoomed to display contextual information near point of interest 4214. In some embodiments, the view of the map 4206 is positioned in user interface 4286 such that point of interest 4214 is located at the position of focus selector 4204. In some embodiments, the zoom from the view of the map 4206 shown in user interface 4284 to the view of the map 4206 shown in user interface 4286 occurs in accordance with a determination that a characteristic intensity of the contact exceeds a threshold intensity level, such as a preview intensity threshold (e.g., ITL, as shown at intensity meter 4202 adjacent to user interface 4286) or another intensity threshold as described herein.
At 42M, a user interface displays a view of map 4206 that includes a plurality of points of interest 4208-4220. A contact is detected at touch screen 112 at a location indicated by focus selector 4204, which is positioned at point of interest 4212. The contact is a tap input. As a result of the received tap input, different user interface from the interface of 42M is displayed, as indicated in
In some embodiments, the user interface of
As described below, the method 4300 provides an intuitive way to zoom a map. The method reduces the cognitive burden on a user when zooming a map around a point of interest, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to zoom a map faster and more efficiently conserves power and increases the time between battery charges.
The device displays (4302), in a first user interface on the display (e.g. touch screen 112), a view of a map that includes a plurality of points of interest (e.g., the points of interest are represented in the map by corresponding markers or icons (e.g., pins, avatars of users, logos of business entities, etc.) at their respective locations in the map). For example,
While displaying the view of the map that includes the plurality of points of interest (e.g., as shown in
In some embodiments, the respective point of interest (e.g., 4212 in
In some embodiments, the respective point of interest is a dynamic (e.g., mobile) point of interest (4308). In some embodiments, the respective point of interest is a location-sharing user (e.g., a person who has made location of his/her portable device available to the electronic device, e.g., via a location-sharing application), a location-sharing device (e.g., a lost device with a homing function enabled to contact the electronic device with its own location, a peripheral device (e.g., a drone) or other devices that communicate with and report their locations to the electronic device, etc.).
In some embodiments, while displaying the view of the map that includes the plurality of points of interest (e.g., as shown in
In some embodiments, modifying the appearance of the respective point of interest includes displaying (4312) an animated transition from a first appearance of the respective point of interest to a second appearance of the respective point of interest. (e.g., an animated transition between the respective point of interest 4212 as shown in
In some embodiments, displaying the animated transition from the first appearance of the respective point of interest to the second appearance of the respective point of interest includes dynamically displaying (4314) (and, optionally, generating) a series of intermediate appearances of the respective point of interest in accordance with a current intensity of the contact while the intensity of the contact varies between the hint intensity threshold (e.g., ITH) and the preview intensity threshold (e.g., ITL). For example, the size of the pin representing the respective point of interest is directly manipulated (e.g., increased and decreased) by changing the contact intensity between the hint intensity threshold and the preview intensity threshold.
In response to detecting the increase in the characteristic intensity of the contact above the preview intensity threshold (e.g., above ITL as indicated at intensity meter 4204 of
In some embodiments, zooming the map to display the contextual information near the respective point of interest includes displaying (4318) an animated transition from a first zoom level of the map to a second zoom level of the map (e.g., an animated transition from a first zoom level as shown in
In some embodiments, the animated transition from the first zoom level of the map to the second zoom level of the map includes (4320) a first portion showing an increase from the first zoom level of the map to a third zoom level of the map, followed by a second portion showing a decrease from the third zoom level of the map to the second zoom level of the map. For example, the animated transition from may zoom in from an initial zoom level (e.g., as shown in
In some embodiments, the plurality of points of interest includes (4322) a first point of interest and a second point of interest (e.g., both the first point of interest and the second point of interest are within a predetermined threshold map/screen distance from the focus selector). For example, first point of interest 4212 and second point of interest 4214 are shown in user interfaces 4280 and 4284 of
In some embodiments, zooming the map to display contextual information near the respective point of interest includes (4324) zooming the map to a predefined zoom level (e.g., such that map view displays a predefined geographic range (e.g., 10-mile radius, 5-block radius, neighborhood, city, county, etc.). In some embodiments, the map view is adjusted such that the respective point of interest is in the center of zoomed map view. In some embodiments, the respective point of interest does not move as the zooming occurs. For example, point of interest 4212 does not change position within map view 4206 as zooming (from map view 4206 as shown in
In some embodiments, zooming the map to display contextual information near the respective point of interest includes (4326) zooming the map to a dynamically selected zoom level (e.g., a zoom level that is determined based on the current context). In some embodiments, the zoom level is dynamically selected to show meaningful information relevant to the current scenario (e.g., if the map and points of interest are displayed as a result of a restaurant search, this search context may warrant a zoom down to the street level near a restaurant of interest; if the map and points of interest are displayed as a result of a search for community parks, this search context and the user's current location 4222 may warrant a zoom down to a level that includes a meaningful number of community parks (e.g., five) near the user's current location, etc.) In some embodiments, dynamically selected zoom level determination includes determining an information density value at the respective point of interest or in an area of the map where the respective point of interest is located. For example, different information density values may be determined for each of a plurality of map views at different zoom levels for each point of interest, and an appropriate information density is used to select the appropriate zoom level for the respective point of interest.
After zooming the map, the device detects (4328) a respective input that includes detecting a decrease in the characteristic intensity of the contact on the touch-sensitive surface below a predefined intensity threshold (e.g., detecting a decrease in intensity of the contact below the predefined intensity threshold or detecting liftoff of the contact from the touch-sensitive surface). For example, in
In response to detecting the respective input that includes detecting the decrease in the characteristic intensity of the contact: in accordance with a determination that the characteristic intensity of the contact increased above a maintain-context intensity threshold (e.g., a deep press intensity threshold (e.g., ITD), or another static or dynamically determined maintain-context intensity threshold) before detecting the respective input, the device continues (4330) to display the contextual information near the respective point of interest (e.g., the same zoomed view of the map is maintained on the display when the characteristic intensity of the contact increases above the maintain-context intensity threshold before easing off). For example, in
In accordance with a determination that the characteristic intensity of the contact did not increase above the maintain-context intensity threshold before detecting the respective input, the device ceases (4330) to display the contextual information near the point of interest and the device redisplays the view of the map that includes the plurality of points of interest. In some embodiments, if the device detects that the intensity of the contact decreases below the predefined intensity threshold or detects liftoff of the contact from the touch-sensitive surface without first detecting an increase above the maintain-context intensity threshold, the zoomed view of the map is replaced by the original view of the map that includes the plurality of points of interest, without the contextual information near the respective point of interest. For example, in
In some embodiments, after zooming the map (e.g., while displaying the zoomed view of the map with the contextual information), the device detects (4332) a movement of the contact on the touch-sensitive surface (e.g., after detecting the increase in intensity of the contact, the device detects a decrease in contact intensity below the preview intensity threshold or the maintain-context intensity threshold, followed by a movement of the contact while at the lower contact intensity). For example, after zooming the map to a map view 4206 as shown in
In some embodiments, zooming the map to display contextual information near the respective point of interest includes zooming the map to a first zoom level (e.g., a preview zoom level), and after zooming the map to the first zoom level (and, optionally, before detecting the respective input that includes detecting a decrease in intensity of the contact on the touch-sensitive surface), the device detects (4334) an increase in the characteristic intensity of the contact above the maintain-context intensity threshold. For example, as shown in
In some embodiments, in response to detecting the respective input that includes detecting the decrease in the characteristic intensity of the contact, the device maintains (4336) display of the map at a respective zoom level that is equal to or greater than the first zoom level. For example, after reaching above the maintain-context intensity threshold, on reduced intensity with or without liftoff, the zoom level of the map is locked in at (1) the preview zoom level (e.g., as shown at
In some embodiments, while maintaining the display of the map at the respective zoom level that is equal to or greater than the first zoom level, the device detects (4338) a predefined gesture directed to the zoomed map (e.g., the user can provide a predetermined gesture (e.g., a pinch gesture) to zoom back out). In response to detecting the predefined gesture directed to the zoomed map, the device ceases (4338) to display the map at the respective zoom level that is equal to or greater than the first zoom level and the device zooms the map to a fourth zoom level below the respective zoom level. In some embodiments, the fourth zoom level is the view of the map that includes the plurality of points of interest. In some embodiments, the amount of zoom from the respective zoom level to the fourth level is based on a magnitude the predetermined gesture (e.g., based a distance traversed by the pinch gesture).
In some embodiments, in response to detecting the increase in the characteristic intensity of the contact above the maintain-context intensity threshold (e.g. ITD), zooming the map to the second zoom level above the first zoom level includes (4340) replacing the first user interface with a second user interface that includes the zoomed map at the second zoom level, and an affordance for returning to the first user interface (e.g., a “Back” button). For example, a second user interface is a user interface as illustrated at
In some embodiments, the first user interface is an interface that includes a map showing avatars of multiple location-sharing friends of the user. When the user places a contact (e.g., a finger contact) on a respective location-sharing friend's avatar in the map and increases the characteristic intensity of the contact above the preview intensity threshold (e.g. ITL), a preview showing a zoomed map around the respective location-sharing friend's location is displayed in a preview platter overlaid on top of the first user interface, or the map in the first user interface is zoomed around the respective location-sharing friend's location while other portions of the first user interface remain unchanged. When the contact intensity increases above the maintain-context intensity threshold (e.g., ITD), a new, second user interface is displayed to replace the first user interface. In the second user interface, the map is displayed in a zoomed state (e.g., at the same zoom level as in the preview or at a higher zoom level). The second user interface also includes additional information about the respective location-sharing friend and affordances for various functions (e.g., contact the friend, etc.) that are not available in the first user interface.
In some embodiments, while displaying the second user interface (e.g., as illustrated at
It should be understood that the particular order in which the operations in
In accordance with some embodiments,
As shown in
The processing unit 4408 is configured to: enable display, in a first user interface on the display unit 4402, of a view of a map that includes a plurality of points of interest; while enabling display (e.g., with display enabling unit 4414) of the view of the map that includes the plurality of points of interest; and while a focus selector is at a location of a respective point of interest, detect (e.g., with detecting unit 4410) an increase in a characteristic intensity of the contact on the touch-sensitive surface unit 4404 above a preview intensity threshold; in response to detecting (e.g., with the detecting unit 4410) the increase in the characteristic intensity of the contact above the preview intensity threshold, zoom (e.g., with the zooming unit 4412) the map to enable display (e.g., with the display enabling unit 4414) of contextual information near the respective point of interest; after zooming (e.g., with the zooming unit 4412) the map, detect (e.g., with detecting unit 4410) a respective input that includes detecting a decrease in the characteristic intensity of the contact on the touch-sensitive surface below a predefined intensity threshold; and in response to detecting the respective input that includes detecting the decrease in the characteristic intensity of the contact: in accordance with a determination that the characteristic intensity of the contact increased above a maintain-context intensity threshold before detecting the respective input, continue to enable display (e.g., with the display enabling unit 4414) of the contextual information near the respective point of interest; and in accordance with a determination that the characteristic intensity of the contact did not increase above the maintain-context intensity threshold before detecting the respective input, cease to enable display (e.g., with the ceasing unit 4416) of the contextual information near the point of interest and redisplay the view of the map that includes the plurality of points of interest.
The operations in the information processing methods described above are, optionally implemented by running one or more functional modules in information processing apparatus such as general purpose processors (e.g., as described above with respect to
The operations described above with reference to
Many electronic devices have graphical user interfaces that display a map at various zoom levels. For example, a map view including multiple points of interest can be displayed and the zoom level of the map can be increased to show contextual information for a particular point of interest. In the embodiments described below, a user interface displays a region with a view of a map including multiple points of interest and another region including representations of the points of interest (e.g., a list including information about the points of interest). When input received at a representation of a point of interest reaches a threshold intensity level, the view of the map is zoomed to show contextual information for the point of interest. Giving a user the ability to provide input with or without an intensity component allows additional functionality to be associated with the input.
Below,
In some embodiments, the device is an electronic device with a separate display (e.g., display 450) and a separate touch-sensitive surface (e.g., touch-sensitive surface 451). In some embodiments, the device is portable multifunction device 100, the display is touch-sensitive display system 112, and the touch-sensitive surface includes tactile output generators 167 on the display (
A contact is detected on touch screen 112 at a location indicated by focus selector 4504 within context region 4508. Focus selector 4504 is at the location of representation 4518, corresponding to point of interest 4512. A characteristic intensity of the contact at the location indicated by focus selector 4504 is indicated by intensity meter 4502. In the illustrative example of
As shown in
As shown in
As shown in
User interface 4550 concurrently displays, on touch screen 112 of portable multifunction device 100, a view of a map (e.g., map view 4506) and a context region (e.g., context region 4508). Map view 4506 includes multiple points of interest 4510-4516 and location indicator 4554 indicating the location of portable multifunction device 100. A contact is detected on touch screen 112 at a location indicated by focus selector 4504. Focus selector 4504 is at the location of representation 4518, corresponding to point of interest 4512. A characteristic intensity of the contact at the location indicated by focus selector 4504 is between a threshold intensity level IT0 and a threshold intensity level ITH, as indicated by intensity meter 4502 adjacent to 4550.
In user interface 4552, map view 4506 is zoomed to display contextual information for point of interest 4512 in response to a detected increase in the characteristic intensity of a contact on touch screen 112 when a focus selector 4504 is located at representation 4518 (corresponding to point of interest 4512). The contact has an intensity level exceeding an intensity threshold, such as a preview intensity threshold (e.g., intensity threshold ITL, as illustrated at intensity meter 4502 adjacent to 4552). Map view 4506 includes point of interest 4512 and location indicator 4554 indicating the location of portable multifunction device 100. In some embodiments, a zoom level of map view 4506 in user interface 4552 is determined such that point of interest 4512 and location indicator 4554 are concurrently visible in map view 4506.
User interface 4560 concurrently displays, on touch screen 112 of portable multifunction device 100, a view of a map 4506 and a context region 4508. The view of the map 4506 includes multiple points of interest 4510-4516. A contact is detected at touch screen 112 at a location indicated by focus selector 4504. Focus selector 4504 is at the location of representation 4518, corresponding to point of interest 4512. A characteristic intensity of the contact at the location indicated by focus selector 4504 is between a threshold intensity level IT0 and a threshold intensity level ITH, as indicated by intensity meter 4502 adjacent to 4560.
In user interface 4562, the view of the map (e.g., map view 4506) is zoomed to display contextual information for point of interest 4512 in response to a detected increase in the characteristic intensity of a contact on touch screen 112 when a focus selector 4504 is located at representation 4518 corresponding to point of interest 4512. The contact has an intensity level exceeding an intensity threshold, such as a preview intensity threshold (e.g., above intensity threshold ITL, as illustrated at intensity meter 4502 adjacent to 4562).
In response to detecting a decrease in the intensity of the contact below the intensity threshold (e.g., below intensity threshold ITL, as illustrated at intensity meter 4502 adjacent to 4564), portable multifunction device 100 redisplays user interface 4564 with the view of the map (e.g., map view 4506, as shown in user interface 4560) that includes multiple points of interest 4510-4516. While the view of the map (e.g., map view 4506) that includes multiple points of interest 4510-4516 is redisplayed as indicated in user interface 4564, the contact moves across touch screen 112 of portable multifunction device 100 such that focus selector 4504 moves from a location over representation 4518 to a location over representation 4520 along a path indicated by arrow 4568.
After movement of the contact along the path indicated by arrow 4568, portable multifunction device 100 detects an increase in the intensity of the contact above the intensity threshold (e.g., above intensity threshold ITL, as illustrated at intensity meter 4502 adjacent to 4566). In response to detecting the increase in the intensity of the contact while focus selector 4504 is at a location over representation 4520 (which corresponds to point of interest 4514), the view of the map (e.g., map view 4506) is zoomed to display contextual information for point of interest 4514, as shown in user interface 4566.
As described below, the method 4600 provides an intuitive way to zoom a map. The method reduces the cognitive burden on a user when zooming a map, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to zoom a map faster and more efficiently conserves power and increases the time between battery charges.
The device concurrently displays (4602) in a user interface on the display: a map view (e.g., map view 4506 in
In some embodiments, the representations of the first and second points of interest in the context region (e.g., representations 4518 and 4520 in context region 4508 of points of interest 4512 and 4514, respectively, shown in map view 4506) include (4604) additional information (e.g., text description of the address, rating, number of reviews, name, hours of operation, one or more images associated with the point of interest, a category description of the point of interest, a cost indicator, a distance from current user location, etc.) about the first and second points of interest that is not displayed in the map view, as shown in
While concurrently displaying the map view and the context region on the display, the device detects (4606) an increase in a characteristic intensity of a contact on the touch-sensitive surface (e.g., touch screen 112) above a respective intensity threshold (e.g., a light press threshold (ITL), or a preview intensity threshold). For example, in
In response to detecting the increase in the characteristic intensity of the contact above the respective intensity threshold (e.g., the light press threshold (ITL), or a preview intensity threshold), in accordance with a determination that a focus selector (e.g., focus selector 4504 in
In some embodiments, when zooming the map view, the context region is not zoomed (4610). For example, when the map view 4506 is zoomed from the view shown in
In some embodiments, zooming the map view to display the respective contextual information for the first point of interest around the first point of interest (e.g., point of interest 4512) in the map view (e.g., map view 4506 in 45J) includes (4612) zooming the map to a first zoom level so as to concurrently display a location of the electronic device and the first point of interest. For example, as shown in
In some embodiments, zooming the map view to display the respective contextual information for the first point of interest around the first point of interest in the map view includes ceasing (4614) to display the second point of interest in the zoomed map view (e.g.,
In some embodiments, zooming the map view to display the respective contextual information for the second point of interest around the second point of interest in the map view includes ceasing (4616) to display the first point of interest in the zoomed map view (e.g.,
In some embodiments, the device detects (4618) a movement of the contact on the touch-sensitive surface (e.g., touch screen 112) that corresponds to a movement of the focus selector (e.g., focus selector 4504) in the map view (e.g., map view 4506) (e.g., a movement along a path indicated by arrow 4544 in
In some embodiments, while displaying the zoomed map view with the respective contextual information for one of the first or second point of interest, the device detects (4620) a decrease in intensity of the contact on the touch-sensitive surface below a second respective intensity threshold (e.g., a decrease in intensity of the contact below ITL, a decrease in intensity of the contact below ITH, a lift-off of the contact from the touch screen 112, etc.) while the focus selector is at the location of the representation of said one of the first or second point of interest. In response to detecting the decrease in the characteristic intensity of the contact below the second respective intensity threshold, the device reverses (4620) the zooming of the map view. For example, in
In some embodiments, after reversing the zooming of the map view, the device detects (4622) a movement of the contact on the touch-sensitive surface that corresponds to a movement of the focus selector from the location of the representation of said one of the first or second point of interest to a location of a representation of a different point of interest shown in the context region (e.g., a third point of interest shown in the context region, or the other one of the first and second point of interest) in the map view. For example, in
In some embodiments, while the focus selector is at the location of the representation of one of the first or second point of interest: in response to detecting the increase in the characteristic intensity of the contact above the respective intensity threshold, the device changes (4624) an appearance of said one of the first or second point of interest in the context region (e.g., highlighting the text in the representation of said point of interest in the context region, as shown at representation 4518 in context region 4508 of
In some embodiments, prior to detecting the increase in characteristic intensity of the contact above the respective intensity threshold (e.g. ITL), the device detects (4626) movement of the contact on the touch-sensitive surface (e.g. touch screen 112) that corresponds to movement of the focus selector in the context region; and in response to detecting the movement of the contact on the touch-sensitive surface (e.g. touch screen 112) that corresponds to the movement of the focus selector in the context region, the device scrolls (4626) the context region in accordance with the corresponding movement of the focus selector in the context region (e.g., context region 4508 is scrolled to show additional entries in the list of entries in the context region 4508 in
In some embodiments, after zooming the map view to display the respective contextual information for one of the first or second point of interest in the map view, and while the focus selector is at the location of the representation of said one of the first or second point of interest, the device detects (4628) an increase in the characteristic intensity of the contact above a location card display intensity threshold (e.g., a deep press intensity threshold ITD, or a static or dynamically determined “pop” intensity threshold). In response to detecting the increase in the characteristic intensity of the contact above the location card display intensity threshold, the device displays (4628) a location card (e.g., location card 4526) for said one of the first or second point of interest. For example, in
In some embodiments, while the focus selector 4504 is at the location of the representation of one of the first or second point of interest: prior to detecting the increase in the characteristic intensity of the contact on the touch-sensitive surface above the respective intensity threshold (e.g., a light press threshold (ITL)), the device detects (4630) an increase in the characteristic intensity of the contact above a hint intensity threshold (e.g., ITH) below the respective intensity threshold. In response to detecting the increase in the characteristic intensity of the contact above the hint intensity threshold, the device changes (4630) an appearance of said one of the first or second point of interest in the context region in accordance with the intensity of the contact (e.g., highlighting the text in the representation of said point of interest in the context region, expanding the representation of said point of interest in the context region, or displaying additional information (e.g., additional text, image, etc.) describing said point of interest in the context region). In some embodiments, the appearance of said point of interest (e.g., e.g., point of interest 4512) is also changed (e.g., highlighted by changing color or size) in the map view in accordance with the intensity of the contact. For example, as shown in
It should be understood that the particular order in which the operations in
In accordance with some embodiments,
As shown in
The processing unit configured to: enable concurrent display (e.g., with display enabling unit 4722), in a user interface on the display unit 4702, of: a map view that includes a plurality of points of interest, and a context region that is distinct from the map view and includes a representation of a first point of interest from the plurality of points of interest and a representation of a second point of interest from the plurality of points of interest; while enabling concurrent display of the map view and the context region on the display unit, detect (e.g., with detecting unit 4712) an increase in a characteristic intensity of a contact on the touch-sensitive surface unit above a respective intensity threshold; and in response to detecting the increase in the characteristic intensity of the contact above the respective intensity threshold:—in accordance with a determination that a focus selector was at a location of the representation of the first point of interest in the context region when the increase in the characteristic intensity of the contact above the respective intensity threshold was detected, zoom (e.g., with the zooming unit 4710) the map view to display respective contextual information for the first point of interest around the first point of interest in the map view; and—in accordance with a determination that the focus selector was at a location of the representation of the second point of interest in the context region when the increase in the characteristic intensity of the contact above the respective intensity threshold was detected, zoom (e.g., with the zooming unit 4710) the map view to display respective contextual information for the second point of interest around the second point of interest in the map view.
The operations in the information processing methods described above are, optionally implemented by running one or more functional modules in information processing apparatus such as general purpose processors (e.g., as described above with respect to
The operations described above with reference to
As noted above, there is a need for electronic devices with improved methods and interfaces for displaying and using a menu that includes contact information. Many electronic devices have applications that list objects that are associated with contact information (e.g., a list of search results in a map application, a list of friends in a messaging application, etc.). However, existing methods for accessing the associated contact information and initiating actions based on the contact information are slow and inefficient. For example, if a user was messaging with a friend in a messaging application, and then wants to call that friend, the user may need to open a phone application, search for that friend in his/her contacts, and then select that friend from the contacts in order to place the call. The embodiments below address this problem by providing a menu (e.g., an action platter or quick action menu) for initiating one or more actions for a respective object that includes the contact information for the respective object. The menu provides a fast way to initiate actions (e.g., for a person, calling, messaging, or emailing the person, or for a business, getting directions to the business, calling the business, opening a web page for the business, etc.) without having to open a separate application or enter search terms and perform a search.
Below,
In some embodiments, the device is an electronic device with a separate display (e.g., display 450) and a separate touch-sensitive surface (e.g., touch-sensitive surface 451). In some embodiments, the device is portable multifunction device 100, the display is touch-sensitive display system 112, and the touch-sensitive surface includes tactile output generators 167 on the display (
As described below, method 4900 provides an efficient way to display a menu that includes contact information. The method provides a fast way to initiate actions (e.g., for a person, calling, messaging, or emailing the person, or for a business, getting directions to the business, calling the business, opening a web page for the business, etc.) without having to open a separate application or enter search terms and perform a search. The method reduces the cognitive burden on a user when displaying a menu, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to initiate actions faster and more efficiently conserves power and increases the time between battery charges.
The device displays (4902), on the display, a first user interface that includes a plurality of selectable objects that are associated with contact information. For example, the selectable objects include avatars, addresses, and/or telephone numbers of contactable entities (e.g., friends, social network contacts, business entities, points of interest, etc.) shown in a user interface of a messaging application (e.g., as shown in messages user interface 4830 of a messaging application,
In some embodiments, the plurality of selectable objects that are associated with contact information include (4904) representations of users associated with the contact information (e.g., images/avatars of other users).
In some embodiments, the plurality of selectable objects that are associated with contact information include (4906) representations of locations associated with the contact information (e.g., pins on a map or representations of restaurants, or data detected locations in the text of an electronic document or an electronic communication such as an email or other electronic message).
The device, while displaying the plurality of selectable objects and while a focus selector is at a location that corresponds to a respective selectable object (e.g., an avatar of a friend or a search result representation), detects (4908) an input that includes detecting a contact on the touch-sensitive surface.
The device, in response to detecting the input: in accordance with a determination that detecting the input includes detecting an increase in intensity of the contact that meets intensity criteria, the intensity criteria including a criterion that is met when a characteristic intensity of the contact increases above a respective intensity threshold (e.g., above a light press intensity threshold or a static or dynamically determined preview intensity threshold), displays (4910) a menu (e.g., an action platter or quick action menu for initiating one or more actions) for the respective selectable object that includes the contact information for the respective selectable object (e.g., available modes of contacting or communicating with the contactable entity represented by the respective selectable object and/or names, avatars, addresses, social network identities, telephone numbers, etc. associated with the respective selectable object) overlaid on top of the first user interface that includes the plurality of selectable objects. For example, for a respective selectable object that represents a restaurant, the one or more actions in the menu optionally include: getting directions to the restaurant, calling the restaurant, opening a web page for the restaurant, and sharing the location of the restaurant. For a respective selectable object that represents a business entity, the one or more actions in the menu optionally include: getting directions to the business, calling the business, opening a web page for the business, and sharing the location of the business, as shown in menu 4811 of
The device, in response to detecting the input: in accordance with a determination that detecting the input includes detecting a liftoff of the contact without meeting the intensity criteria (e.g., intensity of the contact does not reach the light press intensity threshold or the static or dynamically determined preview intensity threshold before lift-off of the contact (e.g., when the input is a tap gesture)), replaces display of the first user interface that includes the plurality of selectable objects with display of a second user interface that is associated with the respective selectable object. In some embodiments, the second user interface that is associated with the respective selectable object includes an information page for the respective selectable object (e.g., a web page for a restaurant, a full contact information sheet for a person, an information page for a business (e.g., information user interface 4820,
In some embodiments, the contact information includes (4912) one or more of: one or more phone numbers (e.g., home, work, cell, etc.), one or more email addresses (e.g., home, work, etc.), one or more geographic addresses (e.g., different business locations), and one or more messaging contact addresses or identities (e.g., text messaging through a cell phone, text messaging through an email address, etc.).
In some embodiments, the menu includes (4914) a header, wherein the header includes additional information about the respective selectable object. (e.g., for a restaurant: business hours, a rating, cost information, etc. or for a person: full name, business affiliation, etc.).
In some embodiments, the device, in response to detecting the input: in accordance with the determination that detecting the input includes detecting an increase in intensity of the contact that meets the intensity criteria, displays (4916) additional descriptive information describing the respective selectable object. In some embodiments, the additional descriptive information is displayed in a header of the menu, as described above with respect to operation 4914. In some embodiments, the additional descriptive information includes business hours, a rating, and/or cost information for a restaurant. In some embodiments, the additional descriptive information includes a full address, business hours, and/or a rating (as shown in
In some embodiments, the respective selectable object is (4918) an avatar. In some embodiments, the device, in accordance with the determination that detecting the input includes detecting an increase in intensity of the contact that meets the intensity criteria, displays a magnified version of the avatar within the menu (e.g., overlaid on top of other portions of the user interface), as shown in
In some embodiments, the device applies (4920) a visual effect to obscure the first user interface that includes the plurality of selectable objects while displaying the menu. In some embodiments, the first user interface is blurred or masked when the menu is displayed on top of the first user interface. For example, in
In some embodiments, the device, while displaying the menu for the respective selectable object, detects (4922) a predefined dismissal gesture (e.g., detecting a tap gesture while the focus selector is located outside of the menu, or detecting a swipe gesture that causes a movement of the focus selector across the menu and ends outside of the menu) directed to a location outside of the menu on the first user interface; and in response to detecting the predefined dismissal gesture: ceases to display the menu for the respective selectable object (and ceases to display any additional descriptive information describing the respective selectable object that was displayed with the menu); and restores display of the first user interface that includes the plurality of selectable objects. In some embodiments, restoring display of the first user interface that includes the plurality of selectable objects includes removing the visual effect that was applied to the first user interface.
In some embodiments, the menu includes (4924) one or more communication objects (e.g., selectable user interface objects that represent available modes of contacting or communicating with the contactable entity represented by the respective selectable object and/or specific names, avatars, addresses, social network identities, telephone numbers, etc. associated with the respective selectable object).
In some embodiments, the portion of the input that meets the selection criteria is (4926) a terminal portion of the input (e.g., liftoff of the contact from the touch-sensitive surface). For example, as shown in
In some embodiments, the portion of the input that meets the selection criteria corresponds (4928) to a change in intensity of the contact. In some embodiments, the change in intensity of the contact includes a decrease in intensity of the contact followed by an increase in intensity of the contact over an intensity threshold that corresponds to selection of the respective communication object. In some embodiments, the change in intensity of the contact includes an increase in intensity of the contact to a second intensity threshold, greater than the respective intensity threshold at which the device displays the menu. For example, as shown in
In some embodiments, initiating the communication function corresponding to the respective communication object includes (4930) initiating a communication (e.g., a telephone call, an instant message, a draft email) corresponding to the respective communication object.
In some embodiments, initiating the communication function corresponding to the respective communication object in response to detecting the portion of the input that meets the selection criteria includes (4932): in response to detecting the portion of the input (e.g., the terminal portion of the input) that meets the selection criteria (e.g., liftoff of the contact): in accordance with a determination that the focus selector is located at a first portion (e.g., left side, as shown in
In some embodiments, the plurality of options associated with the respective communication object expand (4934) out from the respective communication object.
In some embodiments, the device detects (4936) selection of a respective option of the plurality of options (e.g., selection by a tap gesture on the respective option, as shown in
In some embodiments, the respective selectable object occupies (4938) a portion of a second selectable object. In some embodiments, the second selectable object is a row in a plurality of rows in a list, an instant message conversation in a listing of instant messaging conversations, an email message in a listing of email messages, etc. In some embodiments, the second selectable object includes two selectable portions. For example, for a selectable object representing an instant messaging conversation (e.g., a rectangular-shaped user interface item, such as 4834-a, 4834-b, 4834-c, and 4834-d,
In some embodiments, displaying content associated with the second selectable object that is different from the menu for the respective selectable object includes (4940): in accordance with a determination that a first portion of the second input meets preview criteria (e.g., the second input is a press input with a characteristic intensity in the first portion of the second input that meets preview criteria, such as a characteristic intensity that meets a “peek” intensity threshold at which the device starts to display a preview of another user interface that can be reached by pressing harder on the respective selectable object), displaying a preview area overlaid on at least some of the plurality of selectable objects in the first user interface, wherein the preview area includes a reduced scale representation of the second user interface (e.g., as shown in
In some embodiments, determining that the first portion of the second input meets preview criteria includes (4942) detecting that the characteristic intensity of the second contact during the first portion of the second input increases to a first intensity threshold (e.g., a “peek” intensity threshold at which the device starts to display a preview of another user interface that can be reached by pressing harder on the respective selectable object), as shown in
In some embodiments, determining that the second portion of the second input meets user-interface-replacement criteria includes (4944) detecting that the characteristic intensity of the second contact during the second portion of the second input increases to a second intensity threshold, greater than the first intensity threshold (e.g., a “pop” intensity threshold, greater than a “peek” intensity threshold, at which the device replaces display of the first user interface (with the overlaid preview area) with display of the second user interface), as shown in
In some embodiments, determining that the second portion of the second input meets preview-area-disappearance criteria includes (4946) detecting a liftoff of the second contact without meeting the user-interface-replacement criteria during the second portion of the second input. For example, in
In some embodiments, the device applies (4948) a visual effect to obscure the first user interface while displaying the preview area, as shown in
It should be understood that the particular order in which the operations in
In accordance with some embodiments,
As shown in
The processing unit 5008 is configured to: enable display, on the display unit 5002, of a first user interface that includes a plurality of selectable objects that are associated with contact information (e.g., with the display enabling unit 5010); while enabling display of the plurality of selectable objects and while a focus selector is at a location that corresponds to a respective selectable object, detect an input that includes detecting a contact on the touch-sensitive surface unit 5004 (e.g., with the detecting unit 5012); and in response to detecting the input: in accordance with a determination that detecting the input includes detecting an increase in intensity of the contact that meets intensity criteria, the intensity criteria including a criterion that is met when a characteristic intensity of the contact increases above a respective intensity threshold, enable display of a menu for the respective selectable object (e.g., with the display enabling unit 5010) that includes the contact information for the respective selectable object overlaid on top of the first user interface that includes the plurality of selectable objects; and in accordance with a determination that detecting the input includes detecting a liftoff of the contact without meeting the intensity criteria, replace display of the first user interface that includes the plurality of selectable objects with display of a second user interface that is associated with the respective selectable object (e.g., with the display enabling unit 5010).
The operations in the information processing methods described above are, optionally implemented by running one or more functional modules in information processing apparatus such as general purpose processors (e.g., as described above with respect to
The operations described above with reference to
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best use the invention and various described embodiments with various modifications as are suited to the particular use contemplated.
This application is a continuation of U.S. application Ser. No. 14/870,988, filed Sep. 30, 2015, which is a continuation U.S. application Ser. No. 14/869,899, filed Sep. 29, 2015, now U.S. Pat. No. 9,632,664, which claims priority to: (1) U.S. Provisional Application Ser. No. 62/215,722, filed Sep. 8, 2015; (2) U.S. Provisional Application Ser. No. 62/213,609, filed Sep. 2, 2015; (3) U.S. Provisional Application Ser. No. 62/203,387, filed Aug. 10, 2015; (4) U.S. Provisional Application Ser. No. 62/215,696, filed Sep. 8, 2015; (5) U.S. Provisional Application Ser. No. 62/213,606, filed Sep. 2, 2015; (6) U.S. Provisional Application Ser. No. 62/183,139, filed Jun. 22, 2015; (7) U.S. Provisional Application Ser. No. 62/172,226, filed Jun. 7, 2015; and (8) U.S. Provisional Application No 62/129,954, filed Mar. 8, 2015, all of which are incorporated by reference herein in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
4864520 | Setoguchi et al. | Sep 1989 | A |
5184120 | Schultz | Feb 1993 | A |
5374787 | Miller et al. | Dec 1994 | A |
5428730 | Baker et al. | Jun 1995 | A |
5463722 | Venolia | Oct 1995 | A |
5510813 | Makinwa et al. | Apr 1996 | A |
5555354 | Strasnick et al. | Sep 1996 | A |
5559301 | Bryan, Jr. et al. | Sep 1996 | A |
5710896 | Seidl | Jan 1998 | A |
5717438 | Kim et al. | Feb 1998 | A |
5793360 | Fleck et al. | Aug 1998 | A |
5793377 | Moore | Aug 1998 | A |
5801692 | Muzio et al. | Sep 1998 | A |
5805144 | Scholder et al. | Sep 1998 | A |
5805167 | Van Cruyningen | Sep 1998 | A |
5809267 | Moran et al. | Sep 1998 | A |
5819293 | Comer et al. | Oct 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5844560 | Crutcher et al. | Dec 1998 | A |
5872922 | Hogan et al. | Feb 1999 | A |
5946647 | Miller et al. | Aug 1999 | A |
5973670 | Barber et al. | Oct 1999 | A |
6002397 | Jaaskelainen, Jr. | Dec 1999 | A |
6031989 | Cordell | Feb 2000 | A |
6088019 | Rosenberg | Jul 2000 | A |
6088027 | Konar et al. | Jul 2000 | A |
6111575 | Martinez et al. | Aug 2000 | A |
6121960 | Carroll et al. | Sep 2000 | A |
6208329 | Ballare | Mar 2001 | B1 |
6208340 | Amin et al. | Mar 2001 | B1 |
6219034 | Elbing et al. | Apr 2001 | B1 |
6232891 | Rosenberg | May 2001 | B1 |
6243080 | Moine | Jun 2001 | B1 |
6252594 | Xia et al. | Jun 2001 | B1 |
6300936 | Braun et al. | Oct 2001 | B1 |
6313836 | Russell, Jr. et al. | Nov 2001 | B1 |
6396523 | Segal et al. | May 2002 | B1 |
6429846 | Rosenberg et al. | Aug 2002 | B2 |
6448977 | Braun et al. | Sep 2002 | B1 |
6459442 | Edwards et al. | Oct 2002 | B1 |
6489978 | Gong et al. | Dec 2002 | B1 |
6512530 | Rzepkowski et al. | Jan 2003 | B1 |
6563487 | Martin et al. | May 2003 | B2 |
6567102 | Kung | May 2003 | B2 |
6583798 | Hoek et al. | Jun 2003 | B1 |
6590568 | Astala et al. | Jul 2003 | B1 |
6661438 | Shiraishi et al. | Dec 2003 | B1 |
6735307 | Volckers | May 2004 | B1 |
6750890 | Sugimoto | Jun 2004 | B1 |
6806893 | Kolawa et al. | Oct 2004 | B1 |
6822635 | Shahoian et al. | Nov 2004 | B2 |
6906697 | Rosenberg | Jun 2005 | B2 |
6919927 | Hyodo | Jul 2005 | B1 |
6943778 | Astala et al. | Sep 2005 | B1 |
7036088 | Tunney | Apr 2006 | B2 |
7138983 | Wakai et al. | Nov 2006 | B2 |
7312791 | Hoshino et al. | Dec 2007 | B2 |
7411575 | Hill et al. | Aug 2008 | B2 |
7434177 | Ording et al. | Oct 2008 | B1 |
7471284 | Bathiche et al. | Dec 2008 | B2 |
7479949 | Jobs et al. | Jan 2009 | B2 |
7533352 | Chew et al. | May 2009 | B2 |
7552397 | Holecek et al. | Jun 2009 | B2 |
7577530 | Vignalou-Marche | Aug 2009 | B2 |
7614008 | Ording | Nov 2009 | B2 |
7619616 | Rimas Ribikauskas et al. | Nov 2009 | B2 |
7629966 | Anson | Dec 2009 | B2 |
7656413 | Khan et al. | Feb 2010 | B2 |
7657849 | Chaudhri et al. | Feb 2010 | B2 |
7683889 | Rimas Ribikauskas et al. | Mar 2010 | B2 |
7702733 | Fleck et al. | Apr 2010 | B2 |
7743348 | Robbins et al. | Jun 2010 | B2 |
7760187 | Kennedy | Jul 2010 | B2 |
7787026 | Flory et al. | Aug 2010 | B1 |
7797642 | Karam et al. | Sep 2010 | B1 |
7801950 | Eisenstadt et al. | Sep 2010 | B2 |
7812826 | Ording et al. | Oct 2010 | B2 |
7890862 | Kompe et al. | Feb 2011 | B2 |
7903090 | Soss et al. | Mar 2011 | B2 |
7952566 | Poupyrev et al. | May 2011 | B2 |
7956847 | Christie | Jun 2011 | B2 |
7973778 | Chen | Jul 2011 | B2 |
8040142 | Bokma et al. | Oct 2011 | B1 |
8059104 | Shahoian et al. | Nov 2011 | B2 |
8106856 | Matas et al. | Jan 2012 | B2 |
8125440 | Guyot-Sionnest et al. | Feb 2012 | B2 |
8125492 | Wainwright et al. | Feb 2012 | B1 |
RE43448 | Kimoto et al. | Jun 2012 | E |
8209628 | Davidson | Jun 2012 | B1 |
8271900 | Walizaka et al. | Sep 2012 | B2 |
8300005 | Tateuchi et al. | Oct 2012 | B2 |
8325398 | Satomi et al. | Dec 2012 | B2 |
8363020 | Li et al. | Jan 2013 | B2 |
8390583 | Forutanpour et al. | Mar 2013 | B2 |
8423089 | Song et al. | Apr 2013 | B2 |
8446376 | Levy et al. | May 2013 | B2 |
8453057 | Stallings et al. | May 2013 | B2 |
8456431 | Victor | Jun 2013 | B2 |
8466889 | Tong et al. | Jun 2013 | B2 |
8482535 | Pryor | Jul 2013 | B2 |
8499243 | Yuki | Jul 2013 | B2 |
8508494 | Moore | Aug 2013 | B2 |
8542205 | Keller | Sep 2013 | B1 |
8553092 | Tezuka et al. | Oct 2013 | B2 |
8570296 | Birnbaum et al. | Oct 2013 | B2 |
8581870 | Bokma et al. | Nov 2013 | B2 |
8587542 | Moore | Nov 2013 | B2 |
8593415 | Han et al. | Nov 2013 | B2 |
8593420 | Buuck | Nov 2013 | B1 |
8625882 | Backlund et al. | Jan 2014 | B2 |
8638311 | Kang et al. | Jan 2014 | B2 |
8665227 | Gunawan | Mar 2014 | B2 |
8669945 | Coddington | Mar 2014 | B2 |
8698765 | Keller | Apr 2014 | B1 |
8706172 | Priyantha et al. | Apr 2014 | B2 |
8717305 | Williamson et al. | May 2014 | B2 |
8726198 | Rydenhag et al. | May 2014 | B2 |
8743069 | Morton et al. | Jun 2014 | B2 |
8760425 | Crisan | Jun 2014 | B2 |
8769431 | Prasad | Jul 2014 | B1 |
8773389 | Freed | Jul 2014 | B1 |
8788964 | Shin et al. | Jul 2014 | B2 |
8793577 | Schellingerhout et al. | Jul 2014 | B2 |
8799816 | Wells et al. | Aug 2014 | B2 |
8816989 | Nicholson et al. | Aug 2014 | B2 |
8854316 | Shenfield | Oct 2014 | B2 |
8872729 | Lyons et al. | Oct 2014 | B2 |
8872773 | Mak et al. | Oct 2014 | B2 |
8875044 | Ozawa et al. | Oct 2014 | B2 |
8881062 | Kim et al. | Nov 2014 | B2 |
8914732 | Jun et al. | Dec 2014 | B2 |
8952987 | Momeyer et al. | Feb 2015 | B2 |
8954889 | Fujibayashi | Feb 2015 | B2 |
8959430 | Spivak et al. | Feb 2015 | B1 |
8976128 | Moore | Mar 2015 | B2 |
9026932 | Dixon | May 2015 | B1 |
9030419 | Freed | May 2015 | B1 |
9030436 | Ikeda | May 2015 | B2 |
9032321 | Cohen et al. | May 2015 | B1 |
9046999 | Teller et al. | Jun 2015 | B1 |
9052820 | Jarrett et al. | Jun 2015 | B2 |
9063563 | Gray et al. | Jun 2015 | B1 |
9063731 | Heo et al. | Jun 2015 | B2 |
9069460 | Moore | Jun 2015 | B2 |
9086755 | Cho et al. | Jul 2015 | B2 |
9092058 | Kasahara et al. | Jul 2015 | B2 |
9098188 | Kim | Aug 2015 | B2 |
9111076 | Park et al. | Aug 2015 | B2 |
9116569 | Stacy et al. | Aug 2015 | B2 |
9116571 | Zeliff et al. | Aug 2015 | B2 |
9122364 | Kuwabara et al. | Sep 2015 | B2 |
9128605 | Nan et al. | Sep 2015 | B2 |
9146914 | Dhaundiyal | Sep 2015 | B1 |
9164779 | Brakensiek et al. | Oct 2015 | B2 |
9170607 | Bose et al. | Oct 2015 | B2 |
9170649 | Ronkainen | Oct 2015 | B2 |
9218105 | Mansson et al. | Dec 2015 | B2 |
9244562 | Rosenberg et al. | Jan 2016 | B1 |
9244576 | Vadagave et al. | Jan 2016 | B1 |
9244601 | Kim et al. | Jan 2016 | B2 |
9246487 | Casparian et al. | Jan 2016 | B2 |
9262002 | Momeyer et al. | Feb 2016 | B2 |
9304668 | Rezende et al. | Apr 2016 | B2 |
9307112 | Molgaard et al. | Apr 2016 | B2 |
9349552 | Huska et al. | May 2016 | B2 |
9361018 | Defazio et al. | Jun 2016 | B2 |
9383887 | Khafizov et al. | Jul 2016 | B1 |
9389718 | Letourneur | Jul 2016 | B1 |
9389722 | Matsuki et al. | Jul 2016 | B2 |
9395800 | Liu et al. | Jul 2016 | B2 |
9400581 | Bokma et al. | Jul 2016 | B2 |
9405367 | Jung et al. | Aug 2016 | B2 |
9405428 | Roh et al. | Aug 2016 | B2 |
9417754 | Smith | Aug 2016 | B2 |
9423938 | Morris | Aug 2016 | B1 |
9436344 | Kuwabara et al. | Sep 2016 | B2 |
9448694 | Sharma et al. | Sep 2016 | B2 |
9451230 | Henderson et al. | Sep 2016 | B1 |
9471145 | Langlois et al. | Oct 2016 | B2 |
9477393 | Zambetti et al. | Oct 2016 | B2 |
9542013 | Dearman et al. | Jan 2017 | B2 |
9547436 | Ohki et al. | Jan 2017 | B2 |
9569093 | Lipman et al. | Feb 2017 | B2 |
9582178 | Grant et al. | Feb 2017 | B2 |
9600114 | Milam et al. | Mar 2017 | B2 |
9600116 | Tao et al. | Mar 2017 | B2 |
9612741 | Brown et al. | Apr 2017 | B2 |
9619076 | Bernstein et al. | Apr 2017 | B2 |
9625987 | LaPenna et al. | Apr 2017 | B1 |
9645722 | Stasior et al. | May 2017 | B1 |
9665762 | Thompson et al. | May 2017 | B2 |
9671943 | Van der Velden | Jun 2017 | B2 |
9678571 | Robert et al. | Jun 2017 | B1 |
9733716 | Shaffer | Aug 2017 | B2 |
9740381 | Chaudhri et al. | Aug 2017 | B1 |
9753527 | Connell et al. | Sep 2017 | B2 |
9760241 | Lewbel | Sep 2017 | B1 |
9785305 | Alonso Ruiz et al. | Oct 2017 | B2 |
9804665 | DeBates et al. | Oct 2017 | B2 |
9829980 | Lisseman et al. | Nov 2017 | B2 |
10055066 | Lynn et al. | Aug 2018 | B2 |
10057490 | Shin et al. | Aug 2018 | B2 |
10095396 | Kudurshian et al. | Oct 2018 | B2 |
10133388 | Sudou | Nov 2018 | B2 |
10222980 | Alonso Ruiz et al. | Mar 2019 | B2 |
10235023 | Gustafsson et al. | Mar 2019 | B2 |
10275087 | Smith | Apr 2019 | B1 |
10331769 | Hill et al. | Jun 2019 | B1 |
10386960 | Smith | Aug 2019 | B1 |
10496151 | Kim et al. | Dec 2019 | B2 |
20010024195 | Hayakawa et al. | Sep 2001 | A1 |
20010045965 | Orbanes et al. | Nov 2001 | A1 |
20020008691 | Hanajima et al. | Jan 2002 | A1 |
20020015064 | Robotham et al. | Feb 2002 | A1 |
20020042925 | Ebisu et al. | Apr 2002 | A1 |
20020054011 | Bruneau et al. | May 2002 | A1 |
20020057256 | Flack | May 2002 | A1 |
20020109668 | Rosenberg et al. | Aug 2002 | A1 |
20020109678 | Marmolin et al. | Aug 2002 | A1 |
20020140680 | Lu | Oct 2002 | A1 |
20020140740 | Chen | Oct 2002 | A1 |
20020163498 | Chang et al. | Nov 2002 | A1 |
20020180763 | Kung | Dec 2002 | A1 |
20020186257 | Cadiz et al. | Dec 2002 | A1 |
20030001869 | Nissen | Jan 2003 | A1 |
20030013492 | Bokhari et al. | Jan 2003 | A1 |
20030086496 | Zhang et al. | May 2003 | A1 |
20030112269 | Lentz et al. | Jun 2003 | A1 |
20030117440 | Hellyar et al. | Jun 2003 | A1 |
20030122779 | Martin et al. | Jul 2003 | A1 |
20030128242 | Gordon | Jul 2003 | A1 |
20030151589 | Bensen et al. | Aug 2003 | A1 |
20030184574 | Phillips et al. | Oct 2003 | A1 |
20030189552 | Chuang et al. | Oct 2003 | A1 |
20030189647 | Kang | Oct 2003 | A1 |
20030206169 | Springer et al. | Nov 2003 | A1 |
20030222915 | Marion et al. | Dec 2003 | A1 |
20040015662 | Cummings | Jan 2004 | A1 |
20040021643 | Hoshino et al. | Feb 2004 | A1 |
20040056849 | Lohbihler et al. | Mar 2004 | A1 |
20040108995 | Hoshino et al. | Jun 2004 | A1 |
20040138849 | Schmidt et al. | Jul 2004 | A1 |
20040150631 | Fleck et al. | Aug 2004 | A1 |
20040150644 | Kincaid et al. | Aug 2004 | A1 |
20040174399 | Wu et al. | Sep 2004 | A1 |
20040219969 | Casey et al. | Nov 2004 | A1 |
20040267877 | Shiparo et al. | Dec 2004 | A1 |
20050012723 | Pallakoff | Jan 2005 | A1 |
20050039141 | Burke et al. | Feb 2005 | A1 |
20050091604 | Davis | Apr 2005 | A1 |
20050110769 | DaCosta et al. | May 2005 | A1 |
20050114785 | Finnigan et al. | May 2005 | A1 |
20050125742 | Grotjohn et al. | Jun 2005 | A1 |
20050134578 | Chambers et al. | Jun 2005 | A1 |
20050183017 | Cain | Aug 2005 | A1 |
20050190280 | Haas et al. | Sep 2005 | A1 |
20050204295 | Voorhees et al. | Sep 2005 | A1 |
20050223338 | Partanen | Oct 2005 | A1 |
20050229112 | Clay et al. | Oct 2005 | A1 |
20050289476 | Tokkonen | Dec 2005 | A1 |
20060001650 | Robbins et al. | Jan 2006 | A1 |
20060001657 | Monney et al. | Jan 2006 | A1 |
20060012577 | Kyrola | Jan 2006 | A1 |
20060022955 | Kennedy | Feb 2006 | A1 |
20060026536 | Hotelling et al. | Feb 2006 | A1 |
20060031776 | Glein et al. | Feb 2006 | A1 |
20060036971 | Mendel et al. | Feb 2006 | A1 |
20060059436 | Nurmi | Mar 2006 | A1 |
20060067677 | Tokiwa et al. | Mar 2006 | A1 |
20060101347 | Runov et al. | May 2006 | A1 |
20060109252 | Kolmykov-Zotov et al. | May 2006 | A1 |
20060109256 | Grant et al. | May 2006 | A1 |
20060119586 | Grant et al. | Jun 2006 | A1 |
20060132455 | Rimas-Ribikauskas et al. | Jun 2006 | A1 |
20060132456 | Anson | Jun 2006 | A1 |
20060132457 | Rimas-Ribikauskas et al. | Jun 2006 | A1 |
20060136834 | Cao et al. | Jun 2006 | A1 |
20060136845 | Rimas-Ribikauskas et al. | Jun 2006 | A1 |
20060161861 | Holecek et al. | Jul 2006 | A1 |
20060161870 | Hotelling et al. | Jul 2006 | A1 |
20060190834 | Marcjan | Aug 2006 | A1 |
20060195438 | Galuten | Aug 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20060212812 | Simmons et al. | Sep 2006 | A1 |
20060213754 | Jarrett et al. | Sep 2006 | A1 |
20060224989 | Pettiross et al. | Oct 2006 | A1 |
20060233248 | Rynderman et al. | Oct 2006 | A1 |
20060236263 | Bathiche et al. | Oct 2006 | A1 |
20060274042 | Krah et al. | Dec 2006 | A1 |
20060274086 | Forstall et al. | Dec 2006 | A1 |
20060277469 | Chaudhri et al. | Dec 2006 | A1 |
20060282778 | Barsness et al. | Dec 2006 | A1 |
20060284858 | Rekimoto | Dec 2006 | A1 |
20060290681 | Ho et al. | Dec 2006 | A1 |
20070024595 | Baker et al. | Feb 2007 | A1 |
20070024646 | Saarinen et al. | Feb 2007 | A1 |
20070080953 | Lii | Apr 2007 | A1 |
20070113681 | Nishimura et al. | May 2007 | A1 |
20070120834 | Boillot | May 2007 | A1 |
20070120835 | Sato | May 2007 | A1 |
20070124699 | Michaels | May 2007 | A1 |
20070152959 | Peters | Jul 2007 | A1 |
20070157173 | Klein et al. | Jul 2007 | A1 |
20070168369 | Bruns | Jul 2007 | A1 |
20070168890 | Zhao et al. | Jul 2007 | A1 |
20070176904 | Russo | Aug 2007 | A1 |
20070182999 | Anthony et al. | Aug 2007 | A1 |
20070186178 | Schiller | Aug 2007 | A1 |
20070200713 | Weber et al. | Aug 2007 | A1 |
20070222768 | Geurts et al. | Sep 2007 | A1 |
20070229455 | Martin et al. | Oct 2007 | A1 |
20070229464 | Hotelling et al. | Oct 2007 | A1 |
20070236450 | Colgate et al. | Oct 2007 | A1 |
20070236477 | Ryu et al. | Oct 2007 | A1 |
20070245241 | Bertram et al. | Oct 2007 | A1 |
20070257821 | Son et al. | Nov 2007 | A1 |
20070270182 | Gulliksson et al. | Nov 2007 | A1 |
20070288862 | Ording | Dec 2007 | A1 |
20070294295 | Finkelstein et al. | Dec 2007 | A1 |
20070299923 | Skelly et al. | Dec 2007 | A1 |
20080001924 | dos los Reyes et al. | Jan 2008 | A1 |
20080024459 | Poupyrev et al. | Jan 2008 | A1 |
20080034306 | Ording | Feb 2008 | A1 |
20080034331 | Josephsoon et al. | Feb 2008 | A1 |
20080036743 | Westerman et al. | Feb 2008 | A1 |
20080051989 | Welsh | Feb 2008 | A1 |
20080052945 | Matas et al. | Mar 2008 | A1 |
20080066010 | Brodersen et al. | Mar 2008 | A1 |
20080094367 | Van De Ven et al. | Apr 2008 | A1 |
20080094368 | Ording et al. | Apr 2008 | A1 |
20080094398 | Ng et al. | Apr 2008 | A1 |
20080106523 | Conrad | May 2008 | A1 |
20080109753 | Karstens | May 2008 | A1 |
20080136790 | Hio | Jun 2008 | A1 |
20080155415 | Yoon et al. | Jun 2008 | A1 |
20080163119 | Kim et al. | Jul 2008 | A1 |
20080165141 | Christie | Jul 2008 | A1 |
20080168379 | Forstall et al. | Jul 2008 | A1 |
20080168395 | Ording et al. | Jul 2008 | A1 |
20080168403 | Westerman et al. | Jul 2008 | A1 |
20080168404 | Ording | Jul 2008 | A1 |
20080202824 | Philipp et al. | Aug 2008 | A1 |
20080204427 | Heesemans et al. | Aug 2008 | A1 |
20080219493 | Tadmor | Sep 2008 | A1 |
20080222569 | Champion et al. | Sep 2008 | A1 |
20080225007 | Nakadaira et al. | Sep 2008 | A1 |
20080244448 | Goering et al. | Oct 2008 | A1 |
20080259046 | Carsanaro | Oct 2008 | A1 |
20080263452 | Tomkins | Oct 2008 | A1 |
20080284866 | Mizutani | Nov 2008 | A1 |
20080294984 | Ramsay et al. | Nov 2008 | A1 |
20080297475 | Woolf et al. | Dec 2008 | A1 |
20080303795 | Lowles et al. | Dec 2008 | A1 |
20080303799 | Schwesig et al. | Dec 2008 | A1 |
20080307335 | Chaudhri et al. | Dec 2008 | A1 |
20080307359 | Louch et al. | Dec 2008 | A1 |
20080317378 | Steinberg et al. | Dec 2008 | A1 |
20080320419 | Matas et al. | Dec 2008 | A1 |
20090007017 | Anzures et al. | Jan 2009 | A1 |
20090046110 | Sadler et al. | Feb 2009 | A1 |
20090058828 | Jiang et al. | Mar 2009 | A1 |
20090061837 | Chaudhri et al. | Mar 2009 | A1 |
20090066668 | Kim et al. | Mar 2009 | A1 |
20090073118 | Yamaji et al. | Mar 2009 | A1 |
20090083665 | Anttila et al. | Mar 2009 | A1 |
20090085878 | Heubel et al. | Apr 2009 | A1 |
20090085881 | Keam | Apr 2009 | A1 |
20090085886 | Huang et al. | Apr 2009 | A1 |
20090089293 | Garritano et al. | Apr 2009 | A1 |
20090100343 | Lee et al. | Apr 2009 | A1 |
20090102804 | Wong et al. | Apr 2009 | A1 |
20090102805 | Meijer et al. | Apr 2009 | A1 |
20090140985 | Liu | Jun 2009 | A1 |
20090150775 | Miyazaki et al. | Jun 2009 | A1 |
20090158198 | Hayter et al. | Jun 2009 | A1 |
20090160793 | Rekimoto | Jun 2009 | A1 |
20090160814 | Li et al. | Jun 2009 | A1 |
20090167507 | Maenpaa | Jul 2009 | A1 |
20090167508 | Fadell et al. | Jul 2009 | A1 |
20090167509 | Fadell et al. | Jul 2009 | A1 |
20090167704 | Terlizzi et al. | Jul 2009 | A1 |
20090169061 | Anderson et al. | Jul 2009 | A1 |
20090187824 | Hinckley et al. | Jul 2009 | A1 |
20090189866 | Haffenden et al. | Jul 2009 | A1 |
20090195959 | Ladouceur et al. | Aug 2009 | A1 |
20090198767 | Jakobson et al. | Aug 2009 | A1 |
20090201260 | Lee et al. | Aug 2009 | A1 |
20090219294 | Young et al. | Sep 2009 | A1 |
20090225037 | Williamson et al. | Sep 2009 | A1 |
20090228842 | Westerman et al. | Sep 2009 | A1 |
20090237374 | Li et al. | Sep 2009 | A1 |
20090244357 | Huang | Oct 2009 | A1 |
20090247112 | Lundy et al. | Oct 2009 | A1 |
20090247230 | Lundy et al. | Oct 2009 | A1 |
20090251410 | Mori et al. | Oct 2009 | A1 |
20090251421 | Bloebaum | Oct 2009 | A1 |
20090256947 | Ciurea et al. | Oct 2009 | A1 |
20090259975 | Asai et al. | Oct 2009 | A1 |
20090267906 | Schroderus | Oct 2009 | A1 |
20090276730 | Aybes et al. | Nov 2009 | A1 |
20090280860 | Dahlke | Nov 2009 | A1 |
20090282360 | Park et al. | Nov 2009 | A1 |
20090284478 | De la Torre Baltierra et al. | Nov 2009 | A1 |
20090288032 | Chang et al. | Nov 2009 | A1 |
20090293009 | Meserth et al. | Nov 2009 | A1 |
20090295739 | Nagara | Dec 2009 | A1 |
20090303187 | Pallakoff | Dec 2009 | A1 |
20090307583 | Tonisson | Dec 2009 | A1 |
20090307633 | Haughay, Jr. et al. | Dec 2009 | A1 |
20090322893 | Stallings et al. | Dec 2009 | A1 |
20090325566 | Bell et al. | Dec 2009 | A1 |
20100007926 | Imaizumi et al. | Jan 2010 | A1 |
20100011304 | Van Os | Jan 2010 | A1 |
20100013613 | Weston | Jan 2010 | A1 |
20100013777 | Baudisch et al. | Jan 2010 | A1 |
20100017710 | Kim et al. | Jan 2010 | A1 |
20100020035 | Ryu et al. | Jan 2010 | A1 |
20100026640 | Kim et al. | Feb 2010 | A1 |
20100026647 | Abe et al. | Feb 2010 | A1 |
20100039446 | Hillis et al. | Feb 2010 | A1 |
20100044121 | Simon et al. | Feb 2010 | A1 |
20100045619 | Birnbaum et al. | Feb 2010 | A1 |
20100057235 | Wang et al. | Mar 2010 | A1 |
20100058231 | Duarte et al. | Mar 2010 | A1 |
20100060605 | Rimas-Ribikauskas et al. | Mar 2010 | A1 |
20100061637 | Mochizuki et al. | Mar 2010 | A1 |
20100062803 | Yun et al. | Mar 2010 | A1 |
20100070908 | Mori et al. | Mar 2010 | A1 |
20100073329 | Raman et al. | Mar 2010 | A1 |
20100083116 | Akifusa et al. | Apr 2010 | A1 |
20100085302 | Fairweather et al. | Apr 2010 | A1 |
20100085314 | Kwok | Apr 2010 | A1 |
20100085317 | Park et al. | Apr 2010 | A1 |
20100088596 | Griffin et al. | Apr 2010 | A1 |
20100088654 | Henhoeffer | Apr 2010 | A1 |
20100110082 | Myrick et al. | May 2010 | A1 |
20100111434 | Madden | May 2010 | A1 |
20100127983 | Irani et al. | May 2010 | A1 |
20100128002 | Stacy et al. | May 2010 | A1 |
20100138776 | Korhonen | Jun 2010 | A1 |
20100146507 | Kang et al. | Jun 2010 | A1 |
20100148999 | Casparian et al. | Jun 2010 | A1 |
20100149096 | Migos et al. | Jun 2010 | A1 |
20100153879 | Rimas-Ribikauskas et al. | Jun 2010 | A1 |
20100156807 | Stallings et al. | Jun 2010 | A1 |
20100156813 | Duarte et al. | Jun 2010 | A1 |
20100156818 | Burrough et al. | Jun 2010 | A1 |
20100156823 | Paleczny et al. | Jun 2010 | A1 |
20100156825 | Sohn et al. | Jun 2010 | A1 |
20100159995 | Stallings et al. | Jun 2010 | A1 |
20100171713 | Kwok et al. | Jul 2010 | A1 |
20100175023 | Gatlin et al. | Jul 2010 | A1 |
20100180225 | Chiba et al. | Jul 2010 | A1 |
20100199227 | Xiao et al. | Aug 2010 | A1 |
20100211872 | Rolston et al. | Aug 2010 | A1 |
20100214239 | Wu | Aug 2010 | A1 |
20100225604 | Homma et al. | Sep 2010 | A1 |
20100231534 | Chaudhri et al. | Sep 2010 | A1 |
20100235726 | Ording et al. | Sep 2010 | A1 |
20100235746 | Anzures | Sep 2010 | A1 |
20100248787 | Smuga et al. | Sep 2010 | A1 |
20100251168 | Fujita et al. | Sep 2010 | A1 |
20100271312 | Alameh et al. | Oct 2010 | A1 |
20100271500 | Park et al. | Oct 2010 | A1 |
20100277419 | Ganey et al. | Nov 2010 | A1 |
20100277496 | Kawanishi et al. | Nov 2010 | A1 |
20100281379 | Meaney et al. | Nov 2010 | A1 |
20100281385 | Meaney et al. | Nov 2010 | A1 |
20100289807 | Yu et al. | Nov 2010 | A1 |
20100295805 | Shin et al. | Nov 2010 | A1 |
20100302177 | Kim et al. | Dec 2010 | A1 |
20100302179 | Ahn et al. | Dec 2010 | A1 |
20100306702 | Warner | Dec 2010 | A1 |
20100308983 | Conte et al. | Dec 2010 | A1 |
20100309147 | Fleizach et al. | Dec 2010 | A1 |
20100313050 | Harrat et al. | Dec 2010 | A1 |
20100313124 | Privault et al. | Dec 2010 | A1 |
20100313156 | Louch et al. | Dec 2010 | A1 |
20100313158 | Lee et al. | Dec 2010 | A1 |
20100313166 | Nakayama et al. | Dec 2010 | A1 |
20100315417 | Cho et al. | Dec 2010 | A1 |
20100315438 | Horodezky et al. | Dec 2010 | A1 |
20100317410 | Song et al. | Dec 2010 | A1 |
20100321301 | Casparian et al. | Dec 2010 | A1 |
20100321312 | Han et al. | Dec 2010 | A1 |
20100325578 | Mital et al. | Dec 2010 | A1 |
20100328229 | Weber et al. | Dec 2010 | A1 |
20110010626 | Fino et al. | Jan 2011 | A1 |
20110012851 | Ciesla et al. | Jan 2011 | A1 |
20110018695 | Bells et al. | Jan 2011 | A1 |
20110026099 | Kwon et al. | Feb 2011 | A1 |
20110035145 | Yamasaki | Feb 2011 | A1 |
20110038552 | Lam | Feb 2011 | A1 |
20110039602 | McNamara et al. | Feb 2011 | A1 |
20110050576 | Forutanpour et al. | Mar 2011 | A1 |
20110050588 | Li et al. | Mar 2011 | A1 |
20110050591 | Kim et al. | Mar 2011 | A1 |
20110050594 | Kim et al. | Mar 2011 | A1 |
20110050628 | Homma et al. | Mar 2011 | A1 |
20110050629 | Homma et al. | Mar 2011 | A1 |
20110050630 | Ikeda | Mar 2011 | A1 |
20110050653 | Miyazawa et al. | Mar 2011 | A1 |
20110050687 | Alyshev et al. | Mar 2011 | A1 |
20110054837 | Ikeda | Mar 2011 | A1 |
20110055135 | Dawson et al. | Mar 2011 | A1 |
20110055741 | Jeon et al. | Mar 2011 | A1 |
20110057886 | Ng et al. | Mar 2011 | A1 |
20110057903 | Yamano et al. | Mar 2011 | A1 |
20110061029 | Yeh et al. | Mar 2011 | A1 |
20110063248 | Yoon | Mar 2011 | A1 |
20110069012 | Martensson | Mar 2011 | A1 |
20110069016 | Victor | Mar 2011 | A1 |
20110070342 | Wilkens | Mar 2011 | A1 |
20110074697 | Rapp et al. | Mar 2011 | A1 |
20110080349 | Holbein et al. | Apr 2011 | A1 |
20110080350 | Almalki et al. | Apr 2011 | A1 |
20110080367 | Marchand et al. | Apr 2011 | A1 |
20110084910 | Almalki et al. | Apr 2011 | A1 |
20110087982 | McCann et al. | Apr 2011 | A1 |
20110087983 | Shim | Apr 2011 | A1 |
20110093815 | Gobeil | Apr 2011 | A1 |
20110093817 | Song et al. | Apr 2011 | A1 |
20110102340 | Martin et al. | May 2011 | A1 |
20110102829 | Jourdan | May 2011 | A1 |
20110107272 | Aguilar | May 2011 | A1 |
20110109617 | Snook et al. | May 2011 | A1 |
20110116716 | Kwon et al. | May 2011 | A1 |
20110126139 | Jeong et al. | May 2011 | A1 |
20110138295 | Momchilov et al. | Jun 2011 | A1 |
20110141031 | McCullough et al. | Jun 2011 | A1 |
20110141052 | Bernstein et al. | Jun 2011 | A1 |
20110144777 | Firkins et al. | Jun 2011 | A1 |
20110145752 | Fagans | Jun 2011 | A1 |
20110145753 | Prakash | Jun 2011 | A1 |
20110145759 | Leffert et al. | Jun 2011 | A1 |
20110145764 | Higuchi et al. | Jun 2011 | A1 |
20110149138 | Watkins | Jun 2011 | A1 |
20110163971 | Wagner et al. | Jul 2011 | A1 |
20110163978 | Park et al. | Jul 2011 | A1 |
20110175826 | Moore et al. | Jul 2011 | A1 |
20110175832 | Miyazawa et al. | Jul 2011 | A1 |
20110181538 | Aono | Jul 2011 | A1 |
20110181751 | Mizumori | Jul 2011 | A1 |
20110185299 | Hinckley et al. | Jul 2011 | A1 |
20110185300 | Hinckley et al. | Jul 2011 | A1 |
20110185316 | Reid et al. | Jul 2011 | A1 |
20110191675 | Kauranen | Aug 2011 | A1 |
20110193788 | King et al. | Aug 2011 | A1 |
20110193809 | Walley et al. | Aug 2011 | A1 |
20110193881 | Rydenhag | Aug 2011 | A1 |
20110197160 | Kim et al. | Aug 2011 | A1 |
20110201387 | Paek et al. | Aug 2011 | A1 |
20110202834 | Mandryk et al. | Aug 2011 | A1 |
20110202853 | Mujkic | Aug 2011 | A1 |
20110202879 | Stovicek et al. | Aug 2011 | A1 |
20110205163 | Hinckley et al. | Aug 2011 | A1 |
20110209088 | Hinckley et al. | Aug 2011 | A1 |
20110209093 | Hinckley et al. | Aug 2011 | A1 |
20110209097 | Hinckley et al. | Aug 2011 | A1 |
20110209099 | Hinckley et al. | Aug 2011 | A1 |
20110209104 | Hinckley et al. | Aug 2011 | A1 |
20110210926 | Pasquero et al. | Sep 2011 | A1 |
20110210931 | Shai | Sep 2011 | A1 |
20110215914 | Edwards | Sep 2011 | A1 |
20110221684 | Rydenhag | Sep 2011 | A1 |
20110221776 | Shimotani et al. | Sep 2011 | A1 |
20110231789 | Bukurak et al. | Sep 2011 | A1 |
20110234639 | Shimotani et al. | Sep 2011 | A1 |
20110238690 | Arrasvouri et al. | Sep 2011 | A1 |
20110239110 | Garrett et al. | Sep 2011 | A1 |
20110242029 | Kasahara et al. | Oct 2011 | A1 |
20110246877 | Kwak et al. | Oct 2011 | A1 |
20110248916 | Griffin et al. | Oct 2011 | A1 |
20110248948 | Griffin et al. | Oct 2011 | A1 |
20110252346 | Chaudhri | Oct 2011 | A1 |
20110252357 | Chaudhri | Oct 2011 | A1 |
20110252362 | Cho et al. | Oct 2011 | A1 |
20110258537 | Rives et al. | Oct 2011 | A1 |
20110260994 | Saynac et al. | Oct 2011 | A1 |
20110263298 | Park | Oct 2011 | A1 |
20110267530 | Chun | Nov 2011 | A1 |
20110279380 | Weber et al. | Nov 2011 | A1 |
20110279381 | Tong et al. | Nov 2011 | A1 |
20110279395 | Kuwabara et al. | Nov 2011 | A1 |
20110279852 | Oda et al. | Nov 2011 | A1 |
20110285656 | Yaksick et al. | Nov 2011 | A1 |
20110285659 | Kuwabara et al. | Nov 2011 | A1 |
20110291945 | Ewing, Jr. et al. | Dec 2011 | A1 |
20110291951 | Tong | Dec 2011 | A1 |
20110296334 | Ryu et al. | Dec 2011 | A1 |
20110296351 | Ewing, Jr. et al. | Dec 2011 | A1 |
20110304559 | Pasquero | Dec 2011 | A1 |
20110304577 | Brown et al. | Dec 2011 | A1 |
20110310049 | Homma et al. | Dec 2011 | A1 |
20110319136 | Labowicz et al. | Dec 2011 | A1 |
20120005622 | Park et al. | Jan 2012 | A1 |
20120007857 | Noda et al. | Jan 2012 | A1 |
20120011437 | James et al. | Jan 2012 | A1 |
20120013541 | Boka et al. | Jan 2012 | A1 |
20120013542 | Shenfield | Jan 2012 | A1 |
20120013607 | Lee | Jan 2012 | A1 |
20120019448 | Pitkanen et al. | Jan 2012 | A1 |
20120026110 | Yamano | Feb 2012 | A1 |
20120032979 | Blow et al. | Feb 2012 | A1 |
20120036441 | Basir et al. | Feb 2012 | A1 |
20120036556 | LeBeau et al. | Feb 2012 | A1 |
20120038580 | Sasaki | Feb 2012 | A1 |
20120044153 | Arrasvouri et al. | Feb 2012 | A1 |
20120047380 | Nurmi | Feb 2012 | A1 |
20120056837 | Park et al. | Mar 2012 | A1 |
20120056848 | Yamano et al. | Mar 2012 | A1 |
20120062564 | Miyashita et al. | Mar 2012 | A1 |
20120062604 | Lobo | Mar 2012 | A1 |
20120062732 | Marman et al. | Mar 2012 | A1 |
20120066630 | Kim et al. | Mar 2012 | A1 |
20120066648 | Rolleston et al. | Mar 2012 | A1 |
20120081326 | Heubel et al. | Apr 2012 | A1 |
20120081375 | Robert et al. | Apr 2012 | A1 |
20120084644 | Robert et al. | Apr 2012 | A1 |
20120084689 | Ledet et al. | Apr 2012 | A1 |
20120084713 | Desai et al. | Apr 2012 | A1 |
20120089932 | Kano et al. | Apr 2012 | A1 |
20120089942 | Gammon | Apr 2012 | A1 |
20120089951 | Cassidy | Apr 2012 | A1 |
20120096393 | Shim et al. | Apr 2012 | A1 |
20120096400 | Cho | Apr 2012 | A1 |
20120098780 | Fujisawa et al. | Apr 2012 | A1 |
20120102437 | Worley et al. | Apr 2012 | A1 |
20120105358 | Momeyer et al. | May 2012 | A1 |
20120105367 | Son et al. | May 2012 | A1 |
20120106852 | Khawand et al. | May 2012 | A1 |
20120113007 | Koch et al. | May 2012 | A1 |
20120113023 | Koch et al. | May 2012 | A1 |
20120126962 | Ujii et al. | May 2012 | A1 |
20120131495 | Goossens et al. | May 2012 | A1 |
20120139844 | Ramstein et al. | Jun 2012 | A1 |
20120139864 | Sleeman et al. | Jun 2012 | A1 |
20120144330 | Flint | Jun 2012 | A1 |
20120146945 | Miyazawa et al. | Jun 2012 | A1 |
20120147052 | Homma et al. | Jun 2012 | A1 |
20120154303 | Lazaridis et al. | Jun 2012 | A1 |
20120154328 | Kono | Jun 2012 | A1 |
20120158629 | Hinckley et al. | Jun 2012 | A1 |
20120159380 | Kocienda et al. | Jun 2012 | A1 |
20120169646 | Berkes et al. | Jul 2012 | A1 |
20120169716 | Mihara | Jul 2012 | A1 |
20120176403 | Cha et al. | Jul 2012 | A1 |
20120179967 | Hayes | Jul 2012 | A1 |
20120180001 | Griffin et al. | Jul 2012 | A1 |
20120182226 | Tuli | Jul 2012 | A1 |
20120183271 | Forutanpour et al. | Jul 2012 | A1 |
20120192108 | Kolb | Jul 2012 | A1 |
20120200528 | Ciesla et al. | Aug 2012 | A1 |
20120206393 | Hillis et al. | Aug 2012 | A1 |
20120216114 | Privault et al. | Aug 2012 | A1 |
20120218203 | Kanki | Aug 2012 | A1 |
20120235912 | Laubach | Sep 2012 | A1 |
20120240044 | Johnson et al. | Sep 2012 | A1 |
20120249575 | Krolczyk et al. | Oct 2012 | A1 |
20120249853 | Krolczyk et al. | Oct 2012 | A1 |
20120256829 | Dodge | Oct 2012 | A1 |
20120256846 | Mak | Oct 2012 | A1 |
20120256847 | Mak et al. | Oct 2012 | A1 |
20120256857 | Mak | Oct 2012 | A1 |
20120257071 | Prentice | Oct 2012 | A1 |
20120260219 | Piccolotto | Oct 2012 | A1 |
20120260220 | Griffin | Oct 2012 | A1 |
20120274578 | Snow et al. | Nov 2012 | A1 |
20120274591 | Rimas-Ribikauskas et al. | Nov 2012 | A1 |
20120274662 | Kim et al. | Nov 2012 | A1 |
20120284673 | Lamb et al. | Nov 2012 | A1 |
20120293449 | Dietz | Nov 2012 | A1 |
20120293551 | Momeyer et al. | Nov 2012 | A1 |
20120297041 | Momchilov | Nov 2012 | A1 |
20120304108 | Jarrett et al. | Nov 2012 | A1 |
20120304132 | Sareen et al. | Nov 2012 | A1 |
20120304133 | Nan et al. | Nov 2012 | A1 |
20120306632 | Fleizach et al. | Dec 2012 | A1 |
20120306748 | Fleizach et al. | Dec 2012 | A1 |
20120306764 | Kamibeppu | Dec 2012 | A1 |
20120306765 | Moore | Dec 2012 | A1 |
20120306766 | Moore | Dec 2012 | A1 |
20120306772 | Tan et al. | Dec 2012 | A1 |
20120306778 | Wheeldreyer et al. | Dec 2012 | A1 |
20120306927 | Lee et al. | Dec 2012 | A1 |
20120311429 | Decker et al. | Dec 2012 | A1 |
20120311437 | Weeldreyer et al. | Dec 2012 | A1 |
20120311498 | Kluttz et al. | Dec 2012 | A1 |
20130002561 | Wakasa | Jan 2013 | A1 |
20130014057 | Reinpoldt et al. | Jan 2013 | A1 |
20130016042 | Makinen et al. | Jan 2013 | A1 |
20130016122 | Bhatt et al. | Jan 2013 | A1 |
20130019158 | Watanabe | Jan 2013 | A1 |
20130019174 | Gil et al. | Jan 2013 | A1 |
20130031514 | Gabbert | Jan 2013 | A1 |
20130036386 | Park et al. | Feb 2013 | A1 |
20130042199 | Fong et al. | Feb 2013 | A1 |
20130044062 | Bose et al. | Feb 2013 | A1 |
20130047100 | Kroeger et al. | Feb 2013 | A1 |
20130050131 | Lee et al. | Feb 2013 | A1 |
20130050143 | Kim et al. | Feb 2013 | A1 |
20130061172 | Huang et al. | Mar 2013 | A1 |
20130063364 | Moore | Mar 2013 | A1 |
20130063389 | Moore | Mar 2013 | A1 |
20130067383 | Kataoka et al. | Mar 2013 | A1 |
20130067513 | Takami | Mar 2013 | A1 |
20130067527 | Ashbook et al. | Mar 2013 | A1 |
20130074003 | Dolenc | Mar 2013 | A1 |
20130076676 | Gan | Mar 2013 | A1 |
20130077804 | Glebe et al. | Mar 2013 | A1 |
20130082824 | Colley | Apr 2013 | A1 |
20130082937 | Liu et al. | Apr 2013 | A1 |
20130086056 | Dyor et al. | Apr 2013 | A1 |
20130093691 | Moosavi | Apr 2013 | A1 |
20130093764 | Andersson et al. | Apr 2013 | A1 |
20130097520 | Lewin et al. | Apr 2013 | A1 |
20130097521 | Lewin et al. | Apr 2013 | A1 |
20130097534 | Lewin et al. | Apr 2013 | A1 |
20130097539 | Mansson et al. | Apr 2013 | A1 |
20130097556 | Louch | Apr 2013 | A1 |
20130097562 | Kermoian et al. | Apr 2013 | A1 |
20130111345 | Newman et al. | May 2013 | A1 |
20130111378 | Newman et al. | May 2013 | A1 |
20130111398 | Lu et al. | May 2013 | A1 |
20130111415 | Newman et al. | May 2013 | A1 |
20130111579 | Newman et al. | May 2013 | A1 |
20130113715 | Grant et al. | May 2013 | A1 |
20130113720 | Van Eerd et al. | May 2013 | A1 |
20130113760 | Gossweiler, III et al. | May 2013 | A1 |
20130120278 | Cantrell | May 2013 | A1 |
20130120280 | Kukulski | May 2013 | A1 |
20130120295 | Kim et al. | May 2013 | A1 |
20130120306 | Furukawa | May 2013 | A1 |
20130125039 | Murata | May 2013 | A1 |
20130127755 | Lynn et al. | May 2013 | A1 |
20130135243 | Hirsch et al. | May 2013 | A1 |
20130135288 | King et al. | May 2013 | A1 |
20130135499 | Song | May 2013 | A1 |
20130141364 | Lynn et al. | Jun 2013 | A1 |
20130141396 | Lynn et al. | Jun 2013 | A1 |
20130145313 | Roh et al. | Jun 2013 | A1 |
20130154948 | Schediwy et al. | Jun 2013 | A1 |
20130154959 | Lindsay et al. | Jun 2013 | A1 |
20130155018 | Dagdeviren | Jun 2013 | A1 |
20130159893 | Lewis et al. | Jun 2013 | A1 |
20130162603 | Peng et al. | Jun 2013 | A1 |
20130162667 | Eskolin et al. | Jun 2013 | A1 |
20130169549 | Seymour et al. | Jul 2013 | A1 |
20130174049 | Townsend et al. | Jul 2013 | A1 |
20130174089 | Ki | Jul 2013 | A1 |
20130174094 | Heo et al. | Jul 2013 | A1 |
20130174179 | Park et al. | Jul 2013 | A1 |
20130179840 | Fisher et al. | Jul 2013 | A1 |
20130185642 | Gammons | Jul 2013 | A1 |
20130187869 | Rydenhag et al. | Jul 2013 | A1 |
20130191791 | Rydenhag et al. | Jul 2013 | A1 |
20130194217 | Lee et al. | Aug 2013 | A1 |
20130194480 | Fukata et al. | Aug 2013 | A1 |
20130198690 | Barsoum et al. | Aug 2013 | A1 |
20130201139 | Tanaka | Aug 2013 | A1 |
20130212515 | Eleftheriou | Aug 2013 | A1 |
20130212541 | Dolenc et al. | Aug 2013 | A1 |
20130215079 | Johnson et al. | Aug 2013 | A1 |
20130222274 | Mori et al. | Aug 2013 | A1 |
20130222323 | McKenzie | Aug 2013 | A1 |
20130222333 | Miles et al. | Aug 2013 | A1 |
20130222671 | Tseng et al. | Aug 2013 | A1 |
20130227413 | Thorsander et al. | Aug 2013 | A1 |
20130227419 | Lee et al. | Aug 2013 | A1 |
20130227450 | Na et al. | Aug 2013 | A1 |
20130228023 | Drasnin et al. | Sep 2013 | A1 |
20130232353 | Belesiu et al. | Sep 2013 | A1 |
20130232402 | Lu et al. | Sep 2013 | A1 |
20130234929 | Libin | Sep 2013 | A1 |
20130239057 | Ubillos et al. | Sep 2013 | A1 |
20130246954 | Gray et al. | Sep 2013 | A1 |
20130249814 | Zeng | Sep 2013 | A1 |
20130257793 | Zeliff et al. | Oct 2013 | A1 |
20130257817 | Yliaho | Oct 2013 | A1 |
20130265246 | Tae | Oct 2013 | A1 |
20130268875 | Han et al. | Oct 2013 | A1 |
20130271395 | Tsai et al. | Oct 2013 | A1 |
20130275422 | Silber et al. | Oct 2013 | A1 |
20130278520 | Weng et al. | Oct 2013 | A1 |
20130293496 | Takamoto | Nov 2013 | A1 |
20130305184 | Kim et al. | Nov 2013 | A1 |
20130307790 | Konttori et al. | Nov 2013 | A1 |
20130307792 | Andres et al. | Nov 2013 | A1 |
20130314359 | Sudou | Nov 2013 | A1 |
20130314434 | Shetterly et al. | Nov 2013 | A1 |
20130321340 | Seo et al. | Dec 2013 | A1 |
20130321457 | Bauermeister et al. | Dec 2013 | A1 |
20130325342 | Pylappan et al. | Dec 2013 | A1 |
20130326420 | Liu et al. | Dec 2013 | A1 |
20130326421 | Jo | Dec 2013 | A1 |
20130326583 | Freihold et al. | Dec 2013 | A1 |
20130328770 | Parham | Dec 2013 | A1 |
20130328793 | Chowdhury | Dec 2013 | A1 |
20130328796 | Al-Dahle et al. | Dec 2013 | A1 |
20130332836 | Cho | Dec 2013 | A1 |
20130332892 | Matsuki | Dec 2013 | A1 |
20130335373 | Tomiyasu | Dec 2013 | A1 |
20130338847 | Lisseman et al. | Dec 2013 | A1 |
20130339909 | Ha | Dec 2013 | A1 |
20140002355 | Lee et al. | Jan 2014 | A1 |
20140002374 | Hunt et al. | Jan 2014 | A1 |
20140002386 | Rosenberg et al. | Jan 2014 | A1 |
20140013271 | Moore et al. | Jan 2014 | A1 |
20140024414 | Fuji | Jan 2014 | A1 |
20140026098 | Gilman | Jan 2014 | A1 |
20140026099 | Andersson Reimer et al. | Jan 2014 | A1 |
20140028554 | De Los Reyes et al. | Jan 2014 | A1 |
20140028571 | St. Clair | Jan 2014 | A1 |
20140028601 | Moore | Jan 2014 | A1 |
20140028606 | Giannetta | Jan 2014 | A1 |
20140035826 | Frazier et al. | Feb 2014 | A1 |
20140049491 | Nagar et al. | Feb 2014 | A1 |
20140055367 | Dearman et al. | Feb 2014 | A1 |
20140055377 | Kim | Feb 2014 | A1 |
20140059460 | Ho | Feb 2014 | A1 |
20140059485 | Lehrian et al. | Feb 2014 | A1 |
20140062956 | Ishizone et al. | Mar 2014 | A1 |
20140063316 | Lee et al. | Mar 2014 | A1 |
20140063541 | Yamazaki | Mar 2014 | A1 |
20140071060 | Santos-Gomez | Mar 2014 | A1 |
20140072281 | Cho et al. | Mar 2014 | A1 |
20140072283 | Cho et al. | Mar 2014 | A1 |
20140078318 | Alameh | Mar 2014 | A1 |
20140078343 | Dai et al. | Mar 2014 | A1 |
20140082536 | Costa et al. | Mar 2014 | A1 |
20140092025 | Pala et al. | Apr 2014 | A1 |
20140092030 | Van der Velden | Apr 2014 | A1 |
20140092031 | Schwartz et al. | Apr 2014 | A1 |
20140108936 | Khosropour et al. | Apr 2014 | A1 |
20140109016 | Ouyang et al. | Apr 2014 | A1 |
20140111456 | Kashiwa et al. | Apr 2014 | A1 |
20140111480 | Kim et al. | Apr 2014 | A1 |
20140111670 | Lord et al. | Apr 2014 | A1 |
20140118268 | Kuscher | May 2014 | A1 |
20140123080 | Gan | May 2014 | A1 |
20140139456 | Wigdor et al. | May 2014 | A1 |
20140139471 | Matsuki | May 2014 | A1 |
20140145970 | Cho | May 2014 | A1 |
20140152581 | Case et al. | Jun 2014 | A1 |
20140157203 | Jeon et al. | Jun 2014 | A1 |
20140160063 | Yairi et al. | Jun 2014 | A1 |
20140160073 | Matsuki | Jun 2014 | A1 |
20140164955 | Thiruvidam et al. | Jun 2014 | A1 |
20140164966 | Kim et al. | Jun 2014 | A1 |
20140165006 | Chaudhri et al. | Jun 2014 | A1 |
20140168093 | Lawrence | Jun 2014 | A1 |
20140168153 | Deichmann et al. | Jun 2014 | A1 |
20140173517 | Chaudhri | Jun 2014 | A1 |
20140179377 | Song et al. | Jun 2014 | A1 |
20140184526 | Cho | Jul 2014 | A1 |
20140201660 | Clausen et al. | Jul 2014 | A1 |
20140208271 | Bell et al. | Jul 2014 | A1 |
20140210758 | Park et al. | Jul 2014 | A1 |
20140210760 | Aberg et al. | Jul 2014 | A1 |
20140210798 | Wilson | Jul 2014 | A1 |
20140223376 | Tarvainen et al. | Aug 2014 | A1 |
20140223381 | Huang et al. | Aug 2014 | A1 |
20140237408 | Ohlsson et al. | Aug 2014 | A1 |
20140245202 | Yoon et al. | Aug 2014 | A1 |
20140245367 | Sasaki et al. | Aug 2014 | A1 |
20140267114 | Lisseman et al. | Sep 2014 | A1 |
20140267135 | Chhabra | Sep 2014 | A1 |
20140267362 | Kocienda et al. | Sep 2014 | A1 |
20140282084 | Murarka et al. | Sep 2014 | A1 |
20140282211 | Ady et al. | Sep 2014 | A1 |
20140282214 | Shirzadi et al. | Sep 2014 | A1 |
20140300569 | Matsuki et al. | Oct 2014 | A1 |
20140304599 | Alexandersson | Oct 2014 | A1 |
20140304651 | Johansson et al. | Oct 2014 | A1 |
20140306897 | Cueto | Oct 2014 | A1 |
20140306899 | Hicks | Oct 2014 | A1 |
20140310638 | Lee et al. | Oct 2014 | A1 |
20140313130 | Yamano et al. | Oct 2014 | A1 |
20140333551 | Kim et al. | Nov 2014 | A1 |
20140333561 | Bull et al. | Nov 2014 | A1 |
20140344765 | Hicks et al. | Nov 2014 | A1 |
20140351744 | Jeon et al. | Nov 2014 | A1 |
20140354845 | Molgaard et al. | Dec 2014 | A1 |
20140354850 | Kosaka et al. | Dec 2014 | A1 |
20140359438 | Matsuki | Dec 2014 | A1 |
20140359528 | Murata | Dec 2014 | A1 |
20140365945 | Karunamuni et al. | Dec 2014 | A1 |
20140380247 | Tecarro et al. | Dec 2014 | A1 |
20150002664 | Eppinger et al. | Jan 2015 | A1 |
20150012861 | Loginov | Jan 2015 | A1 |
20150015763 | Lee et al. | Jan 2015 | A1 |
20150020032 | Chen | Jan 2015 | A1 |
20150020036 | Kim et al. | Jan 2015 | A1 |
20150026584 | Kobayakov et al. | Jan 2015 | A1 |
20150026592 | Mohammed et al. | Jan 2015 | A1 |
20150026642 | Wilson et al. | Jan 2015 | A1 |
20150029149 | Andersson et al. | Jan 2015 | A1 |
20150033184 | Kim et al. | Jan 2015 | A1 |
20150042588 | Park | Feb 2015 | A1 |
20150046876 | Goldenberg | Feb 2015 | A1 |
20150049033 | Kim et al. | Feb 2015 | A1 |
20150055890 | Lundin et al. | Feb 2015 | A1 |
20150058723 | Cieplinski et al. | Feb 2015 | A1 |
20150062046 | Cho et al. | Mar 2015 | A1 |
20150062052 | Bernstein et al. | Mar 2015 | A1 |
20150062068 | Shih et al. | Mar 2015 | A1 |
20150067495 | Bernstein et al. | Mar 2015 | A1 |
20150067496 | Missig et al. | Mar 2015 | A1 |
20150067497 | Cieplinski et al. | Mar 2015 | A1 |
20150067513 | Zambetti et al. | Mar 2015 | A1 |
20150067519 | Missig et al. | Mar 2015 | A1 |
20150067534 | Choi et al. | Mar 2015 | A1 |
20150067559 | Missig et al. | Mar 2015 | A1 |
20150067560 | Cieplinski et al. | Mar 2015 | A1 |
20150067563 | Bernstein et al. | Mar 2015 | A1 |
20150067596 | Brown et al. | Mar 2015 | A1 |
20150067601 | Bernstein et al. | Mar 2015 | A1 |
20150067602 | Bernstein et al. | Mar 2015 | A1 |
20150067605 | Zambetti et al. | Mar 2015 | A1 |
20150071547 | Keating et al. | Mar 2015 | A1 |
20150082238 | Meng | Mar 2015 | A1 |
20150116205 | Westerman et al. | Apr 2015 | A1 |
20150121218 | Kim et al. | Apr 2015 | A1 |
20150121225 | Somasundaram et al. | Apr 2015 | A1 |
20150128092 | Lee et al. | May 2015 | A1 |
20150135109 | Zambetti et al. | May 2015 | A1 |
20150138126 | Westerman | May 2015 | A1 |
20150138155 | Bernstein et al. | May 2015 | A1 |
20150139605 | Wiklof | May 2015 | A1 |
20150143273 | Bernstein et al. | May 2015 | A1 |
20150143284 | Bennett et al. | May 2015 | A1 |
20150149899 | Bernstein et al. | May 2015 | A1 |
20150149964 | Bernstein et al. | May 2015 | A1 |
20150149967 | Bernstein et al. | May 2015 | A1 |
20150153897 | Huang et al. | Jun 2015 | A1 |
20150153929 | Bernstein et al. | Jun 2015 | A1 |
20150160729 | Nakagawa | Jun 2015 | A1 |
20150169059 | Behles et al. | Jun 2015 | A1 |
20150185840 | Golyshko et al. | Jul 2015 | A1 |
20150193099 | Murphy | Jul 2015 | A1 |
20150193951 | Lee et al. | Jul 2015 | A1 |
20150205495 | Koide et al. | Jul 2015 | A1 |
20150234446 | Nathan et al. | Aug 2015 | A1 |
20150234493 | Parivar et al. | Aug 2015 | A1 |
20150253866 | Amm et al. | Sep 2015 | A1 |
20150268786 | Kitada | Sep 2015 | A1 |
20150268813 | Bos | Sep 2015 | A1 |
20150309573 | Brombach et al. | Oct 2015 | A1 |
20150321607 | Cho et al. | Nov 2015 | A1 |
20150332107 | Paniaras | Nov 2015 | A1 |
20150332607 | Gardner, Jr. et al. | Nov 2015 | A1 |
20150378519 | Brown et al. | Dec 2015 | A1 |
20150378982 | McKenzie et al. | Dec 2015 | A1 |
20150381931 | Uhma et al. | Dec 2015 | A1 |
20160004373 | Huang | Jan 2016 | A1 |
20160004393 | Faaborg et al. | Jan 2016 | A1 |
20160004427 | Zambetti et al. | Jan 2016 | A1 |
20160004428 | Bernstein et al. | Jan 2016 | A1 |
20160004430 | Missig et al. | Jan 2016 | A1 |
20160004431 | Bernstein et al. | Jan 2016 | A1 |
20160004432 | Bernstein et al. | Jan 2016 | A1 |
20160011771 | Cieplinski | Jan 2016 | A1 |
20160019718 | Mukkamala et al. | Jan 2016 | A1 |
20160021511 | Jin et al. | Jan 2016 | A1 |
20160041750 | Cieplinski et al. | Feb 2016 | A1 |
20160048326 | Kim et al. | Feb 2016 | A1 |
20160062466 | Moussette et al. | Mar 2016 | A1 |
20160062619 | Reeve et al. | Mar 2016 | A1 |
20160070401 | Kim et al. | Mar 2016 | A1 |
20160077721 | Laubach et al. | Mar 2016 | A1 |
20160085385 | Gao et al. | Mar 2016 | A1 |
20160125234 | Ota et al. | May 2016 | A1 |
20160132139 | Du et al. | May 2016 | A1 |
20160188181 | Smith | Jun 2016 | A1 |
20160196028 | Kenney et al. | Jul 2016 | A1 |
20160210025 | Bernstein et al. | Jul 2016 | A1 |
20160259412 | Flint et al. | Sep 2016 | A1 |
20160259413 | Anzures et al. | Sep 2016 | A1 |
20160259495 | Butcher et al. | Sep 2016 | A1 |
20160259496 | Butcher et al. | Sep 2016 | A1 |
20160259498 | Foss et al. | Sep 2016 | A1 |
20160259499 | Kocienda et al. | Sep 2016 | A1 |
20160259516 | Kudurshian et al. | Sep 2016 | A1 |
20160259517 | Butcher et al. | Sep 2016 | A1 |
20160259518 | King et al. | Sep 2016 | A1 |
20160259519 | Foss et al. | Sep 2016 | A1 |
20160259527 | Kocienda et al. | Sep 2016 | A1 |
20160259528 | Foss et al. | Sep 2016 | A1 |
20160259536 | Kudurshian et al. | Sep 2016 | A1 |
20160259548 | Ma | Sep 2016 | A1 |
20160274686 | Ruiz et al. | Sep 2016 | A1 |
20160274728 | Luo et al. | Sep 2016 | A1 |
20160274761 | Ruiz et al. | Sep 2016 | A1 |
20160283054 | Suzuki | Sep 2016 | A1 |
20160306507 | Defazio et al. | Oct 2016 | A1 |
20160320906 | Bokma et al. | Nov 2016 | A1 |
20160357368 | Federighi et al. | Dec 2016 | A1 |
20160357389 | Dakin et al. | Dec 2016 | A1 |
20160357390 | Federighi et al. | Dec 2016 | A1 |
20160357404 | Alonso Ruiz et al. | Dec 2016 | A1 |
20160360116 | Penha et al. | Dec 2016 | A1 |
20170045981 | Karunamuni et al. | Feb 2017 | A1 |
20170046039 | Karunamuni et al. | Feb 2017 | A1 |
20170046058 | Karunamuni et al. | Feb 2017 | A1 |
20170046059 | Karunamuni et al. | Feb 2017 | A1 |
20170046060 | Karunamuni et al. | Feb 2017 | A1 |
20170075520 | Bauer et al. | Mar 2017 | A1 |
20170075562 | Bauer et al. | Mar 2017 | A1 |
20170075563 | Bauer et al. | Mar 2017 | A1 |
20170090699 | Pennington et al. | Mar 2017 | A1 |
20170091153 | Thimbleby | Mar 2017 | A1 |
20170109011 | Jiang | Apr 2017 | A1 |
20170115867 | Bargmann | Apr 2017 | A1 |
20170123497 | Yonezawa | May 2017 | A1 |
20170124699 | Lane | May 2017 | A1 |
20170139565 | Choi | May 2017 | A1 |
20170315694 | Alonso Ruiz et al. | Nov 2017 | A1 |
20180024681 | Bernstein et al. | Jan 2018 | A1 |
20180082522 | Bartosik | Mar 2018 | A1 |
20180275862 | Khoe et al. | Sep 2018 | A1 |
20180342103 | Schwartz et al. | Nov 2018 | A1 |
20180349362 | Sharp et al. | Dec 2018 | A1 |
20180364883 | Khoe et al. | Dec 2018 | A1 |
20180364898 | Chen | Dec 2018 | A1 |
20180364904 | Bernstein et al. | Dec 2018 | A1 |
20190004605 | Flint et al. | Jan 2019 | A1 |
20190012059 | Kwon et al. | Jan 2019 | A1 |
20190018562 | Bernstein et al. | Jan 2019 | A1 |
20190042075 | Bernstein et al. | Feb 2019 | A1 |
20190042078 | Bernstein et al. | Feb 2019 | A1 |
20190065043 | Zambetti et al. | Feb 2019 | A1 |
20190121493 | Bernstein et al. | Apr 2019 | A1 |
20190121520 | Cieplinski et al. | Apr 2019 | A1 |
20190138101 | Bernstein | May 2019 | A1 |
20190138102 | Missig | May 2019 | A1 |
20190138189 | Missig | May 2019 | A1 |
20190155503 | Alonso Ruiz et al. | May 2019 | A1 |
20190158727 | Penha et al. | May 2019 | A1 |
20190163358 | Dascola et al. | May 2019 | A1 |
20190171353 | Missig et al. | Jun 2019 | A1 |
20190171354 | Dascola et al. | Jun 2019 | A1 |
20190212896 | Karunamuni et al. | Jul 2019 | A1 |
20190332257 | Kudurshian et al. | Oct 2019 | A1 |
20190364194 | Penha et al. | Nov 2019 | A1 |
20190391658 | Missig et al. | Dec 2019 | A1 |
20200081614 | Zambetti | Mar 2020 | A1 |
20200142548 | Karunamuni et al. | May 2020 | A1 |
20200210059 | Hu et al. | Jul 2020 | A1 |
20200301556 | Alonso Ruiz et al. | Sep 2020 | A1 |
Number | Date | Country |
---|---|---|
1356493 | Jul 2002 | CN |
1808362 | Jul 2006 | CN |
101118469 | Feb 2008 | CN |
101192097 | Jun 2008 | CN |
101202866 | Jun 2008 | CN |
101222704 | Jul 2008 | CN |
101241397 | Aug 2008 | CN |
101320303 | Dec 2008 | CN |
101384977 | Mar 2009 | CN |
101498979 | Aug 2009 | CN |
101526876 | Sep 2009 | CN |
101593077 | Dec 2009 | CN |
101604208 | Dec 2009 | CN |
101609380 | Dec 2009 | CN |
101627359 | Jan 2010 | CN |
101650615 | Feb 2010 | CN |
101727179 | Jun 2010 | CN |
101809526 | Aug 2010 | CN |
101965549 | Feb 2011 | CN |
101998052 | Mar 2011 | CN |
102004575 | Apr 2011 | CN |
102004576 | Apr 2011 | CN |
102004577 | Apr 2011 | CN |
102004593 | Apr 2011 | CN |
102004604 | Apr 2011 | CN |
102112946 | Jun 2011 | CN |
102160021 | Aug 2011 | CN |
102214038 | Oct 2011 | CN |
102243662 | Nov 2011 | CN |
102301322 | Dec 2011 | CN |
102349038 | Feb 2012 | CN |
102349040 | Feb 2012 | CN |
102385478 | Mar 2012 | CN |
102438092 | May 2012 | CN |
102460355 | May 2012 | CN |
102483677 | May 2012 | CN |
102646013 | Aug 2012 | CN |
102662571 | Sep 2012 | CN |
102662573 | Sep 2012 | CN |
102752441 | Oct 2012 | CN |
102792255 | Nov 2012 | CN |
102819331 | Dec 2012 | CN |
102819401 | Dec 2012 | CN |
102841677 | Dec 2012 | CN |
103019586 | Apr 2013 | CN |
103092386 | May 2013 | CN |
103097992 | May 2013 | CN |
103186345 | Jul 2013 | CN |
103201714 | Jul 2013 | CN |
103279295 | Sep 2013 | CN |
103518176 | Jan 2014 | CN |
103649885 | Mar 2014 | CN |
103699295 | Apr 2014 | CN |
103777850 | May 2014 | CN |
103777886 | May 2014 | CN |
103793134 | May 2014 | CN |
103838465 | Jun 2014 | CN |
103970474 | Aug 2014 | CN |
104011637 | Aug 2014 | CN |
104020955 | Sep 2014 | CN |
104021021 | Sep 2014 | CN |
104024985 | Sep 2014 | CN |
104077014 | Oct 2014 | CN |
104142798 | Nov 2014 | CN |
104160362 | Nov 2014 | CN |
104331239 | Feb 2015 | CN |
104392292 | Mar 2015 | CN |
104412201 | Mar 2015 | CN |
104471521 | Mar 2015 | CN |
104487928 | Apr 2015 | CN |
101527745 | Sep 2015 | CN |
105264476 | Jan 2016 | CN |
100 59 906 | Jun 2002 | DE |
0 859 307 | Mar 1998 | EP |
0 880 090 | Nov 1998 | EP |
1 028 583 | Aug 2000 | EP |
1 406 150 | Apr 2004 | EP |
1 674 977 | Jun 2006 | EP |
1 882 902 | Jan 2008 | EP |
2 000 896 | Dec 2008 | EP |
2 017 701 | Jan 2009 | EP |
2 028 583 | Feb 2009 | EP |
2 077 490 | Jul 2009 | EP |
2 141 574 | Jan 2010 | EP |
2 175 357 | Apr 2010 | EP |
2 196 893 | Jun 2010 | EP |
2 214 087 | Aug 2010 | EP |
2 226 715 | Sep 2010 | EP |
2 299 351 | Mar 2011 | EP |
2 302 496 | Mar 2011 | EP |
2 375 309 | Oct 2011 | EP |
2 375 314 | Oct 2011 | EP |
2 386 935 | Nov 2011 | EP |
2 407 868 | Jan 2012 | EP |
2 420 924 | Feb 2012 | EP |
2 426 580 | Mar 2012 | EP |
2 447 818 | May 2012 | EP |
2 527 966 | Nov 2012 | EP |
2 530 677 | Dec 2012 | EP |
2 541 376 | Jan 2013 | EP |
2 555 500 | Feb 2013 | EP |
2 615 535 | Jul 2013 | EP |
2 631 737 | Aug 2013 | EP |
2 674 846 | Dec 2013 | EP |
2 708 985 | Mar 2014 | EP |
2 733 578 | May 2014 | EP |
2 808 764 | Dec 2014 | EP |
2 809 058 | Dec 2014 | EP |
2 813 938 | Dec 2014 | EP |
2 402 105 | Dec 2004 | GB |
58-182746 | Oct 1983 | JP |
H06-161647 | Jun 1994 | JP |
H07-98769 | Apr 1995 | JP |
H07-098769 | Apr 1995 | JP |
H07-104915 | Apr 1995 | JP |
H07-151512 | Jun 1995 | JP |
H08-227341 | Sep 1996 | JP |
H09-269883 | Oct 1997 | JP |
H09-330175 | Dec 1997 | JP |
H11-203044 | Jul 1999 | JP |
2001-078137 | Mar 2001 | JP |
2001-202192 | Jul 2001 | JP |
2001-222355 | Aug 2001 | JP |
2001-306207 | Nov 2001 | JP |
2002-044536 | Feb 2002 | JP |
2002-149312 | May 2002 | JP |
3085481 | May 2002 | JP |
2003-157131 | May 2003 | JP |
2003-186597 | Jul 2003 | JP |
2004-054861 | Feb 2004 | JP |
2004-062648 | Feb 2004 | JP |
2004-070492 | Mar 2004 | JP |
2004-086733 | Mar 2004 | JP |
2004-152217 | May 2004 | JP |
2004-288208 | Oct 2004 | JP |
2005-031786 | Feb 2005 | JP |
2005-092386 | Apr 2005 | JP |
2005-102106 | Apr 2005 | JP |
2005-135106 | May 2005 | JP |
2005-157842 | Jun 2005 | JP |
2005-196810 | Jul 2005 | JP |
2005-352927 | Dec 2005 | JP |
2005-352927 | Dec 2005 | JP |
2006-185443 | Jul 2006 | JP |
2009-545805 | Dec 2006 | JP |
2007-116384 | May 2007 | JP |
2007-148104 | Jun 2007 | JP |
2007-264808 | Oct 2007 | JP |
2008-009759 | Jan 2008 | JP |
2008-015890 | Jan 2008 | JP |
2008-033739 | Feb 2008 | JP |
2008-516348 | May 2008 | JP |
2008-146453 | Jun 2008 | JP |
2008-191086 | Aug 2008 | JP |
2008-537615 | Sep 2008 | JP |
2008-305174 | Dec 2008 | JP |
2009-500761 | Jan 2009 | JP |
2009-110243 | May 2009 | JP |
2009-129171 | Jun 2009 | JP |
2009-129443 | Jun 2009 | JP |
2009-169452 | Jul 2009 | JP |
2009-211704 | Sep 2009 | JP |
2009-217543 | Sep 2009 | JP |
2009-294688 | Dec 2009 | JP |
2010-009321 | Jan 2010 | JP |
2010-503126 | Jan 2010 | JP |
2010-503130 | Jan 2010 | JP |
2010-055274 | Mar 2010 | JP |
2010-097353 | Apr 2010 | JP |
2010-146507 | Jul 2010 | JP |
2010-152716 | Jul 2010 | JP |
2010-176174 | Aug 2010 | JP |
2010-176337 | Aug 2010 | JP |
2010-181934 | Aug 2010 | JP |
2010-181940 | Aug 2010 | JP |
2010-198385 | Sep 2010 | JP |
2010-541071 | Dec 2010 | JP |
2011-501307 | Jan 2011 | JP |
2011-028635 | Feb 2011 | JP |
2011-048666 | Mar 2011 | JP |
2011-048686 | Mar 2011 | JP |
2011-048762 | Mar 2011 | JP |
2011-048832 | Mar 2011 | JP |
2011-053831 | Mar 2011 | JP |
2011-053972 | Mar 2011 | JP |
2011-053973 | Mar 2011 | JP |
2011-053974 | Mar 2011 | JP |
2011-059821 | Mar 2011 | JP |
2011-070342 | Apr 2011 | JP |
2011-100290 | May 2011 | JP |
2011-107823 | Jun 2011 | JP |
2011-123773 | Jun 2011 | JP |
2011-141868 | Jul 2011 | JP |
2011-170538 | Sep 2011 | JP |
2011-192179 | Sep 2011 | JP |
2011-192215 | Sep 2011 | JP |
2011-197848 | Oct 2011 | JP |
2011-221640 | Nov 2011 | JP |
2011-232947 | Nov 2011 | JP |
2011-242386 | Dec 2011 | JP |
2011-250004 | Dec 2011 | JP |
2011-253556 | Dec 2011 | JP |
2011-257941 | Dec 2011 | JP |
2011-530101 | Dec 2011 | JP |
2012-027940 | Feb 2012 | JP |
2012-033061 | Feb 2012 | JP |
2012-043266 | Mar 2012 | JP |
2012-043267 | Mar 2012 | JP |
2012-053687 | Mar 2012 | JP |
2012-053754 | Mar 2012 | JP |
2012-053926 | Mar 2012 | JP |
2012-073785 | Apr 2012 | JP |
2012-073873 | Apr 2012 | JP |
2012-509605 | Apr 2012 | JP |
2012-093820 | May 2012 | JP |
2012-118825 | Jun 2012 | JP |
2012-118993 | Jun 2012 | JP |
2012-123564 | Jun 2012 | JP |
2012-128825 | Jul 2012 | JP |
2012-527685 | Nov 2012 | JP |
2013-025357 | Feb 2013 | JP |
2013-030050 | Feb 2013 | JP |
2013-058149 | Mar 2013 | JP |
2013-080521 | May 2013 | JP |
2013-101465 | May 2013 | JP |
2013-105410 | May 2013 | JP |
2013-529339 | Jul 2013 | JP |
2013-200879 | Oct 2013 | JP |
2013-542488 | Nov 2013 | JP |
2013-250602 | Dec 2013 | JP |
2014-504419 | Feb 2014 | JP |
2014-052852 | Mar 2014 | JP |
2014-130567 | Jul 2014 | JP |
2014-140112 | Jul 2014 | JP |
2014-149833 | Aug 2014 | JP |
2014-519109 | Aug 2014 | JP |
2014-529137 | Oct 2014 | JP |
2015-099555 | May 2015 | JP |
2015-521315 | Jul 2015 | JP |
2015-153420 | Aug 2015 | JP |
2015-185161 | Oct 2015 | JP |
2006-0071353 | Jun 2006 | KR |
2008-0045143 | Apr 2008 | KR |
100823871 | Apr 2008 | KR |
2008-0054346 | Jun 2008 | KR |
2010-0010860 | Feb 2010 | KR |
2010-0014095 | Feb 2010 | KR |
2010 0070841 | Jun 2010 | KR |
2010 0133246 | Dec 2010 | KR |
2011 0026176 | Mar 2011 | KR |
2011 0086501 | Jul 2011 | KR |
2012 0103670 | Sep 2012 | KR |
20120135723 | Dec 2012 | KR |
2013 0099647 | Sep 2013 | KR |
2014 0016495 | Feb 2014 | KR |
2014 0029720 | Mar 2014 | KR |
2014 0043760 | Apr 2014 | KR |
2014 0079110 | Jun 2014 | KR |
2014 0122000 | Oct 2014 | KR |
20150013263 | Feb 2015 | KR |
20150021977 | Mar 2015 | KR |
2007145218 | Jul 2009 | RU |
WO 2005106637 | Nov 2005 | WO |
WO 2006013485 | Feb 2006 | WO |
WO 2006042309 | Apr 2006 | WO |
WO 2006094308 | Sep 2006 | WO |
WO 2007121557 | Nov 2007 | WO |
WO 2008030976 | Mar 2008 | WO |
WO 2008064142 | May 2008 | WO |
WO 2009155981 | Dec 2009 | WO |
WO 2009158549 | Dec 2009 | WO |
WO 2010013876 | Feb 2010 | WO |
WO 2010032598 | Feb 2010 | WO |
WO 2010032598 | Mar 2010 | WO |
WO 2010090010 | Aug 2010 | WO |
WO 2010122813 | Oct 2010 | WO |
WO 2010134729 | Nov 2010 | WO |
WO 2011024389 | Mar 2011 | WO |
WO 2011024465 | Mar 2011 | WO |
WO 2011093045 | Aug 2011 | WO |
WO 2011105009 | Sep 2011 | WO |
WO 2011108190 | Sep 2011 | WO |
WO 2011115187 | Sep 2011 | WO |
WO 2011121375 | Oct 2011 | WO |
WO 2012021417 | Feb 2012 | WO |
WO 2012037664 | Mar 2012 | WO |
WO 2012096804 | Jul 2012 | WO |
WO 2012108213 | Aug 2012 | WO |
WO 2012114760 | Aug 2012 | WO |
WO 2012137946 | Oct 2012 | WO |
WO 2012150540 | Nov 2012 | WO |
WO 2012153555 | Nov 2012 | WO |
WO 2013022486 | Feb 2013 | WO |
WO 2013035725 | Mar 2013 | WO |
WO 2013112453 | Aug 2013 | WO |
WO 2013169302 | Nov 2013 | WO |
WO 2013169845 | Nov 2013 | WO |
WO 2013169846 | Nov 2013 | WO |
WO 2013169849 | Nov 2013 | WO |
WO 2013169851 | Nov 2013 | WO |
WO 2013169853 | Nov 2013 | WO |
WO 2013169854 | Nov 2013 | WO |
WO 2013169870 | Nov 2013 | WO |
WO 2013169875 | Nov 2013 | WO |
WO 2013169877 | Nov 2013 | WO |
WO 2013169882 | Nov 2013 | WO |
WO 2013173838 | Nov 2013 | WO |
WO 2014105275 | Jul 2014 | WO |
WO 2014105276 | Jul 2014 | WO |
WO 2014105277 | Jul 2014 | WO |
WO 2014105278 | Jul 2014 | WO |
WO 2014105279 | Jul 2014 | WO |
WO 2014129655 | Aug 2014 | WO |
WO 2014149473 | Sep 2014 | WO |
WO 2014200733 | Dec 2014 | WO |
Entry |
---|
Agarwal, “How to Copy and Paste Text on Windows Phone 8,” Guiding Tech, http://web.archive.org/web20130709204246/http://www.guidingtech.com/20280/copy-paste-text-windows-phone-8/, Jul. 9, 2013, 10 pages. |
Angelov, “Sponsor Flip Wall with Jquery & CSS”, Tutorialzine. N.p., Mar. 24, 2010. Web. http://tutorialzine.com/2010/03/sponsor-wall-slip-jquery-css/, Mar. 24, 2010, 8 pages. |
Anonymous, “1-Click Installer for Windows Media Taskbar Mini-Player for Windows 7, 8, 8.1 10”, http://metadataconsulting.blogspot.de/2014/05/installer-for-windows-media-taskbar.htm, May 5, 2014, 6 pages. |
Anonymous, “Android—What Should Status Bar Toggle Button Behavior Be?”, https://ux.stackechange.com/questions/34814, Jan. 15, 2015, 2 pages. |
Anonymous, “Event Handling Guide for iOS”, https://github.com/lonfee88/iOSDevelopeLibrary/raw/master/EventHandlingiPhoneOS.pdf, Mar. 9, 2015, 74 pages. |
Anonymous, “Event Handling Guide for iOS—GitHub”, https://github.com/lonfee88/iOSDevelopeLibrary/blob/master/EventHandlingiPhoneOS.pdf, Apr. 15, 2015, 3 pages. |
Anonymous, “Google Android 5.0 Release Date, Specs and Editors Hands on Review—CNET”, http://www.cnet.com/products/google-an-android-5-0-lollipop/, Mar. 12, 2015, 10 pages. |
Anonymous, “How Do I Add Contextual Menu to My Apple Watch App?”, http://www.tech-recipes.com/rx/52578/how-do-i-add-contextual-menu-to-my-apple-watch-app, Jan. 13, 2015, 3 pages. |
Anonymous, “[new] WMP12 with Taskbar Toolbar for Windows 7—Windows Customization—WinMatrix”, http://www.winmatrix.com/forums/index/php?/topic/25528-new-wmp12-with-taskbar-toolbar-for-windows-7, Jan. 27, 2013, 6 pages. |
Anonymous, “Nokia 808 PureView screenshots”, retrieved from Internet; no URL, Nov. 12, 2012, 8 pages. |
Anonymous, “Nokia 808 PureView User Guide,” http://download-fds.webapps.microsoft.com/supportFiles/phones/files/pdf_guides/devices/808/Nokia_808_UG_en_APAC.pdf, Jan. 1, 2012, 144 pages. |
Anonymous, “Notifications, Android 4.4 and Lower”, Android Developers, https://developer.android.com/design/patterns/notifications_k.html, May 24, 2015, 9 pages. |
Anonymous, “Taskbar Extensions”, https://web.archive.org/web/20141228124434/http://msdn.microsoft.com:80/en-us/library/windows/desktop/dd378460(v=vs.85).aspx, Dec. 28, 2014, 8 pages. |
Azundris, “A Fire in the Pie,” http://web.archive.org/web/20140722062639/http://blog.azundrix.com/archives/168-A-fire-in-the-sky.html, Jul. 22, 2014, 8 pages. |
Bilibili, “Android 5.0 Lollipop”, https://www.bilibili.com/video/av1636064?from=search&seid=3128140235778895126, Oct. 19, 2014, 3 pages. |
B-log—betriebsraum weblog, “Extremely Efficient Menu Selection: Marking Menus for the Flash Platform,” http://www.betriebsraum.de/blog/2009/12/11/extremely-efficient-menu-selection-marking -for-the-flash-platform, Dec. 11, 2009, 9 pages. |
Bolluyt, “5 Apple Watch Revelations from Apple's New WatchKit”, http://www.cheatsheet.com/tecnology/5-apple-watch-revelations-from-apples-new-watchkit.html/?a=viewall, Nov. 22, 2014, 3 pages. |
Brownlee, “Android 5.0 Lollipop Feature Review!”, https//www.youtube.com/watch?v=pEDQ1z1-PvU, Oct. 27, 2014, 5 pages. |
Clark, “Global Moxie, Touch Means a Renaissance for Radial Menus,” http://globalmoxie.com/blog/radial-menus-for-touch-ui˜print.shtml, Jul. 17, 2012, 7 pages. |
Cohen, Cinemagraphs are Animated Gifs for Adults, http://www.tubefilter.com/2011/07/10/cinemagraph, Jul. 10, 2011, 3 pages. |
CrackBerry Forums, Windows 8 Bezel Control and Gestures, http://wwwforums.crackberry.com/blackberry-playbook-f222/windows-8-bezel-control-gestures-705129/, Mar. 1, 2012, 8 pages. |
Crook, “Microsoft Patenting Multi-Screen, Milti-Touch Gestures,” http://techcrunch.com/2011/08/25/microsoft-awarded-patents-for-multi-screen-multi-touch-gestures/, Aug. 25, 2011, 8 pages. |
Cvil.ly—a design blog, Interesting Touch Interactions on Windows 8, http://cviliy/2011/06/04/interesting-touch-interactions-on-windows-8/, Jun. 4, 2011, 3 pages. |
Dachis, “All the Awesome Things You Can Do With a Long Press on Your iPhone, iPad, or iPad Touch”, www.lifehacker.com, Jan. 25, 2012, 4 pages. |
Davidson, et al., “Extending 2D Object Arrangement with Pressure-Sensitive Layering Cues”, Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology, Oct. 19, 2008, 4 pages. |
Dinwiddie, et al., “Combined-User Interface for Computers, Television, Video Recorders, and Telephone, Etc”, IP.com Journal, Aug. 1, 1990, 3 pages. |
Drinkwater, “Glossary: Pre/Post Alarm Image Buffer,” http://www.networkwebcams.com/ip-camera-learning-center/2008/07/17/glossary-prepost-alarm-image-buffed, Jul. 17, 2008, 1 page. |
Dzyre, “10 Android Notification Features You Can Fiddle With”, http://www.hongkiat.com/blog/android-notification-features, Mar. 10, 2014, 10 pages. |
Easton-Ellett, “Three Free Cydia Utilities to Remove iOS Notification Badges”, http://www.ijailbreak.com/cydia/three-free-cydia-utilies-to-remove-ios-notification-badges, Apr. 14, 2012, 2 pages. |
Elliot, “Mac System 7”, YouTube. Web. Mar. 8, 2017, http://www.youtube.com/watch?v=XLv22hfuuik, Aug. 3, 2011, 1 page. |
Farshad, “SageThumbs-Preview and Convert Pictures From Windows Context Menu”, https://web.addictivetips.com/windows-tips/sagethumbs-preview-and-convert-photos-from-windows-context-menu, Aug. 8, 2011, 5 pages. |
Fenlon, “The Case for Bezel Touch Gestures on Apple's iPad,” http://www.tested.com/tech/tablets/3104-the case-for-bezel-touch-gestures-on-apples-ipad/, Nov. 2, 2011, 6 pages. |
Flaherty, “Is Apple Watch's Pressure-Sensitive Screen a Bigger Deal Than the Gadget Itself?”, http://www.wired.com/2014/09/apple-watchs-pressure-sensitive-screen-bigger-deal-gadget, Sep. 15, 2014, 3 pages. |
Flixel, “Cinemagraph Pro for Mac”, https://flixel.com/products/mac/cinemagraph-pro, 2014, 7 pages. |
Flowplayer, “Slowmotion: Flowplayer,” https://web.archive.org/web/20150226191526/http://flash.flowplayer.org/plugins/streaming/slowmotion.html, Feb. 26, 2015, 4 pages. |
Forlines, et al., “Glimpse: a Novel Input Model for Multi-level Devices”, Chi '05 Extended Abstracts on Human Factors in Computing Systems, Apr. 2, 2005, 4 pages. |
Gardner, “Recenz—Recent Apps in One Tap”, You Tube, https://www.youtube.com/watch?v-qailSHRgsTo, May 15, 2015, 1 page. |
Gonzalo et al., “Zliding: Fluid Zooming and Sliding for High Precision Parameter Manipulation”, Department of Computer Science, University of Toronto, Seattle, Washington, Oct. 23, 2005, 10 pages. |
Google-Chrome, “Android 5.0 Lollipop”, http://androidlover.net/android-os/android-5-0-lollipop/android-5-0-lollipop-recent-apps-card-google-search.html, Oct. 19, 2014, 10 pages. |
Grant, “Android's Notification Center”, https://www.objc.io/issues/11-android/android-notifications, Apr. 30, 2014, 26 pages. |
IBM et al., “Pressure-Sensitive Icons”, IBM Technical Disclosure Bulletin, vol. 33, No. 1B, Jun. 1, 1990, 3 pages. |
ICIMS Recruiting Software, “Blackberry Playbook Review,” http://www.tested.com/tech.tablets/5749-blackberry-playbook-review/, 2015, 11 pages. |
IPhoneHacksTV, “Confero allows you to easily manage your Badge notifications—iPhone Hacks”, youtube, https://wwwyoutube.com/watch?v=JCk61pnL4SU, Dec. 26, 2014, 3 pages. |
IPhoneOperator, “Wasser Liveeffekt fur Homescreen & Lockscreen—Aquaboard (Cydia)”, http://www.youtube.com/watch?v=fG9YMF-mB0Q, Sep. 22, 2012, 3 pages. |
IPodHacks 142: “Water Ripple Effects on the Home and Lock Screen: AquaBoard Cydia Tweak Review”, YouTube, https://www.youtube.comwatch?v-Auu_uRaYHJs, Sep. 24, 2012, 3 pages. |
Kaaresoja, “Snap-Crackle-Pop: Tactile Feedback for Mobile Touch Screens,” Nokia Research Center, Helsinki, Finland, Proceedings of Eurohaptics vol. 2006, Jul. 3, 2006, 2 pages. |
Kiener, “Force Touch on iPhone”, https://www.youtube.com/watch?v=CEMmnsU5fC8, Aug. 4, 2015, 4 pages. |
Kleinman, “iPhone 6s Said to Sport Force Touch Display, 2GB of RAM”, https://www.technobuffalo.com/2015/01/15/iphone-6s-said-to-sport-force-touch-display-2gb-of-ram, Jan. 15, 2015, 2 pages. |
Kost, “LR3-Deselect All Images But One”, Julieanne Kost's Blog, blogs.adobe.com/jkost/2011/12/lr3-deselect-all-images-but-one.html, Dec. 22, 2011, 1 page. |
Kronfli, “HTC Zoe Comes to Google Play, Here's Everything You Need to Know,” Know Your Mobile, http://www.knowyourmobile.com/htc/htc-one/19550/what-htc-zoe, Aug. 14, 2014, 5 pages. |
Kumar, “How to Enable Ripple Effect on Lock Screen of Galaxy S2”, YouTube, http, http://www.youtube.com/watch?v+B9-4M5abLXA, Feb. 12, 2013, 3 pages. |
Kurdi, “XnView Shell Extension: A Powerful Image Utility Inside the Context Menu”, http://www.freewaregenius.com/xnview-shell-extension-a-powerful-image-utility-inside-the-context-menu, Jul. 30, 2008, 4 pages. |
Laurie, “The Power of the Right Click,” http://vlaurie.com/right-click/customize-context-menu.html, 2002-2016, 3 pages. |
MacKenzie et al., “The Tactile Touchpad”, Chi '97 Extended Abstracts on Human Factors in Computing Systems Looking to the Future, Chi '97, Mar. 22, 1997, 5 pages. |
Mandi, Confero now available in Cydia, brings a new way to manage Notification badges [Jailbreak Tweak], http://www.iphonehacks.com/2015/01/confero/tweak-manage-notification-badges.html, Jan. 1, 2015, 2 pages. |
Matthew, “How to Preview Photos and Images From Right-Click Context Menue in Windows [Tip]”, http://www.dottech.org/159009/add-image-preview-in-windows-context-menu-tip, Jul. 4, 2014, 5 pages. |
McGarry, “Everything You Can Do With Force Touch on Apple Watch”, Macworld, www.macworld.com, May 6, 2015, 4 pages. |
McRitchie, “Internet Explorer Right-Click Menus,” http://web.archive.org/web-201405020/http:/dmcritchie.mvps.org/ie/rightie6.htm, May 2, 2014, 10 pages. |
Microsoft, “Lumia—How to Personalize Your Start Screen”, https://www.youtube.com/watch?v=6GI5Z3TrSEs, Nov. 11, 2014, 3 pages. |
Microsoft, “Use Radial Menus to Display Commands in OneNote for Windows 8,” https://support.office.com/en-us/article/Use-radial-menues-to-display-OneNote-commands-Od75f03f-cde7-493a-a8a0b2ed6f99fbe2, 2016, 5 pages. |
Minsky, “Computational Haptics the Sandpaper System for Synthesizing Texture for a Force-Feedback Display,” Massachusetts Institute of Technology, Jun. 1978, 217 pages. |
Mitroff, “Google Android 5.0 Lollipop,” http://www.cnet.com/products/google-android-5-0-lollipop, Mar. 12, 2015, 5 pages. |
Mohr, “Do Not Disturb—The iPhone Feature You Should Be Using”, http.www.wonderoftech.com/do-not-disturb-iphone, Jul. 14, 2014, 30 pages. |
Nacca, “NiLS Lock Screen Notifications / Floating Panel—Review”, https://www.youtube.com/watch?v=McT4QnS9TDY, Feb. 3, 2014, 4 pages. |
Nickinson, “How to use Do Not Disturb on the HTC One M8”, Android Central (Year: 2014), Apr. 7, 2014, 9 pages. |
Nikon, “Scene Recognition System and Advanced Srs,” http://www.nikonusa.com/enlearn-And-Explore/Article/ftlzi4rr/Scene-Recognition-System.html, Jul. 22, 2015, 2 pages. |
Oh, et al., “Moving Objects with 2D Input Devices in CAD Systems and Desktop Virtual Environments”, Proceedings of Graphics Interface 2005, 8 pages, May 2005. |
O'Hara, et al., “Pressure-Sensitive Icons”, IP.bom Journal, IP.com Inc., West Henrietta, NY, US, Jun. 1, 1990, 2 pages. |
Pallenberg, “Wow, the new iPad had gestures.” https://plus.google.com/+SaschaPallenberg/posts/aaJtJogu8ac, Mar. 7, 2012, 2 pages. |
Phonebuff, “How to Pair Bluetooth on the iPhone”, https://www.youtube.com/watch?v=LudNwEar9A8, Feb. 8, 2012, 3 pages. |
PoliceOne.com, “COBAN Technologies Pre-Event Buffer & Fail Safe Feature,” http://www.policeone.com/police-products/police-technology/mobile-computures/videos/5955587-COBAN-Technologies-Pre-Event, Nov. 11, 2010, 2 pages. |
Pradeep, “Android App Development—Microsoft Awarded With Patents on Gestures Supported on Windows 8,” http://mspoweruser.com/microsoft-awarded-with-patents-on-gestures-supported-on-windows-8/, Aug. 25, 2011, 16 pages. |
“Quickly Preview Songs in Windows Media Player 12 in Windows 7,” Quickly Preview Songs in Windows Media Player 12 in Windows 7. How-to Geek, Apr. 28, 2010, Web. May 8, 2010, http://web.archive.org/web/20100502013134/http://www.howtogeek.com/howto/16157/quickly-preview-songs-in-windows-media-center-12-in-windows-7>, 6 pages. |
Quinn, et al., “Zoofing! Faster List Selections with Pressure-Zoom-Flick-Scrolling”, Proceedings of the 21st Annual Conference of the Australian Computer-Human Interaction Special Interest Group on Design, Nov. 23, 2009, ACM Press, vol. 411, 8 pages. |
Rekimoto, et al., “PreSense: Interaction Techniques for Finger Sensing Input Devices”, Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology, Nov. 30, 2003, 10 pages. |
Rekimoto, et al., “PreSensell: Bi-directional Touch and Pressure Sensing Interactions with Tactile Feedback”, Conference on Human Factors in Computing Systems Archive, ACM, Apr. 22, 2006, 6 pages. |
Ritchie, “How to see all the unread message notifications on your iPhone, all at once, all in the same place | iMore”, https://www.imore.com/how-see-all-unread-message-notifications-your-iphone-all-once-all-same-place, Feb. 22, 2014, 2 pages. |
Sony, “Intelligent Scene Recognition,” https://www.sony-asia.com/article/252999/section/product/product/dsc-t77, downloaded on May 20, 2016, 5 pages. |
Sood, “MultitaskingGestures”, http://cydia.saurik.com/package/org.thebigboxx.multitaskinggestures/, Mar. 3, 2014, 2 pages. |
Stewart, et al., “Characteristics of Pressure-Based Input for Mobile Devices”, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Apr. 2010, 10 pages. |
Stross, “Wearing a Badge, and a Video Camera,” The New York Times, http://www.nytimes.com/2013/04/07/business/wearable-video-cameras-for-police-offers.html? R=0, Apr. 6, 2013, 4 pages. |
Taser, “laser Axon Body Camera User Manual,” https://www.taser.com/images/support/downloads/product-resourses/axon_body_product_manual.pdf, Oct. 1, 2013, 24 pages. |
Tidwell, “Designing Interfaces,” O'Reilly Media, Inc., USA, Nov. 2005, 348 pages. |
VGJFeIiz, “How to Master Android Lollipop Notifications in Four Minutes!”, https://www.youtube.com/watch?v=S-zBRG7GJgs, Feb. 8, 2015, 5 pages. |
VisioGuy, “Getting a Handle on Selecting and Subselecting Visio Shapes”, http://www.visguy.com/2009/10/13/getting-a-handle-on-selecting-and-subselecting-visio-shapes/, Oct. 13, 2009, 18 pages. |
Wikipedia, “AirDrop,”, Wikipedia, the free encyclopedia, http://en.wikipedia.org/wiki/AirDrop, May 17, 2016, 5 pages. |
Wikipedia, “Cinemagraph,” Wikipedia, the free encyclopedia, http://en.wikipedia.org/wiki/Cinemagraph, Last Modified Mar. 16, 2016, 2 pages. |
Wikipedia, “Context Menu,” Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Context menu, Last Modified May 15, 2016, 4 pages. |
Wikipedia, “HTC One (M7),” Wikipedia, the free encyclopedia, https://en.wikipedia.org/wiki/HTC_One_(M7), Mar. 2013, 20 pages. |
Wikipedia, “Mobile Ad Hoc Network,” Wikipedia, the free encyclopedia, http://en.wikipedia.org/wiki/Mobile_ad_hoc_network, May 20, 2016, 4 pages. |
Wikipedia, “Pie Menu,” Wikipedia, the free encyclopedia, http://en.wikipedia.org/wiki/Pie_menu, Last Modified Jun. 4, 2016, 3 pages. |
Wikipedia, “Quick Look,” from Wikipedia, the free encyclopedia, https;//en.wikipedia.org/wiki/Quick_Look, Last Modified Jan. 15, 2016, 3 pages. |
Wikipedia, “Sony Xperia Z1”, Wikipedia, the free encyclopedia, https://enwikipedia.org/wiki/Sony_Experia_Z1, Sep. 2013, 10 pages. |
YouTube, “Android Lollipop Lock-Screen Notification Tips”, https://www.youtube.com/watch?v=LZTxHBOwzIU, Nov. 13, 2014, 3 pages. |
YouTube, “Blackberry Playbook bezel interaction,” https://www.youtube.com/watch?v=YGkzFqnOwXI, Jan. 10, 2011, 2 pages. |
YouTube, “How to Master Android Lollipop Notifications in Four Minutes!”, Video Gadgets Journal (VGJFelix), https://www.youtube.com/watch?v=S-zBRG7GGJgs, Feb. 8, 2015, 4 pages. |
YouTube, “HTC One Favorite Camera Features”, http://www.youtube.com/watch?v=sUYHfcjl4RU, Apr. 28, 2013, 3 pages. |
YouTube, “Multitasking Gestures: Zephyr Like Gestures on iOS”, https://www.youtube.com/watch?v=Jcod-f7Lw0l, Jan. 27, 2014, 3 pages. |
YouTube, “Recentz—Recent Apps in a Tap”, https://www.youtube.com/watch?v=qailSHRgsTo, May 15, 2015, 1 page. |
Office Action, dated Mar. 15, 2017, received in U.S. Appl. No. 14/535,671, 13 pages. |
Office Action, dated Nov. 30, 2017, received in U.S. Appl. No. 14/535,671, 21 pages. |
Notice of Allowance, dated Sep. 5, 2018, received in U.S. Appl. No. 14/535,671, 5 pages. |
Office Action, dated Jun. 29, 2017, received in U.S. Appl. No. 14/608,895, 30 pages. |
Final Office Action, dated Feb. 22, 2018, received in U.S. Appl. No. 14/608,895, 20 pages. |
Notice of Allowance, dated Jun. 26, 2018, received in U.S. Appl. No. 14/608,895, 9 pages. |
Office Action, dated Dec. 18, 2015, received in Australian Patent Application No. 2013368440, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated Oct. 18, 2016, received in Australian Patent Application No. 2013368440, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Notice of Allowance, dated Dec. 20, 2016, received in Australian Patent Application No. 2013368440, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Certificate of Grant, dated Apr. 29, 2017, received in Australian Patent Application No. 2013368440, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated Nov. 6, 2017, received in Chinese Patent Application No. 201380068493.6, which corresponds with U.S. Appl. No. 14/608,895, 5 pages. |
Office Action, dated Oct. 9, 2018, received in Chinese Patent Application No. 201380068493.6, which corresponds with U.S. Appl. No. 14/608,895, 3 pages. |
Office Action, dated Jul. 21, 2016, received in European Patent Application No. 13795391.5, which corresponds with U.S. Appl. No. 14/536,426, 9 pages. |
Office Action, dated Mar. 9, 2018, received in European Patent Application No. 13795391.5, which corresponds with U.S. Appl. No. 14/536,426, 4 pages. |
Intention to Grant, dated Jul. 6, 2018, received in European Patent Application No. 13795391.5, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Office Action, dated Sep. 13, 2016, received in Japanese Patent Application No. 2015-547948, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Patent, dated May 12, 2017, received in Japanese Patent Application No. 2015-547948, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated Apr. 5, 2016, received in Korean Patent Application No. 10-2015-7018851, which corresponds with U.S. Appl. No. 14/536,426, 7 pages. |
Office Action, dated Feb. 24, 2017, received in Korean Patent Application No. 10-2015-7018851, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Patent, dated May 26, 2017, received in Korean Patent Application No. 2015-7018851, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated Oct. 5, 2018, received in Korean Patent Application No. 2018-7028236, which corresponds with U.S. Appl. No. 14/608,895, 6 pages. |
Office Action, dated Jul. 26, 2017, received in U.S. Appl. No. 14/536,235, 14 pages. |
Final Office Action, dated Feb. 26, 2018, received in U.S. Appl. No. 14/536,235, 13 pages. |
Office Action, dated Apr. 5, 2017, received in U.S. Appl. No. 14/536,367, 16 pages. |
Notice of Allowance, dated Nov. 30, 2017, received in U.S. Appl. No. 14/536,367, 9 pages. |
Notice of Allowance, dated May 16, 2018, received in U.S. Appl. No. 14/536,367, 5 pages. |
Office Action, dated Dec. 17, 2015, received in U.S. Appl. No. 14/536,426, 28 pages. |
Final Office Action, dated May 6, 2016, received in U.S. Appl. No. 14/536,426, 23 pages. |
Office action, dated Aug. 3, 2017, received in U.S. Appl. No. 14/536,426, 10 pages. |
Office Action, dated Jul. 15, 2015, received in Australian Patent Application No. 2013259606, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Notice of Allowance, dated May 23, 2016, received in Australian Patent Application No. 2013259606, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Certificate of Grant, dated Sep. 15, 2016, received in Australian Patent Australian Patent Application No. 2013259606, which corresponds with U.S. Appl. No. 14/536,426, 1 page. |
Office Action, dated Nov. 18, 2015, received in Australian Patent Application No. 2015101231, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated May 15, 2017, received in Australian Patent Application No. 2016216580, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated May 8, 2018, received in Australian Patent Application No. 2016216580, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Notice of Allowance, dated May 17, 2018, received in Australian Patent Application No. 2016216580, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Certificate of Grant, dated Sep. 13, 2018, received in Australian Patent Application No. 2016216580, which corresponds with U.S. Appl. No. 14/536,426, 1 page. |
Office Action, dated Sep. 19, 2017, received in Chinese Patent Application No. 201380035982.1, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Notice of Allowance, dated May 10, 2018, received in Chinese Patent Application No. 201380035982.1, which corresponds with U.S. Appl. No. 14/536,426, 2 pages. |
Patent, dated Aug. 17, 2018, received in Chinese Patent Application No. 201380035982.1, which corresponds with U.S. Appl. No. 14/536,426, 4 pages. |
Office Action, dated Sep. 20, 2017, received in Chinese Patent Application No. 201510566550.4, which corresponds with U.S. Appl. No. 14/536,426, 11 pages. |
Decision to Grant, dated Jul. 14, 2016, received in European Patent Application No. 13724100.6, which corresponds with U.S. Appl. No. 14/536,426, 1 page. |
Letters Patent, dated Aug. 10, 2016, received in European Patent Application No. 13724100.6, which corresponds with U.S. Appl. No. 14/536,426, 1 page. |
Office Action, dated Jan. 20, 2017, received in European Patent Application No. 15183980.0, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Office Action, dated Aug. 21, 2017, received in European Patent Application No. 15183980.0, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Intention to Grant, dated Mar. 9, 2018, received in European Patent Application No. 15183980.0, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Intention to Grant, dated Aug. 14, 2018, received in European Patent Application No. 15183980.0, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Certificate of Grant, dated Nov. 10, 2017, received in Hong Kong Patent Application No. 15107535.0, which corresponds with U.S. Appl. No. 14/536,426, 2 pages. |
Office Action, dated Mar. 4, 2016, received in Japanese Patent Application No. 2015-511644, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated Feb. 6, 2017, received in Japanese Patent Application No. 2015-511644, which corresponds with U.S. Appl. No. 14/536,426, 6 pages. |
Notice of Allowance, dated Dec. 8, 2017, received in Japanese Patent Application No. 2015-511644, which corresponds with U.S. Appl. No. 14/536,426, 6 pages. |
Patent, dated Jan. 12, 2018, received in Japanese Patent Application No. 2015-511644, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated Nov. 6, 2018, received in Japanese Patent Application No. 2018-000753, which corresponds with U.S. Appl. No. 14/536,426, 8 pages. |
Office Action, dated Mar. 9, 2017, received in U.S. Appl. No. 14/536,464, 21 pages. |
Final Office Action, dated Aug. 25, 2017, received in U.S. Appl. No. 14/536,464, 30 pages. |
Office Action, dated Feb. 12, 2018, received in U.S. Appl. No. 14/536,464, 33 pages. |
Final Office Action, dated Jun. 22, 2018, received in U.S. Appl. No. 14/536,464, 32 pages. |
Office Action, dated Sep. 25, 2017, received in U.S. Appl. No. 14/536,644, 29 pages. |
Final Office Action, dated May 3, 2018, received in U.S. Appl. No. 14/536,644, 28 pages. |
Office Action, dated Oct. 19, 2017, received in U.S. Appl. No. 14/608,926, 14 pages. |
Final Office Action, dated Jun. 6, 2018, received in U.S. Appl. No. 14/608,926, 19 pages. |
Office Action, dated Feb. 1, 2016, received in Australian Patent Application No. 2013368441, which corresponds with U.S. Appl. No. 14/608,926, 3 pages. |
Notice of Allowance, dated Mar. 30, 2016, received in Australian Patent Application No. 2013368441, which corresponds with U.S. Appl. No. 14/608,926, 1 page. |
Certificate of Grant, dated Jul. 29, 2016, received in Australian Patent Application No. 2013368441, which corresponds with U.S. Appl. No. 14/608,926, 1 page. |
Office Action, dated Jan. 3, 2017, received in Australian Patent Application No. 2016201451, which corresponds with U.S. Appl. No. 14/608,926, 3 pages. |
Notice of Acceptance, dated Dec. 20, 2017, received in Australian Patent Application No. 2016201451, which corresponds with U.S. Appl. No. 14/608,926, 3 pages. |
Certificate of Grant, dated May 3, 2018, received in Australian Patent Application No. 2016201451, which corresponds with U.S. Appl. No. 14/608,926, 1 page. |
Office Action, dated May 4, 2017, received in Chinese Patent Application No. 201380068414.1, which corresponds with U.S. Appl. No. 14/608,926, 5 pages. |
Notice of Allowance, dated Feb. 8, 2018, received in Chinese Patent Application No. 201380068414.1, which corresponds with U.S. Appl. No. 14/608,926, 2 pages. |
Patent, dated May 4, 2018, received in Chinese Patent Application No. 201380068414.1, which corresponds with U.S. Appl. No. 14/608,926, 4 pages. |
Office Action, dated Apr. 21, 2016, received in European Patent Application No. 13795392.3, which corresponds with U.S. Appl. No. 14/608,926, 6 pages. |
Office Action, dated May 6, 2016, received in European Patent Application No. 13795392.3, which corresponds with U.S. Appl. No. 14/608,926, 6 pages. |
Office Action, dated Nov. 11, 2016, received in European Patent Application No. 13795392.3, which corresponds with U.S. Appl. No. 14/608,926, 6 pages. |
Office Action, dated Jul. 4, 2017, received in European Patent Application No. 13795392.3, which corresponds with U.S. Appl. No. 14/608,926, 4 pages. |
Oral Summons, dated Feb. 13, 2017, received in European Patent Application No. 13795392.3, which corresponds with U.S. Appl. No. 14/608,926, 11 pages. |
Office Action, dated Mar. 14, 2016, received in Japanese Patent Application No. 2015-549392, which corresponds with U.S. Appl. No. 14/608,926, 4 pages. |
Notice of Allowance, dated Jan. 17, 2017, received in Japanese Patent Application No. 2015-549392, which corresponds with U.S. Appl. No. 14/608,926, 2 pages. |
Patent, dated Feb. 17, 2017, received in Japanese Patent Application No. 2015-549392, which corresponds with U.S. Appl. No. 14/608,926, 3 pages. |
Patent, dated Apr. 27, 2018, received in Japanese Patent Application No. 2017-024234, which corresponds with U.S. Appl. No. 14/608,926, 3 pages. |
Office Action, dated May 12, 2016, received in Korean Patent Application No. 10-2015-7018853, which corresponds with U.S. Appl. No. 14/608,926, 4 pages. |
Notice of Allowance, dated Mar. 31, 2017, received in Korean Patent Application No. 2015-7018853, which corresponds with U.S. Appl. No. 14/608,926, 4 pages. |
Patent, dated Jun. 30, 2017, received in Korean Patent Application No. 2015-7018853, which corresponds with U.S. Appl. No. 14/608,926, 3 pages. |
Office Action, dated Aug. 22, 2017, received in Korean Patent Application No. 2017-7018250, which corresponds with U.S. Appl. No. 14/608,926, 2 pages. |
Notice of Allowance, dated Dec. 29, 2017, received in Korean Patent Application No. 2017-7018250, which corresponds with U.S. Appl. No. 14/608,926, 3 pages. |
Office Action, dated Jul. 17, 2015, received in Australian Patent Application No. 2013259613, which corresponds with U.S. Appl. No. 14/536,646, 5 pages. |
Office Action, dated May 31, 2016, received in Australian Patent Application No. 2013259613, which corresponds with U.S. Appl. No. 14/536,646, 4 pages. |
Notice of Allowance, dated Jul. 5, 2016, received in Australian Patent Application No. 2013259613, which corresponds with U.S. Appl. No. 14/536,646, 3 pages. |
Office Action, dated Dec. 1, 2016, received in Chinese Patent Application No. 2013800362059, which corresponds with U.S. Appl. No. 14/536,646, 3 pages. |
Notice of Allowance, dated Oct. 9, 2017, received in Chinese Patent Application No. 2013800362059, which corresponds with U.S. Appl. No. 14/536,646, 3 pages. |
Office Action, dated Oct. 19, 2017, received in U.S. Patent Application No. 14/536,646, 21 pages. |
Notice of Allowance, dated Aug. 9, 2018, received in U.S. Patent Application No. 14/536,646, 5 pages. |
Office Action, dated Nov. 12, 2015, received in European Patent Application No. 13724102.2, which corresponds with U.S. Appl. No. 14/536,646, 6 pages. |
Office Action, dated May 31, 2016, received in European Patent Application No. 13724102.2, which corresponds with U.S. Appl. No. 14/536,646, 5 pages. |
Notice of Allowance, dated Jan. 4, 2017, received in European Patent Application No. 13724102.2, which corresponds with U.S. Appl. No. 14,536,646, 5 pages. |
Patent, dated May 26, 2017, received in European Patent Application No. 13724102.2, which corresponds with U.S. Appl. No. 14,536,646, 1 page. |
Office Action, dated Feb. 29, 2016, received in Japanese Patent Application No. 2015-511645, which corresponds with U.S. Appl. No. 14/536,646, 5 pages. |
Notice of Allowance, dated Dec. 22, 2016, received in Japanese Patent Application No. 2015-511645, which corresponds with U.S. Appl. No. 14/536,646, 2 pages. |
Office Action, dated Apr. 3, 2017, received in U.S. Patent Application No. 14/536,141, 11 pages. |
Notice of Allowance, dated Sep. 20, 2017, received in U.S. Appl. No. 14/536,141, 10 pages. |
Office Action, dated Aug. 27, 2015, received in Australian Patent Application No. 2013259614, which corresponds with U.S. Appl. No. 14/536,141, 4 pages. |
Notice of Allowance, dated Aug. 15, 2016, received in Australian Patent Application No. 2013259614, which corresponds with U.S. Appl. No. 14/536,141, 1 page. |
Office Action, dated Jul. 21, 2017, received in Australian Patent Application No. 2016262773, which corresponds with U.S. Appl. No. 14/536,141, 3 pages. |
Notice of Acceptance, dated Jul. 19, 2018, received in Australian Patent Application No. 2016262773, which corresponds with U.S. Appl. No. 14/536,141, 3 pages. |
Office Action, dated Mar. 3, 2017, received in Chinese Patent Application No. 201380035893.7, which corresponds with U.S. Appl. No. 14/536,141, 8 pages. |
Office Action, dated Feb. 2, 2018, received in Chinese Patent Application No. 201380035893.7, which corresponds with U.S. Appl. No. 14/536,141, 5 pages. |
Notice of Allowance, dated Aug. 31, 2018, received in Chinese Patent Application No. 201380035893.7, which corresponds with U.S. Appl. No. 14/536,141, 6 pages. |
Office Action, dated Jan. 7, 2016, received in European Patent Application No. 13726053.5, which corresponds with U.S. Appl. No. 14/536,141, 10 pages. |
Office Action, dated Aug. 31, 2016, received in European Patent Application No. 13726053.5, which corresponds with U.S. Appl. No. 14/536,141, 10 pages. |
Office Action, dated Apr. 9, 2018, received in European Patent Application No. 13726053.5, which corresponds with U.S. Appl. No. 14/536,141, 9 pages. |
Office Action, dated Feb. 29, 2016, received in Japanese Patent Application No. 2015-511646, which corresponds with U.S. Appl. No. 14/536,141, 3 pages. |
Office Action, dated Oct. 25, 2016, received in Japanese Patent Application No. 2015-511646, which corresponds with U.S. Appl. No. 14/536,141, 6 pages. |
Notice of Allowance, dated Jun. 30, 2017, received in Japanese Patent Application No. 2015-511646, which corresponds with U.S. Appl. No. 14/536,141, 5 pages. |
Patent, dated Jul. 28, 2017, received in Japanese Patent Application No. 2015-511646, which corresponds with U.S. Appl. No. 14/536,141, 3 pages. |
Office Action, dated Aug. 13, 2018, received in Japanese Patent Application No. 2017-141953, which corresponds with U.S. Appl. No. 14/536,141, 6 pages. |
Office Action, dated Dec. 8, 2016, received in U.S. Appl. No. 14/608,942, 9 pages. |
Notice of Allowance, dated May 12, 2017, received in U.S. Appl. No. 14/608,942, 10 pages. |
Office Action, dated Jan. 29, 2016, received in Australian Patent Application No. 2013368443, which corresponds with U.S. Appl. No. 14/608,942, 3 pages. |
Notice of Allowance, dated Mar. 11, 2016, received in Australian Patent Application No. 2013368443, which corresponds with U.S. Appl. No. 14/608,942, 2 pages. |
Certificate of Grant, dated Jul. 7, 2016, received in Australian Patent Application No. 2013368443, which corresponds with U.S. Appl. No. 14/608,942, 3 pages. |
Office Action, dated Mar. 29, 2017, received in Australian patent Application No. 2016201303, which corresponds with U.S. Appl. No. 14/608,942, 3 pages. |
Notice of Acceptance, dated Mar. 7, 2018, received in Australian patent Application No. 2016201303, which corresponds with U.S. Appl. No. 14/608,942, 3 pages. |
Certificate of Grant, dated Jul. 5, 2018, received in Australian patent Application No. 2016201303, which corresponds with U.S. Appl. No. 14/608,942, 4 pages. |
Office Action, dated Jun. 16, 2017, received in Chinese Patent Application No. 201380068295.X, which corresponds with U.S. Appl. No. 14/608,942, 6 pages. |
Office Action, dated Mar. 28, 2018, received in Chinese Patent Application No. 201380068295.X, which corresponds with U.S. Appl. No. 14/608,942, 5 pages. |
Office Action, dated Oct. 8, 2018, received in Chinese Patent Application No. 201380068295.X, which corresponds with U.S. Appl. No. 14/608,942, 3 pages. |
Office Action, dated Oct. 7, 2016, received in European Patent Application No. 13798464.7, which corresponds with U.S. Appl. No. 14/608,942, 7 pages. |
Decision to Grant, dated Sep. 13, 2018, received in European Patent Application No. 13798464.7, which corresponds with U.S. Appl. No. 14/608,942, 2 pages. |
Office Action, dated Jul. 4, 2016, received in Japanese Patent Application No. 2015-549393, which corresponds with U.S. Appl. No. 14/608,942, 4 pages. |
Notice of Allowance, dated May 12, 2017, received in Japanese Patent Application No. 2015-549393, which corresponds with U.S. Appl. No. 14/608,942, 5 pages. |
Patent, dated Jun. 16, 2017, received in Japanese Patent Application No. 2015-549393, which corresponds with U.S. Appl. No. 14/608,942, 3 pages. |
Office Action, dated Apr. 5, 2016, received in Korean Patent Application No. 2015-7018448, which corresponds with U.S. Appl. No. 14/608,942, 6 pages. |
Office Action, dated Feb. 24, 2017, received in Korean Patent Application No. 2015-7018448, which corresponds with U.S. Appl. No. 14/608,942, 4 pages. |
Office Action, dated Jul. 17, 2017, received in U.S. Appl. No. 14/536,166, 19 pages. |
Notice of Allowance, dated Feb. 28, 2018, received in U.S. Appl. No. 14/536,166, 5 pages. |
Office Action, dated Aug. 1, 2016, received in U.S. Appl. No. 14/536,203, 14 pages. |
Notice of Allowance, dated Feb. 1, 2017, received in U.S. Appl. No. 14/536,203, 9 pages. |
Office Action, dated Jul. 9, 2015, received in Australian Patent Application No. 2013259630, which corresponds with U.S. Appl. No. 14/536,203, 3 pages. |
Notice of Allowance, dated Jun. 15, 2016, received in Australian Patent Application No. 2013259630, which corresponds with U.S. Appl. No. 14/536,203, 3 pages. |
Certificate of Grant, dated Oct. 21, 2016, received in Australian Patent Application No. 2013259630, which corresponds with U.S. Appl. No. 14/536,203, 3 pages. |
Office Action, dated Jul. 4, 2017, received in Australian Patent Application No. 2016238917, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Notice of Acceptance, dated Jul. 19, 2018, received in Australian Patent Application No. 2016238917, which corresponds with U.S. Appl. No. 14/536,203, 3 pages. |
Certificate of Grant, dated Nov. 1, 2018, received in Australian Patent Application No. 2016238917, which corresponds with U.S. Appl. No. 14/536,203, 1 page. |
Office Action, dated Oct. 25, 2017, received in Chinese Patent Application No. 201380035977.0, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Notice of Allowance, dated Apr. 4, 2018, received in Chinese Patent Application No. 201380035977.0, which corresponds with U.S. Appl. No. 14/536,203, 3 pages. |
Patent, dated Jul. 6, 2018, received in Chinese Patent Application No. 201380035977.0, which corresponds with U.S. Appl. No. 14/536,203, 4 pages. |
Office Action, dated Nov. 11, 2015, received in European Patent Application No. 13724104.8, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Office Action, dated May 31, 2016, received in European Patent Application No. 13724104.8, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Office Action, dated Dec. 6, 2017, received in European Patent Application No. 13724104.8, which corresponds with U.S. Appl. No. 14/536,203, 9 pages. |
Decision to Grant, dated Oct. 24, 2018, received in European Patent Application No. 13724104.8, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Office Action, dated Feb. 15, 2016, received in Japanese Patent Application No. 2015-511650, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Notice of Allowance, dated Aug. 5, 2016, received in Japanese Patent Application No. 2015-511650, which corresponds with U.S. Appl. No. 14/536,203, 4 pages. |
Certificate of Patent, dated Sep. 9, 2016, received in Japanese Patent Application No. 2015-511650, which corresponds with U.S. Appl. No. 14/536,203, 3 pages. |
Office Action, dated Jun. 23, 2017, received in Japanese Patent Application No. 2016173113, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Notice of Allowance, mailed Jan. 12, 2018, received in Japanese Patent Application No. 2016173113, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Patent, dated Feb. 16, 2018, received in Japanese Patent Application No. 2016173113, which corresponds with U.S. Appl. No. 14/536,203, 3 pages. |
Office Action, dated Oct. 19, 2018, received in Japanese Patent Application No. 2018-022394, which corresponds with U.S. Appl. No. 14/536,203, 4 pages. |
Office Action, dated Dec. 4, 2015, received in Korean Patent Application No. 2014-7034520, which corresponds with U.S. Appl. No. 14/536,203, 4 pages. |
Notice of Allowance, dated Sep. 1, 2016, received in Korean Patent Application No. 2014-7034520, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Office Action, dated Feb. 6, 2017, received in Korean Patent Application No. 2016-7033834, which corresponds with U.S. Appl. No. 14/536,203, 4 pages. |
Notice of Allowance, dated Oct. 30, 2017, received in Korean Patent Application No. 2016-7033834, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Patent, dated Jan. 23, 2018, received in Korean Patent Application No. 2016-7033834, which corresponds with U.S. Appl. No. 14/536,203, 4 pages. |
Office Action, dated Oct. 20, 2017, received in U.S. Appl. No. 14/608,965, 14 pages. |
Office Action, dated Jul. 2, 2018, received in U.S. Appl. No. 14/608,965, 16 pages. |
Office action, dated Oct. 11, 2017, received in Chinese Patent Application No. 201380074060.1, which corresponds with U.S. Appl. No. 14/608,965, 5 pages. |
Office action, dated Aug. 1, 2018, received in Chinese Patent Application No. 201380074060.1, which corresponds with U.S. Appl. No. 14/608,965, 5 pages. |
Office Action, dated Jul. 22, 2016, received in European Office Action No. 13798465.4, which corresponds with U.S. Appl. No. 14/608,965, 8 pages. |
Oral Proceedings, dated Mar. 7, 2018, received in European Office Action No. 13798465.4, which corresponds with U.S. Appl. No. 14/608,965, 5 pages. |
Decision to Grant, dated Sep. 6, 2018, received in European Office Action No. 13798465.4, which corresponds with U.S. Appl. No. 14/608,965, 2 pages. |
Office Action, dated Oct. 20, 2016, received in U.S. Appl. No. 14/536,247, 10 pages. |
Final Office Action, dated Mar. 24, 2017, received in U.S. Appl. No. 14/536,247, 14 pages. |
Notice of Allowance, dated Nov. 22, 2017, received in U.S. Appl. No. 14/536,247, 6 pages. |
Office Action, dated Mar. 24, 2017, received in U.S. Appl. No. 14/536,267, 12 pages. |
Notice of Allowance, dated Nov. 9, 2017, received in U.S. Appl. No. 14/536,267, 8 pages. |
Notice of Allowance, dated Jun. 1, 2018, received in U.S. Appl. No. 14/536,267, 5 pages. |
Office Action, dated Aug. 10, 2015, received in Australian Patent Application No. 2013259637, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Notice of Allowance, dated Jun. 28, 2016, received in Australian Patent Application No. 2013259637, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Certificate of Grant, dated Oct. 21, 2016, received in Australian Patent Application No. 2013259637, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Office Action, dated Mar. 24, 2017, received in Australian Patent Application No. 2016204411, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Notice of Acceptance, dated Feb. 27, 2018, received in Australian Patent Application No. 2016204411, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Certificate of Grant, dated Jun. 28, 2018, received in Australian Patent Application No. 2016204411, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Office Action, dated Dec. 9, 2016, received in Chinese Patent Application No. 2016120601564130, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Notice of Allowance, dated Jan. 29, 2018, received in Chinese Patent Application No. 201380035968.1, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Patent, dated Apr. 20, 2018, received in Chinese Patent Application No. 201380035968.1, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Office Action, dated Jun. 13, 2018, received in Chinese Patent Application No. 201810332044.2, which corresponds with U.S. Appl. No. 14/536,267, 2 pages. |
Office Action, dated Jan. 25, 2018, received in European Patent Application No. 13724106.3, which corresponds with U.S. Appl. No. 14/536,267, 5 pages. |
Intention to Grant, dated Jun. 27, 2018, received in European Patent Application No. 13724106.3, which corresponds with U.S. Appl. No. 14/536,267, 5 pages. |
Decision to Grant, dated Oct. 18, 2018, received in European Patent Application No. 13724106.3, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Office Action, dated Sep. 13, 2017, received in European Patent Application No. 16177863.4, which corresponds with U.S. Appl. No. 14/536,267, 6 pages. |
Office Action, dated Jan. 29, 2016, received in Japanese Patent Application No. 2015-511652, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Notice of Allowance, dated Sep. 26, 2016, received in Japanese Patent Application No. 2015-511652, which corresponds with U.S. Appl. No. 14/536,267, 5 pages. |
Office Action, dated Mar. 3, 2017, received in Japanese Patent Application No. 2016-125839, which corresponds with U.S. Appl. No. 14/536,267, 6 pages. |
Notice of Allowance, dated Nov. 17, 2017, received in Japanese Patent Application No. 2016-125839, which corresponds with U.S. Appl. No. 14/536,267, 5 pages. |
Office Action, dated Dec. 4, 2015, received in Korean Patent Application No. 2014-7034530, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Notice of Allowance, dated Sep. 1, 2016, received in Korean Patent Application No. 2014-7034530, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Office Action, dated Jan. 5, 2017, received in Korean Patent Application No. 2016-7029533, which corresponds with U.S. Appl. No. 14/536,267, 2 pages. |
Notice of Allowance, dated Sep. 1, 2017, received in Korean Patent Application No. 2016-7029533, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Patent, dated Dec. 1, 2017, received in Korean Patent Application No. 2016-7029533, which corresponds with U.S. Appl. No. 14/536,267, 2 pages. |
Office Action, dated Apr. 7, 2017, received in U.S. Appl. No. 14/536,291, 11 pages. |
Notice of Allowance, dated Dec. 1, 2017, received in U.S. Appl. No. 14/536,291, 19 pages. |
Notice of Allowance, dated Mar. 20, 2018, received in U.S. Appl. No. 14/536,291, 5 pages. |
Office Action, dated Aug. 18, 2015, received in Australian Patent Application No. 2013259642, which corresponds with U.S. Appl. No. 14/536,291, 3 pages. |
Office Action, dated Jul. 25, 2016, received in Australian Patent Application No. 2013259642, which corresponds with U.S. Appl. No. 14/536,291, 3 pages. |
Office Action, dated Aug. 10, 2016, received in Australian Patent Application No. 2013259642, which corresponds with U.S. Appl. No. 14/536,291, 4 pages. |
Office Action, dated Jul. 21, 2017, received in Australian Patent Application No. 2016216658, which corresponds with U.S. Appl. No. 14/536,291, 3 pages. |
Notice of Acceptance, dated Jul. 19, 2018, received in Australian Patent Application No. 2016216658, which corresponds with U.S. Appl. No. 14/536,291, 3 pages. |
Innovation Patent, dated Sep. 1, 2016, received in Australian Patent Application No. 2016101481, which corresponds with U.S. Appl. No. 14/536,291, 1 page. |
Office Action, dated Sep. 29, 2016, received in Australian Patent Application No. 2016101481, which corresponds with U.S. Appl. No. 14/536,291, 3 pages. |
Office Action, dated Oct. 23, 2017, received in Chinese Patent Application No. 201380035986.X, which corresponds with U.S. Appl. No. 14/536,291, 9 pages. |
Office Action, dated Jan. 7, 2016, received in European Patent Application No. 13724107.1, which corresponds with U.S. Appl. No. 14/536,291, 11 pages. |
Office Action, dated Aug. 22, 2016, received in European Patent Application No. 13724107.1, which corresponds with U.S. Appl. No. 14/536,291, 7 pages. |
Office Action, dated Mar. 23, 2017, received in European Patent Application No. 13724107.1, which corresponds with U.S. Appl. No. 14/536,291, 8 pages. |
Office Action, dated Mar. 8, 2016, received in Japanese Patent Application No. 2015-511655, which corresponds with U.S. Appl. No. 14/536,291, 4 pages. |
Final Office Action, dated Dec. 22, 2016, received in Japanese Patent Application No. 2015-511655, which corresponds with U.S. Appl. No. 14/536,291, 3 pages. |
Office Action, dated Jun. 29, 2018, received in Japanese Patent Application No. 2017-083027, which corresponds with U.S. Appl. No. 14/536,291, 5 pages. |
Office Action, dated Oct. 19, 2017, received in U.S. Appl. No. 14/608,985, 13 pages. |
Notice of Allowance, dated Apr. 20, 2018, received in U.S. Appl. No. 14/608,985, 5 pages. |
Office Action, dated Jan. 15, 2016, received in Australian Patent Application No. 2013368445, which corresponds with U.S. Appl. No. 14/608,985, 3 pages. |
Notice of Allowance, dated Jan. 18, 2017, received in Australian Patent Application No. 2013368445, which corresponds with U.S. Appl. No. 14/608,985, 3 pages. |
Patent, dated May 18, 2017, received in Australian Patent Application No. 2013368445, which corresponds with U.S. Appl. No. 14/608,985, 1 page. |
Office Action, dated May 19, 2017, received in Chinese Patent Application No. 201380068399.0, which corresponds with U.S. Appl. No. 14/608,985, 5 pages. |
Notice of Allowance, dated Sep. 19, 2017, received in Chinese Patent Application No. 201380068399.0, which corresponds with U.S. Appl. No. 14/608,985, 3 pages. |
Patent, dated Dec. 8, 2017, received in Chinese Patent Application No. 201380068399.0, which corresponds with U.S. Appl. No. 14/608,985, 4 pages. |
Office Action, dated Jul. 25, 2016, received in European Patent Application No. 13811032.5, which corresponds with U.S. Appl. No. 14/608,985, 8 pages. |
Office Action, dated Feb. 27, 2017, received in European Patent Application No. 13811032.5, which corresponds with U.S. Appl. No. 14/608,985, 6 pages. |
Summons, dated Oct. 6, 2017, received in European Patent Application No. 13811032.5, which corresponds with U.S. Appl. No. 14/608,985, 6 pages. |
Certificate of Grant, dated Jun. 29, 2018, received in Hong Kong Patent Application No. 15112851.6, which corresponds with U.S. Appl. No. 14/608,985, 2 pages. |
Office Action, dated Apr. 25, 2016, received in Japanese Patent Application No. 2015-550384, which corresponds with U.S. Appl. No. 14/608,985, 4 pages. |
Notice of Allowance, dated Jan. 24, 2017, received in Japanese Patent Application No. 2015-550384, which corresponds with U.S. Appl. No. 14/608,985, 5 pages. |
Patent, dated Feb. 24, 2017, received in Japanese Patent Application No. 2015-550384, which corresponds with U.S. Appl. No. 14/608,985, 2 pages. |
Office Action, dated Nov. 4, 2016, received in Korean Patent Application No. 2015-7019984, which corresponds with U.S. Appl. No. 14/608,985, 8 pages. |
Notice of Allowance, dated Sep. 19, 2017, received in Korean Patent Application No. 2015-7019984, which corresponds with U.S. Appl. No. 14/608,985, 4 pages. |
Patent, dated Dec. 19, 2017, received in Korean Patent Application No. 2015-7019984, which corresponds with U.S. Appl. No. 14/608,985, 3 pages. |
Office Action, dated Mar. 24, 2017, received in U.S. Appl. No. 14/609,006, 13 pages. |
Final Office Action, dated Sep. 21, 2017, received in U.S. Appl. No. 14/609,006, 17 pages. |
Office Action, dated Mar. 20, 2018, received in U.S. Appl. No. 14/609,006, 13 pages. |
Office Action, dated Oct. 11, 2018, received in U.S. Appl. No. 14/609,006, 12 pages. |
Office Action, dated Apr. 19, 2017, received in U.S. Appl. No. 14/536,296, 12 pages. |
Final Office Action, dated Nov. 2, 2017, received in U.S. Appl. No. 14/536,296, 13 pages. |
Notice of Allowance, dated Mar. 14, 2018, received in U.S. Appl. No. 14/536,296, 8 pages. |
Office Action, dated Nov. 1, 2017, received in U.S. Appl. No. 14/536,648, 22 pages. |
Final Office Action, dated Aug. 7, 2018, received in U.S. Appl. No. 14/536,648, 14 pages. |
Office Action, dated Jul. 21, 2017, received in Australian Patent Application No. 2016247194, which corresponds with U.S. Appl. No. 14/536,648, 3 pages. |
Notice of Acceptance, dated Jul. 19, 2018, received in Australian Patent Application No. 2016247194, which corresponds with U.S. Appl. No. 14/536,648, 3 pages. |
Office Action, dated Apr. 27, 2018, received in Japanese Patent Application No. 2017-008764, which corresponds with U.S. Appl. No. 14/536,648, 5 pages. |
Office Action, dated Jan. 19, 2017, received in U.S. Appl. No. 14/609,042, 12 pages. |
Notice of Allowance, dated Jul. 10, 2017, received in U.S. Appl. No. 14/609,042, 8 pages. |
Office Action, dated Aug. 24, 2018, received in Japanese Patent Application No. 2017-113598, which corresponds with U.S. Appl. No. 14/609,042, 6 pages. |
Office Action, dated Mar. 31, 2016, received in U.S. Appl. No. 14/864,737, 17 pages. |
Notice of Allowance, dated Feb. 27, 2017, received in U.S. Appl. No. 14/864,737, 9 pages. |
Notice of Allowance, dated Jun. 19, 2017, received in U.S. Appl. No. 14/864,737, 8 pages. |
Office Action, dated Apr. 16, 2018, received in Australian Patent Application No. 2016233792, which corresponds with U.S. Appl. No. 14/864,737, 2 pages. |
Office Action, dated Sep. 11, 2018, received in Chinese Patent Application No. 201610159295.6, which corresponds with U.S. Appl. No. 14/864,737, 6 pages. |
Notice of Allowance, dated Jul. 1, 2016, received in Chinese Patent Application No. 201620214376.7, which corresponds with U.S. Appl. No. 14/864,737, 3 pages. |
Patent, dated Aug. 3, 2016, received in Chinese Patent Application No. 201620214376.7, which corresponds with U.S. Appl. No. 14/864,737, 5 pages. |
Certificate of Registration, dated Jun. 20, 2016, received in German Patent Application No. 202016001845.1, which corresponds with U.S. Appl. No. 14/864,737, 3 pages. |
Office Action, dated Apr. 5, 2016, received in Danish Patent Application No. 201500577, which corresponds with U.S. Appl. No. 14/864,737, 7 pages. |
Intention to Grant, dated Aug. 2, 2016, received in Danish Patent Application No. 201500577, which corresponds with U.S. Appl. No. 14/864,737, 2 pages. |
Decision to grant, dated Mar. 29, 2018, received in European Patent Application No. 16710871.1, which corresponds with U.S. Appl. No. 14/864,737, 2 pages. |
Grant Certificate, dated Apr. 25, 2018, received in European Patent Application No. 16710871.1, which corresponds with U.S. Appl. No. 14/864,737, 2 pages. |
Office Action, dated May 15, 2017, received in Japanese Patent Application No. 2016-558331, which corresponds with U.S. Appl. No. 14/864,737, 5 pages. |
Notice of Allowance, dated Jun. 23, 2017, received in Japanese Patent Application No. 2016-558331, which corresponds with U.S. Appl. No. 14/864,737, 5 pages. |
Patent, dated Jul. 28, 2017, received in Japanese Patent Application No. 2016-558331, which corresponds with U.S. Appl. No. 14/864,737, 3 pages. |
Office Action, dated Feb. 14, 2018, received in Korean Patent Application No. 2017-7030129, which corresponds with U.S. Appl. No. 14/864,737, 17 pages. |
Patent, dated Jul. 12, 2017, received in Dutch Patent Application No. 2016452, which corresponds with U.S. Appl. No. 14/864,737, 2 pages. |
Office Action, dated Jun. 27, 2016, received in U.S. Appl. No. 14/866,981, 22 pages. |
Notice of Allowance, dated Oct. 24, 2016, received in U.S. Appl. No. 14/866,981, 7 pages. |
Notice of Allowance, dated Feb. 10, 2017, received in U.S. Appl. No. 14/866,981, 5 pages. |
Office Action, dated May 10, 2016, received in Australian Patent Application No. 2016100254 , which corresponds with U.S. Appl. No. 14/866,981, 6 pages. |
Patent, dated Nov. 2, 2016, received in Australian Patent Application No. 2016100254, which corresponds with U.S. Appl. No. 14/866,981, 1 page. |
Notice of Allowance, dated Jul. 27, 2016, received in Chinese Patent Application No. 201620176169.7, which corresponds with U.S. Appl. No. 14/866,981, 3 pages. |
Patent, dated Sep. 28, 2016, received in Chinese Patent Application No. 201620176169.7, which corresponds with U.S. Appl. No. 14/866,981, 4 pages. |
Certificate of Registration, dated Jun. 20, 2016, received in German Patent Application No. 202016001514.2, which corresponds with U.S. Appl. No. 14/864,737, 3 pages. |
Office Action, dated Mar. 18, 2016, received in Danish Patent Application No. 201500575, which corresponds with U.S. Appl. No. 14/866,981, 9 pages. |
Office Action, dated Dec. 5, 2016, received in Danish Patent Application No. 201500575, which corresponds with U.S. Appl. No. 14/866,981, 3 pages. |
Office Action, dated Jul. 7, 2017, received in Danish Patent Application No. 201500575, 4 pages. |
Patent, Nov. 16, 2017, received in Dutch Patent Application No. 2016375, which corresponds with U.S. Appl. No. 14/866,981, 2 pages. |
Office Action, dated Dec. 15, 2017, received in U.S. Appl. No. 14/866,159, 35 pages. |
Notice of Allowance, dated May 18, 2018, received in U.S. Appl. No. 14/866,159, 8 pages. |
Office Action, dated May 19, 2016, received in Australian Patent Application No. 2016100251, which corresponds with U.S. Appl. No. 14/866,159, 5 pages. |
Office Action, dated Jun. 5, 2018, received in Chinese Patent Application No. 201610137839.9, which corresponds with U.S. Appl. No. 14/866,159, 11 pages. |
Office Action, dated Jul. 5, 2016, received in Chinese Patent Application No. 201620186008.6, which corresponds with U.S. Appl. No. 14/866,159, 3 pages. |
Certificate of Registration, dated Jun. 16, 2016, received in German U.S. Pat. No. 202016001483.9, which corresponds with U.S. Appl. No. 14,866,159, 3 pages. |
Office Action, dated Mar. 9, 2016, received in Danish Patent Application No. 201500574, which corresponds with U.S. Appl. No. 14/866,159, 11 pages. |
Office Action, dated Sep. 27, 2016, received in Danish Patent Application No. 201500574, which corresponds with U.S. Appl. No. 14/866,159, 4 pages. |
Office Action, dated Mar. 14, 2017, received in Danish Patent Application No. 201500574, which corresponds with U.S. Appl. No. 14/866,159, 5 pages. |
Office Action, dated Jul. 6, 2017, received in Danish Patent Application No. 201500574, which corresponds with U.S. Appl. No. 14/866,159, 3 pages. |
Office Action, dated Jan. 10, 2018, received in Danish Patent Application No. 201500574, which corresponds with U.S. Appl. No. 14/866,159, 2 pages. |
Notice of Allowance, dated Mar. 21, 2018, received in Danish Patent Application No. 201500574, which corresponds with U.S. Appl. No. 14/866,159, 2 pages. |
Patent, dated May 22, 2018, received in Danish Patent Application No. 201500574, which corresponds with U.S. Appl. No. 14/866,159, 2 pages. |
Patent, dated Sep. 7, 2017, received in Dutch Patent Application No. 2016377, which corresponds with U.S. Appl. No. 14/866,159, 4 pages. |
Office Action, dated Oct. 6, 2017, received in U.S. Appl. No. 14/868,078, 40 pages. |
Notice of Allowance, dated May 24, 2018, received in U.S. Appl. No. 14/868,078, 6 pages. |
Innovation Patent, dated Aug. 4, 2016, received in Australian Patent Application No. 2016101201, which corresponds with U.S. Appl. No. 14/868,078, 1 page. |
Office Action, dated Oct. 12, 2016, received in Australian Patent Application No. 2016101201, which corresponds with U.S. Appl. No. 14/868,078, 3 pages. |
Notice of Allowance, dated Sep. 1, 2017, received in Australian Patent Application No. 2016229421, which corresponds with U.S. Appl. No. 14/868,078, 3 pages. |
Certificate of Grant, dated Jan. 3, 2018, received in Australian Patent Application No. 2016229421, which corresponds with U.S. Appl. No. 14/868,078, 1 page. |
Office Action, dated Aug. 20, 2018, received in Chinese Patent Application No. 01610130348.1, which corresponds with U.S. Appl. No. 14/868,078, 6 pages. |
Notice of Allowance, dated Oct. 1, 2016, received in Chinese Patent Application No. 201620175847.8, which corresponds with U.S. Appl. No. 14/868,078, 1 page. |
Certificate of Registration, dated Jun. 30, 2016, received in German Patent Application No. 20201600156.9, which corresponds with U.S. Appl. No. 14/868,078, 3 pages. |
Office Action, dated Mar. 30, 2016, received in Danish Patent Application No. 201500588, which corresponds with U.S. Appl. No. 14/868,078, 9 pages. |
Office Action, dated Sep. 2, 2016, received in Danish Patent Application No. 201500588, which corresponds with U.S. Appl. No. 14/868,078, 4 pages. |
Notice of Allowance, dated Jan. 30, 2017, received in received in Danish Patent Application No. 201500588, which corresponds with U.S. Appl. No. 14/868,078, 2 pages. |
Notice of Allowance, dated May 2, 2017, received in received in Danish Patent Application No. 201500588, which corresponds with U.S. Appl. No. 14/868,078, 2 pages. |
Patent, dated Sep. 11, 2017, received in Danish Patent Application No. 201500588, which corresponds with U.S. Appl. No. 14/868,078, 5 pages. |
Office Action, dated Apr. 25, 2018, received in European Patent Application No. 16708916.8, which corresponds with U.S. Appl. No. 14/868,078, 6 pages. |
Office Action, dated Oct. 25, 2018, received in European Patent Application No. 17184437.6, which corresponds with U.S. Appl. No. 14/868,078, 6 pages. |
Patent, dated Jul. 12, 2017, received in Dutch Patent Application No. 2016376, which corresponds with U.S. Appl. No. 14/868,078, 2 pages. |
Office Action, dated May 9, 2016, received in U.S. Appl. No. 14/863,432, 26 pages. |
Notice of Allowance, dated Nov. 14, 2016, received in U.S. Appl. No. 14/863,432, 7 pages. |
Notice of Allowance, dated Apr. 27, 2017, received in U.S. Appl. No. 14/863,432, 7 pages. |
Notice of Allowance, dated Sep. 18, 2017, received in U.S. Appl. No. 14/863,432, 8 pages. |
Office Action, dated Aug. 19, 2016, received in Australian Patent Application No. 2016100647, which corresponds with U.S. Appl. No. 14/863,432, 5 pages. |
Notice of Allowance, dated Jan. 12, 2017, received in Chinese Patent Application No. 201620470063.8, which corresponds with U.S. Appl. No. 14/863,432, 1 page. |
Office Action, dated Apr. 4, 2016, received in Danish Patent Application No. 201500582, which corresponds with U.S. Appl. No. 14/863,432, 10 pages. |
Office Action, dated Oct. 7, 2016, received in Danish Patent Application No. 201500582, which corresponds with U.S. Appl. No. 14/863,432, 6 pages. |
Office Action, dated Jun. 12, 2017, received in Danish Patent Application No. 201500582, which corresponds with U.S. Appl. No. 14/863,432, 5 pages. |
Grant, dated Jul. 21, 2017, received in Dutch Patent Application No. 2016801, which corresponds with U.S. Appl. No. 14/871,227, 8 pages. |
Office Action, dated Oct. 13, 2016, received in U.S. Appl. No. 14/866,511, 27 pages. |
Final Office Action, dated Jan. 27, 2017, received in U.S. Appl. No. 14/866,511, 26 pages. |
Notice of Allowance, dated Oct. 4, 2017, received in U.S. Appl. No. 14/866,511, 37 pages. |
Patent, dated Aug. 8, 2016, received in Australian Patent Application 2016100653, which corresponds with U.S. Appl. No. 14/866,511, 1 page. |
Notice of Allowance, dated Jan. 12, 2017, received in Chinese Patent Application No. 201620470281.1, which corresponds with U.S. Appl. No. 14/866,511, 1 page. |
Office Action, dated Mar. 22, 2016, received in Danish Patent Application No. 201500576, which corresponds with U.S. Appl. No. 14/866,511, 10 pages. |
Intention to Grant, dated Jun. 8, 2016, received in Danish Patent Application No. 201500576, which corresponds with U.S. Appl. No. 14/866,511, 2 pages. |
Grant, dated Aug. 26, 2016, received in Danish Patent Application No. 201500576, which corresponds with U.S. Appl. No. 14/866,511, 2 pages. |
Patent, dated Jan. 23, 2017, received in Danish Patent Application No. 201500576, which corresponds with U.S. Appl. No. 14/866,511, 3 pages. |
Office Action, dated Nov. 24, 2017, received in European Patent Application No. 16727900.9, which corresponds with U.S. Appl. No. 14/866,511, 5 pages. |
Office Action, dated May 24, 2018, received in European Patent Application No. 16727900.9, which corresponds with U.S. Appl. No. 14/866,511, 7 pages. |
Office Action, dated Jun. 9, 2017, received in Japanese Patent Application No. 2016558214, which corresponds with U.S. Appl. No. 14/866,511, 6 pages. |
Notice of Allowance, dated Jul. 14, 2017, received in Japanese Patent Application No. 2016558214, which corresponds with U.S. Appl. No. 14/866,511, 5 pages. |
Patent, dated Aug. 18, 2017, received in Japanese Patent Application No. 2016558214, which corresponds with U.S. Appl. No. 14/866,511, 3 pages. |
Office Action, dated May 10, 2016, received in U.S. Appl. No. 14/866,489, 15 pages. |
Final Office Action, dated Sep. 16, 2016, received in U.S. Appl. No. 14/866,489, 24 pages. |
Notice of Allowance, dated Apr. 27, 2017, received in U.S. Appl. No. 14/866,489, 27 pages. |
Notice of Allowance, dated Jul. 6, 2017, received in U.S. Appl. No. 14/866,489, 12 pages. |
Office Action, dated Mar. 28, 2016, received in U.S. Appl. No. 14/869,899, 17 pages. |
Office Action, dated Jun. 28, 2016, received in U.S. Appl. No. 14/869,899, 5 pages. |
Final Office Action, dated Sep. 2, 2016, received in U.S. Appl. No. 14/869,899, 22 pages. |
Notice of Allowance, dated Feb. 28, 2017, received in U.S. Appl. No. 14/869,899, 9 pages. |
Innovation Patent, dated Aug. 25, 2016, received in Australian Patent Application No. 2016101438, which corresponds with U.S. Appl. No. 14/869,899, 1 page. |
Certificate of Examination, dated Oct. 11, 2016, received in Australian Patent Application No. 2016101438, which corresponds with U.S. Appl. No. 14/869,899, 1 page. |
Notice of Acceptance, dated Aug. 23, 2018, received in Australian Patent Application No. 2018204611, which corresponds with U.S. Appl. No. 14/869,899, 3 pages. |
Office Action, dated Feb. 3, 2016, received in Danish Patent Application No. 201500592, which corresponds with U.S. Appl. No. 14/869,899, 9 pages. |
Office Action, dated Oct. 7, 2016, received in Danish Patent Application No. 201500592, which corresponds with U.S. Appl. No. 14/869,899, 6 pages. |
Office Action, dated Jul. 3, 2017, received in Danish Patent Application No. 201500592, which corresponds with U.S. Appl. No. 14/869,899, 5 pages. |
Office Action, dated Jan. 29, 2018, received in Danish Patent Application No. 201500592, which corresponds with U.S. Appl. No. 14/869,899, 2 pages. |
Notice of Allowance, dated Apr. 24, 2018, received in Danish Patent Application No. 201500592, which corresponds with U.S. Appl. No. 14/869,899, 2 pages. |
Patent, dated May 28, 2018, received in Danish Patent Application No. 201500592, which corresponds with U.S. Appl. No. 14/869,899, 2 pages. |
Office Action, dated Nov. 22, 2016, received in Danish Patent Application No. 201670594, which corresponds with U.S. Appl. No. 14/869,899, 9 pages. |
Office Action, dated Dec. 14, 2017, received in Danish Patent Application No. 201670594, which corresponds with U.S. Appl. No. 14/869,899, 3 pages. |
Office Action, dated May 1, 2018, received in Danish Patent Application No. 201670594, which corresponds with U.S. Appl. No. 14/869,899, 2 pages. |
Office Action, dated Oct. 9, 2018, received in Danish Patent Application No. 201670594, which corresponds with U.S. Appl. No. 14/869,899, 2 pages. |
Office Action, dated Sep. 21, 2018, received in Japanese Patent Application No. 2018-100827, which corresponds with U.S. Appl. No. 14/869,899, 4 pages. |
Office Action, dated Oct. 5, 2018, received in Korean Patent Application No. 2018-7017213, which corresponds with U.S. Appl. No. 14/869,899, 3 pages. |
Office Action, dated Mar. 4, 2016, received in U.S. Appl. No. 14/866,992, 30 pages. |
Final Office Action, dated Jul. 29, 2016, received in U.S. Appl. No. 14/866,992, 35 pages. |
Office Action, dated Apr. 13, 2017, received in U.S. Appl. No. 14/866,992, 34 pages. |
Final Office Action, dated Oct. 3, 2017, received in U.S. Appl. No. 14/866,992, 37 pages. |
Office Action, dated Jan. 29, 2018, received in U.S. Appl. No. 14/866,992, 44 pages. |
Final Office Action, dated Aug. 28, 2018, received in U.S. Appl. No. 14/866,992, 52 pages. |
Innovation Patent, dated Sep. 22, 2016, received in Australian Patent Application No. 2016101418, which corresponds with U.S. Appl. No. 14/866,992, 1 page. |
Office Action, dated Nov. 22, 2016, received in Australian Patent Application No. 2016101418, which corresponds with U.S. Appl. No. 14/866,992, 7 pages. |
Office Action, dated Feb. 7, 2017, received in Australian Patent Application No. 2016101418, which corresponds with U.S. Appl. No. 14/866,992, 5 pages. |
Office Action, dated Mar. 26, 2018, received in Australian Patent Application No. 2016304890, which corresponds with U.S. Appl. No. 14/866,992, 3 pages. |
Office Action, dated Jan. 19, 2018, received in Australian Patent Application No. 201761478, which corresponds with U.S. Appl. No. 14/866,992, 6 pages. |
Office Action, dated Mar. 18, 2016, received in Danish Patent Application No. 201500593, which corresponds with U.S. Appl. No. 14/866,992, 10 pages. |
Office Action, dated Jun. 27, 2016, received in Danish Patent Application No. 201500593, which corresponds with U.S. Appl. No. 14/866,992, 7 pages. |
Office Action, dated Feb. 6, 2017, received in Danish Patent Application No. 201500593, which corresponds with U.S. Appl. No. 14/866,992, 4 pages. |
Office Action, dated Sep. 5, 2017, received in Danish Patent Application No. 201500593, which corresponds with U.S. Appl. No. 14/866,992, 6 pages. |
Office Action, dated Oct. 12, 2018, received in European Patent Application No. 16758008.3, which corresponds with U.S. Appl. No. 14/866,992, 11 pages. |
Office Action, dated Feb. 12, 2018, received in U.S. Appl. No. 15/009,661, 36 pages. |
Final Office Action, dated Sep. 19, 2018, received in U.S. Appl. No. 15/009,661, 28 pages. |
Office Action, dated Jan. 18, 2018, received in U.S. Appl. No. 15/009,676, 21 pages. |
Notice of Allowance, dated Aug. 3, 2018, received in U.S. Appl. No. 15/009,676, 6 pages. |
Office Action, dated Mar. 13, 2018, received in U.S. Appl. No. 15/009,688, 10 pages. |
Notice of Allowance, dated Nov. 6, 2018, received in U.S. Appl. No. 15/009,688, 10 pages. |
Office Action, dated Nov. 30, 2015, received in U.S. Appl. No. 14/845,217, 24 pages. |
Final Office Action, dated Apr. 22, 2016, received in U.S. Appl. No. 14/845,217, 36 pages. |
Notice of Allowance, dated Aug. 26, 2016, received in U.S. Appl. No. 14/845,217, 5 pages. |
Notice of Allowance, dated Jan. 4, 2017, received in U.S. Appl. No. 14/845,217, 5 pages. |
Office Action, dated Feb. 3, 2016, received in U.S. Appl. No. 14/856,517, 36 pages. |
Final Office Action, dated Jul. 13, 2016, received in U.S. Appl. No. 14/856,517, 30 pages. |
Office Action, dated May 2, 2017, received in U.S. Appl. No. 14/856,517, 34 pages. |
Final Office Action, dated Oct. 4, 2017, received in U.S. Appl. No. 14/856,517, 33 pages. |
Notice of Allowance, dated Jun. 29, 2018, received in U.S. Appl. No. 14/856,517, 11 pages. |
Office Action, dated Feb. 11, 2016, received in U.S. Appl. No. 14/856,519, 34 pages. |
Final Office Action, dated Jul. 15, 2016, received in U.S. Appl. No. 14/856,519, 31 pages. |
Office Action, dated May 18, 2017, received in U.S. Appl. No. 14/856,519, 35 pages. |
Final Office Action, dated Nov. 15, 2017, received in U.S. Appl. No. 14/856,519, 31 pages. |
Notice of Allowance, dated Jan. 31, 2018, received in U.S. Appl. No. 14/856,519, 9 pages. |
Notice of Allowance, dated May 2, 2018, received in U.S. Appl. No. 14/856,519, 10 pages. |
Office Action, dated Jun. 9, 2017, received in U.S. Appl. No. 14/856,520, 36 pages. |
Final Office Action, dated Nov. 16, 2017, received in U.S. Appl. No. 14/856,520, 41 pages. |
Office Action, dated Jun. 30, 2017, received in U.S. Appl. No. 14/856,522, 22 pages. |
Notice of Allowance, dated Feb. 9, 2018, received in U.S. Appl. No. 14/856,522, 9 pages. |
Office Action, dated Feb. 1, 2016, received in U.S. Appl. No. 14/857,645, 15 pages. |
Final Office Action, dated Jun. 16, 2016, received in U.S. Appl. No. 14/857,645, 12 pages. |
Notice of Allowance, dated Oct. 24, 2016, received in U.S. Appl. No. 14/857,645, 6 pages. |
Notice of Allowance, dated Jun. 16, 2017, received in in U.S. Appl. No. 14/857,645, 5 pages. |
Office Action, dated Nov. 30, 2017, received in U.S. Appl. No. 14/857,636, 19 pages. |
Office Action, dated Jan. 17, 2018, received in Australian Patent Application No. 2017202816, which corresponds with U.S. Appl. No. 14/857,636, 3 pages. |
Office Action, dated Sep. 22, 2017, received in Japanese Patent Application No. 2017-029201, which corresponds with U.S. Appl. No. 14/857,636, 8 pages. |
Office Action, dated Jun. 25, 2018, received in Japanese Patent Application No. 2017-029201, which corresponds with U.S. Appl. No. 14/857,636, 4 pages. |
Office Action, dated Dec. 1, 2017, received in U.S. Appl. No. 14/857,663, 15 pages. |
Office Action, dated Mar. 31, 2017, received in U.S. Appl. No. 14/857,700, 14 pages. |
Final Office Action, dated Oct. 11, 2017, received in U.S. Appl. No. 14/857,700, 13 pages. |
Notice of Allowance, dated Feb. 12, 2018, received in U.S. Appl. No. 14/857,700, 13 pages. |
Notice of Allowance, dated Apr. 9, 2018, received in U.S. Appl. No. 14/857,700, 7 pages. |
Notice of Allowance, dated Apr. 19, 2018, received in U.S. Appl. No. 14/864,529, 11 pages. |
Notice of Allowance, dated Oct. 9, 2018, received in U.S. Appl. No. 14/864,529, 11 pages. |
Grant of Patent, dated Apr. 16, 2018, received in Dutch Patent Application No. 2019215, 2 pages. |
Office Action, dated Jan. 25, 2016, received in U.S. Appl. No. 14,864,580, 29 pages. |
Notice of Allowance, dated May 23, 2016, received in U.S. Appl. No. 14/864,580, 9 pages. |
Notice of Allowance, dated Aug. 4, 2016, received in U.S. Appl. No. 14/864,580, 9 pages. |
Notice of Allowance, dated Dec. 28, 2016, received in U.S. Appl. No. 14/864,580, 8 pages. |
Office Action, dated Aug. 19, 2016, received in Australian Patent Application No. 2016100648, which corresponds with U.S. Appl. No. 14/864,580, 6 pages. |
Office Action, dated Nov. 7, 2018, received in Chinese Patent Application No. 201610342151.4, which corresponds with U.S. Appl. No. 14/864,580, 3 pages. |
Notice of Allowance, dated Nov. 8, 2016, received in Chinese Patent Application No. 201620470247.4, which corresponds with U.S. Appl. No. 14/864,580, 3 pages. |
Certificate of Registration, dated Oct. 14, 2016, received in German Patent Application No. 20201600003234.9, which corresponds with U.S. Appl. No. 14/864,580, 3 pages. |
Office Action, dated Apr. 8, 2016, received in Danish Patent Application No. 201500584, which corresponds with U.S. Appl. No. 14/864,580, 9 pages. |
Office Action, dated Oct. 7, 2016, received in Danish Patent Application No. 201500584, which corresponds with U.S. Appl. No. 14/864,580, 3 pages. |
Office Action, dated May 5, 2017, received in Danish Patent Application No. 201500584, which corresponds with U.S. Appl. No. 14/864,580, 3 pages. |
Office Action, dated Dec. 15, 2017, received in Danish Patent Application No. 201500584, which corresponds with U.S. Appl. No. 14/864,580, 4 pages. |
Notice of Allowance, dated Nov. 23, 2016, received in U.S. Appl. No. 14/864,601, 12 pages. |
Notice of Allowance, dated Apr. 20, 2017, received in U.S. Appl. No. 14/864,601, 13 pages. |
Office Action, dated Aug. 31, 2018, received in Australian Patent Application No. 2016276030, which corresponds with U.S. Appl. No. 14/864,601, 3 pages. |
Office Action, dated Apr. 19, 2016, received in U.S. Appl. No. 14/864,627, 9 pages. |
Notice of Allowance, dated Jan. 31, 2017, received in U.S. Appl. No. 14/864,627, 7 pages. |
Office Action, dated Apr. 8, 2016, received in Danish Patent Application No. 201500585, which corresponds with U.S. Appl. No. 14/864,627, 9 pages. |
Office Action, dated Oct. 7, 2016, received in Danish Patent Application No. 201500585, which corresponds with U.S. Appl. No. 14/864,627, 3 pages. |
Office Action, dated May 5, 2017, received in Danish Patent Application No. 201500585, which corresponds with U.S. Appl. No. 14/864,627, 4 pages. |
Office Action, dated Dec. 15, 2017, received in Danish Patent Application No. 201500585, which corresponds with U.S. Appl. No. 14/864,627, 5 pages. |
Office Action, dated Mar. 29, 2016, received in U.S. Appl. No. 14/866,361, 22 pages. |
Notice of Allowance, dated Jul. 19, 2016, received in U.S. Appl. No. 14/866,361, 8 pages. |
Office Action, dated Jun. 10, 2016, received in Australian Patent Application No. 2016100292, which corresponds with U.S. Appl. No. 14/866,361, 4 pages. |
Certificate of Examination, dated Dec. 8, 2016, received in Australian Patent Application No. 2016100292, which corresponds with U.S. Appl. No. 14/866,361, 1 page. |
Office Action, dated Oct. 19, 2018, received in Chinese Patent Application No. 201610189298.4, which corresponds with U.S. Appl. No. 14/866,361, 6 pages. |
Notice of Allowance/Grant, dated Jul. 1, 2016, received in Chinese Patent Application No. 201620251706.X, which corresponds with U.S. Appl. No. 14/866,361, 3 pages. |
Letters Patent, dated Aug. 3, 2016, received in Chinese Patent Application No. 201620251706.X, which corresponds with U.S. Appl. No. 14/866,361, 3 pages. |
Certificate of Registration, dated Jun. 24, 2016, received in German Patent Application No. 202016001819.2, which corresponds with U.S. Appl. No. 14/866,361, 3 pages. |
Office Action, dated Apr. 7, 2016, received in Danish Patent Application No. 201500579, which corresponds with U.S. Appl. No. 14/866,361, 10 pages. |
Office Action, dated Oct. 28, 2016, received in Danish Patent Application No. 201500579, which corresponds with U.S. Appl. No. 14/866,361, 3 pages. |
Office Action, dated Jun. 15, 2017, received in Danish Patent Application No. 201500579, which corresponds with U.S. Appl. No. 14/866,361, 2 pages. |
Office Action, dated Jan. 4, 2018, received in Danish Patent Application No. 201500579, which corresponds with U.S. Appl. No. 14/866,361, 2 pages. |
Notice of Allowance, dated Mar. 16, 2018, received in Danish Patent Application No. 201500579, which corresponds with U.S. Appl. No. 14/866,361, 2 pages. |
Patent, dated May 22, 2018, received in Danish Patent Application No. 201500579, which corresponds with U.S. Appl. No. 14/866,361, 2 pages. |
Office Action, dated Jun. 11, 2018, received in European Patent Application No. 17188507.2, which corresponds with U.S. Appl. No. 14/866,361, 10 pages. |
Office Action, dated Oct. 12, 2018, received in Japanese Patent Application No. 2017-141962, which corresponds with U.S. Appl. No. 14/866,361, 6 pages. |
Office Action, dated Sep. 14, 2018, received in Korean Patent Application No. 2018-7013039, which corresponds with U.S. Appl. No. 14/866,361, 2 pages. |
Office Action, dated Jan. 22, 2018, received in U.S. Appl. No. 14/866,987 (7335), 22 pages. |
Final Office Action, dated Oct. 11, 2018, received in U.S. Appl. No. 14/866,987, 20 pages. |
Patent, dated Aug. 8, 2016, received in Australian Patent Application No. 2016100649, which corresponds with U.S. Appl. No. 14/866,987, 1 page. |
Office Action, dated Oct. 19, 2016, received in Chinese Patent Application No. 2016201470246.X, which corresponds with U.S. Appl. No. 14/866,987, 4 pages. |
Patent, dated May 3, 2017, received in Chinese Patent Application No. 2016201470246.X, which corresponds with U.S. Appl. No. 14/866,987, 2 pages. |
Patent, dated Sep. 19, 2016, received in German Patent Application No. 202016002908.9, which corresponds with U.S. Appl. No. 14/866,987, 3 pages. |
Office Action, dated Mar. 22, 2016, received in Danish Patent Application No. 201500587, which corresponds with U.S. Appl. No. 14/866,987, 8 pages. |
Intention to Grant, dated Jun. 10, 2016, received in Danish Patent Application No. 201500587, which corresponds with U.S. Appl. No. 14/866,987, 2 pages. |
Notice of Allowance, dated Nov. 1, 2016, received in Danish Patent Application No. 201500587, which corresponds with U.S. Appl. No. 14/866,987, 2 pages. |
Office Action, dated Sep. 9, 2016, received in Danish Patent Application No. 201670463, which corresponds with U.S. Appl. No. 14/866,987, 7 pages. |
Notice of Allowance, dated Jan. 31, 2017, received in Danish Patent Application No. 201670463, which corresponds with U.S. Appl. No. 14/866,987, 3 pages. |
Office Action, dated Apr. 19, 2017, received in Danish Patent Application No. 201670463, which corresponds with U.S. Appl. No. 14/866,987, 3 pages. |
Notice of Allowance, dated Sep. 29, 2017, received in Danish Patent Application No. 201670463, which corresponds with U.S. Appl. No. 14/866,987, 2 pages. |
Patent, dated Nov. 6, 2017, received in Danish Patent Application No. 201670463, which corresponds with U.S. Appl. No. 14/866,987, 6 pages. |
Office Action, dated May 7, 2018, received in European Patent Application No. 16189421.7, which corresponds with U.S. Appl. No. 14/866,987, 5 pages. |
Notice of Allowance, dated Sep. 22, 2017, received in Japanese Patent Application No. 2016-233449, which corresponds with U.S. Appl. No. 14/866,987, 5 pages. |
Patent, dated Oct. 27, 2017, received in Japanese Patent Application No. 2016-233449, which corresponds with U.S. Appl. No. 14/866,987, 3 pages. |
Office Action, dated Jul. 31, 2017, received in Japanese Patent Application No. 2017126445, which corresponds with U.S. Appl. No. 14/866,987, 6 pages. |
Notice of Allowance, dated Mar. 6, 2018, received in Japanese Patent Application No. 2017-126445, which corresponds with U.S. Appl. No. 14/866,987, 5 pages. |
Patent, dated Apr. 6, 2018, received in Japanese Patent Application No. 2017-126445, which corresponds with U.S. Appl. No. 14/866,987, 3 pages. |
Office Action, dated Nov. 29, 2017, received in U.S. Appl. No. 14/866,989, 31 pages. |
Final Office Action, dated Jul. 3, 2018, received in U.S. Appl. No. 14/866,989, 17 pages. |
Certificate of Exam, dated Jul. 21, 2016, received in Australian Patent Application No. 2016100652, which corresponds with U.S. Appl. No. 14/866,989, 1 page. |
Office Action, dated Feb. 26, 2018, received in Australian Patent Application No. 2017201079, which corresponds with U.S. Appl. No. 14/866,989, 6 pages. |
Office Action, dated Sep. 19, 2018, received in Chinese Patent Application No. 201610342314.9, which corresponds with U.S. Appl. No. 14/866,989, 6 pages. |
Office Action, dated Jun. 16, 2017, received in Japanese Patent Application No. 2016-233450, which corresponds with U.S. Appl. No. 14/866,989, 6 pages. |
Patent, dated Mar. 9, 2018, received in Japanese Patent Application No. 2016-233450, which corresponds with U.S. Appl. No. 14/866,989, 4 pages. |
Office Action, dated Apr. 1, 2016, received in Danish Patent Application No. 201500589, which corresponds with U.S. Appl. No. 14/866,989, 8 pages. |
Intention to Grant, dated Jun. 10, 2016, received in Danish Patent Application No. 201500589, which corresponds with U.S. Appl. No. 14/866,989, 2 pages. |
Notice of Allowance, dated Nov. 1, 2016, received in Danish Patent Application No. 201500589, which corresponds with U.S. Appl. No. 14/866,989, 2 pages. |
Notice of Allowance, dated Feb. 5, 2018, received in Japanese Patent Application No. 2016-233450, which corresponds with U.S. Appl. No. 14/866,989, 5 pages. |
Office Action, dated Apr. 11, 2016, received in U.S. Appl. No. 14/871,236, 23 pages. |
Office Action, dated Jun. 28, 2016, received in U.S. Appl. No. 14/871,236, 21 pages. |
Final Office Action, dated Nov. 4, 2016, received in U.S. Appl. No. 14/871,236, 24 pages. |
Notice of Allowance, dated Feb. 28, 2017, received in U.S. Appl. No. 14/871,236, 9 pages. |
Innovation Patent, dated Aug. 25, 2016, received in Australian Patent Application No. 2016101433, which corresponds with U.S. Appl. No. 14/871,236, 1 page. |
Office Action, dated Oct. 14, 2016, received in Australian Patent Application No. 2016101433, which corresponds with U.S. Appl. No. 14/871,236, 3 pages. |
Office Action, dated Apr. 8, 2016, received in Danish Patent Application No. 201500595, which corresponds with U.S. Appl. No. 14/871,236, 12 pages. |
Office Action, dated May 26, 2016, received in Danish Patent Application No. 201500595, which corresponds with U.S. Appl. No. 14/871,236, 14 pages. |
Office Action, dated Sep. 30, 2016, received in Danish Patent Application No. 201500595, which corresponds with U.S. Appl. No. 14/871,236, 10 pages. |
Office Action, dated Jun. 15, 2017, received in Danish Patent Application No. 201500595, which corresponds with U.S. Appl. No. 14/871,236, 4 pages. |
Office Action, dated Jan. 29, 2018, received in Danish Patent Application No. 201500595, which corresponds with U.S. Appl. No. 14/871,236, 2 pages. |
Notice of Allowance, dated Apr. 26, 2018, received in Danish Patent Application No. 201500595, which corresponds with U.S. Appl. No. 14/871,236, 2 pages. |
Patent, dated Jun. 18, 2018, received in Danish Patent Application No. 201500595, which corresponds with U.S. Appl. No. 14/871,236, 3 pages. |
Office Action, dated Jul. 19, 2018, received in Russian Patent Application No. 2017131408, which corresponds with U.S. Appl. No. 14/871,236, 8 pages. |
Office Action, dated Sep. 1, 2017, received in U.S. Appl. No. 14/870,754, 22 pages. |
Final Office Action, dated Mar. 9, 2018, received in U.S. Appl. No. 14/870,754, 19 pages. |
Notice of Allowance, dated Jul. 2, 2018, received in U.S. Appl. No. 14/870,754, 9 pages. |
Office Action, dated Nov. 14, 2017, received in U.S. Appl. No. 14/870,882, 25 pages. |
Final Office Action, dated Apr. 20, 2018, received in U.S. Appl. No. 14/870,882, 7 pages. |
Notice of Allowance, dated Jul. 12, 2018, received in U.S. Appl. No. 14/870,882, 5 pages. |
Innovation Patent, dated Aug. 25, 2016, received in Australian Patent Application No. 2016101436, which corresponds with U.S. Appl. No. 14/871,236, 1 pages. |
Office Action, dated Oct. 31, 2016, received in Australian Patent Application No. 2016101438, which corresponds with U.S. Appl. No. 14/871,236, 6 pages. |
Office Action, dated Apr. 6, 2016, received in Danish Patent Application No. 201500596, which corresponds with U.S. Appl. No. 14/870,882, 7 pages. |
Office Action, dated Jun. 9, 2016, received in Danish Patent Application No. 201500596, which corresponds with U.S. Appl. No. 14/870,882, 9 pages. |
Notice of Allowance, dated Oct. 31, 2017, received in Danish Patent Application No. 201500596, which corresponds with U.S. Appl. No. 14/870,882, 2 pages. |
Patent, dated Jan. 29, 2018, received in Danish Patent Application No. 201500596, which corresponds with U.S. Appl. No. 14/870,882, 4 page.. |
Office Action, dated Sep. 1, 2017, received in U.S. Appl. No. 14/870,988, 14 pages. |
Final Office Action, dated Feb. 16, 2018, received in U.S. Appl. No. 14/870,988, 18 pages. |
Office Action, dated Nov. 22, 2017, received in U.S. Appl. No. 14/871,227, 24 pages. |
Notice of Allowance, dated Jun. 11, 2018, received in U.S. Appl. No. 14/871,227, 11 pages. |
Office Action, dated Oct. 17, 2016, received in Australian Patent Application No. 2016203040, which corresponds with U.S. Appl. No. 14/871,227, 7 pages. |
Office Action, dated Oct. 16, 2017, received in Australian Patent Application No. 2016203040, which corresponds with U.S. Appl. No. 14/871,227, 5 pages. |
Notice of Acceptance, dated Oct. 30, 2018, received in Australian Patent Application No. 2016203040, which corresponds with U.S. Appl. No. 14/871,227, 4 pages. |
Office Action, dated Oct. 18, 2016, received in Australian Patent Application No. 2016101431, which corresponds with U.S. Appl. No. 14/871,227, 3 pages. |
Office Action, dated Apr. 13, 2017, received in Australian Patent Application No. 2016101431, which corresponds with U.S. Appl. No. 14/871,227, 4 pages. |
Office Action, dated Oct. 11, 2018, received in Australian Patent Application No. 2017245442, which corresponds with U.S. Appl. No. 14/871,227, 4 pages. |
Intention to Grant, dated Apr. 7, 2016, received in Danish Patent Application No. 201500597, which corresponds with U.S. Appl. No. 14/871,227, 7 pages. |
Grant, dated Jun. 21, 2016, received in Danish Patent Application No. 201500597, which corresponds with U.S. Appl. No. 14/871,227, 2 pages. |
Patent, dated Sep. 26, 2016, received in Danish Patent Application No. 201500597, which corresponds with U.S. Appl. No. 14/871,227, 7 pages. |
Intent to Grant, dated Sep. 17, 2018, received in European Patent No. 16711743.1, which corresponds with U.S. Appl. No. 14/871,227, 5 pages. |
Office Action, dated Mar. 24, 2017, received in Japanese Patent Application No. 2016-533201, which corresponds with U.S. Appl. No. 14/871,227, 6 pages. |
Office Action, dated Aug. 4, 2017, received in Japanese Patent Application No. 2016-533201, which corresponds with U.S. Appl. No. 14/871,227, 6 pages. |
Notice of Allowance, dated Jan. 4, 2018, received in Japanese Patent Application No. 2016-533201, which corresponds with U.S. Appl. No. 14/871,227, 4 pages. |
Patent, dated Feb. 9, 2018, received in Japanese Patent Application No. 2016-533201, which corresponds with U.S. Appl. No. 14/871,227, 4 pages. |
Office Action, dated Feb. 20, 2018, received in Korean Patent Application No. 2016-7019816, which corresponds with U.S. Appl. No. 14/871,227, 8 pages. |
Notice of Allowance, dated Oct. 1, 2018, received in Korean Patent Application No. 2016-7019816, which corresponds with U.S. Appl. No. 14/871,227, 6 pages. |
Office Action, dated Oct. 26, 2017, received in U.S. Appl. No. 14/871,336, 22 pages. |
Final Office Action, dated Mar. 15, 2018, received in U.S. Appl. No. 14/871,336, 23 pages. |
Office Action, dated Nov. 5, 2018, received in U.S. Appl. No. 14/871,336, 24 pages. |
Office Action, dated Oct. 14, 2016, received in Australian Patent Application No. 2016101437, which corresponds with U.S. Appl. No. 14/871,336, 2 pages. |
Office Action, dated Apr. 11, 2017, received in Australian Patent Application No. 2016101437, which corresponds with U.S. Appl. No. 14/871,336, 4 pages. |
Office Action, dated Apr. 18, 2016, received in Danish Patent Application No. 201500601, which corresponds with U.S. Appl. No. 14/871,336, 8 pages. |
Office Action, dated Oct. 18, 2016, received in Danish Patent Application No. 201500601, which corresponds with U.S. Appl. No. 14/871,336, 3 pages. |
Notice of Allowance, dated Mar. 23, 2017, received in Danish Patent Application No. 201500601, which corresponds with U.S. Appl. No. 14/871,336, 2 pages. |
Patent, dated Oct. 30, 2017, Danish Patent Application No. 201500601, which corresponds with U.S. Appl. No. 14/871,336, 5 pages. |
Office Action, dated Apr. 2, 2018, received in Japanese Patent Application No. 2018-020324, which corresponds with U.S. Appl. No. 14/871,336, 4 pages. |
Notice of Allowance, dated Oct. 12, 2018, received in Japanese Patent Application No. 2018-020324, which corresponds with U.S. Appl. No. 14/871,336, 5 pages. |
Office Action, dated Oct. 16, 2017, received in U.S. Appl. No. 14/871,462, 26 pages. |
Innovation Patent, dated Aug. 25, 2016, received in Australian Patent Application No. 2016101435, which corresponds with U.S. Appl. No. 14/871,462, 1 page. |
Office Action, dated Oct. 4, 2016, received in Australian Patent Application No. 2016101435, which corresponds with U.S. Appl. No. 14/871,462, 3 pages. |
Office Action, dated Oct. 4, 2016, received in Australian Patent Application No. 2016231505, which corresponds with U.S. Appl. No. 14/871,462, 3 pages. |
Office Action, dated Sep. 29, 2017, received in Australian Patent Application No. 2016231505, which corresponds with U.S. Appl. No. 14/871,462, 5 pages. |
Innovation Patent, dated Oct. 11, 2017, received in Australian Patent Application No. 2016231505, which corresponds with U.S. Appl. No. 14/871,462, 1 page. |
Office Action, dated Apr. 20, 2017, received in Chinese Patent Application No. 201621044346.2, which corresponds with U.S. Appl. No. 14/871,462, 3 pages. |
Intention to Grant, dated Apr. 18, 2016, received in Danish Patent Application No. 201500600, which corresponds with U.S. Appl. No. 14/871,462, 7 pages. |
Grant, dated Aug. 30, 2016, received in Danish Patent Application No. 201500600, which corresponds with U.S. Appl. No. 14/871,462, 2 pages. |
Office Action, dated Mar. 13, 2017, received in Japanese Patent Application No. 2016-183289, which corresponds with U.S. Appl. No. 14/871,462, 5 pages. |
Office Action, dated Nov. 13, 2017, received in Japanese Patent Application No. 2016-183289, which corresponds with U.S. Appl. No. 14/871,462, 5 pages. |
Office Action, dated Apr. 29, 2016, received in U.S. Appl. No. 14/867,823, 28 pages. |
Final Office Action, dated Sep. 28, 2016, received in U.S. Appl. No. 14/867,823, 31 pages. |
Office Action, dated May 11, 2017, received in U.S. Appl. No. 14/867,823, 42 pages. |
Final Office Action, dated Nov. 29, 2017, received in U.S. Appl. No. 14/867,823, 47 pages. |
Notice of Allowance, dated Apr. 18, 2018, received in U.S. Appl. No. 14/867,823, 10 pages. |
Notice of Allowance, dated Aug. 7, 2018, received in U.S. Appl. No. 14/867,823, 8 pages. |
Office Action, dated Mar. 18, 2016, received in Danish Patent Application No. 201500594, which corresponds with U.S. Appl. No. 14/867,823, 10 pages. |
Office Action, dated Sep. 7, 2016, received in Danish Patent Application No. 201500594, which corresponds with U.S. Appl. No. 14/867,823, 4 pages. |
Office Action, dated May 15, 2017, received in Danish Patent Application No. 201500594, which corresponds with U.S. Appl. No. 14/867,823, 4 pages. |
Office Action, dated Jan. 23, 2018, received in Danish Patent Application No. 201500594, which corresponds with U.S. Appl. No. 14/867,823, 8 pages. |
Office Action, dated May 10, 2016, received in U.S. Appl. No. 14/867,892, 28 pages. |
Final Office Action, dated Nov. 2, 2016, received in U.S. Appl. No. 14/867,892, 48 pages. |
Office Action, dated Jul. 6, 2017, received in U.S. Appl. No. 14/867,892, 55 pages. |
Final Office Action, dated Dec. 14, 2017, received in U.S. Appl. No. 14/867,892, 53 pages. |
Office Action, dated Apr. 24, 2018, received in U.S. Appl. No. 14/867,892, 63 page. |
Office Action, dated Mar. 21, 2016, received in Danish Patent Application No. 201500598, which corresponds with U.S. Appl. No. 14/867,892, 9 pages. |
Office Action, dated Sep. 14, 2016, received in Danish Patent Application No. 201500598, which corresponds with U.S. Appl. No. 14/867,892, 4 pages. |
Office Action, dated May 4, 2017, received in Danish Patent Application No. 201500598, which corresponds with U.S. Appl. No. 14/867,892, 4 pages. |
Office Action, dated Oct. 31, 2017, received in Danish Patent Application No. 201500598, which corresponds with U.S. Appl. No. 14/867,892, 2 pages. |
Notice of Allowance, dated Jan. 26, 2018, received in Danish Patent Application No. 201500598, which corresponds with U.S. Appl. No. 14/867,892, 2 pages. |
Office Action, dated Feb. 28, 2018, received in U.S. Appl. No. 14/869,361, 26 pages. |
Final Office Action, dated Oct. 4, 2018, received in U.S. Appl. No. 14/869,361, 28 pages. |
Office Action, dated Mar. 1, 2017, received in U.S. Appl. No. 14/869,855, 14 pages. |
Final Office Action, dated Oct. 10, 2017, received in U.S. Appl. No. 14/869,855, 16 pages. |
Office Action, dated Jan. 23, 2018, received in U.S. Appl. No. 14/869,855, 24 pages. |
Notice of Allowance, dated May 31, 2018, received in U.S. Appl. No. 14/869,855, 10 pages. |
Office Action, dated Feb. 9, 2017, received in U.S. Appl. No. 14/869,873, 17 pages. |
Final Office Action, dated Aug. 18, 2017, received in U.S. Appl. No. 14/869,873, 20 pages. |
Office Action, dated Jan. 18, 2018, received in U.S. Appl. No. 14/869,873, 25 pages. |
Final Office Action, dated May 23, 2018, received in U.S. Appl. No. 14/869,873, 18 pages. |
Notice of Allowance, dated Jul. 30, 2018, received in U.S. Appl. No. 14/869,873, 8 pages. |
Office Action, dated Jan. 11, 2018, received in U.S. Appl. No. 14/869,997, 17 pages. |
Office Action, dated Sep. 7, 2018, received in U.S. Appl. No. 14/869,997, 23 pages. |
Notice of Allowance, dated Jan. 17, 2018, received in U.S. Appl. No. 14/867,990 12 pages. |
Notice of Allowance, dated Mar. 30, 3018, received in U.S. Appl. No. 14/867,990, 5 pages. |
Office Action, dated May 23, 2016, received in Australian Patent Application No. 2016100253, which corresponds with U.S. Appl. No. 14/867,990, 5 pages. |
Office Action, dated Jul. 5, 2016, received in Chinese Patent Application No. 201620176221.9, which corresponds with U.S. Appl. No. 14/867,990, 4 pages. |
Office Action, dated Oct. 25, 2016, received in Chinese Patent Application No. 201620176221.9, which corresponds with U.S. Appl. No. 14/867,990, 7 pages. |
Certificate of Registration, dated Jun. 16, 2016, received in German Patent No. 202016001489.8, which corresponds with U.S. Appl. No. 14/867,990, 3 pages. |
Office Action, dated Mar. 18, 2016, received in Danish Patent Application No. 201500581, which corresponds with U.S. Appl. No. 14/867,990, 9 pages. |
Office Action, dated Sep. 26, 2016, received in Danish Patent Application No. 201500581, which corresponds with U.S. Appl. No. 14/867,990, 5 pages. |
Office Action, dated May 3, 2017, received in Danish Patent Application No. 201500581, which corresponds with U.S. Appl. No. 14/867,990, 5 pages. |
Office Action, dated Feb. 19, 2018, received in Danish Patent Application No. 201500581, which corresponds with U.S. Appl. No. 14/867,990, 4 pages. |
Office Action, dated Apr. 19, 2018, received in U.S. Appl. No. 14/869,703, 19 pages. |
Final Office Action, dated Oct. 26, 2018, received in U.S. Appl. No. 14/869,703, 19 pages. |
Office Action, dated Dec. 12, 2017, received in U.S. Appl. No. 15/009,668, 32 pages. |
Final Office Action, dated Jul. 3, 2018, received in U.S. Appl. No. 15/009,668, 19 pages. |
Office Action, dated Nov. 25, 2016, received in U.S. Appl. No. 15/081,771, 17 pages. |
Final Office Action, dated Jun. 2, 2017, received in U.S. Appl. No. 15/081,771, 17 pages. |
Notice of Allowance, dated Dec. 4, 2017, received in U.S. Appl. No. 15/081,771, 10 pages. |
Office Action, dated Feb. 1, 2018, received in Australian Patent Application No. 2017202058, which corresponds with U.S. Appl. No. 15/081,771, 4 pages. |
Office Action, dated Jan. 26, 2018, received in Japanese Patent Application No. 2017-086460, which corresponds with U.S. Appl. No. 15/081,771, 6 pages. |
Notice of Allowance, dated Oct. 12, 2018, received in Japanese Patent Application No. 2017-086460, which corresponds with U.S. Appl. No. 15/081,771, 5 pages. |
Office Action, dated Aug. 29, 2017, received in Korean Patent Application No. 2017-7014536, which corresponds with U.S. Appl. No. 15/081,771, 5 pages. |
Notice of Allowance, dated Jun. 28, 2018, received in Korean Patent Application No. 2017-7014536, which corresponds with U.S. Appl. No. 15/081,771, 4 pages. |
Patent, dated Sep. 28, 2018, received in Korean Patent Application No. 2017-7014536, which corresponds with U.S. Appl. No. 15/081,771, 3 pages. |
Final Office Action, dated May 1, 2017, received in U.S. Appl. No. 15/136,782, 18 pages. |
Notice of Allowance, dated Oct. 20, 2017, received in U.S. Appl. No. 15/136,782, 9 pages. |
Office Action, dated May 4, 2018, received in Australian Patent Application No. 2018202855, which corresponds with U.S. Appl. No. 15/136,782, 3 pages. |
Notice of Acceptance, dated Sep. 10, 2018, received in Australian Patent Application No. 2018202855, which corresponds with U.S. Appl. No. 15/136,782, 3 pages. |
Office Action, dated May 23, 2017, received in Danish Patent Application No. 201770190, which corresponds with U.S. Appl. No. 15/136,782, 7 pages. |
Office Action, dated Jan. 8, 2018, received in Danish Patent Application No. 201770190, which corresponds with U.S. Appl. No. 15/136,782, 2 pages. |
Notice of Allowance, dated Mar. 19, 2018, received in Danish Patent Application No. 201770190 vNo. 15/136,782, 2 pages. |
Patent, dated May 22, 2018, received in Danish Patent Application No. 201770190, which corresponds with U.S. Appl. No. 15/136,782, 2 pages. |
Office Action, dated Jun. 1, 2018, received in Japanese Patent Application No. 2018-062161, which corresponds with U.S. Appl. No. 15/136,782, 5 pages. |
Office Action, dated Oct. 31, 2018, received in Korean Patent Application No. 2018-7020659, which corresponds with U.S. Appl. No. 15/136,782, 5 pages. |
Office Action, dated Jan. 20, 2017, received in U.S. Appl. No. 15/231,745, 21 pages. |
Notice of Allowance, dated Jul. 6, 2017, received in U.S. Appl. No. 15/231,745, 18 pages. |
Office Action, dated Oct. 17, 2016, received in Danish Patent Application No. 201670587, which corresponds with U.S. Appl. No. 15/231,745, 9 pages. |
Office Action, dated Jun. 29, 2017, received in Danish Patent Application No. 201670587, which corresponds with U.S. Appl. No. 15/231,745, 4 pages. |
Office Action, dated Feb. 22, 2018, received in Danish Patent Application No. 201670587, which corresponds with U.S. Appl. No. 15/231,745, 4 pages. |
Office Action, dated Dec. 14, 2016, received in Danish Patent Application No. 201670590, which corresponds with U.S. Appl. No. 15/231,745, 9 pages. |
Office Action, dated Jul. 6, 2017, received in Danish Patent Application No. 201670590, which corresponds with U.S. Appl. No. 15/231,745, 3 pages. |
Office Action, dated Jan. 10, 2018, received in Danish Patent Application No. 201670590, which corresponds with U.S. Appl. No. 15/231,745, 2 pages. |
Patent, dated May 28, 2018, received in Danish Patent Application No. 201670590, which corresponds with U.S. Appl. No. 15/231,745, 2 pages. |
Office Action, dated Nov. 10, 2016, received in Danish Patent Application No. 201670591, which corresponds with U.S. Appl. No. 15/231,745, 12 pages. |
Office Action, dated Apr. 11, 2018, received in Danish Patent Application No. 201670591, which corresponds with U.S. Appl. No. 15/231,745, 3 pages. |
Office Action, dated Oct. 26, 2016, received in Danish Patent Application No. 201670592, which corresponds with U.S. Appl. No. 15/231,745, 8 pages. |
Office Action, dated Jan. 5, 2017, received in Danish Patent Application No. 201670592, which corresponds with U.S. Appl. No. 15/231,745, 3 pages. |
Office Action, dated Jan. 30, 2018, received in Danish Patent Application No. 201670592, which corresponds with U.S. Appl. No. 15/231,745, 2 pages. |
Notice of Allowance, dated Mar. 27, 2018, received in Danish Patent Application No. 201670592, which corresponds with U.S. Appl. No. 15/231,745, 2 pages. |
Patent, dated May 28, 2018, received in Danish Patent Application No. 201670592, which corresponds with U.S. Appl. No. 15/231,745, 2 pages. |
Office Action, dated Oct. 12, 2016, received in Danish Patent Application No. 201670593, which corresponds with U.S. Appl. No. 15/231,745, 7 pages. |
Patent, dated Oct. 30, 2017, received in Danish Patent Application No. 201670593, which corresponds with U.S. Appl. No. 15/231,745, 3 pages. |
Notice of Acceptance, dated Mar. 2, 2018, received in Australian Patent Application No. 2018200705, which corresponds with U.S. Appl. No. 15/272,327, 3 pages. |
Certificate of Grant, dated Jun. 28, 2018, received in Australian Patent Application No. 2018200705, which corresponds with U.S. Appl. No. 15/272,327, 4 pages. |
Office Action, dated Sep. 14, 2018, received in European Patent Application No. 15155939.4, which corresponds with U.S. Appl. No. 15/272,327, 5 pages. |
Notice of Allowance, dated Jul. 30, 2018, received in Japanese Patent Application No. 2018-506989, which corresponds with U.S. Appl. No. 15/272,327, 4 pages. |
Patent, dated Aug. 31, 2018, received in Japanese Patent Application No. 2018-506989, which corresponds with U.S. Appl. No. 15/272,327, 3 pages. |
Office Action, dated Oct. 26, 2018, received in U.S. Appl. No. 15/272,341, 22 pages. |
Office Action, dated Jul. 27, 2017, received in Australian Patent Application No. 2017100535, which corresponds with U.S. Appl. No. 15/272,341, 4 pages. |
Notice of Allowance, dated Sep. 20, 2018, received in U.S. Appl. No. 15/272,343, 44 pages. |
Office Action, dated Oct. 15, 2018, received in U.S. Appl. No. 15/272,345. 31 pages. |
Notice of Acceptance, dated Mar. 2, 2018, received in Australian Patent Application No. 2016304832, which corresponds with U.S. Appl. No. 15/272,345, 3 pages. |
Certificate of Grant, dated Jun. 28, 2018, received in Australian Patent Application No. 2016304832, which corresponds with U.S. Appl. No. 15/272,345, 4 pages. |
Office Action, dated Apr. 20, 2018, received in European Patent Application No. 16756862.5, which corresponds with U.S. Appl. No. 15/272,345, 15 pages. |
Office Action, dated Mar. 7, 2018, received in U.S. Appl. No. 15/482,618, 7 pages. |
Office Action, dated Apr. 23, 2018, received in U.S. Appl. No. 15/499,691, 29 pages. |
Notice of Allowance, dated Oct. 12, 2018, received in U.S. Appl. No. 15/499,693, 8 pages. |
Office Action, dated Aug. 30, 2017, received in U.S. Appl. No. 15/655,749, 22 pages. |
Final Office Action, dated May 10, 2018, received in U.S. Appl. No. 15/655,749, 19 pages. |
Office Action, dated Oct. 31, 2017, received in U.S. Appl. No. 15/723,069, 7 pages. |
Notice of Allowance, dated Dec. 21, 2017, received in U.S. Appl. No. 15/723,069, 7 pages. |
International Search Report and Written Opinion dated May 26, 2014, received in International Application No. PCT/US2013/040053, which corresponds with U.S. Appl. No. 14/535,671, 32 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/US2013/040053, which corresponds with U.S. Appl. No. 14/535,671, 26 pages. |
International Search Report and Written Opinion dated Apr. 7, 2014, received in International Application No. PCT/US2013/069472, which corresponds with U.S. Appl. No. 14/608,895, 24 pages. |
International Preliminary Report on Patentability, dated Jun. 30, 2015, received in International Patent Application No. PCT/US2013/069472, which corresponds with U.S. Appl. No. 14/608,895, 18 pages. |
International Search Report and Written Opinion dated Aug. 7, 2013, received in International Application No. PCT/US2013/040054, which corresponds with U.S. Appl. No. 14/536,235, 12 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/US2013/040054, which corresponds with U.S. Appl. No. 14/536,235, 11 pages. |
International Search Report and Written Opinion dated Aug. 7, 2013, received in International Application No. PCT/US2013/040056, which corresponds with U.S. Appl. No. 14/536,367, 12 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/US2013/040056, which corresponds with U.S. Appl. No. 14/536,367, 11 pages. |
Extended European Search Report, dated Nov. 6, 2015, received in European Patent Application No. 15183980.0, which corresponds with U.S. Appl. No. 14/536,426, 7 pages. |
Extended European Search Report, dated Jul. 30, 2018, received in European Patent Application No. 18180503.7, which corresponds with U.S. Appl. No. 14/536,426, 7 pages. |
International Search Report and Written Opinion dated Aug. 6, 2013, received in International Application No. PCT/US2013/040058, which corresponds with U.S. Appl. No. 14/536,426, 12 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/US2013/040058, which corresponds with U.S. Appl. No. 14/536,426, 11 pages. |
International Search Report and Written Opinion dated Feb. 5, 2014, received in International Application No. PCT/US2013/040061, which corresponds with U.S. Appl. No. 14/536,464, 30 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/US2013/040061, which corresponds with U.S. Appl. No. 14/536,464, 26 pages. |
International Search Report and Written Opinion dated May 8, 2014, received in International Application No. PCT/US2013/040067, which corresponds with U.S. Appl. No. 14/536,644, 45 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/US2013/040067, which corresponds with U.S. Appl. No. 14/536,644, 36 pages. |
International Search Report and Written Opinion dated Mar. 12, 2014, received in International Application No. PCT/US2013/069479, which corresponds with U.S. Appl. No. 14/608,926, 14 pages. |
International Preliminary Report on Patentability, dated Jun. 30, 2015, received in International Patent Application No. PCT/US2013/069479, which corresponds with U.S. Appl. No. 14/608,926, 11 pages. |
International Search Report and Written Opinion dated Aug. 7, 2013, received in International Application No. PCT/US2013/040070, which corresponds with U.S. Appl. No. 14/535,646, 12 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/US2013/040070, which corresponds with U.S. Appl. No. 14/535,646, 10 pages. |
International Search Report and Written Opinion dated Apr. 7, 2014, received in International Application No. PCT/US2013/040072, which corresponds with U.S. Appl. No. 14/536,141, 38 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/US2013/040072, which corresponds with U.S. Appl. No. 14/536,141, 32 pages. |
International Search Report and Written Opinion dated Apr. 7, 2014, received in International Application No. PCT/US2013/069483, which corresponds with U.S. Appl. No. 14/608,942, 18 pages. |
International Preliminary Report on Patentability, dated Jun. 30, 2015, received in International Application No. PCT/2013/069483, which corresponds with U.S. Appl. No. 14/608,942, 13 pages. |
International Search Report and Written Opinion dated Mar. 3, 2014, received in International Application No. PCT/US2013/040087, which corresponds with U.S. Appl. No. 14/536,166, 35 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/2013/040087, which corresponds with U.S. Appl. No. 14/536,166, 29 pages. |
International Search Report and Written Opinion dated Aug. 7, 2013, received in International Application No. PCT/US2013/040093, which corresponds with U.S. Appl. No. 14/536,203, 11 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/2013040093, which corresponds with U.S. Appl. No. 14,536,203, 9 pages. |
International Search Report and Written Opinion dated Jul. 9, 2014, received in International Application No. PCT/US2013/069484, which corresponds with U.S. Appl. No. 14/608,965, 17 pages. |
International Preliminary Report on Patentability, dated Jun. 30, 2015, received in International Patent Application No. PCT/US2013/069484, which corresponds with U.S. Appl. No. 14/608,965, 12 pages. |
International Search Report and Written Opinion dated Feb. 5, 2014, received in International Application No. PCT/US2013/040098, which corresponds with U.S. Appl. No. 14/536,247, 35 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/2013/040098, which corresponds with U.S. Appl. No. 14/536,247, 27 pages. |
Extended European Search Report, dated Oct. 7, 2016, received in European Patent Application No. 16177863.4, which corresponds with U.S. Appl. No. 14/536,267, 12 pages. |
Extended European Search Report, dated Oct. 30, 2018, received in European Patent Application No. 18183789.9, which corresponds with U.S. Appl. No. 14/536,267, 11 pages. |
International Search Report and Written Opinion dated Jan. 27, 2014, received in International Application No. PCT/US2013/040101, which corresponds with U.S. Appl. No. 14/536,267, 30 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/2013/040101, which corresponds with U.S. Appl. No. 14/536,267, 24 pages. |
Extended European Search Report, dated Nov. 24, 2017, received in European Patent Application No. 17186744.3, which corresponds with U.S. Appl. No. 14/536,291, 10 pages. |
International Search Report and Written Opinion dated Jan. 8, 2014, received in International Application No. PCT/US2013/040108, which corresponds with U.S. Appl. No. 14/536,291, 30 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/2013/040108, which corresponds with U.S. Appl. No. 14/536,291, 25 pages. |
International Search Report and Written Opinion dated Jun. 2, 2014, received in International Application No. PCT/US2013/069486, which corresponds with U.S. Appl. No. 14/608,985, 7 pages. |
International Preliminary Report on Patentability, dated Jun. 30, 2015, received in International Patent Application No. PCT/US2013/069486, which corresponds with U.S. Appl. No. 14/608,985, 19 pages. |
International Search Report and Written Opinion dated Mar. 6, 2014, received in International Application No. PCT/US2013/069489, which corresponds with U.S. Appl. No. 14/609,006, 12 pages. |
International Preliminary Report on Patentability, dated Jun. 30, 2015, received in International Patent Application No. PCT/US2013/069489, which corresponds with U.S. Appl. No. 14/609,006, 10 pages. |
Extended European Search Report, dated Mar. 15, 2017, received in European Patent Application No. 17153418.3, which corresponds with U.S. Appl. No. 14/536,648, 7 pages. |
Search Report, dated Apr. 13, 2017, received in Dutch Patent Application No. 2016452, which corresponds with U.S. Appl. No. 14/864,737, 22 pages. |
Search Report, dated Jun. 22, 2017, received in Dutch Patent Application No. 2016375, which corresponds with U.S. Appl. No. 14/866,981, 17 pages. |
International Search Report and Written Opinion, dated Oct. 14, 2016, received in International Patent Application No. PCT/US2016/020697, which corresponds with U.S. Appl. No. 14/866,981, 21 pages. |
Search Report, dated Jun. 19, 2017, received in Dutch Patent Application No. 2016377, which corresponds with U.S. Appl. No. 14/866,159, 13 pages. |
International Search Report and Written Opinion, dated Apr. 25, 2016, received in International Patent Application No. PCT/US2016/018758, which corresponds with U.S. Appl. No. 14/866,159, 15 pages. |
Extended European Search Report, dated Oct. 17, 2017, received in European Patent Application No. 17184437.6, which corresponds with U.S. Appl. No. 14/868,078, 8 pages. |
Search Report, dated Apr. 13, 2017, received in Dutch Patent Application No. 2016376, which corresponds with U.S. Appl. No. 14/868,078, 15 pages. |
International Search Report and Written Opinion, dated Jul. 21, 2016, received in International Patent Application No. PCT/US2016/019913, which corresponds with U.S. Appl. No. 14/868,078, 16 pages. |
Search Report, dated Apr. 18, 2017, received in Dutch Patent Application No. 2016801, which corresponds with U.S. Appl. No. 14/863,432, 34 pages. |
International Search Report and Written Opinion, dated Oct. 31, 2016, received in International Patent Application No. PCT/US2016/033578, which corresponds with U.S. Appl. No. 14/863,432, 36 pages. |
International Search Report and Written Opinion, dated Nov. 14, 2016, received in International Patent Application No. PCT/US2016/033541, which corresponds with U.S. Appl. No. 14/866,511, 29 pages. |
Extended European Search Report, dated Aug. 17, 2018, received in European Patent Application No. 18175195.9, which corresponds with U.S. Appl. No. 14/869,899, 13 pages. |
International Search Report and Written Opinion, dated Aug. 29, 2016, received in International Patent Application No. PCT/US2016/021400, which corresponds with U.S. Appl. No. 14/869,899, 48 pages. |
International Preliminary Report on Patentability, dated Sep. 12, 2017, received in International Patent Application No. PCT/US2016/021400, which corresponds with U.S. Appl. No. 14/869,899, 39 pages. |
International Search Report and Written Opinion, dated Jan. 12, 2017, received in International Patent No. PCT/US2016/046419, which corresponds with U.S. Appl. No. 14/866,992, 23 pages. |
International Search Report and Written Opinion, dated Dec. 15, 2016, received in International Patent Application No. PCT/US2016/046403, which corresponds with U.S. Appl. No. 15/009,661, 17 pages. |
International Search Report and Written Opinion, dated Feb. 27, 2017, received in International Patent Application No. PCT/US2016/046407, which corresponds with U.S. Appl. No. 15/009,688, 30 pages. |
International Preliminary Report on Patentability, dated Feb. 13, 2018, received in International Patent Application No. PCT/US2016/046407, which corresponds with U.S. Appl. No. 15/009,688, 20 pages. |
Search Report, dated Feb. 15, 2018, received in Dutch Patent Application No. 2019215, which corresponds with U.S. Appl. No. 14/864,529, 13 pages. |
Search Report, dated Feb. 15, 2018, received in Dutch Patent Application No. 2019214, which corresponds with U.S. Appl. No. 14/864,601, 12 pages. |
Extended European Search Report, dated Oct. 10, 2017, received in European Patent Application No. 17188507.2, which corresponds with U.S. Appl. No. 14/866,361, 9 pages. |
Extended European Search Report, dated Jun. 22, 2017, received in European Patent Application No. 16189421.7, which corresponds with U.S. Appl. No. 14/866,987, 7 pages. |
Extended European Search Report, dated Sep. 11, 2017, received in European Patent Application No. 17163309.2, which corresponds with U.S. Appl. No. 14/866,987, 8 pages. |
Extended European Search Report, dated Jun. 8, 2017, received in European Patent Application No. 16189425.8, which corresponds with U.S. Appl. No. 14/866,989, 8 pages. |
Extended European Search Report, dated Aug. 2, 2018, received in European Patent Application No. 18168941.5, which corresponds with U.S. Appl. No. 14/871,236, 11 pages. |
Extended European Search Report, dated Jul. 25, 2017, received in European Patent Application No. 17171972.7, which corresponds with U.S. Appl. No. 14/870,882, 12 pages. |
Extended European Search Report, dated Jul. 25, 2017, received in European Patent Application No. 17172266.3, which corresponds with U.S. Appl. No. 14/871,336, 9 pages. |
Extended European Search Report, dated Dec. 21, 2016, received in European Patent Application No. 16189790.5, which corresponds with U.S. Appl. No. 14/871,462, 8 pages. |
International Search Report and Written Opinion, dated Jan. 3, 2017, received in International Patent Application No. PCT/US2016/046214, which corresponds with U.S. Appl. No. 15/231,745, 25 pages. |
Extended European Search Report, dated May 30, 2018, received in European Patent Application No. 18155939.4, which corresponds with U.S. Appl. No. 15/272,327, 8 pages. |
Extended European Search Report, dated Mar. 2, 2018, received in European Patent Application No. 17206374.5, which corresponds with U.S. Appl. No. 15/272,343, 11 pages. |
Apple, “Apple—September Event 2014”, https://www.youtube.com/watch?38lqQpqwPe7s, Sep. 10, 2014, 5 pages. |
Jauregui, “Design and Evaluation of 3D Cursors and Motion Parallax for the Exploration of Desktop Virtual Environments”, IEEE Symposium on 3D User Interfaces 2012, Mar. 4, 2012, 8 pages. |
Nickinson, “Inside Android 4.2: Notifications and Quick Settings”, https://www.andrloidcentral.com/inside-android-42-notifications-and-guick-settings, Nov. 3, 2012, 3 pages. |
Ogino, “iOS 7 Design Standard”, Japan, Impress Japan Corporation, 1st edition, Nov. 21, 2013, 2 pages. |
Tweak, “QuickCenter—Add 3D-Touch Shortcuts to Control Center”, https://www.youtube.com/watch?v=8rHOFpGvZFM, Mar. 22, 2016, 2 pages. |
Tweak, “iOS 10 Tweak on iOS 9.0.2 Jailbread & 9.2.1-9.3 Support: QuickCenter 3D, Touch Cydia Tweak!”, https://wwwyoutube.com/watch?v=opOBr30_Fkl, Mar. 6, 2016, 3 pages. |
Viticci, “Apple Watch: Our Complete Overview—MacStories”, https://www.macstories.net, Sep. 10, 2014, 21 pages. |
Yatani, et al., SemFeel: A User Interface with Semantic Tactile Feedback for Mobile Touch-Screen Devices, Proceedings of the 22nd annual ACM symposium on user interface software and technology (UIST '09), Oct. 2009, 10 pages. |
Notice of Allowance, dated May 24, 2019, received in Korean Patent Application No. 2018-7028236, which corresponds with U.S. Appl. No. 14/608,895, 4 pages. |
Office Action, dated Apr. 12, 2019, received in Australian Patent Application No. 2018223021, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Notice of Allowance, dated Jul. 2, 2019, received in U.S. Appl. No. 14/536,644, 5 pages. |
Notice of Allowance, dated Apr. 10, 2019, received in U.S. Appl. No. 14/608,926, 16 pages. |
Notice of Allowance, dated May 21, received in U.S. Appl. No, 14/608,926, 5 pages. |
Office Action, dated Jun. 6, 2019, received in Australian Patent Application No. 2018256626, which corresponds with U.S. Appl. No. 14/536,646, 3 pages. |
Certificate of Grant, dated Jan. 25, 2019, received in Japanese Patent Application No. 2015-511645, which corresponds with U.S. Appl. No. 14/536,646, 4 pages. |
Office Action, dated Jun. 5, 2019, received in Australian Patent Application No. 2018256616, which corresponds with U.S. Appl. No. 14/536,141, 3 pages. |
Office Action, dated Jul. 5, 2019, received in Japanese Patent Application No. 2017-141953, which corresponds with U.S. Appl. No. 14/536,141, 6 pages. |
Notice of Allowance, dated May 7, 2019, received in Chinese Patent Application No. 201380068295.X, which corresponds with U.S. Appl. No. 14/608,942, 3 pages. |
Office action, dated Apr. 3, 2019, received in Chinese Patent Application No. 201380074060.1, which corresponds with U.S. Appl. No. 14/608,965, 3 pages. |
Patent, dated May 17, 2019, received in Chinese Patent Application No. 201380074060.1, which corresponds with U.S. Appl. No. 14/608,965, 6 pages. |
Notice of Acceptance, dated Apr. 29, 2019, received in Australian Patent Application No, 2018204236, which corresponds with U.S. Appl. No. 14/5326,267, 3 pages. |
Final Office Action, dated May 23, 2019, received in U.S. Appl. No. 14/609,006, 14 pages. |
Notice of Allowance, dated Jul. 2, 2019, received in U.S. Appl. No. 14/536,648, 5 pages. |
Intention to Grant, dated Apr. 1, 2019, received in European Patent Application No. 17153418.3, which corresponds with U.S. Appl. No. 14/536,648 7 pages. |
Notice of Allowance, dated Apr. 9, 2019, received in Japanese Patent Application No. 2017-113598, which corresponds with U.S. Appl. No. 14/609,042, 5 pages. |
Patent, dated Apr. 19, 2019, received in Japanese Patent Application No. 2017-113598, which corresponds with U.S. Appl. No. 14/609,042, 2 pages. |
Notice of Allowance, dated Apr. 17, 2019, received in Chinese Patent Application No. 201610159295.6, which corresponds with U.S. Appl. No. 14/864,737, 3 pages. |
Patent, dated May 31, 2019, received in Chinese Patent Application No. 201610159295.6, which corresponds with U.S. Appl. No. 14/864,737, 7 pages. |
Notice of Acceptance, dated Jun. 21, 2019, received in Australian Patent Application No. 2017258967, which corresponds with U.S. Appl. No. 14/868,078, 3 page. |
Notice of Allowance, dated May 6, 2019, received in Chinese Patent Application No. 01610130348.1, which corresponds with U.S. Appl. No. 14/868,078, 3 pages. |
Intention to Grant, dated May 10, 2019, received in European Patent Application No. 16708916.8, which corresponds with U.S. Appl. No. 14/868,078, 5 pages. |
Intention to Grant, dated May 22, 2019, received in European Patent Application No. 17184437.6, which corresponds with U.S. Appl. No. 14/868,078, 7 pages. |
Office Action, dated Jun. 17, 2019, received in Chinese Patent Application No. 201610342313.4, which corresponds with U.S. Appl. No. 14/863,432, 4 pages. |
Intention to Grant, dated Jul. 5, 2019, received in European Patent Application No. 16727900.9, which corresponds with U.S. Appl. No. 14/866,511, 5 pages. |
Office Action, dated May 8, 2019, received in European Patent Application No. 18168939.9, which corresponds with U.S. Appl. No. 14/869,899, 10 pages. |
Office Action, dated May 23, 2019, received in European Patent Application No. 18175195.9, which corresponds with U.S. Appl. No. 14/869,899, 10 pages. |
Patent, dated Apr. 5, 2019, received in Japanese Patent Application No. 2018-100827, which corresponds with U.S. Appl. No. 14/869,899, 5 pages. |
Office Action, dated Mar. 22, 2019, received in Korean Patent Application No. 2018-7017213, which corresponds with U.S. Appl. No. 14/869,899, 6 pages. |
Patent, dated May 10, 2019, received in Korean Patent Application No. 2018-7017213, which corresponds with U.S. Appl. No. 14/869,899, 8 pages. |
Examiner's Answer, dated May 9, 2019, received in U.S. Appl. No. 14/866,992, 26 pages. |
Certificate of Grant, dated May 9, 2019, received in Australian Patent Application No. 201761478, which corresponds with U.S. Appl. No. 14/,866,992, 3 pages. |
Summons, dated May 8, 2019, received in European Patent Application No. 16758008.3, which corresponds with U.S. Appl. No. 14/866,992, 14 pages. |
Notice of Allowance, dated Jun. 18, 2019, received in Japanese Patent Application No. 2018-506425, which corresponds with U.S. Appl. No. 14/866,992, 5 pages. |
Office Action, dated Jun. 28, 2019, received in U.S. Appl. No. 15/009,661, 33 pages. |
Final Office Action, dated Apr. 17, 2019, received in U.S. Appl. No. 14/856,520, 38 pages. |
Certificate of Grant, dated May 16, 2019, received in Australian Patent Application No. 2017202816, which corresponds with U.S. Appl. No. 14/857,636, 4 pages. |
Notice of Allowance, dated May 10, 2019, received in Korean Patent Application No. 20177036645, which corresponds with U.S. Appl. No. 14/857,636, 4 pages. |
Office Action, dated Jul. 1, 2019, received in Australian Patent Application No. 2019200872, which corresponds with U.S. Appl. No. 14/864,580, 6 pages. |
Notice of Allowance, dated Jun. 14, 2019, received in Chinese Patent Application No. 201610342151.4, which corresponds with U.S. Appl. No. 14/864,580, 3 pages. |
Patent, dated Jun. 25, 2019, received in Koran Patent Application No. 2017-7033756, which corresponds with U.S. Appl. No. 14/864,601, 8 pages. |
Notice of Allowance, dated May 29, 2019, received in Korean Patent Application No. 2017-7033756, which corresponds with U.S. Appl. No. 14/864,601, 6 pages. |
Notice of Allowance, dated May 23, 2019, received in Chinese Patent Application No. 201610189298.4, which corresponds with U.S. Appl. No. 14/866,361, 3 pages. |
Office Action, dated Jun. 10, 2019, received in Japanese Patent Application No. 2017-141962, which corresponds with U.S. Appl. No. 14/866,361, 6 pages. |
Patent, dated Apr. 3, 2019, received in Korean Patent Application No. 2018-7013039, which corresponds with U.S. Appl. No. 14/866,361, 4 pages. |
Notice of Allowance, dated Apr. 4, 2019, received in U.S. Appl. No. 14/866,987, 5 pages. |
Rejection Decision, dated Apr. 28, 2019, received in Chinese Patent Application No. 201610342336.5, which corresponds with U.S. Appl. No. 14/866,987, 4 pages. |
Intention to Grant, dated Jun. 14, 2019, received in European Patent Application No. 16189421.7, which corresponds with U.S. Appl. No. 14/866,987, 7 pages. |
Certificate of Grant, dated Jun. 13, 2019, received in Australian Patent Application No. 2017201079, which corresponds with U.S. Appl. No. 14/866,989, 1 page. |
Rejection Decision, dated Apr. 24, 2019, received in Chinese Patent Application No. 201610342314.9, which corresponds with U.S. Appl. No. 14/866,989, 3 pages. |
Notice of Allowance, dated Jun. 5, 2019, received in Chinese Patent Application No. 201680000466.9, which corresponds with U.S. Appl. No. 14/871,227, 5 pages. |
Notice of Allowance, dated Apr. 4, 2019, received in U.S. Appl. No. 14/869,997, 9 pages. |
Notice of Allowance, dated May 21, 2019, received in Chinese Patent Application No, 201610131507.X, which corresponds with U.S. Appl. No. 14/867,990, 3 pages. |
Notice of Allowance, dated May 1, 2019, received in U.S. Appl. No. 15/009,668, 12 pages. |
Certificate of Grant, dated May 23, 2019, received in Australian Patent Application No. 2017202058, which corresponds with U.S. Appl. No. 15/081,771, 1 page. |
Office Action, dated Apr. 17, 2019, received in European Patent Application No. 18171453.6, which corresponds with U.S. Appl. No. 15/136,782, 4 pages. |
Patent, dated Mar. 22, 2019, received in Japanese Patent Application No. 2018-062161, which corresponds with U.S. Appl. No. 15/136,782, 5 pages. |
Patent, dated Apr. 3, 2019, received in Korean Patent Application No. 2018-7020659, which corresponds with U.S. Appl. No. 15/136,782, 5 pages. |
Decision to Grant, dated Apr. 26, 2019, received in European Patent Application No. 15155939.4, which corresponds with U.S. Appl. No. 15/272,327, 2 pages. |
Patent, dated May 22, 2019, received in European Patent Application No. 15155939.4, which corresponds with U.S. Appl. No. 15/272,327,1 page. |
Office Action, dated Jun. 5, 2019, received in Chinese Patent Application No. 201810071627.4, which corresponds with U.S. Appl. No. 15/272,343, 6 pages. |
Intention to Grant, dated May 13, 2019, received in European Patent Application No. 17206374.5, which corresponds with U.S. Appl. No. 15/272,343, 7 pages. |
Final Office Action, dated Apr. 2, 2019, received in U.S. Appl. 15/272,345, 28 pages. |
Final Office Action, dated Jul. 1, 2019, received in U.S. Appl. 15/655,749, 24 pages. |
Notice of Allowance, dated Apr. 18, 2019, received in Korean Patent Application No. 2017-7034248, which corresponds with U.S. Appl. No. 15/655,749, 5 pages. |
Office Action, dated Apr. 11, 2019, received in U.S. Appl. No. 15/889,115, 9 pages. |
Office Action, dated May 31, 2019, received in Australian Patent 2018253539, which corresponds with U.S. Appl. No. 16/049,725, 3 pages. |
Office Action, dated May 22, 2019, received in U.S. Appl. No. 16/230,743, 7 pages. |
Notice of Allowance, dated Apr. 19, 2019, received in U.S. Appl. No. 16/252,478, 11 pages. |
Anonymous, “Acer Liquid Z5 Duo User's Manual”, https://global-download.acer.com, Feb. 21, 2014, 65 pages. |
Jauregui et al, “Design and Evaluation of 3D Cursors and Motion Parallax for the Exploration of Desktop Virtual Environments”, IEEE Symposium on 3D User Interface 2012, Mar. 4, 2012, 8 pages. |
Neuburg, “Detailed Explanation iOS SDK”, Oreilly Japan, Dec. 22, 2014, vol. 4, p. 175-186, 15 pages. |
Ogino, iOS 7 Design Standard, Japan, Impress Japan Corporation, Nov. 21, 2013, 1st edition, pp. 58-59. |
Plaisant et al, “Touchscreen Toggle Design”, Proceedings of CHI '92, pp. 667-668, May 3-7, 1992, 2 pages. |
Rubino et al., “How to Enable ‘Living Images’ on your Nokia Lumia with Windows Phone 8.1”, https://www.youtube.com/watch?v=RX7vpoFy1Dg, Jun. 6, 2014, 5 pages. |
Tweak, UltimateiDeviceVids, Cydia Tweak: Quick Center—Add 3-Touch Shortcuts to ControlCenter, https://www.youtube.com/watch?v=6rHOFpGvZFM, Mar. 22, 2016, 2 pages. |
Tweak, “iCrackUriDevice, iOS 9.0.2 Jailbreak & 9.2.1-9.3 Support: QuickCenter 3D Touch Cydia Tweak!”, https://www,youtube.com/watch?v=op-OBr3O_Fkl, Mar. 6, 2016, 3 pages. |
UpDown-G, “Using Multiple Selection Mode in Android 4.0 / Getting Started”, https://techbooster.org/android/13946, Mar. 7, 2012, 7 pages. |
YouTube, “How to Use 3D Touch Multitasking on iPhone”, https://www.youtube.com/watch?v=kDq05uRdrCg, Sep. 29, 2015, 1 page. |
Patent, dated Dec. 25, 2018, received in Chinese Patent Application No. 201380068493.6, which corresponds with U.S. Appl. No. 14/608,895, 4 pages. |
Certificate of Grant, dated Dec. 26, 2018, received in European Patent Application No. 13795391.5, which corresponds with U.S. Appl. No. 14/536,426, 4 pages. |
Notice of Allowance, dated Aug. 15, 2018, received in U.S. Appl. No. 14/536,235, 5 pages. |
Notice of Allowance, dated Aug. 8, 2018, received in Chinese Patent Application No. 201510566550.4, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Patent, dated Oct. 23, 2018, received in Chinese Patent Application No. 201510566550.4, which corresponds with U.S. Appl. No. 14/536,426, 4 pages. |
Decision to Grant, dated Jan. 10, 2019, received in European Patent Application No. 15183980.0, which corresponds with U.S. Appl. No. 14/536,426, 4 pages. |
Patent, dated Feb. 6, 2019, received in European Patent Application No. 15183980.0, which corresponds with U.S. Appl. No. 14/536,426, 4 pages. |
Office Action, dated Nov. 2, 2018, received in U.S. Appl. No. 14/536,644, 24 pages. |
Office Action, dated Feb. 22, 2019, received in Japanese Patent Application No. 2018-079290, which corresponds with U.S. Appl. No. 14/608,926, 7 pages. |
Patent, dated Oct. 23, 2018, received in Chinese Patent Application No. 201380035893.7, which corresponds with U.S. Appl. No. 14/536,141, 4 pages. |
Office Action, dated Mar. 7, 2019, received in European Patent Application No. 13726053.5, which corresponds with U.S. Appl. No. 14/536,141, 5 pages. |
Notice of Allowance, dated Jan. 15, 2019, received in Korean Patent Application No. 2015-7018448, which corresponds with U.S. Appl. No. 14/608,942, 5 pages. |
Patent, dated Mar. 8, 2019, received in Korean Patent Application No. 2015-7018448, which corresponds with U.S. Appl. No. 14/608,942, 4 pages. |
Intention to Grant, dated Mar. 18,2019, received in European Patent Application No. 13724104.8, which corresponds with U.S. Appl. No. 14/536,203, 9 pages. |
Final Office Action, dated Jan. 10, 2019, received in U.S. Appl. No. 14/608,965, 17 pages. |
Office action, dated Nov. 1, 2018, received in Chinese Patent Application No. 201380074060.1, which corresponds with U.S. Appl. No. 14/608,965, 3 pages. |
Office Action, dated Mar. 15, 2019, received in Australian Patent Application No. 2018204236, which corresponds with U.S. Appl. No. 14/5326,267, 5 pages. |
Office Action, dated Nov. 28, 2018, received in Chinese Patent Application No. 201610537334.1, which corresponds with U.S. Appl. No. 14/536,267, 5 pages. |
Grant Certificate, dated Nov. 14, 2018, received in European Patent Application No. 13724106.3, which corresponds with U.S. Appl. No. 14/536,267 3 pages. 4 pages. |
Decision to Grant, dated Nov. 29, 2018, received in European Patent Application No. 16177863.4, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Patent, dated Dec. 26, 2018, received in European Patent Application No. 16177863.4, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Office Action, dated Feb. 4, 2019, received in Japanese Patent Application No. 2017-237035, which corresponds with U.S. Appl. No. 14/536,267, 7 pages. |
Patent, dated Oct. 24, 2016, received in Korean Patent Application No. 2014-7034530, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Office Action, dated Jan. 29, 2018, received in Korean Patent Application No. 2017-7034838, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Notice of Allowance, dated Dec. 3, 2018, received in Korean Patent Application No. 2017-7034838, which corresponds with U.S. Appl. No. 14/536,267, 5 pages. |
Patent, dated Mar. 4, 2019, received in Korean Patent Application No. 2017-7034838 (5853KR02), which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Patent, dated Nov. 30, 2018, received in Australian Patent Application No. 2016216658, which corresponds with U.S. Appl. No. 14/536,291, 4 pages. |
Intention to Grant, dated Jan. 8, 2019, received in European Patent Application No. 17186744.3, which corresponds with U.S. Appl. No. 14/536,291, 7 pages. |
Patent, dated Feb. 22, 2019, received in Japanese Patent Application No. 2017-083027, which corresponds with U.S. Appl. No. 14/536,291, 3 pages. |
Notice of Allowance, dated Jan. 15, 2019, received in Japanese Patent Application No. 2017-083027, which corresponds with U.S. Appl. No. 14/536,291, 5 pages. |
Intention to Grant, dated Jan. 16, 2019, received in European Patent Application No. 13811032.5, which corresponds with U.S. Appl. No. 14/608,985, 9 pages. |
Office Action, dated Jan. 2, 2019, received in U.S. Appl. No. 14/536,648 12 pages. |
Notice of Allowance, dated Feb. 4, 2019, received in Japanese Patent Application No. 2017-008764, which corresponds with U.S. Appl. No. 14/536,648, 5 pages. |
Patent, dated Mar. 1, 2019, received in Japanese Patent Application No. 2017-008764, which corresponds with U.S. Appl. No. 14/536,648, 3 pages. |
Notice of Allowance, dated Dec. 17, 2018, received in Korean Patent Application No. 2017-7008614, which corresponds with U.S. Appl. No. 14/609,042, 5 pages. |
Notice of Acceptance, dated Mar. 12, 2019, received in Australian Patent Application No. 2016233792, which corresponds with U.S. Appl. No. 14/864,737, 5 pages. |
Patent, dated Dec. 26, 2018, received in Korean Patent Application No. 2017-7030129, which corresponds with U.S. Appl. No. 14/864,737, 4 pages. |
Office Action, dated Nov. 5, 2018, received in Chinese Patent Application No. 201610131415.1, which corresponds with U.S. Appl. No. 14/866,981, 6 pages. |
Notice of Allowance, dated Dec. 6, 2018, received in Chinese Patent Application No. 201610137839.9, which corresponds with U.S. Appl. No. 14/866,159, 3 pages. |
Patent, dated Feb. 19, 2019, received in Chinese Patent Application No. 201610137839.9, which corresponds with U.S. Appl. No. 14/866,159, 6 pages. |
Office Action, dated Feb. 7, 2019, received in Australian Patent Application No. 2017258967, which corresponds with U.S. Appl. No. 14/868,078, 3 page. |
Office Action, dated Feb. 26, 2019, received in Chinese Patent Application No. 01610130348.1, which corresponds with U.S. Appl. No. 14/868,078, 4 pages. |
Office Action, dated Dec. 4, 2018, received in Chinese Patent Application No. 201610342313.4, which corresponds with U.S. Appl. No. 14/863,432, 5 pages. |
Office Action, dated Dec. 5, 2018, received in Chinese Patent Application No. 201610342264.4, which corresponds with U.S. Appl. No. 14/866,511, 4 pages. |
Office Action, dated Jan. 2, 2019, received in European Patent Application No. 16727900.9, which corresponds with U.S. Appl. No. 14/866,511, 5 pages. |
Patent, dated Feb. 26, 2019, received in Danish Patent Application No. 201670594, which corresponds with U.S. Appl. No. 14/869,899, 3 pages. |
Notice of Allowance, dated Mar. 1, 2019, received in Japanese Patent Application No. 2018-100827, which corresponds with U.S. Appl. No. 14/869,899, 5 pages. |
Notice of Acceptance, dated Mar. 12, 2019, received in Australian Patent Application No. 2016304890, which corresponds with U.S. Appl. No. 14/866,992, 5 pages. |
Office Action, dated Jan. 11, 2019, received in Japanese Patent Application No. 2018-506425, which corresponds with U.S. Appl. No. 14/866,992, 6 pages. |
Notice of Allowance, dated Nov. 15, 2018, received in U.S. Appl. No. 15/009,676, 6 pages. |
Office Action, dated Nov. 20, 2018, received in U.S. Appl. No. 14/856,520, 36 pages. |
Notice of Allowance, dated Aug. 16, 2018, received in U.S. Appl. No. 14/857,636, 5 pages. |
Notice of Allowance, dated Jan. 15, 2019, received in Australian Patent Application No. 2017202816, which corresponds with U.S. Appl. No. 14/857,636, 3 pages. |
Office Action, dated Nov. 28, 2018, received in Korean Patent Application No. 20177036645, which corresponds with U.S. Appl. No. 14/857,636, 6 pages. |
Notice of Allowance, dated Aug. 16, 2018, received in U.S. Appl. No. 14/857,663, 5 pages. |
Certificate of Grant, dated Feb. 21, 2019, received in Australian Patent Application No. 2016276030, which corresponds with U.S. Appl. No. 14/864,601, 4 pages. |
Office Action, dated Feb. 4, 2019, received in European Patent Application No. 16730554.9, which corresponds with U.S. Appl. No. 14/864,601, 10 pages. |
Notice of Allowance, dated Dec. 10, 2018, received in Japanese Patent Application No. 2017-561375, which corresponds with U.S. Appl. No. 14/864,601, 5 pages. |
Patent, dated Jan. 11, 2019, received in Japanese Patent Application No. 2017-561375, which corresponds with U.S. Appl. No. 14/864,601, 3 pages. |
Office Action, dated Jan. 25, 2019, received in Korean Patent Application No. 2017-7033756, which corresponds with U.S. Appl. No. 14/864,601, 8 pages. |
Office Action, dated Jan. 30, 2019, received in European Patent Application No. 17188507.2, which corresponds with U.S. Appl. No. 14/866,361, 13 pages. |
Notice of Allowance, dated Jan. 30, 2019, received in Korean Patent Application No. 2018-7013039, which corresponds with U.S. Appl. No. 14/866,361, 5 pages. |
Office Action, dated Dec. 4, 2018, received in Chinese Patent 201610342336.5, which corresponds with U.S. Appl. No. 14/866.987, 5 pages. |
Office Action, dated Dec. 11, 2018, received in European Patent Application No. 16189421.7, which corresponds with U.S. Appl. No. 14/866,987, 6 pages. |
Notice of Allowance, dated Jan. 17, 2019, received in U.S. Appl. No. 14/866,989, 8 pages. |
Notice of Acceptance, dated Feb. 14, 2019, received in Australian Patent Application No. 2017201079, which corresponds with U.S. Appl. No. 14/866,989. 3 pages. |
Office Action, dated Feb. 25, 2019, received in Chinese Patent Application No. 201610342314.9, which corresponds with U.S. Appl. No. 14/866,989, 3 pages. |
Patent, dated Feb. 15, 2019, received in Russian Patent Application No. 2017131408, which corresponds with U.S. Appl. No. 14/871,236, 2 pages. |
Notice of Allowance, dated Dec. 3, 2018, received in U.S. Appl. No. 14/870,754, 8 pages. |
Notice of Allowance, dated Dec. 5, 2018, received in U.S. Appl. No. 14/870,882, 8 pages. |
Office Action, dated Feb. 11, 2019, received in European Patent Application No. 17171972.7, which corresponds with U.S. Appl. No. 14/870,882, 7 pages. |
Notice of Allowance, dated Aug. 27, 2018, received in U.S. Appl. No. 14/870,988, 11 pages. |
Certificate of Grant, dated Feb. 28, 2019, received in Australian Patent Application No. 2016203040, which corresponds with U.S. Appl. No. 14/871,227, 1 page. |
Office Action, dated Nov. 16, 2018, received in Chinese Patent Application No. 201680000466.9, which corresponds with U.S. Appl. No. 14/871,227, 5 pages. |
Patent, dated Nov. 28, 2018, received in European Patent No. 16711743.1, which corresponds with U.S. Appl. No. 14/871,227, 1 page. |
Patent, dated Dec. 28, 2018, received in Korean Patent Application No. 2016-7019816, which corresponds with U.S. Appl. No. 14/871,227, 8 pages. |
Notice of Allowance, dated Feb. 5, 2019, received in U.S. Appl. No, 14/871,336, 10 pages. |
Office Action, dated Feb. 12, 2019, received in European Patent Application No. 17172266.3, which corresponds with U.S. Appl. No. 14/871,336, 6 pages. |
Patent, dated Nov. 16, 2018, received in Japanese Patent Application No. 2018-020324, which corresponds with U.S. Appl. No. 14/871,336, 4 pages. |
Final Office Action, dated Oct. 17, 2018, received in U.S. Appl. No. 14/867,892, 48 pages. |
Office Action, dated Feb. 27, 2019, received in U.S. Appl. No. 14/869,361, 28 pages. |
Notice of Allowance, dated Mar. 12, 2019, received in U.S. Appl. No. 14/869,703, 6 pages. |
Office Action, dated Jan. 10, 2019, received in U.S. Appl. No. 15/009,668, 17 pages. |
Notice of Acceptance, dated Jan. 24, 2019, received in Australian Patent Application No. 2017202058, which corresponds with U.S. Appl. No. 15/081,771, 3 pages. |
Certificate of Grant, dated Jan. 17, 2019, received in Australian Patent Application No. 2018202855, which corresponds with U.S. Appl. No. 15/136,782, 4 pages. |
Office Action, dated Nov. 12, 2018, received in Japanese Patent Application No. 2018-062161, which corresponds with U.S. Appl. No. 15/136,782, 5 pages. |
Notice of Allowance, dated Feb. 18, 2019, received in Japanese Patent Application No. 2018-062161, which corresponds with U.S. Appl. No. 15/136,782, 5 pages. |
I Notice of Allowance, dated Feb. 25, 2019, received in Korean Patent Application No. 2018-7020659, which corresponds with U.S. Appl. No. 15/136,782, 5 pages. |
Office Action, dated Dec. 18, 2018, received in Danish Patent Application No. 201670587, which corresponds with U.S. Appl. No. 15/231,745, 4 pages. |
Office Action, dated Nov. 23, 2018, received in Danish Patent Application No. 201670591, which corresponds with U.S. Appl. No. 15/231,745, 7 pages. |
Notice of Allowance, dated Oct. 4, 2018, received in U.S. Appl. No. 15/272,327, 46 pages. |
Office Action, dated Mar. 22, 2019, received in Australian Patent Application No. 2018204234, which corresponds with U.S. Appl. No. 15/272,327, 7 pages. |
Intention to Grant, dated Mar. 19, 2019, received in European Patent Application No. 15155939.4, which corresponds with U.S. Appl. No. 15/272,327, 6 pages. |
Final Office Action, dated Mar. 25, 2019, received in U.S. Appl. No. 15/272,341, 25 pages. |
Office Action, dated Jan. 8, 2019, received in European Patent Application No. 17206374.5, which corresponds with U.S. Appl. No. 15/272,343, 5 pages. |
Office Action, dated Nov. 13, 2018, received in European Patent Application No. 16756862.5, which corresponds with U.S. Appl. No. 15/272,345, 5 pages. |
Decision to Grant, dated Jan. 31, 2019, received in European Patent Application No. 16756862.5, which corresponds with U.S. Appl. No. 15/272,345, 5 pages. |
Patent, dated Feb. 27, 2019, received in European Patent Application No. 16756862.5, which corresponds with U.S. Appl. No. '15/272,345, 3 pages. |
Notice of Allowance, dated Aug. 15, 2018, received in U.S. Appl. No. 15/482,618, 7 pages. |
Office Action, dated Jan. 24, 2019, received in U.S. Appl. No. 15/655,749, 25 pages. |
Extended European Search Report, dated Dec. 5, 2018, received in European Application No. 18194127.9, which corresponds with U.S. Appl. No. 14/608,942, 8 pages. |
Extended European Search Report, dated Mar. 8, 2019, received in European Patent Application No. 18205283.7, which corresponds with U.S. Appl. No. 15/081,771, 15 pages. |
Extended European Search Report, dated Aug. 24, 2018, received in European Patent Application No. 18171453.6, which corresponds with U.S. Appl. No. 15/136,782, 9 pages. |
Patent, dated Jul. 9, 2019, received in Korean Patent Application No. 2018-7028236, which corresponds with U.S. Appl. No. 14/608,895, 4 pages. |
Certificate of Grant, dated Jul. 5, 2019, received in Hong Kong Patent Application No. 15108892.5, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Notice of Acceptance, dated Aug. 1, 2019, received in Australian Patent Application No. 2018256626, which corresponds with U.S. Appl. No. 14/536,646, 3 pages. |
Patent, dated Jul. 5, 2019, received in Chinese Patent Application No. 201380068295.X, which corresponds with U.S. Appl. No. 14/608,942, 8 pages. |
Certificate of Grant, dated Jul. 26, 2019, received in Hong Kong, which corresponds with U.S. Appl. No. 14/608,942, 4 pages. |
Office Action, dated Aug. 20, 2018, received in Australian Patent Application No. 2018250481, which corresponds with U.S. Appl. No. 14/536,203, 2 pages. |
Decision to Grant, dated Aug. 8, 2019, received in European Patent Application No. 13724104.8, which corresponds with U.S. Appl. No. 14/536,203, 1 page. |
Office Action, dated Jul. 11, 2019, received in Chinese Patent Application No. 201610537334.1, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Decision Grant, dated Aug. 1, 2019, received in European Patent Application No. 13811032.5, which corresponds with U.S. Appl. No. 14/608,985, 2 pages. |
Decision to Grant, dated Aug. 16, 2019, received in European Patent Application No. 17153418.3, which corresponds with U.S. Appl. No. 14/536,648, 1 page. |
Certificate of Grant, dated Jul. 4, 2019, received in Australian Patent Application No. 2016233792, which corresponds with U.S. Appl. No. 14/864,737, 1 page. |
Office Action, dated Jul. 16, 2019, received in Chinese Patent Application No. 201610131415.1, which corresponds with U.S. Appl. No. 14/866,981, 4 pages. |
Patent, dated Jul. 5, 2019, received in Chinese Patent Application No. 201610130348.1, which corresponds with U.S. Appl. No. 14/868,078, 6 pages. |
Office Action, dated Jul. 11, 2019, received in Chinese Patent Application No. 201610342264.4, which corresponds with U.S. Appl. No. 14/866,511, 4 pages. |
Certificate of Grant, dated Jul. 4, 2019, received in Australian Patent Application No. 2016304890, which corresponds with U.S. Appl. No. 14/866,992, 1 page. |
Patent, dated Jul. 26, 2019, received in Japanese Patent Application No. 2018506425, which corresponds with U.S. Appl. No. 14/866,992, 3 pages. |
Patent, dated Jul. 11, 2019, received in Korean Patent Application No. 20177036645, which corresponds with U.S. Appl. No. 14/857,636, 8 pages. |
Patent, dated Jul. 30, 2019, received in Chinese Patent Application No. 201610342151.4, which corresponds with U.S. Appl. No. 14/864,580, 3 pages. |
Notice of Allowance, dated Aug. 14, 2019, received in Korean Patent Application No. 2019-7018317, which corresponds with U.S. Appl. No. 14/864,580, 6 pages. |
Intention to Grant, dated Jul. 18, 2019, received in European Patent Application No. 16730554.9, which corresponds with U.S. Appl. No. 14/864,601, 5 pages. |
Patent, dated Jul. 23, 2019, received in Chinese Patent Application No. 201610189298.4, which corresponds with U.S. Appl. No. 14/866,361, 7 pages. |
Office Action, dated Aug. 15, 2019, received in Chinese Patent Application No. 201610342336.5, which corresponds with U.S. Appl. No. 14/866,987, 3 pages. |
Patent, dated Aug. 9, 2019, received in Chinese Patent Application No. 201680000466.9, which corresponds with U.S. Appl. No. 14/871,227, 8 pages. |
Examiner's Answer, dated Jul. 18, 2019, received in U.S. Appl. No. 14/867,892, 17 pages. |
Patent, dated Jul. 19, 2019, received in Chinese Patent Application No. 201610131507.X, which corresponds with U.S. Appl. No. 14/867,990, 6 pages. |
Office Action, dated Aug. 2, 2019, received in Korean Patent Application No. 2019-7009439, which corresponds with U.S. Appl. No. 15/499,693, 3 pages. |
Patent, dated Jul. 3, 2019, received in Korean Patent Application No. 2017-7034248, which corresponds with U.S. Appl. No. 15/655,749, 5 pages. |
Office Action, dated Aug. 1, 2019, received in U.S. Appl. No. 15/785,372, 22 pages. |
Office Action, dated Jul. 25, 2019, received in U.S. Appl. No. 15/979,347, 14 pages. |
Office Action, dated Aug. 20, 2019, received in Korean Patent Application No. 2019-7019946, which corresponds with U.S. Appl. No. 16/154,591, 6 pages. |
Office Action, dated Jul. 5, 2019, received in Korean Patent Application No. 2018-7037896, which corresponds with U.S. Appl. No. 16/243,834, 2 pages. |
Office Action, dated Jul. 15, 2019, received in U.S. Appl. No. 16/258,394, 8 pages. |
Borowska, “6 Types of Digital Affordance that Impact Your Ux”, https://www.webdesignerdepot.com/2015/04/6-types-of-digital-affordance-that-implact-your-ux, Apr. 7, 2015, 6 pages. |
Yang, et al., “Affordance Application on Visual Interface Design of Desk-Top Virtual Experiments”, 2014 International Conference on Information Science, Electronics and Electrical Engineering, IEEE, vol. 1, Apr. 26, 2014, 5 pages. |
Office Action, dated Nov. 18, 2019, received in Australian Patent Application No. 2018223021, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Intention to Grant, dated Nov. 8, 2019, received in European Patent Application No. 18194127.9, which corresponds with U.S. Appl. No. 14/608,942, 7 pages. |
Patent, dated Sep. 27, 2019, received in Hong Kong Patent Application No. 15108904.1, which corresponds with U.S. Appl. No. 14/536,203, 6 pages. |
Notice of Allowance dated Nov. 7, 2019, received in U.S. Appl. No. 14/608,965, 17 pages. |
Patent, dated Aug. 30, 2019, received in Hong Kong Patent Application No. 15107537.8, which corresponds with U.S. Appl. No. 14/536,267, 9 pages. |
Patent, dated Sep. 27, 2019, received in Japanese Patent Application No. 2017-237035, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Decision to Grant, dated Oct. 31, 2019, received in European Patent Application No. 17186744.3, which corresponds with U.S. Appl. No. 14/536,291, 3 pages. |
Intention to Grant, dated Oct. 28, 2019, received in European Patent Application No. 16707356.8, which corresponds with U.S. Appl. No. 14/866,159, 7 pages. |
Certificate of Grant, dated Oct. 17, 2019, received in Australian Patent Application No. 2017258967, which corresponds with U.S. Appl. No. 14/868,078, 4 page. |
Patent, dated Oct. 9, 2019, received in European Patent Application No. 16708916.8, which corresponds with U.S. Appl. No. 14/868,078, 3 pages. |
Patent, dated Oct. 16, 2019, received in European Patent Application No. 17184437.6, which corresponds with U.S. Appl. No. 14/868,078, 3 pages. |
Office Action, dated Nov. 5, 2019, received in Chinese Patent Application No. 201610342313.4, which corresponds with U.S. Appl. No. 14/863,432, 4 pages. |
Intention to Grant, dated Oct. 25, 2019, received in European Patent Application No. 18168939.9, which corresponds with U.S. Appl. No. 14/869,899, 8 pages. |
Patent, dated Oct. 11, 2019, received in Korean Patent Application No. 2018-7003890, which corresponds with U.S. Appl. No. 14/866,992, 5 pages. |
Office Action, dated Nov. 11, 2019, received in Japanese Patent Application No. 2018-201076, which corresponds with U.S. Appl. No. 14/857,663, 7 pages. |
Patent, dated Oct. 9, 2019, received in European Patent Application No. 16730554.9, which corresponds with U.S. Appl. No. 14/864,601, 3 pages. |
Intention to Grant, dated Oct. 25, 2019, received in European Patent Application No. 16189421.7, which corresponds with U.S. Appl. No. 14/866,987, 7 pages. |
Decision to Grant, dated Nov. 14, 2019, received in European Patent Application No. 16189421.7, which corresponds with U.S. Appl. No. 14/866,987, 2 pages. |
Office Action, dated Nov. 4, 2019, received in Chinese Patent Application No. 201610871323.7, which corresponds with U.S. Appl. No. 14/871,336, 12 pages. |
Notice of Allowance, dated Nov. 1, 2019, received in Japanese Patent Application No. 2018-158502, which corresponds with U.S. Appl. No. 15/231,745, 5 pages. |
Patent, Oct. 9, 2019, received in European Patent Application No. 17206374.5, which corresponds with U.S. Appl. No. 15/272,343, 3 pages. |
Office Action, dated Oct. 22, 2019, received in Chinese Patent Application No. 201680022696.5, which corresponds with U.S. Appl. No. 15/272,345, 7 pages. |
Final Office Action, dated Oct. 28, 2019, received in U.S. Appl. No. 15/889,115, 12 pages. |
Notice of Allowance, dated Nov. 6, 2019, received in U.S. Appl. No. 16/258,394, 8 pages. |
Office Action, dated Oct. 11, 2019, received in Australian Patent Application No. 2019202417, 4 pages. |
Notice of Allowance, dated Nov. 1, 2019, received in Korean Patent Application No. 2019-7019100, 5 pages. |
Extended European Search Report, dated Nov. 14, 2019, received in European Patent Application No. 19194418.0, which corresponds with U.S. Appl. No. 14/864,580, 8 pages. |
Extended European Search Report, dated Oct. 28, 2019, received in European Patent Application No. 19195414.8, which corresponds with U.S. Appl. No. 16/240,672, 6 pages. |
Extended European Search Report, dated Nov. 13, 2019, received in European Patent Application No. 19194439.6, which corresponds with U.S. Appl. No. 16/262,800, 12 pages. |
Patent, dated Nov. 22, 2019, received in Hong Kong Patent Application No.16107033.6, which corresponds with U.S. Appl. No.14/536,426, 6 pages. |
Certificate of Grant, dated Dec. 5, 2019, received in Australian Patent.Application No. 2018256626, which corresponds with U.S. Appl. No. 14/536,646, 3 pages. |
Notice of Allowance dated Jan. 2, 2020, received in U.S. Appl. No. 14/608,965, 5 pages. |
Office Action, dated Dec. 20, 2019, received in Chinese Patent Application No. 201610537334.1, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Patent, dated Nov. 8, 2019, received in Hong Kong Patent Application No. 15108890.7, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Patent, dated Nov. 27, 2019, received in European Patent Application No. 17186744.3, which corresponds with U.S. Appl. No. 14/536,291, 4 pages. |
Office Action, dated Jan. 7, 2020, received in U.S. Appl. No. 14/609,006, 17 pages. |
Office Action, dated Nov. 21, 2019, received in Chinese Patent Application No. 201680011338.4, which corresponds with U.S. Appl. No. 14/868,078, 8 pages. |
Patent, dated Feb. 8, 2017, received in Chinese Patent Application No. 201620470063.8, which corresponds with U.S. Appl. No. 14/863,432, 5 pages. |
Office Action, dated Jan. 10, 2020, received in Japanese Patent Application No. 2018-243773, which corresponds with U.S. Appl. No. 14/863,432, 6 pages. |
Notice of Allowance, dated Nov. 28, 2019, received in Chinese Patent Application No. 201610342264.4, which corresponds with U.S. Appl. No. 14/866,511, 3 pages. |
Decision to Grant, dated Dec. 5, 2019, received in European Patent Application No. 16727900.9, which corresponds with U.S. Appl. No. 14/866,511, 2 pages. |
Oral Summons, dated Dec. 6, 2019, received in European Patent Application No. 18175195.9, which corresponds with U.S. Appl. No. 14/869,899, 9 pages. |
Office Action, dated Jan. 13, 2020, received in Chinese Patent Application No. 201610658351.8, which corresponds with U.S. Appl. No. 14/866,992, 3 pages. |
Final Office Action, dated Dec. 30, 2019, received in U.S. Appl. No. 15/009,661, 33 pages. |
Notice of Allowance, dated Jan. 6, 2020, received in U.S. Appl. No. 14/856,520, 5 pages. |
Patent, dated Nov. 12, 2019, received in Korean Patent Application No. 2019-7018317, which corresponds with U.S. Appl. No. 14/864,580, 6 pages. |
Patent, dated Nov. 8, 2019, received in Japanese Patent Application No. 2017-141962, which corresponds with U.S. Appl. No. 14/866,361, 4 pages. |
Notice of Allowance, dated Dec. 3, 2019, received in Chinese Patent Application No. 201610342336.5, which corresponds with U.S. Appl. No. 14/866,987, 3 pages. |
Patent, dated Dec. 11, 2019, received in European Patent Application no. 16189421.7, which corresponds with U.S. Appl. No. 14/866,987, 3 pages. |
Intention to Grant, dated Dec. 4, 2019, received in European Patent Application No. 18168941.5, which corresponds with U.S. Appl. No. 14/871,236, 8 pages. |
Office Action, dated Nov. 28, 2019, received in Chinese Patent Application No. 201610870912.3, which corresponds with U.S. Appl. No. 14/870,882, 10 pages. |
Patent, dated Nov. 29, 2019, received in Japanese Patent Application No. 2018-158502, which corresponds with U.S. Appl. No. 15/231,745, 3 pages. |
Notice of Acceptance, dated Dec. 10, 2019, received in Australian Patent Application No. 2018204234, which corresponds with U.S. Appl. No. 15/272,327, 3 pages. |
Notice of Allowance, dated Dec. 11, 2019, received in Chinese Patent Application No. 201810071627.4, which corresponds with U.S. Appl. No. 15/272,343, 4 pages. |
Notice of Allowance, dated Dec. 27, 2019, received in Korean Patent Application No. 2019-7009439, which corresponds with U.S. Appl. No. 15/499,693, 5 pages. |
Office Action, dated Nov. 25, 2019, received in U.S. Appl. No. 16/049,725, 9 pages. |
Office Action, dated Nov. 29, 2019, received in U.S. Appl. No. 16/136,163, 9 pages. |
Office Action, dated Dec. 2, 2019, received in Japanese Patent Application No. 2018-202048, which corresponds with U.S. Appl. No. 16/154,591, 6 pages. |
Office Action, dated Nov. 25, 2019, received in U.S. Appl. No. 16/174,170, 31 pages. |
Office Action, dated Dec. 18, 2019, received in Australian Patent Application No. 2018282409, which corresponds with U.S. Appl. No. 16/243,834, 3 pages. |
Office Action, dated Dec. 23, 2019, received in Korean Patent Application No. 2018-7037896, which corresponds with U.S. Appl. No. 16/243,834, 6 pages. |
Notice of Allowance, dated Dec. 13, 2019, received in Korean Patent Application No. 2019-7033444, which corresponds with U.S. Appl. No. 16/252,478, 6 pages. |
Office Action, dated Sep. 6, 2019, received in European Patent Application No. 18180503.7, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Office Action, dated Oct. 7, 2019, received in Japanese Patent Application No. 2018-000753, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Office Action, dated Sep. 30, 2019, received in Japanese Patent Application No. 2018-079290, which corresponds with U.S. Appl. No. 14/608,926, 5 pages. |
Intention to Grant, dated Sep. 6, 2019, received in European Patent Application No. 13726053.5, which corresponds with U.S. Appl. No. 14/536,141, 7 pages. |
Certificate of Grant, dated Sep. 4, 2019, received in European Patent Application No. 13724104.8, which corresponds with U.S. Appl. No. 14/536,203, 4 pages. |
Office Action, dated Sep. 30, 2019, received in Japanese Patent Application No. 2018-022394, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Certificate of Grant, dated Aug. 28, 2019, received in Australian Patent Application No. 2018204236, which corresponds with U.S. Appl. No. 14/5326,267, 4 page. |
Office Action, dated Sep. 30, 2019, received in Chinese Patent Application No. 201610537334.1, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Notice of Allowance, dated Sep. 9, 2019, received in Japanese Patent Application No. 2017-237035, which corresponds with U.S. Appl. No. 14/536,267, 5 pages. |
Certificate of Granit, dated Aug. 28, 2019, received in European Patent Application No. 13811032.5, which corresponds with U.S. Appl. No. 14/608,985, 4 pages. |
Grant Certificate, dated Sep. 11, 2019, received in European Patent Application No. 17153418.3, which corresponds with U.S. Appl. No. 14/536,648, 3 pages. |
Decision to Grant, dated Sep. 12, 2019, received in European Patent Application No. 16708916.8, which corresponds with U.S. Appl. No. 14/868,078, 2 pages. |
Decision to Grant, dated Sep. 19, 2019, received in European Patent Application No. 17184437.6, which corresponds with U.S. Appl. No. 14/868,078, 2 pages. |
Office Action, dated Sep. 17, 2019, received in Chinese Patent Application No. 201610342264.4, which corresponds with U.S. Appl. No. 14/866,511, 3 pages. |
Office Action, dated Sep. 12, 2019, received in Chinese Patent Application No. 201610658351.8, which corresponds with U.S. Appl. No. 14/866,992, 5 pages. |
Notice of Allowance, dated Sep. 10, 2019, received in Korean Patent Application No. 2018-7003890, which corresponds with U.S. Appl. No. 14/866,992 , 5 pages. |
Notice of Acceptance, dated Sep. 19, 2019, received in Australian Patent Application No. 2019200872, which corresponds with U.S. Appl. No. 14/864,580, 3 pages. |
Decision to Grant, dated Sep. 12, 2019, received in European Patent Application No. 16730554.9, which corresponds with U.S. Appl. No. 14/864,601, 2 pages. |
Office Action, dated Oct. 8, 2019, received in European Patent Application No. 17188507.2, which corresponds with U.S. Appl. No. 14/866,361, 6 pages. |
Notice of Allowance, dated Oct. 7, 2019, received in Japanese Patent Application No. 2017-141962, which corresponds with U.S. Appl. No. 14/866,361, 5 pages. |
Office Action, dated Sep. 30, 2019, received in Chinese Patent Application No. 201610871466.8, which corresponds with U.S. Appl. No. 14/871,236, 4 pages. |
Office Action, dated Sep. 27, 2019, received in Chinese Patent Application No. 201810119007.3, which corresponds with U.S. Appl. No. 15/136,782, 6 pages. |
Office Action, dated Oct. 2, 2019, received in European Patent Application No. 18171453.6, which corresponds with U.S. Appl. No. 15/136,782, 5 pages. |
Decision to Grant, dated Sep. 12, 2019, received in European Patent Application No. 17206374.5, which corresponds with U.S. Appl. No. 15/272,343, 3 pages. |
Notice of Allowance, dated Oct. 10, 2019, received in U.S. Appl. No. 16/102,409, 9 pages. |
Notice of Allowance, dated Sep. 11, 2019, received in U.S. Appl. No. 16/230,743, 5 pages. |
Office Action, dated Aug. 30, 2019, received in Korean Patent Application No. 2019-7019100, 2 pages. |
Extended European Search Report, dated Oct. 9, 2019, received in European Patent Application No. 19181042.3, which corresponds with U.S. Appl. No. 15/272,343, 10 pages. |
Certificate of Grant, dated Jul. 23, 2020, received in Australian Patent Application No. 2018223021, which corresponds with U.S. Appl. No. 14/536,426, 4 pages. |
Office Action, dated Aug. 21, 2020, received in European Patent Application No. 18183789.9, which corresponds with U.S. Appl. No. 16/262,800, 9 pages. |
Patent, dated Jul. 31, 2020, received in Chinese Patent Application No. 201710781246.0, which corresponds with U.S. Appl. No. 14/536,291, 6 pages. |
Office Action, dated Jul. 24, 2020, received in Chinese Patent Application No. 201711422121.5, which corresponds with U.S. Appl. No. 14/536,648, 10 pages. |
Notice of Allowance, dated Jul. 29, 2020, received in Korean Patent Application No. 2020-7003065, which corresponds with U.S. Appl. No. 14/866,511, 5 pages. |
Office Action, dated Jul. 24, 2020, received in Chinese Patent Application No. 201680041559.6, which corresponds with U.S. Appl. No. 14/866,992, 13 pages. |
Office Action, dated Aug. 3, 2020, received in Chinese Patent Application No. 201610870912.3, which corresponds with U.S. Appl. No. 14/870,882, 4 pages. |
Office Action, dated Aug. 4, 2020, received in Chinese Patent Application No. 201610871323.7, which corresponds with U.S. Appl. No. 14/871,336, 18 pages. |
Notice of Acceptance, dated Jul. 22, 2020, received in Australian Patent Application No. 2019203776, which corresponds with U.S. Appl. No. 15/499,693, 3 pages. |
Office Action, dated Aug. 10, 2020, received in U.S. Appl. No. 16/240,672, 13 pages. |
Office Action, dated Aug. 7, 2020, received in Japanese Patent Application No. 2019-058800, which corresponds with U.S. Appl. No. 16/243,834, 8 pages. |
Notice of Allowance, dated Mar. 27, 2020, received in Australian Patent Application No. 2018223021, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Notice of Allowance, dated Apr. 3, 2020, received in Japanese Patent Application No. 2018-079290, which corresponds with U.S. Appl. No. 14/608,926, 5 pages. |
Patent, dated Apr. 14, 2020, received in Japanese Patent Application No. 2018-079290, which corresponds with U.S. Appl. No. 14/608,926, 5 pages. |
Certificate of Grant, dated May 21, 2020, received in Australian Patent Application No. 2018256616, which corresponds with U.S. Appl. No. 14/536,141, 3 pages. |
Patent, dated Feb. 19, 2020, received in European Patent Application No. 13726053.5, which corresponds with U.S. Appl. No. 14/536,141, 4 page. |
Notice of Allowance, dated Apr. 29, 2020, received in Australian Patent Application No. 2018250481, which corresponds with U.S. Appl. No. 14/536,203, 3 pages. |
Office Action, dated Apr. 20, 2020, received in Chinese Patent Application No. 201610537334.1, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Office Action, dated Mar. 16, 2020, received in Chinese Patent Application No. 201610131415.1, which corresponds with U.S. Appl. No. 14/866,981, 3 pages. |
Decision to Grant, dated Mar. 5, 2020, received in European Patent Application No. 16707356.8, which corresponds with U.S. Appl. No. 14/866,159, 2 pages. |
Patent, dated Apr. 1, 2020, received in European Patent Application No. 16707356.8, which corresponds with U.S. Appl. No. 14/866,159, 3 pages. |
Office Action, dated May 19, 2020, received in Chinese Patent Application No. 201680011338.4, which corresponds with U.S. Appl. No. 14/868,078, 4 pages. |
Notice of Allowance, dated Mar. 20, 2020, received in Chinese Patent Application No. 201610342313.4, which corresponds with U.S. Appl. No. 14/863,432, 6 pages. |
Patent, dated May 12, 2020, received in Chinese Patent Application No. 201610342313.4, which corresponds with U.S. Appl. No. 14/863,432, 7 pages. |
Patent, dated Feb. 7, 2020, received in Chinese Patent Application No. 201610342264.4, which corresponds with U.S. Appl. No. 14/866,511, 7 pages. |
Office Action, dated Apr. 24, 2020, received in Korean Patent Application No. 2020-7003065, which corresponds with U.S. Appl. No. 14/866,511, 3 pages. |
Decision to Grant, dated Mar. 26, 2020, received in European Patent Application No. 18168939.9, which corresponds with U.S. Appl. No. 14/869,899, 3 pages. |
Patent, dated Apr. 22, 2020, received in European Patent Application No. 18168939.9, which corresponds with U.S. Appl. No. 14/869,899, 3 pages. |
Intention to Grant, dated Apr. 7, 2020, received in European Patent Application No. 16756866.6, which corresponds with U.S. Appl. No. 15/009,676, 8 pages. |
Intention to Grant, dated Mar. 16, 2020, received in European Patent Application No. 16753796.8, which corresponds with U.S. Appl. No. 15/009,688, 6 pages. |
Notice of Allowance, dated Mar. 4, 2020, received in U.S. Appl. No. 14/856,520, 6 pages. |
Intention to Grant, dated Apr. 14, 2020, received in European Patent Application No. 17188507.2, which corresponds with U.S. Appl. No. 14/866,361, 7 pages. |
Notice of Allowance, dated Mar. 24, 2020, received in Chinese Patent Application No. 201610871466.8, which corresponds with U.S. Appl. No. 14/871,236, 3 pages. |
Patent, dated May 19, 2020, received in Chinese Patent Application No. 201610871466.8, which corresponds with U.S. Appl. No. 14/871,236, 8 pages. |
Office Action, dated Mar. 17, 2020, received in MX/a/2017/011610, which corresponds with U.S. Appl. No. 14/871,236, 4 pages. |
Office Action, dated Jan. 31, 2020, received in European Patent Application No. 16753795.0, which corresponds with U.S. Appl. No. 15/009,668, 9 pages. |
Intention to Grant, dated Apr. 30, 2020, received in European Patent Application No. 18205283.7, which corresponds with U.S. Appl. No. 15/081,771, 7 pages. |
Notice of Allowance, dated Feb. 26, 2020, received in Chinese Patent Application No. 201810119007.3, which corresponds with U.S. Appl. No. 15/136,782, 3 pages. |
Patent, dated Apr. 7, 2020, received in Chinese Patent Application No. 201810119007.3, which corresponds with U.S. Appl. No. 15/136,782, 7 pages. |
Office Action, dated May 12, 2020, received in European Patent Application No. 18171453.6, which corresponds with U.S. Appl. No. 15/136,782, 5 pages. |
Certificate of Grant, dated Apr. 2, 2020, received in Australian Patent Application No. 2018204234, which corresponds with U.S. Appl. No. 15/272,327, 1 page. |
Patent, dated Mar. 3, 2020, received in Chinese Patent Application No. 201810071627.4, which corresponds with U.S. Appl. No. 15/272,343, 7 pages. |
Notice of Allowance, dated Apr. 22, 2020, received in U.S. Appl. No. 15/272,345, 12 pages. |
Patent, dated Feb. 7, 2020, received in Hong Kong Patent Application No. 18101477.0, which corresponds with U.S. Appl. No. 15/272,345, 6 pages. |
Office Action, dated May 11, 2020, received in Australian Patent Application No. 2019203776, which corresponds with U.S. Appl. No. 15/499,693, 4 pages. |
Patent, dated Mar. 27, 2020, received in Korean Patent Application No. 2019-7009439, which corresponds with U.S. Appl. No. 15/499,693, 4 pages. |
Notice of Allowance, dated May 19, 2020, received in U.S. Appl. No. 15/889,115, 9 pages. |
Final Office Action, dated Feb. 27, 2020, received in U.S. Appl. No. 15/979,347, 19 pages. |
Notice of Allowance, dated May 14, 2020, received in U.S. Appl. No. 16/049,725, 9 pages. |
Notice of Acceptance, dated Apr. 2, 2020, received in Australian Patent Application No. 2018253539, which corresponds with U.S. Appl. No. 16/049,725, 3 pages. |
Final Office Action, dated Jun. 9, 2020, received in U.S. Appl. No. 16/136,163, 10 pages. |
Office Action, dated Mar. 9, 2020, received in U.S. Appl. No. 16/145,954, 15 pages. |
Office Action, dated Mar. 6, 2020, received in U.S. Appl. No. 16/154,591, 16 pages. |
Office Action, dated May 4, 2020, received in Australian Patent Application No. 2019203175, which corresponds with U.S. Appl. No. 16/154,591, 4 pages. |
Notice of Allowance, dated Jun. 1, 2020, received in Japanese Patent Application No. 2018-202048, which corresponds with U.S. Appl. No. 16/154,591, 3 pages. |
Office Action, dated Feb. 27, 2020, received in Korean Patent Application No. 2019-7019946, which corresponds with U.S. Appl. No. 16/154,591, 5 pages. |
Final Office Action, dated Mar. 19, 2020, received in U.S. Appl. No. 16/174,170, 35 pages. |
Notice of Allowance, dated May 22, 2020, received in Japanese Patent Application No. 2019-027634, which corresponds with U.S. Appl. No. 16/240,672, 5 pages. |
Patent, dated Mar. 13, 2020, received in Korean Patent Application No. 2018-7037896, which corresponds with U.S. Appl. No. 16/243,834, 7 pages. |
Patent, dated Mar. 12, 2020, received in Korean Patent Application No. 2019-7033444, which corresponds with U.S. Appl. No. 16/252,478, 6 pages. |
Office Action, dated May 14, 2020, received in U.S. Appl. No. 16/354,035, 16 pages. |
Office Action, dated May 14, 2020, received in U.S. Appl. No. 16/509,438, 16 pages. |
Notice of Allowance, dated May 20, 2020, received in U.S. Appl. No. 16/534,214, 17 pages. |
Notice of Allowance, dated May 4, 2020, received in Korean Patent Application No. 2019-7033444, 5 pages. |
Geisler, “Enriched Links: A Framework for Improving Web Navigation Using Pop-Up Views”, Journal of the American Society for Information Science, Chapel Hill, NC, Jan. 1, 2000, 13 pages. |
Office Action, dated Feb. 18, 2020, received in Australian Patent Application No. 2018223021, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Notice of Acceptance, dated Jan. 22, 2020, received in Australian Patent Application No. 2018256616, which corresponds with U.S. Appl. No. 14/536,141, 3 pages. |
Decision to Grant, dated Jan. 23, 2020, received in European Patent Application No. 13726053.5, which corresponds with U.S. Appl. No. 14/536,141, 1 page. |
Patent, dated Jan. 1, 2020, received in European Patent Application No. 16727900.9, which corresponds with U.S. Appl. No. 14/866,511, 3 pages. |
Office Action, dated Jan. 20, 2020, received in Japanese Patent Application No. 2017-029201, which corresponds with U.S. Appl. No. 14/857,636, 21 pages. |
Certificate of Grant, dated Jan. 23, 2020, received in Australian Patent Application No. 2019200872, which corresponds with U.S. Appl. No. 14/864,580, 3 pages. |
Patent, dated Jan. 31, 2020, received in Chinese Patent Application No. 201610342336.5, which corresponds with U.S. Appl. No. 14/866,987, 7 pages. |
Office Action, dated Feb. 3, 2020, received in European Patent Application No. 17163309.2, which corresponds with U.S. Appl. No. 14/866,987, 6 pages. |
Office Action, dated Feb. 3, 2020, received in European Patent Application No. 16189425.8, which corresponds with U.S. Appl. No. 14/866,989, 6 pages. |
Office Action, dated Feb. 21, 2020, received in European Patent Application No. 16711725.8, which corresponds with U.S. Appl. No. 14/867,990, 13 pages. |
Office Action, dated Jan. 24, 2020, received in European Patent Application No, 18205283.7, which corresponds with U.S. Appl. No. 15/081,771, 4 pages. |
Notice of Allowance, dated Feb. 20, 2020, received in U.S. Appl. No. 15/272,341, 12 pages. |
Notice of Allowance, dated Feb. 20, 2020, received in U.S. Appl. No. 15/655,749, 10 pages. |
Office Action, dated Feb. 3, 2020, received in Chinese Patent Application No. 201710331254.5, which corresponds with U.S. Appl. No. 15/655,749, 8 pages. |
Final Office Action, dated Feb. 5, 2020, received in U.S. Appl. No. 15/785,372, 26 pages. |
Patent, dated Jan. 31, 2020, received in Korean Patent Application No. 2019-7019100, 5 pages. |
Decision to Grant, dated Aug. 20, 2020, received in European Patent Application No. 18194127.9, which corresponds with U.S. Appl. No. 14/608,942, 4 pages. |
Decision to Grant, dated Aug. 27, 2020, received in European Patent Application No. 16756866.6, which corresponds with U.S. Appl. No. 15/009,676, 4 pages. |
Office Action, dated Aug. 26, 2020, received in Indian Application No. 201617032291, which corresponds with U.S. Appl. No. 14/866,987, 9 pages. |
Notice of Allowance, dated Sep. 7, 2020, received in Mx/a/2017/011610, which corresponds with U.S. Appl. No. 14/871,236, 12 pages. |
Office Action, dated Aug. 20, 2020, received in Chinese Patent Application No. 201680046985.9, which corresponds with U.S. Appl. No. 15/009,668, 15 pages. |
Decision to Grant, dated Aug. 27, 2020, received in European Patent Application No. 18205283.7, which corresponds with U.S. Patent Application No. 15/081,771, 4 pages. |
Office Action, dated Aug. 31, 2020, received in Chinese Patent Application No. 201810151593.X, which corresponds with U.S. Appl. No. 15/272,327, 10 page. |
Notice of Allowance, dated Aug. 26 2020, received in U.S. Appl. No. 16/240,669, 18 pages. |
Office action, dated Aug. 27, 2020, received in U.S. Appl. No. 16/241,883, 11 pages. |
Notice of Allowance, dated Aug. 25, 2020, received in U.S. Appl. No. 16/354,035, 14 pages. |
Office Action, dated Aug. 21, 2020, received in Japanese Patent Application No. 2019-047319, which corresponds with U.S. Appl. No. 16/896,141, 6 pages. |
Brewster, “The Design and Evaluation of a Vibrotactile Progress Bar”, Glasgow Interactive Systems Group, University of Glasgow, Glasgow, G12 8QQ, UK, 2005, 2 pages. |
Jones, “Touch Screen with Feeling”, IEEE Spectrum, , spectrum.ieee.org/commuting/hardware/touch-screens-with-feeling, May 1, 2009, 2 pages. |
Nishino, “A Touch Screen Interface Design with Tactile Feedback”, Computer Science, 2011 International Conference on Complex, Intelligent, and Software Intensive Systems, 2011, 4 pages. |
Office Action, dated Sep. 16, 2020, received in U.S. Appl. No. 15/009,661, 37 pages. |
Notice of Allowance, dated Oct. 1, 2020, received in U.S. Appl. No. 14/856,520, 5 pages. |
Office Action, dated Sep. 25, 2020, received in U.S. Appl. No. 15/994,843, 5 pages. |
Office Action, dated Sep. 17, 2020, received in U.S. Appl. No. 16/136,163, 13 pages. |
Final Office Action, dated Oct. 1, 2020, received in U.S. Appl. No. 16/154,591, 19 pages. |
Notice of Allowance, dated Sep. 28, 2020, received in U.S. Appl. No. 16/241,883, 10 pages. |
Office Action, dated Oct. 7, 2020, received in U.S. Appl. No. 16/563,505, 20 pages. |
Office Action, dated Oct. 19, 2020, received in U.S. Appl. No. 16/685,773, 15 pages. |
Office Action, dated Sep. 21, 2020, received in U.S. Appl. No. 16/803,904, 5 pages. |
Certificate of Grant, dated Sep. 3, 2020, received in Australian Patent Application No. 2018250481, which corresponds with U.S. Appl. No. 14/536,203, 4 pages. |
Patent, dated Sep. 29, 2020, received in Chinese Patent Application No. 201610537334.1, which corresponds with U.S. Appl. No. 14/536,267, 7 pages. |
Decision to Grant, dated Sep. 24, 2020), received in European Patent Application No. 16753796.8, which corresponds with U.S. Appl. No. 15/009,688, 4 pages. |
Notice of Allowance, dated Sep. 18, 2020, received in Japanese Patent Application No. 2018-201076, which corresponds with U.S. Appl. No. 14/857,663, 5 pages. |
Intention to Grant, dated Oct. 5, 2020, received in European Patent Application No. 18168941.5, which corresponds with U.S. Appl. No. 14/871,236, 8 pages. |
Patent, dated Sep. 18, 2020, received in Chinese Patent Application No. 201680022696.5, which corresponds with U.S. Appl. No. 15/272,345, 6 pages. |
Office Action, dated Sep. 24, 2020, received in Australian Patent Application No. 2019268116, which corresponds with U.S. Appl. No. 16/240,672, 4 pages. |
Office Action, dated Sep. 18, 2020, received in Australian Patent Application No. 2018282409, which corresponds with U.S. Appl. No. 16/243,834, 3 pages. |
Office Action, dated Sep. 15, 2020, received in European Patent Application No. 19194439.6, which corresponds with U.S. Appl. No. 16/262,800, 6 pages. |
Notice of Allowance, dated Sep. 15, 2020, received in Australian Patent Application No. 2019257437, which corresponds with U.S. Appl. No. 16/252,478, 3 pages. |
Extended European Search Report, dated Oct. 6, 2020, received in European Patent Application No. 20188553.0, which corresponds with U.S. Appl. No. 15/499,693, 11 pages. |
Number | Date | Country | |
---|---|---|---|
20190146643 A1 | May 2019 | US |
Number | Date | Country | |
---|---|---|---|
62215722 | Sep 2015 | US | |
62213609 | Sep 2015 | US | |
62203387 | Aug 2015 | US | |
62215696 | Sep 2015 | US | |
62213606 | Sep 2015 | US | |
62183139 | Jun 2015 | US | |
62172226 | Jun 2015 | US | |
62129954 | Mar 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14870988 | Sep 2015 | US |
Child | 16243834 | US | |
Parent | 14869899 | Sep 2015 | US |
Child | 14870988 | US |