This relates generally to user interfaces that present information and selectable options related to items of content on an electronic device.
User interaction with electronic devices has increased significantly in recent years. These devices can be devices such as computers, tablet computers, televisions, multimedia devices, mobile devices, and the like.
In some circumstances, such a device presents an item of content. In some circumstances, the electronic device presents information about the item of content in a user interface specific to the item of content. In some circumstances, the electronic device presents user interfaces for interacting with the electronic device. Enhancing the user's interactions with the device improves the user's experience with the device and decreases user interaction time, which is particularly important where input devices are battery-operated.
It is well understood that the use of personally identifiable information should follow privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining the privacy of users. In particular, personally identifiable information data should be managed and handled so as to minimize risks of unintentional or unauthorized access or use, and the nature of authorized use should be clearly indicated to users.
Some embodiments described in this disclosure are directed to presenting representations of items of content available for playback on the electronic device. Some embodiments described in this disclosure are directed to presenting selectable options for initiating a process to access an item of content based on the available ways of accessing the content. Some embodiments described in this disclosure are directed to presenting representations of episodes in a series of episodic content. Some embodiments described in this disclosure are directed to presenting an enhanced preview of content. Some embodiments described in this disclosure are directed to presenting a control panel. Some embodiments described in this disclosure are directed to switching the active user profile of a device. Some embodiments described in this disclosure are directed to a picture-in-picture mode. The full descriptions of the embodiments are provided in the Drawings and the Detailed Description, and it is understood that the Summary provided above does not limit the scope of the disclosure in any way.
For a better understanding of the various described embodiments, reference should be made to the Detailed Description below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.
In the following description of embodiments, reference is made to the accompanying drawings which form a part hereof, and in which it is shown by way of illustration specific embodiments that are optionally practiced. It is to be understood that other embodiments are optionally used and structural changes are optionally made without departing from the scope of the disclosed embodiments. Further, although the following description uses terms “first,” “second,” etc. to describe various elements, these elements should not be limited by the terms. These terms are only used to distinguish one element from another. For example, a first touch could be termed a second touch, and, similarly, a second touch could be termed a first touch, without departing from the scope of the various described embodiments. The first touch and the second touch are both touches, but they are not the same touch.
The terminology used in the description of the various described embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “includes,” “including,” “comprises,” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The term “if” is, optionally, construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” is, optionally, construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
Embodiments of electronic devices, user interfaces for such devices, and associated processes for using such devices are described. In some embodiments, the device is a portable communications device, such as a mobile telephone, that also contains other functions, such as PDA and/or music player functions. Exemplary embodiments of portable multifunction devices include, without limitation, the iPhone®, iPod Touch®, and iPad® devices from Apple Inc. of Cupertino, Calif. Other portable electronic devices, such as laptops or tablet computers with touch-sensitive surfaces (e.g., touch screen displays and/or touch pads), are, optionally, used. It should also be understood that, in some embodiments, the device is not a portable communications device, but is a desktop computer or a television with a touch-sensitive surface (e.g., a touch screen display and/or a touch pad). In some embodiments, the device does not have a touch screen display and/or a touch pad, but rather is capable of outputting display information (such as the user interfaces of the disclosure) for display on a separate display device, and capable of receiving input information from a separate input device having one or more input mechanisms (such as one or more buttons, a touch screen display and/or a touch pad). In some embodiments, the device has a display, but is capable of receiving input information from a separate input device having one or more input mechanisms (such as one or more buttons, a touch screen display and/or a touch pad).
In the discussion that follows, an electronic device that includes a display and a touch-sensitive surface is described. It should be understood, however, that the electronic device optionally includes one or more other physical user-interface devices, such as a physical keyboard, a mouse and/or a joystick. Further, as described above, it should be understood that the described electronic device, display and touch-sensitive surface are optionally distributed amongst two or more devices. Therefore, as used in this disclosure, information displayed on the electronic device or by the electronic device is optionally used to describe information outputted by the electronic device for display on a separate display device (touch-sensitive or not). Similarly, as used in this disclosure, input received on the electronic device (e.g., touch input received on a touch-sensitive surface of the electronic device) is optionally used to describe input received on a separate input device, from which the electronic device receives input information.
The device typically supports a variety of applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disk authoring application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a workout support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, a television channel browsing application, and/or a digital video player application.
The various applications that are executed on the device optionally use at least one common physical user-interface device, such as the touch-sensitive surface. One or more functions of the touch-sensitive surface as well as corresponding information displayed on the device are, optionally, adjusted and/or varied from one application to the next and/or within a respective application. In this way, a common physical architecture (such as the touch-sensitive surface) of the device optionally supports the variety of applications with user interfaces that are intuitive and transparent to the user.
Attention is now directed toward embodiments of portable or non-portable devices with touch-sensitive displays, though the devices need not include touch-sensitive displays or displays in general, as described above.
As used in the specification and claims, the term “intensity” of a contact on a touch-sensitive surface refers to the force or pressure (force per unit area) of a contact (e.g., a finger contact) on the touch-sensitive surface, or to a substitute (proxy) for the force or pressure of a contact on the touch-sensitive surface. The intensity of a contact has a range of values that includes at least four distinct values and more typically includes hundreds of distinct values (e.g., at least 256). Intensity of a contact is, optionally, determined (or measured) using various approaches and various sensors or combinations of sensors. For example, one or more force sensors underneath or adjacent to the touch-sensitive surface are, optionally, used to measure force at various points on the touch-sensitive surface. In some implementations, force measurements from multiple force sensors are combined (e.g., a weighted average) to determine an estimated force of a contact. Similarly, a pressure-sensitive tip of a stylus is, optionally, used to determine a pressure of the stylus on the touch-sensitive surface. Alternatively, the size of the contact area detected on the touch-sensitive surface and/or changes thereto, the capacitance of the touch-sensitive surface proximate to the contact and/or changes thereto, and/or the resistance of the touch-sensitive surface proximate to the contact and/or changes thereto are, optionally, used as a substitute for the force or pressure of the contact on the touch-sensitive surface. In some implementations, the substitute measurements for contact force or pressure are used directly to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is described in units corresponding to the substitute measurements). In some implementations, the substitute measurements for contact force or pressure are converted to an estimated force or pressure and the estimated force or pressure is used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is a pressure threshold measured in units of pressure). Using the intensity of a contact as an attribute of a user input allows for user access to additional device functionality that may otherwise not be accessible by the user on a reduced-size device with limited real estate for displaying affordances (e.g., on a touch-sensitive display) and/or receiving user input (e.g., via a touch-sensitive display, a touch-sensitive surface, or a physical/mechanical control such as a knob or a button).
As used in the specification and claims, the term “tactile output” refers to physical displacement of a device relative to a previous position of the device, physical displacement of a component (e.g., a touch-sensitive surface) of a device relative to another component (e.g., housing) of the device, or displacement of the component relative to a center of mass of the device that will be detected by a user with the user's sense of touch. For example, in situations where the device or the component of the device is in contact with a surface of a user that is sensitive to touch (e.g., a finger, palm, or other part of a user's hand), the tactile output generated by the physical displacement will be interpreted by the user as a tactile sensation corresponding to a perceived change in physical characteristics of the device or the component of the device. For example, movement of a touch-sensitive surface (e.g., a touch-sensitive display or trackpad) is, optionally, interpreted by the user as a “down click” or “up click” of a physical actuator button. In some cases, a user will feel a tactile sensation such as an “down click” or “up click” even when there is no movement of a physical actuator button associated with the touch-sensitive surface that is physically pressed (e.g., displaced) by the user's movements. As another example, movement of the touch-sensitive surface is, optionally, interpreted or sensed by the user as “roughness” of the touch-sensitive surface, even when there is no change in smoothness of the touch-sensitive surface. While such interpretations of touch by a user will be subject to the individualized sensory perceptions of the user, there are many sensory perceptions of touch that are common to a large majority of users. Thus, when a tactile output is described as corresponding to a particular sensory perception of a user (e.g., an “up click,” a “down click,” “roughness”), unless otherwise stated, the generated tactile output corresponds to physical displacement of the device or a component thereof that will generate the described sensory perception for a typical (or average) user.
It should be appreciated that device 100 is only one example of a portable or non-portable multifunction device, and that device 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components. The various components shown in
Memory 102 optionally includes high-speed random access memory and optionally also includes non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Memory controller 122 optionally controls access to memory 102 by other components of device 100.
Peripherals interface 118 can be used to couple input and output peripherals of the device to CPU 120 and memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in memory 102 to perform various functions for device 100 and to process data.
In some embodiments, peripherals interface 118, CPU 120, and memory controller 122 are, optionally, implemented on a single chip, such as chip 104. In some other embodiments, they are, optionally, implemented on separate chips.
RF (radio frequency) circuitry 108 receives and sends RF signals, also called electromagnetic signals. RF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other communications devices via the electromagnetic signals. RF circuitry 108 optionally includes well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. RF circuitry 108 optionally communicates with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication. The RF circuitry 108 optionally includes well-known circuitry for detecting near field communication (NFC) fields, such as by a short-range communication radio. The wireless communication optionally uses any of a plurality of communications standards, protocols, and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), high-speed uplink packet access (HSDPA), Evolution, Data-Only (EV-DO), HSPA, HSPA+, Dual-Cell HSPA (DC-HSPDA), long term evolution (LTE), near field communication (NFC), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Bluetooth Low Energy (BTLE), Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, IEEE 802.11n, and/or IEEE 802.11ac), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for e-mail (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
Audio circuitry 110, speaker 111, and microphone 113 provide an audio interface between a user and device 100. Audio circuitry 110 receives audio data from peripherals interface 118, converts the audio data to an electrical signal, and transmits the electrical signal to speaker 111. Speaker 111 converts the electrical signal to human-audible sound waves. Audio circuitry 110 also receives electrical signals converted by microphone 113 from sound waves. Audio circuitry 110 converts the electrical signal to audio data and transmits the audio data to peripherals interface 118 for processing. Audio data is, optionally, retrieved from and/or transmitted to memory 102 and/or RF circuitry 108 by peripherals interface 118. In some embodiments, audio circuitry 110 also includes a headset jack (e.g., 212,
I/O subsystem 106 couples input/output peripherals on device 100, such as touch screen 112 and other input control devices 116, to peripherals interface 118. I/O subsystem 106 optionally includes display controller 156, optical sensor controller 158, intensity sensor controller 159, haptic feedback controller 161 and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 receive/send electrical signals from/to other input or control devices 116. The other input control devices 116 optionally include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 160 are, optionally, coupled to any (or none) of the following: a keyboard, infrared port, USB port, and a pointer device such as a mouse. The one or more buttons (e.g., 208,
A quick press of the push button optionally disengages a lock of touch screen 112 or optionally begins a process that uses gestures on the touch screen to unlock the device, as described in U.S. patent application Ser. No. 11/322,549, “Unlocking a Device by Performing Gestures on an Unlock Image,” filed Dec. 23, 2005, U.S. Pat. No. 7,657,849, which is hereby incorporated by reference in its entirety. A longer press of the push button (e.g., 206) optionally turns power to device 100 on or off. The functionality of one or more of the buttons are, optionally, user-customizable. Touch screen 112 is used to implement virtual or soft buttons and one or more soft keyboards.
Touch-sensitive display 112 provides an input interface and an output interface between the device and a user. As described above, the touch-sensitive operation and the display operation of touch-sensitive display 112 are optionally separated from each other, such that a display device is used for display purposes and a touch-sensitive surface (whether display or not) is used for input detection purposes, and the described components and functions are modified accordingly. However, for simplicity, the following description is provided with reference to a touch-sensitive display. Display controller 156 receives and/or sends electrical signals from/to touch screen 112. Touch screen 112 displays visual output to the user. The visual output optionally includes graphics, text, icons, video, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output corresponds to user-interface objects.
Touch screen 112 has a touch-sensitive surface, sensor or set of sensors that accepts input from the user based on haptic and/or tactile contact. Touch screen 112 and display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) on touch screen 112 and convert the detected contact into interaction with user-interface objects (e.g., one or more soft keys, icons, web pages or images) that are displayed on touch screen 112. In an exemplary embodiment, a point of contact between touch screen 112 and the user corresponds to a finger of the user.
Touch screen 112 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other embodiments. Touch screen 112 and display controller 156 optionally detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 112. In an exemplary embodiment, projected mutual capacitance sensing technology is used, such as that found in the iPhone®, iPod Touch®, and iPad® from Apple Inc. of Cupertino, Calif.
A touch-sensitive display in some embodiments of touch screen 112 is, optionally, analogous to the multi-touch sensitive touchpads described in the following U.S. Pat. No. 6,323,846 (Westerman et al.), U.S. Pat. No. 6,570,557 (Westerman et al.), and/or U.S. Pat. No. 6,677,932 (Westerman), and/or U.S. Patent Publication 2002/0015024A1, each of which is hereby incorporated by reference in its entirety. However, touch screen 112 displays visual output from device 100, whereas touch-sensitive touchpads do not provide visual output.
A touch-sensitive display in some embodiments of touch screen 112 is described in the following applications: (1) U.S. patent application Ser. No. 11/381,313, “Multipoint Touch Surface Controller,” filed May 2, 2006; (2) U.S. patent application Ser. No. 10/840,862, “Multipoint Touchscreen,” filed May 6, 2004; (3) U.S. patent application Ser. No. 10/903,964, “Gestures For Touch Sensitive Input Devices,” filed Jul. 30, 2004; (4) U.S. patent application Ser. No. 11/048,264, “Gestures For Touch Sensitive Input Devices,” filed Jan. 31, 2005; (5) U.S. patent application Ser. No. 11/038,590, “Mode-Based Graphical User Interfaces For Touch Sensitive Input Devices,” filed Jan. 18, 2005; (6) U.S. patent application Ser. No. 11/228,758, “Virtual Input Device Placement On A Touch Screen User Interface,” filed Sep. 16, 2005; (7) U.S. patent application Ser. No. 11/228,700, “Operation Of A Computer With A Touch Screen Interface,” filed Sep. 16, 2005; (8) U.S. patent application Ser. No. 11/228,737, “Activating Virtual Keys Of A Touch-Screen Virtual Keyboard,” filed Sep. 16, 2005; and (9) U.S. patent application Ser. No. 11/367,749, “Multi-Functional Hand-Held Device,” filed Mar. 3, 2006. All of these applications are incorporated by reference herein in their entirety.
Touch screen 112 optionally has a video resolution in excess of 100 dpi. In some embodiments, the touch screen has a video resolution of approximately 160 dpi. The user optionally makes contact with touch screen 112 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which can be less precise than stylus-based input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user.
In some embodiments, in addition to the touch screen, device 100 optionally includes a touchpad (not shown) for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad is, optionally, a touch-sensitive surface that is separate from touch screen 112 or an extension of the touch-sensitive surface formed by the touch screen.
Device 100 also includes power system 162 for powering the various components. Power system 162 optionally includes a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable or non-portable devices.
Device 100 optionally also includes one or more optical sensors 164.
Device 100 optionally also includes one or more contact intensity sensors 165.
Device 100 optionally also includes one or more proximity sensors 166.
Device 100 optionally also includes one or more tactile output generators 167.
Device 100 optionally also includes one or more accelerometers 168.
In some embodiments, the software components stored in memory 102 include operating system 126, communication module (or set of instructions) 128, contact/motion module (or set of instructions) 130, graphics module (or set of instructions) 132, text input module (or set of instructions) 134, Global Positioning System (GPS) module (or set of instructions) 135, and applications (or sets of instructions) 136. Furthermore, in some embodiments, memory 102 (
Operating system 126 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, iOS, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
Communication module 128 facilitates communication with other devices over one or more external ports 124 and also includes various software components for handling data received by RF circuitry 108 and/or external port 124. External port 124 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc). In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or similar to and/or compatible with the 30-pin connector used on iPod (trademark of Apple Inc.) devices.
Contact/motion module 130 optionally detects contact with touch screen 112 (in conjunction with display controller 156) and other touch-sensitive devices (e.g., a touchpad or physical click wheel). Contact/motion module 130 includes various software components for performing various operations related to detection of contact, such as determining if contact has occurred (e.g., detecting a finger-down event), determining an intensity of the contact (e.g., the force or pressure of the contact or a substitute for the force or pressure of the contact) determining if there is movement of the contact and tracking the movement across the touch-sensitive surface (e.g., detecting one or more finger-dragging events), and determining if the contact has ceased (e.g., detecting a finger-up event or a break in contact). Contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact, which is represented by a series of contact data, optionally includes determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations are, optionally, applied to single contacts (e.g., one finger contacts) or to multiple simultaneous contacts (e.g., “multitouch”/multiple finger contacts). In some embodiments, contact/motion module 130 and display controller 156 detect contact on a touchpad.
In some embodiments, contact/motion module 130 uses a set of one or more intensity thresholds to determine whether an operation has been performed by a user (e.g., to determine whether a user has “clicked” on an icon). In some embodiments at least a subset of the intensity thresholds are determined in accordance with software parameters (e.g., the intensity thresholds are not determined by the activation thresholds of particular physical actuators and can be adjusted without changing the physical hardware of device 100). For example, a mouse “click” threshold of a trackpad or touch screen display can be set to any of a large range of predefined threshold values without changing the trackpad or touch screen display hardware. Additionally, in some implementations a user of the device is provided with software settings for adjusting one or more of the set of intensity thresholds (e.g., by adjusting individual intensity thresholds and/or by adjusting a plurality of intensity thresholds at once with a system-level click “intensity” parameter).
Contact/motion module 130 optionally detects a gesture input by a user. Different gestures on the touch-sensitive surface have different contact patterns (e.g., different motions, timings, and/or intensities of detected contacts). Thus, a gesture is, optionally, detected by detecting a particular contact pattern. For example, detecting a finger tap gesture includes detecting a finger-down event followed by detecting a finger-up (liftoff) event at the same position (or substantially the same position) as the finger-down event (e.g., at the position of an icon). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event followed by detecting one or more finger-dragging events, and subsequently followed by detecting a finger-up (liftoff) event.
Graphics module 132 includes various known software components for rendering and displaying graphics on touch screen 112 or other display, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast or other visual property) of graphics that are displayed. As used herein, the term “graphics” includes any object that can be displayed to a user, including without limitation text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations and the like.
In some embodiments, graphics module 132 stores data representing graphics to be used. Each graphic is, optionally, assigned a corresponding code. Graphics module 132 receives, from applications etc., one or more codes specifying graphics to be displayed along with, if necessary, coordinate data and other graphic property data, and then generates screen image data to output to display controller 156.
Haptic feedback module 133 includes various software components for generating instructions used by tactile output generator(s) 167 to produce tactile outputs at one or more locations on device 100 in response to user interactions with device 100.
Text input module 134, which is, optionally, a component of graphics module 132, provides soft keyboards for entering text in various applications (e.g., contacts 137, e-mail 140, IM 141, browser 147, and any other application that needs text input).
GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to telephone 138 for use in location-based dialing, to camera 143 as picture/video metadata, and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets).
Applications 136 optionally include the following modules (or sets of instructions), or a subset or superset thereof:
Examples of other applications 136 that are, optionally, stored in memory 102 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, contacts module 137 are, optionally, used to manage an address book or contact list (e.g., stored in application internal state 192 of contacts module 137 in memory 102 or memory 370), including: adding name(s) to the address book; deleting name(s) from the address book; associating telephone number(s), e-mail address(es), physical address(es) or other information with a name; associating an image with a name; categorizing and sorting names; providing telephone numbers or e-mail addresses to initiate and/or facilitate communications by telephone 138, video conference module 139, e-mail 140, or IM 141; and so forth.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, telephone module 138 are optionally, used to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in contacts module 137, modify a telephone number that has been entered, dial a respective telephone number, conduct a conversation, and disconnect or hang up when the conversation is completed. As noted above, the wireless communication optionally uses any of a plurality of communications standards, protocols, and technologies.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, optical sensor 164, optical sensor controller 158, contact/motion module 130, graphics module 132, text input module 134, contacts module 137, and telephone module 138, video conference module 139 includes executable instructions to initiate, conduct, and terminate a video conference between a user and one or more other participants in accordance with user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, e-mail client module 140 includes executable instructions to create, send, receive, and manage e-mail in response to user instructions. In conjunction with image management module 144, e-mail client module 140 makes it very easy to create and send e-mails with still or video images taken with camera module 143.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, the instant messaging module 141 includes executable instructions to enter a sequence of characters corresponding to an instant message, to modify previously entered characters, to transmit a respective instant message (for example, using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for telephony-based instant messages or using XMPP, SIMPLE, or IMPS for Internet-based instant messages), to receive instant messages, and to view received instant messages. In some embodiments, transmitted and/or received instant messages optionally include graphics, photos, audio files, video files and/or other attachments as are supported in an MMS and/or an Enhanced Messaging Service (EMS). As used herein, “instant messaging” refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and Internet-based messages (e.g., messages sent using XMPP, SIMPLE, or IMPS).
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, GPS module 135, map module 154, and music player module, workout support module 142 includes executable instructions to create workouts (e.g., with time, distance, and/or calorie burning goals); communicate with workout sensors (sports devices); receive workout sensor data; calibrate sensors used to monitor a workout; select and play music for a workout; and display, store, and transmit workout data.
In conjunction with touch screen 112, display controller 156, optical sensor(s) 164, optical sensor controller 158, contact/motion module 130, graphics module 132, and image management module 144, camera module 143 includes executable instructions to capture still images or video (including a video stream) and store them into memory 102, modify characteristics of a still image or video, or delete a still image or video from memory 102.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and camera module 143, image management module 144 includes executable instructions to arrange, modify (e.g., edit), or otherwise manipulate, label, delete, present (e.g., in a digital slide show or album), and store still and/or video images.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, browser module 147 includes executable instructions to browse the Internet in accordance with user instructions, including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, e-mail client module 140, and browser module 147, calendar module 148 includes executable instructions to create, display, modify, and store calendars and data associated with calendars (e.g., calendar entries, to-do lists, etc.) in accordance with user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and browser module 147, widget modules 149 are mini-applications that are, optionally, downloaded and used by a user (e.g., weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, and dictionary widget 149-5) or created by the user (e.g., user-created widget 149-6). In some embodiments, a widget includes an HTML (Hypertext Markup Language) file, a CSS (Cascading Style Sheets) file, and a JavaScript file. In some embodiments, a widget includes an XML (Extensible Markup Language) file and a JavaScript file (e.g., Yahoo! Widgets).
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and browser module 147, the widget creator module 150 are, optionally, used by a user to create widgets (e.g., turning a user-specified portion of a web page into a widget).
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, search module 151 includes executable instructions to search for text, music, sound, image, video, and/or other files in memory 102 that match one or more search criteria (e.g., one or more user-specified search terms) in accordance with user instructions.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, and browser module 147, video and music player module 152 includes executable instructions that allow the user to download and play back recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files, and executable instructions to display, present, or otherwise play back videos (e.g., on touch screen 112 or on an external, connected display via external port 124). In some embodiments, device 100 optionally includes the functionality of an MP3 player, such as an iPod (trademark of Apple Inc).
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, notes module 153 includes executable instructions to create and manage notes, to-do lists, and the like in accordance with user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, GPS module 135, and browser module 147, map module 154 are, optionally, used to receive, display, modify, and store maps and data associated with maps (e.g., driving directions, data on stores and other points of interest at or near a particular location, and other location-based data) in accordance with user instructions.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, text input module 134, e-mail client module 140, and browser module 147, online video module 155 includes instructions that allow the user to access, browse, receive (e.g., by streaming and/or download), play back (e.g., on the touch screen or on an external, connected display via external port 124), send an e-mail with a link to a particular online video, and otherwise manage online videos in one or more file formats, such as H.264. In some embodiments, instant messaging module 141, rather than e-mail client module 140, is used to send a link to a particular online video. Additional description of the online video application can be found in U.S. Provisional Patent Application No. 60/936,562, “Portable Multifunction Device, Method, and Graphical User Interface for Playing Online Videos,” filed Jun. 20, 2007, and U.S. patent application Ser. No. 11/968,067, “Portable Multifunction Device, Method, and Graphical User Interface for Playing Online Videos,” filed Dec. 31, 2007, the contents of which are hereby incorporated by reference in their entirety.
Each of the above-identified modules and applications corresponds to a set of executable instructions for performing one or more functions described above and the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (e.g., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of these modules are, optionally, combined or otherwise rearranged in various embodiments. For example, video player module is, optionally, combined with music player module into a single module (e.g., video and music player module 152,
In some embodiments, device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through a touch screen and/or a touchpad. By using a touch screen and/or a touchpad as the primary input control device for operation of device 100, the number of physical input control devices (such as push buttons, dials, and the like) on device 100 is, optionally, reduced.
The predefined set of functions that are performed exclusively through a touch screen and/or a touchpad optionally include navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates device 100 to a main, home, or root menu from any user interface that is displayed on device 100. In such embodiments, a “menu button” is implemented using a touchpad. In some other embodiments, the menu button is a physical push button or other physical input control device instead of a touchpad.
Event sorter 170 receives event information and determines the application 136-1 and application view 191 of application 136-1 to which to deliver the event information. Event sorter 170 includes event monitor 171 and event dispatcher module 174. In some embodiments, application 136-1 includes application internal state 192, which indicates the current application view(s) displayed on touch-sensitive display 112 when the application is active or executing. In some embodiments, device/global internal state 157 is used by event sorter 170 to determine which application(s) is (are) currently active, and application internal state 192 is used by event sorter 170 to determine application views 191 to which to deliver event information.
In some embodiments, application internal state 192 includes additional information, such as one or more of: resume information to be used when application 136-1 resumes execution, user interface state information that indicates information being displayed or that is ready for display by application 136-1, a state queue for enabling the user to go back to a prior state or view of application 136-1, and a redo/undo queue of previous actions taken by the user.
Event monitor 171 receives event information from peripherals interface 118. Event information includes information about a sub-event (e.g., a user touch on touch-sensitive display 112, as part of a multi-touch gesture). Peripherals interface 118 transmits information it receives from I/O subsystem 106 or a sensor, such as proximity sensor 166, accelerometer(s) 168, and/or microphone 113 (through audio circuitry 110). Information that peripherals interface 118 receives from I/O subsystem 106 includes information from touch-sensitive display 112 or a touch-sensitive surface.
In some embodiments, event monitor 171 sends requests to the peripherals interface 118 at predetermined intervals. In response, peripherals interface 118 transmits event information. In other embodiments, peripherals interface 118 transmits event information only when there is a significant event (e.g., receiving an input above a predetermined noise threshold and/or for more than a predetermined duration).
In some embodiments, event sorter 170 also includes a hit view determination module 172 and/or an active event recognizer determination module 173.
Hit view determination module 172 provides software procedures for determining where a sub-event has taken place within one or more views when touch-sensitive display 112 displays more than one view. Views are made up of controls and other elements that a user can see on the display.
Another aspect of the user interface associated with an application is a set of views, sometimes herein called application views or user interface windows, in which information is displayed and touch-based gestures occur. The application views (of a respective application) in which a touch is detected optionally correspond to programmatic levels within a programmatic or view hierarchy of the application. For example, the lowest level view in which a touch is detected is, optionally, called the hit view, and the set of events that are recognized as proper inputs are, optionally, determined based, at least in part, on the hit view of the initial touch that begins a touch-based gesture.
Hit view determination module 172 receives information related to sub-events of a touch-based gesture. When an application has multiple views organized in a hierarchy, hit view determination module 172 identifies a hit view as the lowest view in the hierarchy which should handle the sub-event. In most circumstances, the hit view is the lowest level view in which an initiating sub-event occurs (e.g., the first sub-event in the sequence of sub-events that form an event or potential event). Once the hit view is identified by the hit view determination module 172, the hit view typically receives all sub-events related to the same touch or input source for which it was identified as the hit view.
Active event recognizer determination module 173 determines which view or views within a view hierarchy should receive a particular sequence of sub-events. In some embodiments, active event recognizer determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, active event recognizer determination module 173 determines that all views that include the physical location of a sub-event are actively involved views, and therefore determines that all actively involved views should receive a particular sequence of sub-events. In other embodiments, even if touch sub-events were entirely confined to the area associated with one particular view, views higher in the hierarchy would still remain as actively involved views.
Event dispatcher module 174 dispatches the event information to an event recognizer (e.g., event recognizer 180). In embodiments including active event recognizer determination module 173, event dispatcher module 174 delivers the event information to an event recognizer determined by active event recognizer determination module 173. In some embodiments, event dispatcher module 174 stores in an event queue the event information, which is retrieved by a respective event receiver 182.
In some embodiments, operating system 126 includes event sorter 170. Alternatively, application 136-1 includes event sorter 170. In yet other embodiments, event sorter 170 is a stand-alone module, or a part of another module stored in memory 102, such as contact/motion module 130.
In some embodiments, application 136-1 includes a plurality of event handlers 190 and one or more application views 191, each of which includes instructions for handling touch events that occur within a respective view of the application's user interface. Each application view 191 of the application 136-1 includes one or more event recognizers 180. Typically, a respective application view 191 includes a plurality of event recognizers 180. In other embodiments, one or more of event recognizers 180 are part of a separate module, such as a user interface kit (not shown) or a higher level object from which application 136-1 inherits methods and other properties. In some embodiments, a respective event handler 190 includes one or more of: data updater 176, object updater 177, GUI updater 178, and/or event data 179 received from event sorter 170. Event handler 190 optionally utilizes or calls data updater 176, object updater 177, or GUI updater 178 to update the application internal state 192. Alternatively, one or more of the application views 191 include one or more respective event handlers 190. Also, in some embodiments, one or more of data updater 176, object updater 177, and GUI updater 178 are included in a respective application view 191.
A respective event recognizer 180 receives event information (e.g., event data 179) from event sorter 170 and identifies an event from the event information. Event recognizer 180 includes event receiver 182 and event comparator 184. In some embodiments, event recognizer 180 also includes at least a subset of: metadata 183, and event delivery instructions 188 (which optionally include sub-event delivery instructions).
Event receiver 182 receives event information from event sorter 170. The event information includes information about a sub-event, for example, a touch or a touch movement. Depending on the sub-event, the event information also includes additional information, such as location of the sub-event. When the sub-event concerns motion of a touch, the event information optionally also includes speed and direction of the sub-event. In some embodiments, events include rotation of the device from one orientation to another (e.g., from a portrait orientation to a landscape orientation, or vice versa), and the event information includes corresponding information about the current orientation (also called device attitude) of the device.
Event comparator 184 compares the event information to predefined event or sub-event definitions and, based on the comparison, determines an event or sub-event, or determines or updates the state of an event or sub-event. In some embodiments, event comparator 184 includes event definitions 186. Event definitions 186 contain definitions of events (e.g., predefined sequences of sub-events), for example, event 1 (187-1), event 2 (187-2), and others. In some embodiments, sub-events in an event (187) include, for example, touch begin, touch end, touch movement, touch cancellation, and multiple touching. In one example, the definition for event 1 (187-1) is a double tap on a displayed object. The double tap, for example, comprises a first touch (touch begin) on the displayed object for a predetermined phase, a first liftoff (touch end) for a predetermined phase, a second touch (touch begin) on the displayed object for a predetermined phase, and a second liftoff (touch end) for a predetermined phase. In another example, the definition for event 2 (187-2) is a dragging on a displayed object. The dragging, for example, comprises a touch (or contact) on the displayed object for a predetermined phase, a movement of the touch across touch-sensitive display 112, and liftoff of the touch (touch end). In some embodiments, the event also includes information for one or more associated event handlers 190.
In some embodiments, event definition 187 includes a definition of an event for a respective user-interface object. In some embodiments, event comparator 184 performs a hit test to determine which user-interface object is associated with a sub-event. For example, in an application view in which three user-interface objects are displayed on touch-sensitive display 112, when a touch is detected on touch-sensitive display 112, event comparator 184 performs a hit test to determine which of the three user-interface objects is associated with the touch (sub-event). If each displayed object is associated with a respective event handler 190, the event comparator uses the result of the hit test to determine which event handler 190 should be activated. For example, event comparator 184 selects an event handler associated with the sub-event and the object triggering the hit test.
In some embodiments, the definition for a respective event (187) also includes delayed actions that delay delivery of the event information until after it has been determined whether the sequence of sub-events does or does not correspond to the event recognizer's event type.
When a respective event recognizer 180 determines that the series of sub-events do not match any of the events in event definitions 186, the respective event recognizer 180 enters an event impossible, event failed, or event ended state, after which it disregards subsequent sub-events of the touch-based gesture. In this situation, other event recognizers, if any, that remain active for the hit view continue to track and process sub-events of an ongoing touch-based gesture.
In some embodiments, a respective event recognizer 180 includes metadata 183 with configurable properties, flags, and/or lists that indicate how the event delivery system should perform sub-event delivery to actively involved event recognizers. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate how event recognizers interact, or are enabled to interact, with one another. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate whether sub-events are delivered to varying levels in the view or programmatic hierarchy.
In some embodiments, a respective event recognizer 180 activates event handler 190 associated with an event when one or more particular sub-events of an event are recognized. In some embodiments, a respective event recognizer 180 delivers event information associated with the event to event handler 190. Activating an event handler 190 is distinct from sending (and deferred sending) sub-events to a respective hit view. In some embodiments, event recognizer 180 throws a flag associated with the recognized event, and event handler 190 associated with the flag catches the flag and performs a predefined process.
In some embodiments, event delivery instructions 188 include sub-event delivery instructions that deliver event information about a sub-event without activating an event handler. Instead, the sub-event delivery instructions deliver event information to event handlers associated with the series of sub-events or to actively involved views. Event handlers associated with the series of sub-events or with actively involved views receive the event information and perform a predetermined process.
In some embodiments, data updater 176 creates and updates data used in application 136-1. For example, data updater 176 updates the telephone number used in contacts module 137, or stores a video file used in video player module. In some embodiments, object updater 177 creates and updates objects used in application 136-1. For example, object updater 177 creates a new user-interface object or updates the position of a user-interface object. GUI updater 178 updates the GUI. For example, GUI updater 178 prepares display information and sends it to graphics module 132 for display on a touch-sensitive display.
In some embodiments, event handler(s) 190 includes or has access to data updater 176, object updater 177, and GUI updater 178. In some embodiments, data updater 176, object updater 177, and GUI updater 178 are included in a single module of a respective application 136-1 or application view 191. In other embodiments, they are included in two or more software modules.
It shall be understood that the foregoing discussion regarding event handling of user touches on touch-sensitive displays also applies to other forms of user inputs to operate multifunction devices 100 with input devices, not all of which are initiated on touch screens. For example, mouse movement and mouse button presses, optionally coordinated with single or multiple keyboard presses or holds; contact movements such as taps, drags, scrolls, etc. on touchpads; pen stylus inputs; movement of the device; oral instructions; detected eye movements; biometric inputs; and/or any combination thereof are optionally utilized as inputs corresponding to sub-events which define an event to be recognized.
The touch screen 112 optionally displays one or more graphics within user interface (UI) 200. In this embodiment, as well as others described below, a user is enabled to select one or more of the graphics by making a gesture on the graphics, for example, with one or more fingers 202 (not drawn to scale in the figure) or one or more styluses 203 (not drawn to scale in the figure). In some embodiments, selection of one or more graphics occurs when the user breaks contact with the one or more graphics. In some embodiments, the gesture optionally includes one or more taps, one or more swipes (from left to right, right to left, upward and/or downward) and/or a rolling of a finger (from right to left, left to right, upward and/or downward) that has made contact with device 100. In some implementations or circumstances, inadvertent contact with a graphic does not select the graphic. For example, a swipe gesture that sweeps over an application icon optionally does not select the corresponding application when the gesture corresponding to selection is a tap.
Device 100 optionally also includes one or more physical buttons, such as “home” or menu button 204. As previously described, menu button 204 is, optionally, used to navigate to any application 136 in a set of applications that are, optionally executed on device 100. Alternatively, in some embodiments, the menu button is implemented as a soft key in a GUI displayed on touch screen 112.
In one embodiment, device 100 includes touch screen 112, menu button 204, push button 206 for powering the device on/off and locking the device, volume adjustment button(s) 208, Subscriber Identity Module (SIM) card slot 210, head set jack 212, and docking/charging external port 124. Push button 206 is, optionally, used to turn the power on/off on the device by depressing the button and holding the button in the depressed state for a predefined time interval; to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or to unlock the device or initiate an unlock process. In an alternative embodiment, device 100 also accepts verbal input for activation or deactivation of some functions through microphone 113. Device 100 also, optionally, includes one or more contact intensity sensors 165 for detecting intensity of contacts on touch screen 112 and/or one or more tactile output generators 167 for generating tactile outputs for a user of device 100.
Each of the above identified elements in
Although some of the examples that follow will be given with reference to inputs on touch screen display 112 (where the touch sensitive surface and the display are combined), in some embodiments, the device detects inputs on a touch-sensitive surface that is separate from the display, as shown in
Additionally, while the following examples are given primarily with reference to finger inputs (e.g., finger contacts, finger tap gestures, finger swipe gestures), it should be understood that, in some embodiments, one or more of the finger inputs are replaced with input from another input device (e.g., a mouse based input or stylus input). For example, a swipe gesture is, optionally, replaced with a mouse click (e.g., instead of a contact) followed by movement of the cursor along the path of the swipe (e.g., instead of movement of the contact). As another example, a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e.g., instead of detection of the contact followed by ceasing to detect the contact). Similarly, when multiple user inputs are simultaneously detected, it should be understood that multiple computer mice are, optionally, used simultaneously, or a mouse and finger contacts are, optionally, used simultaneously.
As used herein, the term “focus selector” refers to an input element that indicates a current part of a user interface with which a user is interacting. In some implementations that include a cursor or other location marker, the cursor acts as a “focus selector,” so that when an input (e.g., a press input) is detected on a touch-sensitive surface (e.g., touchpad 355 in
As used in the specification and claims, the term “characteristic intensity” of a contact refers to a characteristic of the contact based on one or more intensities of the contact. In some embodiments, the characteristic intensity is based on multiple intensity samples. The characteristic intensity is, optionally, based on a predefined number of intensity samples, or a set of intensity samples collected during a predetermined time period (e.g., 0.05, 0.1, 0.2, 0.5, 1, 2, 5, 10 seconds) relative to a predefined event (e.g., after detecting the contact, prior to detecting liftoff of the contact, before or after detecting a start of movement of the contact, prior to detecting an end of the contact, before or after detecting an increase in intensity of the contact, and/or before or after detecting a decrease in intensity of the contact). A characteristic intensity of a contact is, optionally, based on one or more of: a maximum value of the intensities of the contact, a mean value of the intensities of the contact, an average value of the intensities of the contact, a top 10 percentile value of the intensities of the contact, a value at the half maximum of the intensities of the contact, a value at the 90 percent maximum of the intensities of the contact, or the like. In some embodiments, the duration of the contact is used in determining the characteristic intensity (e.g., when the characteristic intensity is an average of the intensity of the contact over time). In some embodiments, the characteristic intensity is compared to a set of one or more intensity thresholds to determine whether an operation has been performed by a user. For example, the set of one or more intensity thresholds optionally includes a first intensity threshold and a second intensity threshold. In this example, a contact with a characteristic intensity that does not exceed the first threshold results in a first operation, a contact with a characteristic intensity that exceeds the first intensity threshold and does not exceed the second intensity threshold results in a second operation, and a contact with a characteristic intensity that exceeds the second threshold results in a third operation. In some embodiments, a comparison between the characteristic intensity and one or more thresholds is used to determine whether or not to perform one or more operations (e.g., whether to perform a respective operation or forgo performing the respective operation), rather than being used to determine whether to perform a first operation or a second operation.
In some embodiments described herein, one or more operations are performed in response to detecting a gesture that includes a respective press input or in response to detecting the respective press input performed with a respective contact (or a plurality of contacts), where the respective press input is detected based at least in part on detecting an increase in intensity of the contact (or plurality of contacts) above a press-input intensity threshold. In some embodiments, the respective operation is performed in response to detecting the increase in intensity of the respective contact above the press-input intensity threshold (e.g., a “down stroke” of the respective press input). In some embodiments, the press input includes an increase in intensity of the respective contact above the press-input intensity threshold and a subsequent decrease in intensity of the contact below the press-input intensity threshold, and the respective operation is performed in response to detecting the subsequent decrease in intensity of the respective contact below the press-input threshold (e.g., an “up stroke” of the respective press input).
In some embodiments, the device employs intensity hysteresis to avoid accidental inputs sometimes termed “jitter,” where the device defines or selects a hysteresis intensity threshold with a predefined relationship to the press-input intensity threshold (e.g., the hysteresis intensity threshold is X intensity units lower than the press-input intensity threshold or the hysteresis intensity threshold is 75%, 90% or some reasonable proportion of the press-input intensity threshold). Thus, in some embodiments, the press input includes an increase in intensity of the respective contact above the press-input intensity threshold and a subsequent decrease in intensity of the contact below the hysteresis intensity threshold that corresponds to the press-input intensity threshold, and the respective operation is performed in response to detecting the subsequent decrease in intensity of the respective contact below the hysteresis intensity threshold (e.g., an “up stroke” of the respective press input). Similarly, in some embodiments, the press input is detected only when the device detects an increase in intensity of the contact from an intensity at or below the hysteresis intensity threshold to an intensity at or above the press-input intensity threshold and, optionally, a subsequent decrease in intensity of the contact to an intensity at or below the hysteresis intensity, and the respective operation is performed in response to detecting the press input (e.g., the increase in intensity of the contact or the decrease in intensity of the contact, depending on the circumstances).
For ease of explanation, the description of operations performed in response to a press input associated with a press-input intensity threshold or in response to a gesture including the press input are, optionally, triggered in response to detecting either: an increase in intensity of a contact above the press-input intensity threshold, an increase in intensity of a contact from an intensity below the hysteresis intensity threshold to an intensity above the press-input intensity threshold, a decrease in intensity of the contact below the press-input intensity threshold, and/or a decrease in intensity of the contact below the hysteresis intensity threshold corresponding to the press-input intensity threshold. Additionally, in examples where an operation is described as being performed in response to detecting a decrease in intensity of a contact below the press-input intensity threshold, the operation is, optionally, performed in response to detecting a decrease in intensity of the contact below a hysteresis intensity threshold corresponding to, and lower than, the press-input intensity threshold.
In some embodiments, display controller 508 causes the various user interfaces of the disclosure to be displayed on display 514. Further, input to device 500 is optionally provided by remote 510 via remote interface 512, which is optionally a wireless or a wired connection. In some embodiments, input to device 500 is provided by a multifunction device 511 (e.g., a smartphone) on which a remote control application is running that configures the multifunction device to simulate remote control functionality, as will be described in more detail below. In some embodiments, multifunction device 511 corresponds to one or more of device 100 in
Input mechanism 508 is, optionally, a microphone, in some examples. Personal electronic device 500 optionally includes various sensors, such as GPS sensor 532, accelerometer 534, directional sensor 540 (e.g., compass), gyroscope 536, motion sensor 538, and/or a combination thereof, all of which can be operatively connected to I/O section 514.
Memory 518 of personal electronic device 500 can include one or more non-transitory computer-readable storage mediums, for storing computer-executable instructions, which, when executed by one or more computer processors 516, for example, can cause the computer processors to perform the techniques described below, including processes described with reference to
As used here, the term “affordance” refers to a user-interactive graphical user interface object that is, optionally, displayed on the display screen of devices 100, 300, and/or 500 (
As used herein, “installed application” refers to a software application that has been downloaded onto an electronic device (e.g., devices 100, 300, and/or 500) and is ready to be launched (e.g., become opened) on the device. In some embodiments, a downloaded application becomes an installed application by way of an installation program that extracts program portions from a downloaded package and integrates the extracted portions with the operating system of the computer system.
As used herein, the terms “open application” or “executing application” refer to a software application with retained state information (e.g., as part of device/global internal state 157 and/or application internal state 192). An open or executing application is, optionally, any one of the following types of applications:
As used herein, the term “closed application” refers to software applications without retained state information (e.g., state information for closed applications is not stored in a memory of the device). Accordingly, closing an application includes stopping and/or removing application processes for the application and removing state information for the application from the memory of the device. Generally, opening a second application while in a first application does not close the first application. When the second application is displayed and the first application ceases to be displayed, the first application becomes a background application.
One or more of the embodiments disclosed herein optionally include one or more of the features disclosed in the following patent applications: “User Interfaces For Interacting with Channels that Provide Content that Plays in a Media Browsing Application” (U.S. Patent Application No. 62/822,952, filed Mar. 24, 2019), “User Interfaces For a Media Browsing Application” (U.S. Patent Application No. 62/822,948, filed Mar. 24, 2019), and “User Interfaces Including Selectable Representations of Content Items” (U.S. Patent Application No. 62/822,942, filed Mar. 24, 2019), each of which is hereby incorporated by reference.
Attention is now directed towards embodiments of user interfaces (“UI”) and associated processes that are implemented on an electronic device, such as portable multifunction device 100, device 300, or device 500.
Users interact with electronic devices in many different manners, including using an electronic device to browse items of content available for playback on the electronic device. In some embodiments, an electronic device is able to present representations of items of content that are available for playback on the electronic device. The embodiments described below provide ways in which an electronic device presents first and second representations of items of content. Enhancing interactions with a device reduces the amount of time needed by a user to perform operations, and thus reduces the power usage of the device and increases battery life for battery-powered devices. It is understood that people use devices. When a person uses a device, that person is optionally referred to as a user of the device.
In
As shown in
The selectable option 606b for initiating playback of the content, when selected, causes the electronic device 500 to play the content if the electronic device 500 is entitled to the content (e.g., has purchased or rented the content from a content store, is subscribed to a channel or provider that provides access to the content, etc.) or initiates a process for gaining access to the content (e.g., purchasing or renting the content from the content store or subscribing to the channel or provider). The selectable option 606b is presented in accordance with one or more steps of method 900.
The selectable option 608b for adding the content to a playback queue of the electronic device 500 optionally includes an icon representing the playback queue. For example, the option 608b includes an icon representing adding an item to a list and the text “Up Next.”
The description 612b of the content optionally includes two columns of information. The first column includes information such as title, release date, rating, genre, language and accessibility information, and the like. The second column includes information about the cast and crew that created the content.
As shown in
As shown in
As shown in
As shown in
As shown in
As shown in
The product page user interface includes an indication 616 of the title of the content, a row 618 of representations of seasons of the content, a row 620 of representations of episodes of the content, a row 622 of descriptions of the episodes of the content, and a row 624 of bonus content. The descriptions of episodes in row 622 include the title of the episode, the text about the episode, the original air date of the episode, and other information about the episode. In some embodiments, rows 620 and 622 include a peek of the next representations 620 and 622 of another episode and information about the episode. The remainder of the product page is presented in accordance with method 1100. While displaying the product page user interface, in response to detecting a horizontal swipe, the electronic device 500 moves the current focus (e.g., from one item in row 620 to another item row 620) in accordance with the swipe input, as opposed to navigating to a representation or product page related to a different item of content. As shown in
After the user swipes up from the product page user interface, as shown in
As shown in
As shown in
As shown in
As shown in
As shown in
As shown in
As described below, the method 700 provides ways to present representations of items of content available for playback on the electronic device 500. The method reduces the cognitive burden on a user when interacting with a user interface of the device of the disclosure, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, increasing the efficiency of the user's interaction with the user interface conserves power and increases the time between battery charges.
In some embodiments, such as in
In some embodiments, such as in
In some embodiments, such as in
In some embodiments, such as in
In some embodiments, such as in
In some embodiments, such as in
In some embodiments, such as in
The above-described manner of presenting the second representation of the first content item including second information and the first information allows the electronic device to present additional information about a content item while enabling the user to continue to browse content (e.g., by presenting the second representations of the second and third content items while presenting the second representation of the first content item), which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the user inputs needed to switch between a user interface for browsing content and a user interface for presenting the second information), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of presenting the video trailer in the background of the second representation allows the electronic device to refrain from playing the trailer while the first representations of the items of content are being presented, which reduces power usage and improves battery life of the electronic device by conserving computing resources while the user is browsing the items of content with the first representations of content.
In some embodiments, such as in
The above-described manner of presenting the video trailer in the full screen mode in response to an upward swipe that is detected while the second representation of the first content item is displayed allows the electronic device to conserve display area for the first and second information and selectable options of the second representation until the upward swipe is received, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by enabling the user to view the information, selectable options, and trailer at the same time until the user decides to enter an input to view the trailer full screen), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by reducing the number of inputs needed to view the first and second information and selectable options while viewing the trailer in the second representation of the first content item.
In some embodiments, playing the video trailer in the second representation 604d, such as in
The above-described manner of presenting the video trailer without audio in the second representation of the first content item and presenting the video trailer with audio in the full-screen mode allows the electronic device to conserve computing resources while presenting the second representation of the first content item by forgoing playing the audio of the video content in the second representation of the first content item, which reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of presenting the video trailer with a first aspect ratio in the second representation of the first content item and presenting the video trailer with a second aspect ratio in the full screen mode allows the electronic device to present the video trailer in a predetermined region of the second representation of the first content item that allows the electronic device to concurrently present the video trailer with the first and second information in the second representation of the first content item and the second representations of the first content item and second content item, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by enabling the user to concurrently view the trailer, the first and second information, and the second representations of the second and third content items without navigating between different user interfaces), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of presenting the video trailer in the second representation of the first content item in response to a downward swipe that is received while the video trailer is playing in the full screen mode allows the electronic device to continue playing the trailer while also presenting the first and second information and the second representations of the second and third content items, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by enabling the user to concurrently view the trailer, the first and second information, and the second representations of the second and third content items), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of forgoing playing the video trailer of the first content item if the video trailer has previously been presented allows the electronic device to conserve resources if the trailer has already been presented (e.g., by forgoing playing the video trailer if it has already been presented previously), which reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of pausing the video trailer in response to the pause input and playing the video trailer in response to the play input allows the electronic device to enable the user to pause and play the trailer regardless of the location of the current focus in the user interface, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs needed to play or pause the trailer), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of presenting the second representation of the first content item while presenting the second representations of the second and third content items before the downward swipe is detected allows the electronic device to concurrently display the second representations of the first, second, and third content items until the user enters an input to view the second representation of the first content item in the full screen mode, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs needed to view the second representation of the first content item and browse the other content items), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of revealing the second representation of the second or third content item in the primary position in response to the horizontal directional input allows the electronic device to present movement of the second representations of the content items in response to the user's input while moving fewer display pixels than would be required if the second representation of the second or third content item moved a distance equal to or greater than the width of the primary position, which reduces power usage and improves battery life of the electronic device by reducing the complexity of the movement animation.
In some embodiments, such as in
The above-described manner of playing the trailer of the second or third content item when the second representation of the second or third content item is presented allows the electronic device to reduce the number of inputs required to play the trailer of the second or third content item, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient, which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of moving the current focus to the selectable option to play the respective item of content when the second representation of the respective item of content is displayed allows the electronic device to reduce the number of inputs needed to play the respective item of content, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs needed to move the current focus to the selectable option to play the respective content item), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of presenting the first plurality of representations of content items with the current focus on the first representation of the first content item in response to an input to navigate backward in a user interface that is received while presenting the second representations of the content items allows the electronic device to keep the current focus on a representation of the first content item which enables the user to select the first representation of the first content item if the input to navigate backward was entered in error, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs needed to go back to the second representation of the first content item), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of scrolling the second representations of content items until a second representation of a respective item of content is in the primary position and then presenting the first plurality of representations of content items with the current focus on the first representation of the respective content item in response to an input to navigate backward in a user interface that is received while presenting the second representations of the content items allows the electronic device to keep the current focus on a representation of the respective content item which enables the user to select the first representation of the respective content item if the input to navigate backward was entered in error, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs needed to go back to the second representation of the respective content item), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of not scrolling the first plurality of representations when the first representation of the respective content item was displayed on the display in the row of the first plurality of representations when the selection of the first representation of the first content item was received and scrolling the first plurality of representations when the first representation of the respective content item was not displayed on the display in the row of the first plurality of representations when the selection of the first representation of the first content item was received allows the electronic device to continue to present a representation of the respective content item when the input to navigate backwards in the user interface is received while presenting the second representation of the respective content item, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs required to continue viewing a representation of the respective content item), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of presenting second representations of items of content in rows that only include content items of a type that is in the first set of one or more content types and presenting product pages corresponding to items of content presented in rows including content of types other than types in the first set of content types allows the electronic device to enable the user to continue to browse content items from the second representations when the selected representation of content is in a row of content in the first set of content types and enables the user to view information about content items that are presented in a row of content including representations of items of content that are not of a type in the first set of content types even if content of a type not in the first set of content types does not have a second representation, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by enabling the user to view information about items of content in response to selection even if the row in which the content is presented does not have second representations of the content items), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of moving the current focus to a selectable option that, when selected, causes the electronic device to play the content in response to the selection of the first representation of the first content item allows the electronic device to reduce the number of inputs needed to play the content, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs to move the current focus to the selectable option that, when selected, causes the electronic device to play the content), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of moving the focus to a different second representation if the horizontal directional input is received while the current focus is in the first region and moving the current focus within the second representation of the first content item if the horizontal directional input is received while the current focus is outside of the first region allows the electronic device to present a plurality of selectable options in a horizontal layout within the second representation of the first content outside of the first region, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by utilizing the horizontal space on the display to present more selectable options at once to reduce the number of inputs needed to see all the options), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
It should be understood that the particular order in which the operations in
The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described with respect to
Users interact with electronic devices in many different manners, including using an electronic device to play various media items. In some embodiments, an electronic device is able to access items of content in a media browsing application through several different ways. The embodiments described below provide ways in which an electronic device presents selectable options for accessing items of content that reflect the ways in which the respective item of content is available. Enhancing interactions with a device reduces the amount of time needed by a user to perform operations, and thus reduces the power usage of the device and increases battery life for battery-powered devices. It is understood that people use devices. When a person uses a device, that person is optionally referred to as a user of the device.
The information 810h about the item of content includes two columns of information. The first column includes information such as the content title, genre, runtime, format, languages, and accessibility options. The second column includes information about the cast and crew of the content.
The selectable option 808h for adding the content to a playback queue of the electronic device 500 is optionally presented with an icon that represents adding an item of content to the queue with the words “Up Next”.
The item of content represented by representation 802h is a movie that is available on the electronic device 500 by purchasing the movie from a content store. Thus, the electronic device 500 presents a selectable option 804h to initiate a process to purchase the content from the content store. As shown in
In
As shown in
As shown in
As shown in
A user interface similar to the user interface illustrated in
Returning to
Returning to
As shown in
In
As shown in
As described below, the method 900 provides ways to present selectable options for initiating a process to access an item of content based on the available ways of accessing the content. The method reduces the cognitive burden on a user when interacting with a user interface of the device of the disclosure, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, increasing the efficiency of the user's interaction with the user interface conserves power and increases the time between battery charges.
In some embodiments, such as in
In some embodiments, such as in
In some embodiments, such as in
In some embodiments, such as in
In some embodiments, such as in
The above-described manner of presenting a set of one or more selectable options that correspond to the sources of the content allows the electronic device to indicate to the user the ways in which the content is available and provide selectable options for gaining access to the content in the ways the content is available, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by informing the user how the content will be accessed before the user accesses the content), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by reducing user error of viewing content through a source the user does not intend to access (e.g., accidentally starting a subscription to a channel or accidentally purchasing content.
In some embodiments, such as in
The above-described manner of presenting no more than a maximum number of selectable options for viewing the content allows the electronic device to conserve display space for other information about the respective content item, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by increasing the amount of information that the user is able to view in the user interface that is specific to the respective content item, thereby reducing the number of user inputs needed to access the information), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of presenting a selectable option to play the content through a source the user is entitled to access and forgoing presenting a selectable option to play the content through a source the user is not entitled to access allows the electronic device to reduce the chance of the user making an error of selecting the selectable option associated with a source the user is not entitled to access when a source the user is entitled to access is available, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the chance for user error), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of presenting an indication of the first source of the content item allows the electronic device to communicate to the user the source of the content in the user interface including a selectable option to access the content, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs needed to navigate between a user interface that includes a selectable option to play the content and a user interface that includes an indication of the source of the content), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of presenting an indication of how much time is remaining in the rental period of the respective content item allows the electronic device to communicate to the user how much time the user has to play the content item in a user interface that includes a selectable option to play the content item, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs required to navigate between a user interface that includes an indication of the amount of time remaining in the rental period and a user interface that includes a selectable option to play the respective item of content), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of presenting a visual indication when the electronic device will open a different application to play the respective content item allows the electronic device to indicate to the user that a different application will be opened to play the respective content item, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by present the option to play the respective content item in the media browsing application rather than requiring the user to navigate to the other application to play the respective content item), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of combining two or more manners of accessing the content into one selectable option when the number of manners of accessing the content exceeds a predetermined threshold allows the electronic device to provide more manners of accessing the content than the predetermined threshold while presenting no more than the predetermined threshold number of selectable options, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs needed to traverse all of the selectable options), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by reducing the number of inputs needed to access other information and content in the user interface.
In some embodiments, such as in
The above-described manner of playing the content at the previous playback position within the series or within an episode of the episodic content in response to detecting selection of the selectable option allows the electronic device to present the episodic content at the playback position at which the user left off without requiring additional inputs from the user to select the playback position, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs needed to play the content at the previous playback position), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of playing the content from the last playback position if the user has partially watched the content and playing the content from the beginning if the user has not watched the content allows the electronic device to reduce the number of inputs needed to play the content from the playback position at which the user left off without requiring additional user inputs to do so, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of user inputs needed to resume playing the content), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of presenting the selectable option to pre-purchase the content along with an indication of when the content will become available for viewing allows the electronic device to present information about when the content will become available while presenting the selectable option to pre-order the content, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs needed to navigate between a user interface that includes information about when the content will become available and a user interface that includes the selectable option to pre-purchase the content), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
It should be understood that the particular order in which the operations in
The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described with respect to
Users interact with electronic devices in many different manners, including using an electronic device to view information about a series of episodic content. In some embodiments, an electronic device is able to present representations of the episodes in the series of episodic content. Enhancing interactions with a device reduces the amount of time needed by a user to perform operations, and thus reduces the power usage of the device and increases battery life for battery-powered devices. It is understood that people use devices. When a person uses a device, that person is optionally referred to as a user of the device.
As shown in
As shown in
As shown in
As shown in
In
In
The selectable options 1026d-1032d for accessing the content include an option 1026d to access the content with a channel to which the electronic device 500 is subscribed, an option 1028d to access the content with a channel to which the electronic device 500 is not subscribed, an option 1030d to access the content with another application, and an option 1032d to purchase seasons of the content through the content store. While the current focus is on option 1026d, the electronic device 500 detects a horizontal rightward swipe (e.g., movement of contact 1003). In response to the swipe, the electronic device 500 moves the current focus to option 1032d, as shown in
In
In
As shown in
As shown in
As shown in 10P, the electronic device 500 presents the row 1014d of seasons, the row 1016d of episodes, the row 1018d of information about the episodes, and the row of bonus content 1020d. Although not shown in the figures, when the current focus is on an item in the row 1020d of bonus content and the electronic device 500 receives an input to move the current focus up, the electronic device 500 moves the current focus to an item in the row 1016d of episodes, skipping the row 1018d of information about the episodes, reducing the number of inputs needed to select a representation of an episode in row 1016d to play the episode.
While the current focus is on a representation 1016 of an episode, the electronic device 500 detects a horizontally scrolling input (e.g., movement of contact 1003). In response to the user input, the electronic device 500 moves the current focus in accordance with movement of contact 1003, as shown in
In
As shown in
In
As shown in
As shown in
In
As shown in
In
As shown in
In
As shown in
As shown in
As shown in
In
As described below, the method 1100 provides ways to present representations of episodes in a series of episodic content. The method reduces the cognitive burden on a user when interacting with a user interface of the device of the disclosure, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, increasing the efficiency of the user's interaction with the user interface conserves power and increases the time between battery charges.
In some embodiments, such as in
In some embodiments, such as in
In some embodiments, such as in
In some embodiments, such as in
In some embodiments, such as in
In some embodiments, such as in
In some embodiments, such as in
The above-described manner of updating the current focus to the respective representation of the respective episode in response to an input to move the current focus to the second region allows the electronic device to directly move the current focus to the representation of the respective episode, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of directional inputs required to move the current focus to the representation of the respective episode), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by reducing the number of inputs required to initiate playback of an episode by selecting a representation of a respective episode.
In some embodiments, such as in
The above-described manner of moving the current focus from the respective representation of the respective episode to the respective selectable representation of a respective informational item associated with the respective episode and presenting the expanded representation of the respective informational item in response to selection of the respective representation of the respective informational item allows the electronic device to present a subset of the second information before the respective representation of the respective informational item is selected, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by conserving display area for content other than the second information until the user requests to view the second information), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of presenting the respective representation of the respective informational item with a first visual characteristic when the current focus is not on the respective representation of the respective episode or the respective representation, with a second visual characteristic when the current focus is on the respective representation of the respective episode, and with a third visual characteristic when the current focus is on the respective representation of the respective informational item allows the electronic device to indicate to the user that the respective informational item is associated with the respective episode when the current focus is on the respective representation of the respective episode, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the amount of time it takes the user to identify which informational item is associated with the respective episode), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of moving the current focus to the respective season to which the respective episode belongs in response to an input to move the current focus from the representation of the respective episode to a representation of a season allows the electronic device to reduce the chances of the user selecting a different season in error, such as while scrolling past the representations of the seasons to a different part of the user interface, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs needed to continue viewing the season to which the respective episode belongs), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency and reducing user errors.
In some embodiments, such as in
The above-described manner of replacing the representations of episodes and information about episodes in the first season with representations of episodes and information about episodes in the second season in response to moving the current focus from the first selectable representation of the first season to the second selectable representation of the second season allows the electronic device to reduce the number of inputs needed to view episodes in the second season compared to requiring the user to scroll through a plurality of representations of episodes in the first or other seasons, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs needed to view episodes from the second season) which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of scrolling the representations of the episodes and the representations of the descriptions of episodes together allows the electronic device to maintain the association of respective representations of respective episodes with the respective representations of descriptions of respective episodes while also allowing the representations of episodes and the representations of descriptions of episodes to be independently selectable to perform different actions, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by maintaining the visual association of the representation of and representation of information about each respective episode while also presenting a selectable option to initiate a process to view the episode and a selectable option to view more information about the episode proximate to one another), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by reducing the number of inputs needed to view the association of a representation of an episode to a description of the episode, to initiate a process to view an episode, and to view additional information about the episode.
In some embodiments, such as in
The above-described manner of presenting representations of the one or more manners of accessing the content allows the electronic device to reduce the number of inputs needed to view the different manners of accessing the content and selecting one of the manners to play the content, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs needed to view whether an item of content is available through each of a plurality of manners of accessing content and to play the content via one of the manners), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of presenting the representation that corresponds to the purchased content allows the electronic device to reduce the chances of a user error of selecting a different manner of viewing the content, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs needed to correct an error by avoiding the error), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of presenting a selectable option to purchase one or more episodes of the collection of episodic content allows the electronic device to provide to the user a way of purchasing the content in a user interface that includes further information about the content, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs needed to navigate between a user interface that includes information about the content and a user interface that includes the selectable option to purchase the content), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of presenting the season purchase representations in response to selection of the representation of the purchase allows the electronic device to conserve display area before selection of the representation of the purchase by presenting the single representation of the purchase rather than presenting each of the season purchase representations, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by enabling the user to view more information about the collection of episodic content prior to selecting the representation of the purchase), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of playing a respective episode of the one or more episodes upon successful purchase of the one or more episodes allows the electronic device to reduce the number of inputs needed to play the content, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient, which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of presenting a selectable option within the unified media browsing application that is selectable to view the content in the separate application allows the electronic device to present information about accessing the content through applications that are not the unified media browsing application, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs needed to open the different application to see if the content is available via the different application), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of presenting the representation of the respective channel in the prioritized position allows the electronic device to reduce the number of inputs needed to navigate to the representation of the respective channel, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs needed to watch the content within the unified browsing application), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
The above-described manner of presenting representations of information that are selectable to present additional information allows the electronic device to reduce the amount of screen area used for information before one of the representations is selected, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by reducing the number of inputs needed to view information other than the expanded information before one of the representations is selected), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, such as in
It should be understood that the particular order in which the operations in
The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described with respect to
Users interact with electronic devices in many different manners, including using an electronic device to browse for items of content available for playback on the electronic device. In some embodiments, an electronic device is able to present a preview of items of content available via respective applications on the electronic device. The embodiments described below provide ways in which an electronic device presents enhanced previews of items of content. Enhancing interactions with a device reduces the amount of time needed by a user to perform operations, and thus reduces the power usage of the device and increases battery life for battery-powered devices. It is understood that people use devices. When a person uses a device, that person is optionally referred to as a user of the device.
In some embodiments, user interface 1200-1 includes a content preview region 1208. In some embodiments, content preview region 1208 displays a preview of content available on the electronic device via the applications installed on the device. In some embodiments, the content displayed in content preview region 1208 displays content available from the application that currently has focus. In some embodiments, the content displayed in the content preview region 1208 are still images, a slideshow of still images or videos, and/or a video. In some embodiments, when content is previewed in content preview region 1208, the device does not play the accompanying or corresponding audio of the content being previewed (e.g., if the preview is a video, then the accompanying audio is muted). In some embodiments, the content preview region 1208 encompasses the entire display and is displayed as a background beneath the other user interface elements on user interface 1200-1 (e.g., the rows of content are overlaid over the content preview region 1208). In some embodiments, user interface 1200-1 includes a prioritized row of applications (e.g., row 1202) at or near the bottom of user interface (although it is understood that row 1202 can be displayed anywhere on the user interface). In some embodiments, the prioritized row of applications 1202 is visually indicated and/or separated from other rows of applications. For example, as shown in
In
In some embodiments, the unified media browsing application is an application that provides a centralized location for browsing, viewing, or otherwise accessing content on the electronic device. The unified media browsing application optionally receives content viewing information from multiple content providers and/or applications for viewing content from those content providers that are installed on the electronic device (e.g., the content providers that have enabled sharing of content viewing information with the unified media browsing application, such as a separate CBS application, a separate Fox application, a separate ESPN application, etc. (e.g., such as provider 1, provider 2, provider 3 discussed above with reference to
In some embodiments, when a content item is previewed in the content preview region 1208, then the user is able to perform a gesture to request display of an enhanced preview of the content item currently being displayed in the content preview region 1208. In some embodiments, an upward gesture (e.g., an upward navigational gesture performed on a touch-sensitive surface of a remote control device) corresponds to a request to display an enhanced preview of the content item. In some embodiments, user interface 1200-1 displays a hint 1210 at or near the top of the user interface (e.g., overlaid over content preview region 1208) that indicates to the user that performing an upward swipe gesture causes display of an enhanced preview of the content item.
In
In
In
In
In
As shown in
In
In
As shown in
In
In
In
It is understood that a rightward and leftward navigational inputs are performable to cause navigation to other items associated with the arcade application, similarly to the processes described in
In
In
As shown in
In
In
In
In
In some embodiments, user interface 1200-10 includes the name of the playlist as well as the date when the playlist was most recently updated (e.g., “Monday”). In some embodiments, without user input, selectable options 1262 and 1264 are displayed. In some embodiments, selectable option 1262 is selectable to launch or display of the music application and cause playback of the previewed playlist (e.g., playlist 1). In some embodiments, selectable option 1264 is selectable to launch or display the music application and display a browsing user interface to browse through the music videos in the respective playlist or browse through all available music videos.
In
In
In some embodiments, because representation 1266-1 and representation 1266-2 are now in prioritized row 1202, the representations have access to the content preview functions of the prioritized row. However, in some embodiments, not all applications are compatible with the full features of the prioritized row. Thus, for example, as shown in
In
In FIG. 12AAA, a user input 1203 corresponding to a rightward navigation is received. In some embodiments, in response to the user input, the focus is moved to representation 1266-2 corresponding to App 2. In some embodiments, even though App 2 is not originally in the prioritized row 1202 (e.g., when it is not a recently opened app), App 2 does support the features and functionalities of enhanced preview mode. In some embodiments, because App 2 supports the features and functionalities of enhanced preview mode, content preview region 1208 displays a preview (e.g., optionally the preview extends over the entire length and width of the user interface such that the user interface elements are overlaid over the preview) of an item associated with App 2 (e.g., Item B). In some embodiments, hint 1210 is displayed to indicate that enhanced preview mode is available and that an upward swipe gesture will cause the device to enter into an enhanced preview mode for App 2.
Thus, in some embodiments, one or more applications installed on device 500 support enhanced preview mode. In some embodiments, a user is able to move applications to different rows, including into and out of the prioritized row 1202. In some embodiments, if an application supports enhanced preview mode, then when the application is in the prioritized row and has a focus, content is displayed in content preview region 1208 as discussed above and the user is able to enter into enhanced preview mode. In some embodiments, if the application supports enhanced preview mode and is not in the prioritized row 1202, then when the application has a focus, content is not displayed in content preview region 1208 and the user is not able to enter into enhanced preview mode. In some embodiments, if an application does not support enhanced preview mode and is not in the prioritized row 1202, then when the application has a focus, content is not displayed in content preview region 1208 and the user is not able to enter into enhanced preview mode. In some embodiments, if an application does not support enhanced preview mode and is in the prioritized row 1202, then when the application has a focus, selectable representations of content are displayed in the content preview region (e.g., as individual icons rather than a preview), and the user is not able to enter into enhanced preview mode.
As described below, the method 1300 provides ways to present enhanced previews of items of content available via respective applications on the electronic device 500. The method reduces the cognitive burden on a user when interacting with a user interface of the device of the disclosure, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, increasing the efficiency of the user's interaction with the user interface conserves power and increases the time between battery charges.
In some embodiments, such as in
In some embodiments, the icons of applications represent applications that are downloaded and/or installed on the electronic device. In some embodiments, the applications include a unified media browsing application, one or more content provider applications, a settings application, a music application, a podcast application, a photo gallery application, an application store application, etc. In some embodiments, the unified media browsing application provides a centralized location for browsing, viewing, or otherwise accessing content on the electronic device. The unified media browsing application optionally receives content viewing information from multiple content providers and/or applications for viewing content from those content providers that are installed on the electronic device (e.g., the content providers that have enabled sharing of content viewing information with the unified media browsing application, such as a separate CBS application, a separate Fox application, a separate HBO application, etc.) and aggregates the shared information into a catalog of available content. In some embodiments, the content provider applications have access to content from a specific provider, such as a primary or secondary content provider. In some embodiments, a primary content provider is a content provider (e.g., Comcast, Time Warner, etc.) that provides the user access to a plurality of secondary content providers (e.g., CBS, Fox, HBO, etc). In some embodiments, the music application provides access to a plurality of music that the user is entitled to access. In some embodiments, the podcast application provides access to a plurality of podcasts that are available on the electronic device. In some embodiments, the photo gallery application provides access to a plurality of photographs, memories, collections, and/or albums that are associated with the user of the electronic device's account. In some embodiments, the home user interface includes a content preview region and an application icon region. In some embodiments, the content preview region displays content associated with the application that has a focus. In some embodiments, the first region of the home user interface is a prioritized row of icons. In some embodiments, when an icon in the prioritized row of icons receives a focus, the content preview region displays a preview of content associated with the application whose icon has focus. In some embodiments, not all applications have all the content preview features that are available. Thus, in some embodiments, some applications in the prioritized row of icons have limited content preview functionalities and other applications in the prioritized row of icons have full content preview functionalities. In some embodiments, the second region of the home user interface is a row of icons other than the prioritized row of icons. In some embodiments, the rows of icons other than the prioritized row of icons are displayed beneath the prioritized row of icons and is accessible by navigating the home user interface downwards. In some embodiments, when icons in rows other than the prioritized row of icons have a focus, the content preview region does not display content associated with the application that has focus. Thus, in some embodiments, only the applications in the prioritized row of icons cause content to be displayed in the content preview region when the respective application has a focus.
In some embodiments, while displaying the home user interface for the electronic device in which a respective application icon has a current focus, the electronic device receives (1304), via the one or more input devices, an indication of a directional input in a respective direction, such as in
In some embodiments, in response to receiving the indication of the directional input in the respective direction (1306), such as in
In some embodiments, the content corresponding to the first application icon is the content that was displayed in the content preview region before receiving the upward swipe input. In some embodiments, the content is displayed in a full-screen mode (e.g., without displaying any other user interface elements and/or other content). In some embodiments, displaying the content includes playing audio associated with the content, where the audio was not played before entering full-screen mode. In some embodiments, the content is a still photograph, a slide show, a short clip, a trailer, or any other suitable promotional content. In some embodiments, the content is content that is available from the first application. Thus, in some embodiments, the upward swipe input causes an upward navigation beyond the top-most row of icons, thereby exiting the home user interface and entering into a content display user interface (e.g., a substantially full-screen content display user interface).
In some embodiments, in accordance with a determination that the respective application icon is a second application icon in the second set of application icons, the electronic devices moves (1314) the current focus from the second application icon to another application icon while maintaining display of the home user interface, such as in
The above-described manner of displaying content associated with an application when the application is in a prioritized region of the user allows the electronic device to provide the user with access to promotional content associated with the application, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by providing a mechanism for the user to preview content available from the application without requiring the user to launch the application or perform additional user inputs to preview the content), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of accessing content.
In some embodiments, while the first application icon is in the second region of the home user interface and has the current focus, the electronic device receives (1316), via the one or more input devices, an indication of a second directional input in the respective direction, such as in
In some embodiments, in response to receiving the indication of the second directional input in the respective direction (1318), such as in
In some embodiments, the second region is a second row of icons that is beneath the first row of icons (e.g., beneath the first region of the home user interface, beneath the prioritized row), and an upward navigation causes the focus to move from the first application in the second row to another application that is in the first row. In some embodiments, if the first application is in the second region (e.g., not in the prioritized row), then when focus is on the first application, the content preview region does not display content corresponding to the first application and optionally displays content corresponding to another application. In some embodiments, the upward swipe does not cause the content the device to enter into a content display user interface.
The above-described manner of displaying content associated with an application in a prioritized region (e.g., by providing content preview features to items in the prioritized region, but not providing content preview regions that are not in the prioritized region) allows the electronic device to emphasize applications that are displayed in the prioritized region (e.g., by displaying content in the content preview region and providing the user with the ability to view the displayed content in a more immersive user interface for applications that are in the prioritized region, but not similarly displaying content for applications that are not in the prioritized region), which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by providing the user with a mechanism to quickly view content available from applications from which the user is more likely to want to view content and not providing the user with the same mechanism for applications from which the user is less likely to want to view content), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of accessing content for a subset of applications on the electronic device.
In some embodiments, while the second application icon is in the first region of the home user interface and has the current focus, the electronic device receives (1322), via the one or more input devices, an indication of a second directional input in the respective direction, such as in
In some embodiments, in response to receiving the indication of the second directional input in the respective direction (1324), such as in
In some embodiments, the user interface corresponding to the application is a full screen or substantially full screen display of the content that was displayed in the content preview region before receiving the upward swipe input. In some embodiments, if the first application is not compatible with the functionalities of the prioritized row of applications, then performing an upward swipe does not cause display of the content corresponding to the second application. In such embodiments, the content preview region displays one or more representations of content (e.g., icons of content rather than a preview image or preview video of the content), and an upward swipe causes the focus to move from the second application to the representations of content that is in the content preview region (e.g., the content preview region is displayed above the prioritized row of applications).
The above-described manner of displaying content associated with an application in a prioritized region (e.g., by providing content preview features to items in the prioritized region, but not providing content preview features to those same items when they are not in the prioritized region) allows the electronic device to emphasize the applications that are displayed in the prioritized region (e.g., by displaying content in the content preview region and providing the user with the ability to view the displayed content in a more immersive user interface for applications that are in the prioritized region, which the user has indicated that he or she is more likely to access due to their inclusion in the prioritized region), which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by providing the user with a mechanism to quickly view content available from applications from which the user is more likely to access, without requiring the user to navigate into the respective application to browse for and view the same content), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of accessing content for a subset of applications on the electronic device.
In some embodiments, while displaying the home user interface for the electronic device in which the respective application icon has the current focus, the electronic device receives (1330), via the one or more input devices, an indication of a second directional input in a second respective direction, different than the respective direction, such as in
In some embodiments, in response to receiving the indication of the second directional input in the second respective direction, the electronic device reveals (1332), in the second region of the home user interface, additional application icons for additional applications on the electronic device, such as in
In some embodiments, scrolling the user interface downwards comprises moving the prioritized row of applications upwards and displaying another row of applications below the prioritized row of applications. In some embodiments, when any application from the prioritized row of applications has a focus, then the prioritized row of applications is displayed at or near the bottom of the display and is the only row that is displayed on the display (e.g., optionally the row below the prioritized row of applications is partially displayed beneath the prioritized row of applications as if “peeking” from the bottom of the display). Thus, in some embodiments, a downward navigation causes the row below the prioritized row of applications to be revealed and focus to be moved to that row).
The above-described manner of displaying other applications installed on the electronic device (e.g., by displaying other rows of content in response to a downward navigation) allows the electronic device to display only the applications that the user is more likely to access unless otherwise requested (e.g., by displaying only the prioritized row of icons until the user performs a input corresponding to a request to view other rows of applications), which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by displaying only those applications in the prioritized row of applications to minimize the items displayed on the user interface, unless or until the user requests display of over rows of applications by performing a downward navigation input), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of accessing content for a subset of applications on the electronic device.
In some embodiments, the application icons in the first region of the home user interface are displayed overlaid on a background, the background comprising a video preview (1334), such as in
In some embodiments, while displaying the home user interface for the electronic device in which the respective application icon has the current focus, the electronic device displays (1336), as the background, a video preview of content associated with the respective application icon, such as in
In some embodiments, if the user moves focus from one application icon in the prioritized row of content to another application icon in the prioritized row of content (e.g., due to a leftward or rightward navigation request), then the content preview region updates to display content from the newly-focused application. In some embodiments, the content preview region only displays content from the prioritized row of content. In some embodiments, if the content preview region is still displayed when an icon from a row of content beneath the prioritized row of content has a focus, then the content preview region does not display any content or displays content corresponding to the application from the prioritized row of content that previously had focus (e.g., if the user scrolled down from the prioritized row to a non-prioritized row, then the content preview region continues to display content from the prioritized row of content). In some embodiments, the content displayed in the content preview region is a video preview of the content associated with the respective application icon, such as a trailer or teaser video. In some embodiments, the content displayed in the content preview region is a still image or a slideshow. In some embodiments, the content associated with the respective application icon comprises content that is accessible via the respective application).
The above-described manner of displaying content associated with an application in a prioritized region (e.g., by displaying a video in the content preview region corresponding to the application that currently has focus) allows the electronic device to provide a more substantial preview of the content associated with the application that has focus, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by providing the user with a mechanism to meaningfully preview the content that is available from the application without requiring the user to view only still images or navigate to the respective application to view information about the content), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of previewing content.
In some embodiments, while displaying the first region of the home user interface for the electronic device, the electronic device displays (1338), overlaid on the background, a visual indication that a directional input in the respective direction will cause the home user interface to cease to be displayed and content corresponding to the respective application icon to be displayed, such as in
The above-described manner of displaying a hint of how to access the content display user interface (e.g., by displaying a visual indication that directional input will cause the device to enter into the content display user interface) allows the electronic device to ensure that the user knows how to access the more immersive user interface for previewing content associated with the application, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by displaying a visual indication that if the user performs a particular gesture while the respective application has a focus, then the user will be presented with a more immersive user experience to view the content that is currently being displayed in the content preview region, which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of previewing content available from an application.
In some embodiments, while displaying the home user interface for the electronic device in which the respective application icon has the current focus, the video preview of the content associated with the respective application icon is displayed without corresponding audio of the video preview (1340), such as in
In some embodiments, while displaying the video preview of the content associated with the respective application icon without displaying the home user interface in response to a directional input in the respective direction received while the respective application icon had the current focus (e.g., the content display user interface (e.g., the full screen or substantially full screen display of content associated with the respective application)), the video preview of the content associated with the respective application icon is displayed with the corresponding audio of the video preview (1342), such as in
The above-described manner of displaying content associated with an application in a prioritized region (e.g., by playing audio when the user enters into the full-screen content display user interface, but not playing audio when the user is on the home screen user interface) allows the electronic device to provide the user with a more immersive experience when the user requests the more immersive experience, but otherwise not playing audio when the user is potentially browsing for media, when audio is potentially disruptive and distracting, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by providing the user with a mechanism to preview the content with audio only when the user performs a user input requesting a more immersive experience), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, the application icons in the first region of the home user interface are displayed overlaid on a background, the background comprising a still image (1344), such as in
The above-described manner of displaying content associated with an application in a prioritized region (e.g., by displaying a still image of content in the content preview region) allows the electronic device to provide the user with a preview of content without overly distracting the user (e.g., by displaying still images in the content preview region when the user has not yet performed an input indicating a request to view the displayed content), which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by providing the user with a mechanism to preview the content without overly crowding the user interface), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of previewing content available from applications on the electronic device.
In some embodiments, the displaying the content corresponding to the first application icon includes displaying, overlaid on the content corresponding to the first application icon, one or more visual indications that directional input in a second respective direction, different than the respective direction, will cause display of additional content corresponding to the first application icon without displaying the home user interface (1348), such as in
In some embodiments, the next or previous item are associated with the respective application that had focus when the device entered into the content display user interface (e.g., optionally corresponding to other items that are accessible from the respective application that had focus). In some embodiments, the visual indicators are a left-face and right-facing caret or chevron (e.g., less-than or greater-than symbols). In some embodiments, the visual indicators are only displayed for a threshold amount of time (e.g., for 1 second, 2 seconds, 3 seconds after the initial display of the content display user interface or after content in the content display user interface is changed to another content). In some embodiments, the visual indicators are only displayed when a user input is detected (e.g., a touch-down on a touch-sensitive surface or a navigational input). In some embodiments, only one of the visual indicators are shown if navigation can only proceed in one direction (e.g., only the rightward indicator is shown if the user can only navigate in the rightward direction, and similarly for the leftward navigational direction). In some embodiments, the visual indicators include pagination markers at or near the bottom of the display. In some embodiments, the pagination markers include dots that correspond to the number of available “pages” corresponding to different content that can be navigated to. In some embodiments, the pagination markers include dashes that correspond to the number of available pages. In some embodiments, if there is only one content for display in the content display user interface, then the visual indicators are not shown.
The above-described manner of displaying hints for displaying additional content (e.g., by displaying visual indications that swiping to the left or right will cause display of additional content associated with the first application in the content display user interface) allows the electronic device to ensure that the user knows that additional content is available to be previewed by the user, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by indicating to the user that the user can preview other content items associated with the first application that the user may be interested in), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of browsing for content from a particular application on the electronic device.
In some embodiments, while displaying the content corresponding to the first application icon, the electronic device receives (1350), via the one or more input devices, an indication of a second directional input in a second respective direction, different than the respective direction, such as in
In some embodiments, in response to receiving the indication of the second directional input in the second respective direction (1352), such as in
The above-described manner of displaying additional content associated with an application in a prioritized region (e.g., by displaying additional content in response to a leftward or rightward navigation) allows the electronic device to provide the user with previews of other content that are also associated with the application, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by providing the user with a mechanism to quickly preview a plurality of content available from the first application without requiring the user to navigate to the first application to browse for content), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, while displaying the content corresponding to the first application icon, the electronic device detects (1358), via a remote control device with a touch-sensitive surface, an input on the touch-sensitive surface, such as in
In some embodiments, in response to detecting the input on the touch-sensitive surface (1360), such as in
In some embodiments, the information includes a short synopsis or description of the content item, the title of the content item, the year of publication of the content item, the rating of the content item (reviews, maturity ratings, etc.), the duration of the content item, the cast and crew associated with the content item (e.g., actors, directors, producers, etc.), audio/visual characteristics of the content item (e.g., icons indicating whether the item is playable in HD quality, 4K quality, HDR quality, etc.), or any combination of the forgoing. In some embodiments, the one or more selectable options includes a selectable option that is selectable to cause playback of the content item (e.g., cause the display of the content item in an application for displaying the content item). In some embodiments, the one or more selectable options includes a selectable option to acquire access to the content item. In some embodiments, the one or more selectable option includes a selectable option to display a user interface corresponding to the content item (e.g., display the content item's product page). In some embodiments, the one or more selectable options includes a selectable option to add the content item to a playback queue (e.g., an “Up Next” queue). In some embodiments, other selectable options for performing other actions are possible. In some embodiments, the information and/or the one or more selectable options are always displayed on the content display user interface, without requiring that the user perform a user input that satisfies the first criteria.
In some embodiments, in accordance with a determination that the input does not satisfy the one or more first criteria, the electronic device forgoes (1364) displaying the information about the content corresponding to the first application icon and the one or more selectable options to perform the one or more actions with respect to the content corresponding to the first application icon, such 12E (e.g., if the user input does not correspond to a click input or a touch-down input, then do not display the information and/or the selectable options). For example, if the user input corresponds to a navigational swipe input, then optionally perform a navigation action rather than display the information and/or selectable options.
The above-described manner of receiving more information and performing actions with respect to the content displayed in the content display user interface (e.g., by displaying information and selectable options in response to a user input that satisfies the first criteria) allows the electronic device to display a clean user interface until the user requests for display of other elements on the display (e.g., other information and/or selectable options for performing actions associated with the content), which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by providing the user with a simple preview user interface, but also providing the user with a mechanism to view more information and/or perform actions associated with the content), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of previewing and accessing content on the electronic device.
In some embodiments, the one or more selectable options to perform the one or more actions with respect to the content corresponding to the first application icon are arranged along the respective direction (1366), such as in
The above-described manner of displaying selectable options for performing actions associated with the displayed content (e.g., by displaying the selectable options arranged along a respective direction) allows the electronic device to provide the user with an easily navigable user interface (e.g., by displaying all selectable options along only one direction such that navigation in one direction will allow the user to access all of the selectable options), which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by providing the user with a mechanism to quickly navigate through the selectable options without requiring the user to perform different gestures to access all of the available selectable options), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, the one or more selectable options to perform the one or more actions with respect to the content corresponding to the first application icon includes a respective selectable option that is selectable to display a first application corresponding to the first application icon, and display, in the first application, respective content corresponding to the content corresponding to the first application icon (1368), such as in
The above-described manner of displaying the previewed content (e.g., by causing display of the content in response to a selection of a selectable option) allows the electronic device to provide the user with a method to cause playback of the content after previewing the content, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by providing the user with a mechanism to quickly cause full playback of the previewed content item without requiring the user to navigate away from the content display user interface, launch the respective application, and then browse to the previewed content item to cause full playback of the previewed content item), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of accessing content on the electronic device.
In some embodiments, in accordance with a determination that viewing activity of a user with respect to the respective content is first viewing activity, the respective selectable option is selectable to display, in the first application, first respective content corresponding to the respective content (1370), such as in
In some embodiments, in accordance with a determination that the viewing activity of the user with respect to the respective content is second viewing activity, the respective selectable option is selectable to display, in the first application, second respective content corresponding to the respective content (1372), such as in
In some embodiments, only one episode of the respective television series is displayed to the user. For example, the content displayed in the content user interface corresponds to a television series (e.g., rather than a respective episode of the television series), and selection of the selectable option causes display of a respective episode of the television series based on the user's viewing history. In some embodiments, the selectable option indicates which episode of the television series will be displayed in response to the user's selection (e.g., “Play S3 E3”). In some embodiments, the set of content items that are available to be viewed in the content display user interface comprise a movie, television series, miniseries, etc. In some embodiments, any or all of these content items are included in the set of content items based on the user's viewing history or based on the user adding these content items into a queue (e.g., “Up Next” queue).
The above-described manner of presenting content based on the user's viewing history (e.g., by causing playback of a particular content item that is based on the user's viewing history) allows the electronic device to customize the content that is displayed in response to the user's selection (e.g., by displaying different episodes of an episodic series based on whether the user has already watched certain episodes of the episodic series), which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by providing the user with an easy way to cause playback of the next episode of an episodic series that the user has not yet watched), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of accessing episodes of an episodic series on the electronic device.
In some embodiments, while displaying the content corresponding to the first application icon, the electronic device receives (1374), via the one or more input devices, an indication of a second directional input in a second respective direction, different than the respective direction, such as in
In some embodiments, in response to receiving the indication of the second directional input in the second respective direction (1376), such as in
In some embodiments, if the information about the respective content item and the one or more selectable items are currently being displayed, then preserve the display of the information and the selectable items but update the information and the selectable icons to correspond to the content that is now being displayed. For example, the information now describes the new content item and the one or more selectable options now correspond to performing actions with respect to the new content item. In some embodiments, more or fewer selectable options are displayed based on the actions that are available with the new content item.
In some embodiments, in accordance with a determination that the information about the content corresponding to the first application icon and the one or more selectable options to perform one or more actions with respect to the content corresponding to the first application icon were not displayed overlaid on the content corresponding to the first application icon when the indication of the second directional input was received (1384), such as in
The above-described manner of browsing through previews of different content items (e.g., by navigating to a different content item in response to a user request to navigate to a different content item and preserving the display of information and selectable options if information and selectable options were displayed for the previous content item when the request to navigate to a different content item was received, but by continuing to not display information or selectable options if information and selectable options were not displayed for the previous content item when the request to navigate to a different content item was received) allows the electronic device to provide a consistent user interface for the user based on the user's previous requests (e.g., if the user has previously requested display of information and selectable options, then preserve the display of the information and selectable options, but if the user has not yet requested display of information and selectable options, or has dismissed display of information and selectable options, then do not display information or selectable options until the user performs an explicit request for them), which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., without requiring the user to perform additional inputs to display information and selectable options or to dismiss information or selectable options when the user has already shown a preference for whether to display information and selectable options), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, the first application icon is an application icon for a unified media browsing application (1390), such as in
In some embodiments, the content corresponding to the unified media browsing application is displayed with a first selectable option and a second selectable option overlaid on the content corresponding to the unified media browsing application (1392), such as in
In some embodiments, the first selectable option is selectable to (1394): in accordance with a determination that a user of the electronic device has entitlement to view respective content in the unified media browsing application that corresponds to the content corresponding to the unified media browsing application, display, in the unified media browsing application, the respective content (1396), such as in
In some embodiments, in accordance with a determination that the user of the electronic device does not have entitlement to view the respective content in the unified media browsing application, display, via the display device, a user interface for obtaining entitlement to view the respective content in the unified media browsing application (1398), such as in
In some embodiments, the second selectable option is selectable to display, in the unified media browsing application, a user interface dedicated to the respective content (1398-2), such as in
The above-described manner of dynamically presenting selectable options based on the user's entitlements (e.g., causing playback of the previewed content item if the user is entitled to view the content and by displaying a user interface for obtaining entitlement if the user is not yet entitled to view the content) allows the electronic device to perform the best course of action for accessing the previewed content item in response to the user expressing a desire to access the previewed content item, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., without requiring the user to separately determine whether the user is entitled to view the content item and navigate to a separate user interface to acquire entitlement to the content item before accessing the content item), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of accessing content on the electronic device.
In some embodiments, the first application icon is an application icon for a respective application that provides content based on a subscription to a subscription service (1398-4), such as in
In some embodiments, the content corresponding to the respective application is displayed with a first selectable option overlaid on the content corresponding to the respective application (1398-6), such as in
In some embodiments, the first selectable option is selectable to (1398-8): in accordance with a determination that a user of the electronic device has a subscription to the subscription service, display, in the respective application, respective content from the respective application that corresponds to the content corresponding to the respective application (1398-10), such as in
In some embodiments, if the user has a subscription to the subscription service, but the respective application is not yet downloaded and/or installed on the electronic device, then selection of the first selectable option initiates a process for downloading and/or installing (or otherwise acquiring) the respective application. In some embodiments, if the user has not previously progressed in the respective application (e.g., has not played the game), then the first selectable option is selectable to begin the application from the beginning (e.g., start at the beginning of the game). In some embodiments, if the user has partially progressed in the respective application (e.g., has partially played through the game), then the first selectable option is selectable to continue at the current progress position of the respective application (e.g., continue playing the game at the previous playthrough position). In some embodiments, the selectable option indicates the action that is performed when the selectable option is selected (e.g., “Get”, “Play”, “Continue Play”, etc).
In some embodiments, in accordance with a determination that the user of the electronic device does not have a subscription to the subscription service, display, via the display device, a user interface from which the subscription to the subscription service can be obtained (1398-12), such as in
The above-described manner of accessing subscription content (e.g., by causing display of the subscription application if the user has a subscription to the subscription service and by displaying a user interface for subscribing to the subscription service if the user does not have a subscription to the subscription service) allows the electronic device to perform the best course of action for accessing the previewed content item in response to the user expressing a desire to access the previewed content item, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., without requiring the user to separately determine whether the user has a subscription to the subscription service and navigate to a separate user interface to acquire a subscription to the subscription service to the content item before accessing the content item), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of accessing subscription content on the electronic device.
In some embodiments, the first application icon is an application icon for a photo and video browsing application (1398-14), such as in
In some embodiments, the content corresponding to the photo and video browsing application is displayed with a first selectable option and a second selectable option overlaid on the content corresponding to the photo and video browsing application (1398-16), such as in
In some embodiments, the content corresponding to the photo and video browsing application includes a subset of photos or videos of a given collection of photos or videos in the photo and video browsing application (1398-18), such as in
In some embodiments, the first selectable option is selectable to playback, in the photo and video browsing application, an arrangement of photos or videos from the given collection of photos or videos (1398-20), such as in
In some embodiments, the second selectable option is selectable to display, in the photo and video browsing application, a user interface for manually browsing photos or videos from the given collection of photos or videos (1398-22), such as in
The above-described manner of accessing photo and video content (e.g., by displaying a preview of a collection of photos and/or videos and causing display of the respective collection in the photo and video browsing application in response to a request to view the collection of photos and/or videos or causing display of user interface for browsing photos and/or videos in response to a request to browse for photos and/or videos) allows the electronic device to provide the user with options for how to viewing the previewed content, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by providing multiple viewing options for the respective collection to the user without requiring the user to navigate to the photo and video application to access the same options), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of viewing a collection of photos and/or videos on the electronic device.
In some embodiments, the first application icon is an application icon for a podcast application (1398-24), such as in
In some embodiments, the content corresponding to the podcast application is displayed with a first selectable option and a second selectable option overlaid on the content corresponding to the podcast application (1398-26), such as in
In some embodiments, the content corresponding to the podcast application includes content corresponding to a given podcast in the podcast application (1398-28), such as in
In some embodiments, the first selectable option is selectable to play, in the podcast application, the given podcast (1398-30), such as in
The above-described manner of accessing podcasts (e.g., by displaying a preview of featured podcasts and causing playback of the previewed podcast in response to a user request to play back the previewed podcast or causing display of a user interface for viewing more information about the previewed podcast in response to a user request to view information about the previewed podcast) allows the electronic device to provide the user with multiple options for interacting with the previewed podcast (e.g., to cause playback if the user is interested in the podcast, or to display more information if the user wants to view more information before deciding whether to play the podcast), which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., without requiring the user to separately navigate to the podcast application and then browse for the previewed podcast to determine whether the user is interested in the podcast and to initiate playback of the podcast), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of accessing podcasts on the electronic device.
In some embodiments, the first application icon is an application icon for a music application (1398-34), such as in
In some embodiments, the content corresponding to the music application is displayed with a first selectable option and a second selectable option overlaid on the content corresponding to the music application (1398-36), such as in
In some embodiments, the content corresponding to the music application includes content from a given playlist in the music application (1398-38), such as in
In some embodiments, the first selectable option is selectable to play, in the music application, the given playlist (1398-40), such as in
In some embodiments, the second selectable option is selectable to display, in the music application, additional content from the given playlist (1398-42), such as in
The above-described manner of displaying music content (e.g., by causing playback of music videos from a featured playlist and causing playback of the playlist in response to the user request to view the playlist or by displaying a user interface for browsing through the featured playlist in response to the user request to view the items in the playlist) allows the electronic device to provide the user with multiple options for interacting with the previewed playlist, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., without requiring the user to separately navigate to the music application and then browse for the previewed playlist to determine whether the user is interested in the music videos in the playlist and then to initiate playback of the music videos in the playlist), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of accessing music content on the electronic device.
In some embodiments, the first region of the home user interface includes a third application icon (1398-44), such as in
In some embodiments, in response to receiving the indication of the second directional input in the respective direction (1398-48), such as in
In some embodiments, in accordance with a determination that the third application icon is not compatible with the display of content corresponding to the third application icon in response to a directional input in the respective direction, such as in
In some embodiments, if a respective application does not support the functionalities of the prioritized row of icons, then instead of displaying content in the content preview region (and from which an upward swipe navigation causes display of a content display user interface), then the content preview region displays one or more icons of content that is available from the respective application that are selectable to cause playback of the respective content.
The above-described manner of previewing content available in an application that does not support the prioritized region functionalities (e.g., by displaying, in the content preview region, representations of content available from the respective application, which are selectable to cause display of the respective content in the respective application) allows the electronic device to provide the user with the ability to move a preferred application to the prioritized region and still be able to quickly access content from the preferred application, even if the application does not support the full functionalities of the prioritized region, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by providing the user a mechanism to move a preferred application to the prioritized region while still providing some ability to quickly access certain content from the preferred application, without requiring the user to always navigate around the home user interface to find the preferred application and navigate into the preferred application to quickly find available content), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of content from a particular user-preferred application on the electronic device.
In some embodiments, the first region of the home user interface includes a predetermined number (e.g., one, two, three) of most-recently accessed application icons and one or more application icons that are displayed in the first region of the home user interface independent of activity accessing the one or more application icons (1398-62), such as in
In some embodiments, if a recently accessed application already has a corresponding icon in the prioritized row of icons, then do not include a second icon of the application in the section for recently displayed applications. In some embodiments, the section for one or more applications is visually separated from the section for other icons (e.g., by a line or other visual divider or boundary). In some embodiments, if the icons of the recently displayed applications in the prioritized row of icons are compatible with the functionalities of the prioritized row of icons, then focus on the respective icon will cause display of content in the content preview region (e.g., and optionally the display of the content display user interface in response to an upward swipe input). In some embodiments, if the icons of the recently displayed applications in the prioritized row of icons are not compatible with the functionalities of the prioritized row of icons, then focus on the respective icon will not cause display of content in the content preview region and optionally causes display of one or more icons of content available from the respective application.
The above-described manner of displaying recently accessed applications (e.g., by displaying a number of recently accessed applications I the prioritized region which are selectable to cause display of the respective application) allows the electronic device to provide the user with a shortcut to access applications that the user has shown an interest in accessing (e.g., by recently accessing the respective applications), which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by providing a shortcut to recently accessed applications without requiring the user to separately navigate the home user interface to find and launch the recently accessed applications), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of accessing recently accessed applications on the electronic device.
In some embodiments, while a second respective application icon in the home user interface has a current focus, the electronic device receives (1398-64), via the one or more input devices, an indication of a second directional input in the respective direction, such as in
In some embodiments, in response to receiving the indication of the second directional input in the respective direction (1398-66), such as in
In some embodiments, in response to receiving the indication of the second directional input in the respective direction (1398-66): in accordance with a determination that the second respective application icon is compatible with display of content corresponding to the second respective application icon in response to a directional input in the respective direction (1398-68) (e.g., the second application is compatible with the functionalities of the prioritized row of icons): in accordance with a determination that the second respective application icon was in the second region of the home user interface when the indication of the second directional input was received, the electronic device forgoes ceasing display of the home user interface and forgoes displaying, via the display device, the content corresponding to the second respective application icon (1398-72), such as in
In some embodiments, in accordance with a determination that the second respective application icon is not compatible with display of content corresponding to the second respective application icon in response to a directional input in the respective direction (1398-74), such as in
The above-described manner of interacting with applications on the electronic device (e.g., by displaying content in the content preview region if the respective application with focus is in the prioritized region and supports the functionalities of the prioritized region or by not displaying content in the content preview region if the respective application is not in the prioritized region (e.g., moving a focus) or if the respective application does not support the functionalities of the prioritized region (e.g., display icons of content items in the content preview region), allows the electronic device to provide the user with the ability to move applications to different locations in the home user interface and adjust the functionality of the applications and the device based on the location and the functionalities supported by the applications, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., without fixing the position of the respective application in their respective regions, which potentially requires the user to perform excessive user inputs to navigate to an application that the user is potentially more interested in (e.g., applications not in the prioritized region) or to be presented with applications that the user is potentially not interested in (e.g., applications in the prioritized region)), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of accessing applications on the electronic device.
It should be understood that the particular order in which the operations in
The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described with respect to
Users interact with electronic devices in many different manners, including using an electronic device to browse for and view items of content on the electronic device. In some embodiments, an electronic device is able to present a control center user interface including a plurality of options for controlling the operation of the electronic device. The embodiments described below provide ways in which an electronic device presents these options for controlling the operation of the electronic device in a control center user interface. Enhancing interactions with a device reduces the amount of time needed by a user to perform operations, and thus reduces the power usage of the device and increases battery life for battery-powered devices. It is understood that people use devices. When a person uses a device, that person is optionally referred to as a user of the device.
In
In
In some embodiments, control panel 1412 includes an indication 1414 of the current date and time (e.g., Monday April 4 at 8:30 PM). In some embodiments, control panel 1412 includes one or more selectable option for controlling the operation of device 500. For example, as shown in
In
In
It is understood that although the figures and description above describe the control of playback of a song, the above-described features apply similarly to the playback of video or multimedia content items being played by any application.
In
In
In
In some embodiments, user interface 1400-5 includes a text field in which the user enters text to be searched, a row of recent searches which the user is able to select to perform a search using the respective search string, and one or more rows of content items (e.g., such as a row of trending movies, a row of trending television shows, a row of popular content, etc.) from which the user can select to cause display of the respective content item.
In
As described below, the method 1500 provides ways to present control center user interface. The method reduces the cognitive burden on a user when interacting with a user interface of the device of the disclosure, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, increasing the efficiency of the user's interaction with the user interface conserves power and increases the time between battery charges.
In some embodiments, such as in
In some embodiments, in response to receiving the input including the selection of the respective button on the remote control device (1504), such as in
In some embodiments, the control panel is displayed overlaid over the content or user interface that was displayed before the control panel was displayed. In some embodiments, the control panel is displayed along one side of the display (e.g., right side, left side, top side, bottom side, etc). In some embodiments, the control panel includes a selectable option for causing the electronic device to enter into a low power mode (e.g., sleep), a selectable option for controlling playback of media (e.g., music, videos, etc.) that is currently playing on the electronic device, a selectable option for controlling the audio and/or video output of the electronic device, selectable options to change the primary user profile of the electronic device, and/or a selectable option to display a search user interface on the electronic device. In some embodiments, the control panel displays the current date and time of the electronic device. In some embodiments, if the selection of the respective button does not satisfy the first criteria (e.g., the click or actuation is not longer than the time threshold), then the electronic device launches the unified media browsing application or performs another action corresponding to a short click or tap of the respective button (e.g., as opposed to a long-click or click-and-hold input).
The above-described manner of displaying a control panel for controlling operation of the electronic device allows the electronic device to provide the user with a method to control the operation of the electronic device at any time, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by providing a mechanism for the user to display a control panel and control the operation of the electronic device without requiring the user to navigate to a separate user interface or interrupt the content being displayed by the electronic device to perform the same functions), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of controlling the electronic device.
In some embodiments, in response to receiving the input including the selection of the respective button on the remote control device (1508), such as in
The above-described manner of displaying either a control panel or a unified media browsing application allows the electronic device to provide the user with a method of using a single button on a remote control device to perform multiple functions (e.g., display the control center unified interface or a unified media browsing application) based on the characteristic of the user input on the respective button, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by providing the user with a mechanism to display a control panel or launch the unified media browsing application without requiring the user to navigate through a menu or perform additional inputs to perform the same functions), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of interacting with the electronic device.
In some embodiments, the control center user interface includes one or more selectable options that are selectable to switch a user profile with which the electronic device is configured to respective user profiles associated with the respective selectable options (1512), such as in
The above-described manner of changing the active user profile of the device (e.g., by selecting a respective user profile on a control center user interface) allows the electronic device to provide the user with a shortcut method of selecting an active profiles without requiring the user to navigate to a system settings user interface, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by displaying a control panel in response to the user input from which the user can change the user profile, without requiring the user to navigate through a settings menu system to change the active profile of the device), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of switching user profiles on the electronic device.
In some embodiments, the control center user interface includes a selectable option that is selectable to transition the electronic device to a standby state (1514), such as in
The above-described manner of transitioning the electronic device to a standby state (e.g., by providing a selectable option on the control center user interface that is selectable to place the electronic device in a standby state) allows the electronic device to provide the user with a quick shortcut method of placing the electronic device in a low power state, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., without requiring the user to navigate through a menu system to find a user interface for controlling the power states of the device and without requiring the remote control device to include a dedicated power button for controlling the power states of the device), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, the control center user interface includes (1516), such as in
The above-described manner of displaying information about content that is currently playing (e.g., by displaying, on the control center user interface, a representation of the content item that is currently playing at the electronic device) allows the electronic device to provide the user with a single interface from which the user can view information about the content item currently being played (e.g., without requiring the user to find the application that is playing the currently playing content and then navigate into the respective application to view information about the currently played content item, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient, which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, while the application in which the content item is currently playing is not displayed (e.g., the application that is playing the currently playing content item is not currently displayed on the display (e.g., is running as a background process)) and while the representation of the content item that is currently playing has a current focus, the electronic device detects (1520), via a remote control device having a touch-sensitive surface, input including a contact having an intensity greater than an intensity threshold, such as in
In some embodiments, in response to detecting the input including the contact having the intensity greater than the intensity threshold (1522), such as in
For example, if the currently playing content item is a song that is being played by a music application, then display the music application (e.g., optionally the playback user interface of the music application). In some embodiment, if the currently playing content is a video (e.g., tv show, movie, etc.), then display the application that is playing the video (e.g., optionally the playback user interface of the application). In some embodiments, after displaying the application that is playing the currently playing content item, the control center user interface is dismissed (e.g., no longer displayed). In some embodiments, if the application that is playing the currently playing content item is already displayed on the display when the user selects performs the input, then merely dismiss the control panel. For example, if the user is in a music app and causes playback of a respective song, then causes display of the control panel (e.g., without navigating to another application or to another user interface), and selects the representation of the respective song, then causing display of the application that is playing the currently playing content item and dismissal of the control center user interface only causes the dismissal of the control center user interface because the music application is already displayed.
The above-described manner of displaying the application that is currently playing content (e.g., by displaying the application that is currently playing content in response to the user selecting the representation of the currently playing content on the control center user interface) allows the electronic device to provide the user with a quick shortcut method of displaying the application that is currently playing content, without requiring the user to search for and navigate into the application that is currently playing content, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient, which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, while the representation of the content item that is currently playing has a current focus, the electronic device detects (1526), via a remote control device having a touch-sensitive surface, input including selection of a play/pause button on the remote control device, such as in
In some embodiments, in response to detecting the input including selection of the play/pause button on the remote control device, the electronic device pauses (1528) playback of the content item, such as in
The above-described manner of controlling playback of the currently playing content item (e.g., by playing or pausing the currently playing content item in response to a user input selecting the play/pause button while the representation of the currently playing content item on the control center user interface has a focus) allows the electronic device to provide the user with a quick shortcut method of controlling the playback of the content item without requiring the user to find and navigate into the application that is currently playing the content item to achieve the same playback control functions, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient, which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, the control center user interface includes a selectable option that is selectable to initiate a process to change an audio output destination for the electronic device (1530), such as in
The above-described manner of changing the audio output destination of the electronic device (e.g., by displaying a selectable option on the control center user interface that is selectable to display a user interface for changing the audio output destination of the electronic device allows the electronic device to provide the user with a quick shortcut method of changing the audio output destination of the electronic device without requiring the user to navigate through a system settings menu system to find a setting for changing the audio output destination or find and navigate into the application that is currently playing content to change the audio output destination, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient, which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, the control center user interface includes a selectable option that is selectable to display, via the display device, a search user interface for searching content available on the electronic device (1532), such as in
The above-described manner of displaying a search user interface (e.g., by providing a selectable option on the control center user interface that is selectable to display the search user interface) allows the electronic device to provide the user with a quick shortcut method of displaying the search user interface without requiring the user to navigate to the home user interface and find the icon corresponding to the search feature, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient, which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, when the input including the selection of the respective button on the remote control device was received, a respective element in the user interface had a current focus (1536), such as in
In some embodiments, while the control center user interface is displayed and while the respective selectable option in the control center user interface has the current focus, the electronic device receives, via the one or more input devices, an input corresponding to a request to cease displaying the control center user interface (1540), such as in
In some embodiments, in response to receiving the input corresponding to the request to cease displaying the control center user interface (1542), such as in
The above-described manner of changing the item that has a focus (e.g., by moving the focus to a selectable option on the control center user interface when the control center user interface is displayed and moving the focus back to the item that had a focus before the control center user interface was displayed when the control center user interface is dismissed) allows the electronic device to provide the user with a method of displaying the control center user interface, performing the user's intended actions, then dismissing the control center user interface and resume interacting with the user interface with very little interruption without requiring navigate the focus to the appropriate item when the control center user interface was displayed and dismissed, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient, which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
It should be understood that the particular order in which the operations in
The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described with respect to
Users interact with electronic devices in many different manners, including using an electronic device to browse for and view items of content on the electronic device. In some embodiments, the electronic devices maintains one or more of the user's preferences, settings, viewing history, etc., sometimes known as a user profile, to provide the user with a more customized experience. In some embodiments, the electronic devices maintains multiple user profiles for different users to reflect each user's individual preferences, settings, viewing histories, etc. The embodiments described below provide ways in which an electronic device switches the active profile of the device from one user profile to another, thus enhancing users' interactions with the device. Enhancing interactions with a device reduces the amount of time needed by a user to perform operations, and thus reduces the power usage of the device and increases battery life for battery-powered devices. It is understood that people use devices. When a person uses a device, that person is optionally referred to as a user of the device.
In
In some embodiments, the unified media browsing applications maintains data regarding the user's entitlement to contact and data regarding the user's viewing preferences. In some embodiments, the unified media browsing application determines content that is available via multiple content providers and determines whether the user has entitlement to the content providers or to any content items directly. Thus, in some embodiments, the unified media browsing application is able to provide the user with multiple ways of viewing respective content items and adjust the recommendations provided to the user based on the user's entitlements. In some embodiments, the unified media browsing application stores the user's viewing history and viewing preferences to allow the unified media browsing application to recommend content items to the user that are most likely to be of interest to the user. For example, the unified media browsing application is able to recommend the next episode of a television show to the user or a movie that is similar to a previously watched movie.
Thus, as shown in
In
In some embodiments, as shown in
In
Thus, as shown in
In
In
Thus, in some embodiments, the podcast application is able to determine that User 1 is the current active user, that User 1 has subscriptions to one or more podcasts, and that User 1 has previously played one or more podcasts. In some embodiments, the podcast application is able to update user interface 1600-5 that reflects the user's subscriptions and playback history. It is understood that the user interface 1600-5 as shown illustrates that the podcast application is able to determine the user's subscriptions and playback history and optionally reflects the determined subscriptions and playback history on the user interface and should not be interpreted as limiting.
In
In
In
It is understood that the user interfaces 1600-6 and 1600-7 as shown illustrates that the music application is able to determine the user's subscription status, entitlements, and playback history and optionally reflects the determined subscription status, entitlements, and playback history on the user interface and should not be interpreted as limiting.
In
For example, in
In
For example, in
In
In
In
In
In
In
In
In
In
In
In
In
As shown above, for example, in
Thus, as described above, some applications on device 500 support the profile switching functionalities (e.g., ability to determine the active profile and maintain and display separate sets of entitlements, recommendations, viewing history, etc.) and some applications on device 500 do not support the profile switching functionalities. It is understood that although certain applications are described above as having or not having the profile switching functionalities, this is illustrative of certain embodiments of the disclosure and should not be considered limiting. In some embodiments, any of the above-discussed applications can or cannot have the profile switching functionalities or can have a subset of the profile switching functionalities described above.
As described below, the method 1700 provides ways to switch the active user profile of the electronic device 500. The method reduces the cognitive burden on a user when interacting with a user interface of the device of the disclosure, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, increasing the efficiency of the user's interaction with the user interface conserves power and increases the time between battery charges.
In some embodiments, such as in
In some embodiments, the settings and/or content of the electronic device are associated with one or more user accounts and/or user profiles. In some embodiments, one of the one or more user profiles is active at any one time on the electronic device. In some embodiments, the active profile determines the settings and/or available content on the electronic device. Thus, in some embodiments, if a first user profile is active, the various content applications on the electronic device are configured to provide content that the first user profile is entitled to access on the electronic device (but not content that the second user profile is entitled to access on the electronic device), and if a second user profile is active, the various content applications on the electronic device are configured to provide content that the second user profile is entitled to access on the electronic device (but not content that the first user profile is entitled to access on the electronic device). In some embodiments, the settings and/or content defined by the user profile include associations with cloud accounts, history of purchased content, viewing history, etc.
In some embodiments, the request comprises selecting the second user profile from the control center user interface as described above with reference to method 1500. In some embodiments, the request comprises selecting the second user profile from a settings application. In some embodiments, the request is received from another electronic device that is remotely controlling the electronic device.
In some embodiments, in response to receiving the input corresponding to the request to configure the electronic device with the second user profile of the second user, the electronic device configures (1704) the electronic device with the second user profile of the second user, which allows the first content application to provide a third set of content, different than the first set of content, on the electronic device and the second content application to provide a fourth set of content, different than the second set of content, on the electronic device, such as in
In some embodiments, setting the second user profile as the active profile causes one or more of the applications on the electronic device to change from being associated with the first user profile to being associated with the second user profile. For example, the first content application logs out of the account associated with the first user profile and logs into the account associated with the second user profile. In some embodiments, the account associated with the second user profile has different content entitlements such that logging into the account associated with the second user profile gives the electronic device access to a different set of content. In some embodiments, not all applications have a separate and/or dedicated user account and optionally, instead, rely on and/or have access to the active user profile of the electronic device. In some embodiments, setting up a profile on the electronic device provides these applications with access to the profile (e.g., the applications use the user profile instead of a dedicated user account to uniquely identify users). In some embodiments, the data from these applications are able to be saved to and associated with the active user profile (e.g., settings, viewing history, etc). In such examples, when the active profile is changed from the first user profile to the second user profile, these applications are updated to refer to the second user profile and the data that these applications access that are associated with the first user profile (e.g., settings, viewing history, etc.) are switched to the data that is associated with the second user. In some embodiments, the data associated with the first profile is removed and the data associated with the second profile is loaded (e.g., the data is saved on a server, the cloud, or a local repository), or the data is not removed and the application is updated to access a different set of data for the new user profile (e.g., the system stores one or more sets of data corresponding to the one or more user profiles). In some embodiments, not all applications and content are associated with a user profile or are capable of being switched (e.g., agnostic to user accounts or user profiles). In such examples, the applications and content that are not associated with a user profile or are not capable of being switched are not changed or updated to reflect the change in the active profile.
The above-described manner of changing user profiles allows the electronic device to provide the user with the ability to quickly update the settings and change the available content to another set of settings and content, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by providing a mechanism for the user to switch from a first user profile to a second user profile and automatically update applications to reflect the changed user profile without requiring the user to individually navigate to each application to log out of the account associated with the first user profile and log into the account associated with the second user profile or navigate to each setting to manually change each setting appropriately), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of switching user profiles.
In some embodiments, the user profiles that are available with which to configured the electronic device are user profiles that are part of a family account that includes the first user profile and the second user profile (1706), such as in
The above-described manner of changing user profiles (e.g., by selecting from the user profiles that are part of a family account) allows the electronic device to provide the user with the ability to select from user profiles of users that are likely to use the electronic device (e.g., the members of the family of the user), which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by automatically displaying the user profiles of members of a family account without requiring the user to manually add each member of the user's family to the list of profiles that can be switched to), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of switching user profiles.
In some embodiments, the user profiles that are available with which to configured the electronic device are user profiles added to a smart home application available to the electronic device (1708), such as in
The above-described manner of changing user profiles (e.g., by selecting from the user profiles that are included in a smart home application) allows the electronic device to provide the user with the ability to select from user profiles of users that are likely to use the electronic device (e.g., the people who live in the same residence as the user), which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by automatically displaying the user profiles of users who most likely live with the user without requiring the user to manually add each resident to the list of profiles that can be switched to), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of switching user profiles.
In some embodiments, while the electronic device is configured with the first user profile of the first user, such as in
In some embodiments, while the electronic device is configured with the second user profile of the second user, such as in
The above-described manner of changing user profiles (e.g., by selecting from the user profiles that are part of a family account) allows the electronic device to provide the user with the ability to select from user profiles of users that are likely to use the electronic device (e.g., the members of the family of the user), which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by automatically displaying the user profiles of members of a family account without requiring the user to manually add each member of the user's family to the list of profiles that can be switched to), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of switching user profiles.
In some embodiments, the third application is a photos application, and the fifth set of content is photos content associated with the first user profile of the first user (1722), such as in
The above-described manner of changing user profiles (e.g., by maintaining the content available via the photos application) allows the electronic device to provide the second user with the ability to view the first user's photos and/or videos that are available via the photos application, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., without requiring the user to manually log out of the photos application and log into the photos application as the first user in order to view the first user's content while the second user profile is the active profile), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of viewing photos and/or videos.
In some embodiments, while the electronic device is configured with the first user profile of the first user (1724), such as in
In some embodiments, while the electronic device is configured with the second user profile of the second user (1728), such as in
The above-described manner of changing user profiles (e.g., by changing the viewing history of a respective content from the viewing history associated with the first user profile to the viewing history associated with the second user profile) allows the electronic device to provide recommendations to the user that is most relevant to the active user profile (e.g., by setting the active viewing history as the viewing history of the active user profile such that a respective application that provides recommendations based on viewing history is able to provide the correct recommendations for the active user profile), which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by automatically updating the active viewing history of the device without requiring the user to clear the viewing history on each application and import the viewing history associated with the new active profile to achieve the same functionality), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of switching user profiles.
In some embodiments, while the electronic device is configured with the first user profile of the first user (1732), such as in
In some embodiments, while the electronic device is configured with the second user profile of the second user (1736), such as in
The above-described manner of changing user profiles (e.g., by changing the content recommendations of a respective content from the recommendations for the first user profile to the recommendations for the second user profile) allows the electronic device to provide recommendations to the user that is most relevant to the active user profile (e.g., by changing the recommendations provided by the application to the applications that are associated with the active user profile), which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by automatically updating the content that is recommended by respective applications based on the active user history without requiring the user to clear the recommendations on each application and import new viewing history and/or recommendations to achieve the same functionality), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of switching user profiles.
In some embodiments, while the electronic device is configured with the first user profile of the first user (1740), such as in
In some embodiments, when the active profile is the first user profile, the unified media browsing application is able to determine the first user profile's entitlements and appropriately identify what content the user is entitled to (selection of which initiates a process for displaying the content) and what content the user is not entitled to (selection of which does not initiate a process for displaying the content. In some embodiments, when the active profile is the first user profile, the active viewing activity information is the viewing activity information of the first user profile.
In some embodiments, while the electronic device is configured with the second user profile of the second user (1748), such as in
Thus, in some embodiments, switching the active profile from the first user profile to the second user profile causes the unified media browsing application to reflect any changes in entitlements between the first user profile and the second user profile. Thus, in some embodiments, switching the active profile from the first user profile to the second user profile causes the unified media browsing application to reflect the different consumption histories of the user profiles.
The above-described manner of changing user profiles (e.g., by switching the active viewing activity information and entitlements of the unified media browsing application from the first user profile to the second user profile) allows the electronic device to provide the second user with a customized experience that is customized for the second user, without artifacts from the first user's history, settings, and/or entitlements, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by automatically updating the entitlements and viewing history in the unified media browsing application without requiring the user to navigate to the unified media browsing application and log out of the first user's user profile and log into the second user's user profile to achieve the same functionality), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of switching user profiles.
In some embodiments, while the electronic device is configured with the first user profile of the first user, an application that provides access to content based on a subscription to a subscription service provides content based on a subscription status of the first user with the subscription service (1754), such as in
In some embodiments, while the electronic device is configured with the second user profile of the second user, the application that provides access to content based on a subscription to the subscription service provides content based on a subscription status of the second user with the subscription service (1756), such as in
The above-described manner of changing user profiles (e.g., by changing the access to a set of content based on the subscription status of the second user profile instead of the first user profile) allows the electronic device to provide the proper content access entitlements based on the subscription status of the second user profile, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by automatically updating the content entitlement of the application based on the subscription status of the active user without requiring the user to manually log out of the application and log into the application with the second user profile), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of switching user profiles.
In some embodiments, while the electronic device is configured with the first user profile of the first user (1758), such as in
In some embodiments, when the active profile is the first user profile, the music application is able to determine the first user profile's entitlements (e.g., items that the user has purchased access to, or items that the user has access to as a result of a subscription to a music subscription service) and appropriately identify what content the user is entitled to (selection of which initiates a process for playing the content) and what content the user is not entitled to (selection of which does not initiate a process for playing the content). In some embodiments, when the active profile is the first user profile, the active content consumption activity is the playback activity of the first user profile.
In some embodiments, while the electronic device is configured with the second user profile of the second user (1764), such as in
Thus, in some embodiments, switching the active profile from the first user profile to the second user profile causes the music application to reflect any changes in entitlements between the first user profile and the second user profile. Thus, in some embodiments, switching the active profile from the first user profile to the second user profile causes the music application to reflect the different consumption histories of the user profiles.
The above-described manner of changing user profiles (e.g., by changing the access to a set of music based on the entitlements of the second user profile instead of the first user profile, and by changing the consumption history from the consumption history of the first user to the consumption history of the second user) allows the electronic device to provide the proper content access entitlements and viewing history based on the entitlements of the second user profile, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by automatically updating the content entitlement and viewing history of the application such that the user does not improperly attempt to access content to which the user does not have entitlements to access and without requiring the user to navigate to the music application to manually log out of the first user profile and log into the second user profile), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of switching user profiles.
In some embodiments, while the electronic device is configured with the first user profile of the first user (1770), such as in
In some embodiments, when the active profile is the first user profile, the podcast application is able to determine the first user profile's entitlements (e.g., the podcasts to which the user has subscribed) and appropriately identify what content the user is entitled to (selection of which initiates a process for playing the content) and what content the user is not entitled to (selection of which does not initiate a process for playing the content. In some embodiments, when the active profile is the first user profile, the active content consumption activity is the playback activity of the first user profile.
In some embodiments, while the electronic device is configured with the second user profile of the second user (1776), such as in
Thus, in some embodiments, switching the active profile from the first user profile to the second user profile causes the podcast application to reflect any changes in entitlements between the first user profile and the second user profile. Thus, in some embodiments, switching the active profile from the first user profile to the second user profile causes the podcast application to reflect the different consumption histories of the user profiles.
The above-described manner of changing user profiles (e.g., by changing the access to a set of podcasts based on the entitlements of the second user profile instead of the first user profile, and by changing the consumption history from the consumption history of the first user to the consumption history of the second user) allows the electronic device to provide the proper content access entitlements and viewing history based on the entitlements of the second user profile, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by automatically updating the content entitlement and viewing history of the application such that the user does not improperly attempt to access content to which the user does not have entitlements to access and without requiring the user to navigate to the podcast application to manually log out of the first user profile and log into the second user profile), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of switching user profiles.
In some embodiments, when the input corresponding to the request to configure the electronic device with the second user profile of the second user was received, a first set of applications, including the first content application and the second content application, were installed on the electronic device (1782), such as in
In some embodiments, configuring the electronic device with the second user profile of the second user includes maintaining the first set of applications installed on the electronic device and not installing additional applications on the electronic device (1784), such as in
The above-described manner of changing user profiles (e.g., by maintaining the applications that are installed on the electronic device despite changing the active profile from the first user profile to the second user profile) allows the electronic device to provide a consistent experience to the first user and to the second user and without requiring the device to uninstall or reinstall applications every time the active user profile is changed, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by not changing the set of applications that are installed on the electronic device and without requiring the user to re-install applications that the user desired to remain installed on the device), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of switching user profiles.
It should be understood that the particular order in which the operations in
The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described with respect to
Users interact with electronic devices in many different manners, including using an electronic device to browse for and view items of content on the electronic device. In some embodiments, the user desires to concurrently view multiple content items or to view a content item while simultaneously browsing for content. The embodiments described below provide ways in which an electronic device displays a content item overlaid over another user interface from which the user is able to browse for and display other content items, thus enhancing users' interactions with the device. Enhancing interactions with a device reduces the amount of time needed by a user to perform operations, and thus reduces the power usage of the device and increases battery life for battery-powered devices. It is understood that people use devices. When a person uses a device, that person is optionally referred to as a user of the device.
In
In
In
In
In
In
As shown in
In
In
In
However, in some embodiments, if the primary display is not displaying audio, then device 500 will output the audio from the PIP display. For example, in
In
In
In
In
In
In
As described below, the method 1900 provides ways to display a content item in picture-in-picture mode. The method reduces the cognitive burden on a user when interacting with a user interface of the device of the disclosure, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, increasing the efficiency of the user's interaction with the user interface conserves power and increases the time between battery charges.
In some embodiments, such as in
In some embodiments, in response to receiving the indication of the contact detected on the touch-sensitive surface of the remote control device, in accordance with a determination that the user interface comprises a content playback user interface (e.g., a movie or TV show playback user interface in which a movie or TV show is currently playing or paused), the electronic device displays (1904), in the user interface, a selectable option for displaying the user interface as an overlay over another user interface, such as in
In some embodiments, the electronic device receives (1906), via the one or more input devices, an input selecting the selectable option for displaying the user interface as the overlay over another user interface, such as in
In some embodiments, in response to receiving the input selecting the selectable option, the electronic device displays (1908), via the display device, the user interface as the overlay over the other user interface, such as in
The above-described manner of activating a picture-in-picture mode allows the electronic device to provide the user with a method of activating picture-in-picture, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by providing a mechanism for the user to enter picture-in-picture without requiring the user to navigate to a separate user interface or perform additional inputs to enable picture-in-picture), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of enabling picture-in-picture.
In some embodiments, the user interface was displayed in response to an input that was received, via the one or more input devices, when a respective user interface was displayed via the display device, and the other user interface is the respective user interface (1910), such as in
The above-described manner of activating a picture-in-picture mode (e.g., by displaying the picture-in-picture content overlaid over the user interface that was displayed before content playback began) allows the electronic device to provide the user with the user interface that the user was previously browsing so that the user is able to continue browsing for other content when the device enters into picture-in-picture mode, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by automatically displaying the user interface that the user was browsing when the device enters picture-in-picture mode without requiring the user to navigate through multiple user interfaces to reach the same user interface that was displayed before content playback began), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of enabling picture-in-picture mode.
In some embodiments, in response to receiving the indication of the contact detected on the touch-sensitive surface of the remote control device, in accordance with a determination that the user interface does not comprise a content playback user interface, the electronic device forgoes displaying (1912), in the user interface, the selectable option for displaying the user interface as an overlay over another user interface, such as in
The above-described manner of displaying a selectable option for entering a picture-in-picture mode (e.g., by displaying a selectable option to enter picture-in-picture mode when the user interface is a content playback user interface, but not displaying a selectable option to enter picture-in-picture mode when the user interface is not a content playback user interface) allows the electronic device to provide the user with the option to enter picture-in-picture mode only if the user is displaying content that can be displayed in a picture-in-picture overlay, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., without requiring the user to determine whether picture-in-picture mode is actually available and without unnecessarily displaying an option to enter picture-in-picture mode when picture-in-picture mode is not actually available), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of enabling picture-in-picture.
In some embodiments, displaying, via the display device, the user interface as the overlay over the other user interface includes displaying the user interface as the overlay without displaying one or more selectable options for interacting with the overlay (1914), such as in
The above-described manner of displaying a picture-in-picture overlay (e.g., by displaying the content in the picture-in-picture overlay without displaying selectable options on the overlay for interacting with the overlay) allows the electronic device to provide the user with a clean viewing experience of the picture-in-picture content and only displaying selects options when the user performs an input corresponding to a request to access the selectable options, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by not unnecessarily displaying options for interacting with the picture-in-picture overlay when the user has not shown a desire for them), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of watching picture-in-picture content.
In some embodiments, in response to receiving the indication of the contact detected on the touch-sensitive surface of the remote control device, in accordance with the determination that the user interface comprises a content playback user interface, the electronic device displays (1916), in the user interface, a scrubber bar for scrubbing through content being played in the content playback user interface, such as in
The above-described manner of displaying a selectable option for entering a picture-in-picture mode (e.g., by displaying a selectable option to enter picture-in-picture mode concurrently with the display of a scrubber bar) allows the electronic device to provide the user, after a single gesture, with multiple options of how to interact with the content currently playing, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., without requiring the user to determine whether picture-in-picture mode is available and without interrupting the user's playback to navigate through a series of menus to activate picture-in-picture mode), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of enabling picture-in-picture.
In some embodiments, while displaying, in the user interface, the scrubber bar and the selectable option for displaying the user interface as an overlay over another user interface, and while the selectable option does not have a current focus, the electronic device detects (1918), via the remote control device, an input including a contact having an intensity greater than an intensity threshold in the touch-sensitive surface of the remote control device, such as in
In some embodiments, in response to detecting the input including the contact having the intensity greater than the intensity threshold in the touch-sensitive surface of the remote control device, the electronic device initiates (1920) a scrubbing mode for scrubbing through the content being played in the content playback user interface without displaying, via the display device, the user interface as the overlay over the other user interface, such as in
The above-described manner of interacting with the content currently playing (e.g., by entering scrubbing mode in response to receiving a click input on the touch-sensitive surface of the remote control device) allows the electronic device to provide the user with the ability to scrub through the currently playing content while simultaneously displaying the option to enter into picture-in-picture mode, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient, which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of enabling picture-in-picture.
In some embodiments, while the user interface is displayed as the overlay over playing content, the electronic device receives (1922), via the one or more input devices, an indication of a second contact detected on the touch-sensitive surface of the remote control device, such as in
In some embodiments, in response to receiving the indication of the second contact detected on the touch-sensitive surface of the remote control device (1924), such as in
In some embodiments, the selectable options for interacting with the overlay include a selectable option that is selectable to exit picture-in-picture mode. In some embodiments, the selectable options for interacting with the overlay include a selectable option that is selectable to move the picture-in-picture overlay to another location on the user interface (e.g., move the picture-in-picture overlay to a different corner of the user interface). In some embodiments, the selectable options for interacting with the overlay include a selectable option that is selectable to swap the content that is displayed (e.g., swap the content being displayed in the picture-in-picture overlay with the content that is being displayed beneath the picture-in-picture overlay (e.g., in the primary user interface). In some embodiments, the selectable options for interacting with the overlay is displayed in the picture-in-picture overlay overlaid over the content being displayed in the picture-in-picture overlay. In some embodiments, the selectable options for interacting with the overlay are displayed at another location on the user interface (e.g., not overlaid over the picture-in-picture overlay).
The above-described manner of displaying selectable options for interacting with the picture-in-picture overlay (e.g., by displaying the selectable options for interacting with the picture-in-picture overlay in response to receiving a user contact on the touch sensitive surface) allows the electronic device to provide the user with selectable options for interacting with the picture-in-picture overlay only after the user requests display of the selectable options, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., without always displaying the selectable options or without interrupting the user's playback to navigate through a series of menus to interact with the picture-in-picture overlay), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, displaying the scrubber bar for scrubbing through the playing content comprises displaying the scrubber bar without displaying a selectable option for displaying the playing content as an overlay over another user interface (1930), such as in
The above-described manner of displaying a selectable option for entering a picture-in-picture mode (e.g., by displaying a selectable option to enter picture-in-picture mode if the device is not already in picture-in-picture mode, but not displaying the selectable option if the device is already in picture-in-picture mode) allows the electronic device to only provide the user with the selectable option to enter into picture-in-picture mode if the device is not already in picture-in-picture mode, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by not displaying an unnecessary option to activate picture-in-picture mode and without requiring the user to separately determine whether picture-in-picture mode is actually available), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency, such as by streamlining the process of enabling picture-in-picture.
In some embodiments, while the selectable options that are selectable to interact with the user interface do not have a current focus, receiving, via the one or more input devices, an indication of a directional input detected at the touch-sensitive surface of the remote control device (1932), such as in
In some embodiments, in response to receiving the indication of the directional input detected at the touch-sensitive surface of the remote control device, updating a respective selectable option of the one or more selectable options that are selectable to interact with the user interface that is displayed as the overlay over the playing content to have the current focus (1934), such as in
The above-described manner of displaying a accessing the selectable options for interacting with the picture-in-picture overlay (e.g., by moving a focus to the selectable options for interacting with the picture-in-picture overlay in response to receiving a directional input) allows the electronic device to provide the user with the ability to access the selectable options for interacting with the picture-in-picture overlay while simultaneously providing the user with access to the scrubber bar, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., without requiring the user to perform additional inputs to access either the scrubber bar or the selectable options for interacting with the picture-in-picture or without requiring the user to interrupt playback to navigate through a series of menus to interact with the picture-in-picture overlay), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, while the selectable options that are selectable to interact with the user interface do not have a current focus, the electronic device receives (1936), via the one or more input devices, an indication of a contact having an intensity greater than an intensity threshold detected at the touch-sensitive surface of the remote control device, such as in
In some embodiments, in response to receiving the indication of the contact having the intensity greater than the intensity threshold detected at the touch-sensitive surface of the remote control device, initiating a scrubbing mode for scrubbing through the playing content (1938), such as in
The above-described manner of interacting with the currently displayed content while in picture-in-picture mode (e.g., by displaying a selectable option to enter picture-in-picture mode concurrently with the display of a scrubber bar and entering scrubbing mode in response to receiving a click on the touch-sensitive surface of the remote control device) allows the electronic device to provide the user with the ability to access the selectable options for interacting with the picture-in-picture overlay while simultaneously providing the user with access to the scrubber bar, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., without requiring the user to perform additional inputs to enter a scrubbing mode), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, while the user interface is displayed as the overlay over the other user interface (e.g., while in picture-in-picture mode), wherein the overlay is displayed over a first location in the other user interface (e.g., in a respective corner of the user interface), the electronic device receives (1940), via the one or more input devices, an input corresponding to a request to move a current focus in the other user interface to a second location in the other user interface, such as in
In some embodiments, in response to receiving the input corresponding to the request to move the current focus in the other user interface to the second location in the other user interface (1942), such as in
In some embodiments, in accordance with a determination that the second location is not within the threshold distance of the first location, maintaining display of the overlay at the first location over the other user interface (1946), such as in
The above-described manner of automatically moving the picture-in-picture overlay (e.g., by moving the picture-in-picture overlay when the user moves a focus to an item that is obscured by the overlay) allows the electronic device to provide the user with the ability to navigate to all items in the user interface without requiring the user to manually move the picture-in-picture overlay to a different location to access items that are displayed beneath the overlay, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by automatically moving the overlay if items that the user is interested in are obscured by the overlay), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, while the current focus is at the second location in the other user interface and the overlay is displayed over the third location in other user interface, the electronic device receives (1948), via the one or more input devices, an input corresponding to a request to move the current focus in the other user interface to a fourth location in the other user interface, such as in
In some embodiments, in response to receiving the input corresponding to the request to move the current focus in the other user interface to the fourth location in the other user interface (1950), such as in
The above-described manner of automatically moving the picture-in-picture overlay (e.g., by moving the picture-in-picture overlay when the user moves a focus to an item that is obscured by the overlay and moving the overlay back to its original position after the user moves the focus away from the item that would have been obscured by the overlay) allows the electronic device to provide the user with the ability to navigate to all items in the user interface while minimizing the disruption to the playback of the picture-in-picture content (e.g., by moving the overlay back to its original position after the user is done navigating to items that would have been obscured by the overlay)), which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by automatically moving the overlay back to its original position that is familiar to the user to provide a consistent display and without requiring the user to manually move the overlay back to its original position), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, while the user interface is displayed as the overlay over the other user interface, the electronic device receives (1954), via the one or more input devices, an indication of selection of a respective button on the remote control device, such as in
In some embodiments, in response to receiving the indication of the selection of the respective button on the remote control device (1956), such as in
In some embodiments, if the selection of the respective button does not meet the first criteria (e.g., is not a depression for longer than the time threshold), then do not cause the display of the control center user interface or the display of the selectable options for interacting with the picture-in-picture overlay. In some embodiments, in response to the user input that does not meet the first criteria, the electronic device launches the unified media browsing application or performs another action corresponding to a short click or tap of the respective button (e.g., as opposed to a long-click or click-and-hold input). In some embodiments, a user input corresponding to a selection of the “home” or “menu” button corresponding to a request to cease display of the control center user interface causes the control center user interface and the selectable options for interacting with the picture-in-picture overlay to cease display and focus to move back to the item that had focus before the control center user interface was displayed.
In some embodiments, a selectable option in the control user interface has a current focus (1964), such as in
In some embodiments, focus is moved from the control center to the selectable options for interacting with the picture-in-picture overlay (e.g., the selectable options, discussed above, for swapping the content displayed in the overlay with the content displayed on the primary display, for moving the picture-in-picture overlay, and for exiting picture-in-picture mode) in response to a user input corresponding to a navigation toward the direction of the selectable options for interacting with the picture-in-picture overlay. For example, if the selectable options for interacting with the picture-in-picture overlay are displayed to the left of the control center (e.g., if the picture-in-picture overlay is displayed to the left of the control center), then a leftward navigation causes focus to move from a selectable option on the control center user interface to one of the selectable options for interacting with the picture-in-picture overlay (e.g., optionally the selectable option closes to the control center user interface, such as the selectable option for exiting picture-in-picture mode).
The above-described manner of displaying selectable options for interacting with the picture-in-picture overlay (e.g., by displaying the selectable options for interacting with the picture-in-picture overlay in response to the same user input that causes display of the control center user interface) allows the electronic device to provide the user with selectable options for interacting with the picture-in-picture overlay when the primary user interface is not playing content, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., by still providing controls to the user even if content is not being played on the primary user interface, without requiring the user to playback content on the primary display, then display the selectable content, interact with the picture-in-picture overlay as desired, and then stop playback of the content on the primary display to achieve the same functionality), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, while the user interface is displayed as the overlay over the other user interface (e.g., an while a current focus is in the other user interface, such as on a representation of a content item in the other user interface (e.g., a unified media browsing application user interface)), the electronic device receives (1966), via the one or more input devices, an indication of selection of a respective button on the remote control device, such as in
In some embodiments, in response to receiving the indication of the selection of the respective button on the remote control device (1968), such as in
For example, the first criteria is satisfied if the user input is a double-click on the respective button (e.g., a play/pause button). In some embodiments, other input patterns are possible to satisfy the first criteria (e.g., such as a long press, or a click followed quickly by a click-and-hold, etc). In some embodiments, one of the selectable options of the selectable options for interacting with the picture-in-picture overlay has a focus. In some embodiments, if the user input does not meet the first criteria, then do not display the selectable options for interacting with the picture-in-picture overlay. In some embodiments, if the user input does not meet the first criteria, then the device performs a different action, such as the action corresponding to a single button press actuation of the respective button (e.g., in response to the user performing a single button press actuation of the respective button). In some embodiments, a user input corresponding to a selection of the “home” or “menu” button corresponding to a request to cease display of the selectable options for interacting with the picture-in-picture overlay causes the selectable options for interacting with the picture-in-picture overlay to cease display and focus to move back to the item that had focus before the selectable options were displayed (e.g., focus to move back to a representation of a content item in a unified media browsing application).
The above-described manner of displaying selectable options for interacting with the picture-in-picture overlay (e.g., by displaying the selectable options for interacting with the picture-in-picture overlay in response to receiving a double-click button actuation) allows the electronic device to provide the user with selectable options for interacting with the picture-in-picture overlay only after the user requests display of the selectable options even when the device is not currently playing content on the primary user interface, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., without always displaying the selectable options or without interrupting the user's playback to navigate through a series of menus to interact with the picture-in-picture overlay and without requiring that content be played on the primary user interface), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
In some embodiments, while the user interface is displayed as the overlay over the other user interface (1974), such as in
In some embodiments, in accordance with a determination that the other user interface does not include content that is currently playing that includes respective audio, the electronic device plays (1978) the audio for the content in the overlay, such as in
The above-described manner of outputting audio from content being played by the device (e.g., by always outputting the audio from the playback of content on the primary user interface unless the playback of content on the primary user interface does not include audio or the audio is muted, then outputting the audio form the playback of content in the picture-in-picture overlay) allows the electronic device to provide the user with the ability to be fully immersed in the content being displayed on the primary user interface, but quickly hear the audio from the content on the picture-in-picture overlay if the user requests it (e.g., by muting the content on the primary user interface) which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient (e.g., without requiring the user to swap the content being displayed on the primary user interface with the content on the picture-in-picture overlay to hear audio from the content that is in the picture-in-picture overlay, even for a short time frame), which additionally reduces power usage and improves battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiency.
It should be understood that the particular order in which the operations in
The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described with respect to
As described above, one aspect of the present technology is the gathering and use of data available from various sources to improve the delivery to users of content that may be of interest to them. The present disclosure contemplates that in some instances, this gathered data may include personal information data that uniquely identifies or can be used to contact or locate a specific person. Such personal information data can include demographic data, location-based data, telephone numbers, email addresses, twitter ID's, home addresses, data or records relating to a user's health or level of fitness (e.g., vital signs measurements, medication information, exercise information), date of birth, or any other identifying or personal information.
The present disclosure recognizes that the use of such personal information data, in the present technology, can be used to the benefit of users. For example, the personal information data can be used to deliver targeted content that is of greater interest to the user. Accordingly, use of such personal information data enables users to have calculated control of the delivered content. Further, other uses for personal information data that benefit the user are also contemplated by the present disclosure. For instance, viewing history data may be used to provide customized recommendations to users, or may be used to provide the user with the user's own past viewing history. Further, personal information such as personal preferences and settings can be used to quickly load and switch between respective users' preferences and settings.
The present disclosure contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. Such policies should be easily accessible by users, and should be updated as the collection and/or use of data changes. Personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection/sharing should occur after receiving the informed consent of the users. Additionally, such entities should consider taking any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices. In addition, policies and practices should be adapted for the particular types of personal information data being collected and/or accessed and adapted to applicable laws and standards, including jurisdiction-specific considerations. For instance, in the US, collection of or access to certain health data may be governed by federal and/or state laws, such as the Health Insurance Portability and Accountability Act (HIPAA); whereas health data in other countries may be subject to other regulations and policies and should be handled accordingly. Hence different privacy practices should be maintained for different personal data types in each country.
Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, in the case of content delivery services, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services or anytime thereafter. In another example, users can select not to provide content taste data, for targeted content delivery services. In addition to providing “opt in” and “opt out” options, the present disclosure contemplates providing notifications relating to the access or use of personal information. For instance, a user may be notified upon downloading an app that their personal information data will be accessed and then reminded again just before personal information data is accessed by the app.
Moreover, it is the intent of the present disclosure that personal information data should be managed and handled in a way to minimize risks of unintentional or unauthorized access or use. Risk can be minimized by limiting the collection of data and deleting data once it is no longer needed. In addition, and when applicable, including in certain health related applications, data de-identification can be used to protect a user's privacy. De-identification may be facilitated, when appropriate, by removing specific identifiers (e.g., date of birth, etc.), controlling the amount or specificity of data stored (e.g., collecting location data a city level rather than at an address level), controlling how data is stored (e.g., aggregating data across users), and/or other methods.
Therefore, although the present disclosure broadly covers use of personal information data to implement one or more various disclosed embodiments, the present disclosure also contemplates that the various embodiments can also be implemented without the need for accessing such personal information data. That is, the various embodiments of the present technology are not rendered inoperable due to the lack of all or a portion of such personal information data. For example, content can be selected and delivered to users by inferring preferences based on non-personal information data or a bare minimum amount of personal information, such as the content being requested by the device associated with a user, other non-personal information available to the content delivery services, or publicly available information.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best use the invention and various described embodiments with various modifications as are suited to the particular use contemplated.
This application claims the benefit under 35 USC 119(e) of U.S. Provisional Patent Application No. 62/822,966, filed Mar. 24, 2019 and U.S. Provisional Patent Application No. 62/855,867, filed May 31, 2019, the contents of which are incorporated herein by reference in their entirety for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
2718550 | Hoyt et al. | Sep 1955 | A |
4672677 | Yamakawa | Jun 1987 | A |
5029223 | Fujisaki | Jul 1991 | A |
5483261 | Yasutake | Jan 1996 | A |
5488204 | Mead et al. | Jan 1996 | A |
5585866 | Miller et al. | Dec 1996 | A |
5596373 | White et al. | Jan 1997 | A |
5621456 | Florin et al. | Apr 1997 | A |
5818439 | Nagasaka et al. | Oct 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5835079 | Shieh | Nov 1998 | A |
5880411 | Gillespie et al. | Mar 1999 | A |
5886690 | Pond et al. | Mar 1999 | A |
5926230 | Niijima | Jul 1999 | A |
6021320 | Bickford | Feb 2000 | A |
6028600 | Rosin et al. | Feb 2000 | A |
6049333 | Lajoie et al. | Apr 2000 | A |
6188391 | Seely et al. | Feb 2001 | B1 |
6310610 | Beaton et al. | Oct 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6405371 | Oosterhout et al. | Jun 2002 | B1 |
6487722 | Okura et al. | Nov 2002 | B1 |
6570557 | Westerman et al. | May 2003 | B1 |
6628304 | Mitchell | Sep 2003 | B2 |
6677932 | Westerman | Jan 2004 | B1 |
6690387 | Zimmerman et al. | Feb 2004 | B2 |
6745391 | Macrae et al. | Jun 2004 | B1 |
6909837 | Unger | Jun 2005 | B1 |
6928433 | Goodman et al. | Aug 2005 | B2 |
7015894 | Morohoshi | Mar 2006 | B2 |
7039879 | Bergsten | May 2006 | B2 |
7103906 | Katz et al. | Sep 2006 | B1 |
7134089 | Celik | Nov 2006 | B2 |
7184064 | Zimmerman et al. | Feb 2007 | B2 |
7213255 | Markel | May 2007 | B2 |
7293275 | Krieger et al. | Nov 2007 | B1 |
7324953 | Murphy | Jan 2008 | B1 |
7330192 | Brunner et al. | Feb 2008 | B2 |
7596761 | Lemay et al. | Sep 2009 | B2 |
7614008 | Ording | Nov 2009 | B2 |
7631278 | Miksovsky | Dec 2009 | B2 |
7633076 | Huppi et al. | Dec 2009 | B2 |
7636897 | Koralski | Dec 2009 | B2 |
7649526 | Ording et al. | Jan 2010 | B2 |
7650569 | Allen et al. | Jan 2010 | B1 |
7653883 | Hotelling et al. | Jan 2010 | B2 |
7657849 | Chaudhri et al. | Feb 2010 | B2 |
7663607 | Hotelling et al. | Feb 2010 | B2 |
7694231 | Kocienda et al. | Apr 2010 | B2 |
7712051 | Chadzelek | May 2010 | B2 |
7783892 | Russell et al. | Aug 2010 | B2 |
7810043 | Ostojic et al. | Oct 2010 | B2 |
7814023 | Rao et al. | Oct 2010 | B1 |
7827483 | Unbedacht | Nov 2010 | B2 |
7836475 | Angiolillo et al. | Nov 2010 | B2 |
7844914 | Andre et al. | Nov 2010 | B2 |
7849487 | Vosseller | Dec 2010 | B1 |
7856605 | Ording et al. | Dec 2010 | B2 |
7917477 | Hutson et al. | Mar 2011 | B2 |
7956846 | Ording et al. | Jun 2011 | B2 |
7957762 | Herz et al. | Jun 2011 | B2 |
7970379 | White et al. | Jun 2011 | B2 |
8006002 | Kalayjian et al. | Aug 2011 | B2 |
8026805 | Rowe | Sep 2011 | B1 |
8082523 | Forstall et al. | Dec 2011 | B2 |
8094132 | Frischling et al. | Jan 2012 | B1 |
8115731 | Varanda | Feb 2012 | B2 |
8145617 | Verstak et al. | Mar 2012 | B1 |
8170931 | Ross et al. | May 2012 | B2 |
8205240 | Ansari et al. | Jun 2012 | B2 |
8239784 | Hotelling et al. | Aug 2012 | B2 |
8279180 | Hotelling et al. | Oct 2012 | B2 |
8291452 | Yong et al. | Oct 2012 | B1 |
8299889 | Kumar et al. | Oct 2012 | B2 |
8301484 | Kumar | Oct 2012 | B1 |
8312484 | Mccarty et al. | Nov 2012 | B1 |
8312486 | Briggs et al. | Nov 2012 | B1 |
8325160 | St. Pierre et al. | Dec 2012 | B2 |
8346798 | Spiegelman | Jan 2013 | B2 |
8370874 | Chang et al. | Feb 2013 | B1 |
8381135 | Hotelling et al. | Feb 2013 | B2 |
8386588 | Cooley | Feb 2013 | B1 |
8407737 | Ellis | Mar 2013 | B1 |
8416217 | Eriksson et al. | Apr 2013 | B1 |
8418202 | Ahmad-Taylor | Apr 2013 | B2 |
8424048 | Lyren et al. | Apr 2013 | B1 |
8479122 | Hotelling et al. | Jul 2013 | B2 |
8495499 | Denise | Jul 2013 | B1 |
8516063 | Fletcher | Aug 2013 | B2 |
8516525 | Jerding et al. | Aug 2013 | B1 |
8560398 | Few et al. | Oct 2013 | B1 |
8584165 | Kane et al. | Nov 2013 | B1 |
8607163 | Plummer | Dec 2013 | B2 |
8613015 | Gordon et al. | Dec 2013 | B2 |
8613023 | Narahara et al. | Dec 2013 | B2 |
8625974 | Pinson | Jan 2014 | B1 |
8674958 | Kravets et al. | Mar 2014 | B1 |
8683362 | Shiplacoff | Mar 2014 | B2 |
8683517 | Carpenter et al. | Mar 2014 | B2 |
8730190 | Moloney | May 2014 | B2 |
8742885 | Brodersen | Jun 2014 | B2 |
8754862 | Zaliva | Jun 2014 | B2 |
8762852 | Davis et al. | Jun 2014 | B2 |
8769408 | Madden et al. | Jul 2014 | B2 |
8782706 | Ellis | Jul 2014 | B2 |
8850471 | Kilar et al. | Sep 2014 | B2 |
8850490 | Thomas et al. | Sep 2014 | B1 |
8869207 | Earle | Oct 2014 | B1 |
8887202 | Hunter et al. | Nov 2014 | B2 |
8930839 | He | Jan 2015 | B2 |
8952987 | Momeyer et al. | Feb 2015 | B2 |
8963847 | Hunt | Feb 2015 | B2 |
8983950 | Askey | Mar 2015 | B2 |
8988356 | Tseng | Mar 2015 | B2 |
8990857 | Yong | Mar 2015 | B2 |
9007322 | Young | Apr 2015 | B1 |
9066146 | Suh et al. | Jun 2015 | B2 |
9081421 | Lai et al. | Jul 2015 | B1 |
9092057 | Varela et al. | Jul 2015 | B2 |
9116569 | Stacy et al. | Aug 2015 | B2 |
9118967 | Sirpal | Aug 2015 | B2 |
9129656 | Prather et al. | Sep 2015 | B2 |
9141200 | Bernstein et al. | Sep 2015 | B2 |
9196309 | Schultz | Nov 2015 | B2 |
9214290 | Xie et al. | Dec 2015 | B2 |
9215273 | Jonnala et al. | Dec 2015 | B2 |
9219634 | Morse et al. | Dec 2015 | B1 |
9235317 | Matas | Jan 2016 | B2 |
9241121 | Rudolph | Jan 2016 | B2 |
9244600 | McIntosh | Jan 2016 | B2 |
9247014 | Rao | Jan 2016 | B1 |
9247174 | Sirpal | Jan 2016 | B2 |
9285977 | Greenberg et al. | Mar 2016 | B1 |
9319727 | Phipps et al. | Apr 2016 | B2 |
9348458 | Hotelling et al. | May 2016 | B2 |
9357250 | Newman et al. | May 2016 | B1 |
9380343 | Webster | Jun 2016 | B2 |
9414108 | Sirpal | Aug 2016 | B2 |
9454288 | Raffle | Sep 2016 | B2 |
9514476 | Kay et al. | Dec 2016 | B2 |
9532111 | Christie | Dec 2016 | B1 |
9538310 | Fjeldsoe-Nielsen et al. | Jan 2017 | B2 |
9542060 | Brenner et al. | Jan 2017 | B1 |
9560399 | Kaya | Jan 2017 | B2 |
9575944 | Neil | Feb 2017 | B2 |
9591339 | Christie et al. | Mar 2017 | B1 |
9600159 | Lawson et al. | Mar 2017 | B2 |
9602566 | Lewis et al. | Mar 2017 | B1 |
9639241 | Penha et al. | May 2017 | B2 |
9652118 | Hill et al. | May 2017 | B2 |
9652448 | Pasquero et al. | May 2017 | B2 |
9658740 | Chaudhri | May 2017 | B2 |
9774917 | Christie et al. | Sep 2017 | B1 |
9792018 | Van Os | Oct 2017 | B2 |
9807462 | Wood | Oct 2017 | B2 |
9864508 | Dixon et al. | Jan 2018 | B2 |
9864509 | Howard et al. | Jan 2018 | B2 |
9871905 | Habiger et al. | Jan 2018 | B1 |
9913142 | Folse et al. | Mar 2018 | B2 |
9933937 | Lemay et al. | Apr 2018 | B2 |
9973800 | Yellin et al. | May 2018 | B2 |
10019142 | Van Os et al. | Jul 2018 | B2 |
10025499 | Howard et al. | Jul 2018 | B2 |
10079872 | Thomas et al. | Sep 2018 | B1 |
10091558 | Christie et al. | Oct 2018 | B2 |
10116996 | Christie et al. | Oct 2018 | B1 |
10126904 | Agnetta | Nov 2018 | B2 |
10168871 | Wallters | Jan 2019 | B2 |
10200761 | Christie et al. | Feb 2019 | B1 |
10205985 | Lue-Sang et al. | Feb 2019 | B2 |
10209866 | Johnston et al. | Feb 2019 | B2 |
10237599 | Gravino et al. | Mar 2019 | B1 |
10275148 | Matas et al. | Apr 2019 | B2 |
10282088 | Kim | May 2019 | B2 |
10303422 | Woo et al. | May 2019 | B1 |
10373479 | Banfi | Aug 2019 | B2 |
10405015 | Kite et al. | Sep 2019 | B2 |
10521188 | Christie et al. | Dec 2019 | B1 |
10551995 | Ho et al. | Feb 2020 | B1 |
10552470 | Todd et al. | Feb 2020 | B2 |
10564823 | Dennis et al. | Feb 2020 | B1 |
10601808 | Nijim et al. | Mar 2020 | B1 |
10606539 | Bernstein | Mar 2020 | B2 |
10631042 | Zerr et al. | Apr 2020 | B2 |
10650052 | Van Os et al. | May 2020 | B2 |
10795490 | Chaudhri | Oct 2020 | B2 |
10827007 | Kode et al. | Nov 2020 | B2 |
11062358 | Lewis et al. | Jul 2021 | B1 |
11461397 | Van Os et al. | Oct 2022 | B2 |
20020015024 | Westerman et al. | Feb 2002 | A1 |
20020026637 | Markel | Feb 2002 | A1 |
20020042920 | Thomas et al. | Apr 2002 | A1 |
20020060750 | Istvan et al. | May 2002 | A1 |
20020085045 | Vong et al. | Jul 2002 | A1 |
20020100063 | Herigstad et al. | Jul 2002 | A1 |
20020112239 | Goldman | Aug 2002 | A1 |
20020113816 | Mitchell | Aug 2002 | A1 |
20020144269 | Connelly | Oct 2002 | A1 |
20020171686 | Kamen | Nov 2002 | A1 |
20030001907 | Bergsten | Jan 2003 | A1 |
20030005445 | Schein et al. | Jan 2003 | A1 |
20030009757 | Kikinis | Jan 2003 | A1 |
20030011641 | Totman et al. | Jan 2003 | A1 |
20030013483 | Ausems | Jan 2003 | A1 |
20030088872 | Maissel et al. | May 2003 | A1 |
20030093790 | Logan et al. | May 2003 | A1 |
20030126600 | Heuvelman | Jul 2003 | A1 |
20030149628 | Abbosh et al. | Aug 2003 | A1 |
20030158950 | Sako | Aug 2003 | A1 |
20030167471 | Roth et al. | Sep 2003 | A1 |
20030177075 | Burke | Sep 2003 | A1 |
20030177498 | Ellis et al. | Sep 2003 | A1 |
20030192060 | Levy | Oct 2003 | A1 |
20030221191 | Khusheim | Nov 2003 | A1 |
20030228130 | Tanikawa et al. | Dec 2003 | A1 |
20030234804 | Parker et al. | Dec 2003 | A1 |
20040019497 | Volk et al. | Jan 2004 | A1 |
20040046801 | Lin et al. | Mar 2004 | A1 |
20040070573 | Graham | Apr 2004 | A1 |
20040088328 | Cook et al. | May 2004 | A1 |
20040090463 | Celik | May 2004 | A1 |
20040093262 | Weston et al. | May 2004 | A1 |
20040133909 | Ma | Jul 2004 | A1 |
20040139401 | Unbedacht | Jul 2004 | A1 |
20040161151 | Iwayama et al. | Aug 2004 | A1 |
20040168184 | Steenkamp et al. | Aug 2004 | A1 |
20040193421 | Blass | Sep 2004 | A1 |
20040252120 | Hunleth et al. | Dec 2004 | A1 |
20040254883 | Kondrk et al. | Dec 2004 | A1 |
20040254958 | Volk | Dec 2004 | A1 |
20040267715 | Polson et al. | Dec 2004 | A1 |
20050012599 | Dematteo | Jan 2005 | A1 |
20050071761 | Kontio | Mar 2005 | A1 |
20050071785 | Chadzelek | Mar 2005 | A1 |
20050076363 | Dukes et al. | Apr 2005 | A1 |
20050091254 | Stabb et al. | Apr 2005 | A1 |
20050091597 | Ackley | Apr 2005 | A1 |
20050134625 | Kubota | Jun 2005 | A1 |
20050162398 | Eliasson et al. | Jul 2005 | A1 |
20050162402 | Watanachote | Jul 2005 | A1 |
20050186988 | Lim et al. | Aug 2005 | A1 |
20050190059 | Wehrenberg | Sep 2005 | A1 |
20050223335 | Ichikawa | Oct 2005 | A1 |
20050235316 | Ahmad-Taylor | Oct 2005 | A1 |
20050257166 | Tu | Nov 2005 | A1 |
20050283358 | Stephanick et al. | Dec 2005 | A1 |
20060017692 | Wehrenberg et al. | Jan 2006 | A1 |
20060020904 | Aaltonen et al. | Jan 2006 | A1 |
20060026521 | Hotelling et al. | Feb 2006 | A1 |
20060029374 | Park | Feb 2006 | A1 |
20060031872 | Hsiao et al. | Feb 2006 | A1 |
20060033724 | Chaudhri et al. | Feb 2006 | A1 |
20060053449 | Gutta | Mar 2006 | A1 |
20060069998 | Artman et al. | Mar 2006 | A1 |
20060071905 | Varanda | Apr 2006 | A1 |
20060080352 | Boubez et al. | Apr 2006 | A1 |
20060097991 | Hotelling et al. | May 2006 | A1 |
20060107304 | Cleron et al. | May 2006 | A1 |
20060112346 | Miksovsky | May 2006 | A1 |
20060112352 | Tseng et al. | May 2006 | A1 |
20060117267 | Koralski | Jun 2006 | A1 |
20060120624 | Jojic et al. | Jun 2006 | A1 |
20060195479 | Spiegelman | Aug 2006 | A1 |
20060195512 | Rogers et al. | Aug 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20060224987 | Caffarelli | Oct 2006 | A1 |
20060236847 | Withop | Oct 2006 | A1 |
20060248113 | Leffert et al. | Nov 2006 | A1 |
20060265637 | Marriott et al. | Nov 2006 | A1 |
20060271968 | Zellner | Nov 2006 | A1 |
20060282856 | Errico et al. | Dec 2006 | A1 |
20060288848 | Gould et al. | Dec 2006 | A1 |
20060294545 | Morris | Dec 2006 | A1 |
20070005569 | Hurst-hiller et al. | Jan 2007 | A1 |
20070009229 | Liu | Jan 2007 | A1 |
20070011702 | Vaysman | Jan 2007 | A1 |
20070024594 | Sakata et al. | Feb 2007 | A1 |
20070028267 | Ostojic et al. | Feb 2007 | A1 |
20070038957 | White | Feb 2007 | A1 |
20070073596 | Alexander et al. | Mar 2007 | A1 |
20070092204 | Wagner et al. | Apr 2007 | A1 |
20070150802 | Wan et al. | Jun 2007 | A1 |
20070154163 | Cordray | Jul 2007 | A1 |
20070157220 | Cordray et al. | Jul 2007 | A1 |
20070157249 | Cordray et al. | Jul 2007 | A1 |
20070168413 | Barletta et al. | Jul 2007 | A1 |
20070186254 | Tsutsui et al. | Aug 2007 | A1 |
20070199035 | Schwartz et al. | Aug 2007 | A1 |
20070204057 | Shaver et al. | Aug 2007 | A1 |
20070229465 | Sakai | Oct 2007 | A1 |
20070233880 | Nieh et al. | Oct 2007 | A1 |
20070244902 | Seide et al. | Oct 2007 | A1 |
20070248317 | Bahn | Oct 2007 | A1 |
20080046928 | Poling et al. | Feb 2008 | A1 |
20080059884 | Ellis et al. | Mar 2008 | A1 |
20080065989 | Conroy et al. | Mar 2008 | A1 |
20080066010 | Brodersen et al. | Mar 2008 | A1 |
20080077562 | Schleppe | Mar 2008 | A1 |
20080092168 | Logan et al. | Apr 2008 | A1 |
20080092173 | Shannon et al. | Apr 2008 | A1 |
20080111822 | Horowitz et al. | May 2008 | A1 |
20080120668 | Yau | May 2008 | A1 |
20080127281 | Van et al. | May 2008 | A1 |
20080155475 | Duhig | Jun 2008 | A1 |
20080189740 | Carpenter et al. | Aug 2008 | A1 |
20080189742 | Ellis et al. | Aug 2008 | A1 |
20080208844 | Jenkins | Aug 2008 | A1 |
20080216020 | Plummer | Sep 2008 | A1 |
20080222677 | Woo et al. | Sep 2008 | A1 |
20080243817 | Chan et al. | Oct 2008 | A1 |
20080250312 | Curtis | Oct 2008 | A1 |
20080260252 | Borgaonkar et al. | Oct 2008 | A1 |
20080270886 | Gossweiler et al. | Oct 2008 | A1 |
20080276279 | Gossweiler et al. | Nov 2008 | A1 |
20080301260 | Goldeen et al. | Dec 2008 | A1 |
20080301579 | Jonasson et al. | Dec 2008 | A1 |
20080301734 | Goldeen et al. | Dec 2008 | A1 |
20080307343 | Robert et al. | Dec 2008 | A1 |
20080307458 | Kim et al. | Dec 2008 | A1 |
20080307459 | Migos | Dec 2008 | A1 |
20080320391 | Lemay et al. | Dec 2008 | A1 |
20080320532 | Lee | Dec 2008 | A1 |
20090055385 | Jeon et al. | Feb 2009 | A1 |
20090063521 | Bull et al. | Mar 2009 | A1 |
20090063975 | Rottler et al. | Mar 2009 | A1 |
20090089837 | Momosaki | Apr 2009 | A1 |
20090094662 | Chang et al. | Apr 2009 | A1 |
20090119754 | Schubert | May 2009 | A1 |
20090158325 | Johnson | Jun 2009 | A1 |
20090158326 | Hunt et al. | Jun 2009 | A1 |
20090161868 | Chaudhry | Jun 2009 | A1 |
20090164944 | Webster et al. | Jun 2009 | A1 |
20090165054 | Rudolph | Jun 2009 | A1 |
20090174679 | Westerman | Jul 2009 | A1 |
20090177301 | Hayes | Jul 2009 | A1 |
20090177989 | Ma et al. | Jul 2009 | A1 |
20090178083 | Carr et al. | Jul 2009 | A1 |
20090228491 | Malik | Sep 2009 | A1 |
20090228807 | Lemay | Sep 2009 | A1 |
20090239587 | Negron | Sep 2009 | A1 |
20090256807 | Nurmi | Oct 2009 | A1 |
20090259957 | Slocum et al. | Oct 2009 | A1 |
20090278916 | Ito | Nov 2009 | A1 |
20090282444 | Laksono et al. | Nov 2009 | A1 |
20090288079 | Zuber et al. | Nov 2009 | A1 |
20090313100 | Ingleshwar | Dec 2009 | A1 |
20090322962 | Weeks | Dec 2009 | A1 |
20090327952 | Karas et al. | Dec 2009 | A1 |
20100009629 | Jung et al. | Jan 2010 | A1 |
20100031162 | Wiser et al. | Feb 2010 | A1 |
20100053220 | Ozawa et al. | Mar 2010 | A1 |
20100053432 | Cheng | Mar 2010 | A1 |
20100057696 | Miyazawa et al. | Mar 2010 | A1 |
20100064313 | Beyabani | Mar 2010 | A1 |
20100080163 | Krishnamoorthi et al. | Apr 2010 | A1 |
20100083181 | Matsushima et al. | Apr 2010 | A1 |
20100095240 | Shiplacoff | Apr 2010 | A1 |
20100100899 | Bradbury et al. | Apr 2010 | A1 |
20100104269 | Prestenback et al. | Apr 2010 | A1 |
20100115592 | Belz et al. | May 2010 | A1 |
20100121714 | Bryant et al. | May 2010 | A1 |
20100146442 | Nagasaka et al. | Jun 2010 | A1 |
20100153881 | Dinn | Jun 2010 | A1 |
20100153999 | Yates | Jun 2010 | A1 |
20100159898 | Krzyzanowski et al. | Jun 2010 | A1 |
20100162172 | Aroner | Jun 2010 | A1 |
20100194998 | Lee et al. | Aug 2010 | A1 |
20100198822 | Glennon et al. | Aug 2010 | A1 |
20100205628 | Davis et al. | Aug 2010 | A1 |
20100211636 | Starkenburg et al. | Aug 2010 | A1 |
20100223646 | Goldeen et al. | Sep 2010 | A1 |
20100229194 | Blanchard et al. | Sep 2010 | A1 |
20100235744 | Schultz | Sep 2010 | A1 |
20100251304 | Donoghue et al. | Sep 2010 | A1 |
20100257005 | Phenner et al. | Oct 2010 | A1 |
20100269145 | Ingrassia et al. | Oct 2010 | A1 |
20100275143 | Fu et al. | Oct 2010 | A1 |
20100277337 | Brodersen | Nov 2010 | A1 |
20100293190 | Kaiser et al. | Nov 2010 | A1 |
20100293586 | Simoes et al. | Nov 2010 | A1 |
20100299606 | Morita | Nov 2010 | A1 |
20100312824 | Smith et al. | Dec 2010 | A1 |
20100325660 | Holden | Dec 2010 | A1 |
20100333142 | Busse | Dec 2010 | A1 |
20100333143 | Civanlar et al. | Dec 2010 | A1 |
20110004831 | Steinberg et al. | Jan 2011 | A1 |
20110047513 | Onogi et al. | Feb 2011 | A1 |
20110052146 | Murthy et al. | Mar 2011 | A1 |
20110054649 | Sarkis et al. | Mar 2011 | A1 |
20110055762 | Jung et al. | Mar 2011 | A1 |
20110055870 | Yum et al. | Mar 2011 | A1 |
20110071977 | Nakajima et al. | Mar 2011 | A1 |
20110078739 | Grad | Mar 2011 | A1 |
20110080935 | Kim et al. | Apr 2011 | A1 |
20110087992 | Wang et al. | Apr 2011 | A1 |
20110090402 | Huntington | Apr 2011 | A1 |
20110093415 | Rhee et al. | Apr 2011 | A1 |
20110119715 | Chang et al. | May 2011 | A1 |
20110131607 | Thomas et al. | Jun 2011 | A1 |
20110154194 | Mathai et al. | Jun 2011 | A1 |
20110154305 | Leroux et al. | Jun 2011 | A1 |
20110157029 | Tseng | Jun 2011 | A1 |
20110162022 | Xia | Jun 2011 | A1 |
20110163971 | Wagner et al. | Jul 2011 | A1 |
20110167339 | Lemay | Jul 2011 | A1 |
20110175930 | Hwang et al. | Jul 2011 | A1 |
20110179388 | Fleizach et al. | Jul 2011 | A1 |
20110179453 | Poniatowski | Jul 2011 | A1 |
20110197153 | King et al. | Aug 2011 | A1 |
20110209177 | Sela et al. | Aug 2011 | A1 |
20110218948 | De et al. | Sep 2011 | A1 |
20110231280 | Farah | Sep 2011 | A1 |
20110231823 | Frye et al. | Sep 2011 | A1 |
20110231872 | Gharachorloo et al. | Sep 2011 | A1 |
20110231878 | Hunter et al. | Sep 2011 | A1 |
20110246332 | Alcodray et al. | Oct 2011 | A1 |
20110281517 | Ukkadam | Nov 2011 | A1 |
20110283304 | Roberts et al. | Nov 2011 | A1 |
20110283333 | Ukkadam | Nov 2011 | A1 |
20110289064 | Lebeau et al. | Nov 2011 | A1 |
20110289317 | Darapu | Nov 2011 | A1 |
20110289419 | Yu et al. | Nov 2011 | A1 |
20110289421 | Jordan et al. | Nov 2011 | A1 |
20110289452 | Jordan et al. | Nov 2011 | A1 |
20110289531 | Moonka et al. | Nov 2011 | A1 |
20110289534 | Jordan et al. | Nov 2011 | A1 |
20110296351 | Ewing et al. | Dec 2011 | A1 |
20110302532 | Missig | Dec 2011 | A1 |
20110307631 | Park et al. | Dec 2011 | A1 |
20110312278 | Matsushita et al. | Dec 2011 | A1 |
20110321072 | Patterson et al. | Dec 2011 | A1 |
20120019674 | Ohnishi et al. | Jan 2012 | A1 |
20120023450 | Noto et al. | Jan 2012 | A1 |
20120036552 | Dare et al. | Feb 2012 | A1 |
20120042245 | Askey | Feb 2012 | A1 |
20120042343 | Laligand et al. | Feb 2012 | A1 |
20120053887 | Nurmi | Mar 2012 | A1 |
20120054178 | Tran et al. | Mar 2012 | A1 |
20120054642 | Balsiger et al. | Mar 2012 | A1 |
20120054797 | Skog et al. | Mar 2012 | A1 |
20120059910 | Cassidy | Mar 2012 | A1 |
20120060092 | Hill | Mar 2012 | A1 |
20120064204 | Davila et al. | Mar 2012 | A1 |
20120084136 | Seth et al. | Apr 2012 | A1 |
20120093481 | Mcdowell et al. | Apr 2012 | A1 |
20120096011 | Kay et al. | Apr 2012 | A1 |
20120102573 | Spooner et al. | Apr 2012 | A1 |
20120105367 | Son et al. | May 2012 | A1 |
20120110616 | Kilar et al. | May 2012 | A1 |
20120110621 | Gossweiler, III | May 2012 | A1 |
20120114303 | Chung et al. | May 2012 | A1 |
20120117584 | Gordon | May 2012 | A1 |
20120131615 | Kobayashi et al. | May 2012 | A1 |
20120139938 | Khedouri et al. | Jun 2012 | A1 |
20120144003 | Rosenbaum et al. | Jun 2012 | A1 |
20120158524 | Hintz et al. | Jun 2012 | A1 |
20120173991 | Roberts et al. | Jul 2012 | A1 |
20120174157 | Stinson et al. | Jul 2012 | A1 |
20120198020 | Parker et al. | Aug 2012 | A1 |
20120198336 | Novotny et al. | Aug 2012 | A1 |
20120210366 | Wong | Aug 2012 | A1 |
20120215684 | Kidron | Aug 2012 | A1 |
20120216113 | Li | Aug 2012 | A1 |
20120216117 | Arriola et al. | Aug 2012 | A1 |
20120216296 | Kidron | Aug 2012 | A1 |
20120221498 | Kaszynski et al. | Aug 2012 | A1 |
20120222056 | Donoghue et al. | Aug 2012 | A1 |
20120233640 | Odryna et al. | Sep 2012 | A1 |
20120236173 | Telek et al. | Sep 2012 | A1 |
20120242704 | Bamford et al. | Sep 2012 | A1 |
20120260291 | Wood | Oct 2012 | A1 |
20120260293 | Young et al. | Oct 2012 | A1 |
20120262371 | Lee et al. | Oct 2012 | A1 |
20120262407 | Hinckley et al. | Oct 2012 | A1 |
20120266069 | Moshiri et al. | Oct 2012 | A1 |
20120272261 | Reynolds et al. | Oct 2012 | A1 |
20120284753 | Roberts et al. | Nov 2012 | A1 |
20120290933 | Rajaraman et al. | Nov 2012 | A1 |
20120291079 | Gordon et al. | Nov 2012 | A1 |
20120308143 | Bellegarda et al. | Dec 2012 | A1 |
20120311443 | Chaudhri et al. | Dec 2012 | A1 |
20120311638 | Reyna et al. | Dec 2012 | A1 |
20120317482 | Barraclough et al. | Dec 2012 | A1 |
20120323938 | Skeen et al. | Dec 2012 | A1 |
20120324504 | Archer et al. | Dec 2012 | A1 |
20120327125 | Kutliroff et al. | Dec 2012 | A1 |
20130014150 | Seo et al. | Jan 2013 | A1 |
20130014159 | Wiser et al. | Jan 2013 | A1 |
20130021288 | Kaerkkaeinen et al. | Jan 2013 | A1 |
20130024895 | Yong et al. | Jan 2013 | A1 |
20130031585 | Itagaki et al. | Jan 2013 | A1 |
20130033643 | Kim et al. | Feb 2013 | A1 |
20130042271 | Yellin et al. | Feb 2013 | A1 |
20130061234 | Piira et al. | Mar 2013 | A1 |
20130061267 | Cansino et al. | Mar 2013 | A1 |
20130067366 | Almosnino | Mar 2013 | A1 |
20130073403 | Tuchman et al. | Mar 2013 | A1 |
20130083076 | Liu et al. | Apr 2013 | A1 |
20130097009 | Akadiri | Apr 2013 | A1 |
20130110978 | Gordon et al. | May 2013 | A1 |
20130124998 | Pendergast et al. | May 2013 | A1 |
20130132874 | He | May 2013 | A1 |
20130132966 | Chanda et al. | May 2013 | A1 |
20130151300 | Le Chevalier | Jun 2013 | A1 |
20130173034 | Reimann et al. | Jul 2013 | A1 |
20130174193 | Yu et al. | Jul 2013 | A1 |
20130179812 | BianRosa | Jul 2013 | A1 |
20130179995 | Basile et al. | Jul 2013 | A1 |
20130198686 | Kawai et al. | Aug 2013 | A1 |
20130205312 | Huang | Aug 2013 | A1 |
20130212531 | Yoshida | Aug 2013 | A1 |
20130227482 | Thorsander et al. | Aug 2013 | A1 |
20130247105 | Jovanovski et al. | Sep 2013 | A1 |
20130262431 | Garner et al. | Oct 2013 | A1 |
20130262558 | Wood et al. | Oct 2013 | A1 |
20130262619 | Goodwin et al. | Oct 2013 | A1 |
20130262633 | Goodwin et al. | Oct 2013 | A1 |
20130263189 | Garner | Oct 2013 | A1 |
20130283154 | Sasakura | Oct 2013 | A1 |
20130283168 | Brown et al. | Oct 2013 | A1 |
20130283317 | Guntupalli et al. | Oct 2013 | A1 |
20130283318 | Wannamaker | Oct 2013 | A1 |
20130285937 | Billings et al. | Oct 2013 | A1 |
20130290233 | Ferren et al. | Oct 2013 | A1 |
20130290848 | Billings et al. | Oct 2013 | A1 |
20130291018 | Billings et al. | Oct 2013 | A1 |
20130291037 | Im et al. | Oct 2013 | A1 |
20130294755 | Arme et al. | Nov 2013 | A1 |
20130312044 | Itagaki | Nov 2013 | A1 |
20130326499 | Mowatt et al. | Dec 2013 | A1 |
20130326554 | Shkedi | Dec 2013 | A1 |
20130326561 | Pandey | Dec 2013 | A1 |
20130332838 | Naggar et al. | Dec 2013 | A1 |
20130332960 | Young | Dec 2013 | A1 |
20130339877 | Skeen et al. | Dec 2013 | A1 |
20130340006 | Kwan | Dec 2013 | A1 |
20130346564 | Warrick et al. | Dec 2013 | A1 |
20130347044 | Lee et al. | Dec 2013 | A1 |
20140006635 | Braness et al. | Jan 2014 | A1 |
20140006795 | Han et al. | Jan 2014 | A1 |
20140006951 | Hunter | Jan 2014 | A1 |
20140012859 | Heilprin et al. | Jan 2014 | A1 |
20140013283 | Matas | Jan 2014 | A1 |
20140020017 | Stern et al. | Jan 2014 | A1 |
20140024341 | Johan | Jan 2014 | A1 |
20140033245 | Barton | Jan 2014 | A1 |
20140049692 | Sirpal et al. | Feb 2014 | A1 |
20140052683 | Kirkham et al. | Feb 2014 | A1 |
20140053116 | Smith | Feb 2014 | A1 |
20140053195 | Sirpal | Feb 2014 | A1 |
20140059605 | Sirpal | Feb 2014 | A1 |
20140059615 | Sirpal | Feb 2014 | A1 |
20140059625 | Dourado et al. | Feb 2014 | A1 |
20140059635 | Sirpal | Feb 2014 | A1 |
20140068654 | Marlow et al. | Mar 2014 | A1 |
20140071068 | Shih et al. | Mar 2014 | A1 |
20140074454 | Brown et al. | Mar 2014 | A1 |
20140075313 | Bachman et al. | Mar 2014 | A1 |
20140075316 | Li | Mar 2014 | A1 |
20140075394 | Nawle | Mar 2014 | A1 |
20140075574 | Zheng et al. | Mar 2014 | A1 |
20140082497 | Chalouhi et al. | Mar 2014 | A1 |
20140088952 | Fife et al. | Mar 2014 | A1 |
20140089816 | Dipersia et al. | Mar 2014 | A1 |
20140098102 | Raffle | Apr 2014 | A1 |
20140104646 | Nishiyama | Apr 2014 | A1 |
20140109204 | Papillon et al. | Apr 2014 | A1 |
20140111416 | Sugiura | Apr 2014 | A1 |
20140115636 | Stuckman | Apr 2014 | A1 |
20140123006 | Chen et al. | May 2014 | A1 |
20140129232 | Jones et al. | May 2014 | A1 |
20140130097 | Londero | May 2014 | A1 |
20140136946 | Matas | May 2014 | A1 |
20140137029 | Stephenson et al. | May 2014 | A1 |
20140137030 | Matas | May 2014 | A1 |
20140143260 | Simonson et al. | May 2014 | A1 |
20140143683 | Underwood et al. | May 2014 | A1 |
20140156792 | Roberts et al. | Jun 2014 | A1 |
20140157204 | Roberts et al. | Jun 2014 | A1 |
20140157329 | Roberts et al. | Jun 2014 | A1 |
20140164966 | Kim | Jun 2014 | A1 |
20140168071 | Ahmed et al. | Jun 2014 | A1 |
20140171153 | Kienzle et al. | Jun 2014 | A1 |
20140172622 | Baronshin | Jun 2014 | A1 |
20140172953 | Blanksteen | Jun 2014 | A1 |
20140173660 | Correa et al. | Jun 2014 | A1 |
20140184471 | Martynov | Jul 2014 | A1 |
20140189523 | Shuttleworth et al. | Jul 2014 | A1 |
20140189574 | Stallings et al. | Jul 2014 | A1 |
20140189606 | Shuttleworth | Jul 2014 | A1 |
20140196064 | Kennedy et al. | Jul 2014 | A1 |
20140196069 | Ahmed et al. | Jul 2014 | A1 |
20140208268 | Jimenez | Jul 2014 | A1 |
20140208360 | Kardatzke | Jul 2014 | A1 |
20140219637 | McIntosh | Aug 2014 | A1 |
20140224867 | Werner et al. | Aug 2014 | A1 |
20140244751 | Tseng | Aug 2014 | A1 |
20140245148 | Silva et al. | Aug 2014 | A1 |
20140245186 | Tseng | Aug 2014 | A1 |
20140245222 | Kovacevic | Aug 2014 | A1 |
20140250465 | Mulholland et al. | Sep 2014 | A1 |
20140250479 | Lee et al. | Sep 2014 | A1 |
20140253463 | Hicks | Sep 2014 | A1 |
20140259074 | Ansari et al. | Sep 2014 | A1 |
20140278072 | Fino et al. | Sep 2014 | A1 |
20140278940 | Wade | Sep 2014 | A1 |
20140280728 | Szerlip Joyce et al. | Sep 2014 | A1 |
20140282208 | Chaudhri | Sep 2014 | A1 |
20140282636 | Petander et al. | Sep 2014 | A1 |
20140282677 | Mantell et al. | Sep 2014 | A1 |
20140288686 | Sant et al. | Sep 2014 | A1 |
20140289226 | English et al. | Sep 2014 | A1 |
20140289751 | Hsu | Sep 2014 | A1 |
20140310742 | Kim | Oct 2014 | A1 |
20140317653 | Mlodzinski | Oct 2014 | A1 |
20140325357 | Sant et al. | Oct 2014 | A1 |
20140333530 | Agnetta | Nov 2014 | A1 |
20140337607 | Peterson et al. | Nov 2014 | A1 |
20140340358 | Martinoli | Nov 2014 | A1 |
20140341109 | Cartmell et al. | Nov 2014 | A1 |
20140344247 | Procopio et al. | Nov 2014 | A1 |
20140344291 | Simonson et al. | Nov 2014 | A9 |
20140344294 | Skeen et al. | Nov 2014 | A1 |
20140351691 | Neil | Nov 2014 | A1 |
20140359598 | Oliver et al. | Dec 2014 | A1 |
20140365479 | Lyons et al. | Dec 2014 | A1 |
20140365481 | Novosel et al. | Dec 2014 | A1 |
20140365604 | Lewis et al. | Dec 2014 | A1 |
20140365919 | Shaw et al. | Dec 2014 | A1 |
20140366040 | Parker et al. | Dec 2014 | A1 |
20140366047 | Thomas et al. | Dec 2014 | A1 |
20150020127 | Doshi et al. | Jan 2015 | A1 |
20150022481 | Andersson et al. | Jan 2015 | A1 |
20150039685 | Lewis et al. | Feb 2015 | A1 |
20150046866 | Shimadate | Feb 2015 | A1 |
20150062069 | Shin et al. | Mar 2015 | A1 |
20150067582 | Donnelly et al. | Mar 2015 | A1 |
20150067724 | Johnson et al. | Mar 2015 | A1 |
20150074522 | Harned et al. | Mar 2015 | A1 |
20150074552 | Chai | Mar 2015 | A1 |
20150074603 | Abe | Mar 2015 | A1 |
20150082187 | Wallters | Mar 2015 | A1 |
20150095460 | Berger et al. | Apr 2015 | A1 |
20150095845 | Chun et al. | Apr 2015 | A1 |
20150113429 | Edwards | Apr 2015 | A1 |
20150121408 | Jacoby et al. | Apr 2015 | A1 |
20150134653 | Bayer et al. | May 2015 | A1 |
20150150049 | White | May 2015 | A1 |
20150150066 | Park et al. | May 2015 | A1 |
20150153571 | Ballard et al. | Jun 2015 | A1 |
20150161251 | Ramanarayanan et al. | Jun 2015 | A1 |
20150169705 | Korbecki et al. | Jun 2015 | A1 |
20150169975 | Kienzle et al. | Jun 2015 | A1 |
20150186002 | Suzuki | Jul 2015 | A1 |
20150189347 | Oztaskent et al. | Jul 2015 | A1 |
20150193192 | Kidron | Jul 2015 | A1 |
20150195624 | Gossweiler, III | Jul 2015 | A1 |
20150205591 | Jitkoff et al. | Jul 2015 | A1 |
20150237389 | Grouf et al. | Aug 2015 | A1 |
20150277720 | Thorson et al. | Oct 2015 | A1 |
20150296072 | Zhou et al. | Oct 2015 | A1 |
20150301729 | Wang et al. | Oct 2015 | A1 |
20150309670 | Wheeler et al. | Oct 2015 | A1 |
20150312603 | Singh et al. | Oct 2015 | A1 |
20150317343 | Cselle et al. | Nov 2015 | A1 |
20150334464 | Shin | Nov 2015 | A1 |
20150346975 | Lee et al. | Dec 2015 | A1 |
20150350741 | Rajaraman et al. | Dec 2015 | A1 |
20150355816 | Shim | Dec 2015 | A1 |
20150363035 | Hinckley et al. | Dec 2015 | A1 |
20150365729 | Kaya | Dec 2015 | A1 |
20150370435 | Kirmse et al. | Dec 2015 | A1 |
20150370455 | Van Os et al. | Dec 2015 | A1 |
20150370920 | Van Os et al. | Dec 2015 | A1 |
20150373107 | Chan et al. | Dec 2015 | A1 |
20150382047 | Van Os et al. | Dec 2015 | A1 |
20150382066 | Heeter et al. | Dec 2015 | A1 |
20160004425 | Yoon et al. | Jan 2016 | A1 |
20160004772 | Kim et al. | Jan 2016 | A1 |
20160004773 | Jannink et al. | Jan 2016 | A1 |
20160005013 | Perry | Jan 2016 | A1 |
20160014461 | Leech et al. | Jan 2016 | A1 |
20160021412 | Zito | Jan 2016 | A1 |
20160035119 | Lee et al. | Feb 2016 | A1 |
20160036897 | Kim et al. | Feb 2016 | A1 |
20160041702 | Wang | Feb 2016 | A1 |
20160043962 | Kim et al. | Feb 2016 | A1 |
20160066004 | Lieu et al. | Mar 2016 | A1 |
20160066021 | Thomas | Mar 2016 | A1 |
20160066040 | Webster | Mar 2016 | A1 |
20160066049 | Mountain | Mar 2016 | A1 |
20160078526 | Nations et al. | Mar 2016 | A1 |
20160080815 | Ruffini et al. | Mar 2016 | A1 |
20160092042 | Yenigalla et al. | Mar 2016 | A1 |
20160092559 | Lind et al. | Mar 2016 | A1 |
20160096113 | Decoufle | Apr 2016 | A1 |
20160099991 | Lonkar et al. | Apr 2016 | A1 |
20160105540 | Kwon et al. | Apr 2016 | A1 |
20160110064 | Shapira | Apr 2016 | A1 |
20160127783 | Garcia Navarro | May 2016 | A1 |
20160127789 | Roberts et al. | May 2016 | A1 |
20160133230 | Daniels et al. | May 2016 | A1 |
20160142783 | Bagga et al. | May 2016 | A1 |
20160146935 | Lee et al. | May 2016 | A1 |
20160165307 | Lavender et al. | Jun 2016 | A1 |
20160188902 | Jin | Jun 2016 | A1 |
20160191639 | Dai et al. | Jun 2016 | A1 |
20160192017 | Tirpak | Jun 2016 | A1 |
20160231885 | Lee et al. | Aug 2016 | A1 |
20160249105 | Carney Landow | Aug 2016 | A1 |
20160255379 | Langan et al. | Sep 2016 | A1 |
20160277785 | Newman et al. | Sep 2016 | A1 |
20160334935 | Jeon | Nov 2016 | A1 |
20160345070 | Beeson et al. | Nov 2016 | A1 |
20160357305 | Wells et al. | Dec 2016 | A1 |
20160357352 | Matas et al. | Dec 2016 | A1 |
20160357355 | Carrigan et al. | Dec 2016 | A1 |
20160357366 | Migos | Dec 2016 | A1 |
20160370982 | Penha | Dec 2016 | A1 |
20170003879 | Tamai et al. | Jan 2017 | A1 |
20170010846 | Bernstein | Jan 2017 | A1 |
20170010847 | Bernstein | Jan 2017 | A1 |
20170013295 | Wertheimer et al. | Jan 2017 | A1 |
20170024587 | Nonogaki et al. | Jan 2017 | A1 |
20170046039 | Karunamuni et al. | Feb 2017 | A1 |
20170046339 | Bhat et al. | Feb 2017 | A1 |
20170068402 | Lochhead et al. | Mar 2017 | A1 |
20170068511 | Brown et al. | Mar 2017 | A1 |
20170094360 | Keighran | Mar 2017 | A1 |
20170097969 | Stein et al. | Apr 2017 | A1 |
20170115867 | Bargmann | Apr 2017 | A1 |
20170124594 | Naiga et al. | May 2017 | A1 |
20170132659 | Dirks et al. | May 2017 | A1 |
20170132829 | Blas et al. | May 2017 | A1 |
20170134778 | Christie et al. | May 2017 | A1 |
20170140748 | Roberts et al. | May 2017 | A1 |
20170188116 | Major et al. | Jun 2017 | A1 |
20170192642 | Fishman | Jul 2017 | A1 |
20170195736 | Chai | Jul 2017 | A1 |
20170201618 | Schmidt | Jul 2017 | A1 |
20170201850 | Raleigh et al. | Jul 2017 | A1 |
20170214975 | Schmidt et al. | Jul 2017 | A1 |
20170220228 | Sang et al. | Aug 2017 | A1 |
20170242913 | Tijssen et al. | Aug 2017 | A1 |
20170243471 | Banfi | Aug 2017 | A1 |
20170245017 | Chaudhri | Aug 2017 | A1 |
20170251257 | Obrien | Aug 2017 | A1 |
20170300151 | Lue-sang et al. | Oct 2017 | A1 |
20170339443 | Lue-sang et al. | Nov 2017 | A1 |
20170344553 | Evnine et al. | Nov 2017 | A1 |
20170345040 | Pirnack et al. | Nov 2017 | A1 |
20170353603 | Grunewald et al. | Dec 2017 | A1 |
20170357387 | Clarke | Dec 2017 | A1 |
20170359722 | Folse et al. | Dec 2017 | A1 |
20170364246 | Van Os et al. | Dec 2017 | A1 |
20180011580 | Lebowitz et al. | Jan 2018 | A1 |
20180041814 | Christie et al. | Feb 2018 | A1 |
20180053094 | Patel et al. | Feb 2018 | A1 |
20180059872 | Iida | Mar 2018 | A1 |
20180063591 | Newman et al. | Mar 2018 | A1 |
20180070121 | Zimmerman et al. | Mar 2018 | A1 |
20180070138 | Chai et al. | Mar 2018 | A1 |
20180107353 | Lee | Apr 2018 | A1 |
20180113579 | Johnston et al. | Apr 2018 | A1 |
20180130097 | Tran et al. | May 2018 | A1 |
20180136800 | Johnston et al. | May 2018 | A1 |
20180146377 | Folse et al. | May 2018 | A1 |
20180157368 | Park et al. | Jun 2018 | A1 |
20180189076 | Liston et al. | Jul 2018 | A1 |
20180253900 | Finding et al. | Sep 2018 | A1 |
20180260070 | Mun et al. | Sep 2018 | A1 |
20180275855 | Van Os et al. | Sep 2018 | A1 |
20180293210 | Xue et al. | Oct 2018 | A1 |
20180293771 | Piemonte et al. | Oct 2018 | A1 |
20180295403 | Christie | Oct 2018 | A1 |
20180302680 | Cormican | Oct 2018 | A1 |
20180343497 | Brown et al. | Nov 2018 | A1 |
20180349509 | Abou Mahmoud et al. | Dec 2018 | A1 |
20180367834 | Carpenter et al. | Dec 2018 | A1 |
20190012048 | Johnston et al. | Jan 2019 | A1 |
20190020925 | Christie et al. | Jan 2019 | A1 |
20190028769 | Jeon et al. | Jan 2019 | A1 |
20190045271 | Christie et al. | Feb 2019 | A1 |
20190052744 | Jung et al. | Feb 2019 | A1 |
20190058921 | Christie et al. | Feb 2019 | A1 |
20190064998 | Chowdhury et al. | Feb 2019 | A1 |
20190066672 | Wood et al. | Feb 2019 | A1 |
20190073104 | Wang | Mar 2019 | A1 |
20190073680 | Knox | Mar 2019 | A1 |
20190129588 | Johnston et al. | May 2019 | A1 |
20190138163 | Howland | May 2019 | A1 |
20190141399 | Auxer et al. | May 2019 | A1 |
20190246060 | Tanabe et al. | Aug 2019 | A1 |
20190258373 | Davydov et al. | Aug 2019 | A1 |
20190272853 | Moore | Sep 2019 | A1 |
20190324614 | Brillon et al. | Oct 2019 | A1 |
20190324640 | Park et al. | Oct 2019 | A1 |
20190342616 | Domm et al. | Nov 2019 | A1 |
20190354264 | Van Os et al. | Nov 2019 | A1 |
20190373320 | Balsamo | Dec 2019 | A1 |
20200034792 | Rogers et al. | Jan 2020 | A1 |
20200068274 | Aher et al. | Feb 2020 | A1 |
20200084488 | Christie et al. | Mar 2020 | A1 |
20200099985 | Keighran et al. | Mar 2020 | A1 |
20200104021 | Bylenok et al. | Apr 2020 | A1 |
20200133631 | Christie et al. | Apr 2020 | A1 |
20200137175 | Ganci et al. | Apr 2020 | A1 |
20200257415 | Clarke | Aug 2020 | A1 |
20200272666 | Van Os et al. | Aug 2020 | A1 |
20200301575 | Lindholm | Sep 2020 | A1 |
20200304863 | Domm et al. | Sep 2020 | A1 |
20200304876 | Cielak et al. | Sep 2020 | A1 |
20200304879 | Ellingford | Sep 2020 | A1 |
20200304880 | Diaz Delgado et al. | Sep 2020 | A1 |
20200363934 | Van Os et al. | Nov 2020 | A1 |
20200374595 | Yang et al. | Nov 2020 | A1 |
20200380029 | Chen | Dec 2020 | A1 |
20200382845 | Payne | Dec 2020 | A1 |
20200396507 | Balsamo | Dec 2020 | A1 |
20210021903 | Christie et al. | Jan 2021 | A1 |
20210168424 | Sharma | Jun 2021 | A1 |
20210181901 | Johnston et al. | Jun 2021 | A1 |
20210195277 | Thurlow et al. | Jun 2021 | A1 |
20210223925 | Bylenok et al. | Jul 2021 | A1 |
20210286454 | Beaumier et al. | Sep 2021 | A1 |
20210306711 | Ellingford et al. | Sep 2021 | A1 |
20210337280 | Diaz Delgado et al. | Oct 2021 | A1 |
20210345004 | Christie et al. | Nov 2021 | A1 |
20210365134 | Beaumier et al. | Nov 2021 | A1 |
20210397306 | Rajam et al. | Dec 2021 | A1 |
20210406995 | Peters et al. | Dec 2021 | A1 |
20220132215 | Venugopal et al. | Apr 2022 | A1 |
20220179526 | Schöberl | Jun 2022 | A1 |
20220244824 | Cielak | Aug 2022 | A1 |
20220321940 | Christie et al. | Oct 2022 | A1 |
20220329891 | Christie et al. | Oct 2022 | A1 |
20220337914 | Christie et al. | Oct 2022 | A1 |
20220360858 | Christie et al. | Nov 2022 | A1 |
20220413796 | Christie et al. | Dec 2022 | A1 |
20230022781 | Lindholm et al. | Jan 2023 | A1 |
20230033604 | Diaz Delgado et al. | Feb 2023 | A1 |
20230096458 | Van Os et al. | Mar 2023 | A1 |
20230127228 | Clarke | Apr 2023 | A1 |
20230132595 | Van Os et al. | May 2023 | A1 |
20230300415 | Balsamo | Sep 2023 | A1 |
20230328327 | Cielak et al. | Oct 2023 | A1 |
Number | Date | Country |
---|---|---|
2009255409 | Jul 2012 | AU |
2016100476 | May 2016 | AU |
2017101431 | Nov 2017 | AU |
2018100810 | Jul 2018 | AU |
1295419 | May 2001 | CN |
1391765 | Jan 2003 | CN |
1985277 | Jun 2007 | CN |
101160932 | Apr 2008 | CN |
101228570 | Jul 2008 | CN |
101317149 | Dec 2008 | CN |
101370104 | Feb 2009 | CN |
101405679 | Apr 2009 | CN |
101436110 | May 2009 | CN |
101465993 | Jun 2009 | CN |
101529437 | Sep 2009 | CN |
101641662 | Feb 2010 | CN |
101699505 | Apr 2010 | CN |
101706704 | May 2010 | CN |
101719125 | Jun 2010 | CN |
101860447 | Oct 2010 | CN |
102098537 | Jun 2011 | CN |
102103460 | Jun 2011 | CN |
102187338 | Sep 2011 | CN |
102265586 | Nov 2011 | CN |
102325144 | Jan 2012 | CN |
102819715 | Dec 2012 | CN |
102859484 | Jan 2013 | CN |
102880404 | Jan 2013 | CN |
102890615 | Jan 2013 | CN |
102955653 | Mar 2013 | CN |
102981695 | Mar 2013 | CN |
103037265 | Apr 2013 | CN |
103177738 | Jun 2013 | CN |
103399967 | Nov 2013 | CN |
103516933 | Jan 2014 | CN |
103546816 | Jan 2014 | CN |
103562848 | Feb 2014 | CN |
103562947 | Feb 2014 | CN |
103620531 | Mar 2014 | CN |
103620541 | Mar 2014 | CN |
103620639 | Mar 2014 | CN |
103686418 | Mar 2014 | CN |
103985045 | Aug 2014 | CN |
103999017 | Aug 2014 | CN |
104508618 | Apr 2015 | CN |
104822098 | Aug 2015 | CN |
105190590 | Dec 2015 | CN |
105247526 | Jan 2016 | CN |
105264479 | Jan 2016 | CN |
105303372 | Feb 2016 | CN |
105308634 | Feb 2016 | CN |
105308923 | Feb 2016 | CN |
105336350 | Feb 2016 | CN |
105657554 | Jun 2016 | CN |
105812849 | Jul 2016 | CN |
105828098 | Aug 2016 | CN |
105955520 | Sep 2016 | CN |
105955607 | Sep 2016 | CN |
105989085 | Oct 2016 | CN |
105992068 | Oct 2016 | CN |
106101982 | Nov 2016 | CN |
108292190 | Jul 2018 | CN |
109313651 | Feb 2019 | CN |
202016003233 | Aug 2016 | DE |
0608708 | Aug 1994 | EP |
0624853 | Nov 1994 | EP |
2386984 | Nov 2011 | EP |
2453667 | May 2012 | EP |
2535844 | Dec 2012 | EP |
2574089 | Mar 2013 | EP |
2605203 | Jun 2013 | EP |
2642402 | Sep 2013 | EP |
2672703 | Dec 2013 | EP |
2704032 | Mar 2014 | EP |
2725531 | Apr 2014 | EP |
2879398 | Jun 2015 | EP |
2000-112977 | Apr 2000 | JP |
2000-163031 | Jun 2000 | JP |
2001-197445 | Jul 2001 | JP |
2002-27381 | Jan 2002 | JP |
2002-342033 | Nov 2002 | JP |
2003-99452 | Apr 2003 | JP |
2003-534737 | Nov 2003 | JP |
2004-62237 | Feb 2004 | JP |
2006-31219 | Feb 2006 | JP |
2007-124465 | May 2007 | JP |
2007-512640 | May 2007 | JP |
2007-140910 | Jun 2007 | JP |
2007-294068 | Nov 2007 | JP |
2008-71112 | Mar 2008 | JP |
2008-135911 | Jun 2008 | JP |
2009-60328 | Mar 2009 | JP |
2009-206957 | Sep 2009 | JP |
2009-260947 | Nov 2009 | JP |
2010-28437 | Feb 2010 | JP |
2010-56595 | Mar 2010 | JP |
2010-509684 | Mar 2010 | JP |
2010-114733 | May 2010 | JP |
2011-512701 | Apr 2011 | JP |
2011-123750 | Jun 2011 | JP |
2011-154455 | Aug 2011 | JP |
2011-182146 | Sep 2011 | JP |
2011-205562 | Oct 2011 | JP |
2011-257930 | Dec 2011 | JP |
2012-95123 | May 2012 | JP |
2012-123685 | Jun 2012 | JP |
2012-208622 | Oct 2012 | JP |
2013-8369 | Jan 2013 | JP |
2013-12021 | Jan 2013 | JP |
2013-223150 | Oct 2013 | JP |
2013-235523 | Nov 2013 | JP |
2014-81740 | May 2014 | JP |
2014-102660 | Jun 2014 | JP |
2015-50655 | Mar 2015 | JP |
2015-70404 | Apr 2015 | JP |
2001-0005939 | Jan 2001 | KR |
2001-0035356 | May 2001 | KR |
10-2002-0010151 | Feb 2002 | KR |
10-2007-0114329 | Dec 2007 | KR |
10-2009-0106104 | Oct 2009 | KR |
10-2010-0039194 | Apr 2010 | KR |
10-2011-0036408 | Apr 2011 | KR |
10-2011-0061811 | Jun 2011 | KR |
10-2012-0076682 | Jul 2012 | KR |
10-2012-0124445 | Nov 2012 | KR |
10-2013-0014712 | Feb 2013 | KR |
10-2013-0058034 | Jun 2013 | KR |
10-2013-0137969 | Dec 2013 | KR |
10-2014-0041939 | Apr 2014 | KR |
10-2019-0033658 | Mar 2019 | KR |
10-2022-0041231 | Mar 2022 | KR |
200622893 | Jul 2006 | TW |
200719204 | May 2007 | TW |
201337717 | Sep 2013 | TW |
201349049 | Dec 2013 | TW |
201351261 | Dec 2013 | TW |
1994009438 | Apr 1994 | WO |
1999040728 | Aug 1999 | WO |
2004063862 | Jul 2004 | WO |
2004102285 | Nov 2004 | WO |
2005050652 | Jun 2005 | WO |
2005109345 | Nov 2005 | WO |
2007078623 | Jul 2007 | WO |
2008005135 | Jan 2008 | WO |
2008060486 | May 2008 | WO |
2009016607 | Feb 2009 | WO |
2009039786 | Apr 2009 | WO |
2009148781 | Dec 2009 | WO |
2010022570 | Mar 2010 | WO |
2010025168 | Mar 2010 | WO |
2010118690 | Oct 2010 | WO |
2011095693 | Aug 2011 | WO |
2011158475 | Dec 2011 | WO |
2012012446 | Jan 2012 | WO |
2012061760 | May 2012 | WO |
2012088665 | Jul 2012 | WO |
2013000741 | Jan 2013 | WO |
2013149128 | Oct 2013 | WO |
2013169849 | Nov 2013 | WO |
2013169877 | Nov 2013 | WO |
2013187370 | Dec 2013 | WO |
2013149128 | Feb 2014 | WO |
2014105276 | Jul 2014 | WO |
2014144908 | Sep 2014 | WO |
2014177929 | Nov 2014 | WO |
2014200730 | Dec 2014 | WO |
2015200227 | Dec 2015 | WO |
2015200228 | Dec 2015 | WO |
2015200537 | Dec 2015 | WO |
2016030437 | Mar 2016 | WO |
2016048308 | Mar 2016 | WO |
2016048310 | Mar 2016 | WO |
2016111065 | Jul 2016 | WO |
2017008079 | Jan 2017 | WO |
2017124116 | Jul 2017 | WO |
2017200923 | Nov 2017 | WO |
2017218104 | Dec 2017 | WO |
2018081157 | May 2018 | WO |
Entry |
---|
Advisory Action received for U.S. Appl. No. 15/167,801, dated Feb. 16, 2018, 4 pages. |
Applicant Initiated Interview Summary received for U.S. Appl. No. 15/167,801, dated Apr. 23, 2018, 3 pages. |
Applicant Initiated Interview Summary received for U.S. Appl. No. 15/167,801, dated Jul. 29, 2019, 3 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 14/242,575, dated Dec. 15, 2016, 7 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 14/242,575, dated Nov. 16, 2016, 7 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 14/255,664, dated Aug. 29, 2017, 4 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 14/267,671, dated Nov. 29, 2018, 3 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 14/749,288, dated Sep. 21, 2017, 5 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 15/276,633, dated Sep. 10, 2019, 7 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 15/695,880, dated Jun. 11, 2018, 6 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 15/714,904, dated Sep. 7, 2018, 5 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 16/010,280, dated Aug. 6, 2019, 2 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 16/036,810, dated Nov. 19, 2018, 6 pages. |
Examiner Initiated Interview Summary received for U.S. Appl. No. 15/390,377, dated Oct. 30, 2017, 2 pages. |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 15/876,715, dated Aug. 18, 2020, 16 pages. |
Extended European Search Report received for European Patent Application No. 17813728.7, dated Feb. 11, 2019, 8 pages. |
Final Office Action received for U.S. Appl. No. 14/255,664, dated Oct. 17, 2016, 16 pages. |
Final Office Action received for U.S. Appl. No. 14/267,671, dated May 23, 2018, 17 pages. |
Final Office Action received for U.S. Appl. No. 14/267,671, dated Oct. 26, 2016, 21 pages. |
Final Office Action received for U.S. Appl. No. 14/271,179, dated Dec. 15, 2016, 10 pages. |
Final Office Action received for U.S. Appl. No. 14/271,179, dated Jun. 20, 2019, 15 pages. |
Final Office Action received for U.S. Appl. No. 14/271,179, dated Jun. 21, 2018, 14 pages. |
Final Office Action received for U.S. Appl. No. 14/746,095, dated Jul. 16, 2018, 33 pages. |
Final Office Action received for U.S. Appl. No. 14/746,662, dated Apr. 24, 2017, 8 pages. |
Final Office Action received for U.S. Appl. No. 14/746,662, dated Jun. 27, 2017, 9 pages. |
Final Office Action received for U.S. Appl. No. 15/167,801, dated Apr. 5, 2019, 18 pages. |
Final Office Action received for U.S. Appl. No. 15/167,801, dated May 28, 2020, 17 pages. |
Final Office Action received for U.S. Appl. No. 15/167,801, dated Nov. 29, 2017, 12 pages. |
Final Office Action received for U.S. Appl. No. 15/235,000, dated Dec. 19, 2018, 33 pages. |
Final Office Action received for U.S. Appl. No. 15/235,000, dated Mar. 13, 2018, 31 pages. |
Final Office Action received for U.S. Appl. No. 15/272,393, dated Mar. 25, 2019, 54 pages. |
Final Office Action received for U.S. Appl. No. 15/272,397, dated Mar. 7, 2017, 23 pages. |
Final Office Action received for U.S. Appl. No. 15/276,633, dated Jul. 26, 2017, 15 pages. |
Final Office Action received for U.S. Appl. No. 15/276,633, dated Oct. 29, 2018, 12 pages. |
Final Office Action received for U.S. Appl. No. 15/390,377, dated Nov. 9, 2017, 18 pages. |
Final Office Action received for U.S. Appl. No. 15/507,229, dated Jul. 15, 2020, 20 pages. |
Final Office Action received for U.S. Appl. No. 15/507,229, dated Sep. 18, 2019, 15 pages. |
Final Office Action received for U.S. Appl. No. 15/719,404, dated Aug. 8, 2019, 19 pages. |
Final Office Action received for U.S. Appl. No. 15/876,715, dated Nov. 5, 2018, 15 pages. |
Final Office Action received for U.S. Appl. No. 16/108,519, dated Dec. 12, 2019, 10 pages. |
Final Office Action received for U.S. Appl. No. 16/126,962, dated Apr. 8, 2020, 20 pages. |
Final Office Action received for U.S. Appl. No. 16/136,005, dated Mar. 9, 2020, 9 pages. |
Final Office Action received for U.S. Appl. No. 16/144,077, dated Jul. 12, 2019, 22 pages. |
Final Office Action received for U.S. Appl. No. 16/584,790, dated May 27, 2020, 27 pages. |
International Search Report received for PCT Patent Application No. PCT/US2014/057272, dated May 28, 2015, 4 pages. |
International Search Report received for PCT Patent Application No. PCT/US2014/057280, dated May 27, 2015, 4 pages. |
International Search Report received for PCT Patent Application No. PCT/US2015/037027, dated Sep. 28, 2015, 3 pages. |
International Search Report received for PCT Patent Application No. PCT/US2015/037030, dated Dec. 10, 2015, 7 pages. |
International Search Report received for PCT Patent Application No. PCT/US2015/037520, dated Mar. 7, 2016, 6 pages. |
International Search Report received for PCT Patent Application No. PCT/US2017/029448, dated Jul. 13, 2017, 3 pages. |
International Search Report received for PCT Patent Application No. PCT/US2017/031764, dated Aug. 7, 2017, 2 pages. |
International Search Report received for PCT Patent Application No. PCT/US2017/058132, dated Mar. 27, 2018, 6 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/208,099, dated Jun. 25, 2015, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/242,575, dated Mar. 21, 2016, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/255,664, dated Apr. 1, 2016, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/262,435, dated Feb. 22, 2016, 22 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/267,671, dated Apr. 1, 2016, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/267,671, dated Dec. 1, 2017, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/267,671, dated May 26, 2017, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/271,179, dated May 29, 2015, 25 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/271,179, dated Oct. 5, 2018, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/271,179, dated Sep. 21, 2017, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/746,095, dated Dec. 1, 2017, 34 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/746,095, dated Jul. 25, 2019, 33 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/746,620, dated Jan. 11, 2017, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/746,662, dated Aug. 9, 2016, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/749,288, dated Oct. 12, 2016, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/167,801 dated Mar. 24, 2017, 12 Pages. |
Non-Final Office Action received for U.S. Appl. No. 15/167,801, dated Aug. 30, 2018, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/167,801, dated Sep. 26, 2019, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/224,370, dated Oct. 3, 2017, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/235,000, dated Jul. 14, 2017, 31 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/235,000, dated Jul. 25, 2018, 31 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/235,000, dated Jun. 26, 2019, 31 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/272,393, dated Oct. 2, 2018, 52 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/272,397, dated Nov. 22, 2016, 20 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/276,633, dated Feb. 23, 2018, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/276,633, dated Mar. 5, 2019, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/276,633, dated Nov. 17, 2016, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/390,377, dated Apr. 5, 2017, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/414,493, dated Oct. 6, 2017, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/507,229, dated Feb. 27, 2020, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/507,229, dated Jun. 3, 2019, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/674,992, dated May 11, 2018, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/719,404, dated Dec. 14, 2018, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/798,092, dated Dec. 20, 2017, 20 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/876,715, dated Jun. 4, 2018, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/876,715, dated Sep. 10, 2019, 13 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/990,327, dated Jul. 31, 2018, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/010,280, dated Mar. 7, 2019, 5 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/108,519, dated Aug. 2, 2019, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/108,519, dated May 8, 2020, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/126,962, dated Aug. 25, 2020, 22 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/126,962, dated Sep. 3, 2019, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/136,005, dated Sep. 9, 2020, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/136,005, dated Sep. 18, 2019, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/142,635, dated Jun. 8, 2020, 19 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/144,077, dated Feb. 19, 2019, 24 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/144,077, dated Nov. 27, 2019, 40 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/233,990, dated Jun. 18, 2020, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/392,467, dated Sep. 27, 2019, 5 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/584,790, dated Dec. 26, 2019, 24 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/682,443, dated Sep. 23, 2020, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/697,090, dated Jul. 6, 2020,14 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/865,172, dated Aug. 20, 2020, 19 pages. |
Notice of Allowance received for U.S. Appl. No. 14/208,099, dated Feb. 3, 2016, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 14/242,575, dated Oct. 27, 2016, 11 pages. |
Notice of Allowance received for U.S. Appl. No. 14/255,664, dated May 5, 2017, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 14/262,435, dated Aug. 16, 2016, 6 pages. |
Notice of Allowance received for U.S. Appl. No. 14/267,671, dated Sep. 19, 2018, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 14/746,095, dated Dec. 31, 2019, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 14/746,620, dated Sep. 25, 2017, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 14/746,662, dated Sep. 25, 2017, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 14/749,288, dated May 25, 2017, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 15/272,393, dated Jan. 15, 2020, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 15/272,393, dated Sep. 18, 2019, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 15/272,397, dated Oct. 18, 2017, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 15/276,633, dated Aug. 26, 2019, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 15/390,377, dated Jul. 2, 2018, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 15/414,493, dated Mar. 14, 2018, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 15/674,992, dated Oct. 1, 2018, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 15/695,880, dated Feb. 28, 2018, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 15/695,880, dated Oct. 18, 2017, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 15/714,904, dated May 22, 2018, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 15/798,092, dated Jun. 7, 2018, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 15/798,092, dated Oct. 9, 2018, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 15/833,618, dated Mar. 14, 2018, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 15/990,327, dated Jan. 11, 2019, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 16/010,280, dated Jul. 29, 2019, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 16/036,810, dated Oct. 31, 2018, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 16/144,077, dated May 8, 2020, 15 pages. |
Notice of Allowance received for U.S. Appl. No. 16/392,467, dated Mar. 23, 2020, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,942, dated Oct. 5, 2020, 10 pages. |
Restriction Requirement received for U.S. Appl. No. 14/208,099, dated Feb. 24, 2015, 5 pages. |
Search Report received for Chinese Patent Application No. 201580028382.1, dated Oct. 12, 2018, 5 pages (2 pages of English Translation & 3 pages of Official copy). |
Search Report received for Danish Patent Application No. PA 201670581, dated Apr. 4, 2017, 2 pages. |
Search Report received for Danish Patent Application No. PA 201670581, dated Feb. 5, 2018, 1 page. |
Search Report received for Danish Patent Application No. PA 201670581, dated Nov. 3, 2016, 1 page. |
Search Report received for Danish Patent Application No. PA 201870354, dated Sep. 26, 2018, 4 pages. |
Search Report received for Danish Patent Application No. PA201670582, dated Feb. 9, 2017, 1 pages. |
Search Report received for Danish Patent Application No. PA201670582, dated Mar. 6, 2018, 2 pages. |
Search Report received for Danish Patent Application No. PA201670582, dated Oct. 28, 2016, 4 pages. |
Search Report received for Danish Patent Application No. PA201770200, Completed on Jul. 12, 2017, 4 pages. |
Search Report received for Taiwanese Patent Application No. 104120369, dated Aug. 8, 2016, 2 Pages (1 page of official copy & 1 page of English translation). |
Search Report received for Taiwanese Patent Application No. 104120385, dated Nov. 25, 2016, 2 Pages (1 page of official copy & 1 page of English translation). |
Supplemental Notice of Allowance received for U.S. Appl. No. 15/798,092, dated Jan. 9, 2019, 2 pages. |
Akhtar Iyaz, “Movies Anywhere: Everything You Need to Know”, Available online at: <https://www.cnet.com/how-to/movies-anywhere-ultraviolet-movies-locker-streaming-redeem-faq/>, 2017, 8 pages. |
Alvarez Edgar, “Sling TV Redesign Makes It Easy to Find Your Favorite Content”, Engadget, Available online at: <https://www.engadget.com/2016/01/05/sling-tv-major-redesign/>, May 1, 2016, pp. 1-12. |
Bishop Bryan, “Netflix Introduces One Unified TV Interface to Rule them All”, The Verge, Available online at: <https://www.theverge.com/2013/11/13/5098224/netflix-introduces-one-unified-tv-interface-to-rule-them-all>, Nov. 13, 2013, 3 pages. |
Bohn Dieter, “Rebooting WebOS: How LG Rethought the Smart TV”, The Verge, Available online at: <http://www.theverge.com/2014/1/6/5279220/rebooting-webos-how-lg-rethought-the-smart-tv>, Jan. 6, 2014, 5 pages. |
episodecalendar.com,“Keep track of your favorite TV shows!—TV Episode Calendar”, Available Online at: <https://web.archive.org/web/20140517060612/https://episodecalendar.com/>, May 17, 2014, 6 pages. |
Fingas Roger, “Walmart's Vudu to get Native Apple TV”, AppleInsider, 2017, pp. 1-4. |
Grey Melissa, “Comcast's New X2 Platform Moves your DVR Recordings from the Box to the Cloud”, Engadget, Available online at: <http://www.engadget.com/2013/06/11/comcast-x2-platform/>, Jun. 11, 2013, 15 pages. |
International Standard—ISO,“Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs)”, Part 13: User Guidance, Zurich, CH, vol. 9241-13, XP001525163, Section 10, Jul. 15, 1998, 40 pages. |
Lee et al., “A Multi-Touch Three Dimensional Touch-Sensitive Tablet”, CHI'85 Proceedings, Apr. 1985, pp. 21-25. |
Ng Gary, “New Netflix User Interface Coming This Month, First Redesign in Four Years”, iPhone in Canada, Available online at: <https://www.iphoneincanada.ca/news/new-netflix-user-interface/>, Jun. 1, 2015, 3 pages. |
Panzarino Matthew, “Apple Announces Voice Activated Siri Assistant Feature for iOS 5, Integrates Wolfram Alpha and Wikipedia”, Available online at: <www.thenextweb.com>, Oct. 4, 2011, pp. 1-6. |
Pierce David, “Got Hulu and Netflix? You Need an App to Search It All”, Wired, Available online at: <https://www.wired.com/2016/03/got-hulu-netflix-need-app-search/>, Mar. 10, 2016, pp. 1-4. |
Rubine Dean, “Combining Gestures and Direct Manipulation”, CHI'92, May 3-7, 1992, pp. 659-660. |
Rubine Dean H., “The Automatic Recognition of Gestures”, CMU-CS-91-202, Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Computer Science at Carnegie Mellon University, Dec. 1991, 285 pages. |
Westerman Wayne, “Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface”, A Dissertation Submitted to the Faculty of the University of Delaware in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Electrical Engineering, 1999, 363 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/719,404, dated Oct. 16, 2020, 18 pages. |
Cover Flow—Wikipedia, Available online at: <https://en.wikipedia.org/w/index.php?t%20itle=Cover%20Flow&oldid=879285208>, Jan. 20, 2019, 3 pages. |
Extended European Search Report received for European Patent Application No. 20190698.9, dated Oct. 30, 2020, 6 pages. |
Final Office Action received for U.S. Appl. No. 16/108,519, dated Nov. 25, 2020, 12 pages. |
Final Office Action received for U.S. Appl. No. 16/142,635, dated Feb. 3, 2021, 23 pages. |
Final Office Action received for U.S. Appl. No. 16/233,990, dated Jan. 11, 2021, 17 pages. |
Final Office Action received for U.S. Appl. No. 16/697,090, dated Jan. 27, 2021, 18 pages. |
Final Office Action received for U.S. Appl. No. 16/865,172, dated Feb. 12, 2021, 29 pages. |
International Search Report received for PCT Patent Application No. PCT/US2019/034921, dated Nov. 19, 2019, 5 pages. |
International Search Report received for PCT Patent Application No. PCT/US2020/024452, dated Aug. 6, 2020, 6 pages. |
International Search Report received for PCT Patent Application No. PCT/US2020/024485, dated Aug. 3, 2020, 6 pages. |
International Search Report received for PCT Patent Application No. PCT/US2020/024486, dated Aug. 11, 2020, 6 pages. |
International Search Report received for PCT Patent Application No. PCT/US2020/024492, dated Aug. 10, 2020, 6 pages. |
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2019/034921, dated Sep. 24, 2019, 12 pages. |
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2020/024452, dated Jun. 15, 2020, 13 pages. |
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2020/024485, dated Jun. 8, 2020, 11 pages. |
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2020/024486, dated Jun. 3, 2020, 11 pages. |
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2020/024492, dated Jun. 8, 2020, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/167,801, dated Dec. 11, 2020, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/584,790, dated Dec. 23, 2020, 30 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/827,918, dated Dec. 10, 2020, 28 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/065,387, dated Jan. 28, 2021, 28 pages. |
Notice of Allowance received for U.S. Appl. No. 16/136,005, dated Feb. 24, 2021, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,942, dated Jan. 22, 2021, 5 pages. |
Supplemental Notice of Allowability received for U.S. Appl. No. 16/827,942, dated Nov. 4, 2020, 3 pages. |
Extended European Search Report received for European Patent Application No. 20199219.5, dated Apr. 22, 2021, 8 pages. |
Final Office Action received for U.S. Appl. No. 15/719,404, dated Mar. 30, 2021, 19 pages. |
Final Office Action received for U.S. Appl. No. 16/175,565, dated Nov. 12, 2020, 40 pages. |
Final Office Action received for U.S. Appl. No. 16/222,619, dated Jul. 27, 2020, 11 pages. |
Final Office Action received for U.S. Appl. No. 16/584,790, dated Jun. 15, 2021, 30 pages. |
Final Office Action received for U.S. Appl. No. 16/682,443, dated Mar. 9, 2021, 9 pages. |
Final Office Action received for U.S. Appl. No. 16/827,918, dated Jul. 8, 2021, 31 pages. |
International Search Report received for PCT Patent Application No. PCT/US2020/035423, dated Oct. 13, 2020, 4 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/108,519, dated Apr. 5, 2021, 13 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/142,635, dated Jun. 11, 2021, 23 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/222,619, dated Mar. 19, 2020, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/233,990, dated Jul. 9, 2021, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/697,090, dated Aug. 3, 2021, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/827,910, dated Jun. 17, 2021, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/827,931, dated Mar. 3, 2021, 24 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/865,172 dated Jun. 29, 2021, 29 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/872,274, dated Jul. 9, 2021, 19 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/888,453, dated Jun. 4, 2021, 37 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/888,478, dated Feb. 8, 2021, 24 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/945,724, dated Jul. 19, 2021, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/065,387, dated Jun. 1, 2021, 25 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/133,550, dated Jun. 8, 2021, 23 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/175,565, dated Mar. 4, 2020, 36 pages. |
Notice of Allowance received for U.S. Appl. No. 16/136,005, dated Jun. 9, 2021, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 16/222,619, dated Nov. 20, 2020, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 16/726,179, dated Jun. 17, 2021, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,942, dated Apr. 28, 2021, 5 pages. |
Patent Board Decision received for U.S. Appl. No. 15/876,715, dated Aug. 3, 2021, 8 pages. |
Search Report received for Chinese Patent Application No. 201780033590.X, dated Mar. 24, 2021, 4 pages (2 page of English Translation and 2 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 201910469185.3, dated Feb. 23, 2021, 6 pages (3 page of English Translation and 3 page of Official Copy). |
Supplemental Notice of Allowability received for U.S. Appl. No. 16/222,619, dated Mar. 8, 2021, 3 pages. |
Cheredar, Tom, “Verizon's Viewdini lets you watch Netflix, Comcast, & Hulu videos from a single app”, venturebeat.com, May 22, 2012, 6 pages. |
Kaijser, Martijn, “Mimic skin for Kodi 15.x: Installation and showcase”, Time 2:23-2:28, Available online at: <https://www.youtube.com/watch?v=RGfpbUWVkgQ&t=143s>, Aug. 3, 2015, 1 page. |
Li, Xiaoshan, “CNTV, Hulu, BBC iPlayer Comparative Study on User Interface of Three Network TV Stations”, Modern Communication (Journal of Communication University of China), Issue 11, Nov. 5, 2010, pp. 156-158. See attached Communication 37 CFR § 1.98(a) (3). |
Corrected Notice of Allowability received for U.S. Appl. No. 16/108,519, mailed on Dec. 22, 2021, 3 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 15/876,715, mailed on Oct. 20, 2021, 2 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 16/827,931, mailed on Dec. 6, 2021, 4 pages. |
Final Office Action received for U.S. Appl. No. 16/872,274, mailed on Dec. 23, 2021, 20 pages. |
Final Office Action received for U.S. Appl. No. 16/888,478, mailed on Nov. 15, 2021, 27 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/167,801, mailed on Sep. 3, 2021, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/719,404, mailed on Nov. 26, 2021, 19 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/175,565, mailed on Sep. 20, 2021, 33 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/000,112, mailed on Dec. 7, 2021, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/210,352, mailed on Oct. 18, 2021, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/353,527, mailed on Oct. 5, 2021, 14 pages. |
Notice of Allowance received for U.S. Appl. No. 15/876,715, mailed on Oct. 14, 2021, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 16/108,519, mailed on Sep. 21, 2021, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 16/142,635, mailed on Nov. 10, 2021, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 16/682,443, mailed on Aug. 20, 2021, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 16/682,443, mailed on Nov. 17, 2021, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 16/726,179, mailed on Sep. 30, 2021, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,931, mailed on Jan. 5, 2022, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,931, mailed on Sep. 15, 2021, 11 pages. |
Notice of Allowance received for U.S. Appl. No. 16/865,172, mailed on Dec. 16, 2021, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 16/945,724, mailed on Dec. 20, 2021, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 17/065,387, mailed on Dec. 1, 2021, 10 pages. |
Search Report received for Chinese Patent Application No. 201680050096.X, mailed on Jan. 10, 2022, 2 pages (Official Copy Only). See attached Communication 37 CFR § 1.98(a) (3). |
Search Report received for Chinese Patent Application No. 201910587972.8, mailed on Jan. 4, 2022, 4 pages (2 page of English Translation and 2 pages of Official Copy). |
Applicant Initiated Interview Summary received for U.S. Appl. No. 17/210,352, mailed on Feb. 28, 2022, 4 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 15/876,715, mailed on Apr. 11, 2022, 4 Pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 15/876,715, mailed on Apr. 19, 2022, 4 Pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 16/142,635, mailed on Mar. 10, 2022, 2 Pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 16/233,990, mailed on Mar. 8, 2022, 4 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 17/065,387, mailed on Mar. 30, 2022, 2 Pages. |
Final Office Action received for U.S. Appl. No. 16/888,453, mailed on Apr. 8, 2022, 39 pages. |
Final Office Action received for U.S. Appl. No. 16/175,565, mailed on May 27, 2022, 33 pages. |
Final Office Action received for U.S. Appl. No. 16/697,090, mailed on Feb. 23, 2022, 25 pages. |
Final Office Action received for U.S. Appl. No. 16/827,910, mailed on Feb. 28, 2022, 17 pages. |
Final Office Action received for U.S. Appl. No. 17/133,550, mailed on Feb. 11, 2022, 18 pages. |
Final Office Action received for U.S. Appl. No. 17/210,352, mailed on Jun. 3, 2022, 21 pages. |
Final Office Action received for U.S. Appl. No. 17/353,527, mailed on May 11, 2022, 17 Pages. |
Non-Final Office Action received for U.S. Appl. No. 15/167,801, mailed on May 18, 2022, 17 Pages. |
Non-Final Office Action received for U.S. Appl. No. 16/584,790, mailed on Feb. 1, 2022, 33 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/888,478, mailed on May 2, 2022, 29 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/379,785, mailed on Mar. 30, 2022, 18 Pages. |
Notice of Allowance received for U.S. Appl. No. 15/876,715, mailed on Apr. 4, 2022, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 16/233,990, mailed on Feb. 22, 2022, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 16/233,990, mailed on May 26, 2022, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,918, mailed on Feb. 7, 2022, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,931, mailed on Apr. 19, 2022, 7 Pages. |
Notice of Allowance received for U.S. Appl. No. 16/865,172, mailed on Apr. 13, 2022, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 16/872,274, mailed on Apr. 19, 2022, 10 Pages. |
Notice of Allowance received for U.S. Appl. No. 16/945,724, mailed on Apr. 4, 2022, 8 Pages. |
Notice of Allowance received for U.S. Appl. No. 17/000,112, mailed on Jun. 3, 2022, 14 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 16/872,274, mailed on Aug. 12, 2022, 5 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 17/000,112, mailed on Jun. 17, 2022, 2 pages. |
Extended European Search Report received for European Patent Application No. 22167405.4, mailed on Jul. 4, 2022, 11 Pages. |
Final Office Action received for U.S. Appl. No. 16/584,790, mailed on Jun. 14, 2022, 37 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/697,090, mailed on Jul. 7, 2022, 25 pages. |
Notice of Allowance received for U.S. Appl. No. 15/719,404, mailed on Jul. 13, 2022, 8 Pages. |
Notice of Allowance received for U.S. Appl. No. 15/876,715, mailed on Aug. 3, 2022, 7 Pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,918, mailed on Jun. 8, 2022, 9 Pages. |
Notice of Allowance received for U.S. Appl. No. 16/945,724, mailed on Jul. 20, 2022, 8 Pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 16/233,990, mailed on Oct. 20, 2022, 2 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 16/945,724, mailed on Aug. 31, 2022, 2 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/827,910, mailed on Sep. 14, 2022, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/133,550, mailed on Sep. 9, 2022, 23 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/457,901, mailed on Apr. 28, 2022, 24 Pages. |
Notice of Allowance received for U.S. Appl. No. 16/233,990, mailed on Oct. 5, 2022, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 16/865,172, mailed on Aug. 25, 2022, 8 Pages. |
Notice of Allowance received for U.S. Appl. No. 17/000,112, mailed on Oct. 18, 2022, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 17/654,578, mailed on Oct. 25, 2022, 8 pages. |
Final Office Action received for U.S. Appl. No. 16/697,090, mailed on Dec. 14, 2022, 28 pages. |
Final Office Action received for U.S. Appl. No. 16/827,910, mailed on Mar. 15, 2023, 18 pages. |
Final Office Action received for U.S. Appl. No. 16/888,478, mailed on Feb. 13, 2023, 27 pages. |
Final Office Action received for U.S. Appl. No. 17/133,550, mailed on Feb. 15, 2023, 22 pages. |
Final Office Action received for U.S. Appl. No. 17/379,785, mailed on Oct. 28, 2022, 14 pages. |
Final Office Action received for U.S. Appl. No. 17/586,625, mailed on May 4, 2023, 15 pages. |
Final Office Action received for U.S. Appl. No. 17/660,622, mailed on May 24, 2023, 20 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/167,801, mailed on Feb. 8, 2023, 23 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/719,404, mailed on May 10, 2023, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/175,565, mailed on Feb. 17, 2023, 33 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/353,527, mailed on Dec. 8, 2022, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/379,785, mailed on Mar. 9, 2023, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/586,625, mailed on Sep. 1, 2022, 13 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/651,731, mailed on Apr. 25, 2023, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/656,610, mailed on Feb. 6, 2023, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/660,622, mailed on Dec. 20, 2022, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/937,410, mailed on Mar. 2, 2023, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/937,704, mailed on Mar. 30, 2023, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/060,902, mailed on Mar. 10, 2023, 8 pages. |
Notice of Allowability received for U.S. Appl. No. 17/457,901, mailed on Mar. 8, 2023, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 15/719,404, mailed on Nov. 9, 2022, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 16/233,990, mailed on Jan. 31, 2023, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 16/584,790, mailed on Feb. 3, 2023, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 16/888,453, mailed on Jun. 21, 2023, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 16/888,453, mailed on Mar. 1, 2023, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 17/210,352, mailed on Dec. 5, 2022, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 17/210,352, mailed on Mar. 16, 2023, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 17/367,227, mailed on Mar. 23, 2023, 12 pages. |
Notice of Allowance received for U.S. Appl. No. 17/457,901, mailed on Nov. 16, 2022, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 17/654,578, mailed on Feb. 15, 2023, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 17/654,578, mailed on Jun. 13, 2023, 7 pages. |
Search Report received for Chinese Patent Application No. 201780066823.6, mailed on Nov. 1, 2022, 4 pages (2 pages of English Translation and 2 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 201811143102.3, mailed on Nov. 22, 2022, 5 pages (2 pages of English Translation and 3 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 201911313480.6, mailed on Jan. 20, 2023, 4 pages (2 pages of English Translation and 2 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 201911313496.7, mailed on Jan. 20, 2023, 4 pages (2 pages of English Translation and 2 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 201911313497.1, mailed on Apr. 11, 2023, 5 pages (2 pages of English Translation and 3 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 201911313497.1, mailed on Dec. 14, 2022, 3 pages (1 page of English Translation and 2 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 202010011436.6, mailed on Dec. 15, 2022, 9 pages (4 pages of English Translation and 5 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 202010662190.9, mailed on Apr. 28, 2023, 5 pages (2 pages of English Translation and 3 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 202010662206.6, mailed on Apr. 28, 2023, 5 pages (2 pages of English Translation and 3 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 202010662994.9, mailed on Apr. 28, 2023, 5 pages (2 pages of English Translation and 3 pages of Official Copy). |
Search Report received for European Patent Application No. 20718506.7, mailed on Mar. 21, 2023, 2 pages. |
Anonymous, “Video Progress Bar—YouTube Help”, Retrieved from the Internet: <URL:https://web.archive.org/web/20190317001501/https://support.google.com/youtube/answer/7174115?hl=en>, [retrieved on Mar. 22, 2023], Mar. 17, 2019, 2 pages. |
Apple, “The control is all yours”, Available online at : <https://www.apple.com.cn/privacy/control/>, [Retrieved Dec. 29, 2022], Nov. 30, 2022, 12 pages. See attached Communication 37 CFR § 1.98(a)(3). |
Beer et al., “The Odds Of Running A Nonlinear TV Program Using Web Technologies”, IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB), 2011, 4 pages. |
Biao et al., “Research on UI Optimization of Chinese Network Television Stations”, Southeast Communications, 2013, 4 pages. See attached Communication 37 CFR § 1.98(a)(3). |
Budhraja et al., “Probability Based Playlist Generation Based on Music Similarity and User Customization”, National Conference On Computing And Communication Systems, 2012, 5 pages. |
Cheng, Luo, “The Designing of Dynamic Play-list Based on Flash Streaming Media Technology”, Computer and Telecommunication, 2008, 3 pages. See attached Communication 37 CFR § 1.98(a)(3). |
Drews et al., “Virtual Jukebox—Reviving a Classic”, Proceedings of the 35th Hawaii International Conference on System Sciences, 2022, 7 pages. |
Jin et al., “Pricing Sponsored Content in Wireless Networks with Multiple Content Providers”, The Fourth IEEE Workshop on Smart Data Pricing 2015, 2015, pp. 668-673. |
Kimbler Kristofer, “App Store Strategies for Service Providers”, 2010 4th International Conference on Intelligence in Next Generation Networks, Nov. 18, 2010, 5 pages. |
Liu, Chang, “Functions and Design of Multi-Screen Playing System in TV Variety Studio”, Modern TV Technology, 2013, 5 pages. See attached Communication 37 CFR § 1.98(a)(3). |
Meng et al., “Role Authorization Based Web Service Access Control Model”, Journal of Lanzhou University (Natural Science Edition), vol. 42, No. 2, 2007, pp. 84-88. See attached Communication 37 CFR § 1.98(a)(3). |
Tinari George, “What's New in the Netflix Redesign and How to Use It”, Retrieved from the Internet: <https://web.archive.org/web/20161110092133/https://www.guidingtech.com/48443/netflix-redesign-overview/ >, [retrieved on Mar. 22, 2023], Nov. 10, 2016, 9 pages. |
Wang et al., “Authorization Management Mechanism of Web Application System”, Network and Information Technology, vol. 25, No. 11, 2006, 3 pages. See attached Communication 37 CFR § 1.98(a)(3). |
Zhang et al., “Music Playlist Prediction Via Detecting Song Moods”, IEEE China Summit and International Conference on Signal and Information Processing, 2013, pp. 174-178. |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 16/175,565, mailed on Dec. 15, 2023, 27 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/133,550, mailed on Dec. 18, 2023, 25 pages. |
Notice of Allowance received for U.S. Appl. No. 15/719,404, mailed on Dec. 8, 2023, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,910, mailed on Dec. 13, 2023, 19 pages. |
Advisory Action received for U.S. Appl. No. 18/060,902, mailed on Nov. 13, 2023, 2 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 16/888,453, mailed on Jul. 26, 2023, 5 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 16/888,478, mailed on Oct. 31, 2023, 6 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 17/210,352, mailed on Sep. 20, 2023, 5 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 17/367,227, mailed on Jul. 27, 2023, 2 pages. |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 16/697,090, mailed on Oct. 26, 2023, 10 pages. |
Final Office Action received for U.S. Appl. No. 15/167,801, mailed on Sep. 19, 2023, 19 pages. |
Final Office Action received for U.S. Appl. No. 17/379,785, mailed on Aug. 23, 2023, 13 pages. |
Final Office Action received for U.S. Appl. No. 17/937,410, mailed on Aug. 3, 2023, 15 pages. |
Final Office Action received for U.S. Appl. No. 17/937,704, mailed on Aug. 31, 2023, 18 pages. |
Final Office Action received for U.S. Appl. No. 18/060,902, mailed on Aug. 25, 2023, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/656,610, mailed on Jul. 26, 2023, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/657,913, mailed on Jul. 21, 2023, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/060,902, mailed on Dec. 1, 2023, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/146,336, mailed on Aug. 3, 2023, 23 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,910, mailed on Aug. 3, 2023, 21 pages. |
Notice of Allowance received for U.S. Appl. No. 16/888,478, mailed on Aug. 2, 2023, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 17/353,527, mailed on Jul. 21, 2023, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 17/353,527, mailed on Oct. 4, 2023, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 17/586,625, mailed on Oct. 26, 2023, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 17/651,731, mailed on Oct. 3, 2023, 5 pages. |
Search Report received for Chinese Patent Application No. 201811143102.3, mailed on Nov. 2, 2023, 5 pages (3 pages of English Translation and 2 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 202010011436.6 mailed on Aug. 30, 2023, 4 pages (2 pages of English Translation and 2 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 202010662994.9, mailed on Sep. 28, 2023, 3 pages (1 page of English Translation and 2 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 202110201931.8, mailed on Oct. 16, 2023, 3 pages (1 page of English Translation and 2 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 202210799020.4, mailed on Jul. 27, 2023, 5 pages (1 page of English Translation and 4 pages of Official Copy). |
Cai, Chongshan, “Analysis of Copyright Infringement Problems of Video Aggregation App”, China Copyright, vol. 02, [retrieved on Oct. 6, 2023], Available online at: < http://www.cqvip.com/qk/81889a/2015002/90716681504849534850485048.html>, Apr. 15, 2015, 2 pages (1 page English Translation and 1 page Official Copy). |
Chen et al., “What a Juke! A Collaborative Music Sharing System”, IEEE, 2012, 6 pages. |
Cunningham et al., “An Ethnographic Study of Music Information Seeking: Implications for the Design of a Music Digital Library”, IEEE, 2003, 13 pages. |
Number | Date | Country | |
---|---|---|---|
20200301567 A1 | Sep 2020 | US |
Number | Date | Country | |
---|---|---|---|
62822966 | Mar 2019 | US | |
62855867 | May 2019 | US |