This relates generally to user interfaces for navigating and displaying content items.
Electronic devices that provide a user interface for navigating and displaying content items are often cluttered and confusing for users. Further, bringing up the user interface after viewing video, for example, can be a jarring and unintuitive experience.
Many electronic devices have graphical user interfaces that allow a user to navigate through numerous content items. There is a need to provide a fast, efficient, and convenient way for users to navigate through and select content items for consumption (e.g., viewing, listening, etc.). The embodiments described below provide a fast, efficient, and convenient way for users to navigate through and select content items for consumption using a column user interface.
For a better understanding of the various described embodiments, reference should be made to the Detailed Description below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.
In the following description of examples, reference is made to the accompanying drawings which form a part hereof, and in which it is shown by way of illustration specific examples that can be practiced. It is to be understood that other examples can be used and structural changes can be made without departing from the scope of the disclosed examples.
Embodiments of electronic devices, user interfaces for such devices, and associated processes for using such devices are described. In some embodiments, the device is a portable communications device, such as a mobile telephone, that also contains other functions, such as PDA and/or music player functions. Exemplary embodiments of portable multifunction devices include, without limitation, the iPhone®, iPod Touch®, and iPad® devices from Apple Inc. of Cupertino, California Other portable electronic devices, such as laptops or tablet computers with touch-sensitive surfaces (e.g., touch screen displays and/or touch pads), are, optionally, used. It should also be understood that, in some embodiments, the device is not a portable communications device, but is a desktop computer or a television with a touch-sensitive surface (e.g., a touch screen display and/or a touch pad). In some embodiments, the device does not have a touch screen display and/or a touch pad, but rather is capable of outputting display information (such as the user interfaces of the disclosure) for display on a separate display device, and capable of receiving input information from a separate input device having one or more input mechanisms (such as one or more buttons, a touch screen display and/or a touch pad). In some embodiments, the device has a display, but is capable of receiving input information from a separate input device having one or more input mechanisms (such as one or more buttons, a touch screen display and/or a touch pad).
In the discussion that follows, an electronic device that includes a display and a touch-sensitive surface is described. It should be understood, however, that the electronic device optionally includes one or more other physical user-interface devices, such as a physical keyboard, a mouse and/or a joystick. Further, as described above, it should be understood that the described electronic device, display and touch-sensitive surface are optionally distributed amongst two or more devices. Therefore, as used in this disclosure, information displayed on the electronic device or by the electronic device is optionally used to describe information outputted by the electronic device for display on a separate display device (touch-sensitive or not). Similarly, as used in this disclosure, input received on the electronic device (e.g., touch input received on a touch-sensitive surface of the electronic device) is optionally used to describe input received on a separate input device, from which the electronic device receives input information.
The device typically supports a variety of applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disk authoring application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a workout support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, a television channel browsing application, and/or a digital video player application.
The various applications that are executed on the device optionally use at least one common physical user-interface device, such as the touch-sensitive surface. One or more functions of the touch-sensitive surface as well as corresponding information displayed on the device are, optionally, adjusted and/or varied from one application to the next and/or within a respective application. In this way, a common physical architecture (such as the touch-sensitive surface) of the device optionally supports the variety of applications with user interfaces that are intuitive and transparent to the user.
Attention is now directed toward embodiments of portable or non-portable devices with touch-sensitive displays, though the devices need not include touch-sensitive displays or displays in general, as described above.
As used in the specification and claims, the term “intensity” of a contact on a touch-sensitive surface refers to the force or pressure (force per unit area) of a contact (e.g., a finger contact) on the touch-sensitive surface, or to a substitute (proxy) for the force or pressure of a contact on the touch-sensitive surface. The intensity of a contact has a range of values that includes at least four distinct values and more typically includes hundreds of distinct values (e.g., at least 256). Intensity of a contact is, optionally, determined (or measured) using various approaches and various sensors or combinations of sensors. For example, one or more force sensors underneath or adjacent to the touch-sensitive surface are, optionally, used to measure force at various points on the touch-sensitive surface. In some implementations, force measurements from multiple force sensors are combined (e.g., a weighted average) to determine an estimated force of a contact. Similarly, a pressure-sensitive tip of a stylus is, optionally, used to determine a pressure of the stylus on the touch-sensitive surface. Alternatively, the size of the contact area detected on the touch-sensitive surface and/or changes thereto, the capacitance of the touch-sensitive surface proximate to the contact and/or changes thereto, and/or the resistance of the touch-sensitive surface proximate to the contact and/or changes thereto are, optionally, used as a substitute for the force or pressure of the contact on the touch-sensitive surface. In some implementations, the substitute measurements for contact force or pressure are used directly to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is described in units corresponding to the substitute measurements). In some implementations, the substitute measurements for contact force or pressure are converted to an estimated force or pressure and the estimated force or pressure is used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is a pressure threshold measured in units of pressure).
As used in the specification and claims, the term “tactile output” refers to physical displacement of a device relative to a previous position of the device, physical displacement of a component (e.g., a touch-sensitive surface) of a device relative to another component (e.g., housing) of the device, or displacement of the component relative to a center of mass of the device that will be detected by a user with the user's sense of touch. For example, in situations where the device or the component of the device is in contact with a surface of a user that is sensitive to touch (e.g., a finger, palm, or other part of a user's hand), the tactile output generated by the physical displacement will be interpreted by the user as a tactile sensation corresponding to a perceived change in physical characteristics of the device or the component of the device. For example, movement of a touch-sensitive surface (e.g., a touch-sensitive display or trackpad) is, optionally, interpreted by the user as a “down click” or “up click” of a physical actuator button. In some cases, a user will feel a tactile sensation such as an “down click” or “up click” even when there is no movement of a physical actuator button associated with the touch-sensitive surface that is physically pressed (e.g., displaced) by the user's movements. As another example, movement of the touch-sensitive surface is, optionally, interpreted or sensed by the user as “roughness” of the touch-sensitive surface, even when there is no change in smoothness of the touch-sensitive surface. While such interpretations of touch by a user will be subject to the individualized sensory perceptions of the user, there are many sensory perceptions of touch that are common to a large majority of users. Thus, when a tactile output is described as corresponding to a particular sensory perception of a user (e.g., an “up click,” a “down click,” “roughness”), unless otherwise stated, the generated tactile output corresponds to physical displacement of the device or a component thereof that will generate the described sensory perception for a typical (or average) user.
It should be appreciated that device 100 is only one example of a portable or non-portable multifunction device, and that device 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components. The various components shown in
Memory 102 optionally includes high-speed random access memory and optionally also includes non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to memory 102 by other components of device 100, such as CPU 120 and the peripherals interface 118, is, optionally, controlled by memory controller 122.
Peripherals interface 118 can be used to couple input and output peripherals of the device to CPU 120 and memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in memory 102 to perform various functions for device 100 and to process data.
In some embodiments, peripherals interface 118, CPU 120, and memory controller 122 are, optionally, implemented on a single chip, such as chip 104. In some other embodiments, they are, optionally, implemented on separate chips.
RF (radio frequency) circuitry 108 receives and sends RF signals, also called electromagnetic signals. RF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other communications devices via the electromagnetic signals. RF circuitry 108 optionally includes well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. RF circuitry 108 optionally communicates with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication. The wireless communication optionally uses any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), high-speed uplink packet access (HSUPA), Evolution, Data-Only (EV-DO), HSPA, HSPA+, Dual-Cell HSPA (DC-HSPDA), long term evolution (LTE), near field communication (NFC), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for e-mail (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
Audio circuitry 110, speaker 111, and microphone 113 provide an audio interface between a user and device 100. Audio circuitry 110 receives audio data from peripherals interface 118, converts the audio data to an electrical signal, and transmits the electrical signal to speaker 111. Speaker 111 converts the electrical signal to human-audible sound waves. Audio circuitry 110 also receives electrical signals converted by microphone 113 from sound waves. Audio circuitry 110 converts the electrical signal to audio data and transmits the audio data to peripherals interface 118 for processing. Audio data is, optionally, retrieved from and/or transmitted to memory 102 and/or RF circuitry 108 by peripherals interface 118. In some embodiments, audio circuitry 110 also includes a headset jack (e.g., 212,
I/O subsystem 106 couples input/output peripherals on device 100, such as touch screen 112 and other input control devices 116, to peripherals interface 118. I/O subsystem 106 optionally includes display controller 156, optical sensor controller 158, intensity sensor controller 159, haptic feedback controller 161 and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 receive/send electrical signals from/to other input or control devices 116. The other input control devices 116 optionally include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 160 are, optionally, coupled to any (or none) of the following: a keyboard, infrared port, USB port, and a pointer device such as a mouse. The one or more buttons (e.g., 208,
Touch-sensitive display 112 provides an input interface and an output interface between the device and a user. As described above, the touch-sensitive operation and the display operation of touch-sensitive display 112 are optionally separated from each other, such that a display device is used for display purposes and a touch-sensitive surface (whether display or not) is used for input detection purposes, and the described components and functions are modified accordingly. However, for simplicity, the following description is provided with reference to a touch-sensitive display. Display controller 156 receives and/or sends electrical signals from/to touch screen 112. Touch screen 112 displays visual output to the user. The visual output optionally includes graphics, text, icons, video, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output corresponds to user-interface objects.
Touch screen 112 has a touch-sensitive surface, sensor or set of sensors that accepts input from the user based on haptic and/or tactile contact. Touch screen 112 and display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) on touch screen 112 and converts the detected contact into interaction with user-interface objects (e.g., one or more soft keys, icons, web pages or images) that are displayed on touch screen 112. In an exemplary embodiment, a point of contact between touch screen 112 and the user corresponds to a finger of the user.
Touch screen 112 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other embodiments. Touch screen 112 and display controller 156 optionally detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 112. In an exemplary embodiment, projected mutual capacitance sensing technology is used, such as that found in the iPhone®, iPod Touch®, and iPad® from Apple Inc. of Cupertino, California.
Touch screen 112 optionally has a video resolution in excess of 100 dpi. In some embodiments, the touch screen has a video resolution of approximately 160 dpi. The user optionally makes contact with touch screen 112 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which can be less precise than stylus-based input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user.
In some embodiments, in addition to the touch screen, device 100 optionally includes a touchpad (not shown) for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad is, optionally, a touch-sensitive surface that is separate from touch screen 112 or an extension of the touch-sensitive surface formed by the touch screen.
Device 100 also includes power system 162 for powering the various components. Power system 162 optionally includes a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable or non-portable devices.
Device 100 optionally also includes one or more optical sensors 164.
Device 100 optionally also includes one or more contact intensity sensors 165.
Device 100 optionally also includes one or more proximity sensors 166.
Device 100 optionally also includes one or more tactile output generators 167.
Device 100 optionally also includes one or more accelerometers 168.
In some embodiments, the software components stored in memory 102 include operating system 126, communication module (or set of instructions) 128, contact/motion module (or set of instructions) 130, graphics module (or set of instructions) 132, text input module (or set of instructions) 134, Global Positioning System (GPS) module (or set of instructions) 135, and applications (or sets of instructions) 136. Furthermore, in some embodiments memory 102 stores device/global internal state 157, as shown in
Operating system 126 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
Communication module 128 facilitates communication with other devices over one or more external ports 124 and also includes various software components for handling data received by RF circuitry 108 and/or external port 124. External port 124 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.). In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or similar to and/or compatible with the 30-pin connector used on iPod (trademark of Apple Inc.) devices.
Contact/motion module 130 optionally detects contact with touch screen 112 (in conjunction with display controller 156) and other touch-sensitive devices (e.g., a touchpad or physical click wheel). Contact/motion module 130 includes various software components for performing various operations related to detection of contact, such as determining if contact has occurred (e.g., detecting a finger-down event), determining an intensity of the contact (e.g., the force or pressure of the contact or a substitute for the force or pressure of the contact) determining if there is movement of the contact and tracking the movement across the touch-sensitive surface (e.g., detecting one or more finger-dragging events), and determining if the contact has ceased (e.g., detecting a finger-up event or a break in contact). Contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact, which is represented by a series of contact data, optionally includes determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations are, optionally, applied to single contacts (e.g., one finger contacts) or to multiple simultaneous contacts (e.g., “multitouch”/multiple finger contacts). In some embodiments, contact/motion module 130 and display controller 156 detect contact on a touchpad.
In some embodiments, contact/motion module 130 uses a set of one or more intensity thresholds to determine whether an operation has been performed by a user (e.g., to determine whether a user has “clicked” on an icon). In some embodiments at least a subset of the intensity thresholds are determined in accordance with software parameters (e.g., the intensity thresholds are not determined by the activation thresholds of particular physical actuators and can be adjusted without changing the physical hardware of device 100). For example, a mouse “click” threshold of a trackpad or touch screen display can be set to any of a large range of predefined thresholds values without changing the trackpad or touch screen display hardware. Additionally, in some implementations a user of the device is provided with software settings for adjusting one or more of the set of intensity thresholds (e.g., by adjusting individual intensity thresholds and/or by adjusting a plurality of intensity thresholds at once with a system-level click “intensity” parameter).
Contact/motion module 130 optionally detects a gesture input by a user. Different gestures on the touch-sensitive surface have different contact patterns and intensities. Thus, a gesture is, optionally, detected by detecting a particular contact pattern. For example, detecting a finger tap gesture includes detecting a finger-down event followed by detecting a finger-up (lift off) event at the same position (or substantially the same position) as the finger-down event (e.g., at the position of an icon). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event followed by detecting one or more finger-dragging events, and subsequently followed by detecting a finger-up (lift off) event.
Graphics module 132 includes various known software components for rendering and displaying graphics on touch screen 112 or other display, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast or other visual property) of graphics that are displayed. As used herein, the term “graphics” includes any object that can be displayed to a user, including without limitation text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations and the like.
In some embodiments, graphics module 132 stores data representing graphics to be used. Each graphic is, optionally, assigned a corresponding code. Graphics module 132 receives, from applications etc., one or more codes specifying graphics to be displayed along with, if necessary, coordinate data and other graphic property data, and then generates screen image data to output to display controller 156.
Haptic feedback module 133 includes various software components for generating instructions used by tactile output generator(s) 167 to produce tactile outputs at one or more locations on device 100 in response to user interactions with device 100.
Text input module 134, which is, optionally, a component of graphics module 132, provides soft keyboards for entering text in various applications (e.g., contacts 137, e-mail 140, IM 141, browser 147, and any other application that needs text input).
PS module 135 determines the location of the device and provides this information for use in various applications (e.g., to telephone module 138 for use in location-based dialing, to camera module 143 as picture/video metadata, and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets).
Applications 136 optionally include the following modules (or sets of instructions), or a subset or superset thereof:
Examples of other applications 136 that are, optionally, stored in memory 102 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
In conjunction with touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, contacts module 137 are, optionally, used to manage an address book or contact list (e.g., stored in application internal state 192 of contacts module 137 in memory 102 or memory 370), including: adding name(s) to the address book; deleting name(s) from the address book; associating telephone number(s), e-mail address(es), physical address(es) or other information with a name; associating an image with a name; categorizing and sorting names; providing telephone numbers or e-mail addresses to initiate and/or facilitate communications by telephone 138, video conference 139, e-mail 140, or IM 141; and so forth.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, telephone module 138 are, optionally, used to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in address book 137, modify a telephone number that has been entered, dial a respective telephone number, conduct a conversation and disconnect or hang up when the conversation is completed. As noted above, the wireless communication optionally uses any of a plurality of communications standards, protocols and technologies.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, optical sensor 164, optical sensor controller 158, contact module 130, graphics module 132, text input module 134, contact list 137, and telephone module 138, videoconferencing module 139 includes executable instructions to initiate, conduct, and terminate a video conference between a user and one or more other participants in accordance with user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, e-mail client module 140 includes executable instructions to create, send, receive, and manage e-mail in response to user instructions. In conjunction with image management module 144, e-mail client module 140 makes it very easy to create and send e-mails with still or video images taken with camera module 143.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, the instant messaging module 141 includes executable instructions to enter a sequence of characters corresponding to an instant message, to modify previously entered characters, to transmit a respective instant message (for example, using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for telephony-based instant messages or using XMPP, SIMPLE, or IMPS for Internet-based instant messages), to receive instant messages and to view received instant messages. In some embodiments, transmitted and/or received instant messages optionally include graphics, photos, audio files, video files and/or other attachments as are supported in a MMS and/or an Enhanced Messaging Service (EMS). As used herein, “instant messaging” refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and Internet-based messages (e.g., messages sent using XMPP, SIMPLE, or IMPS).
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact module 130, graphics module 132, text input module 134, GPS module 135, map module 154, and music player module 146, workout support module 142 includes executable instructions to create workouts (e.g., with time, distance, and/or calorie burning goals); communicate with workout sensors (sports devices); receive workout sensor data; calibrate sensors used to monitor a workout; select and play music for a workout; and display, store and transmit workout data.
In conjunction with touch screen 112, display controller 156, optical sensor(s) 164, optical sensor controller 158, contact module 130, graphics module 132, and image management module 144, camera module 143 includes executable instructions to capture still images or video (including a video stream) and store them into memory 102, modify characteristics of a still image or video, or delete a still image or video from memory 102.
In conjunction with touch screen 112, display controller 156, contact module 130, graphics module 132, text input module 134, and camera module 143, image management module 144 includes executable instructions to arrange, modify (e.g., edit), or otherwise manipulate, label, delete, present (e.g., in a digital slide show or album), and store still and/or video images.
In conjunction with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, and text input module 134, browser module 147 includes executable instructions to browse the Internet in accordance with user instructions, including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages.
In conjunction with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, text input module 134, e-mail client module 140, and browser module 147, calendar module 148 includes executable instructions to create, display, modify, and store calendars and data associated with calendars (e.g., calendar entries, to do lists, etc.) in accordance with user instructions.
In conjunction with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, widget modules 149 are mini-applications that are, optionally, downloaded and used by a user (e.g., weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, and dictionary widget 149-5) or created by the user (e.g., user-created widget 149-6). In some embodiments, a widget includes an HTML (Hypertext Markup Language) file, a CSS (Cascading Style Sheets) file, and a JavaScript file. In some embodiments, a widget includes an XML (Extensible Markup Language) file and a JavaScript file (e.g., Yahoo! Widgets).
In conjunction with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, the widget creator module 150 are, optionally, used by a user to create widgets (e.g., turning a user-specified portion of a web page into a widget).
In conjunction with touch screen 112, display system controller 156, contact module 130, graphics module 132, and text input module 134, search module 151 includes executable instructions to search for text, music, sound, image, video, and/or other files in memory 102 that match one or more search criteria (e.g., one or more user-specified search terms) in accordance with user instructions.
In conjunction with touch screen 112, display system controller 156, contact module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, and browser module 147, video and music player module 152 includes executable instructions that allow the user to download and play back recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files, and executable instructions to display, present or otherwise play back videos (e.g., on touch screen 112 or on an external, connected display via external port 124). In some embodiments, device 100 optionally includes the functionality of an MP3 player, such as an iPod (trademark of Apple Inc.).
In conjunction with touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, notes module 153 includes executable instructions to create and manage notes, to do lists, and the like in accordance with user instructions.
In conjunction with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, text input module 134, GPS module 135, and browser module 147, map module 154 are, optionally, used to receive, display, modify, and store maps and data associated with maps (e.g., driving directions; data on stores and other points of interest at or near a particular location; and other location-based data) in accordance with user instructions.
In conjunction with touch screen 112, display system controller 156, contact module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, text input module 134, e-mail client module 140, and browser module 147, online video module 155 includes instructions that allow the user to access, browse, receive (e.g., by streaming and/or download), play back (e.g., on the touch screen or on an external, connected display via external port 124), send an e-mail with a link to a particular online video, and otherwise manage online videos in one or more file formats, such as H.264. In some embodiments, instant messaging module 141, rather than e-mail client module 140, is used to send a link to a particular online video.
Each of the above identified modules and applications correspond to a set of executable instructions for performing one or more functions described above and the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules are, optionally, combined or otherwise re-arranged in various embodiments. In some embodiments, memory 102 optionally stores a subset of the modules and data structures identified above. Furthermore, memory 102 optionally stores additional modules and data structures not described above.
In some embodiments, device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through a touch screen and/or a touchpad (whether included in device 100 or on a separate device, such as an input device). By using a touch screen and/or a touchpad as the primary input control device for operation of device 100, the number of physical input control devices (such as push buttons, dials, and the like) on device 100 is, optionally, reduced.
The predefined set of functions that are performed exclusively through a touch screen and/or a touchpad optionally include navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates device 100 to a main, home, or root menu from any user interface that is displayed on device 100. In such embodiments, a “menu button” is implemented using a touchpad. In some other embodiments, the menu button is a physical push button or other physical input control device instead of a touchpad.
Event sorter 170 receives event information and determines the application 136-1 and application view 191 of application 136-1 to which to deliver the event information. Event sorter 170 includes event monitor 171 and event dispatcher module 174. In some embodiments, application 136-1 includes application internal state 192, which indicates the current application view(s) displayed on touch-sensitive display 112 when the application is active or executing. In some embodiments, device/global internal state 157 is used by event sorter 170 to determine which application(s) is (are) currently active, and application internal state 192 is used by event sorter 170 to determine application views 191 to which to deliver event information.
In some embodiments, application internal state 192 includes additional information, such as one or more of: resume information to be used when application 136-1 resumes execution, user interface state information that indicates information being displayed or that is ready for display by application 136-1, a state queue for enabling the user to go back to a prior state or view of application 136-1, and a redo/undo queue of previous actions taken by the user.
Event monitor 171 receives event information from peripherals interface 118. Event information includes information about a sub-event (e.g., a user touch on touch-sensitive display 112, as part of a multi-touch gesture). Peripherals interface 118 transmits information it receives from I/O subsystem 106 or a sensor, such as proximity sensor 166, accelerometer(s) 168, and/or microphone 113 (through audio circuitry 110). Information that peripherals interface 118 receives from I/O subsystem 106 includes information from touch-sensitive display 112 or a touch-sensitive surface.
In some embodiments, event monitor 171 sends requests to the peripherals interface 118 at predetermined intervals. In response, peripherals interface 118 transmits event information. In other embodiments, peripheral interface 118 transmits event information only when there is a significant event (e.g., receiving an input above a predetermined noise threshold and/or for more than a predetermined duration).
In some embodiments, event sorter 170 also includes a hit view determination module 172 and/or an active event recognizer determination module 173.
Hit view determination module 172 provides software procedures for determining where a sub-event has taken place within one or more views, when touch-sensitive display 112 displays more than one view. Views are made up of controls and other elements that a user can see on the display.
Another aspect of the user interface associated with an application is a set of views, sometimes herein called application views or user interface windows, in which information is displayed and touch-based gestures occur. The application views (of a respective application) in which a touch is detected optionally correspond to programmatic levels within a programmatic or view hierarchy of the application. For example, the lowest level view in which a touch is detected is, optionally, called the hit view, and the set of events that are recognized as proper inputs are, optionally, determined based, at least in part, on the hit view of the initial touch that begins a touch-based gesture.
Hit view determination module 172 receives information related to sub-events of a touch-based gesture. When an application has multiple views organized in a hierarchy, hit view determination module 172 identifies a hit view as the lowest view in the hierarchy which should handle the sub-event. In most circumstances, the hit view is the lowest level view in which an initiating sub-event occurs (i.e., the first sub-event in the sequence of sub-events that form an event or potential event). Once the hit view is identified by the hit view determination module, the hit view typically receives all sub-events related to the same touch or input source for which it was identified as the hit view.
Active event recognizer determination module 173 determines which view or views within a view hierarchy should receive a particular sequence of sub-events. In some embodiments, active event recognizer determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, active event recognizer determination module 173 determines that all views that include the physical location of a sub-event are actively involved views, and therefore determines that all actively involved views should receive a particular sequence of sub-events. In other embodiments, even if touch sub-events were entirely confined to the area associated with one particular view, views higher in the hierarchy would still remain as actively involved views.
Event dispatcher module 174 dispatches the event information to an event recognizer (e.g., event recognizer 180). In embodiments including active event recognizer determination module 173, event dispatcher module 174 delivers the event information to an event recognizer determined by active event recognizer determination module 173. In some embodiments, event dispatcher module 174 stores in an event queue the event information, which is retrieved by a respective event receiver module 182.
In some embodiments, operating system 126 includes event sorter 170. Alternatively, application 136-1 includes event sorter 170. In yet other embodiments, event sorter 170 is a stand-alone module, or a part of another module stored in memory 102, such as contact/motion module 130.
In some embodiments, application 136-1 includes a plurality of event handlers 190 and one or more application views 191, each of which includes instructions for handling touch events that occur within a respective view of the application's user interface. Each application view 191 of the application 136-1 includes one or more event recognizers 180. Typically, a respective application view 191 includes a plurality of event recognizers 180. In other embodiments, one or more of event recognizers 180 are part of a separate module, such as a user interface kit (not shown) or a higher level object from which application 136-1 inherits methods and other properties. In some embodiments, a respective event handler 190 includes one or more of: data updater 176, object updater 177, GUI updater 178, and/or event data 179 received from event sorter 170. Event handler 190 optionally utilizes or calls data updater 176, object updater 177 or GUI updater 178 to update the application internal state 192. Alternatively, one or more of the application views 191 includes one or more respective event handlers 190. Also, in some embodiments, one or more of data updater 176, object updater 177, and GUI updater 178 are included in a respective application view 191.
A respective event recognizer 180 receives event information (e.g., event data 179) from event sorter 170, and identifies an event from the event information. Event recognizer 180 includes event receiver 182 and event comparator 184. In some embodiments, event recognizer 180 also includes at least a subset of: metadata 183, and event delivery instructions 188 (which optionally include sub-event delivery instructions).
Event receiver 182 receives event information from event sorter 170. The event information includes information about a sub-event, for example, a touch or a touch movement. Depending on the sub-event, the event information also includes additional information, such as location of the sub-event. When the sub-event concerns motion of a touch, the event information optionally also includes speed and direction of the sub-event. In some embodiments, events include rotation of the device from one orientation to another (e.g., from a portrait orientation to a landscape orientation, or vice versa), and the event information includes corresponding information about the current orientation (also called device attitude) of the device.
Event comparator 184 compares the event information to predefined event or sub-event definitions and, based on the comparison, determines an event or sub-event, or determines or updates the state of an event or sub-event. In some embodiments, event comparator 184 includes event definitions 186. Event definitions 186 contain definitions of events (e.g., predefined sequences of sub-events), for example, event 1 (187-1), event 2 (187-2), and others. In some embodiments, sub-events in an event 187 include, for example, touch begin, touch end, touch movement, touch cancellation, and multiple touching. In one example, the definition for event 1 (187-1) is a double tap on a displayed object. The double tap, for example, comprises a first touch (touch begin) on the displayed object for a predetermined phase, a first lift-off (touch end) for a predetermined phase, a second touch (touch begin) on the displayed object for a predetermined phase, and a second lift-off (touch end) for a predetermined phase. In another example, the definition for event 2 (187-2) is a dragging on a displayed object. The dragging, for example, comprises a touch (or contact) on the displayed object for a predetermined phase, a movement of the touch across touch-sensitive display 112, and lift-off of the touch (touch end). In some embodiments, the event also includes information for one or more associated event handlers 190.
In some embodiments, event definition 187 includes a definition of an event for a respective user-interface object. In some embodiments, event comparator 184 performs a hit test to determine which user-interface object is associated with a sub-event. For example, in an application view in which three user-interface objects are displayed on touch-sensitive display 112, when a touch is detected on touch-sensitive display 112, event comparator 184 performs a hit test to determine which of the three user-interface objects is associated with the touch (sub-event). If each displayed object is associated with a respective event handler 190, the event comparator uses the result of the hit test to determine which event handler 190 should be activated. For example, event comparator 184 selects an event handler associated with the sub-event and the object triggering the hit test.
In some embodiments, the definition for a respective event 187 also includes delayed actions that delay delivery of the event information until after it has been determined whether the sequence of sub-events does or does not correspond to the event recognizer's event type.
When a respective event recognizer 180 determines that the series of sub-events do not match any of the events in event definitions 186, the respective event recognizer 180 enters an event impossible, event failed, or event ended state, after which it disregards subsequent sub-events of the touch-based gesture. In this situation, other event recognizers, if any, that remain active for the hit view continue to track and process sub-events of an ongoing touch-based gesture.
In some embodiments, a respective event recognizer 180 includes metadata 183 with configurable properties, flags, and/or lists that indicate how the event delivery system should perform sub-event delivery to actively involved event recognizers. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate how event recognizers interact, or are enabled to interact, with one another. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate whether sub-events are delivered to varying levels in the view or programmatic hierarchy.
In some embodiments, a respective event recognizer 180 activates event handler 190 associated with an event when one or more particular sub-events of an event are recognized. In some embodiments, a respective event recognizer 180 delivers event information associated with the event to event handler 190. Activating an event handler 190 is distinct from sending (and deferred sending) sub-events to a respective hit view. In some embodiments, event recognizer 180 throws a flag associated with the recognized event, and event handler 190 associated with the flag catches the flag and performs a predefined process.
In some embodiments, event delivery instructions 188 include sub-event delivery instructions that deliver event information about a sub-event without activating an event handler. Instead, the sub-event delivery instructions deliver event information to event handlers associated with the series of sub-events or to actively involved views. Event handlers associated with the series of sub-events or with actively involved views receive the event information and perform a predetermined process.
In some embodiments, data updater 176 creates and updates data used in application 136-1. For example, data updater 176 updates the telephone number used in contacts module 137, or stores a video file used in video player module 145. In some embodiments, object updater 177 creates and updates objects used in application 136-1. For example, object updater 177 creates a new user-interface object or updates the position of a user-interface object. GUI updater 178 updates the GUI. For example, GUI updater 178 prepares display information and sends it to graphics module 132 for display on a touch-sensitive display.
In some embodiments, event handler(s) 190 includes or has access to data updater 176, object updater 177, and GUI updater 178. In some embodiments, data updater 176, object updater 177, and GUI updater 178 are included in a single module of a respective application 136-1 or application view 191. In other embodiments, they are included in two or more software modules.
It shall be understood that the foregoing discussion regarding event handling of user touches on touch-sensitive displays and/or touchpads also applies to other forms of user inputs to operate multifunction devices 100 with input-devices, not all of which are initiated on touch screens. For example, mouse movement and mouse button presses, optionally coordinated with single or multiple keyboard presses or holds; contact movements such as taps, drags, scrolls, etc., on touch-pads; pen stylus inputs; movement of the device; oral instructions; detected eye movements; biometric inputs; and/or any combination thereof are optionally utilized as inputs corresponding to sub-events which define an event to be recognized.
The touch screen 112 optionally displays one or more graphics within user interface (UI) 200. In this embodiment, as well as others described below, a user is enabled to select one or more of the graphics by making a gesture on the graphics, for example, with one or more fingers 202 (not drawn to scale in the figure) or one or more styluses 203 (not drawn to scale in the figure). In some embodiments, selection of one or more graphics occurs when the user breaks contact with the one or more graphics. In some embodiments, the gesture optionally includes one or more taps, one or more swipes (from left to right, right to left, upward and/or downward) and/or a rolling of a finger (from right to left, left to right, upward and/or downward) that has made contact with device 100. In some implementations or circumstances, inadvertent contact with a graphic does not select the graphic. For example, a swipe gesture that sweeps over an application icon optionally does not select the corresponding application when the gesture corresponding to selection is a tap.
Device 100 optionally also includes one or more physical buttons, such as “home” or menu button 204. As described previously, menu button 204 is, optionally, used to navigate to any application 136 in a set of applications that are, optionally executed on device 100. Alternatively, in some embodiments, the menu button is implemented as a soft key in a GUI displayed on touch screen 112.
In one embodiment, device 100 includes touch screen 112, menu button 204, push button 206 for powering the device on/off and locking the device, volume adjustment button(s) 208, Subscriber Identity Module (SIM) card slot 210, head set jack 212, and docking/charging external port 124. Push button 206 is, optionally, used to turn the power on/off on the device by depressing the button and holding the button in the depressed state for a predefined time interval; to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or to unlock the device or initiate an unlock process. In an alternative embodiment, device 100 also accepts verbal input for activation or deactivation of some functions through microphone 113. Device 100 also, optionally, includes one or more contact intensity sensors 165 for detecting intensity of contacts on touch screen 112 and/or one or more tactile output generators 167 for generating tactile outputs for a user of device 100.
Each of the above identified elements in
Although some of the examples which follow will be given with reference to inputs on touch screen display 112 (where the touch sensitive surface and the display are combined), in some embodiments, the device detects inputs on a touch-sensitive surface that is separate from the display, as shown in
Additionally, while the following examples are given primarily with reference to finger inputs (e.g., finger contacts, finger tap gestures, finger swipe gestures), it should be understood that, in some embodiments, one or more of the finger inputs are replaced with input from another input device (e.g., a mouse based input or stylus input). For example, a swipe gesture is, optionally, replaced with a mouse click (e.g., instead of a contact) followed by movement of the cursor along the path of the swipe (e.g., instead of movement of the contact). As another example, a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e.g., instead of detection of the contact followed by ceasing to detect the contact). Similarly, when multiple user inputs are simultaneously detected, it should be understood that multiple computer mice are, optionally, used simultaneously, or a mouse and finger contacts are, optionally, used simultaneously.
As used herein, the term “focus selector” refers to an input element that indicates a current part of a user interface with which a user is interacting. In some implementations that include a cursor or other location marker, the cursor acts as a “focus selector,” so that when an input (e.g., a press input) is detected on a touch-sensitive surface (e.g., touchpad 355 in
In some embodiments described herein, one or more operations are performed in response to detecting a gesture that includes a respective press input or in response to detecting the respective press input performed with a respective contact (or a plurality of contacts), where the respective press input is detected based at least in part on detecting an increase in intensity of the contact (or plurality of contacts) above a press-input intensity threshold. In some embodiments, the respective operation is performed in response to detecting the increase in intensity of the respective contact above the press-input intensity threshold (e.g., a “down stroke” of the respective press input). In some embodiments, the press input includes an increase in intensity of the respective contact above the press-input intensity threshold and a subsequent decrease in intensity of the contact below the press-input intensity threshold, and the respective operation is performed in response to detecting the subsequent decrease in intensity of the respective contact below the press-input threshold (e.g., an “up stroke” of the respective press input).
In some embodiments, the device employs intensity hysteresis to avoid accidental inputs sometimes termed “jitter,” where the device defines or selects a hysteresis intensity threshold with a predefined relationship to the press-input intensity threshold (e.g., the hysteresis intensity threshold is X intensity units lower than the press-input intensity threshold or the hysteresis intensity threshold is 75%, 90% or some reasonable proportion of the press-input intensity threshold). Thus, in some embodiments, the press input includes an increase in intensity of the respective contact above the press-input intensity threshold and a subsequent decrease in intensity of the contact below the hysteresis intensity threshold that corresponds to the press-input intensity threshold, and the respective operation is performed in response to detecting the subsequent decrease in intensity of the respective contact below the hysteresis intensity threshold (e.g., an “up stroke” of the respective press input). Similarly, in some embodiments, the press input is detected only when the device detects an increase in intensity of the contact from an intensity at or below the hysteresis intensity threshold to an intensity at or above the press-input intensity threshold and, optionally, a subsequent decrease in intensity of the contact to an intensity at or below the hysteresis intensity, and the respective operation is performed in response to detecting the press input (e.g., the increase in intensity of the contact or the decrease in intensity of the contact, depending on the circumstances).
For ease of explanation, the description of operations performed in response to a press input associated with a press-input intensity threshold or in response to a gesture including the press input are, optionally, triggered in response to detecting either: an increase in intensity of a contact above the press-input intensity threshold, an increase in intensity of a contact from an intensity below the hysteresis intensity threshold to an intensity above the press-input intensity threshold, a decrease in intensity of the contact below the press-input intensity threshold, and/or a decrease in intensity of the contact below the hysteresis intensity threshold corresponding to the press-input intensity threshold. Additionally, in examples where an operation is described as being performed in response to detecting a decrease in intensity of a contact below the press-input intensity threshold, the operation is, optionally, performed in response to detecting a decrease in intensity of the contact below a hysteresis intensity threshold corresponding to, and lower than, the press-input intensity threshold.
In some embodiments, display controller 508 causes the various user interfaces of the disclosure to be displayed on display 500. Further, input to device 500 is optionally provided by remote 510 via remote interface 512, which is optionally a wireless or a wired connection. It is understood that the embodiment of
Many electronic devices have graphical user interfaces that allow a user to navigate through numerous content items. There is a need to provide a fast, efficient, and convenient way for users to navigate through and select content items for consumption (e.g., viewing, listening, etc.). The embodiments described below provide a fast, efficient, and convenient way for users to navigate through and select content items for consumption using a column user interface.
For example, the menu items in the “On Now” column 6002-1 each correspond to video content that is currently airing live, and a user can view the corresponding video content by selecting one of the menu items. Menu items corresponding to live video content, such as menu items 6004-1, 6004-2, 6004-3, and 6004-4, are optionally represented in the user interface by live video. The menu items in the “Watch List” column 6002-2 each correspond to video content that is available to view on-demand, and a user can view the corresponding on-demand video content by selecting one of the menu items in the “Watch List” column. Menu items corresponding to live video content, such as menu items 6006-1, 6006-2, 6006-3, 6006-4, and 6006-5, are optionally represented in the user interface by one or more still images and/or text. Further, the text optionally indicates a number of unwatched episodes of a content series corresponding to the menu item. For example, menu item 6006-1 indicates that the user has three unwatched episodes of Game of Thrones in the watch list.
In
In some embodiments, a background color of the user interface optionally changes based on the menu item that is currently highlighted. For example, the logo of “Game of Thrones” contains the color brown, and thus the background of the user interface may be brown when the “Game of Thrones” menu item 6006-1 is highlighted by the item focus indicator 6008. The logo of “Mad Men” contains the color red, and thus the background of the user interface may be red when the “Mad Men” menu item 6006-2 is highlighted by the item focus indicator 6008.
In
In some embodiments, one or more columns optionally have multiple menu items on a row, as illustrated in
In
In some embodiments, a plurality of the menu items in a column in a first set of columns are optionally grouped together in a single column in a second set of columns. For example,
As described below, the method 700 provides ways in which a device can display a column user interface. The method reduces the cognitive burden on a user when interacting with a user interface on the device by providing an intuitive user interface for selecting content items, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, increasing the efficiency of the user's interaction with the user interfaces conserves power and increases the time between battery charges.
In some embodiments, an electronic device (e.g., a set top box or other user interface generating device that is in communication with a display device) with one or more processors and memory provides (702), for display on a display device (e.g., a separate display such as a television or computer monitor, or an integrated display that is part of the electronic device), a user interface including a first set of columns. Two or more columns in the first set of columns each include two or more menu items (e.g., text, images, and/or video content, some or all of which may optionally include links to content, applications, and/or other user interfaces). For example,
In some embodiments, the first set of columns are optionally configured (704) to move together when scrolled in a first direction (e.g., a horizontal direction) and to move separately when scrolled in a second direction (e.g., a vertical direction) that is different from the first direction (e.g., when scrolling horizontally, all the columns are scrolled, and when scrolling vertically, only a selected column is scrolled; and when scrolling horizontally, the onscreen horizontal position of the selected column may be proportional to the relative position of the selected column among the total plurality of columns, including those that are offscreen). For example,
In some embodiments, the electronic device receives (718) a selection input. In response (720) to receiving the selection input, in accordance with a determination that the selection input corresponds to selection of a first column (e.g., an input selecting a heading of the first column, such as a “Watch List” heading for a Watch List column), the electronic device provides (722), for display on the display device, a second set of columns (different from the first set of columns). Two or more of the columns in the second set of columns correspond to different subsets of menu items that were displayed in the first column that was selected in response to the selection input (e.g., each column in the second set of columns corresponds to an item in the user's Watch List) (and one or more additional columns, e.g., each subset is an episode of a television series, and the one or more additional columns include extra content related to the television series). In some embodiments, a first column in the second set of columns optionally corresponds (724) to a first menu item from the first column (and only the first menu item), and a second column in the second set of columns optionally corresponds to a second menu item from the first column that is different from (e.g., distinct from) the first menu item from the first column (and only the second menu item). For example,
In some embodiments, a first column in the second set of columns optionally corresponds (726) to a first plurality of the menu items from the first column, and a second column in the second set of columns optionally corresponds to a second plurality of the menu items from the first column that is different from (e.g., distinct from) the first plurality of menu items from the first column (e.g., a subset for each letter in the alphabet, a subset for each artist in a music collection, a subset for each television series in a Watch List, etc.). For example,
In some embodiments, in response (720) to the selection input, in accordance with a determination that the selection input corresponds to selection of a menu item from the first column (e.g., an input selecting a representation of an episode of a television show from the Watch List column) the electronic device optionally provides (730), for display on the display device, a user interface associated with the menu item that was selected (e.g., content associated with the menu item, a set of columns associated with the menu item, a grid associated with the menu item, a set of rows associated with the menu item, etc.). In some embodiments, the menu item optionally corresponds (732) to a content series (e.g., a television series, a miniseries, a set of webisodes, among other possibilities) that includes a plurality of episodes, and a respective episode of the plurality of episodes is in a watch list (e.g., a list of movies, content series, episodes, music actors, genres, searches, etc.). For example,
In some embodiments, the electronic device optionally generates (734) one or more links to ancillary content related to the respective episode (e.g., a clip of a talk show featuring an actor from the episode, a clip of a parody of the episode, a song featured in the episode, among other possibilities), and the second set of columns optionally includes the one or more links to ancillary content related to the respective episode. For example,
In some embodiments, the first set of columns are optionally scrolled (736) in a first direction. A selected column display position (e.g., a horizontal position on the display given with respect to the user interface) is optionally determined (738) based on a position of a currently selected column with respect to a total number of the plurality of columns (e.g., the position may be the first column of twelve total columns, or the sixth column of six total columns). The plurality of columns are optionally scrolled (740) such that the currently selected column moves to the selected column display position on the display device (e.g., if there are 100 total columns, including those that are currently offscreen, and the selected column is the 20th column, then the selected column may have a horizontal position that is 20% of the total horizontal display length; similarly, if there are 100 total columns, and the selected column is the 90th column, then the selected column may have a horizontal position that is 90% of the total display length, etc.). For example,
In some embodiments, second input to advance from the first column to a second column is optionally received (742), and in response to the second input, a visual characteristic of one or both of the first column and the second column is optionally altered (744). In some embodiments, altering the visual characteristic optionally includes increasing (746) a visual emphasis of the second column relative to a visual emphasis of the first column (e.g., by increasing a brightness, contrast, opacity, saturation, or other visual property of the second column). For example,
In some embodiments, second input is optionally received (752) to advance from a first menu item of the first column to a second menu item of the first column. In response, a color associated with a logo of the second menu item is optionally obtained (754), and the background color of the user interface is optionally altered (756) in accordance with the color associated with the logo of the second menu item. For example,
In some embodiments, the first column optionally includes (728) first and second headings (e.g., sub-headings of the first column). A scrolling input corresponding to the first column is optionally received (758). In response (760) to the scrolling input, the menu items of the first column are optionally scrolled (762), where the first heading remains stationary during the scrolling of the menu items of the first column. The first heading is optionally scrolled (764) off an edge of the user interface, such that the first heading is no longer visible in the user interface after scrolling the first heading off the edge of the user interface. The first heading is optionally replaced (766) with the second heading, such that the second heading remains stationary during scrolling after replacing the first heading. For example,
In some embodiments, the first set of columns is optionally generated based on a partial search term (708). A search suggestions column in the first set of columns is optionally generated (770) for display. The search suggestions column optionally comprises a plurality of search suggestions, and a respective search suggestion of the plurality of search suggestions is optionally highlighted. A search results column in the first set of columns is optionally generated (772) for display. The search suggestions column optionally comprises a plurality of search results corresponding to the respective search suggestion. A selection of the respective search suggestion is optionally received (774), and in response to the selection of the respective search suggestion, a plurality of search results columns is optionally generated (776) for display, including two or more search results columns that each correspond to one search result of the plurality of search results. For example,
In some embodiments, providing the user interface including the first set of columns optionally includes generating (710) representations of a plurality of content items including a plurality of on-demand content items (e.g., content items stored locally at the electronic device or other local storage, or content items stored remotely at a server) and a plurality of live content items (e.g., content items live streaming over the internet, live broadcast content items, or other live content items). Two or more of the representations of on-demand content items optionally include static images (e.g., a movie poster, production art, a screenshot, or other placeholder image) corresponding to the on-demand content, and two or more of the representations of live content items optionally include live video corresponding to the live content. In some embodiments, the live video optionally includes (712) live video of a live sporting event, and the static images optionally represent one or more of completed sporting events, upcoming sporting events, and highlight reels. For example,
In some embodiments, it is optionally determined (778) whether a first content item of the plurality of content items is an on-demand content item or a live content item. In accordance with a determination that the first content item is an on-demand content item, a static image corresponding to the on-demand content item is optionally obtained (780). In accordance with a determination that the first content item is a live content item, a live video corresponding to the live content item is optionally obtained (784). In some embodiments, obtaining the static image optionally includes capturing the static image from video corresponding to the on-demand content item. For example,
In some embodiments, the first set of columns optionally includes (714) a first recent content column and a second recent content column. Each of the first and second recent content columns optionally includes content corresponding to a different category of recent content (e.g., a recent TV column, a recent movies column, a recent songs column, etc.). In some embodiments, the first recent content column only includes (716) content that has been viewed past a predetermined threshold (e.g., only content that has been viewed for at least 8 minutes, only content that has been viewed at least 10% of its running time, etc.; and the threshold may be different for different content categories, for example, television may have a 2 minute threshold, movies may have a 10 minutes threshold, and songs may have a 30 second threshold, etc.). For example,
In some embodiments, the first column of the first set of columns is optionally (786) a first type of column (e.g., a column with only a single menu item in each row of the column) and a second column of the first set of columns is optionally a second type of column (e.g., a column with multiple menu items in one or more rows of the columns). While the first column is selected, a first navigation input (e.g., a swipe left or a swipe right) is optionally received (788). In response to receiving the first navigation input, a different column of the first set of columns is optionally selected (790) (e.g., a column immediately to the left or the right of the first column). While the second column is selected, a second navigation input (e.g., a swipe left or a swipe right) is optionally received (792). In response to receiving the second navigation input, selection of the second column is optionally maintained (794) and multiple menu items in a row of the second column are optionally navigated among. For example,
The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described above with respect to
The operations described above with reference to
It should be understood that the particular order in which the operations in
Many electronic devices have graphical user interfaces that overlay some other visual content. The sudden display of a new graphical user interface can be jarring to a smooth user experience. The embodiments described below provide a smooth transition to a graphical user interface by first displaying a translucent graphical user interface over visual content, and then decreasing the degree of translucency as the user continues to interact with the interface.
As described below, the method 900 provides ways in which a device can display a user interface with translucent portions. The method reduces the cognitive burden on a user when interacting with a user interface on the device by providing a smooth and intuitive user interface for selecting content items, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, increasing the efficiency of the user's interaction with the user interfaces conserves power and increases the time between battery charges.
In some embodiments, an electronic device (e.g., a set top box or other user interface generating device that is in communication with a display device) with one or more processors and memory, while a user interface that includes visual content (e.g., video, still image, animation, etc.) is displayed on a display, receives (902) a request to display a menu for controlling the visual content that is displayed in the user interface.
In some embodiments, in response to receiving the request to display the menu, the electronic device provides (904), for display on the display device, an updated user interface that includes the menu, where the menu includes at least one translucent portion with a first degree (or amount) of translucency (in some examples, the whole menu is translucent), so that an indication of the visual content in the user interface can be seen through the translucent portion of the menu in accordance with the first degree of translucency. In some embodiments, the menu optionally includes (906) a first set of columns, two or more columns in the first set of columns each including two or more menu items (e.g., text, images, and/or video content, some or all of which may optionally include links to content, applications, and/or other user interfaces). For example,
In some embodiments, while the menu with the translucent portion is displayed on the display device, the electronic device receives (908) a request to perform an operation in the menu (e.g., a menu navigation operation such as switching from displaying one column to displaying another column or advancing from a first menu item to a second menu item). In some embodiments, the menu operation is optionally (910) a navigation operation that corresponds to a request to move the menu in a respective direction.
In some embodiments, in response (912) to receiving the request to perform the operation in the menu, the electronic device performs (914) the operation in the menu. For example,
In some embodiments, performing the operation in the menu optionally includes (916) changing a state of an item in the menu (e.g., selecting a menu item, activating a checkbox, or manipulating a slider, among other possibilities). Thus, in some embodiments, performing the operation in the menu includes performing an operation other than merely navigating through the menu (e.g., by scrolling through items in the menu). For example,
Further in response (912) to receiving the request to perform the operation in the menu, the electronic device changes (918) the translucency of the translucent portion of the menu from the first degree of translucency to a second degree of translucency that is different from the first degree of translucency (e.g., once the user has indicated an intention to interact with the menu, the visual properties of the menu are changed so as to make the menu more legible). For example, in
In some embodiments, in accordance with a determination that the respective direction is a first direction, the translucency of the translucent portion of the menu is optionally increased (920). For example,
In some embodiments, while the menu with the translucent portion is displayed on the display device, a request is optionally received (924) to perform a navigation operation in a second direction (e.g., a direction opposite the first direction). In response to receiving the request to perform the navigation operation in the second direction, the navigation operation is optionally performed (926) in the second direction, and the translucency of the translucent portion of the menu is optionally reduced (928).
In some embodiments, while the menu with the translucent portion is displayed on the display device, a request is optionally received (930) to dismiss the menu. In response to receiving the request to dismiss the menu, the user interface including the visual content and not including the menu is optionally provided (932) for display on the display device. In some embodiments, the visual content optionally includes (934) video content. The video content is optionally paused (936) in response to receiving the request to display the menu, and the video content is optionally resumed (938) in response to receiving the request to dismiss the menu.
The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described above with respect to
The operations described above with reference to
It should be understood that the particular order in which the operations in
Many electronic devices have graphical user interfaces that overlay some other visual content. The sudden display of a new graphical user interface can be jarring to a smooth user experience. The embodiments described below provide a smooth transition to a graphical user interface by presenting a first column after a column display condition (e.g., detecting presence of a user, detecting motion proximate the electronic device, or detecting a face of a user using an optical sensor, among other possibilities) has been met, and then gradually presenting additional columns of a column user interface after input is received (e.g., a swipe, a tap, or other input to expose one or more additional columns, among other possibilities).
As described below, the method 1100 provides ways in which a device can display a user interface gradually over visual content. The method reduces the cognitive burden on a user when interacting with a user interface on the device by providing a smooth and intuitive user interface for selecting content items, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, increasing the efficiency of the user's interaction with the user interfaces conserves power and increases the time between battery charges.
In some embodiments, an electronic device (e.g., a set top box or other user interface generating device that is in communication with a display device) with one or more processors and memory detects (1102) that a column display condition (e.g., detecting presence of a user, detecting motion proximate the electronic device, or detecting a face of a user using an optical sensor, among other possibilities) has been met.
In some embodiments, in response to detecting that the column display condition has been met, the electronic device provides (1114) a user interface for presentation on a display (e.g., a separate display device or an integrated display that is part of the electronic device). The user interface optionally includes at least a portion of a first column proximate to (e.g., at or near) a first edge (e.g., the rightmost edge) of the display. For example,
In some embodiments, input is optionally received (1122) (e.g., a swipe, a tap, or other input to expose one or more additional columns, among other possibilities). In some embodiments, in response to the input, the electronic device provides (1128) for display an animation including the first column moving away from the first edge of the display to a location on the display that is proximate to (e.g., at or near) a second edge (e.g., the leftmost edge) of the display, the second edge being opposite the first edge. The animation further includes a second column gradually appearing from the first edge of the display, such that a plurality of columns including the first and second columns fills the display from the first edge to the second edge (e.g., the plurality of columns fills the user interface from the rightmost edge of the display to the leftmost edge of the display, although portions of the user interface above and/or below the plurality of columns may not be filled by the columns and, in some circumstances, there are margins between one or more of the columns and the edge of the display). In some embodiments, the plurality of columns is optionally included (1132) in a first set of columns, two or more columns in the first set of columns each including two or more menu items (e.g., text, images, and/or video content, some or all of which may optionally include links to content, applications, and/or other user interfaces).
For example,
In some embodiments, detecting that the column display condition has been met optionally does not include (1104) receiving directional input, and receiving the input optionally includes (1124) receiving directional input in a first direction (e.g., a swipe, a tap, or other input in a direction such as left, right, up, or down). In some embodiments, the directional input optionally includes (1126) a magnitude (e.g., an amount of movement of a contact or a velocity of movement of a contact). The animation optionally further includes (1130) moving the second column in accordance with the magnitude of the directional input (e.g., direct manipulation of the second column, such as a swipe left corresponding to movement of a contact causes the second column to move to the left by an amount that corresponds to a distance moved by the contact during the swipe left gesture or a velocity of the contact during the swipe left gesture, etc.).
In some embodiments, video content is optionally displayed (1112) on the display when the column display condition is met, the user interface optionally includes (1116) an overlay on the video content, and the first column optionally includes (1120) content (e.g., menu items of the first column) selected based on the video content being displayed (e.g., if the video content is a particular movie, the selected content included in the first column may be information related to the particular movie; if the video content is a particular television show, the selected content included in the first column may be additional episodes of the particular television show; etc.).
In some embodiments, video content is optionally displayed (1112) on the display when the column display condition is met, the user interface optionally includes (1120) an overlay on the video content, and detecting the column display condition optionally includes (1108) detecting selection of a menu button (e.g., on a touch sensitive device, on a remote control, and/or on a mobile device).
In some embodiments, passive content (e.g., splash image, screen saver, background image, etc.) is optionally displayed (1110) on the display when the column display condition is met, the user interface optionally includes (1118) an overlay on the passive content, and detecting the column display condition optionally includes (1106) detecting user presence proximate to the display (e.g., detecting motion proximate the electronic device, detecting a face of a user using an optical sensor, or detecting presence of a mobile device). For example,
The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described above with respect to
The operations described above with reference to
It should be understood that the particular order in which the operations in
Many electronic devices have graphical user interfaces that provide access to episodes of various content series. It can be difficult for a user to keep track of which episodes have already been watched. Further, for an unwatched content series, it can be more intuitive to present either the first episode of the series or the most recent episode of the series, depending on the release status of the content series. The embodiments described below provide a more intuitive user interface by intelligently presenting contextual information for a content series based on the release status of the content series.
As described below, the method 1300 provides ways in which a device can display a user interface including intelligent presentation of contextual information for a content series. The method reduces the cognitive burden on a user when interacting with a user interface on the device by providing intuitive user interface for selecting content items, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, increasing the efficiency of the user's interaction with the user interfaces conserves power and increases the time between battery charges.
In some embodiments, an electronic device (e.g., a set top box or other user interface generating device that is in communication with a display device) with one or more processors and memory receives (1302) a request for information about a content series (e.g., a television series, a miniseries, a set of webisodes, among other possibilities) that has a release status (e.g., whether the series is still in production, whether the series has been canceled, whether there are remaining episodes that have yet to be released, among other possibilities).
In some embodiments, in response to receiving the request, the electronic device provides (1304) a user interface for display on a display (e.g., a television or other display device) that includes information about the content series, including respective contextual information that is based on the release status of the content series. In some embodiments, the user interface optionally includes (1312) a plurality of columns, two or more of the plurality of columns including two or more menu items (e.g., text, images, and/or video content, some or all of which may optionally include links to content, applications, and/or other user interfaces). A first column of the plurality of columns optionally includes the respective contextual information that is based on the release status of the content series (e.g., the column is a column dedicated to the content series, such as a “Game of Thrones” column; and/or the column is a column dedicated to a content provider of the content series, such as an HBO column with a menu item for “Game of Thrones”, etc.). For example,
In some embodiments, in accordance with a determination that the release status is a first release status (1306), the respective contextual information is first contextual information provided based on recent release-status activity (e.g., current or upcoming episodes) for the content series.
In some embodiments, in accordance with a determination that the release status is a second release status (1308), different from the first release status, the respective contextual information is second contextual information (different from the first contextual information) provided without reference to recent release-status activity (e.g., current or upcoming episodes) for the content series.
In some embodiments, the first contextual information optionally includes (1314) an affordance that, when selected, causes an episode of the content series to be provided for display on the display (in some embodiments, the device receives a selection of the affordance; and in response to receiving the selection of the affordance, presenting for display an episode of the content series). In some embodiments, the second contextual information optionally includes (1322) information indicating when a next episode of the content series will be available for presentation without including an affordance that, when selected, causes an episode of the content series to be presented for display. For example,
In some embodiments, the release status is optionally cancelled (1310) and the respective contextual information optionally includes a first episode of the content series (or a last unwatched episode of the content series). For example,
In some embodiments, the first release status is optionally currently releasing (1316) (e.g., the content series is currently on the air and/or new episodes are being released) and a most recently released episode is marked as watched. Providing the first contextual information based on recent release-status activity optionally includes providing, for display, a date of a next releasing episode (e.g., the air date of an upcoming episode in the content series). For example,
In some embodiments, the first release status is optionally currently releasing (1318) (e.g., the content series is currently on the air and/or new episodes are being released) and a most recently released episode is not marked as watched. Providing the first contextual information based on recent release-status activity optionally includes providing an affordance for presenting a most recently released episode of the content series. In some embodiments, a selection of the affordance is optionally received (1324), and in response (1326) to receiving the selection of the affordance, the most recently released episode of the content series is optionally provided for display (1328). For example,
In some embodiments, the first release status is optionally currently releasing (1320) (e.g., the content series is currently on the air and/or new episodes are being released) and an episode of the content series is currently airing live on a respective channel. Providing the first contextual information based on recent release-status activity optionally includes providing an affordance for providing for display the respective channel on which the episode is currently airing live. In some embodiments, a selection of the affordance is optionally received (1324), and in response (1326) to receiving the selection of the affordance, the respective channel on which the episode is currently airing live is optionally provided for display (1330). For example,
The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described above with respect to
The operations described above with reference to
It should be understood that the particular order in which the operations in
Many electronic devices have graphical user interfaces that allow a user to directly add content items to a watch list for later viewing. It can be difficult for a user to keep track of potentially interesting content items as they are released. The embodiments described below provide a user interface for quickly adding both content and non-content items (e.g., actors, search terms, genres, sports leagues/players, etc.) to a watch list, so that content items related to the non-content items can be automatically populated to the watch list.
As described below, the method 1500 provides ways in which a device can display user interfaces for adding both content and non-content items to a watch list. The method reduces the cognitive burden on a user when interacting with a user interface on the device by providing intuitive user interface for selecting content items, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, increasing the efficiency of the user's interaction with the user interfaces conserves power and increases the time between battery charges.
In some embodiments, an electronic device (e.g., a set top box or other user interface generating device that is in communication with a display device) with one or more processors and memory provides (1504) for display a first affordance (e.g., a button, checkbox, or other selectable user interface object) in association with a first content item (e.g., a movie, an episode, a song, a video clip, etc.) and a second affordance in association with a non-content item (e.g., an actor, a genre, a search term, etc.). For example,
In some embodiments, the electronic device receives (1522) an input (e.g., a tap gesture on a touch-sensitive surface) selecting the first affordance and an input selecting the second affordance (e.g., input to add the first content item and the non-content item to a watch list). For example, input can be received selecting any of affordances 14002, 14004, 14006, 14010, 14012, 14014, 14016, and 14018 in
In some embodiments, the electronic device provides (1528) for display a list of content items (e.g., a watch list displayed as a plurality of columns, or as a plurality of menu items in a menu), the list including the first content item and one or more additional content items associated with the non-content item (e.g., movies starring the actor if the non-content item is an actor, movies from the genre if the non-content item is a genre, or results of the search if the non-content item is a search, etc.). For example,
In some embodiments, the first affordance and the second affordance are optionally identical (1506) (e.g., the first and second affordances are buttons including the text “Add to Watch List”). In some embodiments, the first affordance and the second affordance are substantially identical (e.g., both affordances are buttons that include the text “Add to Watch List” so that it is clear to the user that both buttons add content associated with a currently displayed content or non-content item to the user's watch list, even if there are different shape, color, or font used in the buttons). For example,
In some embodiments, a list of search results based on a search term is optionally provided for display (1502). The second affordance is optionally provided (1508) proximate to (e.g., in association with) the list of search results (and the second affordance is generated for display proximate to the list of search results; e.g., the affordance includes the text “Add Search to Watch List”). In some embodiments, providing the list of content items optionally includes (1530) searching based on the search term to obtain an additional list of search results (and at least one of the additional search results is included in the list of content items). For example,
In some embodiments, providing for display the first affordance in association with the first content optionally includes (1510) receiving a request to display information about the first content item on the display. In response to receiving the request to display the information about the first content item on the display, the information about the first content item is optionally displayed (1512), and the first affordance is optionally displayed adjacent to the information about the first content item. For example, in
While displaying the first affordance adjacent to the information about the first content item, an input is optionally received (1524) that corresponds to activation of the first affordance (e.g., a selection command received while the “add to watch list” affordance is highlighted). Providing the second affordance for display in association with the non-content item optionally includes, while displaying the information about the first content item, receiving (1514) a request to display information about the non-content item on the display. In response to receiving the request to display the information about the non-content item on the display, display of the information about the first content item is optionally replaced (1516) with display of the information about the non-content item. The second affordance is optionally displayed adjacent to the information about the non-content item. For example, in
Receiving the input selecting the second affordance optionally includes, while displaying the second affordance adjacent to the information about the non-content item, receiving (1526) an input that corresponds to activation of the second affordance (e.g., a selection command received while the “add to watch list” affordance is highlighted). In some embodiments, one or more additional affordances are selected while navigating through a user interface that includes columns corresponding to a plurality of different content and non-content items (e.g., the user has the option of selecting from a plurality of different affordances that correspond to non-content items and a plurality of affordances that correspond to content items while navigating through a series of menus or user interfaces corresponding to the different content items and non-content items). For example, the plurality of affordances 14002, 14004, 14006, 14010, 14012, 14014, 14016, and 14018 in
In some embodiments, a plurality of affordances are optionally provided (1518) that correspond to non-content items. The plurality of affordances optionally include an affordance corresponding to a respective non-content item (e.g., a sports player), and an affordance corresponding to a grouping of non-content items that includes the respective non-content item and one or more other non-content items (e.g., a team that includes the sports player or a league that includes a team). For example,
In some embodiments, a plurality of affordances are optionally provided (1520) that correspond to content items. The plurality of affordances optionally include an affordance corresponding to a respective content item (e.g., an episode of a TV show), and an affordance corresponding to a grouping of content items that includes the respective content item and one or more other content items (e.g., a TV show that includes the episode). For example,
The operations in the information processing methods described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general purpose processors (e.g., as described above with respect to
The operations described above with reference to
It should be understood that the particular order in which the operations in
In accordance with some embodiments,
As shown in
In some embodiments, the processing unit 1610 is configured to provide (e.g., with the display enabling unit 1612), for display on a display device, a user interface including a first set of columns, two or more columns in the first set of columns each including two or more menu items. The processing unit 1610 is configured to receive (e.g., with the receiving unit 1614) a selection input. In response to receiving the selection input, in accordance with a determination that the selection input corresponds to selection of a first column, the processing unit 1610 is further configured to provide (e.g., with the display enabling unit 1612), for display on the display device, a second set of columns, two or more of the columns in the second set of columns corresponding to different subsets of menu items that were displayed in the first column that was selected in response to the selection input.
In some embodiments, the processing unit 1610 is further configured to, in response to the selection input, in accordance with a determination that the selection input corresponds to selection of a menu item from the first column, provide (e.g., with the display enabling unit 1612), for display on the display device, a user interface associated with the menu item that was selected.
In some embodiments, the menu item corresponds to a content series that includes a plurality of episodes, and a respective episode of the plurality of episodes is in a watch list. The processing unit 1610 is further configured to generate (e.g., with the display enabling unit 1612) one or more links to ancillary content related to the respective episode, and the second set of columns includes the one or more links to ancillary content related to the respective episode.
In some embodiments, a first column in the second set of columns corresponds to a first menu item from the first column, and a second column in the second set of columns corresponds to a second menu item from the first column that is different from the first menu item from the first column. In some embodiments, a first column in the second set of columns corresponds to a first plurality of the menu items from the first column, and a second column in the second set of columns corresponds to a second plurality of the menu items from the first column that is different from the first plurality of menu items from the first column.
In some embodiments, the first set of columns are configured to move together when scrolled in a first direction and to move separately when scrolled in a second direction that is different from the first direction. In some embodiments, scrolling the first set of columns in the first direction includes determining (e.g., with the determining unit 1616) a selected column display position based on a position of a currently selected column with respect to a total number of the plurality of columns, and scrolling (e.g., with the display enabling unit 1612) the plurality of columns such that the currently selected column moves to the selected column display position on the display device.
In some embodiments, the processing unit 1610 is further configured to receive (e.g., with the receiving unit 1614) second input to advance from the first column to a second column, and, in response to the second input, alter (e.g., with the display enabling unit 1612) a visual characteristic of one or both of the first column and the second column. In some embodiments, altering the visual characteristic includes increasing a visual emphasis (e.g., with the display enabling unit 1612) of the second column relative to a visual emphasis of the first column. In some embodiments, altering the visual characteristic includes displaying (e.g., with the display enabling unit 1612) additional information on the second column. In some embodiments, altering the visual characteristic includes increasing (e.g., with the display enabling unit 1612) a spatial property of the second column relative to a spatial property of the first column.
In some embodiments, the user interface has at least one background color, and the processing unit 1610 is further configured to receive (e.g., with the receiving unit 1614) second input to advance from a first menu item of the first column to a second menu item of the first column, obtain (e.g., with the obtaining unit 1618) a color associated with a logo of the second menu item, and alter (e.g., with the display enabling unit 1612) the background color of the user interface in accordance with the color associated with the logo of the second menu item.
In some embodiments, the first column includes first and second headings, and the processing unit 1610 is further configured to receive (e.g., with the receiving unit 1614) a scrolling input corresponding to the first column. In response to the scrolling input, the processing unit 1610 is further configured to scroll (e.g., with the display enabling unit 1612) the menu items of the first column, wherein the first heading remains stationary during the scrolling of the menu items of the first column, scroll (e.g., with the display enabling unit 1612) the first heading off an edge of the user interface, such that the first heading is no longer visible in the user interface after scrolling the first heading off the edge of the user interface, and replace (e.g., with the display enabling unit 1612) the first heading with the second heading, such that the second heading remains stationary during scrolling after replacing the first heading.
In some embodiments, the first set of columns is generated based on a partial search term, and the processing unit 1610 is further configured to generate (e.g., with the display enabling unit 1612) for display a search suggestions column in the first set of columns, the search suggestions column comprising a plurality of search suggestions, wherein a respective search suggestion of the plurality of search suggestions is highlighted, generate (e.g., with the display enabling unit 1612) for display a search results column in the first set of columns, the search suggestions column comprising a plurality of search results corresponding to the respective search suggestion, and receive (e.g., with the receiving unit 1614) a selection of the respective search suggestion. In response to the selection of the respective search suggestion, the processing unit 1610 is further configured to generate (e.g., with the display enabling unit 1612) for display a plurality of search results columns, including two or more search results columns that each correspond to one search result of the plurality of search results.
In some embodiments, providing the user interface including the first set of columns includes generating (e.g., with the display enabling unit 1612) representations of a plurality of content items including a plurality of on-demand content items and a plurality of live content items, wherein two or more of the representations of on-demand content items include static images corresponding to the on-demand content and two or more of the representations of live content items include live video corresponding to the live content.
In some embodiments, the processing unit 1610 is further configured to determine (e.g., with the determining unit 1616) whether a first content item of the plurality of content items is an on-demand content item or a live content item, in accordance with a determination that the first content item is an on-demand content item, obtain (e.g., with the obtaining unit 1618) a static image corresponding to the on-demand content item, and in accordance with a determination that the first content item is a live content item, obtain (e.g., with the obtaining unit 1618) a live video corresponding to the live content item. In some embodiments, obtaining the static image includes capturing (e.g., with the obtaining unit 1618) the static image from video corresponding to the on-demand content item. In some embodiments, the live video includes live video of a live sporting event, and the static images represent one or more of completed sporting events, upcoming sporting events, and highlight reels.
In some embodiments, the first set of columns includes a first recent content column and a second recent content column, and each of the first and second recent content columns includes content corresponding to a different category of recent content. In some embodiments, the first recent content column only includes content that has been viewed past a predetermined threshold.
In some embodiments, the first column of the first set of columns is a first type of column and a second column of the first set of columns is a second type of column, and the processing unit 1610 is further configured to, while the first column is selected, receive (e.g., with the receiving unit 1614) a first navigation input. In response to receiving the first navigation input, the processing unit 1610 is further configured to select (e.g., with the display enabling unit 1612) a different column of the first set of columns, and, while the second column is selected, receive (e.g., with the receiving unit 1614) a second navigation input. In response to receiving the second navigation input, the processing unit 1610 is further configured to maintain selection (e.g., with the display enabling unit 1612) of the second column and navigate (e.g., with the display enabling unit 1612) among multiple menu items in a row of the second column.
In accordance with some embodiments,
As shown in
In some embodiments, the processing unit 1710 is configured to, while a user interface that includes visual content is displayed on a display, receive (e.g., with the receiving unit 1714) a request to display a menu for controlling the visual content that is displayed in the user interface. In response to receiving the request to display the menu, the processing unit 1710 is further configured to provide (e.g., with the display enabling unit 1712), for display on the display device, an updated user interface that includes the menu, and the menu includes at least one translucent portion with a first degree of translucency, so that an indication of the visual content in the user interface can be seen through the translucent portion of the menu in accordance with the first degree of translucency. While the menu with the translucent portion is displayed on the display device, the processing unit 1710 is further configured to receive (e.g., with the receiving unit 1714) a request to perform an operation in the menu. In response to receiving the request to perform the operation in the menu, the processing unit 1710 is further configured to perform (e.g., with the display enabling unit 1712) the operation in the menu and change (e.g., with the display enabling unit 1712) the translucency of the translucent portion of the menu from the first degree of translucency to a second degree of translucency that is different from the first degree of translucency.
In some embodiments, the operation is a first navigation operation in a first direction, and changing the translucency of the translucent portion of the menu includes increasing (e.g., with the display enabling unit 1712) the translucency. While the menu with the translucent portion is displayed on the display device, the processing unit 1710 is further configured to receive (e.g., with the receiving unit 1714) a request to perform a navigation operation in a second direction. In response to receiving the request to perform the navigation operation in the second direction, the processing unit 1710 is further configured to perform (e.g., with the display enabling unit 1712) the navigation operation in the second direction and reduce (e.g., with the display enabling unit 1712) the translucency of the translucent portion of the menu.
In some embodiments, the operation is a navigation operation that corresponds to a request to move the menu in a respective direction, and changing the translucency of the translucent portion of the menu includes: in accordance with a determination that the respective direction is a first direction, increasing (e.g., with the display enabling unit 1712) the translucency of the translucent portion of the menu; and in accordance with a determination that the respective direction is a second direction that is different from the first direction, decreasing (e.g., with the display enabling unit 1712) the translucency of the translucent portion of the menu.
In some embodiments, performing the operation in the menu includes changing (e.g., with the display enabling unit 1712) a state of an item in the menu.
In some embodiments, the processing unit 1710 is further configured to, while the menu with the translucent portion is displayed on the display device, receive (e.g., with the receiving unit 1714) a request to dismiss the menu, and, in response to receiving the request to dismiss the menu, provide (e.g., with the display enabling unit 1712), for display on the display device, the user interface including the visual content and not including the menu.
In some embodiments, the visual content includes video content, and the processing unit 1710 is further configured to pause the video content (e.g., with the display enabling unit 1712) in response to receiving the request to display the menu, and resume the video content (e.g., with the display enabling unit 1712) in response to receiving the request to dismiss the menu. In some embodiments, the menu includes a first set of columns, two or more columns in the first set of columns each including two or more menu items.
In accordance with some embodiments,
As shown in
In some embodiments, the processing unit 1810 is configured to detect (e.g., with the detecting unit 1816) that a column display condition has been met. In response to detecting that the column display condition has been met, the processing unit 1810 is further configured to provide (e.g., with the display enabling unit 1812) a user interface for presentation on a display, the user interface including at least a portion of a first column proximate to a first edge of the display. The processing unit 1810 is further configured to receive input (e.g., with the receiving unit 1814). In response to the input, the processing unit 1810 is further configured to provide (e.g., with the display enabling unit 1812) for display an animation including the first column moving away from the first edge of the display to a location on the display that is proximate to a second edge of the display, the second edge being opposite the first edge, and a second column gradually appearing from the first edge of the display, such that a plurality of columns including the first and second columns fills the display from the first edge to the second edge.
In some embodiments, detecting that the column display condition has been met does not include receiving directional input, and receiving the input includes receiving (e.g., with the receiving unit 1814) directional input in a first direction. In some embodiments, the directional input includes a magnitude, and the animation further includes moving (e.g., with the display enabling unit 1812) the second column in accordance with the magnitude of the directional input. In some embodiments, video content is displayed on the display (e.g., with the display enabling unit 1812) when the column display condition is met, the user interface includes an overlay on the video content, and the first column includes content selected based on the video content being displayed.
In some embodiments, video content is displayed on the display (e.g., with the display enabling unit 1812) when the column display condition is met, the user interface includes an overlay on the video content; and detecting the column display condition includes detecting (e.g., with the detecting unit 1816) selection of a menu button. In some embodiments, passive content is displayed on the display (e.g., with the display enabling unit 1812) when the column display condition is met, the user interface includes an overlay on the passive content, and detecting the column display condition includes detecting (e.g., with the detecting unit 1816) user presence proximate to the display. In some embodiments, the plurality of columns is included in a first set of columns, two or more columns in the first set of columns each including two or more menu items.
In accordance with some embodiments,
As shown in
In some embodiments, the processing unit 1910 is configured to receive (e.g., with the receiving unit 1914) a request for information about a content series that has a release status. In response to receiving the request, the processing unit 1910 is further configured to provide (e.g., with the display enabling unit 1912) a user interface for display on a display that includes information about the content series, including respective contextual information that is based on the release status of the content series. In accordance with a determination that the release status is a first release status, the respective contextual information is first contextual information provided based on recent release-status activity for the content series, and in accordance with a determination that the release status is a second release status, different from the first release status, the respective contextual information is second contextual information provided without reference to recent release-status activity for the content series.
In some embodiments, the first contextual information includes an affordance that, when selected, causes (e.g., with the display enabling unit 1912) an episode of the content series to be provided for display on the display, and the second contextual information includes information indicating when a next episode of the content series will be available for presentation without including an affordance that, when selected, causes (e.g., with the display enabling unit 1912) an episode of the content series to be presented for display.
In some embodiments, the release status is cancelled and the respective contextual information includes a first episode of the content series. In some embodiments, the first release status is currently releasing and a most recently released episode is marked as watched, and providing the first contextual information based on recent release-status activity includes providing, for display, (e.g., with the display enabling unit 1912) a date of a next releasing episode.
In some embodiments, the first release status is currently releasing and a most recently released episode is not marked as watched, providing the first contextual information based on recent release-status activity includes providing (e.g., with the display enabling unit 1912) an affordance for presenting a most recently released episode of the content series, and the processing unit 1910 is further configured to receive (e.g., with the receiving unit 1914) a selection of the affordance. In response to receiving the selection of the affordance, the processing unit 1910 is further configured to provide (e.g., with the display enabling unit 1912) for display the most recently released episode of the content series.
In some embodiments, the first release status is currently releasing and an episode of the content series is currently airing live on a respective channel, and providing the first contextual information based on recent release-status activity includes providing (e.g., with the display enabling unit 1912) an affordance for providing for display the respective channel on which the episode is currently airing live, and the processing unit 1910 is further configured to receive (e.g., with the receiving unit 1914) a selection of the affordance. In response to receiving the selection of the affordance, the processing unit 1910 is further configured to provide (e.g., with the display enabling unit 1912) for display the respective channel on which the episode is currently airing live.
In some embodiments, the user interface includes a plurality of columns, two or more of the plurality of columns including two or more menu items, wherein a first column of the plurality of columns includes the respective contextual information that is based on the release status of the content series.
In accordance with some embodiments,
As shown in
In some embodiments, the processing unit 2010 is configured to provide (e.g., with the display enabling unit 2012) for display a first affordance in association with a first content item and a second affordance in association with a non-content item. The processing unit 2010 is further configured to receive (e.g., with the receiving unit 2014) an input selecting the first affordance and an input selecting the second affordance, and provide (e.g., with the display enabling unit 2012) for display a list of content items, the list including the first content item and one or more additional content items associated with the non-content item. In some embodiments, the first affordance and the second affordance are identical.
In some embodiments, the processing unit 2010 is further configured to provide (e.g., with the display enabling unit 2012) for display a list of search results based on a search term, wherein the second affordance is provided proximate to the list of search results. In some embodiments, providing the list of content items includes searching (e.g., with the searching unit 2016) based on the search term to obtain an additional list of search results.
In some embodiments, providing for display the first affordance in association with the first content includes receiving (e.g., with the receiving unit 2014) a request to display information about the first content item on the display and, in response to receiving the request to display the information about the first content item on the display, displaying (e.g., with the display enabling unit 2012) the information about the first content item and displaying the first affordance adjacent to the information about the first content item. Receiving the input selecting the first affordance includes, while displaying the first affordance adjacent to the information about the first content item, receiving (e.g., with the receiving unit 2014) an input that corresponds to activation of the first affordance. Providing for display the second affordance in association with the non-content item includes, while displaying the information about the first content item, receiving (e.g., with the receiving unit 2014) a request to display information about the non-content item on the display and, in response to receiving the request to display the information about the non-content item on the display, replacing (e.g., with the display enabling unit 2012) display of the information about the first content item with display of the information about the non-content item and displaying (e.g., with the display enabling unit 2012) the second affordance adjacent to the information about the non-content item. Receiving the input selecting the second affordance includes, while displaying the second affordance adjacent to the information about the non-content item, receiving (e.g., with the receiving unit 2014) an input that corresponds to activation of the second affordance.
In some embodiments, the processing unit 2010 is further configured to provide (e.g., with the display enabling unit 2012) a plurality of affordances that correspond to non-content items, the plurality of affordances including: an affordance corresponding to a respective non-content item, and an affordance corresponding to a grouping of non-content items that includes the respective non-content item and one or more other non-content items.
In some embodiments, the processing unit 2010 is further configured to provide (e.g., with the display enabling unit 2012) a plurality of affordances that correspond to content items, the plurality of affordances including: an affordance corresponding to a respective content item, and an affordance corresponding to a grouping of content items that includes the respective content item and one or more other content items.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best use the invention and various described embodiments with various modifications as are suited to the particular use contemplated.
This application is a continuation of U.S. patent application Ser. No. 16/872,274, filed May 11, 2020, and published as U.S. Publication No. 2020-0272666, which is a continuation of U.S. patent application Ser. No. 14/746,095, filed Jun. 22, 2015, and issued on May 12, 2020 as U.S. Pat. No. 10,650,052, which claims the benefit of U.S. Provisional Application No. 62/016,599, filed Jun. 24, 2014, the entire disclosures of which are herein incorporated by reference in their entireties for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
2718550 | Hoyt et al. | Sep 1955 | A |
4672677 | Yamakawa | Jun 1987 | A |
5029223 | Fujisaki | Jul 1991 | A |
5483261 | Yasutake | Jan 1996 | A |
5488204 | Mead et al. | Jan 1996 | A |
5585866 | Miller et al. | Dec 1996 | A |
5596373 | White et al. | Jan 1997 | A |
5621456 | Florin et al. | Apr 1997 | A |
5818439 | Nagasaka et al. | Oct 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5835079 | Shieh | Nov 1998 | A |
5880411 | Gillespie et al. | Mar 1999 | A |
5886690 | Pond et al. | Mar 1999 | A |
5926230 | Niijima et al. | Jul 1999 | A |
6021320 | Bickford et al. | Feb 2000 | A |
6028600 | Rosin et al. | Feb 2000 | A |
6049333 | Lajoie et al. | Apr 2000 | A |
6188391 | Seely et al. | Feb 2001 | B1 |
6310610 | Beaton et al. | Oct 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6405371 | Oosterhout et al. | Jun 2002 | B1 |
6487722 | Okura et al. | Nov 2002 | B1 |
6570557 | Westerman et al. | May 2003 | B1 |
6628304 | Mitchell et al. | Sep 2003 | B2 |
6677932 | Westerman | Jan 2004 | B1 |
6690387 | Zimmerman et al. | Feb 2004 | B2 |
6745391 | Macrae et al. | Jun 2004 | B1 |
6909837 | Unger | Jun 2005 | B1 |
6928433 | Goodman et al. | Aug 2005 | B2 |
7015894 | Morohoshi | Mar 2006 | B2 |
7039879 | Bergsten et al. | May 2006 | B2 |
7103906 | Katz et al. | Sep 2006 | B1 |
7134089 | Celik et al. | Nov 2006 | B2 |
7184064 | Zimmerman et al. | Feb 2007 | B2 |
7213255 | Markel et al. | May 2007 | B2 |
7293275 | Krieger et al. | Nov 2007 | B1 |
7324953 | Murphy | Jan 2008 | B1 |
7330192 | Brunner et al. | Feb 2008 | B2 |
7596761 | Lemay et al. | Sep 2009 | B2 |
7614008 | Ording | Nov 2009 | B2 |
7631278 | Miksovsky et al. | Dec 2009 | B2 |
7633076 | Huppi et al. | Dec 2009 | B2 |
7636897 | Koralski et al. | Dec 2009 | B2 |
7649526 | Ording et al. | Jan 2010 | B2 |
7650569 | Allen et al. | Jan 2010 | B1 |
7653883 | Hotelling et al. | Jan 2010 | B2 |
7657849 | Chaudhri et al. | Feb 2010 | B2 |
7663607 | Hotelling et al. | Feb 2010 | B2 |
7694231 | Kocienda et al. | Apr 2010 | B2 |
7712051 | Chadzelek et al. | May 2010 | B2 |
7783892 | Russell et al. | Aug 2010 | B2 |
7810043 | Ostojic et al. | Oct 2010 | B2 |
7814023 | Rao et al. | Oct 2010 | B1 |
7827483 | Unbedacht et al. | Nov 2010 | B2 |
7836475 | Angiolillo et al. | Nov 2010 | B2 |
7844914 | Andre et al. | Nov 2010 | B2 |
7849487 | Vosseller | Dec 2010 | B1 |
7856605 | Ording et al. | Dec 2010 | B2 |
7917477 | Hutson et al. | Mar 2011 | B2 |
7956846 | Ording et al. | Jun 2011 | B2 |
7957762 | Herz et al. | Jun 2011 | B2 |
7970379 | White et al. | Jun 2011 | B2 |
8006002 | Kalayjian et al. | Aug 2011 | B2 |
8026805 | Rowe | Sep 2011 | B1 |
8082523 | Forstall et al. | Dec 2011 | B2 |
8094132 | Frischling et al. | Jan 2012 | B1 |
8115731 | Varanda | Feb 2012 | B2 |
8145617 | Verstak et al. | Mar 2012 | B1 |
8170931 | Ross et al. | May 2012 | B2 |
8205240 | Ansari et al. | Jun 2012 | B2 |
8239784 | Hotelling et al. | Aug 2012 | B2 |
8279180 | Hotelling et al. | Oct 2012 | B2 |
8291452 | Yong et al. | Oct 2012 | B1 |
8299889 | Kumar et al. | Oct 2012 | B2 |
8301484 | Kumar | Oct 2012 | B1 |
8312484 | Mccarty et al. | Nov 2012 | B1 |
8312486 | Briggs et al. | Nov 2012 | B1 |
8325160 | St. Pierre et al. | Dec 2012 | B2 |
8346798 | Spiegelman et al. | Jan 2013 | B2 |
8370874 | Chang et al. | Feb 2013 | B1 |
8381135 | Hotelling et al. | Feb 2013 | B2 |
8386588 | Cooley | Feb 2013 | B1 |
8407737 | Ellis | Mar 2013 | B1 |
8416217 | Eriksson et al. | Apr 2013 | B1 |
8418202 | Ahmad-taylor | Apr 2013 | B2 |
8424048 | Lyren et al. | Apr 2013 | B1 |
8479122 | Hotelling et al. | Jul 2013 | B2 |
8495499 | Denise | Jul 2013 | B1 |
8516063 | Fletcher | Aug 2013 | B2 |
8516525 | Jerding et al. | Aug 2013 | B1 |
8560398 | Few et al. | Oct 2013 | B1 |
8584165 | Kane et al. | Nov 2013 | B1 |
8607163 | Plummer | Dec 2013 | B2 |
8607268 | Migos | Dec 2013 | B2 |
8613015 | Gordon et al. | Dec 2013 | B2 |
8613023 | Narahara et al. | Dec 2013 | B2 |
8625974 | Pinson | Jan 2014 | B1 |
8674958 | Kravets et al. | Mar 2014 | B1 |
8683362 | Shiplacoff et al. | Mar 2014 | B2 |
8683517 | Carpenter et al. | Mar 2014 | B2 |
8730190 | Moloney | May 2014 | B2 |
8742885 | Brodersen et al. | Jun 2014 | B2 |
8754862 | Zaliva | Jun 2014 | B2 |
8762852 | Davis et al. | Jun 2014 | B2 |
8769408 | Madden et al. | Jul 2014 | B2 |
8782706 | Ellis | Jul 2014 | B2 |
8850471 | Kilar et al. | Sep 2014 | B2 |
8850490 | Thomas et al. | Sep 2014 | B1 |
8869207 | Earle | Oct 2014 | B1 |
8887202 | Hunter et al. | Nov 2014 | B2 |
8930839 | He et al. | Jan 2015 | B2 |
8952987 | Momeyer et al. | Feb 2015 | B2 |
8963847 | Hunt | Feb 2015 | B2 |
8983950 | Askey et al. | Mar 2015 | B2 |
8988356 | Tseng | Mar 2015 | B2 |
8990857 | Yong et al. | Mar 2015 | B2 |
9007322 | Young | Apr 2015 | B1 |
9066146 | Suh et al. | Jun 2015 | B2 |
9081421 | Lai et al. | Jul 2015 | B1 |
9092057 | Varela et al. | Jul 2015 | B2 |
9116569 | Stacy et al. | Aug 2015 | B2 |
9118967 | Sirpal et al. | Aug 2015 | B2 |
9129656 | Prather et al. | Sep 2015 | B2 |
9141200 | Bernstein et al. | Sep 2015 | B2 |
9196309 | Schultz et al. | Nov 2015 | B2 |
9214290 | Xie et al. | Dec 2015 | B2 |
9215273 | Jonnala et al. | Dec 2015 | B2 |
9219634 | Morse et al. | Dec 2015 | B1 |
9235317 | Matas et al. | Jan 2016 | B2 |
9241121 | Rudolph | Jan 2016 | B2 |
9244600 | Mcintosh et al. | Jan 2016 | B2 |
9247014 | Rao | Jan 2016 | B1 |
9247174 | Sirpal et al. | Jan 2016 | B2 |
9285977 | Greenberg et al. | Mar 2016 | B1 |
9319727 | Phipps et al. | Apr 2016 | B2 |
9348458 | Hotelling et al. | May 2016 | B2 |
9357250 | Newman et al. | May 2016 | B1 |
9380343 | Webster et al. | Jun 2016 | B2 |
9414108 | Sirpal et al. | Aug 2016 | B2 |
9454288 | Raffle et al. | Sep 2016 | B2 |
9514476 | Kay et al. | Dec 2016 | B2 |
9532111 | Christie et al. | Dec 2016 | B1 |
9538310 | Fjeldsoe-nielsen et al. | Jan 2017 | B2 |
9542060 | Brenner et al. | Jan 2017 | B1 |
9560399 | Kaya et al. | Jan 2017 | B2 |
9575944 | Neil et al. | Feb 2017 | B2 |
9591339 | Christie et al. | Mar 2017 | B1 |
9600159 | Lawson et al. | Mar 2017 | B2 |
9602566 | Lewis et al. | Mar 2017 | B1 |
9639241 | Penha et al. | May 2017 | B2 |
9652118 | Hill et al. | May 2017 | B2 |
9652448 | Pasquero et al. | May 2017 | B2 |
9658740 | Chaudhri | May 2017 | B2 |
9774917 | Christie et al. | Sep 2017 | B1 |
9792018 | Van Os et al. | Oct 2017 | B2 |
9807462 | Wood | Oct 2017 | B2 |
9864508 | Dixon et al. | Jan 2018 | B2 |
9864509 | Howard et al. | Jan 2018 | B2 |
9871905 | Habiger et al. | Jan 2018 | B1 |
9913142 | Folse et al. | Mar 2018 | B2 |
9933937 | Lemay et al. | Apr 2018 | B2 |
9973800 | Yellin et al. | May 2018 | B2 |
10019142 | Van Os et al. | Jul 2018 | B2 |
10025499 | Howard et al. | Jul 2018 | B2 |
10079872 | Thomas et al. | Sep 2018 | B1 |
10091558 | Christie et al. | Oct 2018 | B2 |
10116996 | Christie et al. | Oct 2018 | B1 |
10126904 | Agnetta et al. | Nov 2018 | B2 |
10168871 | Wallters et al. | Jan 2019 | B2 |
10200761 | Christie et al. | Feb 2019 | B1 |
10205985 | Lue-sang et al. | Feb 2019 | B2 |
10209866 | Johnson et al. | Feb 2019 | B2 |
10237599 | Gravino et al. | Mar 2019 | B1 |
10275148 | Matas et al. | Apr 2019 | B2 |
10282088 | Kim et al. | May 2019 | B2 |
10303422 | Woo et al. | May 2019 | B1 |
10405015 | Kite et al. | Sep 2019 | B2 |
10521188 | Christie et al. | Dec 2019 | B1 |
10551995 | Ho et al. | Feb 2020 | B1 |
10552470 | Todd et al. | Feb 2020 | B2 |
10564823 | Dennis et al. | Feb 2020 | B1 |
10601808 | Nijim et al. | Mar 2020 | B1 |
10606539 | Bernstein et al. | Mar 2020 | B2 |
10631042 | Zerr et al. | Apr 2020 | B2 |
10795490 | Chaudhri et al. | Oct 2020 | B2 |
10827007 | Kode et al. | Nov 2020 | B2 |
11062358 | Lewis et al. | Jul 2021 | B1 |
20020015024 | Westerman et al. | Feb 2002 | A1 |
20020026637 | Markel et al. | Feb 2002 | A1 |
20020042920 | Thomas et al. | Apr 2002 | A1 |
20020060750 | Istvan et al. | May 2002 | A1 |
20020085045 | Vong et al. | Jul 2002 | A1 |
20020100063 | Herigstad et al. | Jul 2002 | A1 |
20020112239 | Goldman | Aug 2002 | A1 |
20020113816 | Mitchell et al. | Aug 2002 | A1 |
20020144269 | Connelly | Oct 2002 | A1 |
20020171686 | Kamen et al. | Nov 2002 | A1 |
20020178446 | Sie et al. | Nov 2002 | A1 |
20030001907 | Bergsten et al. | Jan 2003 | A1 |
20030005445 | Schein et al. | Jan 2003 | A1 |
20030009757 | Kikinis | Jan 2003 | A1 |
20030011641 | Totman et al. | Jan 2003 | A1 |
20030013483 | Ausems et al. | Jan 2003 | A1 |
20030088872 | Maissel et al. | May 2003 | A1 |
20030093790 | Logan et al. | May 2003 | A1 |
20030126600 | Heuvelman | Jul 2003 | A1 |
20030149628 | Abbosh et al. | Aug 2003 | A1 |
20030158950 | Sako | Aug 2003 | A1 |
20030167471 | Roth et al. | Sep 2003 | A1 |
20030177075 | Burke | Sep 2003 | A1 |
20030177498 | Ellis et al. | Sep 2003 | A1 |
20030192060 | Levy | Oct 2003 | A1 |
20030221191 | Khusheim | Nov 2003 | A1 |
20030228130 | Tanikawa et al. | Dec 2003 | A1 |
20030234804 | Parker et al. | Dec 2003 | A1 |
20040019497 | Volk et al. | Jan 2004 | A1 |
20040046801 | Lin et al. | Mar 2004 | A1 |
20040070573 | Graham | Apr 2004 | A1 |
20040088328 | Cook et al. | May 2004 | A1 |
20040090463 | Celik et al. | May 2004 | A1 |
20040093262 | Weston et al. | May 2004 | A1 |
20040133909 | Ma | Jul 2004 | A1 |
20040139401 | Unbedacht et al. | Jul 2004 | A1 |
20040161151 | Iwayama et al. | Aug 2004 | A1 |
20040168184 | Steenkamp et al. | Aug 2004 | A1 |
20040193421 | Blass | Sep 2004 | A1 |
20040252120 | Hunleth et al. | Dec 2004 | A1 |
20040254883 | Kondrk et al. | Dec 2004 | A1 |
20040254958 | Volk | Dec 2004 | A1 |
20040267715 | Polson et al. | Dec 2004 | A1 |
20050012599 | Dematteo | Jan 2005 | A1 |
20050071761 | Kontio | Mar 2005 | A1 |
20050071785 | Chadzelek et al. | Mar 2005 | A1 |
20050076363 | Dukes et al. | Apr 2005 | A1 |
20050091254 | Stabb et al. | Apr 2005 | A1 |
20050091597 | Ackley | Apr 2005 | A1 |
20050134625 | Kubota | Jun 2005 | A1 |
20050162398 | Eliasson et al. | Jul 2005 | A1 |
20050162402 | Watanachote | Jul 2005 | A1 |
20050186988 | Lim et al. | Aug 2005 | A1 |
20050190059 | Wehrenberg | Sep 2005 | A1 |
20050223335 | Ichikawa | Oct 2005 | A1 |
20050235316 | Ahmad-taylor | Oct 2005 | A1 |
20050257166 | Tu | Nov 2005 | A1 |
20050283358 | Stephanick et al. | Dec 2005 | A1 |
20060017692 | Wehrenberg et al. | Jan 2006 | A1 |
20060020904 | Aaltonen et al. | Jan 2006 | A1 |
20060026521 | Hotelling et al. | Feb 2006 | A1 |
20060029374 | Park | Feb 2006 | A1 |
20060031872 | Hsiao et al. | Feb 2006 | A1 |
20060033724 | Chaudhri et al. | Feb 2006 | A1 |
20060053449 | Gutta | Mar 2006 | A1 |
20060069998 | Artman et al. | Mar 2006 | A1 |
20060071905 | Varanda | Apr 2006 | A1 |
20060080352 | Boubez et al. | Apr 2006 | A1 |
20060097991 | Hotelling et al. | May 2006 | A1 |
20060107304 | Cleron et al. | May 2006 | A1 |
20060112346 | Miksovsky et al. | May 2006 | A1 |
20060112352 | Tseng et al. | May 2006 | A1 |
20060117267 | Koralski et al. | Jun 2006 | A1 |
20060120624 | Jojic et al. | Jun 2006 | A1 |
20060195479 | Spiegelman et al. | Aug 2006 | A1 |
20060195512 | Rogers et al. | Aug 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20060224987 | Caffarelli | Oct 2006 | A1 |
20060236847 | Withop | Oct 2006 | A1 |
20060248113 | Leffert et al. | Nov 2006 | A1 |
20060265637 | Marriott et al. | Nov 2006 | A1 |
20060271968 | Zellner | Nov 2006 | A1 |
20060282856 | Errico et al. | Dec 2006 | A1 |
20060288848 | Gould et al. | Dec 2006 | A1 |
20060294545 | Morris et al. | Dec 2006 | A1 |
20070005569 | Hurst-hiller et al. | Jan 2007 | A1 |
20070009229 | Liu | Jan 2007 | A1 |
20070011702 | Vaysman | Jan 2007 | A1 |
20070024594 | Sakata et al. | Feb 2007 | A1 |
20070028267 | Ostojic et al. | Feb 2007 | A1 |
20070038957 | White | Feb 2007 | A1 |
20070073596 | Alexander et al. | Mar 2007 | A1 |
20070092204 | Wagner et al. | Apr 2007 | A1 |
20070150802 | Wan et al. | Jun 2007 | A1 |
20070154163 | Cordray | Jul 2007 | A1 |
20070157220 | Cordray et al. | Jul 2007 | A1 |
20070157249 | Cordray et al. | Jul 2007 | A1 |
20070168413 | Barletta et al. | Jul 2007 | A1 |
20070186254 | Tsutsui et al. | Aug 2007 | A1 |
20070199035 | Schwartz et al. | Aug 2007 | A1 |
20070204057 | Shaver et al. | Aug 2007 | A1 |
20070229465 | Sakai et al. | Oct 2007 | A1 |
20070233880 | Nieh et al. | Oct 2007 | A1 |
20070244902 | Seide et al. | Oct 2007 | A1 |
20070248317 | Bahn | Oct 2007 | A1 |
20080046928 | Poling et al. | Feb 2008 | A1 |
20080059884 | Ellis et al. | Mar 2008 | A1 |
20080065989 | Conroy et al. | Mar 2008 | A1 |
20080066010 | Brodersen et al. | Mar 2008 | A1 |
20080077562 | Schleppe | Mar 2008 | A1 |
20080092168 | Logan et al. | Apr 2008 | A1 |
20080092173 | Shannon et al. | Apr 2008 | A1 |
20080111822 | Horowitz et al. | May 2008 | A1 |
20080120668 | Yau | May 2008 | A1 |
20080127281 | Van et al. | May 2008 | A1 |
20080155475 | Duhig | Jun 2008 | A1 |
20080189740 | Carpenter et al. | Aug 2008 | A1 |
20080189742 | Ellis et al. | Aug 2008 | A1 |
20080208844 | Jenkins | Aug 2008 | A1 |
20080216020 | Plummer | Sep 2008 | A1 |
20080222677 | Woo et al. | Sep 2008 | A1 |
20080235331 | Melamed | Sep 2008 | A1 |
20080235588 | Gonze et al. | Sep 2008 | A1 |
20080243817 | Chan et al. | Oct 2008 | A1 |
20080250312 | Curtis | Oct 2008 | A1 |
20080260252 | Borgaonkar et al. | Oct 2008 | A1 |
20080270886 | Gossweiler et al. | Oct 2008 | A1 |
20080276279 | Gossweiler et al. | Nov 2008 | A1 |
20080301260 | Goldeen et al. | Dec 2008 | A1 |
20080301579 | Jonasson et al. | Dec 2008 | A1 |
20080301734 | Goldeen et al. | Dec 2008 | A1 |
20080307343 | Robert et al. | Dec 2008 | A1 |
20080307458 | Kim et al. | Dec 2008 | A1 |
20080307459 | Migos | Dec 2008 | A1 |
20080320391 | Lemay et al. | Dec 2008 | A1 |
20080320532 | Lee | Dec 2008 | A1 |
20090055385 | Jeon et al. | Feb 2009 | A1 |
20090063521 | Bull et al. | Mar 2009 | A1 |
20090063975 | Rottler et al. | Mar 2009 | A1 |
20090089837 | Momosaki | Apr 2009 | A1 |
20090094662 | Chang et al. | Apr 2009 | A1 |
20090119754 | Schubert | May 2009 | A1 |
20090158325 | Johnson | Jun 2009 | A1 |
20090158326 | Hunt et al. | Jun 2009 | A1 |
20090161868 | Chaudhry | Jun 2009 | A1 |
20090164944 | Webster et al. | Jun 2009 | A1 |
20090165054 | Rudolph | Jun 2009 | A1 |
20090174679 | Westerman | Jul 2009 | A1 |
20090177301 | Hayes | Jul 2009 | A1 |
20090177989 | Ma et al. | Jul 2009 | A1 |
20090178083 | Carr et al. | Jul 2009 | A1 |
20090228491 | Malik | Sep 2009 | A1 |
20090228807 | Lemay | Sep 2009 | A1 |
20090239587 | Negron et al. | Sep 2009 | A1 |
20090256807 | Nurmi | Oct 2009 | A1 |
20090259957 | Slocum et al. | Oct 2009 | A1 |
20090278916 | Ito | Nov 2009 | A1 |
20090282444 | Laksono et al. | Nov 2009 | A1 |
20090288079 | Zuber et al. | Nov 2009 | A1 |
20090313100 | Ingleshwar | Dec 2009 | A1 |
20090322962 | Weeks | Dec 2009 | A1 |
20090327952 | Karas et al. | Dec 2009 | A1 |
20100009629 | Jung et al. | Jan 2010 | A1 |
20100017713 | Igarashi | Jan 2010 | A1 |
20100031162 | Wiser | Feb 2010 | A1 |
20100053220 | Ozawa et al. | Mar 2010 | A1 |
20100053432 | Cheng et al. | Mar 2010 | A1 |
20100057696 | Miyazawa et al. | Mar 2010 | A1 |
20100064313 | Beyabani | Mar 2010 | A1 |
20100080163 | Krishnamoorthi et al. | Apr 2010 | A1 |
20100083181 | Matsushima et al. | Apr 2010 | A1 |
20100095240 | Shiplacoff et al. | Apr 2010 | A1 |
20100100899 | Bradbury | Apr 2010 | A1 |
20100104269 | Prestenback et al. | Apr 2010 | A1 |
20100115592 | Belz et al. | May 2010 | A1 |
20100121714 | Bryant et al. | May 2010 | A1 |
20100146442 | Nagasaka et al. | Jun 2010 | A1 |
20100153881 | Dinn | Jun 2010 | A1 |
20100153999 | Yates | Jun 2010 | A1 |
20100159898 | Krzyzanowski et al. | Jun 2010 | A1 |
20100162172 | Aroner | Jun 2010 | A1 |
20100194998 | Lee et al. | Aug 2010 | A1 |
20100198822 | Glennon et al. | Aug 2010 | A1 |
20100205628 | Davis et al. | Aug 2010 | A1 |
20100211636 | Starkenburg et al. | Aug 2010 | A1 |
20100223646 | Goldeen et al. | Sep 2010 | A1 |
20100229194 | Blanchard et al. | Sep 2010 | A1 |
20100235744 | Schultz et al. | Sep 2010 | A1 |
20100251304 | Donoghue et al. | Sep 2010 | A1 |
20100257005 | Phenner et al. | Oct 2010 | A1 |
20100269145 | Ingrassia et al. | Oct 2010 | A1 |
20100275143 | Fu et al. | Oct 2010 | A1 |
20100277337 | Brodersen et al. | Nov 2010 | A1 |
20100293190 | Kaiser et al. | Nov 2010 | A1 |
20100293586 | Simoes et al. | Nov 2010 | A1 |
20100299606 | Morita | Nov 2010 | A1 |
20100312824 | Smith et al. | Dec 2010 | A1 |
20100325660 | Holden | Dec 2010 | A1 |
20100333142 | Busse et al. | Dec 2010 | A1 |
20100333143 | Civanlar et al. | Dec 2010 | A1 |
20110004831 | Steinberg et al. | Jan 2011 | A1 |
20110033168 | Iyer | Feb 2011 | A1 |
20110047513 | Onogi et al. | Feb 2011 | A1 |
20110052146 | Murthy et al. | Mar 2011 | A1 |
20110054649 | Sarkis et al. | Mar 2011 | A1 |
20110055762 | Jung et al. | Mar 2011 | A1 |
20110055870 | Yum et al. | Mar 2011 | A1 |
20110071977 | Nakajima et al. | Mar 2011 | A1 |
20110078739 | Grad | Mar 2011 | A1 |
20110080935 | Kim et al. | Apr 2011 | A1 |
20110087992 | Wang et al. | Apr 2011 | A1 |
20110090402 | Huntington et al. | Apr 2011 | A1 |
20110093415 | Rhee et al. | Apr 2011 | A1 |
20110099519 | Ma | Apr 2011 | A1 |
20110119715 | Chang et al. | May 2011 | A1 |
20110131607 | Thomas et al. | Jun 2011 | A1 |
20110154194 | Mathai et al. | Jun 2011 | A1 |
20110154305 | Leroux et al. | Jun 2011 | A1 |
20110157029 | Tseng | Jun 2011 | A1 |
20110162022 | Xia | Jun 2011 | A1 |
20110163971 | Wagner et al. | Jul 2011 | A1 |
20110167339 | Lemay | Jul 2011 | A1 |
20110175930 | Hwang et al. | Jul 2011 | A1 |
20110179388 | Fleizach et al. | Jul 2011 | A1 |
20110179453 | Poniatowski | Jul 2011 | A1 |
20110197153 | King et al. | Aug 2011 | A1 |
20110209177 | Sela et al. | Aug 2011 | A1 |
20110218948 | De et al. | Sep 2011 | A1 |
20110231280 | Farah | Sep 2011 | A1 |
20110231823 | Fryc et al. | Sep 2011 | A1 |
20110231872 | Gharachorloo et al. | Sep 2011 | A1 |
20110231878 | Hunter et al. | Sep 2011 | A1 |
20110246332 | Alcodray et al. | Oct 2011 | A1 |
20110281517 | Ukkadam | Nov 2011 | A1 |
20110283304 | Roberts et al. | Nov 2011 | A1 |
20110283333 | Ukkadam | Nov 2011 | A1 |
20110289064 | Lebeau et al. | Nov 2011 | A1 |
20110289317 | Darapu et al. | Nov 2011 | A1 |
20110289419 | Yu et al. | Nov 2011 | A1 |
20110289421 | Jordan et al. | Nov 2011 | A1 |
20110289452 | Jordan et al. | Nov 2011 | A1 |
20110289531 | Moonka et al. | Nov 2011 | A1 |
20110289534 | Jordan et al. | Nov 2011 | A1 |
20110296351 | Ewing et al. | Dec 2011 | A1 |
20110302532 | Missig | Dec 2011 | A1 |
20110307631 | Park et al. | Dec 2011 | A1 |
20110312278 | Matsushita et al. | Dec 2011 | A1 |
20110321072 | Patterson et al. | Dec 2011 | A1 |
20120019674 | Ohnishi et al. | Jan 2012 | A1 |
20120036552 | Dare et al. | Feb 2012 | A1 |
20120042245 | Askey et al. | Feb 2012 | A1 |
20120042343 | Laligand et al. | Feb 2012 | A1 |
20120053887 | Nurmi | Mar 2012 | A1 |
20120054178 | Tran et al. | Mar 2012 | A1 |
20120054642 | Balsiger et al. | Mar 2012 | A1 |
20120054679 | Ma | Mar 2012 | A1 |
20120054797 | Skog et al. | Mar 2012 | A1 |
20120059910 | Cassidy | Mar 2012 | A1 |
20120060092 | Hill et al. | Mar 2012 | A1 |
20120064204 | Davila et al. | Mar 2012 | A1 |
20120084136 | Seth et al. | Apr 2012 | A1 |
20120093481 | Mcdowell et al. | Apr 2012 | A1 |
20120096011 | Kay et al. | Apr 2012 | A1 |
20120102573 | Spooner et al. | Apr 2012 | A1 |
20120105367 | Son et al. | May 2012 | A1 |
20120110616 | Kilar et al. | May 2012 | A1 |
20120110621 | Gossweiler, III | May 2012 | A1 |
20120114303 | Chung et al. | May 2012 | A1 |
20120117584 | Gordon | May 2012 | A1 |
20120124615 | Lee | May 2012 | A1 |
20120131615 | Kobayashi et al. | May 2012 | A1 |
20120139938 | Khedouri et al. | Jun 2012 | A1 |
20120141095 | Schwesinger | Jun 2012 | A1 |
20120144003 | Rosenbaum et al. | Jun 2012 | A1 |
20120158524 | Hintz et al. | Jun 2012 | A1 |
20120173991 | Roberts et al. | Jul 2012 | A1 |
20120174157 | Stinson et al. | Jul 2012 | A1 |
20120198020 | Parker et al. | Aug 2012 | A1 |
20120198336 | Novotny et al. | Aug 2012 | A1 |
20120210366 | Wong et al. | Aug 2012 | A1 |
20120215684 | Kidron | Aug 2012 | A1 |
20120216113 | Li | Aug 2012 | A1 |
20120216117 | Arriola et al. | Aug 2012 | A1 |
20120216296 | Kidron | Aug 2012 | A1 |
20120221498 | Kaszynski et al. | Aug 2012 | A1 |
20120222056 | Donoghue et al. | Aug 2012 | A1 |
20120233640 | Odryna et al. | Sep 2012 | A1 |
20120242704 | Bamford et al. | Sep 2012 | A1 |
20120260291 | Wood | Oct 2012 | A1 |
20120260293 | Young et al. | Oct 2012 | A1 |
20120262371 | Lee et al. | Oct 2012 | A1 |
20120262407 | Hinckley et al. | Oct 2012 | A1 |
20120266069 | Moshiri et al. | Oct 2012 | A1 |
20120272261 | Reynolds et al. | Oct 2012 | A1 |
20120284753 | Roberts et al. | Nov 2012 | A1 |
20120290933 | Rajaraman et al. | Nov 2012 | A1 |
20120291079 | Gordon et al. | Nov 2012 | A1 |
20120308143 | Bellegarda et al. | Dec 2012 | A1 |
20120311443 | Chaudhri et al. | Dec 2012 | A1 |
20120311638 | Reyna | Dec 2012 | A1 |
20120317482 | Barraclough et al. | Dec 2012 | A1 |
20120323938 | Skeen et al. | Dec 2012 | A1 |
20120324504 | Archer et al. | Dec 2012 | A1 |
20120327125 | Kutliroff et al. | Dec 2012 | A1 |
20130014150 | Seo et al. | Jan 2013 | A1 |
20130014159 | Wiser et al. | Jan 2013 | A1 |
20130021288 | Kaerkkaeinen et al. | Jan 2013 | A1 |
20130024895 | Yong et al. | Jan 2013 | A1 |
20130031585 | Itagaki et al. | Jan 2013 | A1 |
20130033643 | Kim et al. | Feb 2013 | A1 |
20130042271 | Yellin et al. | Feb 2013 | A1 |
20130061234 | Piira et al. | Mar 2013 | A1 |
20130061267 | Cansino et al. | Mar 2013 | A1 |
20130067366 | Almosnino | Mar 2013 | A1 |
20130073403 | Tuchman et al. | Mar 2013 | A1 |
20130080968 | Hanson | Mar 2013 | A1 |
20130083076 | Liu et al. | Apr 2013 | A1 |
20130097009 | Akadiri | Apr 2013 | A1 |
20130110978 | Gordon et al. | May 2013 | A1 |
20130124998 | Pendergast et al. | May 2013 | A1 |
20130132874 | He et al. | May 2013 | A1 |
20130132966 | Chanda et al. | May 2013 | A1 |
20130151300 | Le et al. | Jun 2013 | A1 |
20130173034 | Reimann et al. | Jul 2013 | A1 |
20130174193 | Yu et al. | Jul 2013 | A1 |
20130179812 | Bianrosa et al. | Jul 2013 | A1 |
20130179995 | Basile et al. | Jul 2013 | A1 |
20130198686 | Kawai et al. | Aug 2013 | A1 |
20130205312 | Huang | Aug 2013 | A1 |
20130212531 | Yoshida | Aug 2013 | A1 |
20130227482 | Thorsander et al. | Aug 2013 | A1 |
20130247105 | Jovanovski et al. | Sep 2013 | A1 |
20130262431 | Garner et al. | Oct 2013 | A1 |
20130262558 | Wood et al. | Oct 2013 | A1 |
20130262619 | Goodwin et al. | Oct 2013 | A1 |
20130262633 | Goodwin et al. | Oct 2013 | A1 |
20130263189 | Garner | Oct 2013 | A1 |
20130283154 | Sasakura | Oct 2013 | A1 |
20130283168 | Brown et al. | Oct 2013 | A1 |
20130283317 | Guntupalli | Oct 2013 | A1 |
20130283318 | Wannamaker | Oct 2013 | A1 |
20130285937 | Billings | Oct 2013 | A1 |
20130290233 | Ferren et al. | Oct 2013 | A1 |
20130290848 | Billings et al. | Oct 2013 | A1 |
20130291018 | Billings et al. | Oct 2013 | A1 |
20130291037 | Im et al. | Oct 2013 | A1 |
20130294755 | Arme et al. | Nov 2013 | A1 |
20130312044 | Itagaki | Nov 2013 | A1 |
20130326499 | Mowatt et al. | Dec 2013 | A1 |
20130326554 | Shkedi | Dec 2013 | A1 |
20130326561 | Pandey | Dec 2013 | A1 |
20130332838 | Naggar et al. | Dec 2013 | A1 |
20130332960 | Young et al. | Dec 2013 | A1 |
20130339877 | Skeen et al. | Dec 2013 | A1 |
20130340006 | Kwan | Dec 2013 | A1 |
20130346564 | Warrick et al. | Dec 2013 | A1 |
20130347044 | Lee et al. | Dec 2013 | A1 |
20140006635 | Braness et al. | Jan 2014 | A1 |
20140006795 | Han et al. | Jan 2014 | A1 |
20140006951 | Hunter | Jan 2014 | A1 |
20140012859 | Heilprin et al. | Jan 2014 | A1 |
20140013283 | Matas et al. | Jan 2014 | A1 |
20140020017 | Stern et al. | Jan 2014 | A1 |
20140024341 | Johan | Jan 2014 | A1 |
20140033245 | Barton et al. | Jan 2014 | A1 |
20140049692 | Sirpal et al. | Feb 2014 | A1 |
20140052683 | Kirkham et al. | Feb 2014 | A1 |
20140053116 | Smith et al. | Feb 2014 | A1 |
20140053195 | Sirpal et al. | Feb 2014 | A1 |
20140059605 | Sirpal et al. | Feb 2014 | A1 |
20140059615 | Sirpal et al. | Feb 2014 | A1 |
20140059625 | Dourado | Feb 2014 | A1 |
20140059635 | Sirpal et al. | Feb 2014 | A1 |
20140068654 | Marlow et al. | Mar 2014 | A1 |
20140071068 | Shih et al. | Mar 2014 | A1 |
20140074454 | Brown et al. | Mar 2014 | A1 |
20140075313 | Bachman et al. | Mar 2014 | A1 |
20140075316 | Li | Mar 2014 | A1 |
20140075394 | Nawle et al. | Mar 2014 | A1 |
20140075574 | Zheng et al. | Mar 2014 | A1 |
20140082497 | Chalouhi et al. | Mar 2014 | A1 |
20140082660 | Zhang | Mar 2014 | A1 |
20140088952 | Fife et al. | Mar 2014 | A1 |
20140089816 | Dipersia et al. | Mar 2014 | A1 |
20140098102 | Raffle et al. | Apr 2014 | A1 |
20140101706 | Kardatzke | Apr 2014 | A1 |
20140104646 | Nishiyama | Apr 2014 | A1 |
20140109204 | Papillon et al. | Apr 2014 | A1 |
20140111416 | Sugiura | Apr 2014 | A1 |
20140115636 | Stuckman | Apr 2014 | A1 |
20140123006 | Chen et al. | May 2014 | A1 |
20140129232 | Jones et al. | May 2014 | A1 |
20140130097 | Londero | May 2014 | A1 |
20140136946 | Matas | May 2014 | A1 |
20140137029 | Stephenson et al. | May 2014 | A1 |
20140137030 | Matas | May 2014 | A1 |
20140143260 | Simonson et al. | May 2014 | A1 |
20140143683 | Underwood et al. | May 2014 | A1 |
20140156792 | Roberts et al. | Jun 2014 | A1 |
20140157204 | Roberts et al. | Jun 2014 | A1 |
20140157329 | Roberts et al. | Jun 2014 | A1 |
20140164966 | Kim et al. | Jun 2014 | A1 |
20140168071 | Ahmed et al. | Jun 2014 | A1 |
20140171153 | Kienzle et al. | Jun 2014 | A1 |
20140172622 | Baronshin | Jun 2014 | A1 |
20140172953 | Blanksteen | Jun 2014 | A1 |
20140173660 | Correa et al. | Jun 2014 | A1 |
20140184471 | Martynov et al. | Jul 2014 | A1 |
20140189523 | Shuttleworth et al. | Jul 2014 | A1 |
20140189574 | Stallings et al. | Jul 2014 | A1 |
20140189606 | Shuttleworth et al. | Jul 2014 | A1 |
20140196064 | Kennedy et al. | Jul 2014 | A1 |
20140196069 | Ahmed et al. | Jul 2014 | A1 |
20140208268 | Jimenez | Jul 2014 | A1 |
20140208360 | Kardatzke | Jul 2014 | A1 |
20140219637 | Mcintosh et al. | Aug 2014 | A1 |
20140224867 | Werner et al. | Aug 2014 | A1 |
20140244751 | Tseng | Aug 2014 | A1 |
20140245148 | Silva et al. | Aug 2014 | A1 |
20140245186 | Tseng | Aug 2014 | A1 |
20140245222 | Kovacevic et al. | Aug 2014 | A1 |
20140250465 | Mulholland et al. | Sep 2014 | A1 |
20140250479 | Lee et al. | Sep 2014 | A1 |
20140253463 | Hicks | Sep 2014 | A1 |
20140259074 | Ansari et al. | Sep 2014 | A1 |
20140278072 | Fino et al. | Sep 2014 | A1 |
20140278940 | Wade | Sep 2014 | A1 |
20140280728 | Szerlip Joyce et al. | Sep 2014 | A1 |
20140282208 | Chaudhri | Sep 2014 | A1 |
20140282636 | Petander et al. | Sep 2014 | A1 |
20140282677 | Mantell et al. | Sep 2014 | A1 |
20140288686 | Sant et al. | Sep 2014 | A1 |
20140289226 | English et al. | Sep 2014 | A1 |
20140289751 | Hsu et al. | Sep 2014 | A1 |
20140310742 | Kim | Oct 2014 | A1 |
20140317653 | Mlodzinski | Oct 2014 | A1 |
20140325357 | Sant et al. | Oct 2014 | A1 |
20140333530 | Agnetta et al. | Nov 2014 | A1 |
20140337607 | Peterson et al. | Nov 2014 | A1 |
20140340358 | Martinoli | Nov 2014 | A1 |
20140341109 | Cartmell et al. | Nov 2014 | A1 |
20140344247 | Procopio et al. | Nov 2014 | A1 |
20140344291 | Simonson et al. | Nov 2014 | A9 |
20140344294 | Skeen et al. | Nov 2014 | A1 |
20140351691 | Neil et al. | Nov 2014 | A1 |
20140359598 | Oliver et al. | Dec 2014 | A1 |
20140365479 | Lyons et al. | Dec 2014 | A1 |
20140365481 | Novosel et al. | Dec 2014 | A1 |
20140365604 | Lewis et al. | Dec 2014 | A1 |
20140365919 | Shaw et al. | Dec 2014 | A1 |
20140366040 | Parker et al. | Dec 2014 | A1 |
20140366047 | Thomas et al. | Dec 2014 | A1 |
20150020127 | Doshi et al. | Jan 2015 | A1 |
20150039685 | Lewis et al. | Feb 2015 | A1 |
20150046866 | Shimadate | Feb 2015 | A1 |
20150067582 | Donnelly et al. | Mar 2015 | A1 |
20150067724 | Johnson et al. | Mar 2015 | A1 |
20150074522 | Harned et al. | Mar 2015 | A1 |
20150074552 | Chai et al. | Mar 2015 | A1 |
20150074603 | Abe et al. | Mar 2015 | A1 |
20150082187 | Wallters et al. | Mar 2015 | A1 |
20150095460 | Berger et al. | Apr 2015 | A1 |
20150095845 | Chun et al. | Apr 2015 | A1 |
20150113429 | Edwards et al. | Apr 2015 | A1 |
20150121408 | Jacoby et al. | Apr 2015 | A1 |
20150134653 | Bayer et al. | May 2015 | A1 |
20150150049 | White | May 2015 | A1 |
20150150066 | Park et al. | May 2015 | A1 |
20150153571 | Ballard et al. | Jun 2015 | A1 |
20150161251 | Ramanarayanan et al. | Jun 2015 | A1 |
20150169705 | Korbecki et al. | Jun 2015 | A1 |
20150169975 | Kienzle et al. | Jun 2015 | A1 |
20150186002 | Suzuki et al. | Jul 2015 | A1 |
20150189347 | Oztaskent et al. | Jul 2015 | A1 |
20150193192 | Kidron | Jul 2015 | A1 |
20150195624 | Gossweiler, III | Jul 2015 | A1 |
20150205591 | Jitkoff et al. | Jul 2015 | A1 |
20150237389 | Grouf et al. | Aug 2015 | A1 |
20150296072 | Zhou et al. | Oct 2015 | A1 |
20150301729 | Wang et al. | Oct 2015 | A1 |
20150309670 | Wheeler et al. | Oct 2015 | A1 |
20150312603 | Singh et al. | Oct 2015 | A1 |
20150317343 | Cselle et al. | Nov 2015 | A1 |
20150334464 | Shin | Nov 2015 | A1 |
20150346975 | Lee et al. | Dec 2015 | A1 |
20150350741 | Rajaraman et al. | Dec 2015 | A1 |
20150355816 | Shim | Dec 2015 | A1 |
20150363035 | Hinckley et al. | Dec 2015 | A1 |
20150365729 | Kaya et al. | Dec 2015 | A1 |
20150370435 | Kirmse et al. | Dec 2015 | A1 |
20150370455 | Van Os et al. | Dec 2015 | A1 |
20150370920 | Van Os et al. | Dec 2015 | A1 |
20150373107 | Chan et al. | Dec 2015 | A1 |
20150382047 | Van Os et al. | Dec 2015 | A1 |
20150382066 | Heeter et al. | Dec 2015 | A1 |
20160004425 | Yoon et al. | Jan 2016 | A1 |
20160004772 | Kim et al. | Jan 2016 | A1 |
20160004773 | Jannink et al. | Jan 2016 | A1 |
20160005013 | Perry | Jan 2016 | A1 |
20160014461 | Leech et al. | Jan 2016 | A1 |
20160021412 | Zito, Jr. | Jan 2016 | A1 |
20160035119 | Lee et al. | Feb 2016 | A1 |
20160036897 | Kim et al. | Feb 2016 | A1 |
20160041702 | Wang | Feb 2016 | A1 |
20160043962 | Kim et al. | Feb 2016 | A1 |
20160066004 | Lieu et al. | Mar 2016 | A1 |
20160066021 | Thomas et al. | Mar 2016 | A1 |
20160066040 | Webster et al. | Mar 2016 | A1 |
20160066049 | Mountain | Mar 2016 | A1 |
20160078526 | Nations et al. | Mar 2016 | A1 |
20160080815 | Ruffini et al. | Mar 2016 | A1 |
20160092042 | Yenigalla et al. | Mar 2016 | A1 |
20160092559 | Lind et al. | Mar 2016 | A1 |
20160096113 | Decoufle | Apr 2016 | A1 |
20160099991 | Lonkar et al. | Apr 2016 | A1 |
20160105540 | Kwon et al. | Apr 2016 | A1 |
20160110064 | Shapira | Apr 2016 | A1 |
20160127783 | Garcia Navarro | May 2016 | A1 |
20160127789 | Roberts et al. | May 2016 | A1 |
20160133230 | Daniels et al. | May 2016 | A1 |
20160142783 | Bagga et al. | May 2016 | A1 |
20160165307 | Lavender et al. | Jun 2016 | A1 |
20160188902 | Jin | Jun 2016 | A1 |
20160191639 | Dai et al. | Jun 2016 | A1 |
20160192017 | Tirpak | Jun 2016 | A1 |
20160231885 | Lee et al. | Aug 2016 | A1 |
20160249105 | Carney Landow | Aug 2016 | A1 |
20160255379 | Langan et al. | Sep 2016 | A1 |
20160277785 | Newman et al. | Sep 2016 | A1 |
20160345070 | Beeson et al. | Nov 2016 | A1 |
20160357305 | Wells et al. | Dec 2016 | A1 |
20160357352 | Matas et al. | Dec 2016 | A1 |
20160357355 | Carrigan et al. | Dec 2016 | A1 |
20160357366 | Migos et al. | Dec 2016 | A1 |
20160370982 | Penha et al. | Dec 2016 | A1 |
20170010846 | Bernstein et al. | Jan 2017 | A1 |
20170010847 | Bernstein et al. | Jan 2017 | A1 |
20170013295 | Wertheimer et al. | Jan 2017 | A1 |
20170046039 | Karunamuni et al. | Feb 2017 | A1 |
20170046339 | Bhat et al. | Feb 2017 | A1 |
20170068402 | Lochhead et al. | Mar 2017 | A1 |
20170068511 | Brown et al. | Mar 2017 | A1 |
20170094360 | Keighran et al. | Mar 2017 | A1 |
20170097969 | Stein et al. | Apr 2017 | A1 |
20170115867 | Bargmann | Apr 2017 | A1 |
20170124594 | Naiga et al. | May 2017 | A1 |
20170132659 | Dirks et al. | May 2017 | A1 |
20170132829 | Blas et al. | May 2017 | A1 |
20170134778 | Christie et al. | May 2017 | A1 |
20170140748 | Roberts et al. | May 2017 | A1 |
20170188116 | Major et al. | Jun 2017 | A1 |
20170192642 | Fishman et al. | Jul 2017 | A1 |
20170195736 | Chai et al. | Jul 2017 | A1 |
20170201850 | Raleigh et al. | Jul 2017 | A1 |
20170214975 | Schmidt et al. | Jul 2017 | A1 |
20170220228 | Sang et al. | Aug 2017 | A1 |
20170242913 | Tijssen et al. | Aug 2017 | A1 |
20170245017 | Chaudhri et al. | Aug 2017 | A1 |
20170251257 | Obrien | Aug 2017 | A1 |
20170300151 | Lue-Sang et al. | Oct 2017 | A1 |
20170339443 | Lue-Sang et al. | Nov 2017 | A1 |
20170344553 | Evnine et al. | Nov 2017 | A1 |
20170345040 | Pimack et al. | Nov 2017 | A1 |
20170353603 | Grunewald et al. | Dec 2017 | A1 |
20170357387 | Clarke | Dec 2017 | A1 |
20170359722 | Folse et al. | Dec 2017 | A1 |
20170364246 | Van Os et al. | Dec 2017 | A1 |
20180011580 | Lebowitz et al. | Jan 2018 | A1 |
20180041814 | Christie et al. | Feb 2018 | A1 |
20180053094 | Patel et al. | Feb 2018 | A1 |
20180063591 | Newman et al. | Mar 2018 | A1 |
20180070121 | Zimmerman et al. | Mar 2018 | A1 |
20180070138 | Chai et al. | Mar 2018 | A1 |
20180107353 | Lee | Apr 2018 | A1 |
20180113579 | Johnston et al. | Apr 2018 | A1 |
20180130097 | Tran et al. | May 2018 | A1 |
20180136800 | Johnston et al. | May 2018 | A1 |
20180146377 | Folse et al. | May 2018 | A1 |
20180189076 | Liston et al. | Jul 2018 | A1 |
20180253900 | Finding et al. | Sep 2018 | A1 |
20180275855 | Van Os et al. | Sep 2018 | A1 |
20180293210 | Xue et al. | Oct 2018 | A1 |
20180293771 | Piemonte et al. | Oct 2018 | A1 |
20180295403 | Christie et al. | Oct 2018 | A1 |
20180302680 | Cormican | Oct 2018 | A1 |
20180343497 | Brown et al. | Nov 2018 | A1 |
20180349509 | Abou Mahmoud et al. | Dec 2018 | A1 |
20180367834 | Carpenter et al. | Dec 2018 | A1 |
20190012048 | Johnston et al. | Jan 2019 | A1 |
20190020925 | Christie et al. | Jan 2019 | A1 |
20190028769 | Jeon et al. | Jan 2019 | A1 |
20190045271 | Christie et al. | Feb 2019 | A1 |
20190052744 | Jung et al. | Feb 2019 | A1 |
20190058921 | Christie et al. | Feb 2019 | A1 |
20190066672 | Wood et al. | Feb 2019 | A1 |
20190073104 | Wang | Mar 2019 | A1 |
20190073680 | Knox | Mar 2019 | A1 |
20190129588 | Johnston et al. | May 2019 | A1 |
20190138163 | Howland et al. | May 2019 | A1 |
20190141399 | Auxer et al. | May 2019 | A1 |
20190258373 | Davydov et al. | Aug 2019 | A1 |
20190272853 | Moore | Sep 2019 | A1 |
20190324614 | Brillon et al. | Oct 2019 | A1 |
20190342616 | Domm et al. | Nov 2019 | A1 |
20190354264 | Van Os et al. | Nov 2019 | A1 |
20190373320 | Balsamo | Dec 2019 | A1 |
20200034792 | Rogers et al. | Jan 2020 | A1 |
20200068274 | Aher et al. | Feb 2020 | A1 |
20200084488 | Christie et al. | Mar 2020 | A1 |
20200099985 | Keighran et al. | Mar 2020 | A1 |
20200133631 | Christie et al. | Apr 2020 | A1 |
20200137175 | Ganci et al. | Apr 2020 | A1 |
20200257415 | Clarke | Aug 2020 | A1 |
20200272666 | Van Os et al. | Aug 2020 | A1 |
20200301567 | Park et al. | Sep 2020 | A1 |
20200301575 | Lindholm et al. | Sep 2020 | A1 |
20200304863 | Domm et al. | Sep 2020 | A1 |
20200304876 | Cielak et al. | Sep 2020 | A1 |
20200304879 | Ellingford | Sep 2020 | A1 |
20200304880 | Diaz Delgado et al. | Sep 2020 | A1 |
20200363934 | Van Os et al. | Nov 2020 | A1 |
20200374595 | Yang et al. | Nov 2020 | A1 |
20200380029 | Chen | Dec 2020 | A1 |
20200382845 | Payne | Dec 2020 | A1 |
20200396507 | Balsamo | Dec 2020 | A1 |
20210021903 | Christie et al. | Jan 2021 | A1 |
20210168424 | Sharma | Jun 2021 | A1 |
20210181901 | Johnston et al. | Jun 2021 | A1 |
20210195277 | Thurlow et al. | Jun 2021 | A1 |
20210286454 | Beaumier et al. | Sep 2021 | A1 |
20210306711 | Ellingford et al. | Sep 2021 | A1 |
20210337280 | Diaz Delgado et al. | Oct 2021 | A1 |
20210345004 | Christie et al. | Nov 2021 | A1 |
20210365134 | Beaumier et al. | Nov 2021 | A1 |
20210397306 | Rajam et al. | Dec 2021 | A1 |
20210406995 | Peters et al. | Dec 2021 | A1 |
20220132215 | Venugopal et al. | Apr 2022 | A1 |
20220179526 | Schöberl | Jun 2022 | A1 |
20220244824 | Cielak | Aug 2022 | A1 |
20220321940 | Christie et al. | Oct 2022 | A1 |
20220329891 | Christie et al. | Oct 2022 | A1 |
20220337914 | Christie et al. | Oct 2022 | A1 |
20220360858 | Christie et al. | Nov 2022 | A1 |
20220413796 | Christie et al. | Dec 2022 | A1 |
20230022781 | Lindholm et al. | Jan 2023 | A1 |
20230033604 | Diaz Delgado et al. | Feb 2023 | A1 |
20230096458 | Van Os et al. | Mar 2023 | A1 |
20230127228 | Clarke | Apr 2023 | A1 |
20230300415 | Balsamo | Sep 2023 | A1 |
20230328327 | Cielak et al. | Oct 2023 | A1 |
20240037144 | Chen | Feb 2024 | A1 |
20240089550 | Ellingford et al. | Mar 2024 | A1 |
20240089553 | Payne | Mar 2024 | A1 |
Number | Date | Country |
---|---|---|
2009255409 | Jul 2012 | AU |
2016100476 | May 2016 | AU |
2017101431 | Nov 2017 | AU |
2018100810 | Jul 2018 | AU |
1391765 | Jan 2003 | CN |
1985277 | Jun 2007 | CN |
1985327 | Jun 2007 | CN |
101160932 | Apr 2008 | CN |
101228570 | Jul 2008 | CN |
101317149 | Dec 2008 | CN |
101370104 | Feb 2009 | CN |
101405679 | Apr 2009 | CN |
101436110 | May 2009 | CN |
101465993 | Jun 2009 | CN |
101529437 | Sep 2009 | CN |
101641662 | Feb 2010 | CN |
101699505 | Apr 2010 | CN |
101706704 | May 2010 | CN |
101719125 | Jun 2010 | CN |
102098537 | Jun 2011 | CN |
102103460 | Jun 2011 | CN |
102187338 | Sep 2011 | CN |
102265586 | Nov 2011 | CN |
102325144 | Jan 2012 | CN |
102819715 | Dec 2012 | CN |
102859484 | Jan 2013 | CN |
102880404 | Jan 2013 | CN |
102890615 | Jan 2013 | CN |
102955653 | Mar 2013 | CN |
102981695 | Mar 2013 | CN |
103177738 | Jun 2013 | CN |
103399967 | Nov 2013 | CN |
103546816 | Jan 2014 | CN |
103562848 | Feb 2014 | CN |
103562947 | Feb 2014 | CN |
103620531 | Mar 2014 | CN |
103620639 | Mar 2014 | CN |
103686418 | Mar 2014 | CN |
103985045 | Aug 2014 | CN |
103999017 | Aug 2014 | CN |
104508618 | Apr 2015 | CN |
104822098 | Aug 2015 | CN |
105190590 | Dec 2015 | CN |
105247526 | Jan 2016 | CN |
105264479 | Jan 2016 | CN |
105303372 | Feb 2016 | CN |
105308634 | Feb 2016 | CN |
105308923 | Feb 2016 | CN |
105336350 | Feb 2016 | CN |
105657554 | Jun 2016 | CN |
105812849 | Jul 2016 | CN |
105828098 | Aug 2016 | CN |
105955520 | Sep 2016 | CN |
105955607 | Sep 2016 | CN |
105989085 | Oct 2016 | CN |
105992068 | Oct 2016 | CN |
106101982 | Nov 2016 | CN |
108292190 | Jul 2018 | CN |
109313651 | Feb 2019 | CN |
202016003233 | Aug 2016 | DE |
0608708 | Aug 1994 | EP |
0624853 | Nov 1994 | EP |
2386984 | Nov 2011 | EP |
2453667 | May 2012 | EP |
2535844 | Dec 2012 | EP |
2574089 | Mar 2013 | EP |
2605203 | Jun 2013 | EP |
2642402 | Sep 2013 | EP |
2672703 | Dec 2013 | EP |
2679017 | Jan 2014 | EP |
2725531 | Apr 2014 | EP |
2000-163031 | Jun 2000 | JP |
2001-197445 | Jul 2001 | JP |
2002-027381 | Jan 2002 | JP |
2002-342033 | Nov 2002 | JP |
2003-099452 | Apr 2003 | JP |
2003534737 | Nov 2003 | JP |
2004-062237 | Feb 2004 | JP |
2006-031219 | Feb 2006 | JP |
2007-124465 | May 2007 | JP |
2007-512640 | May 2007 | JP |
2007-140910 | Jun 2007 | JP |
2007-294068 | Nov 2007 | JP |
2008-71112 | Mar 2008 | JP |
2008-135911 | Jun 2008 | JP |
2009-60328 | Mar 2009 | JP |
2009-206957 | Sep 2009 | JP |
2009-260947 | Nov 2009 | JP |
2010-28437 | Feb 2010 | JP |
2010-509684 | Mar 2010 | JP |
2010-114733 | May 2010 | JP |
2011-512701 | Apr 2011 | JP |
2011-123750 | Jun 2011 | JP |
2011-154455 | Aug 2011 | JP |
2011-182146 | Sep 2011 | JP |
2011-205562 | Oct 2011 | JP |
2011-257930 | Dec 2011 | JP |
2012-095123 | May 2012 | JP |
2012-123685 | Jun 2012 | JP |
2012-208622 | Oct 2012 | JP |
2013-008369 | Jan 2013 | JP |
2013-12021 | Jan 2013 | JP |
2013-223150 | Oct 2013 | JP |
2013-235523 | Nov 2013 | JP |
2014-81740 | May 2014 | JP |
2014-102660 | Jun 2014 | JP |
2015-050655 | Mar 2015 | JP |
2015-70404 | Apr 2015 | JP |
10-2001-0005939 | Jan 2001 | KR |
10-2002-0010151 | Feb 2002 | KR |
10-2007-0114329 | Dec 2007 | KR |
10-2010-0039194 | Apr 2010 | KR |
10-2011-0036408 | Apr 2011 | KR |
10-2011-0061811 | Jun 2011 | KR |
10-2012-0076682 | Jul 2012 | KR |
10-2012-0124445 | Nov 2012 | KR |
10-2013-0014712 | Feb 2013 | KR |
10-2013-0058034 | Jun 2013 | KR |
10-2013-0137969 | Dec 2013 | KR |
10-2014-0041939 | Apr 2014 | KR |
10-2019-0033658 | Mar 2019 | KR |
10-2022-0041231 | Mar 2022 | KR |
200622893 | Jul 2006 | TW |
200719204 | May 2007 | TW |
201337717 | Sep 2013 | TW |
201349049 | Dec 2013 | TW |
201351261 | Dec 2013 | TW |
1994009438 | Apr 1994 | WO |
1999040728 | Aug 1999 | WO |
2004063862 | Jul 2004 | WO |
2004102285 | Nov 2004 | WO |
2005050652 | Jun 2005 | WO |
2005109345 | Nov 2005 | WO |
2007078623 | Jul 2007 | WO |
2008005135 | Jan 2008 | WO |
2008060486 | May 2008 | WO |
2009016607 | Feb 2009 | WO |
2009039786 | Apr 2009 | WO |
2009148781 | Dec 2009 | WO |
2010022570 | Mar 2010 | WO |
2010025168 | Mar 2010 | WO |
2010118690 | Oct 2010 | WO |
2011095693 | Aug 2011 | WO |
2011158475 | Dec 2011 | WO |
2012012446 | Jan 2012 | WO |
2012061760 | May 2012 | WO |
2012088665 | Jul 2012 | WO |
2013000741 | Jan 2013 | WO |
2013149128 | Oct 2013 | WO |
2013169849 | Nov 2013 | WO |
2013169877 | Nov 2013 | WO |
2013187370 | Dec 2013 | WO |
2013149128 | Feb 2014 | WO |
2014105276 | Jul 2014 | WO |
2014144908 | Sep 2014 | WO |
2014177929 | Nov 2014 | WO |
2014200730 | Dec 2014 | WO |
2015200227 | Dec 2015 | WO |
2015200228 | Dec 2015 | WO |
2015200537 | Dec 2015 | WO |
2016030437 | Mar 2016 | WO |
2016048308 | Mar 2016 | WO |
2016048310 | Mar 2016 | WO |
2016111065 | Jul 2016 | WO |
2017008079 | Jan 2017 | WO |
2017124116 | Jul 2017 | WO |
2017200923 | Nov 2017 | WO |
2017218104 | Dec 2017 | WO |
2018081157 | May 2018 | WO |
Entry |
---|
Matejka, Justin, Tovi Grossman, and George Fitzmaurice. “Swifter: improved online video scrubbing.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2013 (Year: 2013). |
Schmidt, Alexander. “Graphical user interface for video on demand navigation from an IPTV set top box.” (2009) (Year: 2009). |
Kim, Jinwoo, Hyunho Kim, and Kyungwook Park. “Towards optimal navigation through video content on interactive TV.” Interacting with Computers 18.4 (2006): 723-746 (Year: 2006). |
Advisory Action received for U.S. Appl. No. 15/167,801, mailed on Feb. 16, 2018, 4 pages. |
Applicant Initiated Interview Summary received for U.S. Appl. No. 15/167,801, mailed on Apr. 23, 2018, 3 pages. |
Applicant Initiated Interview Summary received for U.S. Appl. No. 15/167,801, mailed on Jul. 29, 2019, 3 pages. |
Applicant Initiated Interview Summary received for U.S. Appl. No. 17/210,352, mailed on Feb. 28, 2022, 4 pages. |
Corrected Notice of Allowability received for U.S. Appl. No. 16/108,519, mailed on Dec. 22, 2021, 3 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 14/242,575, mailed on Dec. 15, 2016, 7 pages. |
Corrected Notice of Allowance for U.S. Appl. No. 14/224,575, mailed on Nov. 16, 2016, 7 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 14/255,664, mailed on Aug. 29, 2017, 4 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 14/267,671, mailed on Nov. 29, 2018, 3 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 14/749,288, mailed on Sep. 21, 2017, 5 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 15/276,633, mailed on Sep. 10, 2019, 7 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 15/695,880, mailed on Jun. 11, 2018, 6 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 15/714,904, mailed on Sep. 7, 2018, 5 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 15/876,715, mailed on Oct. 20, 2021, 2 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 16/010,280, mailed on Aug. 6, 2019, 2 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 16/036,810, mailed on Nov. 19, 2018, 6 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 16/142,635, mailed on Mar. 10, 2022, 2 Pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 16/233,990, mailed on Mar. 8, 2022, 4 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 16/827,931, mailed on Dec. 6, 2021, 4 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 16/872,274, mailed on Aug. 12, 2022, 5 pages. |
Cover Flow, Wikipedia, Available online at: <https://en.wikipedia.org/w/index.php?title=Cover_Flow&oldid=879285208>, Jan. 20, 2019, 3 pages. |
Examiner Initiated Interview Summary received for U.S. Appl. No. 15/390,377, mailed on Oct. 30, 2017, 2 pages. |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 15/876,715, mailed on Aug. 18, 2020, 16 pages. |
Extended European Search Report received for European Patent Application No. 17813728.7, mailed on Feb. 11, 2019, 8 pages. |
Extended European Search Report received for European Patent Application No. 20199219.5, mailed on Apr. 22, 2021, 8 pages. |
Final Office Action received for U.S. Appl. No. 14/255,664, mailed on Oct. 17, 2016, 16 pages. |
Final Office Action received for U.S. Appl. No. 14/267,671, mailed on May 23, 2018, 17 pages. |
Final Office Action received for U.S. Appl. No. 14/267,671, mailed on Oct. 26, 2016, 21 pages. |
Final Office Action received for U.S. Appl. No. 14/271,179, mailed on Dec. 15, 2016, 10 pages. |
Final Office Action received for U.S. Appl. No. 14/271,179, mailed on Jun. 20, 2019, 15 pages. |
Final Office Action received for U.S. Appl. No. 14/271,179, mailed on Jun. 21, 2018, 14 pages. |
Final Office Action received for U.S. Appl. No. 14/746,095, mailed on Jul. 16, 2018, 33 pages. |
Final Office Action received for U.S. Appl. No. 14/746,662, mailed on Apr. 24, 2017, 8 pages. |
Final Office Action received for U.S. Appl. No. 14/746,662, mailed on Jun. 27, 2017, 9 pages. |
Final Office Action received for U.S. Appl. No. 15/167,801, mailed on Apr. 5, 2019, 18 pages. |
Final Office Action received for U.S. Appl. No. 15/167,801, mailed on May 28, 2020, 17 pages. |
Final Office Action received for U.S. Appl. No. 15/167,801, mailed on Nov. 29, 2017, 12 pages. |
Final Office Action received for U.S. Appl. No. 15/235,000, mailed on Dec. 19, 2018, 33 pages. |
Final Office Action received for U.S. Appl. No. 15/235,000, mailed on Mar. 13, 2018, 31 pages. |
Final Office Action received for U.S. Appl. No. 15/272,393, mailed on Mar. 25, 2019, 54 pages. |
Final Office Action received for U.S. Appl. No. 15/272,397, mailed on Mar. 7, 2017, 23 pages. |
Final Office Action received for U.S. Appl. No. 15/276,633, mailed on Jul. 26, 2017, 15 pages. |
Final Office Action received for U.S. Appl. No. 15/276,633, mailed on Oct. 29, 2018, 12 pages. |
Final Office Action received for U.S. Appl. No. 15/390,377, mailed on Nov. 9, 2017, 18 pages. |
Final Office Action received for U.S. Appl. No. 15/507,229, mailed on Jul. 15, 2020, 20 pages. |
Final Office Action received for U.S. Appl. No. 15/507,229, mailed on Sep. 18, 2019, 15 pages. |
Final Office Action received for U.S. Appl. No. 15/719,404, mailed on Aug. 8, 2019, 19 pages. |
Final Office Action received for U.S. Appl. No. 15/719,404, mailed on Mar. 30, 2021, 19 pages. |
Final Office Action received for U.S. Appl. No. 15/876,715, mailed on Nov. 5, 2018, 15 pages. |
Final Office Action received for U.S. Appl. No. 16/108,519, mailed on Dec. 12, 2019, 10 pages. |
Final Office Action received for U.S. Appl. No. 16/108,519, mailed on Nov. 25, 2020, 12 pages. |
Final Office Action received for U.S. Appl. No. 16/126,962, mailed on Apr. 8, 2020, 20 pages. |
Final Office Action received for U.S. Appl. No. 16/136,005, mailed on Mar. 9, 2020, 9 pages. |
Final Office Action received for U.S. Appl. No. 16/142,635, mailed on Feb. 3, 2021, 23 pages. |
Final Office Action received for U.S. Appl. No. 16/144,077, mailed on Jul. 12, 2019, 22 pages. |
Final Office Action received for U.S. Appl. No. 16/175,565, mailed on Nov. 12, 2020, 40 pages. |
Final Office Action received for U.S. Appl. No. 16/233,990, mailed on Jan. 11, 2021, 17 pages. |
Final Office Action received for U.S. Appl. No. 16/584,790, mailed on Jun. 15, 2021, 30 pages. |
Final Office Action received for U.S. Appl. No. 16/584,790, mailed on May 27, 2020, 27 pages. |
Final Office Action received for U.S. Appl. No. 16/682,443, mailed on Mar. 9, 2021, 9 pages. |
Final Office Action received for U.S. Appl. No. 16/697,090, mailed on Feb. 23, 2022, 25 pages. |
Final Office Action received for U.S. Appl. No. 16/697,090, mailed on Jan. 27, 2021, 18 pages. |
Final Office Action received for U.S. Appl. No. 16/827,910, mailed on Feb. 28, 2022, 17 pages. |
Final Office Action received for U.S. Appl. No. 16/827,918, mailed on Jul. 8, 2021, 31 pages. |
Final Office Action received for U.S. Appl. No. 16/827,926, mailed on Mar. 17, 2021, 44 pages. |
Final Office Action received for U.S. Appl. No. 16/865,172, mailed on Feb. 12, 2021, 29 pages. |
Final Office Action received for U.S. Appl. No. 16/872,274, mailed on Dec. 23, 2021, 20 pages. |
Final Office Action received for U.S. Appl. No. 16/888,478, mailed on Nov. 15, 2021, 27 pages. |
Final Office Action received for U.S. Appl. No. 17/133,550, mailed on Feb. 11, 2022, 18 pages. |
International Search Report received for PCT Patent Application No. PCT/US2019/034921, mailed on Nov. 19, 2019, 5 pages. |
International Search Report received for PCT Patent Application No. PCT/US2020/024452, mailed on Aug. 6, 2020, 6 pages. |
International Search Report received for PCT Patent Application No. PCT/US2020/024485, mailed on Aug. 3, 2020, 6 pages. |
International Search Report received for PCT Patent Application No. PCT/US2020/024486, mailed on Aug. 11, 2020, 6 pages. |
International Search Report received for PCT Patent Application No. PCT/US2020/024492, mailed on Aug. 10, 2020, 6 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/208,099, mailed on Jun. 25, 2015, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/242,575, mailed on Mar. 21, 2016, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/255,664, mailed on Apr. 1, 2016, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/262,435, mailed on Feb. 22, 2016, 22 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/267,671, mailed on Apr. 1, 2016, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/267,671, mailed on Dec. 1, 2017, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/267,671, mailed on May 26, 2017, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/271,179, mailed on May 29, 2015, 25 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/271,179, mailed on Oct. 5, 2018, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/271,179, mailed on Sep. 21, 2017, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/746,095, mailed on Dec. 1, 2017, 34 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/746,095, mailed on Jul. 25, 2019, 33 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/746,620, mailed on Jan. 11, 2017, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/746,662, mailed on Aug. 9, 2016, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/749,288, mailed on Oct. 12, 2016, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/167,801 mailed on Mar. 24, 2017, 12 Pages. |
Non-Final Office Action received for U.S. Appl. No. 15/167,801, mailed on Aug. 30, 2018, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/167,801, mailed on Dec. 11, 2020, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/167,801, mailed on Sep. 3, 2021, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/167,801, mailed on Sep. 26, 2019, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/224,370, mailed on Oct. 3, 2017, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/235,000, mailed on Jul. 14, 2017, 31 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/235,000, mailed on Jul. 25, 2018, 31 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/235,000, mailed on Jun. 26, 2019, 31 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/272,393, mailed on Oct. 2, 2018, 52 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/272,397, mailed on Nov. 22, 2016, 20 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/276,633, mailed on Feb. 23, 2018, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/276,633, mailed on Mar. 5, 2019, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/276,633, mailed on Nov. 17, 2016, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/390,377, mailed on Apr. 5, 2017, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/414,493, mailed on Oct. 6, 2017, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/507,229, mailed on Feb. 27, 2020, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/507,229, mailed on Jun. 3, 2019, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/674,992, mailed on May 11, 2018, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/719,404, mailed on Dec. 14, 2018, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/719,404, mailed on Nov. 26, 2021, 19 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/719,404, mailed on Oct. 16, 2020, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/798,092, mailed on Dec. 20, 2017, 20 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/876,715, mailed on Jun. 4, 2018, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/876,715, mailed on Sep. 10, 2019, 13 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/990,327, mailed on Jul. 31, 2018, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/010,280, mailed on Mar. 7, 2019, 5 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/108,519, mailed on Apr. 5, 2021, 13 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/108,519, mailed on Aug. 2, 2019, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/108,519, mailed on May 8, 2020, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/126,962, mailed on Aug. 25, 2020, 22 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/126,962, mailed on Sep. 3, 2019, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/136,005, mailed on Sep. 9, 2020, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/136,005, mailed on Sep. 18, 2019, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/142,635, mailed on Jun. 8, 2020, 19 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/142,635, mailed on Jun. 11, 2021, 23 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/144,077, mailed on Feb. 19, 2019, 24 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/144,077, mailed on Nov. 27, 2019, 40 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/175,565, mailed on Sep. 20, 2021, 33 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/233,990, mailed on Jul. 9, 2021, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/233,990, mailed on Jun. 18, 2020, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/392,467, mailed on Sep. 27, 2019, 5 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/584,790, mailed on Dec. 23, 2020, 30 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/584,790, mailed on Dec. 26, 2019, 24 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/584,790, mailed on Feb. 1, 2022, 33 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/682,443, mailed on Sep. 23, 2020, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/697,090, mailed on Aug. 3, 2021, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/697,090, mailed on Jul. 6, 2020, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/827,910, mailed on Jun. 17, 2021, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/827,918, mailed on Dec. 10, 2020, 28 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/827,926, mailed on Oct. 29, 2020, 45 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/827,931, mailed on Mar. 3, 2021, 24 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/865,172 mailed on Jun. 29, 2021, 29 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/865,172, mailed on Aug. 20, 2020, 19 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/872,274, mailed on Jul. 9, 2021, 19 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/888,453, mailed on Jun. 4, 2021, 37 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/888,478, mailed on Feb. 8, 2021, 24 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/945,724, mailed on Jul. 19, 2021, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/000,112, mailed on Dec. 7, 2021, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/065,387, mailed on Jan. 28, 2021, 28 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/065,387, mailed on Jun. 1, 2021, 25 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/133,550, mailed on Jun. 8, 2021, 23 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/210,352, mailed on Oct. 18, 2021, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/353,527, mailed on Oct. 5, 2021, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/175,565, mailed on Mar. 4, 2020, 36 pages. |
Notice of Allowance received for U.S. Appl. No. 14/208,099, mailed on Feb. 3, 2016, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 14/242,575, mailed on Oct. 27, 2016, 11 pages. |
Notice of Allowance received for U.S. Appl. No. 14/255,664, mailed on May 5, 2017, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 14/262,435, mailed on Aug. 16, 2016, 6 pages. |
Notice of Allowance received for U.S. Appl. No. 14/267,671, mailed on Sep. 19, 2018, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 14/746,095, mailed on Dec. 31, 2019, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 14/746,620, mailed on Sep. 25, 2017, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 14/746,662, mailed on Sep. 25, 2017, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 14/749,288, mailed on May 25, 2017, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 15/272,393, mailed on Jan. 15, 2020, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 15/272,393, mailed on Sep. 18, 2019, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 15/272,397, mailed on Oct. 18, 2017, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 15/276,633, mailed on Aug. 26, 2019, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 15/390,377, mailed on Jul. 2, 2018, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 15/414,493, mailed on Mar. 14, 2018, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 15/674,992, mailed on Oct. 1, 2018, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 15/695,880, mailed on Feb. 28, 2018, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 15/695,880, mailed on Oct. 18, 2017, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 15/714,904, mailed on May 22, 2018, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 15/798,092, mailed on Jun. 7, 2018, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 15/798,092, mailed on Oct. 9, 2018, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 15/833,618, mailed on Mar. 14, 2018, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 15/876,715, mailed on Oct. 14, 2021, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 15/990,327, mailed on Jan. 11, 2019, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 16/010,280, mailed on Jul. 29, 2019, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 16/036,810, mailed on Oct. 31, 2018, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 16/108,519, mailed on Sep. 21, 2021, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 16/136,005, mailed on Feb. 24, 2021, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 16/136,005, mailed on Jun. 9, 2021, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 16/142,635, mailed on Nov. 10, 2021, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 16/144,077, mailed on May 8, 2020, 15 pages. |
Notice of Allowance received for U.S. Appl. No. 16/233,990, mailed on Feb. 22, 2022, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 16/392,467, mailed on Mar. 23, 2020, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 16/682,443, mailed on Aug. 20, 2021, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 16/682,443, mailed on Nov. 17, 2021, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 16/726,179, mailed on Jun. 17, 2021, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 16/726,179, mailed on Sep. 30, 2021, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,918, mailed on Feb. 7, 2022, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,926, mailed on Nov. 1, 2021, 35 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,931, mailed on Jan. 5, 2022, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,931, mailed on Sep. 15, 2021, 11 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,942, mailed on Apr. 28, 2021, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,942, mailed on Jan. 22, 2021, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,942, mailed on Oct. 5, 2020, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 16/865,172, mailed on Dec. 16, 2021, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 16/872,274, mailed on Apr. 19, 2022, 10 Pages. |
Notice of Allowance received for U.S. Appl. No. 16/945,724, mailed on Dec. 20, 2021, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 17/065,387, mailed on Dec. 1, 2021, 10 pages. |
Restriction Requirement received for U.S. Appl. No. 14/208,099, mailed on Feb. 24, 2015, 5 pages. |
Patent Board Decision received for U.S. Appl. No. 15/876,715, mailed on Aug. 3, 2021, 8 pages. |
Search Report received for Chinese Patent Application No. 201680050096.X, mailed on Jan. 10, 2022, 4 pages (2 pages of English Translation and 2 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 201780033590.X, mailed on Mar. 24, 2021, 4 pages (2 page of English Translation and 2 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 201910469185.3, mailed on Feb. 23, 2021, 6 pages (3 page of English Translation and 3 page of Official Copy). |
Search Report received for Chinese Patent Application No. 201910587972.8, mailed on Jan. 4, 2022, 4 pages (2 pages of English Translation and 2 pages of Official Copy). |
Search Report received for Taiwanese Patent Application No. 104120385, mailed on Nov. 25, 2016, 2 Pages (1 page of official copy & 1 page of English translation). |
Supplemental Notice of Allowability received for U.S. Appl. No. 16/827,942, mailed on Nov. 4, 2020, 3 pages. |
Supplemental Notice of Allowance received for U.S. Appl. No. 15/798,092, mailed on Jan. 9, 2019, 2 pages. |
Bohn Dieter, “Rebooting WebOS: How LG Rethought the Smart TV”, The Verge, Available online at: <http://www.theverge.com/2014/1/6/5279220/rebooting-webos-how-lg-rethought-the-smart-tv>, [Retrieved Aug. 26, 2019], Jan. 6, 2014, 5 pages. |
Cheredar Tom, “Verizon's Viewdini Lets You Watch Netflix, Comcast, & Hulu Videos from a Single App”, Available online at: <venturebeat.com>, [Retrieved Jun. 10, 2021], May 22, 2012, 6 pages. |
episodecalendar.com, “Keep track of your favorite TV shows!—TV Episode Calendar”, Available Online at: <https://web.archive.org/web/20140517060612/https://episodecalendar.com/>, [Retrieved Oct. 18, 2017], May 17, 2014, 6 pages. |
Grey Melissa, “Comcast's New X2 Platform Moves your DVR Recordings from the Box to the Cloud”, Engadget, Available online at: <http://www.engadget.com/2013/06/11/comcast-x2-platform/>, Jun. 11, 2013, 15 pages. |
Kaijser Martijn, “Mimic Skin for Kodi 15.x: Installation and Showcase”, Time 2:23-2:28, Available online at: <https://www.youtube.com/watch?v=RGfpbUWVkgQ&t=143s>, [Retrieved Jun. 10, 2021], Aug. 3, 2015, 1 page. |
Lee et al., “A Multi-Touch Three Dimensional Touch-Sensitive Tablet”, CHI'85 Proceedings, Apr. 1985, pp. 21-25. |
Li Xiaoshan, “CNTV, HULU, BBC iPlayer Comparative Study on User Interface of Three Network TV Stations”, Modern Communication (Journal of Communication University of China), Issue 11, Nov. 5, 2010, pp. 156-158. |
Rubine Dean, “Combining Gestures and Direct Manipulation”, CHI'92, May 3-7, 1992, pp. 659-660. |
Rubine Deanh. , “The Automatic Recognition of Gestures”, CMU-CS-91-202, Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Computer Science at Carnegie Mellon University, Dec. 1991, 285 pages. |
Westerman Wayne, “Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface”, A Dissertation Submitted to the Faculty of the University of Delaware in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Electrical Engineering, 1999, 363 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 15/876,715, mailed on Apr. 11, 2022, 4 Pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 15/876,715, mailed on Apr. 19, 2022, 4 Pages. |
Notice of Allowance received for U.S. Appl. No. 16/233,990, mailed on Oct. 20, 2022, 2 pages. |
Notice of Allowance received for U.S. Appl. No. 16/945,724, mailed on Aug. 31, 2022, 2 pages. |
Notice of Allowance received for U.S. Appl. No. 17/000,112, mailed on Jun. 17, 2022, 2 pages. |
Notice of Allowance received for U.S. Appl. No. 17/065,387, mailed on Mar. 30, 2022, 2 Pages. |
Extended European Search Report received for European Patent Application No. 22167405.4, mailed on Jul. 4, 2022, 11 Pages. |
Final Office Action received for U.S. Appl. No. 16/175,565, mailed on May 27, 2022, 33 pages. |
Final Office Action received for U.S. Appl. No. 16/584,790, mailed on Jun. 14, 2022, 37 pages. |
Final Office Action received for U.S. Appl. No. 16/697,090, mailed on Dec. 14, 2022, 28 pages. |
Final Office Action received for U.S. Appl. No. 16/888,453, mailed on Apr. 8, 2022, 39 pages. |
Final Office Action received for U.S. Appl. No. 17/210,352, mailed on Jun. 3, 2022, 21 pages. |
Final Office Action received for U.S. Appl. No. 17/353,527, mailed on May 11, 2022, 17 Pages. |
Final Office Action received for U.S. Appl. No. 17/379,785, mailed on Oct. 28, 2022, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/167,801, mailed on May 18, 2022, 17 Pages. |
Non-Final Office Action received for U.S. Appl. No. 16/697,090, mailed on Jul. 7, 2022, 25 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/827,910, mailed on Sep. 14, 2022, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/827,926, mailed on Apr. 25, 2022, 27 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/888,478, mailed on May 2, 2022, 29 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/133,550, mailed on Sep. 9, 2022, 23 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/353,527, mailed on Dec. 8, 2022, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/379,785, mailed on Mar. 30, 2022, 18 Pages. |
Non-Final Office Action received for U.S. Appl. No. 17/457,901, mailed on Apr. 28, 2022, 24 Pages. |
Non-Final Office Action received for U.S. Appl. No. 17/586,625, mailed on Sep. 1, 2022, 13 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/656,610, mailed on Feb. 6, 2023, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/660,622, mailed on Dec. 20, 2022, 17 pages. |
Notice of Allowance received for U.S. Appl. No. 15/719,404, mailed on Jul. 13, 2022, 8 Pages. |
Notice of Allowance received for U.S. Appl. No. 15/719,404, mailed on Nov. 9, 2022, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 15/876,715, mailed on Apr. 4, 2022, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 15/876,715, mailed on Aug. 3, 2022, 7 Pages. |
Notice of Allowance received for U.S. Appl. No. 16/233,990, mailed on Jan. 31, 2023, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 16/233,990, mailed on May 26, 2022, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 16/233,990, mailed on Oct. 5, 2022, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 16/584,790, mailed on Feb. 3, 2023, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,918, mailed on Jun. 8, 2022, 9 Pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,931, mailed on Apr. 19, 2022, 7 Pages. |
Notice of Allowance received for U.S. Appl. No. 16/865,172, mailed on Apr. 13, 2022, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 16/865,172, mailed on Aug. 25, 2022, 8 Pages. |
Notice of Allowance received for U.S. Appl. No. 16/945,724, mailed on Apr. 4, 2022, 8 Pages. |
Notice of Allowance received for U.S. Appl. No. 16/945,724, mailed on Jul. 20, 2022, 8 Pages. |
Notice of Allowance received for U.S. Appl. No. 17/000,112, mailed on Jun. 3, 2022, 14 pages. |
Notice of Allowance received for U.S. Appl. No. 17/000,112, mailed on Oct. 18, 2022, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 17/210,352, mailed on Dec. 5, 2022, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 17/457,901, mailed on Nov. 16, 2022, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 17/654,578, mailed on Oct. 25, 2022, 8 pages. |
Search Report received for Chinese Patent Application No. 201780066823.6, mailed on Nov. 1, 2022, 4 pages (2 pages of English Translation and 2 Pages of Official Copy). |
Search Report received for Chinese Patent Application No. 201811143102.3, mailed on Nov. 22, 2022, 5 Pages (2 Pages of English Translation and 3 Pages of Official Copy). |
Search Report received for Chinese Patent Application No. 201911313497.1, mailed on Dec. 14, 2022, 3 pages (1 pages of English Translation and 2 Pages of Official Copy). |
Search Report received for Chinese Patent Application No. 202010011436.6, mailed on Dec. 15, 2022, 9 pages (4 pages of English Translation and 5 Pages of Official Copy). |
Apple, “The control is all yours”, Available online at : <https://www.apple.com.cn/privacy/control/>, [Retrieved Dec. 29, 2022], Nov. 30, 2022, 12 pages. See attached Communication 37 CFR § 1.98(a)(3). |
Drews et al., “Virtual Jukebox—Reviving a Classic”, Proceedings of the 35th Hawaii International Conference on System Sciences, 2022, 7 pages. |
Jin et al., “Pricing Sponsored Content in Wireless Networks with Multiple Content Providers”, The Fourth IEEE Workshop on Smart Data Pricing 2015, 2015, pp. 668-673. |
Kimbler Kristofer, “App Store Strategies for Service Providers”, 2010 4th International Conference on Intelligence in Next Generation Networks, Nov. 18, 2010, 5 Pages. |
Wang et al., “Authorization Management Mechanism of Web application system”, Network and Information Technology, vol. 25, No. 11, 2006, 3 pages. See attached Communication 37 CFR § 1.98(a)(3). |
Meng et al., “Role Authorization Based Web Service Access Control Model”, Journal of Lanzhou University (Natural Science Edition), vol. 42, No. 2, 2007, pp. 84-88. See attached Communication 37 CFR § 1.98(a)(3). |
Final Office Action received for U.S. Appl. No. 16/827,910, mailed on Mar. 15, 2023, 18 pages. |
Final Office Action received for U.S. Appl. No. 16/827,926, mailed on Apr. 18, 2023, 32 pages. |
Final Office Action received for U.S. Appl. No. 16/888,478, mailed on Feb. 13, 2023, 27 pages. |
Final Office Action received for U.S. Appl. No. 17/133,550, mailed on Feb. 15, 2023, 22 pages. |
Final Office Action received for U.S. Appl. No. 17/586,625, mailed on May 4, 2023, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/167,801, mailed on Feb. 8, 2023, 23 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/719,404, mailed on May 10, 2023, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/175,565, mailed on Feb. 17, 2023, 33 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/379,785, mailed on Mar. 9, 2023, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/651,731, mailed on Apr. 25, 2023, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/937,410, mailed on Mar. 2, 2023, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/060,902, mailed on Mar. 10, 2023, 8 pages. |
Notice of Allowability received for U.S. Appl. No. 17/457,901, mailed on Mar. 8, 2023, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 16/888,453, mailed on Mar. 1, 2023, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 17/210,352, mailed on Mar. 16, 2023, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 17/367,227, mailed on Mar. 23, 2023, 12 pages. |
Notice of Allowance received for U.S. Appl. No. 17/654,578, mailed on Feb. 15, 2023, 8 pages. |
Search Report received for Chinese Patent Application No. 201911313480.6, mailed on Jan. 20, 2023, 4 pages (2 bages of English Translation and 2 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 201911313496.7, mailed on Jan. 20, 2023, 4 pages (2 pages of English Translation and 2 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 201911313497.1, mailed on Apr. 11, 2023, 5 pages (2 pages of English Translation and 3 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 202010662206.6, mailed on Apr. 28, 2023, 5 pages (2 bages of English Translation and 3 pages of Official Copy). |
Search Report received for European Patent Application No. 20718506.7, mailed on Mar. 21, 2023, 2 pages. |
Anonymous, “Video Progress Bar—YouTube Help”, Retrieved from the Internet: <URL:https://web.archive.org/web/20190317001501/https://support.google.com/youtube/answer/7174115?hl=en>, [retrieved on Mar. 22, 2023], Mar. 17, 2019, 2 pages. |
Beer et al., “The Odds of Running a Nonlinear TV Program Using Web Technologies”, IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB), 2011, 4 pages. |
Biao et al., “Research on UI Optimization of Chinese Network Television Stations”, Southeast Communications, 2013, 4 pages. See attached Communication 37 CFR § 1.98(a)(3). |
Budhraja et al., “Probability Based Playlist Generation Based on Music Similarity and User Customization”, National Conference on Computing and Communication Systems, 2012, 5 pages. |
Cheng, Luo, “The Designing of Dynamic Play-list Based on Flash Streaming Media Technology”, Computer and Telecommunication, 2008, 3 pages. See attached Communication 37 CFR § 1.98(a)(3). |
Liu, Chang, “Functions and Design of Multi-Screen Playing System in TV Variety Studio”, Modern TV Technology, 2013, 5 pages. See attached Communication 37 CFR § 1.98(a)(3). |
Tinari, George, “What's New in the Netflix Redesign and How to Use It”, Retrieved from the Internet: <https://web.archive.org/web/20161110092133/https://www.guidingtech.com/48443/netflix-redesign-overview/ >, [retrieved on Mar. 22, 2023]., Nov. 10, 2016, 9 pages. |
Zhang et al., “Music Playlist Prediction via Detecting Song Moods”, IEEE China Summit and International Conference on Signal and Information Processing, 2013, pp. 174-178. |
Corrected Notice of Allowance received for U.S. Appl. No. 16/888,453, mailed on Jul. 26, 2023, 5 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 17/367,227, mailed on Jul. 27, 2023, 2 pages. |
Final Office Action received for U.S. Appl. No. 17/660,622, mailed on May 24, 2023, 20 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/656,610, mailed on Jul. 26, 2023, 10 Pages. |
Non-Final Office Action received for U.S. Appl. No. 17/657,913, mailed on Jul. 21, 2023, 16 pages. |
Notice of Allowance received for U.S. Appl. No. 16/888,453, mailed on Jun. 21, 2023, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 17/353,527, mailed on July 21, 2023, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 17/654,578, mailed on June 13, 2023, 7 pages. |
Search Report received for Chinese Patent Application No. 202010662994.9, mailed on Apr. 28, 2023, 5 pages (2 pages of English Translation and 3 pages of Official Copy). |
Advisory Action received for U.S. Appl. No. 18/060,902, mailed on Nov. 13, 2023, 2 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 16/888,478, mailed on Oct. 31, 2023, 6 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 17/210,352, mailed on Sep. 20, 2023, 5 pages. |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 16/697,090, mailed on Oct. 26, 2023, 10 pages. |
Final Office Action received for U.S. Appl. No. 15/167,801, mailed on Sep. 19, 2023, 19 pages. |
Final Office Action received for U.S. Appl. No. 17/379,785, mailed on Aug. 23, 2023, 13 pages. |
Final Office Action received for U.S. Appl. No. 17/937,410, mailed on Aug. 3, 2023, 15 pages. |
Final Office Action received for U.S. Appl. No. 18/060,902, mailed on Aug. 25, 2023, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/146,336, mailed on Aug. 3, 2023, 23 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,910, mailed on Aug. 3, 2023, 21 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,926, mailed on Sep. 13, 2023, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 16/888,478, mailed on Aug. 2, 2023, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 17/353,527, mailed on Oct. 4, 2023, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 17/586,625, mailed on Oct. 26, 2023, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 17/651,731, mailed on Oct. 3, 2023, 5 pages. |
Search Report received for Chinese Patent Application No. 201811143102.3, mailed on Nov. 2, 2023, 5 pages (3 pages of English Translation and 2 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 202010011436.6 mailed on Aug. 30, 2023, 4 pages (2 pages of English Translation and 2 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 202010662994.9, mailed on Sep. 28, 2023, 3 pages (1 page of English Translation and 2 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 202110201931.8, mailed on Oct. 16, 2023, 3 pages (1 page of English Translation and 2 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 202210799020.4, mailed on Jul. 27, 2023, 5 pages (1 page of English Translation and 4 pages of Official Copy). |
Cai, Chongshan, “Analysis of Copyright Infringement Problems of Video Aggregation App”, China Copyright, vol. 02, [retrieved on Oct. 6, 2023], Available online at: <http://www.cqvip.com/qk/81889a/2015002/90716681504849534850485048.html>, Apr. 15, 2015, 2 pages (1 page English Translation and 1 page Official Copy). |
Chen et al., “What a Juke! A Collaborative Music Sharing System”, IEEE, 2012, 6 pages. |
Cunningham et al., “An Ethnographic Study of Music Information Seeking: Implications for the Design of a Music Digital Library”, IEEE, 2003, 13 pages. |
Corrected Notice of Allowability received for U.S. Appl. No. 15/719,404, mailed on Mar. 22, 2024, 2 pages. |
Corrected Notice of Allowability received for U.S. Appl. No. 17/586,625, mailed on Feb. 20, 2024, 2 pages. |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 16/175,565, mailed on Dec. 15, 2023, 27 pages. |
Final Office Action received for U.S. Appl. No. 17/656,610, mailed on Jan. 16, 2024, 12 pages. |
Final Office Action received for U.S. Appl. No. 18/146,336, mailed on Feb. 23, 2024, 23 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/133,550, mailed on Dec. 18, 2023, 25 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/379,785, mailed on Feb. 15, 2024, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/657,913, mailed on Jan. 11, 2024, 20 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/937,410, mailed on Feb. 29, 2024, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/060,902, mailed on Dec. 1, 2023, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 15/719,404, mailed on Dec. 8, 2023, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,910, mailed on Dec. 13, 2023, 19 pages. |
Notice of Allowance received for U.S. Appl. No. 16/827,926, mailed on Feb. 2, 2024, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 17/651,731, mailed on Jan. 25, 2024, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 17/660,622, mailed on Jan. 24, 2024, 11 pages. |
Search Report received for Chinese Patent Application No. 202111293833.8, mailed on Dec. 9, 2023, 4 pages (2 pages of English Translation and 2 pages of Official Copy). |
Search Report received for Chinese Patent Application No. 202111635535.2, mailed on Dec. 21, 2023, 5 pages (2 pages of English Translation and 3 pages of Official Copy). |
Number | Date | Country | |
---|---|---|---|
20230132595 A1 | May 2023 | US |
Number | Date | Country | |
---|---|---|---|
62016599 | Jun 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16872274 | May 2020 | US |
Child | 17937704 | US | |
Parent | 14746095 | Jun 2015 | US |
Child | 16872274 | US |