The present disclosure relates to the field of interactive television and graphical user interfaces.
Digital media playback capabilities may be incorporated into a wide range of devices, including digital televisions, including so-called “smart” televisions, laptop or desktop computers, tablet computers, e-book readers, personal digital assistants (PDAs), digital recording devices, digital media players, video gaming devices, digital cameras, cellular or satellite radio telephones, including so-called “smart” phones, dedicated video streaming devices, and the like. Digital media content may originate from a plurality of sources including, for example, over-the-air television providers, satellite television providers, cable television providers, online media-sharing services, including, online media streaming and downloading services, peer devices, and the like. Further, devices with digital media playback capabilities may be configured to output digital media to ultra-resolution displays.
Due to the wide range of content users may access on devices with digital media playback capabilities, traditional techniques for the organizing, sorting, and displaying available content choices may be less than ideal, particularly for televisions and secondary connected devices. Further, traditional electronic programming guides may be intended to be displayed on relatively smaller and/or lower resolution displays than the larger and higher resolution displays that are currently available or may be become available in the future.
The following brief summary is not intended to include all features and aspects of the present invention, nor does it imply that the invention must include all features and aspects discussed in this summary. The present disclosure relates to the field of graphical user interfaces and more specifically describes techniques for presenting a user with dynamic interactive graphical user interfaces. In particular, this disclosure describes techniques for enabling content selection. In some examples, the techniques may be implemented in a device with digital media playback capabilities, including for example, laptop or desktop computers, tablet computers, smart phones, set top boxes, and televisions.
According to one example of the disclosure, a method for enabling content selection comprises displaying one or more sub-categories of content included within a category of content, wherein displaying one or more sub-categories of content includes displaying icons representing items of content included within each of the one or more sub-categories of content in a stack structure, wherein the one or more sub-categories of content are associated with a sub-category type, and enabling a user to change a sub-category type.
According to another example of the disclosure, a device for enabling content selection comprises one or more processors configured to display one or more sub-categories of content included within a category of content, wherein displaying one or more sub-categories of content includes displaying icons representing items of content included within each of the one or more sub-categories of content in a stack structure, wherein the one or more sub-categories of content are associated with a sub-category type, and enable a user to change a sub-category type.
According to another example of the disclosure, an apparatus for enabling content selection comprises means for displaying one or more sub-categories of content included within a category of content, wherein displaying one or more sub-categories of content includes displaying icons representing items of content included within each of the one or more sub-categories of content in a stack structure, wherein the one or more sub-categories of content are associated with a sub-category type, and means for enabling a user to change a sub-category type.
According to another example of the disclosure, a non-transitory computer-readable storage medium has instructions stored thereon that upon execution cause one or more processors of a device to display one or more sub-categories of content included within a category of content, wherein displaying one or more sub-categories of content includes displaying icons representing items of content included within each of the one or more sub-categories of content in a stack structure, wherein the one or more sub-categories of content are associated with a sub-category type, and enable a user to change a sub-category type.
The details of one or more examples are set forth in the accompanying drawings and the description below. Other features, objects, and advantages will be apparent from the description and drawings, and from the claims.
Described herein are systems and methods for enabling the selection of content. Some embodiments extend to a machine-readable medium embodying instructions which, when executed by a machine, cause the machine to perform any one or more of the methodologies described herein. Other features will be apparent from the accompanying drawings and from the detailed description that follows. Examples merely typify possible variations. Unless explicitly stated otherwise, components and functions are optional and may be combined or subdivided, and operations may vary in sequence or may be combined or subdivided. In the following description, for purposes of explanation, numerous specific details are set forth to provide a thorough understanding of example embodiments. It will be evident to one skilled in the art, however, that the present subject matter may be practiced without these specific details.
Devices with digital media playback capabilities, including, for example, televisions, set top boxes, and mobile devices, may be configured to provide users thereof with graphical user interfaces that enable the selection of content. In some examples, these graphical user interfaces may be referred to as electronic program guides (EPGs). Traditional electronic program guides may be designed to only display content available through a television provider. Further, traditional electronic programming guides may have been designed for relatively smaller and/or lower resolution displays than the larger and higher resolution displays that are currently available or may be become available in the future. That is, traditional electronic program guides may have been designed when standard definition displays (576i resolutions) with a maximum size screen size of 32″ were common. Whereas, current commercially available displays may be larger than 100″ and may have resolutions as high as 3840 pixels by 2160 pixels. Further, even larger displays with higher resolutions are anticipated to be commercially available in the future. Displays with resolutions higher than standard or high definition displays may be referred to as ultra-resolution displays. Ultra-resolution displays, such as ultra-resolution televisions, increase the screen real-estate because more pixels can be seen by viewers allowing more content to be displayed efficiently. Further, smaller devices, such as tablet computers, may include ultra-resolution displays. Traditional electronic program guides may be less than ideal for use with ultra-resolution displays. The techniques describe herein may enable a user to more efficiently select content from a plurality of diverse sources.
Example embodiments described herein may allow for a more natural interaction with a graphical user interface by zooming back from content and surrounding it with relevant features and data. This mimics how users actually organize and interact with objects in the real world. Additionally the techniques described herein may leverage ultra-resolution displays to minimize text and content confusion by using thumbnail navigation. Thumbnail navigation may allow users to quickly and seamlessly move across content solely using icons representing content. Example embodiments described herein may replace standard data that is represented in text form with high resolution graphics to reduce clutter and allow for easier browsing of large catalogs of content. These features may then be transported on many other devices beyond a television, creating a seamless interface that enhance the user experience. That is, the graphical user interfaces described herein may be displayed on a secondary display (e.g., a smart phone display) in conjunction with being displayed on a primary display (e.g., an ultra-resolution television).
System 100 represents an example of a system that may be configured to allow digital content, such as, for example, music, videos, images, webpages, messages, voice communications, and applications, to be distributed to and accessed by a plurality of computing devices, such as computing devices 102A-102N. In the example illustrated in
Communications network 104 may comprise any combination of wireless and/or wired communication media. Communications network 104 may include coaxial cables, fiber optic cables, twisted pair cables, wireless transmitters and receivers, routers, switches, repeaters, base stations, or any other equipment that may be useful to facilitate communications between various devices and sites. Communications network 104 may operate according to a combination of one or more telecommunication protocols. Telecommunications protocols may include proprietary aspects and/or may include standardized telecommunication protocols. Examples of standardized telecommunications protocols include Digital Video Broadcasting (DVB) standards, Advanced Television Systems Committee (ATSC) standards, Integrated Services Digital Broadcasting (ISDB) standards, Data Over Cable Service Interface Specification (DOCSIS) standards, Global System Mobile Communications (GSM) standards, code division multiple access (CDMA) standards, 3rd Generation Partnership Project (3GPP) standards, European Telecommunications Standards Institute (ETSI) standards, Internet Protocol (IP) standards, Wireless Application Protocol (WAP) standards, and IEEE standards, such as, for example, one or more of the 802 standards.
As illustrated in
Television provider network 106 is an example of a network configured to provide a user with television services. For example, television provider network 106 may include public over-the-air television networks, public or subscription-based satellite television service provider networks, and public or subscription-based cable television provider networks. It should be noted that although in some examples television provider network 106 may primarily be used to provide television services, television provider network 106 may also provide other types of data and services according to any combination of the telecommunication protocols described herein.
Public network 108 is an example of a packet-based network, such as, a local area network, a wide-area network, or a global network, such as the Internet, configured to provide a user with World Wide Web based services. Public network 108 may be configured to operate according to Internet Protocol (IP) standards. It should be noted that although in some examples public network 108 may primarily be used to provide access to hypertext web pages, public network 108 may also provide other types of media content according to any combination of the telecommunication protocol described herein.
Referring again to
On demand engine 114 may be configured to access a multimedia library and distribute multimedia content to one or more of computing devices 102A-102N through television provider network 106. For example, on demand engine 114 may access multimedia content (e.g., music, movies, and TV shows) stored in multimedia database 116A and provide a subscriber of a cable television service with movies on a Pay Per View (PPV) basis. Multimedia database 116A may be a storage device configured to store multimedia content. It should be noted that multimedia content accessed through on demand engine 114 may also be located at various sites within system 100 (e.g., peer-to-peer distribution).
Media service provider site 118 represents an example of a multimedia service provider. Media service provider site 118 may be configured to access a multimedia library and distribute multimedia content to one or more of computing devices 102A-102N through public network 108. For example, media service provider site 118 may access multimedia (e.g., music, movies, and TV shows) stored in multimedia database 116B and provide a user of a media service with multimedia. Multimedia database 116B may be a storage device configured to store multimedia content. In one example, media service provider site 118 may be configured to provide content to one or more of computing devices 102A-102N using the Internet protocol suite. In some examples, a media service may be referred to as a streaming service. Commercial examples of media services may include Hulu, YouTube, Netflix, and Amazon Prime. As described above, television provider network 106 and public network 108 may share physical and logical aspects. Thus, content accessed by one or more of computing devices 102A-102N through media service provider site 118 may be transmitted through physical components of television provider network 106. For example, a user of a computing device may access the internet and multimedia content provided by a media service through a cable modem connected to a coaxial network maintained by a cable television provider.
Webpage content distribution site 120 represents an example of a webpage service provider. Webpage content distribution site 120 may be configured to provide hypertext based content to one or more of computing devices 102A-102N through public network 108. It should be noted that hypertext based content may include audio and video content. Hypertext content may be defined according to programming languages, such as, for example, Hypertext Markup Language (HTML), Dynamic HTML, and Extensible Markup Language (XML). Examples of webpage content distribution sites include the Wikipedia website and the United States Patent and Trademark Office website.
Application distribution site 122 represents an example of an application distribution service. Application distribution site 122 may be configured to distribute developed software applications to one or more of computing devices 102A-102N. In one example, software applications may include games and programs operable on computing devices. In other examples, software applications may be configured to allow a computing device to access content provided by a site in manner specific to the computing device. For example, software applications may be configured to provide enhanced or reduced functionality of a webpage to a mobile device or a set top box. Software applications may be developed using a specified programming language. Examples of programming languages include, Java™, C, C++, Perl, UNIX Shell, Visual Basic, and Visual Basic Script. In some examples, developers may write software applications using a software development kit (SDK) provided by a device manufacturer or a service provider. In the example where one or more of computing devices 102A-102N are mobile devices, application distribution site 122 may be maintained by a mobile device manufacturer, a service provider, and/or a mobile device operating system provider. In the example where one or more of computing devices 102A-102N are set top boxes, application distribution site 108 may be maintained by a set top box manufacturer, a service provider, and/or an operating system provider. In some examples, an application distribution site may be referred to as an app store. Examples of commercially available application distribution sites include Google Play, the Apple App Store, BlackBerry World, Windows Phone Store, and the Amazon Appstore.
Social media site 124 represents an example of a social media service. Social media site 124 may be configured to allow users of computing devices 102A-102N to communicate with one another. Social media site 124 may be configured to host profile pages corresponding to users of computing devices 102A-102N. For example, social media site 124 may be configured such that users of computing devices 102A-102N are able to display messages and upload photos, videos, and other media to a user's profile page. Examples of commercially available social media sites include Facebook, YouTube, Linkedin, Google Plus, Twitter, Flickr, and Instagram. In addition to allowing users to maintain profile pages, social media site 124 may be configured to generate analytical data based on information included in user profile pages and/or user activity. For example, social media site 124 may be configured to track the popularity of a news story based on comments provided by users of computing devices 102A-102N. As described in detail below, the techniques described herein may allow users of computing devices 102A-102N to incorporate functions of social media sites to share content and recommendations with other users. For example, users may discover content endorsed by other users.
Search engine site 126 represents an example of a content search service. Search engine site 126 may be a service configured to allow users of computing devices 102A-102N to search for content available through communications network 104. Search engine site 126 may be configured to receive queries from computing devices 102A-102N and provide a list of search results to computing devices 102A-102N. For example, search engine site 126 may be configured such that users of computing devices 102A-102N are presented with a webpage including a search query field and are able to search content based on keywords. Examples of commercially available search engine sites include Google, Bing, and Yahoo! Further, search engine site 126 may be configured to generate analytical data based on information included in search queries. For example, search engine site 126 may be configured to track the popularity of an actress based on the number of times a query related to the actress is provided by users of computing devices 102A-102N.
As illustrated in
CPU(s) 202 may be configured to implement functionality and/or process instructions for execution in computing device 200. CPU(s) 202 may be capable of retrieving and processing instructions, code, and/or data structures for implementing one or more of the techniques described herein. Instructions may be stored on a computer readable medium, such as system memory 204 or storage devices 220. CPU(s) 202 may include multi-core central processing units.
System memory 204 may be described as a non-transitory or tangible computer-readable storage medium. In some examples, system memory 204 may provide temporary and/or long-term storage. In some examples, system memory 204 or portions thereof may be described as non-volatile memory and in other examples portions of system memory 204 may be described as volatile memory. Examples of volatile memories include random access memories (RAM), dynamic random access memories (DRAM), and static random access memories (SRAM). Examples of non-volatile memories include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories.
System memory 204, may be configured to store information that may be used by computing device 200 during operation. System memory 204 may be used to store program instructions for execution by CPU(s) 202 and may be used by software or applications running on computing device 200 to temporarily store information during program execution. For example, system memory 204 may store instructions associated with operating system 206 and applications 208. Applications 208 may include applications implemented within or executed by computing device 200 and may be implemented or contained within, operable by, executed by, and/or be operatively/communicatively coupled to components of computing device 200. Applications 208 may include instructions that may cause CPU(s) 202 of computing device 200 to perform particular functions. Applications 208 may include algorithms which are expressed in computer programming statements, such as, for-loops, while-loops, if-statements, do-loops, etc. Applications 208 may be distributed to computing device 200 through an application distribution site, such as, for example, application distribution site 122 described above.
As further illustrated in
System interface 210, may be configured to enable communications between components of computing device 200. In one example, system interface 210 comprises structures that enable data to be transferred from one peer device to another peer device or to a storage medium. For example, system interface 210 may include a chipset supporting Accelerated Graphics Port (“AGP”) based protocols, Peripheral Component Interconnect (PCI) bus based protocols, such as, for example, the PCI Express™ (“PCIe”) bus specification, which is maintained by the Peripheral Component Interconnect Special Interest Group, or any other form of structure that may be used to interconnect peer devices.
Storage devices 220 represent memory of computing device 200 that may be configured to store relatively larger amounts of information for relatively longer periods of time than system memory 204. For example, in the example where computing device 200 is included as part of a digital video recorder, storage devices 220 may be configured to store numerous video files. Similar to system memory 204, storage device(s) 220 may also include one or more non-transitory or tangible computer-readable storage media. Storage device(s) 220 may include internal and/or external memory devices and in some examples may include volatile and non-volatile storage elements. Examples of memory devices include file servers, an FTP servers, network attached storage (NAS) devices, a local disk drive, or any other type of device or storage medium capable of storing data. Storage medium may include Blu-ray discs, DVDs, CD-ROMs, flash memory, or any other suitable digital storage media.
I/O devices 222 may be configured to receive input and provide output during operation of computing device 200. Input may be generated from an input device, such as, for example, a push-button remote control, a motion based remote control, a device including a touch-sensitive screen, a device including a track pad, a mouse, a keyboard, a microphone, a video camera, a motion sensor, or any other type of device configured to receive user input. In one example, an input device may include an advanced user input device, such as a smart phone or a tablet computing device. For example, an input device may be a secondary computing device and may be configured to receive user input via touch gestures, buttons on the secondary computing device, and/or voice control. Further, in some examples, an input device may include a display that is configured to display the graphical users interfaces described herein. For example, in the case where computing device 200 includes a television, an input device may include a smart phone in communication with the television. In this example, a user may provide commands to a television by activating portions of a graphical user interface displayed on a smart phone. Output may be provided to output devices, such as, for example internal speakers, an integrated display device, and/or external components, such as, a secondary computing device. In some examples, I/O device(s) 222 may be operatively coupled to computing device 200 using a standardized communication protocol, such as for example, Universal Serial Bus protocol (USB), Bluetooth, ZigBee or a proprietary communications protocol, such as, for example, a proprietary infrared communications protocol.
In the example illustrated in
In one example, activation of “+” and “−” channel buttons 264 may cause a selected navigational item 304, as described in detail below, to change. For example, when menu bar 302 is active, activation of “+” channel button may cause a navigational item to the left of the selected navigational item to be selected and activation of “−” channel button may cause a navigational item to the right of the selected navigational item to be selected. In one example, activation of “+” and “−” channel buttons 264 may cause a selected stack structure, as described in detail below, to change. For example, when stack structure selection is active, activation of “+” channel button may cause a stack structure to the left of the selected stack structure to be selected and activation of “−” channel button may cause a stack structure to the right of the selected stack structure to be selected.
Playback controls 270 may be configured to enable a user to control the playback of and/or record multimedia content. For example, playback controls 270 may enable a user to control the playback of a video originating from a media service provider site, an on demand engine, and/or a personal video recorder (PVR). As illustrated in
In one example, activation of reverse playback button 271 and forward playback button 273 may cause a selected navigational item 304, as described in detail below, to change. For example, when menu bar 302 is active, activation of reverse playback button 271 may cause a navigational item to the left of the selected navigational item to be selected and activation of forward playback button 273 may cause a navigational item to the right of the selected navigational item to be selected. In one example, activation of reverse playback button 271 and forward playback button 273 may cause a selected stack structure, as described in detail below, to change. For example, when stack structure selection is active, activation of reverse playback button 271 may cause a stack structure to the left of the selected stack structure to be selected and activation of forward playback button 273 may cause a stack structure to the right of the selected stack structure to be selected.
As described above, devices with digital media playback capabilities, including, for example, televisions, set top boxes, and mobile devices, may be configured to provide users thereof with graphical user interfaces that enable the selection of content. Navigational controls 280 may be configured to enable a user to navigate graphical user interfaces and select content using a graphical user interface. In one example, navigational controls 280 may be configured to enable a user to navigate graphical user interfaces and select content using one of more of the example graphical user interfaces described below with respect to
In the example illustrated in
Select button 282 may enable a user to further select an item of content. As described in detail below, an icon representing an item of content may be associated with multiple levels of selection. In one example, consecutive activations of select button 282 may cause respective levels of selection to occur. Information button 283 may be configured to cause additional information associated with an item of content of to be displayed. For example, when an icon representing an item of content is initially selected, activation of information button 283 may cause information associated with the content (e.g., cast and crew information) to be displayed.
Menu button 284, guide button 285, back button 286, and exit button 287 may be configured to enable a user to cause different graphical user interfaces to be presented. Upon activation, menu button 284 may cause a graphical user interface including a high level menu to be displayed. In one example, a high level menu may include a menu that enables a user to change settings associated with the operation of a computing device. In one example, a high-level menu may include a menu that enables a user to select a user profile (e.g., a log-in graphical user interface). Upon activation, guide button 285 may be configured to provide a graphical user interface that enables a user to select content. In one example, upon activation of guide button 285, graphical user interface 300 described with respect to
Back button 286 may be configured to enable a user to return to a previous graphical user interface. For example, when graphical user interface 800, described below with respect to
As describe in detail below with respect to
Referring again to
In one example, modem 212 may be configured to perform physical signaling, addressing, and channel access control according to the physical and MAC layers utilized in a television provider network, such as, for example, television provider network 106. In one example, modem 212 may configured to receive signals from a coaxial cable and/or an over the air signal and perform low level signal processing (e.g., demodulation). In one example, modem 212 may be configured to extract transport streams from signals received from a coaxial cable. In one example, a transport stream may be based on a transport stream defined by the Moving Pictures Experts Group (MPEG). In one example, a transport stream may include a plurality of program streams where each program stream respectively corresponds to a program available from a television network. Further, a transport stream may include a plurality of data streams (e.g., Program Map Table and EPG data).
Transport module 214 may be configured to receive data from modem 212 and process received data. For example, transport model 214 may be configured to receive a transport stream including a plurality of program streams and extract individual program streams from a received transport stream. In one example, a program stream may include a video stream, an audio stream, and a data stream. AV demux 216 may be configured to receive data from transport module 214 and process received data. For example, AV demux 216 may be configured to receive a program stream from transport module 214 and extract audio packets, video packets, and data packets. That is, AV demux 216 may apply demultiplexing techniques to separate video streams, audio streams, and data streams from a program stream. In one example, AV demux 216 may be configured to decapsulate packetized elementary video and audio streams from a transport stream defined according to MPEG-2 Part 1. It should be noted that although modem 212, transport module 214, and AV demux 216 are illustrated as having distinct functional blocks, the functions performed by modem 212, transport module 214, and AV demux 216 may be highly integrated and realized using any combination of hardware, firmware and/or software implementations.
Network interface 218 may be configured to enable computing device 200 to send and receive data via a public network. As described above, data sent or received via a public network may include data associated digital content, such as, for example, music, videos, images, webpages, messages, voice communications, and applications. Network interface 218 may include a network interface card, such as an Ethernet card, an optical transceiver, a radio frequency transceiver, or any other type of device configured to send and receive information. Network interface 218 may be configured to perform physical signaling, addressing, and channel access control according to the physical and MAC layers utilized in a public network, such as for example, public network 108. Further, in a manner similar to that described above with respect to transport module 214 and A/V demux 216, network interface 218 may be configured to extract audio packets, video packets, and data packets from a data stream. For example, network interface 218 may be configured to extract video packets, audio packets, and data packets according to one or more of internet protocol (IP), transport control protocol (TCP), real time streaming protocol (RTSP), user datagram protocol (UDP), real time protocol (RTP), MPEG transport stream protocols, and IPTV protocols. It should be noted, that the techniques described herein are generally applicable to any and all methods of digital content distribution and are not limited to particular communications network implementations. For example, the techniques described herein may be applicable to digital content originating from one or more of a broadcast, a multicast, a unicast, an over-the-top content source, a personal video recorder (PVR), and a peer-to-peer content source.
Referring again to
Audio decoder 224 may be configured to retrieve and process coded audio data. For example, audio decoder 224 may be a combination of hardware and software used to implement aspects of audio codec. Audio data may be coded using multi-channel formats such as those developed by Dolby and Digital Theater Systems. Audio data may be coded using a compressed or uncompressed format. Examples of compressed audio formats include MPEG-1, 2 Audio Layers II and III, AC-3, AAC, and Ogg Vorbis. An example of an uncompressed audio format includes pulse-code modulation (PCM) audio format. Audio processor 226 may be configured to retrieve captured audio samples and may process audio data for output to an audio system (not shown). In some examples, audio processor 226 may include a digital to analog converter. An audio system may comprise any of a variety of audio output devices such as headphones, a single-speaker system, a multi-speaker system, or a surround sound system.
Video decoder 228 may be configured to retrieve and process coded video data. For example, video decoder 228 may be a combination of hardware and software used to implement aspects of video codec. In one example, video decoder 228 may be configured to decode video data encode according to any number of video compression standards, such as ITU-T H.261, ISO/IEC MPEG-1 Visual, ITU-T H.262 or ISO/IEC MPEG-2 Visual, ITU-T H.263, ISO/IEC MPEG-4 Visual, ITU-T H.264 (also known as ISO/IEC MPEG-4 AVC), VP8, and High-Efficiency Video Coding (HEVC).
As described above, a device with media playback capabilities may provide a graphical user interface (e.g., an EPG) that enables a user to select content. A graphical user interface may include images and graphics displayed in conjunction with video content (e.g., playback icons overlaid on a video). Graphics processing unit 230 is an example of a dedicated processing unit that may be configured to generate graphical user interfaces, including the graphical user interfaces described herein. That is, graphics processing unit 230 may be configured to receive commands and content data and output pixel data. Graphic processing unit 230 may operate according to a graphics pipeline process (e.g., input assembler, vertex shader, geometry shader, rasterizer, pixel shader, and output merger). Graphics processing unit 230 may include multiple processing cores and may be configured to operate according to OpenGL (Open Graphic Library, managed by the Khronos Group) and/or Direct3D (managed by Microsoft, Inc.).
Display processor 232 may be configured to retrieve and process pixel data for display. For example, display processor 232 may receive pixel data from video decoder 228 and/or graphics processing unit 230 and output data for display. Display processor 232 may be coupled to a display, such display 250 (not shown in
As described above, traditional electronic program guides (EPGs) may be less than ideal for displaying available content originating from a plurality of diverse sources and further may be less than ideal for ultra-resolution displays. Further, traditional EPGs may be limited in how content may be sorted by a user. For example, a user may be limited to sorting on air television programming by channel and time and may be limited to sorting movies and streaming content by genre and alphabetically. Further, traditional EPGs do not enable users customize how content is sorted or provide dynamic sorting techniques. Computing device 200 may be configured to enable the selection of content by providing one or more of the graphical user interfaces described herein. The graphical user interfaces described herein may be provided to a computing device and/or an I/O device in communication with a computing device.
As illustrated in
Navigational items HOME, ON AIR TV, ON DEMAND, PHOTOS, MUSIC, APPS, SOCIAL, and MY STUFF represent different sources, types, and/or categories of content. As described in detail below, different graphical user interfaces may be presented to a user based on the navigational item that is selected. In the example illustrated in
In the example illustrated in
In one example, individual television shows represented by icon 308, individual movies represented by icon 310, and individual musical albums represented by icon 312 may be presented to a user based on an algorithm that determines the likelihood a user will select a particular piece of content. For example, computing device 200 may be configured to present individual pieces of content based on any and all combinations of consumption, behavior, and environment. In one example, consumption may include content a user has accessed or is accessing. In one example, behavior may include user usage information such as, for example, how fast the user changes channels, how often the user skips commercials, how frequently a user accesses content through a computing device. In one example, environment may include time (e.g., hour, day, month, or year) and location (e.g., home, car, or airport) of a computing device. For example, an algorithm may determine that a user prefers to watch crime dramas on Friday nights and graphical user interface 300 may present an icon representing a crime drama television show at a center position and/or the icon may be larger than other icons. Further, in one example graphical user interface 300 may incorporate 3D effects such that icons appear to be positioned in the foreground or the background based on the likelihood of selection.
As described above, applications may include games and programs operable on a computing device. As further described above, applications may be configured to allow a computing device to access content provided by a site in manner specific to the computing device. Referring again to
In the example illustrated in
In one example, graphical user interface 300 may display content recommendations by displaying information window or a preview window (not shown in
Further, in one example, a computing device may be configured to enable a user to select one of the plurality of icons using an I/O device. As described above, an I/O device may include a push-button remote control, a motion based remote control, a device including a touch-sensitive screen, a device including a track pad, a mouse, a keyboard, a microphone, video camera, a motion sensor, and/or an advanced user input device, such as a smart phone or a tablet computing device. For example, a user of computing device 200 may be able to select an icon using I/O device 222. Graphical user interface 300 may be configured such that each icon is capable of multiple levels of selection. For example, a user may be able to use an I/O device to move a cursor, where a cursor may be a visible or invisible cursor, to the location of an icon and remaining on the icon for a predetermined amount of time may be associated with one level of selection and activation of one or more controls on an I/O device (e.g., a single tap or a double tap on a touch-sensitive display) may be associated with other levels of selection (e.g., display information or provide a preview).
In one example, graphical user interface 300 may be configured to enable four levels of selection for each icon: a level that enlarges or highlights an icon, a level that provides information associated with content (e.g., cast and crew information), a level that provides a preview of content associated with an icon (e.g., a trailer), and a level that provides full access to content associated with an icon (e.g., play movie or television show or launch an application). These levels of selection are described in further detail below with respect to
In addition to enabling the user to select icons, graphical user interface 300 may enable a user to select one of navigational items 306. In one example, a user may be able to select one of ON AIR TV, ON DEMAND, PHOTOS, MUSIC, APPS, SOCIAL, and MY STUFF by moving a cursor to a corresponding location and/or by activating another navigational button on an I/O device. As described above, activation of one or more of “+” and “−” channel buttons 264, reverse playback button 271 and forward playback button 273, and navigational arrow buttons 281 may cause a selected navigational item 304 to change. In one example, a user may initially select one of navigational items 306 using navigational arrow buttons 281 and may further select an initially selected navigational item 306 by activating select button 282. In one example, upon activation of the select button 282, a graphical user interface other than graphical user interface 300 may be presented (e.g., graphical user interface 400 if ON AIR TV is selected) and menu bar 302 may become inactive. In one example, when menu bar 302 becomes inactive a user may not be able to change a selected navigational item using navigational arrow buttons 281, as navigational arrows buttons 281 may be used to navigate a respective graphical user interface. A user may need to reactive menu bar 302 before navigational arrow buttons 281 may be used to select a navigation item 306. In one example, menu bar 302 may be reactivated upon a user activating back button 286.
As described above with respect to
As illustrated in
It should be noted that although only television shows are displayed in the examples of
A computing device may be configured to enable a user to select one of the icons using an I/O device. For example, a user of computing device 200 may be able to select an icon using I/O device 222. Similar to levels of selection available for icons described above with respect to graphical user interface 300, graphic user interface 400 may enable multiple levels of selection for each icon. In the examples illustrated in
In addition to enabling a user to select individual icons within stack structure 402 and stack structure 404, graphical user interface 400 may enable a user to select a stack structure. In the example illustrated in
As illustrated in
In the examples illustrated in
The example graphical user interfaces illustrated in
As described above with respect to
It should be noted that although the examples, illustrated in
In another example, a sort cycle may change based on the frequency of selected a sub-category type. For example, the sub-category type most commonly used by a user for browsing content may be displayed first in a sorting cycle and less frequently used sub-category types may be displayed later in a sorting cycle. In one example, a computing device may analyze metadata associated with content and provide a ranked list of sorting cycles available to the users. For example, with respect to the example sort cycle described with respect to
In addition to sub-categories types described above with respect to ON AIR TV navigational item, other sub-categories types may be associated with other navigation items. Sub-categories types for content associated with ON DEMAND navigational item may include service providers, genres, titles, actors, directors, and/or popularity rankings. Sub-categories types for content associated with ON DEMAND navigational item are described in greater detail below with respect to
In a manner similar to that described above with respect to ON AIR TV navigational item, a computing device may enable a user to change a sub-category type, and thus, how content within a category is organized for content respectively associated with each of ON DEMAND navigational item, PHOTOS navigational item, MUSIC navigational item, APPS navigational item, SOCIAL navigation item, and MY STUFF navigational item. Further, computing device 200 may be configured to enable a user to progress through respective sort cycles for each of the navigational items.
In one example, computing device 200 may be configured to use one or more of the following sort cycles for content associated with the respective navigation items: ON AIR TV: availability time, network, alphabetical by title, most watched now, most popular amongst friends, and/or most trending shows. ON DEMAND: genre and/or sub-genre, alphabetical by title, chronological by release date, and/or popularity. PHOTOS: alphabetically by file name, chronological by date taken, alphabetical by photo album name, geographic location, subject in photo or faces, and/or source of photo. MUSIC: alphabetical by song title, alphabetical by album name, alphabetical by group or artist name, chronologically be release date, genre, popularity, and/or source of music. APPS: Genre, prices, release date, and/or popularity. In a manner similar to that described above, with respect to content associated with ON AIR TV each of these sort cycle may be modified based on one or more of consumption, behavior, environment, and/or may be set by a user. It should be noted that these sort cycle merely typify example sort cycles and each example sort cycle could be personalized further by determining preferred sorting techniques of a user. In this manner, each individual user could have their own sorting preference.
It should be noted that for the sake of brevity graphical user interfaces corresponding to PHOTOS, MUSIC, APPS, and SOCIAL being the selected navigational item are not individual represented in the drawings. However, when one of PHOTOS, MUSIC, APPS, and SOCIAL is a selected navigational item associated content may be organized into stack structures based on any of the sub-categories described above and a user may be able to select stack structures and icons in a manner similar to that described with respect to
As described above, content associated with MY STUFF navigational item includes content personalized to a user. In the example illustrated in
Thus, as illustrated in
As described above, a user of a computing device may be able to select a stack using an I/O device, such as I/O device 222. Examples of additional available levels of selection available for a stack are described below with respect to
In the example illustrated in
The position of icons within a stack may be based on an algorithm that determines the likelihood a user will select a particular icon. For example, recommended movies may be positioned within a stack in a position that facilitates selection by a user (e.g., at a center position). In a manner similar to that described above with respect to
In the example illustrated in
In the example illustrated in
Referring again to
As illustrated in
It should be noted that a computing device may display any and all combinations of the graphical user interfaces illustrated in
Flowchart 1100 illustrates an example of how graphical user interfaces described herein may be presented to a user. It should be noted that although flowchart 1100 is described with respect to computing device 200, the techniques described with respect to flowchart 1100 may be performed by any and all combinations of components of computing device 200. Computing device 200 provides an initial graphical user interface to a user (1102). In one example, an initial graphical user interface may include graphical user interface 300 and may be presented to a user after a user logs-in to a profile and/or upon a user activation guide button 285. Computing device 200 receives a category selection (1104). In one example, computing device 200 may receive a category selection from an I/O device and a user may indicate a category selection by selection of a navigational item included in a menu bar. In one example, a user may select a navigational item using stack structure navigational buttons 292. Computing device 200 displays content within a category according to sub-category stack structures (1106). For example, computing device 200 may respectively display one of graphical user interface 400, graphical user interface 500, graphical user interface 600, and graphical user interface 700. As described above, a user may be able to organize content associated with graphical user interface 400, graphical user interface 500, graphical user interface 600, and graphical user interface 700 by changing a sub-category type. In one example, computing device 200 may enable a user to progress through a sort cycle by successively activating sort button 291.
Computing device 200 receives a stack structure selection (1108). In one example, computing device 200 may receive a stack structure selection from I/O device 222 and a user may indicate a stack structure selection by highlighting a stack structure with a cursor and activating an I/O device control. In the example illustrated in flowchart 1100, computing device 200 displays content within a sub-category according to a mosaic. In one example, computing device 200 may display graphical user interface 800. Computing device 200 receives a user content selection (1112). In one example, computing device 200 may receive a user content selection according to the techniques described above with respect to
The disclosed and other embodiments, modules and the functional operations described in this document can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this document and their structural equivalents, or in combinations of one or more of them. The disclosed and other embodiments can be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer readable medium for execution by, or to control the operation of, data processing apparatus. The computer readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more them. The term “data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them. A propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, which is generated to encode information for transmission to suitable receiver apparatus.
A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a standalone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a mark-up language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
The processes and logic flows described in this document can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Computer readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
While this patent document contains many specifics, these should not be construed as limitations on the scope of an invention that is claimed or of what may be claimed, but rather as descriptions of features specific to particular embodiments. Certain features that are described in this document in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable sub-combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a sub-combination or a variation of a sub-combination. Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results.
Only a few examples and implementations are disclosed. Variations, modifications, and enhancements to the described examples and implementations and other implementations can be made based on what is disclosed.
This application is a continuation of U.S. patent application Ser. No. 17/249,876, filed on Mar. 17, 2021, which is a continuation of U.S. patent application Ser. No. 16/745,570, filed on Jan. 17, 2020, which is a continuation of U.S. patent application Ser. No. 16/148,843, filed on Oct. 1, 2018, which is a continuation of U.S. patent application Ser. No. 15/841,904, filed on Dec. 14, 2017, which is a continuation of U.S. patent application Ser. No. 14/336,758, filed on Jul. 21, 2014, which is a continuation-in-part of U.S. patent application Ser. No. 14/242,459, filed on Apr. 1, 2014, which claims the benefit of priority to U.S. Provisional Application Ser. No. 61/876,188, filed on Sep. 10, 2013 and U.S. Provisional Application Ser. No. 61/876,199, filed on Sep. 10, 2013. The above applications are incorporated herein by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
5623613 | Rowe et al. | Apr 1997 | A |
5798785 | Hendricks et al. | Aug 1998 | A |
5812123 | Rowe et al. | Sep 1998 | A |
6795826 | Flinn et al. | Sep 2004 | B2 |
6804675 | Knight et al. | Oct 2004 | B1 |
6845374 | Oliver et al. | Jan 2005 | B1 |
7188356 | Miura et al. | Mar 2007 | B1 |
7353235 | Sally et al. | Apr 2008 | B2 |
7607150 | Kobayashi et al. | Oct 2009 | B1 |
7644427 | Horvitz et al. | Jan 2010 | B1 |
7739604 | Lyons et al. | Jun 2010 | B1 |
7757250 | Horvitz et al. | Jul 2010 | B1 |
7801419 | Sakai et al. | Sep 2010 | B2 |
7853600 | Herz et al. | Dec 2010 | B2 |
7895624 | Thomas et al. | Feb 2011 | B1 |
7904924 | de Heer et al. | Mar 2011 | B1 |
8108341 | Barsook et al. | Jan 2012 | B2 |
8230360 | Ma et al. | Jul 2012 | B2 |
8234147 | Olejniczak et al. | Jul 2012 | B2 |
8286206 | Aaron et al. | Oct 2012 | B1 |
8346624 | Goad et al. | Jan 2013 | B2 |
8402031 | Govani et al. | Mar 2013 | B2 |
8429530 | Neuman et al. | Apr 2013 | B2 |
8515975 | Federici | Aug 2013 | B1 |
8539359 | Rapaport et al. | Sep 2013 | B2 |
8666979 | Chen et al. | Mar 2014 | B2 |
8677235 | Chronister et al. | Mar 2014 | B2 |
8780163 | Cahill et al. | Jul 2014 | B2 |
8803882 | Lam et al. | Aug 2014 | B2 |
9009768 | Agnihotri et al. | Apr 2015 | B2 |
9135333 | Cameron et al. | Sep 2015 | B2 |
9595300 | Duffin et al. | Mar 2017 | B2 |
9602563 | Barkai et al. | Mar 2017 | B2 |
9678623 | Neuman et al. | Jun 2017 | B2 |
9699503 | Fishman | Jul 2017 | B2 |
9883250 | Chai et al. | Jan 2018 | B2 |
10080060 | Fishman et al. | Sep 2018 | B2 |
10129600 | Fishman et al. | Nov 2018 | B2 |
10210160 | Fishman et al. | Feb 2019 | B2 |
10419817 | Fishman et al. | Sep 2019 | B2 |
10595094 | Fishman et al. | Mar 2020 | B2 |
10992995 | Fishman et al. | Apr 2021 | B2 |
11363342 | Fishman et al. | Jun 2022 | B2 |
20020010625 | Smith et al. | Jan 2002 | A1 |
20020011988 | Sai et al. | Jan 2002 | A1 |
20020053084 | Escobar et al. | May 2002 | A1 |
20020059593 | Shao et al. | May 2002 | A1 |
20020112239 | Goldman | Aug 2002 | A1 |
20030037334 | Khoo et al. | Feb 2003 | A1 |
20030093790 | Logan et al. | May 2003 | A1 |
20030172374 | Vinson et al. | Sep 2003 | A1 |
20030217365 | Caputo | Nov 2003 | A1 |
20030236695 | Litwin, Jr. | Dec 2003 | A1 |
20040054572 | Oldale et al. | Mar 2004 | A1 |
20040056900 | Blume | Mar 2004 | A1 |
20040252119 | Hunleth et al. | Dec 2004 | A1 |
20050038717 | McQueen et al. | Feb 2005 | A1 |
20050097121 | Sally et al. | May 2005 | A1 |
20060008256 | Khedouri et al. | Jan 2006 | A1 |
20060010464 | Azami | Jan 2006 | A1 |
20060123448 | Ma et al. | Jun 2006 | A1 |
20060200434 | Flinn et al. | Sep 2006 | A1 |
20060242554 | Gerace et al. | Oct 2006 | A1 |
20060250358 | Wroblewski | Nov 2006 | A1 |
20060277098 | Chung et al. | Dec 2006 | A1 |
20070011702 | Vaysman | Jan 2007 | A1 |
20070043617 | Stein et al. | Feb 2007 | A1 |
20070061745 | Anthony et al. | Mar 2007 | A1 |
20070074123 | Omura et al. | Mar 2007 | A1 |
20070100824 | Richardson et al. | May 2007 | A1 |
20070107019 | Romano et al. | May 2007 | A1 |
20070136753 | Bovenschulte et al. | Jun 2007 | A1 |
20070157248 | Ellis | Jul 2007 | A1 |
20070192794 | Curtis et al. | Aug 2007 | A1 |
20070220543 | Shanks et al. | Sep 2007 | A1 |
20070240180 | Shanks et al. | Oct 2007 | A1 |
20080066010 | Brodersen et al. | Mar 2008 | A1 |
20080092173 | Shannon et al. | Apr 2008 | A1 |
20080117202 | Martinez et al. | May 2008 | A1 |
20080134053 | Fischer | Jun 2008 | A1 |
20080155588 | Roberts et al. | Jun 2008 | A1 |
20080178239 | Yampanis | Jul 2008 | A1 |
20080222106 | Rao et al. | Sep 2008 | A1 |
20080282190 | Kagaya | Nov 2008 | A1 |
20080301118 | Chien et al. | Dec 2008 | A1 |
20090006374 | Kim et al. | Jan 2009 | A1 |
20090006398 | Lam et al. | Jan 2009 | A1 |
20090031354 | Riley et al. | Jan 2009 | A1 |
20090037254 | Colando | Feb 2009 | A1 |
20090046101 | Askey et al. | Feb 2009 | A1 |
20090049476 | Jung et al. | Feb 2009 | A1 |
20090060469 | Olague et al. | Mar 2009 | A1 |
20090070185 | Farrelly | Mar 2009 | A1 |
20090083326 | Pelton | Mar 2009 | A1 |
20090089433 | Kisel et al. | Apr 2009 | A1 |
20090092183 | O'hern | Apr 2009 | A1 |
20090100469 | Conradt et al. | Apr 2009 | A1 |
20090119258 | Petty | May 2009 | A1 |
20090144773 | Cavanaugh et al. | Jun 2009 | A1 |
20090150214 | Mohan | Jun 2009 | A1 |
20090150786 | Brown | Jun 2009 | A1 |
20090158337 | Stiers et al. | Jun 2009 | A1 |
20090163183 | O'donoghue et al. | Jun 2009 | A1 |
20090164450 | Martinez et al. | Jun 2009 | A1 |
20090177989 | Ma et al. | Jul 2009 | A1 |
20090182725 | Govani et al. | Jul 2009 | A1 |
20090249393 | Shelton et al. | Oct 2009 | A1 |
20090265359 | Barsook et al. | Oct 2009 | A1 |
20100042608 | Kane, Jr. | Feb 2010 | A1 |
20100058241 | Saijo et al. | Mar 2010 | A1 |
20100071000 | Amento et al. | Mar 2010 | A1 |
20100083318 | Weare et al. | Apr 2010 | A1 |
20100088312 | Goldfeder | Apr 2010 | A1 |
20100201618 | Lorente | Aug 2010 | A1 |
20100235745 | Shintani | Sep 2010 | A1 |
20100293034 | Olejniczak et al. | Nov 2010 | A1 |
20100333143 | Civanlar et al. | Dec 2010 | A1 |
20110035707 | Kitayama | Feb 2011 | A1 |
20110060649 | Dunk et al. | Mar 2011 | A1 |
20110162008 | Aldrey et al. | Jun 2011 | A1 |
20110175867 | Satake | Jul 2011 | A1 |
20110225290 | Kansal et al. | Sep 2011 | A1 |
20110239158 | Barraclough et al. | Sep 2011 | A1 |
20110283189 | Mccarty | Nov 2011 | A1 |
20110283304 | Roberts et al. | Nov 2011 | A1 |
20110289189 | Bartholomew | Nov 2011 | A1 |
20110289422 | Spivack et al. | Nov 2011 | A1 |
20110320715 | Ickman et al. | Dec 2011 | A1 |
20120059825 | Fishman et al. | Mar 2012 | A1 |
20120060094 | Irwin et al. | Mar 2012 | A1 |
20120060195 | Fishman et al. | Mar 2012 | A1 |
20120124625 | Foote et al. | May 2012 | A1 |
20120311453 | Reyna et al. | Dec 2012 | A1 |
20130044050 | Ylivainio | Feb 2013 | A1 |
20130061140 | Nseir et al. | Mar 2013 | A1 |
20130066885 | Komuves | Mar 2013 | A1 |
20130073988 | Groten et al. | Mar 2013 | A1 |
20130086159 | Gharachorloo et al. | Apr 2013 | A1 |
20130152129 | Alberth et al. | Jun 2013 | A1 |
20130191401 | Xia et al. | Jul 2013 | A1 |
20130204825 | Su | Aug 2013 | A1 |
20130204833 | Pang et al. | Aug 2013 | A1 |
20130212178 | Krishnamurthy | Aug 2013 | A1 |
20130212493 | Krishnamurthy | Aug 2013 | A1 |
20130232414 | Neuman et al. | Sep 2013 | A1 |
20140052785 | Sirpal | Feb 2014 | A1 |
20140068689 | Sirpal et al. | Mar 2014 | A1 |
20140108541 | Kawai | Apr 2014 | A1 |
20140215512 | Maruyama et al. | Jul 2014 | A1 |
20140365873 | Willis et al. | Dec 2014 | A1 |
20150006280 | Ruiz et al. | Jan 2015 | A1 |
20150033109 | Marek | Jan 2015 | A1 |
20150074552 | Chai et al. | Mar 2015 | A1 |
20150074721 | Fishman et al. | Mar 2015 | A1 |
20150074728 | Chai et al. | Mar 2015 | A1 |
20150206269 | Qin | Jul 2015 | A1 |
20180020255 | Fishman et al. | Jan 2018 | A1 |
20180035161 | Fishman et al. | Feb 2018 | A1 |
20180220194 | Chai et al. | Aug 2018 | A1 |
20180234736 | Fishman et al. | Aug 2018 | A1 |
20190045272 | Fishman et al. | Feb 2019 | A1 |
20190258689 | Fishman et al. | Aug 2019 | A1 |
20200045369 | Fishman et al. | Feb 2020 | A1 |
20200260152 | Fishman et al. | Aug 2020 | A1 |
20210258651 | Fishman et al. | Aug 2021 | A1 |
Number | Date | Country |
---|---|---|
201101152 | Oct 2011 | AU |
2020202800 | Dec 2021 | AU |
2847496 | Oct 2017 | CA |
2923815 | Sep 2022 | CA |
2923853 | Oct 2022 | CA |
1310921 | Aug 2001 | CN |
1325235 | Dec 2001 | CN |
1784647 | Jun 2006 | CN |
101303697 | Nov 2008 | CN |
102473191 | May 2012 | CN |
102656898 | Sep 2012 | CN |
102685583 | Sep 2012 | CN |
103081502 | May 2013 | CN |
106105230 | Nov 2016 | CN |
106462316 | Feb 2017 | CN |
111522480 | Aug 2020 | CN |
112822561 | May 2021 | CN |
2490454 | Aug 2012 | EP |
2614444 | Jul 2013 | EP |
2001145087 | May 2001 | JP |
2002521929 | Jul 2002 | JP |
2002288018 | Oct 2002 | JP |
2003345729 | Dec 2003 | JP |
2004023118 | Jan 2004 | JP |
2007516496 | Jun 2007 | JP |
2007267173 | Oct 2007 | JP |
2007300563 | Nov 2007 | JP |
2007323543 | Dec 2007 | JP |
2008152584 | Jul 2008 | JP |
2008527539 | Jul 2008 | JP |
2008191779 | Aug 2008 | JP |
2008276705 | Nov 2008 | JP |
2009520379 | May 2009 | JP |
2009122981 | Jun 2009 | JP |
2009266238 | Nov 2009 | JP |
2010506299 | Feb 2010 | JP |
2011070316 | Apr 2011 | JP |
2011097400 | May 2011 | JP |
2011234198 | Nov 2011 | JP |
2012038292 | Feb 2012 | JP |
2012503832 | Feb 2012 | JP |
2012065265 | Mar 2012 | JP |
2013009444 | Jan 2013 | JP |
2013012954 | Jan 2013 | JP |
2013505599 | Feb 2013 | JP |
2013080507 | May 2013 | JP |
2013135466 | Jul 2013 | JP |
6673990 | Mar 2020 | JP |
6677781 | Mar 2020 | JP |
20060031600 | Apr 2006 | KR |
20080095972 | Oct 2008 | KR |
1020120094690 | Aug 2012 | KR |
102231535 | Mar 2021 | KR |
10201807442 | Sep 2018 | SG |
WO-0033572 | Jun 2000 | WO |
WO-2004099903 | Nov 2004 | WO |
WO-2007078623 | Jul 2007 | WO |
WO-2008050613 | May 2008 | WO |
WO-2010084602 | Jul 2010 | WO |
WO-2011017316 | Feb 2011 | WO |
WO-2012015117 | Feb 2012 | WO |
WO-2012033489 | Mar 2012 | WO |
WO-2012088307 | Jun 2012 | WO |
WO-2012166925 | Dec 2012 | WO |
WO-2015038515 | Mar 2015 | WO |
WO-2015038516 | Mar 2015 | WO |
Entry |
---|
“A method of comfortably watching Sasuki, an Internet video, a ASCII Media Works, a company Limited”, with English translation, (Jul. 24, 2011), 7 pages. |
“U.S. Appl. No. 12/877,034, Appeal Brief filed Jun. 11, 2015”, 21 pgs. |
“U.S. Appl. No. 12/877,034, Appeal Decision dated Jan. 3, 2017”, 10 pgs. |
“U.S. Appl. No. 12/877,034, Decision on Pre-Appeal Brief Request dated Dec. 11, 2014”, 2 pgs. |
“U.S. Appl. No. 12/877,034, Examiner Interview Summary dated Jul. 24, 2013”, 3 pgs. |
“U.S. Appl. No. 12/877,034, Final Office Action dated Mar. 25, 2013”, 14 pgs. |
“U.S. Appl. No. 12/877,034, Final Office Action dated Jun. 13, 2014”, 14 pgs. |
“U.S. Appl. No. 12/877,034, Non Final Office Action dated Aug. 10, 2012”, 11 pgs. |
“U.S. Appl. No. 12/877,034, Non Final Office Action dated Oct. 1, 2013”, 13 pgs. |
“U.S. Appl. No. 12/877,034, Notice of Allowance dated Mar. 29, 2017”, 9 pgs. |
“U.S. Appl. No. 12/877,034, Pre-Appeal Brief Request filed Nov. 4, 2014”, 5 pgs. |
“U.S. Appl. No. 12/877,034, Response filed Feb. 26, 2014 to Non Final Office Action dated Oct. 1, 2013”, 13 lpgs. |
“U.S. Appl. No. 12/877,034, Response filed Aug. 26, 2013 to Final Office Action dated Mar. 25, 2013”, 12 pgs. |
“U.S. Appl. No. 12/877,034, Response filed Nov. 13, 2012 to Non Final Office Action dated Aug. 10, 2012”, 11 pgs. |
“U.S. Appl. No. 12/877,993, Amendment with Request to Reopen Prosecution filed Jul. 7, 2017”, 18 pgs. |
“U.S. Appl. No. 12/877,993, Appeal Brief filed Feb. 24, 2016”, 20 pgs. |
“U.S. Appl. No. 12/877,993, Appeal Decision dated May 8, 2017”, 9 pgs. |
“U.S. Appl. No. 12/877,993, Examiner Interview Summary dated Mar. 19, 2018”, 3 pgs. |
“U.S. Appl. No. 12/877,993, Examiner Interview Summary dated May 21, 2019”, 3 pgs. |
“U.S. Appl. No. 12/877,993, Examiner's Answer dated Jun. 3, 2016 to Appeal Brief filed Feb. 24, 2016”, 10 pgs. |
“U.S. Appl. No. 12/877,993, Final Office Action dated Jan. 28, 2015”, 35 pgs. |
“U.S. Appl. No. 12/877,993, Final Office Action dated Mar. 15, 2013”, 30 pgs. |
“U.S. Appl. No. 12/877,993, Final Office Action dated Jul. 9, 2018”, 37 pgs. |
“U.S. Appl. No. 12/877,993, Final Office Action dated Jul. 23, 2019”, 35 pgs. |
“U.S. Appl. No. 12/877,993, Non Final Office Action dated Feb. 4, 2019”, 34 pgs. |
“U.S. Appl. No. 12/877,993, Non Final Office Action dated Jun. 20, 2014”, 31 pgs. |
“U.S. Appl. No. 12/877,993, Non Final Office Action dated Aug. 2, 2012”, 26 pgs. |
“U.S. Appl. No. 12/877,993, Non Final Office Action dated Dec. 15, 2017”, 36 pgs. |
“U.S. Appl. No. 12/877,993, Reply Brief filed Aug. 3, 2016 to Examiner's Answer dated Jun. 3, 2016”, 5 pgs. |
“U.S. Appl. No. 12/877,993, Response filed Mar. 15, 2018 to Non Final Office Action dated Dec. 15, 2017”, 25 pgs. |
“U.S. Appl. No. 12/877,993, Response filed May 6, 2019 to Non Final Office Action dated Feb. 4, 2019”, 15 pgs. |
“U.S. Appl. No. 12/877,993, Response filed Jul. 22, 2013 to Final Office Action dated Mar. 15, 2013”, 17 pgs. |
“U.S. Appl. No. 12/877,993, Response filed Oct. 14, 2014 to Non Final Office Action dated Jun. 20, 2014”, 19 pgs. |
“U.S. Appl. No. 12/877,993, Response filed Nov. 8, 2018 to Final Office Action dated Jul. 9, 2018”, 15 pgs. |
“U.S. Appl. No. 12/877,993, Response filed Dec. 3, 2012 to Non Final Office Action dated Aug. 8, 2018”, 17 pgs. |
“U.S. Appl. No. 12/878,001, Appeal Brief filed May 12, 2015”, 16 pgs. |
“U.S. Appl. No. 12/878,001, Appeal Decision dated Mar. 20, 2017”, 10 pgs. |
“U.S. Appl. No. 12/878,001, Examiner Interview Summary dated Jul. 24, 2013”, 3 pgs. |
“U.S. Appl. No. 12/878,001, Examiner Interview Summary dated Jul. 27, 2018”, 3 pgs. |
“U.S. Appl. No. 12/878,001, Examiner Interview Summary dated Dec. 18, 2017”, 3 pgs. |
“U.S. Appl. No. 12/878,001, Final Office Action dated Mar. 29, 2013”, 13 pgs. |
“U.S. Appl. No. 12/878,001, Final Office Action dated Apr. 23, 2018”, 18 pgs. |
“U.S. Appl. No. 12/878,001, Final Office Action dated Jul. 17, 2014”, 12 pgs. |
“U.S. Appl. No. 12/878,001, Non Final Office Action dated Aug. 9, 2012”, 11 pgs. |
“U.S. Appl. No. 12/878,001, Non Final Office Action dated Aug. 24, 2017”, 14 pgs. |
“U.S. Appl. No. 12/878,001, Non Final Office Action dated Oct. 3, 2013”, 12 pgs. |
“U.S. Appl. No. 12/878,001, Notice of Allowance dated Oct. 2, 2018”, 12 pgs. |
“U.S. Appl. No. 12/878,001, Request to Reopen Prosecution under 37 C.F.R. 41.50 filed May 19, 2017”, 8 pgs. |
“U.S. Appl. No. 12/878,001, Response filed Apr. 1, 2014 to Non Final Office Action dated Oct. 3, 2013”, 13 pgs. |
“U.S. Appl. No. 12/878,001, Response filed Jul. 23, 2018 to Final Office Action dated Apr. 23, 2018”, 12 pgs. |
“U.S. Appl. No. 12/878,001, Response filed Aug. 23, 2013 to Final Office Action dated Mar. 29, 2013”, 12 pgs. |
“U.S. Appl. No. 12/878,001, Response filed Nov. 9, 2012 to Non Final Office Action dated Aug. 9, 2012”, 11 pgs. |
“U.S. Appl. No. 12/878,001, Response filed Dec. 22, 2017 to Non Final Office Action dated Aug. 24, 2017”, 16 pgs. |
“U.S. Appl. No. 14/242,459, Advisory Action dated Sep. 2, 2015”, 6 pgs. |
“U.S. Appl. No. 14/242,459, Appeal Brief filed Dec. 4, 2015”, 17 pgs. |
“U.S. Appl. No. 14/242,459, Applicant Summary of Interview with Examiner filed Sep. 15, 2015”, 2 pgs. |
“U.S. Appl. No. 14/242,459, Applicant Summary of Interview with Examiner filed Nov. 17, 2015”, 4 pgs. |
“U.S. Appl. No. 14/242,459, Decision on Pre-Appeal Brief Request dated Nov. 4, 2015”, 4 pgs. |
“U.S. Appl. No. 14/242,459, Examiner Interview Summary dated Mar. 3, 2015”, 3 pgs. |
“U.S. Appl. No. 14/242,459, Examiner Interview Summary dated Jul. 21, 2015”, 3 pgs. |
“U.S. Appl. No. 14/242,459, Examiner Interview Summary dated Nov. 17, 2015”, 3 pgs. |
“U.S. Appl. No. 14/242,459, Examiners Answer to Appea Brief dated Jul. 12, 2016”, 18 pgs. |
“U.S. Appl. No. 14/242,459, Final Office Action dated Jun. 19, 2015”, 21 pgs. |
“U.S. Appl. No. 14/242,459, Non Final Office Action dated Jan. 5, 2015”, 11 pgs. |
“U.S. Appl. No. 14/242,459, Pre-Brief Conference request filed Sep. 15, 2015”, 5 pgs. |
“U.S. Appl. No. 14/242,459, Response filed Feb. 19, 2015 to Non Final Office Action dated Jan. 5, 2015”, 9 pgs. |
“U.S. Appl. No. 14/242,459, Response filed Jul. 21, 2015 to Final Office Action dated Jun. 19, 2015”, 11 pgs. |
“U.S. Appl. No. 14/260,677, Advisory Action dated Dec. 9, 2016”, 3 pgs. |
“U.S. Appl. No. 14/260,677, Corrected Notice of Allowance dated Nov. 3, 2017”, 2 pgs. |
“U.S. Appl. No. 14/260,677, Examiner Interview Summary dated Aug. 28, 2017”, 3 pgs. |
“U.S. Appl. No. 14/260,677, Final Office Action dated Sep. 23, 2016”, 20 pgs. |
“U.S. Appl. No. 14/260,677, Non Final Office Action dated Jun. 6, 2017”, 18 pgs. |
“U.S. Appl. No. 14/260,677, Non Final Office Action dated Jun. 7, 2016”, 15 pgs. |
“U.S. Appl. No. 14/260,677, Notice of Allowability dated Sep. 25, 2017”, 2 pgs. |
“U.S. Appl. No. 14/260,677, Notice of Allowance dated Sep. 12, 2017”, 7 pgs. |
“U.S. Appl. No. 14/260,677, Response filed Aug. 29, 2017 to Non Final Office Action dated Jun. 6, 2017”, 11 pgs. |
“U.S. Appl. No. 14/260,677, Response filed Sep. 6, 2016 to Non Final Office Action dated Jun. 7, 2016”, 9 pgs. |
“U.S. Appl. No. 14/260,677, Response filed Dec. 1, 2016 to Final Office Action dated Sep. 23, 2016”, 11 pgs. |
“U.S. Appl. No. 14/336,758, Advisory Action dated Mar. 9, 2016”, 3 pgs. |
“U.S. Appl. No. 14/336,758, Appeal Brief filed May 20, 2016”, 18 pgs. |
“U.S. Appl. No. 14/336,758, Appeal Decision dated May 17, 2017”, 6 pgs. |
“U.S. Appl. No. 14/336,758, Examiner Interview Summary dated Dec. 20, 2017”, 3 pgs. |
“U.S. Appl. No. 14/336,758, Examiners Answer to Appeal Brief dated Jul. 12, 2016”, 13 pgs. |
“U.S. Appl. No. 14/336,758, Final Office Action dated Nov. 25, 2015”, 11 pgs. |
“U.S. Appl. No. 14/336,758, Non Final Office Action dated Jan. 29, 2015”, 10 pgs. |
“U.S. Appl. No. 14/336,758, Non Final Office Action dated Jul. 23, 2015”, 10 pgs. |
“U.S. Appl. No. 14/336,758, Notice of Allowance dated May 4, 2018”, 8 pgs. |
“U.S. Appl. No. 14/336,758, Notice of Allowance dated Aug. 1, 2017”, 8 pgs. |
“U.S. Appl. No. 14/336,758, Notice of Allowance dated Sep. 14, 2017”, 8 pgs. |
“U.S. Appl. No. 14/336,758, PTO Response to rule 312 Communication dated Dec. 21, 2017”, 2 pgs. |
“U.S. Appl. No. 14/336,758, Reply Brief filed Aug. 31, 2016”, 4 pgs. |
“U.S. Appl. No. 14/336,758, Response filed Feb. 25, 2016 to Final Office Action dated Nov. 25, 2015”, 5 pgs. |
“U.S. Appl. No. 14/336,758, Response filed Apr. 28, 2015 to Non Final Office Action dated Jan. 29, 2015”, 10 pgs. |
“U.S. Appl. No. 14/336,758, Response filed Sep. 22, 2015 to Non Final Office Action dated Jul. 23, 2015”, 14 pgs. |
“U.S. Appl. No. 15/637,561, Advisory Action dated Mar. 6, 2019”, 3 pgs. |
“U.S. Appl. No. 15/637,561, Final Office Action dated Nov. 26, 2018”, 20 pgs. |
“U.S. Appl. No. 15/637,561, Non Final Office Action dated Apr. 23, 2018”, 22 pgs. |
“U.S. Appl. No. 15/637,561, Notice of Allowance dated Apr. 18, 2019”, 12 pgs. |
“U.S. Appl. No. 15/637,561, Preliminary Amendment filed Oct. 5, 2017”, 7 pgs. |
“U.S. Appl. No. 15/637,561, Response filed Jan. 22, 2019 to Final Office Action dated Nov. 26, 2018”, 11 pgs. |
“U.S. Appl. No. 15/637,561, Response filed Jul. 23, 2018 to Non Final Office Action dated Apr. 23, 2018”, 11 pgs. |
“U.S. Appl. No. 15/637,561, Response filed Mar. 26, 2019 to Final Office Action dated Nov. 26, 2018”, 8 pgs. |
“U.S. Appl. No. 15/726,102, Non Final Office Action dated Apr. 18, 2018”, 32 pgs. |
“U.S. Appl. No. 15/726, 102, Preliminary Amendment filed Oct. 6, 2017”, 7 pgs. |
“U.S. Appl. No. 15/841,904, Notice of Allowance dated Jul. 2, 2018”, 8 pgs. |
“U.S. Appl. No. 15/841,904, Preliminary Amendment filed May 8, 2018”, 8 pgs. |
“U.S. Appl. No. 15/882,472, Examiner Interview Summary dated Jan. 7, 2021”, 3 pgs. |
“U.S. Appl. No. 15/882,472, Examiner Interview Summary dated Mar. 26, 2021”, 2 pgs. |
“U.S. Appl. No. 15/882,472, Examiner Interview Summary dated Apr. 30, 2020”, 3 pgs. |
“U.S. Appl. No. 15/882,472, Examiner Interview Summmary dated Nov. 5, 2019”, 3 pgs. |
“U.S. Appl. No. 15/882,472, Final Office Action dated Jan. 22, 2021”, 18 pgs. |
“U.S. Appl. No. 15/882,472, Final Office Action dated Nov. 27, 2019”, 16 pgs. |
“U.S. Appl. No. 15/882,472, Final Office Action dated Dec. 24, 2021”, 19 pgs. |
“U.S. Appl. No. 15/882,472, Non Final Office Action dated May 26, 2022”, 19 pgs. |
“U.S. Appl. No. 15/882,472, Non Final Office Action dated Jul. 12, 2019”, 16 pgs. |
“U.S. Appl. No. 15/882,472, Non Final Office Action dated Aug. 19, 2021”, 16 pgs. |
“U.S. Appl. No. 15/882,472, Non Final Office Action dated Oct. 5, 2020”, 15 pgs. |
“U.S. Appl. No. 15/882,472, Preliminary Amendment Filed Apr. 23, 2018”, 7 pgs. |
“U.S. Appl. No. 15/882,472, Response filed Jan. 5, 2021 to Non Final Office Action dated Oct. 5, 2020”, 11 pgs. |
“U.S. Appl. No. 15/882,472, Response filed Mar. 22, 2022 to Final Office Action dated Dec. 24, 2021”, 17 pgs. |
“U.S. Appl. No. 15/882,472, Response filed Apr. 14, 2020 to Final Office Action dated Nov. 27, 2019”, 11 pgs. |
“U.S. Appl. No. 15/882,472, Response filed Apr. 22, 2021 to Final Office Action dated Jan. 22, 2021”, 13 pgs. |
“U.S. Appl. No. 15/882,472, Response Filed Jan. 6, 2019 to Non Final Office Action dated Jul. 12, 2019”, 10 pgs. |
“U.S. Appl. No. 15/882,472, Response filed Nov. 5, 2021 to Non Final Office Action dated Aug. 19, 2021”, 18 pgs. |
“U.S. Appl. No. 16/148,843, Notice of Allowance dated Jul. 18, 2019”, 8 pgs. |
“U.S. Appl. No. 16/148,843, Notice of Allowance dated Oct. 18, 2019”, 8 pgs. |
“U.S. Appl. No. 16/148,843, Supplemental Amendment filed Oct. 9, 2019”, 8 pgs. |
“U.S. Appl. No. 16/237,022, Non Final Office Action dated Jan. 21, 2020”, 34 pgs. |
“U.S. Appl. No. 16/237,022, Preliminary Amendment Filed May 14, 2019”, 8 pgs. |
“U.S. Appl. No. 16/511,648, Preliminary Amendment Filed Oct. 29, 2019”, 6 pgs. |
“U.S. Appl. No. 16/745,570, Non Final Office Action dated Sep. 2, 2020”, 8 pgs. |
“U.S. Appl. No. 16/745,570, Notice of Allowance dated Dec. 18, 2020”, 8 pgs. |
“U.S. Appl. No. 16/745,570, Preliminary Amendment filed May 6, 2020”, 9 pgs. |
“U.S. Appl. No. 16/745,570, Response filed Dec. 2, 2020 to Non Final Office Action dated Sep. 2, 2020”, 12 pgs. |
“U.S. Appl. No. 17/249,876, Non Final Office Action dated Oct. 7, 2021”, 7 pgs. |
“U.S. Appl. No. 17/249,876, Notice of Allowance dated Feb. 16, 2022”, 8 pgs. |
“U.S. Appl. No. 17/249,876, Preliminary Amendment filed May 12, 2021”, 8 pgs. |
“U.S. Appl. No. 17/249,876, Response filed Dec. 21, 2021 to Non Final Office Action dated Oct. 7, 2021”, 10 pgs. |
“Australian Application Serial No. 2011101152, Examination Report No. 1 dated May 6, 2013”, 4 pgs. |
“Australian Application Serial No. 2011101152, Response filed Sep. 17, 2013 to Examination Report No. 1 dated May 6, 2013”, 13 pgs. |
“Australian Application Serial No. 2011299234, Amendment filed Apr. 4, 2013”, 11 pgs. |
“Australian Application Serial No. 2011299234, Amendment filed Aug. 25, 2015”, 26 pgs. |
“Australian Application Serial No. 2011299234, First Examiner Report dated Aug. 25, 2014”, 3 pgs. |
“Australian Application Serial No. 2011299234, Response filed Oct. 26, 2015 to Subsequent Examiners Report dated Sep. 4, 2015”, 3 pgs. |
“Australian Application Serial No. 2011299234, Subsequent Examiners Report dated Sep. 4, 2015”, 4 pgs. |
“Australian Application Serial No. 2014318961, First Examination Report dated Apr. 23, 2018”, 4 pgs. |
“Australian Application Serial No. 2014318961, Response filed Jan. 6, 2020 to First Examination Report dated Apr. 23, 2019”, 15 pgs. |
“Australian Application Serial No. 2014318962, First Examination Report dated Apr. 5, 2018”, 6 pgs. |
“Australian Application Serial No. 2014318962, Response filed Jan. 16, 2019 to Subsequent Examiners Report dated Sep. 25, 2018”, 12 pgs. |
“Australian Application Serial No. 2014318962, Response filed Aug. 29, 2018 to First Examination Report dated Apr. 5, 2018”, 14 pgs. |
“Australian Application Serial No. 2014318962, Subsequent Examiners Report dated Sep. 25, 2018”, 5 pgs. |
“Australian Application Serial No. 2016201377, First Examiner Report dated Feb. 1, 2017”, 3 pgs. |
“Australian Application Serial No. 2016201377, Response filed May 25, 2017 to First Examiner Report dated Feb. 1, 2017”, 55 pgs. |
“Australian Application Serial No. 2016201377, Response filed Aug. 9, 2017 to Subsequent Examiners Report dated Jun. 6, 2017”, 2 pgs. |
“Australian Application Serial No. 2016201377, Subsequent Examiners Report dated Jun. 6, 2017”, 3 pgs. |
“Australian Application Serial No. 2016201377, Subsequent Examiners Report dated Aug. 23, 2017”, 3 pgs. |
“Australian Application Serial No. 2020202800, First Examination Report dated Jun. 15, 2021”, 4 pages. |
“Australian Application Serial No. 2020202800, Respone filed Nov. 2, 2021 to First Examination Report dated Jun. 15, 2021”, 3 pgs. |
“Brazil Application Serial No. BR1120130055251, Office Action dated Sep. 24, 2019”, w/English translation, 7 pgs. |
“Brazil Application Serial No. BR1120130055251, Response filed Dec. 23, 2019 to Office Action dated Sep. 24, 2019”, w/English Claims, 32 pgs. |
“Brazilian Application Serial No. 1120160051297, Office Action dated Jul. 15, 2020”, with English translation, 6 pages. |
“Brazilian Application Serial No. 1120160051297, Response filed Oct. 16, 2020 to Office Action dated Jul. 15, 2020”, with English claims, 24 pages. |
“Brazilian Application Serial No. 1120160051319, Office Action dated Feb. 27, 2020”, with English translation, 6 pages. |
“Brazilian Application Serial No. 1120160051319, Response filed Jun. 8, 2020 to Office Action dated Feb. 27, 2020”, with English claims, 53 pages. |
“Brazilian Application Serial No. BR1120130055251, Voluntary Amendment filed Sep. 8, 2014”, with English claims, 9 pages. |
“Canadian Application Serial No. 2,810,521, Examiner's Rule 30(2) Requisition dated Jan. 4, 2019”, 4 pgs. |
“Canadian Application Serial No. 2,810,521, Office Action dated Mar. 1, 2018”, 5 pgs. |
“Canadian Application Serial No. 2,810,521, Office Action dated Jun. 8, 2017”, 3 pgs. |
“Canadian Application Serial No. 2,810,521, Response filed Apr. 4, 2019 to Examiner's Rule 30(2) Requisition dated Jan. 4, 2019”, 9 pgs. |
“Canadian Application Serial No. 2,180,521, Response filed Jul. 30, 2018 to Office Action dated Mar. 1, 2018”, 17 pgs. |
“Canadian Application Serial No. 2,810,521, Response filed Sep. 7, 2017 to Office Action dated Jun. 8, 2017”, 15 pgs. |
“Canadian Application Serial No. 2,923,815, Office Action dated Oct. 16, 2020”, 4 pgs. |
“Canadian Application Serial No. 2,923,815, Response filed Feb. 10, 2021 to Office Action dated Oct. 16, 2020”, 40 pgs. |
“Canadian Application Serial No. 2,923,815, Voluntary Amendment Filed Oct. 8, 2021”, w/ English Claims, 39 pgs. |
“Canadian Application Serial No. 2,923,853, Office Action dated Jun. 25, 2021”, 5 pgs. |
“Canadian Application Serial No. 2,923,853, Office Action dated Oct. 14, 2020”, 4 pgs. |
“Canadian Application Serial No. 2,923,853, Office Action dated Dec. 23, 2021”, 4 pgs. |
“Canadian Application Serial No. 2,923,853, Response Filed Feb. 15, 2022 to Office Action dated Dec. 23, 2021”, 15 pgs. |
“Canadian Application Serial No. 2,923,853, Response filed Apr. 29, 2021 to Office Action dated Oct. 14, 2020”, 3 pgs. |
“Canadian Application Serial No. 2,923,853, Response filed Oct. 19, 2021 to Office Action dated Jun. 25, 2021”, 23 pgs. |
“Canadian Application Serial No. 2,923,853, Voluntary Amendment filed Sep. 10, 2020”, 29 pgs. |
“Chinese Application Serial No. 201480056859.2, Decision of Rejection dated Jun. 11, 2019”, w/ English translation, 13 pgs. |
“Chinese Application Serial No. 201480056859.2, Decision on Reexamination dated Dec. 22, 2020”, with English translation, 27 pgs. |
“Chinese Application Serial No. 201480056859.2, Notice of Reexamination dated Jul. 8, 2020”, with English translation, 16 pages. |
“Chinese Application Serial No. 201480056859.2, Office Action dated Jan. 23, 2019”, (w/English Translation), 12 pages. |
“Chinese Application Serial No. 201480056859.2, Office Action dated Jul. 31, 2018”, W/ English Translation, 17 pgs. |
“Chinese Application Serial No. 201480056859.2, Office Action dated Sep. 21, 2020”, w/ English translation, 17 pgs. |
“Chinese Application Serial No. 201480056859.2, Respone filed Sep. 18, 2019 to Decision of Rejection dated Jun. 11, 2019”, w/English Translation of Claims, 10 pgs. |
“Chinese Application Serial No. 201480056859.2, Response filed Apr. 3, 2019 to Office Action dated Jan. 23, 2019”, w/ English Translation, 2 pgs. |
“Chinese Application Serial No. 201480056859.2, Response filed Aug. 21, 2020 to Notice of Reexamination dated Jul. 8, 2020”, w/English Claims, 11 pgs. |
“Chinese Application Serial No. 201480056859.2, Response filed Oct. 28, 2020 to Notice of Reexamination dated Sep. 21, 2020”, with English translation of claims, 13 pgs. |
“Chinese Application Serial No. 201480056859.2, Response Filed Dec. 17, 2018 to Office Action dated Jul. 31, 2018”, with English claims, 10 pages. |
“Chinese Application Serial No. 201480056877.0, Office Action dated Feb. 1, 2019”, (w/English Translation), 16 pages. |
“Chinese Application Serial No. 201480056877.0, Office Action dated Aug. 30, 2019”, W/English Translation, 18 pgs. |
“Chinese Application Serial No. 201480056877.0, Response filed Jan. 13, 2020 to Office Action dated Aug. 30, 2019”, w/English Claims. 15 pgs. |
“Chinese Application Serial No. 20140056877.0, Response filed Jun. 17, 2019 to Office Action dated Feb. 1, 2019”, 13 pgs. |
“Chinese Application Serial No. 202110295575.0, Voluntary Amendment Filed Sep. 7, 2021”, w/English Claims, 9 pgs. |
“European Application Serial No. 11824078.7, Communication Pursuant to Article 94(3) EPC dated May 7, 2018”, 3 pgs. |
“European Application Serial No. 11824078.7, Communication Pursuant to Article 94(3) EPC dated Aug. 16, 2018”, 7 pgs. |
“European Application Serial No. 11824078.7, Extended European Search Report dated Aug. 19, 2016”, 10 pgs. |
“European Application Serial No. 11824078.7, Response filed Feb. 26, 2019 to Communication Pursuant to Article 94(3) EPC dated Aug. 16, 2018”, 13 pgs. |
“European Application Serial No. 11824078.7, Response filed Mar. 3, 2017 to Extended European Search Report dated Aug. 19, 2016”, 4 pgs. |
“European Application Serial No. 11824078.7, Response filed May 29, 2018 to Communication Pursuant to Article 94(3) EPC dated May 7, 2018”, 1 pg. |
“European Application Serial No. 11824078.7, Response filed Nov. 7, 2019 to Summons to Attend Oral Proceedings dated Jul. 23, 2019”, 29 pgs. |
“European Application Serial No. 11824078.7, Summons to Attend Oral Proceedings dated Jul. 23, 2019”, 9 pgs. |
“European Application Serial No. 14843569.6, Communication Pursuant to Article 94(3) EPC dated May 14, 2019”, 8 pgs. |
“European Application Serial No. 14843569.6, Extended European Search Report dated Mar. 6, 2017”, 10 pgs. |
“European Application Serial No. 14843569.6, Response filed Sep. 20, 2017 to Extended European Search Report dated Mar. 6, 2017”, 12 pgs. |
“European Application Serial No. 14843569.6, Response filed Oct. 26, 2016 to Communication pursuant to Rules 161(2) and 162 EPC dated Apr. 22, 2016”, 9 pgs. |
“European Application Serial No. 14843569.6, Response Filed Nov. 25, 2019 to Communication Pursuant to Article 94(3) EPC dated May 14, 2019”, 12 pgs. |
“European Application Serial No. 14843569.6, Summons to Attend Oral Proceedings dated Mar. 20, 2020”, 9 pgs. |
“European Application Serial No. 14844441.7, Communication Pursuant to Article 94(3) EPC dated Aug. 27, 2018”, 7 pgs. |
“European Application Serial No. 14844441.7, Decision to Refuse date Dec. 10, 2019”, 12 pgs. |
“European Application Serial No. 14844441.7, Extended European Search Report dated Mar. 2, 2017”, 10 pgs. |
“European Application Serial No. 14844441.7, Response Filed Jan. 3, 2019 to Communication Pursuant to Article 94(3) EPC dated Aug. 27, 2018”, 35 pgs. |
“European Application Serial No. 14844441.7, Response filed Sep. 20, 2017 to Extended European Search Report dated Mar. 2, 2017”, 45 pgs. |
“European Application Serial No. 14844441.7, Response filed Oct. 21, 2019 to Summons to Attend Ora Proceedings dated Mar. 14, 2019”, 12 pgs. |
“European Application Serial No. 14844441.7, Response filed Oct. 26, 2016 to Communication pursuant to Rules 161(2) and 162 EPC dated Apr. 19, 2016”, 7 pgs. |
“European Application Serial No. 14844441.7, Summons to Attend Oral Proceedings dated Mar. 14, 2019”, 8 pgs. |
“Indian Application Serial No. 201627011112, First Examination Report dated Jun. 19, 2020”, 8 pgs. |
“Indian Application Serial No. 201627011112, Response filed Dec. 14, 2020 to First Examination Report dated Jun. 19, 2020”, 155 pgs. |
“Indian Application Serial No. 201627011113, First Examination Report dated Oct. 22, 2019”, 5 pgs. |
“Indian Application Serial No. 201627011113, Response filed Apr. 10, 2020 to First Examination Report dated Oct. 22, 2019”, 21 pgs. |
“International Application Serial No. PCT/US2011/50712, International Preliminary Report on Patentability dated Mar. 21, 2013”, 8 pgs. |
“International Application Serial No. PCT/US2011/50712, International Search Report dated Jan. 5, 2012”, 2 pgs. |
“International Application Serial No. PCT/US2011/50712, Written Opinion dated Jan. 5, 2012”, 6 pgs. |
“International Application Serial No. PCT/US2014/054701, International Preliminary Report on Patentability dated Mar. 24, 2016”, 8 pgs. |
“International Application Serial No. PCT/US2014/054701, International Search Report dated Jan. 12, 2015”, 2 pgs. |
“International Application Serial No. PCT/US2014/054701, Written Opinion dated Jan. 12, 2015”, 6 pgs. |
“International Application Serial No. PCT/US2014/054702, International Preliminary Report on Patentability dated Mar. 24, 2016”, 6 pgs. |
“International Application Serial No. PCT/US2014/054702, International Search Report dated Nov. 19, 2014”, 2 pgs. |
“International Application Serial No. PCT/US2014/054702, Writte Opinion dated Nov. 19, 2014”, 4 pgs. |
“Japanese Application Serial No. 2016-542039, Office Action dated May 15, 2018”, With English Translation, 10 pgs. |
“Japanese Application Serial No. 2016-542040, Office Action dated May 22, 2018”, With English Translation, 7 pgs. |
“Japanese Application Serial No. 2018-153570, Notification of Reasons for Rejection dated Aug. 6, 2019”, w/ English Translation, 6 pgs. |
“Japanese Application Serial No. 2018-153570, Response filed Nov. 6, 2019 to Notification of Reasons for Rejection dated Aug. 6, 2019”, with English claims, 16 pgs. |
“Japanese Application Serial No. 2018-193563, Notification of Reasons for Refusal dated Jun. 25, 2019”, w/ English translation, 11 pgs. |
“Japanese Application Serial No. 2018-193563, Response filed Sep. 25, 2019 to Notification of Reasons for Refusal dated Jun. 25, 2019”, w/English Translation of Claims, 27 pgs. |
“Japanese Application Serial No. 2020-044427, Examiners Decision of Final Refusal dated Jun. 8, 2021”, with English translation, 6 pages. |
“Japanese Application Serial No. 2020-044427, Notification of Reasons for Refusal dated Mar. 2, 2021”, with English translation, 8 pages. |
“Japanese Application Serial No. 2020-044427, Response filed May 17, 2021 to Notification of Reasons for Refusal dated Mar. 2, 2021”, with English claims, 18 pages. |
“Korean Application Serial No. 10-2016-7009259, Notice of Preliminary Rejection mailed Apr. 11, 18”, w/ English translation, 14 pgs. |
“Korean Application Serial No. 10-2016-7009259, Notice of Preliminary Rejection dated Oct. 2, 2018”, w/ English Translation, 5 pgs. |
“Korean Application Serial No. 10-2016-7009259, Response Filed Dec. 12, 2018 to Notice of Preliminary Rejection dated Oct. 2, 2018”, (w/English Claims), 17 pages. |
“Korean Application Serial No. 10-2016-7009259, Response filed Jun. 11, 2018 to Notice of Preliminary Rejection dated Apr. 11, 2018”, With English Claims, 28 pgs. |
“Korean Application Serial No. 10-2016-7009260, Notice of Preliminary Rejection dated Jun. 29, 2020”, with English translation, 10 pages. |
“Korean Application Serial No. 10-2016-7009260, Response filed Aug. 31, 2020 to Notice of Preliminary Rejection dated Jun. 29, 2020”, w/English Claims, 25 pgs. |
“Mexican Application Serial No. MX/a/2016/003114, Office Action dated Nov. 16, 2017”, with English translation, 4 pages. |
“Mexican Application Serial No. MX/a/2016/003114, Response filed Feb. 6, 2018 to Office Action dated Nov. 16, 2017”, with English translation, 18 pages. |
“Mexican Application Serial No. MX/a/2016/003115, Office Action dated Nov. 7, 2017”, with English translation, 6 pages. |
“Mexican Application Serial No. MX/a/2016/003115, Response filed Feb. 19, 2018 to Office Action dated Nov. 7, 2017”, with English translation, 14 pages. |
“Mexican Application Serial No. MX/a/2018/006222, Office Action dated Jun. 3 2021”, English translated copy, 3 pages. |
“Mexican Application Serial No. MX/a/2018/006222, Office Action dated Sep. 14, 2021”, with machine English translation, 3 pgs. |
“Mexican Application Serial No. MX/A/2018/006222, Response filed Aug. 19, 2021 to Office Action dated Jun. 3, 2021”, with machine English translation, 16 pages. |
“Mexican Application Serial No. MX/A/2018/006222, Response filed Nov. 11, 2021 Office Action dated Sep. 14, 2021”, with English claims, 11 pages. |
“Singapore Application No. 11201602200U Written Opinion Mailed Aug. 13, 2018”, 4 pgs. |
“Singapore Application Serial No. 10201807442R, Search Report and Written Opinion dated Jul. 15, 2022”, 6 pgs. |
“Singapore Application Serial No. 10201807442R, Search Report dated May 26, 2020”, 4 pgs. |
“Singapore Application Serial No. 112021602200U, Response filed Nov. 7, 2018 to Written Opinion dated Aug. 16, 2018”, with English translation of claims, 12 pgs. |
“Brazilian Application Serial No. 1120160051319, Response filed Nov. 23, 2022 to Office Action dated Aug. 23, 2022”, with machine translation, 15 pgs. |
“Chinese Application Serial No. 202110295575.0, Office Action dated Oct. 10, 2022”, w/ English translation, 10 pgs. |
“Singapore Application Serial No. 10201807442R, Response filed Nov. 30, 2022 to Search Report and Written Opinion dated Jul. 15, 2022”, 20 pgs. |
U.S. Appl. No. 12/877,993, filed Sep. 8, 2010, System and Method for Displaying Information Related to Video Programs in a Graphical User Interface. |
U.S. Appl. No. 14/260,677, U.S. Pat. No. 9,883,250, filed Apr. 24, 2014, System and Method of Displaying Content and Related to Social Media Data. |
U.S. Appl. No. 15/882,472, filed Jan. 29, 2018, System and Method of Displaying Conent and Related Social Media Data. |
U.S. Appl. No. 14/242,459, filed Apr. 1, 2014, Systems and Methods of Displaying Conent. |
U.S. Appl. No. 14/336,758, U.S. Pat. No. 10,080,060, filed Jul. 21, 2014, Systems and Methods of Displaying Conent. |
U.S. Appl. No. 15/841,904, U.S. Pat. No. 10,129,600, filed Dec. 14, 2017, Systems and Methods of Displaying Conent. |
U.S. Appl. No. 16/745,570, U.S. Pat. No. 10,992,995, filed Jan. 17, 2020, Systems and Methods of Displaying Content. |
U.S. Appl. No. 16/148,843, U.S. Pat. No. 10,595,094, filed Oct. 1, 2018, Systems and Methods of Displaying Content. |
U.S. Appl. No. 17/249,876, U.S. Pat. No. 11,363,342, filed Mar. 17, 2021, Systems and Methods of Displaying Content. |
“U.S. Appl. No. 15/882,472, Final Office Action dated Sep. 16, 2022”, 18 pgs. |
“U.S. Appl. No. 15/882,472, Response filed Aug. 18, 2022 to Non Final Office Action dated May 26, 2022”, 20 pgs. |
“Brazilian Application Serial No. 1120160051319, Office Action dated Aug. 23, 2022”, with machine translation, 8 pgs. |
“Brazilian Application Serial No. 1120160051319, Opinion for non-patenteability (RPI 7.1) dated Jan. 3, 2023”, w English Translation, 11 pgs. |
“Brazilian Application Serial No. 1120160051297, Office Action dated Jan. 9, 2023”, w English Translation, 16 pgs. |
“Chinese Application Serial No. 202110295575.0, Response filed Feb. 16, 2023 to Office Action dated Oct. 10, 2022”, w English Claims, 12 pgs. |
“Chinese Application Serial No. 202010338682.2, Office Action dated Mar. 20, 2023”, W English Translation, 23 pgs. |
“Brazilian Application Serial No. 1120160051319, Response filed Apr. 3, 2023 to Opinion for non-patenteability (RPI 7.1) dated Jan. 3, 2023”, with machine translation, 9 pgs. |
“Brazilian Application Serial No. 1120160051319, Final Office Action dated Jun. 6, 2023”, with machine translation, 9 pgs. |
“Chinese Application Serial No. 202110295575.0, Office Action dated May 27, 2023”, W English Translation, 15 pgs. |
Number | Date | Country | |
---|---|---|---|
20220329914 A1 | Oct 2022 | US |
Number | Date | Country | |
---|---|---|---|
61876199 | Sep 2013 | US | |
61876188 | Sep 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17249876 | Mar 2021 | US |
Child | 17662808 | US | |
Parent | 16745570 | Jan 2020 | US |
Child | 17249876 | US | |
Parent | 16148843 | Oct 2018 | US |
Child | 16745570 | US | |
Parent | 15841904 | Dec 2017 | US |
Child | 16148843 | US | |
Parent | 14336758 | Jul 2014 | US |
Child | 15841904 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14242459 | Apr 2014 | US |
Child | 14336758 | US |