Input devices may be used to interface with various types of electronic devices, such as those of an entertainment system. In particular, such input devices may allow for a user to interface with the entertainment system wirelessly. Traditionally, input devices (e.g., remote controls) may be configured for one-way communication, so as to transmit commands to the entertainment system. As such, it may be difficult to enhance the user experience associated with the entertainment system by expanding the user experience to incorporate the input device.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
According to one aspect of this disclosure, a method of dynamically changing a user interface of a companion device configured to remotely control an entertainment system is provided. The method includes establishing two-way communication with the entertainment system, and registering one or more trigger events with the entertainment system. The method further includes, upon occurrence of a trigger event, receiving a notification of the trigger event from the entertainment system, and dynamically changing the user interface of the companion device responsive to the notification.
An input device may utilize communication protocols and/or networking protocols such as Internet protocols, infrared protocols, radio-frequency protocols, Universal Plug and Play (UPnP), etc. to interface with an entertainment system. However, the user experience is typically fairly restricted in such an environment. Traditionally, input devices support one-way communication to the entertainment system but may not be configured to receive content from the entertainment system via a back channel. As such, the entertainment system is unable to provide content and/or information to the input device. The present disclosure is directed to an input device configured to provide an expanded user experience by displaying companion content (e.g., themed user interface, related programming, images, advertisements, etc.) related to content being provided by an entertainment system.
As a nonlimiting example,
The content signals received by entertainment system 22 may be provided by any suitable content source, such as a cable television provider, a satellite television provider, an Internet protocol television (IPTV) provider, a media-disc player, a digital video recorder, data stored on mass storage, etc. In the depicted example, entertainment system 22 is displaying content 28 of a basketball game.
A user 30 may interface with entertainment system 22 via a companion device 32, such as the user's mobile communication device. Thus, companion device 32 may be configured as an input device for entertainment system 22 as well as being configured as a computing and/or communication device. For example, companion device 32 may be configured to communicate via two-way radio telecommunications over a cellular network. Companion device 32 may additionally or alternatively be configured to communicate via other technologies such as via the Internet, Bluetooth, infrared, radio-frequency, etc. Further, companion device 32 may additionally be configured to send and/or receive text communications (e.g., SMS messages, email, etc.). As depicted, companion device 32 includes a display 34 for displaying content. Such content may be received from any suitable source, such as local mass storage at companion device 32, entertainment system 22, a service 36 via a network 38, etc.
Companion device 32 may register itself with entertainment system 22 so as to receive notifications when desired events occur, such as channel changes, advertisements, etc. In some embodiments, the companion device 32 may register itself by sending a registration message to the entertainment system. In some embodiments, the companion device 32 may register itself by sending a registration message to service 36.
A registration message may be virtually any suitable format without departing from the scope of this disclosure. In some embodiments, the registration message may be formatted with an extensible markup language. In some embodiments, an application programming interface may be established for registration communications and/or event notifications.
Event notifications that are sent responsive to registered trigger events may be sent from entertainment system 22 and/or service 36 via network 38. Further, upon receiving an event notification, companion device 32 may dynamically change the user interface 40 displayed on display 34 of companion device 32. In the depicted example, companion device 32 is displaying a basketball-themed user interface which corresponds to the basketball game of content 28 provided by entertainment system 22.
In some embodiments, companion device 32 may be configured to determine the content being displayed at display device 24 by querying (e.g., “polling”) content device 26. The companion device 32 may poll the content device at virtually any fixed or variable interval using any suitable approach.
It should be appreciated that the above-described examples of
A companion device such as example companion device 32 configured to remotely control an entertainment system such as entertainment system 22 may be configured to dynamically change its user interface responsive to event notifications in any suitable manner. As an example,
Upon establishing two-way communication, method 50 proceeds to 56 where the companion devices registers one or more trigger events with the entertainment system. A trigger event may include, for example, an event occurring at the entertainment system, such as a changing of content being provided by the entertainment system. As a nonlimiting example, content may change in response to a user's request, such as a channel change. As another nonlimiting example, content may automatically change at a program boundary transition when current programming ends and subsequent programming commences.
Such registration may be done in any suitable manner. For example, the companion device may send a registration message from the companion device to the entertainment system, as indicated at 58. Such a registration message may define a category of trigger event that is to be reported (e.g., all channel changes). In some embodiments, the registration message may further define an address to which the notification of the trigger event is to be reported upon occurrence of that category of trigger event.
In the case that the companion device sends a registration message, the entertainment system may then receive the message, as indicated at 60, and register the companion device as indicated at 62.
Next, upon registering the companion device, the entertainment system may determine that a trigger event has occurred, as indicated at 64. The entertainment system may determine a trigger event has occurred in any suitable manner. For example, in some embodiments, the entertainment system may be configured to locally detect such events, as indicated at 66. However, in some embodiments, a service may determine a trigger event has occurred, as indicated at 68, and may then send a message to the entertainment system to notify the entertainment system of the trigger event, as indicated at 70. In response to the trigger event, the entertainment system may then send a notification of the trigger event to the companion device, as indicated at 72.
Thus, method 50 proceeds to 74, wherein upon occurrence of a trigger event, the companion device receives a notification of the trigger event from the entertainment system. In some embodiments, this may include receiving user interface elements responsive to the notification, as indicated at 76. Such user interface elements may be sent by the entertainment system and/or service, as indicated at 78 and 80 respectively.
Next, method 50 proceeds to 82, wherein the companion device dynamically changes the user interface of the companion device responsive to the notification. In the case that the companion device received user interface elements from the entertainment system and/or service, the user interface may be dynamically changed to include those user interface elements, as indicated at 84. In some embodiments, the entertainment system may be configured to provide content, for example to a display device, and the user interface elements sent to the companion device may be associated with the content. As a nonlimiting example, dynamically changing the user interface of the companion device may include updating a theme of the user interface based on content being provided by the entertainment system. As another nonlimiting example, dynamically changing the user interface of the companion device may include visually presenting an advertisement.
It should be appreciated that the user interface of the companion device may dynamically change in any suitable manner. For example, the user interface may dynamically change to have a different aesthetic to match the content on the entertainment system, but may retain the same virtual buttons and controls, thus remaining functionally-equivalent to a previously-displayed user interface. This is illustrated by way of example at
As another example, the user interface may dynamically change to have different virtual buttons and/or controls to functionally augment the content on the entertainment system. For example, if the content on the entertainment center is a basketball game, then the user interface may include virtual buttons for changing the channel to other sporting events that are currently playing. Further, if the content on the entertainment system changes to a movie, then the user interface may dynamically change to present controls for selecting subtitles, viewing environments, surround-sound preferences, etc.
As yet another example, the user interface may dynamically change to include content that supplements the main content on the entertainment system. For example, if content on the entertainment system has changed to a basketball game, then the user interface of the companion device may dynamically change to display a player's statistics for the basketball game, an upcoming game schedule, etc.
As yet another example, the user interface may dynamically change to include advertisements targeted to content on the entertainment system. For example, in the above-described case where the content is a basketball game, the user interface of the companion device may dynamically change to display an advertisement for basketball shoes, tickets to a next game, etc. As another example, for the case where the content is a movie, the user interface of the companion device may dynamically change to display an advertisement for action figures of the movie, restaurant promotions related to the movie, etc.
It should be appreciated that in some embodiments the companion device may be configured to dynamically change its user interface responsive to events other than a notification. For example, an Internet Protocol (IP)-based companion device may be configured to query the entertainment system for the current content using a Transmission Control Protocol (TCP). As another example, the companion device may query a service to determine the content being provided by the entertainment system.
It should be further appreciated that the user interface may dynamically change responsive to any suitable trigger events, such as program boundaries, arbitrary time boundaries, channel boundaries, randomly, and/or any other suitable criteria related to the content being presented by the entertainment system.
It should be further appreciated that the entertainment system may be further configured to support communication of a content identifier and/or metadata. The metadata may be exposed, for example, through an external server and/or directly to the companion device. The companion device may then be configured to detect the identity of the content being presented, and select user interface elements (e.g., advertising) based on that content, and dynamically change its user interface to include those user interface elements.
In some embodiments, the above described methods and processes may be tied to a computing system including one or more computers. In particular, the methods and processes described herein may be implemented as a computer application, computer service, computer API, computer library, and/or other computer program product.
Computing system 90 includes a logic subsystem 92 and a data-holding subsystem 94. Computing system 90 may optionally include a display subsystem 96, communication subsystem 98, and/or other components not shown in
Logic subsystem 92 may include one or more physical devices configured to execute one or more instructions. For example, the logic subsystem may be configured to execute one or more instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more devices, or otherwise arrive at a desired result.
The logic subsystem may include one or more processors that are configured to execute software instructions, such as instructions for dynamically changing a user interface of the companion device. Additionally or alternatively, the logic subsystem may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. Processors of the logic subsystem may be single core or multicore, and the programs executed thereon may be configured for parallel or distributed processing. The logic subsystem may optionally include individual components that are distributed throughout two or more devices, which may be remotely located and/or configured for coordinated processing. One or more aspects of the logic subsystem may be virtualized and executed by remotely accessible networked computing devices configured in a cloud computing configuration.
Data-holding subsystem 94 may include one or more physical, non-transitory, devices configured to hold data and/or instructions executable by the logic subsystem to implement the herein described methods and processes. When such methods and processes are implemented, the state of data-holding subsystem 94 may be transformed (e.g., to hold different data).
Data-holding subsystem 94 may include removable media and/or built-in devices. Data-holding subsystem 94 may include optical memory devices (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory devices (e.g., RAM, EPROM, EEPROM, etc.) and/or magnetic memory devices (e.g., hard disk drive, floppy disk drive, tape drive, MRAM, etc.), among others. Data-holding subsystem 94 may include devices with one or more of the following characteristics: volatile, nonvolatile, dynamic, static, read/write, read-only, random access, sequential access, location addressable, file addressable, and content addressable. In some embodiments, logic subsystem 92 and data-holding subsystem 94 may be integrated into one or more common devices, such as an application specific integrated circuit or a system on a chip.
The terms “module,” “program,” and “engine” may be used to describe an aspect of computing system 90 that is implemented to perform one or more particular functions. In some cases, such a module, program, or engine may be instantiated via logic subsystem 92 executing instructions held by data-holding subsystem 94. It is to be understood that different modules, programs, and/or engines may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same module, program, and/or engine may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc. The terms “module,” “program,” and “engine” are meant to encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.
It is to be appreciated that a “service”, as used herein, may be an application program executable across multiple user sessions and available to one or more system components, programs, and/or other services. In some implementations, a service may run on a server responsive to a request from a client.
When included, display subsystem 96 may be used to present a visual representation of data held by data-holding subsystem 94. As the herein described methods and processes change the data held by the data-holding subsystem, and thus transform the state of the data-holding subsystem, the state of display subsystem 96 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 96 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with logic subsystem 92 and/or data-holding subsystem 94 in a shared enclosure, or such display devices may be peripheral display devices.
When included, communication subsystem 98 may be configured to communicatively couple computing system 90 with one or more other computing devices. Communication subsystem 98 may include wired and/or wireless communication devices compatible with one or more different communication protocols. As nonlimiting examples, the communication subsystem may be configured for communication via a wireless telephone network, a wireless local area network, a wired local area network, a wireless wide area network, a wired wide area network, etc. In some embodiments, the communication subsystem may allow computing system 90 to send and/or receive messages to and/or from other devices via a network such as the Internet.
It is to be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated may be performed in the sequence illustrated, in other sequences, in parallel, or in some cases omitted. Likewise, the order of the above-described processes may be changed.
The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.