Viewing of media programs (e.g., television programs, movies, streaming video, and the like) has become increasingly popular as the cost of movie theater-like televisions, screens, and sound systems become more affordable for mainstream consumers. However, there remains an ever-present need to improve the viewing experience and immersion level for viewers.
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosure. The summary is not an extensive overview of the disclosure. It is neither intended to identify key or critical elements of the disclosure nor to delineate the scope of the disclosure. The following summary merely presents some concepts of the disclosure in a simplified form as a prelude to the description below.
Aspects of this disclosure relate to systems and methods that dynamically control one or more objects, such as devices that are disposed near a media consumption apparatus, the dynamic control may implemented in synchronicity with a media program being viewed on the media consumption apparatus. According to one aspect, lighting in a video viewing environment (e.g., a retail, commercial, or domestic environment) may be dynamically altered to enhance a viewing experience while watching a media program such as a television show, advertisements or informational productions, on-line video game, streaming video, movie, or the like. According to another aspect, a consumer electronic device, such as an animatronic stuffed animal, may be controlled to perform some action or play back a particular audio file at a time specified by instructions contained in a media transmission, e.g., a content stream, such instructions may be wirelessly transmitted to the one or more objects after being processed, e.g., when associated content is being presented, by the media consumption device or another device.
According to a first aspect, a method includes receiving a media transmission over a network connection, where the media transmission comprises a media program as an audio component, a video component, and one or more, e.g. a series of, dynamic media packets. The method may include outputting the video component to a video display device, outputting the audio component to one or more speakers associated with the video display device, and outputting instructions associated with the dynamic media packets to a dynamic device associated with the dynamic media packet. The dynamic media packet may define an action to be performed by the dynamic device at a predefined time associated with the media program.
Another aspect provides a device comprising a processor, a network input/output port, video output port, an audio output port, a wireless transceiver, and memory storing instructions that, when executed by the processor, configure the device to perform various actions, such as receiving a media stream over the network input/output port, where the media stream defines a media program as an audio component, a video component, and dynamic media packets. The device outputs the video component via the video output port, outputs the audio component via the audio output port, and outputs the instructions associated with the dynamic media packets via the wireless transceiver to a dynamic device associated with the dynamic media packet. The dynamic media packet defines an action to be performed by the dynamic device at a predefined time within the media program.
According to yet another aspect a computing device includes a processor and memory storing instructions that, when executed by the processor, configure the server device to perform various actions, such as receiving a request from a first user device for a media program, and determining a first set of one or more dynamic devices known to be associated with the first subscriber device. The server device determines a first version of the media program, from a plurality of versions of the media program, based on the first set of one or more dynamic devices, and transmits the first version of the media program to the first subscriber device.
These and other aspects will be readily apparent upon reviewing the detailed description below.
The present disclosure is illustrated by way of example and not limited in the accompanying figures in which like reference numerals indicate similar elements and in which:
In the following description of various illustrative embodiments, reference is made to the accompanying drawings, which form a part hereof, and in which is shown, by way of illustration, various embodiments in which aspects of the disclosure may be practiced. It is to be understood that other embodiments may be utilized, and structural and functional modifications may be made, without departing from the scope of the present disclosure.
Illustrative aspects described herein provide methods and system for providing an immersive user experience while viewing a media program, based on the content of the media program (e.g., video, television, movie, interactive web site, etc.), by interacting with one or more objects, such as devices, external to the media program viewing device (e.g., a television) in use. The external device(s) may include dynamic lights, toys, sound generators, wind generators, olfactory emitters, lasers, smoke machines, heat sources, rain generators, microphones, or any other objects that can communicate with a controller as described herein. Such objects may be referred to herein as dynamic devices.
First illustrative aspects provide methods and system for dynamically altering lighting in a room when a media program is playing, based on the content in the media program. Stated differently, some aspects described herein define how to alter ambient lighting based on the content in a television show, movie, or other video program. For example, during a sunrise scene in a video program, ambient lighting might get stronger to enhance the viewer's sensory perception of the sun rising; during a sunset the ambient lighting might be reduced to enhance the viewer's sensory perception of the sun going down; during a scene in which a police car is shown with flashing lights, ambient lighting might increase and decrease in alternating cycles between left and right portions of the room to enhance the viewer's sensory perception of a police car with flashing lights. A large number of embodiments exist based on the content being shown in a media program. Aspects described herein define methods and systems defining lighting schemes, associating lighting schemes with a video program, communicating the lighting information to a viewer's terminal equipment, and controlling lighting within a room based on the received lighting information.
There may be one or more communication channels 101 originating from the central location 103, and the communication channels may traverse one or more different paths (e.g., lines, routers, nodes, hubs) to distribute the signal to various premises 102 which may be, for example, many miles distant from the central location 103. The communication channels 101 may include components not illustrated, such as splitters, filters, amplifiers, etc. Portions of the communication channels 101 may also be implemented with fiber-optic cable, while other portions may be implemented with coaxial cable, other lines, or wireless communication paths.
The central location 103 may or may not include an interface 104 (such as a termination system (TS), router, modem, cable modem termination system, fiber termination system, etc.) which may include one or more processors configured to manage communications between devices on the communication channels 101 and/or backend devices such as servers 105-107 (to be discussed further below). Interface 104 may be as specified in a suitable communication standard, such as the Data Over Cable Service Interface Specification (DOCSIS) standard, published by Cable Television Laboratories, Inc. (a.k.a. Cable Labs), 802.11, FDDI, MPLS. Interface 104 may also use a custom standard such as a similar or modified interface device to a standard interface. Interface 104 may be variously configured to include time division, frequency division, time/frequency division, wave division, etc. In one illustrative embodiment, the interface 104 may be configured to place data on one or more downstream frequencies to be received by modems at the various premises 102, and to receive upstream communications from those modems on one or more upstream frequencies. The central location 103 may also include one or more network interfaces 108, which can permit the central location 103 to communicate with various other external networks 109. These external networks 109 may include, for example, networks of Internet devices, telephone networks, cellular telephone networks (3G, 4G, etc.), fiber optic networks, local wireless networks (e.g., WiMAX), satellite networks, PSTN networks, internets, intranets, the Internet, and/or any other desired network. The interface 108 may include the corresponding circuitry needed to communicate on the external network 109, and/or to other devices on the external.
As noted above, the central location 103 may include a variety of servers 105-107 that may be configured to perform various functions. For example, the central location 103 may include a push notification server 105. The push notification server 105 may generate push notifications to deliver data and/or commands to the various premises 102 in the network (or more specifically, to the devices in the premises 102 that are configured to detect such notifications, e.g., ambient lighting devices). The central location 103 may also include a content server 106. The content server 106 may be one or more processors/computing devices that are configured to provide content to users in the premises. This content may be, for example, video on demand movies, television programs, songs, text listings, etc. The content may include associated lighting instructions. The content server 106 may include software to validate user identities and entitlements, locate and retrieve requested content, encrypt the content, and initiate delivery (e.g., streaming) of the content to the requesting user and/or device. The content server 106 may also include segmented video where lighting instructions are inserted into the video and associated with particular segments of video.
The central location 103 may also include one or more application servers 107. An application server 107 may be a computing device configured to offer any desired service, and may run various languages and operating systems (e.g., servlets and JSP pages running on Tomcat/MySQL, OSX, BSD, Ubuntu, Redhat, HTML5, JavaScript, AJAX and COMET). For example, an application server may be responsible for collecting television program listings information and generating a data download for electronic program guide listings. The program guide may be variously configured. In one embodiment, the program guide will display an indication (e.g., an icon) indicating that the program is ambient lighting enabled. For example, the program guide may include an icon of a static or dynamically changing light bulb indicating that the particular program is ambient lighting enabled. Another application server may be responsible for monitoring user viewing habits and collecting that information for use in selecting advertisements. Additionally, the lighting instructions may be included in advertisements. In one illustrative embodiment, the room brightens markedly when an advertisement appears on the program. Another application server may be responsible for formatting and inserting advertisements in a video stream being transmitted to the premises 102. Another application server may be configured to operate ambient lighting devices manually via controls input by the user from a remote device such as a remote control, IPHONE, IPAD, tablet, laptop computer, and/or similar device. Still referring to
Device controller 211 (e.g., a lighting controller in this example) may dynamically control one or more dynamic devices 220 (e.g., including one or more light sources 300 such as a light fixture and/or the bulb therein), as further described herein, via one or more networks, e.g., wireless, wired (e.g., IP, DLNA, etc.), powerline, Wi-Fi, Bluetooth, infrared (IR), Z-Wave, and/or ZigBee-compliant networks (e.g., RF4CE or ZigBee Pro). Presently there exist approximately 1 billion incandescent light sources in residential premises in the US. Aspects of this disclosure makes these light sources much more versatile, controllable, and adaptable to the users. Device 220 is representative of a generic device controllable as described herein. Dynamic devices may include, but are not limited to, dynamic lights, toys, sound generators, wind generators, olfactory emitters, lasers, smoke machines, heat sources, rain generators, microphones, or any other device that can communicate with or be controlled by a device controller as described herein.
With reference to
Light source 300 may also include a housing 301 in which any number of LEDs may be included (e.g., four light emitting diode strands 303-309). Housing 301 may include a standard base so that the light source 300 can be screwed into any conventional lamp or fixture. The LEDs within the light source 300 may be variously configured. For example, LED 303 may include a red LED; LED 305 may be blue LED; LED 307 may be a green LED; LED 309 may be a high intensity white LED. LEDs 303-309 may be connected to, for example, one or more processors 311 using any suitable means such as control logic and/or via control wires 313, 315, 317, 319, respectively. Processor 311 may be variously configured. In one illustrative embodiment, processor 311 is manufactured by Marvell Technology Group Ltd. of Bermuda and Santa Clara, Calif., and is configured to control the LED strands within the light source, e.g., turning up or down the intensity, or “volume”, of one or more of the LED strands.
In illustrative embodiments, the light source 300 may be configured to include a media access control address (e.g., MAC address). The Mac address may register with the computing device 200 and/or with devices located proximate to the central location 103. In illustrative embodiments, the processor 311 (or light source 300) is initially manufactured having a unique media access control (MAC) address. The processor 311 may control the LEDs based on communication signals (e.g., lighting instructions) received via transceiver 321, when those communication signals are addressed to the MAC address associated with that light source. Transceiver 321 may be variously configured to include, for example, a Wi-Fi, Bluetooth, IEEE 802.15.4, or ZigBee-compliant transceiver. Light source 300 may further include one or more dip switches 323 to set various parameters associated with the light source 300, and may further include an input button 325 which may be used to place light source 300 in a designated mode, e.g., a pairing mode, as further described herein.
According to some embodiments, transceiver 321 may instead consist only of a receiver, and not include the ability to output send data. According to other embodiments, light 300 might include only 3 LEDs, omitting the high-intensity white LED. Light source may be variously configured such that processor 311 and/or transceiver 321 may be mounted in the base of the housing 301. In illustrative embodiments, an application downloadable to a remote control device (e.g., an i-Pad/i-Phone) may be utilized to set and/or control the light source either alone and/or in conjunction with the lighting instructions. The remote control may override the lighting instructions and/or enable the lighting instructions. Further, the remote control may set parameters for the lighting instructions such as minimum lighting levels.
With reference to
Each light source 300 may be controlled by its respective internal processor 311. Each processor, in turn, may control the LEDs in that light source based on instructions received via wireless transceiver 321. These instructions may be manual instructions from a remote and/or lighting instructions as discussed above. According to one illustrative aspect, with reference to
With reference to
Lighting effects may be defined by creatively determining sequences of lighting primitives for each of a plurality of light channels. Each light channel may be associated with a particular location of a light source corresponding to that channel. For example, in one aspect, 6 light channels may be used: front right, front left, rear right, rear left, center front, and burst channels. Each of the left, right, and center channels may be associated with a single and/or multicolor bulb as described herein, whereas the burst channel may be associated with a single bright white light source that can be used to present bright light bursts (e.g., during explosions, search lights, etc.). In another aspect, 2 additional channels may be used as well: middle left, middle right, where each middle channel is located between its respective front and rear channels, and each associated with a multicolor bulb. In other aspects, different or additional channels may be used, e.g., floor channels, ceiling channels, dim channels, strobe channels, or other special purpose channels. Special purpose channels may be associated with a special purpose light source, e.g., burst channel, strobe channel, etc. For illustrative purposes only, the remainder of this description assumes that 6 channels are being used, as illustrated in Table 1 below, where channels 401-405 use a multicolor LED bulb, and burst channel 406 uses a single color high lumen white bulb.
In additions, additional primitives may be defined for video games. For example, in car chase scenes in grand theft auto, police lights may be shown as the police are closing in on the player's vehicle. Further, headlights may appear when another car is being passed. The video games video sequences may also include lighting instructions as herein defined. These lighting instructions may appear in on-line versions of the games as well as local versions.
As shown in
In the sunrise effect example illustrated in
The remainder of the primitives examples, excepting the last primitive shown in
In still further examples, some effects may be defined to reference actions to be performed based on the previous effect. For example, Effect ID 2000 might indicate that the light should gradually return to a default state (e.g., whatever state the light was in prior to the start of the video program, i.e., what the viewer had set the lighting to prior to watching the video program) over some predefined or specified period of time. For example, the duration for lighting effect 2001 might indicate the amount of time over which the light should gradually return to the default state. Effect ID 2002 might be used to indicate that the final state of the previous effect should be held for the period of time specified in the duration field. Effect ID 2003 might be used to indicate a blackout, i.e., all lights off, for the period of time specified in the duration, or indefinitely if the duration is zero. Additional or different transition effects may also be defined.
With reference to
Continuing with this example, lighting scheme 801 next indicates that, at 23 minutes and 12.5 seconds, sunrise effect (Effect ID 2) is executed. The duration is set to 0, indicating that the effect is to be executed as defined by the primitives in Effect ID 2. Scheme 801 next indicates that Effect ID 2001 is executed, which by agreement refers to a gradual return to the default state of each light over the time period specified in the duration for that effect, i.e., in this example over a period of 30 seconds. The Time=0 indicates that Effect ID 2001 is to be executed immediately after the preceding effect (sunrise) is completed.
Referring to the same example, lighting scheme 801 next indicates that, at 36 minutes and 8.8 seconds, sunset effect (Effect ID 3) is executed. The duration is set to 0, indicating that the effect is to be executed as defined by the primitives defined in Effect ID 3. Scheme 801 next indicates that blackout Effect ID 2003 is immediately executed upon completion of the sunset effect, thereby causing all lights to be completely off (regardless of how the sunset effect ended) for 5 seconds. Scheme 801 next indicates that Effect ID 2001 is again executed to gradually return the lights to their default state over the time period specified in the duration for that effect, i.e., in this example over a period of 45 seconds. The Time=0 indicates that Effect ID 2001 is also to be executed immediately after the preceding effect (blackout) is completed.
Using the hardware components (lights, wireless networks, media distribution networks, etc.), primitives, effects, and schemes described above, aspects described herein provide the architecture for dynamic lighting schemes to be performed in conjunction with a media program, which will dynamically change the hue and intensity of light sources within the proximate viewing area surrounding a video in order to enhance the viewing experience.
In order to effect dynamic lighting based on the lighting primitives, effects, and schemes, in illustrative embodiments lighting controller 211 (
In some examples, before lighting primitives, effects and schemes can be effected, lighting controller 211 (
According to a first aspect, when each light source is manufactured it may be hardcoded to be a bulb for a specific light channel. In still further embodiments, 5.1 (“five point one”) is the common name a multi-channel surround sound (e.g., six channel) system. 5.1 surround sound is the layout used in many cinemas and in home theaters. The standard employs five full bandwidth channels and one “point one” enhancement channel. 5.1 is used in digital broadcasts. Similarly, aspects of the present invention propose extending 5.1 to ambient lighting to enhance the overall cinematic experience.
In an illustrative 5.1 ambient lighting channel system (e.g., two front, two rear, one center, and one burst), light sources may be sold in kits of 6 lights bulbs, labeled appropriately for each channel, or may be sold in kits of 5 bulbs (one for each multicolor channel), and the burst channel may be sold separately. Other combinations of bulbs may be packaged together (for example, a kit of the four front and rear bulbs only), and each bulb may also be sold individually, e.g., so a consumer can replace an individual bulb that is no longer working. In this example, where a light sources' respective channels are set at manufacturing, e.g., by hardcoding the light channel in the light source, no further setup is required beyond the user ensuring that the correct bulb is inserted into its correspondingly located lamp 401-406. Subsequently, when lighting controller 211 sends commands to a bulb designated as “front right”, any light source designated as a front right bulb may respond to those commands (regardless of where that light source is actually located). For example, the light source itself on the outer housing 301 may be labeled front left, front right, rear left, rear right, center, and/or burst. The user simply needs to place the correctly labeled light source in a lamp in the correct location. Alternately, the light sources can be dynamically programmed based on an interactive remote control. For example, a tablet device could activate each device detected in sequence and the user could simply drag an icon indicative of the active light source to a location on the tablet such as front left, front right, rear left, rear right, center, and/or burst.
According to a another example, each light source 300 may include a plurality of interactive control elements such as dip switches 323 through which a user can set each bulb to be on a designated channel. In the example shown in
In yet another aspect, light source 300 may include a pairing button 325. Microprocessor may be configured, upon detecting that pairing button 325 has been pressed, to enter a pairing mode. While in the pairing mode, the processor may utilize a remote control and/or display screen to allow a user to input a code to assign a light source with a particular location such as front left, front right, rear left, rear right, center, and/or burst. For example, lighting controller may include instructions that execute a configuration wizard program. The configuration wizard program may cause device 200 to display various commands on display 206. For example, the wizard may cause one of the detected light sources to blink along with a display of message stating “Press the appropriate pairing button front left “1”, front right “2”, rear left “3”, rear right “4”, center “5”, and/or burst “6”.” The wizard then listens for an identification message received from user to complete the location pairing with the activated light source. In this example, when the user subsequently presses the pairing button input on the remote control, the processor thereafter associates the light source with the location selected during the pairing. In this manner, the bulb's MAC address (or other ID) is paired with location in the lighting controller 211. Lighting controller 211 records the ID as being associated with, for example, the front right channel. Similar steps may be performed for each of the other channels in use.
In yet another aspect, an RF4CE ZigBee protocol may be used to pair the lighting controller with the individual bulb devices to be controlled. Wi-Fi Protected Setup (WPS) may also be used, e.g., in combination with pairing button 325.
According to another aspect, pairing may be performed via near-field communications (NFC) or other short range RF communication using input device 208, e.g., a remote control. In such an aspect, device 200 and input device 208 may each include an NFC chip 212, 215. Device 200 may optionally include NFC chip 212 within lighting controller 211, I/O 209, or separately within device 200, as shown. Each light 301 may also include NFC circuitry, e.g., within transceiver 321, or separately. NFC chips are known in the art. Other short range RF communication standards and/or technologies may also or alternatively be used.
After the number and position of the lights are selected by the user in step 1201, next in step 1203 device 200 determines an order in which the lights should be paired. The order is used by device 200 to later determine which paired light is in each position. Step 1203 may include, e.g., device 200 displaying a chart to the user, as shown in
In step 1205 the lighting controller 211 (e.g., via device 200) transfers pairing information to a transfer device. Pairing information may include configuration information. In this example, the transfer device may be input device/remote control 208, which may be provided to the user with set top box/device 200. Device 208 may be configured with NFC chip 215, as well as a processor (not shown) controlling operations of device 208, and memory (not shown) to temporarily store pairing information for transfer between device 200 and each light 401-406. The memory of device 208 may further store control logic for controlling device 208 during the transfer/exchange process. Device 208 may initiate a pairing mode of operation, during which a processor of device 208 controls communications via NFC 215 with device 200 and lights 401-406.
Pairing information may include, in an illustrative RF4CE embodiment, a personal area network (PAN) ID, MAC Address of device 200, an encryption key (e.g., an AES-128 key), a channel number, and an RF4CE address of device 200 or other shortened address to which data for device 200 (or lighting controller 211) can be sent. Pairing/configuration information may include different information needed to establish communications between device 200 and each light 401-406. Pairing/configuration information may also include a device type, e.g., where each light may need to be configured differently depending on the type of lighting controller in use. Pairing/configuration information may also include a lighting protocol identifier, e.g., where a light is configurable for use with different ambient lighting protocols. Pairing/configuration information may also include a bit rate to be used, where the lighting controller and/or light are configurable for use with different bit rate streams.
The transfer of pairing information from device 200 to device 208 may comprise placing NFC 215 of device 208 in proximity to NFC 212 of device 200, at which time device 200 sends it's pairing information to device 208 via NFC. Such NFC communication may be referred to herein as “touching” devices together to share or exchange information. Each device may have a graphic or sticker indicating a location of its internal NFC chip to identify where each device should be touched.
Next, in step 1207, the user may ensure that each light is powered on, thereby providing power to each light's NFC circuitry 321. The user may position device 208 in proximity to each light in the order prescribed by device 200 (e.g., as shown in
Each light's pairing information provided to device 208 may include a MAC Address of the light, and an RF4CE address or other shortened address to which data for that light can be sent, as well as an encryption key and channel number. Optional configuration information may be included as well, e.g., light capabilities, how many colors or LEDs the light has, types of lighting formats supported, acceptable bit rates, etc. Device 208 stores each light's pairing information in its memory in such a manner that device 208 can provide to device 200 an indication of order among the information received for each light (e.g., storing in a queue, stack, or any other ordered data structure).
After “touching” device 208 to each light in the determined order, then in step 1209 the user again touches device 208 to device 200 to transfer the paring and configuration information received from each light to lighting controller 211 via NFC 212. Lighting controller 211, in step 1211, determines which light is assigned to each position/light channel based on knowing the number and arrangement of lights selected by the user in step 1201, the order in which the lights were “touched” (e.g., assumed to be as shown to the user), and the order in which the data was stored (queue, stack, numbered, etc.) and/or received from device 208. The lighting controller may also confirm the proper number of lights based on the pairing/configuration information objects downloaded from device 208. Once lighting controller 211 stores pairing information for each light, then lighting controller 211 in step 1213 can begin transmitting ambient lighting instructions in step 1213, as further described herein.
The process described with respect to
While NFC has been used in this example, other short range communication protocols may also be used. The method may be modified by adding steps, combining steps, or rearranging steps, provided the end result is the same, namely, pairing each light with the lighting controller, with known positions. While the above method has been described with respect to dynamic ambient lights, a similar pairing method may also be used with any other location dependent system using position-dependent devices, e.g., surround sound speakers, microphone arrays, vibration generators, smell/olfactory sources, directional wind generators, heat sources, moisture generators, and the like, in order to exchange pairing and position information in a single process. Other position-dependent devices may alternatively be used.
In illustrative embodiments, after lighting controller 211 has been configured (as necessary) to communicate with the appropriate light source for each light channel in use, lighting controller 211 may then dynamically alter room lighting based on the video program being displayed on TV 206. According to a first aspect, lighting controller 211 may dynamically alter the lighting in real-time based on a color analysis of the video program being performed or displayed. According to a second aspect, lighting controller 211 may dynamically alter the lighting based on a predefined lighting scheme corresponding to the program being performed or displayed. Each example is described in turn below.
With reference to
According to an alternative aspect, the lighting analysis may continue until user input is received indicating user desire to end dynamic ambient lighting, rather than based on the end of a video program. In yet another alternative, device 200 may query a user at the end of a video program to determine whether to continue dynamic ambient lighting or not. Other ways of determining when the device should end ambient lighting may also or alternatively be used.
With reference to
In this example, in step 1001, a lighting scheme is generated based on a particular video program. The lighting designer may include a human user, using a studio application or other software, manually selecting effects to be applied within a video program, and associating those effects with specified times, durations, and/or transitions. Alternatively, the lighting designer may include automated video analysis software that automatically segments the video into various segments, detects certain events within those segments, e.g., flashing police lights, explosions, plays in a football game, touch downs, etc., and automatically applies applicable effects at corresponding times and durations in the video program, and optionally also setting a transition after the lighting effect is completed. The set of lighting effects, durations, and transitions associated with a particular video program is then saved as a lighting scheme that can be associated with that particular video program. These may be associated with the video program as lighting instructions that may be synchronized with the video either within a digital stream (e.g., MPEG stream) and/or as separate file time coded with the digital stream.
In certain examples, because multiple video schemes might be based on the same particular video program, e.g., created by two different lighting designers, in step 1003 a single lighting scheme may be selected for transmission with the particular video program. Next, in illustrative step 1005, the selected lighting scheme may be packaged for transmission with the particular video program. According to one aspect, packaging may include saving the video program and lighting scheme as a single file or set of associated files in a predetermined format for sending over a desired delivery platform. For example, in one aspect the selected lighting scheme may be intended to be sent in a synchronized MPEG-2 and/or MPEG-4 stream, e.g., using enhanced binary interchange format (EBIF), to transmit the ambient lighting scheme in a time-synchronized manner with the video program. In such an environment, the video program and lighting scheme may be saved in a format for immediate or near immediate transmissions, with little or no conversion required before transmission. In other embodiments, the files are sent as separate files and then time coded to particular segments of the MPEG stream.
In illustrative step 1007 the packaged file is transmitted to a media consumer device. Transmission may occur at or initiate from a headend 103 or other media distribution location. In step 1009 the transmission is received by a media device, e.g., device 200, a set-top box (STB), digital video recorder (DVR), computer server, or any other desired computing device capable of receiving and decoding the transmission.
In illustrative step 1011, the media device decodes the transmission into a video program and a lighting scheme, and forwards each portion to applicable hardware for further handling. In illustrative step 1013 the media device outputs the video program portion of the transmission for display on a video display screen, e.g., display 206. In this illustrative method, the media device outputs the lighting scheme to lighting controller 211 for control of an ambient lighting system as described herein. Based on the time-based information in each of the video program and the lighting scheme, the video and illustrative ambient lighting information may be performed in synchronicity with each other, thereby rendering the lighting scheme in conjunction with the video program as intended by the lighting designer.
The above aspects and information describe only one possible implementation of the dynamic ambient lighting system and methods thus far described. Many variations and alternatives are possible that allow a system to remotely control multiple light sources, using a synchronized transport stream (e.g., an MPEG-2 transport stream) or an asynchronous transmission as its communications path. A system remote from individual light sources themselves can thereby control lighting in predefined ways. For example, a movie might have encoded within its MPEG-2 transport stream, instructions for lighting in the room where the movie is being viewed. A scene in the movie might have police lights flashing. A remote command might be sent to specific bulbs in the viewing room to flash red and blue. The result is an intelligent expansion of the viewing platform.
In another illustrative embodiment, a lighting controller might query a lighting scheme database (e.g., over network 109, 210, the Internet, etc.) based on a program ID of received video content. If a lighting scheme is identified as a result of the query, the lighting controller (or other applicable component) might download the lighting scheme from the lighting scheme database for use during playback of the video content, as described herein. If more than one lighting scheme is identified as a result of the query, the lighting controller (or another applicable component) might query the user to determine which lighting scheme should be used, or may pick a lighting scheme automatically, e.g., based on an author of the lighting scheme, popularity, user feedback or reviews, or based on other information known about the lighting scheme. Once selected and downloaded, the lighting controller uses the selected lighting scheme to control ambient lighting during playback of the video content, as described herein.
According to one example, instead of the format shown in
According to another example, the synchronized lighting scheme data, upon encapsulation within the MPEG transport stream, may be encapsulated into descriptor elements as “proprietary data” as that term is utilized in the MPEG standards. In one embodiment, the lighting instructions may be packaged as proprietary data and identified within a Program Map Table of the client device or gateway. This meta data can be utilized by the computing device 200 to control lighting and also by the program guide to show programs which are ambient lighting enabled. The computer device 200 may be configured to check the descriptor elements including the proprietary data in order recognizes that the type of proprietary data is a type which includes lighting instructions. For example, a type from within the PMT may be used, and the binary stream, synchronized to the concurrently received video and audio stream. Upon reading the lighting instructions, the computing device may be configured to broadcast data associated with the lighting instructions to 802.15.4 radio receivers embedded within each light channel's light source. According to this aspect, each light source may be configured with a specific identification. Using the field within the lightControl packet structure to determine whether the lighting control message is meant for it, a light source's processor determines whether that light source should implement the lighting instruction it has received. As discussed above, a lighting instruction might be a simple set of intensity values for each LED strand, e.g., a primitive, or alternatively the lighting instruction could be a more complex lighting effect, perhaps lasting many seconds.
According to other aspects, ambient lighting may be used to signify external events in or around the viewing area. For example, when a loud video program is playing, it may be difficult for a viewer to hear the telephone ring. Currently, media distribution systems tie in to the telephone line and may display caller ID information on a television or other display apparatus. According to an inventive aspect herein, the lighting controller may be configured to perform a specific lighting effect or scheme when a telephone rings or upon the occurrence of other predefined events not associated with the video program being watched. For example, when the phone rings, the lighting controller may cause the ambient lights to perform a strobe effect. In another example, when a doorbell is rung the lighting controller may cause the ambient lights to repeatedly transition from dim to bright and vice versa, or some other predefined effect. The processor 200 may also be configured to act as an alarm clock and have the lighting activated responsive to an alarm event such as a predetermine wakeup hour. Further, the lighting may be responsive to other events such as the laundry ending, the stove timer, the dish washer, etc. Predetermined effects may include any desired light channel(s), colors, strobes, durations, patterns, etc. The auxiliary devices such as laundry may be tied in via network 210.
According to some aspects described herein, a set-top-box or other media device may be configured to output the lighting scheme portion of the transport stream via USB or HDMI (e.g., over the consumer electronics control (CEC) data path) to an external device that includes the lighting controller and/or associated wireless transmitter. This configuration allows for set top boxes or other devices currently available, which do not have the requisite hardware installed for use in the described ambient lighting control system(s) to be retrofitted for such use. In another variation, a Digital to Analog (DTA) adapter may be used to receive streamed (e.g., via MPEG-2) lighting instructions. The latest generation of these devices includes RF4CE transmitter capability, thus there would be no need for an external adapter. The DTA adapter, in such an embodiment, may also transmit the lighting instructions to the light sources using the RF4CE transmitter.
As described above, aspects described herein are not limited to dynamic ambient lighting. With reference back to
Data packet 1500 may also include a device ID 1503 identifying a particular dynamic device 220, e.g., to distinguish between lighting, toys, wind generators, etc. The device ID 1503 may reference a particular device, or may reference a device type or genre. Device ID 1503 may refer, for example, to classes of devices, e.g., wind generators, olfactory emitters, lasers, etc., and if device controller is in communication with any of the device or type of device specific, may act upon the received data packet at the timestamp indicated. For example, the device ID may indicate a value corresponding only to a single type of stuffed animal (e.g., an elephant) that has built in electronics for movement of the elephant's trunk, memory for storing one or more audio clips, and a speaker for playback of the stored audio clip in synchronization with a media program. Alternatively, the device ID may indicate a value corresponding to a genre of devices that have predetermined capabilities corresponding to that device ID. For example, the device ID may correspond to any plush toy or other device having memory for storing audio clips and a speaking for playing the stored audio clips, regardless of whether device 220 is an elephant, bear, tree, cube, etc.
Data packet 1500 may also include payload 1507 including all information and/or instructions necessary for the device(s) identified by Device ID 1503 to perform some action at the time identified by timestamp 1505 during the media program identified by Program ID 1501. Payload 1507, e.g., may include a command instructing device 220 to play stored audio clip 1 at time 4:17 of a specified media program. Alternatively, payload 1507 may include the audio clip to be played by device 220, as well as a command to play the audio clip at a particular time. In still other embodiments, payload 1507 may include one or more lighting instructions for a dynamic light device as described above. Payload 1507 may include instructions defining an intensity of wind for a wind generator, as well as times to start and/or stop blowing wind at the specific intensity. An infinite number of possibilities exist for different types of dynamic devices to be synchronized to playback of a media program. The specific information in packet 1500 is secondary to the ability to synchronize some action performed by a dynamic device with playback of a media program based on information contained in media packet 1500.
In some embodiments, the payload instructions may be predetermined according to an adopted standard or protocol. In other embodiments, the payload instructions may be particular or specific to a device or genre of devices. In other embodiments, a combination of the two may be used.
Media packet 1500, or a stream of media packets 1500, may be included within an MPEG media stream delivering one or more media programs to a user device (e.g., to a television, set-top-box, computer, media server, etc.). Device controller 211 may store a lookup table or database of devices in communication with the device controller. Device controller 211 may then determine, upon receiving a media packet 1500 in a media stream, whether device controller 211 is in communication with a device 220, 300 that corresponds to Device ID 1503. When device controller 211 is in communication with such a device, controller 211 may send the payload to corresponding device for execution at the time specified by timestamp 1505. In one embodiment, controller 211 might not send payload 1507 to the relevant device until timestamp 1505, at which time the device 220 executes the instructions immediately upon receipt. In another embodiment, controller 211 might send payload 1507 (or a preset delay) to the relevant device along with timestamp 1505, and the device 220 then waits until timestamp 1505 (or until the delay has expired) before executing the instructions in payload 1507.
In embodiments where the payload instructions are not executed instantly by a device 220, the delay is preferably minimized to ensure that the user is still watching the media program to which media packet 1500 corresponds. That is, the user may change a television channel, or otherwise navigate away from the media program associated with media packet 1500 before payload 1507 of media packet 1500 is executed by device 220. As a result, the user may be watching a different media program when the instruction is executed, causing confusion to the user. The delay or other advance notification of a media packet to a device 220 should therefore be minimized to reduce this possibility. Alternatively, when a user changes a channel or navigates away from a media program, device controller 211 may be adapted to send a cancellation message to a device 220 for any sent, but unexecuted, payload instructions.
In step 1611 device controller 211 waits for timestamp 1505 to occur during playback of media program 1501. In another aspect, waiting for a time stamp might not occur, and the payload instructions may be sent immediately upon processing. When controller 211 detects that media program 1501 is at or near timestamp 1505, then in step 1613 device controller 211 sends payload instructions 1507 to the dynamic device 220 identified in step 1609, e.g., via wireless communications, RF4CE, ZigBee protocols, etc. In step 1615 dynamic device 220 executes the received payload instructions to occur at or near timestamp 1505. In step 1617, if the media stream continues, then device 200 continues to receive the media stream back in step 1601.
In some aspects, the method described above may be modified, e.g., to include the timestamp in the information sent to each device 220, which may be used by device 220 to ensure that execution of the payload instructions occurs at a particular time. The method may also be altered to send cancellation instructions to a device when a previously sent payload has been sent to the device, the payload has not yet been acted upon by the device, and the user has changed channels or otherwise navigated away from the media program with which the payload is associated. In some embodiments, steps may be combined, split and/or reordered.
As described herein, instructions (dynamic media packets) for execution by a dynamic device may be embedded in an MPEG stream, e.g., an MPEG-2 program stream sent over a media distribution network. The instructions may be decoded by a media consumption device such as a set top box, home gateway, media server, computer, or similarly configured devices. Each dynamic media packet contains information for execution by a dynamic device in synchronization with playback of a media program or other video stream. Communication between a device controller and a dynamic device may be in accordance with 802.15.4 chipsets (e.g., according to ZigBee and/or RF4CE protocols), or in accordance with other standards and/or protocols. The dynamic device receiving the instructions then executes the instructions to play a sound, generate wind, alter lighting, ring a chime, generate fog or smoke, or perform any other action identified in a dynamic media packet and which the dynamic device is capable of performing.
According to aspects described herein, interactivity may be controlled by the video rendering device (e.g., TV), or any other computing device (e.g., set top box, device controller 211, etc.) which has information about the current media time of the program which is being watched.
According to an aspect, a set top box or other device 200 may download a set of triggers (e.g., predefined or preset dynamic media packets) from a web-server. Each trigger may be or include a UPnP action corresponding to a media time or timestamp. In another case, these could be a series of tuples with UPnP actions, timestamp and activation parameters (program ID, device ID, business rules, and/or other parameters determining when the UPnP action should be performed).
Device 200 may discover all dynamic devices within a premises using a wireless universal plug-and-play (UPnP) protocol, store device information in a lookup table or database for future reference (e.g., during step 1605 above) and then invoke UPnP actions corresponding to the defined triggers when the conditions for the trigger are met. The triggers and conditions may be included in dynamic media packets. When a condition or trigger is based on additional information beyond program ID, device ID, and timestamp, any such additional requirements may be included in payload 1507, or the format of dynamic media packet 1500 may be altered to include additional or different fields, as needed.
According to an illustrative use case, a user may purchase a set of stuffed toys, such as characters from Sesame Street®, which will speak along with a Sesame Street show. The video stream in which the show is sent to end-user devices may be customized with dynamic media packets instructing the stuffed toy characters talk in a personalized manner, or in supplement to the show.
According to an additional aspect the media program itself may be altered based on the characters that are present in a particular premises and tailor the dialog/interactivity based on the dynamic devices present. For example, after receiving device 200 registers one or more dynamic devices 220, device 200 may communicate with a server over a network 210 such as the Internet, and identify the dynamic devices associated with that receiving device 200. When a user instructs device 200 to tune to a particular channel or to play a particular media program, a server device 103 serving content to device 200 may select a media stream based on the devices 220 known to be associated with device 200. In this manner, a portion of dialogue associated with a television program or other media program may be spoken by one or more of the dynamic devices 220, rather than output via a speaker associated with a television set or stereo system. Alternatively, a single media stream may include separate audio channels for dialog by different characters, and device 200 may send audio associated with a particular character to a corresponding dynamic device when a dynamic device has previously been registered for that character, or device 200 may send audio associated with that particular character to a default speaker when a dynamic device has not been previously been registered for that character.
For example, central location 103 (e.g., a headend, local office, or other media distribution location), upon receiving a request from device 200 to tune to a Sesame Street media program, may determine that device 200 is known to be associated with dynamic devices having device IDs corresponding to the Big Bird and Oscar the Grouch characters in the show. Each of the Big Bird and Oscar the Grouch dynamic devices may be known to have memory for storing audio clips, a processor, animatronics (or other mechanical actuators) to move the mouths of the respective characters, and a speaker for playback of the stored audio clips. Headend 103 may then select a version of the Sesame Street media program that replaces some or all of the dialog for the characters Big Bird and/or Oscar in the media program with one or more dynamic media packets storing replacement dialog to be spoken by the corresponding Big Bird and Oscar the Grouch dynamic devices, with corresponding instructions to move the mouths of the characters in concert with playback. Alternatively, the media program might not be altered, but may instead include separate audio channels for each characters dialog. Device 200 may send audio output corresponding to dialog by Big Bird and Oscar the Grouch to device controller 211 for transmission to the Big Bird and Oscar the Grouch dynamic devices, respectively, and may send the remainder of the audio channels to an audio processor output via default speaker or speakers.
In another example, dynamic media packets may be inserted into a media stream dynamically, based on events occurring the media program. For example, a producer of the broadcast of a football game may instruct a broadcast computer to insert a dynamic media packet when a team scores. A remote control, motorized (e.g., electromechanically actuated) football action figure may be controlled to spin around in response to the inserted dynamic media packet.
In another example, an advertiser may give away a dynamic device that responds (animates, plays sound, etc.) to that advertisers television commercials. For example, a Dunkin Donuts coffee cup may lift its lid and play a melody in response to dynamic media packets included in a Dunkin Donuts commercial. The melody may be preloaded on the dynamic device, or included within the dynamic media packets.
It will thus be appreciated and understood that modifications may be made without departing from the true spirit and scope of the present disclosure. The description is thus to be regarded as illustrative instead of restrictive on the present disclosure.
This application claims priority to and is a continuation in part of co-pending application Ser. No. 13/326,617, filed Dec. 15, 2011, entitled “Dynamic Ambient Lighting”, which claims priority to provisional application No. 61/567,783, filed Dec. 7, 2011, also having the title “Dynamic Ambient Lighting,” each of which is herein incorporated by reference for all purposes.
Number | Date | Country | |
---|---|---|---|
61567783 | Dec 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13326617 | Dec 2011 | US |
Child | 13804208 | US |