The present invention claims priority of Korean Patent Application No. 10-2008-0105763, filed on Oct. 28, 2008, which is incorporated herein by reference.
The present invention relates to a technique of playing media, more particularly, to a system and method for orchestral media service appropriate for playing media including multiple audio/videos and neodata synchronized with multiple active and passive devices through wired or wireless network.
Digital home will evolve into real-sense/intelligent ubiquitous home. The home digital devices present in the real and intelligent ubiquitous home will be interconnected through wired or wireless network. The home media device has undertaken a media playback by using an actuator. The actuator may be implemented by, e.g., a home server, a set-top box, a DTV (digital television) and the like in a home and by, e.g., a smart phone, a PDA(Personal Digital Assistants), PMP (Portable Media Player) and the like while moving. For example, media has been played by using a playback device such as a television in home. In the future, the media playback devices will cooperate with each other and play together to give more effects to users, and will be self-evolved and appropriate to the user's home, rather than processing playback of all media in one actuator. Until now, various media playing methods which use multiple devices together are being discussed regarding this matter.
As the number of media playback devices present at home increases and each device has a built-in function capable of playing media, however, since there is not enough playback method to play media through integrating the home appliances, therefore, the devices present at home are not fully used.
As described above, in the media playback system of state of the art, one media which is consist of one video and one audio is usually played on one playback device. Even though, when there are various devices capable of playing media at home, we only can use one device to play one media, because these devices are not support multiple audio/videos playing. If there are multiple audio/videos and effect data related with the specific scenes in one media, it is better to use all devices to play these media to maximize the effects of the media.
In view of the above, the present invention provides a system and method for the orchestral media service capable of playing media including multiple audio/videos synchronized with multiple active devices, e.g., a PC, a PDA(Personal Digital Assistants), an UMPC (Ultra Mobile PC), a PMP (Portable Media Player), a PSP(PlayStation Portable) and the like and passive devices, e.g, a heating device, a lighting device, a shading device, temperature and humidity controller and the like through wired or wireless network.
Further, the present invention provides a system and method for the orchestral media service capable of transferring a media including multiple tracks to multiple active devices through wired or wireless network, and playing different audio/video included inside the orchestral media by multiple active devices and controlling passive devices to make non visual and audible effects (e.g., scent, smog, light, vibration, etc.) synchronized with a main audio/video played in an actuator.
In accordance with a first aspect of the present invention, there is provided a system for the orchestral media service which receives the orchestral media having multiple tracks and neodata from the media service provider and spread tracks over the multiple connected devices to play, the system including: a client engine that parses the orchestral media to separate into each audio/video track and neodata (contains effect data), synchronizes with the connected devices with a basis of the playtime of the orchestral media, analyzes the neodata, maps the effects data inside the neodata into control command that controls the effect devices connected with the actuator, and outputs the mapped control command to the passive devices; and a communication interface that performs connection with the devices having respective communication interface and transfers the control command to the connected devices.
In accordance with a second aspect of the present invention, there is provided a method for the orchestral media service, including: controlling that controls total time to play the orchestral media transferred from the media service provider, in the actuator performing connection with the active and the passive devices to perform continuous synchronization; separating that parses the orchestral media to separate into each audio/video data and neodata; playing back that plays the main audio/video(normally first track inside multiple tracks can be the main audio/video) on a media output device (e.g., DTV) connected with the actuator by performing synchronization and transfers other audio/video tracks except main audio/video to the user around active devices to play them synchronously with main audio/video; mapping that analyzes the neodata and changes the effect data inside the neodata into control command to activate the connected passive devices; and transferring that transfers the mapped control command to the passive devices and each audio/video except main audio/video to the active devices.
The objects and features of the present invention will become apparent from the following description of embodiments given in conjunction with the accompanying drawings, in which:
Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings which form a part hereof.
Referring to
A client engine 104 of the actuator 102 analyzes the transferred orchestral media to make multiple audio/videos playable on the respective active devices, transfers the corresponding media and neodata which is separated from the orchestral media to the user around passive devices respectively connected with interfaces 108 (serial port, USB port, LAN/WLAN port, audio out port, video out port) through a communication interface, which is an Application Program Interface (API) 106. Actually, active devices use WLAN/LAN interface to receive multiple audio/video tracks, and passive devices use serial port, USB port, audio out port, video out port and the like.
Specifically, the control data transferred through the control interface (e.g., serial port 110) is transferred to a ZigBee coordinator 122 through the ZigBee wireless network 120. The ZigBee coordinator 122 transfers the control data to the heater 124, fan 126, scent generator 128 and other devices.
Further, the serial port 110 can be used to transfer the control data to the lighting device 130 such as dimmer, light, color light and the like connected by control interface (e.g., RS-485 serial communication interface), and blind, curtain 132 and the like connected by control interface (e.g., RS-232 serial communication interface). A USB port 112 can be used to transfer the control data to a flash 134 connected by control interface (e.g., USB communication interface). A LAN/WLAN port 114 transfers the each audio/video to an appropriate active devices 136 linked by LAN/WLAN communication such as a computer, cellular phone, Ultra Mobile PC (UMPC), Personal Digital Assistants (PDA) and the like.
An electro machine such as vibration chair 138 can be connected to the control interface (e.g., audio out port 116 through audio cable), and digital television 140 is connected to the control interface (e.g., video out port 118 through a high definition multimedia interface (HDMI) cable) to transfer the media data to the corresponding devices.
An active device and a passive device used in the orchestral media service system may be a home appliance generally used in a home network, and can be a build-in equipment for example, smog machine, soap bubble generator and the like used to play a specialized effect.
Referring to
Then, the main audio and video selected from the multiple audio/video tracks are delivered to the rendering process in step 208 and then played in the A/V player inside the actuator 102. The passive device that receives the control data is activated simultaneously in step 210.
Referring to
The media consist of several tracks is transferred to the actuator 102 and goes through the parsing process to be transferred to the respective active devices. Each active device and the actuator 102 continuously perform synchronization each other. Assuming that a time is synchronized, an event channel 300 is shared and, if a control command is generated in the event channel 300, the control command is registered in an event queue 302. The event control command registered in the event queue 302 is dispatched to the corresponding active devices 306 and 308 by the event dispatcher 304. The active devices 306 and 308 execute the event.
Referring to
Specifically, the orchestral media from the orchestral media service provider 100 is transferred to the main controller 404 of the client engine 104 through the transfer engine 402. The main controller 404 manages total time to play the orchestral media and parses the orchestral media to separate into each audio/video track and neodata, thereby transferring the separated data to the A/V player module 406 and the parser module 408. The A/V player module 406 synchronizes the audio/video data transferred from the main controller 404 to play. The parser module 408 analyzes the neodata transferred from the main controller 404 and maps the neodata into control command to transfer to the connected respective passive devices.
The synchronization module 410 receives the control command and synchronization information from the parser module 408 and synchronizes with the active and passive devices which the control command is to be transferred. Under synchronized state, the synchronization module 410 transfers the mapped control command to the device controller 412 and the device controller 412 confirms the passive devices 418 connected by using the communication API 106. Then, the device controller 412 determines and selects among the passive devices capable implementing the effect based on the transferred mapped control command, and transfers the implementable control command to the selected passive devices.
Further, multi-track sender 608 of the A/V player module 406 transfers each audio/video, separated from the orchestral media, except main audio/video to the user around active devices, which will be described in
Hereinafter, each block will be described in detail with reference to the following drawings.
Referring to
Referring to
The A/V buffer 600 stores the audio/video tracks parsed from the media parser 502 and then transferred from the A/V controller 504 of the main controller 404. The audio sync 602 performs synchronization of the audio/video stored in the buffer. The A/V renderer 604 renders the synchronized audio/video into one resource. The H/W decoder 606 performs decoding to output the rendered resource in H/W. The multi-track sender 608 is responsible for transferring the audio/video of different tracks to the active device connected with the actuator 102 through wired or wireless interface.
Referring to
Since the neodata stored in the parsing table 700 includes only effect information about the audio/video transferred together, it is necessary that the neodata analyzer 702 analyzes the neodata stored in the parsing table 700 to convert effect data to control command. The neodata analyzer 702 analyzes the effect information included in the neodata to confirm a data structure included in the effect information. In the neodata mapper 704, the neodata, which effect information is analyzed in the neodata analyzer 702, undergoes a mapping process performing a transformation of data structure to be connected with the device actually connected with the actuator 102 and to be appropriate for executing the effect information in the corresponding device.
An example of mapping the neodata is as follows. For example, the neodata of wind blowing scene as the data structure 706 shown in
In order to play the above effect in the device at home, the neodata mapper 704 performs the transformation to control information <Electronic Fan, 1005, IR, 9s, 3 step control code, ON> and transfers to the synchronization module 410, since the neodata can be represented with device type, device identification number, connection interface, execution time, control type, control value and the like, as shown in
Referring to
The sync timer checker 802 continuously checks synchronization among the connected devices for example, active devices according to a time of the main clock manager 500. If there is an active device not synchronized, a synchronization set command is transferred to the unsynchronized active device. The sync table updater 804 is responsible for correcting control information so that the device executes ahead by considering an actual execution time. In the sync table updater 804, Equation 1 is used to calculate actual execution time. The actual execution time Ei of each device is calculated by subtracting activation time Δt(di) of each device and network delay time Δt(ni) from the start time(Ti) of each device.
Ei=Ti−Δt(di)−Δt(ni) [Equation 1]
The passive device uses hardware and may have an error range to a certain extent, e.g., 40 μs or smaller. However, the active devices like computer and PDA internally scheduling with their own CPU have irregular execution times for respective processes. Therefore, there can be making an error in the activation time even if the control command from the actuator 102 is transferred instantly. Further, since current wired/wireless communication interfaces are not protocols insuring the real time characteristics, a delay concerning such situation is required to be considered. When calculating the device activation time, the sync table updater 804 distinguishes whether the device is active type or passive type. The activation time Δt(di) of each active or passive device can be obtained by using the following Equation 2.
Sender processing delay (SPD) is a delay time generated by the command processing time in the actuator 102 side, and sender media access delay (SMAD) is a time taken to read media in the actuator 102 side. Receiver processing delay (RPD) is a processing delay time of the active device receiving audio/video, and receiver media access delay (RMAD) is a time used to play audio/video on player of the active device.
A value of the network delay time Δt(ni) for the passive device can be set 0 since it uses hardware and a value of the network delay time Δt(ni) for the active device is obtained by a delay value produced when transferring through wired/wireless communication.
The device control interface 806 is connected with the device controller 412 shown in
Referring to
Referring to
Referring to
The media parser 502 parses the orchestral media to separate each audio/video and neodata in step 1104. The parsed audio/videos are transferred to the A/V player module 406. In step 1106, the A/V player module 406 synchronizes the audio/video, renders audio/video data through rendering process and decoding process. When there are multiple audio/videos in one orchestral media, the parser divides them into each part, and the multi-track sender sends separated track to the active device. To determine an active device, actuator must know the capacity of active device. When active device receives separated audio/video, it plays the audio/video with main audio/video with synchronized way.
In step 1108, the neodata is sent to the parser module 408 where an analysis of the neodata is performed and mapping of the neodata which converts neodata into control command executable in the corresponding device is performed. Then in step 1110, the device controller 412 receives the mapped control command from the parser module 408 and send control command to the passive devices to activate effect devices, the A/V player module 406 plays main audio/video on output device like television, and transfers other audio/videos separated from the orchestral media to the corresponding active devices to play audio/videos synchronously with main audio/video. After this step, main audio/video, other audio/video, and effect data play individually on different devices, with the help of the synchronization process, each device can make a harmony. Namely, they play apart, they can make synchronization.
The described orchestral media service system plays multiple audio/videos by using several active devices and activates multiple passive devices to give another effects from the different playback way of one media by using one device, thereby increases an applicability of media and may be used for playback at once by 3D media (e.g., there's 3 audio/video tracks in one orchestral media for an car advertisement, first track contains front shot of the car, second track contains left shot and third track contains right shot of the car. These track plays together and can give 3D effects to users) in home media service and dome shape (360-degree view) theater through attaching many small media outputs in series, if more number of audio/video tracks and the active devices are used, and a method of playback is adjusted.
As described above, the present invention, that embodies playing of media including multiple audio/videos through wired/wireless network synchronized with multiple active and passive devices, transfers media including multiple tracks to multiple active devices through wired/wireless network and plays different audio/videos included inside the media in multiple active devices and passive devices synchronized with a main audio/video played in an actuator.
While the invention has been shown and described with respect to the embodiments, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the scope of the invention as defined in the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2008-0105763 | Oct 2008 | KR | national |