The present invention relates to a data transmitting apparatus, a data receiving apparatus, a data transmitting method, a data receiving method, and an audio-visual environment controlling method that transmits/receives data for making various peripheral devices work in conjunction with a video/sound content in order to realize viewing/listening of a content with a highly realistic sensation.
In recent years, a method is proposed that realizes viewing/listening of a content with a highly realistic sensation by making various peripheral devices such as a lighting, a fan, and an air conditioner work in an audio-visual space in conjunction with a video/sound content. For example, Patent documents 1 and 2 disclose a lighting controlling method for controlling audio-visual environment lighting in conjunction with an image displayed on an image display device to thereby obtain a highly realistic sensation.
However, in the conventional method, when the specification of a lighting device of a user does not strictly support lighting conditions (such as brightness and a color temperature of lighting, for example) intended by a producer of contents, it has been difficult for the user to judge whether or not to turn the lights on. Moreover, the user needs to adjust execution of an additional effect, such as a lighting, that brings a highly realistic sensation and has to perform setting of operations of peripheral devices for each viewing/listening because of the reason such as a concern for others depending on a time, a place, and a state when viewing/listening these contents.
The present invention has been made to solve the above problems and an object thereof is to provide a data transmitting apparatus, a data receiving apparatus, a data transmitting method, a data receiving method, and an audio-visual environment controlling method that control peripheral devices in accordance with an intention of a content producer to realize viewing/listening of a content with a highly realistic sensation.
In order to solve the above problems, a first technical means of the present invention is a data transmitting apparatus comprising transmitting portion for multiplexing and transmitting video data and/or sound data, and either audio-visual environment control data for one or more peripheral device in an audio-visual environment space where the video data and/or the sound data is reproduced or identification information for identifying the audio-visual environment control data, wherein the audio-visual environment control data includes a control designation value for the peripheral device, and producer preference information indicating an error permission range for the control designation value.
A second technical means of the present invention is the data transmitting apparatus as defined in the first technical means, wherein the producer preference information includes permission type information for allowing to control the peripheral device only using the control designation value.
A third technical means of the present invention is the data transmitting apparatus as defined in the first technical means, wherein the producer preference information includes permission type information for allowing to control the peripheral device using a value under the control designation value.
A fourth technical means of the present invention is the data transmitting apparatus as defined in the first technical means, wherein the producer preference information includes permission type information for allowing to control the peripheral device using a value over the control designation value.
A fifth technical means of the present invention is the data transmitting apparatus as defined in the first technical means, wherein the producer preference information includes permission type information for allowing to control the peripheral device using a value within a predetermined range including the control designation value.
A sixth technical means of the present invention is the data transmitting apparatus as defined in the third technical means, wherein the producer preference information includes proximity permission range information indicating a range in which the value is allowed in a ratio to the control designation value.
A seventh technical means of the present invention is the data transmitting apparatus as defined in the fourth technical means, wherein the producer preference information includes proximity permission range information indicating a range in which the value is allowed in a ratio to the control designation value.
An eighth technical means of the present invention is the data transmitting apparatus as defined in the fifth technical means, wherein the producer preference information includes proximity permission range information indicating a range in which the value is allowed in a ratio to the control designation value.
A ninth technical means of the present invention is the data transmitting apparatus as defined in the first technical means, wherein the control designation value is represented by a quantized value.
A tenth technical means of the present invention is the data transmitting apparatus as defined in the first technical means, wherein the audio-visual environment control data is lighting control data for one or more lighting device in the audio-visual environment space.
An eleventh technical means of the present invention is a data receiving apparatus comprising receiving portion for receiving multiplexed data including video data and/or sound data, and either audio-visual environment control data for one or more peripheral device in an audio-visual environment space where the video data and/or the sound data is reproduced or identification information for identifying the audio-visual environment control data, wherein the audio-visual environment control data includes a control designation value for the peripheral device, and producer preference information indicating an error permission range for the control designation value.
A twelfth technical means of the present invention is the data receiving apparatus as defined in the eleventh technical means, wherein the producer preference information includes permission type information for allowing to control the peripheral device only with the control designation value.
A thirteenth technical means of the present invention is the data receiving apparatus as defined in the eleventh technical means, wherein the producer preference information includes permission type information for allowing to control the peripheral device using a value under the control designation value.
A fourteenth technical means of the present invention is the data receiving apparatus as defined in the eleventh technical means, wherein the producer preference information includes permission type information for allowing to control the peripheral device using a value over the control designation value.
A fifteenth technical means of the present invention is the data receiving apparatus as defined in the eleventh technical means, wherein the producer preference information includes permission type information for allowing to control the peripheral device using a value within a predetermined range including the control designation value.
A sixteenth technical means of the present invention is the data receiving apparatus as defined in the thirteenth technical means, wherein the producer preference information includes proximity permission range information indicating a range in which the value is allowed in a ratio to the control designation value.
A seventeenth technical means of the present invention is the data receiving apparatus as defined in the fourteenth technical means, wherein the producer preference information includes proximity permission range information indicating a range in which the value is allowed in a ratio to the control designation value.
An eighteenth technical means of the present invention is the data receiving apparatus as defined in the fifteenth technical means, wherein the producer preference information includes proximity permission range information indicating a range in which the value is allowed in a ratio to the control designation value.
A nineteenth technical means of the present invention is the data receiving apparatus as defined in the eleventh technical means, wherein the control designation value is represented by a quantized value.
A twentieth technical means of the present invention is the data receiving apparatus as defined the eleventh technical means, wherein the audio-visual environment control data is lighting control data for one or more lighting device in the audio-visual environment space.
A twenty-first technical means of the present invention is a data transmitting method for multiplexing and transmitting video data and/or sound data, and either audio-visual environment control data for one or more peripheral device in an audio-visual environment space where the video data and/or the sound data is reproduced or identification information for identifying the audio-visual environment control data, wherein the audio-visual environment control data includes a control designation value for the peripheral device, and producer preference information indicating an error permission range for the control designation value.
A twenty-second technical means of the present invention is a data receiving method for receiving multiplexed data including video data and/or sound data, and either audio-visual environment control data for one or more peripheral device in an audio-visual environment space where the video data and/or the sound data is reproduced or identification information for identifying the audio-visual environment control data, wherein the audio-visual environment control data includes a control designation value for the peripheral device, and producer preference information indicating an error permission range for the control designation value.
A twenty-third technical means of the present invention is an audio-visual environment controlling method for receiving multiplexed data including video data and/or sound data, and either audio-visual environment control data for one or more peripheral device in an audio-visual environment space where the video data and/or the sound data is reproduced or identification information for identifying the audio-visual environment control data, wherein the audio-visual environment control data includes a control designation value for the peripheral device, and producer preference information indicating an error permission range for the control designation value.
According to the present invention, it is possible to appropriately control a peripheral device in a content audio-visual environment in accordance with an intention of the content producer or the content provider, and to realize viewing/listening of the content with a highly realistic sensation by describing preference of a content producer or a content provider as audio-visual environment control data corresponding to an audio-visual content. Moreover, a user (viewer) is able to realize audio-visual environment control suited to the audio-visual situation by setting execution conditions for an audio-visual environment control depending on a user preference and an audio-visual situation.
Embodiments of the present invention will hereinafter be described in detail with reference to drawings.
First, a data transmitting apparatus will be described.
The data transmitting apparatus is comprised of a video coding portion 101, a sound coding portion 102, an audio-visual environment control data coding portion 103, an audio-visual environment control data input portion 104, a data multiplexing portion 105, and a transmitting portion 106.
Input video data is compressed and coded by the video coding portion 101 and output to the data multiplexing portion 105. Various compression methods are usable for the video coding, including ISO/IEC 13818-2 (MPEG-2 Video), ISO/IEC 14496-2 (MPEG-4 Visual), ISO/IEC 14496-10 (MPEG-4 AVC), and the like.
Similarly, input sound data is compressed and coded by the sound coding portion 102 and output to the data multiplexing portion 105. Various compression methods are usable for the sound coding, including ISO/IEC 13818-7 (MPEG-2 AAC), ISO/IEC 14496-3 (MPEG-4 Audio), and the like.
Moreover, audio-visual environment control data input by the audio-visual environment control data input portion 104 is compressed and coded by the audio-visual environment control data coding portion 103 and output to the data multiplexing portion 105. Note that, the audio-visual environment control data will be described below in detail. For example, the XML (Extensible Markup Language) format and the like are used as a description method of the audio-visual environment control data. In addition, for a compression method of the audio-visual environment control data, the BiM (Binary format for MPEG-7) format in ISO/IEC 15938-1 (MPEG-7 Systems) and the like are usable. Alternatively, the output may be carried out in the very XML format without performing compression.
The video data, the sound data, and the audio-visual environment control data that have been coded are multiplexed by the data multiplexing portion 105 and sent or accumulated through the transmitting portion 106. For example, an MPEG-2 transport stream packet (TSP), an IP packet, an RTP packet, and the like in ISO/IEC 13818-1 (MPEG-2 Systems) are usable as a multiplexing method.
For example, when the transport stream packet (TSP) prescribed in the MPEG-2 is used, it is possible that, subsequent to a header in which information prescribed in the MPEG-2 is described, audio-visual environment control data is described in an extended header portion, and further video data and sound data are sent by a payload subsequent to the extended header. Alternatively, audio-visual environment control data may be sent by a payload, as well as video data and sound data are sent. Moreover, video data, sound data, and audio-visual environment control data may be sent by multiplexing different data streams of respective data.
Although it is configured such that video, sound, and audio-visual environment control data are multiplexed and transmitted in the above, it may be configured such that video/sound multiplexed data and audio-visual environment control data are transmitted through different networks. For example, there is a case where video/sound multiplexed data is transmitted as broadcasting data and audio-visual environment control data is transmitted to the Internet.
Audio-visual environment control data coded by the audio-visual environment control data coding portion 103 is accumulated in an accumulating portion 1403 from an audio-visual environment control data transmitting portion 1401 through an audio-visual environment control data server 1402. Note that, the audio-visual environment control data transmitting portion 1401 notifies the data multiplexing portion 105 of a URL (Uniform Resource Locator) for identifying the audio-visual environment control data and transmits the URL together with video/sound data by multiplexing them, so that a data receiving apparatus side is allowed to acquire the audio-visual environment control data from the audio-visual environment control data server 1402 based on the URL as identification information, thus making it possible to link the video/sound multiplexed data and the audio-visual environment control data. The transmission of the audio-visual environment control data may be carried out at the same time as the transmission of the video/sound multiplexed data or may be carried out upon reception of a request from outside (user, etc.).
Note that, when the audio-visual environment control data is transmitted through a network that is different from a network where the video/sound multiplexed data is transmitted and may be any identification information which can specify a corresponding relation between the video/sound multiplexed data and the audio-visual environment control data, including a CRID (Content Reference ID) in the TV-Anytime specification, a content name, and the like, not limited to the URL described above, may be used as the identification information for associating the audio-visual environment control data with the video/sound multiplexed data.
Alternatively, only the audio-visual environment control data may be recorded in another recording medium for distribution. For example, there is a case where the video/sound data is distributed by a large capacity recording medium such as a Blu-ray Disc and a DVD, and the audio-visual environment control data is distributed by a small-sized semiconductor recording medium or the like. In this case, when a plurality of contents are recorded for distribution, identification information which can specify a corresponding relation between the video/sound data and the audio-visual environment control data is also necessary.
Next, the audio-visual environment control data input portion 104 will be described.
For example, in the case of a lighting effect, as shown in
Here, the video data enables to produce an atmosphere and a realistic sensation for each scene with illumination light and serves as information in which lighting conditions for a lighting device are useful.
For example, controlling audio-visual environment lighting enables to improve a realistic sensation for each scene of contemporary/samurai dramas. That is, the color temperatures of lighting of the lighting devices generally used in contemporary dramas are about 5000 K for fluorescent lamps (daylight white color), about 6700 K for fluorescent lamps (daylight color), and about 2800 K for incandescent lamps. On the other hand, the color temperature is about 1800 to 2500 K in the case of candle light that is frequently used for light sources at night in samurai dramas. Moreover, the light intensity tends to be high in contemporary dramas and low in samurai dramas.
Therefore, when an indoor scene in the twilight is projected in a samurai drama, a realistic sensation is not obtained by viewing/listening under an environment where the color temperature of the lighting is high and the light intensity is high, and such environment results to give viewing/listening under an audio-visual environment against an intention of a content producer or a content provider.
Accordingly, it is desirable that the content producer or the content provider designates conditions for various peripheral devices in order to appropriately control peripheral devices in a content audio-visual environment in accordance with the intention of the producer at the time of viewing/listening video data and sound data, and the content producer or the content provider is able to make the conditions by actually viewing/listening the video data and the sound data and designating, for example, lighting conditions (brightness, color temperatures) for each scene. Alternatively, it is possible that average luminance and a dominant color in an entire screen or around the screen are automatically extracted by analyzing video data or video coded data, and lighting conditions such as brightness and color temperatures of the lighting are determined based on the extracted luminance information and color information. In addition, at this time, producer preference, which will be described below, may be designated together.
Note that, effect types illustrated in
Moreover, it is possible to add the audio-visual environment control data per frame, per shot, or per scene of the video data. At the least, the audio-visual environment control data may be added per scene, however, it is possible to control the audio-visual environment more precisely when the audio-visual environment control data is added per frame. For example, the audio-visual environment control data may be added only to a specific frame (such as a scene switching frame) in accordance with an intention of a video producer (such as a scenario writer or a director).
Moreover, adding the audio-visual environment control data per shot enables to realize appropriate control over audio-visual environment lighting, for example, even in a case where outdoor and indoor shots are included in the same scene. In addition, it may be configured such that the audio-visual environment control data is added per GOP (Group of Picture) that is a unit of random access to video data and the like.
At the preference input portion 202, whether to recommend strict reproduction of lighting conditions (such as brightness and color temperature values) and the like designated by the effect input portion 201 or to allow a permission range indicated by a predetermined threshold is input as preference (fondness) of a content producer or a content provider.
As examples of descriptive contents of preference, as shown in
Note that, contents (values) input by the preference input portion 202 are referred to as producer preference or producer preference information. Thus, the audio-visual environment control data includes an effect type added to a content, conditions thereof (which are values indicating effects and serve as control designation values), and producer preference.
At the format portion 203, contents input by the effect input portion 201 and the preference input portion 202 are output according to a predetermined format.
In
In each of the Control Data, brightness and a color temperature of the lighting, and producer preference are described. For example, in the Control Data (2) of light 1, the brightness is 200 (lx) and the color temperature is 3000 (K), and as producer preference, the permission range of −10%, that is, the permission range with the brightness of 180 to 200 (lx) and the color temperature of 2700 to 3000 (K) is allowed.
In this manner, it is possible to carry out control in accordance with an intention of the producer and provide content viewing/listening with a realistic sensation even in a case where the specification of a peripheral device in an audio-visual environment (support ranges of brightness and a color temperature in the case of a lighting device) by adding effects added to a content, such as lighting, wind, and temperatures, and preference of a content producer or a content provider with respect to the effects, to audio-visual environment control data.
Next, a data receiving apparatus will be described.
The data receiving apparatus is comprised of a receiving portion 701, a data separating portion 702, a video decoding portion 703, a sound decoding portion 704, an audio-visual environment control data decoding portion 705, a video reproducing portion 706, a sound reproducing portion 707, a device control portion 708, and a lighting device 709 as an example of devices to be controlled.
Multiplexed data (for example, an MPEG-2 transport stream) including video, sound, and audio-visual environment control data received by the receiving portion 701 is separated by the data separating portion 702 into video coded data, sound coded data, and audio-visual environment control data, and each of which is output to the video decoding portion 703, the sound decoding portion 704, and the audio-visual environment control data decoding portion 705.
The video coded data is decoded by the video decoding portion 703 and reproduced by the video reproducing portion 706. The sound coded data is decoded by the sound decoding portion 704 and reproduced by the sound reproducing portion 707. In addition, the audio-visual environment control data is decoded by the audio-visual environment control data decoding portion 705 and output to the device control portion 708.
The device control portion 708 controls the lighting device 709 according to descriptive contents of the audio-visual environment control data that has been input. Note that, although only the lighting device 709 is described as a device to be controlled by the device control portion 708 in the present embodiment, a fan, an air conditioner, a vibration device, a scent generating device, and the like may also be objects to be controlled.
Although it is configured such that data in which the video, sound and audio-visual environment control data are multiplexed is received in the above, it may be configured such that the video/sound multiplexed data and the audio-visual environment control data are received from separate networks. For example, there is a case where the video/sound multiplexed data is received from broadcasting data and the audio-visual environment control data is received from the Internet.
Note that, the data receiving apparatus illustrated in
An audio-visual environment control data receiving portion 1501 receives audio-visual environment control data accumulated in an accumulating portion 1403 through an audio-visual environment control data server 1402 to output to the audio-visual environment control data decoding portion 705. Note that, the audio-visual environment control data receiving portion 1501 acquires from the data separating portion 702, for example, a URL (Uniform Resource Locator) for identifying the audio-visual environment control data that has been multiplexed with the video/sound data as described above, and based on the URL, acquires the audio-visual environment control data from the audio-visual environment control data server 1402, and thereby, enables to link the video/sound multiplexed data and the audio-visual environment control data. The reception of audio-visual environment control data may be carried out at the timing when the URL for identifying the audio-visual environment control data is acquired from the data separating portion 702 or may be carried out based on a user request.
Note that, the identification information for associating the audio-visual environment control data with the video/sound multiplexed data is not limited to the above-described URL as described before.
Moreover, only the audio-visual environment control data may be acquired from another recording medium. For example, there is a case where the video/sound data is acquired from a large capacity recording medium such as a Blu-ray Disc and a DVD and the audio-visual environment control data is acquired from a small-sized semiconductor recording medium such as a compact flash (registered trademark) and an SD card.
Next, the device control portion 708 will be described.
The device control portion 708 is comprised of an analyzing portion (parser) 801, an effect extracting portion 802, a preference extracting portion 803, a device capability acquiring portion 804, a control value determining portion 805, and a command issuing portion 806.
Audio-visual environment control data output by the audio-visual environment control data decoding portion 705 for controlling the lighting (
First, the analyzing portion 801 parses audio-visual environment control data output by the audio-visual environment control data decoding portion 705, and the effect extracting portion 802 acquires a designation value (such as brightness and a color temperature) of lighting conditions from the audio-visual environment control data (step S91). On the other hand, the device capability acquiring portion 804 acquires a value of device capability (support range) of the lighting device 709 (step S92).
Subsequently, the control value determining portion 805 compares the designation value of the lighting conditions with the support range of the lighting device (step S93). When the designation value falls within the support range, the flow goes to step S94, and the command issuing portion 806 issues a command for turning the lights on with the brightness and the color temperature designated by the designation value and finishes the processing.
When the designation value falls out of the support range at step S93, the flow goes to step S95, and the preference extracting portion 803 acquires producer preference (permission type and permission range) out of the audio-visual environment control data parsed by the analyzing portion 801. The flow then goes to step S96, and the control value determining portion 805 compares the permission range of lighting conditions led by preference and the support range of the lighting device acquired at step S92. When support is allowed within the permission range, then the flow goes to step S98, and the control value determining portion 805 determines an approximate value closest to the designation value within the support range of the lighting device and informs the command issuing portion 806 of the approximate value. The command issuing portion 806 issues a command for turning the lighting on with the approximate value to the lighting device 709 (step S99).
Alternatively, when support is not allowed even in the permission range led by preference at step S96, the flow goes to step S97, and the command issuing portion 806 does not issue a command for the lighting to the lighting device (step S97) and processing is finished with the lighting turned off.
Here, description will be given with reference to
The designation values of lighting conditions by a content producer are such that brightness is 300 (lx) and preference is “Under” 10%, that is, the permission range is 270 to 300 (lx).
The lighting device A has the brightness support range of 290 to 340 (lx), so that the lighting is allowed to be turned on at the designation value of 300 (lx).
Moreover, the lighting device B has the brightness support range of 230 to 280 (lx), where the designation value of 300 (lx) is not supported, but a part (270 to 280 (lx)) within the permission range (270 to 300 (lx)) led by preference is supported, so that the lighting is allowed to be turned on, for example, at 280 (lx) closest to the designation value.
In addition, the lighting device C has the support range of 360 to 410 (lx), where the designation value of 300 (lx) is not supported. Further, support is not made even within the permission range led by preference, so that the lighting is turned off.
The lighting device 709 connected to the data receiving apparatus is able to be configured by LEDs that emit lights of three primary colors, for example, RGB with predetermined hues. However, the lighting device 709 may have any configuration capable of controlling colors and brightness of the lighting in a surrounding environment of a video display device, is not limited to the combination of LEDs emitting light of predetermined colors as described above, and may be configured by white LEDs and color filters, or a combination of white bulbs or fluorescent tubes, or may be applied with a combination of white lamps or fluorescent tubes and color filters, color lamps, etc. may also be applied. Moreover, one or more of the lighting devices 70 may be arranged.
In addition, the command issuing portion 806 may be anything that is capable of generating RGB data corresponding to the designation values (such as brightness and a color temperature) of lighting conditions from the control value determining portion 805 to output to the lighting device 709.
In this manner, when the lighting device does not satisfy designated lighting conditions, it is possible to carry out appropriate control in accordance with an intention of a producer by controlling ON/OFF of the lighting at an approximate value corresponding to preference.
The schematic configuration of the data transmitting apparatus in the present embodiment is similar to
Different from the embodiment 1, the configuration of
In this manner, it is possible to carry out control in accordance with an intention of the producer and provide content viewing/listening with a realistic sensation even in a case where the specification of a peripheral device in an audio-visual environment (a support range of brightness and a color temperature in the case of a lighting device) does not satisfy the designation condition by adding effects added to a content, such as lighting, wind, and temperatures, and preference of a content producer or a content provider with respect to the effects, to audio-visual environment control data. Further, adding producer preference to an upper element collectively or only to a necessary element enables to avoid a problem of increasing the data amount unnecessarily and a problem of increasing the burden on the side where producer preference is added.
The schematic configuration of the data receiving apparatus in the embodiment 2 is similar to
In the present embodiment, the operation for acquiring preference at step S95 in the flowchart illustrated in
When the designation value falls out of the support range at step S93 of
Alternatively, when preference is not described in the corresponding element at step S131, it is judged by the analyzing portion 801 whether or not there is an upper (parent) element of the corresponding element (step S132), and in the case of YES, moving to the upper (parent) element (step S134), and the flow further goes to step S131.
Alternatively, when there is no upper element at step S132, the flow goes to step S133, and at the control value determining portion 805, preference that is determined in advance is used so that the flow goes to step S96 of
In the example shown in
In this manner, when the lighting device does not satisfy designated lighting conditions, it is possible to carry out appropriate control in accordance with an intention of a producer by controlling ON/OFF of the lights at an approximate value according to preference.
In the embodiments 1 and 2, an example of preference acting on a lighting control unit is illustrated, however, second preference acting on entire audio-visual environment control data may further be described. That is, a no-permission type flag selecting, when a control condition of certain lighting is not satisfied, whether all the control of the lighting in the content are not permitted (control of the lighting whose condition is satisfied is also not permitted) or only control of the lighting whose condition is not satisfied is not permitted (control of the lighting whose condition is satisfied is permitted), may be described.
Note that, the above-described no-permission type flag may be collectively transmitted/received together with each control data in the content, or may be transmitted/received prior to actual control data.
Alternatively, description may be possible as attributes or child elements of a route element of metadata (Effect Description, Device Control Description and the like).
In this manner, when a lighting device does not satisfy a designated lighting condition, it is even possible to turn lighting control of an entire content OFF in accordance with an intention of a producer (the second preference), therefore it is possible to avoid a problem that lighting is not consistent through an entire content such that lighting is turned ON in one scene while lighting is turned OFF in another scene, thus rather impairing a realistic sensation against the intention of the producer.
In the embodiments 1 to 3, examples where producer preference is applied for control designation values such as brightness (lx), a color temperature (K), wind speed (m/s), and a temperature (C.°) are shown, and description will be given for an application example in a case where control designation values are quantized in accordance with a predetermined method.
Description will be given for a method of applying producer preference for quantized color temperature values (quantization index of color temperatures) with reference to drawings.
For the quantized color temperature values (quantization index of color temperatures), it is possible to apply producer preference (permission type and permission range) in accordance with the following equations;
Strict: Min(g(i)) to Max(g(i))
Under: Min(g(i))×(1−Range) to Max(g(i))
Over: Min(g(i)) to Max(g(i))×(1+Range)
Both: Min(g(i))×(1−Range)
to Max(g(i))×(1+Range);
where
g( ): inverse quantization function
i: quantization index
Range: permission range in producer preference (%)
Min( ): minimum value
Max( ): maximum value.
Description will be given for application of producer preference by taking a case where quantization index value=“01000000 (binary number)”, that is, 2251 (K) or more and less than 2267 (K) are designated as control designation values as an example.
When a permission type of producer preference is “Strict”, control at color temperature values indicated by a quantization index, that is, 2251 (K) or more and less than 2267 (K) is permitted.
When a permission type of producer preference is “Under” and a permission range is “10%”, control at −10% to ±0% from color temperature values indicated by a quantization index, that is, 2026 (K) or more and less than 2267 (K) is permitted.
When a permission type of producer preference is “Over” and a permission range is “10%”, control at ±0% to +10% from color temperature values indicated by a quantization index, that is, 2251 (K) or more and less than 2494 (K) is permitted.
When permission type of producer preference is “Both” and a permission range is “10%”, control at −10% to +10% from color temperature values indicated by a quantization index, that is, 2026 (K) or more and less than 2494 (K) is permitted.
Alternatively, a permission range of producer preference may be a ratio when a maximum quantization index value is 100%. For example, permission ranges of quantization index values in the case of n-bit quantization are, respectively,
Strict: ±0
Under: −Range×2n to ±0
Over: ±0 to +Range×2n
Both: −Range×2n to +Range×2n.
Alternatively, a permission range may be described not by a rate but by a maximum difference value of quantization index values.
Strict: ±0
Under: −Range to ±0
Over: ±0 to +Range
Both: −Range to +Range
Note that, in the above-described configuration, producer preference is applied for color temperature values of lighting, however, when conditions of lighting are designated by RGB values and the like, may be applied for correlation color temperature values obtained in accordance with a predetermined conversion equation, or may be applied for an approximated value obtained from the designated RGB values in the case of luminance.
For example, it is possible to obtain a correlation color temperature T by converting a color system from an RGB color system to an XYZ color system, obtaining chromaticity coordinate (x, y) and approximating it using a predetermined function f as follows.
In addition, the present invention is not limited to an application to a lighting device, but is applicable to various peripheral devices such as a fan and an air conditioner.
In the embodiments 1 to 4, producer preference of a content of a content producer or a content provider (such as permission value relating to effect reproduction) for additional effects and a method for utilization thereof are mentioned. In the present embodiment, preference of a user who enjoys content additional effects and use of user preference in a content having producer preference will be shown.
Here, user preference is a description of user's taste, that usually describes attribute information of a user, a genre and a key word of a preferred content, or a method for operating a device. The user preference is used for personalizing an AV device such as extracting only a preferred genre from accumulated contents, enabling an execution of a series of device operations which are frequently performed with a single touch of a button, and the like. In the present embodiment, preference as to an operation of additional effects of a content is described in user preference in addition to the preference information as described above.
The additional effects of a content are, as described above, effects for improving a realistic sensation at the time of viewing/listening the content, including lighting control and controls of a scent effect and a vibration effect. These additional effects are effective for producing a highly realistic sensation. However, under an audio-visual environment condition of a user, for example, when the user does not desire to execute the effect for such a reason that a vibration effect annoys other people in viewing/listening at midnight, it is necessary to take measures such as setting execution of each effect to turn off or not connecting a reproduction device and devices to be controlled for producing additional effects for every reproduction of a content (for example, such as lighting device or vibration device). Therefore, it is possible for users to save their trouble and enjoy a realistic sensation depending on an audio-visual environment and fondness by the way that a device interprets the execution conditions described in user preference data as their preference.
In user preference for additional effects of a content, execution conditions for an intended additional effect are described.
As shown in
When a value of the field is FALSE, the additional effect is turned off. In the case of being TRUE, it is indicated that there is restriction on execution. The restriction is described in the restriction type field and the restriction value field. In the restriction type field, a description method of the restriction value field is designated with an identifier. It is enough if it is possible to identify the description method of the restriction value field with the identifier, and in the present embodiment, Range Of Rendering that indicates a representation of percentage for a control designation value and Difference Value that indicates a representation of a difference value for a control designation value are used.
When the restriction type field is Range Of Rendering, it is noted that the restriction value field is in percentage representation for a control designation value, and in the case of Difference Value, it is noted that the restriction value field is a difference value for a control designation value.
The availability of operation field does not need to be TRUE and FALSE if it is possible to judge whether reproduction of additional effects is off or restricted, and may be a combination of 1 and 0, and the like.
In
1603 denotes a control in which the effect intended field is lighting and the availability of operation field is TRUE, therefore restriction is posed on reproduction. Since the restriction type field is Difference Value, the restriction value field has a difference representation of a control designation value. Since the value in the restriction value field is −50, the lighting is reproduced at a value which is smaller than a control designation value designated in the content by 50 [lx]. At this time, a unit of the restriction value filed is determined depending on each additional effect. For example, it is lux (lx) in the case of lighting, and decibel (db) in the case of audio, and the like.
In this manner, in user preference for additional effects in the present invention, an effect to be intended and an operation thereof are described. Note that, an additional effect which is not described (restricted) in the user preference is executed at the very control designation value.
Description will be given below for a receiving apparatus with the use of user preference.
In the present embodiment, it is assumed that user preference including control description for additional effects set by a user is accumulated in advance in the user preference managing portion 1701. As an example of the user preference managing portion 1701, a large capacity recording medium represented by a hard disk and a Blu-Ray disc attached to a data receiving apparatus and a small-sized semiconductor recording medium such as an SD card and a smart card are usable.
Similarly to the data receiving apparatus illustrated in
The device control portion 1702 compares the acquired user preference with the analyzed descriptive contents of the audio-visual environment control data and determines control availability and a control value of a device to be controlled. When it is determined that control is available, the determined control value is output to the lighting device 709 when an intended additional effect is lighting and to a vibration device 1703 in the case of a vibration effect for performing control.
In the present embodiment, in the data receiving apparatus illustrated in
Here, description will be given for the device control portion 1702.
The device control portion 1702 is comprised of an analyzing portion (parser) 801, an effect extracting portion 802, a control value determining portion 1802, a command issuing portion 806, and a user preference acquiring portion 1801. Note that, a common reference numeral is provided to one having a function similar to the device control portion 708 illustrated in
First, the analyzing portion 801 parses audio-visual environment control data output from the audio-visual environment control data decoding portion 705, and the effect extracting portion 802 acquires an additional effect attached to a content and a control designation value thereof (brightness, a color temperature, and intensity of vibration) among the audio-visual environment control data (step S1901).
Next, the flow goes to step S1902 and the user preference acquiring portion 1801 acquires user preference from the user preference managing portion 1701.
The control value determining portion 1802 judges whether the acquired additional effect is included in entry of additional effects of user preference description, that is, whether restriction to the additional effects exists (step S1903).
When entry of an intended additional effect does not exist in user preference, then the flow goes to step S1907, and since there is no description in the user preference, the command issuing portion 806 issues a command for turning the device to be controlled ON at a control designation value of the additional effect.
When entry of the intended additional effect exists in the user preference at step S1903, the flow goes to step S1904, and an availability of operation field in user preference of an additional effect to be reproduced is checked. Here, when the availability of operation field is FALSE (reproduction of the additional effect is not performed), the flow goes to step S1908, and the command issuing portion 806 does not issue a command because no control is performed over the additional effect.
When the availability of operation field is TRUE at step S1904, the flow goes to step S1905, and the control value determining portion 1802 calculates a control value for additional effect control. Here, the control value determining portion 1802, when a control value of additional effects is described in user preference, calculates a control value based on a control value field included in the user preference for the control designation value extracted at step S1901. Alternatively, when a control value for additional effects is not described in user preference, the control designation value extracted at step S1901 is used.
At step S1906, the command issuing portion 806 issues a command for turning on the device to be controlled at the control value (brightness and a color temperature, or intensity of vibration) calculated by the control value determining portion 1802 to the device to be controlled (the lighting device 709 or the vibration device 1703).
The device control portion 1702 performs the above-described processing to all the additional effects included in the content.
In
On the other hand, in
User preference mentioned so far is applied whenever the user preference data exists. However, preference of reproduction of additional effects sometimes changes depending on audio-visual situations. For example, it is considered that a user who has a family may desire to turn a vibration effect off so as not to annoy other people when viewing/listening a content including additional effects at midnight, and may desire to turn a lighting effect off when viewing/listening a content with the use of a mobile device in a public place such as an office and a train. In order to realize these, it is possible to introduce audio-visual environment information to user preference restricting additional effects.
In
There may be at least either one of place information and time information in audio-visual environment information. For example, when place information does not exist, reproduction control of additional effects is performed in accordance with time information in any places, and when the both exist, reproduction control of additional effects is performed at a designated time in a designated place.
In
In
For example, when a receiving apparatus has a function of GPS (Global Positioning System) reception, a place may be specified from position information and map information received from the GPS, and a position of the receiving apparatus may be specified with the use of RFID by installing the RFID in the receiving apparatus and a place for viewing/listening (for example, an entrance of each room of a house and inside a train car) and communicating between the receiving apparatus and the place for viewing/listening by the RFID.
Alternatively, the receiving apparatus may acquire position information from a mobile phone held by a user, an access point of a communication network such as a LAN (Local Area Network), or the like, and the acquiring portion is no object if the device control portion 1702 is able to specify place information of the receiving apparatus. In addition, when a position information acquiring portion does not exist in the receiving apparatus, the device control portion 1702 may inquire of a user about a place for viewing/listening at the time of viewing/listening a content including additional effects. In this manner, the device control portion 1702 is thereby able to apply user preference shown in
In addition, when 2 or more people having preference view/listen a content having additional effects, only additional effects common to users may be focused or control of additional effects satisfying all the respective preference may be performed.
For example, if John and Mary view/listen a content together, the preference of John describes preference of turning off a lighting effect and a vibration effect of 50%, and the preference of Mary describes preference of a lighting effect of 70% and turning off a scent effect, when only a common additional effect is restricted, only the lighting effect is restricted and the vibration effect and the scent effect are not restricted.
On the other hand, in the case of executing so as to satisfy all the preference, execution of the lighting effect, the vibration effect, and the scent effect are restricted.
Note that, although a restriction method of each effect is not particularly limited, for example, in the case of a lighting effect, a control value for executing a more additional effect (70% of Mary) may be selected or a more restricted value (turning off of John) may be selected.
In addition, user information may be set by an individual as shown in
Further, in user preference mentioned so far, all the control for additional effects desired by a user have to be described. That is, in a case where only a lighting effect is desired to be effective among several types of additional effects included in a content (a lighting effect, a vibration effect, a scent effect, etc.), and the like, description has to be made such that all the additional effects considered to be included in the content other than the lighting effect are not executed. In order to avoid this, an additional effect whose execution is permitted is described in user preference and filtering of additional effects may be performed.
Specifically, description indicating execution permission is prepared and set in user preference for realization. In this example, the description takes either value of Allow or Deny. When a value of the description is Allow, only a described additional effect is permitted to be executed, and an additional effect which is not described is not executed. When a value of the description is Deny, the described additional effect is not executed, and it is possible to execute an additional effect other than the description.
In
Further, filtering of additional effects depending on an audio-visual environment may be realized by combining the filtering with audio-visual environment information like in
In the present embodiment, for description of user preference, a reproduction state of individual additional effects is described, however, description may be performed for not only an additional effect but also a specified parameter. For example, as for a lighting effect, control information may be described for each of brightness and luminance. In addition, when audio-visual environment control data relating to operations of a blinking effect of lighting and the like is included in a content, description controlling a blinking interval and the like may be performed in user preference.
In this manner, a reproduction state of additional effects is described in user preference, so that a reproduction method of additional effects preferred by a user and reproduction of additional effects suitable for a time and a place for viewing/listening are possible even without performing setting of operations of peripheral devices for every viewing/listening.
101 . . . video coding portion; 102 . . . sound coding portion; 103 . . . audio-visual environment control data coding portion; 104 . . . audio-visual environment control data input portion; 105 . . . data multiplexing portion; 106 . . . transmitting portion; 201 . . . effect input portion; 202 . . . preference input portion; 203 . . . format portion; 701 . . . receiving portion; 702 . . . data separating portion; 703 . . . video decoding portion; 704 . . . sound decoding portion; 705 . . . audio-visual environment control data decoding portion; 706 . . . video reproducing portion; 707 . . . sound reproducing portion; 708, 1702 . . . device control portion; 709 . . . lighting device; 801 . . . analyzing portion (parser); 802 . . . effect extracting portion; 803 . . . preference extracting portion; 804 . . . device capability acquiring portion; 805, 1802 . . . control value determining portion; 806 . . . command issuing portion; 1401 . . . audio-visual environment control data transmitting portion; 1402 . . . audio-visual environment control data server; 1403 . . . accumulating portion; 1501 . . . audio-visual environment control data receiving portion; 1701 . . . user preference managing portion; 1703 . . . vibration device; and 1801 . . . user preference acquiring portion.
Number | Date | Country | Kind |
---|---|---|---|
2008-183813 | Jul 2008 | JP | national |
2009-015429 | Jan 2009 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2009/062736 | 7/14/2009 | WO | 00 | 3/7/2011 |