1. Field of the Invention
Exemplary embodiments of the present invention relate to a method and apparatus for providing metadata for a sensory effect, a computer-readable recording medium on which metadata for a sensory effect is recorded, and a method and apparatus for sensory reproduction.
2. Description of Related Art
In general, contents may be provided to a user through a computing device or optical disk player capable of reproducing the contents. When the contents are recorded on an optical disk such as a CD, DVD or Blue-ray disk, moving picture contents are reproduced through the computing device or optical disk player, and the reproduced contents may be displayed through a monitor or television connected to the computing device or optical disk player.
As the moving picture experts group (MPEG) is extended from MPEG-1 to MPEG-21 via MPEG-2, MPEG-4 and MPEG-7 related to media technologies such as audio (including voice and sound) or video (including still pictures and moving pictures) contents, media concepts and multimedia processing technologies have been developed. A format for storing audio and video is defined in the MPEG-1, and media transmission is focused in the MPEG-2. The MPEG-deals with metadata related to media, and the MPEG-21 deals with a distribution framework technology of media.
In order to more develop the contents reproduction technology, studies on a sensory effect have recently been conducted to provide more realistic pictures to a user in reproduction of moving pictures. That is, studies on a peripheral sensory reproduction apparatus for controlling a sensory effect and a signal processing system for controlling the peripheral sensory reproduction apparatus have been actively conducted so as to provide the sensory effect such as fog, wind, temperature, smell, light, lightning or movement of a chair according to contents.
An embodiment of the present invention is directed to a method and apparatus for providing metadata for a sensory effect, a computer-readable recording medium on which metadata for a sensory effect are recorded, and a method and apparatus for reproducing a sensory effect, which provides the sensory effect according to contents reproduction (consumption).
Another embodiment of the present invention is directed to a method and apparatus for providing metadata for a sensory effect, a computer-readable recording medium on which metadata for a sensory effect are recorded, and a method and apparatus for reproducing a sensory effect, which provide a color correction effect for contents.
Other objects and advantages of the present invention can be understood by the following description, and become apparent with reference to the embodiments of the present invention. Also, it is obvious to those skilled in the art to which the present invention pertains that the objects and advantages of the present invention can be realized by the means as claimed and combinations thereof.
In accordance with an embodiment of the present invention, a method for providing metadata for a sensory effect includes generating Sensory Effect Metadata (SEM) including sensory effect information on contents using binary representation syntax, and analyzing the SEM and transmitting the SEM to a sensory reproduction apparatus engine unit that generates control information on a sensory reproduction apparatus. In the method, the sensory effect information includes color correction effect information on the contents.
In accordance with another embodiment of the present invention, a method for providing metadata for a sensory effect includes generating User Sensory Preference (USP) metadata including consumer preference information on a sensory effect using binary representation syntax, and analyzing the USP metadata and transmitting the USP metadata to a sensory reproduction apparatus engine unit that generates control information on a sensory reproduction apparatus. In the method, the preference information includes preference information on a color correction effect of contents.
In accordance with another embodiment of the present invention, a method for providing metadata for a sensory effect includes generating Sensory Device Capabilities (SDCap) metadata including reproduction capability information on a sensory effect of a sensory reproduction apparatus using binary representation syntax, and analyzing the SDCap metadata and transmitting the SDCap metadata to a sensory reproduction apparatus engine unit that generates control information on the sensory reproduction apparatus. In the method, the reproduction capability information includes reproduction capability information on a color correction effect of contents.
In accordance with another embodiment of the present invention, a method for providing metadata for a sensory effect includes receiving SEM comprising sensory effect information, analyzing the SEM and generating Sensory Device Commands (SDCmd) metadata including control information on a sensory effect of a sensory reproduction apparatus, and transmitting the SDCmd metadata to a control device that controls the sensory reproduction apparatus. In the method, the SEM are generated using binary representation syntax, and the sensory effect information includes color correction effect information on contents.
In accordance with another embodiment of the present invention, a sensory effect representing method of a sensory reproduction apparatus for representing a sensory effect includes receiving control information on a sensory effect on a sensory reproduction apparatus, and representing the sensory effect based on the control information on the sensory effect. In the method, the control information on the sensory effect includes control information on a color correction effect of contents.
In accordance with another embodiment of the present invention, a computer-readable recording medium on which metadata are recorded includes SEM including sensory effect information on contents. In the computer-readable recording medium, the sensory effect information includes color correction information on the contents.
In the computer-readable recording medium on which the metadata are recorded, the metadata may include USP metadata including consumer preference information on a sensory effect, and the preference information may include preference information on a color correction effect of the contents.
In the computer-readable recording medium on which the metadata are recorded, the metadata may include SDCap metadata including reproduction capability information on the sensory effect of a sensory reproduction apparatus, and the reproduction capability information may include reproduction capability information on the color correction effect of the contents.
In the computer-readable recording medium on which the metadata are recorded, the metadata may include SDCmd metadata including control information on the sensory effect of the sensory reproduction apparatus, and the control information on the sensory effect may include control information on the color correction effect of the contents.
Exemplary embodiments of the present invention will be described below in more detail with reference to the accompanying drawings. The present invention may, however, be embodied in different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the present invention to those skilled in the art. Throughout the disclosure, like reference numerals refer to like parts throughout the various figures and embodiments of the present invention.
In general, when a contents consumer consumes contents, a sensory effect may be reproduced by a corresponding sensory reproduction apparatus according the consumed contents. Since relations and compatibilities between the consumed contents and the sensory reproduction apparatuses may be different, a sensory reproduction apparatus compatible with contents is required to reproduce sensory effects for the contents.
Meanwhile, the sensory effect is used as a means that enables a contents consumer to more realistically experience contents, but the sensory effect for reproducing colors intended by a contents provider in a sensory reproduction apparatus is not provided. That is, since contents providing environment and characteristics of the sensory reproduction apparatus are different, it is difficult to reproduce pictures identical to original pictures of the contents in the sensory reproduction apparatus.
In accordance with the present invention, color correction effect information on contents is provided to a sensory reproduction apparatus, so that colors intended by a contents provider can be reproduced in the sensory reproduction apparatus. That is, in accordance with the present invention, color information on original pictures of contents and information on a target for color correction are provided to a sensory reproduction apparatus, so that pictures intended by a contents provider or pictures identical to the original pictures of the contents can be reproduced in the sensory reproduction apparatus.
As illustrated in
First, metadata will be described as follows.
Sensory Effect Metadata (SEM) are metadata including sensory effect information on metadata contents. User Sensory Preferences (USP) metadata are metadata including user preference information of a sensory effect for contents. Sensory Device Commands (SDCmd) metadata are metadata including information for controlling a sensory reproduction apparatus that represents a sensory effect. Sensory Device Capabilities (SDCap) metadata are metadata including sensory effect reproduction capability information of the sensory reproduction apparatus.
The SEM generating unit 101 SEM including sensory effect information on contents. The sensory effect information on the contents may be provided by a contents provider. Here, the sensory effect information may be wind effect information, vibration information, temperature information, main illumination information and color correction effect information on the contents.
The USP metadata generating unit 103 generates USP metadata including consumer preference information on a sensory effect. The consumer preference information may be provided by a consumer that consumes contents. For example, the consumer preference information may include information that the consumer prefers a wind effect in the sensory effects and does not prefer a vibration effect in the sensory effects.
The SDCmd metadata generating unit 105 generates SDCmd metadata including control information for controlling the sensory reproduction apparatus 113 to represent a sensory effect. That is, the sensory reproduction apparatus 113 may represent the sensory effect according to the SDCmd metadata.
The SDCap metadata generating unit 107 generates SDCap metadata including sensory effect reproduction capability information of the sensory reproduction apparatus 113. For example, the sensory effect reproduction capability information may include temperature control capability information of the temperature control device 121 in the sensory reproduction apparatus 113.
The sensory reproduction apparatus engine unit 109 receives SEM metadata inputted from the SEM generating unit 101 and analyzes the received SEM. The sensory reproduction apparatus engine unit 109 may analyze at least one of the USP metadata and the SDCap metadata. That is, the sensory reproduction apparatus engine unit 109 analyzes the SEM, the USP metadata and the SDCap metadata and provides the analyzed result to the SDCmd metadata generating unit 105, so that SDCmd metadata generating unit 105 can generate SDCmd metadata.
Meanwhile, the SDCmd metadata generating unit 105 may be included in the sensory reproduction apparatus engine unit 109, and the sensory reproduction apparatus engine unit 109 may generate the SDCmd metadata. That is, the sensory reproduction apparatus engine unit 109 may generate the SDCmd metadata using sensory effect information, user preference information and reproduction capability information.
The sensory reproduction apparatus control unit 111 receives SDCmd metadata inputted from the SDCmd metadata generating unit 105 and analyzes the received SDCmd metadata. The sensory reproduction apparatus control unit 111 may control the sensory reproduction apparatus 113 using the control information on the sensory reproduction apparatus 113. The sensory reproduction apparatus control unit 111 may include the SDCap metadata generating unit 107, and the SDCap metadata generating unit 107 may generate SDCap metadata of the sensory reproduction apparatus 113 connected to the sensory reproduction apparatus control unit 111.
The sensory reproduction apparatus 113 reproduces or represents a sensory effect under a control of the sensory reproduction apparatus control unit 111. For example, the sensory reproduction apparatus 113 may represent a wind effect for contents, a vibration effect, a temperature effect, a main illumination effect, a peripheral illumination effect and a color correction effect for contents. That is, the illumination device 117 and the LED device 119 may represent the main illumination effect and the peripheral illumination effect, and the temperature control device 121 may represent the temperature effect. The display device 115 may represent the color correction effect together with the reproduction of contents. Other reproduction apparatuses 123 may represent the wind effect and the vibration effect.
Particularly, the sensory reproduction apparatus 113 in accordance with the embodiment of the present invention may represent the color correction effect for the contents as described above. The color correction effect is, for example, a color effect intended by the contents provider. The sensory reproduction apparatus 113 may reproduce the contents by representing the color effect intended by the contents provider in the contents, or may reproduce the contents by maximally reflecting original color pictures of the contents.
The multimedia system in accordance with the embodiment of the present invention generates SEM, USP metadata, SDCap metadata and SDCmd metadata for the color correction effect so as to implement the color correction effect, and reproduces contents using these metadata.
Meanwhile, the sensory reproduction apparatus 113 in accordance with the embodiment of the present invention may receive the SDCmd metadata directly inputted from the sensory reproduction apparatus engine unit 109 so as to represent the sensory effect, or may include the SDCap metadata generating unit 107 so as to transmit the SDCap metadata to the sensory reproduction apparatus engine unit 109.
In the multimedia system in accordance with the embodiment of the present invention, each of the metadata may be transmitted/received through a communication channel (not shown). The communication channel is a wired network, and may transmit/receive data through specific communication regulations. Alternatively, the communication channel may be a communication channel using a mobile communication scheme such as CDMA, WCDMA or FDMA or a wireless communication scheme such as Bluetooth, WiBro or wireless LAN.
In accordance with the embodiment of the present invention, a method for describing metadata according to the standardized format and structure may be based on the MPEG-7 multimedia description scheme (MDS) or the MPEG-21 digital item adaptation (DIA) MDS.
As illustrated in
The metadata generating unit 201 generates SEM metadata 200 including sensory effect information. The transmission unit 203 transmits the SEM to the sensory reproduction apparatus engine unit 109 that generates control information on the sensory reproduction apparatus 113 that analyzes metadata and reproduces a sensory effect.
Here, the sensory effect information may include at least one of color correction effect information, wind effect information, vibration information, temperature information, main illumination information and peripheral illumination information.
As illustrated in
Specifically, information on whether or not the sensory effect described in the SEM 200 from media including the SEM 200 and contents is described in the automatic extraction attribute information. General explanation information on the SEM or the like may be described in the form of an appendix in the Description metadata 304. The Declarations metadata 306 are metadata for previously declaring sensory effect information included in the SEM 200 and referring the previously declared sensory effect information (Group Of Effects, Effect) and parameter information (Parameter) for reference in the reproduction of a sensory effect, if necessary. The Reference Effect metadata 312 are metadata for referable sensory effect information in the reproduction of the previously declared sensory effect.
Here, any one of the Declarations metadata 306, the Group Of Effects metadata 308, the Effect metadata 310 and the Reference Effect metadata 312 may be repeatedly described in the SEM 200.
In Table 1, the description structure of the SEM 200 is shown in the form of an eXtension Markup Language (XML) schema.
As illustrated in
The SEM Base Type 500 becomes a base type for a plurality of metadata included in the SEM, and the plurality of metadata included in the SEM may use a data type extended from the SEM Base Type 500. The data type extended from the SEM base type 500 includes all the attributes or information of the SEM base type 500.
In Table 2, the description structure of the SEM Base Type metadata 500 is shown in the form of an XML schema.
As illustrated in
The Classification Scheme Alias metadata 700 may be extended from, for example, the SEM Base Type 500. The classification Scheme Alias metadata 700 includes metadata (Alias) 702 for describing attribute information for another name assigned to the classification schema and metadata (HREF) 704 for describing attribute information for referring the classification schema assigned by another name using the URI. That is, another classification schema may be referred by the classification scheme alias metadata 700.
Specifically, the Alias metadata 702 are metadata for assigning a separate name to the name of the classification schema. The URI refers to path information for referring a file in which the classification schema is defined on the Web. The URI is defined as attribute information of the href metadata 704.
In Table 3, the description structure of the Description metadata 304 is shown in the form of an XML schema.
As illustrated in
The Effect Base Type metadata 900 includes SEM Base Attributes metadata 902 for describing a group of attributes including attributes necessary for sensory effect description and metadata (##other) 904 for describing extendible attribute information.
In Table 4, the description structure of the Effect Base Type metadata 900 is shown in the form of an XML schema.
As illustrated in
The SEMAdaptablilityAttributes metadata 1116 includes metadata (adaptType) 1118 for describing attribute information showing preference for adaptation and metadata (adaptRange) 1120 for describing attribute information showing a range for the adaptation.
Specifically, alt metadata 1108 shows position information on a substitutable sensory effect when it is necessary to substitute another sensory effect for a predetermined sensory effect. The position metadata 1114 shows position information on a position at which the sensory effect is represented, e.g., a position at which the wind effect is represented at a left side.
The SEMAdaptablilityAttributes metadata 1116 shows a degree of adaptation for the reproduction intensity in the reproduction of the sensory effect. For example, when the predetermined reproduction intensity is 100%, the SEMAdaptablilityAttributes metadata 1116 shows information on whether the sensory effect is reproduced by strictly applying the reproduction intensity of 100% or by flexibly applying the reproduction intensity of 100%. That is, attribute information related to adaptation is described so that the sensory effect can be reduced in the adaptType metadata 1118. If the reproduction intensity is described as 10% in the adaptRange metadata 1120, a wind effect of 90% may be represented in the sensory reproduction apparatus 113.
In Table 5, the description structure of the SEM Base Attributes metadata 902 is shown in the form of an XML schema.
As illustrated in
The Group Of Effects metadata 308 includes the SEM Base Attributes metadata 902 for describing a group of attributes including attributes necessary for sensory effect description, the metadata (##other) 904 for describing extendible attribute information, and the Effect metadata 310 for describing one sensory effect information. The Group Of Effects metadata 308 may include two or more Effect metadata 310.
In Table 6, the description structure of the Group Of Effects Type metadata 1200 is shown in the form of an XML schema.
As illustrated in
The reference effect metadata 312 includes metadata (uri) 1402 for describing attribute information showing a position of a sensory effect to be referred, the SEM Base Attributes metadata 902 for describing a group of attributes including attributes necessary for sensory effect description, and metadata (##other) 1404 for describing extendible attribute information.
In Table 7, the description structure of the Reference Effect Type metadata 1400 is shown in the form of an XML schema.
As illustrated in
The Declarations metadata 306 includes the Group Of Effects metadata 308, the Effect metadata 310, and Parameter metadata 1602 for describing parameter information referred to the sensory effect. Any one of the Group Of Effects metadata 308, the Effect metadata 310 and the Parameter metadata 1602 may be repeatedly described in the Declarations metadata 306.
The Group Of Effects metadata 308 and the Effect metadata 310 may be included in the Declarations metadata 306, or may be included in the SEM. In a case where the Group Of Effects metadata 308 and the Effect metadata 310 are used in the Declarations metadata 306, they are used as data previously defined in a dictionary. In a case where the Group Of Effects metadata 308 and the Effect metadata 310 are used in the SEM, they are used as data according to contents of media. For example, when a temperature effect for the media is continuously represented, the temperature effect may be defined in the Declarations metadata. When a temperature effect is represented depending on a situation, the temperature effect may be defined in the SEM.
In Table 8, the description structure of the Declarations Type metadata 1600 is shown in the form of an XML schema.
As illustrated in
In Table 9, the description structure of the Parameter Base Type metadata 1800 is shown in the form of an XML schema.
As illustrated in
The Color Correction Parameter Type metadata 2000 includes at least one of Tone Reproduction Curves 2002, Conversion LUT 2004, Color Temperature 2006, Input Device Color Gamut 2008 and Illumination Of Surround 2010.
Since contents providing environment and characteristics of the sensory reproduction apparatus are different as described above, it is difficult to reproduce pictures identical to the original pictures of contents in the sensory reproduction apparatus. That is, since the apparatus used to generate contents and the sensory reproduction apparatus have a difference in color property, a difference in color representation occurs when the original pictures of the contents are reproduced in the sensory reproduction apparatus as they are. Thus, in the embodiment of the present invention, color correction effect information on the original pictures of the contents is provided to the sensory reproduction apparatus, so that the original pictures of the contents can be reproduced in the sensory reproduction apparatus. Here, the color correction effect information may include color correction parameters and Color Correction Type metadata 3000 of
That is, the sensory reproduction apparatus engine unit 109 analyzes color correction parameters from the SEM and generates SDCmd metadata so that the sensory reproduction apparatus can restore the original pictures of the contents or display pictures intended by the contents provider. Alternatively, the sensory reproduction apparatus may represent a color correction effect intended by the contents provider with reference to the color correction parameters.
The Tone Reproduction Curves 2002 show characteristics of an original picture display device for the original pictures of the contents. That is, the Tone Reproduction Curves 2002 for describing tone reproduction curves showing the characteristics of the original picture display device used in production of the contents are provided as color correction parameters so as to perform successful color restoration in the sensory reproduction apparatus 113.
The Conversion LUT 2004 includes information for converting a color space for the original pictures into a standard color space. Since the color space of the original pictures and the color space in the sensory reproduction apparatus 113 are different, the Conversion LUT 2004 is provided as a color correction parameter. Here, the Conversion LUT 2004 includes an LUT for how the color space of the original pictures is converted in the standard color space and parameter information.
The Color Temperature 2006 shows color temperature information of illumination used in a production space of the original pictures. That is, the Color Temperature 2006 includes color temperature information on an illumination source used in the production space of the original pictures.
The Input Device Color Gamut 2008 shows input device color gamut information on the original picture display device. Since the input device color gamut of the original picture display unit and the input device color gamut of the sensory reproduction apparatus 113 are different, the Input Device Color Gamut 2008 including input device color gamut information on the original picture display device is provided as a color correction parameter.
The illuminance Of Surround 2010 shows illuminance information on the consumer that reproduces contents.
In the embodiment of the present invention, although a gain offset gamma (GOG) model is used as the method for converting the color space, another conversion model such as a polynomial or PLCC model may be used as the method for converting the color space.
In Table 10, the description structure of the Color Correction Parameter metadata 2000 is shown in the form of an XML schema.
As illustrated in
The Tone Reproduction Curves Type metadata 2200 includes a digital to analog conversion (DAC) value for an RGB channel of the original picture display device and an RGB value (RGB_value) 2204 of the RGB channel according to the DAC value (DAC_value) 2202. The DAC value and the RGB value are used to evaluate a gamma value, i.e., a tone reproduction curve. The gamma value is a numerical value showing correlation between the input and output of the display device. The gamma value shows a ratio of input voltage to brightness.
That is, the gamma value may be evaluated through the DAC value that is a digital value outputted from the RGB channel according to an input voltage and the RGB value measured through a spectrometer. The sensory reproduction apparatus 113 may reproduce the contents or represent the color correction effect with reference to the gamma value evaluated using the DAC value and the RGB value.
The DAC_Value metadata 2202 and the RGB_value metadata 2204 may be repeatedly described in the Tone Reproduction Curves metadata 2002 from once to 256 times in pair (described in sequence).
In Table 11, the description structure of the Tone Reproduction Curves Type metadata 2200 is shown in the form of an XMT, schema.
As illustrated in
The Conversion LUT metadata 2004 includes RGB_XYZ_LUT 2402, Parameter information and Inverse LUT 2410.
The RGB_XYZ_LUT 2402 is information for converting the RGB color space into the XYZ color space. The Inverse LUT 2410 is information for the XYZ color space into the RGB color space.
The parameter information describes a gain, offset and gamma value of the original picture display device for GOG conversion and an RGB scalar maximum value for the RGB channel. That is, the parameter information includes RGBScalar_Max 2404 for describing the RGB scalar maximum value for each channel, which is necessary for the GOG conversion, Offset_Value 2406 for describing the offset value of the original picture display device, and Gain_Offset_Gamma 2408 for describing the gain, offset and gamma value of the original picture display device, that are parameters necessary for the GOG conversion.
That is, according to conversion information, the RGB color space that is a color space of the original pictures may be converted into the XYZ color space that is a standard color space. The sensory reproduction apparatus 113 may reproduce the contents or represent the color correction effect with reference to the conversion information.
In Table 12, the description structure of the Conversion LUT Type metadata 2400 is shown in the form of an XML schema.
As illustrated in
The Color Temperature 2006 may include type information of the illuminant (Daylight) 2602, white point chromaticity value (xy_Value) 2604 and brightness of the illuminant (Y_Value) 2606. Alternatively, the Color Temperature 2006 may include correlated color temperature information (Correlated_CT) 2608. That is, the Daylight 2602, the xy_Value 2604 and the Y_Value 2606 are necessarily described together with the Color Temperature metadata 2006. Selectively, only the Correlated_CT 2608 may be described in the Color Temperature metadata 2006.
Here, the Daylight metadata 2602 may be an illumination type according to the name (type) of commission internationale de I'Eclairage (CIE) standard illuminant. The xy_Value metadata 2604 may use Chromaticity Type metadata of MPEG-21 DIA.
The sensory reproduction apparatus 113 may reproduce the contents or represent the color correction effect with reference to the Color Temperature 2006.
In Table 13, the description structure of the Illuminant Type metadata 2600 is shown in the form of an XML schema.
As illustrated in
The Input Device Color Gamut 2008 includes kind information of the original picture display device (IDCG_Type) 2802 and input device color gamut value according to the maximum DAC value of the original picture display device (IDCG_Value) 2804. That is, the IDCG_Type 2802 describes the kind of an input device that receives the original pictures of the contents, and the IDCG_Value 2804 describes the input device color gamut value in the maximum DAC value of the input device as a value on x and y coordinates.
The sensory reproduction apparatus 113 may reproduce the contents or represent the color correction effect with reference to the Input Device Color Gamut 2008.
In Table 14, the description structure of the Input Device Color Gamut Type metadata 2800 is shown in the form of an XML schema.
As illustrated in
The Color Correction Type metadata 3000 may include at least one of a Spatio Temporal Locator 3002 and a Spatio Temporal Mask 3004. The Spatio Temporal Locator 3002 and the Spatio Temporal Mask 3004 are elements used to trace and interpolate a range (or object) subjected to color correction depending on a color correction range and a change in position for the purpose of applying partial color correction. The Spatio Temporal Locator 3002 shows the position of a color correction object using a coordinate, and the Spatio Temporal Mask 3004 shows the position of the color correction object using a mask.
The sensory reproduction apparatus 113 may represent the color correction effect using the Color Correction Type metadata 3000 with reference to the Color Correction Parameter metadata 2000.
The Spatio Temporal Locator Type of MPEG-7 MDS may be used as the Spatio Temporal Locator 3002, and the Spatio Temporal Mask Type of MPEG-7 MDS may be used as the Spatio Temporal Mask 3004.
In Table 15, the description structure of the Color Correction Type metadata 3000 is shown in the form of an XML schema.
As illustrated in
In Table 16, the description structure of the Wind Type metadata 3200 is shown in the form of an XML schema.
As illustrated in
The metadata generating unit 3401 generates USP metadata 3400 including consumer preference information on a sensory effect. The transmission unit 3403 analyzes the USP metadata 3400 and transmits the USP metadata 3400 to the sensory reproduction apparatus engine unit 109 for generating control information on the sensory reproduction apparatus 113 that reproduces the sensory effect.
Here, the sensory effect information may include at least one of color correction effect information, wind effect information, vibration information, temperature information, main illumination information and peripheral illumination information. For example, the USP metadata 3400 may include consumer preference information on the color correction effect.
As illustrated in
Here, the preference information on the sensory effect may be preference information on the color correction effect, and the Preference metadata 3504 is necessarily described at least once. The sensory reproduction apparatus engine unit 109 may generate SDCmd metadata based on the present of user preference for the color correction effect using the USP metadata 3400.
In Table 17, the description structure of the USP metadata 3400 is shown in the form of an XML schema.
As illustrated in
The USP base type 3700 becomes a base type for a plurality of metadata included in the USP metadata, and the plurality of metadata included in the USP metadata may use a data type extended from the SEM Base Type 500. For example, the plurality of metadata included in the USP metadata may be used as the preference base type of the Preference metadata 3504. The data type extended from the UPS Base Type 3700 includes all the attributes or information of the USP Base Type 3700.
In Table 18, the description structure of the USP Base Type metadata 3700 is shown in the form of an XML schema.
As illustrated in
The Preference Base Type metadata 3900 includes metadata 3902 for describing a group of attributes including attributes necessary for description of the sensory effect information and metadata (##other) 3904 for describing extendible attribute information. The USP base attributes metadata 3902 includes metadata (activate) 3906 for describing attribute information showing activation of the reproduction effect and metadata (maxIntensity) 3908 for describing attribute information showing the maximum reproduction intensity.
In Table 19, the description structure of the Preference Base Type metadata 3900 is shown in the form of an XML schema.
As illustrated in
In Table 20, the description structure of the Color Correction Preference metadata 4100 is shown in the form of an XML schema.
As illustrated in
The metadata generating unit 4301 generates SDCap metadata 4300 including reproduction capability information of the sensory reproduction apparatus for a sensory effect. The transmission unit 4303 analyzes the SDCap metadata 4300 and transmits the SDCap metadata 4300 to the sensory reproduction apparatus engine unit 109 for generating control information on the sensory reproduction apparatus 113 that reproduces the sensory effect.
Here, the sensory effect information may include at least one of color correction effect information, wind effect information, vibration information, temperature information, main illumination information and peripheral illumination information. For example, the SDCap metadata 4300 may include reproduction capability information on the sensory reproduction apparatus 113 for the color correction effect.
As illustrated in
Here, the Device Capability metadata 4402 is necessarily described at least once.
In Table 21, the description structure of the SDCap metadata 4300 is shown in the form of an XML schema.
As illustrated in
The SDCap base type 4600 becomes a base type for a plurality of metadata included in the SDCap metadata, the plurality of metadata included in the SDCap metadata may use a data type extended from the SDCap Base Type 4600. For example, the SDCap Base Type 4600 may be used as the device capability base type of the Device Capability metadata 4402. The data type extended from the SDCap Base Type 4600 includes all the attributes or information of the SDCap Base Type 4600.
In Table 22, the description structure of the SDCap Base Type metadata 4600 is shown in the form of an XML schema.
As illustrated in
The Device Capability Base Type metadata 4800 includes SDCap Base Attributes metadata 4802 for describing a group of attributes including attributes necessary for description of sensory reproduction apparatus capability and metadata (##other) 4804 for describing extendible attribute information. The SDCap Base Attributes metadata 4802 includes metadata (maxIntensity) 4806 for describing attribute information showing the maximum reproduction capability and metadata (position) 4808 for describing attribute information showing position information of the sensory reproduction apparatus.
In
In Table 23, the description structure of the Device Capability Base Type metadata 4800 is shown in the form of an XML schema.
As illustrated in
In Table 24, the description structure of the Color Correction Device Capability metadata 5000 is shown in the form of an XML schema.
As illustrated in
The metadata generating unit 5201 generates SDCmd metadata 5200 including control information on a sensory effect of the sensory reproduction apparatus 113 that reproduces the sensory effect. The metadata generating unit 5201 may generate the SDCmd metadata 5200 by receiving the analyzed result for at least one of the SEM, the USP metadata and the SDCap metadata from the sensory reproduction apparatus engine unit 109.
The transmission unit 5203 transmits the SDCmd metadata 5200 to a control device for controlling the sensory reproduction apparatus 113. For example, the control device may be a control device included in the sensory reproduction apparatus control unit 111 or the sensory reproduction apparatus 113.
Here, the sensory effect information may include at least one of color correction effect information, wind effect information, vibration information, temperature information, main illumination information and peripheral illumination information. For example, the SDCmd metadata 5200 may include control information on the color correction effect.
As described above, the SDCmd metadata generating unit 105 may be included in the sensory reproduction apparatus engine unit 109.
As illustrated in
One of the Group Of Commands metadata 5302 and the Device Command metadata 5304 is necessarily described at least once.
In Table 25, the description structure of the SDCmd metadata 5200 is shown in the form of an XML schema.
As illustrated in
The SDCmd Base Type 5500 becomes a base type for a plurality of metadata included in the SDCmd metadata, and the plurality of metadata included in the SDCmd metadata may use a data type extended from the SDCmd Base Type 5500. For example, the SDCmd Base Type 5500 may be used as a device command base type of the Device Command metadata 5304. The data type extended from the SDCmd Base Type 5500 includes all the attributes or information of the SDCmd Base Type 5500.
In Table 26, the description structure of the SDCmd Base Type metadata 5500 is shown in the form of an XML schema.
As illustrated in
The Device Command Base Type metadata 5700 includes SDCmd Base Attributes metadata 5702 for describing a group of attributes including attributes necessary for description of command information on the sensory reproduction apparatus and metadata (##other) 5704 for describing extendible attribute information.
The SDCmd Base Attributes metadata 5702 includes metadata (idref) 5706 for describing attribute information showing identifier (id) reference of the sensory reproduction apparatus 113, metadata (activate) 5708 for describing attribute information showing activation information of the sensory reproduction apparatus 113, and metadata (intensity) 5710 for describing attribute information showing sensory reproduction intensity information.
In Table 27, the description structure of the Device Command Base Type metadata 5700 is shown in the form of an XML schema.
As illustrated in
The Group Of Commands metadata 5302 includes metadata (##other) 5902 for describing extendible attribute information. The Group Of Commands metadata 5302 includes at lest two or more metadata (Device Command) 5304 for describing one sensory reproduction command information.
In Table 28, the description structure of the Group Of Commands Type metadata 5900 is shown in the form of an XML schema.
As illustrated in
The Color Correction Device Command metadata 6100 may include at least one of a Spatio Temporal Locator 6102 and a Spatio Temporal Mask 6104. That is, the Spatio Temporal Locator 3002 and the Spatio Temporal Mask 3004 are included in the Color Correction Type metadata 3000, and the Color Correction Device Command metadata 6100 according to the analyzed result of the SEM also includes the Spatio Temporal Locator 6102 and the Spatio Temporal Mask 6104.
The Spatio Temporal Locator 6102 and the Spatio Temporal Mask 6104 are elements used to trace and interpolate a range (or object) subjected to color correction depending on a color correction range and a change in position for the purpose of applying partial color correction. The Spatio Temporal Locator 6102 shows the position of a color correction object using a coordinate, and the Spatio Temporal Mask 6104 shows the position of the color correction object using a mask. The Spatio Temporal Locator Type of MPEG-7 MDS may be used as the Spatio Temporal Locator 6102, and the Spatio Temporal Mask Type of MPEG-7 may be used as the Spatio Temporal Mask 6104.
The sensory reproduction apparatus 113 may represent the color correction effect based on the Color Correction Device Command metadata 6100.
In
In Table 29, the description structure of the Color Correction Device Command metadata 6100 is shown in the form of an XML schema.
As illustrated in
At the step S6301, the SEM generating unit 101 generates SEM including sensory effect information on contents.
At step S6302, the SEM generating unit 101 transmits the SEM to the sensory reproduction apparatus engine unit 109. The sensory reproduction apparatus engine unit 109 receives the SEM transmitted from the SEM generating unit 101 so as to analyze the SEM and generates control information on the sensory reproduction apparatus 113. The sensory effect information may include color correction effect information on the contents.
Specifically, the sensory effect information may include at least one of the Spatio Temporal Locator and the Spatio Temporal Mask as described in
As described in
The SEM generated by the SEM generating unit 101 is analyzed by the sensory reproduction apparatus engine unit 109, and the analyzed result of the sensory reproduction apparatus engine unit 109 may be used in representation of the color correction effect. That is, the sensor reproduction apparatus 113 may represent the color correction effect intended by a contents provider using the color correction effect information included in the SEM. The sensory reproduction apparatus 113 may represent the color correction effect with reference to the Color Correction Parameter. The sensory reproduction apparatus 113 may represent colors of original pictures for the contents with reference to the Color Correction Parameter.
As illustrated in
At the step S6401, the USP metadata generating unit 103 generates USP metadata including consumer preference information on a sensory effect.
At step S6403, the USP metadata generating unit 103 transmits the USP metadata to the sensory reproduction apparatus engine unit 109. The sensory reproduction apparatus engine unit 109 receives the USP metadata transmitted from the USP metadata generating unit 103 so as to analyze the USP metadata and generate control information on the sensory reproduction apparatus 113.
The consumer preference information on the sensory effect may include preference information on the color correction effect as described in
As illustrated in
At the step S6501, the SDCap metadata generating unit 107 generates SDCap metadata including reproduction capability information on a sensory effect in the sensory reproduction apparatus 113.
At step S6503, the SDCap metadata generating unit 107 transmits the SDCap metadata to the sensory reproduction apparatus engine unit 109. The sensory reproduction apparatus engine unit 109 receives the SDCap metadata transmitted from the SDCap metadata generating unit 107 so as to analyze the SDCap metadata and generate control information on the sensory reproduction apparatus.
The reproduction capability information may include reproduction capability information on the color correction effect for the contents as described in
As illustrated in
At the step S6601, the sensory reproduction apparatus engine unit 109 receives SEM including sensory effect information. The sensory reproduction apparatus engine unit 109 may receive the SEM inputted from the SEM generating unit 101.
At step S6603, the sensory reproduction apparatus engine unit 109 transmits SDCmd metadata to a control device for controlling the sensory reproduction apparatus 113. The control device for controlling the sensory reproduction apparatus 113 may be a control device included in the sensory reproduction apparatus control unit 111 or the sensory reproduction apparatus 113.
Meanwhile, the method in accordance with the embodiment of the present invention may further include receiving USP metadata including consumer preference information on the color correction effect or receiving SDCap metadata including reproduction capability information on the color correction effect of the sensory reproduction apparatus 113. The sensory reproduction apparatus engine unit 109 may generate and transmit SDCmd metadata by additionally analyzing the USP metadata or SDCap metadata.
The sensory effect may be a color correction effect for contents. Thus, the SEM may include color correction effect information on the contents, and the USP metadata may include consumer preference information on the color correction effect. The SDCap metadata may include reproduction capability information on the color correction effect of the sensory reproduction apparatus 113.
The sensory reproduction apparatus 113 that receives the SDCmd metadata including the color correction effect information on the contents may represent the color correction effect based on the SDCmd metadata. That is, the sensory reproduction apparatus engine unit 109 analyzes the color correction effect information including color correction parameters and generates the SDCmd metadata so that the sensory reproduction apparatus 113 can represent the color correction effect according to the color correction parameters. The sensory reproduction apparatus 113 may represent the color correction effect according to the color correction parameters, or may represent the color correction effect according to the Spatio Temporal Locator and the Spatio Temporal Mask for a color correction range.
As illustrated in
At the step S6701, the sensory reproduction apparatus 113 receives control information on a sensory effect of the sensory reproduction apparatus 113. The control information on the sensory effect may be inputted from the sensory reproduction apparatus control unit 111 that receives SDCmd metadata, or may be inputted in the form of SDCmd metadata from the sensory reproduction apparatus engine unit 109.
At step 6703, the sensory reproduction apparatus 113 represents a sensory effect based on the control information on the sensory effect. The control information on the sensory effect may be control information on the contents color correction effect.
According to the embodiment of the present invention, metadata for various sensory effects may be generated depending on the metadata description structure described above, and the various sensory effects may be represented in the sensory reproduction apparatus. For example, the sensory reproduction apparatus for representing sensory effects has base device command type metadata, and the base type metadata may be extended to metadata for the type of each sensory reproduction apparatus. Various metadata may be included as elements of reproduction apparatus type metadata or sensory effect information of the sensory reproduction apparatus and parameter information related to the sensory effect in the extended metadata. Here, the various metadata includes original color restoration setting information on contents, illuminance reproduction setting information, vibration setting information, temperature reproduction setting information, reproduction intensity setting information of each sensory reproduction apparatus, and the like.
As illustrated in
In Tables 30 to 34, the SEM produced by the advertisement producer is shown in an XML instance form. Specifically, Tables 30 to 34 show SEM metadata for describing parameters for original color correction intended by the advertisement producer, correction information (range and changed position) on a color correction object, an illumination effect, a temperature effect, a wind effect, and the like. Tables 30 to 34 are shown in the consecutive XML instance form.
The advertisement contents and the advertisement moving picture contents 6300 may be generated in the form of a multimedia application format (MAF). The generated advertisement moving picture contents 6300, i.e., media are transmitted to the sensory reproduction apparatus engine unit 109, and a consumer of the advertisement moving picture contents 6300 can see that there exists a sensory effect for the advertisement contents.
Thus, the consumer selects whether or not to apply the sensory effect for the transmitted advertisement moving picture contents 6300. That is, the consumer may select the presence of preference for the sensory effect using a graphic user interface (GUI) of the sensory reproduction apparatus 113. Accordingly, the generated USP metadata is transmitted to the sensory production apparatus engine unit 109.
In Table 35, the USP metadata showing preference of the sensory effect for the advertisement moving picture contents 6300 is represented in an XML instance form. That is, Table 35 shows the USP metadata for describing sensory effect reference information of the consumer. Here, the USP metadata use all the original picture color correction effect, the main illumination effect, the peripheral illumination effect, the temperature effect and the wind effect, and describe a degree of the reproduction effect of illumination, temperature and wind control.
The sensory reproduction apparatus engine unit 109 receives the SEM 200 for reproducing the sensory effect of the advertisement moving picture contents 6300, the SDCap metadata 4300 of a peripheral device (a main illumination and peripheral illumination (LED), an air conditioner or the like) connected to the sensory reproduction apparatus control unit 111, and the USP metadata 3400 that are sensory effect reproduction preference information of the consumer so as to generate the SDCmd metadata 5200.
In Table 36, the SDCap metadata 4300 generated from the sensory reproduction apparatus control unit 111 is shown in an XML instance form. Table 36 describes the reproduction capability range of a main illumination device (dimmer), a peripheral illumination device (LED), and a temperature and wind control device (air-conditioner).
While the advertisement contents are reproduced, the sensory reproduction apparatus engine unit 109 analyzes the SEM 200 and the SDCap metadata 4300 and decides a currently available sensory reproduction apparatus in the sensory effect intended by the contents provider. Then, the sensory reproduction apparatus engine unit 109 finally analyzes consumer preference information based on the USP metadata of the consumer and transmits the generated SDCmd metadata 5200 to the sensory reproduction apparatus control unit 111.
In Tables 37 to 39, the SDCmd metadata 5200 generated from the sensory reproduction apparatus engine unit 109 is shown in an XML instance form. Tables 37 to 39 show the SDCmd metadata 5200 including control information on a sensory effect, controlled based on the USP metadata 3400 of the consumer, and describe control information on the original color correction effect, the main illumination and peripheral illumination (LED) effect, and the temperature and wind control effect. Tables 37 to 39 are shown in a consecutive XML instance form.
-->
-->
-->
-->
-->
-->
The sensory reproduction apparatus control unit 111 transmits a control signal to each of the sensory reproduction apparatuses based on the SDCmd metadata 5200. The sensory reproduction apparatus 113 that receives the control signal reproduces (represents) the sensory effect intended by the advertisement producer to the consumer in response to the control signal.
For example, a beer advertisement will be described as advertisement contents. In a case where pictures of cool sea under intense sunlight are reproduced, a specific range or the original color sense of an object (beer or sea) or all the pictures is displayed as an advertisement producer intends. Further, illumination may be intensely spotlighted, and a peripheral LED (peripheral illumination) may emit blue light. Furthermore, cool wind of an air-conditioner may be blown from the back of a consumer. The consumer may feel the urge to purchase an advertisement product while watching such advertisement media.
In a case where the consumer does not prefer a sensory effect, pictures without a color correction effect intended by the contents provider are displayed in a display device. That is, pictures according to color characteristic of the consumer's display device are displayed in the display device, and the advertisement effect for the consumer may be reduced.
In Table 40, the USP metadata when a consumer does not prefer a sensory effect is shown in an XML instance form. Table 40 describes that the original color correction effect, the main illumination and peripheral illumination (LED) effect, and the temperature and wind control effect are not used.
As described above, the method for providing metadata for a sensory effect and the method for reproducing the sensory effect in accordance with the present invention can be written using computer programs. Codes and code segments constituting the programs may be easily construed by computer programmers skilled in the art to which the invention pertains. Furthermore, the created programs may be stored in computer-readable recording media or data storage media and may be read out and executed by the computers. Examples of the computer-readable recording media include any computer-readable recoding media, e.g., intangible media such as carrier waves, as well as tangible media such as CD or DVD.
A computer-readable medium in accordance with an embodiment of the present invention includes SEM including sensory effect information on contents. The sensory effect information includes color correction effect information on the contents.
A computer-readable medium in accordance with another embodiment of the present invention includes USP metadata including consumer preference information on a sensory effect. The preference information includes preference information on a color correction effect of contents.
A computer-readable medium in accordance with another embodiment of the present invention includes SDCap metadata including reproduction capability information on a sensory effect of a sensory reproduction apparatus. The reproduction capability information includes reproduction capability information on a color correction effect of contents.
A computer-readable medium in accordance with another embodiment of the present invention includes SDCmd metadata including control information on a sensory effect of a sensory reproduction apparatus. The control information on the sensory effect includes control information on a color correction effect of contents.
Hereinafter, an apparatus and method for implementing sensory effect metadata using binary representation of the sensory effect metadata will be described. The present invention provides a method for encoding/decoding the metadata in a binary representation form. In other words, each of the SEM, the USP metadata, the SDCap metadata and the SDCmd metadata may be generated using binary representation syntax.
<Control Information>
The control information describes base types of SDCap metadata and USP metadata, base attributes and binary representation syntax. Particularly, the control information describes binary representation syntax of color correction.
1. Sensory Device Capability Base Type
A. Syntax
B. Binary Representation Syntax
C. Semantics of the SensoryDeviceCapabilityBaseType
2. Sensory Device Capability Base Attributes
A. Syntax
B. Binary Representation Syntax
C. Semantics of the SensoryDeviceCapabilityBaseType
In the binary description of “Location” described in
3. Color Correction Capability Type
A. Syntax
B. Binary Representation Syntax
C. Semantics
A. Syntax
B. Binary Representation Syntax
C. Semantics of the UserSensoryPreferenceBaseType
5. User Sensory Preference Base Attributes
A. Syntax
B. Binary Representation Syntax
C. Semantics
6. Color Correction Preference Type
A. Syntax
B. Binary Representation Syntax
C. Semantics
The sensory information describes attributes, base type, effect base type, parameter base type, color correction parameter base type and color correction effect with respect to SEM. Particularly, the sensory information describes binary representation syntax.
1. SEM Base Attributes
A. XML Representation Syntax
B. Binary Representation Syntax
C. Semantics
Semantics of the SEMBaseAttributes
In the binary description of “Location” described in
Semantics of the SEMAdaptabilityAttributes:
In the binary description of “adaptType” described in
2. SEM Base Type
A. XML Representation Syntax
B. Binary Representation Syntax
C. Semantics
Semantics of the SEMBaseType
3. EffectBaseType
A. XML Representation Syntax
B. Binary Representation Syntax
C. Semantics
Semantics of the EffectBaseType
In the binary description of “autoExtraction” described in
Semantics of the SupplimentalInformationType
In the binary description of “Operator” described in
4. Parameter Base Type
A. XML Representation Syntax
B. Binary Representation Syntax
C. Semantics of the ParameterBaseType
5. Color Correction Parameter Type
A. XML Representation Syntax
B. Binary Representation Syntax
C. Semantics
Semantics of the ColorCorrectionParameterType
Semantics of the ToneReproductionCurvesType
Semantics of the ConversionLUTType
The way of describing the values in the binary representation is in the order of [Rx, Gx, Bx; Ry, Gy, By; Rz, Gz, Bz].
The way of describing the values in the binary represenation is in the order of [Gainr, Gaing, Gainb; Offsetr, Offsetg, Offsetb; Gammar, Gammag, Gammab].
The way of describing the values in the binary represenation is in the order of [Rx′, Gx′, Bx′; Ry′, Gy′, By′; Rz′, Gz′, Bz′].
Semantics of the illuminantType
In the binary description of “ElementType” described in
Semantics of the InputDeviceColorGamutType
The way of describing the values in the binary representa- tion is in the order of [xr, yr, xg, yg, xb, yb].
6. Color Correction Effect
A. XML Representation Syntax
B. Binary Representation Syntax
C. Semantics
Semantics of the ColorCorrectionType
Mnemonics>
The following mnemonics are defined to describe another data type used in an encoded bit stream.
<Data Formats for Device Information>
Here, a data format for interaction between apparatuses is described.
1. Device Command Base Type
A. Syntax
B. Binary Representation Syntax
C. Semantics
Semantics of the DeviceCommandBaseType
2. Device Command Base Attributes
A. Syntax
B. Binary Representation Syntax
C. Semantics
Semantics of the DeviceCmBaseAttributes
3. Color correction type
A. Syntax
B. Binary Representation Syntax
C. Semantics
Semantics of the ColorCorrectionType
4. Initialize Color Correction Parameter Type
This command transmits a parameter for supporting a color correction effect of a sensory reproduction apparatus.
A. Syntax
B. Binary Representation Syntax
C. Semantics
Semantics of the ColorCorrectionType
<Common Types>
A. Syntax
B. Binary Representation Syntax
C. Semantics
Semantics of the Basic Datatypes
The way of describing the value in the binary representation is in the order of [Rx, Gx, Bx; Ry, Gy, By; Rz, Gz, Bz].
The way of describing the values in the binary representation is in the order of [Gainr, Gaing, Gainb; Offsetr, Offsetg, Offsetb; Gammar, Gammag, Gammab].
The way of describing the values in the binary representation is in the order of [Rx′, Gx′, Bx′; Ry′, Gy′, By′; Rz′, Gz′, Bz′].
The way of describing the values in the binary representation is in the order of [xr, yr, xg, yg, xb, yb].
While the present invention has been described with respect to the specific embodiments, it will be apparent to those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2009-0032566 | Apr 2009 | KR | national |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/KR2010/002362 | Apr 2010 | US |
Child | 13275045 | US |