This application is a U.S. National Phase application of International Application No. PCT/KR2011/005379, filed on Jul. 21, 2011, and which claims the priority benefit of Korean Patent Application No. 10-2010-0070667, filed Jul. 21, 2010, in the Korean Intellectual Property Office, the disclosure of each of which are incorporated herein by reference.
1. Field
Example embodiments of the following disclosure relate to a data transmission apparatus and method, and more particularly, to an apparatus and method for transmitting information on a virtual object of a virtual world.
2. Description of the Related Art
Currently, interest in experience-type games has been increasing in the video gaming market, for example. MICROSOFT CORPORATION introduced PROJECT NATAL at the “E3 2009” Press Conference. PROJECT NATAL may provide a user body motion capturing function, a face recognition function, and a voice recognition function by combining MICROSOFT's XBOX 360 game console with a separate sensor device including a depth/color camera and a microphone array, thereby enabling a user to interact with a virtual world without a dedicated controller. In addition, SONY CORPORATION introduced WAND which is an experience-type game motion controller. The WAND enables interaction with a virtual world through input of a motion trajectory of a controller by applying, to the SONY PLAYSTATION 3 game console, a location/direction sensing technology obtained by combining a color camera, a marker, and an ultrasonic sensor.
The real world and a virtual world may interact with each other in one of two directions. In one direction, data information obtained by a sensor in the real world may be reflected to the virtual world. In the other direction, data information obtained from the virtual world may be reflected to the real world using an actuator, for example.
Meanwhile, with development in a technology related to the virtual world, research with respect to the interaction between virtual worlds is actively being conducted. For the interaction, a method of transmitting information on one virtual world to another virtual world is necessitated.
Accordingly, example embodiments suggest an apparatus and method for transmitting data related to a virtual world for exchange of information between virtual worlds.
Additional aspects and/or advantages will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the invention.
According to an aspect of the present disclosure, there is provided a data transmission apparatus that transmits data from a first virtual world to a second virtual world, the data transmission apparatus including an encoder to encode information on the first virtual world into first metadata; and a transmission unit to transmit the first metadata to the second virtual world.
According to another aspect of the present disclosure, there is provided a data transmission method to transmit data from a first virtual world to a second virtual world, the data transmission method including encoding information on the first virtual world into first metadata; and transmitting the first metadata to the second virtual world.
According to another aspect of the present disclosure, there is provided a system for transmitting data between a first virtual world and a second virtual world, the system including: a first data transmission apparatus to encode information relating to the first virtual world, and to transmit the encoded information to a second data transmission apparatus; and the second data transmission apparatus to receive the encoded information from the first data transmission apparatus, and to decode the received encoded information, wherein the decoded information is applied to the second virtual world.
According to example embodiments, information exchange between virtual worlds is implemented by encoding information on a virtual world into metadata and transmitting the metadata to another virtual world.
These and/or other aspects and advantages will become apparent and more readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
Reference will now be made in detail to embodiments of the present disclosure, examples of which are illustrated in the accompanying drawings. However, the embodiments are not limiting. In the drawings, like reference numerals refer to the like elements throughout.
Referring to
The car 130, as an object of the virtual world 110, may include information 131 relating to the car 130. The information 131 may include sound information of an engine, a horn, and a brake pedal, and scent information of gasoline.
The musical instrument 140, as an object of the virtual world 110, may include information 141 relating to the musical instrument 140. The information 141 may include sound information, such as, a, b, and c, owner information, such as, George Michael, and price information, such as, $5.
The data transmission apparatus according to example embodiments may migrate a virtual object from a certain virtual world to another virtual world.
Depending on embodiments, the data transmission apparatus with respect to the first virtual world 110 may transmit information relating to the first virtual world 110 to a second virtual world 120. For example, the data transmission apparatus with respect to the first virtual world 110 may transmit, to the second virtual world 120, the information 131 and 141 related to the car 130 and the musical instrument 140, respectively, which are the objects implemented in the first virtual world 110.
Referring to
The data transmission apparatus 210 may transmit data from a first virtual world 201 to a second virtual world 202. The data transmission apparatus 220 may transmit data from the second virtual world 202 to the first virtual world 201.
The encoder 211 may encode information on the first virtual world 201 into first metadata.
The virtual worlds 201 and 202 may be classified into a virtual environment and a virtual world object. The virtual world object may characterize various types of objects in the virtual environment. In addition, the virtual world object may provide interaction in the virtual environment.
The virtual world object may include an avatar and a virtual object. The avatar may be used as representation of a user in the virtual environment. The avatar and the virtual world object will be described in detail later.
Information on the virtual worlds 201 and 202 may include at least one of information on the avatar of the virtual worlds 201 and 202 and information on the virtual object of the virtual worlds 201 and 202.
The encoder 211 may generate the first metadata by encoding the information relating to the first virtual world 201 into a binary format, for example. In addition, the encoder 211 may generate the first metadata by encoding the information on the first virtual world 201 into the binary format and losslessly compressing the binary format information.
Depending on embodiments, the encoder 211 may generate the first metadata by encoding the information on the first virtual world 201 into an XML format and encoding the XML format information into a binary format, for example. In addition, the encoder 211 may generate the first metadata by encoding the information on the first virtual world 201 into an XML format, encoding the XML format information into a binary format, and losslessly compressing the binary format information.
The transmission unit 212 may transmit the first metadata encoded by the encoder 211 to the second virtual world 202. Depending on embodiments, the transmission unit 212 may transmit the encoded first metadata to the data transmission apparatus 220 that corresponds to the second virtual world 202.
A receiving unit 223 of the data transmission apparatus 220 may receive the first metadata from the transmission unit 212 of the first data transmission apparatus 210. A decoder 224 may recover the information on the first virtual world 201 by decoding the received first metadata.
According to an aspect, the data transmission apparatus 210 may further include a receiving unit 213 and a decoder 214.
The encoder 221 of the data transmission apparatus 220 may encode the information on the second virtual world 202 into second metadata.
According to an aspect, the encoder 221 may generate the second metadata by encoding the information relating to the second virtual world 202 into a binary format, for example. In addition, the encoder 221 may generate the second metadata by encoding the information relating to the second virtual world 202 into a binary format and losslessly compressing the binary format information.
Depending on embodiments, the encoder 221 may generate the second metadata by encoding the information on the second virtual world 202 into an XML format and encoding the XML format information into a binary format. In addition, the encoder 221 may generate the second metadata by encoding the information on the second virtual world 202 into an XML format, encoding the XML format information into a binary format, and losslessly compressing the binary format information.
The transmission unit 222 may transmit the second metadata encoded by the encoder 221 to the first virtual world 201. Depending on embodiments, the transmission unit 222 may transmit the encoded second metadata to the data transmission apparatus 210, which corresponds to the first virtual world 201.
The receiving unit 213 may receive the second metadata encoded from the information on the second virtual world 202.
The decoder 214 may decode the second metadata received by the receiving unit 213. The decoder 214 may recover the information on the second virtual world 202 by decoding the second metadata.
Hereinafter, an encoder to encode information relating to a virtual world and a decoder to decode the encoded information will be described in detail with reference to
Referring to
The XML encoder 311 may encode information relating to a first virtual world into an XML format, for example. The binary encoder 312 may encode the XML format information into a binary format, for example.
When the information encoded into the binary format is transmitted to the second data transmission apparatus, the binary decoder 321 may decode the transmitted information. In addition, the XML decoder 322 may recover the information on the first virtual world, by decoding the information decoded by the binary decoder 321.
Referring to
The binary encoder 411 may encode information on a first virtual world into a binary format. The lossless compression unit 412 may losslessly compress the information encoded into the binary format.
When the compressed information is transmitted to the second data transmission apparatus, the data recovery unit 421 may recover the transmitted information. In addition, the binary decoder 422 may recover the information on the first virtual world, by decoding the information recovered by the data recovery unit 421.
The data transmission apparatus, according to the example embodiments, may control interoperability between a virtual world and a real world or between virtual worlds.
Here, the virtual world may be classified into a virtual environment and a virtual world object.
The virtual world object may characterize various types of objects in the virtual environment. In addition, the virtual world object may provide interaction in the virtual environment.
The virtual world object may include an avatar and a virtual object. The avatar may be used as representation of a user in the virtual environment. These described virtual world objects are exemplary, and thus, the present disclosure is not limited thereto.
Hereinafter, the virtual world object will be described in detail with reference to
Referring to
The attributes 520 and the characteristics 530, 540, and 550 of the VWO base type 510 may be shared by both the avatar and the virtual object. That is, the VWO base type 510 may be inherited as avatar metadata and virtual object metadata to extend particular aspects of the respective metadata. The virtual object metadata, which is representation of the virtual object in the virtual environment, may characterize various types of objects in the virtual environment. In addition, the virtual object metadata may provide interaction between the virtual object and the avatar. Furthermore, the virtual object metadata may provide interaction with a virtual environment, however, the present disclosure is not limited thereto.
The VWO may include a root element. The root element may include an avatar list and a virtual object list.
Depending on embodiments, the root element may be expressed using an extensible markup language (XML) as shown in Source 1. However, a program source shown in Source 1 is not limiting and is only an example embodiment.
Table 1-2 shows binary representation syntax corresponding to the root element, according to the example embodiments.
Table 1-3 shows descriptor components semantics of the root element, according to the example embodiments.
Depending on embodiments, the VWO base type 510 may be expressed using the XML as shown in Source 1-4 below. However, a program source shown in Source 1-4 is not limiting but only an example embodiment.
The attributes 520 may include id 521.
The id 521 may refer to a unique identifier for identifying an individual piece of VWO information.
The VWO base type 510 may include characteristics such as identification 530, VWO characteristics (VWOC) 540, and behavior model list 550.
The identification 530 refers to an identification of the VWO.
The VWOC 540 refers to a set of the characteristics of the VWO. According to example embodiments, the VWOC 540 may include a sound list 541, a scent list 542, a control list 543, and an event list 544, however, the present disclosure is not limited thereto. The sound list 541 may refer to a list of sound effects related to the VWO. The scent list 542 may refer to a list of scent effects related to the VWO. The control list 543 may refer to a list of control related to the VWO. The event list 544 may refer to a list of an input events related to the VWO.
The behavior model list 550 refers to a list of behavior models related to the VWO.
Example 1-5 below shows an example of the VWO base type 510. However, Example 1-5 is not limiting and is only an example embodiment.
Table 1-6 shows binary representation syntax corresponding to the VWO base type 510, according to the example embodiments.
Table 1-7 shows descriptor components semantics of a VWO base type 510, according to the example embodiments.
Referring to Table 1-7, flags may express whether the characteristics with respect to the respective VWOs of the virtual world are used, the characteristics including the identification 530, the VWOC 540 including the sound list 541, the scent list 542, the control list 543, and the event list 544, and the behavior model list 550.
For example, when the sound list 541 with respect to a random VWO of the virtual world is being used, a sound list flag may have a value of “1.” When the sound list 541 is not used, the sound list flag may have a value of “0.”
Referring to
The identification type 610 may refer to identification of the VWO.
The attributes 620 may include at least a name 621 and family 622, however, the present disclosure is not limited thereto.
The name 621 may refer to a name of the VWO.
The family 622 may refer to a relationship of the VWO with another object.
The identification type 610, according to the example embodiments, may include at least user ID 631, ownership 632, right 633, and credits 634, however, the present disclosure is not limited thereto.
The user ID 631 may contain a user identifier related to the VWO.
The ownership 632 may refer to ownership of the VWO.
The right 633 may refer to a right of the VWO.
The credit 634 may refer to a chronological contributor of the VWO.
Depending on embodiments, the identification type 610 may be expressed using the XML, for example, as shown below in Source 2. However, a program source shown in Source 2 is not limiting and is only an example embodiment.
Table 2 shows binary representation syntax corresponding to the identification type 610, according to the example embodiments.
Table 2-2 shows descriptor components semantics of the identification type 610, according to the example embodiments.
According to an aspect of the present disclosure, descriptor components starting with “Num” among the descriptor components related to the VWO may represent a number of pieces of information included in components following “Num.” For example, NumVWOHapticPropertyType may be a field representing a number of haptic property types included in VWOHapticPropertyType.
Referring to
The VWO sound list type 640 may refer to a data format of the sound list 541 of
The VWO sound list type 640 may refer to a wrapper element type that allows multiple occurrences of sound effects related to the VWO.
The sound 641 may refer to a sound effect related to the VWO.
Depending on embodiments, the VWO sound list type 640 may be expressed using the XML, for example, as shown below in Source 3. However, a program source shown in Source 3 is not limiting and is only an example embodiment.
Table 3 shows binary representation syntax corresponding to the VWO sound list type 640, according to the example embodiments.
Table 3-2 shows descriptor components semantics of the VWO sound list type 640, according to the example embodiments.
Referring to
The VWO scent list type 650 may refer to a data format of the scent list 542 of
The VWO scent list type 650 may refer to a wrapper element type that allows multiple occurrences of scent effects related to the VWO, however, the present disclosure is not limited thereto.
The scent 651 may refer to a scent effect related to the VWO.
Depending on embodiments, the VWO scent list type 650 may be expressed using the XML, for example, as shown below in Source 4. However, a program source shown in Source 4 is not limiting and is only an example embodiment.
Table 4 shows binary representation syntax corresponding to the VWO scent list type 650, according to the example embodiments.
Table 4-2 shows descriptor components semantics of the VWO scent list type 650, according to the example embodiments.
Referring to
The VWO control list type 660 may refer to a data format of the control list 543 of
The VWO control list type 660 may refer to a wrapper element type that allows multiple occurrences of control related to the VWO.
The control 661 may refer to the control related to the VWO.
Depending on embodiments, the VWO control list type 660 may be expressed using the XML, for example, as shown below in Source 5. However, a program source shown in Source 5 is not limiting and is only an example embodiment.
Table 5 shows binary representation syntax corresponding to the VWO control list type 660, according to the example embodiments.
Table 5-2 shows descriptor components semantics of the VWO control list type 660, according to the example embodiments.
Referring to
The VWO event list type 670 may refer to a data format of the event list 544 of
The VWO event list type 670 may refer to a wrapper element type that allows multiple occurrences of input events related to the VWO.
The event 671 may refer to the input events related to the VWO.
Depending on embodiments, the VWO event list type 670 may be expressed using the XML, for example, as shown below in Source 6. However, a program source shown in Source 6 is not limiting and is only an example embodiment.
Table 6 shows binary representation syntax corresponding to the VWO event list type 670, according to the example embodiments.
Table 6-2 shows descriptor components semantics of the VWO event list type 670, according to the example embodiments.
Referring to
The VWO behavior model list type 680 may refer to a data format of the behavior model list 550 of
The VWO behavior model list type 680 may refer to a wrapper element type that allows multiple occurrences of input behavior models related to the VWO.
The behavior model 671 may refer to the input behavior models related to the VWO.
Depending on embodiments, the VWO behavior model list type 680 may be expressed using the XML, for example, as shown below in Source 7. However, a program source shown in Source 7 is not limiting and is only an example embodiment.
Table 7 shows binary representation syntax corresponding to the VWO behavior model list type 680, according to the example embodiments.
Table 7-2 shows descriptor components semantics of the VWO behavior model list type 680, according to the example embodiments.
According to an aspect of the present disclosure, the VWO base type 510 may further include characteristics of a haptic property list.
The haptic property list type may refer to a data structure of the haptic property list.
Depending on embodiments, the VWO sound list type 640 may be expressed using the XML, for example, as shown below in Source 7-3. However, a program source shown in Source 7-3 is not limiting and is only an example embodiment.
Table 7-4 shows binary representation syntax corresponding to the haptic property list type, according to the example embodiments.
Table 7-5 shows descriptor components semantics of the haptic property list type, according to the example embodiments, however, the present disclosure is not limited thereto.
Referring to
The VWO sound type 710 may refer to type information of sound effects related to the VWO.
Depending on embodiments, the VWO sound type 710 may be expressed using the XML, for example, as shown below in Source 8. However, a program source shown in Source 8 is not limiting and is only an example embodiment.
The attributes 720 may include sound ID 721, intensity 722, duration 723, loop 724, and name 725, however, the present disclosure is not limited thereto.
The sound ID 721 may refer to a unique identifier of an object sound.
The intensity 722 may refer to an intensity of the sound.
The duration 723 may refer to a length of duration of the sound.
The loop 724 may refer to a number of repetitions of the sound.
The name 725 may refer to a name of the sound.
The resources URL 730 includes a link related to a sound file. The sound file may be in the form of a MP4 file, for example, however, the present disclosure is not limited thereto, and thus, the sound file may be in other forms or formats.
Example 2 shows an example of the VWO sound type 710. However, Example 2 is not limiting and is only an example embodiment.
Referring to Example 2, a sound resource having “BigAlarm” as the name 725 is stored in “http://sounddb.com/alarmsound_0001.wav” and the sound ID 721 is “SoundID3.” The duration of the sound of Example 2 may be 30 seconds and the intensity 722 of the sound may be 50%.
Table 8 shows binary representation syntax corresponding to the VWO sound type 710, according to the example embodiments.
Table 8-2 shows descriptor components semantics of the VWO sound type 710, according to the example embodiments.
Referring to
The VWO scent type 810 may refer to type information of scent effects related to the VWO.
Depending on embodiments, the VWO scent type 810 may be expressed using the XML, for example, as shown below in Source 9. However, a program source shown in Source 9 is not limiting and is only an example embodiment.
The attributes 820 may include scent ID 821, intensity 822, duration 823, loop 824, and name 825.
The scent ID 821 may refer to a unique identifier of an object scent.
The intensity 822 may refer to an intensity of the scent.
The duration 823 may refer to a length of duration of the scent.
The loop 824 may refer to a number of repetitions of the scent.
The name 825 may refer to a name of the scent.
The resources URL 830 may include a link related to a scent file.
Example 3 shows an example of the VWO scent type 810. However, Example 3 is not limiting and is only an example embodiment of the VWO scent type 810.
Table 9 shows example binary representation syntax corresponding to the VWO scent type 810, according to the example embodiments.
Table 9-2 shows example descriptor components semantics of the VWO scent type 810, according to the example embodiments.
Referring to
The VWO control type 910 may refer to type information of control related to the VWO.
Depending on embodiments, the VWO control type 910 may be expressed using the XML, for example, as shown below in Source 10. However, a program source shown in Source 10 is not limiting and is only an example embodiment.
The attributes 920 may include control ID 921.
The control ID 921 may refer to a unique identifier of the control.
The motion feature control 930 may refer to a set of elements controlling a position, an orientation, and a scale of the virtual object. According to example embodiments, the motion feature control 930 may include elements including a position 941, an orientation 942, and a scale factor 943.
The position 941 may refer to a position of the object in a scene. As a non-limiting example, the position 941 may be expressed by a 3-dimensional (3D) floating point vector method (x, y, and z).
The orientation 942 may refer to an orientation of the object in the scene. Depending on embodiments, the orientation 942 may be expressed by the 3D floating point vector method (yaw, pitch, and roll) using an Euler angle.
The scale factor 943 may refer to a scale of the object in the scene. Depending on embodiments, the scale factor 943 may be expressed by the 3D floating point vector method (Sx, Sy, and Sz).
Table 10 shows example binary representation syntax corresponding to the VWO control type 910, according to the example embodiments.
Table 10-2 shows example descriptor components semantics of the VWO control type 910, according to the example embodiments.
Referring to
The VWO event type 1010 may refer to type information of an event related to the VWO.
Depending on embodiments, the VWO event type 1010 may be expressed using the XML, for example, as shown below in Source 11. However, a program source shown in Source 11 is not limiting and is only an example embodiment.
The attributes 1020 may include event ID 1021.
The event ID 1021 may refer to a unique identifier of an event.
The VWO event type 1010, according to the example embodiments, may include elements including mouse 1031, keyboard 1032, sensor input 1033, and user defined input 1034.
The mouse 1031 may refer to a mouse event. That is, the mouse 1031 may refer to an event occurring based on an input by manipulation of a mouse. For example, the mouse 1031 may include elements shown in Table 11.
The keyboard 1032 refers to a keyboard event. That is, the keyboard 1032 may refer to an event generated based on an input by manipulation of a keyboard. For example, the keyboard 1032 may include elements shown in Table 11-2.
The sensor input 1033 may refer to a sensor input. That is, the sensor input 1033 may refer to an event occurring based on an input by manipulation of a sensor.
The user defined input 1034 may refer to an input event defined by a user.
Table 11-3 shows example binary representation syntax corresponding to the VWO event type 1010, according to the example embodiments.
Table 11-4 shows example descriptor components semantics of the VWO event type 1010, according to the example embodiments.
Referring to
The VWO behavior model type 1110 may refer to type information of a behavior model related to the VWO.
Depending on embodiments, the VWO behavior model type 1110 may be expressed using the XML, for example, as shown below in Source 12. However, a program source shown in Source 12 is not limiting and is only an example embodiment.
The behavior input 1120 may refer to an input event for making an object behavior. Depending on embodiments, the behavior input 1120 may include attributes 1121.
The attributes 1121 may include event ID reference 1122, however, the present disclosure is not limited thereto. The event ID reference 1122 may refer to a unique identifier of the input event.
The behavior output 1130 may refer to an output of the object behavior corresponding to the input event. For example, the behavior output 1130 may include attributes 1131.
The attributes 1131 may include sound ID reference 1132, scent ID reference 1133, animation ID reference 1134, and control ID reference 1135, however, the present disclosure is not limited thereto.
The sound ID reference 1132 may reference a sound ID to provide sound effects of the object.
The scent ID reference 1133 may reference a scent ID to provide scent effects of the object.
The animation ID reference 1134 may reference an animation ID to provide an animation clip of the object.
The control ID reference 1135 may reference a control ID to provide control of the object.
Table 12 shows example binary representation syntax corresponding to the VWO behavior model type 1110, according to the example embodiments.
Table 12-2 shows example descriptor components semantics of the VWO behavior model type 1110, according to the example embodiments.
The VWO according to the example embodiments may include a common data type for the avatar metadata and the virtual object metadata. The common data type may be used as a basic building block. The common data type may include a haptic property type, a description type, an animation description type, an animation resources description type, and a common simple data type, however, the present disclosure is not limited thereto.
Hereinafter, the common data type will be described in detail with reference to
Referring to
The VWO haptic property type 1210 may refer to type information related to a haptic property of the VWO.
Depending on embodiments, the VWO haptic property type 1210 may be expressed using the XML, for example, as shown below in Source 13. However, a program source shown in Source 13 is not limiting and is only an example embodiment.
The attributes 1220 may include haptic ID 1221.
The haptic ID 1221 may refer to a unique identifier of the haptic property.
The VWO haptic property type 1210 may include a material property type 1230, a dynamic force effect type 1240, and a tactile type 1250.
The material property type 1230 may include a parameter characterizing property of a material.
The dynamic force type 1240 may include a parameter characterizing force effects.
The tactile type 1250 may include a parameter characterizing tactile property.
Table 13 shows example binary representation syntax corresponding to the VWO haptic property type 1210, according to the example embodiments.
Table 13-2 shows descriptor components semantics of the VWO haptic property type 1210, according to the example embodiments.
According to an aspect of the present disclosure, the common simple data type may be expressed using the XML, for example, as shown below in Source 13-3. However, a program source shown in Source 13-3 is not limiting and is only an example embodiment.
Table 13-4 shows example binary representation syntax corresponding to the common simple data type, according to the example embodiments.
Table 13-5 shows example descriptor components semantics of the common simple data type, according to the example embodiments.
According to an aspect of the present disclosure, the simple data type may include indicate of LH type, indicate of LMH type, indicate of SMB type, indicate of SML type, indicate of DMU type, indicate of DU type, indicate of MN type, indicate of RC type, indicate of LR type, indicate of LMR type, measure unit of LMH type, measure unit of SMB type, level of 5 type, angle type, percentage type, unlimited percentage type, and point type.
The indicate of LH type represents whether a value is high or low.
Depending on embodiments, the indicate of LH type may be expressed using the XML, for example, as shown below in Source 20. However, a program source shown in Source 20 is not limiting and is only an example embodiment.
The indicate of LMH type represents whether the value is high, medium, or low.
Depending on embodiments, the indicate of LMH type may be expressed using the XML as shown in Source 21. However, a program source shown in Source 21 is not limiting and is only an example embodiment.
The indicate of SMB type represents whether the value is small, medium, or big.
Depending on embodiments, the indicate of SMB type may be expressed using the XML, for example, as shown below in Source 22. However, a program source shown in Source 22 is not limiting and is only an example embodiment.
The indicate of SML type represents whether the value is short, medium, or long.
Depending on embodiments, the indicate of SML type may be expressed using the XML, for example, as shown in Source 23. However, a program source shown in Source 23 is not limiting and is only an example embodiment.
The indicate of DMU type represents whether the value is down, medium, or up.
Depending on embodiments, the indicate of DMU type may be expressed using the XML, for example, as shown below in Source 24. However, a program source shown in Source 24 is not limiting and is only an example embodiment.
The indicate of DU type represents whether the value is down or up.
Depending on embodiments, the indicate of DU type may be expressed using the XML, for example, as shown below in Source 25. However, a program source shown in Source 25 is not limiting and is only an example embodiment.
The indicate of PMN type represents whether the value is pointed, middle, or not pointed.
Depending on embodiments, the indicate of PMN type may be expressed using the XML, for example, as shown below in Source 26. However, a program source shown in Source 26 is not limiting and is only an example embodiment.
The indicate of RC type represents whether the value is round or cleft.
Depending on embodiments, the indicate of RC type may be expressed using the XML, for example, as shown below in Source 27. However, a program source shown in Source 27 is not limiting and is only an example embodiment.
The indicate of LR type represents whether the value is left or right.
Depending on embodiments, the indicate of LR type may be expressed using the XML, for example, as shown below in Source 28. However, a program source shown in Source 28 is not limiting and is only an example embodiment.
The indicate of LMR type represents whether the value is left, middle, or right.
Depending on embodiments, the indicate of LMR type may be expressed using the XML, for example, as shown below in Source 29. However, a program source shown in Source 29 is not limiting and is only an example embodiment.
The measure unit of LMH type refers to the indicate of LMH type or float.
Depending on embodiments, the measure unit of LMH type may be expressed using the XML, for example, as shown below in Source 30. However, a program source shown in Source 30 is not limiting and is only an example embodiment.
The measure unit of SMB type refers to the indicate of SMB type or float.
Depending on embodiments, the measure unit of SMB type may be expressed using the XML, for example, as shown below in Source 31. However, a program source shown in Source 31 is not limiting and is only an example embodiment.
The level of 5 type refers to a type of an integer value of from 1 to 5.
Depending on embodiments, the level of 5 type may be expressed using the XML, for example, as shown below in Source 32. However, a program source shown in Source 32 is not limiting and is only an example embodiment.
The angle type refers to a type of a floating value of from 0 degrees to 360 degrees.
Depending on embodiments, the angle type may be expressed using the XML, for example, as shown below in Source 33. However, a program source shown in Source 33 is not limiting and is only an example embodiment.
The percentage type refers to a type of a floating value of from 0% to 100%.
Depending on embodiments, the percentage type may be expressed using the XML, for example, as shown below in Source 34. However, a program source shown in Source 34 is not limiting and is only an example embodiment.
The unlimited percentage type refers to a type of a floating value of from 0%.
Depending on embodiments, the unlimited percentage type may be expressed using the XML, for example, as shown below in Source 35. However, a program source shown in Source 35 is not limiting and is only an example embodiment.
The point type refers to a type of a floating value of from 0%.
The point type may provide roots related to two point types, that is, a logical point type and a physical 3D point type which specify feature points for face feature control.
The logical point type provides names of the feature points.
The physical 3D point type provides 3D point vector values.
Depending on embodiments, the point type may be expressed using the XML, for example, as shown below in Source 36. However, a program source shown in Source 36 is not limiting and is only an example embodiment.
Hereinafter, the avatar will be described in detail with reference to
Referring to
According to the example embodiments, the attributes 1320 may include a gender, which represents a gender of the avatar.
The characteristics 1330 to 1380 may include appearance 1330, animation 1340, communication skills 1350, personality 1360, control feature 1370, and haptic property list 1380.
The avatar type 1330 may extend a VWO base type 1390 and share attributes 1391 and characteristics 1392, 1393, and 1394 of the VWO base type 1390. Since the attributes 1391 and the characteristics 1392, 1393, and 1394 of the VWO base type 1390 have already been described in detail with reference to
Depending on embodiments, the VWO base type 1390 may be expressed using the XML, for example, as shown below in Source 37. However, a program source shown in Source 37 is not limiting and is only an example embodiment.
Table 37-4 shows example binary representation syntax corresponding to the avatar type 1310, according to the example embodiments.
Table 37-5 shows example descriptor components semantics of the avatar type 1310, according to the example embodiments.
Referring to
The elements of the avatar appearance type 1410 may include body 1411, head 1412, eyes 1413, ears 1414, nose 1415, mouth lip 1416, skin 1417, facial 1418, nail 1419, body look 1420, hair 1421, eyebrows 1422, facial hair 1423, appearance resources 1424, facial calibration points 1425, physical condition 1426, clothes 1427, shoes 1428, and accessories 1429.
Depending on embodiments, the avatar appearance type 1410 may be expressed using the XML, for example, as shown below in Source 38. However, a program source shown in Source 38 is not limiting and is only an example embodiment.
Table 38-1 shows example binary representation syntax corresponding to the avatar appearance type 1410, according to the example embodiments.
Table 38-2 shows example descriptor components semantics of the avatar appearance type 1410, according to the example embodiments.
Depending on embodiments, the body 1411 of the avatar appearance type 1410 may be expressed using the XML, for example, as shown below in Source 39. However, a program source shown in Source 39 is not limiting and is only an example embodiment.
Table 39-1 shows example binary representation syntax corresponding to the body 1411 of the avatar appearance type 1410, according to the example embodiments.
Table 39-2 shows example descriptor components semantics of the body 1411 of the avatar appearance type 1410, according to the example embodiments.
Depending on embodiments, the head 1412 of the avatar appearance type 1410 may be expressed using the XML, for example, as shown below in Source 40. However, a program source shown in Source 40 is not limiting and is only an example embodiment.
Table 40-1 shows example binary representation syntax corresponding to the head 1412 of the avatar appearance type 1410, according to the example embodiments.
Table 40-2 shows example descriptor components semantics of the head 1412 of the avatar appearance type 1410, according to the example embodiments.
Depending on embodiments, the eyes 1413 of the avatar appearance type 1410 may be expressed using the XML, for example, as shown below in Source 41. However, a program source shown in Source 41 is not limiting and is only an example embodiment.
Table 41-1 shows example binary representation syntax corresponding to the eyes 1413 of the avatar appearance type 1410, according to the example embodiments.
Table 41-2 shows example descriptor components semantics of the eyes 1413 of the avatar appearance type 1410, according to the example embodiments.
Depending on embodiments, the ears 1414 of the avatar appearance type 1410 may be expressed using the XML, for example, as shown below in Source 42. However, a program source shown in Source 42 is not limiting and is only an example embodiment.
Table 42-1 shows example binary representation syntax corresponding to the ears 1414 of the avatar appearance type 1410, according to the example embodiments.
Table 42-2 shows example descriptor components semantics of the ears 1414 of the avatar appearance type 1410, according to the example embodiments.
Depending on embodiments, the nose 1415 of the avatar appearance type 1410 may be expressed using the XML, for example, as shown below in Source 43. However, a program source shown in Source 43 is not limiting and is only an example embodiment.
Table 43-1 shows example binary representation syntax corresponding to the nose 1415 of the avatar appearance type 1410, according to the example embodiments.
Table 43-2 shows example descriptor components semantics of the nose 1415 of the avatar appearance type 1410, according to the example embodiments.
Depending on embodiments, the mouth lip 1416 of the avatar appearance type 1410 may be expressed using the XML, for example, as shown below in Source 44. However, a program source shown in Source 44 is not limiting and is only an example embodiment.
Table 44-1 shows example binary representation syntax corresponding to the mouth lip 1416 of the avatar appearance type 1410, according to the example embodiments.
Table 44-2 shows example descriptor components semantics of the mouth lip 1416 of the avatar appearance type 1410, according to the example embodiments.
Depending on embodiments, the skin 1417 of the avatar appearance type 1410 may be expressed using the XML, for example, as shown below in Source 45. However, a program source shown in Source 45 is not limiting and is only an example embodiment.
Table 45-1 shows example binary representation syntax corresponding to the skin 1417 of the avatar appearance type 1410, according to the example embodiments.
Table 45-2 shows example descriptor components semantics of the skin 1417 of the avatar appearance type 1410, according to the example embodiments.
Depending on embodiments, the facial 1418 of the avatar appearance type 1410 may be expressed using the XML, for example, as shown below in Source 46. However, a program source shown in Source 46 is not limiting and is only an example embodiment.
Table 46-1 shows example binary representation syntax corresponding to the facial 1418 of the avatar appearance type 1410, according to the example embodiments.
Table 46-2 shows example descriptor components semantics of the facial 1418 of the avatar appearance type 1410, according to the example embodiments.
Depending on embodiments, the nail 1419 of the avatar appearance type 1410 may be expressed using the XML, for example, as shown below in Source 47. However, a program source shown in Source 47 is not limiting and is only an example embodiment.
Table 47-1 shows example binary representation syntax corresponding to the nail 1419 of the avatar appearance type 1410, according to the example embodiments.
Table 47-2 shows example descriptor components semantics of the nail 1419 of the avatar appearance type 1410, according to the example embodiments.
Depending on embodiments, the body look 1420 of the avatar appearance type 1410 may be expressed using the XML, for example, as shown below in Source 48. However, a program source shown in Source 48 is not limiting and is only an example embodiment.
Table 48-1 shows example binary representation syntax corresponding to the body look 1420 of the avatar appearance type 1410, according to the example embodiments.
Table 48-2 shows example descriptor components semantics of the body look 1420 of the avatar appearance type 1410 according to the example embodiments.
Depending on embodiments, the hair 1421 of the avatar appearance type 1410 may be expressed using the XML, for example, as shown below in Source 49. However, a program source shown in Source 49 is not limiting and is only an example embodiment.
Table 49-1 shows example binary representation syntax corresponding to the hair 1421 of the avatar appearance type 1410, according to the example embodiments.
Table 49-2 shows example descriptor components semantics of the hair 1421 of the avatar appearance type 1410, according to the example embodiments.
Depending on embodiments, the eyebrows 1422 of the avatar appearance type 1410 may be expressed using the XML, for example, as shown below in Source 50. However, a program source shown in Source 50 is not limiting and is only an example embodiment.
Table 50-1 shows example binary representation syntax corresponding to the eyebrows 1422 of the avatar appearance type 1410, according to the example embodiments.
Table 50-2 shows example descriptor components semantics of the eyebrows 1422 of the avatar appearance type 1410, according to the example embodiments.
Depending on embodiments, the facial hair 1423 of the avatar appearance type 1410 may be expressed using the XML, for example, as shown below in Source 51. However, a program source shown in Source 51 is not limiting and is only an example embodiment.
Table 51-1 shows example binary representation syntax corresponding to the facial hair 1423 of the avatar appearance type 1410, according to the example embodiments.
Table 51-2 shows example descriptor components semantics of the facial hair 1423 of the avatar appearance type 1410, according to the example embodiments.
Depending on embodiments, the facial calibration points 1425 of the avatar appearance type 1410 may be expressed using the XML, for example, as shown below in Source 52. However, a program source shown in Source 52 is not limiting and is only an example embodiment.
Table 52-1 shows example binary representation syntax corresponding to the facial calibration points 1425 of the avatar appearance type 1410, according to the example embodiments.
Table 52-2 shows example descriptor components semantics of the facial calibration points 1425 of the avatar appearance type 1410, according to the example embodiments.
Depending on embodiments, the physical condition type 1426 of the avatar appearance type 1410 may be expressed using the XML, for example, as shown below in Source 53. However, a program source shown in Source 53 is not limiting and is only an example embodiment.
Table 53-1 shows example binary representation syntax corresponding to the physical condition type 1426 of the avatar appearance type 1410, according to the example embodiments.
Table 53-2 shows example descriptor components semantics of the physical condition type 1426 of the avatar appearance type 1410, according to the example embodiments.
Referring to
According to the example embodiments, the elements of the avatar animation type 1510 may include idle 1511, greeting 1512, dance 1513, walk 1514, moves 1515, fighting 1516, hearing 1517, smoke 1518, congratulations 1519, common actions 1520, specific actions 1521, facial expression 1522, body expression 1523, and animation resources 1524. The described elements of the avatar animation type are exemplary, and thus, the present disclosure is not limited thereto.
Depending on embodiments, the avatar animation type 1510 may be expressed using the XML, for example, as shown below in Source 54. However, a program source shown in Source 54 is not limiting and is only an example embodiment.
Table 54-1 shows example binary representation syntax corresponding to the avatar appearance type 1510, according to the example embodiments.
Table 54-2 shows example descriptor components semantics of the avatar animation type 1510, according to the example embodiments.
Referring to
The elements of the avatar personality type 1610 may include openness 1631, agreeableness 1632, neuroticism 1633, extraversion 1634, and conscientiousness 1635, however, the present disclosure is not limited thereto.
Depending on embodiments, the avatar personality type 1610 may be expressed using the XML, for example, as shown below in Source 55. However, a program source shown in Source 55 is not limiting and is only an example embodiment.
Table 55-1 shows example binary representation syntax corresponding to the avatar personality type 1610, according to the example embodiments.
Table 55-2 shows descriptor components semantics of the avatar personality type 1610, according to the example embodiments.
Referring to
The elements of the avatar communication skills type 1701 may include input verbal communication 1703, input nonverbal communication 1704, output verbal communication 1705, and output nonverbal communication 1706, however, the present disclosure is not limited thereto.
Depending on embodiments, the avatar communication skills type 1701 may be expressed using the XML, for example, as shown below in Source 56. However, a program source shown in Source 56 is not limiting and is only an example embodiment.
Table 56-1 shows example binary representation syntax corresponding to the avatar communication skills type 1701, according to the example embodiments.
Table 56-2 shows example descriptor components semantics of the avatar communication skills type 1701, according to the example embodiments.
Referring to
Depending on embodiments, the verbal communication type 1710 may be expressed using the XML, for example, as shown below in Source 57. However, a program source shown in Source 57 is not limiting and is only an example embodiment.
Table 57-1 shows example binary representation syntax corresponding to the verbal communication type 1710, according to the example embodiments.
Table 57-2 shows example descriptor components semantics of the verbal communication type 1710, according to the example embodiments.
Referring to
Depending on embodiments, the avatar control features type 1801 may be expressed using the XML, for example, as shown below in Source 58. However, a program source shown in Source 58 is not limiting and is only an example embodiment.
Table 58-1 shows example binary representation syntax corresponding to the avatar control features type 1801, according to the example embodiments.
Table 58-2 shows example descriptor components semantics of the avatar control features type 1801, according to the example embodiments.
Referring to
The elements of the control face features type 1810 may include head outline 1812, left eye outline 1813, right eye outline 1814, mouth lip outline 1815, nose outline 1816, left eye brow outline 1817, right eye brow outline 1818, left ear outline 1819, right ear outline 1820, face points 1821, and miscellaneous points 1822; however, the present disclosure is not limited thereto.
Depending on embodiments, the control face features type 1810 may be expressed using the XML, for example, as shown below in Source 59. However, a program source shown in Source 59 is not limiting and is only an example embodiment.
Table 59-1 shows example binary representation syntax corresponding to the control face features type 1810, according to the example embodiments.
Table 59-2 shows example descriptor components semantics of the control face features type 1810, according to the example embodiments.
Referring to
Depending on embodiments, the outline type 1830 may be expressed using the XML, for example, as shown below in Source 60. However, a program source shown in Source 60 is not limiting and is only an example embodiment.
Table 60-1 shows example binary representation syntax corresponding to the outline type 1830, according to the example embodiments.
Table 60-2 shows example descriptor components semantics of the outline type 1830, according to the example embodiments.
Depending on embodiments, the outline 4 points 1832 of the outline type 1830 may be expressed using the XML, for example, as shown below in Source 61. However, a program source shown in Source 61 is not limiting and is only an example embodiment.
Table 61-1 shows example binary representation syntax corresponding to the outline 4 points 1832 of the outline 1830, according to the example embodiments.
Table 61-2 shows example descriptor components semantics of the outline 4 points 1832 of the outline 1830, according to the example embodiments.
Depending on embodiments, the outline 5 points 1832 of the outline type 1830 may be expressed using the XML, for example, as shown below in Source 62. However, a program source shown in Source 62 is not limiting and is only an example embodiment.
Table 62-1 shows example binary representation syntax corresponding to the outline 5 points 1832 of the outline 1830, according to the example embodiments.
Table 62-2 shows example descriptor components semantics of the outline 5 points 1832 of the outline 1830, according to the example embodiments.
Depending on embodiments, the outline 8 points 1833 of the outline 1830 may be expressed using the XML, for example, as shown below in Source 63. However, a program source shown in Source 63 is not limiting and is only an example embodiment.
Table 63-1 shows example binary representation syntax corresponding to the outline 8 points 1833 of the outline 1830, according to the example embodiments.
Table 63-2 shows example descriptor components semantics of the outline 8 points 1833 of the outline 1830, according to the example embodiments.
Depending on embodiments, the outline 14 points 1834 of the outline 1830 may be expressed using the XML, for example, as shown below in Source 64. However, a program source shown in Source 64 is not limiting and is only an example embodiment.
Table 64-1 shows example binary representation syntax corresponding to the outline 14 points 1834 of the outline 1830, according to the example embodiments.
Table 64-2 shows example descriptor components semantics of the outline 14 points 1834 of the outline 1830, according to the example embodiments.
Referring to
Depending on embodiments, the control body features type 1840 may be expressed using the XML, for example, as shown below in Source 65. However, a program source shown in Source 65 is not limiting and is only an example embodiment.
Table 65-1 shows example binary representation syntax corresponding to the control body features type 1840, according to the example embodiments.
Table 65-2 shows example descriptor components semantics of the control body features type 1840, according to the example embodiments.
Depending on embodiments, the head bones 1841 of the control body features type 1840 may be expressed using the XML, for example, as shown below in Source 66. However, a program source shown in Source 66 is not limiting and is only an example embodiment.
Table 66-1 shows example binary representation syntax corresponding to the head bones 1841 of the control body features type 1840, according to the example embodiments.
Table 66-2 shows example descriptor components semantics of the head bones 1841 of the control body features type 1840, according to the example embodiments.
Depending on embodiments, the upper body bones 1842 of the control body features type 1840 may be expressed using the XML, for example, as shown below in Source 67. However, a program source shown in Source 67 is not limiting and is only an example embodiment.
Table 67-1 shows example binary representation syntax corresponding to the upper body bones 1842 of the control body features type 1840, according to the example embodiments.
Table 67-2 shows example descriptor components semantics of the upper body bones 1842 of the control body features type 1840, according to the example embodiments.
Depending on embodiments, the down body bones 1843 of the control body features type 1840 may be expressed using the XML, for example, as shown below in Source 68. However, a program source shown in Source 68 is not limiting and is only an example embodiment.
Table 68-1 shows example binary representation syntax corresponding to the down body bones 1843 of the control body features type 1840, according to the example embodiments.
Table 68-2 shows example descriptor components semantics of the down body bones 1843 of the control body features type 1840, according to the example embodiments.
Depending on embodiments, the middle body bones 1844 of the control body features type 1840 may be expressed using the XML, for example, as shown below in Source 69. However, a program source shown in Source 69 is not limiting and is only an example embodiment.
Table 69-1 shows example binary representation syntax corresponding to the middle body bones 1844 of the control body features type 1840, according to the example embodiments.
Table 69-2 shows example descriptor components semantics of the middle body bones 1844 of the control body features type 1840, according to the example embodiments.
The virtual object in the virtual environment, according to example embodiments, may be represented by virtual object metadata.
The virtual object metadata may characterize various types of object in the virtual environment. Additionally, the virtual object metadata may provide interaction between the virtual object and the avatar. In addition, the virtual object metadata may provide interaction in the virtual environment.
The virtual object, according to the example embodiments, may further include elements including appearance and animation, in addition to extension of the base type of VWO. Hereinafter, the virtual object will be described in detail with reference to
Referring to
The virtual object type 1910 may refer to a data type with respect to the virtual object.
The VWO base type 1920 may refer to the VWO base type 510 of
The virtual object type 1910, according to the example embodiments, may include elements including appearance 1931 and animation 1932. Depending on embodiments, the virtual object type 1910 may further include haptic property 1933 and virtual object components 1934.
The appearance 1931 may include at least one resource link with respect to an appearance file describing elements such as tactile and visual of the virtual object.
The animation 1932 may include a set of metadata describing pre-recorded animation related to the virtual object.
The haptic property 1933 may include a set of descriptors of haptic property defined by the VWO haptic property type 1210 described with reference to
The virtual object components 1934 may include a list of virtual objects concatenating the virtual objects to components.
Depending on embodiments, the virtual object type 1910 may be expressed using the XML, for example, as shown below in Source 70. However, a program source shown in Source 70 is not limiting and is only an example embodiment.
Table 70 shows example binary representation syntax corresponding to the virtual object type 1910, according to the example embodiments.
Table 70-2 shows example descriptor components semantics of the virtual object type 1910, according to the example embodiments.
Referring to
The motion 2020 may refer to a set of animation defined as a rigid motion. Depending on embodiments, the motion 2020 may include an animation description type 2021. The animation description type 2021 may refer to the animation description type 1710 of
Table 71 below shows a specific embodiment of the motion 2020.
The deformation 2030 refers to a set of deformation animation. Depending on embodiments, the deformation 2030 may include an animation description type 2031. The animation description type 2031 may refer to the animation description type 1710 of
Table 71-1 below shows a specific embodiment of the deformation 2030.
The additional animation 2040 may include at least one link with respect to the animation file. Depending on embodiments, the additional animation 2040 may include an animation resources description type 2041. The animation resources description type 2041 may refer to the animation resources description type 1810 described with reference to
Depending on embodiments, the VO animation type 2010 may be expressed using the XML, for example, as shown below in Source 71-2. However, a program source shown in Source 71-2 is not limiting and is only an example embodiment.
Table 71-3 shows example binary representation syntax corresponding to the VO animation type 2010, according to the example embodiments.
Table 71-4 shows example descriptor components semantics of the VO animation type 2010, according to the example embodiments.
Referring to
The virtual world may be classified into a virtual environment and a virtual world object. The virtual world object may characterize various types of object in the virtual environment. In addition, the virtual world object may provide interaction in the virtual environment.
The virtual world object may include an avatar and a virtual object. The avatar may be used as representation of a user in the virtual environment.
Information on the virtual world may include at least one of information relating to the avatar of the virtual world and information relating to the virtual object of the virtual world.
The data transmission method may generate the first metadata by encoding the information on the first virtual world into a binary format. In addition, the data transmission method may generate the first metadata by encoding the information on the first virtual world into the binary format and losslessly compressing the binary format information.
Depending on embodiments, the data transmission method may generate the first metadata by encoding the information on the first virtual world into an XML format and encoding the XML format information into a binary format. In addition, the data transmission method may generate the first metadata by encoding the information on the first virtual world into an XML format, encoding the XML format information into a binary format, and losslessly compressing the binary format information.
The data transmission method may transmit the encoded first metadata to the second virtual world, in operation S2120. Depending on embodiments, the transmission unit 212 may transmit the encoded first metadata to a data transmission apparatus that corresponds to the second virtual world.
The data transmission method may receive the second metadata related to the second virtual world in operation S2130.
In addition, the data transmission method may decode the received second metadata in operation S2140. The data transmission method may recover information on the second virtual world by decoding the second metadata.
The methods according to the above-described example embodiments may be recorded in non-transitory computer-readable media including program instructions to implement various operations embodied by a computer. The media may also include, alone or in combination with the program instructions, data files, data structures, and the like. The program instructions recorded on the media may be those specially designed and constructed for the purposes of the example embodiments, or they may be of the kind well-known and available to those having skill in the computer software arts. Examples of non-transitory computer-readable media include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM disks and DVDs; magneto-optical media such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like. The media may be transfer media such as optical lines, metal lines, or waveguides including a carrier wave for transmitting a signal designating the program command and the data construction. Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter. The described hardware devices may be configured to act as one or more software modules in order to perform the operations of the above-described example embodiments, or vice versa. Examples of the magnetic recording apparatus include a hard disk device (HDD), a flexible disk (FD), and a magnetic tape (MT). Examples of the optical disk include a DVD (Digital Versatile Disc), a DVD-RAM, a CD-ROM (Compact Disc-Read Only Memory), and a CD-R (Recordable)/RW.
Further, according to an aspect of the embodiments, any combinations of the described features, functions and/or operations can be provided.
Moreover, the data transmission apparatus, and its various embodiments, may include at least one processor to execute at least one of the above-described units and methods.
Although example embodiments have been shown and described, it would be appreciated by those skilled in the art that changes may be made in these example embodiments without departing from the principles and spirit of the disclosure, the scope of which is defined in the claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
10-2010-0070667 | Jul 2010 | KR | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/KR2011/005379 | 7/21/2011 | WO | 00 | 3/29/2013 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2012/011755 | 1/26/2012 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
6624769 | Oktem | Sep 2003 | B2 |
7149770 | Kalva | Dec 2006 | B1 |
7712097 | Byrne | May 2010 | B2 |
8651964 | Brick | Feb 2014 | B2 |
20050216515 | Bray et al. | Sep 2005 | A1 |
20070111795 | Choi | May 2007 | A1 |
20070124789 | Sachson et al. | May 2007 | A1 |
20070143315 | Stone | Jun 2007 | A1 |
20080120558 | Nathan et al. | May 2008 | A1 |
20090271715 | Tumuluri | Oct 2009 | A1 |
20090276718 | Dawson et al. | Nov 2009 | A1 |
20100005028 | Hartley et al. | Jan 2010 | A1 |
Number | Date | Country |
---|---|---|
101346689 | Jan 2009 | CN |
2002-123838 | Apr 2002 | JP |
2003-256105 | Sep 2003 | JP |
2004-523166 | Jul 2004 | JP |
2005-215951 | Aug 2005 | JP |
2008-219264 | Sep 2008 | JP |
2009-501991 | Jan 2009 | JP |
10-2006-0085907 | Jul 2006 | KR |
10-2009-0048288 | May 2009 | KR |
10-2010-0034686 | Apr 2010 | KR |
Entry |
---|
“Preserving Avatar Genuineness in Different Display Media” Carretero et al. (Jul. 15, 2008). |
“Lossless compression” Wikipedia.org (Jul. 4, 2009). |
“Living Greyhawk D&D v.3.5 Conversion Guidelines” (2003). |
Chinese Office Action issued on Jan. 12, 2015, in counterpart Chinese Application No. 201180035435.4 (11 pages in English, 9 pages in Chinese). |
International Search Report of Corresponding PCT Application PCT/KR2011/005379 mailed Mar. 7, 2012. |
Carretero, María, et al. “Preserving avatar genuineness in different display media.” Mobile Networks and Applications 13.6 (2008): 627-634 (8 pages in English). |
Korean Office Action issued on Dec. 15, 2015 in counterpart Korean Application No. 10-2010-0070667 (6 pages with English translation). |
Nakao, T. et al., “On the Object Sharing Architecture for an Integrative Use of Virtual Spaces,” Multimedia, Distributed, Cooperative, and Mobile (DICOMO) Symposium Journals 1997-2006 Ver.1.1 [DVD-ROM], Corporation information processing society, Jun. 27, 2001, vol. 2001, Issue 7, pp. 247-252. |
Japanese Office Action issued in counterpart Japanese Patent Application No. 2013-520670 on Jun. 30, 2015 (2 pages in English, 3 pages in Japanese). |
Extended European Search Report issued on Apr. 25, 2016 in counterpart European Application No. 11809882.1. (7 pages in English). |
International Organization for Standardization, and International Electrotechnical Commission. “Part 1: Binary MPEG Format for XML.” Information Technology—MPEG Systems Technologies. International Standard ISO/IEC 23001-1, First Edition, 2006. (61 pages in English). |
Japanese Office Action issued on May 31, 2016 in counterpart Japanese Application No. 2013-520670. (23 pages with English translation). |
Number | Date | Country | |
---|---|---|---|
20130204852 A1 | Aug 2013 | US |