INFORMATION PROCESSING DEVICE AND METHOD

Information

  • Patent Application
  • 20220388178
  • Publication Number
    20220388178
  • Date Filed
    December 03, 2020
    3 years ago
  • Date Published
    December 08, 2022
    a year ago
Abstract
There is provided an information processing device and method capable of performing remote control with higher accuracy. With respect to a plurality of observation points of a device serving as an interface, haptic data including at least one piece of kinesthetic data including information on a kinesthetic sensation detected at the observation points, tactile data including information on a tactile sensation detected at the observation points, and force data including information on a magnitude of force applied to the observation points is transmitted. The present disclosure can be applied to, for example, an information processing device, a control device, an electronic apparatus, an information processing method, a program, or the like.
Description
TECHNICAL FIELD

The present disclosure relates to an information processing device and method, and in particular, to an information processing device and method capable of performing remote control with higher accuracy.


BACKGROUND ART

Conventionally, systems for performing remote control by transmitting kinesthetic data, tactile data, or the like have been conceived (refer to NPL 1 and NPL 2, for example). In recent years, more accurate remote control has been required with the improvement of information processing technology.


CITATION LIST
Non Patent Literature
NPL 1



  • “Definition and Representation of Haptic-Tactile Essence For Broadcast Production Applications,” SMPTE ST2100-1, 2017



NPL 2



  • P. Hinterseer, S. Hirche, S. Chaudhuri, E. Steinbach, M. Buss, “Perception-based data reduction and transmission of haptic data in telepresence and teleaction systems,” IEEE Transactions on Signal Processing



SUMMARY
Technical Problem

However, only information of a single observation point was transmitted in conventional cases. Therefore, it was difficult to sufficiently improve the accuracy of remote control, and only simple operation could be reproduced on a remote side.


In view of such circumstances, the present disclosure enables more accurate remote control.


Solution to Problem

An information processing device of one aspect of the present technology is an information processing device including a transmission unit that transmits haptic data including at least one piece of kinesthetic data including information on a kinesthetic sensation detected at a plurality of observation points, tactile data including information on a tactile sensation detected at the observation points, and force data including information on a magnitude of force applied to the observation points with respect to the observation points of a device serving as an interface.


An information processing method of one aspect of the present technology is an information processing method including transmitting haptic data including at least one piece of kinesthetic data including information on a kinesthetic sensation detected at a plurality of observation points, tactile data including information on a tactile sensation detected at the observation points, and force data including information on a magnitude of force applied to the observation points with respect to the observation points of a device serving as an interface.


An information processing device of another aspect of the present technology is an information processing device including a reception unit that receives haptic data including at least one piece of kinesthetic data including information on a kinesthetic sensation detected at a plurality of observation points, tactile data including information on a tactile sensation detected at the observation points, and force data including information on a magnitude of force applied to the observation points with respect to the observation points of a device serving as an interface, and a driving unit that drives a plurality of driven points of a device serving as an interface on the basis of the haptic data received by the reception unit.


An information processing method of another aspect of the present technology is an information processing method including receiving haptic data including at least one piece of kinesthetic data including information on a kinesthetic sensation detected at a plurality of observation points, tactile data including information on a tactile sensation detected at the observation points, and force data including information on a magnitude of force applied to the observation points with respect to the observation points of a device serving as an interface, and driving a plurality of driven points of a device serving as an interface on the basis of the received haptic data.


In the information processing device and method of one aspect of the present technology, with respect to a plurality of observation points of a device serving as an interface, haptic data including at least one piece of kinesthetic data including information on a kinesthetic sensation detected at the observation points, tactile data including information on a tactile sensation detected at the observation points, and force data including information on a magnitude of force applied to the observation points is transmitted.


In the information processing device and method of another aspect of the present technology, with respect to a plurality of observation points of a device serving as an interface, haptic data including at least one piece of kinesthetic data including information on kinesthetic sensation detected at the observation points, tactile data including information on a tactile sensation detected at the observation points, and force data including information on a magnitude of force applied to the observation points is received, and a plurality of driven points of a device serving as an interface are driven on the basis of the received haptic data.





BRIEF DESCRIPTION OF DRAWINGS


FIG. 1 is a diagram illustrating an overview of a haptic system.



FIG. 2 is a diagram illustrating an example of a remote control system.



FIG. 3 is a block diagram showing a main configuration example of a local system.



FIG. 4 is a block diagram showing a main configuration example of an MPD system.



FIG. 5 is a diagram illustrating an example of a haptic interface.



FIG. 6 is a diagram showing an example of a three-dimensional stimulation device.



FIG. 7 is a diagram showing an example of haptic data.



FIG. 8 is a diagram showing an example of stimulus operation data.



FIG. 9 is a diagram showing an example of a glove-type device.



FIG. 10 is a diagram showing a usage example of the glove-type device.



FIG. 11 is a diagram showing an example of haptic data.



FIG. 12 is a diagram showing an example of a KT map.



FIG. 13 is a diagram showing an application example of a remote control system.



FIG. 14 is a diagram showing an application example of a remote control system.



FIG. 15 is a diagram showing an application example of a remote control system.



FIG. 16 is a diagram showing an example of tactile data.



FIG. 17 is a diagram showing an application example of a remote control system.



FIG. 18 is a diagram showing an application example of the remote control system.



FIG. 19 is a block diagram showing a main configuration example of a renderer.



FIG. 20 is a block diagram showing a main configuration example of the renderer.



FIG. 21 is a diagram showing an example of GPS information.



FIG. 22 is a diagram illustrating an example of deriving an inclination angle.



FIG. 23 is a diagram showing an example of a KT map.



FIG. 24 is a diagram showing an example of a KT map.



FIG. 25 is a diagram showing an example of haptic data.



FIG. 26 is a diagram showing an example of a frame structure of coded data.



FIG. 27 is a diagram showing an example of coding by a dead band.



FIG. 28 is a diagram showing an example of coding haptic data.



FIG. 29 is a block diagram showing a main configuration example of a coding unit and a decoding unit.



FIG. 30 is a diagram showing an example of a state of processing.



FIG. 31 is a diagram showing an example of an in-frame structure of coded data.



FIG. 32 is a flowchart illustrating an example of a flow of haptic data transmission processing.



FIG. 33 is a flowchart illustrating an example of a flow of haptic data reception processing.



FIG. 34 is a flowchart illustrating an example of a flow of MPD generation processing.



FIG. 35 is a flowchart illustrating an example of a flow of MPD control processing.



FIG. 36 is a diagram showing an example of a state of data reference using MPD.



FIG. 37 is a diagram showing an example of a state of data reference using MPD.



FIG. 38 is a diagram showing an example of a state of data reference using MPD.



FIG. 39 is a diagram showing a configuration example of a container.



FIG. 40 is a diagram showing a configuration example of a container.



FIG. 41 is a diagram showing a configuration example of a container.



FIG. 42 is a diagram showing an arrangement example of data in a file format.



FIG. 43 is a diagram showing an example of a file format.



FIG. 44 is a diagram showing an arrangement example of data in a file format.



FIG. 45 is a diagram showing an arrangement example of data in a file format.



FIG. 46 is a diagram showing an example of a file format.



FIG. 47 is a diagram showing an example of MPD.



FIG. 48 is a diagram showing an example of MPD.



FIG. 49 is a diagram showing an example of MPD.



FIG. 50 is a diagram showing an example of MPD.



FIG. 51 is a diagram showing an example of MPD.



FIG. 52 is a diagram showing an example of MPD.



FIG. 53 is a diagram showing an example of MPD.



FIG. 54 is a diagram showing an example of MPD.



FIG. 55 is a diagram showing an example of MPD.



FIG. 56 is a diagram showing an example of MPD.



FIG. 57 is a diagram showing an example of MPD.



FIG. 58 is a diagram showing an example of MPD.



FIG. 59 is a diagram showing an example of semantics.



FIG. 60 is a diagram showing an example of semantics.



FIG. 61 is a diagram showing a configuration example of HDMI transmission.



FIG. 62 is a diagram showing an example of metadata.



FIG. 63 is a diagram showing an example of metadata.



FIG. 64 is a diagram showing an example of semantics.



FIG. 65 is a diagram showing an example of a transmission format of haptic data through a TMDS channel.



FIG. 66 is a block diagram showing a main configuration example of a computer.





DESCRIPTION OF EMBODIMENTS

Hereinafter, modes for carrying out the present disclosure (hereinafter referred as embodiments) will be described. The descriptions will be given in the following order.

  • 1. Haptic transmission
  • 2. First embodiment (remote control system)
  • 3. Second embodiment (application of MPD)
  • 4. Third embodiment (digital interface)
  • 5. Supplement


1. Haptic Transmission
<Haptic System>

Telexistence society intends to realize the effect of instantaneous spatial movement by disposing a device controlled at user's will in a spatially separate place and controlling the device via a network, and is conceived to lead to realization of human augmentation in which an action on a local side which leads control is reproduced on a remote side and a remote device operates according to the action such that progress and results are fed back to the local side at any time, humans are incorporated into a feedback system to be free from spatiotemporal constraints such that the feedback causes local activity to continue, and human abilities can be amplified rather than simply the sense of presence.


For example, a haptic system 10 of FIG. 1 includes a haptic device 11 and a haptic device 15 which are installed at places remote from each other and are composed of sensors, actuators, and the like. One transmits haptic data (kinesthetic data, tactile data, and the like) detected by a sensor, and the other receives the haptic data and drives an actuator on the basis of the haptic data. By exchanging such haptic data, the operation of one haptic device can be reproduced in the other haptic device. That is, a remote operation is realized. Such exchange of haptic data is realized by a communication device 12 and a communication device 14 communicating with each other via a network 13.


The communication device 12 can also feed back haptic data from the haptic device 11 to the haptic device 11. Similarly, the communication device 14 can also feed back haptic data from the haptic device 15 to the haptic device 15.


In such transmission of haptic data, it is necessary to accurately describe how outputs of a plurality of local kinesthetic sensors change in conjunction with each other and to convey it to a remote receiving device.


A haptic device may be, for example, a device in a bent skeletal arm shape or a glove-shaped device that can be worn on a hand. When an operator moves a skeletal arm or a glove as a haptic display on a local side, position information and a motion state at each articulation change.


As a haptic device, a higher-order haptic device that changes the degree of freedom in a configuration of a kinesthetic sensor from the first (1 degree of freedom (1DoF)) to the third (3DoF), increasing the number of articulation points, or the like has been conceived. However, when it is transmitted to a remote side, it is limited to information of a single articulation point. In addition, in studies, projection of an element on a single vector in a three-dimensional space has been assumed, compression coding performed and coding results delivered. Information on a plurality of articulation points that form the skeleton of a kinesthetic sensor can describe a motion of an end part by being combined, and if such interconnection information is not clearly described, accurate trace cannot be performed in a reproduction device on a reception side.


Therefore, connection information of each articulation part composed of a kinesthetic sensor is described, and at the time of transmission, transmitted to a reception side as metadata. By doing this, it is possible to realize high kinesthetic reproducibility and tactile reproducibility.


2. First Embodiment
<Remote Control System>


FIG. 2 is a diagram illustrating an overview of a remote control system which is an embodiment of a communication system (information processing system) to which the present technology is applied. The remote control system 100 shown in FIG. 2 has a local system 101 and a remote system 102 that are remote from each other. The local system and the remote system 102 each have haptic devices, communicate with each other via a network 110, and realize remote control of the haptic devices by exchanging hap tic data. For example, an operation input to one haptic device can be reproduced in the other haptic device.


Here, although a system on the side of a subject of communication is referred to as the local system 101 and a system on the side of a communication partner is referred to as the remote system 102 in description, the local system 101 and the remote system 102 are systems that can basically play the same role. Therefore, unless otherwise specified, description of the local system 101 below can also be applied to the remote system 102.


Configurations of the local system 101 and the remote system 102 are arbitrary. The local system 101 and the remote system 102 may have different configurations or the same configuration. Further, although one local system 101 and one remote system 102 are shown in FIG. 2, the remote control system 100 can include an arbitrary number of local systems 101 and remote systems 102.


In addition, the remote control system 100 can include an MPD server 103. The MPD server 103 performs processing related to registration and provision of media presentation description (MPD) on the local system 101 and the remote system 102. The local system 101 and the remote system 102 can select and acquire necessary information using this MPD. Of course, the configuration of the MPD server 103 is also arbitrary, and the number thereof is also arbitrary.


This MPD server 103 can be omitted. For example, the local system 101 or the remote system 102 may supply MPD to the communication partner. Further, the local system 101 and the remote system 102 may exchange haptic data without using MPD, for example.


The network 110 is configured as, for example, any wired communication network or wireless network, or both, such as a local area network, a network by a dedicated line, a wide area network (WAN), the Internet, or satellite communication. Further, the network 110 may be configured as a plurality of communication networks.


<Local System>


FIG. 3 is a block diagram showing a main configuration example of the local system 101. As shown in FIG. 3, the local system 101 includes a haptic device 121, a communication device 122, a digital interface 123, and a digital interface 124.


The haptic device 121 is a device that can serve as an interface for a user or a remote device and generates haptic data or operates on the basis of the haptic data. In addition, the haptic device 121 can supply haptic data or the like to the communication device 122 via the digital interface 123, for example. Further, the haptic device 121 can acquire haptic data or the like supplied from the communication device 122 via the digital interface 124.


The communication device 122 can communicate with other devices via the network 110 (FIG. 2). The communication device 122 can exchange haptic data and exchange MPD through the communication, for example. Further, the communication device 122 can acquire haptic data or the like supplied from the haptic device 121 via the digital interface 123, for example. Further, the communication device 122 can supply haptic data or the like to the haptic device 121 via the digital interface 124. The digital interface 123 and the digital interface 124 are interfaces for digital apparatuses of arbitrary standards, such as a Universal Serial Bus (USB) (registered trademark) and a High-Definition Multimedia Interface (HDMI) (registered trademark).


The haptic device 121 includes a sensor unit 131, a renderer 132, an actuator 133, and a haptic interface (I/F) 134.


The sensor unit 131 detects kinesthetic data, tactile data, and the like at an observation point in the haptic interface 134 and supplies the detected data to the communication device 122 as haptic data. The sensor unit 131 can also supply haptic data to the renderer 132.


The sensor unit 131 may include any sensor, for example, a magnetic sensor, an ultrasonic sensor, and a Global Positioning System (GPS) sensor that detect a position and a motion, a gyro sensor that detects a motional state such as an angular velocity, an acceleration sensor that detects an acceleration, and the like as long as it can detect necessary data. For example, the sensor unit 131 may include an image sensor 141 and a spatial coordinate conversion unit 142. The image sensor 141 supplies captured image data to the spatial coordinate conversion unit 142. The spatial coordinate conversion unit 142 derives spatial coordinates (coordinates of a three-dimensional coordinate system) of an observation point (for example, an articulation or the like) of the haptic interface 134 from the image. The spatial coordinate conversion unit 142 can supply the coordinate data to the communication device 122 and the renderer 132 as haptic data.


The image sensor 141 may have a detection function in a depth direction. In this way, the spatial coordinate conversion unit 142 can more easily derive spatial coordinates of an observation point using a captured image and a depth value thereof. Further, the sensor unit 131 may have a plurality of sensors.


The renderer 132 performs rendering using the haptic data supplied from the communication device 122 and transmitted from another external device to generate control information for the actuator. The renderer 132 supplies the control information to the actuator 133. The renderer 132 can also perform rendering using haptic data supplied from the sensor unit 131 (spatial coordinate conversion unit 142).


The actuator 133 drives the haptic interface 134 in response to the control information supplied from the renderer 132. For example, the actuator 133 causes the haptic interface 134 to reproduce a motion (kinesthetic sensation, a tactile sensation, or the like) represented by the transmitted haptic data.


The haptic interface 134 serves as an interface for kinesthetic data, tactile data, and the like for an operator that is a user, a remote device, or the like. For example, the haptic interface 134 is controlled by the actuator 133 to reproduces a motion, a tactile sensation, or the like corresponding to the haptic data output from the renderer 132.


The communication device 122 performs processing related to transmission/reception of haptic data. In the case of the example of FIG. 3, the communication device 122 includes a composer 151, a coding unit 152, a container processing unit 153, an MPD generation unit 154, an imaging unit 155, and a video coding unit 156.


The composer 151 converts haptic data supplied from the haptic device 121 (sensor unit 131) into a format for transmission (for example, generates a KT map which will be described later). The composer 151 supplies the haptic data for transmission to the coding unit 152. Further, the composer 151 can also supply the haptic data for transmission to the MPD generation unit 154.


The coding unit 152 acquires and codes the haptic data supplied from the composer 151 to generate coded data of the haptic data. The coding unit 152 supplies the coded data to the container processing unit 153.


The container processing unit 153 performs processing related to generation of transmission data. For example, the container processing unit 153 may acquire the coded data of the haptic data supplied from the coding unit 152, store the coded data in transmission data, and transmit it to the remote system 102.


Further, with respect to the haptic data generated in the haptic device 121, the communication device 122 can generate MPD that is control information for controlling reproduction of the haptic data.


For example, the MPD generation unit 154 may acquire haptic data from the composer 151, generate MPD with respect to the haptic data, and supply the generated MPD to the coding unit 152. In such a case, the coding unit 152 can acquire the MPD supplied from the MPD generation unit 154 and code the MPD to generate coded data of the MPD, and supply the coded data to the container processing unit 153. The container processing unit 153 can store the coded data of the MPD supplied from the coding unit 152 in transmission data and transmit it to, for example, the MPD server 103. The container processing unit 153 may transmit the transmission data in which the coded data of the MPD is stored to the remote system 102.


Furthermore, the communication device 122 can also transmit data that is not haptic data. For example, the communication device 122 can image the haptic device 121 and transmit the captured image to the remote system 102.


For example, the imaging unit 155 may include an image sensor or the like, image the haptic device 121 (or a user of the haptic device 121), generate captured image data, and supply the captured image data to the video coding unit 156. The video coding unit 156 can acquire and code the captured image data supplied from the imaging unit 155 to generate coded data of the captured image data and supply the coded data to the container processing unit 153. In such a case, the container processing unit 153 can store the coded data of the captured image data supplied from the video coding unit 156 in the transmission data, and transmit the transmission data to, for example, the remote system 102.


Further, the communication device 122 includes a container processing unit 161, a decoding unit 162, an MPD control unit 163, a video decoding unit 164, and a display unit 165.


The container processing unit 161 performs processing of extracting desired data from the transmission data. For example, the container processing unit 161 can receive the transmission data transmitted from the remote system 102, extract the coded data of the haptic data from the transmission data, and supply the extracted coded data to the decoding unit 162.


The decoding unit 162 can acquire the coded data of the haptic data supplied from the container processing unit 161 and decode it to generate the haptic data and supply the generated haptic data to (the renderer 132 of) the haptic device 121.


In addition, the communication device 122 can control reproduction of haptic data using the MPD.


For example, the container processing unit 161 can acquire transmission data in which MPD corresponding to desired haptic data is stored from the MPD server 103, extract coded data of the MPD from the transmission data, and supply it to the decoding unit 162. The decoding unit 162 can decode the coded data to generate the MPD and supply the MPD to the MPD control unit 163. The MPD control unit 163 can acquire the MPD supplied from the decoding unit 162, control the container processing unit 161 using the MPD, and cause the transmission data in which the coded data of the desired haptic data is stored to be acquired.


Further, the communication device 122 can also receive data that is not haptic data. For example, the communication device 122 can receive a captured image of the haptic device 121 from the remote system 102.


In such a case, the container processing unit 161 can receive transmission data transmitted from the remote system 102, extract coded data of captured image data from the transmission data, and supply the extracted coded data to the video decoding unit 164. The video decoding unit 164 can acquire the coded data supplied from the container processing unit 161 and decode the coded data to generate the captured image data and supply the generated captured image data to the display unit 165. The display unit 165 includes an arbitrary display device such as a monitor or a projector and can display the captured image corresponding to the supplied captured image data.


The remote system 102 can have the same configuration as the local system 101. That is, the description given with reference to FIG. 3 can also be applied to the remote system 102.


<MPD Server>


FIG. 4 is a block diagram showing a main configuration example of the MPD server 103. In the MPD server 103 shown in FIG. 4, a central processing unit (CPU) 201, a read only memory (ROM) 202, and a random access memory (RAM) 203 are connected via a bus 204.


An input/output interface 210 is also connected to the bus 204. An input unit 211, an output unit 212, a storage unit 213, a communication unit 214, and a drive 215 are connected to the input/output interface 210.


The input unit 211 may include any input device such as a keyboard, a mouse, a microphone, a touch panel, an image sensor, a motion sensor, or various other sensors. Further, the input unit 211 may include an input terminal. The output unit 212 may include any output device such as a display, a projector, a speaker, or the like. Further, the output unit 212 may include an output terminal.


The storage unit 213 includes, for example, an arbitrary storage medium such as a hard disk, a RAM disk, or a non-volatile memory, and a storage control unit that writes or reads information to or from the storage medium. The communication unit 214 includes, for example, a network interface. The drive 215 drives an arbitrary removable recording medium 221 such as a magnetic disk, an optical disc, a magneto-optical disc, or a semiconductor memory and writes or reads information to or from the removable recording medium 221.


In the MPD server 103 configured as described above, the CPU 201 realizes various functions indicated by functional blocks which will be described later, for example, by loading a program stored in the storage unit 213 into the RAM 203 via the input/output interface 210 and the bus 204 and executing the program. The RAM 203 also appropriately stores data and the like necessary for the CPU 201 to execute various types of processing of the program.


A program executed by a computer can be recorded on a removable recording medium 221 as package media or the like and applied thereto, for example. In such a case, the program can be installed in the storage unit 213 via the input/output interface 210 by mounting the removable recording medium 221 in the drive 215.


This program can also be provided via a wired or wireless transmission medium such as a local area network, a dedicated line network, or a WAN, the Internet, satellite communication, or the like. In such a case, the program can be received by the communication unit 214 and installed in the storage unit 213.


In addition, this program can be installed in the ROM 202 or the storage unit 213 in advance.


<Force Representation/Transmission of Multiple Articulations Information>

For example, the remote control system 100 represents a force applied to a device position (articulation, or the like) of the haptic interface 134 when a force is applied to an object such as a stylus or a handle mounted on the haptic interface 134, as shown in FIG. 5.


In addition, the remote control system 100 transmits haptic data (kinesthetic data and tactile data) with respect to a plurality of device positions (multiple articulations or the like).


In the following, it is assumed that haptic data includes at least one piece of kinesthetic data detected at a device position, tactile data detected at the device position, and force data representing a force applied to the device position.


When only a motion of an end part is reproduced, even one piece of position information can be represented, but a motion of a articulation point from a viewpoint to the end cannot be faithfully reproduced, limiting the usage. If articulation points are configured in multiple stages, and the position of each articulation point and information on a point of action such as a force or a torque at the position of each articulation point need to be reproduced, more accurate control can be performed by describing rotation information of yaw/pitch/roll of sensor output and information on the force/torque with respect to each articulation point.


In addition, by describing each of multi-stage sensor outputs and transmitting connection information to a receiving side as metadata, more accurate tracing can be performed in a reproduction device on the receiving side. The remote control system 100 can realize high kinesthetic reproducibility by describing connection information of each articulation part composed of a kinesthetic sensor, and at the time of transmission, transmitting the connection information to the receiving side as metadata.


<Use Case 1>

An application example of the remote control system 100 will be described. For example, a three-dimensional stimulation device 301 as shown in FIG. 6 may be applied as the haptic interface 134, and the remote control system 100 may be applied to a system that reproduces a motion of the three-dimensional stimulation device 301 at a remote location.


In the three-dimensional stimulation device 301, a rod-shaped object 302 is attached to the tip of an arm having a plurality of articulations and the position and posture of the object 302 are supported by the arm. In other words, when a user or the like moves the object 302 or changes the posture thereof, for example, the three-dimensional stimulation device 301 can detect a change in the position or posture of the object 302 by the arm.


The object 302 serves as, for example, a pen, a pointer, or the like. For example, when a user moves the object 302 to write letters or draw a line or a picture, the three-dimensional stimulation device 301 can detect a change in the position or posture of the object 302 by the arm and obtain haptic data capable of representing the change. For example, it is possible to reproduce a motion of the aforementioned object 302 by controlling a motion of a device having a drive mechanism similar to that of the three-dimensional stimulation device 301 using the haptic data.


As shown in FIG. 6, the arm has a plurality of articulations (movable parts). The sensor unit 131 uses the plurality of articulations of the three-dimensional stimulation device 301 (haptic interface 134) and the tip of the arm to which the object 302 is attached as observation points and detects haptic data with respect to each observation point.


For example, three articulation points of the arm are set as an observation point Art_1, an observation point Art_2, and an observation point Art_3 in order from the root side (left side in the figure). In addition, the tip (the position to which the object 302 is attached) of the arm is set as an observation point Art_4. The connection position relationship of the observation points is Art_1→Art_2→Art_3→Art_4 in order from the origin of a motion, and Art_1 is the origin of a motion and becomes a fulcrum of subsequent connection relationship. It is also possible to set the origin at another position and set Art_1 as a position having an offset from the origin.


The sensor unit 131 detects information about positions, information about a motion, and information about an applied force with respect to observation points as haptic data. The information about positions includes information indicating the positions of the observation points. The information about a motion includes information indicating a motion (amount of change in a posture) of the object 302 attached to an observation point. In the case of the example of FIG. 6, this information is obtained only for the observation point Art_4 to which the object 302 is attached. The information about an applied force includes information indicating the magnitude and direction (pushing, pulling, or the like) of a force applied in the vertical direction at an observation point. In the case of the example of FIG. 6, this force represents a force applied through the object 302. Therefore, this information is obtained only for the observation point Art_4 to which the object is attached.


The sensor unit 131 outputs information as shown in the table of FIG. 7 as haptic data. The example of FIG. 7 corresponds to the example of FIG. 6. In the table of FIG. 7, P_ID is identification information (ID) of an observation point. In the case of the example of FIG. 7, haptic data is detected with respect to four observation points A1 (Art_1 in FIG. 6) to A4 (Art_4 in FIG. 6). UPSP indicates an adjacent observation point (articulation point) located upstream in a fulcrum direction. For example, the UPSP of the observation point A2 (Art_2) is the observation point A1 (Art_1), the UPSP of the observation point A3 (Art_3) is the observation point A2 (Art_2), and the UPSP of the observation point A4 (Art_4) is the observation point A4 (Art_4). Since the observation point A1 (Art_1) is the highest observation point (on the first root side of the arm), the observation point A1 itself is assigned to the UPSP thereof.


A fulcrum articulation point position is information indicating the position of an observation point. The method of representing the position of an observation point is arbitrary. For example, the position of an observation point may be represented by coordinates (for example, xyz coordinates) with the fulcrum of the observation point as the origin (x1/y1/z1, x2/y2/z2, x3/y3/z3, x4/y4/z4). The radius of gyration r is uniquely obtained by taking the absolute value of a vector with an adjacent articulation point. Further, by combining these coordinates with the aforementioned connection position relationship between observation points, the position of each observation point can be represented by a single coordinate system.


The stimulus operation is information about a motion of an observation point. As described above, this information is obtained only for the observation point Art_4 to which the object is attached. The method of representing this motion is arbitrary. For example, as shown in FIG. 8, it may be represented by motions (velocity) in rotation directions (pitch, yaw, and roll) having coordinate axes of x, y, and z as rotation axes (pt4/yw4/rl4).


The force is information about a force applied to an observation point. As described above, this information is obtained only for the observation point Art_4 to which the object is attached. The method of representing this force is arbitrary. For example, it may be represented by the magnitude of a force (N4), such as Newton (N). In addition, the direction of the force may be indicated by the sign of a value. For example, a positive value may indicate a force in a direction of pushing the object 302 toward the observation point Art_4, and a negative value may indicate a force in a direction of moving the object 302 away from the observation point Art_4 (direction of pulling). Further, the mass (m) of the arm or the like may be used as known information, and an acceleration (a) may be used to represent this force (F=ma).


Meanwhile, observation points having motions that are chained (having motions that are related) may be grouped. In addition, a subgroup (Sub_Gp) may be formed of observation points having a stronger relationship among the observation points in the group. For example, since the observation points A1 to A4 are observation points of one three-dimensional stimulation device 301, they have related operations and thus can be grouped. In addition, since the observation points A1 to A4 are also observation points of one arm, they have a stronger relationship among their operations and thus can form a subgroup. In the case of the example of FIG. 7, a subgroup consisting of the observation point A1 and a subgroup consisting of the observation points A2 to A4 are formed. In this manner, an observation point on the most root side, which serves as a fulcrum may form a unique subgroup.


The table of FIG. 7 is an example, and haptic data output by the sensor unit 131 is not limited to the example of FIG. 7. Haptic data output by the sensor unit 131 may include at least one of kinesthetic data including information on a kinesthetic sensation detected at an observation point, tactile data including information on a tactile sensation detected at the observation point, and force data including information on the magnitude of a force applied to the observation point.


In the example of FIG. 7, the fulcrum articulation point position is information about the position of an observation point and information on a kinesthetic sensation. Further, the stimulus operation is information about the velocity of an observation point and information on a kinesthetic sensation. That is, data of the fulcrum articulation point position and the stimulus operation are included in kinesthetic data. In addition, the force is information about the magnitude of a force applied to an observation point. That is, data of the force is included in force data.


That is, haptic data output by the sensor unit 131 may include tactile data, or a part of the information shown in FIG. 7 may be omitted, for example.


The remote control system 100 can represent, for example, a motion of the object 302 that cannot be represented only using position information of the observation point Art_4 by transmitting haptic data with respect to a plurality of observation points, as in the example of FIG. 7. In addition, an applied force can also be represented. Therefore, it is possible to perform remote control with higher accuracy.


<Use Case 2>

In addition, for example, a glove-type device 311 as shown in FIG. 9 may be applied as the haptic interface 134 and the remote control system 100 may be applied to a system that reproduces a motion of the glove-type device 311 at a remote location.


The glove-type device 311 is a device of a glove type, and when a user or the like wears the glove-type device on his/her hand like a glove and moves it, for example, a motion can be detected. As shown in FIG. 9, the sensor unit 131 uses the wrist, articulations, fingertips, and the like of the glove-type device 311 as observation points. An observation point A1 is the wrist of the glove-type device 311. Observation points A2 and A3 are articulations of the thumb, and an observation point B1 is the tip of the thumb. Observation points A4 to A6 are articulations of the index finger, and an observation point B2 is the tip of the index finger. Observation points A7 to A9 are articulations of the middle finger, and an observation point B3 is the tip of the middle finger. Observation points A10 to A12 are articulations of the ring finger, and an observation point B4 is the tips of the ring finger. Observation points A13 to A15 are articulations of the little finger, and an observation point B5 is the tip of the little finger.


As in the case of use case 1, the sensor unit 131 detects haptic data with respect to these observation points. For example, when a user wears the glove-type device 311 and grasps an object 312 (for example, a smartphone), as shown in FIG. 10, motions of the fingers, that is, motions of the respective observation points are detected by the sensor unit 131 and haptic data is output. The table of FIG. 11 shows an example of the haptic data.


The table of FIG. 11 shows P_ID, UPSP, and a fulcrum articulation positions with respect to each observation point, similarly to the example of FIG. 7. In addition, a stimulus operation is shown with respect to each of observation points (for example, observation points A2, A3, A6, A9, A12, and A15) in contact with the object 312. Further, a force applied to each of the observation points (for example, observation points B1 to B5) at the tips of the fingers is shown. The method of representing such information is arbitrary as in the case of use case 1.


In addition, in this case, as indicated by dotted line frames in the table of FIG. 11, the observation points are subgrouped for each finger. For example, the observation point A1 that is a fulcrum, the observation points A2, A3, and B1 of the thumb, the observation points A4, A5, A6, and B2 of the index finger, the observation points A7, A8, A9, and B3 of the middle finger, the observation points A10, A11, A12, and B4 of the ring finger, and the observation points A13, A14, A15, and B5 of the little finger are subgrouped, respectively.


The table of FIG. 11 is an example, and haptic data output by the sensor unit 131 is not limited to the example of FIG. 11. Haptic data output by the sensor unit 131 may include at least one of kinesthetic data including information on a kinesthetic sensation detected at an observation point, tactile data including information on a tactile sensation detected at the observation point, and force data including information on the magnitude of a force applied to the observation point.


In the example of FIG. 11, the fulcrum articulation point position is information about the position of an observation point and information on a kinesthetic sensation. Further, the stimulus operation is information about the velocity of an observation point and information on a kinesthetic sensation. That is, data of the fulcrum articulation point position and the stimulus operation are included in kinesthetic data. In addition, the force is information about the magnitude of a force applied to an observation point. That is, data of the force is included in force data.


That is, haptic data output by the sensor unit 131 may include tactile data, or a part of the information shown in FIG. 11 may be omitted, for example.


The remote control system 100 can represent, for example, a motion of the object 302 that cannot be represented only using position information of the observation points B1 to B5 at the fingertips by transmitting haptic data with respect to a plurality of observation points as in the example of FIG. 11. In addition, an applied force can also be represented. Therefore, it is possible to perform remote control with higher accuracy.


<KT Map>

Each piece of information of haptic data with respect to a plurality of observation points as described above may be integrated for each observation point (may be classified for each observation point) using, for example, a kinesthetic & tactile (KT) map 331 shown in FIG. 12. The KT map shown in FIG. 12 is a table in which haptic data with respect to each observation point is integrated for each observation point (classified for each observation point).


As shown in FIG. 12, in this KT map 331, observation points are grouped, and identification information (group ID) of each group is shown. In such a group, identification information (P_ID) of each observation point is indicated, and information such as UPSP, a fulcrum articulation point position, a stimulus operation, force (N), hardness (G), a coefficient of friction (μ), and temperature (° C.) are associated for each P_ID.


The P_ID, UPSP, fulcrum articulation point position, stimulus operation, and force (N) are the same information as those in FIG. 7 and FIG. 11. The type of force may be pressure. Pressure is represented by a force applied to a unit area, and N/m2 or Pascal P is the unit, for example. The hardness (G) is information indicating the hardness of the surface of an object with which the haptic interface 134 is in contact, which is detected at an observation point. Here, the unit representing the hardness is assumed to be rigidity. The rigidity is a kind of elastic modulus, which is a physical property value that determines the difficulty of deformation, and can be represented in Gpa. A larger Gpa value represents greater hardness. The coefficient of friction (μ) is information indicating the coefficient of friction of the surface of an object with which the haptic interface 134 is in contact, which is detected at an observation point. The coefficient of friction can be obtained from a dimensionless quantity (μ=F/N) obtained by dividing a frictional force by a normal force acting on a contact surface. The temperature (° C.) is information indicating the temperature of the surface of an object with which the haptic interface 134 is in contact, which is detected at an observation point. That is, the hardness (G), the coefficient of friction (μ), and the temperature (° C.) are all information about the surface of the object with which the haptic interface 134 is in contact, that is, information on a tactile sensation. That is, this data is included in tactile data.


By forming the KT map 331 that integrates haptic data for each observation point in this manner, it is possible to more easily select and exchange haptic data of a desired observation point. Further, by configuring haptic data as the KT map 331 in the form of a table as shown in the example of FIG. 12, it is possible to more easily select and exchange desired information of a desired observation point. Therefore, it is possible to control and adjust loads of communication and processing related to transmission of haptic data.


Meanwhile, the KT map 331 may include haptic data for each observation point, and details thereof are not limited to the example shown in FIG. 12. For example, information that is not shown in FIG. 12 may be included in the KT map 331, or a part of the information shown in FIG. 12 may be omitted. Further, observation points may be subgrouped as in the case of the examples of FIG. 7 and FIG. 11.


<Use Case 3>

The remote control system 100 can bidirectionally transmit such haptic data. For example, when a local operator operates the glove-type device 311, as shown in FIG. 13, haptic data (data of the position, velocity, and acceleration of each observation point, and the like, applied force data, and the like) detected at a plurality of observation points (black circles) of the glove-type device 311 is transmitted (forward) to a remote device 351 at a remote location. The remote device 351 is a hand-shaped device (so-called robot hand) corresponding to the glove-type device 311 and is the haptic interface 134 of the remote system 102.


The remote device 351 controls a plurality of control points (white circles) such as articulations thereof on the basis of the transmitted plurality of pieces of observation point haptic data to grip a predetermined object 352 (e.g., a ball or the like) that is a gripping target.


When haptic data such as kinesthetic data, tactile data, and force data is detected at a plurality of observation points (white circles) of the remote device 351 by grasping the object 352, the haptic data (data of the position, velocity, and acceleration of each observation point, and the like, applied force data, and the like) is transmitted (feedback) to the glove-type device 311 of the local operator. That is, in this case, there is a kinesthetic sensor at the force point (the part that grips the object) of the device, and the applied force is detected, quantified, and transmitted to the other party. The position of the force point is independent of the position of a articulation point and can be any position.


By performing feedback transmission in this manner, the glove-type device 311 can reproduce a reaction force at the time of gripping the object 352, which is detected by the remote device 351. That is, the local operator can feel the reaction force. Therefore, when the remote device 351 feeds back a force having the same magnitude as the received signal as the reaction force in the opposite direction, the local operator feels that the position is stable by grasping it with a finger. By adjusting the force applied to the glove-type device 311 such that the local operator can obtain such a feeling, remote control can be performed such that the remote device 351 can stably grip the object 352. That is, remote control can be performed with higher accuracy.


Meanwhile, this feedback transmission may be performed only when forward-transmitted kinesthetic data is updated by gripping the object 352.


<Use Case 4>

Meanwhile, haptic data may be transmitted through wireless communication. For example, as shown in FIG. 14, the remote control system 100 may be applied to remote control of a robot hand 362 provided on a flying object 361 such as a so-called drone. In this case, a local operator operates the glove-type device 311 to input an operation for the robot hand 362. The local system 101 transmits position data and force data as kinesthetic data detected in the glove-type device 311 to the flying object 361 through wireless communication. The flying object 361 supplies the information to the robot hand 362, and the robot hand 362 is driven according to the control thereof to perform an operation such as grasping an object. When the robot hand 362 grasps an object or the like, position data and force data thereof are fed back to the glove-type device 311 as in the case of use case 3. The local operator can remotely control the robot hand 362 while experiencing a kinesthetic sensation, a tactile sensation, and the like reproduced by the glove-type device 311 on the basis of the fed back position data and force data.


Such remote control can be applied to, for example, an operation in which a rescue flying object 361 flies to a scene of an incident and grasps the body of a person found there by the robot hand 362 to rescue the person, and the like. For example, the robot hand 362 may move in conjunction with a hand of the local operator and hold the person around him/her. Further, at that time, it may be possible to confirm whether or not the local operator is holding the person through feedback transmission. Meanwhile, the feedback data may include video and audio data.


<Use Case 5>

In addition, the remote control system 100 may be applied to a system that detects the state of the surface of a material by a remote device. For example, as shown in FIG. 15, when a local operator wears the glove-type device 311 and scans from the top to the bottom in the figure, haptic data detected in the glove-type device 311 is forward-transmitted to a remote device 371 at a remote location, and a motion of the glove-type device 311 is reproduced in the remote device 371. That is, the remote device 371 scans an object surface 372 from the top to the bottom in the figure.


According to this scanning, the remote device 371 detects a state of unevenness of the object surface 372, and the like. This haptic data is fed back to the glove-type device 311 on the side of the local operator. The glove-type device 311 reproduces the unevenness of the object surface 372 detected by the remote device 371 by a tactile actuator on the basis of the haptic data. Accordingly, the local operator can ascertain the unevenness that is not directly touched.


For example, regarding detection of a tactile sensation, the local operator can simultaneously detect ranges with widths by operating with two fingers with a predetermined force. When position information changes at a certain time, vibration with a frequency of f=V/λ Hz is generated on the skin surface when tracing a material having surface unevenness intervals of λ mm at a velocity V (mm/sec) calculated by the position information. When this vibration is traced, for example, it becomes as shown in FIG. 16. Accordingly, from position information of the two fingers (tactile points), it can be ascertained that feedback patterns of tactile data that is results of parallel scanning of the two fingers side by side have a similarity relation therebetween having a slight time difference. Therefore, it can be ascertained that the surface has unevenness diagonally at the distance scanned by the two fingers.


By comparing the positions of the two fingers and tactile feedback information obtained at the positions on the local side in this manner, it is possible to ascertain the shapes, unevenness, roughness, and smoothness of the surfaces of a plurality of positions more accurately and efficiently and to ascertain the texture and structure as surfaces. It is possible to control a degree of force to be applied on the remote side using force information transmitted from the local side. In this manner, the texture of an object surface can be ascertained by simultaneously scanning a plurality of points even in transmission of only a tactile sensation without distribution of a video.


<Use Case 6>

Further, Extended Reality (XR) may be applied to the system. For example, as shown in FIG. 17, reality of sensation may be improved by displaying a video of a virtual hand. In the case of the example of FIG. 17, a local operator wears a glove-type device 311-1 and performs a virtual handshaking operation with a remote operator at a remote location. The remote operator also wears a glove-type device 311-2 and similarly performs a virtual handshaking operation with the local operator.


According to forward transmission and feedback transmission, haptic data detected in each glove-type device 311 is exchanged, and the motion is reproduced by a kinesthetic/tactile actuator of the glove-type device 311 of the other side.


In addition, the local operator is wearing a head mounted display (HMD) 381-1. Similarly, the remote operator is wearing a head mounted display (HMD) 381-2. When it is not necessary to distinguish the head mounted display 381-1 and the head mounted display 381-2 from each other in description, they are referred to as a head mounted display 381. The head mounted display 381 is equipped with an image sensor for recognizing the surroundings.


The head mounted display 381-1 displays a video of a virtual space including an image of a real hand corresponding to the glove-type device 311-1 operated by the local operator and an image of a virtual hand corresponding to the glove-type device 311-2 operated by the remote operator. The image of the virtual hand corresponding to the glove-type device 311-1 is generated on the basis of haptic data detected in the glove-type device 311-1. That is, the image of the real hand reproduces the state of the motion of the glove-type device 311-1 operated by the local operator. Similarly, the image of the virtual hand corresponding to the glove-type device 311-2 is generated on the basis of haptic data transmitted from the glove-type device 311-2. That is, this image of the virtual hand reproduces the state of the motion of the glove-type device 311-2 operated by the remote operator.


The local operator can perform shaking of virtual hands in the virtual space displayed on the head mounted display 381-1, for example, by operating the glove-type device 311-1 while viewing the video displayed on the head mounted display 381-1. Further, since the glove-type device 311-1 can reproduce kinesthetic sensation and a tactile sensation using the haptic data fed back from the glove-type device 311-2 by the kinesthetic/tactile actuator, the local operator can feel the motion and the amount of force of the other person's hand at the time of shaking hands, and sensation such as temperature. Further, in the case of this system, the local operator can view the video of handshaking, and thus the reality of the handshaking sensation can be improved.


The same applies to the remote operator side. The head mounted display 381-2 displays a video of a virtual space including an image of a virtual hand corresponding to the glove-type device 311-1 operated by the local operator and an image of a virtual hand corresponding to the glove-type device 311-2 operated by the remote operator. The image of the virtual hand corresponding to the glove-type device 311-1 is generated on the basis of haptic data transmitted from the glove-type device 311-1. That is, this image of the virtual hand reproduces the state of the motion of the glove-type device 311-1 operated by the local operator. Similarly, the image of the virtual hand corresponding to the glove-type device 311-2 is generated on the basis of haptic data detected in the glove-type device 311-2. That is, this image of the virtual hand reproduces the state of the motion of the glove-type device 311-2 operated by the remote operator.


The remote operator can perform shaking of the virtual hands in the virtual space displayed on the head mounted display 381-2, for example, by operating the glove-type device 311-2 while viewing the video displayed on the head mounted display 381-2. Further, since the glove-type device 311-2 can reproduce kinesthetic sensation and a tactile sensation using haptic data fed back from the glove-type device 311-1 by the kinesthetic/tactile actuator, the remote operator can feel the sensation of the other person's hand at the time of shaking the hands. Further, in the case of this system, the remote operator can view the video of the state of handshaking, and thus the reality of the handshaking sensation can be improved.


For example, the local system 101 represents the position of the glove-type device 311-1 using information such as an absolute coordinate position of a reference position RP1 (Reference Position 1), directions of absolute coordinate axes (horizontal plane and vertical direction), coordinate values in absolute coordinates (x, y, z), a reference observation point BP1 (e.g., fulcrum of Group1 (=Organ1)), and an offset from the reference position RP1.


Similarly, the remote system 102 represents the position of the glove-type device 311-2 using information such as an absolute coordinate position of a reference position RP2 (Reference Position 2), directions of absolute coordinate axes (horizontal plane and vertical direction), coordinate values in absolute coordinates (x, y, z), a reference observation point BP2 (e.g., fulcrum of Group2 (=Organ2)), and an offset from the reference position RP2.


By exchanging this information, relative positions of the glove-type device 311-1 and the glove-type device 311-2 can be derived and handshaking can be performed in a virtual space.


<Use Case 7>

Further, in a system that realizes virtual handshaking between operators at remote locations like the system of FIG. 17, a local operator and a remote operator may perform handshaking with actual robot hands.


In this case, the head mounted display 381 is unnecessary. Instead, a robot hand 391-1 is provided near the local operator. The local operator operates the glove-type device 311-1 to actually shake a hand with the robot hand 391-1. The robot hand 391-1 is driven on the basis of haptic data transmitted from the glove-type device 311-2 and reproduces a motion of the glove-type device 311-2 operated by the remote operator. That is, the local operator and the remote operator can get a feeling (a kinesthetic sensation or tactile sensation) as if they are really shaking hands with each other. Moreover, since the actual robot hand 391-1 is driven, the local operator can improve the reality of the handshaking sensation.


The same applies to the remote operator side. That is, a robot hand 391-2 is provided near the remote operator. The remote operator operates the glove-type device 311-2 to actually shake a hand with the robot hand 391-2. The robot hand 391-2 is driven on the basis of haptic data transmitted from the glove-type device 311-1 and reproduces a motion of the glove-type device 311-1 operated by the local operator. That is, the local operator and the remote operator can get a feeling (a kinesthetic sensation or tactile sensation) as if they are really shaking hands with each other. Further, since the actual robot hand 391-2 is driven, the remote operator can improve the reality of the handshaking sensation.


For example, the local system 101 represents the position of the glove-type device 311-1 using information such as an absolute coordinate position of a reference position RP1 (Reference Position 1), directions of absolute coordinate axes (horizontal plane and vertical direction), coordinate values in absolute coordinates (x, y, z), a reference observation point BP1 (e.g., fulcrum of Group1 (=Organ1)), and an offset from the reference position RP1.


Similarly, the remote system 102 represents the position of the glove-type device 311-2 using information such as an absolute coordinate position of a reference position RP2 (Reference Position 2), directions of absolute coordinate axes (horizontal plane and vertical direction), coordinate values in absolute coordinates (x, y, z), a reference observation point BP2 (e.g., fulcrum of Group2 (=Organ2)), and an offset from the reference position RP2.


By exchanging this information, relative positions of the glove-type device 311-1 and the glove-type device 311-2 can be derived, and virtual handshaking (handshaking with the actual robot hand 391) can be performed.


<Renderer>

Next, the renderer 132 will be described. The renderer 132 may synthesize, for example, force data generated through a local operation and force data received remotely with respect to force data. FIG. 19 shows a main configuration example of the renderer 132 in such a case. As shown in FIG. 19, the renderer 132 in this case includes a synthesis unit 401. The synthesis unit 401 acquires force data (Force1 (power, direction)) supplied from the sensor unit 131 and generated through a local operation. Further, the synthesis unit 401 acquires force data (Force2 (power, direction)) supplied from the decoding unit 162 and received remotely. The synthesis unit 401 synthesizes the force data to generate control information (Synth_force (power, direction)) of the actuator 133. The synthesis unit 401 supplies the control information to the actuator 133.


In addition, the renderer 132 may correct a posture of a haptic system at an absolute position. Further, the renderer 132 may correct and map remote position information in a local positional relationship. FIG. 20 shows a main configuration example of the renderer 132 in such a case. In this case, the renderer 132 includes a positional relationship determination unit 411 and a position correction unit 412.


The positional relationship determination unit 411 determines a positional relationship between a reference position RP1 and a reference articulation point BP1 from local absolute coordinates 1. Similarly, it determines a positional relationship between RP1 and a local dependent articulation point CP1.


The position correction unit 412 obtains a positional relationship between a remote reference position RP2 and a remote reference observation point BP2 and a positional relationship of a remote dependent articulation point CP2 from remote absolute coordinates 2. These positional relationships are scaled and corrected in local positional relationships obtained as described above. Then, the corrected position data is sent to the actuator.


Meanwhile, the same processing may also be performed in a renderer on the remote side.


<Delay Compensation by Position Detection Sensor>

A predictor may be provided in a sensor that detects a motion on each of a local side and a remote side, and a position immediately after the corresponding position may be predicted using an immediately previous history. With this prediction function, it is possible to transmit a value (preemptive position data) that is ahead of the current position on the local side as position information that is actually transmitted.


On the remote side, feedback information at the preemptive data position is transmitted. When an actual motion on the local side has progressed, it is possible to obtain renderer output on the local side by receiving the feedback information from the remote side and tuning the feedback information such that it is reproduced by the actuator. The same can apply to sensor information transmitted from the remote side, and it is possible to obtain kinesthetic tactile information without feeling delay in renderer output of the remote side by returning a feedback of the local side for the preemptive position information from the remote side.


<Identification of Absolute Position>

An inclination with respect to the north may be detected using global positioning system (GPS) information and a 3-axis direction sensor to identify a fulcrum position of a haptic device and a set direction. In the GPS information, for example, information such as two-dimensional coordinates (latitude, longitude), elevation, and time is defined as spatial coordinates. FIG. 21 shows an example of describing GPS information in xml format.


<Deviation in Vertical Direction>

In addition, as shown in FIG. 22, for example, an acceleration sensor 421 may be used to obtain an inclination angle of a device 420 in which the acceleration sensor is provided. For example, when the acceleration sensor 421 is consistent with the direction in which gravity acts, the acceleration sensor 421 detects an acceleration of 9.8 m/sec2. On the other hand, when the acceleration sensor 421 is disposed in the direction perpendicular to the direction in which gravity acts, the influence of gravity disappears and the output of the acceleration sensor 421 becomes 0. When an acceleration a of an inclination at an arbitrary angle θ is obtained as the output of the acceleration sensor 421, the inclination angle θ can be derived by the following formula (1).





Θ=sin−1 (a/g)   (1)


In this manner, a deviation of the haptic device in the vertical direction can be calculated.


<Other Examples of Haptic Data (KT Map)>

The configuration of haptic data (KT map) is arbitrary and is not limited to the above example. The haptic data in the example of FIG. 7 may be configured as shown in FIG. 23. The haptic data in the example of FIG. 11 may be configured as shown in FIG. 24. That is, the position, velocity, and acceleration of each observation point may be indicated as a fulcrum articulation point position.


<Coding of Haptic Data>

Values of haptic data (kinesthetic data, tactile data, force data) of an observation point dynamically change along the time axis as shown in the graph of FIG. 25, for example. Therefore, when such haptic data is coded, for example, a frame (haptic frame (haptic_frame)) may be formed at predetermined time intervals, as shown in FIG. 26.


In FIG. 26, the rectangles represent haptic data arranged in a time series, and the haptic data is separated at predetermined time intervals to form a haptic frame (haptic_frame) 451-1 and a haptic frame 451-2, a haptic frame 451-3, . . . . When it is not necessary to distinguish each haptic frame in description, it is referred to as a haptic frame 451.


In addition, it is possible to perform prediction between samples of haptic data in the haptic frame 451, and at the time of performing prediction, an increase in the amount of information is curbed.


As a one-dimensional data coding method, for example, there is deadband. In the case of the deadband method, for example, in one-dimensional data as shown in A of FIG. 27, only some data (data indicated by the black circle) in which immediately previous values considerably vary are coded, and coding of data in gray parts having small variation is omitted. At the time of decoding, coded/decoded data is duplicated for a section that has not been coded. Therefore, decoded data is as shown in B of FIG. 27. That is, in the case of the deadband method, there is a possibility that it is difficult to reproduce a minute change. On the contrary, when a motion with high definition is intended to be reproduced, the range of a perception threshold becomes very narrow and the frequency of data to be transmitted increases, and thus the coding efficiency may be reduced.


Therefore, in coding of haptic data, a difference is obtained between adjacent points in a time series, and the difference value is variable-length coded. For example, as shown in A of FIG. 28, a difference is derived between adjacent points in a time series in a haptic frame 451 of one-dimensional haptic data. For data at the beginning of a haptic frame, a difference is not derived and the value thereof is used intactly.


In this way, haptic data (difference value) for respective haptic frames becomes, for example, a curve 461 to a curve 464. Then, these difference values are quantized and additionally variable-length coded. In this way, the amount of information can be reduced and reduction in coding efficiency can be curbed as compared to a case where original data is coded intact. Further, when such coded data is decoded, the original data can be faithfully reconstructed as shown by a curve 465 shown in B of FIG. 28. Therefore, it is possible to curb decrease in reproduction accuracy due to coding/decoding such as the deadband method. That is, it is possible to reproduce a motion with high definition on a receiving side, and it is also possible to achieve a constant data compression effect at all times. This is the same in the case of a motion in three-dimensional space.


Meanwhile, when such prediction processing is performed, invalid decoded values due to an error at the time of transmission, and the like may be propagated to a subsequent sample, but it is possible to curb the propagation even if decoding has failed by configuring a haptic frame at predetermined time intervals, refreshing prediction at the beginning of the haptic frame, and performing coding without deriving difference values, as described above. Accordingly, it is possible to improve the resistance to transmission errors and the like, for example.


<Coding Unit>

A main configuration example of the coding unit 152 (FIG. 3) in this case is shown in A of FIG. 29. In this case, the coding unit 152 includes a delay unit 501, an arithmetic operation unit 502, a quantization unit 503, and a variable length coding unit 504.


The delay unit 501 delays input haptic data (Input data (t)) to generate delayed haptic data (dd (t)). The arithmetic operation unit 502 derives a difference value (also referred to as a predicted residual value) (Predicted data (t)) between the input haptic data (Input data (t)) and the haptic data (dd (t)) delayed by the delay unit 501. That is, prediction is performed between samples and a predicted residual is derived. The quantization unit 503 quantizes the predicted residual value (Predicted data (t)) to generate a quantization coefficient (quantized data (t)). The variable length coding unit 504 variable-length-codes the quantization coefficient (quantized data (t)) to generate coded data (variable length code (t)) and outputs the coded data.


<Decoding Unit>

In addition, a main configuration example of the decoding unit 162 in this case is shown in B of FIG. 29. In this case, the decoding unit 162 includes a variable length decoding unit 511, a dequantization unit 512, an arithmetic operation unit 513, and a delay unit 514.


The variable length decoding unit 511 variable-length-decodes the input coded data (variable length code (t)) to generate the quantization coefficient (quantized data (t)). The dequantization unit 512 dequantizes the quantization coefficient (quantized data (t)) to derive a dequantization coefficient (dequantized data (t)). This dequantization coefficient (dequantized data (t)) is a reconstructed predicted residual value (Predicted data′ (t)). Since quantization is performed and it is irreversible depending on a quantization step width, this reconstructed predicted residual value (Predicted data′ (t)) may not be the same as the predicted residual value (Predicted data (t)) of the coding unit 152.


The arithmetic operation unit 513 adds data (dd′ (t)) obtained by delaying haptic data (Reconstructed data (t)) by the delay unit 514 to the dequantization coefficient (dequantized data (t)) to derive reconstructed haptic data (Reconstructed data (t)). That is, the arithmetic operation unit 513 generates haptic data of the current sample by adding the decoding result of the past sample to the predicted residual.


For example, if the haptic data (Input data (t)) value input to the coding unit 152 changes in the order of “2”→“8”→“4”→“6”→“10” in a time series, the haptic data (dd (t)), the predicted residual value (Predicted data (t)), the haptic data (dd′ (t)), and the haptic data (Reconstructed data (t)) become values as shown in FIG. 30. Generally, mapping is performed such that a code length decreases as an input value approaches zero in variable-length coding, and thus the amount of information can be reduced and reduction in coding efficiency can be curbed.


In a haptic frame, information such as a fulcrum articulation point position, a stimulus operation, and a force is predicted between samples, as shown in A to C of FIG. 31, and is composed of unpredicted values and predicted residual values.


<Flow of Haptic Data Transmission Processing>

Next, processing executed in the local system 101 will be described. The local system 101 transmits haptic data by performing haptic data transmission processing. An example of the flow of this haptic data transmission processing will be described with reference to the flowchart of FIG. 32.


When haptic data transmission processing is started, the sensor unit 131 detects kinesthetic data, tactile data, force data, and the like at a plurality of observation points of the haptic interface 134 in step S101. In step S102, the sensor unit 131 generates multipoint haptic data using such information. In step S103, the composer 151 generates a KT map by performing integration of the multipoint haptic data for each observation point, and the like.


In step S104, the coding unit 152 codes the KT map using prediction between samples to generates coded data.


In step S105, the container processing unit 153 stores the coded data in a container and generates transmission data. In step S106, the container processing unit 153 transmits the transmission data.


When the transmission data including the haptic data is transmitted as described above, haptic data transmission processing ends.


In this way, haptic data of multiple observation points can be transmitted, and remote control can be performed with higher accuracy.


<Flow of Haptic Data Reception Processing>

Next, an example of a flow of haptic data reception processing will be described with reference to the flowchart of FIG. 33.


When haptic data reception processing is started, the container processing unit 161 receives transmitted data and extracts coded data in step S131. In step S132, the decoding unit 162 decodes the coded data to reproduce a KT map. In step S133, the decoding unit 162 reproduces the multipoint haptic data using the KT map.


In step S134, the renderer 132 performs rendering using the multipoint haptic data to reproduce control information for controlling the actuator 133.


In step S135, the actuator 133 drives the haptic interface 134 on the basis of the control information generated in step S134.


When the haptic interface 134 is driven, haptic data reception processing ends.


In this way, haptic data of a plurality of observation points can be received, and remote control can be performed with higher accuracy.


3. Second Embodiment
<Application of MPD>

As described above, the remote control system 100 can also apply MPD to reproduction of haptic data. In such a case, the local system 101 can generate MPD with respect to haptic data detected thereby and register the MPD in the MPD server 103.


<Flow of MPD Generation Processing>

An example of a flow of MPD generation processing for generating such MPD will be described with reference to the flowchart of FIG. 34.


When MPD generation processing is started, the MPD generation unit 154 acquires multipoint haptic data from the composer 151 and generates MPD corresponding to the multipoint haptic data on the basis of the multipoint haptic data in step S161.


In step S162, the container processing unit 153 transmits the MPD generated in step S161 to the MPD server 103. When the MPD is transmitted, MPD transmission processing ends.


By generating MPD with respect to multipoint haptic data and registering it in the MPD server in this manner, reproduction control based on the MPD can be performed.


<Flow of MPD Control Processing>

An example of a flow of MPD control processing, which is processing related to reproduction control of haptic data using MPD, will be described with reference to the flowchart of FIG. 35.


When MPD control processing is started, the MPD control unit 163 requests and acquires MPD from the MPD server 103 via the container processing unit 161 in step S191.


In step S192, the MPD control unit 163 selects haptic data to be acquired on the basis of the MPD acquired through processing of step S191.


In step S193, the MPD control unit 163 requests the haptic data selected through processing of step S192 from the remote system 102 via the container processing unit 161.


Haptic data supplied on the basis of the aforementioned request can be acquired by the local system 101 performing the above-described haptic data reception processing.


<Configuration Example of Container>


A main configuration example of a container for storing haptic data is shown in A of FIG. 36. Haptic data is stored in, for example, a container (transmission data) in the ISO base media file format (ISOBMFF). In the case of this ISOBMFF, the container has an initialization segment (IS) and a media segment (MS), as shown in A of FIG. 36. Track identification information (trackID), a time stamp (Timestamp), and the like are stored in the MS. In addition, DASH MPD is associated with this MS.


A of FIG. 36 shows an example of a state of the container when the media segment is not fragmented. B of FIG. 36 shows an example of a state of the container when the media segment is fragmented. The media segment contains moof and mdat as a movie fragment.


DASH MPD is associated with moof of the media segment according to a group ID (Group_id). A of FIG. 37 shows an example in the case of transmission from a local side. B of FIG. 37 shows an example in the case of transmission from a remote side.


Further, when there are multiple haptic devices locally, one MPD can be associated with multiple moofs as shown in FIG. 38. Even in such a case, association is performed according to the group ID. As in this example, haptic data may be stored in different tracks for respective groups.


<Allocation of Track>


Regarding haptic data, one group may be allocated to one track of a media file. FIG. 39 shows an example of a container configuration when the haptic data of the example of FIG. 7 is stored. In addition, FIG. 40 shows an example of a container configuration when the haptic data of the example of FIG. 11 is stored. Further, FIG. 41 shows an example of a container configuration when the haptic data of the example of FIG. 16 is stored.


<Data Arrangement in File Format>

For example, as shown in FIG. 42, haptic data of all the observation points of one haptic device 121 (haptic interface 134) may be disposed in one track of a container. In this case, a track is allocated to one file and moof is disposed independently within the track. hapticSampleGroupEntry (‘hsge’) indicates an offset position between subparts at the same time, an offset position in ascending order on the time axis, an offset position between samples, and the like along with connection arrangement of each observation point. The same also applies when a coding target is tactile data. It is possible to easily approximately pick up only a necessary subptrep group according to the structure of observation points on the reproduction side according to subpt_grp_offset indicating an offset position between subparts.



FIG. 43 shows an example of a quantization table (Q_table), precision of value (Precision_of_positions), and connection chain (connection_chain). The connection_chain information indicates the placement order of coded sample elements when subpartrep_ordering_type is “interleaved.” It is also possible to transmit and receive velocity and acceleration information instead of position. In addition, it is also possible to transmit and receive rotposition and rotvelocity information instead of rotacceleration.


<Consideration for Error Tolerance>

When the decoding unit 162 variable-length-decodes coding target data, the influence of deterioration due to an error in a transmission process may be minimized by detecting a predetermined code string (unique word) having a meaning different from that of normal data, the beginning of the next time data or the next haptic frame may be detected, and decoding processing may be restarted therefrom. An example is shown in FIG. 44. In FIG. 44, a diagonally hatched rectangle indicates identification information for identifying the frame top of a haptic frame according to a variable-length-coded unique word. Further, a rectangle in gray indicates identification information for identifying the top of time data (Time_data Top) according to the variable-length-coded unique word.


For example, “1111111111” may be a unique word for identifying the frame top of a haptic frame, and “1111111110” may be a unique word for identifying the top of time data (Time Data Top). That is, N “1 s” may continue at the frame top of the haptic frame, (N-1) “1 s” may continue at the top of the time data (Time Data Top), and a maximum of (N-2) “1 s” may be continue in other pieces of coded data.


Even when packet loss occurs (it is possible to check with the checksum of an upper layer, or the like), a unique word can be detected by checking the number of consecutive ‘1 s’ and recovery from a predetermined timing can be performed in a receiver.


<Securing Reproduction Selection System on Receiving Side>

There are cases where devices on transmitting and receiving sides have differed scales of articulation points (that is, the number of observation points on the transmitting side differs from the number of control points on the receiving side). When the number (M) of articulation points operated on the transmitting side differs from the number (P) of articulation points reproducible on the receiving side, particularly, when M>P, only a subpoint group corresponding to articulation points that can be reproduced by the receiving side may be decoded. For example, as shown in FIG. 45, only a subgroup (Sbgsp) to be reproduced may be selected, and other decoding or reproduction may be skipped. In the case of the example of FIG. 45, decoding or reproduction of subgroups (for example, Sbgsp4 and Sbgsp6) represented in gray is skipped.


At that time, as a decision on the receiving side, initial values of position information of articulation points are checked for each subgroup set included in each subrepresentation in MPD. Subgroup sets having many articulation points and position information which are close and can be reproduced in the receiver are adopted as reproduction targets. For example, when subgroup sets=1, 2, 3, and 5 are adopted, the subgroups (subgroupset) represented in gray in FIG. 45 are skipped. At the time of skipping, Subpt_grp_offset information in the MPD is used.


Feedback information to the local side when reproduced in such a manner is transmitted from the remote side only by the adopted amount. According to this function, a wider range of types of reproduction devices can be used as haptic kinesthetic reproduction devices.



FIG. 46 shows an example of the moof syntax in the ISOBMFF file format corresponding to the KT map. In FIG. 46, /hsge and /connection_chain are elements provided to correspond to the KT map.


<Control of Distribution Target by MPD on Receiving Side>

The receiving side may access the MPD server 103, acquire and analyze an MPD file, consider a possible bandwidth in the network on the receiving side, and select an appropriate bit rate. Further, depending on the device configuration on the receiving side, control for selecting the scale of observation points to be delivered may be performed such that the scale is within a reproducible range.


In order to realize the above-described functions, a new schema is defined using supplementary descriptor in MPD.


For example, when the transmitting side receives a bit rate request via the MPD server 103, the transmitting side performs coding quantization control with a quantization step size that realizes the request. An example of MPD in such a case is shown in FIG. 47. FIG. 47 is an example of MPD corresponding to the haptic data of FIG. 11. The description of subrepresentation of FIG. 48 to FIG. 50 is inserted into the parts of frames 701 and 702 indicated by alternate long and short dash lines in FIG. 47. Position data, rotaccdate, and force data below subrepresentations of FIG. 48 to FIG. 50 indicate initial values of dynamic change. In this example, the total amount of coding bit rates can be selected from 4 Mbps or 2 Mbps.


In addition, when the transmitting side receives a request for reducing the scale of articulation points via the MPD server 103, the transmitting side reduces the number of subgroups (sub_group) and performs coding so as to realize the request. An example of MPD in such a case is shown in FIG. 51. FIG. 51 is an example of MPD corresponding to the haptic data of FIG. 11. The description of subrepresentation of FIG. 52 and FIG. 53 is inserted into the parts of frames 711 and 712 indicated by alternate long and short dash lines in FIG. 51. Position data, rotaccdate, and force data below subrepresentations of FIG. 52 and FIG. 53 indicate initial values of dynamic change. In this example, the number of subgroups can be selected from 6 and 4.


Although the examples of FIG. 47 to FIG. 53 describe MPD when haptic data is transmitted from a single kinesthetic device, haptic data may be transmitted from a plurality of kinesthetic devices. An example of MPD in such a case is shown in FIG. 54. In the case of the example of FIG. 54, two adaptation sets are provided. Different group IDs are allocated to the adaptation sets, and the adaptation sets correspond to different kinesthetic devices.


In addition, FIG. 55 shows an example of MPD corresponding to transmission of haptic data from the local system 101 to the remote system 102 when haptic data is bidirectionally transmitted. In this case, a request is performed for the remote system 102 such that the remote system 102 returns tactile data such as hardness, coefficient of friction, and temperature information in MPD corresponding to transmission from the local system 101. For example, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:qrhardness” value=“ptB1queryhardness”/> in the MPD of FIG. 55 is a description of requesting that the remote system 102 will return “hardness” as tactile (tactile data). In addition, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:qrfriction” value=“ptB1queryfriction”/> is a description requesting that the remote system 102 will return “coefficient of friction” as tactile (tactile data). Further, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:qrtemp” value=“ptB1querytemperature”/> is a description requesting that the remote system 102 will return “temperature information” as tactile (tactile data). Similarly, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:qrhardness” value=“ptB2queryhardness”/> in the MPD of FIG. 55 is a description requesting that the remote system 102 will return “hardness” as tactile (tactile data). In addition, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:qrfriction” value=“ptB2queryfriction”/> is a description requesting that the remote system 102 will return “coefficient of friction” as tactile (tactile data). Further, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:qrtemp” value=“ptB2querytemperature”/> is a description requesting that the remote system 102 will return “temperature information” as tactile (tactile data).


In addition, FIG. 56 shows an example of MPD corresponding to transmission (feedback transmission) of haptic data from the remote system 102 to the local system 101, which corresponds to the MPD of FIG. 55. In this feedback transmission, tactile data such as hardness, coefficient of friction, and temperature information are returned on the basis of requests in the MPD of FIG. 55. For example, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:pthardness” value=“ptB1hardness”/> in the MPD of FIG. 56 is a description showing that “hardness” is fed back as tactile data and the initial value thereof is reflected in ptB1hardness in response to a request from the local system 101. In addition, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:ptfriction” value=“ptB1friction”/> is a description showing that “coefficient of friction” is fed back as tactile data and the initial value thereof is reflected in ptB1friction in response to a request from the local system 101. Further, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:pttemp” value=“ptB1temperature”/> is a description showing that “temperature information” is fed back as tactile data and the initial value thereof is reflected in ptB1temperature in response to a request from the local system 101. Similarly, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:pthardness” value=“ptB2hardness”/> in the MPD of FIG. 56 is a description showing that “hardness” is fed back as tactile data and the initial value thereof is reflected in ptB2hardness in response to a request from the local system 101. In addition, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:ptfriction” value=“ptB2friction”/> is a description showing that “coefficient of friction” is fed back as tactile data and the initial value thereof is reflected in ptB2friction in response to a request from the local system 101. Further, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:pttemp” value=“ptB2temperature”/> is a description showing that “temperature information” is fed back as tactile data and the initial value thereof is reflected in ptB2temperature in response to a request from the local system 101.



FIG. 57 is a diagram showing another example of MPD corresponding to transmission of haptic data from the local system 101 to the remote system 102 when haptic data is bidirectionally transmitted. In this case, a request is performed for the remote system 102 such that the remote system 102 returns vibration information as frequency and amplitude values in the MPD corresponding to transmission from the local system 101. For example, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:qrvibfreqency” value=“ptB1queryvibfrequency”/> in the MPD of FIG. 57 is a description requesting that the remote system 102 will return vibration information as a frequency. In addition, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:qrvibamplitude” value=“ptB1queryvibamplitude”/> is a description requesting that the remote system 102 will return vibration information as an amplitude value. Similarly, the <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:qrvibfreqency” value=“ptB2queryvibfrequency”/> in the MPD of FIG. 57 is a description requesting that the remote system 102 will return vibration information as a frequency. In addition, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:qrvibamplitude” value=“ptB2queryvibamplitude”/> is a description requesting that the remote system 102 will return vibration information as an amplitude value.


In addition, FIG. 58 shows an example of MPD corresponding to transmission (feedback transmission) of haptic data from the remote system 102 to the local system 101, which corresponds to the MPD of FIG. 57. In this feedback transmission, vibration information is returned as frequency and amplitude values on the basis of requests described in the MPD of FIG. 57. For example, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:vibfreqency” value=“ptB1vibfrequency”/> in the MPD of FIG. 58 is a description showing that vibration information is fed back as a frequency and the initial value thereof is reflected in ptB1vibfrequency in response to a request from the local system 101. Further, for example, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:vibamplitude” value=“ptB1vibamplitude”/> in the MPD of FIG. 58 is a description showing that vibration information is fed back as an amplitude value and the initial value thereof is reflected in ptB1vibamplitude in response to a request from the local system 101. Similarly, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:vibfreqency” value=“ptB2vibfrequency”/> in the MPD of FIG. 58 is a description showing that vibration information is fed back as a frequency and the initial value thereof is reflected in ptB2vibfrequency in response to a request from the local system 101. Further, for example, <SupplementaryDesctiptor schemeIdUri=“urn:bicom:haptic3d:vibamplitude” value=“ptB2vibamplitude”/> in the MPD of FIG. 58 is a description showing that vibration information is fed back as an amplitude value and the initial value thereof is reflected in ptB2vibamplitude in response to a request from the local system 101.


Examples of MPD semantics are shown in FIG. 59 and FIG. 60.


4. Third Embodiment
<Digital Interface>

In one or both of the local system 101 and the remote system 102, transmission of haptic data between the haptic device 121 and the communication device 122 can be performed via the digital interface 123 or the digital interface 124 as shown in FIG. 3. As described above, an interface for digital devices of any standard can be applied to these digital interfaces. For example, FIG. 61 shows a main configuration example of the digital interface 123 when HDMI is applied. A main configuration example of the digital interface 124 when HDMI is applied is the same as the example of FIG. 61. In this case, transmission of haptic data between the haptic device 121 and the communication device 122 is performed using a transition minimized differential signaling (TMDS) channel of HDMI. Meanwhile, data transmitted through the TMDS channel can be considered as the same premise even if it is transmitted through an optical transmission line or radio waves.


Transmission of haptic data is performed in synchronization with video data or using a blanking period. Examples of states of insertion of metadata in this data island slot are shown in FIG. 62 and FIG. 63. In addition, an example of semantics is shown in FIG. 64. Further, an example of transmission formats of haptic data in the TMDS channel is show in FIG. 65.


5. Supplement
<Computer>

The above-described series of processing can be executed by hardware or software. When the series of processing is performed by software, a program including the software is installed in a computer. Here, the computer includes a computer which is embedded in dedicated hardware or, for example, a general-purpose personal computer capable of executing various functions by installing various programs.



FIG. 66 is a block diagram showing an example of a hardware configuration of a computer that executes the above-described series of processing according to a program.


In a computer 900 illustrated in FIG. 66, a central processing unit (CPU) 901, a read only memory (ROM) 902, and a random access memory (RAM) 903 are connected to each other via a bus 904.


An input/output interface 910 is also connected to the bus 904. An input unit 911, an output unit 912, a storage unit 913, a communication unit 914, and a drive 915 are connected to the input/output interface 910.


The input unit 911 may include, for example, a keyboard, a mouse, a microphone, a touch panel, an input terminal, and the like. The output unit 912 includes, for example, a display, a speaker, an output terminal, and the like. The storage unit 913 includes, for example, a hard disk, a RAM disk, a non-volatile memory, and the like. The communication unit 914 includes, for example, a network interface. The drive 915 drives a removable recording medium 921 such as a magnetic disk, an optical disc, a magneto-optical disk, or a semiconductor memory.


In the computer configured as above, for example, the CPU 901 performs the above-described a series of processing by loading a program stored in the storage unit 913 to the RAM 903 via the input/output interface 910 and the bus 904 and executing the program. Data and the like necessary for the CPU 901 to execute various kinds of processing are also appropriately stored in the RAM 903.


A program executed by a computer can be recorded on a removable recording medium 921 as package media or the like and applied thereto, for example. In such a case, the program can be installed in the storage unit 913 via the input/output interface 910 by mounting the removable recording medium 921 in the drive 915.


This program can also be provided via a wired or wireless transmission medium such as a local area network, a leased line network, or a WAN, the Internet, satellite communication, or the like. In this case, the program can be received by the communication unit 914 and installed in the storage unit 913.


In addition, the program can also be installed in advance in the ROM 902 or the storage unit 913.


<Application Target of Present Technology>

Although each device of the remote control system 100 has been described above as an example to which the present technology is applied, the present technology can be applied to any configuration.


For example, the present technology can be applied to various electronic apparatuses such as a transmitter or a receiver (for example, a television receiver or a mobile phone) in satellite broadcasting, wired broadcasting such as cable TV, transmission on the Internet, a local area network, a network using a dedicated line, or a WAN, transmission to a terminal through cellular communication, and the like, or a device (for example, a hard disk recorder or a camera) that records an image on media such as an optical disc, a magnetic disk, and a flash memory or reproduces an image from these storage media.


For example, the present technology can be implemented as a configuration of a part of a device such as a processor (for example, a video processor) of a system large scale integration (LSI), a module (for example, a video module) using a plurality of processors or the like, a unit (for example, a video unit) using a plurality of modules or the like, or a set (for example, a video set) with other functions added to the unit.


For example, the present technology can also be applied to a network system configured by a plurality of devices. For example, the present technology may be implemented as cloud computing shared or processed in cooperation with a plurality of devices via a network. For example, the present technology can be implemented in a cloud service providing a service related to images (moving images) to any terminal such as a computer, an audio visual (AV) device, a portable information processing terminal, or an Internet of things (IoT) device.


In the present specification, a system means a set of a plurality of constituent elements (devices, modules (parts), or the like) and all the constituent elements may not be in the same casing. Accordingly, a plurality of devices accommodated in separate casings and connected via a network and a single device accommodating a plurality of modules in a single casing are all a system.


The above-described series of processing can be executed by hardware or software. When the series of processing is performed by software, a program including the software is installed in a computer. Here, the computer includes a computer which is embedded in dedicated hardware or, for example, a general-purpose personal computer capable of executing various functions by installing various programs.


<Others>

The embodiments of the present technology are not limited to the above-described embodiments, and various modifications can be made without departing from the essential spirit of the present technology.


For example, a configuration described as one device (or processing unit) may be divided and configured as a plurality of devices (or processing units). On the other hand, the configuration described above as the plurality of devices (or processing units) may be collected and configured as one device (or processing unit). A configuration other than the above-described configuration may be added to the configuration of each device (or each processing unit). Further, when the configuration or the operation are substantially the same in the entire system, a part of the configuration of a certain device (or processing unit) may be included in the configuration of another device (or another processing unit).


For example, the above-described program may be executed in any device. In this case, the device may have a necessary function (a functional block or the like) and may be able to obtain necessary information.


For example, each step of one flowchart may be executed by one device or may be shared and executed by a plurality of devices. Further, when a plurality of kinds of processing are included in one step, the plurality of kinds of processing may be performed by one device or may be shared and performed by a plurality of devices. In other words, a plurality of kinds of processing included in one step can also be executed as processing of a plurality of steps. In contrast, processing described as a plurality of steps can be collectively performed as one step.


For example, for a program executed by a computer, processing of steps of describing the program may be performed chronologically in order described in the present specification or may be performed in parallel or individually at a necessary timing such as the time of calling. That is, processing of each step may be performed in order different from the above-described order as long as inconsistency does not occur. Further, processing of steps describing the program may be performed in parallel to processing of another program or may be performed in combination with processing of another program.


For example, a plurality of technologies related to the present technology can be implemented independently alone as long as inconsistency does not occur. Of course, any number of modes of the present technology may be used in combination. For example, part or all of the present technology described in any of the embodiments may be implemented in combination with part or all of the present technology described in the other embodiments. Further, part or all of the above-described present technology may be implemented in combination with other technologies not described above.


The present technology can also be configured as follows.


(1) An information processing device including a transmission unit that transmits haptic data including at least one piece of kinesthetic data including information on a kinesthetic sensation detected at a plurality of observation points, tactile data including information on a tactile sensation detected at the observation points, and force data including information on a magnitude of force applied to the observation points with respect to the observation points of a device serving as an interface.


(2) The information processing device according to (1), wherein the kinesthetic data includes information on positions of the observation points or information on velocity of the observation points.


(3) The information processing device according to (1) or (2), wherein the tactile data includes at least one piece of information on a hardness of another object in contact with the observation points, information on a coefficient of friction of the object, and information on a temperature of the object.


(4) The information processing device according to any one of (1) to (3), wherein the haptic data includes at least one piece of the kinesthetic data, the tactile data, and the force data classified for each observation point.


(5) The information processing device according to (4), wherein the haptic data further includes information indicating a relationship between the observation points.


(6) The information processing device according to (4) or (5), wherein the transmission unit stores the haptic data for each observation point in different tracks for respective groups of the observation points according to a control structure of the device and transmits the haptic data.


(7) The information processing device according to any one of (1) to (6), further including a coding unit that codes the haptic data to generate coded data, wherein the transmission unit transmits the coded data generated by the coding unit.


(8) The information processing device according to (7), wherein the coding unit performs prediction between samples to derive a predicted residual and codes the predicted residual to generate the coded data with respect to the haptic data.


(9) The information processing device according to any one of (1) to (8), further including a generation unit that generates control information for controlling reproduction of the haptic data,


wherein the transmission unit transmits the control information generated by the generation unit.


(10) An information processing method including transmitting haptic data including at least one piece of kinesthetic data including information on a kinesthetic sensation detected at a plurality of observation points, tactile data including information on a tactile sensation detected at the observation points, and force data including information on a magnitude of force applied to the observation points with respect to the observation points of a device serving as an interface.


(11) An information processing device including a reception unit that receives haptic data including at least one piece of kinesthetic data including information on a kinesthetic sensation detected at a plurality of observation points, tactile data including information on a tactile sensation detected at the observation points, and force data including information on a magnitude of force applied to the observation points with respect to the observation points of a device serving as an interface, and


a driving unit that drives a plurality of driven points of a device serving as an interface on the basis of the haptic data received by the reception unit.


(12) The information processing device according to (11), further including a control unit that controls the reception unit on the basis of control information for controlling reproduction of the haptic data,


wherein the reception unit receives the haptic data designated by the control unit.


(13) The information processing device according to (12), wherein the reception unit receives the haptic data at a bit rate designated by the control unit.


(14) The information processing device according to (12) or (13), wherein the reception unit receives the haptic data of a group of the observation points designated by the control unit.


(15) The information processing device according to any one of (11) to (14), wherein the reception unit receives coded data of the haptic data, and the information processing device further includes a decoding unit that decodes the coded data received by the reception unit to generate the haptic data.


(16) The information processing device according to (15), wherein the coded data is coded data of a predicted residual of the haptic data, derived by performing prediction between samples, and


the decoding unit decodes the coded data to generate the predicted residual, and adds a decoding result of a past sample to the predicted residual to generate the haptic data of a current sample.


(17) The information processing device according to (16), wherein the decoding unit detects a predetermined code string included in the coded data and identifies a start position of prediction between the samples.


(18) The information processing device according to any one of (11) to (17), further including a rendering processing unit that performs rendering processing of generating control information for controlling the driving unit on the basis of haptic data detected at observation points of the device having the driven points and the haptic data received by the receiving unit,


wherein the driving unit drives the plurality of driven points of the device on the basis of the control information generated by the rendering processing unit.


(19) The information processing device according to (18), wherein the rendering processing unit corrects the control information using absolute coordinates.


(20) An information processing method including receiving haptic data including at least one piece of kinesthetic data including information on a kinesthetic sensation detected at a plurality of observation points, tactile data including information on a tactile sensation detected at the observation points, and force data including information on a magnitude of force applied to the observation points with respect to the observation points of a device serving as an interface, and


driving a plurality of driven points of a device serving as an interface on the basis of the received haptic data.


REFERENCE SIGNS LIST




  • 100 Remote control system


  • 101 Local system


  • 102 Remote system


  • 103 MPD server


  • 121 Haptic device


  • 122 Communication device


  • 131 Sensor unit


  • 132 Renderer


  • 133 Actuator


  • 134 haptic interface


  • 141 Image sensor


  • 142 Spatial coordinate conversion unit


  • 151 Composer


  • 152 Coding unit


  • 153 Container processing unit


  • 154 MPD generation unit


  • 155 Imaging unit


  • 156 Video coding unit


  • 161 Container processing unit


  • 162 Decoding unit


  • 163 MPD control unit


  • 164 Video decoding unit


  • 165 Display unit


  • 301 Three-dimensional stimulation device


  • 302 Object


  • 311 Glove-type device


  • 312 Object


  • 331 KT map


  • 351 Remote device


  • 352 Object


  • 353 Virtual object


  • 361 Flying object


  • 362 Robot hand


  • 371 Remote device


  • 372 Object surface


  • 381 Head mounted display


  • 391 Robot hand


  • 401 Synthesis unit


  • 411 Positional relationship determination unit


  • 412 Position correction unit


  • 420 Device


  • 421 Acceleration sensor


  • 451 Haptic frame


  • 501 Delay unit


  • 502 Arithmetic operation unit


  • 503 Quantization unit


  • 504 Variable length coding unit


  • 511 Variable length decoding unit


  • 512 Dequantization unit


  • 513 Arithmetic operation unit


  • 514 Delay unit


Claims
  • 1. An information processing device comprising a transmission unit that transmits haptic data including at least one piece of kinesthetic data including information on a kinesthetic sensation detected at a plurality of observation points, tactile data including information on a tactile sensation detected at the observation points, and force data including information on a magnitude of force applied to the observation points with respect to the observation points of a device serving as an interface.
  • 2. The information processing device according to claim 1, wherein the kinesthetic data includes information on positions of the observation points or information on velocity of the observation points.
  • 3. The information processing device according to claim 1, wherein the tactile data includes at least one piece of information on a hardness of another object in contact with the observation points, information on a coefficient of friction of the object, and information on a temperature of the object.
  • 4. The information processing device according to claim 1, wherein the haptic data includes at least one piece of the kinesthetic data, the tactile data, and the force data classified for each observation point.
  • 5. The information processing device according to claim 4, wherein the haptic data further includes information indicating a relationship between the observation points.
  • 6. The information processing device according to claim 4, wherein the transmission unit stores the haptic data for each observation point in different tracks for respective groups of the observation points according to a control structure of the device and transmits the haptic data.
  • 7. The information processing device according to claim 1, further comprising a coding unit that codes the haptic data to generate coded data, wherein the transmission unit transmits the coded data generated by the coding unit.
  • 8. The information processing device according to claim 7, wherein the coding unit performs prediction between samples to derive a predicted residual and codes the predicted residual to generate the coded data with respect to the haptic data.
  • 9. The information processing device according to claim 1, further comprising a generation unit that generates control information for controlling reproduction of the haptic data, wherein the transmission unit transmits the control information generated by the generation unit.
  • 10. An information processing method comprising transmitting haptic data including at least one piece of kinesthetic data including information on a kinesthetic sensation detected at a plurality of observation points, tactile data including information on a tactile sensation detected at the observation points, and force data including information on a magnitude of force applied to the observation points with respect to the observation points of a device serving as an interface.
  • 11. An information processing device comprising: a reception unit that receives haptic data including at least one piece of kinesthetic data including information on a kinesthetic sensation detected at a plurality of observation points, tactile data including information on a tactile sensation detected at the observation points, and force data including information on a magnitude of force applied to the observation points with respect to the observation points of a device serving as an interface; and a driving unit that drives a plurality of driven points of a device serving as an interface on the basis of the haptic data received by the reception unit.
  • 12. The information processing device according to claim 11, further including a control unit that controls the reception unit on the basis of control information for controlling reproduction of the haptic data, wherein the reception unit receives the haptic data designated by the control unit.
  • 13. The information processing device according to claim 12, wherein the reception unit receives the haptic data at a bit rate designated by the control unit.
  • 14. The information processing device according to claim 12, wherein the reception unit receives the haptic data of a group of the observation points designated by the control unit.
  • 15. The information processing device according to claim 11, wherein the reception unit receives coded data of the haptic data, andthe information processing device further includes a decoding unit that decodes the coded data received by the reception unit to generate the haptic data.
  • 16. The information processing device according to claim 15, wherein the coded data is coded data of a predicted residual of the haptic data, derived by performing prediction between samples, and the decoding unit decodes the coded data to generate the predicted residual, andadds a decoding result of a past sample to the predicted residual to generate the haptic data of a current sample.
  • 17. The information processing device according to claim 16, wherein the decoding unit detects a predetermined code string included in the coded data and identifies a start position of prediction between the samples.
  • 18. The information processing device according to claim 11, further comprising a rendering processing unit that performs rendering processing of generating control information for controlling the driving unit on the basis of haptic data detected at observation points of the device having the driven points and the haptic data received by the receiving unit, wherein the driving unit drives the plurality of driven points of the device on the basis of the control information generated by the rendering processing unit.
  • 19. The information processing device according to claim 18, wherein the rendering processing unit corrects the control information using absolute coordinates.
  • 20. An information processing method comprising: receiving haptic data including at least one piece of kinesthetic data including information on a kinesthetic sensation detected at a plurality of observation points, tactile data including information on a tactile sensation detected at the observation points, and force data including information on a magnitude of force applied to the observation points with respect to the observation points of a device serving as an interface; and driving a plurality of driven points of a device serving as an interface on the basis of the received haptic data.
Priority Claims (1)
Number Date Country Kind
2019-227550 Dec 2019 JP national
PCT Information
Filing Document Filing Date Country Kind
PCT/JP2020/044985 12/3/2020 WO