The present disclosure relates to an information processing device, an information processing method, and a program.
Various technologies for presenting a haptic stimulus such as vibration to a user have been conventionally proposed. As an example, there is a technology of presenting, to a user, a haptic stimulus based on sensing information regarding the user. For example, Patent Document 1 below discloses a technology of presenting, to a driver, a haptic stimulus determined on the basis of sensing information regarding a situation surrounding a vehicle.
The technology disclosed in Patent Document 1 is intended to notify a driver who drives a vehicle of an emergency. Therefore, it is sufficient for the driver to recognize a haptic stimulus presented to the driver as an emergency, and therefore reality thereof is not considered at all.
In view of this, the present disclosure proposes a novel and improved information processing device, information processing method, and program capable of presenting a more realistic haptic stimulus.
The present disclosure provides an information processing device including: an acquisition unit that acquires sensing information regarding a user and first haptic information unique to a haptic presentation object; and a data processing unit that generates second haptic information from the first haptic information on the basis of the sensing information, the second haptic information being used in a case where a haptic presentation device presents a haptic stimulus to the user.
Further, the present disclosure provides an information processing method executed by a processor, the method including: acquiring sensing information regarding a user and first haptic information unique to a haptic presentation object; and generating second haptic information from the first haptic information on the basis of the sensing information, the second haptic information being used in a case where a haptic presentation device presents a haptic stimulus to the user.
Furthermore, the present disclosure provides a program for causing a computer to function as: an acquisition unit that acquires sensing information regarding a user and first haptic information unique to a haptic presentation object; and a data processing unit that generates second haptic information from the first haptic information on the basis of the sensing information, the second haptic information being used in a case where a haptic presentation device presents a haptic stimulus to the user.
Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. Note that, in this specification and the drawings, components having substantially the same functional configurations will be represented as the same reference signs, and repeated description thereof will be omitted.
Note that description will be provided in the following order.
A technology according to an embodiment of the present disclosure relates to an information processing device that presents a haptic stimulus based on sensing information to a user. The information processing device according to the present embodiment generates second haptic information from first haptic information unique to a haptic presentation object on the basis of sensing information regarding a user, the second haptic information being used in a case where a haptic presentation device presents a haptic stimulus to the user.
The sensing information regarding the user can include various types of information. For example, the sensing information includes contact information indicating a contact state between the user and the haptic presentation device. Examples of the contact information encompass a moving speed (acceleration) of a part of the user in contact with the haptic presentation device (hereinafter, also referred to as “contact part”), a pressure applied from the contact part to a contact presentation device, and an area where the contact part and the haptic presentation device are in contact with each other. Note that the contact information is not limited to such examples. With such a configuration, the information processing device can generate the second haptic information from the first haptic information on the basis of the contact information.
Further, the sensing information includes non-contact information indicating a non-contact state between the user and the haptic presentation device. Examples of the non-contact information encompass a body temperature of the user, a humidity of a body surface of the user, and a distance from the haptic presentation device to the contact part. Note that the non-contact information is not limited to such examples. With such a configuration, the information processing device can generate the second haptic information from the first haptic information on the basis of the non-contact information.
Further, the sensing information includes environmental information regarding a surrounding environment of the user. Examples of the environmental information encompass a temperature and humidity of a space where the user exists. Note that the environmental information is not limited to such examples. With such a configuration, the information processing device can generate the second haptic information from the first haptic information on the basis of the environmental information.
The haptic presentation object is a target object based on which a haptic stimulus is presented to the user via the haptic presentation device. Information regarding the haptic presentation object can be managed in association with, for example, an image of the haptic presentation object.
The first haptic information includes information regarding a haptic stimulus that is transmitted to the user when the user actually touches the haptic presentation object. For example, the first haptic information includes information indicating a quantified intensity of a haptic stimulus (hereinafter, also referred to as “haptic stimulus value”).
The haptic stimulus value is information unique to the haptic presentation object. The haptic stimulus value can be set for each predetermined region. For example, in a case where the haptic presentation object is shown as an image, the haptic stimulus value may be set for each pixel of the image. Further, one haptic stimulus value may be set for a plurality of pixels. Furthermore, the image showing the haptic presentation object may be divided into a plurality of regions of any size, and the haptic stimulus value may be set for each region. Hereinafter, a region where the haptic stimulus value is set will also be referred to as “haptic stimulus value region”. Further, in the following description, “information density” indicates an amount of haptic stimulus values that are set per unit area of the image.
The second haptic information is information generated from the first haptic information on the basis of the sensing information. For example, the second haptic information is generated by changing (hereinafter, also referred to as “processing”) the haptic stimulus value included in the first haptic information on the basis of the sensing information. Hereinafter, processing of generating the second haptic information will also be referred to as “generation processing”.
After the second haptic information is generated, the information processing device causes the haptic presentation device to present, to the user, a haptic stimulus based on the generated second haptic information. For example, the information processing device maps the haptic stimulus value included in the second haptic information onto a region where the haptic presentation device can present a haptic sensation (hereinafter, also referred to as “haptic presentation region”). Then, the information processing device causes the haptic presentation device to present, to the user, a haptic stimulus of an intensity indicated by the haptic stimulus value mapped onto a position in the haptic presentation region touched by the user.
A haptic sensation that the user feels when touching an object in a real space typically depends on the way the user touches the object and characteristics unique to the object such as a material and hardness of the object. In this regard, the information processing device according to the present embodiment generates the second haptic information on the basis of the sensing information corresponding to the way the user touches the object and the first haptic information corresponding to the characteristics unique to the object, thereby presenting a realistic haptic stimulus to the user.
(Outline of Processing)
Herein, an outline of processing according to the embodiment of the present disclosure will be described with reference to
The haptic presentation object 62 in the example of
The user moves his/her hand from a position of a hand 52a to a position of a hand 52b while keeping the hand in contact with the haptic presentation unit 160. At this time, for example, an amount of change in a moving speed obtained when the user moves his/her hand is acquired as the sensing information. In a case where the sensing information is acquired, the haptic presentation device 10 performs generation processing. In the generation processing, the haptic presentation device 10 processes the first haptic information 72 on the basis of the sensing information, thereby generating second haptic information 74 that reflects a change in the way the user touches.
The generated second haptic information 74 is mapped onto the haptic presentation unit 160. Then, the haptic presentation unit 160 presents a haptic stimulus to the user on the basis of the haptic stimulus value of the second haptic information 74 mapped onto a position of the haptic presentation unit 160 touched by the user.
(Exemplary Presentation of Haptic Stimulus)
Here, an exemplary presentation of a haptic stimulus according to the embodiment of the present disclosure will be described with reference to
As illustrated in an upper diagram of
A graph in a lower diagram of
Herein, problems are summarized. General haptic presentation devices do not present, to the user, a haptic stimulus obtained by processing a haptic stimulus value on the basis of the sensing information. Therefore, the general haptic presentation devices do not generate the second haptic information from the first haptic information even if information indicating a change in the way the user touches is acquired as the sensing information.
The embodiment of the present disclosure has been made in view of the above point, and proposes a technology capable of presenting a more realistic haptic stimulus. Hereinafter, the present embodiment will be sequentially described in detail.
First, a configuration example of an information processing system according to the embodiment of the present disclosure will be described with reference to
<2-1. System Configuration>
As illustrated in
(1) Haptic Presentation Device 10
The haptic presentation device 10 is a device (information processing device) that presents a haptic stimulus to an arbitrary target. For example, the haptic presentation device 10 presents a haptic stimulus to a part of the user in contact with the haptic presentation device.
The haptic presentation device 10 is connected to the server 20 via the network 50, and can transmit and receive information to and from the server 20. Further, the haptic presentation device 10 is connected to the sensor device 30 via the network 50, and can transmit and receive information to and from the sensor device 30. Furthermore, the haptic presentation device 10 is connected to the display device 40 via the network 50, and can cause the display device 40 to display an image of the haptic presentation object.
In the haptic presentation device 10, haptic presentation processing is performed by the information processing device according to the present embodiment. For example, the information processing device is provided in the haptic presentation device 10, and performs the haptic presentation processing of presenting a haptic stimulus to the haptic presentation unit of the haptic presentation device 10. Hereinafter, an example where the information processing device is provided in the haptic presentation device 10 will be described. However, a device in which the information processing device is provided is not limited to the haptic presentation device 10, and may be any device. For example, the information processing device may be provided in the server 20 to control the haptic presentation processing in the haptic presentation device 10 via the network 50.
(2) Server 20
The server 20 is a server device having a function of storing information regarding the haptic presentation processing of the haptic presentation device 10. For example, the server 20 may be a haptic information server that stores the first haptic information.
The server 20 is connected to the haptic presentation device 10 via the network 50, and can transmit and receive information to and from the haptic presentation device 10. For example, the server 20 transmits the first haptic information to the haptic presentation device 10 via the network 50.
(3) Sensor Device 30
The sensor device 30 has a function of sensing information used for processing in the haptic presentation device 10. For example, the sensor device 30 senses the sensing information regarding the user. After sensing, the sensor device 30 transmits the sensing information to the haptic presentation device 10 via the network 50.
The sensor device 30 can include various sensor devices. As an example, the sensor device 30 may include a camera, a thermosensor, and a humidity sensor. Note that the sensor devices included in the sensor device 30 are not limited to such examples, and any other sensor device may be included.
The camera is an imaging device that includes a lens system, a drive system, and an imaging element of an RGB camera or the like and captures an image (a still image or a moving image). For example, the camera captures a captured image showing a contact state between the user and the haptic presentation device 10. Therefore, the camera is desirably provided at a position at which the contact state between the user and the haptic presentation device 10 can be imaged. With such a configuration, the sensor device 30 can acquire the captured image showing the contact state between the user and the haptic presentation device 10 as the contact information.
The thermosensor is a device that senses a temperature. The thermosensor can sense various temperatures. For example, the thermosensor senses a temperature of a space where the user exists. Further, the thermosensor senses the body temperature of the user. Furthermore, the thermosensor senses a temperature of an object (e.g., the haptic presentation device 10) with which the user is in contact. With such a configuration, the sensor device 30 can acquire the temperature of the space where the user exists as the environmental information, the body temperature of the user as the non-contact information, and the temperature of the object in contact with the user as the contact information.
The humidity sensor is a device that senses a humidity. The humidity sensor can sense various humidities. For example, the humidity sensor senses a humidity of the space where the user exists. Further, the humidity sensor senses a humidity of the body surface of the user. Furthermore, the humidity sensor senses a humidity of a contact position between the user and an object (e.g., the haptic presentation device 10). With such a configuration, the sensor device 30 can acquire the humidity of the space where the user exists as the environmental information, the humidity of the body surface of the user as the non-contact information, and the humidity of the contact position between the user and the object as the contact information.
(4) Display Device 40
The display device 40 has a function of displaying an image regarding the haptic presentation processing of the haptic presentation device 10. For example, in a case where the haptic presentation object is an image, the display device 40 displays the image.
The display device 40 is connected to the haptic presentation device 10 via the network 50, and can transmit and receive information to and from the haptic presentation device 10. For example, the display device 40 receives an image of the haptic presentation object from the haptic presentation device 10 via the network 50 and displays the image.
The display device 40 can be achieved by various devices. For example, the display device 40 is achieved by a terminal device including a display unit, such as a personal computer (PC), a smartphone, a tablet terminal, a wearable terminal, or an agent device.
Note that the display device 40 may be achieved by a display. Examples of the display encompass a CRT display, a liquid crystal display, a plasma display, and an EL display. Further, the display device 40 may be achieved by a laser projector, an LED projector, or the like.
(5) Network 50
The network 50 has a function of connecting the haptic presentation device 10 and the server 20 and connecting the haptic presentation device 10 and the sensor device 30. The network 50 may include public networks such as the Internet, a telephone network, and a satellite communication network, various local area networks (LANs) including Ethernet (registered trademark), wide area networks (WANs), and the like. Further, the network 50 may include a dedicated network such as the Internet protocol-virtual private network (IP-VPN). Furthermore, the network 50 may include wireless communication networks such as Wi-Fi (registered trademark) and Bluetooth (registered trademark).
<2-2. Functional Configuration>
Next, a functional configuration of the haptic presentation device 10 according to the embodiment of the present disclosure will be described. As illustrated in
(1) Communication Unit 110
The communication unit 110 has a function of communicating with an external device. For example, in communicating with the external device, the communication unit 110 outputs information received from the external device to the control unit 140. Specifically, in communicating with the server 20 via the network 50, the communication unit 110 receives the first haptic information from the server 20 and outputs the first haptic information to the control unit 140.
For example, in communicating with the external device, the communication unit 110 transmits information input from the control unit 140 to the external device. Specifically, the communication unit 110 transmits, to the server 20, information indicating the haptic presentation object serving as a target from which the first haptic information is acquired. The information is input from an acquisition unit 142 of the control unit 140 at the time of acquiring the first haptic information.
(2) Sensor Unit 120
The sensor unit 120 has a function of sensing information used for processing in the control unit 140. For example, the sensor unit 120 senses the sensing information regarding the user. After sensing, the sensor unit 120 outputs the sensing information to the control unit 140.
The sensor unit 120 can include various sensor devices. As an example, the sensor unit 120 may include a touchscreen, a pressure-sensitive sensor, an acceleration sensor, a gyro sensor, and a proximity sensor. Note that the sensor devices included in the sensor unit 120 are not limited to such examples, and any other sensor device may be included. As an example, the sensor unit 120 may include the camera, the thermosensor, and the humidity sensor described above as the sensor devices that can be included in the sensor device 30.
The touchscreen is a device that senses a contact state. For example, the touchscreen detects whether or not the touchscreen is in contact with the target. As an example, the touchscreen detects whether or not the user and the haptic presentation unit 160 are in contact with each other. Further, the touchscreen Further, the touchscreen senses a speed while the target is in contact with the touchscreen. As an example, in a case where the user touches the haptic presentation unit 160, the touchscreen senses a speed at which the user moves the contact part. With such a configuration, the sensor unit 120 can acquire, as the contact information, information indicating whether or not the user is in contact with the haptic presentation unit 160 and a moving speed of the contact part.
The pressure-sensitive sensor is a device that senses a pressure. For example, the pressure-sensitive sensor senses a pressure applied to the pressure-sensitive sensor when the pressure-sensitive sensor is brought into contact with a target. As an example, in a case where the user and the haptic presentation unit 160 are brought into contact with each other, the pressure-sensitive sensor senses a pressure applied to the contact part. Further, in a case where the pressure-sensitive sensor is brought into contact with the target, the pressure-sensitive sensor senses a contact area with the target. As an example, in a case where the user and the haptic presentation unit 160 are brought into contact with each other, the pressure-sensitive sensor senses an area of the contact part. With such a configuration, in a case where the user and the haptic presentation unit 160 are brought into contact with each other, the sensor unit 120 can acquire the pressure applied to the contact part and the area of the contact part as the contact information.
The acceleration sensor is a device that senses acceleration. For example, the acceleration sensor senses acceleration that is an amount of change in speed at which a target moves. As an example, the acceleration sensor senses the acceleration when the user moves the contact part in contact with the haptic presentation unit 160. With such a configuration, the sensor unit 120 can acquire, as the contact information, the acceleration when the user moves the contact part.
The gyro sensor is a device that senses an angular velocity. For example, the gyro sensor senses an angular velocity that is an amount of change in a posture of the target. As an example, in a case where the haptic presentation device 10 is achieved as a device held and operated by the user, the gyro sensor senses the angular velocity when the user changes a posture of the haptic presentation device 10. With such a configuration, the sensor unit 120 can acquire, as the contact information, the angular velocity when the user changes the posture of the haptic presentation device 10.
The proximity sensor is a device that detects a nearby object. The proximity sensor may be achieved by various devices. As an example, the proximity sensor may be achieved by a depth camera that senses distance information from an object ahead. With such a configuration, the sensor unit 120 can acquire, as the non-contact information, a distance from a contact part of the user who is assumed to be in contact with the haptic presentation unit 160.
The control unit 140 is an information processing device having a function of controlling the entire operation of the haptic presentation device 10. In order to achieve the function, the control unit 140 includes the acquisition unit 142, a data processing unit 144, and a haptic presentation control unit 146 as illustrated in
(3-1. Acquisition Unit 142)
The acquisition unit 142 has a function of acquiring the sensing information. For example, the acquisition unit 142 acquires the sensing information regarding the user and the first haptic information. At the time of acquiring the sensing information, the acquisition unit 142 can acquire the sensing information from a plurality of acquisition sources. For example, the acquisition unit 142 acquires information sensed by the sensor unit 120 as the sensing information from the sensor unit 120. Further, the acquisition unit 142 may acquire information sensed by the sensor device 30 as the sensing information from the sensor device 30 via the communication unit 110. Note that the acquisition unit 142 may acquire the sensing information from either one or both of the sensor unit 120 and the sensor device 30.
After acquiring the sensing information, the acquisition unit 142 outputs the acquired sensing information to the data processing unit 144. With such a configuration, the acquisition unit 142 can output the sensing information acquired from both the sensor unit 120 and the sensor device 30 to the data processing unit 144. Note that the acquisition unit 142 may output the acquired sensing information to the storage unit 150 to cause the storage unit 150 to store the acquired sensing information.
At the time of acquiring the first haptic information, the acquisition unit 142 acquires the first haptic information from the server 20 (haptic information server) via the network 50. After acquiring the first haptic information, the acquisition unit 142 outputs the acquired first haptic information to the data processing unit 144. Note that the acquisition unit 142 may output the acquired first haptic information to the storage unit 150 to cause the storage unit 150 to store the acquired first haptic information.
Note that, in a case where the first haptic information is held in the storage unit 150, the acquisition unit 142 may acquire the first haptic information from the storage unit 150. With such a configuration, the acquisition unit 142 can improve processing efficiency in the control unit 140, as compared with a case where the first haptic information is acquired from the server 20 via the network 50.
(3-2. Data Processing Unit 144)
The data processing unit 144 has a function of performing generation processing of the second haptic information. For example, the data processing unit 144 generates the second haptic information from the first haptic information on the basis of the sensing information, the second haptic information being used in a case where the haptic presentation unit 160 presents a haptic stimulus to the user. Specifically, the data processing unit 144 generates the second haptic information by changing a haptic stimulus value included in the first haptic information input from the acquisition unit 142 on the basis of the sensing information also input from the acquisition unit 142. Then, the data processing unit 144 maps the generated second haptic information onto the haptic presentation unit 160. Hereinafter, the processing performed by the data processing unit 144 will be sequentially described in detail.
(3-2-1. Configuration of Haptic Information)
First, a configuration example of haptic information according to the embodiment of the present disclosure will be described with reference to
The header section can store information regarding the haptic information. The information regarding the haptic information is, for example, a data size of the haptic information, information regarding a predetermined region, globally applied information, or the like. In a case where the haptic presentation object is an image 66 as illustrated in
The data section can store information for each predetermined region. As illustrated in
(3-2-2. Generation of Second Haptic Information)
(Generation Examples Based on Sensing Information)
For example, the data processing unit 144 generates the second haptic information by processing the first haptic information on the basis of the sensing information.
As an example, the data processing unit 144 generates the second haptic information by processing the first haptic information on the basis of the contact information. Herein, generation of the second haptic information based on the contact information will be described with reference to
Generation Examples Based on Speed
For example, the data processing unit 144 generates the second haptic information on the basis of a change speed at a contact position between the haptic presentation unit 160 and the user. Hereinafter, a specific description will be given with reference to
For example, the data processing unit 144 generates the second haptic information by processing the first haptic information in accordance with the change speed at the contact position. Specifically, the data processing unit 144 generates the second haptic information in which an amount of change in haptic stimulus per unit distance is smaller as the change speed at the contact position is higher, and generates the second haptic information in which the amount of change in haptic stimulus per unit distance is larger as the change speed at the contact position is lower.
Generation Examples Based on Pressure
Further, the data processing unit 144 generates the second haptic information on the basis of a pressure between the haptic presentation unit 160 and the user. Hereinafter, a specific description will be given with reference to
Generation Example Based on Contact Area
Further, the data processing unit 144 generates the second haptic information on the basis of the contact area between the haptic presentation unit 160 and the user. Hereinafter, a specific description will be given with reference to
Generation Example Based on Environmental Information
Further, the data processing unit 144 may generate the second haptic information further on the basis of the environmental information. Hereinafter, a specific description will be given with reference to
Generation Example Based on Humidity
(Generation Example Based on Size of Haptic Presentation Unit 160)
Further, in a case where there is a plurality of pieces of the first haptic information having different sizes and information densities, the data processing unit 144 may generate the second haptic information on the basis of, among the plurality of pieces of the first haptic information, a piece of the first haptic information having the information density corresponding to the size of the haptic presentation unit 160. Hereinafter, a specific description will be given with reference to
As illustrated in an upper left diagram of
As illustrated in a middle left diagram of
As illustrated in a lower left diagram of
With such a configuration, the user can feel an appropriate haptic sensation without being affected by the size of the haptic presentation unit 160.
(Generation Examples Based on Scale Ratio)
Further, the data processing unit 144 may generate the second haptic information in accordance with a scale ratio of the haptic presentation object mapped onto the haptic presentation unit 160. The scale ratio means an enlargement ratio or a reduction ratio of an image. For example, the enlargement ratio is a magnification obtained in a case where the image of the haptic presentation object displayed on the display device 40 is enlarged. Further, the reduction ratio is a magnification obtained in a case where the image of the haptic presentation object displayed on the display device 40 is reduced. Hereinafter, a specific description will be given with reference to
In the example of
In the example of
Note that it is assumed that, in a case where the image of the haptic presentation object displayed on the display device 40 is enlarged or reduced, the size of the enlarged or reduced image falls within a predetermined range. In this case, the data processing unit 144 generates the second haptic information on the basis of the first haptic information corresponding to an actual size of the haptic presentation object.
Further, it is assumed that the size of the enlarged or reduced image is out of the predetermined range and is larger than the actual size of the haptic presentation object. Furthermore, in a case where there is the first haptic information having the information density corresponding to the enlarged or reduced size, the data processing unit 144 may generate the second haptic information on the basis of the first haptic information. Meanwhile, in a case where there is no first haptic information having the information density corresponding to the enlarged or reduced size, the data processing unit 144 may generate the second haptic information by repeating a pattern of predetermined regions in the first haptic information of the actual size. Note that, in a case where there is the first haptic information having a size smaller than the actual size, the data processing unit 144 may generate the second haptic information by repeating a pattern of predetermined regions in the first haptic information of the small size.
Further, it is assumed that the size of the enlarged or reduced image is out of the predetermined range and is smaller than the actual size of the haptic presentation object. Furthermore, in a case where there is the first haptic information having the information density corresponding to the enlarged or reduced size, the data processing unit 144 may generate the second haptic information on the basis of the first haptic information. Meanwhile, in a case where there is no first haptic information having the information density corresponding to the enlarged or reduced size, the data processing unit 144 may generate the second haptic information by reducing the first haptic information of the actual size.
(3-2-3. Mapping of Second Haptic Information)
Next, mapping of the second haptic information according to the embodiment of the present disclosure will be described with reference to
The data processing unit 144 maps the generated second haptic information onto the haptic presentation unit 160. For example, the data processing unit 144 maps the second haptic information of the full-size haptic presentation object as it is onto the haptic presentation unit 160.
As an example, as illustrated in an upper diagram of
Further, as illustrated in a middle diagram of
Furthermore, as illustrated in a lower diagram of
(3-2-4. Scaling of Second Haptic Information)
Next, scaling of the second haptic information according to the embodiment of the present disclosure will be described with reference to
In a case where the second haptic information is mapped onto the haptic presentation unit 160 but the second haptic information is not mapped in an appropriate size, the data processing unit 144 may scale the second haptic information to the appropriate size by enlarging or reducing the second haptic information and then map the second haptic information.
For example, as illustrated in an upper diagram of
Further, as illustrated in a lower diagram of
In view of this, the data processing unit 144 may regenerate the second haptic information so that the second haptic information has the information density corresponding to the scale ratio at the time of scaling. With such a configuration, the data processing unit 144 can map, onto the haptic presentation unit 160, the second haptic information having the information density suitable for the size of the scaled haptic presentation object 62 at the time of scaling the haptic presentation object 62.
(3-3. Haptic Presentation Control Unit 146)
The haptic presentation control unit 146 has a function of controlling an operation of the haptic presentation unit 160. For example, the haptic presentation control unit 146 generates a presentation signal to be presented by the haptic presentation unit 160 on the basis of the second haptic information mapped onto a position where the user touches the haptic presentation unit 160. Specifically, the haptic presentation control unit 146 reads a haptic stimulus value from the second haptic information mapped onto the position where the user touches the haptic presentation unit 160 and converts the haptic stimulus, thereby generating a presentation signal. Then, the haptic presentation control unit 146 outputs the generated presentation signal to the haptic presentation unit 160.
(4) Storage Unit 150
The storage unit 150 has a function of storing information regarding the processing in the haptic presentation device 10. In order to achieve the function, as illustrated in
(4-1. Haptic Information Storage Unit 152)
The haptic information storage unit 152 is a storage unit that stores haptic information. For example, the haptic information storage unit 152 stores the first haptic information acquired by the acquisition unit 142 from the server 20 via the communication unit 110 and the network 50.
(4-2. Sensing Information Storage Unit 154)
The sensing information storage unit 154 is a storage unit that stores the sensing information. For example, the sensing information storage unit 154 stores the sensing information acquired by the acquisition unit 142 from the sensor device 30 via the communication unit 110 and the sensing information acquired by the acquisition unit 142 from the sensor unit 120.
(4-3. Haptic-Presentation-Unit-Information Storage Unit 156)
The haptic-presentation-unit-information storage unit 156 is a storage unit that stores haptic presentation unit information. For example, the haptic-presentation-unit-information storage unit 156 stores the haptic presentation unit information prepared in advance. The haptic presentation unit information is, for example, information unique to the haptic presentation unit 160, such as a coefficient of restitution and a coefficient of friction.
Note that the information stored in the storage unit 150 is not limited to such examples. For example, the storage unit 150 may store programs such as various applications.
(5) Haptic Presentation Unit 160
The haptic presentation unit 160 has a function of presenting a haptic stimulus to the user. For example, the haptic presentation unit 160 presents, to the user, a haptic stimulus corresponding to a presentation signal input from the control unit 140.
The haptic presentation unit 160 can present a haptic stimulus to the user by various means. As an example, the haptic presentation unit 160 can present a haptic stimulus by an electrical stimulus, a Peltier device, a motor, air pressure, a vibrator, a speaker, a display, or the like.
The haptic presentation unit 160 presents, for example, an electrical stimulus having an intensity corresponding to a presentation signal to the user as a haptic stimulus. With such a configuration, the user can feel unevenness of a surface of the haptic presentation object as a haptic sensation via the haptic presentation unit 160.
The haptic presentation unit 160 presents, for example, heat adjusted by the Peltier device in response to a presentation signal to the user as a haptic stimulus. With such a configuration, the user can feel a temperature of the surface of the haptic presentation object as a haptic sensation via the haptic presentation unit 160.
The haptic presentation unit 160 presents, for example, reaction force generated by moving the haptic presentation unit 160 by using the motor in response to a presentation signal to the user as a haptic stimulus. With such a configuration, the user can feel a texture of the surface of the haptic presentation object as a haptic sensation via the haptic presentation unit 160.
The haptic presentation unit 160 presents, for example, vibration generated by vibrating the haptic presentation unit 160 at an arbitrary frequency by using air pressure in response to a presentation signal to the user as a haptic stimulus. Further, the haptic presentation unit 160 presents reaction force generated by moving the haptic presentation unit 160 by using air pressure in response to a presentation signal to the user as a haptic stimulus. With such a configuration, the user can feel a texture of the surface of the haptic presentation object as a haptic sensation via the haptic presentation unit 160.
The haptic presentation unit 160 presents, for example, vibration generated by vibrating the haptic presentation unit 160 at an arbitrary frequency by using the vibrator in response to a presentation signal to the user as a haptic stimulus. With such a configuration, the user can feel a texture of the surface of the haptic presentation object as a haptic sensation via the haptic presentation unit 160. Further, the haptic presentation unit 160 presents a change in a direction of motion to the user as a haptic stimulus by changing a movement pattern of mass by using the vibrator in response to a presentation signal. With such a configuration, the user can feel a change in weight of the haptic presentation object as a haptic sensation via the haptic presentation unit 160.
The haptic presentation unit 160 presents, for example, a sound of a specific frequency to the user as a haptic stimulus through the speaker in response to a presentation signal. With such a configuration, the user can feel a change in humidity as a haptic sensation via the haptic presentation unit 160. For example, in a case where a sound is output at a frequency of about 2000 Hz, the user can feel a low humidity and dry air. Meanwhile, in a case where a sound is output at a suppressed frequency, the user can feel a high humidity and wet air.
For example, the haptic presentation unit 160 presents visual feedback to the user as a haptic stimulus by using the display in response to a presentation signal. With such a configuration, the user can feel pseudo haptics via the haptic presentation unit 160.
Hereinabove, the configuration example according to the present embodiment has been described. Next, processing examples according to the present embodiment will be described.
<3-1. Flow of Processing in a Case where Haptic Information is not Switched>
First, the control unit 140 of the haptic presentation device 10 acquires information unique to the haptic presentation unit 160 from the storage unit 150 (S102). Next, the control unit 140 detects the haptic presentation object selected by the user (S104). Next, the control unit 140 acquires the first haptic information corresponding to the selected haptic presentation object from the server 20 via the communication unit 110 (S106), and stores the acquired first haptic information in the storage unit 150 (S108). Next, the control unit 140 acquires the sensing information from the sensor unit 120 (S110).
After acquiring the sensing information, the control unit 140 generates the second haptic information through the generation processing (S112). After the generation processing, the control unit 140 generates a presentation signal on the basis of the generated second haptic information (S114). Then, the control unit 140 causes the haptic presentation unit 160 to present a haptic stimulus on the basis of the presentation signal (S116).
After the presentation of the haptic stimulus, in a case where another haptic presentation object is selected by the user (S118/YES), the control unit 140 repeats the processing from S106. Meanwhile, in a case where no other haptic presentation object is selected by the user (S118/NO), the control unit 140 repeats the processing from S110.
<3-2. Flow of Processing in a Case Where Haptic Information is Switched>
First, the control unit 140 of the haptic presentation device 10 acquires information unique to the haptic presentation unit 160 from the storage unit 150 (S202). Next, the control unit 140 detects the haptic presentation object selected by the user (S204). Next, the control unit 140 acquires the first haptic information corresponding to the selected haptic presentation object from the server 20 via the communication unit 110 (S206), and stores the acquired first haptic information in the storage unit 150 (S208). Next, the control unit 140 acquires the sensing information from the sensor unit 120 (S210).
After acquiring the sensing information, the control unit 140 switches the first haptic information in accordance with the sensing information (S212). After switching the first haptic information, the control unit 140 generates the second haptic information through the generation processing (S214). After the generation processing, the control unit 140 generates a presentation signal on the basis of the generated second haptic information (S216). Then, the control unit 140 causes the haptic presentation unit 160 to present a haptic stimulus on the basis of the presentation signal (S218).
After the presentation of the haptic stimulus, in a case where another haptic presentation object is selected by the user (S220/YES), the control unit 140 repeats the processing from S206. Meanwhile, in a case where no other haptic presentation object is selected by the user (S220/NO), the control unit 140 repeats the processing from S210.
Hereinabove, the configuration example according to the present embodiment has been described. Next, processing examples according to the present embodiment will be described.
For example, in the example of
For example, in the example of
Hereinafter, modification examples according to the embodiment of the present disclosure will be described. Note that the modification examples described below may be applied to the embodiment of the present disclosure independently or in combination. Further, the modification examples may be applied instead of or in addition to the configuration described in the embodiment of the present disclosure.
For example, as illustrated in
In a case where the user holds the haptic presentation device 10 without tilting the haptic presentation device 10, the virtual object is displayed as illustrated in a left diagram of
As illustrated in a left diagram of
Finally, a hardware configuration example of the information processing device according to the present embodiment will be described with reference to
As illustrated in
The CPU 901 functions as, for example, an arithmetic processing device or a control device, and controls a part of or the entire operation of each component on the basis of various programs recorded in the ROM 902, the RAM 903, or the storage device 910. The ROM 902 is means for storing the programs read by the CPU 901, data used for calculation, and the like. The RAM 903 temporarily or permanently stores, for example, the programs read by the CPU 901, various parameters that appropriately change when the programs are executed, and the like. Those components are mutually connected by the host bus 904 including a CPU bus or the like. The CPU 901, the ROM 902, and the RAM 903 can achieve, for example, the function of the control unit 140 described with reference to
The CPU 901, the ROM 902, and the RAM 903 are mutually connected via, for example, the host bus 904 capable of transmitting data at a high speed. Meanwhile, for example, the host bus 904 is connected to the external bus 906 that transmits data at a relatively low speed via the bridge 905. Further, the external bus 906 is connected to various components via the interface 907.
The input device 908 is achieved by, for example, a device to which information is input by the user, such as a mouse, a keyboard, a touchscreen, a button, a microphone, a switch, or a lever. Further, the input device 908 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device that performs in response to operation of the information processing device 900, such as a mobile phone or a PDA. Furthermore, the input device 908 may include, for example, an input control circuit that generates an input signal on the basis of information input by the user by using the above input means and outputs the input signal to the CPU 901, and the like. By operating the input device 908, the user of the information processing device 900 can input various kinds of data to the information processing device 900 and instruct the information processing device 900 to perform a processing operation.
In addition, the input device 908 can include a device that detects information regarding the user. For example, the input device 908 may include various sensors such as an image sensor (e.g., a camera), a depth sensor (e.g., a stereo camera), an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, a sound sensor, a distance measurement sensor (e.g., a time of flight (ToF) sensor), and a force sensor. Further, the input device 908 may acquire information regarding a state of the information processing device 900 itself, such as a posture and moving speed of the information processing device 900, and information regarding a surrounding environment of the information processing device 900, such as luminance and noise around the information processing device 900. Further, the input device 908 may include a global navigation satellite system (GNSS) module that receives a GNSS signal (e.g., a global positioning system (GPS) signal from a GPS satellite) from a GNSS satellite to measure position information including a latitude, longitude, and altitude of the device. Furthermore, regarding the position information, the input device 908 may detect the position by Wi-Fi (registered trademark), transmission and reception with a mobile phone, a PHS, a smartphone, or the like, near field communication, or the like. The input device 908 can achieve, for example, the function of the sensor unit 120 described with reference to
The output device 909 includes a device capable of visually or aurally notifying the user of acquired information. Examples of such a device encompass display devices such as a CRT display, a liquid crystal display, a plasma display, an EL display, a laser projector, an LED projector, and a lamp, sound output devices such as a speaker and headphones, and printer devices. The output device 909 outputs, for example, results of various kinds of processing performed by the information processing device 900. Specifically, the display device visually displays the results of the various kinds of processing performed by the information processing device 900 in various formats such as text, images, tables, and graphs. Meanwhile, the sound output device converts audio signals including reproduced sound data, acoustic data, and the like into analog signals and aurally outputs the analog signals. The output device 909 can achieve, for example, the function of the haptic presentation unit 160 described with reference to
The storage device 910 is a data storage device provided as an example of a storage unit of the information processing device 900. The storage device 910 is achieved by, for example, a magnetic storage unit device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like. The storage device 910 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like. The storage device 910 stores programs and various kinds of data executed by the CPU 901, various kinds of data acquired from the outside, and the like. The storage device 910 can achieve, for example, the function of the storage unit 150 described with reference to
The drive 911 is a storage medium reader/writer, and is included in or externally attached to the information processing device 900. The drive 911 reads information recorded on a removable storage medium such as an attached magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the information to the RAM 903. Further, the drive 911 can also write information into the removable storage medium.
The connection port 912 is, for example, a port for connecting an external connection device, such as a universal serial bus (USB) port, an IEEE 1394 port, a small computer system interface (SCSI), an RS-232C port, or an optical audio terminal.
The communication device 913 is a communication interface including, for example, a communication device to be connected to the network 920, and the like. The communication device 913 is, for example, a communication card for a wired or wireless local area network (LAN), long term evolution (LTE), Bluetooth (registered trademark), wireless USB (WUSB), or the like. Further, the communication device 913 may be an optical communication router, an asymmetric digital subscriber line (ADSL) router, various communication modems, or the like. For example, the communication device 913 can transmit/receive signals and the like to/from the Internet and other communication devices in accordance with, for example, a predetermined protocol such as TCP/IP. The communication device 913 can achieve, for example, the function of the communication unit 110 described with reference to
Note that the network 920 is a wired or wireless transmission path for information transmitted from a device connected to the network 920. For example, the network 920 may include public networks such as the Internet, a telephone network, and a satellite communication network, various local area networks (LANs) including Ethernet (registered trademark), wide area networks (WANs), and the like. Further, the network 920 may include a dedicated network such as the Internet protocol-virtual private network (IP-VPN). The network 920 can achieve, for example, the function of the network 50 described with reference to
Hereinabove, there has been described an example of the hardware configuration capable of achieving the function of the information processing device 900 according to the present embodiment. Each of the above components may be achieved by using a general-purpose member, or may be achieved by hardware specialized for the function of each component. Therefore, it is possible to appropriately change the hardware configuration to be used in accordance with a technological level at the time of implementing the present embodiment.
As described above, the information processing device according to the embodiment of the present disclosure acquires the sensing information regarding the user and the first haptic information unique to the haptic presentation object. The information processing device generates the second haptic information from the first haptic information on the basis of the acquired sensing information, the second haptic information being used in a case where the haptic presentation device presents a haptic stimulus to the user.
With such a configuration, the information processing device can generate haptic information corresponding to the sensing information regarding the user from the haptic information unique to the haptic presentation object.
Therefore, it is possible to provide a novel and improved information processing device, information processing method, and program capable of presenting a more realistic haptic stimulus.
Hereinabove, the preferred embodiment of the present disclosure has been described in detail with reference to the accompanying drawings. However, the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure may find various changes or modifications within the scope of the technical idea described in the claims. As a matter of course, it is understood that those changes and modifications also belong to the technical scope of the present disclosure.
For example, each device described in the present specification may be achieved as a single device, or some or all of the devices may be achieved as separate devices. For example, the control unit 140 included in the haptic presentation device 10 of
Further, the series of processing performed by each device described in the present specification may be achieved by software, hardware, or a combination of software and hardware. A program forming the software is stored in advance in, for example, a recording medium (non-transitory medium) provided inside or outside each device. Further, for example, each program is read into the RAM at the time of execution by a computer and is executed by a processor such as a CPU.
Further, the processing described by using the flowcharts in the present specification may not necessarily be executed in the shown order. Some processing steps may be performed in parallel. Further, additional processing steps may be adopted, and some processing steps may be omitted.
Further, the effects described in this specification are merely illustrative or exemplary and are not limited. In other words, the technology according to the present disclosure can have other effects that are apparent to those skilled in the art from the description of the present specification in addition to or in place of the above effects.
Note that the following configurations also belong to the technical scope of the present disclosure.
(1)
An information processing device including:
(2)
The information processing device according to (1), in which
(3)
The information processing device according to (2), in which the data processing unit generates the second haptic information on the basis of a change speed at a contact position between the haptic presentation device and the user.
(4)
The information processing device according to (3), in which the data processing unit generates the second haptic information by processing the first haptic information in accordance with the change speed at the contact position.
(5)
The information processing device according to (4), in which the data processing unit generates the second haptic information in which an amount of change in haptic stimulus per unit distance is smaller as the change speed at the contact position is higher, and generates the second haptic information in which the amount of change in haptic stimulus per unit distance is larger as the change speed at the contact position is lower.
(6)
The information processing device according to any one of (2) to (5), in which the data processing unit generates the second haptic information on the basis of a contact pressure between the haptic presentation device and the user.
(7)
The information processing device according to any one of (2) to (6), in which the data processing unit generates the second haptic information on the basis of a contact area between the haptic presentation device and the user.
(8)
The information processing device according to any one of (1) to (7), in which
(9)
The information processing device according to any one of (1) to (8), in which the data processing unit generates the second haptic information further on the basis of information included in the sensing information and indicating a posture of the haptic presentation device held by the user.
(10)
The information processing device according to any one of (1) to (9), in which the data processing unit generates the second haptic information further on the basis of information included in the sensing information and indicating a position and posture of the user with respect to a virtual object located in a space.
(11)
The information processing device according to any one of (1) to (10), in which the data processing unit generates the second haptic information on the basis of, among a plurality of pieces of the first haptic information, a piece of the first haptic information having an information density corresponding to a size of the haptic presentation device.
(12)
The information processing device according to any one of (1) to (11), in which the data processing unit generates the second haptic information in accordance with a scale ratio of the haptic presentation object mapped onto the haptic presentation device.
(13)
The information processing device according to (12), in which the data processing unit generates the second haptic information on the basis of, among a plurality of pieces of the first haptic information, a piece of the first haptic information having an information density corresponding to the scale ratio.
(14)
The information processing device according to (12), in which the data processing unit generates the second haptic information by processing the first haptic information in accordance with the scale ratio.
(15)
The information processing device according to (14), in which
(16)
The information processing device according to (14), in which
(17)
The information processing device according to any one of (1) to (16), further including
(18)
The information processing device according to any one of (1) to (17), further including
(19)
An information processing method executed by a processor, the method including:
(20)
A program for causing a computer to function as:
Number | Date | Country | Kind |
---|---|---|---|
2019-032930 | Feb 2019 | JP | national |
This application is a divisional of U.S. patent application Ser. No. 17/432,346 (filed on Aug. 19, 2021), which is a National Stage Patent Application of PCT International Patent Application No. PCT/JP2020/005905 (filed on Feb. 14, 2020) under 35 U.S.C. § 371, which claims priority to Japanese Patent Application No. 2019-032930 (filed on Feb. 26, 2019), which are all hereby incorporated by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
Parent | 17432346 | Aug 2021 | US |
Child | 18371692 | US |