The present invention relates to a technique that presents self-motion to a user through tactile stimuli that simulate tactile stimuli produced by self-motion.
Motion that changes the position or attitude of a self-body relative to its environment is called “self-motion”. For example, walking is a self-motion. A sensory stimulus that simulates a sensory stimulus produced by self-motion is called a “sensory stimulus that suggests self-motion”. For example, an optical flow having an extended focus in the direction of movement is an example of this. The human brain estimates such self-motion on the basis of a variety of sensory inputs, and uses the information for perception, control, and the like.
Presenting various sensory stimuli that simulate tactile stimuli produced by self-motion and appropriately working with such processes by which the brain estimates self-motion makes it possible to implement a system that presents desired self-motion to a user. Thus far, such systems have used visual stimuli such as optical flows, electrical stimuli to the vestibular system, and the like. Recently, systems that use tactile stimuli that simulate tactile stimuli produced by self-motion have begun to be proposed in order to enhance the sensation of self-motion presented by visual stimuli, adjust the sensation in a desired direction, or the like. For example, NPL 1 discusses the possibility of presenting forward motion by presenting tactile pseudo-motion on a seating surface, and manipulating the perceived speed of self-motion perceived from the observation of expanding dot motion. Additionally, NPL 2 indicates the possibility of manipulating similar perceptions by presenting a tactile stimulus suggesting forward motion by blowing air on the face.
However, previously-proposed systems for presenting self-motion using tactile stimuli that simulate the tactile stimuli produced by self-motion were designed assuming that the user and the tactile presentation device are always in a specific relative positional relationship. Accordingly, in situations where the positional relationship changes, it has not been possible to present desired self-motion using tactile stimuli that simulate the tactile stimuli produced by self-motion.
An object of the present invention is to provide a technique capable of presenting desired self-motion to a user through tactile stimuli that simulate tactile stimuli produced by self-motion, in situations where a relative positional relationship between the user and a tactile presentation device change.
To solve the above-described problem, a tactile presentation device according to one aspect of the present invention is a tactile presentation device that presents, to a body of a user, a simulated tactile stimulus that simulates a tactile stimulus produced when the user performs a desired self-motion. The tactile presentation device includes a control unit that generates a drive signal driving the tactile presentation device, and a drive unit that presents the simulated tactile stimulus in accordance with the drive signal. The drive signal is generated on the basis of contact point motion information expressing motion arising at a contact point between the body of the user and the tactile presentation device due to the self-motion. Assuming that the contact point is fixed with respect to an outside world, the contact point motion information corresponds to a change in a relative positional relationship between the body of the user and the contact point arising when the user performs the self-motion.
According to the present invention, a technique capable of presenting desired self-motion to a user through tactile stimuli that simulate tactile stimuli produced by self-motion, in situations where a relative positional relationship between the user and a tactile presentation device change, can be provided.
An embodiment of this invention will be described in detail hereinafter. In the drawings, the same numbers are added to constituent elements having the same functions, and redundant descriptions will be omitted.
An embodiment of the present invention is a self-motion presentation system that presents a sensation of desired self-motion, including at least one of translation and rotation, to a user by using a tactile presentation device that presents tactile stimuli as motion of a contact point on the skin of the user's hand.
In the present embodiment, the position, attitude, and the like of the user, the position, motion, and the like of the contact point, and the like are defined using a predetermined coordinate system. In the following descriptions, a device coordinate system C1, a pre-motion body coordinate system C2, and a post-motion body coordinate system C3, illustrated in
The functional configuration of the self-motion presentation system will be described with reference to
The state measurement device 10 measures position/attitude information S12 of the user 2 in the device coordinate system C1 (called “user position/attitude information” hereinafter) and position information S14 of the contact point 4 in the device coordinate system C1 (called “contact point position information” hereinafter). The contact point motion calculation device 20 receives the input self-motion information S23 and the user position/attitude information S12 and contact point position information S14 output by the state measurement device 10, and calculates information S145 expressing contact point motion in the device coordinate system C1 to be presented to the user 2 (called “contact point motion information” hereinafter). The tactile presentation device 1 presents tactile stimuli corresponding to the contact point motion (called “simulated tactile stimuli” hereinafter) to the user 2.
As illustrated in
The contact point position measurement unit 11 measures the contact point position information S14 in the device coordinate system C1. As illustrated in
The body position/attitude measurement unit 12 measures the user position/attitude information S12 in the device coordinate system C1. As illustrated in
The contact point position measurement unit 11 uses a sensor such as an encoder of the tactile presentation device 1, a camera fixed to the tactile presentation device 1, or the like, for example. The body position/attitude measurement unit 12 uses a sensor such as a camera fixed to the tactile presentation device 1, a laser rangefinder, a floor sensor installed in the environment, or the like, for example. The contact point position measurement unit 11 and the body position/attitude measurement unit 12 may use a common sensor. Additionally, in a situation where the position of the contact point 4 in the device coordinate system C1 does not change significantly, the state measurement device 10 need not include the contact point position measurement unit 11. In this case, the state measurement device 10 outputs a predetermined value as the contact point position information S14.
As illustrated in
The pre-movement contact point position calculation unit 21 receives the contact point position information S14 and the user position/attitude information S12 output by the state measurement device 10, and calculates position information S24 of the contact point 4 in the pre-motion body coordinate system C2 (called “pre-movement contact point position information” hereinafter). The pre-movement contact point position information S24 includes a position vector V24 from the user 2 to the contact point 4. In other words, the pre-movement contact point position information S24 expresses a relative positional relationship between the pre-self-motion user 2 and the pre-movement contact point 4.
The post-motion contact point position calculation unit 22 receives the self-motion information S23 input to the contact point motion calculation device 20 and the pre-movement contact point position information S24 output by the pre-movement contact point position calculation unit 21, and calculates position information S34 of the contact point 4 in the post-motion body coordinate system C3 (called “post-motion contact point position information” hereinafter). The post-motion contact point position information S34 includes a position vector V34 from the user 3 to the contact point 4. In other words, the post-motion contact point position information S34 expresses a relative positional relationship between the post-self-motion user 3 and the pre-movement contact point 4.
The post-movement contact point position calculation unit 23 receives the post-motion contact point position information S34 output by the post-motion contact point position calculation unit 22, and calculates position information S15 in the device coordinate system C1 of a position at which the relative positional relationship to the pre-self-motion user 2 corresponds to the post-motion contact point position information S34 (the contact point 4 having moved to this position will be represented by a contact point 5) (called “post-movement contact point position information” hereinafter). The post-movement contact point position information S15 includes a position vector V15 from the tactile presentation device 1 to the contact point 5. In other words, the post-movement contact point position information S15 expresses a relative positional relationship between the pre-self-motion user 2 and the post-movement contact point 5.
The contact point displacement calculation unit 24 receives the contact point position information S14 output by the state measurement device 10 and the post-movement contact point position information S15 output by the post-movement contact point position calculation unit 23, subtracts the position of the pre-movement contact point 4 from the position of the post-movement contact point 5, and calculates a vector V145 expressing displacement of the contact point between before and after the movement (called a “contact point displacement vector” hereinafter).
The contact point motion calculation device 20 outputs the contact point displacement vector V145, which has been output by the contact point displacement calculation unit 24, as the contact point motion information S145. Note that the contact point motion calculation device 20 need not include the contact point displacement calculation unit 24, as illustrated in
The calculation by the pre-movement contact point position calculation unit 21 will be described in detail with reference to
V24=M12*V14 [Math 1]
The transformation matrix M12 can be calculated using the user position/attitude information S12 obtained from the state measurement device 10. For example, the following can be written when (x, y) represents the positional coordinates of the contact point 4 in the device coordinate system C1, (x′, y′) represents the positional coordinates of the contact point 4 in the pre-motion body coordinate system C2, (Tx, Ty) represents the positional coordinates of the center of the body of the pre-self-motion user 2 in the device coordinate system C1, and Rz represents an angle of rotation of the axis.
The calculation by the post-motion contact point position calculation unit 22 will be described in detail with reference to
V34=M23*V24 [Math 3]
The transformation matrix M23 can be calculated using the self-motion information S23 input to the contact point motion calculation device 20. For example, the following can be written when (x′, y′) represents the positional coordinates of the contact point 4 in the pre-motion body coordinate system C2, (x″, y″) represents the positional coordinates of the contact point 4 in the post-motion body coordinate system C3, (T′x, T′y) represents the positional coordinates of the center of the body of the post-self-motion user 3 in the pre-motion body coordinate system C2, and R′z represents an angle of rotation of the axis resulting from the self-motion.
The calculation by the post-movement contact point position calculation unit 23 and the contact point displacement calculation unit 24 will be described in detail with reference to
V15=M21*V34 [Math 5]
The transformation matrix M21 can be calculated using the user position/attitude information S12 obtained from the state measurement device 10. For example, the following can be written when (x″, y″) represents the positional coordinates of the pre-movement contact point 4 in the post-motion body coordinate system C3, (x′″, y′″) represents the positional coordinates of the post-movement contact point 5 in the pre-motion body coordinate system C2, (Tx, Ty) represents the positional coordinates of the center of the body of the pre-self-motion user 2 in the device coordinate system C1, and Rz represents an angle of rotation of the axis.
As illustrated in
V145=V15−V14 [Math 7]
As illustrated in
The tactile presentation device 1 presents the contact point motion as, for example, a change in the position of the contact point between the user 2 and the tactile presentation device 1. For example, the tactile presentation device 1 moves the robot arm such that the end of the robot arm moves from the position of the pre-movement contact point 4 to the position of the post-movement contact point 5. The tactile presentation device 1 may also present tactile motion or tactile pseudo-motion of a length proportional to the magnitude of the contact point displacement vector V145 in the direction indicated by the contact point displacement vector V145. Furthermore, the tactile presentation device 1 may present contact point motion as a force sensation by applying skin deformation, external force, symmetrical vibration, or asymmetrical vibration of a magnitude proportional to the magnitude of the contact point displacement vector V145 in the direction indicated by the contact point displacement vector V145.
Variations
Although the foregoing embodiment described calculations in the case where there is one contact point between the user 2 and the tactile presentation device 1, there may be a plurality of contact points between the user 2 and the tactile presentation device 1. In this case, as illustrated in
When presenting a plurality of contact point motions simultaneously, the self-motion suggested by the tactile stimuli can be limited more than when presenting a single contact point motion. For example, assume that a contact point motion which pulls forward only on one point of the user's left hand is presented, as illustrated in
An application is conceivable in which a mobile tactile presentation device is used to present self-motion to a user and guide the user to a desired route or destination in a situation where the user is moving, such as walking in a city. An application is also conceivable in which walking motion is stabilized by attaching or incorporating a tactile presentation device to a cane, a mobile terminal, or the like used by an elderly or disabled person, and inducing attitude responses, walking responses, and the like that compensate the presented self-motion.
Although embodiments of the invention have been described thus far, the specific configuration is not intended to be limited to these embodiments, and it goes without saying that changes to the design and the like, to the extent that they do not depart from the essential spirit of the invention, are included in the invention. The various types of processing described in the embodiments need not be executed in time series according to the order in the descriptions, and may instead be executed in parallel or individually as necessary or in accordance with the processing capabilities of the device executing the processing.
Program and Recording Medium
When the various processing functions of the respective devices described in the foregoing embodiments are implemented by a computer, the processing content of the functions which the devices are to have are written in a program. Then, by loading the program into a storage unit 1020 of the computer illustrated in
The program in which the processing details are written can be recorded into a computer-readable recording medium. The computer-readable recording medium is, for example, a non-transitory recording medium, and is a magnetic recording device, an optical disk, or the like.
Additionally, the program is distributed by, for example, selling, transferring, or lending portable recording media such as DVDs and CD-ROMs in which the program is recorded. Furthermore, the configuration may be such that the program is distributed by storing this program in a storage device of a server computer and transferring the program from the server computer to another computer over a network.
A computer executing such a program first stores the program recorded in the portable recording medium or the program transferred from the server computer, for example, in an auxiliary recording unit 1050, which is its own non-transitory storage device. Then, when executing the processing, the computer loads the program stored in the auxiliary recording unit 1050, which is its own non-transitory storage device, into the storage unit 1020, which is a transitory storage device, and executes processing in accordance with the loaded program. As another way to execute the program, the computer may load the program directly from the portable recording medium and execute the processing in accordance with the program, and furthermore, each time a program is transferred to the computer from the server computer, processing according to the received programs may be executed sequentially. Additionally, the configuration may be such that the above-described processing is executed by what is known as an ASP (Application Service Provider)-type service that implements the processing functions only by instructing execution and obtaining results, without transferring the program from the server computer to the computer in question. Note that the program according to this embodiment includes information that is provided for use in processing by an electronic computer and that is based on the program (such as data that is not a direct command to a computer but has a property of defining processing by the computer).
Additionally, although these devices are configured by causing a computer to execute a predetermined program in this embodiment, the details of the processing may be at least partially realized by hardware.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/012263 | 3/19/2020 | WO |