The present invention relates to a technique for presenting a pseudo haptic sense.
By manipulating visual information on the basis of information obtained from a haptic input device such as a mouse, a keyboard, a touch pen, or a reaction force presentation device, visual feedback to the haptic input can be given to a user. At this time, a haptic impression different from an actual haptic impression can be given to the user in an illusion manner by giving delays or fluctuations to the visual feedback. This technique is referred to herein as a pseudo-haptic technique.
There is also a technique for giving a haptic impression to a user by manipulating visual information on the basis of tracking information while tracking a user's body motion or hand motion in a touchless manner, instead of a haptic input device. For example, Non Patent Literature 1 discloses a touchless pseudo-haptic technique in a carousel interface that slides a video to the left and right on the basis of a swipe gesture of a user. In this technique, a friction coefficient and a magnetic force are applied to the movement of the carousel on the basis of physical operation, thereby giving a pseudo haptic sense to the user while reproducing natural movement and stop of the carousel. Non Patent Literature 2 discloses that, in a virtual reality environment, by manipulating a speed difference between a real object gripped in a real space and visual feedback, it is felt that the heaviness of the real object changes.
There is no known technique for allowing a user to perceive a pseudo haptic sense without performing a complicated physical operation for simulating a real environment or using a haptic sense based on a physical reaction to a motion of a human body. For example, in Non Patent Literature 2, although the heaviness of the object in the virtual environment is modulated by a pseudo haptic sense, the speed difference between the haptic sense due to the physical reaction received from the gripped real object and the visual feedback is used, and the user is not allowed to perceive the pseudo haptic sense without using the haptic sense due to the physical reaction to the motion of the human body.
The present invention has been made in view of these points, and an object of the present invention is to allow a user to perceive a pseudo haptic sense without performing a complicated physical operation for simulating a real environment or using a haptic sense based on a physical reaction to a motion of a human body.
A visual object is visually changed by a magnitude of visual change of the visual object according to a manipulation based on a motion of a body part on the basis of manipulation information indicating the manipulation and information indicating a relationship between the magnitude of visual change and a manipulation amount based on the manipulation, and information for presenting the visually changed visual object is output. However, this relationship is defined at least according to a degree of a pseudo haptic sense to be presented.
Accordingly, it is possible to allow the user to perceive a pseudo haptic sense without performing a complicated physical operation for simulating the real environment or using a haptic sense based on a physical reaction to the motion of the human body.
Embodiments of the present invention will be described below with reference to the drawings.
First, the principle of each embodiment will be described.
In each embodiment, a visual object is visually changed by a magnitude of visual change of the visual object according to a manipulation based on a motion of a body part on the basis of manipulation information indicating the manipulation and information indicating a relationship between the magnitude of visual change and a manipulation amount based on the manipulation, and the visually changed visual object is presented to a user. Here, the relationship between the magnitude of visual change of the visual object and the manipulation amount based on the manipulation is defined at least according to a degree of a pseudo haptic sense to be presented. Accordingly, it is possible to allow the user to perceive a pseudo haptic sense without performing a complicated physical operation for simulating the real environment or using a haptic sense based on a physical reaction to the motion of the human body. The pseudo haptic sense that can be presented is, for example, a heaviness sense. The heaviness sense can also be rephrased as a weight sense, a resistance sense, or the like.
Specific examples of the body part include a hand, a palm, an arm, a head, a face part, a waist, a trunk, and a foot, but this does not limit the present invention, and may be a manipulation based on a motion of any body part. The motion of the body part is, for example, a motion of the body part in a touchless environment. A touchless environment is an environment in which a physical reaction to the motion of the human body (that is, a reaction force due to the reaction) is not used for presentation of a pseudo haptic sense. An environment in which a human body does not grip or wear a physical input device is a touchless environment. Even if the human body grips or wears a physical input device (for example, a VR controller), if a physical reaction from the input device to the motion of the human body is not used for presentation of a pseudo haptic sense, the environment is a touchless environment. The motion of the body part may be a motion of moving the position of the body part, a motion of rotating the body part, a motion of deforming the body part, or a combination thereof. The position of the body part may be an absolute position or a relative position. An example of the former is a position of a coordinate system fixed to an external reference position (for example, the ground or the floor), and an example of the latter is a relative position with respect to a visual object or a device that presents the visual object. The position of the body part may be a position represented by a three-dimensional coordinate system, a position represented by a two-dimensional coordinate system, or a position represented by a one-dimensional coordinate system. The motion of rotating the body part may be a motion of absolutely rotating the body part or a motion of relatively rotating the body part. An example of the former is a motion of rotating a body part with respect to a coordinate system fixed to an external reference position, and an example of the latter is a motion of rotating a body part with respect to a visual object or a device that presents the visual object. The motion of deforming the body part is a motion of deforming the body part in accordance with the movement of muscles and joints. Examples of the motion of deforming the body part include a pinching motion with a finger and a motion of clenching a fist. The motion of the body part can be represented, for example, by the position, orientation, movement amount, rotation amount, movement speed, rotation speed, acceleration, angular acceleration, or a combination of at least some of these of the body part.
The manipulation based on the motion of the body part may be any manipulation as long as the manipulation is performed on the basis of the motion of the body part. For example, the manipulation based on the motion of the body part may be any manipulation as long as the manipulation content or the manipulation amount based on the manipulation is determined on the basis of the position, orientation, movement amount, rotation amount, movement speed, rotation speed, acceleration, angular acceleration, or a combination of at least some of these of the body part. For example, the manipulation by the body part performed in the touchless environment may be detected by a detection device such as a hand tracker, and the manipulation content or the manipulation amount based on the manipulation may be determined on the basis of the detected position, orientation, movement amount, rotation amount, movement speed, rotation speed, acceleration, angular acceleration, or a combination of at least some of these of the body part. The manipulation information indicating the manipulation based on the motion of the body part may be any information as long as the manipulation information indicates the manipulation based on the motion of the body part. For example, the manipulation information may be information indicating the manipulation content, information indicating the manipulation amount based on the manipulation, or information indicating both the manipulation content and the manipulation amount based on the manipulation. The manipulation content indicates the type of manipulation. Specific examples of the manipulation content include selection, start, continuation, and end of the manipulation. The manipulation amount based on the manipulation represents the amount of any type of manipulation. For example, the greater the motion of the body part, the greater the manipulation amount based on the manipulation. More specifically, for example, the greater the movement amount of the body part, the greater the manipulation amount based on the manipulation. Hereinafter, the manipulation amount based on the manipulation is referred to as a manipulation amount m. The manipulation amount m is, for example, an amount directly proportional to the movement amount of the body part. The movement amount of the body part may be used as the manipulation amount m as it is, or a function value (for example, a non-decreasing function value or a monotonically increasing function value) of the movement amount of the body part may be used as the manipulation amount m. The movement amount of the body part in this case may be a distance from a motion start position of the body part to a current position, may be a specific direction component (for example, a vertical component, a vertical upward component, a vertical downward component, a horizontal component, a horizontal specific direction component, and the like) of the distance from the motion start position of the body part to the current position, or may be a path from the motion start position of the body part to the current position. The motion start position of the body part may be, for example, a position of the body part at the time of transition from a state where a predetermined start condition is not satisfied to a state where the predetermined start condition is satisfied, or may be a predetermined position. Examples of the start condition include that the body part has started a motion or movement (Condition 1), that the body part has performed a specific trigger motion (for example, a pinching motion with a finger, a motion of clenching a fist, and the like) (Condition 2), that a positional relationship between the body part and the visual object or another position satisfies a predetermined condition (Condition 3), that a device state satisfies a predetermined condition (Condition 4), that a manipulation state of the device satisfies a predetermined condition (Condition 5), that there has been an output from another processing unit (Condition 6), and the like. Any combination of Conditions 1 to 6 may be used as the start condition. Note that the manipulation amount m may be a value of 0 or more, a positive value, a value of 0 or less, a negative value, or a value that can be positive, negative, or 0.
The visual object may be any object as long as it is visually perceived and visually changed by a magnitude of change according to the manipulation. For example, the visual object may be a two-dimensional image, a three-dimensional image, an image in a virtual space, a stereoscopic hologram, or an entity (for example, a mechanically moving attraction device, an advertisement signboard, or the like) in a real space. The visual change of the visual object is not limited, and for example, a size of the visual object may change, a shape of the visual object may change, a luminance may change, a color may change, the pattern may change, or any combination thereof may change. At least one of the luminance, the color, or the pattern of the visual object may change without changing the size or shape of the visual object. At least one of the luminance, the color, or the pattern of the visual object may or may not be uniform in a spatial region. At least one of the luminance, the color, or the pattern of the visual object may be periodic or aperiodic in the spatial region.
The magnitude of visual change of the visual object is a magnitude of change from the initial state to the current state of the visual element of the visual object. The initial state of the visual element of the visual object may be determined in advance, may be determined on the basis of input information, or may be determined on the basis of other information (for example, the position of the presented visual object). The initial state of the visual element of the visual object may be the same regardless of the degree of the pseudo haptic sense to be presented, or may be different depending on the degree of the pseudo haptic sense to be presented. However, in order to clarify the difference in the degree of the pseudo haptic sense to be presented, it is desirable to make the initial state of the visual element of the visual object the same regardless of the degree of the pseudo haptic sense to be presented. Hereinafter, the magnitude of visual change of the visual object is referred to as a magnitude of change c. The magnitude of change c corresponds to the magnitude of change of at least one of the size, the shape, the luminance, the color, or the pattern of the visual object, and at least one of the size, the shape, the luminance, the color, or the pattern of the visual object changes by the magnitude of change according to the manipulation based on the motion of the body part. Examples of the magnitude of size change of the visual object include a magnitude of diameter change, a magnitude of radius change, a magnitude of area change, and a magnitude of volume change of the visual object. Examples of the magnitude of shape change of the visual object include a magnitude of change in aspect ratio, a magnitude of change in aspect/depth ratio, and a magnitude of change in oblateness of the visual object. Examples of the magnitude of luminance change include a magnitude of luminance change, a magnitude of brightness change, and a magnitude of phase change in spatial region. Examples of the magnitude of color change include a magnitude of change in pixel value, a magnitude of change in color space, and a magnitude of phase change in spatial region. The magnitude of pattern change is a magnitude of phase change in spatial region, a magnitude of change in pixel value, a magnitude of change in pixel value, a sum of magnitudes of change in pixel value, and the like. The magnitude of change c may be a value of 0 or more, a positive value, a value of 0 or less, a negative value, or a value that can be positive, negative, or 0.
The relationship between the magnitude of visual change c of the visual object and the manipulation amount m based on the manipulation is defined according to a degree of a pseudo haptic sense to be presented. This relationship is defined according to the degree of the pseudo haptic sense to be presented. Hereinafter, this relationship is referred to as a relationship r. Only a single relationship r with respect to a degree of a single pseudo haptic sense may be defined, or a plurality of relationships r with respect to degrees of a plurality of pseudo haptic senses may be defined. Note that the degree of the pseudo haptic sense to be presented may be determined in advance, may be determined on the basis of the input information, or may be determined on the basis of other processing. Hereinafter, an index representing the degree of the pseudo haptic sense to be presented is referred to as an index i. An example of the index i is a value (for example, an index, a numerical value, a vector, or a symbol) representing the magnitude (strength) of the pseudo haptic sense, and is, for example, a value representing the magnitude of the heaviness sense, the magnitude of the weight sense, and the magnitude of the resistance sense. In a case where a plurality of relationships r with respect to degrees of a plurality of pseudo haptic senses are defined, the relationships r are also different if the pseudo haptic senses to be presented are different. That is, when the relationship in a case where the index i is i1 (first index value) is referred to as a relationship r(i1), and the relationship in a case where the index i representing the degree of the pseudo haptic sense to be presented is i2 (second index value) different from i1 is referred to as a relationship r(i2), the relationship r(i1) and the relationship r(i2) are different. The relationship r between the magnitude of change c and the manipulation amount m may be a linear relationship or a non-linear relationship. An example of the relationship r is a ratio Δc/Δm of a change Δc in the magnitude of visual change c of the visual object to a change Δm in the manipulation amount m. For example, Δm means a unit manipulation amount in the manipulation amount m. For example, in a case where c is a function value f(m) of m, an example of the ratio r=Δc/Δm is the first derivative (slope) dc/dm for m of c=f(m). Here, f(·) means a function having · as a domain.
When the ratio r=Δc/Δm in a case where the index i is i1 (first index value) is referred to as a ratio r(i1) (first value), and the ratio r in a case where the index i is i2 (second index value) different from i1 is referred to as a ratio r(i2) (second value), the ratio r(i1) (first value) and the ratio r(i2) (second value) are different from each other. For example, in a case where the degree of the pseudo haptic sense represented by the index i1 (first index value) (for example, the magnitude of the pseudo haptic sense such as the magnitude of the heaviness sense, the magnitude of the weight sense, the magnitude of the resistance sense, and the like) is greater than the degree of the pseudo haptic sense represented by the index i2 (second index value), the ratio r(i1) (first value) is smaller than the ratio r(i2) (second value). For example, the ratio r decreases as the degree of the pseudo haptic sense to be presented increases. In other words, the smaller the ratio r, the greater the degree of the pseudo haptic sense to be presented.
For the same index i, the ratio r=Δc/Δm may be constant regardless of the magnitude of m (at least within a predetermined m range), or the ratio r=Δc/Δm may be different according to the magnitude of m. That is, in a case where the manipulation amount m changes from m1 (first manipulation amount) to m2 (second manipulation amount) and/or the manipulation amount m changes from m2 (second manipulation amount) to m1 (first manipulation amount) according to the manipulation (m1 is different from m2), a ratio r (third value) when the manipulation amount m is the manipulation amount m1 (first manipulation amount) and a ratio r (fourth value) when the manipulation amount m is the manipulation amount m2 (second manipulation amount) may be equal to or different from each other. For example, in a case where the ratio r when the manipulation amount m is m1 is r(i1, m1) (third value) and the ratio r when the manipulation amount m is the manipulation amount m2 is r(i1, m2) (fourth value) with respect to the index i1, r(i1, m1)=r(i1, m2), or r(i1, m1)≠r(i1, m2) (that is, r(i1, m2) (fourth value) is different from r(i1, m1) (third value)) may be satisfied. For example, the manipulation amount m1 (first manipulation amount) may be greater than the manipulation amount m2 (second manipulation amount), and r(i1, m1) (third value) may be smaller than r(i1, m2) (fourth value). In this case, the degree of the pseudo haptic sense increases as the manipulation amount m increases, and for example, the greater the manipulation amount m, the greater the degree of the pseudo haptic sense. Conversely, for example, the manipulation amount m1 (first manipulation amount) may be greater than the manipulation amount m2 (second manipulation amount), and r(i1, m1) (third value) may be greater than r(i1, m2) (fourth value). In this case, the degree of the pseudo haptic sense decreases as the manipulation amount m increases, and for example, the greater the manipulation amount m, the smaller the degree of the pseudo haptic sense. Alternatively, for example, in a case where the ratio r to the manipulation amount m1 is r(i1, m1), the ratio r to the manipulation amount m2 is r(i1, m2), and the ratio r to the manipulation amount m3 is r(i1, m3) with respect to the manipulation amounts m1, m2, and m3 satisfying m1<m2<m3, r(i1, m2)<r(i1, m1) and r(i1, m2)<r(i1, m3) may be satisfied. In this case, the degree of the pseudo haptic sense presented at the manipulation amount m2 is greater than the degrees of the pseudo haptic senses presented at the manipulation amounts m1 and m3 before and after the manipulation amount m2. Conversely, r(i1, m2)>r(i1, m1) and r(i1, m2)>r(i1, m3) may be satisfied. In this case, the degree of the pseudo haptic sense presented at the manipulation amount m2 is smaller than the degrees of the pseudo haptic senses presented at the manipulation amounts m1 and m3 before and after the manipulation amount m2.
In addition, the index i representing the degree of the pseudo haptic sense may be switched according to the manipulation amount m. For example, in a case where the manipulation amount m changes from m1 (first manipulation amount) to m2 (second manipulation amount) and/or the manipulation amount m changes from m2 (second manipulation amount) to m1 (first manipulation amount) according to the manipulation (m1 is different from m2), the index i when the manipulation amount m is the manipulation amount m1 (first manipulation amount) and the index i when the manipulation amount m is the manipulation amount m2 (second manipulation amount) may be equal to or different from each other. In this manner, the degree of the pseudo haptic sense may be changed by switching the index i according to the manipulation amount m. In this case, the ratio r(i)=Δc/Δm corresponding to each index i may be constant regardless of the magnitude of m, or the ratio r(i)=Δc/Δm may be different according to the magnitude of m (for example, as described above). For example, even if the ratio r(i) corresponding to each index i is constant regardless of the magnitude of m, the degree of the pseudo haptic sense to be presented can be changed according to the manipulation amount m by switching the index i according to the manipulation amount m. For example, in a case where the manipulation amount m1 is greater than the manipulation amount m2 and the degree of the pseudo haptic sense represented by the index i1 is greater than the degree of the pseudo haptic sense represented by the index i2, the index may be switched to the index i1 at the manipulation amount m1, and may be switched to the index i2 at the manipulation amount m2. Conversely, the index may be switched to the index i2 at the manipulation amount m1, and may be switched to the index i1 at the manipulation amount m2. Alternatively, for example, in a case where the degree of the pseudo haptic sense represented by the index i1 is greater than the degree of the pseudo haptic sense represented by the index i2 and the degree of the pseudo haptic sense represented by the index i2 is greater than the degree of the pseudo haptic sense represented by the index i3 with respect to the manipulation amounts m1, m2, and m3 satisfying m1<m2<m3, the index may be switched to the index i1 at the manipulation amount m2, may be switched to the index i2 at the manipulation amount m1, and may be switched to the index i3 at the manipulation amount m3. Alternatively, in this case, the index may be switched to the index i3 at the manipulation amount m2, may be switched to the index i2 at the manipulation amount m1, and may be switched to the index i3 at the manipulation amount m3.
In a first embodiment, a case where the body part is a user's hand, the visual object is a disk-shaped two-dimensional figure, the motion of the body part is a gesture that moves the visual object up and down, the diameter of the visual object is changed by a magnitude of change according to a manipulation based on this motion, and the user is allowed to perceive a heaviness sense that is a pseudo haptic sense will be exemplified. However, this does not limit the present invention.
As illustrated in
The pseudo haptic sense presentation apparatus 11 includes an input unit 111, a storage unit 112, a motion information detection unit 113, a manipulation determination unit 115, a visual object update unit 116, and an output unit 117. Although the description is omitted hereafter, data obtained by each process is stored in the storage unit 112 and is read and used as necessary.
The detection device 12 is a device that detects a position of a body part 101 (for example, a hand) of a user 100 in the touchless environment, and outputs a detection result. An example of the detection device 12 is a hand tracker such as Leap motion (registered trademark), motion capture, a touchless display, an acceleration sensor, a gyro sensor, and the like, which do not limit the present invention.
The visual object presentation device 13 is a device that visually presents a visual object 130 to the user 100. Examples of the visual object presentation device 13 include a liquid crystal display, a virtual reality headset, a video projector, a stereoscopic hologram display, and the like, which do not limit the present invention.
As preprocessing, at least information p indicating the relationship r between the magnitude of visual change c of the visual object 130 and the manipulation amount m based on the manipulation (manipulation based on the motion of the body part 101) is input to the input unit 111 and stored in the storage unit 112. Specific examples of the manipulation amount m, the visual object 130, the magnitude of change c, and the relationship r are as described above. For the sake of simplicity of description, in the present embodiment, a case where the visual object 130 is a disk-shaped two-dimensional figure, the magnitude of visual change c of the visual object 130 is a magnitude (positive value) of diameter change of the visual object 130, the manipulation amount m is a movement amount of a vertical component of the body part 101 (hand) based on the motion of the body part 101 to move the visual object 130 up and down, and the relationship r is a relationship between the magnitude of diameter change c of the visual object 130 and the manipulation amount m will be exemplified. However, this does not limit the present invention.
Next, an operation of the pseudo haptic sense presentation system 1 according to the present embodiment will be exemplified.
The visual object update unit 116 (
The user 100 moves the body part 101 (for example, a hand) in a touchless environment while viewing the visual object 130 presented by the visual object presentation device 13. The position of the body part 101 is detected by the detection device 12. Information d indicating the position of the body part 101 detected by the detection device 12 is output to the motion information detection unit 113. The position detection of the body part 101 and the output of the information d in the detection device 12 may be continuously performed at predetermined time intervals, for example, or may be performed each time the movement of the body part 101 is detected. Every time the information d is input, the motion information detection unit 113 detects the motion of the body part 101 from the information d and outputs information am indicating the motion of the body part 101. The motion detection of the body part 101 and the output of the information am are also continuously performed. The information am indicating the motion is, for example, information indicating at least one of the position, the movement, and the movement amount m of the body part 101. In the present embodiment, an example is illustrated in which the information am indicating the motion is information for specifying the movement amount m of the body part 101. A specific example of the movement amount m of the body part 101 is as described above. In the present embodiment, an example in which the vertical component of the distance from the motion start position of the body part 101 to the current position is used as the movement amount will be described. A specific example of the motion start position of the body part 101 is as described above. In the example of the present embodiment, the position of the body part 101 at the time of transition from a state in which the start condition, which is a combination of the above-described Condition 2 and Condition 3, is not satisfied to a state in which the start condition is satisfied is set as the motion start position. That is, in the example of the present embodiment, the position of the body part 101 at the time of transition to a state in which the body part 101 performs a specific trigger motion (Condition 2) and the positional relationship between the body part 101 and the presented visual object 130 or another position satisfies a predetermined condition (Condition 3) is set as the motion start position. The trigger motion may be any motion, but in the present embodiment, an example in which a pinching motion with a finger is used as the trigger motion will be described. Furthermore, the predetermined condition (Condition 3) that the positional relationship satisfies may be any condition. In the example of the present embodiment, the predetermined condition is that the body part 101 (for example, a hand) is away from the presentation position (for example, the display screen of the visual object presentation device 13) of the visual object 130 by more than a certain distance. In this case, the motion information detection unit 113 measures a distance d1 between the thumb and the index finger, which are the body part 101 of the user 100, on the basis of the information d, further measures a distance d2 between the body part 101 (for example, a hand) and the presentation position of the visual object 130 (for example, the display screen of the visual object presentation device 13), and outputs the distance d1 and the distance d2 as information am=(d1, d2) indicating the motion of the body part 101.
The information am=(d1, d2) is input to the manipulation determination unit 115. The manipulation determination unit 115 obtains and outputs manipulation information indicating a manipulation based on the motion of the body part 101 on the basis of the information am. An example of the manipulation information is as described above, but in the present embodiment, an example in which information indicating the manipulation amount m is output as the manipulation information will be described. For example, first, the manipulation determination unit 115 determines whether or not the distance d1 is less than a predetermined threshold value dth1 (alternatively, whether or not the distance d1 is equal to or less than the threshold value dth1). Here, the threshold value dth1 is a positive real number representing the distance. Here, in a case where the distance d1 exceeds the threshold value dth1 (alternatively, in a case where the distance d1 is equal to or greater than the threshold value dth1), the manipulation determination unit 115 determines that the pinching motion, that is, the trigger motion, on the body part 101 (the hand in this example) is not performed. On the other hand, in a case where the distance d1 is equal to or less than the threshold value dth1 (alternatively, in a case where the distance d1 is less than the threshold value dth1), the manipulation determination unit 115 determines that the trigger motion is performed (that is, it is determined that Condition 2 is satisfied). In addition, the manipulation determination unit 115 determines whether or not the distance d2 exceeds a predetermined threshold value dth2 (alternatively, whether or not the distance d2 is equal to or greater than the threshold value dth2). Here, the threshold value dth2 is a positive real number representing the distance. Here, in a case where the distance d2 is equal to or less than the threshold value dth2 (alternatively, in a case where the distance d2 is less than the threshold value dth2), it is determined that the positional relationship of the body part 101 does not satisfy the predetermined condition. On the other hand, in a case where the distance d2 exceeds the threshold value dth2 (alternatively, in a case where the distance d2 is equal to or greater than the threshold value dth2), it is determined that the positional relationship of the body part 101 satisfies the predetermined condition (that is, it is determined that Condition 3 is satisfied). In a case where the trigger motion is performed and it is determined that the positional relationship of the body part 101 satisfies the predetermined condition (that is, in a case where it is determined that both Condition 2 and Condition 3 are satisfied), the manipulation determination unit 115 obtains a vertical component of the distance from the motion start position to the current position of the body part 101 as the movement amount on the basis of the distance d2, and sets the movement amount as the manipulation amount m. Note that the motion start position is as described above. The motion start position exemplified in the present embodiment is the position of the body part 101 at the time of transition from a state in which at least one of Condition 2 or Condition 3 is not satisfied to a state in which both Condition 2 and Condition 3 are satisfied. The manipulation determination unit 115 outputs information indicating the manipulation amount m (manipulation information indicating a manipulation based on the motion of the body part) to the visual object update unit 116.
Information indicating the manipulation amount m is input to the visual object update unit 116. The visual object update unit 116 refers to the information p stored in the storage unit 112, and visually changes the visual object 130 by the magnitude of change according to the manipulation on the basis of at least the information indicating the manipulation amount m (manipulation information indicating the manipulation based on the motion of the body part) and the information p (information indicating the relationship between the magnitude of visual change of the visual object and the manipulation amount based on the manipulation). For example, if the information p represents only a single relationship r with respect to a degree of a single pseudo haptic sense, the visual object update unit 116 refers to this relationship r, obtains the magnitude of change c corresponding to the input manipulation amount m, and visually changes the visual object 130 from the initial state by the magnitude of change c. In the example of the present embodiment, the diameter of the visual object 130 is changed from the initial value c1 to c1+b·c. That is, the visual object update unit 116 in this example generates a disk-shaped visual object 130 having a diameter c2=c1+b·c, and outputs the information v indicating the visual object 130. b={b+, b−} may be information included in the information p or may be determined in advance. On the other hand, for example, if the information p represents a plurality of relationships r with respect to degrees of a plurality of pseudo haptic senses, the visual object update unit 116 refers to the relationship r corresponding to the degree of the pseudo haptic sense to be presented, obtains the magnitude of change c corresponding to the input manipulation amount m, and visually changes the visual object 130 from the initial state by the magnitude of change c. In the case of the example of
With reference to
The information v indicating the visual object 130 output from the visual object update unit 116 is transmitted to the output unit 117, and is output from the output unit 117 to the visual object presentation device 13. The visual object presentation device 13 presents (displays) the visual object 130 on the basis of the transmitted information v (
Next, experimental results of Experiment 1 for showing the effects of the first embodiment will be exemplified.
As illustrated in
In the first embodiment, the case where the visual object 130 is a disk-shaped two-dimensional figure, the motion of the body part 101 is a gesture that moves the visual object 130 up and down, the diameter of the visual object 130 is changed by a magnitude of change according to a manipulation based on this motion, and the user is allowed to perceive a heaviness sense that is a pseudo haptic sense has been exemplified. That is, in the first embodiment, an example has been shown in which the heaviness sense is perceived by changing the size of the visual object 130 by the magnitude of change according to the manipulation based on the motion. In a second embodiment, it is shown that the heaviness sense can be perceived by changing at least one of the luminance, the color, or the pattern of the visual object by the magnitude of change according to the manipulation based on the motion without changing the size or the shape of the visual object. In order to show this, in the present embodiment, it is assumed that a two-dimensional figure having concentric fringes with a fixed diameter is a visual object. In the second embodiment, the heaviness sense is perceived by changing a phase in the spatial region of the concentric fringes of the visual object (hereinafter simply referred to as a “phase”) by the magnitude of change according to the manipulation based on the motion.
Differences from the first embodiment will be mainly described below, and the same reference numerals will be used for the matters that have already been described to simplify the description.
As illustrated in
Preprocessing of the present embodiment is the same as the preprocessing of the first embodiment except that the visual object 130 is replaced with a visual object 230 that is a disk-shaped two-dimensional figure having concentric fringes with a fixed diameter (for example,
Next, an operation of the pseudo haptic sense presentation system 2 according to the present embodiment will be exemplified.
The visual object update unit 216 (
The user 100 moves the body part 101 (for example, a hand) in a touchless environment while viewing the visual object 230 presented by the visual object presentation device 13. The position of the body part 101 is detected by the detection device 12. Information d indicating the position of the body part 101 detected by the detection device 12 is output to the motion information detection unit 113. Every time the information d is input, the motion information detection unit 113 detects the motion of the body part 101 from the information d and outputs information am=(d1, d2) indicating the motion of the body part 101. The operations of the detection device 12 and the motion information detection unit 113 are the same as those in the first embodiment.
The information am=(d1, d2) is input to the manipulation determination unit 115, and the manipulation determination unit 115 obtains and outputs manipulation information indicating a manipulation based on the motion of the body part 101 on the basis of the information am. As in the first embodiment, also in the present embodiment, an example in which information indicating the manipulation amount m is output as manipulation information will be described. The operation of the manipulation determination unit 115 is the same as that of the first embodiment.
Information indicating the manipulation amount m is input to the visual object update unit 216. The visual object update unit 216 refers to the information p stored in the storage unit 112, and visually changes the visual object 230 by the magnitude of change according to the manipulation on the basis of at least the information indicating the manipulation amount m (manipulation information indicating the manipulation based on the motion of the body part) and the information p (information indicating the relationship between the magnitude of visual change of the visual object and the manipulation amount based on the manipulation). This operation is the same as the operation of the visual object update unit 116 of the first embodiment except that the phase of the concentric fringes of the visual object 230 is changed such that the distance x is from x1 to x2=x1+b·c instead of changing the diameter of the visual object 130 from the initial value c1 to c2=c1+b·c. In a case where the magnitude of change c is positive and b=b+, changing the phase of the concentric fringes of the visual object 230 such that the distance x is x1 to x2=x1+b·c is moving the fringes of the concentric fringes in the direction from the center O toward the outside. That is, in a case where b=b+, the higher the vertical position of the body part 101 (for example, a hand), the more the fringes are moved outward, and the lower the vertical position of the body part 101 (for example, a hand), the more the fringes are moved inward. In a case where the magnitude of change c is positive and b=b−, changing the phase of the concentric fringes of the visual object 230 such that the distance x is x1 to x2=x1+b·c is moving the fringes of the concentric fringes in the direction from the outside toward the center O. That is, in a case where b=b−, the higher the height (vertical component of the movement amount) of the body part 101 (for example, a hand), the more the fringes are moved inward, and the lower the height of the body part 101 (for example, a hand), the more the fringes are moved outward. Note that the diameter (size) and the shape of the visual object 230 are not changed.
With reference to
The information v indicating the visual object 130 output from the visual object update unit 216 is transmitted to the output unit 117, and is output from the output unit 117 to the visual object presentation device 13. The visual object presentation device 13 presents (displays) the visual object 230 on the basis of the transmitted information v (
Next, experimental results of Experiments 2 and 3 for showing the effects of the second embodiment will be exemplified.
Experiment 2 is an experiment in a case where b=b+, that is, the higher the vertical position of the body part 101 (for example, a hand), the more the fringes are moved outward, and the lower the vertical position of the body part 101 (for example, a hand), the more the fringes are moved inward. As illustrated in
Experiment 3 is an experiment in a case where b=b−, that is, the higher the vertical position of the body part 101 (for example, a hand), the more the fringes are moved inward, and the lower the vertical position of the body part 101 (for example, a hand), the more the fringes are moved outward. The experimental method is the same as in Experiment 2 except that b=b−.
In the second embodiment, the heaviness sense is perceived by changing the phase of the concentric fringes of the visual object by the magnitude of change according to the manipulation based on the motion. However, this effect can be obtained not only in the case where the phases of the concentric fringes are changed but also in the case where the phases of other figures are changed. In order to show this, in the present embodiment, it is assumed that a two-dimensional figure having one-dimensional fringes is a visual object. The size and the shape of the visual object of the present embodiment do not change. Differences from the first and second embodiments will be mainly described below, and the same reference numerals will be used for the matters that have already been described to simplify the description.
As illustrated in
Preprocessing of the present embodiment is the same as the preprocessing of the first embodiment except that the visual object 130 is replaced with a visual object 330 that is a rectangular two-dimensional figure having one-dimensional fringes with a fixed size (for example,
Next, an operation of the pseudo haptic sense presentation system 3 according to the present embodiment will be exemplified.
The visual object update unit 316 (
The user 100 moves the body part 101 (for example, a hand) in a touchless environment while viewing the visual object 330 presented by the visual object presentation device 13. The position of the body part 101 is detected by the detection device 12. Information d indicating the position of the body part 101 detected by the detection device 12 is output to the motion information detection unit 113. Every time the information d is input, the motion information detection unit 113 detects the motion of the body part 101 from the information d and outputs information am=(d1, d2) indicating the motion of the body part 101. The operations of the detection device 12 and the motion information detection unit 113 are the same as those in the first embodiment.
The information am=(d1, d2) is input to the manipulation determination unit 115, and the manipulation determination unit 115 obtains and outputs manipulation information indicating a manipulation based on the motion of the body part 101 on the basis of the information am. As in the first embodiment, also in the present embodiment, an example in which information indicating the manipulation amount m is output as manipulation information will be described. The operation of the manipulation determination unit 115 is the same as that of the first embodiment.
Information indicating the manipulation amount m is input to the visual object update unit 316. The visual object update unit 316 refers to the information p stored in the storage unit 112, and visually changes the visual object 330 by the magnitude of change according to the manipulation on the basis of at least the information indicating the manipulation amount m (manipulation information indicating the manipulation based on the motion of the body part) and the information p (information indicating the relationship between the magnitude of visual change of the visual object and the manipulation amount based on the manipulation). This operation is the same as the operation of the visual object update unit 116 of the first embodiment except that the phase of the one-dimensional fringes of the visual object 330 is changed such that the distance x is from x1 to x2=x1+b·c instead of changing the diameter of the visual object 130 from the initial value c1 to c2=c1+b·c. In a case where the magnitude of change c is positive and b=b+, changing the phase of the one-dimensional fringes of the visual object 330 such that the distance x is x1 to x2=x1+b·c is moving the fringes of the one-dimensional fringes rightward from the edge E. That is, in a case where b=b+, the higher the vertical position of the body part 101 (for example, a hand), the more the fringes are moved to the right, and the lower the vertical position of the body part 101 (for example, a hand), the more the fringes are moved to the left. In a case where the magnitude of change c is positive and b=b−, changing the phase of the one-dimensional fringes of the visual object 330 such that the distance x is x1 to x2=x1+b·c is moving the fringes of the one-dimensional fringes in the direction from the right to the edge E (leftward). That is, in a case where b=b−, the higher the height (vertical component of the movement amount) of the body part 101 (for example, a hand), the more the fringes are moved to the left, and the lower the height of the body part 101 (for example, a hand), the more the fringes are moved to the right. Note that the size and the shape of the visual object 330 are not changed.
The information v indicating the visual object 130 output from the visual object update unit 316 is transmitted to the output unit 117, and is output from the output unit 117 to the visual object presentation device 13. The visual object presentation device 13 presents (displays) the visual object 330 on the basis of the transmitted information v (
Next, experimental results of Experiment 4 for showing the effects of the third embodiment will be exemplified.
As illustrated in
In the first to third embodiments, an example has been described in which the size and the phase of the visual object are changed by the magnitude of change according to the manipulation on the basis of the manipulation information indicating the manipulation based on the motion of the body part and the information indicating the relationship between the magnitude of visual change of the visual object and the manipulation amount based on the manipulation, thereby allowing the user to perceive the heaviness sense. However, the present invention is not limited thereto as described above. For example, the shape of the visual object may be a disk shape, an elliptical shape, a rectangular shape, another polygonal shape, or any other shape. There is no limitation on the visual change of the visual object. For example, the manner of changing the size of the visual object is not limited to that of the first embodiment, and the size of the visual object may be changed while the visual object is deformed by the magnitude of change according to the manipulation, or the size or the shape of the visual object may be changed while at least one of the luminance, the color, or the pattern of the visual object is changed. The phase of the periodic fringes of the visual object may be changed while the size and the shape of the visual object are changed by the magnitude of change according to the manipulation. The visual object is a two-dimensional image (for example, a white noise image or the like) having an aperiodic pattern, and at least one of the size, the shape, the pattern, the luminance, and the color of the visual object may be changed by the magnitude of change according to the manipulation. Furthermore, as described above, for the same index i, the relationship r (for example, r=Δc/Δm) may be constant, or the relationship r (for example, r=Δc/Δm) may be different according to the magnitude of m. As described above, the visual object update units 116, 216, and 316 may switch the index i representing the degree of the pseudo haptic sense according to the manipulation amount m, refer to the relationship r corresponding to the switched index i, obtain the magnitude of change c corresponding to the input manipulation amount m, and visually change the visual object 130 by the obtained magnitude of change.
Also, various kinds of processing described above may be executed not only in time series in accordance with the description but also in parallel or individually in accordance with processing capabilities of the devices that execute the processes or as necessary. It is needless to say that appropriate modifications can be made without departing from the scope of the present invention.
Each of the pseudo haptic sense presentation apparatus 11, 21, and 31 according to each embodiment is a device configured with a general-purpose or dedicated computer executing a predetermined program, the computer including a processor (a hardware processor) such as a central processing unit (CPU) and a memory such as a random access memory (RAM) and a read only memory (ROM), for example. That is, each of the pseudo haptic sense presentation apparatus 11, 21, and 31 according to each embodiment includes, for example, processing circuitry configured to implement each unit included in each of the pseudo haptic sense presentation apparatus. The computer may include one processor and one memory, or may include a plurality of processors and a plurality of memories. The program may be installed into the computer, or may be recorded in a ROM or the like in advance. Also, some or all of the processing units may be configured using an electronic circuit that independently implements the processing functions, rather than an electronic circuit (circuitry) that forms the functional components by reading the program like a CPU. Also, an electronic circuit forming one device may include a plurality of CPUs.
The program described above can be recorded in a computer-readable recording medium. Examples of the computer-readable recording medium include a non-transitory recording medium. Examples of such a recording medium include a magnetic recording device, an optical disc, a magneto-optical recording medium, a semiconductor memory, and the like.
The program is distributed by selling, giving, or renting portable recording media such as DVDs or CD-ROMs recording the program thereon, for example. Furthermore, a configuration in which the program is stored in a storage device in a server computer and is distributed by transfer from the server computer to other computers via a network may also be employed. As described above, the computer executing such a program first stores a program recorded in a portable recording medium or a program transferred from the server computer temporarily into a storage device of the computer, for example. At the time of execution of a process, the computer reads the program stored in the storage device of the computer, and performs processing in accordance with the read program. Also, in other modes of execution of the program, the computer may read the program directly from a portable recording medium and performs processing in accordance with the program, or alternatively, the computer may sequentially perform processing in accordance with a received program every time a program is transferred from the server computer to the computer. Moreover, the above-described processing may be executed by a so-called application service provider (ASP) type service that implements a processing function only by an execution instruction and result acquisition without transferring the program from the server computer to the computer. Note that the program in the present embodiment includes information that is used for processing by an electronic computer and is equivalent to the program (data or the like that is not a direct command to the computer but has property that defines processing performed by the computer).
Although this device is configured with a computer executing a predetermined program in each embodiment, at least some of the processing contents may be implemented by hardware.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/040589 | 11/4/2021 | WO |