The present disclosure relates to an information processing device and an information processing method, and more particularly, to an information processing device and an information processing method capable of improving the operability of a 3D object.
In the related art, to move or rotate a 3D object displayed through AR (Augmented Reality) or VR (Virtual Reality), an operation to “grip” the 3D object is performed by a hand of a user to perform an operation to move or rotate the 3D object in accordance with the motion of the hand of the user.
For example, PTL 1 discloses a technique of correcting the position of a virtual object, displayed through AR by a plurality of AR devices, between the AR devices.
When a 3D object is operated as discussed above, the axis as the reference for the operation is not fixed. Therefore, the 3D object may be operated in a direction not expected by the user, and it has been desired to improve operability.
The present disclosure has been made in view of such circumstances, and makes it possible to improve the operability of a 3D object.
An aspect of the present disclosure provides an information processing device including: an axis attitude determination unit that determines an attitude of an adjustment axis on the basis of an attitude of a finger of a user, the adjustment axis serving as a reference for performing an adjustment process for a 3D object disposed in a three-dimensional space on computer graphics so as to be superimposed on a real space, and that changes and fixes the attitude of the adjustment axis; and an adjustment processing unit that performs the adjustment process for the 3D object according to an operation by the user in accordance with an axial direction of the adjustment axis at the fixed attitude.
An aspect of the present disclosure provides an information processing method that causes an information processing device to perform a process including: determining an attitude of an adjustment axis on the basis of an attitude of a finger of a user, the adjustment axis serving as a reference for performing an adjustment process for a 3D object disposed in a three-dimensional space on computer graphics so as to be superimposed on a real space, and changing and fixing the attitude of the adjustment axis; and performing the adjustment process for the 3D object according to an operation by the user in accordance with an axial direction of the adjustment axis at the fixed attitude.
In an aspect of the present disclosure, an attitude of an adjustment axis is determined on the basis of an attitude of a finger of a user, the adjustment axis serving as a reference for performing an adjustment process for a 3D object disposed in a three-dimensional space on computer graphics so as to be superimposed on a real space, and the attitude of the adjustment axis is changed and fixed; and the adjustment process for the 3D object according to an operation by the user is performed in accordance with an axial direction of the adjustment axis at the fixed attitude.
Hereinafter, specific embodiments to which the present technology is applied will be described in detail with reference to the drawings.
An AR system 11 illustrated in
The glasses-type wearable terminal 12 is configured to be shaped so as to be wearable by a user as with common glasses, and configured to be provided with a transmissive display unit 14L and a transmissive display unit 14R in lens portions of glasses. The glasses-type wearable terminal 12 is also configured to be provided with a camera 15 at a position at which the camera 15 can capture the field of view of the user wearing the glasses-type wearable terminal 12.
The transmissive display unit 14L and the transmissive display unit 14R display a 3D object 21 expressed through computer graphics, and transmits light in a region other than the region in which the 3D object 21 is displayed. Thus, the user can visually recognize the 3D object 21 displayed by the transmissive display unit 14L and the transmissive display unit 14R as if the 3D object 21 were disposed as superimposed on the real space.
For example, the 3D object 21 displayed by the transmissive display unit 14L and the transmissive display unit 14R is displayed in accordance with a world coordinate system 22 as a coordinate system that defines the entire three-dimensional space on the computer graphics. In addition, the 3D object 21 is provided with a local coordinate system 23 that defines the front-rear direction, the left-right direction, and the up-down direction of the 3D object 21 with the center of the 3D object 21 serving as the origin, for example.
The camera 15 can capture motion of a hand, a finger, etc., of the user, for example.
The ring-type device 13 is an input device that is used to input an operation by the user for the 3D object 21, and is used in the state of being worn on an index finger of the user as illustrated in the drawing, for example.
In the AR system 11, for example, when the user makes touch-on (change from non-contact to contact) on a touch pad 33 (
After that, when the user makes touch-off (change from contact to non-contact) on the touch pad 33 (
Further, the AR system 11 can perform an adjustment process for the 3D object 21, as that to be discussed later, using the ring-type device 13.
As illustrated in
The body portion 31 is configured to include, integrated therein, a battery 41, a vibration actuator 42, an audio module 43, an IMU (Inertial Measurement Unit) 44, a wireless communication substrate 45, etc. In the body portion 31, the various portions are driven using electric power supplied from the battery 41, causing the vibration actuator 42 to vibrate, the audio module 43 to output a sound, the IMU 44 to measure the three-dimensional angular velocity and acceleration of the ring-type device 13, and the wireless communication substrate 45 to communicate with the glasses-type wearable terminal 12, for example.
The belt portion 32 is used to allow the ring-type device 13 to be worn on a finger of the user. For example, the user can fix the ring-type device 13 to his/her index finger by inserting his/her index finger into the belt portion 32 along the constant chain line indicated in the drawing and thereafter adjusting the fitting of the belt portion 32.
The touch pad 33 is provided on a side surface of the belt portion 32, and detects a touch operation by the user on the surface of the touch pad 33. For example, the user can wear the ring-type device 13 on his/her index finger such that the touch pad 33 faces the thumb side as illustrated in
The tactile switch 34 is a switch to be turned on only while being pressed by a finger of the user and to be turned off when the user releases his/her finger, for example.
As illustrated in
The display 51 is constituted by the transmissive display unit 14L and the transmissive display unit 14R in
The speaker 52 outputs a sound in accordance with sound data supplied from the device body 54.
The sensor 53 is constituted by the camera 15 in
The device body 54 is configured to include, integrated therein, a CPU 55, a memory 56, a communication device 57, and an input/output interface 58. The CPU 55 controls the various portions of the glasses-type wearable terminal 12 by executing a program. The memory 56 stores data required for various processes executed in the glasses-type wearable terminal 12. The communication device 57 communicates with the ring-type device 13. The input/output interface 58 performs data input and output with the display 51, the speaker 52, and the sensor 53.
The ring-type device 13 is configured such that a vibrator 61 and a sensor 62 are connected to a device body 63.
The vibrator 61 is constituted by the vibration actuator 42 in
The sensor 62 is constituted by the IMU 44 in
The device body 63 is configured to include, integrated therein, a CPU 64, a memory 65, a communication device 66, and an input/output interface 67. The CPU 64 controls the various portions of the ring-type device 13 by executing a program. The memory 65 stores data required for various processes executed in the ring-type device 13. The communication device 66 communicates with the glasses-type wearable terminal 12. The input/output interface 67 performs data input and output with the vibrator 61 and the sensor 62.
The glasses-type wearable terminal 12 may be various head-mounted displays that are used in optical see-through AR, video see-through AR, VR, etc. A part of the hardware configuration of the glasses-type wearable terminal 12 may be implemented by a personal computer (not illustrated). Alternatively, the glasses-type wearable terminal 12 may be replaced with a projector or a large or small display device. In the AR system 11, the ring-type device 13 may not be used, and the function of the ring-type device 13 may be integrated in the glasses-type wearable terminal 12. In the AR system 11, an operation terminal that is used to operate a robot, a drone, etc., can be used, for example.
As illustrated in
The sensor control unit 71 controls the camera 15 in
In addition, the sensor control unit 71 can control various sensors such as an RGB camera and a depth camera (not illustrated). Such sensors can be disposed in an environment in which the AR system 11 is used, or attached to a part of the body of the user other than the face or a hand, for example, rather than being provided in the glasses-type wearable terminal 12 or the ring-type device 13. A single sensor or a plurality of sensors may be provided.
The information processing unit 72 is configured to include a sensor information acquisition unit 81, an adjustment mode determination unit 82, a finger attitude determination unit 83, an axis attitude determination unit 84, a finger operation determination unit 85, an adjustment processing unit 86, and an output control unit 87.
The sensor information acquisition unit 81 acquires, as sensor information, the visual data, the angular velocity data and the acceleration data on the ring-type device 13, the touch position data for the touch pad 33, the on/off data for the tactile switch 34, etc., supplied from the sensor control unit 71.
The adjustment mode determination unit 82 determines whether or not to make a transition to an adjustment mode in which an adjustment process for the 3D object 21 is performed, and whether or not to end the adjustment mode, on the basis of the sensor information acquired by the sensor information acquisition unit 81, such as the visual data or the angular velocity data or the acceleration data on the ring-type device 13, for example.
The finger attitude determination unit 83 acquires bone position-attitude information that indicates the position and the attitude of each finger by analyzing the skeletal frame of the fingers of the user captured in the visual data acquired by the sensor control unit 71. The finger attitude determination unit 83 also acquires device position-attitude information that indicates the position and the attitude of the ring-type device 13 on the basis of the angular velocity data and the acceleration data on the ring-type device 13 acquired by the sensor control unit 71. Then, the finger attitude determination unit 83 can determine the attitude of the fingers of the user on the basis of the bone position-attitude information and the device position-attitude information. Besides, the finger attitude determination unit 83 may determine the attitude of the fingers of the user using depth information acquired by a depth sensor (not illustrated) as well.
The axis attitude determination unit 84 determines the attitude of the adjustment axes 25 (
The finger operation determination unit 85 determines an operation by a finger of the user on the basis of the touch position data for the touch pad 33 and the on-off data for the tactile switch 34 acquired by the sensor control unit 71. For example, the finger operation determination unit 85 can determine whether or not the touch pad 33 is touched, the touch position and the amount of movement of a touch on the touch pad 33, and whether the tactile switch 34 is on or off.
The adjustment processing unit 86 performs an adjustment process of moving the position of the 3D object 21, turning the attitude of the 3D object 21, and increasing/reducing the size of the 3D object 21 on the basis of the touch position and the amount of movement of a touch on the touch pad 33 determined by the finger operation determination unit 85 when the user is in touch with the touch pad 33.
The output control unit 87 controls output of visual data to the visual display unit 73, output of tactile data to the tactile presentation unit 74, and output of sound data to the sound presentation unit 75.
The visual display unit 73 is constituted by the transmissive display unit 14L and the transmissive display unit 14R in
The tactile presentation unit 74 is constituted by the vibration actuator 42 in
The sound presentation unit 75 is constituted by the audio module 43 in
An adjustment mode determination range 24 will be described with reference to
In the AR system 11, for example, an adjustment mode determination range 24 for determining a transition and an end of the adjustment mode in which an adjustment process for the 3D object 21 is performed is set around the 3D object 21.
For example, the adjustment mode determination range 24 is set in a certain range around the 3D object 21 as illustrated in A of
Here, when the position of the 3D object 21 is adjusted in the adjustment mode, the position of the adjustment mode determination range 24 can be adjusted together with the 3D object 21 so that the positional relationship of the adjustment mode determination range 24 relative to the 3D object 21 is not changed. Alternatively, when the position of the 3D object 21 is adjusted in the adjustment mode, the position of the adjustment mode determination range 24 may be kept fixed so that the positional relationship of the adjustment mode determination range 24 relative to the 3D object 21 is changed as illustrated in B of
In the AR system 11, a transition and an end of the adjustment mode may be determined without using the adjustment mode determination range 24.
For example, a transition may be made to the adjustment mode when the user moves his/her finger wearing the ring-type device 13 closer to the 3D object 21 and performs an operation of clicking, double-clicking, or holding down the tactile switch 34. After that, the adjustment mode can be ended when the user performs an operation of clicking, double-clicking, or holding down the tactile switch 34.
Similarly, a transition may be made to the adjustment mode when the user moves his/her finger wearing the ring-type device 13 closer to the 3D object 21 and performs an operation of single-tapping, double-tapping, or holding down the touch pad 33. After that, the adjustment mode can be ended when the user performs an operation of single-tapping, double-tapping, or holding down the touch pad 33.
Alternatively, a transition may be made to the adjustment mode when the user moves his/her finger wearing the ring-type device 13 closer to the 3D object 21 and performs an operation of tapping, double-tapping, or holding down the 3D object 21 with the fingertip. After that, the adjustment mode can be ended when the user taps, double-taps, or holds down the 3D object 21 with the fingertip. Besides, a specific gesture such as a finger snap may be used. Such motion of the fingertip can be detected by the IMU 44, the camera 15, etc.
The adjustment axes 25 displayed in the adjustment mode will be described with reference to
In the AR system 11, for example, the adjustment axes 25 are displayed in accordance with the origin of the local coordinate system 23 for the 3D object 21 in accordance with the attitude of the touch pad 33 of the ring-type device 13 at the timing of a transition to the adjustment mode. As discussed above, the operation axes (X- and Y-axes in
In the AR system 11, the adjustment axes 25 are displayed since the timing when a transition is made to the adjustment mode until the timing when the adjustment mode is ended, allowing the user to recognize whether or not the adjustment mode is activated on the basis of the adjustment axes 25. In that event, feedback may be given through a sound, vibration, etc., or feedback may be given visually by means of other than the adjustment axes 25.
In the AR system 11, for example, the user can change the attitude of the adjustment axes 25 displayed in the adjustment mode by tilting the ring-type device 13. That is, when the attitude of the operation axes of the touch pad 33 of the ring-type device 13 is varied, the adjustment axes 25 are displayed with its attitude changed so as to coincide with the attitude of the operation axes.
When the user tilts the ring-type device 13 so as to be rotated leftward about an axis orthogonal to the surface of the touch pad 33 as illustrated in A of
When the user tilts the ring-type device 13 so as to be rotated leftward about the Y-axis of the surface of the touch pad 33 as illustrated in B of
In the AR system 11, for example, the display of the adjustment axes 25 can be fixed at the attitude at the timing when the tactile switch 34 is turned on while the user is performing an operation to turn on the tactile switch 34 when in the adjustment mode. That is, the attitude of the adjustment axes 25 may not be changed even if the user tilts the ring-type device 13 with the tactile switch 34 kept on.
The attitude of the adjustment axes 25 can be kept fixed even if the user performs an operation to turn on the tactile switch 34 with the adjustment axes 25 displayed as illustrated on the left side of
After that, when the user performs an operation to turn off the tactile switch 34, the relative relationship between the adjustment axes 25 and the operation axes of the touch pad 33 at the timing when the operation is performed is maintained. That is, offset is caused with the ring-type device 13 tilted with respect to the adjustment axes 25 as illustrated on the right side of
When the user performs a specific cancellation operation such as double-tapping the touch pad 33 or shaking the ring-type device 13, the offset may be canceled, and the adjustment axes 25 may be displayed so as to coincide with the attitude of the operation axes of the touch pad 33.
In the AR system 11, for example, a designated angle of the adjustment axes 25 can be set such that the X-axis of the adjustment axes 25 is in the horizontal direction and the Y-axis of the adjustment axes 25 is in the vertical direction when the 3D object 21 is seen from the position of the user. Then, when the user changes the attitude of the adjustment axes 25 by tilting the ring-type device 13, a user interface that indicates the designated angle can be displayed when the present attitude of the adjustment axes 25 reaches a certain angle or less (e.g., ±10° or less) from the designated angle. This allows the user to match the attitude of the adjustment axes 25 with the designated angle.
An adjustment process for the 3D object 21 will be described with reference to
In the AR system 11, for example, an adjustment process of moving the position of the 3D object 21, an adjustment process of turning the attitude of the 3D object 21, and an adjustment process of increasing/reducing the size of the 3D object 21 can be performed. Switching can be made among the adjustment processes by single-tapping, double-tapping, or holding down the touch pad 33, clicking, double-clicking, or holding down the tactile switch 34, etc.
When a transition is made to the adjustment mode, the adjustment axes 25 are displayed for the 3D object 21, as illustrated on the left side of
When the user performs an operation to move the touch from the right side toward the left side of the touch pad 33, the position of the 3D object 21 is moved toward the left side of the X-axis of the adjustment axes 25. When the user performs an operation to move the touch from the upper side toward the lower side of the touch pad 33, meanwhile, the position of the 3D object 21 is moved toward the lower side of the Y-axis of the adjustment axes 25. When the user performs an operation to move the touch from the lower side toward the upper side of the touch pad 33, similarly, the position of the 3D object 21 is moved toward the upper side of the Y-axis of the adjustment axes 25.
In this manner, the adjustment process of moving the position of the 3D object 21 is performed in accordance with the direction of movement and the amount of movement of a touch operation by the user on the touch pad 33.
When a transition is made to the adjustment mode, the adjustment axes 25 are displayed for the 3D object 21, as illustrated on the left side of
When the user performs an operation to move the touch from the lower side toward the upper side of the touch pad 33, the attitude of the 3D object 21 is turned toward the back side about the X-axis of the adjustment axes 25. When the user performs an operation to move the touch from the left side toward the right side of the touch pad 33, meanwhile, the attitude of the 3D object 21 is turned toward the right side about the Y-axis of the adjustment axes 25. When the user performs an operation to move the touch from the right side toward the left side of the touch pad 33, similarly, the attitude of the 3D object 21 is turned toward the left side about the Y-axis of the adjustment axes 25.
In this manner, the adjustment process of turning the attitude of the 3D object 21 is performed in accordance with the direction of movement and the amount of movement of a touch operation by the user on the touch pad 33.
When a transition is made to the adjustment mode, the adjustment axes 25 are displayed for the 3D object 21, as illustrated on the left side of
When the user performs an operation to move the touch from the right side toward the left side of the touch pad 33, the size of the 3D object 21 is reduced in a direction along the X-axis of the adjustment axes 25. When the user performs an operation to move the touch from the upper side toward the lower side of the touch pad 33, meanwhile, the size of the 3D object 21 is reduced in a direction along the Y-axis of the adjustment axes 25. When the user performs an operation to move the touch from the lower side toward the upper side of the touch pad 33, similarly, the size of the 3D object 21 is increased in a direction along the Y-axis of the adjustment axes 25.
In this manner, the adjustment process of increasing/reducing the size of the 3D object 21 is performed in accordance with the direction of movement and the amount of movement of a touch operation by the user on the touch pad 33.
A 3D object adjustment process performed in the AR system 11 will be described with reference to the flowchart illustrated in
In step S11, the adjustment mode determination unit 82 determines whether or not to make a transition to an adjustment mode in which an adjustment process for the 3D object 21 is performed, on the basis of the sensor information supplied from the sensor information acquisition unit 81. For example, the adjustment mode determination unit 82 determines to make a transition to the adjustment mode when it is detected that the user moves his/her finger wearing the ring-type device 13 closer to the 3D object 21 and that the fingertip is located at a position within the adjustment mode determination range 24 and at which the fingertip is not in contact with the 3D object 21.
The process stands by until the adjustment mode determination unit 82 determines in step S11 to make a transition to the adjustment mode. When the adjustment mode determination unit 82 determines to make a transition to the adjustment mode, the process proceeds to step S12.
In step S12, the axis attitude determination unit 84 determines the attitude of the adjustment axes 25 at the timing when a transition is made to the adjustment mode, on the basis of the device position-attitude information on the ring-type device 13 supplied from the finger attitude determination unit 83. For example, the axis attitude determination unit 84 determines the attitude of the adjustment axes 25 so as to coincide with the attitude of the operation axes of the surface of the touch pad 33 at the timing when a transition is made to the adjustment mode. Then, the axis attitude determination unit 84 instructs the output control unit 87 to display the adjustment axes 25 at an attitude according to the determination, and the output control unit 87 supplies the visual display unit 73 with visual data on the adjustment axes 25 generated to be displayed at the attitude indicated by the axis attitude determination unit 84. Consequently, the adjustment axes 25 at an attitude that coincides with the attitude of the operation axes of the surface of the touch pad 33 at the timing when a transition is made to the adjustment mode are displayed at the origin of the local coordinate system 23 for the 3D object 21, as described with reference to
In step S13, the finger operation determination unit 85 determines whether or not a touch by the user on the touch pad 33 of the ring-type device 13 is detected, on the basis of the sensor information supplied from the sensor information acquisition unit 81.
When the finger operation determination unit 85 determines in step S13 that a touch by the user on the touch pad 33 of the ring-type device 13 is not detected, the process proceeds to step S14.
In step S14, the axis attitude determination unit 84 changes the attitude of the adjustment axes 25 in accordance with variations in the attitude of the operation axes of the surface of the touch pad 33, if there are any such variations, on the basis of the device position-attitude information on the ring-type device 13 supplied from the finger attitude determination unit 83. For example, the axis attitude determination unit 84 instructs the output control unit 87 to display the adjustment axes 25 at an attitude based on the device position-attitude information, and the output control unit 87 supplies the visual display unit 73 with visual data on the adjustment axes 25 generated to be displayed at the attitude indicated by the axis attitude determination unit 84. Consequently, the adjustment axes 25 are displayed with their attitude changed in accordance with variations in the attitude of the ring-type device 13, as described with reference to
When the finger operation determination unit 85 determines in step S13 that a touch by the user on the touch pad 33 of the ring-type device 13 is detected, on the other hand, the process proceeds to step S15.
In step S15, the adjustment processing unit 86 performs an adjustment process for the 3D object 21 on the basis of the touch position and the amount of movement of the touch on the touch pad 33 supplied from the finger operation determination unit 85 with the attitude of the adjustment axes 25 fixed. For example, when setting has been made to perform an adjustment process to move the position of the 3D object 21, the adjustment processing unit 86 instructs the output control unit 87 to display the 3D object 21 at a position after movement by an amount of movement based on the amount of movement of the touch. Accordingly, the output control unit 87 supplies the visual display unit 73 with visual data on the 3D object 21 generated to be displayed at the position indicated by the adjustment processing unit 86. Consequently, the 3D object 21 after being moved is displayed, as described with reference to
After the process in step S14 or S15, the process proceeds to step S16, and the adjustment mode determination unit 82 determines whether or not to end the adjustment mode in which an adjustment process for the 3D object 21 is performed, on the basis of the sensor information supplied from the sensor information acquisition unit 81. For example, the adjustment mode determination unit 82 determines to end the adjustment mode when it is detected that the user moves his/her finger wearing the ring-type device 13 away from the 3D object 21 and that the fingertip is located at a position outside the adjustment mode determination range 24.
When the adjustment mode determination unit 82 determines in step S16 not to end the adjustment mode, the process returns to step S13, and similar processes are repeatedly performed subsequently. When the adjustment mode determination unit 82 determines in step S16 to end the adjustment mode, on the other hand, the process returns to step S11, and similar processes are repeatedly performed subsequently.
In the AR system 11, as has been described above, an adjustment process for the 3D object 21 can be performed in the 3D object adjustment process, with the attitude of the adjustment axes 25 changed when the touch pad 33 is not touched and with the attitude of the adjustment axes 25 fixed when the touch pad 33 is touched. Consequently, it is possible to perform an adjustment process for the 3D object 21 as intended by the user, improving the operability of the 3D object 21.
Examples of various process that can be performed by the AR system 11 will be described with reference to
In the AR system 11, for example, the adjustment process of moving the position of the 3D object 21 can be performed by tilting the ring-type device 13, besides performing the adjustment process of moving the position of the 3D object 21 using the touch pad 33 as discussed above. That is, the adjustment processing unit 86 can move the position of the 3D object 21 in accordance with the tilt of the ring-type device 13 on the basis of the angular velocity data and the acceleration data on the ring-type device 13 acquired by the sensor information acquisition unit 81.
When a transition is made to the adjustment mode, the adjustment axes 25 are displayed for the 3D object 21, as illustrated on the left side of
The adjustment process performed using a touch on the touch pad 33 and the adjustment process performed by tilting the ring-type device 13 may be combined with each other.
Further, the adjustment process can also be performed using operation means such as a pointing stick or a scroll wheel when the ring-type device 13 includes such operation means. Alternatively, even when the ring-type device 13 is not worn, the adjustment process can be performed by detecting motion of one finger in contact with another using a skin strain sensor, an electromyogram sensor, etc., or the adjustment process can be performed by detecting motion in the depth direction using a pressure sensor, etc., for example.
In the AR system 11, for example, the magnification of the amount of movement in the position of the 3D object 21 can be changed when the adjustment process is performed by touching the touch pad 33.
When the user performs an operation to give a touch in the vicinity of the upper side of the touch pad 33 and move the touch in the X-axis direction while keeping the touch as illustrated in A of
When the user performs an operation to give a touch in the vicinity of the lower side of the touch pad 33 and move the touch in the X-axis direction while keeping the touch as illustrated in B of
Similarly, when the user performs an operation to give a touch in the vicinity of the right side of the touch pad 33 and move the touch in the Y-axis direction while keeping the touch, the position of the 3D object 21 can be moved significantly in the Y-axis direction. When the user performs an operation to give a touch in the vicinity of the left side of the touch pad 33 and move the touch in the Y-axis direction while keeping the touch, meanwhile, the position of the 3D object 21 can be moved slightly in the Y-axis direction.
As with the magnification that is used to move the position of the 3D object 21, a magnification may be used, or may not be used, also for an amount of rotation that is used to turn the attitude of the 3D object 21 or an amount of increase or an amount of reduction that is used to increase/reduce the size of the 3D object 21.
The magnification may be varied in accordance with the size of the 3D object 21. For example, the position of the 3D object 21 can be moved significantly when the size of the 3D object 21 is large, and the position of the 3D object 21 can be moved slightly when the size of the 3D object 21 is small.
When the magnification in the adjustment process for the 3D object 21 is changed in this manner, the display on the user interface may be changed in order to cause the user to recognize such a change. For example, the adjustment axes 25 may be indicated by thick lines as illustrated in
Before an adjustment process for the 3D object 21 is performed, the magnification may be changed in accordance with the time since a finger touches the touch pad 33 until the finger starts moving. For example, the magnification can be low when the finger starts moving soon after touching the touch pad 33, and the magnification can be high when the finger starts moving several seconds after touching the touch pad 33.
The magnification may be stored in advance in the ring-type device 13 for each user that uses the AR system 11. In this case, the adjustment process is performed using the magnification stored in the ring-type device 13 by recognizing the user wearing the ring-type device 13.
Occasionally, a touch position cannot be accurately detected when the user touches the upper side of the touch pad 33 and moves the touch position to the lower side of the touch pad 33 while keeping the touch, for example.
When the ball of the thumb contacts the touch pad 33 when the user starts a touch, even if the user intends to be in touch with the upper side of the touch pad 33, the center of the touch pad 33 is detected as the touch position, as illustrated in A of
Thus, in the AR system 11, detection of the touch position on the touch pad 33 can be assisted by detecting and tracking the position of the thumb of the user on the basis of a video of the fingers of the user captured by the camera 15. In the AR system 11, the touch position can be detected more accurately by using tracking based on a video in this manner.
In the AR system 11, as discussed above, the attitude of the adjustment axes 25 can be freely changed to any angle in accordance with the attitude of the ring-type device 13. However, it is assumed that it is occasionally difficult to change the attitude of the adjustment axes 25 to a desired angle. Thus, in the AR system 11, the attitude of the adjustment axes 25 can be changed by a predetermined angle each time the ring-type device 13 is tilted to a certain angle or more, for example.
For example,
That is, when the ring-type device 13 is tilted rightward or leftward to an angle of less than 45°, the adjustment axes 25 are kept fixed, even if the ring-type device 13 is tilted. When the ring-type device 13 is tilted rightward or leftward to 45° or more, the attitude of the adjustment axes 25 can be turned rightward or leftward by 90°. Consequently, the attitude of the adjustment axes 25 can be accurately adjusted to the predetermined angle on either side.
As a matter of course, the changes in the attitude of the adjustment axes 25 are not limited to 90° each, and setting can be made such that the attitude of the adjustment axes 25 is changed by any predetermined angle (such as 45° or) 60° each time.
In the AR system 11, as discussed above, the adjustment axes 25 are displayed in accordance with the attitude of the touch pad 33 of the ring-type device 13 at the timing when a transition is made to the adjustment mode. When the attitude of the adjustment axes 25 initially displayed at the timing when a transition is made to the adjustment mode is different in accordance with the attitude of the touch pad 33 of the ring-type device 13, however, it is assumed that it is occasionally difficult to perform an adjustment process for the 3D object 21.
Thus, in the AR system 11, the adjustment axes 25 can be always initially displayed in accordance with the default attitude, irrespective of the attitude of the touch pad 33 of the ring-type device 13 at the timing when a transition is made to the adjustment mode. For example, an attitude at which the X-axis of the adjustment axes 25 is in the horizontal direction and the Y-axis of the adjustment axes 25 is in the vertical direction when the 3D object 21 is seen from the position of the user can be set as the default attitude for initially displaying the adjustment axes 25. The default attitude for initially displaying the adjustment axes 25 may be determined so as to match the result of estimating a psychological position and attitude of a hand of the user, for example.
For example, when a transition is made to the adjustment mode with the ring-type device 13 tilted as illustrated in A of
Here, when offset is caused between the attitude of the adjustment axes 25 and the attitude of the operation axes of the touch pad 33 of the ring-type device 13 as illustrated in B of
In the AR system 11, the direction of the Z-axis (not displayed) may be fixed so that the Z-axis is always in the direction of the user, for example.
In the AR system 11, further, the 3D object 21 may be initially displayed in accordance with the default attitude. For example, the default attitude for initially displaying the 3D object 21 can be set such that the Z-axis direction of the local coordinate system 23 for the 3D object 21 coincides with the Z-axis direction of the world coordinate system 22 when displaying the 3D object 21 with reference to the ring-type device 13. Consequently, the 3D object 21 is initially displayed such that the attitude of the 3D object 21 in the up-down direction is always the same when initially displaying the 3D object 21.
For example, when displaying the 3D object 21 with reference to the ring-type device 13, it is not expected to initially display the 3D object 21 in accordance with the attitude of the ring-type device 13 (i.e., such that the up-down direction of the 3D object 21 does not coincide with the Z-axis direction of the world coordinate system 22) as illustrated in A of
Selection of the 3D object 21 as the target for an adjustment process will be described with reference to
In the AR system 11, for example, selection of the 3D object 21 as the target for an adjustment process can be changed in accordance with the touch position on the touch pad 33 of the ring-type device 13.
For example, it is possible to select the 3D object 21 which is larger when the user touches the lower side of the touch pad 33, and to select the 3D object 21a which is smaller when the user touches the upper side of the touch pad 33. This allows the user to select the 3D object 21 quickly and easily.
As a matter of course, the target to be selected may also be changed in accordance with the touch position on the touch pad 33 when selecting the 3D object 21 as the target for an operation to “grip” the 3D object 21 as in the related art, besides when selecting the 3D object 21 as the target for an adjustment process. Besides, the same can also be applied to an operation to switch the appearance of an object, an intensity simulation, or display of a layer such as temperature.
Here, while the ring-type device 13 is used to adjust the 3D object 21 in the present embodiment by way of example, devices other than the ring-type device 13 such as a glove-type device or a device shaped so as to cover a single finger as a whole can also be used. The touch pad 33 is also not limited to the one capable of two-axis detection discussed above, and one capable of single-axis detection may also be used.
For example, a pen-type device 91 may also be used as illustrated in
To use the pen-type device 91, a single-axis adjustment axis 27 is displayed for the 3D object 21 in accordance with the tilt of the pen-type device 91, as illustrated in B of
A mouse-type device 93 may also be used as illustrated in
For example, the user can perform an operation to perform an adjustment process for the 3D object 21 with the mouse-type device 93 lifted in the air. At this time, the adjustment axes 25 are displayed in accordance with the origin of the local coordinate system 23 for the 3D object 21 in accordance with the attitude of the mouse-type device 93 at the timing when a transition is made to the adjustment mode, as illustrated in B of
The above-described series of processing (information processing method) can be executed by hardware or software. In a case where the series of processing is executed by software, a program that configures the software is installed in a general-purpose computer or the like.
The program can be recorded in advance in the hard disk 105 or ROM 103 as a recording medium built into the computer.
Alternatively, the program can be stored (recorded) in removable recording medium 111 driven by drive 109. Such a removable recording medium 111 can be provided as so-called package software. Here, examples of the removable recording medium 111 include a flexible disc, a compact disc read-only memory (CD-ROM), a magneto-optical (MO) disc, a digital versatile disc (DVD), a magnetic disk, and a semiconductor memory.
The program can be installed in the computer from the removable recording medium 111 as described above, or can be downloaded to the computer via a communication network or broadcasting network and installed in the built-in hard disk 105. That is, for example, the program is transferred from the download site to the computer wirelessly via an artificial satellite for digital satellite broadcasting, or transferred to the computer by wire via a network such as a LAN (Local Area Network) or the Internet.
The computer contains a central processing unit (CPU) 102. An input/output interface 110 is connected to the CPU 102 via a bus 101.
When a user inputs an instruction by manipulating an input unit 107 through the input/output interface 110, the CPU 102 executes a program stored in a read-only memory (ROM) 103 in accordance with the instruction. Alternatively, the CPU 102 loads a program stored in the hard disk 105 into a random access memory (RAM) 104 and executes it.
As a result, the CPU 102 performs processing according to the above-described flowcharts or processing executed by components of the above-described block diagrams. Then, the CPU 102 causes an output unit 106 to output a processing result, causes a communication unit 108 to transmits the processing result, and causes the hard disk 105 to record the processing result, for example, via the input/output interface 110 as necessary.
The input unit 107 is composed of a keyboard, mouse, microphone, and the like. Further, the output unit 106 is configured by a liquid crystal display (LCD), a speaker, or the like.
The processing executed by the computer in accordance with the program described herein may not necessarily be executed chronologically in the order described as the flowcharts. In other words, the processing executed by the computer in accordance with the program also includes processing that is executed in parallel or individually (for example, parallel processing or processing by objects).
The program may be a program processed by one computer (processor) or may be distributed and processed by a plurality of computers. Furthermore, the program may be a program transmitted to a remote computer to be executed.
Moreover, in the present specification, a system means a collection of a plurality of constituent elements (including devices and modules (components)) regardless of whether all the constituent elements are contained in the same casing. Accordingly, a plurality of devices accommodated in separate casings and connected via a network and one device in which a plurality of modules are accommodated in one casing are all systems.
For example, a configuration described as one device (or processing unit) may be divided and configured as a plurality of devices (or processing units). On the other hand, the configuration described above as a plurality of devices (or processing units) may be collectively configured as one device (or processing unit). Further, of course, a configuration other than the above may be added to the configuration of each device (or each processing unit). Further, a part of the configuration of a device (or processing unit) may be included in the configuration of another device (or another processing unit) as long as the configuration or operation of the system as a whole is substantially the same.
Further, for example, the present technology may have a cloud computing configuration in which one function is shared with and processed by a plurality of devices via a network.
Further, for example, the above-described program can be executed in any device. In this case, the device only needs to have necessary functions (functional blocks, and the like) and to be able to obtain necessary information.
Further, for example, the respective steps described in the above-described flowchart may be executed by one device or in a shared manner by a plurality of devices. Furthermore, in a case where a plurality of steps of processing are included in one step, the plurality of steps of processing included in one step may be executed by one device or by a plurality of devices in a shared manner. In other words, it is also possible to execute the plurality of processing included in one step as processing of a plurality of steps. On the other hand, it is also possible to execute processing described as a plurality of steps collectively as one step.
For example, for a program executed by a computer, processing of steps describing the program may be performed chronologically in order described in the present specification or may be performed in parallel or individually at a necessary timing such as the time of calling. That is, the processing of the respective steps may be executed in an order different from the above-described order as long as there is no contradiction. Further, the processing of the steps describing this program may be executed in parallel with processing of another program, or may be executed in combination with the processing of the other program.
Note that the present technology described as various modes in the present description may be implemented independently alone as long as no contradiction arises. Of course, any plurality of technologies may be implemented together. For example, some or all of the present technologies described in several embodiments may be implemented in combination with some or all of the present technologies described in the other embodiments. A part or all of any above-described present technology can also be implemented together with another technology which has not been described above.
The present disclosure can also be configured as follows.
(1)
An information processing device including:
The information processing device according to (1), further including
The information processing device according to (2), in which:
The information processing device according to (3), in which
The information processing device according to any one of (1) to (4), in which
The information processing device according to (3), in which
The information processing device according to (6), in which
The information processing device according to (6) or (7), in which
The information processing device according to any one of (6) to (8), in which
The information processing device according to (3), in which
The information processing device according to (3), in which
The information processing device according to (3), in which
The information processing device according to (3), in which
The information processing device according to (3), in which
The information processing device according to (14), in which
The information processing device according to (3), in which
The information processing device according to (3), in which
The information processing device according to (3), in which
The information processing device according to (3), in which
An information processing method that causes an information processing device to perform a process including:
Note that embodiments of the present disclosure are not limited to the above-mentioned embodiments and can be modified in various manners without departing from the scope and spirit of the present disclosure. The advantageous effects described in the present specification are merely exemplary and are not limitative, and other advantageous effects may be achieved.
Number | Date | Country | Kind |
---|---|---|---|
2022-092457 | Jun 2022 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2023/019269 | 5/24/2023 | WO |