This application is a U.S. National Phase of International Patent Application No. PCT/JP2020/044031 filed on Nov. 26, 2020, which claims priority benefit of Japanese Patent Application No. JP 2019-219694 filed in the Japan Patent Office on Dec. 4, 2019. Each of the above-referenced applications is hereby incorporated herein by reference in its entirety.
The present invention relates to an information processing device, an information processing method, and an information processing program.
Conventionally, there is an information processing device that provides virtual reality for a wearer of a head-mounted display. A field of view of the wearer is optically blocked during display of content by the head-mounted display.
There is disclosed a technology that, in a case where a surrounding person who attempts to communicate with the wearer is detected on the basis of an external situation of the head-mounted display, notifies the wearer of existence of the detected surrounding person (see, for example, Patent Literature 1).
In such a head-mounted display, it is not easy for a surrounding person to determine which one of the content and a real space the wearer is viewing even if the wearer is viewing content displayed on the head-mounted display.
Therefore, the surrounding person may feel that the wearer of the head-mounted display gazes at the surrounding person himself/herself even if the wearer is viewing a content image displayed on the head-mounted display. This may give an unpleasant feeling to the surrounding person.
The present invention has been made in view of the above, and an object thereof is to provide an information processing device, an information processing method, and an information processing program capable of reducing an unpleasant feeling given to a surrounding person.
In order to Solve the above Problem, and achieve the object, an information processing device according to an embodiment includes a display control unit and a decision unit. The display control unit displays a content image on a head-mounted display. During display of the content image by the display control unit, the decision unit decides whether or not a surrounding person exists in a front direction of the head-mounted display on the basis of a camera image obtained by capturing an image of a surrounding environment of the head-mounted display. In a case where the decision unit decides that a surrounding person exists, the display control unit moves a display position of the content image.
According to one aspect of the embodiment, it is possible to reduce an unpleasant feeling given to a surrounding person.
Hereinafter, embodiments of the present disclosure will be described in detail with reference to the drawings. In each of the following embodiments, the same parts are denoted by the same reference signs, and redundant description will be omitted.
First, an outline of a display device according to an embodiment will be described with reference to
In the example of
In the example of
As illustrated in
Therefore, the field of view (eye movement) of the wearer U of the HMD cannot be confirmed by a surrounding person existing around the wearer U. That is, the field of view of the wearer U is blocked while the wearer U is viewing the content image, and thus the wearer U is less likely to notice the surrounding person, and the surrounding person cannot recognize where the wearer U is looking.
Therefore, in some cases, the surrounding person feels that the wearer is gazing at the surrounding person himself/herself while the wearer U is viewing the content image. However, the wearer U cannot notice the surrounding person and may give an unpleasant feeling to the surrounding person.
Therefore, in the information processing method according to the embodiment, in a case where a surrounding person exists in a front direction of the HMD, a posture of the wearer U, that is, the front direction of the HMD is changed by moving a display position of image content displayed on the display device 10.
Specifically, as illustrated in
At this time, the wearer U can comfortably view the content image when the center coordinates C substantially match the initial coordinates Rp, and therefore it is expected that the wearer U performs an action of changing his/her posture such that the center coordinates C substantially match the initial coordinates Rp. That is, it is expected that the wearer U changes the posture by following the center coordinates C such that the initial coordinates Rp substantially match the center coordinates C.
Therefore, in the example of
That is, the information processing method according to the embodiment prompts the wearer U to change the posture in a direction different from a direction of the surrounding person who has originally existed in the front direction of the HMD, thereby reducing an unpleasant feeling given by the wearer U to the surrounding person.
Next, a configuration example of an information processing device 1 according to the embodiment will be described with reference to
First, the display device 10 will be described. As illustrated in
The gyroscope sensor 12 detects angular velocities in three axes for detecting movement of the display device 10. Because the display device 10 is the HMD as described above, the gyroscope sensor 12 detects a change in the posture of the wearer U of the display device 10 and outputs a posture signal corresponding to the detected change in the posture to the information processing device 1.
The camera 13 includes an image sensor and captures an image of an area in front of the display device 10. The camera 13 preferably includes a wide-angle lens such as a fisheye lens. For example, the camera 13 captures an image at an angle of view corresponding to the field of view of the wearer U wearing the HMD, captures a captured camera image, and outputs the captured image to the information processing device 1.
The distance measurement sensor 14 is an example of a sensor for sensing a surrounding environment of the display unit 11 and is, for example, a time-of-flight (ToF) sensor. Instead of the distance measurement sensor 14, the image sensor of the camera 13 may be regarded as a sensor for measuring the surrounding environment. That is, in a case where a distance from a surrounding person can be measured by image analysis, the image sensor may implement a function of the distance measurement sensor 14.
Next, the information processing device 1 will be described. As illustrated in
The storage unit 2 is implemented by, for example, a semiconductor memory element such as a RAM or a flash memory or a storage device such as a hard disk or an optical disk. In the example of
The model information 20 is information regarding a model for detecting a predetermined target object from a camera image captured by the camera 13. For example, a feature value of each target object in the camera image is stored in the storage unit 2 as the model information 20.
The score information 21 is information regarding a score for determining a destination in a case where the display position of the content image is moved. A specific example of the score information 21 will be described later with reference to
Next, the control unit 3 will be described. The control unit 3 is implemented by, for example, a central processing unit (CPU) or a micro processing unit (MPU) executing a program stored in the information processing device 1 by using a random access memory (RAM) or the like as a work area. The control unit 3 is a controller and may also be implemented by, for example, an integrated circuit such as an application-specific integrated circuit (ASIC) or a field programmable gate array (FPGA).
As illustrated in
The acquisition unit 30 acquires various types of information input from the display device 10. Specifically, the acquisition unit 30 acquires a posture signal from the gyroscope sensor 12 and acquires a camera image from the camera 13.
During display of the content image by the display control unit 34, the decision unit 31 decides whether or not a surrounding person exists in the front direction of the display device 10 on the basis of a camera image obtained by capturing an image of the surrounding environment of the display device 10.
For example, the decision unit 31 detects a person appearing in the camera image captured by the camera 13 on the basis of the feature value registered in the model information 20, and, in a case where the detected person is in the front direction of the display device 10, the decision unit 31 decides that a surrounding person exists in the front direction of the display device 10.
At this time, the decision unit 31 may decide that a surrounding person exists in a case where the decision unit 31 detects a person satisfying a predetermined decision condition. A specific example of processing by the decision unit 31 will be described with reference to
Hereinafter, the front direction of the display device 10 may also be referred to as a front vector V. As illustrated in
In the example of
When detecting the specific parts, the decision unit 31 calculates a relative position between the display device 10 and each specific part. At this time, for example, the decision unit 31 may calculate the relative position on the basis of a measurement result of the distance measurement sensor 14 in
Then, the decision unit 31 decides whether or not the person satisfies the predetermined decision condition, and, when deciding that the predetermined decision condition is satisfied, the decision unit 31 decides that a surrounding person exists in the front direction of the display device 10.
In the example of
The distance d between the front vector V and the specific part C1 corresponds to a length of a perpendicular line from the specific part C1 toward the front vector V. In the example of
In other words, in the decision unit 31, the decision condition is satisfied in a case where the surrounding person notices a situation in which the wearer U seems to gaze at the surrounding person. Hereinafter, the above decision condition will also be referred to as a line-of-sight condition.
At this time, the decision condition may be satisfied without considering the direction of the surrounding person. Specifically, as illustrated in
In other words, in a case where the person stops for a predetermined period of time or more in the front direction of the display device 10, the decision unit 31 decides that a surrounding person exists. In this case, in a case where a situation in which the wearer U seems to gaze at a specific part of the surrounding person occurs, the decision unit 31 decides that a surrounding person exists in the front direction of the display device 10. Hereinafter, the above decision condition will also be referred to as a distance condition.
Returning to the description of
That is, in a case where the decision unit 31 decides that a surrounding person exists in the front direction of the display device 10, the determination unit 32 determines the destination of the display position of the content image on the basis of the camera image captured by the camera 13.
A series of processing by the determination unit 32 will be described with reference to
As illustrated in
At this time, the determination unit 32 detects not only the people appearing in the camera image I but also a direction of each person. In other words, the determination unit 32 distinctively detects people facing the wearer U and people facing other directions.
As illustrated in
In the example of
Examples of other feature objects include predetermined monuments such as a bronze statue and crowds. A feature value of the feature object satisfying the visual condition may be registered in advance, and an object having the feature value may be detected from the camera image I as the feature object, or the feature object may be detected on the basis of lines of sight of the people appearing in the camera image I.
In this case, in a case where a plurality of people appearing in the camera image I are gazing at a predetermined object, this object is detected as the feature object. Further, a position of the feature object may be registered in a map in advance, and the feature object may be detected on the basis of a relative positional relationship with a current location of the wearer U on the basis of the map.
When detecting the people or feature objects appearing in the camera image I, the determination unit 32 calculates a score on the basis of each detection result. Specifically, as illustrated in
As an example of a method of calculating the score, a point is deducted in a region where a person exists, and a point is further deducted in a region where a person facing the wearer U exists, whereas a point is added in a region where a feature object exists.
Then, for example, the determination unit 32 determines a region having the lowest score as the destination of the display position of the image content, that is, the destination of the center coordinates C in
That is, the determination unit 32 determines a direction in which no person satisfying the decision condition exists in the front direction of the moved HMD as the destination of the center coordinates C, thereby reducing a frequency of moving the content image.
Further, the determination unit 32 determines the destination of the center coordinates C on the basis of the feature object, and thus it is possible to make people around the wearer think that the wearer U is looking at the feature object. Therefore, this makes it possible to eliminate an unpleasant feeling itself that the wearer U gives to the surrounding people.
Note that, depending on the kind of the feature object, the surrounding people may feel strange if the wearer U is gazing at the feature object for too long. Therefore, in a case where a predetermined time elapses after the feature object comes in the front direction of the HMD, the destination of the center coordinates C may be determined again.
Returning to the description of
In a case where the decision unit 31 decides that the decision condition is satisfied, the display control unit 34 described later moves the center coordinates C of the content image from the initial coordinates Rp. In this case, if the head of the wearer U does not rotate, a situation in which the wearer U seems to gaze at the surrounding person continues.
Therefore, the calculation unit 33 calculates the amount of rotation of the head of the wearer U when the center coordinates C are moved and notifies the display control unit 34 in a case where, for example, the calculated amount of rotation exceeds a threshold. The threshold herein indicates an amount required for an initial surrounding person to deviate from the front direction of the HMD, but, for example, may be determined on the basis of the destination determined by the determination unit 32. That is, in a case where the front direction of the HMD becomes close to the destination determined by the determination unit 32, it may be determined that the amount of rotation exceeds the threshold.
In a case where the decision unit 31 decides that a surrounding person exists in the front direction of the HMD, the display control unit 34 moves the display position of the content image. At this time, the initial coordinates Rp are set at the center of the display region of the display device 10 as described above.
Therefore, the display control unit 34 moves the center coordinates C such that the distance between the initial coordinates Rp and the center coordinates C increases. Further, the display control unit 34 moves the center coordinates C to the destination determined by the determination unit 32.
At this time, for example, when the center coordinates C are rapidly moved from the initial coordinates Rp, visibility of the content image may be impaired in accordance with the display position of the content image. In this case, if the wearer U rapidly rotates the head with the movement of the center coordinates C, the wearer U greatly averts his/her eyes from the surrounding person, and this may give a sense of distrust to the surrounding person.
Therefore, in order to move the display position of the content image, the display control unit 34 preferably moves the display position such that a moving speed of the content image is equal to or less than a predetermined value and moves the display position such that the distance between the display position before the movement and the display position after the movement falls within a predetermined range.
This makes it possible to reduce an unpleasant feeling given to a surrounding person without impairing the visibility of the content image, and thus it is possible to minimize the amount of rotation of the head of the wearer U. In a period in which the center coordinates C shift from the initial coordinates Rp, the wearer U is expected to rotate the head, whereas, in a case where the shift between the center coordinates C and the initial coordinates Rp is sufficiently small, the wearer U is assumed to move only the line of sight to the center coordinates C without rotating the head.
Therefore, the distance between the initial coordinates Rp and the moved center coordinates C is preferably equal to or more than a predetermined value. In this case, the display control unit 34 may acquire the amount of rotation of the head from the calculation unit 33 every time the center coordinates C are moved, and, only in a case where the head is not rotated, the display control unit 34 may move the center coordinates C to the next display position.
Thereafter, as illustrated in
Next, a processing procedure executed by the information processing device 1 according to the embodiment will be described with reference to
As illustrated in
Then, the information processing device 1 decides whether or not the line-of-sight condition in
In a case where the information processing device 1 decides that the line-of-sight condition is not satisfied in the decision processing in step S103 (Step S103, No), the information processing device 1 decides whether or not the distance condition in
In a case where the distance condition is satisfied in the decision in step S104 (step S104, Yes), the information processing device 1 proceeds to the processing in step S105. In a case where the information processing device 1 decides that the distance condition is not satisfied (step S104, No), the information processing device 1 terminates the processing. In a case where the content image is not displayed in the decision in step S101 (step S101, No), the information processing device 1 omits the processing in step S102 and the subsequent steps and terminates the processing.
Next, the processing procedure in step S105 in
Then, the information processing device 1 determines a destination of a display position of image content on the basis of the score of each region (step S202) and moves the display position to the determined destination (step S203).
Then, the information processing device 1 decides whether or not the amount of rotation of the head of the wearer U is larger than the threshold (step S204) and, in a case where the amount of rotation exceeds the threshold (step S204, Yes), the information processing device 1 moves the display position to the initial coordinates Rp (step S205) and terminates the processing.
In a case where the amount of rotation is less than the threshold in the decision in step S204 (step S204, No), the information processing device 1 proceeds to the processing in step S203.
The above embodiment shows that, in a case where a surrounding person exists in the front direction of the HMD, the display position of the content image is moved to guide the line of sight of the wearer U. However, the present invention is not limited thereto. That is, the wearer U may be notified of the existence of the surrounding person by a warning image or a warning sound. In this case, the transmittance of a part of or the entire content image may be increased to cause the wearer U to directly and visually recognize the surrounding person.
Further, the above embodiment shows a case where the display device 10 is an optical see-through display device. However, the present invention is not limited thereto and is similarly applicable to a video see-through display device.
An information device such as the information processing device according to each embodiment described above is implemented by, for example, a computer 1000 having a configuration of
The CPU 1100 operates on the basis of programs stored in the ROM 1300 or the HDD 1400, thereby controlling each unit. For example, the CPU 1100 develops the programs stored in the ROM 1300 or the HDD 1400 in the RAM 1200 and executes processing corresponding to the various programs.
The ROM 1300 stores a boot program such as a basic input output system (BIOS) executed by the CPU 1100 at the time of activation of the computer 1000, a program depending on hardware of the computer 1000, and the like.
The HDD 1400 is a computer-readable recording medium that non-transitorily records programs executed by the CPU 1100, data used by the programs, and the like. Specifically, the HDD 1400 is a recording medium that records programs according to the present disclosure serving as an example of program data 1450.
The communication interface 1500 is an interface for connecting the computer 1000 to an external network 1550 (e.g., the Internet). For example, the CPU 1100 receives data from another device or transmits data generated by the CPU 1100 to another device via the communication interface 1500.
The input/output interface 1600 is an interface for connecting an input/output device 1650 and the computer 1000. For example, the CPU 1100 receives data from an input device such as a keyboard or a mouse via the input/output interface 1600. The CPU 1100 transmits data to an output device such as a display, a speaker, or a printer via the input/output interface 1600. The input/output interface 1600 may function as a media interface that reads a program or the like recorded in a predetermined recording medium (medium). The medium is, for example, an optical recording medium such as a digital versatile disc (DVD) or a phase change rewritable disk (PD), a magneto-optical recording medium such as a magneto-optical disk (MO), a tape medium, a magnetic recording medium, or a semiconductor memory.
For example, in a case where the computer 1000 functions as the information processing device 1 according to the embodiment, the CPU 1100 of the computer 1000 implements a function of the acquisition unit 30 by executing a program loaded on the RAM 1200. The HDD 1400 stores a program according to the present disclosure and data in the storage unit 2. The CPU 1100 reads the program data 1450 from the HDD 1400 and executes the program data, but may acquire those programs from another device via the external network 1550 as another example.
The present technology can also have the following configurations.
(1)
An information processing device comprising:
The information processing device according to (1), wherein
The information processing device according to (2), wherein
The information processing device according to any one of (1) to (3), wherein
The information processing device according to any one of (1) to (4), wherein
The information processing device according to any one of (1) to (5), wherein
The information processing device according to any one of (1) to (6), further comprising
The information processing device according to (7), wherein:
The information processing device according to any one of (1) to (8), wherein
The information processing device according to (9), wherein
The information processing device according to any one of (1) to (10), wherein
The information processing device according to any one of (1) to (11), wherein
The information processing device according to any one of (1) to (12), further comprising
The information processing device according to (13), wherein
The information processing device according to (13) or (14), wherein
The information processing device according to any one of (13) to (15), wherein
An information processing method comprising
Number | Date | Country | Kind |
---|---|---|---|
2019-219694 | Dec 2019 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/044031 | 11/26/2020 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2021/111975 | 6/10/2021 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
20130265232 | Yun et al. | Oct 2013 | A1 |
20150123997 | Hayasaka et al. | May 2015 | A1 |
20160133051 | Aonuma | May 2016 | A1 |
Number | Date | Country |
---|---|---|
104272371 | Jan 2015 | CN |
104635338 | May 2015 | CN |
2657929 | Oct 2013 | EP |
2007-304721 | Nov 2007 | JP |
2015-087909 | May 2015 | JP |
2015-090635 | May 2015 | JP |
2015-518580 | Jul 2015 | JP |
2017-149335 | Aug 2017 | JP |
10-2013-0113902 | Oct 2013 | KR |
2013154295 | Oct 2013 | WO |
2014156388 | Oct 2014 | WO |
Entry |
---|
International Search Report and Written Opinion of PCT Application No. PCT/JP2020/044031, issued on Feb. 16, 2021, 12 pages of ISRWO. |
Number | Date | Country | |
---|---|---|---|
20230005454 A1 | Jan 2023 | US |