The present disclosure relates to a display control apparatus, a display control method, and a program.
Techniques for controlling display of various types of information on the basis of an operation detected by various sensors have been developed. For example, a technique for detecting a position or the like of an operating body such as a hand and controlling an output of information based on an obtained detection result to a screen or a display region of a display device such as a display is disclosed in Patent Literature 1.
Patent Literature 1: WO 2015/098187
For example, it is considered that an object operating in tandem with an operating body is caused to be displayed in the display region in order to improve the accuracy of the operation on the information displayed in the display region. However, when only the object is simply displayed, it is difficult to detect the tandem operation between the object and the operating body by a body sensation, and it takes time to have an operation feeling.
In this regard, the present disclosure proposes a display control apparatus, a display control method, and a program which are novel and improved and capable of easily acquiring an operation feeling of an object corresponding to an operating body.
According to the present disclosure, there is provided a display control apparatus, including: a first display control unit configured to control display of a first object corresponding to an operating body in a display region on the basis of a position of the operating body and a viewpoint position of a user; and a second display control unit configured to control display of a second object arranged toward a display position of the first object in the display region.
In addition, according to the present disclosure, there is provided a display control method, including: controlling, by a processor, display of a first object corresponding to an operating body in a display region on the basis of a position of the operating body and a viewpoint position of a user; and controlling, by the processor, display of a second object arranged toward a display position of the first object in the display region.
In addition, according to the present disclosure, there is provided a program causing a computer to function as: a first display control unit configured to control display of a first object corresponding to an operating body in a display region on the basis of a position of the operating body and a viewpoint position of a user; and a second display control unit configured to control display of a second object arranged toward a display position of the first object in the display region.
As described above, according to the present disclosure, it is possible to easily acquire an operation feeling of an object corresponding to an operating body.
Note that the effects described above are not necessarily limitative. With or in the place of the above effects, there may be achieved any one of the effects described in this specification or other effects that may be grasped from this specification.
Hereinafter, (a) preferred embodiment(s) of the present disclosure will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
Further, the description will proceed in the following order.
The display control apparatus 10 is a device having a display control function for acquiring detection information obtained from each detecting device and performing control of display based on the detection information. The display control apparatus 10 may include a processing circuit, a storage device, a communication device, and the like. The display control apparatus 10 can be realized by any information processing device such as a personal computer (PC), a tablet, or a smartphone. Further, as illustrated in
As illustrated in
The control unit 100 controls overall operation of the display control apparatus 10 according to the present embodiment. The function of the control unit 100 is realized by a processing circuit such as a central processing unit (CPU) included in the display control apparatus 10. Further, the control unit 100 has functions realized by respective functional units illustrated in
The communication unit 110 is a communication device included in the display control apparatus 10, and carries out various types of communications with an external device via a network (or directly) in a wireless or wired manner. The function of the communication unit 110 is realized by a communication device included in the display control apparatus 10. Specifically, the communication unit 110 is realized by a communication device such as a communication antenna and a radio frequency (RF) circuit (wireless communication), an IEEE 802.15.1 port and a transmission/reception circuit (wireless communication), an IEEE 802.11b port and a transmission/reception circuit (wireless communication), or a local area network (LAN) terminal and a transmission/reception circuit (wired communication). For example, as illustrated in
The storage unit 120 is a storage device included in the display control apparatus 10, and stores information acquired by the communication unit 110, information obtained by processes of the respective functional units of the control unit 100, and the like. The storage unit 120 is realized by, for example, a magnetic recording medium such as a hard disk, a non-volatile memory such as a flash memory, or the like. For example, the storage unit 120 may store information related to a body of the user using the display control system 1 (a line of sight position PV1 or the like). Further, the storage unit 120 appropriately outputs stored information in response to a request from each functional unit included in the control unit 100 or from the communication unit 110. Further, the storage unit 120 need not necessarily be included in the display control apparatus 10, and for example, the function of the storage unit 120 may be realized by an external cloud server or the like.
The operating body detecting device 20 is an example of a detecting device used for detecting the operating body. The operating body detecting device 20 according to the present embodiment generates operating body detection information related to a hand H1 of the user U1 which is an example of the operating body. The generated operating body detection information is output to the display control apparatus 10 via the network NW (or directly). Further, as illustrated in
The operating body detection information includes, for example, information (three-dimensional position information) related to a position of the detected operating body in a three-dimensional space. In the present embodiment, the operating body detection information includes three-dimensional position information of the operating body in a coordinate system of the space 2. Further, the operating body detection information may include a model or the like generated on the basis of a shape of the operating body. As described above, the operating body detecting device 20 generates information related to an operation which the user performs on the operating body detecting device 20 as the operating body detection information.
The operating body detecting device 20 according to the present embodiment can be realized by an infrared irradiation light source, an infrared camera, and the like. Further, the operating body detecting device 20 may be realized by any of various types of sensors such as, for example, a depth sensor, a camera, a magnetic sensor, and a microphone. In other words, the operating body detecting device 20 is not particularly limited as long as it can acquire a position, a form, or the like of the operating body.
Further, in the example illustrated in
Further, in the present embodiment, the hand is assumed as an example of the operating body, but the present technology is not limited to such an example. The operating body may be, for example, a finger of the hand of the user or a foot of the user. Further, the operating body may be an object used by the user to operate an operation target such as a device gripped by the user (for example, a dish, a laboratory instrument, a medical instrument, or a tool).
The object detecting device 30 is an example of a detecting device used for estimating the position or the like of the user. The object detecting device 30 according to the present embodiment generates three-dimensional position information of a detected body. The generated three-dimensional position information is output to the display control apparatus 10 via the network NW (or directly). Further, as illustrated in
The object detecting device 30 according to the present embodiment can be realized by a depth sensor. Further, the object detecting device 30 may be realized by, for example, a stereo camera or the like. Further, the object detecting device 30 may be realized by a sensor capable of performing distance measurement using an infrared sensor, a time of flight (TOF) type sensor, an ultrasonic sensor, or the like or may be realized by a device which projects an IR laser pattern. In other words, the object detecting device 30 is not particularly limited as long as it can detect the body or the like of the user in the space.
Further, although the object detecting device 30 is described as being arranged on the ceiling, the wall, or the like of the space 2 in the example illustrated in
Further, in another embodiment, the display control system 1 may include a device capable of detecting a position of each part of the body of the user such as a viewpoint position of the user instead of the object detecting device 30. Such a device may be realized by, for example, an image recognition sensor or the like capable of identifying the head, the arm, the shoulder (hereinafter collectively referred to as an “upper limb”), or the like of the user by image recognition or the like.
The display device 40 is a device which is arranged in the space 2 to which the display control system 1 is applied, and displays a predetermined screen and information output from the display control apparatus 10 via the network NW (or directly) in a display region 41. Although the details will be described later, the display in the display region 41 of the display device 40 according to the present embodiment is controlled by the display control apparatus 10.
For example, as illustrated in
In the display control system 1, the operating body detecting device 20 detects the position or the like of the hand H1 of the user U1 which is an example of the operating body, the object detecting device 30 detects a skeleton of the user U1, and the detection information is output to the display control apparatus 10. On the basis of the detection information, the display control apparatus 10 controls display of a virtual object corresponding to the operating body such as the hand H1 in the display region. Accordingly, the user U1 can perform an operation on a screen displayed in the display region while looking at the virtual object corresponding to his/her hand H1 reflected on the screen.
However, it is not easy to recognize that the virtual object corresponding to the hand H1 of the user U1 operates in tandem with his/her own hand H1. For example, even when the virtual object is displayed on the screen displayed in the display region, the virtual object corresponding to the hand H1 is searched for through learning by moving the hand H1 or the like, and it is difficult to recognize the virtual object corresponding to one's own hand H1 intuitively.
In this regard, the present disclosure proposes technology that enables the virtual object corresponding to the operating body to be intuitively recognized. Specifically, in the present disclosure, a technique of controlling display of a second object such that a virtual object (second object) different from a virtual object (first object) is arranged to face a display position of the first object displayed in the display region. The second object may be, for example, an object corresponding to an arm S1 of the user U1, but the second object and the arm S1 need not necessarily operate in tandem with each other. With such a technique, it is possible to more easily detect the virtual object corresponding to the hand H1 of the user U1 in the screen displayed in the display region by display of the second object corresponding to the arm S1 of the user U1. Therefore, the operation feeling in the operation screen can be easily acquired.
Hereinafter, the display control system 1 according to the present embodiment will be described in detail.
Next, an example of a configuration and a function of the control unit 100 according to the first embodiment of the present disclosure will be described.
The acquiring unit 101 has a function of acquiring position information of the user using the display control system 1. The position information of the user is not limited to information of the position of the user and includes position information of each part of the body of the user. The position information of each part of the body of the user includes, for example, position information of the hand H1 of the user U1, information of the viewpoint position PV1 of the user U1, position information of the upper arm (for example, the arm S1) of the user U1, and the like illustrated in
Further, the information of the position of the user here means a representative position within the space 2 of the user using the display control system 1. Therefore, the information of the position of the user may be information obtained independently of the position information of each part of the user, may be identical to the position information of each part, or may be information of a position estimated on the basis of any of those positions. For example, the position of the user may be a position of the hand of the user (that is, the position of the operating body), may be the position of the viewpoint position of the user or the position of the upper limb of the user, or may be a position estimated on the basis of any of those positions.
Here, the upper limb of the user is an example of a supporting body in the present disclosure. The supporting body is associated with the operating body and supports the operating body, and corresponds to the upper limb for the hand.
The acquiring unit 101 according to the present embodiment can first acquire the position information of the hand of the user which is the operating body using the operating body detection information generated by the operating body detecting device 20. Specifically, the acquiring unit 101 can estimate the position of the hand of the user on the basis of the detection position of the operating body included in the operating body detection information and generate the position information of the hand of the user.
Further, in another embodiment, instead of the acquiring unit 101, the operating body detecting device 20 may calculate the position of the hand of the user, and the acquiring unit 101 may acquire the position information of the hand of the user.
Further, the acquiring unit 101 may acquire information related to the shape of the hand of the user using the operating body detection information. Specifically, the acquiring unit 101 can calculate the shape of the hand of the user on the basis of a model of the operating body included in the operating body detection information and generate shape information of the hand of the user. The shape information of the hand can be used, for example, for control of a display form of the first object.
Further, the acquiring unit 101 according to the present embodiment can acquires the position of the user, the viewpoint position of the user, and the position information of the upper limb of the user using the three-dimensional position information generated by the object detecting device 30. Specifically, the acquiring unit 101 may identify the body of the user which is a detected body from the three-dimensional position information, generate the skeleton information of the user, estimate the viewpoint position or the like of the user from the skeleton information, and generate each piece of position information. Further, the viewpoint position of the user can be estimated, for example, from a position of a part corresponding to the head in the skeleton of the user. A known skeleton estimation engine or the like may be used for the detection of the skeleton of the user.
Here, the viewpoint position of the user according to the present embodiment means, for example, a position corresponding to the eye of the user using the display control system 1. The viewpoint position of that user may be acquired by directly measuring the position of the eye of the user or may be acquired by estimating it on the basis of the position of the body of the user, a direction of a line of sight, or the like. Further, for example, the viewpoint position of the user may be a part corresponding to the head, the upper body, or the like of the user in addition to the eye of the user as described above. The viewpoint position of the user in the display control system 1 can be defined by a coordinate system based on an arbitrary component in the space 2. For example, the viewpoint position of the user may be defined by relative coordinates of the eye (or the head) of the user in the coordinate system based on the display region of the display device 40. Alternatively, the viewpoint position of the user in the display control system 1 may be defined by absolute coordinates of the eye (or the head) of the user in a global coordinate system indicating the space 2 in which the display control system 1 is used.
Further, in another embodiment, instead of the acquiring unit 101, the object detecting device 30 may generate the skeleton information of the user from the three-dimensional position information and estimate the position of the viewpoint position or the like of the user. In this case, the acquiring unit 101 may acquire each piece of position information from the object detecting device 30.
Further, as illustrated in
The acquiring unit 101 outputs information related to the obtained viewpoint position of the user and the position of the hand to the first display control unit 102. Further, the acquiring unit 101 outputs information related to the acquired position of the user and the position of the upper limb of the user to the second display control unit 103.
The first display control unit 102 has a function of controlling display of the virtual object (first object) corresponding to the operating body.
The first display control unit 102 according to the present embodiment controls the display of the first object corresponding to the hand of the user which is the operating body in the display region 41 of the display device 40. For example, the first display control unit 102 controls the display of the first object in the display region 41 on the basis of the viewpoint position of the user and the position of the hand. More specifically, the first display control unit 102 performs control such that the first object is displayed at a position at which an extension line of the line of sight in a case in which the user views the hand from his/her viewpoint intersects with the display region 41. Accordingly, in a case in which the user views his/her hand, the user can have a feeling as if his/her hand were immersed into the screen displayed in the display region 41.
More specifically, the first display control unit 102 calculates a display position of the first object in the display region 41 on the basis of the three-dimensional position information of the viewpoint position of the user and the hand position and information of the shortest distance between the display region 41 and the hand (or the viewpoint). Further, in a case in which the calculated display position of the first object is outside the display region 41, the first display control unit 102 may not display the first object. Further, the shortest distance between the display region 41 and the hand (or the viewpoint) may be a previously acquired predetermined value or a value obtained by measuring a distance with the object detecting device 30, another sensor, or the like.
Further, the first display control unit 102 may cause a display form of the first object corresponding to the hand of the user to be changed on the basis of information related to the shape of the hand of the user. For example, the first display control unit 102 may cause the display form of the first object to be changed on the basis of a motion and a form of the hand estimated on the basis of the information related to the shape of the hand of the user. The change of the display form of the first object may include causing a structure of the first object to be changed and causing a different object to be added to the first object. For example, in a case in which the first object imitates an object of a hand, the first display control unit 102 may cause the object of the hand to be changed to a different object (such as an object of a hand of an animal) or cause a pointer to be added to the object of the hand. Accordingly, it is possible to increase variations of the operation by the first object and to perform a free operation on the screen displayed in the display region 41.
The second display control unit 103 has a function of controlling display of the virtual object (second object) arranged to face the display position of the first object.
For example, the second object is arranged to extend toward the display position of the first object in the display region 41. Since the second object associated with the first object corresponding to the hand is displayed as described above, not only the hand of the user but also the object corresponding to the arm is expressed. In this case, a feeling as if the hand and the arm of the user were present in the screen displayed in the display region 41 is obtained. Therefore, the user can easily acquire the operation feeling for the screen displayed in the display region 41. Therefore, an initial learning load of the operation by the first object can be reduced.
Further, the second display control unit 103 according to the present embodiment may control the display of the second object on the basis of the position of the user. More specifically, the second display control unit 103 may control the display position of the second object on the basis of the position of the user. For example, in a case in which the user is assumed to be located on the left side toward the display region 41, the second display control unit 103 may display the second object in the region on the left side of the display region 41. Accordingly, it is possible to obtain a feeling as if the second object is extended from a position at which the user is located. Therefore, the operation feeling of the first object can be acquired more easily.
Further, as will be described in detail in a second embodiment, as the second object is displayed on the basis of the position of the user, in a case in which a plurality of users performs operations on a screen displayed on a single display region 41, the second object is displayed at the display position corresponding to the position of the user. Therefore, the user can identify the first object corresponding to his/her hand immediately. Therefore, since a possibility of confusion between the first object corresponding to his/her hand and another object in the screen displayed in the display region 41 is reduced, the operability is further improved.
Further, the second object according to the present embodiment may be, for example, a virtual object corresponding to the supporting body. The supporting body is an object for supporting the operating body, and specifically, the supporting body corresponds to the upper limb such as the arm or the shoulder for the hand of the user. As the second object corresponding to the supporting body is displayed, the feeling as if the arm of the user were immersed into the display region 41 can be obtained, and thus the operability can be further improved.
The second display control unit 103 may control the display of the second object on the basis of, for example, the position of the supporting body. Here, the position of the supporting body is, for example, a representative position of the upper limb of the user, and more specifically, the position of the supporting body may be an elbow, a base of an arm, a shoulder, or the like. The position of the supporting body is obtained on the basis of, for example, the skeleton information. As the display of the second object (more specifically, the display position) is controlled on the basis of the position of the supporting body, it is possible to cause the display of the second object in the display region 41 to be reflected on the state of the upper limb of the user which actually performs an operation. Therefore, since his/her own upper limb and the second object operate in tandem with each other, the operability of the user can be further improved.
Further, the second display control unit 103 may control the display of the second object on the basis of a relation between the position of the supporting body and the display position of the first object in the display region 41.
In this regard, the second display control unit 103 performs control such that the second object is displayed to extend from the first object Obj1 in the display region 41 toward a representative position S2 (for example, the elbow) of an arm S1 of the user U1 as illustrated in
Here, an example of control of display in the display region 41 for the display device 40 by the first display control unit 102 and the second display control unit 103 will be described.
The operation object Obj is integrally formed by a first object Obj1 corresponding to the hand and a second object Obj2 corresponding to the arm. In particular, since the operation object Obj illustrated in
As illustrated in
The line RS1 may be, for example, a line obtained by projecting the extension line DS1 of the arm illustrated in
Further, an arrangement position of the second object Obj2 is not limited to the above example. For example, in the example illustrated in
Further, although the operation object Obj illustrated in
Further, as illustrated in
Next, an example of a flow of a process by the display control system 1 according to the present embodiment will be described with reference to
Referring to
Then, the acquiring unit 101 acquires the position information such as the viewpoint position of the user, the position of the hand, the position of the upper limb, or the like on the basis of the operating body detection information and the three-dimensional position information (step S107). Then, the first display control unit 102 calculates the display position in the display region of the first object on the basis of the viewpoint position of the user, the hand position, or the like (step S109). If the calculated display position is within the display region (YES in step S111), the second display control unit 103 calculates the display position of the second object (step S113).
Then, the first display control unit 102 and the second display control unit 103 decide the display forms of the first object and the second object (step S115). Further, the process of step S115 will be described later in detail. Then, the first display control unit 102 and the second display control unit 103 control the display of the first object and the second object for the display device 40 (step S117).
The display control system 1 sequentially repeats the processes in accordance with steps S101 to S117 until the process ends.
The example of the flow of the process of the display control system 1 according to the present embodiment has been described above.
Next, an example of display control in the display control system 1 according to the present embodiment will be described.
More specifically, as illustrated in
In the example illustrated in
Next, referring to
Further, in addition to the example of the display control illustrated in
The display control system 1 according to the first embodiment of the present disclosure has been described above.
Next, a display control system 1A according to the second embodiment of the present disclosure will be described. The display control system 1A according to the present embodiment controls display of operation objects respectively corresponding to the hands of a plurality of users.
As illustrated in
Specifically, as illustrated in
In this regard, the second display control unit 103 controls an arrangement of second objects Obj2A to Obj2C such that the second objects Obj2A to Obj2C face the display positions of the first objects Obj1A to Obj1C, respectively.
In the present embodiment, display of the second objects Obj2A to Obj2C is controlled on the basis of the positions of the respective users UA to UC. In other words, the second display control unit 103 controls the display positions of the second objects Obj2A to Obj2C on the basis of the positions of the users UA to UC. More specifically, the second display control unit 103 performs the display so that the second objects Obj2A to Obj2C are positioned on the extension lines of the extended hands and the extended arms of the users UA to UC on the basis of the viewpoint positions PVA to PVC of the users UA to UC, the positions of the hand HA to HC, and the positions of the upper limbs. Accordingly, the users UA to UC can intuitively identify the first objects Obj1A to Obj1C corresponding to their hands. In other words, even in a case in which a plurality of users performs operations on the screen on the single display region 41, a plurality of users can immediately understand the operation objects corresponding to their hands.
Incidentally, in a case in which a plurality of users performs operations at the same time, depending on an operation, the first objects may be closely arranged or overlapped, or the second objects extending toward the first objects may be densely stacked or overlapped. In this case, it may be difficult for the respective users to identify objects corresponding to the first objects corresponding to their hands. Further, there are cases in which it is difficult to perform the operation on the operation target in the screen displayed in the display region 41 in the first place due to the close arrangement or the overlap of the second objects.
In this regard, the second display control unit 103 according to the present embodiment can control the display of the second object on the basis of a positional relation of a plurality of users. Accordingly, there is no confusion between the operation objects, and it is also possible to prevent inhibition of display of contents.
The second display control unit 103 may control the display position of the second object, for example, on the basis of a positional relation of a plurality of users.
Further, in the example illustrated in
Further, the present technology is not limited to the examples illustrated in
Further, the “positional relation of the users” in the present disclosure may include, for example, an arrangement of the users for the display region 41. As described above, the second display control unit 103 may control the display of the second object on the basis of the arrangement of a plurality of users for the display region 41. This arrangement may be an arrangement in a direction parallel to the plane of the display region 41 as illustrated in the example of
For example, in a case in which the users are arranged in the depth direction with respect to the display region 41, and the first objects corresponding to the users are displayed close to each other, the second objects may also be closely arranged or overlapped. Therefore, in a case in which a plurality of users is arranged in the depth direction with respect to the display region 41, the second display control unit 103 may appropriately adjusts the display positions or the like of the second objects on the basis of the display positions or the like of the first objects. For example, as will be described later, the second display control unit 103 may adjust a height of the second object in the VR space and an inclination angle of the second object in accordance with a distance between the display region 41 and each user. Accordingly, although the second objects corresponding to the users are closely arranged or overlapped on the plane of the display region 41, the user can identify the second object corresponding to him/herself.
Further, the “positional relation of the users” may include, for example, the density of the users. In other words, the second display control unit 103 may control the display of the second objects on the basis of the density of the users. The density of the users means, for example, the number of users who are located in a region of a predetermined size.
For example, in a case in which the density of the users is high, it is considered that the second objects are also likely to be closely arranged. In this regard, the second display control unit 103 may control the display positions of the second objects corresponding to the users belonging to a group in which the density of the users is high. Accordingly, it is possible to eliminate the close arrangement of the second objects and prevent confusion between the first objects corresponding to the hands of the users. The density of the users may be estimated, for example, on the basis of a position of the operating body detecting device 20 or may be estimated by positions of a plurality of users detected by the object detecting device 30.
Further, in the example illustrated in
Further, the second display control unit 103 may control the display form of the second objects, for example, on the basis of the positional relation of a plurality of users. Here, the display form can include, for example, the size and the shape of display of the operation object, the height in the VR space, the display effect (including a sequential change or the like), and the like as described in the first embodiment.
In this regard, the second display control unit 103 may control the heights of the second object Obj2A to Obj2C in the VR space displayed in the display region 41 on the basis of the positional relation of the users. For example, as illustrated in
Further, in the example illustrated in
In this regard, the second display control unit 103 may control the sizes of the second object Obj2A to Obj2C on the basis of the positional relation of the users. For example, as illustrated in
Further, for the size of the second object, the second object corresponding to the user closer to the display region 41 may be larger, and the second object corresponding to the user far from the display region 41 may be smaller. Here, in consideration of the consistency with the control of the size of the first object based on the relation of the viewpoint position of the user, the position of the hand, and the position of the display region 41, it is desirable to perform control such that the size of the second object be increased with the increase in the distance between the display region 41 and the user.
In this regard, the second display control unit 103 may perform control such that the display of the second objects Obj2A to Obj2C intersecting with each other is transparent. In this case, for example, the second display control unit 103 may cause the second objects Obj2A to Obj2C to be transparent after causing the second objects Obj2A to Obj2C to be displayed without change for a certain period of time. Accordingly, it is possible to intuitively recognize the first objects Obj1A to Obj1C corresponding to the hands of the users, and it is possible to prevent display of content from being hindered. In the case of causing the second objects Obj2A to Obj2C to be transparent after a certain period of time elapses, the second display control unit 103 may cause the second objects Obj2A to Obj2C to become transparent instantaneously or may cause the second objects Obj2A to Obj2C to become transparent gradually over a predetermined period of time.
Further, in the example illustrated in
Further, in a case in which a plurality of second objects are displayed in a superimposed manner, the second display control unit 103 may cause a second object serving as an upper layer to be transparent. Accordingly, since a second object serving as a lower layer is displayed, it is possible to prevent the operability of the user corresponding to the second object from being lowered.
The display control examples for the second object based on the positional relation between a plurality of users by the second display control unit 103 according to the present embodiment have been described above. Further, the examples illustrated in
Next, the hardware configuration of an information processing apparatus 900 according to an embodiment of the present disclosure is described with reference to
The information processing apparatus 900 includes a central processing unit (CPU) 901, read-only memory (ROM) 903, and random-access memory (RAM) 905. In addition, the information processing apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input apparatus 915, an output apparatus 917, a storage apparatus 919, a drive 921, a connection port 925, and a communication apparatus 929. In conjunction with, or in place of, the CPU 901, the information processing apparatus 900 may have a processing circuit called a digital signal processor (DSP) or application specific integrated circuit (ASIC).
The CPU 901 functions as an arithmetic processing unit and a control unit, and controls the whole operation in the information processing apparatus 900 or a part thereof in accordance with various programs recorded in the ROM 903, the RAM 905, the storage apparatus 919, or a removable recording medium 923. The ROM 903 stores programs, operation parameters, or the like used by the CPU 901. The RAM 905 temporarily stores programs used in the execution by the CPU 901, parameters that vary as appropriate in the execution, or the like. For example, the CPU 901, the ROM 903, and the RAM 905 may realize the functions of the control unit 100 in the foregoing embodiment. The CPU 901, the ROM 903, and the RAM 905 are connected with each other via the host bus 907 that includes an internal bus such as a CPU bus. Furthermore, the host bus 907 is connected to the external bus 911 such as peripheral component interconnect/interface (PCI) bus via the bridge 909.
The input apparatus 915 is, in one example, an apparatus operated by a user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever. The input apparatus 915 may be, in one example, a remote control apparatus using infrared rays or other radio waves, or may be externally connected equipment 927 such as a cellular phone that supports the operation of the information processing apparatus 900. The input apparatus 915 includes an input control circuit that generates an input signal on the basis of the information input by the user and outputs it to the CPU 901. The user operates the input apparatus 915 to input various data to the information processing apparatus 900 and to instruct the information processing apparatus 900 to perform a processing operation.
The output apparatus 917 includes an apparatus capable of notifying visually or audibly the user of the acquired information. The output apparatus 917 may be a display apparatus such as a liquid crystal display (LCD), a plasma display panel (PDP), and an organic electro-luminescence display (OELD), an audio output apparatus such as a speaker and a headphone, as well as printer apparatus or the like. The output apparatus 917 outputs the result obtained by the processing of the information processing apparatus 900 as a video such as a text or an image, or outputs it as audio such as a speech or sound.
The storage apparatus 919 is a data storage apparatus configured as an example of a storage portion of the information processing apparatus 900. The storage apparatus 919 includes, in one example, a magnetic storage unit device such as hard disk drive (HDD), a semiconductor storage device, an optical storage device, and a magneto-optical storage device. The storage apparatus 919 stores programs executed by the CPU 901, various data, various types of data obtained from the outside, and the like.
The drive 921 is a reader-writer for a removable recording medium 923 such as a magnetic disk, an optical disk, a magneto-optical disk, and a semiconductor memory, and is incorporated in the information processing apparatus 900 or externally attached thereto. The drive 921 reads the information recorded on the loaded removable recording medium 923 and outputs it to the RAM 905. In addition, the drive 921 writes a record in the loaded removable recording medium 923. At least one of the storage apparatus 919, or the drive 921 and the removable recording medium 923 may realize the functions of the storage unit 120 in the foregoing embodiment.
The connection port 925 is a port for directly connecting equipment to the information processing apparatus 900. The connection port 925 may be, in one example, a Universal Serial Bus (USB) port, an IEEE 1394 port, or a Small Computer Device Interface (SCSI) port. In addition, the connection port 925 may be, in one example, an RS-232C port, an optical audio terminal, or High-Definition Multimedia Interface (HDMI, registered trademark) port. The connection of the externally connected equipment 927 to the connection port 925 makes it possible to exchange various kinds of data between the information processing apparatus 900 and the externally connected equipment 927.
The communication apparatus 929 is, in one example, a communication interface including a communication device or the like, which is used to be connected to the communication network NW. The communication apparatus 929 may be, in one example, a communication card for wired or wireless local area network (LAN), Bluetooth (registered trademark), or wireless USB (WUSB). In addition, the communication apparatus 929 may be, in one example, a router for optical communication, a router for asymmetric digital subscriber line (ADSL), or a modem for various communications. The communication apparatus 929 transmits and receives signals or the like using a predetermined protocol such as TCP/IP, in one example, with the Internet or other communication equipment. In addition, the communication network NW connected to the communication apparatus 929 is a network connected by wire or wireless, and is, in one example, the Internet, home LAN, infrared communication, radio wave communication, satellite communication, or the like. Note that at least one of the connection port 925 and the communication apparatus 929 may realize the functions of the communication unit 110 in the foregoing embodiment.
The above illustrates one example of a hardware configuration of the information processing apparatus 900.
The preferred embodiment(s) of the present disclosure has/have been described above with reference to the accompanying drawings, whilst the present disclosure is not limited to the above examples. A person skilled in the art may find various alterations and modifications within the scope of the appended claims, and it should be understood that they will naturally come under the technical scope of the present disclosure.
Further, in the above embodiment, VR content using the VR space has been dealt as an application target of the present technology, but the present technology is not limited to this example. For example, the screen displayed in the display region may be a screen of an augmented reality (AR) space used for AR content or may be a screen displaying arbitrary video content such as a video game, a moving image, or a still image expressed by a two-dimensional video. Further, the screen displayed in the display region may be a screen realizing an interface or the like provided for an operation for using arbitrary content in addition to the above content. In other words, the present technology can be applied as long as an operation is performed on the screen displayed in the display region, and it is content, an interface, or the like in which an output to an operation is reflected on a screen.
Note that each of the steps in the processes of the display control apparatus in this specification is not necessarily required to be processed in a time series following the sequence described as a flowchart. For example, each of the steps in the processes of the display control apparatus may be processed in a sequence that differs from the sequence described herein as a flowchart, and furthermore may be processed in parallel.
Additionally, it is possible to create a computer program for causing hardware such as a CPU, ROM, and RAM built into a display control apparatus to exhibit functions similar to each component of the display control apparatus described above. In addition, a readable recording medium storing the computer program is also provided.
Further, the effects described in this specification are merely illustrative or exemplified effects, and are not limitative. That is, with or in the place of the above effects, the technology according to the present disclosure may achieve other effects that are clear to those skilled in the art from the description of this specification.
Additionally, the present technology may also be configured as below.
(1)
A display control apparatus, including:
a first display control unit configured to control display of a first object corresponding to an operating body in a display region on the basis of a position of the operating body and a viewpoint position of a user; and
a second display control unit configured to control display of a second object arranged toward a display position of the first object in the display region.
(2)
The display control apparatus according to (1), in which the second display control unit controls the display of the second object on the basis of a position of the user in a space in which the display region is installed.
(3)
The display control apparatus according to (2), in which the second object is an object corresponding to a supporting body which is associated with the user and supports the operating body, and
the control of the display of the second object includes control of the display of the second object based on a position of the supporting body.
(4)
The display control apparatus according to (3), in which the second display control unit controls the display of the second object on the basis of a relation between a display position of the first object in the display region and the position of the supporting body.
(5)
The display control apparatus according to (3) or (4), in which the position of the supporting body is estimated on the basis of information obtained by detecting a skeleton of the user.
(6)
The display control apparatus according to any one of (2) to (5), in which the position of the user in the space is identical to the position of the operating body.
(7)
The display control apparatus according to any one of (2) to (6), in which the second display control unit controls the display of the second object on the basis of a positional relation of a plurality of the users.
(8)
The display control apparatus according to (7), in which the positional relation of the plurality of the users includes a density of the users.
(9)
The display control apparatus according to (7) or (8), in which the positional relation of the plurality of the users includes an arrangement of the users.
(10)
The display control apparatus according to any one of (1) to (9), in which the second display control unit controls the display of the second object on the basis of a density of the first object.
(11)
The display control apparatus according to any one of (1) to (10), in which the control of the display of the second object includes control of a display position of the second object.
(12)
The display control apparatus according to any one of (1) to (11), in which the control of the display of the second object includes control of a display form of the second object.
(13)
The display control apparatus according to (12), in which the control of the display form of the second object includes control of a height of the second object in a virtual space displayed in the display region.
(14)
The display control apparatus according to (12) or (13), in which the control of the display form of the second object includes control of a size of the second object.
(15)
The display control apparatus according to any one of (1) to (14), in which the second object is displayed to extend from a contour of the display region.
(16)
The display control apparatus according to any one of (1) to (15), in which the viewpoint position of the user is estimated on the basis of information obtained by detecting a skeleton of the user.
(17)
The display control apparatus according to any one of (1) to (16), in which the first display control unit controls a size of the first object on the basis of a relation of the viewpoint position of the user, the position of the operating body, and a position of the display region.
(18)
The display control apparatus according to any one of (1) to (17), in which a screen related to a virtual reality (VR) space is displayed in the display region, and the display of the screen is controlled on the basis of the position of the operating body and the viewpoint position of the user.
(19)
A display control method, including: controlling, by a processor, display of a first object corresponding to an operating body in a display region on the basis of a position of the operating body and a viewpoint position of a user; and
controlling, by the processor, display of a second object arranged toward a display position of the first object in the display region.
(20)
A program causing a computer to function as:
a first display control unit configured to control display of a first object corresponding to an operating body in a display region on the basis of a position of the operating body and a viewpoint position of a user; and
a second display control unit configured to control display of a second object arranged toward a display position of the first object in the display region.
Number | Date | Country | Kind |
---|---|---|---|
2016-204951 | Oct 2016 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2017/030145 | 8/23/2017 | WO | 00 |