Field of the Invention
The present invention relates to a technique for presenting Mixed Reality (MR).
Description of the Related Art
Recently, studies of MR aiming at seamless combination of a physical space and virtual space have been made actively. An image display device which presents mixed reality is, for example, a device which displays an image obtained by superimposing and rendering, on an image of a physical space that is sensed by an image sensing device such as a video camera, an image in a virtual space (for example, a virtual object or character information rendered by computer graphics) that is generated in accordance with the position and orientation of the image sensing device. As such a device, for example, a Head Mounted Display (HMD) is usable, e.g., see Japanese Patent Application No. 2006-320872, Japanese Patent Application No. 2010-189458, and Japanese Patent Laid-Open No. 2007-172596. The image display device is also implemented by an optical see-through method of displaying, on an optical see-through display mounted on the head of a user, an image in a virtual space that is generated in accordance with the position and orientation of the viewpoint of the user. There is disclosed a method using markers for calculating the position and orientation of an image sensing device from an image in a physical space that is acquired from the image sensing device (e.g., see Japanese Patent Application No. 2006-320872, and Japanese Patent Laid-Open No. 2007-172596).
When building a system which obtains the position and orientation of the viewpoint by using an image in the physical space including markers and enables a user to experience MR by using the obtained position and orientation of the viewpoint, a user having little knowledge of MR does not know an arrangement pattern of markers that stabilizes calculation of the position and orientation of the viewpoint.
The present invention has been made in consideration of the above-described problem, and provides a technique for creating information for notifying a user of an arrangement pattern of indices that stabilizes calculation of the position and orientation of the viewpoint.
According to an aspect of an exemplary embodiment, an information processing apparatus includes: an acquisition unit configured to acquire information of a position at which a user can experience mixed reality in a physical space; a generation unit configured to generate an arrangement pattern of indices in the physical space used to calculate a position and orientation of a viewpoint of the user based on the information of the position; and an output unit configured to output the arrangement pattern generated by the generation unit.
According to another aspect of an exemplary embodiment, an information processing method includes: acquiring information of a position at which a user can experience mixed reality in a physical space; generating an arrangement pattern of indices in the physical space used to calculate a position and orientation of a viewpoint of the user based on the information of the position; and outputting the generated arrangement pattern.
Further features will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Exemplary Embodiments are described below with reference to the accompanying drawings.
The first embodiment will explain one aspect of an information processing apparatus. This information processing apparatus generates (first generation) a plurality of types of arrangement pattern candidates. Each arrangement pattern candidate is an arrangement pattern candidate of a plurality of indices in a physical space used to calculate the position and orientation of the viewpoint. The arrangement pattern candidate enables observation of at least a predetermined number of indices from a position in an area where mixed reality can be experienced in the physical space. Then, the information processing apparatus generates (second generation) and outputs information representing the arrangement pattern of a plurality of indices in the physical space by using the generated types of arrangement pattern candidates.
First, an example of the functional arrangement of the information processing apparatus according to the first embodiment will be described with reference to the block diagram of
The display unit 1080 will be explained first. The display unit 1080 is constituted by a CRT, liquid crystal screen, or the like, and displays an image, character, and the like based on data output from the main body unit 1000.
The operation unit 1010 will be explained next. The operation unit 1010 is constituted by a keyboard, mouse, or the like. The user can operate the operation unit 1010 to input various instructions to the main body unit 1000.
The main body unit 1000 will be explained next. As shown in
The data storage unit 1020 stores MR experience information input by operating the operation unit 1010 by the user. The MR experience information is information used to decide a plurality of types of arrangement pattern candidates that enable observation of at least a predetermined number of indices from a position in an area where mixed reality can be experienced in a physical space. Details of the MR experience information will be described later.
By using the MR experience information stored in the data storage unit 1020, the arrangement information generation unit 1030 decides a plurality of types of arrangement pattern candidates that enable observation of a predetermined number or more of indices from a position in an area where mixed reality can be experienced in a physical space. The arrangement information generation unit 1030 stores the respective decided arrangement pattern candidates as pieces of index arrangement information in the data storage unit 1020.
By using the pieces of index arrangement information stored in the data storage unit 1020, the arrangement information output unit 1040 generates information representing the arrangement pattern of a plurality of indices in a physical space. The arrangement information output unit 1040 stores the generated information in the data storage unit 1020.
Based on the information stored in the data storage unit 1020 by the arrangement information output unit 1040, the image output unit 1070 generates information representing the arrangement pattern of a plurality of indices in the physical space. The image output unit 1070 outputs the information to the display unit 1080.
To experience mixed reality, the user mounts a head mounted display on his head. As is well known, the head mounted display includes a display unit, and a camera which senses a physical space. In
A cube 2050 represents an area where mixed reality can be experienced in the physical space represented by the cube 2000. As long as the head mounted display is located within the cube 2050, a user who mounts the head mounted display can experience mixed reality. In
As described above, the camera 2090 is movable in the area represented by the cube 2050. In accordance with the position of the camera 2090, the appearance of indices arranged on the wall 2010 (in particular, the number of indices falling within the field 2040 of view) changes. If a sufficient number of indices do not fall within the field 2040 of view, the number of indices appearing on a sensed image becomes insufficient, reducing the position/orientation calculation accuracy.
In the embodiment, therefore, the information processing apparatus in
In step S201, the user inputs MR experience information by operating the operation unit 1010. The control unit (not shown) of the information processing apparatus stores the MR experience information in the data storage unit 1020. As described above, the MR experience information is information used to decide a plurality of types of arrangement pattern candidates that enable observation of at least a predetermined number of indices from a position in an area where mixed reality can be experienced in a physical space. Thus, any information is available as long as the same purpose can be achieved by using the information. For example, assume that the MR experience information contains the following kinds of information:
The physical space information is, for example, information which defines a physical space in an area observable by a user who experiences mixed reality, in a world coordinate system set in the physical space (a coordinate system in which one point in the physical space is set as the origin and three axes perpendicular to each other at the origin are defined as the x-, y- and z-axes). In
The area information is, for example, information representing an area where mixed reality can be experienced in the world coordinate system.
In addition to these pieces of information, the MR experience information may contain pieces of information such as the camera resolution, lens distortion information, focal length information, and information which defines a region where a virtual object is to be arranged.
In step S202, the arrangement information generation unit 1030 reads out the MR experience information from the data storage unit 1020. The positional relationship between a “portion (in
The processing in step S202 will be explained with reference to an example shown in
In
Note that the distances d1 and d2 are not limited to those obtained by the above-mentioned method. For example, maximum and minimum distances input by the user by operating the operation unit 1010 may be set as d1 and d2, respectively.
Referring back to
By using the physical space information contained in the MR experience information, the arrangement information generation unit 1030 specifies a portion at which indices are arranged. The arrangement information generation unit 1030 arranges rectangles each having the size S side by side at this portion, thereby arranging a rectangle array (for example, 10000 in
For example,
In step S204, the arrangement information generation unit 1030 subtracts a predetermined value from the value of the variable D, thereby decreasing the value of the variable D by the predetermined value.
In step S205, the arrangement information generation unit 1030 determines whether the value of the variable D is equal to or smaller than d2. If the arrangement information generation unit 1030 determines that the value of the variable D is equal to or smaller than d2, the process advances to step S209. If the arrangement information generation unit 1030 determines that the value of the variable D is larger than d2, the process advances to step S207.
In step S207, the arrangement information generation unit 1030 determines whether a predetermined number (for example, four) or more of rectangles fall within the field of view of the viewpoint in the currently set rectangle array when the viewpoint is set at a position spaced apart by the distance D from the portion at which indices are arranged. If the arrangement information generation unit 1030 determines that the predetermined number or more of rectangles fall within the field of view of the viewpoint, the process advances to step S204. If the arrangement information generation unit 1030 determines that the predetermined number or more of rectangles do not fall within the field of view of the viewpoint, as shown in
In step S208, the arrangement information generation unit 1030 obtains the size S of the rectangle by calculating S=M×D, as in step S203 described above. By using the physical space information contained in the MR experience information, the arrangement information generation unit 1030 specifies a portion at which indices are arranged. The arrangement information generation unit 1030 arranges rectangles each having the size S side by side at this portion, instead of the previously arranged array, thereby arranging a rectangle array (for example, 11000 in
In step S209, the arrangement information output unit 1040 reads out pieces of index arrangement information stored in the data storage unit 1020. By using the readout pieces of index arrangement information, the arrangement information output unit 1040 generates information representing the arrangement pattern of a plurality of indices in the physical space. As processing to be performed in this step, various processes are conceivable. An example of the processing will be explained below.
For descriptive convenience, a case in which information representing the arrangement pattern of a plurality of indices in the physical space is generated by using two types of index arrangement information will be explained. Even when three or more pieces of index arrangement information are used, the following processing is applied in the same way.
First, a composite arrangement pattern is generated by superimposing an arrangement pattern candidate represented by one index arrangement information, and an arrangement pattern candidate represented by the other index arrangement information.
When the composite arrangement pattern 12000 in
As processing to be performed when it is determined that rectangles overlap each other, various processes are conceivable. For example, when A % or more of the area of one rectangle (the rectangle 13001) is contained in the other rectangle (the rectangle 13002), as shown in
When it is determined that rectangles overlap each other, for example, when A % or more of the area of one rectangle (a rectangle 14001) is contained in the other rectangle (a rectangle 14003), as shown in
When the wall 2010 has an outlet, a region where no index can be physically arranged, or a region 17000 where no index is to be arranged, as shown in
It is also possible to contain, in advance in physical space information or MR experience information, a position where a virtual object is displayed, and move a rectangle so as to arrange it at the region position. In this fashion, when it is determined that rectangles overlap each other, the composite arrangement pattern may be edited appropriately. Depending on the application, the composite arrangement pattern may not be edited. Alternatively, the composite arrangement pattern may be displayed on the display unit 1080 to prompt the user to edit it. In this case, the user edits the composite arrangement pattern by operating the operation unit 1010 to move or delete one or more rectangles.
In step S210, the arrangement information output unit 1040 stores the edited/unedited composite arrangement pattern in the data storage unit 1020. The composite arrangement pattern may be stored in an arbitrary format in the data storage unit 1020. For example, the composite arrangement pattern may be stored in a digital document format such as pdf or xps, or an image format such as jpeg or bmp. Note that the composite arrangement pattern contains data which defines each rectangle contained in the composite arrangement pattern, such as data of the size or three-dimensional position of each rectangle contained in the composite arrangement pattern.
By executing the above-described processing according to the flowchart of
The image output unit 1070 reads out the composite arrangement pattern registered in the data storage unit 1020, and displays it as an image or digital document on the display unit 1080. At this time, the image output unit 1070 may display, on the display unit 1080, even the three-dimensional coordinate position of each rectangle contained in the composite arrangement pattern. The contents displayed on the display unit 1080 are not limited to them, and an interface for experiencing mixed reality may be further displayed.
Note that the portion at which indices are arranged is not limited to the wall, and various portions in a physical space are conceivable. In the embodiment, the user mounts a head mounted display on his head in order to experience mixed reality. However, mixed reality can be experienced even by using an HHD (Hand Held Display), tablet, smartphone, or the like.
In the first embodiment, the display unit 1080 displays an image or digital document based on a composite arrangement pattern. However, the output destination of an image or digital document based on a composite arrangement pattern is not limited to the display unit 1080. The second embodiment will describe an information processing apparatus capable of causing a printing device to print an image or digital document based on a composite arrangement pattern.
An example of the functional arrangement of the information processing apparatus according to the second embodiment will be described with reference to the block diagram of
The printing device 18020 is connected to the main body unit 1900 directly or indirectly so as to be able to perform data communication with the main body unit 1900. The connection may be wired or wireless.
The arrangement information printing unit 18010 reads out a composite arrangement pattern registered in the data storage unit 1020, generates print data of the composite arrangement pattern, and sends the generated print data to the printing device 18020. The printing device 18020 prints an image, character, or the like on a printing medium such as paper in accordance with the print data.
In the first embodiment, the user inputs physical space information by operating the operation unit 1010. However, a three-dimensional measurement device or the like may be used to measure three-dimensional information of a physical space in an area observable by a user who experiences mixed reality, and the measurement result may be acquired as physical space information.
In the first embodiment, the user inputs physical space information by operating the operation unit 1010. After that, the user may appropriately edit the physical space information by operating the operation unit 1010. Depending on the application, the input physical space information may be converted into, for example, physical space information representing a more simplified physical space. That is, input physical space information may be directly used, or may be modified and used in subsequent processing.
Although the respective functional units constituting a main body unit 1000 shown in
An example of the hardware arrangement of the computer applicable to the main body unit 1000 shown in
A CPU 15001 controls the operation of the overall computer by executing processing using a computer program and data stored in a RAM 15002 or ROM 15003. In addition, the CPU 15001 executes each processing described to be performed by the main body unit 1000 shown in
The RAM 15002 has an area for storing computer programs and data loaded from an external storage device 15007 or storage medium drive 15008, and computer programs and data externally received via an I/F (interface) 15009. Further, the RAM 15002 has a work area used when the CPU 15001 executes various processes. That is, the RAM 15002 can properly provide various areas.
The ROM 15003 stores setting data of the computer, a boot program, and the like.
The external storage device 15007 is a mass information storage device typified by a hard disk drive. The external storage device 15007 saves an OS (Operating System), and a computer program and data for causing the CPU 15001 to execute each processing described to be performed by the main body unit 1000 shown in
The storage medium drive 15008 reads out computer programs and data recorded on a storage medium such as a CD-ROM or DVD-ROM, and outputs them to the RAM 15002 or external storage device 15007. Some of the computer programs and data described to be saved in the external storage device 15007 may be recorded on this storage medium.
The I/F 15009 is constituted by one or more interfaces for connecting an external device to the computer. An operation unit 1010, display unit 1080, and printing device 18020 described above are connected to the I/F 15009. For example, the I/F 15009 is constituted by a digital input/output port such as a USB for connecting the printing device 18020, and a DVI or HDMI port for connecting the display unit 1080. The above-mentioned head mounted display may be connected to the I/F 15009. The respective units described above are connected to a bus 15010.
Embodiments can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions recorded on a storage medium (e.g., non-transitory computer-readable storage medium) to perform the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more of a central processing unit (CPU), micro processing unit (MPU), or other circuitry, and may include a network of separate computers or separate computer processors. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
Number | Name | Date | Kind |
---|---|---|---|
8249361 | Steffens | Aug 2012 | B1 |
20060071945 | Anabuki | Apr 2006 | A1 |
20060071946 | Anabuki | Apr 2006 | A1 |
20090022369 | Satoh | Jan 2009 | A1 |
20100017407 | Beniyama et al. | Jan 2010 | A1 |
20100026714 | Utagawa | Feb 2010 | A1 |
20100265164 | Okuno | Oct 2010 | A1 |
Number | Date | Country |
---|---|---|
2007-172596 | Jul 2007 | JP |
2008-134161 | Jun 2008 | JP |
2012-048456 | Mar 2012 | JP |
Entry |
---|
WIPO, International Search Report and Written Opinion of the International Searching Authority for PCT/JP2015/003911, Oct. 13, 2015. |
Bajura Michael et al., Dynamic Registration Correction in Augmented-Reality Systems, Proceedings of the Virtual Reality Annual International Symposium, Mar. 1995. |
Number | Date | Country | |
---|---|---|---|
20160035134 A1 | Feb 2016 | US |