The present invention relates to a system and the like for an information terminal to recognize a space.
An information terminal such as a head mounted display (HMD) or a smartphone has a function of displaying an image (sometimes referred to as a virtual image or the like) corresponding to a virtual reality (VR), an augmented reality (AR), or the like on a display surface. For example, the HMD worn by the user displays an AR image at a position corresponding to an actual object such as a wall or a desk in a space such as a room.
As prior art examples relating to the above technology, JP-A-2014-514653 (Patent Document 1) is cited. Patent Document 1 describes a technique in which, in a plurality of terminals, the same object in the real space, for example, a desk surface, is recognized as an anchor surface based on camera capture, and a virtual object is displayed on the anchor surface from each terminal, thereby displaying the virtual object at almost the same position.
[Patent Document 1] Japanese Patent Application Laid-Open Publication No. 2014-514653
The information terminal preferably grasps the position, orientation, shape, and the like of an actual object such as a wall, a desk, and the like in a space with as high accuracy as possible in order to be able to display a virtual image suitably in a function such as an AR. The information terminal has a function to detect and measure the space around its own device using cameras and sensors for its grasp. For example, the HMD can detect the reflection point when the light emitted from the sensor of its own device comes back to hit the surrounding object as a feature point, it is possible to obtain a plurality of feature points of the surrounding as point group data. The HMD can configure space data representing the shape or the like of the space (in other words, data for the information terminal to recognize the space) by using such point group data.
However, in the case where the information terminal of the user performs the measurement on a large space or a large number of spaces in the real world, there are problems with respect to efficiency, convenience of the user, workload, and the like. For example, when one user performs an operation of measuring a space in a building with an information terminal, it may take a long time and a large load.
Further, when an information terminal of a certain user measures and grasps a space of a room or the like once and uses the space for AR image display or the like and then reuses the space after that, the efficiency or the like is not good if the space must be measured again.
It is an object of the present invention to provide a technique in which an information terminal can measure space to create and register space data, and an information terminal can acquire and use the space data, and a technique in which the space data and the corresponding space recognition can be shared among a plurality of information terminals of a plurality of users.
A typical embodiment among the present invention has the configuration shown below. The space recognition system of one embodiment includes an information terminal including a terminal coordinate system and having a function of measuring a space and a function of displaying a virtual image on a display surface, and an information processing apparatus performing processing based on a common coordinate system, wherein the information terminal measures a relationship between the terminal coordinate system and the common coordinate system with respect to a position and an orientation, matches the terminal coordinate system and the common coordinate system based on data representing the measured relationship, and the information terminal and the information processing apparatus share recognition of the space.
According to a representative embodiment of the present invention, the information terminal can measure space to create and register space data, the information terminal can acquire and use the space data, and the space data and the corresponding space recognition can be shared among a plurality of information terminals of a plurality of users.
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. In all the drawings, the same parts are denoted by the same reference numerals in principle, and a repetitive description thereof is omitted.
A space recognition system and a method of the first embodiment of the present invention will be described with reference to
Conventionally, regarding a terminal such as an HMD, a concept of measuring a space to be used by a plurality of terminals of a plurality of users in a spatially or temporally divided manner, creating and registering space data, and a suitable technique therefor have not been sufficiently examined. The space recognition system and method of the first embodiment provides a suitable technique such as measurement of space by such sharing, creation of space data, sharing and reuse of space data, and a series of procedures therefor. The systems and methods efficiently, e.g., quickly, realize measurement of space, creation of space data, sharing and reuse of the space data, and the like.
First, the basic configuration of the present invention is shown in
The sharing of the space data 6 is performed through the description of the space data 6 by the common coordinate system WS. For example, the information terminal 1 measures the space 2 to acquire the space data 6, and acquires the space data 6 described based on the common coordinate system WS from the information processing apparatus 9. The information terminal 1 converts the space data 6 acquired from the information processing apparatus 9 to the terminal coordinate system WT of its own device, and uses the space data 6 on integrating with the space data 6 measured by its own device. Alternatively, the information terminal 1 converts the space data 6 measured by its own device into a description by the common coordinate system WS, and provides it to the information processing apparatus 9. The information processing apparatus 9 collectively processes and uses the provided space data 6 and the held space data 6.
The space recognition system of the first embodiment shown in
The space recognition system and method according to the first embodiment have a mechanism such as adaptation between coordinate systems related to measurement and recognition of the space 2 in the above sharing. In general, a coordinate system of a space (sometimes referred to as a “space coordinate system”) and a coordinate system of each terminal (sometimes referred to as a “terminal coordinate system”) are basically different coordinate systems, and at least initially do not coincide with each other. Therefore, in the embodiment, at the time of the above sharing, the operation of correlating and matching the terminal coordinate systems of the terminals 1 with each other is performed as “coordinate system pairing”. By this operation, the conversion parameters 7 for coordinate system conversion are set in the terminal 1. In a state in which the coordinate system pairing is established, the positions, orientations, and the like can be mutually converted between the coordinate systems by the conversion parameters 7. As a result, it is possible to share the recognition of the position and the like of the same space 2 between the terminals 1 that perform sharing. Each terminal 1 creates partial space data 6 described in each terminal coordinate system by measurement in sharing. The plurality of partial space data described in each terminal coordinate system can be configured as the space data 6 described in a certain unified terminal coordinate system in units of the space 2 by conversion and integration using the conversion parameters 7.
The space 2 is an arbitrary space managed by identification or division, and is, for example, one room in a building. In this example, the space 2 of the room is an object of creating the space data 6 by sharing, and is an object of recognition sharing by a plurality of terminals 1.
The plurality of terminals 1 includes, for example, a first terminal 1A (=first HMD) of the first user U1 and a second terminal 1B (=second HMD) of the second user U2. The HMD which is a terminal 1 includes in the housing, a transmissive display surface 11, a camera 12, and a ranging sensor 13 or the like, and has a function of displaying a virtual image of AR on the display surface 11. Similarly, the smartphones 1a and 1b include a display surface such as a touch panel, a camera, a ranging sensor, and the like, and have a function of displaying a virtual image of AR on the display surface. When a smart phone or the like is used as the terminal 1, a function or the like substantially similar to that of the HMD can be realized. For example, the user sees a virtual image such as an AR displayed on a display surface of a smartphone held in his or her hand.
Each terminal 1 has a function of performing coordinate system pairing between the own device and the other terminal 1. Each terminal 1 measures the relationship between the terminal coordinate system of the own device (e.g., the first terminal coordinate system WA) and the terminal coordinate system of the counterpart (e.g., the second terminal coordinate system WB), generates a conversion parameter 7 based on the relationship, and sets the relationship to at least one of the own device or the counterpart. The plurality of terminals 1 (1A, 1B) measure the object space 2 in a shared manner, and create respective partial space data 6 (sometimes referred to as “partial space data”). For example, the first terminal 1A creates space data D1A, and the second terminal 1B creates space data D1B. The plurality of terminals 1 can generate space data 6 (e.g., space data D1) in units of the space 2 from the partial space data 6, and can share the recognition of the space 2 using the space data 6. Terminal 1 has a function of measuring the space 2 using a camera 12 and the ranging sensor 13 or the like, and creating the space data 6 based on the measurement data. The terminal 1 can use the conversion parameters 7 to convert between coordinate systems relating to the representation of the measurement data and the space data 6.
The relation between the terminal coordinate systems (WA, WB) is roughly determined as follows. First, the relation of the rotations between the coordinate systems is obtained based on measuring the representations in each terminal coordinate systems (WA, WB) in two different specified directions in the real space. Next, based on the measurement of the relationship of the positions between the terminals 1, the relationship of the origin between each terminal coordinate system is determined. Conversion parameter 7 can be constituted by the parameter of the rotation and the parameter of the origin.
In the first embodiment, coordinate system pairing is performed for each pair of two terminals 1, with the terminal coordinate system of either one of the terminals 1 among a plurality of terminals 1 (e.g., the first terminal coordinate system WA) as a common coordinate system. As a result, at least one of the terminals 1, e.g., the first terminal 1A, creates and holds the conversion parameters 7. Thereafter, each terminal 1 measures the space 2 in a shared manner, and creates each partial space data described in each terminal coordinate system. Each terminal 1 may transmit/receive/exchange each partial space data as data described on the basis of a common coordinate system with each other terminal 1. The terminal 1 converts the partial space data between the description based on the terminal coordinate system of the terminal 1 of the own device and the description based on the common coordinate system using the conversion parameter 7. If the terminal coordinate system of the own device is a common coordinate system, this conversion is unnecessary. Then, the terminal 1 obtains the space data 6 in units of the space 2 by integration from the plurality of partial space data described in the unified terminal coordinate system. Thus, the plurality of terminals 1 can preferably display the same virtual image 22 at the same position 21 in the same space 2 by using the space data 6.
Even in the case of the terminal 1 provided with the non-transmissive display, the display position of the virtual image displayed on the display surface 11 can be shared with other terminals 1 while the terminal coordinate system of the terminal 1 is fixed to the real space.
In the first embodiment, a coordinate system serving as a reference for specifying a position, an orientation, or the like in the real space in each terminal 1 or space 2 is referred to as a world coordinate system. Each terminal 1 has a terminal coordinate system as its own world coordinate system. In
The first terminal coordinate system WA has an origin OA and an axis XA, an axis YA, and an axis ZA as three axes perpendicular to each other. The second terminal coordinate system WB has an origin OB and an axis XB, an axis YB, and an axis ZB as three axes that are perpendicular to each other. The space coordinate system W1 has an origin O1 and an axis X1, an axis Y1, and an axis Z1 as three axes perpendicular to each other. The origin OA, OB and the origin O1 are fixed at predetermined positions in the real space, respectively. The position LA of the first terminal 1A in the first terminal coordinate system WA, and the position LB of the second terminal 1B in the second terminal coordinate system WB are defined in advance as, for example, the housing center position (
When sharing the recognition of the space 2, the terminal 1 performs coordinate system pairing with other terminals 1. For example, the sharing terminals 1 (1A, 1B) perform coordinate system pairing with each other. At the time of coordinate system pairing, each terminal 1 measures and acquires predetermined quantities from each other (
In step S1, the first terminal 1A performs the coordinate system pairing with the second terminal 1B (
In step S2, the first terminal 1A measures the area 2A based on the sharing, and creates the partial space data 6 (referred to as the partial space data D1A) described in the first terminal coordinate system WA. Note that the symbol * in the drawing indicates a coordinate system describing the space data. On the other hand, in S3, the second terminal 1B similarly measures the area 2B based on the sharing, and creates partial space data 6 described in the second terminal coordinate system WB (referred to as partial space data D1B). Steps S2, S3 can be executed in parallel at the same time.
In step S4, the first terminal 1A receives and acquires the partial space data D1B from the second terminal 1B. In step S5, the first terminal 1A converts the partial space data D1B into the partial space data 6 (the partial space data D1BA) described in the first terminal coordinate system WA using the conversion parameter 7.
In step S6, the first terminal 1A integrates the partial space data D1A and the partial space data D1BA into one, and obtains space data 6A (D1) in units of the space 2 described in the first terminal coordinate system WA. As a result, the first terminal 1A obtains the space data 6A (D1) in units of the space 2 even if only the area 2A is measured.
Furthermore, the method has the following steps. In step S7, the first terminal 1A converts the partial space data D1A into the partial space data 6 described in the second terminal coordinate system WB (referred to as the partial space data D1AB) using the conversion parameters 7. In step S8, the first terminal 1A integrates the partial space data D1B and the partial space data D1AB into one, and obtains space data 6B (D1) in units of space 2 described in the second terminal coordinate system WB. In step S9, the first terminal 1A transmits the space data 6B (D1) to the second terminal 1B. As a result, the second terminal 1B obtains the space data 6B (D1) in units of the space 2 even if only the area 2B is measured.
In the above manner, for the same space 2, the first terminal 1A acquires the space data 6A (D1) described in the first terminal coordinate system WA, and the second terminal 1B acquires the space data 6B (D1) described in the second terminal coordinate system WB. Therefore, it is possible to share the recognition of the space 2 between the terminals 1 (1A, 1B). For example, the first terminal 1A and the second terminal 1B can display the same virtual image 22 at the same position 21 in the space 2 (as will be described later with reference to
The above-described methods can be similarly applied to the case where the conversion parameters 7 are generated by the second terminal 1B to configure the space data 6.
The space data 6 describing the space 2 (in particular, the space shape data described later), is, for example, data of an arbitrary format representing the position, shape, and the like of the room. The space data 6 includes data representing a boundary of the space 2 and data of any object arranged in the space 2. The data representing the boundary of the space 2 includes, for example, data of arrangements such as a floor, a wall, a ceiling, and a door 2d which constitute a room. In some cases, there is no arrangement at the boundary. The data of the object in the space 2 includes, for example, data of a desk 2a, a whiteboard 2b, and the like arranged in the room. The space data 6 includes, for example, at least point group data, and is data having position coordinate information for each feature point in a certain terminal coordinate system. The space data 6 may be polygon data representing lines, planes, and the like in the space.
In this embodiment, the space 2, which is one room, is divided and measured in sharing manner by each terminal 1 (1A, 1B) of the users U1, U2, which are two users, to create the space data 6 of the space 2. The content of the sharing can be arbitrarily determined. For example, it is performed to consult between two users and share as shown in the figure. The sharing may be, as in the present example, to spatially divide the object space 2 into a plurality of areas (in other words, partial spaces). In this example, the space 2 is divided into left and right half areas with respect to the center in the left-right direction (axis Y1 direction) in
When measuring the target space 2, it is not necessary to cover 100% of the area. A sufficient quantity of area in the space 2 may be measured according to the function of AR or the like or the necessity. Of the spaces 2, some areas that are not measured may occur, or areas that have been measured in duplicate by sharing may occur. In the example of
After pairing the coordinate system between the terminals 1 (1A, 1B), each terminal 1 measures each measurement range (401, 402) of the shared areas 2A, 2B, and obtains each measurement data. The first terminal 1A measures the measurement range 401 of the area 2A and obtains measurement data 411. The second terminal 1B measures the measurement area 402 of the area 2B and obtains measurement data 412. The measurement data is, for example, point group data obtained by the ranging sensor 13. Point group data is data with location, orientation, and distance, etc., for each point at a plurality of feature points around. Each terminal 1 creates the partial space data 420 from the measurement data. The first terminal 1A creates partial space data D1A described in the first terminal coordinate system WA from the measured data 411. The second terminal 1B creates partial space data D1B described in the second terminal coordinate system WB from the measured data 412.
In the example of
Each terminal 1 converts the partial space data 430 obtained from the other terminal 1 into the partial space data 440 in its own terminal coordinate system using the conversion parameter 7. The first terminal 1A converts the partial space data D1B into the partial space data D1BA described in the first terminal coordinate system WA. The second terminal 1B converts the partial space data D1A into the partial space data D1AB described in the second terminal coordinate system WB.
Each terminal 1 integrates the partial space data 420 obtained by the own terminal 1 and the partial space data 440 obtained by the partner terminal into the space data 6 (450) in units of one space 2 in the unified terminal coordinate system. The first terminal 1A integrates the partial space data D1A and the partial space data D1BA into one to obtain the space data D1 (6A) described in the first terminal coordinate system WA. The second terminal 1B integrates the partial space data D1B and the partial space data D1AB into one to obtain the space data D1 (6B) described in the second terminal coordinate system WB. In the case of this example, which of the two terminals 1 corresponds to the information processing apparatus 9 of the basic configuration (
According to the above method, it is possible to realize the measurement and the acquisition of the space data more efficiently by shortening the time involved in the measurement and the acquisition of the space data as compared with the case where the space 2 is measured by the terminal 1 of one user.
It should be noted that, in the case where a space portion of the space 2 in which the other user is appeared viewing from the terminal 1 of one user and shade is created on the back side of the other user is generated, such a space portion can be measured by the terminal 1 of the other user, and it is more efficient to measure the space portion by the terminal 1 of the other user.
During the measurement, the user and the corresponding terminal 1 can also be moved appropriately to change the measurement range. Unmeasured area 491 shown can also be measured separately by including in the measurement range from other positions.
As a more sophisticated method of sharing, either terminal 1 may automatically judge and determine the sharing. For example, each terminal 1 judges, based on a camera image or the like, the schematic position and orientation of its own device in the room, the presence or absence of the imprinting of another user or other machine, its position and orientation, and the like. For example, when the second user U2 and the second terminal 1B are not appeared in the camera image, the first terminal 1A selects the area and range in the orientation at that time as the area and range to be shared by the first terminal 1A.
The camera 12 has, for example, two cameras disposed on the left and right sides of the housing 10, and captures an image by taking a range including the front of the HMD. The ranging sensor 13 is a sensor for measuring the distance between the HMD and the external object. The ranging sensor 13 may use a TOF (Time Of Flight) type sensor, or may use a stereo camera or other type of systems. The sensor unit 14 includes a group of sensors for detecting the state of the position and orientation of the HMD. On the left and right sides of the housing 10, there are provided an audio input device 18 including a microphone, an audio output device 19 including a speaker and an earphone terminal, and the like.
An operating device such as a remote controller may be attached to the terminal 1. In this case, the HMD performs, for example, short-range wireless communication with the operating device. The user can input an instruction relating to the function of the HMD, move the cursor on the display surface 11, and the like by operating the operating device by hand. The HMD may communicate with an external smartphone, PC, or the like to perform cooperation. For example, the HMD may receive AR image data from an application of a smartphone.
The terminal 1 includes an application program or the like that displays a virtual image such as an AR on the display surface 11 for work assistance or entertainment. For example, the terminal 1 generates a virtual image 22 (
The processor 101 includes a CPU, a ROM, a RAM, and the like, and configures a controller of the HMD. The processor 101 executes processing in accordance with the control program 31 and the application program 32 of the memory 102 to realize functions such as an OS, a middleware, and an application and other functions. The memory 102 is composed of a non-volatile storage device or the like and stores various data and information handled by the processor 101 or the like. The memory 102 also stores, as temporary information, an image acquired by the camera 12 or the like, detected information, and the like.
The camera 12 acquires an image by converting the light incident from the lens into an electrical signal by the image pickup device. The ranging sensor 13, for example, when using a TOF sensor, calculates the distance to the object from the time until the light emitted to the outside comes back after hitting the object. The sensor unit 14 includes, for example, an acceleration sensor 141, a gyro sensor (angular velocity sensor) 142, a geomagnetic sensor 143, and a GPS receiver 144. The sensor unit 14, using the detected information of these sensors, detects the position, orientation, the state of the movement or the like of the HMD. The HMD is not limited thereto, may be provided with an illuminance sensor, a proximity sensor, an atmospheric pressure sensor or the like.
The display device 103 includes a display driving circuit and the display surface 11, and displays a virtual image or the like on the display surface 11 based on the image data of the display information 34. Note that the display device 103 is not limited to a transmissive display device, and may be a non-transmissive display device or the like.
The communication device 104 includes a communication processing circuit, an antenna, and the like corresponding to various predetermined communication interfaces. Examples of communication interfaces include mobile networks, Wi-Fi (registered trademark), Bluetooth (registered trademark), infra-red or the like. The communication device 104 performs wireless communication processing or the like between the other terminal 1 and the access point 23 (
The voice input device 18 converts the input voice from the microphone into voice data. The sound output device 19 outputs sound from a speaker or the like based on the sound data. The voice input device may include a voice recognition function. The voice output device may include a voice synthesis function. The operation input unit 105 is a portion for receiving an operation input to the HMD, for example, power-on/off or volume adjustment or the like, and is composed of a hardware button, a touch sensor or the like. The battery 106 supplies power to each part.
The controller by the processor 101 includes a communication control unit 101A, a display control unit 101B, a data processing unit 101C, and a data acquiring unit 101D as examples of the configuration of functional blocks realized by processing.
The memory 102 stores a control program 31, an application program 32, setting information 33, display information 34, coordinate system information 35, space data information 36, and the like. The control program 31 is a program for realizing control including a space recognition function. The application program 32 is a program that realizes a function such as an AR that uses the space data 6. The setting information 33 includes system setting information and user setting information related to each function. The display information 34 includes image data and position coordinate information for displaying an image such as the virtual image 22 on the display surface 11.
The coordinate system information 35 is management information related to the space recognition function. The coordinate system information 35 includes information of the first terminal coordinate system WA of the own device, information of the second terminal coordinate system WB of the other device, various quantity data of the own device side and various quantity data of the other device side (
The space data information 36 is the information corresponding to the space data 6 of
The communication control unit 101A controls a communication process using the communication device 104 when communicating with another terminal 1 or the like. The display control unit 101B controls the display of the virtual images 22 and the like on the display surface 11 of the display devices 103 using the display data 34.
The data processing unit 101C reads and writes the coordinate system information 35, and performs processing for managing the terminal coordinate system of the data processing unit itself, processing for pairing the coordinate system with the terminal coordinate system of the partner, processing for converting between the coordinate systems using the conversion parameters 7, and the like. At the time of the coordinate system pairing, the data processing unit 101C performs processing for measuring various quantity data on its own device side, processing for acquiring various quantity data on the other device side, processing for generating the conversion parameter 7, and the like.
The data acquiring unit 101D acquires the detected data from various sensors such as the camera 12, the ranging sensor 13, and the sensor unit 14. At the time of coordinate system pairing, the data acquiring unit 101D measures the quantity data of its own device side in accordance with the control from the data processing unit 101C.
Next, the details of the coordinate system pairing will be described.
In the example of
For example, when the first terminal 1A performs space recognition sharing with the second terminal 1B, the first terminal 1A performs coordinate system pairing as an operation of sharing world coordinate system information with each other by using those terminals as one pair. The two terminals 1 (1A, 1B) may perform one coordinate system pairing once. Even when there are three or more terminals 1, the coordinate system pairing may be performed similarly for each pair.
At the time of coordinate system pairing, each terminal 1 (1A, 1B) measures predetermined quantities in each terminal coordinate system (WA, WB), and exchanges quantity data with the other terminal 1. The first terminal 1A measures a specific-direction vector NA, an inter-terminal vector PBA, and a coordinate-value dA as the quantities 801 measured by the own device. The first terminal 1A transmits the data of those quantities 801 to the second terminal 1B. The second terminal 1B measures a specific-direction vector NB, an inter-terminal vector PAB, and a coordinate-value dB as the quantities 802 measured by the own device. The second terminal 1B transmits the data of those quantities 802 to the first terminal 1A.
Each terminal 1 can determine the relationship between the terminal coordinate system of the pair based on the various quantity data measured by its own device and the various quantity data obtained from the partner device, and from the relationship, it is possible to calculate the conversion parameter 7 for the conversion between the terminal coordinate systems. Thus, between the terminals 1, using the conversion parameter 7, the sharing of the world coordinate system information can be performed by associating each terminal coordinate system.
When only one of the terminals 1 in the coordinate system pairing, for example, only the first terminal 1A performs conversion between the coordinate systems, only the first terminal 1A needs to acquire the quantities 801 on the own device side and the quantities 802 on the other device side to generate the conversion parameter 7. In this case, the quantities 801 need not be transmitted from the first terminal 1A to the second terminal 1B. The first terminal 1A may transmit the generated conversion parameter 7 to the second terminal 1B. In this case, the second terminal 1B side can also perform conversion.
In the first embodiment, information of the following three elements is included as various quantities at the time of coordinate system pairing. The quantities include a specific direction vector as first information, an inter-terminal vector as second information, and a world coordinate value as third information.
(1) Regarding a specific direction vector: Each terminal 1 uses a specific direction vector as information about a specific direction in the real space in the world coordinate system. Two distinct directional vectors (NA, NB, MA, MB) are used to obtain the relation of rotation between coordinate systems. The specific direction vector NA is a representation of the first direction vector in the first terminal 1A and the unit direction vector is nA. The specific direction vector NB is a representation of the first direction vector in the second terminal 1B and the unit direction vector is nB. The specific direction vector MA is a representation of the second direction vector in the first terminal 1A and the unit direction vector is mA. The specific direction vector MB is a representation of the second direction vector in the second terminal 1B and the unit direction vector is mB.
In the first embodiment, in particular, a vertically downward direction is used as one specific direction (first specific direction), and a later-described inter-terminal vector is used as another specific direction (second specific direction). In the illustrative example of
The vertical downward direction can be measured as the direction of the gravitational acceleration, for example, using a three-axis acceleration sensor which is an acceleration sensor 141 provided in the terminal 1 (
(2) Regarding an inter-terminal vector: Each terminal 1 uses information of vectors (i.e., directions and distances) between terminal positions (LA, LB) as information representing positional relationships from one terminal 1 (e.g., the first terminal 1A) to the other terminal 1 (e.g., the second terminal 1B). This information is referred to as the “inter-terminal vector”. In the example of
The inter-terminal vector includes information about another specific direction (second specific direction) in the real space for determining the orientation relationship between the world coordinate systems. Here, there is a correspondence relationship between the specified directional vector (MA, MB) and the inter-terminal vector (PBA, PAB) as follows.
P
BA
=M
A
P
AB
=−M
B
During the coordinate system pairing, each terminal 1 measures the inter-terminal vector to the other terminal 1, for example, using a ranging sensor 13 or a camera 12 of the stereo type such as shown in
(3) Regarding world coordinate system: Each terminal 1 uses information of a coordinate value representing a position in the world coordinate system. In the example of
In
By the coordinate system pairing, the relation of the world coordinate system (WA, WB) between the terminals 1 (1A, 1B) can be known, and the positions and orientations can be converted from each other. That is, it is possible to perform a conversion for matching the second terminal coordinate system WB to the first terminal coordinate system WA, or vice versa. The conversion between the world coordinate systems is represented by the predetermined conversion parameter 7. The conversion parameter 7 is a parameter for a conversion of the direction of the coordinate system (in other words, rotation) and a calculation of the difference between the origin of the coordinate system.
For example, when the coordinate system conversion can be performed at the first terminal 1A, the first terminal 1A calculates the relation between the terminal coordinate system (WA, WB) from the quantities 801 of the own device side and the quantity 802 of the other device side, generates the conversion parameter 7, and sets it to the own device. The same can be applied to the second terminal 1B. The conversion parameter 7 includes a conversion parameter 71 for converting a position in the first terminal coordinate system WA to a position in the second terminal coordinate system WB and the like, and a conversion parameter 72 for converting a position in the second terminal coordinate system WB to a position in the first terminal coordinate system WA and the like. These conversions are inverse conversions of each other. At least one terminal 1 may hold the conversion parameter 7, and both terminals 1 may hold the same conversion parameters 7.
(A) shows a first example. The first terminal 1A converts the position coordinate value rA that is the position in the first terminal coordinate system WA (for example, the position 21 of the display target of the virtual image 22) to the position in the second terminal coordinate system WB (position coordinate value rB) using the conversion parameter 71, and transmits the converted position coordinate value to the second terminal 1B.
(B) shows a second example. The first terminal 1A transmits the position coordinate value rA that is a position in the first terminal coordinate system WA to the second terminal 1B, and the second terminal 1B converts the received position coordinate value rA into the position coordinate value rB in the second terminal coordinate system WB using the conversion parameter 71.
(C) shows a third example. The second terminal 1B converts the position coordinate value rB, which is a position in the second terminal coordinate system WB, into the position coordinate value rA in the first terminal coordinate system WA using the conversion parameter 72, and transmits the result to the first terminal 1A.
(D) shows a fourth example. The second terminal 1B transmits the position coordinate value rB in the second terminal coordinate system WB to the first terminal 1A, and the first terminal 1A converts the received position coordinate value rB into the position coordinate value rA in the first terminal coordinate system WA using the conversion parameter 72.
As described above, for example, when the position is transmitted from the first terminal 1A to the second terminal 1B, a conversion may be performed by the method (A) or (B), and when the position is transmitted from the second terminal 1B to the first terminal 1A, a conversion may be performed by the method (C) or (D). In terms of correspondence with the basic configuration (
The lower part of
In the steps S1A and S1B, a radio communication connection related to space recognition sharing is established between the first terminal 1A and the second terminal 1B through the process of the communication device 107 of
In the steps S2A and S2B, the user performs an input operation for starting the measurement of the space 2 on the terminal 1, which is an HMD. For example, the user U1 inputs a measurement start instruction to the first terminal 1A, and the user U2 inputs a measurement start instruction to the second terminal 1B. It is to be noted that a communication related to the starting of measurement may be performed between the terminals 1 (1A, 1B). Further, for example, the terminal 1 may display a guide image relating to the operation of starting or ending the measurement on the display surface 11. The user performs an input operation for starting or ending the measurement in accordance with the display. The input operation may be a hardware button operation, an operation by voice recognition, or an operation by detection of a predetermined gesture such as movement of a finger or the like. As another method, the terminal 1 may control the start and end of the measurement by setting in advance or automatic determination.
Further, in the step S2A, S2B, the sharing of the area of the space 2 and the measuring range as shown in
Steps S3A to S6A and S3B to S6B are steps for performing coordinate system pairing. The method of the first embodiment is a method of measuring the space 2 after the coordinate system pairing. Therefore, the measurement start instruction of the steps S2A, S2B is, in other words, the measurement start instruction of the coordinate system pairing.
In steps S3A, S3B, request for coordinate system pairing is transmitted from one terminal 1 to the other terminal 1. For example, the first terminal 1A transmits a coordinate system pairing request to the second terminal 1B. The second terminal 1B receives the coordinate system pairing request, and when accepting the request, transmits a coordinate system pairing response indicating acceptance of the request to the first terminal 1A. In step S3A, S3B, each terminal 1 may display image for guiding the coordinate system pairing on the display surface 11 (which will be described later with reference to
In steps S4A, S4B, the first terminal 1A and the second terminal 1B are matched in their timing with each other to measure the quantities for coordinate system pairing (
In steps S5A, S5B, the first terminal 1A and the second terminal 1B mutually transmit the quantity data on the own device side to the other device side, thereby exchanging the quantity data. The first terminal 1A acquires the quantities 802 from the second terminal 1B, and the second terminal 1B acquires the quantities 801 from the first terminal 1A.
In steps S6A, S6B, the first terminal 1A and the second terminal 1B generate the conversion parameters 7, respectively, to set the own device. The first terminal 1A generates a conversion parameter 7 (e.g., both the conversion parameters 71 and 72 in
After the coordinate system pairing is established, a measurement-start instruction of steps S2A, S2B may be inputted.
After the coordinate system pairing is established up to the steps S6A, S6B, in the loops after the steps S7A, S7B, the terminals 1 measure the area of the space 2 due to the sharing (
In steps S8A, S8B, each terminal 1 constructs partial space data based on the measured data, and transmits the partial space data to each other's terminals 1 (
In steps S9A, S9B, each terminal 1 converts the partial space data described in the terminal coordinate system of the other side into the partial space data described in the terminal coordinate system of the own device side using the conversion parameters 7 as required (
In steps S10A, S10B, each terminal 1 judges whether or not to finish the space measurement in the coordinate system pairing state. At this time, the user may input an instruction to end the measurement to the terminal 1, or the terminal 1 may end the measurement by automatic determination. For example, the terminal 1 may be determined that the measurement end of the own device, when it is determined that a predetermined rate or more of the area of the space 2 or sharing of the object has been measured or created, based on the measurement data or space data or the like. The rate is a variable setpoint. The terminal 1 proceeds to the following step when it is determined that the measurement is completed (Yes), and returns to the steps S7A, S7B when it is determined that the measurement is not completed (No), and repeats the same process.
In steps S11A, S11B, each terminal 1 uses the space 2 while sharing the recognition of the space 2 with the other terminal 1 by using the created space data 6. It should be noted that steps S11A, S11B can be omitted when the creation of the space data 6 is the object of the present method. The use of the space 2 typically includes displaying the same virtual image 22 at a desired same position 21 in the space 2 by using an AR function between the terminals 1 (1A, 1B), and performing an operation or the like (
In steps S12A, S12B, each terminal 1 cancels the state of the coordinate system pairing. For example, when the use of the space 2 is temporary, each terminal 1 may delete the conversion parameter 7 or may delete the space data 6. The present method is not limited to this, and each terminal 1 may maintain the state of the coordinate system pairing thereafter. That is, each terminal 1 may continue to retain the conversion parameters 7 and the space data 6 thereafter. In such cases, steps S12A, S12B can be omitted. For example, each terminal 1 is possible to omit the processing such as measurement again or the like when subsequently reusing the same space 2, by holding the conversion parameter 7 and the space data 6 within its own device.
Further, the first terminal 1A displays an image 1103 when measuring the quantities 801 in the step S4A. The image 1103 is, for example, a message image such as “Under pairing. Do not move as much as possible”. During direct coordinate system pairing between the terminals 1, by making the state as stationary as possible to each other, various quantities can be measured with high accuracy. Therefore, the output of the image 1103 for such a guide is effective.
The details of the coordinate conversion are described in detail below. First, the notation for explaining the relationship of the coordinate system will be summarized. In an embodiment, the coordinate system is unified to the right hand system and a normalized quaternion is used to represent the rotation of the coordinate system. Normalized quaternion is a quaternion with a norm of 1, and can represent rotation about an axis. The rotation of an arbitrary coordinate system can be represented by such a normalized quaternion. The normalized quaternion q representing the rotation of the angle η, with the unit vector (nX, nY, nZ) as the rotation axis is given by Equation 1 below. i, j, k are units in quaternions. The clockwise rotation when the unit vector (nX, nY, nZ) is oriented is the forward rotation.
q=cos(η/2)+nX sin(η/2)i+nY sin(η/2)j+nZ sin(η/2)k Equation 1:
The real part of the quaternion q is represented by Sc(q). Let q* be the conjugate quaternion of the quaternion q. The operating device that normalizes the norm of the quaternion q to 1 is defined by [•]. Assuming that the quaternion q is an arbitrary quaternion, Equation 2 is the definition of the [⋅]. The denominator on the right side of Equation 2 is the norm of the quaternion q.
[q]=q/(qq*)1/2 Equation 2:
Next, the quaternion p representing the coordinate point or vector (pX, pY, pZ) is defined by Equation 3.
p=p
X
i+p
Y
j+p
Z
k Equation 3:
In this specification, unless otherwise noted, the symbols representing the coordinate points and vectors that are not component display are assumed to be quaternion representations. It is also assumed that the symbol representing rotation is a normalized quaternion.
The projection operating device of the vector into the plane perpendicular to the direction of the unit vector n is assumed to be PT(n). The projection of the vector p is represented by Equation 4.
P
T(n)p=p+nSc(np) Equation 4:
Assuming that the coordinate point or directional vector p1 is converted to a coordinate point or directional vector p2 by rotating the center of origin represented by the quaternion q, the directional vector p2 can be calculated by Equation 5.
p
2
=qp
1
q* Equation 5:
The normalized quaternion R(n1, n2) which rotates the unit vector n1 about an axis perpendicular to the plane including the unit vector n2 so that the unit vector n1 is superimposed on the unit vector n2 becomes Equation 6 below.
R(n1,n2)=[1−n2n1] Equation 6:
Based on the above-mentioned quantities (
The relation of the terminal coordinate systems (WA, WB) can be calculated as follows. Hereinafter, the calculation for obtaining the rotation and the coordinate origin difference in the case where the representation of the coordinate value and the vector value in the second terminal coordinate system WB is converted into the representation in the first terminal coordinate system WA will be described.
First, the rotation for matching the direction of the first terminal coordinate system WA and the direction of the second terminal coordinate system WB is obtained. Based on the inter-terminal vector PBA, PAB explained above (
m
A=[PBA]
m
B=[−PAB]
First, consider a rotation qT1 in which the unit vector nA in the first specific direction is superimposed on the unit vector nB in the rotation in the representation of the first terminal coordinate system WA. Specifically, qT1 of rotations is as follows.
q
T1
=R(nA,nB)
Next, the direction in which the unit vector nA, mA in a specific direction is rotated by this rotation qT1 is assumed as nA1, mA1.
n
A
=−q
T1
n
A
q
T1
*=n
B
m
A1
=q
T1
m
A
q
T1*
Since it is an angle between the same directions in the real space, the angle between the direction nA1 and the direction mA1 is equal to the angle between the unit vector nB and the unit direction vector mB. Also, since it is assumed in advance that the two specific directions are different directions, the angle between the unit vector nB and the unit direction vector mB is not 0. Therefore, a rotation qT2 can be constructed in which the direction nA1 or unit vector nB is used as an axis and the direction mA1 is superimposed on the unit direction vector mB. Specifically, the rotational qT2 is given by a following equation:
q
T2
=R([PT(nB)mA1],[PT(nB)mB])
The orientation is invariant by this rotation qT2 because it is in the same direction as the rotation axis nB of the rotation qT2. The orientation mA1 is also rotated to the unit orientation vector mB by this rotation qT2.
n
B
=q
T2
n
A1
q
T2*
m
B
=q
T2
m
A1
q
T2*
A rotation qBA is newly defined below.
q
BA
=−q
T2
q
T1
With this rotation qBA, the unit vector nA and the unit direction vector mA are rotated to the unit vector nB and the unit direction vector mB.
n
B
=q
BA
n
A
q
BA*
m
B
=q
BA
m
A
q
BA
Since the unit vector nA and the unit direction vector mA are selected as two different directions, this rotation qBA is a rotation that converts the direction representation in the first terminal coordinate system WA to the direction representation in the second terminal coordinate system WB. Conversely, when the rotation that converts the direction representation in the second terminal coordinate system WB to the direction representation in the first terminal coordinate system WA is assumed as a rotation qA, the rotation qA is similarly as follows.
q
AB
=q
BA*
Next, the conversion equation of the coordinate-value dA, dB(
o
BA
=d
A
+P
BA
−q
AB
d
B
q
AB*
o
AB
=d
B
+P
AB
−q
BA
d
A
q
BA* Equation A:
As can be easily understood, there is the following relationship.
o
AB
=−q
BA
o
BA
q
BA*
Finally, the conversion equation between the coordinate value rA in the first terminal coordinate system WA for any point (position 21) in the real space and the coordinate value rB in the second terminal coordinate system WB is given as follows.
r
B
=q
BA(rA−oBA)qBA*=qBArAqBA*+oAB
r
A
=q
AB(rB−oAB)qAB*=qABrBqAB*+oBA
As described above, when it is desired to convert the specific position 21 (coordinate value rA) viewed in the first terminal coordinate system WA to the position 21 (coordinate value rB) when viewed in the second terminal coordinate system WB, for example, it can be calculated using the rotation qBA, the coordinate value rA, and the origin representation oAB. The inverse conversion can be calculated as well. Conversion parameter 7 (71, 72) of
As described above, according to the space recognition system and method of the first embodiment, the terminal 1 can measure the space 2 to create the space data 6, and can acquire and use the space data 6 from each other among the plurality of terminals 1 of the plurality of users, and can share the recognition of the space 2. According to this system and method, the functions and operations as described above can be efficiently realized, the convenience of the user can be enhanced, and the work load can be reduced. According to this system and method, by utilizing the space data 6, functions and services of various applications can be realized for the user.
The following is also possible as a modification of the first embodiment. In a modified example, the terminal 1 of each user may transmit and register the space data 6 described in the terminal coordinate system created by the own device to an external device such as a PC or a server. Terminal 1 may transmit and register the generated conversion parameter 7 to a device such as an external PC or server.
In the first modification of the first embodiment, each terminal 1 measures the space 2 before performing the coordinate system pairing, and creates the space data 6 described in the terminal coordinate system of its own device. Thereafter, the terminal 1 performs coordinate system pairing with the other terminal 1. The terminal 1 uses the conversion parameter 7 to convert the space data 6 into the space data 6 described in a common terminal coordinate system, that is, a common coordinate system.
For example, consider the terminal 1C of the user UC as its own device. First, as in the first embodiment, it is assumed that a coordinate system pairing 1301 is established between the terminal 1A and the terminal 1B, for example. Next, it is assumed that a coordinate system pairing 1302 is performed between the terminal 1B and the terminal 1C. As a result, a coordinate system pairing 1303 between the terminal 1C and the terminal 1A can be realized indirectly. This will be explained below.
First, by the coordinate system pairing 1301, the terminal 1B, as information 1321 of the conversion parameters, obtains a rotation qBA and the origin representation oAB for the conversion between the terminal coordinate system WA and the terminal coordinate system WB. The rotation qBA is a rotation in which the representation in the terminal coordinate system WA is made to the representation in the terminal coordinate system WB. The origin representation oAB is a coordinate value in the terminal coordinate system WB for the origin OA of the terminal coordinate system WA. Conversely, the terminal 1A obtains the rotation qAB and the origin representation oBA as the information 1311 of the conversion parameter.
Then, by the coordinate system pairing 1302, the terminal 1C, as information 1331 of the conversion parameter, obtains a rotation qCB and the origin representation oBC. The rotation qCB is a rotation in which the representation in the terminal coordinate system WB is made to the representation in the terminal coordinate system WC. The origin representation oBC is a coordinate value in the terminal coordinate system WC for the origin OB of the terminal coordinate system WB. Conversely, the terminal 1B obtains the rotation qBC and the origin representation oCB as information 1322 of the conversion parameter.
Here, the terminal 1C receives information 1321 (rotation qBA and origin representation oA) of the conversion parameter from the terminal 1B and holds it as the information 1332. Thus, the terminal 1C can calculate, the rotation qCA and the origin representation oAC of the indirect coordinate system pairing 1303 with the terminal 1A, by using information 1331 (qCB, oBC) and information 1332 (qBA, OAB) of the conversion parameter. The rotation qCA is a rotation in which the representation in the terminal coordinate system WA is made to the representation in the terminal coordinate system WC. The origin representation oAC is a coordinate value in the terminal coordinate system WC for the origin OA of the terminal coordinate system WA.
q
CA
=q
CB
q
BA
o
AC
=o
BC
+q
CB
o
AB
q
CB*
The terminal 1C holds the obtained data 1333 (qCA, oAC). The terminal 1C can use the information 1333 to convert the representation (rA) of the position 21 in the terminal coordinate system WA into the representation (rC) in the terminal coordinate system WC, as shown in the following equation.
r
C
=q
CA(rA−oCA)qCA*=qCArAqCA*+oAC
The terminal 1C transmits the information 1333 (qCA, oAC) of the conversion parameter to the terminal 1A. The terminal 1A holds it as information 1312 (qCA, oAC). As a result, the terminal coordinate system WA and the terminal coordinate system WC can be converted even in the terminal 1A because of the following relationships in general. That is, the terminal 1A holds information 1313 (qAC, oCA) of the conversion parameter related to the inverse conversion. In addition, since the following relation exists, one of qIJ or qJI may be held and one of oJI or oIJ may be held in each terminal 1.
q
IJ
=q
JI*
o
JI
=−q
IJ
o
IJ
q
IJ*
As described above, in the second modification, the coordinate system pairing is sequentially performed with any two terminals 1 as a pair, thereby enabling space recognition and sharing within the group. Even if a terminal 1C does not perform a direct coordinate system pairing process with a terminal 1A, an indirect coordinate system pairing 1303 can be performed by performing a coordinate system pairing between the terminal 1A and another terminal 1B for which the coordinate system pairing has been completed. Similarly, even if there is a terminal 1D newly participating in the group, the terminal 1D may perform the same procedure for one terminal 1 in the group, for example, a coordinate system pairing 1304 with the terminal 1C, and a process of the coordinate system pairing with the terminals 1 is not necessary. In the first embodiment and the second modification, since the conversion parameter 7 is held in each terminal 1, the processing can be performed at high speed when the virtual image 22 is displayed at the shared position 21 or the like.
A table 1511 of the conversion parameter 7 held by the terminal 1A has conversion parameter information with respect to the respective terminals 1 (1B,1C,1D), similar to the table 1401 of
For example, when the terminal 1B designates the position 21 (
As another modification, it is possible to make a configuration is possible in which only the representative terminal holds the conversion parameter 7 and performs each conversion. This modification corresponds to a configuration in which the terminals 1B, 1C, 1D do not hold the tables 1512, 1513, and 1514 of the conversion parameter 7 in
As another modification, the terminal coordinate system of the representative terminal may be fixed as the common coordinate system in the group, and the position may be transmitted between the terminals 1. The representative terminal does not hold the conversion parameter 7. Each terminal 1 other than the representative terminal holds a conversion parameter 7 for conversion with the terminal coordinate system of the representative terminal. This modification corresponds to, for example, a configuration in which the terminal 1A, which is a representative terminal, does not hold the table 1511 in
In addition, in this modification, the position transmission between the terminals 1 may be performed without using the representative terminal. For example, the terminal 1B converts the representation (rB) of the position 21 in the terminal coordinate system WB into the representation (rA) in the representative terminal using the table 1512, and transmits the representation to the terminal 1C. The terminal 1C uses the table 1513 to convert the representation (rA) to the representation (rC) in the own device.
As described above, according to each modification, the quantity of data of the conversion parameter 7 held in the entire system can be reduced.
A space recognition system and the like according to the second embodiment of the present invention will be described with reference to
In the second embodiment, at the time of coordinate system pairing, the terminal 1 measures the relationship with a predetermined feature (feature point or feature line) in the space 2 as various quantities. The terminal 1, based on the measured value, obtains the relationship between the space coordinate system associated with the feature and the terminal coordinate system of the own device, and calculates the conversion parameter 7 based on the relationship.
Further, in the second embodiment, the terminal 1 may register the created space data 6 in the DB 5 of an external server 4. In this case, the server 4 corresponds to the information processing apparatus 9 having the basic configuration (
In the second embodiment, information of the space coordinate system W1 related to the space 2 is defined in advance. In the space coordinate system W1, information such as the position of the space 2 and predetermined features such as feature points and feature lines is also defined. The space coordinate system W1 may be, for example, a local coordinate system unique to a building, or may be a coordinate system common to the Earth, an area, or the like. The space coordinate system W1 is fixed in the real space and has an origin O1 and an axis X1, an axis Y1, and an axis Z1 as three axes perpendicular to each other. In the example of
The second embodiment deals with pairing of coordinate systems between the terminal coordinate system (WA, WB) of the respective terminals 1 and the space coordinate system (W1) of the space 2. These terminals 1 (1A, 1B) share the recognition of the space 2 using the space data 6 created by sharing. Each terminal 1 measures the shape or the like of the space 2 in the terminal coordinate system of its own terminal, and creates space data 6, in particular, space shape data, describing the space 2. At this time, each terminal 1 performs coordinate system pairing with the space coordinate system W1 using a predetermined feature in the space 2 as a clue. Feature points, feature lines, and the like, which are predetermined features in the space 2, are defined in advance. This feature may be, for example, a boundary line such as a wall or a ceiling, or may be a predetermined arrangement or the like. Incidentally, the feature point in a predetermined feature of the space 2 is meaningfully different from the feature point of the point group data obtained by the ranging sensor 13 described above.
For example, the first terminal 1A recognizes predetermined features of the space 2, measures various quantities, and grasps the relation between the first terminal coordinate system WA and the space coordinate system W1. The first terminal 1A, from its relation, generates a conversion parameter 7 of the first terminal coordinate system WA and the space coordinate system W1, and sets it to the own device. Each terminal 1 measures an area to be shared in the space 2 in the state of coordinate system pairing. For example, the first terminal 1A measures the area 2A and obtains a measured data 1601 described in the first terminal coordinate system WA. The first terminal 1A constructs a partial space data 1602 from the measured data 1601. The first terminal 1A converts the partial space data 1602 into the partial space data described in the space coordinate system W1 using the conversion parameter 7. For example, the first terminal 1A acquires the partial space data created by the second terminal 1B from the second terminal 1B. Then, the first terminal 1A integrates the partial space data obtained by itself and the partial space data obtained from the partner into one piece, thereby obtaining the space data 6 described in the space coordinate system W1 in units of the space 2. The second terminal 1B side can also obtain the space data 6 in the same manner as the first terminal 1A side.
In
In the second embodiment, space data 6 relating to each space 2 in the real space is registered as a library, particularly in the DB 5 of the server 4 which is an external source. At first, in the stage prior to the space 2 being measured, a space shape data 61 of the space data 6 of the DB 5 is not registered. The space data 6 of the DB 5 includes the space shape data 61 and the feature data 62. The space shape data 61 is data representing the shape or the like of the space 2 described in the space coordinate system W1, and is a portion created by the terminal 1. The feature data 62 includes data defining quantities of predetermined features, such as feature points and feature lines, in the space 2. The feature data 62 is referred to during the coordinate system pairing by the terminal 1.
The space data 6 of the DB 5 may be described in a unique space coordinate system corresponding to the space 2, or may be described in a shared space coordinate system among a plurality of related spaces 2 (e.g., buildings). The common space coordinate system may be a common coordinate system within the Earth or the area. For example, it may be a coordinate system using latitude, longitude and altitude in GPS or the like.
The configuration of the space data 6 is an example, and the details thereof are not limited. As data different from the space data 6, there may exist a space coordinate system W1 defined in advance, data relating to features and various quantities, and the like. The feature data 62 may be described as a part of the space shape data 61. The feature data 62 may be held in advance in the terminal 1. Various types of data may be maintained in different locations and associated through identification information. The server 4 is not limited to one, and may be a plurality of servers 4, for example, a server 4 associated with each one or more spaces 2.
In particular, in the second embodiment, each terminal 1 can register the space data 6 created by measuring the space 2 in the DB 5 of the server 4. At this time, the space data 6 created by the terminal 1 is registered with respect to the space data 6 (in particular, the space shape data 61) registered in advance in the DB 5. In other words, the space data 6 of the server 4 is appropriately updated according to the registration of the space data 6 from the terminal 1. Each of the terminals 1 can appropriately acquire and use the registered space data 6 from the DB 5 of the server 4. Each terminal 1 does not have to hold the space data 6 inside its own device.
In the second embodiment, the space data 6 of each space 2 may be registered as a library in an external source such as the server 4, but the present invention is not limited to this, and the space data 6 may be held in the terminal 1 as a library. Each terminal 1 that shares the recognition of the space 2 may only create and send and receive the space data 6 between the terminals 1 and share and hold it.
In the present exemplary embodiment, unlike the position of the origin OA of the terminal coordinate system WA and the position LA of the first terminal 1, and the position of the origin O1 of the space coordinate system W1 is different from the position L1 of the feature point in the space 2, but the present invention is not limited to this. Hereinafter, a case where the origin of the terminal coordinate system does not coincide with the position of the terminal 1 and a case where the position of the origin of the space coordinate system does not coincide with the position of the feature point of the space 2 will be described.
The coordinate in the terminal coordinate system WA with respect to the position LA of the terminal 1 is assumed to be dA=(xA, yA, zA). The coordinate value in the space coordinate system W1 with respect to the position L1 of the feature point in the space 2 is assumed to be d1=(x1, y1, z1). These coordinate values are determined in accordance with the setting of the world coordinate system. The terminal position vector VA is a vector from the origin OA to the position LA. The feature point position vector V1 is a vector from origin O1 to position L1.
At the time of coordinate system pairing, the terminal 1 acquires information on the space coordinate system W1 from the server 4 (or the reference terminal in the modification). For example, the terminal 1 refers to the feature data 62 of the space data 6 from the server 4. The feature data 62 includes data of various quantities 1702 relating to the feature on the space 2 side, i.e., the corresponding object 1700. Terminal 1 measures the quantities 1701 of its own device side, using a ranging sensor 13 or the like. The terminal 1 obtains the relationship between the terminal coordinate system WA and the space coordinate system W1 based on the quantities 1702 on the space 2 side and the measured quantities 1701 on the own device side. Terminal 1, based on the relationship, calculates the conversion parameter 7 between their coordinate systems, and sets to the own device.
As various quantities in the pairing of the coordinate system, information of the following three elements is included. The quantities include a specific direction vector as first information, a world coordinate value as second information, and a space position vector as third information. In the exemplary embodiment of
(1) Regarding specific direction vector: The terminal 1 uses a specific direction vector as information about a specific direction in the space 2 in the terminal coordinate system. This specific direction includes a direction which is measured by a sensor of the terminal 1, for example, a direction such as the vertical downward direction, and a direction of the feature line in the space 2, for example, a direction corresponding to the left side or the upper side of the object 1700. The terminal 1 may use unit vectors in two different specific directions from among a plurality of candidates. The representation of these unit vectors in the space coordinate system W1 is taken as n1, m1, and the representation in the terminal coordinate system WA is taken as nA, mA. The unit vectors nA, mA in the terminal coordinate system WA are measured by the terminal 1. The unit vectors n1, m1 in the space coordinate system W1 are predetermined and can be acquired from the feature data 62 of the server 4.
When the vertical downward direction is used as one specific direction, the vertical downward direction can be measured as the direction of gravitational acceleration using an acceleration sensor as described above. Alternatively, in the world coordinate system (WA, W1) settings, the vertical downward direction may be set as the negative direction of the Z-axis (ZA, Z1). In any case, since the vertical downward direction does not change in the world coordinate system, it is not necessary to measure each time the coordinate system pairing.
When using the north direction of the geomagnetism, for example, as one specific direction, the north direction of the geomagnetism can be measured using a geomagnetic sensor 143 provided in the terminal 1 (
When the direction of a predetermined feature line in the space 2 is used as the specific direction, for example, when the directions of the two feature lines of the left side and the upper side of the object 1700 are used as the two specific directions, the measurement can be performed as follows. The terminal 1 measures position coordinate values in the terminal coordinate system WA for two different feature points constituting the feature line for each feature line. The terminal 1 obtains a direction vector (for example, a direction vector NA(nA) corresponding to the left side and a direction vector MA(mA) corresponding to the upper side) from the measured value. This coordinate value can be measured, for example, by the ranging sensor 13 of the terminal 1.
(2) Regarding world coordinate value: The terminal 1 uses information of a coordinate value representing a position in the terminal coordinate system. In the example of
(3) Regarding space position vector: The space position vector (space position vector P1A) is a vector from the position LA of the terminal 1 to the position L1 of the feature point of the space 2. The space position vector provides information about the positional relation between the two coordinate systems (WA, W1). The space position vector can be measured, for example, by the ranging sensor 13 of the terminal 1.
In
Since the relation between the first terminal coordinate system WA and the space coordinate system W1 is known from the quantity data (1701, 1702), the conversion between these world coordinate systems (WA, W1) can be calculated. That is, as the conversion parameter 7, a conversion parameter 73 for converting the space coordinate system W1 to the first terminal coordinate system WA, and as an inverse conversion, a conversion parameter 74 for converting the first terminal coordinate system WA to the space coordinate system W1 can be configured. The conversion parameter 7, as described in the first embodiment, can be defined using the rotation and the coordinate origin difference.
After the coordinate system pairing, any world coordinate system may be used for the recognition of the position in the space 2 by the terminal 1. The position in the space coordinate system W1 may be converted into the position in the first terminal coordinate system WA by the conversion parameter 73. The position in the first terminal coordinate system WA may be converted into the position in the space coordinate system W1 by the conversion parameter 74.
The table of conversion parameters 73 in the example of
Since the calculation method of the conversion parameter 7 in the second embodiment is the same as that in the first embodiment, only the calculation result will be described below. The conversion equation between the coordinate value rA in the terminal coordinate system WA for any point (position 21) in the space 2 and the coordinate value r1 in the space coordinate system W1 is given as follows.
r
1
=q
1A(rA−o1A)q1A*=q1ArAq1A*+oA1
r
A
=q
A1(r1−oA1)qA1*=−qA1r1qA1*+o1A
However, the quantities in the above equation are given by the following:
q
T1
=R(nA,n1)
m
A1
=q
T1
m
A
q
T1*
q
T2
=R([PT(n1)mA1],[PT(n1)m1])
q
1A
=q
T2
q
T1
q
A1
=q
1A*
o
1A
=d
A
+P
1A
−q
A1
d
1
q
A1*
o
A1
=d
1
−q
1A(dA+P1A)q1A*
As described above, for example, when the position 21 (coordinate value rA) viewed in the first terminal coordinate system WA is to be converted into the position 21 (coordinate value r1) viewed in the space coordinate system W1, it can be calculated using the rotation q1A, the coordinate value rA, and the origin representation (oA1). The inverse conversion can be calculated as well. The conversion parameter 7 in the second embodiment can be configured by the parameters appearing in the above description. In the configuration and holding of the conversion parameter 7, as in the first embodiment, since it can be easily converted to each other, for example, it may be a q1A instead of the rotation qA1.
As described above, according to the second embodiment, the space data 6 corresponding to the space coordinate system W1 of the space 2 used as the common coordinate system can be created by each terminal 1 and registered in the server 4, and the recognition of the space 2 can be shared among the plurality of terminals 1 of the plurality of users.
The following is also possible as a modification of the second embodiment. In a modified example, the terminal 1 measures the space 2 and creates the space data 6 described by the terminal coordinate system of its own device before performing the coordinate system pairing. Thereafter, the terminal 1 performs coordinate system pairing with the space coordinate system W1, and converts the space data 6 described in the terminal coordinate system into the space data 6 described in the space coordinate system W1 using the conversion parameter 7.
As a modification of the first and second embodiments, the following is also possible. The information provided between the terminals 1 or between the terminal 1 and the server 4 may include data such as a virtual image (AR object) related to a function such as the AR or the arrangement position information of the virtual image. For example, in
A space recognition system and the like according to a third embodiment of the present invention will be described using
Mark 3 (in other words, marker, sign, etc.) has a special function for terminal 1 in addition to its function as a general mark that allows the user to identify space 2. The mark 3 gives a world coordinate system serving as a reference for the space 2 as the space coordinate system W1 (which may be referred to as a mark coordinate system). The mark 3 is a specific object in which a predetermined feature is defined and which can be used for measurement of various quantities and the like at the time of coordinate system pairing by the terminal 1. The mark 3 has a function for enabling the terminal 1 to identify the space 2 (corresponding ID) and acquire the space data 6. The mark 3 is described a position, a shape, and the like in the same space coordinate system W1 as the space 2. The feature in the space 2 in the second embodiment is a feature point or a feature line as a feature of the mark 3 in the third embodiment. The feature of the mark 3 is defined in advance as various quantities. For example, in the space data 6 of the DB 5 of the server 4, the mark data 62 is registered. The mark data 62 includes various quantity data of the mark 3, and corresponds to the feature data 62 in the second embodiment.
Terminal 1, for example, the first terminal 1A measures the features of the mark 3 as various quantity data of the own device side, grasps the relationship between the first terminal coordinate system WA and the space coordinate system W1, based on the relationship, the first terminal coordinate system WA and the space coordinate system W1 generating a conversion parameter 7, and sets the own device.
In this example, a feature point or a feature line in the space coordinate system W1 is defined in advance on the mark surface of the mark 3. In the mark plane, one feature point (point p1) representing a representative position L1 of the mark 3 is defined. Two other feature points (points p2, p3) are defined on the mark plane. The three feature points (points p1-p3) define two feature lines (lines v1, v2 corresponding to the vector). The point p1 is an upper left corner point of the mark surface, the point p2 is a lower left corner point, and the point p3 is an upper right corner point. The line v1 is the left side of the mark surface, and the line v2 is the upper side. These feature points and feature lines constitute the two specific directions described above. The quantity data relating to the space coordinate system W1 of the mark 3 includes, for example, the information of one feature point (point p1) and the information of two specific directions (line v1, v2). It should be noted that although the feature points such as the point p1 and the feature lines such as the line v1 are illustrated for the purpose of explanation, they are not actually described. Alternatively, a feature point or a feature line may be described on the mark surface as a specific image so that it can be recognized by the user and the terminal 1.
The terminal 1 measures the relationship with the mark 3 as various quantities during the coordinate system pairing. At that time, the terminal 1 measures these three feature points (points p1 to p3) using the ranging sensor 13 and the camera 12 based on the mark data 62. In other words, the terminal 1 measures two feature lines (line v1, v2). When the positions of the three feature points in the terminal coordinate system WA can be grasped, the two feature lines corresponding to the two specific directions can be grasped.
Incidentally, the origin O1 of the space coordinate system W1 may be outside the space 2, may be in the space 2, or may be set in particular to the mark surface of the mark 3. For example, the origin O1 may be set in accordance with the feature point (point p1) of the mark 3.
In
In
In
The predetermined information described in the mark 3 may be information including an ID 2001 for identifying the space 2 and the mark 3, or may be information including an address and a URL for accessing the space data 6 of the server 4 as an external source, or may be configured as follows.
The predetermined information may be information including various quantity data relating to the space coordinate system W1 of the mark 3, i.e., the mark data 62 in
The predetermined information may be information including a predetermined ID and space data transmission destination information. The terminal 1 can use this information to access the server 4 and obtain space data 6 (in particular, the mark data 62) associated with the mark 3. The terminal 1 can acquire various quantity data from the mark data 62.
In
Next, in step S32, the terminal 1 measures the space 2 (above mentioned sharing area) in the terminal coordinate system WA, and creates the space data 6 described in the space coordinate system W1 using the conversion parameters 7. The terminal 1 appropriately converts the position or the like in the terminal coordinate system WA in the measurement data or the partial space data into the position or the like in the space coordinate system W1. The details of the processing in step S32 are the same as those described above.
In step S33, the terminal 1 transmits the space data 6 described in the created space coordinate system W1 to the server 4 based on the predetermined information of the mark 3. The terminal 1 may attach the identification information of the own device or the user, the position information (measurement starting point), the measurement date and time information (time stamp), and other related information to the space data 6 to be transmitted. When there is the measurement date and time information, the server 4 can grasp changes in the space data 6 on the time axis (states such as the shape of the space 2 or the like), as a data management.
The server 4 registers and stores the space data 6 (in particular, the space shape data) received from the terminal 1 in the library of the DB 5. The server 4 registers the space data 6 (in particular, the space shape data 61) in association with the information such as the ID of the space 2. When the corresponding space data 6 (in particular, the space shape data 61) has already been registered in the DB 5, the server 4 updates the content of the space data 6. The server 4 manages the measurement date and time, the registration date and time, the update date and time, and the like of the space data 6.
Alternatively, the following may be used. In steps S32 to S33, the terminal 1 creates the space data 6 described by the terminal coordinate system WA of the own device based on the measurement data. The terminal 1 transmits the space data 6 described in the terminal coordinate system WA and the conversion parameter 7 (a conversion parameter that can be converted from the terminal coordinate system WA to the space coordinate system W1) as a set to the server 4. The server 4 registers those data in the DB 5.
In step S301, the terminal 1 recognizes the mark 3, reads predetermined information (e.g., ID and space data transmission destination information), and establishes a communication connection with the server 4 based on the predetermined information. In step S301b, the server 4 establishes a communication connection with the terminal 1. At this time, the server 4 may authenticate the user or the terminal 1, confirm the authority related to the space 2, and permit the terminal 1 whose authority is confirmed. As the authority, for example, an authority for measurement, an authority for registration and update of the space data 6, an authority for acquisition and use of the space data 6, and the like may be provided.
In step S302, the terminal 1 transmits the coordinate system pairing request to the server 4, and in step S302b, the server 4 transmits the coordinate system pairing response to the terminal 1.
In step S303, the terminal 1 sends a request for various quantities of data regarding the mark 3 to the server 4. In a step S303b, the server 4 transmits the corresponding mark data 62 to the terminal 1 as a response to various quantity data relating to the mark 3. The terminal 1 acquires quantity data relating to the mark 3.
In step S304, the terminal 1, based on the quantity data obtained above, a predetermined feature of the mark 3 (point p1 and lines v1, v2 in
In step S305, the terminal 1 calculates the conversion parameter 7 between the terminal coordinate system WA and the space coordinate system W1 using the various quantity data described by the space coordinate system W1 on the mark side obtained by the step S303 and the various quantity data described by the terminal coordinate system WA on the own device side obtained by the step S304, and sets it to the own device.
In step S306, the terminal 1 measures the space 2, obtains measured data, and creates space data 6 (in particular, the space geometry data) described in its terminal coordinate system WA. In detail, the space data 6 is partial space data based on sharing.
In step S307, the terminal 1 converts the space data 6 created in step S306 into the space data 6 described in the space coordinate system W1 using the conversion parameter 7.
In step S308, the terminal 1 transmits the space data 6 obtained in step S307 to the server 4. In step S308b, the server 4 registers or updates the space data 6 received from the terminal 1 in the corresponding space data 6 (in particular, the space shape data 61) in the DB 5.
In another method, instead of steps S307, S308, the terminal 1 transmits the space data 6 and the conversion parameter 7 described in the terminal coordinate system WA of the own device as a set to the server 4. The server 4 registers the space data 6 and the conversion parameter 7 in association with each other in the DB 5. In this configuration, the server 4 may perform the coordinate conversion process using conversion parameters 7 of the DB 5.
In steps S309, S309b, the terminal 1 and the server 4 confirm whether or not the coordinate system pairing related to the space measurements is to be finished, and if the pairing is to be finished (Yes), the processing proceeds to step S310, and if the pairing is to be continued (No), the processing returns to step S306, and the processing is repeated in the same manner.
In steps S310, S310b, the terminal 1 and the server 4 disconnect the communication connection related to measuring the space 2. The terminal 1 and the server 4 may explicitly cancel the state of the coordinate system pairing (e.g., deleting the conversion parameter 7), or may be continued thereafter. The terminal 1 may be connected to the server 4 at all times via communication, or may be connected to the server 4 only when it is necessary. Basically, a method (client-server method) that does not hold data such as space data 6 or the like may be used in the terminal 1.
In the above control flow example, the terminal 1 automatically transmits the created space data 6 to the server 4 and registers. As not limited to this, the user may perform an operation for registering space data in the terminal 1 and register the space data 6 in the server 4 according to the operation. The terminal 1 displays a guide image related to the space data registration on the display surface 11. The user performs as operation for the space data registration accordingly.
When the space data 6 of the space 2 (in particular, the space shape data 61) has been registered in the server 4 as described above, each terminal 1 can acquire and use the space data 6 by communication, particularly through the mark 3. The procedure at this time is as follows, for example.
The terminal 1 recognizes the corresponding mark 3 for the target space 2, acquires predetermined information (such as an ID or the like), and checks whether the coordinate system has been paired or whether the space data 6 has been registered, and the like. For example, when the space data 6 has already been registered, the terminal 1 acquires the space data 6 (in particular, the space shape data 61) related to the target space 2 from the server 4 using the predetermined information. When the coordinate system pairing has not been completed, the terminal 1 performs the coordinate system pairing with the space 2. When the conversion parameter 7 is already held in the terminal 1, the coordinate system pairing can be omitted.
When the terminal 1 recognizes the mark 3, or the like, it may possible to display an image for the option or guide on the display surface 11 for the user to measure the space 2 (creation of the corresponding space data 6 or acquire and use the registered space data 6, and to determine the subsequent processing according to the user's operation. For example, when the space data 6 is used based on the user's operation, the terminal 1 transmits a space data request to the server 4. The server 4 retrieves the DB 5 in response to the request, and when there is the target space data 6 (in particular, the space shape data 61), and transmits the space data 6 to the terminal 1 as a response.
The terminal 1 can suitably display the virtual image 22 in the space 2 at a position 21 that matches the shape of the object in the space 2 by, for example, an AR function using the acquired space data 6. The space data 6 (in particular, space shape data 61) can be used for various applications in addition to the use of the AR function to display the virtual image 22. For example, it can be used for the purpose of grasping the positions of the user and the own device, or for the purpose of searching for a route to a destination or guidance. For example, the HMD, which is the terminal 1, displays the shape of the space 2 on the display surface 11 using the acquired space data 6. At this time, the HMD may superimpose and display the shape of the space 2 on the real object by a virtual image of a line drawing, for example, in a real object size. Alternatively, the HMD may display the shape of the space 2 in a virtual image, such as a three-dimensional map or a two-dimensional map, with a size smaller than that of the real object. In addition, the HMD may display a virtual image representing the current position of the user and the own device on the map. Further, the HMD may display the position of the user's destination and the route from the current position to the destination position in the map as a virtual image. Alternatively, the HMD may display a virtual image such as an arrow for route guidance in accordance with the actual object.
As described above, in the third embodiment, in particular, efficient coordinate system pairing and space data acquisition are possible using the mark 3. In the second and third embodiments, when the terminal coordinate system WA of the terminal 1 is paired with the space coordinate system W1 on the side of the space 2 and the mark 3, the object in the space 2 or the mark 3 is fixed. Therefore, during this coordinate system pairing, it is sufficient to consider stationary of the terminal 1 side, it is possible to measure with high accuracy, the degree of freedom in practical use is increased.
As a modification of the third embodiment, with respect to the coordinate system pairing between the terminal coordinate system of the terminal 1 and the space coordinate system of the mark 3, similarly to a fourth modification (
In the third embodiment, after the coordinate system pairing with the mark 3, the terminal 1 may use a predetermined feature point and feature line obtained by a measuring in the space 2 for a calibration (adjustment) relating to the coordinate system pairing (corresponding conversion parameter 7). In addition, a plurality of marks 3 or features may be provided in one space 2. The terminal 1 can use the respective mark 3 or features for the coordinate system pairing or adjustment.
As a modification (referred to as a fifth modification) of the first to third embodiments, the following is also possible. The fifth modification deals with the sharing on the time axis in the case where the space data 6 is generated by measuring a certain space 2. In this case, either one or more users can be used. Even if there is only one terminal at the same time, sharing on a time axis is possible. In this case, each terminal 1 takes charge of each time in a plurality of times formed by a temporal division.
(A) of
(B) of
(C) of
As described above, in the DB 5 of the server 4, space data 6 (in particular, space shape data 61) about the space 2 (ID=100) is stored. The content of the space data 6 is updated on the time axis at any time. For example, at the third date and time, the space data D100 is composed of a partial space data D101, D102, D103. Each partial space data may have measurement date and time information, measurement user and terminal information, status information such as “measured”, and the like. Thereafter, similarly, by appropriately measuring an arbitrary area in the space 2 by one or more arbitrary users or arbitrary terminals 1 on the time axis, if a sufficient area in the space 2 has been measured, the space data 6 can be created in units of the space 2.
Before the start of each measurement, each terminal 1 can grasp a measured area in the space 2 by referring to the space data 6 from the server 4. Therefore, the terminal 1 can omit the measurement for the measured area and start the measurement for an unmeasured area. In addition, the terminal 1 can update or correct the shape or the like of the area when measuring the measured area again.
The following methods are possible for handling overlapped areas of measurements (e.g., an overlapped area 2212). As the first method, each partial space data is provided with data of the overlapped area. For example, in the partial space data D101 and D102, it has data in the overlapped area 2212.
As a second method, each partial space data does not have data of the overlapped area. For example, in a partial space data D101 or D102, there is no data of the overlapped area 2212. The terminal 1 or the server 4 determines whether or not a certain area in the space 2 has been measured. For example, the determination can be made based on the state of the contents of the registered space data 6. For example, in the second terminal 1B and the server 4, since there is already data of the overlapped area 2212 in the partial space data D101, the data in the overlapped area 2212 is not provided in the partial space data D102. Alternatively, as another method, the second terminal 1B and server 4 overwrite the data of the overlapped area 2212 in the partial space data D101 with the data of the overlapped area 2212 of the partial space data D102.
For the area in the space 2, there is a case where the state such as the shape or the like in that area on the time axis is changed. For example, an arrangement such as a desk or the like may be moved. In this case, the terminal 1 and the server 4 can determine the change by looking at the difference between the measurement data or the partial space data for each area on the time axis. Based on this determination, for example, when it is desired to reflect the latest state of the space 2, the terminal 1 and the server 4 may perform the overwrite update using the partial space data at the newer measurement date and time. In addition, the terminal 1 and the server 4 can also determine a distinction between fixed arrangements (e.g., a wall, a floor, or a ceiling) constituting the space 2 and variable position arrangements (e.g., a desk) based on such a determination. Based on this, the terminal 1 and the server 4 may register the attribute information in the space data 6 by distinguishing between the fixed arrangements and the variable position arrangements for each part. In addition, the space data 6 may be configured so that the variable position arrangements are not components in the first place.
In the DB 5 of the servers 4, only the space data of the latest measurement date and time may be held as the space data 6 of the same space 2, or the space data of each measurement date and time may be held as a history. In this case, the change of the space 2 on the time axis can be grasped as a history. From the difference of the space data of each measurement date and time, it is also possible to discriminate between the fixed arrangements and the variable position arrangements.
As a modification (referred to as a sixth modification) of the first to third embodiments, the following is also possible. In the sixth modification, when the space data 6 is generated by measuring a certain space 2, each terminal 1 does not share the measurement in advance. In this case, either one or more users can be used. Each terminal 1 provides the measured space data to another terminal 1 if requested, or registers it in the server 4. Each terminal 1 retrieves, acquires and uses the space data 6 not owned by the own device from the other terminal 1 and the server 4.
The flow of the sixth modification is shown in
In steps S331, S331b, the first terminal 1A and the information processing device 9 establish communication to each other. When the information processing apparatus 9 is a server 4, when establishing communication, the first terminal 1A selects the server 4 that manages the space data 6 to be acquired. The selection can be made, for example, from the position information of the space data 6. Alternatively, a predetermined information (e.g., ID and space data acquisition destination information) may be read by recognizing the mark 3, and a communication connection with the server 4 may be established based on the predetermined information. When the information processing apparatus 9 is the second terminal 1B, since the apparatus can be specified specifically, communication can be established using the communication data held in advance.
In step S332, the first terminal 1A transmits a coordinate system pairing request to the information processing apparatus 9, and in step S332b, the information processing apparatus 9 transmits a coordinate system pairing response to the first terminal 1A.
In step S333, the terminal 1 transmits a request for various quantity data to the information processing apparatus 9. This quantity data is a quantity data relating to the second terminal 1B when the information processing device 9 is the second terminal 1B. When the information processing apparatus 9 is the server 4, the quantity data is the quantity data relating to the mark 3. In step S333b, the information processing apparatus 9 transmits the requested quantity data to the first terminal 1A. The first terminal 1A acquires the quantities. When the common coordinate system for acquiring the space data 6 from the second terminal 1B is the space coordinate system, the first terminal 1A acquires various quantity data such as the mark 3 or the like required for the coordinate system pairing with the space coordinate system from the second terminal 1B and the like.
In step S334, the first terminal 1A measures predetermined features (e.g., the point p1 and the lines v1, v2 in
In step S335, the first terminal 1A calculates the conversion parameter 7 between the terminal coordinate system WA and the common coordinate system WS using the various quantity data described in the common coordinate system WS on the side of the second terminal 1B or the mark 3 obtained in the step S333 and the various quantity data described in the terminal coordinate system WA on the own device side obtained in the step S334, and sets it to the own device. As a result, a sharing of space recognition between the first terminal 1A and the information processing device 9 can be obtained.
In steps S336, S336b, the first terminal 1A makes an inquiry about holding the space data 6, and the information processing device 9 makes inquiry responses and transmits the space data 6. First, the first terminal 1A transmits the position information described based on the common coordinate system of the area in which the space data 6 is to be acquired to the information processing apparatus 9. The information processing apparatus 9 answers a list of the space data 6 related to the area that has received the inquiry. The area referred to here is a three-dimensional area surrounded by a rectangular parallelepiped defined by coordinate values, for example, as shown in
In step S337, the first terminal 1A converts the space data 6 acquired in step S336 into the space data 6 described by the terminal coordinate system WA of its own device using the conversion parameter 7 and uses it.
In another method, the conversion parameter 7 may be transmitted to the information terminal device 9, and relating to the positional information, information exchange with reference to the terminal coordinate system WA may be performed.
In steps S338, S338b, the first terminal 1A and the information processing device 9 confirm whether or not the coordinate system pairing related to the provision of the space data is to be terminated, and in the case of termination (Yes), the process proceeds to the steps S339, S339b, and in the case of continuation (No), the process returns to the steps S336, S336b and is repeated in the same manner.
In steps S309, S309b, the first terminal 1A and the information processing apparatus 9 disconnect the communication connection related to the provision of the space data. The first terminal 1A and the information processing device 9 may explicitly cancel the status of the coordinate system pairing (for example, deletion of the conversion parameter 7), or may continue thereafter. The first terminal 1A may be connected to the information processing apparatus 9 at all times via communication, or may be connected to the information processing apparatus 9 only when it is necessary. Basically, a method (client-server method) that does not hold data such as the space data 6 in the terminal 1 may be used. In this case, a server is not the server 4 as the information processing apparatus 9.
In addition to integrating the acquired space data 6 to create the new space data 6, the terminal 1 may only use the space data 6 acquired from the information processing apparatus for displaying the AR object or the like.
According to the sixth modification, it is possible to omit the measurement of the space data 6 by the own device without having to perform the measurement sharing setting in advance, and it is possible also to improve the efficiency of the operation.
A space recognition system and the like according to the fourth embodiment will be described with reference to
The first terminal 1A superimposes and displays an image 2300 representing an area 2A (i.e., an area to be measured) or a measuring range or the like shared by the own device on the display surface 11. The image 2300 indicates that an area in a direction in which the image 2300 is visible is a shared area. In this example, the image 2300 is an image representing the boundary surface of the areas 2A, 2B in the space 2 (an image in which the back is visible through), but is not limited thereto, it may be an image representing a three-dimensional area 2A or the like. In this example, the position of the first terminal 1A (the corresponding user U1) in the space coordinate system W1 is outside the area 2A and facing the area 2A. Therefore, the boundary surface of the areas 2A, 2B is displayed as the image 2300. In addition, for example, when the position of the first terminal 1A is inside the area 2A and is oriented toward the arrangement in the area 2A, an image representing the state is displayed instead of the image 2300.
The user U1 can easily grasp and measure the area 2A by viewing the images 2300. The user U1 may measure the direction in which the image 2300 is visible. When the sensitivity area of the sensor (e.g., the ranging sensor 13) for measurement provided in the terminal 1 is in the front direction of the face of the user U1, the user U1 may use this image 2300 as a guideline for the direction in which the face is directed for measurement. In other words, the user U1 may turn around the face such that a line of sight points within the surface area of the image 2300 at the time of measurement.
As another example, the first terminal 1A may display another image representing an area (area 2B) shared by the other terminal 1 (e.g., the second terminal 1B) in addition to the image 2300 in accordance with the position and orientation.
In this example, positions and directions (2401, 2402, 2403) are shown when the terminal 1 (1A, 1B, 1C) of the three users (U1, U2, U3) simultaneously measure the position and the direction. Between the terminals 1, the sharing range is first calculated in this state. Intersection points (2411, 2412, 2413) between a vertical bisector of a line segment connecting adjacent terminals 1 and a boundary line (in this example, a four-sided wall) of the space 2 are taken. This intersection point is defined as a boundary of the sharing range of the space 2 (a corresponding vertical line). In addition, among the terminals 1, the sharing range may be set as an initial value, and further, adjustment may be performed (for example, the intersection may be shifted in the horizontal direction) so that the sharing becomes fair (for example, the same degree of size). Further, in this example, although the sharing range is not overlapped with the intersection point as a boundary, the present invention is not limited to this, and the sharing range may be overlapped with the boundary portion including the intersection point. The shape of the wall or the like of the room can be measured by the measurement in the sharing as shown in
By moving the direction of the face of user U1 along the image 2503 of the horizontal line arrow so as to change the orientation of the face (the corresponding image 2504) efficient and accurate measurement can be realized. The image 2504 is an example of a display of an image such as a cursor representing the orientation of an HMD, a sensor, etc. The direction and spacing of the horizontal line arrows in the image 2503 is designed to enable efficient measurement. For example, the interval between two adjacent horizontal line arrows is selected as an interval at which measurement overlap is minimized without occurrence of measurement leakage.
Although the present invention has been specifically described based on the embodiments described above, the present invention is not limited to the embodiments described above, and various modifications can be made without departing from the gist thereof. It is possible to add, delete, replace, or combine the components of the Fifths. Some or all of the functions described above may be implemented in hardware, or may be implemented in software program processing. Programs and data comprising functions and the like may be stored in a computer-readable storage medium or may be stored in a device on a communication network.
1: terminal (HMD), 1A: first terminal, 1B: second terminal, 1a, 1b: smart phone, 2: space, 4: server, 6: space data, 7: conversion parameter, 9: information processor, 11: display surface, 12: camera, 13: ranging sensor, U1: first user, U2: second user, W1: space coordinate system, WA: first terminal system, WB: second terminal coordinate system, WS: common coordinate system, WT: terminal coordinate system, 21: position, 22: image.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/004388 | 2/5/2020 | WO |