The present disclosure relates to an information processing device, an information processing method, a program, and a server.
Information processing terminals capable of acquiring position information are recently widespread. In addition, various services using position information are developed. In one example, Patent Literature 1 discloses an information processing method of displaying information entered by the user on a map image in association with position information.
Patent Literature 1: JP 2015-003046A
The services as describe above, however, do not contemplate a real object whose position varies as a target to be associated with information. Thus, the services as described above are difficult for the user to check the information associated with a moving real object.
In view of the above, the present disclosure provides a novel and improved information processing device, information processing method, program, and server, capable of changing display of information associated with a moving real object depending on the position of the real object.
According to the present disclosure, there is provided an information processing device including: a display control unit configured to control display of tag information managed in association with position information of a real object. The display control unit controls the display of the tag information in such a manner that the display of the tag information is changed depending on a change in the position information of the real object.
In addition, according to the present disclosure, there is provided a server including: an object management unit configured to manage an update of position information of a real object on a basis of the collected position information of the real object; and a control unit configured to cause the position information of the real object and tag information managed in association with the position information of the real object to be transmitted to an information processing device.
As described above, according to the present disclosure, it is possible to change the display of the information associated with the moving real object depending on the position of the real object. Note that the effects described above are not necessarily limitative. With or in the place of the above effects, there may be achieved any one of the effects described in this specification or other effects that may be grasped from this specification.
Hereinafter, (a) preferred embodiment(s) of the present disclosure will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
Moreover, the description will be given in the following order.
1. Control of tag display according to present disclosure
1.2. System configuration example according to present disclosure
1.3. Overview regarding control of tag display
1.4. Information processing device 10 according to present disclosure
1.5. Server 20 according to present disclosure
1.6. Real object 30 according to present disclosure
1.7. Modification of functional configuration according to present disclosure
2. First embodiment (battle game intended to contest real object 30)
2.1. Overview of battle game according to first embodiment
2.2. Example of information managed by server 20
2.3. Display control of information regarding battle game
2.4. Simplified display information
2.5. Specifying real object 30 to be attacked
2.6. Display control of specifying the real object 30
2.7. Control of input regarding battle
2.8. Control flow according to present embodiment
2.9. Summary of first embodiment
3. Second embodiment (bomb game using real object 30)
3.1. Overview of bomb game according to second embodiment
3.2. Details of bomb game according to second embodiment
3.3. Summary of second embodiment
4. Third embodiment (collection game intended to collect real objects 30)
4.1. Overview of collection game according to third embodiment
4.2. Details of collection game according to second embodiment
4.3. Summary of third embodiment
5. Fourth embodiment (evaluation function using real object 30)
5.1. Overview of evaluation function according to fourth embodiment
5.2. Details of evaluation function according to fourth embodiment
5.3. Summary of fourth embodiment
6. Fifth embodiment (language guidance using real object 30)
6.1. Overview of language guidance according to fifth embodiment
6.2. Details of language guidance according to fifth embodiment
6.3. Summary of fifth embodiment
7. Hardware configuration example
7.1. Common component
7.2. Component specific to the information processing device 10
Technology called augmented reality (AR) that superimposes additional information on the real space and presents it to the user has recently attracted attention. In AR technology, information presented to users is visualized in various forms of virtual objects such as text, icons, animation. The virtual object is arranged depending on the position of a real object associated with the virtual object. The virtual object is typically displayed on a display of an information processing terminal.
An application in which AR technology is applied, in one example, make it possible to associate additional information such as navigation information or advertisements with a real object such as buildings or roads existing in real space and to present it to the user. The application as described above, however, contemplates a real object whose position does not vary as a target to be associated with additional information. An information processing device and a server according to the present disclosure are conceived in view of the above points and are capable of displaying additional information associated with a moving real object. In addition, the information processing device and the server according to the present disclosure are capable of associating new additional information with a moving real object. In the following, characteristics of the information processing device and the server according to the present disclosure and the effects derived from the characteristics will be described.
A configuration example of an information system according to the present disclosure is now described with reference to
In the following description of tag display control according to the present disclosure, a head-mounted display (HMD) is described as an example of the information processing device 10, and a vehicle is described as an example of the real object 30, but the information processing device 10 and the real object 30 are not limited to those of such example. The information processing device 10 according to the present disclosure may be, in one example, a mobile phone, a smartphone, a tablet, or a personal computer (PC). In addition, the information processing device 10 may be an eyeglass or contact lens type wearable device, an information processing device used by being installed in ordinary eyeglasses, or the like. In addition, the real object 30 according to the present disclosure may be an object such as a ship, an animal, a chair, or the like equipped with a GPS sensor.
Then, an overview of tag display control according to the present disclosure is described with reference to
In this example, the real object 30 transmits its own position information acquired by using a global positioning system (GPS), Wi-Fi, or the like to the server 20. The server 20 transmits the acquired position information of the real object 30 and tag information associated with the real object 30 to the information processing device 10. The information processing device 10 controls the display position of the tag display T1 on the basis of the acquired position information of the real object 30 and the tag information associated with the real object 30.
Further, the server 20, when acquiring the new position information of the real object 30, updates the position information of the real object 30 held in the server 20 and transmits the updated position information to the information processing device 10. The information processing device 10 controls the display position of the tag display T1 on the basis of the acquired new position information of the real object 30. Moreover, the server 20, when updating the position information of the real object 30, may again acquire the tag information associated with the real object 30 and transmit it to the information processing device 10.
The processing regarding the addition of tag information to the real object 30 is now described. The information processing device 10 transmits information entered by the user to the server 20 together with identification information of the target real object 30. The server 20 links the information entered by the user with the target real object 30 on the basis of the acquired contents, and sets it as new tag information. Upon completion of the setting, the server 20 transmits the new tag information and the position information of the real object 30 to the information processing device 10. In addition, the information processing device 10 controls the display position of a new tag display on the basis of the acquired tag information and position information of the real object 30. Moreover, the information processing device 10 is also capable of generating a tag display and controlling the display position without transmitting the information entered by the user to the server 20.
The overview of the tag display control according to the present disclosure is described above. Here, when referring to
As described above, the information processing device 10 according to the present disclosure is capable of controlling the display position of the tag display on the basis of the position information of the moving real object 30 and the tag information associated with the real object 30. In addition, the information processing device 10 is capable of adding new tag information to the moving real object 30.
The information processing device according to the present disclosure is now described in detail. As described above, the information processing device 10 according to the present disclosure has a function of controlling the display of tag information associated with the real object 30. In addition, the information processing device 10 has a function of adding new tag information to the real object 30. A functional configuration example of the information processing device 10 according to the present disclosure is now described with reference to
A communication unit 110 has a function of performing information communication with the server 20 or the real object 30. Specifically, the communication unit 110 receives position information of the real object 30, tag information associated with the real object 30, or the like from the server 20. In addition, the communication unit 110 transmits tag information that is set by an input control unit 150 to be described later or position information of the information processing device 10 to the server 20. In addition, the communication unit 110 may have a function of acquiring identification information, position information, or the like from the real object 30 using short-range wireless communication.
A storage unit 120 has a function of storing programs or various kinds of information to be used by the components in the information processing device 10. Specifically, the storage unit 120 stores identification information of the information processing device 10, setting information related to a filtering function of tag information to be described later, tag information set in the past, or the like.
A target management unit 130 manages the position information of the real object 30 that is acquired from the server 20 and manages the tag information associated with the real object 30. The target management unit 130 has a function of linking the tag information set by the input control unit 150 with the target real object 30.
A display control unit 140 controls display of the tag information managed in association with the position information of the real object in such a manner that the display is changed depending on a change in the position information of the real object. Specifically, the display control unit 140 controls the display of the tag information associated with the real object 30, on the basis of the information managed by the target management unit 130 and the position information and direction information of the information processing device 10 that are acquired from a sensor unit 160 to be described later. In addition, the display control unit 140 has a function of specifying the position of the real object 30 in detail on the basis of the information from the sensor unit 160. The display control unit 140 is capable of specifying the detailed position of the real object 30 or recognizing the target real object 30 by using, in one example, a technique such as image recognition or simultaneous localization and mapping (SLAM). In addition, the display control unit 140 has a function of filtering the tag information to be displayed depending on the type of the tag information. Moreover, the display of the tag information controlled by the display control unit 140 is not limited to the display on a display device. In one example, the display control unit 140 may control tag display using projection mapping by, in one example, controlling a projection device such as projectors.
The input control unit 150 has a function of setting contents of the tag information. Here, the real object 30 to be set as the tag information is specified on the basis of the information acquired by the sensor unit 160. The information to be set as contents of the tag information may be information that is input through the touch panel or various buttons or may be input by voice or gesture. The input control unit 150 is capable of recognizing the input contents and setting it as the tag information on the basis of the user's voice or gesture information acquired by the sensor unit 160. In addition, the input control unit 150 has a function of estimating tag information to be set on the basis of a tendency of tag information set in the past or the information acquired from the sensor unit 160. The input control unit 150 is capable of estimating the tag information to be set from, in one example, information or the like related to the user's heart rate, blood pressure, breathing, or perspiration that is acquired from the sensor unit 160.
The sensor unit 160 includes various types of sensors and has a function of collecting information corresponding to the type of sensor. The sensor unit 160 may include, in one example, a GPS sensor, an accelerometer, a gyro sensor, a geomagnetic sensor, an infrared sensor, a barometer, an optical sensor, a temperature sensor, a microphone, or the like. In addition, the sensor unit 160 may include various types of sensors for acquiring physiological data of the user. The physiological data of the user may include, in one example, heart rate, blood pressure, body temperature, respiration, eye movement, galvanic skin response, myoelectric potential, electroencephalogram, or the like.
The server 20 according to the present disclosure is now described in detail. As described above, the server 20 according to the present disclosure has the function of acquiring position information from the real object 30 and updating the position information of the real object 30 held by the server 20. In addition, the server 20 executes various processing corresponding to the mode of an application to be provided while communicating with the information processing device 10. The server 20 according to the present disclosure may include a plurality of information processing devices or may be made redundant or virtualized. The configuration of the server 20 can be changed appropriately depending on conditions regarding the specification or operation of the application. A functional configuration example of the server 20 according to the present disclosure is now described with reference to
A communication unit 210 has a function of performing information communication with the information processing device 10 or the real object 30. Specifically, the communication unit 210 acquires position information from the real object 30, and transmits the position information of the real object 30 and the tag information associated with the real object 30 to the information processing device 10. In addition, the communication unit 210 receives requests for various processing from the information processing device 10, and transmits the processing result corresponding to the mode of an application to the information processing device 10.
A user management unit 220 has a function of managing information related to the information processing device 10 and information related to the user who uses the information processing device 10. The user management unit 220 may be a database that stores the information related to the information processing device 10 and the user. The user management unit 220 stores, in one example, the position information of the information processing device 10, the identification information of the user, or the like. In addition, the user management unit 220 manages various types of information regarding the information processing device 10 and the user depending on the mode of the application.
An object management unit 230 has a function of managing information related to the real object 30. The object management unit 230 may be a database that stores information related to the real object 30. The object management unit 230 stores, in one example, the position information of the real object 30 and the tag information associated with the real object 30. In addition, the object management unit 230 stores various types of information regarding the real object 30 depending on the mode of the application.
A tag linkage unit 240 has a function of linking the real object 30 with the tag information. The tag linkage unit 240 links the identification information of the real object 30 that is acquired from the information processing device 10 with the newly set tag information, and stores it in the object management unit 230. In a case where the server 20 has a function related to the new tag information setting, the tag linkage unit 240 may link the tag information acquired using the function with the target real object 30.
A control unit 250 has a function of controlling each component of the server 20 and causing the components to execute their own processing. The control unit 250 controls the user management unit 220 and the object management unit 230, in one example, on the basis of a request related to registration for new information from the information processing device 10 or the real object 30. In addition, the control unit 250 executes various processing corresponding to the mode of an application to be provided.
Although the functional configuration example of the server 20 according to the present disclosure is described above, the server 20 according to the present disclosure is not limited to the above example, and may further have a configuration other than that illustrated in
The real object 30 according to the present disclosure is now described in detail. The real object 30 according to the present disclosure can be defined as a moving real object such as vehicle or a real object that is movable by a third party. A functional configuration of the real object 30 according to the present disclosure is now described with reference to
A communication unit 310 has a function of performing information communication with the server 20 or the information processing device 10. Specifically, the communication unit 310 transmits position information of the real object 30, which is acquired by a position information acquisition unit 320 to be described later, to the server 20. Moreover, the transmission of the position information to the server 20 may be performed periodically or irregularly. In the case where the transmission of the position information is performed irregularly, the information may be transmitted at the timing when the position information of the real object 30 is changed. In addition, the communication unit 310 may have a function of transmitting identification information, position information, or the like of the real object 30 to the information processing device 10 using short-range wireless communication. The short-range wireless communication may include communication by Bluetooth (registered trademark) or radio frequency identification (RFID).
The position information acquisition unit 320 has a function of acquiring the position information of the real object 30. The position information acquisition unit 320 acquires the position information of the real object 30 using, in one example, GPS, Wi-Fi, or the like.
The control of the tag display using the information processing device 10, the server 20, and the real object 30 according to the present disclosure is described above. The functional configuration described above is merely an example, and can be changed appropriately depending on the mode of an application to be provided. In one example, the position information of the real object 30 may be transmitted to the server 20 from the information processing device 10 that identifies the real object. The identification of the real object 30 by the information processing device 10 may be achieved by acquisition of identification information using a QR code (registered trademark) or by using image recognition technology. In addition, in one example, in a case where the real object 30 is a person holding a device capable of acquiring position information, the communication unit 310 of the real object 30 can perform information communication using intra-body communication with the communication unit 110 of the information processing device 10.
Embodiments according to the present disclosure using the information processing device 10, the server 20, and the real object 30 mentioned above are described below in detail.
A battle game according to a first embodiment of the present disclosure is now described with reference to
The user who participates in the game first decides a team to participate at the time of user registration. Moreover, the team to participate may be decided by the server 20 that performs the user registration processing. The user can check the tag display associated with the real object 30 through the information processing device 10 such as HMD and launch an attack against the real object 30 of the opponent team.
The users have individual physical strengths and attack powers (status). In addition, the real object 30 is also associated with the tag information such as acquisition difficulty level or rarity level. The battle's victory or defeat is determined depending on the user who launches an attack, the status of the user who owns the real object 30, and the tag information of the real object 30. In this regard, in a case where the user who launches an attack wins, the user can take away the target real object 30 from the original owner. In addition, a user who wins the battle, as a privilege, may rise in status or may be given an item or the like available for the game. Furthermore, the user who wins the battle can set a new acquisition difficulty level in the real object 30 in exchange for the user's status. In addition, the user who wins the battle can set an optional tag to be associated with the real object 30. A detailed description of the battle will be given later.
Moreover, the points acquired by each team are obtained from the sum of the acquisition difficulty levels of the real objects 30 owned by the users belonging to each team. The points acquired by each team are counted every predetermined period such as week, month, or the like, and the team's victory or defeat may be determined for each such period.
Various types of information used in the battle game according to the present embodiment are now described with reference to
The acquisition difficulty level is an item corresponding to the physical strength of the real object 30. A user who launches an attack can subtract a numerical value obtained by multiplying a user's attack power by a random number from the acquisition difficulty level. If the acquisition difficulty level of the real object 30 is less than or equal to 0 from the result of the attack, the user who launches an attack gains the victory. The acquisition difficulty level is the tag information that can be set by the user who wins the battle, and the user can set a new acquisition difficulty level of the real object 30 in exchange for the user's status. The setting of high acquisition difficulty level makes it possible to eliminate or reduce the possibility of being taken away the real object 30 when an attack is launched from another user.
The manufacturer, model, and degree of luxury are product information related to the real object 30. The information may be information provided by a manufacturer that makes the real object 30. In addition, in the example illustrated in
The optional tag is tag information set by the user who owns the real object 30, and the user, who launches an attack, when winning the battle, can set it. The optional tag may be a simple message directed to another user.
The rarity level is a value indicating the scarcity of the real object 30. The rarity level may be calculated from, in one example, the number of real objects 30 of the same model, which are managed by the object management unit 230. In other words, the rarity level of the real object 30 having a small number of identical models with respect to the whole is set to high, and the rarity level of the real object 30 in which many identical models are registered is set to low. Moreover, in the example illustrated in
The owner is an item indicating a user who owns the real object 30. Referring to
The information related to the real object 30 managed by the object management unit 230 according to the present embodiment is described above. Moreover, the above-described information managed by the object management unit 230 may be distributed and stored in a plurality of tables. In addition, information other than the above may be managed together. In one example, the object management unit 230 may manage the image information of a vehicle for each model of the real object 30.
Information related to the user (the information processing device 10) managed by the user management unit 220 according to the present embodiment is now described.
The team represents a force on the game to which the user belongs. In the example illustrated in
The physical strength and attack power indicate user status information. The physical strength decreases by counterattacks from the battle opponent, and when it is 0 or less, the user's defeat is decided. As described above, the attack power indicates the strength of taking away the value of the acquisition difficulty level of the real object 30 to be attacked.
The ranking is a value indicating a user ranking in the game. The ranking is determined on the basis of points acquired for each user. In addition, the ranking may be a personal ranking of acquired points in the team, or may be a personal ranking of points acquired in all teams.
The information related to the user (the information processing device 10) managed by the user management unit 220 according to the present embodiment is described above. Moreover, the above-described information managed by the user management unit 220 may be distributed and stored in a plurality of tables. In addition, information other than the above may be managed together. In one example, the user management unit 220 may further manage the user's status such as defense power or hit rate, to make the game more complicated.
The overview of the battle game according to the present embodiment is described above. Then, display control of information regarding the battle game is described.
Further, the tag displays T11 to T13 indicate tag displays associated with the real objects 30a to 30c, respectively. The tag displays T11 to T13 are controlled by the display control unit 140. Moreover, the display control unit 140 may acquire a change in the position information of the real objects 30a to 30c from the server 20 and control the display positions of the tag displays T11 to T13. In addition, the display control unit 140 may control the display positions of the tag displays T11 to T13 using image recognition technology such as SLAM on the basis of the information related to the real objects 30a to 30c that is acquired from the sensor unit 160.
The tag displays T11 to T13 illustrated in
Then, referring to the tag display T12, the same item as the tag display T11 is displayed on the tag display T12, but the background of the tag display T12 is displayed in a format different from the tag display T11. As described above, the display control unit 140 may change the display format of the tag display depending on the tag information associated with the real object 30. In this example, the display control unit 140 controls the display format of the tag display depending on the rarity level that is set in the real object 30. It can be seen that the rarity level of the real object 30a is D while the rarity level of the real object 30b is A, as compared to the tag displays T11 and T12. The user is able to recognize intuitively that the rarity level of the real object 30b is higher by checking the display format of the tag display T12. The display format of the tag display may include color, shape, size, pattern, or the like.
Then, referring to the tag display T13, unlike the tag displays T11 and T12, text information “IN BATTLE!” is displayed. In this example, this message indicates that the real object 30c is being attacked by another user (in battle). As described above, the display control unit 140 is capable of acquiring a situation of processing regarding the real object 30 from the server 20 to control the tag display. In addition, as illustrated in
Further, the display control unit 140 according to the present embodiment may have a function of filtering tag information to be displayed depending on various conditions such as setting and state of the user. In one example, in a case where a predetermined rarity level is set as a condition for the user to display the tag information, the display control unit 140 may display only the tag display regarding the real object 30 associated with the rarity level having a predetermined value or more.
Further, the display control unit 140 may filter the tag information to be displayed on the basis of the information related to the user's emotion that is acquired by the sensor unit 160. In one example, in a case where the information related to the user's emotion indicates the excited state of the user, the display control unit 140 may perform display control in such a manner to display only the tag information associated with a red-colored vehicle. Moreover, examples of the information related to the user's emotion may include information related to heart rate, blood pressure, eye movement, or the like of the user.
Then, the windows W11 to W14 illustrated in
The window W12 is an area for displaying the position information of the information processing device 10 and the real object 30 on a map. In this example, the position (user's position) of the information processing device 10 is indicated by a mark of black circle, and the position of the real object 30 is indicated by a mark of white triangle or white star. In this regard, the display control unit 140 may change the mark indicating the real object 30 depending on the rarity level of the real object 30. In one example, when the rarity level of the real object 30 is a predetermined rarity level or more, the display control unit 140 may cause the real object 30 to be displayed as a white star mark on the map. In addition, the display control unit 140 is capable of performing display control in such a manner to display information other than the real object 30 that is acquired from the server 20 on the map. In this example, an item used in the battle game is shown on the map with a heart-shaped mark. The item used in the battle game may be, in one example, one that restores the user's physical strength.
The window W13 is an area for displaying the information related to the user (the information processing device 10) such as the status including the user's physical strength or attack power, the ranking, or the like. The display control unit 140 is capable of causing various kinds of information related to the user that is acquired from the server 20 to be displayed in the window W13. Moreover, in this example, the physical strength of the user is represented as HP, and the attack power is represented as ATK. The display control unit 140 may acquire information related to the team to which the user belongs from the server 20 and cause it to be displayed in the window W13.
The window W14 is an example of an icon used to perform transition to various control screens regarding the battle game. In this manner, the display control unit 140 may control a display interface for the user to perform the processing regarding the battle game. Moreover, it is conceivable that examples of the various control screens regarding the battle game include a screen for user information setting, a screen for communication with other users, or the like.
As described above, it is possible for the display control unit 140 according to the present embodiment to control display of the information related to the user (the information processing device 10) or the information on the processing related to the battle game, in addition to the tag information associated with the real object 30.
Then, the control regarding simplification of the display information by the display control unit 140 is described. The display control unit 140 according to the present embodiment has a function of simplifying and displaying the tag information depending on various conditions. The simplification and displaying of the tag information makes it possible for the user to recognize intuitively the tag information associated with the real object 30. The display control unit 140 may simplify the display information, in one example, by using icons and a change in colors.
The simplification of the display information by the display control unit 140 is now described in detail with reference to
When comparing
The display control unit 140 according to the present embodiment is capable of displaying the tag display while simplifying it on the basis of the moving speed of the real object 30 in consideration of the above situation. In this regard, the moving speed of the real object 30 may be a value calculated by the server 20 from the change of the position information of the real object 30, or may be a value calculated by the information processing device 10 from the information regarding the real object 30 that is acquired from the sensor unit 160.
Referring to
Moreover, the display control unit 140 is also capable of simplifying the information to be displayed on the basis of the moving speed of the user (the information processing device 10). By performing this control, it is possible to reduce the influence on the visual information of the real space perceived by the user to be small and to secure the safety at the time of movement of the user. In this event, the display control unit 140 may display the windows W11 to W14 while simplifying it in a similar manner to the tag displays T11 to T13. In addition, the display positions of the windows W11 to W14 may be controlled to move to a corner of the user's field of view. The moving speed of the user (information processing device 10) can be calculated on the basis of the information acquired from the sensor unit 160.
Furthermore, the display control unit 140 is also capable of simplifying the information to be displayed in consideration of the information amount of the tag information associated with the real object 30. In one example, in a case where the number of real objects 30 to be recognized is large, a case where the number of associated tag information is large, a case where the information amount of tag information is large, or other like cases, the display control unit 140 may display the tag display while simplifying it.
The information display control by the display control unit 140 according to the present embodiment is described above. Then, specifying the real object 30 to be attacked regarding the battle game according to the present embodiment is described with reference to
In the battle game according to the present embodiment, a user who checks the tag display associated with the real object 30 that is a moving vehicle launches an attack against the real object 30, and the battle is started. The display control unit 140 according to the present embodiment has a function of specifying a real object 30 to be attacked on the basis of the information acquired from the sensor unit 160.
The display control unit 140 according to the present embodiment is capable of specifying the target real object 30 using various methods corresponding to the type of sensor included in the sensor unit 160. In one example, in a case where the sensor unit 160 includes a microphone, the display control unit 140 may specify the target real object 30 by using voice recognition. In this event, voice information to be input may be the user's readout of a name of the user who owns the real object 30 or a model name of the real object 30. In addition, in a case where the sensor unit 160 detects an input from a user on an input device such as a touch panel, the display control unit 140 may specify the target real object 30 on the basis of this input information.
Further, in a case where the sensor unit 160 detects information on the user's line of sight, the display control unit 140 may specify the target real object 30 on the basis of the information on the user's line of sight. In this event, the display control unit 140 is capable of specifying, on the basis of a fact that the user's line of sight is fixed to the real object 30 for a predetermined time or longer, the real object 30 as a target. In addition, in a case where the sensor unit 160 detects a gesture of the user, the display control unit 140 may specify the target real object 30 on the basis of information on the user's gesture. In one example, the display control unit 140 is capable of specifying, on the basis of a fact that the user's finger points to the real object 30 for a predetermined time or longer, the real object 30 as a target.
Furthermore, the display control unit 140 may specify the target real object 30 on the basis of both the information on the user's line of sight and the information on the gesture.
In an example illustrated in
Then, the display control of specifying the real object 30 to be attacked is described with reference to
Furthermore, in
Further, as illustrated in
Then, the control of input regarding the battle of the present embodiment is described with reference to
The input control unit 150 is capable of recognizing a battle command from the user on the basis of the user's gesture detected by the sensor unit 160. Here, the battle command may be an instruction to attack against the real object 30 by a predetermined gesture or a defense instruction against counterattack from a battle opponent. In the example illustrated in
The input control unit 150, when recognizing the battle command from the user, transmits contents of the battle command to the server 20 via the communication unit 110. In addition, in this event, the input control unit 150 may deliver the information on the recognized battle command to the display control unit 140. The display control unit 140 is capable of controlling the display including the guide G12 depending on the contents of the battle command. In addition, the display control unit 140 may cause the window W11 to display a fact that the battle command is recognized.
Moreover,
The recognition of the battle command according to the present embodiment is described above. Then, the setting of the tag information after the battle is ended by the input control unit 150 is described. In the battle game according to the present embodiment, after the battle is ended, a user who wins the battle is able to set an optional tag or a new acquisition difficulty level as tag information to be associated with the real object 30.
The input control unit 150 is capable of setting the optional tag or the acquisition difficulty level on the basis of the input information from the user that is detected by the sensor unit 160, in a similar manner to the recognition of the battle command. In one example, the input control unit 150 may set the tag on the basis of the user's voice information.
Further, the input control unit 150 according to the present embodiment may estimate contents of tag information set by the user and may set it as new tag information. The input control unit 150 may estimate the contents of the tag information to be set on the basis of, in one example, a tendency of tag information set by the user in the past, user's gesture information, information related to the user's emotion that is acquired by the sensor unit 160, or the like. In a case where the tag information is estimated on the basis of the tendency of the tag information set by the user in the past, the input control unit 150 is capable of acquiring the information from the storage unit 120 and executing the estimation. In addition, the information related to the user's emotion may include information such as heart rate, blood pressure, eye movement, or the like of the user.
Further, the input control unit 150 may estimate a plurality of patterns of tag information to be set and present it as a setting candidate to the user. In this case, the input control unit 150 may set the contents corresponding to a pattern selected by the user as new tag information and deliver it to the target management unit 130. The target management unit 130 transmits the tag information accepted from the input control unit 150 to the server 20 in association with the target real object 30.
The characteristics of the information processing device 10, the server 20, and the real object 30 in the battle game according to the present embodiment are described above. Then, the control flow regarding the battle game of the present embodiment is described with reference to
A procedure of new registration of the user (information processing device 10) information is now described with reference to
The user management unit 220, when receiving the request from the control unit 250, associates the information related to the user that is delivered from the control unit 250 with a new ID and performs registration processing of the user information (S5003). Subsequently, the user management unit 220 returns a result of the registration processing to the control unit 250 (S5004). In a case where the result of the registration processing that is acquired from the user management unit 220 is normal, the control unit 250 transmits a notification of user information registration to the information processing device 10 (S5005). Moreover, in a case where it is found that the result of the registration processing that is acquired from the user management unit 220 is abnormal, the control unit 250 may create a message corresponding to the result of the registration processing and transmit it to the information processing device 10.
Subsequently, the procedure of new registration of information on the real object 30 is described with reference to
The object management unit 230, when receiving the request from the control unit 250, associates the information related to the real object 30 that is delivered from the control unit 250 with a new ID and performs registration processing of the real object 30 (S5013). Subsequently, the object management unit 230 returns a result of the registration processing to the control unit 250 (S5014). In a case where the result of the registration processing that is acquired from the object management unit 230 is normal, the control unit 250 transmits a registration notification to the real object 30 (S5015). Moreover, in a case where it is found that the result of the registration processing that is acquired from the object management unit 230 is abnormal, the control unit 250 may create a message corresponding to the result of the registration processing and transmit the message to the real object 30.
Then, the procedure of updating the position information of the information processing device 10 is described with reference to
The user management unit 220, when receiving the request, updates the position information of the information processing device 10 on the basis of the new position information of the information processing device 10 that is delivered from the control unit 250 (S5023). Subsequently, the user management unit 220 returns a result of the update processing to the control unit 250 and ends the processing (S5024). Moreover, in a case where it is found that the result of the update processing that is acquired from the user management unit 220 is abnormal, the control unit 250 may create a message corresponding to the result of the update processing and transmit the message to the information processing device 10.
Subsequently, the procedure of updating the position information of the real object 30 is described with reference to
The object management unit 230, when receiving the request, updates the position information of the real object 30 on the basis of the new position information of the real object 30 that is delivered from the control unit 250 (S5033). Subsequently, the object management unit 230 returns a result of the update processing to the control unit 250 and ends the processing (S5034). Moreover, in a case where it is found that the result of the registration processing that is acquired from the object management unit 230 is abnormal, the control unit 250 may create a message corresponding to the result of the update processing and transmit the message to the real object 30.
The procedure of acquiring tag information associated with the real object 30 is now described with reference to
Then, the tag linkage unit 240 requests the object management unit 230 to acquire information related to the real object 30 on the basis of the acquired position information of the user (the information processing device 10) (S5045). The object management unit 230, when receiving the request, searches the information of the real object 30 existing in the vicinity of the information processing device 10 on the basis of the position information of the information processing device 10 that is delivered from the tag linkage unit 240 (S5046). Subsequently, the object management unit 230 delivers the acquired information of the real object 30 to the tag linkage unit 240 (S5047).
Then, the tag linkage unit 240, when acquiring the information of the real object 30, transmits the acquired information list of the real object 30 to the target management unit 130 of the information processing device 10 (S5048). Moreover, in a case where it is found that the result of the information acquisition of the real object 30 that is acquired from the object management unit 230 is abnormal, the control unit 250 may create a message corresponding to the information acquisition result and transmit the message to the information processing device 10. Then, the target management unit 130 delivers the acquired information list of the real object 30 to the display control unit 140 (S5049), and ends the processing.
The procedure of acquiring the tag information associated with the real object 30 is described above. As described above, the server 20 is capable of acquiring the information of the real object 30 existing near the information processing device 10 on the basis of the position information of the information processing device 10. This processing makes it possible to achieve an effect of reducing the information amount of the real object 30 that the server 20 transmits to the information processing device 10.
Then, the procedure of controlling the battle according to the present embodiment is described with reference to
Then, the control unit 250 requests the user management unit 220 to acquire information related to the attacker and the owner of the real object 30 set as an attack target (S5052). The user management unit 220, when receiving the request, searches for information on the user on the basis of the user identification information delivered from the control unit 250 (S5053). In this event, the acquired user information includes status information of the attacker and the owner. Subsequently, the user management unit 220 returns the acquired user information to the control unit 250 (S5054).
Then, the control unit 250 requests the object management unit 230 to acquire the information on the real object 30 to be the attack target (S5055). The object management unit 230, when receiving the request, searches for the information on the real object 30 on the basis of the identification information of the real object 30 that is delivered from the control unit 250 (S5056). In this time, the information to be acquired includes the acquisition difficulty level or rarity level associated with the real object 30. Subsequently, the object management unit 230 returns the acquired information related to the real object 30 to the control unit 250 (S5057).
In a case where the acquisition of the user information and the information related to the real object 30 is normally completed, the control unit 250 notifies the display control unit 140 of the information processing device 10 owned by the attacker and the owner of the start of the battle (S5058a and S5058b). Then, the input control unit 150 of the information processing device 10a owned by the attacker recognizes the attack instruction on the basis of the input by the user and requests the control unit 250 of the server 20 to perform the attack processing (S5059). The control unit 250, when receiving the attack request, performs the battle determination on the basis of the attack (S5060). Specifically, the control unit 250 performs processing of subtracting a value obtained by multiplying the attack power of the attacker by a random number from the acquisition difficulty level of the real object 30 to be the attack target. Here, the description will be continued assuming that the acquisition difficulty level of the real object 30 does not become 0 or less after the processing.
Subsequently, the control unit 250 transmits the result of the battle determination to the display control unit 140 of the information processing device 10 of the attacker and the owner (S5061a and S5061b). Then, the display control unit 140 of an information processing device 10b owned by the owner recognizes the attack instruction on the basis of the input by the user and requests the control unit 250 of the server 20 to perform the attack processing (S5062). Moreover, here, in a case where the attack request from the information processing device 10b fails to be checked within a predetermined time, the control unit 250 may perform the subsequent processing without waiting for the attack request. By performing the processing as described above by the control unit 250, even if the owner fails to participate in the battle game, it is possible for the attacker to continue the game.
Then, the control unit 250, when receiving the attack request, performs a battle determination based on the attack (S5063). Specifically, the control unit 250 performs processing of subtracting a value obtained by multiplying the attack power of the owner by a random number from the physical strength of the attacker. Here, the description will be continued assuming that the physical strength of the attacker does not become 0 or less after the processing.
Subsequently, the control unit 250 transmits the result of the battle determination to the display control unit 140 of the information processing device 10 of the attacker and the owner (S5064a and S5064b). Then, steps S5059 to S5063 described above are repeatedly processed until the physical strength of the attacker or the acquisition difficulty level of the real object 30 becomes 0 or less.
(Procedure of Setting Tag Information after Completion of Battle)
Then, the procedure of setting tag information after completion of a battle is described with reference to
The user management unit 220, when receiving the request updates the user information on the basis of the information delivered from the control unit 250 (S5072). Subsequently, the user management unit 220 returns the update result of the user information to the control unit 250 (S5073). In this event, the control unit 250 may create a message corresponding to the result of the update and transmit the message to the information processing device 10 owned by the attacker and the owner.
Then, the winner of the battle sets tag information to be associated with the real object 30. Here, the description is given on the assumption that the attacker wins the battle. The attacker who is the winner of the battle inputs a new acquisition difficulty level and optional tag to be associated with the real object 30 to the information processing device 10a. The input control unit 150, when recognizing the input, delivers the setting of the tag information based on the recognized contents to the target management unit 130 (S5074). Here, the input control unit 150 may estimate new tag information on the basis of the past trends or information acquired from the sensor unit 160 and deliver it to the target management unit 130. The estimation of the tag information performed by the input control unit 150 makes it possible to reduce the input burden on the user. The target management unit 130 requests the control unit 250 of the server 20 to set tag information in association with the tag information delivered from the input control unit 150 and the target real object 30 (S5075).
The control unit 250, when receiving the tag setting request, requests the object management unit 230 to update the information of the real object 30 on the basis of contents of the request (S5076). The object management unit 230 updates the information on the real object 30 on the basis of the information delivered from the control unit 250. Specifically, the object management unit 230 sets the new acquisition difficulty level, the optional tag, and the owner of the real object 30 on the basis of the information delivered from the control unit 250 (S5077). Subsequently, the object management unit 230 returns the result of the update processing to the control unit 250 (S5078). In a case where the result of the update processing acquired from the object management unit 230 is normal, the control unit 250 transmits an update notification of the real object 30 to the display control unit 140 (S5079). Moreover, in a case where it is found that the result of the setting processing acquired from the object management unit 230 is abnormal, the control unit 250 may create a message corresponding to the result of the update processing and transmit the message to the display control unit 140.
The battle game according to the first embodiment of the present disclosure is described above. As described above, the battle game according to the present embodiment is a contest game that targets the moving real object 30. The user is able to check the tag display associated with the real object 30 through the information processing device 10 and perform processing such as attack instruction. In addition, the user is able to set new tag information in the real object 30.
Moreover, in the present embodiment, the description is given of the real object 30 by taking a moving vehicle as an example, but the real object 30 according to the present embodiment is not limited to this example. The real object 30 according to the present embodiment may be a train or an airplane, or may be an animal equipped with a device for transmitting the positional information to the server 20. The functions of the information processing device 10, the server 20, and the real object 30 as described above allow the battle game of the present embodiment to be appropriately changed.
Then, a bomb game according to a second embodiment of the present disclosure is described with reference to
It is assumed that the real object 30 causes an explosion when the associated time information is exhausted due to the countdown and a user within a predetermined range during the explosion drops out of the game as being involved in the explosion. The user is able to move the real object 30 before explosion of the real object 30 to escape the explosion or to cause the user of the opponent team to be involved in the explosion. The following description is given by focusing on the difference from the first embodiment, and the description of the common functions of the information processing device 10, server 20, and real object 30 will be omitted.
The real object 30 according to the second embodiment is defined as an object that can be moved by the user. The real object 30 according to the present embodiment may be, in one example, a chair, a book, or a ball provided with a device for transmitting position information to the server 20. The users are divided into two teams, and move the real objects 30 to involve users of the opponent team in the explosion. A plurality of real objects 30 may be used in the game.
In the example illustrated in
Further, the tag display T25 indicating the range of the explosion is associated with the real object 30d. The display control unit performs display control of the tag display T25 on the basis of the tag information related to the explosion range associated with the real object 30.
The persons P21 and P22 indicate participants of the game. The tag displays T22 and T23 indicating the teams to which the respective persons P21 and P22 belong are associated with each other. In addition, the tag display T24 indicating text information “Danger!” is associated with the person P21. The tag display T24 is a tag display indicating a warning to the user located within the explosion range of the real object 30d. In this manner, in the bomb game according to the present embodiment, a person carrying the information processing device 10 can be treated as the real object 30.
The windows W21 and W22 are areas for presenting various kinds of information related to the game to the user. In the example illustrated in
When the time information associated with the real object 30d is exhausted due to the countdown, the control unit 250 of the server 20 acquires the position information of the user participating in the game from the user management unit 220, and makes a hit determination for each user on the basis of the tag information related to the explosion range associated with the real object 30d. In addition, the control unit 250 may perform processing of expanding the explosion range of the real object 30d depending on the number of times the user is involved in the explosion. The control unit 250 repeats the processing described above and terminates the game on the basis of the fact that the number of surviving users of any team is zero.
The bomb game according to the second embodiment of the present disclosure is described above. As described above, the bomb game according to the present embodiment is a competition game in which the real object 30 that can be moved by the user is regarded as a bomb. In the bomb game according to the present embodiment, a user who owns the information processing device 10 can be treated as the real object 30.
Moreover, in the present embodiment, the description is given of the real object 30 by taking a chair as an example, but the real object 30 according to the present embodiment is not limited to this example. The real object 30 according to the present embodiment may be a ball that is thrown by a user. The bomb game according to the present embodiment may be applied to a game like a snowball fight with an explosion range by using a ball as the real object 30.
Then, a collection game according to a third embodiment of the present disclosure is described with reference to
In the example illustrated in
The windows W31 to W33 are areas for presenting various kinds of information related to the game to the user. In the example illustrated in
In the collection game according to the present embodiment, in addition to the method of specifying the real object 30 that is described in the first embodiment, the acquired points may be added on the basis of the fact that the user actually rides or boards the real object 30. In this case, when the difference between the position information of the real object 30 and the position information of the user (the information processing device 10) is equal to or less than a predetermined value, the control unit 250 of the server 20 may determine that the user rides or boards the real object 30. In addition, the information processing device 10 held by the user who rides or boards the real object 30 may receive the identification information from the real object 30 using short-range wireless communication and transmit it to the server 20.
In a case where the acquired points are added on the basis of the riding or boarding on the real object 30, the highest point may be given to a user who first rode or boarded the real objects 30 among the users registered in the server 20. In addition, in a case of competing for the collection game according to the present embodiment by team, a bonus may be added to the acquired points depending on the number of users who rides or boards the real object 30 at the same time.
Furthermore, the collection game according to the present embodiment can interlock with a company's campaign. In one example, the user is able to obtain a higher acquisition point than usual by specifying a predetermined number or more of sales vehicles of a company that performs cooperation. In addition, the user may be able to obtain other advantages in addition to or in lieu of the acquired points. Here, the other advantage may be a product sold by a cooperating company, key information for downloading the content of another application, or the like.
The collection game according to the third embodiment of the present disclosure is described above. As described above, the collection game according to the present embodiment is a game in which the user competes for acquisition points obtained by recognizing the real object 30. In addition, in the collection game according to the present embodiment, it is also possible to give an acquisition point on the basis of the fact that the user actually rides or boards the real object 30.
Moreover, in the present embodiment, the description is given of the real object 30 by taking transportation such as a vehicle, a train, an airplane, or the like as an example, but the real object 30 according to the present embodiment is not limited to such example. The real object 30 according to the present embodiment may be, in one example, an animal equipped with a device that transmits position information to the server 20. The use of such an animal as the real object 30 allows the collection game according to the present embodiment to be held as an event such as a zoo.
Then, an evaluation function according to a fourth embodiment of the present disclosure is described with reference to
In the example illustrated in
In the tag display according to the present embodiment, information related to the evaluation of the real object 30 or the user who owns the real object 30 is displayed. In the tag display T41 illustrated in
Further, the user is able to check the tag information related to the evaluation request set by another user through the information processing device 10 and input the evaluation. In the example illustrated in
Further, in the evaluation function according to the present embodiment, filtering of the tag display may be performed in more detail. In a case where many users use the evaluation function, the amount of tag information controlled by the display control unit 140 becomes enormous, and it is difficult for the user to check the tag display desired to be checked. Thus, the user is able to set the information processing device 10 in such a manner that only tag information of interest is displayed. The information related to such setting may be stored in the storage unit 120. The display control unit 140 is capable of filtering the tag display to be displayed on the basis of the information set in the storage unit 120. In one example, in the example illustrated in
Further, the display control unit 140 may perform filtering on the basis of the distance to the real object 30. In one example, the display control unit 140 is capable of causing only the tag information associated with the real object 30 existing at a predetermined distance to be displayed on the basis of the position information of the information processing device 10. Further, the display control unit 140 may control the information amount of the tag display on the basis of the distance to the real object 30. The display control unit 140 may cause more detailed information to be included in the tag display as the distance between the information processing device 10 and the real object 30 is shorter.
The evaluation function according to the fourth embodiment of the present disclosure is described above. As described above, the use of the evaluation function according to the present embodiment makes it possible for the user to evaluate the real object 30 or the owner of the real object 30 through the information processing device 10. In addition, the user is able to request another user to evaluate the matter concerning the requesting user himself through the information processing device 10, the server 20, and the real object 30.
Moreover, in the present embodiment, the description is given of the case where the individual uses the evaluation function as an example, but the use of the evaluation function according to the present embodiment is not limited to such example. In one example, it is also possible for a company to collect evaluation data from consumers in real time by using the evaluation function according to the present embodiment. In addition, the evaluation function according to the present embodiment is expected to cooperate with a campaign or the like that gives a bonus to the user who performs the evaluation.
Then, the language guidance according to a fifth embodiment of the present disclosure is described with reference to
Referring to
As described above, in the language guidance according to the present embodiment, the use of the tag information filtering function makes it possible to filter the language type of the tag information to be displayed. In the example illustrated in
The respective tag displays are now described in detail. The tag display T51 is a type of advertisement for a user who is an English speaker in association with the real object 30i shown as a taxi. The user who is an English speaker is able to know the contents of services that can be enjoyed by checking the tag display T51 associated with the moving real object 30i. In addition, the user who is an English speaker is able to recognize intuitively the taxi (the real object 30) associated with a tag display and distinguish between vehicles that can receive service by mother tongue. In addition, the tag display T52 is an evaluation comment associated with another user, and the user who is an English speaker is also able to select a vehicle to receive a service with reference to a comment from the other user.
The tag display T53 is associated with the real object 30j shown as a smartphone held by the person P51. Here, the person P51 may be police officer, security guard, or store staff. The user who is an English speaker is able to recognize that the person P51 can speak English by checking the tag display T53 associated with the real object 30j held by the person P51.
The tag display T54 is a type of advertisement for a user who is an English speaker in association with the real object 30k set in the signboard of the hotel. The user who is an English speaker is able to recognize it to be the hotel that can receive services in English by checking the tag display T54 associated with the real object 30k. Further, the tag display T55 is an evaluation comment associated with another user, and the user who is an English speaker is also able to select a hotel to stay with reference to the comment from the other user. Moreover, as illustrated in
The language guidance according to the fifth embodiment of the present disclosure is described above. As described above, in the language guidance according to the present embodiment, the user of the tag information filtering function makes it possible to provide information based on the user's language.
Moreover, in the present embodiment, the case is described in which one type of language is set as the filtering language, but the language guidance according to the present embodiment is not limited to such example. In the language guidance according to the present embodiment, a plurality of languages may be set as the filtering language. In one example, it is also possible to apply Japanese language education to the users who are English speakers by setting the filtering language to English and Japanese.
The hardware configuration example of the information processing device 10 and the server 20 according to the present disclosure is now described with reference to
A CPU 871 functions as, in one example, an arithmetic processing unit or a control unit, and controls the overall operation of each component or a part thereof on the basis of various programs recorded in a ROM 872, a RAM 873, a storage unit 880, or a removable recording medium 901.
The ROM 872 is a means for storing programs to be fetched by the CPU 871, data used for calculation, or the like. The RAM 873 temporarily or permanently stores, in one example, programs to be fetched by the CPU 871, various parameters appropriately changing at the time of executing the program, or the like.
The CPU 871, the ROM 872, and the RAM 873 are mutually connected via, in one example, a host bus 874 capable of high-speed data transmission. On the other hand, the host bus 874 is connected to an external bus 876 having a relatively low data transmission speed via, in one example, a bridge 875. In addition, the external bus 876 is connected to various components via an interface 877.
Examples of the input unit 878 include a mouse, a keyboard, a touch panel, a button, a switch, a lever, and the like. Further example of the input unit 878 includes a remote controller capable of transmitting a control signal using infrared rays or other radio waves (hereinafter referred to as a remote controller).
An output unit 879 is a device capable of notifying visually or audibly the user of the acquired information, and examples thereof include a display device such as cathode ray tubes (CRTs), LCDs, or organic ELs, an audio output device such as speakers or headphones, a printer, a mobile phone, a facsimile, and the like.
The storage unit 880 is a device for storing various types of data. Examples of the storage unit 880 include a magnetic storage device such as hard disk drives (HDDs), a semiconductor storage device, an optical storage device, a magneto-optical storage device, and the like.
A drive 881 is a device that reads out information recorded on the removable recording medium 901 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory or writes information to the removable recording medium 901.
The removable recording medium 901 is, in one example, a DVD medium, a Blu-ray (registered trademark) medium, an HD DVD medium, various kinds of semiconductor storage media, and the like. It may be apparent that the removable recording medium 901 may be, in one example, an IC card equipped with a contactless IC chip, an electronic device, or the like.
A connection port 882 is a port for connection with an external connection device 902, and examples thereof include a universal serial bus (USB) port, an IEEE 1394 port, a small computer system interface (SCSI), an RS-232C port, or an optical audio terminal.
The external connection device 902 is, in one example, a printer, a portable music player, a digital camera, a digital video camera, an IC recorder, or the like.
A communication unit 883 is a communication device for connecting to a network 903, and examples thereof include a communication card for wired or wireless LAN, Bluetooth (registered trademark), or wireless USB (WUSB), a router for optical communication, a router for asymmetric digital subscriber line (ADSL), or a modem for various communication.
The components common to the information processing device 10 and the server 20 according to the present disclosure are described above. Subsequently, components specific to the information processing device 10 are described. Each of the components described below is not necessarily specific to the information processing device 10, and may be provided in the server 20.
A sensor unit 884 includes a plurality of sensors and manages information acquired by each sensor. The sensor unit 884 includes, in one example, a geomagnetic sensor, an accelerometer, a gyro sensor, a barometer, and an optical sensor. Moreover, the hardware configuration shown here is an example, and some of the components may be omitted. In addition, the hardware configuration of the sensor unit 884 may further include components other than the components described here.
The geomagnetic sensor is a sensor that detects geomagnetism as a voltage value. The geomagnetic sensor may be a triaxial geomagnetic sensor that detects geomagnetism in the X-axis direction, the Y-axis direction, and the Z-axis direction.
The accelerometer is a sensor that detects the acceleration as a voltage value. The accelerometer may be a triaxial acceleration sensor that detects the acceleration along the X-axis direction, the acceleration along the Y-axis direction, and the acceleration along the Z-axis direction.
The gyro sensor is a type of measuring instrument for detecting the angle and angular velocity of an object. The gyro sensor may be a triaxial gyro sensor that detects the speed (angular velocity) at which the rotation angle around the X-axis, the Y-axis, and the Z-axis changes as a voltage value.
The barometer is a sensor that detects ambient atmospheric pressure as a voltage value. The barometer can detect atmospheric pressure at a predetermined sampling frequency.
The optical sensor is a sensor that detects electromagnetic energy such as light. Here, the optical sensor may be a sensor that detects visible light, or a sensor that detects invisible light.
As described above, the information processing device 10 according to the present disclosure has a function of controlling display of tag information associated with the moving real object 30. In addition, the information processing device 10 has a function of adding new tag information to the moving real object 30. In addition, the server 20 according to the present disclosure has a function of acquiring position information from the real object 30 and updating the position information of the real object 30 that is held in the server 20. In addition, the server 20 executes various processing corresponding to the mode of the application to be provided while communicating with the information processing device 10. Such a configuration makes it possible to change the display of the information associated with the moving real object depending on the position of the real object.
The preferred embodiment(s) of the present disclosure has/are described above with reference to the accompanying drawings, whilst the present disclosure is not limited to the above examples. A person skilled in the art may find various alterations and modifications within the scope of the appended claims, and it should be understood that they will naturally come under the technical scope of the present disclosure.
In one example, in the above embodiment, the display control unit 140 of the information processing device 10 controls display of tag information, but the present technology is not limited to this example. The display control of the tag information may be achieved by the server 20. In this case, the server 20 acquires the position information or direction information of the information processing device 10, and so the server 20 is capable of functioning as a display control unit that controls the display position of the tag information associated with the real object 30. In addition, the server 20 may control information display other than tag display to be displayed on the information processing device 10. In one example, the server 20 may perform control to cause the information processing device 10 to display a message related to the result of the processing by the server 20. Furthermore, the server 20 may perform filtering of a tag to be displayed or estimation of tag information to be newly set by the user in the real object 30 on the basis of the information acquired from the sensor unit of the information processing device 10.
Further, the effects described in this specification are merely illustrative or exemplified effects, and are not limitative. That is, with or in the place of the above effects, the technology according to the present disclosure may achieve other effects that are clear to those skilled in the art from the description of this specification.
Additionally, the present technology may also be configured as below.
(1)
An information processing device including:
a display control unit configured to control display of tag information managed in association with position information of a real object,
in which the display control unit controls the display of the tag information in such a manner that the display of the tag information is changed depending on a change in the position information of the real object.
(2)
The information processing device according to (1), further including:
a sensor unit including one or more sensors,
in which the display control unit controls a display position of the tag information depending on the change in the position information of the real object and a change in position information and direction information of the information processing device, the position information and the direction information being collected by the sensor unit.
(3)
The information processing device according to (1) or (2),
in which the display control unit controls the display position of the tag information in such a manner that the display of the tag information follows the real object.
(4)
The information processing device according to (2),
in which the display control unit controls the display of the tag information in such a manner that the display of the tag information is changed depending on a moving speed of the real object, the moving speed being collected by the sensor unit.
(5)
The information processing device according to any one of (2) to (4),
in which the display control unit limits display contents of the tag information on a basis of a fact that the moving speed of the real object exceeds a predetermined speed.
(6)
The information processing device according to any one of (1) to (5),
in which the display control unit, in a case of specifying the real object from information collected by the sensor unit, causes tag information playing a role as an avatar of the real object to be displayed and causes a display position of tag information associated with the real object to be kept.
(7)
The information processing device according to any one of (1) to (6),
in which the display control unit controls the display of the tag information in such a manner that the display of the tag information is changed depending on a distance between the real object and the information processing device.
in which the display control unit controls display contents of the tag information on a basis of a fact that the distance between the real object and the information processing device exceeds a predetermined distance.
(9)
The information processing device according to any one of (1) to (8),
in which the display control unit performs filtering of tag information to be displayed depending on contents of the tag information.
(10)
The information processing device according to (2), further including:
a target management unit configured to manage the position information of the real object and the tag information in association with each other.
(11)
The information processing device according to (10), further including:
an input control unit configured to set contents of the tag information.
(12)
The information processing device according to (11),
in which the target management unit associates the tag information set by the input control unit with the real object.
(13)
The information processing device according to (11) or (12),
in which the input control unit sets contents estimated from user-related information collected by the sensor unit as the contents of the tag information, and
the information collected by the sensor unit includes user's line of sight, user's gesture, and user's emotion.
(14)
The information processing device according to (12),
in which the target management unit associates tag contents set by the input control unit with the real object specified from information collected by the sensor unit, and
the information collected by the sensor unit includes user's line of sight, user's gesture, voice information, and image information of the real object.
(15)
The information processing device according to (12),
in which the target management unit associates tag contents set by the input control unit with the real object specified using SLAM techniques from information collected by the sensor unit.
(16)
The information processing device according to (12),
in which the target management unit associates tag contents set by the input control unit with the real object specified from information regarding the real object that is collected using short-range wireless communication.
(17)
The information processing device according to any one of (1) to (16),
in which the information processing device is a head-mounted display.
(18)
An information processing method including:
controlling, by a processor, display of tag information managed in association with position information of a real object; and
controlling the display of the tag information in such a manner that the display of the tag information is changed depending on a change in the position information of the real object.
(19)
A program causing a computer to function as an information processing device including:
a display control unit configured to control display of tag information managed in association with position information of a real object,
in which the display control unit controls the display of the tag information in such a manner that the display of the tag information is changed depending on a change in the position information of the real object.
(20)
A server including:
an object management unit configured to manage an update of position information of a real object on a basis of the collected position information of the real object; and
a control unit configured to cause the position information of the real object and tag information managed in association with the position information of the real object to be transmitted to an information processing device.
Number | Date | Country | Kind |
---|---|---|---|
2016-001672 | Jan 2016 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/078813 | 9/29/2016 | WO | 00 |