The present invention relates to an information processing device, an information processing method, and a program.
There is known an augmented reality type game in which players grouped into two teams play against each other in a limited area such as a court or a field, for example. In recent years, there has also been constructed a mechanism in which not only players competing in a game, but also third parties other than the players, for example, viewers who watch the game, support desired players (supported players) (for example, Patent Document 1).
Patent Document 1: Japanese Unexamined Patent Application, Publication No. 2018-015354.
However, conventional games have been designed to the extent that a viewer's cheer for a player allows a cheering message or stamp to be sent to the player and gives the player points corresponding to the amount of the message or stamp (degree of cheering), but has not reached a level where a third party such as a viewer can feel as if he/she participates in the game together with the player. Further, in a game where objects are displayed while superimposed on real players in this way, it is important from the viewpoint of viewing or playing a game how accurately movements of players' heads and hands can be tracked. Further, in a game where a player throws or shoots an object such as a fireball or a ball, the timing at which the object is released from the player's hand according to a player's throwing motion is important from the viewpoint of viewing or playing the game. However, under the circumstances, the conventional games have been insufficient in tracking accuracy and sensation of release from hands, and viewers have not enjoyed the games in some cases because they feel neither a sense of unity nor a sense of reality for players and objects, and thus the games are unattractive to the viewers.
The present invention has been made in view of such a situation, and has an object to enable players and third parties other than the players to enjoy games.
In order to attain the above object, an information processing device according to one aspect of the present invention comprises:
According to the present invention, players and other third parties can enjoy the game.
Embodiments of the present invention will be described below with reference to the drawings.
As shown in
When it is not necessary to identify individual terminals, a player terminal 1 (a head mounted display 1-1 (hereinafter referred to as “HMD 1-1”) and an arm sensor 2), and a supporter terminal 3 will be hereinafter referred to. When it is not necessary to identify individual players, a player P, a player PA, a player PB, etc. will be referred to. Further, when it is not necessary to identify individual supporters, a supporter U, a supporter UA, a supporter UB, etc. will be referred to.
The supporters UA-1 to UA-h cheer in a cheering area for cheering the players PA-1 to PA-m and the team thereof. The supporters UB-1 to UB-h cheer in a cheering area for cheering the players PB-1 to PB-m and the team thereof.
As shown in
In the following description, the supporter terminals 3-1 to 3-n are referred to as supporter terminals 3 when it is not necessary to identify the individual terminals.
The player terminal 1 detects the movement of the player P wearing it. The HMD 1-1 is provided with a sensor for detecting the movement of the player P, a communication function, a speaker, a display, a camera, and the like. In other words, the HMD 1-1 detects the line of sight of the player P, the movement of the line of sight, and the movement of the head portion of the player P.
A speaker outputs sounds of the game, voices of surrounding cheers, and the like received by a communication function and so that the player P can hear them. 3DCG hologram images of the game (3D objects such as an energy ball EB and a shield SH) received by the communication function are displayed on a display with being superimposed on an real space which is actually seen through the display. 3DCG is an abbreviation for three-dimensional computer graphics.
The arm sensor 1-2 detects the movement of the arm of the player P. On the game, for example, when the arm is raised in a vertical direction, the arm sensor 1-2 detects the movement of the arm and notifies it to a server 4, and a shield SH is displayed on the display as standing in front of the player P through the game processing in the server 4. Further, when the arm is swung to throw a ball, an energy ball EB is displayed on the display as being shot in a throwing direction from the hand. Peripheral equipment of the HMD 1-1 includes the arm sensor 1-2, and the arm sensor 1-2 also communicates with the HMD 1-1. In addition to the arm sensor 1-2, the peripheral equipment of the HMD 1-1 also includes, for example, Vive Tracker, Opti Track, and the like, and they may be used.
The supporter terminals 3-1 to 3-n are information terminals that can be operated by supporters U who support the players PA-1 to PA-m and the players PB-1 to PB-m, and detect various behaviors including cheering behaviors of the supporters U. The behaviors to be detected by the supporter terminals 3-1 to 3-n include, for example, voices of the supporters input to the supporter terminals 3-1 to 3-n, button pushing operations and touch operations by the supporters U, movements of images of the supporters U captured by the camera functions of the supporter terminals 3-1 to 3-n, behaviors of moving the supporter terminals 3-1 to 3-n by the supporters U while the supporters U hold the supporter terminals 3-1 to 3-n by hands, operations of detecting vertical movements of the supporter terminals 3-1 to 3-n when the supporters U jump (counting steps by pedometers or the like), operations of detecting positional displacements of the supporter terminals 3-1 to 3-n when the supporters U walk (direction measurement, movement measurement, etc.), etc. “Manpo kei” (trade name of Pedometer in Japan) is a registered trademark.
The CPU 21 executes various processing according to programs recorded in the ROM 22 or programs loaded from the storage unit 29 into the RAM 23. Data and the like which are necessary for the CPU 21 to execute various processing are also appropriately stored in the RAM 23.
The CPU 21, the ROM 22, and the RAM 23 are connected to one another via the bus 24. The input/output interface 25 is also connected to the bus 24. The display unit 27, the input unit 28, the storage unit 29, the communication unit 30, and the drive 31 are connected to the input/output interface 25.
The display unit 27 comprises a display such as a liquid crystal display, and displays a screen. The input unit 28 comprises various kinds of hardware buttons or software buttons, or the like, and receives various information input according to a user's instructing operation.
The storage unit 29 comprises DRAM (Dynamic Random Access Memory) or the like, and stores various data. The communication unit 30 is, for example, a LAN interface, and communicates with other devices (the player terminals 1, the supporter terminals 3, and the like in the example of
The drive 31 is provided as needed. A removable media 32 comprising a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is appropriately mounted on the drive 31, and data or a program is read from or written to the removable media 32. A program read out from the removable media 32 by the drive 31 is installed into the storage unit 29 as needed. Further, the removable media 32 can also store various data stored in the storage unit 29 in the same manner as the storage unit 29.
The devices other than the server 4, for example, the hardware of the player terminals 1, the supporter terminals 3, and the like are almost the same as that of the server 4, and the illustration thereof is omitted.
The supporter terminals 3-1 to 3-n are information terminals possessed by the supporters U, for example, smart phones, tablet terminals, and the like. In the case of these terminals, the terminal includes a touch panel in which the display unit 27 and the input unit 28 are integrated with each other to accept an operation by a human finger or the like. The touch panel enables the supporter U to execute a touch operation. It is not necessary for all spectators to have supporter terminals 3-1 to 3-n. In addition, examples of the information terminal include a television device, a personal computer, a console game, and the like.
Further, recent smart phones include various sensors such as an infrared sensor, a gyro sensor, a magnetic sensor (direction sensors), a proximity sensor, a barometric pressure sensor, a temperature sensor, an optical sensor, an acceleration sensor (motion sensor), a heart rate sensor, and a blood pressure sensor, and also include communication functions such as a microphone, a GPS signal reception function, and a wireless LAN. The supporter terminals 3-1 to 3-n themselves can detect human behaviors and send (transmit) them to the outside as behavior information. The server 4 can acquire the behavior information sent from each of the supporter terminals 3-1 to 3-n and perform various processing based on the acquired behavior information.
Next, a functional configuration of the information processing system of
As shown in
In the game information storage unit 42 are stored not only information for displaying objects on the HMD 1-1 and causing the objects to operate on the game, but also setting information such as parameters, team results, a game history, and the like. In the cheering information storage unit 43 is stored a table 70 in which a condition (cheering behavior assessment condition) for assessing that a behavior acquired from each supporter terminal 3 is a cheering behavior is associated with information (game reflection content) indicating how a cheering behavior matching the above cheering behavior assessment condition is reflected to the game as shown in
The CPU 21 of the server 4 functions as a game execution unit 51, a Web server unit 52, a behavior information acquisition unit 53, and a reflection unit 54. The game execution unit 51 executes a game in which an object is operated according to the movement of the player P. The game execution unit 51 displays, on HMD 1-1, a real space which the player P wearing the HMD 1-1 sees through the HMD 1-1 while objects (energy ball EB, shield SH, etc.) in an AR space are superimposed on the real space, and executes, on the HMD 1-1, an AR game in which the objects are operated according to the movement of the player P (natural user interface such as movement of the head, hand gesture, and movement of line of sight).
The Web server unit 52 makes an AR game management site open to the public on the network N, and provides services related to games to members which have conducted membership registration on the management site. Since members who have completed membership registration are given member IDs and passwords, the members can get services as members by accessing the management site by using their own information terminals (HMD 1-1, supporter terminals 3-1 to 3-n, etc.) and logging in to member pages with the member IDs and the passwords.
The behavior information acquisition unit 53 acquires behavior information pertaining to behaviors of third parties which are related to the game with the exception of the players P. The behavior information acquisition unit 53 receives and acquires behavior information pertaining to cheering behaviors performed by the supporters U watching the game from the respective supporter terminals 3-1 to 3-n through the communication unit 30. In other words, the behavior information acquisition unit 53 acquires information on the cheering behaviors of the supporters U from the supporter terminals 3 possessed by the supporters U. The reflection unit 54 generates a change which affects the game, including the objects, based on the behavior information acquired from the respective supporter terminals 3-1 to 3-n by the behavior information acquisition unit 53. The reflection unit 54 controls the game execution unit 51 so as to bring about a state change in the movements of the game, including the objects, according to a behavior which has been assessed as a cheering behavior based on the behavior information.
The reflection unit 54 includes a behavior assessment unit 61, a game reflection unit 62, and the like. The behavior assessment unit 61 assesses a behavior matching a preset condition based on the behavior information acquired from the respective supporter terminals 3-1 to 3-n by the behavior information acquisition unit 53. Specifically, when the behavior information acquisition unit 53 acquires the behavior information from the respective supporter terminals 3-1 to 3-n, the behavior assessment unit 61 refers to the table 70 of the cheering information storage unit 43 to determine whether or not the behavior included in the acquired behavior information matches a cheering behavior assessment condition preset in the table 70, and if the matching behavior is included, the behavior is assessed as a cheering behavior. When the change in the situation of the supporter U exceeds a preset threshold value, the behavior assessment unit 61 notifies it to the game reflection unit 62 to cause a change in the game, including the objects. For example, the change is some steps or more for the number of steps, some times or more of handclapping for handclap, some dBs or more for voice, and the like.
The game reflection unit 62 generates a change affecting the game, including the objects, based on a behavior assessed by the behavior assessment unit 61. In other words, when the behavior assessment unit 61 assesses the behavior as a cheering behavior, the game reflection unit 62 reads out the game reflection content corresponding to the cheering behavior from the table 70, and outputs a control signal to the game execution unit 51 so that the game operates according to the content.
Next, the operation (the content of the service) of this information processing system will be described with reference to
In the case of this information processing system, the player P who participates in the game and the supporter U who is a spectator log in to the member site which is made open to the public by the server 4, and registers which team (which player) the supporter U supports. If a favorite team or player P (supported player) is registered in advance at the time of membership registration, the registration of the supported player can be omitted at the time of log-in.
After the game is started, if the supporter U conducts a behavior, for example, “gives a cheer” “jumps”, “waves a flag” for a favorite team or player P when the supporter U cheers for desired teams or players P while watching the game, the supporter terminals 3-1 to 3-n detects the behavior of the supporter U as behavior information, and transmits the detected behavior information to the server 4.
The behavior of “giving a cheer” is detected as voice data input to the microphone on the supporter terminals 3-1 to 3-n, and the voice data is transmitted as the behavior information of the supporter. The behavior of “jumping” is counted as the number of steps of the pedometer function on the supporter terminals 3-1 to 3-n, and the number of steps is transmitted as the behavior information of the supporter. The behavior of “waving a flag” is recognized as a flag waving behavior by image recognition of the camera function on the supporter terminals 3-1 to 3-n, and a command corresponding to the flag waving behavior is transmitted. If there is a margin in the capacity, communication speed, etc. of the network N, an image (video or the like) of the supporter U captured by the camera function of the supporter terminals 3-1 to 3-n may be sent to the server 4 as behavior information. In this way, the behavior of the supporter U is converted into behavior information and then transmitted. The matter illustrated here is an example, and there are various cheering behaviors other than the above behaviors. Some of the behaviors assessed as cheering behaviors are disclosed in the fields of cheering behavior assessment conditions in Table 70 of
In the server 4, the behavior information acquisition unit 53 acquires behavior information pertaining to behaviors of the supporters U who are third parties related to the game with the exception of the players P in step S11 of
In step S12, the reflection unit 54 generates a change which affects the game, including the objects, based on the behavior information acquired by the behavior information acquisition unit 53. The processing of step S12 will be described in detail. In step S21, the behavior assessment unit 61 assesses a behavior matching a preset condition based on the behavior information acquired from the respective supporter terminals 3-1 to 3-n by the behavior information acquisition unit 53. Specifically, when the behavior information is acquired from the respective supporter terminals 3-1 to 3-n by the behavior information acquisition unit 53, the behavior assessment unit 61 refers to the table 70 of the cheering information storage unit 43 to determine whether or not a behavior included in the acquired behavior information matches a cheering behavior assessment condition preset in the table 70. If the matching behavior is included, the behavior assessment unit 61 assesses the matching behavior as a cheering behavior, reads out the game reflection content corresponding to the cheering behavior from the table 70, and notifies it to the game reflection unit 62.
Here, an example in which the cheering behavior causes a change that affects the game will be described with reference to
The description will be returned to the description on the processing from here. In step S22 of
Next, an information processing system according to a second embodiment of the present invention will be described with reference to
As shown in
As shown in
The arm sensor 1-2 includes an infrared sensor 101, a position detection unit 102, and a communication unit 103. The infrared sensor 101 has a large number of light receiving sensors installed therein, and receives infrared laser beams emitted from base stations 81 at different positions. The position detection unit 102 detects the position of an arm or hand of a player P on which the arm sensor 1-2 is mounted, from the times and angles at which the infrared sensors 101 receives the infrared laser beams. Hereinafter, the arm or hand is collectively referred to as “hand”. As an example, a tracking technique or the like based on HTC VIVE may be used. Those including the head, hands, and feet are called limbs. The communication unit 102 communicates with the server 4 and the arm sensor 1-2. The communication unit 112 transmits the position information of the position of the hand of the player P detected by the position detection unit 102 to the server 4.
The game execution unit 51 of the server 4 includes an arm tracking unit 91, a head portion tracking unit 92, a synchronization unit 93, and a game control unit 94. The head portion tracking unit 92 tracks the position of the head portion of the player P based on the position information of the head portion of the player P acquired from HMD 1-1. The arm tracking unit 91 tracks the position of the hand based on the position information of the hand of the player P acquired from the arm sensor 1-2.
The synchronization unit 93 synchronizes first tracking information obtained by tracking the position of the head portion by the head portion tracking unit 92 with second tracking information obtained by tracking the position of the hand by the arm tracking unit 91 to acquire the position information of the head portion and the hand at the same timing, that is, detect the positions of the head portion and the hand. The tracking information is transition information in which the position changes in time series (vector information on the distance and the direction). The game control unit 94 controls to display an image (effect) matched with the positions of the head portion and hand acquired (detected) by the synchronization unit 93 in a virtual space in the game.
Subsequently, the operation of the information processing system of the second embodiment will be described with reference to
As shown in
Subsequently, an operation of tracking the position of the head portion in the information processing system will be described with reference to
In the marker tracking, when the player P turns his/her line of sight to the marker M on the wall surface W while the player P stands within the range of the floor F as shown in
In the world tracking, when the player P moves his/her line of sight so as to look around while the player P stands within the range of the floor F as shown in
Next, a synchronization operation of two pieces of tracking information in the information processing system will be described with reference to
The synchronization unit 93 synchronizes first tracking information obtained by tracking the position of the hand by the arm tracking unit 91 (information in which the position changes in time series) with second tracking information obtained by tracking the position of the head portion by the head portion tracking unit 92 (information in which the position changes in time series) to generate tracking synchronization information including the position of the head portion ((x, y, z)=(1, 2, 1)) and the position of the hand (the coordinate (x, y, z)=(1,1,1)) at the same timing as shown in
Here, the tracking synchronization information generated by the synchronization unit 93 will be described with reference to
As described above, according to the second embodiment, the game execution unit 51 of the server 4 includes the synchronization unit 93, and the tracking information pieces of the two positions of the HMD 1-1 and the arm sensor 1-2 are synchronized with each other, whereby the position of the head portion of player P and the position of the hand of the player P are synchronized with each other in time series. Therefore, it is possible to interlock the objects of the game with the movements of the head portion and hand of the player P when the player P moves, so that the player P and the supporters U can achieve realistic play sensations for the motions of the player P and the objects (appearance timings and appearance positions of an energy ball EB, a shield SH, and the like). As a result, the player P and the supporters U who are third parties other than the player P can enjoy the game.
Here, an example of application to a game using the synchronization function of the synchronization unit 93 will be described with reference to
Next, an information processing system according to a third embodiment will be described with reference to
As shown in
The game execution unit 51 of the server 4 includes a synchronization unit 93 and a game control unit 94. The synchronization unit 93 and the game control unit 94 have the same configurations as those of the second embodiment, and the description thereon will be omitted. The game control unit 94 includes a shooting determination unit 121, and an object control unit 122. The object control unit 122 controls the operations of the objects displayed on the HMD 1-1 based on shooting information notified from the shooting determination unit 122.
The shooting determination unit 121 performs shooting determination on an object interlocked with the movement of the player P, for example, an energy ball EB or the like, based on the tracking synchronization information input from the synchronization unit 93, and notifies shooting information including a shooting direction and a shooting timing as a result of the shooting determination to the object control unit 122. More specifically, the shooting determination unit 121 controls to shoot an object (energy ball EB or the like) at such a timing and in such a direction that the object interlocks with the movement of the player P, based on the vector at each position included in the tracking synchronization information input from the synchronization unit 93.
The shooting determination unit 121 includes a shooting timing determination unit 131, and a shooting direction determination unit 132. The shooting timing determination unit 131 determines the shooting timing of an object (when the object is to be shot) based on the vector (direction and speed) at each position included in the tracking synchronization information input from the synchronization unit 93. More specifically, the shooting timing determination unit 131 determines the shooting timing of an object (when the object is to be shot) based on the speed of the vector at each position included in the sequentially input tracking synchronization information.
For example, when the player P swings his/her arm forward (in the direction of his/her line of sight) from bottom to top and performs a shooting operation in the direction of an arrow H as shown in
The shooting direction determination unit 132 determines the shooting direction of an object (where the object is to be shot) based on the vector at each position included in the tracking synchronization information input from the synchronization unit 93. More specifically, when the player P standing on the floor F directs his/her line of sight E forward (in the direction of the line of sight) in order to shoot an object as shown in
As a result, the object control unit 122 can control to shoot a shot S to a position around the height of the line of sight E as shown in
As described above, according to the third embodiment, the game execution unit 51 is provided with the shooting determination unit 121 and the object control unit 122, whereby the shooting determination is performed on the object interlocking with the movement of the player P based on the tracking synchronization information input from the synchronization unit 93, and the operation of the object is controlled based on the result of the shooting determination. Therefore, in such a game that the player P throws or shoots an object by his/her hand, the object is smoothly released from the hand according to the throwing motion of the player P, so that attractiveness is enhanced, and the player P and the supporters U can enjoy the game.
Although some embodiments of the present invention have been described above, the present invention is not limited to the above-described embodiments, and modifications, improvements, and the like are included in the present invention to the extent that the object of the present invention can be achieved.
In the above embodiments, the AR game which the player P plays while wearing the HMD 1-1 has been described as an example, but the embodiments can also be applied to, for example, a VR game and an MR game. Further, the embodiments can be applied to a game, a video game, and the like in which the player P plays the games without wearing HMD 1-1 while operating a game machine or a controller. VR is an abbreviation for Virtual Reality, and MR is an abbreviation for Mixed Reality. In the VR game, the game execution unit 51 executes a game in which an avatar in a virtual reality space corresponding to a real player P moves according to the movement of the player P or the operation of the game machine, and an image in the virtual reality space changes according to the movement of the avatar. The MR games are almost common to AR games. In other words, in the MR games, the game execution unit 51 displays objects while the objects are superimposed on things in the real space that the player P can see through a transmissive type display, and operates the objects according to the movement of the player P. Further, in the above embodiment, a game in which a plurality of players P are grouped into two teams A and B to play against each other has been described as an example. However, in addition to this game, the present invention is applicable to a game in which a player P plays alone, a game in which a number of teams play against one another in the same field, etc., and is not limited to the number of players P and the content of the game. The game in the present invention has a broad meaning, and includes, for example, such a show-type video distribution that one player P or a plurality of players P distribute a video to supporter terminals 3 and the supporters U present throwing money according to the content of the video. The game also includes sports, events, and the like.
In the example of the third embodiment, an example of such a game that the player P throws an object such as an energy ball EB by his/her hand has been described. Control as described below is performed in order to support a game in which an object is shot by hand in addition to the above-mentioned game. The object control unit 122 fixes the shooting position (height) in the Y-axis direction, and performs control so that a shoot J flies to a target at a constant height K as shown in
In addition, the arm sensor 1-2 attached to the arm may be used to control shoot and charge. In this case, for example, as shown in
Further, as shown in
Further, in order to prevent erroneous shooting of shoots as much as possible, as shown in
Further, for example, the series of processing described above may be executed by hardware or may be executed by software. Further, one functional block may be configured by a single piece of hardware, a single piece of software, or a combination thereof.
When a series of processing is executed by software, a program constituting the software is installed into a computer or the like from a network or a recording medium. The computer may be a computer incorporated in dedicated hardware. Further, the computer may be a computer which is capable of executing various functions by installing various programs therein, for example, it may be a general-purpose smart phone, PC or the like in addition to a server.
A recording medium containing such a program is configured not only in the form of a removable medium (not shown) to be distributed separately from the main body of a device in order to provide the program to a user, but also in the form of a recording medium to be provided to a user in a state where the recording medium is installed in the main body of a device in advance.
In this specification, the steps for describing a program to be recorded on a recording medium include not only processing to be executed in time series along the order thereof, but also processing which is to be executed in parallel or individually without being necessarily executed in time series. Further, in the present specification, the term of the system means an overall device configured by a plurality of devices, a plurality of units, and the like.
In other words, the information processing system to which the present invention is applied, including the above-described information processing system as the embodiment of
In other words, an information processing device to which the present invention is applied (for example, the server 4 in
By providing this configuration, a cheering behavior itself to be conducted by the supporter U generates a change that affects the game including the object, so that the supporter U who is a third party other than the player P can feel as if he/she participates in the game together with the player P.
The reflection unit (for example, the reflection unit 54 of
By providing this configuration, when the behavior conducted by the supporter U matches a condition, a change that affects the game including the object.
Therefore, when the supporter U conducts an effective cheering behavior, a change is generated in the game, so that the supporter U who is a third party other than the player P can feel as if he/she participates in the game.
The game execution unit (for example, the game execution unit 51 in
By providing this configuration, in the AR game or the MR game, a supporter U who is a third party other than the player P can feel as if he/she participates in the game.
The game execution unit (for example, the game execution unit 51 in
By providing this configuration, in the VR game, a supporter U who is a third party other than a player P can feel as if he/she participates in the game.
The condition (for example, a table 70 in
By providing this configuration, the behavior of the supporter U set in the table 70 is reflected to the game, so that a supporter U who is a third party other than the player P can feel as if he/she participates in the game.
The third party (supporter U) has an operable information terminal (for example, the supporter terminal 3 in
By providing this configuration, only the behavior of the supporter U which is set as a cheering behavior on a management side is reflected to the game, so that the game is not hindered by other disturbing behaviors and the like.
The information indicating the change in the situation includes one or more changes of a button pushing operation by the third party (supporter U) on the information terminal (for example, the supporter terminal 3 in
By providing this configuration, it is possible to detect various cheering behaviors of the supporter U by using the existing functions of the information terminal, and it is possible to make various reflection contents matching the cheering behaviors.
The button pushing operation includes not only an operation of pushing a hardware button, but also an operation of pushing a software button (icon, object, or the like) displayed on the screen.
Further, the pushing operation includes not only a simple pushing operation, but also an operation of tracing the screen, an operation of flipping, an operation by a plurality of fingers, and the like.
The game execution unit (for example, the game execution unit 51 in
By providing this configuration, in a game in which an object is displayed while superimposed on an real player P, the movement of the object is synchronized with the movements of limbs including the head, hands and feet of the player P, which enhances attractiveness and enables the players P and the supporters U to enjoy the game.
The game execution unit (for example, the game execution unit 51 in
1 . . . player terminal, 1-1 . . . HMD, 1-2 . . . arm sensor, 3, 3-1 to 3-n . . . supporter terminal, 4 . . . server, 21 . . . CPU, 29 . . . storage unit, 30 . . . communication unit, 41 . . . member information storage unit, 42 . . . game information storage unit, 43 . . . support information storage unit, 51 . . . game execution unit , 52 . . . Web server unit, 53 . . . behavior information acquisition unit, 54 . . . reflection unit, 61 . . . behavior assessment unit, 62 . . . game reflection unit, 93 . . . synchronization unit, 94 . . . game control unit, 121 . . . shooting determination unit, 122 . . . object control unit, 131 . . . shooting timing determination unit, 132 . . . shooting direction determination unit
Number | Date | Country | Kind |
---|---|---|---|
2019-204288 | Nov 2019 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/040908 | 10/30/2020 | WO |