The present invention relates to a game system that implements a game in which a plurality of characters acting in a virtual space appear, and a method for controlling a game system.
In the related art, games in which a plurality of characters acting in a virtual space appear and in which a specific character chases another character are proposed. In this kind of game, chasing modes of a chasing character may be changed depending on whether the chasing character is visually recognizing a chased character.
For example, Patent Literature 1 discloses a game in which a player character (hereinafter, also referred to as “PC”) acting in accordance with actions of a player and a non-player character (hereinafter, also referred to as “NPC”) appear in a virtual game space. In this game, it is disclosed that a monster, which is an NPC, has vision- and olfaction-related enemy searching ability. Whether the vision of the monster is usable is determined and, if the vision is usable, the monster chases the PC with the vision. If the vision is not usable, the monster chases the PC with olfaction.
Patent Literature 1: Japanese Laid-Open Patent Application Publication No. 2003-175281
However, Patent Literature 1 does not specifically describe how the monster, which is a chasing character, will chase the PC with olfaction when the monster is not recognizing a position of the PC. Therefore, a chasing action of the chasing character with olfaction cannot be expressed with reality from the disclosure of Patent Literature 1.
Accordingly, an object of the invention is to provide a game system capable of imparting reality to chasing when the chasing character is not recognizing a position of a chased character, and a method for controlling a game system.
A game system according to an aspect of the invention is a game system including a storage unit and a control unit, wherein the control unit includes: a virtual space generating unit that generates a virtual space; a first character control unit that controls action of a first character moving in the virtual space; a second character control unit that controls action of a second character moving in the virtual space; a trace position storing unit that sequentially stores, in the storage unit, positions of the first character in the virtual space as trace positions at predetermined time intervals; and a chase processing unit that executes chasing of the first character by the second character based on the trace positions stored in the storage unit.
According to the invention, a game system capable of imparting reality to chasing when the chasing character is not recognizing a position of a chased character, and a method for controlling a game system can be provided.
A game system according to one aspect of the invention is game system including a storage unit and a control unit, wherein the control unit includes: a virtual space generating unit that generates a virtual space; a first character control unit that controls action of a first character moving in the virtual space; a second character control unit that controls action of a second character moving in the virtual space; a trace position storing unit that sequentially stores, in the storage unit, positions of the first character in the virtual space as trace positions at predetermined time intervals; and a chase processing unit that executes chasing of the first character by the second character based on the trace positions stored in the storage unit.
With this configuration, the second character, which is the chasing character, chases the first character, which is the chased character, based on the trace positions. Therefore, reality can be imparted to chasing by the second character not recognizing the position of the first character.
The trace position storing unit may store, in the storage unit, the trace positions and times at which the trace positions are stored in association with each other; an upper limit may be set for the number of the trace positions stored in the storage unit; and when the most recent trace position is to be stored in a state in which the number of already stored trace positions has reached the upper limit, the trace position storing unit may delete the trace position stored earliest among the trace positions stored in the storage unit. Therefore, capacity occupied by the trace position data in the storage unit can be saved.
Alternatively, the trace position storing unit may store, in the storage unit, the trace positions and times at which the trace positions are stored in association with each other; and the trace position storing unit may delete, from the storage unit, a trace position that has been stored in the storage unit for a certain time. Therefore, capacity occupied by the trace position data in the storage unit can be saved.
the control unit may include a state change processing unit that changes a state of the second character between a position recognized state in which the second character recognizes a position of the first character and a lost-sight state in which the second character does not recognize a position of the first character; and when the state of the second character is changed from the position recognized state to the lost-sight state, the chase processing unit may execute chasing of the first character by the second character based on the trace positions. Thus, even when the second character which is the chasing character loses sight of the first character which is the chased character, it is possible to give the user a feeling of tension to be chased by the second character.
The chase processing unit may include a trace position specifying unit that specifies a most recently stored trace position as a target position from among the trace positions in a predetermined search range including the second character; and the chase processing unit may execute chasing of the first character by the second character by alternately repeating specifying of the trace position and movement of the second character to the specified target position. Therefore, since one piece of trace data is specified from among trace data in the search range and then the position to move next is determined, a chasing route of the second character can be changed depending on the length of the time interval of storing the trace data and the size of the range in which the second character searches.
When the trace position in the search range is not present in the trace positions stored in the storage unit, the trace position specifying unit may specify, as the target position, a predetermined trace position stored before the state is changed to the lost-sight state from among the trace positions outside the search range. Thus, even if the next destination of the second character which is the chasing character is not found in the search range, a destination of the second character is found outside the search range. Therefore, chasing by the second character is continued.
In a case in which, after the second character is moved to the specified trace position, the trace position stored more recently than the trace position to which the second character is moved is not present in the search range, the trace position specifying unit may specify, as the target position, a trace position stored next recent to the previously specified trace position outside the search range. Thus, even if the next destination of the second character which is the chasing character is not found in the search range, a destination of the second character is found outside the search range. Therefore, chasing by the second character is continued.
A game system according to another aspect of the invention includes: a virtual space generating unit that generates a virtual space in which a plurality of objects are arranged; a player character (hereinafter, “PC”) control unit that controls action of a PC moving in the virtual space in response to a user operation; a non-player character (hereinafter, “NPC”) control unit that controls action of a NPC that is a character other than the PC and moving in the virtual space; and a lost-sight state change determining unit that determines to change a state of the NPC from a position recognized state in which the NPC is recognizing a position of the PC to a lost-sight state in which the NPC does not recognize a position of the PC when an imaginary line segment connecting the NPC and the PC in the virtual space touches or intersects a specific object among the plurality of objects.
Therefore, when the imaginary line segment connecting the NPC and the PC touches or intersects the specific object, since the PC is located in a position hidden from the NPC by the specific object, the lost-sight state can be produced without causing a user to feel unnaturality.
When the NPC is in the position recognized state, if the imaginary line segment does not touch or intersect the specific object, the lost-sight state change determining unit may keep the NPC in the position recognized state. Even when the PC is not in the field of view of the NPC, when the imaginary line segment connecting the NPC and the PC does not touch or intersect the specific object, the state of the NPC is not changed from the position recognized state to the lost-sight state. Thus, it is possible to prevent frequent occurrence of lost-sight state during a battle between the PC and the NPC.
A lost-sight state release determining unit that, when the PC enters a field of view of the NPC, that releases the lost-sight state of the NPC, and determines to change the state of the NPC from the lost-sight state to the position recognized state may be provided. With this configuration, reality can be imparted to a situation in which the NPC rediscovers the PC in the virtual space.
A method for controlling a game system according to one aspect of the invention includes: a virtual space generating step of generating a virtual space; a first character controlling step of controlling action of a first character moving in the virtual space; a second character controlling step of controlling action of a second character moving in the virtual space; a trace position storing step of sequentially storing, in the storage unit, positions of the first character in the virtual space as trace positions at predetermined time intervals; and a chase processing step of executing chasing the first character by the second character based on the trace positions stored in the storage unit.
Below, a game system and a method for controlling a game system according to an embodiment of the invention will be described with reference to the drawings.
A disc-shaped recording medium 30, such as digital versatile disc (DVD)-ROM, is loadable in the disk drive 12. The disc-shaped recording medium 30 is an example of a nonvolatile recording medium according to the invention. A game program 30a and game data 30b according to the present embodiment are recorded on the recording medium 30. The game data 30b includes various types of data necessary for the progress of the game, such as data necessary to form each character and a virtual space, and sound data to be reproduced in the game. A card-shaped recording medium 31 is loadable in the memory card slot 13. Saved data indicating a playing situation, such as a progress of the game, can be recorded on the recording medium 31 according to an instruction from the CPU 10.
The HDD 14 is a large-capacity recording medium built in the game device 2. The game program 30a and the game data 30b read from the disc-shaped recording medium 30, save data, etc. are recorded in the HDD 14. The ROM 15 is semiconductor memory, such as mask ROM or programmable ROM (PROM), in which a startup program for starting up the game device 2, a program for controlling an operation when the disc-shaped recording medium 30 is loaded, etc. are recorded. The RAM 16 is formed by dynamic random-access memory (DRAM), static random-access memory (SRAM), etc. The RAM 16 reads the game program 30a to be executed by the CPU 10, the game data 30b necessary for the execution of the game program 30a, etc. from the disc-shaped recording medium 30 or the HDD 14 in accordance with a play situation of the game and temporarily records.
A graphics processor 17, an audio synthesizer 20, a wireless communication controller 23, and a network interface 26 are also connected to the CPU 10 via the bus 11.
Among these components, the graphics processor 17 draws a game image including a virtual game space, each character, etc. according to an instruction from the CPU 10. That is, the graphics processor 17 adjusts position, direction, zoom (angle of view), etc. of a virtual camera set in the virtual space, and captures images of the virtual space. The graphics processor 17 renders the captured image and generates a two-dimensional game image for display. An external display (display unit) 19 is connected to the graphics processor 17 via a video converter 18. The game image drawn by the graphics processor 17 is converted into a moving image format by the video converter 18 and displayed on the display 19.
According to an instruction from the CPU 10, the audio synthesizer 20 reproduces and synthesizes digital sound data included in the game data 30b. An external speaker 22 is connected to the audio synthesizer 20 via an audio converter 21. Thus, the sound data reproduced and synthesized by the audio synthesizer 20 is decoded into an analog form by the audio converter 21 and output from the speaker 22 to the outside. Therefore, a user playing the game can hear the reproduced sound.
The wireless communication controller 23 has a wireless communication module in 2.4 GHz band, and is wirelessly connected to a controller 24 attached to the game device 2. Therefore, transmission and reception of data are possible. The user can input a signal in the game device 2 by operating an operating unit, such as a button, provided in the controller 24, and controls an action of the player character displayed on the display 19.
The network interface 26 connects the game device 2 to a communication network NW, such as the Internet or a LAN, and enables the game device 2 to communicate with other game devices 2 and the server device 3. The network interface 26 connects the game device 2 to other game devices 2 via the communication network NW to transmit and receive data to and from each other, so that a plurality of player characters can be displayed synchronously in the same virtual space. This configuration enables a multiplayer game in which a plurality of players cooperatively proceeds a game.
Next, with reference to
As illustrated in
In this game, as illustrated in
Further, as indicated by a broken line in
The enemy character N changes its behavior before and after discovering the player character P.
As illustrated in
In this game, a situation in which the enemy character N has lost sight of the player character P occurs.
In this game, the enemy character N in the lost-sight state takes an action to chase the player character P.
The virtual space generating unit 41 generates a three-dimensional virtual space S. In the virtual space S, a specific character chases another character. As described above, in the virtual space S, the player character P is present as a first character which is the chased character, and the enemy character N is present as a second character which is the chasing character. NPCs other than the enemy character N also present in the virtual space S in addition to the player character P and the enemy character N. NPCs other than the enemy character N may include, for example, characters which attack the enemy character cooperatively with the player character P or villagers who are neither ally nor enemies. Further, the virtual space generating unit 41 generates the object A described above and places the object A in the virtual space S. As described above, the object A may include the specific object B and the general objects C1 and C2. In the present embodiment, the specific object B is generated to include an internal space which the player character P can enter. However, the specific object B does not necessarily have to include an internal space which the player character P can enter.
The character control unit 42 includes a first character control unit (first character control means) 42a that controls action of the first character which is the chased character, and a second character control unit (second character control means) 42b that controls action of the second character, which is the chasing character. In the present embodiment, the first character control unit 42a functions as a player character control unit (player character control means) 42a that controls action of the player character P in the virtual space S in response to a user's operation. Hereinafter, the player character control unit will be referred to as a PC control unit. The PC control unit 42a controls various types of action including movement, attack, defense, use of items, and the like of the player character P during the battle in response to, for example, the user operation of the controller 24. Further, in the present embodiment, the second character control unit 42b functions as a non-player character control unit (non-player character control means) 42b that controls action of the NPC in the virtual space S. Hereinafter, the non-player character control unit will be referred to as an NPC control unit. For example, the NPC control unit 42b controls various types of action, such as movement, attack, and defense, of the enemy character N fighting against the player character P. The NPC control unit 42b also controls action of NPCs other than the enemy character N that fight against the player character P.
The state change processing unit 43 processes the state change of the enemy character N. In the present embodiment, the state change processing unit 43 changes the state of the enemy character N among the normal state described above in which the enemy character N takes normal action (
When the enemy character N is not in the position recognized state (e.g., in the normal state or in the lost-sight state), the position recognition determining unit 43a determines whether the player character P is in the field of view V of the enemy character N (field of view determination).
Further, when the player character P enters the field of view V of the enemy character N, the position recognition determining unit 43a determines to change the state of the enemy character N from the current state to the position recognized state. In the present embodiment, when the player character P enters the field of view V of the enemy character N, the state of the enemy character N is changed from the current state to the position recognized state. However, the current state may be changed to the position recognized state when the state in which the player character P is in the field of view V of the enemy character N is continued for a predetermined time. When the current state of the enemy character N is the lost-sight state, the position recognition determining unit 43a functions as a lost-sight state release determining unit (lost-sight state release determining means) that releases the lost-sight state.
The lost-sight state change determining unit 43b determines whether the imaginary line segment L connecting the enemy character N and the player character P in the virtual space S touches or intersects the specific object B (ray determination). The imaginary line segment L is not actually displayed on the display 19 but is a line segment calculated from position information of a predetermined point M1 positioned in the enemy character N and a predetermined point M2 positioned in the player character P. In the examples illustrated in
Determination as to whether the imaginary line segment L touches or intersects the specific object B may be made in any way. For example, whether a straight line (ray) extending from the point M1 to the point M2 touches or intersects the object A may be determined, or whether a straight line (ray) extending from the point M2 to the point M1 touches or intersects the specific object B may be determined. In the present embodiment, the specific object B includes an internal space which the player character P can enter. Therefore, when the player character P enters the specific object B, the imaginary line segment L between the player character P inside the specific object B and the enemy character N outside the specific object B reliably intersects the specific object B.
Further, when the imaginary line segment L touches or intersects the specific object B, the lost-sight state change determining unit 43b determines to change the state of the enemy character N from the position recognized state to the lost-sight state. When the imaginary line segment L does not touch or intersect the specific object B, the lost-sight state change determining unit 43b keeps the enemy character N in the position recognized state. In the present embodiment, when a state in which the imaginary line segment L is touching or intersecting the specific object B is continued for a predetermined time, the state of the enemy character N is changed from the position recognized state to the lost-sight state. However, when the imaginary line segment L touches or intersects the specific object B, the state of the enemy character N may be changed from the position recognized state to the lost-sight state. Further, conditions for changing to the lost-sight state may include conditions other than that the imaginary line segment L touches or intersects the specific object B. For example, conditions for changing to the lost-sight state may include conditions that the player character P takes specific action, such as entering the internal space of the object B, the player character P takes a specific posture, such as squatting on that spot, etc.
Here, a state change process by the state change processing unit 43 will be described with reference to a flowchart illustrated in
As illustrated in
When changed to the position recognized state, the state change processing unit 43 determines whether the state in which the imaginary line segment L is touching or intersecting the specific object B has continued for a predetermined time (step S4). If the state in which the imaginary line segment L is touching or intersecting the specific object B has continued for a predetermined time (step S4: Yes), the state change processing unit 43 changes the state of the enemy character N from the position recognized state to the lost-sight state (step S5, see
When changed to the lost-sight state, the chase processing unit 45 executes the chase process (see step S6,
While the chase process is executed, the state change processing unit 43 determines whether the player character P has entered the field of view V of the enemy character N which is in the lost-sight state (step S7). When the player character P enters the field of view V of the enemy character N which is in the lost-sight state (step S7: Yes, see
When the player character P has not entered the field of view V of the enemy character N which is in the lost-sight state (step S7: No), the state change processing unit 43 determines whether a battle continuation parameter is lower than a threshold value (step S8). Here, the battle continuation parameter is a parameter for determining whether to continue the battle action or the chasing action of the enemy character N. The battle continuation parameter is managed by, for example, the state change processing unit 43 of the game device 2. In the present embodiment, while the enemy character N is in the lost-sight state, the battle continuation parameter gradually decreases as time elapses from the value when the battle continuation parameter changes to the lost-sight state. If the battle continuation parameter is not lower than the threshold value (step S8: No), the chase process is continued. If the battle continuation parameter is lower than the threshold value (step S8: Yes), the state of the enemy character N is changed from the lost-sight state to the normal state (step S1).
Returning to
In the present embodiment, the trace position storing unit 44 always stores the trace positions at predetermined time intervals while the player character P can move in the virtual space S, but this is not restrictive. For example, the trace positions may be stored at predetermined time intervals only when the enemy character N is in the position recognized state and in the lost-sight state.
In the present embodiment, an upper limit for the number of trace positions to be stored in the storage unit is set. When the most recent trace position is to be stored in a state in which the number of already stored trace positions has reached the upper limit, the trace position storing unit 44 deletes the trace position stored earliest among the trace positions stored in the storage unit. However, an upper limit for the number of trace positions stored in the storage unit does not necessarily have to be set. In this case, for example, the trace position storing unit 44 may delete, from the storage unit, a trace position that has been stored for a certain time.
When the state of the enemy character N is changed from the position recognized state to the lost-sight state, the chase processing unit 45 executes chasing of the player character P by the enemy character N based on the trace positions stored in the storage unit. The chase processing unit 45 includes a trace position specifying unit (trace position specifying means) 45a. The trace position specifying unit 45a specifies the most recently stored trace position, as the target position, from among the trace positions located in a predetermined search range R including the enemy character N. The search range R is, for example, a range within a predetermined distance from the enemy character N. The size of the search range R may be changed according to the type of the enemy character N. Alternatively, the size of the search range R may be changed for each specific process of the trace position by the trace position specifying unit 45a. For example, a plurality of types of actions of the enemy character N for specifying the trace position (e.g., a gesture of taking a smell of the ground) may be prepared in advance, and the size of the search range R may be changed according to the type of the action taken by the enemy character N. In this case, action of the enemy character N to specify the trace position may be selected by lottery, or may be selected based on the distance between the player character P and the enemy character N.
When the enemy character N arrives at the specific object B by the chase process, the chase processing unit 45 causes the enemy character N to perform predetermined action, such as looking into the specific object B, in order to put the player character P inside the specific object B into the field of view V. As a result, when the player character P enters the field of view V of the enemy character N, the lost-sight state of the enemy character N is released. Then, the state of the enemy character N is changed to the position recognized state again. When the player character P is not present inside the specific object B the enemy character N looked into, the chase processing unit 45 keeps the chasing action of the enemy character N. When the enemy character N arrives at the specific object B by the chase process, the specific object B at which the enemy character N arrived may be changed to a general object regarding which no losing-sight occurs.
Before the enemy character N arrives at the specific object B, the player character P may be moved out of the specific object B by the user's operation. In this case, unless the player character P enters the field of view V of the enemy character N, the chasing action of the enemy character N is continued.
When the lost-sight state of the enemy character N is released, the chasing action of the enemy character N by the chase processing unit 45 is ended. More specifically, when the player character P enters the field of view V of the enemy character N or when the above-described battle continuation parameter becomes lower than a threshold value, the chasing action of the enemy character N is ended.
Next, with reference to
However, the method for specifying the trace position when no trace position is present in the search range R is not limited to this. For example, when no trace position is present in the search range R, the trace position specifying unit 45a may specify, as the target position, the most recently stored trace position from among trace positions stored before a certain point of time predetermined time before the time at which the state of the enemy character N is changed to the lost-sight state. Further, the trace position specifying unit 45a may specify the trace position closest to the search range R as the target position, or may specify the trace position stored at the time of changing to the lost-sight state as the target position.
However, the method for specifying the trace position in a case in which, after the enemy character N is moved to the trace position, no trace position stored more recently than that trace position is present in the search range R is not limited thereto. For example, the trace position specifying unit 45a may temporarily increase the search range R when the trace position stored more recently than the trace position t5 to which the enemy character N is moved is not present in the search range R. In this case, among the trace positions included in the increased search range R, the trace position specifying unit 45a may specify, as the target position, the trace position that is stored more recently than the trace position t5 and stored earliest excluding the trace position t5. Further, among the trace positions included in the increased search range R and stored more recently than the trace position t5, the trace position specifying unit 45a may specify, as the target position, a trace position located closest to the enemy character N excluding the trace position t5.
Alternatively, when the trace position stored more recently than the trace position t5 to which the enemy character N is moved is not present in the search range R, the trace position specifying unit 45a may move the enemy character N so as to search for the positions t1 to t4 stored more recently than the trace position t5 near the trace position t5 to which the enemy character N is moved. In this case, when the enemy character N is moved, the search range R is also moved, and then any of the trace positions t1 to t4 that are stored more recently enters the search range R, the trace position specifying unit 45a may specify, as the target position, the trace position that has entered the search range R. When the enemy character N is moved, the search range R is also moved, and then a plurality of the trace positions t1 to t4 that are stored more recently enter the search range R, the trace position specifying unit 45a may specify, as the target position, the trace position stored earliest from among the plurality of trace positions that have entered the search range R.
Next, the flow of the chase process illustrated in step S6 in
As illustrated in
When a trace position is present in the search range R (step T1: Yes), the chase processing unit 45 determines whether a trace position stored more recently than the previously specified trace position is present (step T2). When a trace position stored more recently than the previously specified trace position is present (step T2: Yes, see
After specifying the trace position as the target position in steps T3, T4, and T5, the chase processing unit 45 moves the enemy character N to the specified target position (step T6). After moving the enemy character N, the chase processing unit 45 returns to the determination as to whether a trace position is present in the search range R (step T1). In this manner, the chase process is repeated while the enemy character N is in the lost-sight state (see steps S6 to S8 in
As described above, in the game system 1 according to the present embodiment, since the chase processing unit 45 causes the enemy character N to chase the player character P based on the trace positions. Therefore, reality can be imparted to chasing by the enemy character N which is not recognizing the position of the player character P.
When the state of the enemy character N is changed from the position recognized state to the lost-sight state, the chase processing unit 45 causes the enemy character N to chase the player character P based on the trace positions. Therefore, even when the enemy character N loses sight of the player character P, it is possible to give the user a feeling of tension to be chased by the enemy character N.
In the related art, a game in which a certain range around an NPC, such as a monster, is set to be a field of view range of the NPC is proposed (for example, Japanese Laid-Open Patent Application Publication No. 2010-88675). In this type of game, when a PC enters the field of view of the NPC, a state of the NPC is changed from a normal state to a battle state. Then, as the PC is moved out of the field of view range of the NPC, the NPC loses sight of the PC. When a predetermined time elapses in that state, the state of the NPC is changed from the battle state to the normal state.
However, this type of lost-sight mode of the NPC may cause the user to feel unnaturality because the NPC loses sight of the PC when the PC goes out of a certain range around the NPC even if the NPC is visible from the PC. Conversely, if the PC enters a certain range around the NPC, the NPC does not lose sight of the PC even if the PC is not visible by the NPC, which may cause the user to feel unnaturality as well. Further, when the PC is located near a boundary of the certain range, a state in which the PC is discovered by the NPC and a state in which the NPC is lose sight of the PC are frequently switched when the PC is slightly moved. This may interrupt a battle condition frequently, making it difficult to effectively produce a battle scene that is the most enjoyable part of an action game.
In the present embodiment, when the imaginary line segment L connecting the enemy character N and the player character P touches or intersects the specific object B, the lost-sight state change determining unit 43b determines to change the state of the enemy character N from the position recognized state to the lost-sight state. When the imaginary line segment L touches or intersects the specific object B, since the player character P is located in a position hidden from the enemy character N by the specific object B, the lost-sight state can be produced without causing a user to feel unnaturality. Further, even when the player character P is not in the field of view V of the enemy character N, the lost-sight state does not occur when the imaginary line segment L connecting the enemy character N and the player character P does not touch or intersect the specific object B. Thus, it is possible to prevent frequent occurrence of lost-sight state during the battle between the player character P and the enemy character N. Therefore, a battle scene that is the most enjoyable part of an action game can be produced effectively.
Further, when the player character P enters the field of view V of the enemy character N, the position recognition determining unit (lost-sight state release determining unit) 43a releases the lost-sight state of the enemy character N and determines to change the state of the enemy character N from the lost-sight state to the position recognized state. With this configuration, reality can be imparted to a situation in which the enemy character N rediscovers the player character P in the virtual space S.
The invention is not limited to the embodiment described above. Various modified embodiments are possible without departing from the spirit and scope of the invention.
For example, in the above embodiment, the game system 1 that realizes a game in which the NPC chases the PC has been described, but the invention is applicable also to a game system that implements a game in which the NPC chases another NPC. That is, the first character that is a chased character and the second character that is the chasing character may be different NPCs, and the first character control unit 42a may function as the NPC control unit. Further, in the embodiment described above, the NPC appearing in the game implemented by the game system 1 is described as the enemy character N that battles with the player character P, but the invention is not limited to this. The NPC does not necessarily have to battle with the player character P. In this case, the NPC in the position recognized state may be set to take actions other than the battle action.
Further, the specific object B does not necessarily have to include an internal space which the player character P can enter. However, when the specific object B includes an internal space, the imaginary line segment L can reliably touch or intersect the specific object B by causing the player character P to enter the specific object B. Therefore, a situation in which the enemy character N loses sight of the player character P in the virtual space S can be created more intentionally by the user.
Although the single player character P appearing in the virtual space S has been described in the above embodiment, the game implemented by the game system 1 may be a multiplayable game in which a plurality of player characters P appears synchronously in the same virtual space S.
When a plurality of player characters P is present in the virtual space S, the trace position storing unit 44 may manage which trace positions belong to which player characters in an identifiable manner. Further, when a plurality of player characters P is present in the virtual space S, the trace position storing unit 44 may manage only positions and times without distinguishing the player characters. The trace position storing unit 44 may store and manage trace positions of all the player characters P present in the virtual space S or a part of the player characters P.
If a plurality of player characters P is present in the virtual space S, when the state of the enemy character N is changed from the position recognized state to lost-sight state, the chase processing unit 45 may cause the enemy character N to chase one player character selected from among the plurality of player characters. The player character P to be chased by the enemy character N may be a player character P selected by lottery from among the plurality of player characters P, or may be a player character P targeted by the enemy character N immediately before the enemy character N lost sight of the player character P.
Alternatively, priority as to whether the player character P is to be chased preferentially by the enemy character N may be set and managed for each player character P. In this case, a player character P having higher priority may be selected as a player character P to be chased by the enemy character N. Priority may be set and managed according to, for example, the magnitude of damage each player character P causes to the enemy character N, a level of each player character P, equipment items, current health, and the like. For example, from among a plurality of player characters P, the player character P that has caused the greatest damage to the enemy character N may be selected, or the player character P of the highest level may be selected as the player character P to be chased by the enemy character N.
41: Virtual space generating unit
40
b: PC control unit
40
d: NPC control unit
43
a: Position recognition determining unit (lost-sight state release determining unit)
43
b: Lost-sight state change determining unit
44: Trace position storing unit
45: Chase processing unit
45
a: Trace position specifying unit
A: Object
B: Specific object
L: Imaginary line segment
R: Search range
S: Virtual space
V: Field of view
Number | Date | Country | Kind |
---|---|---|---|
2016-213570 | Oct 2016 | JP | national |
2016-213571 | Oct 2016 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2017/039159 | 10/30/2017 | WO | 00 |