This application claims priority to Japanese Patent Application No. 2023-121939 filed on Jul. 26, 2023, the entire contents of which are incorporated herein by reference.
The present disclosure relates to information processing for a game or the like.
Hitherto, there have been games in which an image showing how to hold controllers is displayed before the game starts.
When holding controllers with left and right hands, it is conceivable that a player cannot instantly determine whether or not an image showing how to hold controllers is a laterally inverted image (i.e., an image displayed as if reflected in a mirror). In particular, when the player is required to take a laterally asymmetric posture (i.e., hold the controllers in a laterally asymmetric manner), the player may make an operational error if the player makes a mistake in holding the controllers.
Therefore, it is an object of the exemplary embodiments to provide a non-transitory computer-readable storage medium having a game program stored therein, a game processing system, a game processing apparatus, and a game processing method that are capable of displaying a how-to-hold image whose orientation is easily grasped by a player.
In order to attain the object described above, for example, the following configuration examples are exemplified.
A first configuration example is directed to a non-transitory computer-readable storage medium having stored therein instructions that, when executed by a processor of an information processing apparatus for executing a plurality of types of games in succession, cause the information processing apparatus to:
According to the above first configuration example, by rotating the model object, it becomes easier to understand that the model object is not displayed so as to be laterally inverted as if reflected in a mirror, and in the case of performing the games in succession, the player can be prevented from failing to be ready for each game in time.
In a second configuration example based on the first configuration example, the instructions cause the information processing apparatus to:
According to the above second configuration example, in the case of playing the game while holding the controllers with both hands, respectively, there is no confusion as to whether or not the model object is displayed so as to be laterally inverted.
In a third configuration example based on the second configuration example, the first type of game includes a second type of game in which the pose is set as a laterally asymmetric pose for the model object.
According to the third configuration example, in a game in which a laterally asymmetric pose is made, whether or not the model object is displayed so as to be laterally inverted is particularly problematic, but this problem can be significantly eliminated.
In a fourth configuration example based on the third configuration example, the instructions cause the information processing apparatus to:
According to the fourth configuration example, by setting a dominant hand, a game in which a laterally asymmetric pose is made can be performed such that display of an object to be operated or the like is inverted.
In a fifth configuration example based on the second configuration example, the instructions cause the information processing apparatus to:
In a sixth configuration example based on the fifth configuration example, the third type of game is a game in which the pose is set as a pose in which the left and right hands are positioned in front of the model object.
According to the sixth configuration example, each hand part, of the model object, holding the controller can be prevented from being hidden by the trunk of the model object.
In a seventh configuration example based on the second configuration example, the instructions cause the information processing apparatus to, for a fifth type of game in which the pose is set as a laterally symmetric pose for the model object and which is not included in the first type of game, display the model video as an animation remaining in an orientation in which the front side or the back side of the model object is displayed.
According to the seventh configuration example, for some games in which an object to be operated is laterally symmetric, the presence or absence of lateral inversion does not become a problem, so that the model video can be an animation in which the front side or the back side of the model object remains displayed.
In an eighth configuration example based on the first configuration example, the instructions cause the information processing apparatus to:
In a ninth configuration example based on the eighth configuration example, the instructions cause the information processing apparatus to, for the first type of game, display the model video as an animation in which the model object makes the pose, while rotating the model object or the virtual camera.
According to the ninth configuration example, by the model object making the pose using the animation, it can be made easier to understand the pose.
According to the exemplary embodiment, it is possible to provide a non-transitory computer-readable storage medium having a game program stored therein, a game processing system, a game processing apparatus, and a game processing method that are capable of displaying a how-to-hold image whose orientation is easily grasped by a player.
Hereinafter, an exemplary embodiment will be described.
Hereinafter, an information processing system (game system) according to an example of the exemplary embodiment will be described below. An example of a game system 1 according to the exemplary embodiment includes a main body apparatus (an information processing apparatus, which functions as a game apparatus main body in the exemplary embodiment) 2, a left controller 3, and a right controller 4. Each of the left controller 3 and the right controller 4 is attachable to and detachable from the main body apparatus 2. That is, the game system 1 can be used as a unified apparatus obtained by attaching each of the left controller 3 and the right controller 4 to the main body apparatus 2. Further, in the game system 1, the main body apparatus 2, the left controller 3, and the right controller 4 can also be used as separate bodies (see
The shape and the size of the housing 11 are discretionary. As an example, the housing 11 may be of a portable size. Further, the main body apparatus 2 alone or the unified apparatus obtained by attaching the left controller 3 and the right controller 4 to the main body apparatus 2 may function as a mobile apparatus. The main body apparatus 2 or the unified apparatus may function as a handheld apparatus or a portable apparatus.
As shown in
The main body apparatus 2 includes a touch panel 13 on the screen of the display 12. In the exemplary embodiment, the touch panel 13 is of a type capable of receiving a multi-touch input (e.g., electrical capacitance type). However, the touch panel 13 may be of any type, and may be, for example, of a type capable of receiving a single-touch input (e.g., resistive film type).
The main body apparatus 2 includes speakers (i.e., speakers 88 shown in
Further, the main body apparatus 2 includes a left terminal 17, which is a terminal for the main body apparatus 2 to perform wired communication with the left controller 3, and a right terminal 21, which is a terminal for the main body apparatus 2 to perform wired communication with the right controller 4.
As shown in
The main body apparatus 2 includes a lower terminal 27. The lower terminal 27 is a terminal for the main body apparatus 2 to communicate with a cradle. In the exemplary embodiment, the lower terminal 27 is a USB connector (more specifically, a female connector). Further, when the unified apparatus or the main body apparatus 2 alone is mounted on the cradle, the game system 1 can display on a stationary monitor an image generated by and outputted from the main body apparatus 2. Further, in the exemplary embodiment, the cradle has the function of charging the unified apparatus or the main body apparatus 2 alone mounted on the cradle. Further, the cradle has the function of a hub device (specifically, a USB hub).
The left controller 3 includes a left analog stick (hereinafter, referred to as a “left stick”) 32 as an example of a direction input device. As shown in
The left controller 3 includes various operation buttons. The left controller 3 includes four operation buttons 33 to 36 (specifically, a right direction button 33, a down direction button 34, an up direction button 35, and a left direction button 36) on the main surface of the housing 31. Further, the left controller 3 includes a record button 37 and a “−” (minus) button 47. The left controller 3 includes a first L-button 38 and a ZL-button 39 in an upper left portion of a side surface of the housing 31. Further, the left controller 3 includes a second L-button 43 and a second R-button 44, on the side surface of the housing 31 on which the left controller 3 is attached to the main body apparatus 2. These operation buttons are used to give instructions depending on various programs (e.g., an OS program and an application program) executed by the main body apparatus 2.
Further, the left controller 3 includes a terminal 42 for the left controller 3 to perform wired communication with the main body apparatus 2.
Similarly to the left controller 3, the right controller 4 includes a right analog stick (hereinafter, referred to as a “right stick”) 52 as a direction input section. In the exemplary embodiment, the right stick 52 has the same configuration as that of the left stick 32 of the left controller 3. Further, the right controller 4 may include a directional pad, a slide stick that allows a slide input, or the like, instead of the analog stick. Further, similarly to the left controller 3, the right controller 4 includes four operation buttons 53 to 56 (specifically, an A-button 53, a B-button 54, an X-button 55, and a Y-button 56) on a main surface of the housing 51. Further, the right controller 4 includes a “+” (plus) button 57 and a home button 58. Further, the right controller 4 includes a first R-button 60 and a ZR-button 61 in an upper right portion of a side surface of the housing 51. Further, similarly to the left controller 3, the right controller 4 includes a second L-button 65 and a second R-button 66.
Further, the right controller 4 includes a terminal 64 for the right controller 4 to perform wired communication with the main body apparatus 2.
The main body apparatus 2 includes a processor 81. The processor 81 is an information processing section for executing various types of information processing to be executed by the main body apparatus 2. For example, the processor 81 may be composed only of a CPU (Central Processing Unit), or may be composed of a SoC (System-on-a-chip) having a plurality of functions such as a CPU function and a GPU (Graphics Processing Unit) function. The processor 81 executes an information processing program (e.g., a game program) stored in a storage section (specifically, an internal storage medium such as a flash memory 84, an external storage medium attached to the slot 23, or the like), thereby performing the various types of information processing.
The main body apparatus 2 includes the flash memory 84 and a DRAM (Dynamic Random Access Memory) 85 as examples of internal storage media built into the main body apparatus 2. The flash memory 84 and the DRAM 85 are connected to the processor 81. The flash memory 84 is a memory mainly used to store various types of data (or programs) to be saved in the main body apparatus 2. The DRAM 85 is a memory used to temporarily store various types of data used for information processing.
The main body apparatus 2 includes a slot interface (hereinafter, abbreviated as “I/F”) 91. The slot I/F 91 is connected to the processor 81. The slot I/F 91 is connected to the slot 23, and in accordance with an instruction from the processor 81, reads and writes data from and to the predetermined type of storage medium (e.g., a dedicated memory card) attached to the slot 23.
The processor 81 appropriately reads and writes data from and to the flash memory 84, the DRAM 85, and each of the above storage media, thereby performing the above information processing.
The main body apparatus 2 includes a network communication section 82. The network communication section 82 is connected to the processor 81. The network communication section 82 communicates (specifically, through wireless communication) with an external apparatus via a network. In the exemplary embodiment, the network communication section 82 connects to a wireless LAN by a method compliant with the Wi-Fi standard, for example, and performs Internet communication or the like with an external apparatus (another main body apparatus 2). Further, the network communication section 82 can also perform short-range wireless communication (e.g., infrared light communication) with another main body apparatus 2.
The main body apparatus 2 includes a controller communication section 83. The controller communication section 83 is connected to the processor 81. The controller communication section 83 wirelessly communicates with the left controller 3 and/or the right controller 4. The communication method between the main body apparatus 2, and the left controller 3 and the right controller 4, is discretionary. In the exemplary embodiment, the controller communication section 83 performs communication compliant with the Bluetooth (registered trademark) standard with the left controller 3 and with the right controller 4.
The processor 81 is connected to the left terminal 17, the right terminal 21, and the lower terminal 27. When performing wired communication with the left controller 3, the processor 81 transmits data to the left controller 3 via the left terminal 17 and also receives operation data from the left controller 3 via the left terminal 17. Further, when performing wired communication with the right controller 4, the processor 81 transmits data to the right controller 4 via the right terminal 21 and also receives operation data from the right controller 4 via the right terminal 21. Further, when communicating with the cradle, the processor 81 transmits data to the cradle via the lower terminal 27. As described above, in the exemplary embodiment, the main body apparatus 2 can perform both wired communication and wireless communication with each of the left controller 3 and the right controller 4. Further, when the unified apparatus obtained by attaching the left controller 3 and the right controller 4 to the main body apparatus 2 or the main body apparatus 2 alone is attached to the cradle, the main body apparatus 2 can output data (e.g., image data or sound data) to the stationary monitor or the like via the cradle.
Here, the main body apparatus 2 can communicate with a plurality of left controllers 3 simultaneously (in other words, in parallel). Further, the main body apparatus 2 can communicate with a plurality of right controllers 4 simultaneously (in other words, in parallel). Thus, a plurality of users can simultaneously provide inputs to the main body apparatus 2, each using a set of the left controller 3 and the right controller 4. As an example, a first user can provide an input to the main body apparatus 2 using a first set of the left controller 3 and the right controller 4, and simultaneously, a second user can provide an input to the main body apparatus 2 using a second set of the left controller 3 and the right controller 4.
The main body apparatus 2 includes a touch panel controller 86, which is a circuit for controlling the touch panel 13. The touch panel controller 86 is connected between the touch panel 13 and the processor 81. On the basis of a signal from the touch panel 13, the touch panel controller 86 generates data indicating the position at which a touch input has been performed, for example, and outputs the data to the processor 81.
Further, the display 12 is connected to the processor 81. The processor 81 displays a generated image (e.g., an image generated by executing the above information processing) and/or an externally acquired image on the display 12.
The main body apparatus 2 includes a codec circuit 87 and speakers (specifically, a left speaker and a right speaker) 88. The codec circuit 87 is connected to the speakers 88 and a sound input/output terminal 25 and also connected to the processor 81. The codec circuit 87 is a circuit for controlling the input and output of sound data to and from the speakers 88 and the sound input/output terminal 25.
The main body apparatus 2 includes a power control section 97 and a battery 98. The power control section 97 is connected to the battery 98 and the processor 81. Further, although not shown in
Further, the battery 98 is connected to the lower terminal 27. When an external charging device (e.g., the cradle) is connected to the lower terminal 27 and power is supplied to the main body apparatus 2 via the lower terminal 27, the battery 98 is charged with the supplied power.
The left controller 3 includes a communication control section 101, which communicates with the main body apparatus 2. As shown in
Further, the left controller 3 includes a memory 102 such as a flash memory. The communication control section 101 includes, for example, a microcomputer (or a microprocessor) and executes firmware stored in the memory 102, thereby performing various processes.
The left controller 3 includes buttons 103 (specifically, the buttons 33 to 39, 43, 44, and 47). Further, the left controller 3 includes the left stick 32. Each of the buttons 103 and the left stick 32 outputs information regarding an operation performed on itself to the communication control section 101 repeatedly at appropriate timings.
The left controller 3 includes inertial sensors. Specifically, the left controller 3 includes an acceleration sensor 104. Further, the left controller 3 includes an angular velocity sensor 105. In the exemplary embodiment, the acceleration sensor 104 detects the magnitudes of accelerations along predetermined three axial (e.g., x, y, z axes shown in
The communication control section 101 acquires information regarding an input (specifically, information regarding an operation or the detection result of the sensor) from each of input sections (specifically, the buttons 103, the left stick 32, and the sensors 104 and 105). The communication control section 101 transmits operation data including the acquired information (or information obtained by performing predetermined processing on the acquired information) to the main body apparatus 2. The operation data is transmitted repeatedly, once every predetermined time. The interval at which the information regarding an input is transmitted from each of the input sections to the main body apparatus 2 may or may not be the same.
The above operation data is transmitted to the main body apparatus 2, whereby the main body apparatus 2 can obtain inputs provided to the left controller 3. That is, the main body apparatus 2 can determine operations on the buttons 103 and the left stick 32 on the basis of the operation data. Further, the main body apparatus 2 can calculate information regarding the motion and/or the orientation of the left controller 3 on the basis of the operation data (specifically, the detection results of the acceleration sensor 104 and the angular velocity sensor 105).
The left controller 3 includes a power supply section 108. In the exemplary embodiment, the power supply section 108 includes a battery and a power control circuit. Although not shown in
As shown in
The right controller 4 includes input sections similar to the input sections of the left controller 3. Specifically, the right controller 4 includes buttons 113, the right stick 52, and inertial sensors (an acceleration sensor 114 and an angular velocity sensor 115). These input sections have functions similar to those of the input sections of the left controller 3 and operate similarly to the input sections of the left controller 3.
The right controller 4 includes a power supply section 118. The power supply section 118 has a function similar to that of the power supply section 108 of the left controller 3 and operates similarly to the power supply section 108.
Next, an outline of game processing (an example of the information processing) executed in the game system 1 according to the exemplary embodiment will be described. The game assumed in the exemplary embodiment is a game in which multiple short games are played in succession, and when a short game is cleared, the next short game starts. The game ends when the player fails to clear a short game, and when playing the game again, the game can be restarted, for example, from the short game that was failed to be cleared.
In this game processing, during a waiting period before the start of each short game, a model video showing the player (user) how to hold controllers is displayed, and then the short game is played. The model video is a video obtained by capturing a model object placed in a virtual space (game space) by a virtual camera, and shows that a model object imitating the player holding the left controller 3 with the left hand and the right controller 4 with the right hand makes a pose (a pose set in advance according to the short game to be executed after the model video is displayed). Then, the player makes the pose shown in the model video (the pose of holding the controllers in the same manner as the pose of the model object) and starts an operation in the short game. Hereinafter, the model video and the short game will be specifically described.
The model video 1 is started from a state where, as shown in
As described above, in the model video 1, the model object 310 moves both hands holding the controllers while rotating, and finally makes a pose viewed from the right oblique back side (a pose in which the right oblique back portion is displayed), in which the state of the arms and the hands is visible. This allows the player to intuitively understand that the model object 310 is not displayed so as to be laterally inverted as if reflected in a mirror, and also allows the player to quickly and intuitively understand a pose to be made at the start of a short game described later with reference to
When the display of the model video 1 is completed, the short game in
Then, the orientation of the telescope is changed in the virtual space on the basis of the movement of the left and right controllers corresponding to the movement of the left and right hands of the player. In
In the above, the case where the right hand of the player is set as a dominant hand in advance before the start of the short game has been described. In the case where the right hand is a dominant hand, it is generally easier to operate the telescope if the right hand which is a dominant hand is on the upper side when holding the telescope. Therefore, in the pose (first pose) of the model object 310 in
On the other hand, in the case where the left hand of the player is set as a dominant hand in advance before the start of the short game, the model video 1 in
The model video 2 is started from a state shown in
As described above, in the model video 2, the model object 310 moves both hands holding the controllers while rotating, and finally makes a pose viewed from the right oblique back side (a pose in which the oblique back portion is displayed), in which the state of the arms and the hands is visible. This allows the player to intuitively understand that the model object 310 is not displayed so as to be laterally inverted as if reflected in a mirror, and also allows the player to quickly and intuitively understand a pose to be made at the start of a short game described later with reference to
When the display of the model video 2 is completed, the short game in
The model video 3 is started from a state shown in
As described above, in the model video 3, the model object 310 moves both hands holding the controllers while rotating, and finally makes a pose viewed from directly behind (a pose in which the back portion is displayed), in which the state of the arms and the hands is visible. This allows the player to intuitively understand that the model object 310 is not displayed so as to be laterally inverted as if reflected in a mirror, and also allows the player to quickly and intuitively understand a pose to be made at the start of a short game described later with reference to
When the display of the model video 3 is completed, the short game in
Then, if, at the timing when the enemy object 355 comes directly above the player object 356, the player object 356 jumps while raising the right hand to push up the blocks above the player object 356 on the basis of the movement of the left and right controllers corresponding to the motion of the player jumping while raising the right hand (
In the above, the case where the right hand of the player is set as a dominant hand in advance before the start of the short game has been described. In the case where the right hand is a dominant hand, generally, a motion of jumping while raising the right hand can be made properly.
On the other hand, in the case where the left hand of the player is set as a dominant hand in advance before the start of the short game, the model video 3 in
The model video 4 is started from a state shown in
As described above, in the model video 4, the model object 310 moves both hands holding the controllers while facing the front (in the direction toward the virtual camera), and finally makes a pose viewed from the front side, in which the state of the arms and the hands is visible. This allows the player to quickly and intuitively understand a pose to be made at the start of the short game described later with reference to
In
When the display of the model video 4 is completed, the short game in
Next, the information processing of the exemplary embodiment will be described in detail with reference to
Various types of data used in the game processing will be described.
The game program 401 is a game program for executing the game processing.
The object data 403 is data of objects to be placed in the virtual space, such as player objects, enemy objects, items, ground, blocks, etc. The object data 403 also includes data of the coordinates (position), orientation, posture, state, etc., of each object.
The image data 408 is image data of backgrounds, virtual effects, etc.
The virtual camera control data 409 is data for controlling the motion of the virtual camera placed in the virtual space. Specifically, the virtual camera control data 409 is data that specifies the position/orientation, angle of view, imaging direction, etc., of the virtual camera.
The operation data 410 is data indicating the contents of operations performed on the left controller 3 and the right controller 4. The operation data 410 includes, for example, data indicating motions and orientation changes of the left controller 3 and the right controller 4 and input states regarding press states and the like of various buttons. The contents of the operation data 410 are updated at a predetermined cycle on the basis of signals from the left controller 3 and the right controller 4.
The transmission data 411 is data to be transmitted to other game systems 1, and is data including at least information for identifying the transmission source, and the contents of the operation data 410. The transmission data 411 includes data, regarding an own player character, to be transmitted to another game system 1 of a multiplay partner (data indicating coordinates (position), posture, state, etc.).
The reception data 412 is data stored such that transmission data received from other game systems 1 (i.e., transmission sources) can be discerned for each of the other game systems 1. The reception data 412 includes data, regarding another player character, received from another game system 1 of a multiplay partner (or a server) (data indicating coordinates (position), posture, state, etc.).
In addition, various types of data to be used in game processing are stored as necessary in the DRAM 85.
Next, the game processing according to the exemplary embodiment will be described in detail with reference to a flowchart.
First, as shown in
In step S101, the processor 81 determines whether or not the dominant hand has been inputted as a left hand in step S100. When the determination in step S101 is YES, the processing proceeds to step S102. When this determination is NO, the processing proceeds to step S103.
In step S102, the processor 81 performs setting in which display of a model video is laterally inverted and operation data acquired from the left and right controllers are swapped. Specifically, as described for the short game with reference to
The process of setting a dominant hand in steps S100 to S102 may not necessarily be executed each time the game is started as described above, but may be executed by the player performing a predetermined setting operation at any timing.
In step S103, the processor 81 determines a short game to be executed. The order of the short games to be executed is determined at the start of the game processing. The order of the short games to be executed may be determined in advance. Then, the processing proceeds to step S104.
In step S104, the processor 81 displays a model video corresponding to the short game determined to be executed in step S103. For example, when the short game described with reference to
In step S105, the processor 81 executes the short game determined to be executed in step S103. Then, the processing proceeds to step S106.
In step S106, the processor 81 determines whether or not a game ending condition has been satisfied. The game ending condition is a condition that the short game was failed to be cleared. The game ending condition is not limited thereto, and may be, for example, a condition that the short game was failed to be cleared a predetermined number of times (e.g., five times). When the determination in step S106 is YES, the game ends, and the processing ends. When this determination is NO, the processing returns to step S103, and a short game to be executed next is determined.
As described above, according to the exemplary embodiment, before the start of the short game, the animated model video in which the model object imitating the player holding the controllers makes a pose is displayed (see
In the above exemplary embodiment, the model object 310 appears to rotate in the model video by rotating the model object 310 in the virtual space (see
In the above exemplary embodiment, the model video is a three-dimensional video obtained by capturing the virtual space by the virtual camera (see
In the above exemplary embodiment, the movement of the hands of the player is detected by the inertial sensors of the controllers. However, the present disclosure is not limited thereto, and for example, an image of the player may be taken from the front side, and the movement of the hands of the player may be detected on the basis of the taken image of the player.
In the above exemplary embodiment, the evaluation to determine success or failure in the short game is performed on the basis of the operation data acquired from the controllers. However, the present disclosure is not limited thereto, and an evaluation to assign a score in the short game may be performed on the basis of the operation data acquired from the controllers.
In the exemplary embodiment, a case in which a series of processes regarding the game processing are executed in a single game apparatus has been described. In another exemplary embodiment, the series of processes may be executed in an information processing system including a plurality of information processing apparatuses. For example, in an information processing system including a terminal-side apparatus and a server-side apparatus communicable with the terminal-side apparatus via a network, some of the series of processes above may be executed by the server-side apparatus. Further, in an information processing system including a terminal-side apparatus and a server-side apparatus communicable with the terminal-side apparatus via a network, major processes among the series of processes above may be executed by the server-side apparatus, and some of the processes may be executed in the terminal-side apparatus. Further, in the above information processing system, the system on the server side may be implemented by a plurality of information processing apparatuses, and processes that should be executed on the server side may be shared and executed by a plurality of information processing apparatuses. Further, a configuration of a so-called cloud gaming may be adopted. For example, a configuration may be adopted in which: the game apparatus sends operation data indicating operations performed by the user to a predetermined server; various game processes are executed in the server; and the execution result is streaming-distributed as a moving image/sound to the game apparatus.
While the exemplary embodiment and the modifications have been described, the description thereof is in all aspects illustrative and not restrictive. It is to be understood that various other modifications and variations may be made to the exemplary embodiment and the modifications.
Number | Date | Country | Kind |
---|---|---|---|
2023-121939 | Jul 2023 | JP | national |