The disclosure of Japanese Patent Application No. 2010-41530 is incorporated herein by reference.
The technology herein relates to a storage medium storing an object controlling program, an object controlling apparatus and an object controlling method. More specifically, certain example embodiments relate to a storage medium storing an object controlling program, an object controlling apparatus and an object controlling method which cause an object to generate and move according to an operation by a player.
One example of the related art is disclosed in Japanese Patent Application Laid-Open No. 2007-34634 [G06F 3/14, A63F 13/00, G06T 13/00] (Document 1) laid-open on Feb. 8, 2007. In an object controlling apparatus of the document 1, when a player makes a touch-on operation, a bullet object is generated at a touched position, and when a stick is slid in a downward direction of the screen with the screen touched, a tail fluke of the bullet object is generated. Thereafter, when the player makes a touch-off operation, the bullet object is moved in an upward direction of the screen. When the bullet object collides with an enemy character, if an offensive power of the bullet object exceeds stamina of the enemy character, the enemy object disappears.
However, in the object controlling apparatus disclosed in the document 1, the bullet object can be made to appear in an arbitrary touched position, but the shape of the bullet object is fixedly decided in the form of a vertically-long rod. Furthermore, the bullet object merely moves from the downward direction to the upward direction of the screen, basically. That is, the bullet object merely moves linearly. Thus, it is easily expect the movement of the bullet object, making the game simple. As a result, the player is tired of the game soon.
Therefore, certain example embodiments provide a novel storage medium storing an object controlling program, a novel object controlling apparatus and a novel object controlling method.
Furthermore, certain example embodiments provide a storage medium storing an object controlling program, an object controlling apparatus and an object controlling method which are able to increase interest in a game.
In certain example embodiments a storage medium storing an object controlling program is provided, and the object controlling program causes a computer to function as an input detector, an object generator, and an object mover. The input detector detects an operation input by a player. The object generator generates a first object according to the operation inputs detected by the input detector. The object mover moves the first object generated by the object generator according to a moving course based on the operation inputs detected by the input detector.
According to some embodiments, the object is generated according to the operation inputs by the player, and the object moves according to the moving course based on the operation inputs, and therefore, it becomes possible to generate an object in a complex shape with a simple operation, and moreover move the object according to a complex moving course. That is, it is necessary to generate the object with the strategy devised, thus capable of increasing interest in the game.
A second aspect is according to the first aspect, and a moving direction decider decides a moving direction when the first object moves based on a locus indicated by the operation inputs detected by the input detector in a case that the first object is generated by the object generator. The object mover moves the first object according to a moving course determined by utilizing the moving direction decided by the moving direction decider.
According to certain example embodiments, the first object is generated according to the locus of the operation inputs by the player, and the moving course is determined by utilizing the moving direction decided based on the locus, so that the first object has to be drawn in view of the moving course. Thus, the movement of the object as well as drawing of the object is taken into consideration, and the game is hard to become simple, capable of increasing interest.
A third aspect is according to the first aspect, and the operation input by the player is coordinate data indicating coordinates on a display surface of a displayer. The object generator generates the first object by connecting the coordinates indicated by a plurality of coordinate data in chronological order with a line segment. That is, the object in line drawing is generated by the player.
According to certain example embodiments, by merely drawing a line, an object can be generated, so that it is possible to generate an object even in the form of a complex shape with a simple operation.
A fourth aspect is according to the third aspect, and the object generator changes the thickness of each of the line segments depending on a distance between temporally continuous coordinates. For example, in a case that the distance between temporally continuous coordinates is short, the thickness of the line segment is increased, whereas in a case that the distance between the temporally continuous coordinates is long, the thickness of the line segment is reduced. Here, the thickness may be set conversely.
According to certain example embodiments, the thickness of the line segment making up of the object is changed according to an operation manner by the player, so that it is possible to generate the object having a complex shape with a simple operation.
A fifth aspect is according to the third aspect, and the object controlling program causes the computer to further function as a copy mover which moves a copy of the first object generated by the object generator at this point according to a moving course based on the operation inputs which have been detected until now by the input detector when a situation in which the distance between the temporally continuous coordinates is less than a constant distance continues.
According to certain example embodiments, in the course of generation of the object, it is possible to know the moving course of the object at this point, and therefore, it is possible to easily decide whether the generation of the object is to be continued or ended.
A sixth aspect is according to the first aspect, and the object mover changes a moving velocity of the first object when a predetermined condition is satisfied. For example, in a case that a specific operation is executed, or in a case that a specific event occurs, the moving velocity of the first object is made higher or lower.
According to certain example embodiments, the moving velocity of the object is changed, so that the moving course is also changed depending on the moving velocity. That is, it is hard to expect the moving course of the object, making it difficult to make the game simple or monotonous.
A seventh aspect is according to the sixth aspect, and the operation input by the player is coordinate data indicating coordinates on a display surface of a displayer. The object controlling program causes the computer to further function as a condition judger. The condition judger judges whether or not a predetermined condition is satisfied on the basis of a change of the coordinates indicated by a plurality of coordinate data.
According to certain example embodiments, whether or not the predetermined condition is satisfied is determined according to a change of the coordinates by the operation inputs by the player, and therefore, it is possible to determine whether or not the predetermined condition is satisfied on the basis of the operation input when the first object is drawn, for example.
An eighth aspect is according to the first aspect, and a second object different from the first object exists in a virtual space. An impacter gives an impact to at least a second object when the first object moved by the object mover hits the second object.
According to certain example embodiments, the second object is attacked by the first object, and therefore, the first object has to be drawn with the strategy devised, making the virtual game varied and interesting.
A ninth aspect is an object controlling apparatus and comprises an input detector which detects an operation input by a player, an object generator which generates an object according to the operation inputs detected by the input detector, and an object mover which moves the object generated by the object generator according to a moving course based on the operation inputs detected by the input detector.
According to the ninth aspect as well, similar to the first aspect, it is possible to increase interest in the game.
A tenth aspect is an object controlling method, and includes following steps of (a) detecting an operation input by a player, (b) generating an object according to the operation inputs detected by the step (a), and (c) moving the object generated by the step (b) according to a moving course based on the operation inputs detected by the step (a).
According to the tenth aspect as well, similar to the first aspect, it is possible to increase interest in the game.
The above described objects and other objects, features, aspects and advantages of certain example embodiments will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings.
Referring to
Generally, the user uses the game apparatus 10 in the open state. Furthermore, the user keeps the game apparatus 10 in a close state when not using the game apparatus 10. Here, the game apparatus 10 can maintain an opening and closing angle formed between the upper housing 12 and the lower housing 14 at an arbitrary angle between the close state and open state by a friction force, etc. exerted at the connected portion as well as the aforementioned close state and open state. That is, the upper housing 12 can be fixed with respect to the lower housing 14 at an arbitrary angle.
Additionally, the game apparatus 10 is mounted with a camera (32, 34) described later, functioning as an imaging device, such as imaging an image with the camera (32, 34), displaying the imaged image on the screen, and saving the imaged image data.
As shown in
In addition, although an LCD is utilized as a display in this embodiment, an EL (Electronic Luminescence) display, a plasmatic display, etc. may be used in place of the LCD. Furthermore, the game apparatus 10 can utilize a display with an arbitrary resolution.
As shown in
The direction input button (cross key) 20a functions as a digital joystick, and is used for instructing a moving direction of a player object, moving a cursor, and so forth. Each operation buttons 20b-20e is a push button, and is used for causing the player object to make an arbitrary action, executing a decision and cancellation, and so forth. The power button 20f is a push button, and is used for turning on or off the main power supply of the game apparatus 10. The start button 20g is a push button, and is used for temporarily stopping (pausing), starting (restarting) a game, and so forth. The select button 20h is a push button, and is used for a game mode selection, a menu selection, etc.
Although operation buttons 20i-20k are omitted in
The L button 20i and the R button 20j are push buttons, and can be used for similar operations to those of the operation buttons 20b-20e, and can be used as subsidiary operations of these operation buttons 20b-20e. Furthermore, in this embodiment, the L button 20i and the R button 20j can be also used for an operation of a imaging instruction (shutter operation). The volume button 20k is made up of two push buttons, and is utilized for adjusting the volume of the sound output from two speakers (right speaker and left speaker) not shown. In this embodiment, the volume button 20k is provided with an operating portion including two push portions, and the aforementioned push buttons are provided by being brought into correspondence with the respective push portions. Thus, when the one push portion is pushed, the volume is made high, and when the other push portion is pushed, the volume is made low. For example, when the push portion is hold down, the volume is gradually made high, or the volume is gradually made low.
Returning to
Additionally, at the right side surface of the lower housing 14, a loading slot (represented by a dashed line shown in
Moreover, on the right side surface of the lower housing 14, a loading slot for housing a memory card 26 (represented by a chain double-dashed line in
In addition, on the upper side surface of the lower housing 14, a loading slot (represented by an alternate long and short dash line
At the left end of the connected portion (hinge) between the upper housing 12 and the lower housing 14, an indicator 30 is provided. The indicator 30 is made up of three LEDs 30a, 30b, 30c. Here, the game apparatus 10 can make a wireless communication with another appliance, and the first LED 30a lights up when a wireless communication with the appliance is established. The second LED 30b lights up while the game apparatus 10 is recharged. The third LED 30c lights up when the main power supply of the game apparatus 10 is turned on. Thus, by the indicator 30 (LEDs 30a-30c), it is possible to inform the user of a communication-established state, a charge state, and a main power supply on/off state of the game apparatus 10.
As described above, the upper housing 12 is provided with the first LCD 16. In this embodiment, the touch panel 22 is set so as to cover the second LCD 18, but the touch panel 22 may be set so as to cover the first LCD 16. Alternatively, two touch panels 22 may be set so as to cover the first LCD 16 and the second LCD 18. For example, on the second LCD 18, an operation explanatory screen for teaching the user how the respective operation buttons 20a-20k and the touch panel 22 work or how to operate them, and a game screen are displayed.
Additionally, the upper housing 12 is provided with the two cameras (inward camera 32 and outward camera 34). As shown in
Accordingly, the inward camera 32 can image a direction to which the inner surface of the upper housing 12 is turned, and the outward camera 34 can image a direction opposite to the imaging direction of the inward camera 32, that is, can image a direction to which the outer surface of the upper housing 12 is turned. Thus, in this embodiment, the two cameras 32, 34 are provided such that the imaging directions of the inward camera 32 and the outward camera 34 are the opposite direction with each other. For example, the user holding the game apparatus 10 can image a landscape (including the user, for example) as the user is seen from the game apparatus 10 with the inward camera 32, and can image a landscape and other user as the direction opposite to the user is seen from the game apparatus 10 with the outward camera 34.
Additionally, on the internal surface near the aforementioned connected portion, a microphone 84 (see
Furthermore, on the outer surface of the upper housing 12, in the vicinity of the outward camera 34, a fourth LED 38 (dashed line in
Moreover, the upper housing 12 is formed with a sound release hole 40 on both sides of the first LCD 16. The above-described speaker is housed at a position corresponding to the sound release hole 40 inside the upper housing 12. The sound release hole 40 is a through hole for releasing the sound from the speaker to the outside of the game apparatus 10.
As described above, the upper housing 12 is provided with the inward camera 32 and the outward camera 34 which are constituted to image an image and the first LCD 16 as a displayer for mainly displaying the imaged image and a game screen. On the other hand, the lower housing 14 is provided with the input device (operation button 20 (20a-20k) and the touch panel 22) for performing an operation input to the game apparatus 10 and the second LCD 18 as a displayer for mainly displaying an operation explanatory screen and a game screen. Accordingly, the game apparatus 10 has two screens (16, 18) and two kinds of operating portions (20, 22).
The CPU 50 is an information processing means for executing a predetermined program. In this embodiment, the predetermined program is stored in a memory (memory for saved data 56, for example) within the game apparatus 10 and the memory card 26 and/or 28, and the CPU 50 executes information processing described later by executing the predetermined program.
Here, the program to be executed by the CPU 50 may be previously stored in the memory within the game apparatus 10, acquired from the memory card 26 and/or 28, and acquired from another appliance by communicating with this another appliance.
The CPU 50 is connected with the main memory 52, the memory controlling circuit 54, and the memory for preset data 58. The memory controlling circuit 54 is connected with the memory for saved data 56. The main memory 52 is a memory means to be utilized as a work area and a buffer area of the CPU 50. That is, the main memory 52 stores (temporarily stores) various data to be utilized in the aforementioned information processing, and stores a program from the outside (memory cards 26 and 28, and another appliance). In this embodiment, as a main memory 52, a PSRAM (Pseudo-SRAM) is used, for example. The memory for saved data 56 is a memory means for storing (saving) a program to be executed by the CPU 50, data of an image imaged by the inward camera 32 and the outward camera 34, etc. The memory for saved data 56 is constructed by a nonvolatile storage medium, and can utilize a NAND type flash memory, for example. The memory controlling circuit 54 controls reading and writing from and to the memory for saved data 56 according to an instruction from the CPU 50. The memory for preset data 58 is a memory means for storing data (preset data), such as various parameters, etc. which are previously set in the game apparatus 10. As a memory for preset data 58, a flash memory to be connected to the CPU 50 through an SPI (Serial Peripheral Interface) bus can be used.
Both of the memory card I/Fs 60 and 62 are connected to the CPU 50. The memory card I/F 60 performs reading and writing data from and to the memory card 26 attached to the connector according to an instruction form the CPU 50. Furthermore, the memory card I/F 62 performs reading and writing data from and to the memory card 28 attached to the connector according to an instruction form the CPU 50. In this embodiment, image data corresponding to the image imaged by the inward camera 32 and the outward camera 34 and image data received by other devices are written to the memory card 26, and the image data stored in the memory card 26 is read from the memory card 26 and stored in the memory for saved data 56, and sent to other devices. Furthermore, the various programs stored in the memory card 28 is read by the CPU 50 so as to be executed.
Here, the information processing program such as a game program is not only supplied to the game apparatus 10 through the external storage medium, such as a memory card 28, etc. but also is supplied to the game apparatus 10 through a wired or a wireless communication line. In addition, the information processing program may be recorded in advance in a nonvolatile storage device inside the game apparatus 10. Additionally, as an information storage medium for storing the information processing program, an optical disk storage medium, such as a CD-ROM, a DVD or the like may be appropriate beyond the aforementioned nonvolatile storage device.
The wireless communication module 64 has a function of connecting to a wireless LAN according to an IEEE802.11.b/g standard-based system, for example. The local communication module 66 has a function of performing a wireless communication with the same types of the game apparatuses by a predetermined communication system. The wireless communication module 64 and the local communication module 66 are connected to the CPU 50. The CPU 50 can receive and send data over the Internet with other appliances by means of the wireless communication module 64, and can receive and send data with the same types of other game apparatuses by means of the local communication module 66.
Furthermore, the CPU 50 is connected with the RTC 68 and the power supply circuit 70. The RTC 68 counts a time to output the same to the CPU 50. For example, the CPU 50 can calculate a date and a current time, etc. on the basis of the time counted by the RTC 68. The power supply circuit 70 controls power supplied from the power supply (typically, a battery accommodated in the lower housing 14) included in the game apparatus 10, and supplies the power to the respective circuit components within the game apparatus 10.
Also, the game apparatus 10 includes the microphone 84 and an amplifier 86. Both of the microphone 84 and the amplifier 86 are connected to the I/F circuit 72. The microphone 84 detects a voice and a sound (clap and handclap, etc.) of the user produced or generated toward the game apparatus 10, and outputs a sound signal indicating the voice or the sound to the I/F circuit 72. The amplifier 86 amplifies the sound signal applied from the I/F circuit 72, and applies the amplified signal to the speaker (not illustrated). The I/F circuit 72 is connected to the CPU 50.
The touch panel 22 is connected to the I/F circuit 72. The I/F circuit 72 includes a sound controlling circuit for controlling the microphone 84 and the amplifier 86 (speaker), and a touch panel controlling circuit for controlling the touch panel 22. The sound controlling circuit performs an A/D conversion and a D/A conversion on a sound signal, or converts a sound signal into sound data in a predetermined format. The touch panel controlling circuit generates touch position data in a predetermined format on the basis of a signal from the touch panel 22 and outputs the same to the CPU 50. For example, touch position data is data indicating coordinates of a position where an input is performed on an input surface of the touch panel 22.
Additionally, the touch panel controlling circuit performs reading of a signal from the touch panel 22 and generation of the touch position data per each predetermined time. By fetching the touch position data via the I/F circuit 72, the CPU 50 can know the position on the touch panel 22 where the input is made.
The operation button 20 is made up of the aforementioned respective operation buttons 20a-20k, and connected to the CPU 50. The operation data indicating a input state (whether or not to be pushed) with respect to each of the operation buttons 20a-20k is output from the operation button 20 to the CPU 50. The CPU 50 acquires the operation data from the operation button 20, and executes processing according to the acquired operation data.
Both of the inward camera 32 and the outward camera 34 are connected to the CPU 50. The inward camera 32 and the outward camera 34 image images according to an instruction from the CPU 50, and output image data corresponding to the imaged images to the CPU 50. In this embodiment, the CPU 50 issues an imaging instruction to any one of the inward camera 32 and the outward camera 34 while the camera (32, 34) which has received the imaging instruction images an image and sends the image data to the CPU 50.
The first GPU 74 is connected with the first VRAM 78, and the second GPU 76 is connected with the second VRAM 80. The first GPU 74 generates a first display image on the basis of data for generating the display image stored in the main memory 52 according to an instruction from the CPU 50, and draws the same in the first VRAM 78. The second GPU 76 similarly generates a second display image according to an instruction form the CPU 50, and draws the same in the second VRAM 80. The first VRAM 78 and the second VRAM 80 are connected to the LCD controller 82.
The LCD controller 82 includes a register 82a. The register 82a stores a value of “0” or “1” according to an instruction from the CPU 50. In a case that the value of the register 82a is “0”, the LCD controller 82 outputs the first display image drawn in the first VRAM 78 to the second LCD 18, and outputs the second display image drawn in the second VRAM 80 to the first LCD 16. Furthermore, in a case that the value of the register 82a is “1”, the LCD controller 82 outputs the first display image drawn in the first VRAM 78 to the first LCD 16, and outputs the second display image drawn in the second VRAM 80 to the second LCD 18.
Here, on the game screen 100 shown in
On the upper screen 110, an enemy object (enemy object) 112 which imitates a cloud and an enemy object 114 are displayed. On the lower screen 120, an object (drawn object) 122 which is drawn (generated) by the player by utilizing the touch pen 24 is displayed.
Although it is difficult to comprehend in the drawing, in the virtual game of this embodiment, the drawn object 122 generated according to a touch operation by the player is moved according to a moving course based on a locus of the touch operation. When the drawn object 122 hits the enemy object 112, 114, it can give damage (offensive force) on the enemy object 112, 114. Furthermore, when the damage given by the drawn object 122 exceeds the life (stamina) of the enemy object 112, 114, the enemy object 112,114 is made to disappear. Although illustration is omitted, in this embodiment, a hit judging polygon having a shape (size) the same or approximately the same as the drawn object 122 is set. Here, the hit judging polygon is colorless and transparent.
Furthermore, in a case that the drawn object 122 and the enemy object 112, 114 hit with each other for the first time, only when the hit portion is a leading end portion (portion encircled by the dotted line in
It should be noted that in this embodiment, only when the drawn object 122 hits the enemy object 112,114 from the leading end portion, the enemy object 112, 114 is damaged, but this is not restricted thereto. Irrespective from which portion the drawn object 122 hits the enemy object 112, 114, the enemy object 112, 114 may be damaged. For example, these may be used depending on the game levels.
For example, when it is successful to cause all the enemy objects 112, 114 disappear within a time limit (or within the limited number of attacks), the game is to be cleared. On the other hand, before all the enemy objects 112, 114 are caused to disappear, when the time limit (or the limited number of attacks) is exceeded, the game is over.
Here, an explanation is made on generating processing and a movement control of the drawn object 122. For example, when the player starts a touch operation by utilizing the touch pen 24, that is, when a touch-off state switches to a touch-on state, sampling of the coordinate data detected via the touch panel 22 is started. Here, in order to remove a mere unintentional movement of the game apparatus 10 or the touch pen 24, in a case that a distance d1 from the coordinates indicating a previously detected coordinate data is equal to or less than a constant distance L1 (3 dots, for example), the current coordinate data is not sampled.
The sampling continues until the end of the touch operation, that is, until the touch-on state switches to the touch-off state. Furthermore, in parallel with the sampling of the coordinate data, the drawn object 122 is generated (drawn). More specifically, temporally continuous two coordinates are connected with a line (line segment). Here, depending on the distance d1 between the temporally continuous two coordinates, the thickness of the line (line segment) is variably set. The distance d between the two coordinates is calculated according to Equation 1. Here, one of the coordinates is regarded as (x1, y1), and the other of the coordinates is regarded as (x2, y2). This is applied to a case that the distance d between the coordinates is calculated below.
d=√{square root over ( )}{(x1−x2)2+(y1−y2)2} [Equation 1]
In this embodiment, in a case that the distance d1 is short (is equal to or more than 3 dots and less than 8 dots), the thickness of the line is set to bold (5 dots). Furthermore, in a case that the distance d1 is normal (middle) (is equal to or more than 8 dots and less than 12 dots), the thickness of the line is set to normal (3 dots). Moreover, in a case that the distance d1 is long (is equal to or more than 12 dots), the thickness of the line is set to fine (1 dot). Moreover, depending on the thickness of the line, the size of the damage (attack) applied to the enemy object 112, 114 is made different. In this embodiment, the bolder the line, the more the damage given to the enemy object 112, 114 is.
In this embodiment, the shorter the distance d1 is, the bolder the line is, and the more the damage to be given is, and vice versa. That is, the shorter the distance d1 is, the finer the line is, and the less the damage is.
In this manner, until the touch operation is absent, the sampling of the coordinate data and generation of the drawn object 122 are performed. Here, when a total number of sampled coordinate data reaches a maximum value (200, for example), the generation of the drawn object 122 is forcedly ended.
After completion of generating the drawn object 122, the drawn object 122 starts to move. In this embodiment, the drawn object 122 moves according to a moving course based on its own locus. Here, when the moving course is decided by utilizing all the coordinates making up of the drawn object 122, an enormous amount of processing is required, and therefore, in this embodiment, the moving course is decided by utilizing a part of the coordinates. Thus, coordinates are searched from the drawing start position in order (search reference), and the coordinates for which a distance d2 with the coordinates immediately before is equal to or less than a constant distance L2 (10 dots, for example) is removed. That is, only the coordinates for which the distance d2 with the coordinates immediately before is above the constant distance L2 are extracted. Then, on the basis of the plurality of extracted coordinates, a vector (direction vector) for deciding a direction in which the drawn object 122 is moved is calculated. Here, in this embodiment, by the drawn object 122 drawn on the lower screen 120, the enemy object 112, 114 existing on the upper screen 110 is attacked, so that the direction vector of the drawn object 122 is decided in order so as to draw the moving course based on the locus of the drawn object 122 from the trailing end of the drawn object 122.
More specifically, in a case that n coordinates are extracted from the coordinates at the drawing start position to the coordinates at the drawing end position, a direction vector Vh is calculated according to Equation 2.
Vh=coordinatesn−2−coordinatesn [Equation 2]
Here, the direction vector Vh having the n-th coordinates as a starting point and n−2-th coordinatesn-2 as an endpoint is calculated every time that n is subtracted by 1. Accordingly, for example, as to the drawn object 122 shown in
Here, because only its direction is used, the direction vector Vh, may be represented by a unit vector having a magnitude of 1 by dividing each vector with its own magnitude.
In this embodiment, the drawn object 122 moves by a predetermined number of frames (3 frames in this embodiment) in a direction represented by the decided direction vector Vh at a constant moving velocity A (see Equation 3). Here, the frame is a screen updating rate (1/60 seconds). In a case that the drawn object 122 is out of the game screen 100 (first LCD 16 and second LCD 18) during movement, the movement of the drawn object 122 is ended. However, if even a part of the drawn object 122 is displayed on the game screen 100, the drawn object 122 continues to move, and after completion of movement according to the last direction vector Vh, the drawn object 122 continues to move in the direction of the last direction vector Vh until it is out of the game screen 100.
moving velocity A=reference velocity A0+flipping velocity α [Equation 3]
Here, the reference velocity A0 is a velocity set in advance by a programmer or a developer in advance, and set to a numerical value empirically obtained. Furthermore, the flipping velocity α is given according to Equation 4.
flipping velocity α=maximum distance dmax×constant C [Equation 4]
Here, the constant C is set to a numerical value empirically obtained by examinations, etc.
The flipping velocity α is explained, here. In this embodiment, as described above, the drawn object 122 is generated, but when generating (drawing) of the drawn object 122 is ended, the player makes a touch-off operation so as to flip the touch panel 22 with the touch pen 24 to thereby increase the moving velocity A.
Accordingly, in this embodiment, when a touch operation is ended, it is detected whether or not a flipping operation is made. If a flipping operation is made, a flipping velocity α in response thereto is calculated according to Equation 4, and it is reflected on the moving velocity A. Here, if it is determined that no flipping operation is made, the flipping velocity α=0.
More specifically, the coordinates indicated by the sampled coordinate data are detected in reverse chronological order from the trailing end, and a distance d3 with the coordinates immediate before is sequentially calculated. Furthermore, it is determined whether or not the distance d3 is equal to or less than a constant distance L3 (3 dots), and the coordinates when the distance d3 is equal to or less than the constant distance L3 (point) is regarded as a starting position of a flipping operation. Then, from the starting position of the flipping operation to the trailing end of the drawn object 122 (drawing end position), a maximum distance d3 (maximum distance dmax) is extracted to thereby calculate a flipping velocity a according to Equation 4. Furthermore, at this time, the coordinate data corresponding to the coordinates from the starting position of the flipping operation to the end thereof are precluded. This is because that the flipping operation is considered as an operation separate from the operation for drawing (generating) the drawn object 122. Accordingly, a flipping direction does not have an effect on a moving direction of the drawn object 122.
Furthermore, when the flipping velocity α is added by the flipping operation, the distance by which the drawn object 122 moves during one frame is made longer. Thus, in a case that a flipping operation is made, the drawn object 122 moves according to a moving course little longer than a moving course when no flipping operation is made.
Although detailed explanation is omitted, the flipping operation is executed when drawing the drawn object 122 is ended, and therefore, the presence or absence of the flipping operation is detected by using the coordinates at a point of the trailing end to the coordinates at a predetermined point (20, for example) temporally backward out of the sampled coordinates. In a case that a flipping operation is made, the flipping velocity α is calculated.
For example, as shown in
Here, the block object 116 may be made to disappear, or may not be made to disappear, in response to the drawn object 122 being hit. In either case, as described above, if all the enemy objects 112, 114 are not defeated within the time limit (within the limited number of attacks), the game is not to be cleared. Although illustration is omitted, the enemy object 112, 114 may move within the game screen 100 (game space) under the control of the computer (CPU 50).
The main processing program 522a is a program for processing a main routine of the virtual game of this embodiment. The image generating program 522b is a program for generating game image data to display a game screen (100, etc.) on the first LCD 16 and the second LCD 18 by utilizing image data 524b and object data 524d. The image displaying program 522c is a program for displaying the game image data generated according to the image generating program 522b on the first LCD 16 and the second LCD 18 as a game screen (100, etc.).
The object drawing program 522d generates image data of the drawn object 122 according to the object data 524d generated on the basis of the coordinate data that is input in response to a touch operation by the player. That is, the object drawing program 522d connects the coordinates indicated by the current coordinate data and the coordinates indicated by the coordinate data immediate before with a line segment having a thickness depending on the distance d1 between the two coordinates every time that coordinate data is sampled.
The flipping velocity calculating program 522e is a program for detecting a presence or absence of a flipping operation by utilizing a predetermined number of coordinates data from the last coordinate data of the object data 524, and calculating a flipping velocity α if a flipping operation is made based on a maximum distance dmax.
The object moving program 522f is a program for moving the drawn object 122 on the game screen 100 (game space). More specifically, the object moving program 522f extracts only the coordinates for which the distance d2 with the coordinates immediate before is above the constant distance L2 (10 dots) from all the coordinates from the drawing start position to the drawing end position with reference to the object data 524d, and calculates a direction vector Vh (object moving direction data 524f) as described above. Furthermore, the object moving program 522f calculates a moving velocity A (object moving velocity data 524g) of the drawn object 122 according to Equation 3 by utilizing the flipping velocity α calculated according to the flipping velocity calculating program 522e. Then, the object moving program 522f moves the drawn object 122 according to the calculated direction vector Vh and the calculated moving velocity A.
The hit judging program 522g determines a hit with the enemy object 112, 114 and the block object 116, and the drawn object 122. Here, as described above, only when the leading end portion of the drawn object 122 first hits the enemy object 112, 114, the enemy object 112, 114 can be damaged.
Although illustration is omitted, the program memory area 522 also stores a sound output program and a backup program. The sound output program is a program for outputting a sound necessary for the game, such as a voice (onomatopoeic sound) of the game object, a sound effect, music, etc. The backup program is a program for storing (saving) the game data (proceeding data and result data) in the memory card 26, and the memory for saved data 56 according to a game event and an instruction by the player.
The data memory area 524 is provided with an input data buffer 524a. The input data buffer 524a is provided for storing (temporarily storing) the operation data from the operation button 20 and the coordinate data from the touch panel 22 in chronological order.
Furthermore, the data memory area 524 stores image data 524b, thickness data 524c, object data 524d, hit judging data 524e, object moving direction data 524f, object moving velocity data 524g.
The image data 524b is data, such as polygon data, texture data, etc. for generating game image data. The thickness data 524c is table data for deciding the thickness of the line (line segment) of the drawn object 122. In
Returning to
The number of the sample is a serial number given according to a sampling order of the coordinate data. The coordinates are coordinates (XY coordinates) indicated by the sampled coordinate data. Here, although detailed explanation is omitted, the resolution of the second LCD 18 and the detection accuracy of the touch panel 22 are set to the same, and the second LCD 18 and the touch panel 22 are the same in a coordinate system. Accordingly, the coordinates according to the coordinate data detected via the touch panel 22 can be used as coordinates on the second LCD 18 as it is.
However, in this embodiment, a lateral direction of the second LCD 18 and the touch panel 22 is an X-axis direction, and a vertical direction thereof is a Y-axis direction. In
Furthermore, the thickness is decided depending on the distance between the coordinates indicated by the currently sampled coordinate data and the coordinates indicated by the coordinate data sampled immediately before as described above. The thickness of the line (the number of dots) corresponding to the distance is fetched from the thickness-of-line table shown in
Returning to
The object moving direction data 524f is data as to the vector Vh generated according to the object moving program 522f. The object moving velocity data 524g is data as to the moving velocity A calculated according to the object moving program 522f.
Although illustration is omitted, in the data memory area 524, sound (music) data for generating sound necessary for the game is stored or a counter (timer) and a flag which are necessary for the game processing are provided.
Although illustration is omitted, when the game is started, a timer for counting a time limit is started. Here, in a case that the number of attacks is restricted, a counter for counting the number of attacks is reset.
In a next step S3, it is determined whether or not drawing is started. That is, the CPU 50 determines whether or not a touch-off state changes to a touch-on state. More specifically, the CPU 50 determines whether or not coordinate data is input to the input data buffer 524a. If “NO” in the step S3, that is, if drawing is not started, the process returns to the same step S3 as it is. On the other hand, if “YES” in the step S3, that is, if drawing is started, sampling is started in a step S5. Here, the CPU 50 stores the coordinate data firstly input as coordinate data of the drawing start position in the data memory area 524. That is, generating of the object data 524d (information of the object) is started.
In a succeeding step S7, it is determined whether or not a distance d1 is equal to or more than a constant distance L1 (3 dots). Here, the CPU 50 calculates the distance d1 between the coordinates indicated by the coordinate data sampled immediately before and the coordinates indicated by the coordinate data to be currently referred according to Equation 1, and determines whether or not the distance d1 is equal to or more than the constant distance L1.
If “NO” in the step S7, that is, if the distance d1 is not equal to or more the constant distance L1, the process returns to the step S7 as it is. That is, the coordinate data for which the distance d1 is less than the constant distance L1 is not sampled, whereby the coordinate data due to an unintentional movement is excluded. That is, it is possible to prevent the memory from being filled with unnecessary data. On the other hand, if “YES” in the step S7, that is, if a movement is performed by the constant distance L1 or more, the coordinate data which is currently referred is sampled in a step S9. That is, the CPU 50 adds the coordinate data which is currently being referred to the object data 524d. Then, in a step S11, the two coordinates are connected with a line segment having a thickness depending on the distance d1. Here, the CPU 50 connects the coordinates currently sampled and the coordinate sampled immediately before with the line segment having a thickness (the number of dots) decided in the thickness table depending on the distance d1 between the coordinates.
Succeedingly, in a step S13, it is determined whether or not drawing is ended. The CPU 50, here, determines whether or not a touch-on state changes to a touch-off state. More specifically, the CPU 50 determines whether or not there is no input of the coordinate data to the input data buffer 524a. If “NO” in the step S13, that is, if drawing is not ended, it is determined whether or not the number of samples of the coordinate data exceeds a maximum number (200) in a step S15.
If “YES” in the step S15, that is, if the number of samples of the coordinate data exceeds the maximum number, the process proceeds to a step S19. Although illustration is omitted, if “YES” in the step S15, after the flipping velocity α is set to 0, the process proceeds to the step S19. On the other hand, if “NO” in the step S15, that is, if the number of samples of the coordinate data does not exceed the maximum number, the process returns to the step S7 as it is.
Alternatively, if “YES” in the step S13, that is, if drawing is ended, flipping velocity calculating processing (see
Succeedingly, in a step S21, the coordinates for which a distance d2 with the coordinates immediately before is equal to or less than a constant distance L2 (10 dots) are removed. That is, the CPU 50 extracts the coordinates for generating a direction vector Vh from the information of the object indicated by the object data 524d. In a next step S23, a total number of coordinates is set to the variable n. Here, the CPU 50 sets the total number of coordinates which has not been removed by the processing in the step S21.
As shown in
In a following step S29, the drawn object 122 is moved by one frame. Here, the drawn object 122 is moved by one frame of the moving distance decided by the moving velocity A calculated in the step S19 toward the direction vector Vh calculated in the step S25. Then, in a step S31, a hit between the enemy object 112, 114 and the drawn object 122 is determined at the position after movement.
Here, the hit determining method is not an essential content of the present invention, and it is already well-known, so that detailed explanation is omitted.
Succeedingly, in a step S33, it is determined whether or not the drawn object 122 hits the enemy. If “NO” in the step S33, that is, if the drawn object 122 does not hit the enemy, the process proceeds to a step S41. On the other hand, if “YES” in the step S33, that is, if the drawn object 122 hits the enemy, the stamina of the enemy object 112, 114 is subtracted by the damage corresponding to the thickness of the leading end portion of the drawn object 122 in a step S35. Although illustration is omitted, if the damage by the drawn object 122 is above the stamina of the enemy object 112, 114 at this time, the enemy object 112, 114 is made to disappear from the game screen 100 (game space).
Although illustration is omitted, as described above, only when the drawn object 122 hits the enemy object 112, 114 from the leading end portion, the enemy object 112, 114 is damaged, so that if the drawn object 122 hits the enemy object 112,114 from the portion except for the leading end portion, the process proceeds to the step S41 without the enemy object 112, 114 being damaged.
Although illustration is omitted, if the drawn object 122 hits the block object 116 from the leading end portion, the block object 116 may be made to disappear.
Then, in a step S37, it is determined whether or not all the enemies disappear. That is, the CPU 50 determines whether or not all the enemy objects 112, 114 disappear on the game screen 100 (game space). If “NO” in the step S37, that is, if at least one enemy object 112, 114 exists on the game screen 100 (game space), the process proceeds to the step S41. On the other hand, if “YES” in the step S37, that is, if all the enemy objects 112, 114 disappear on the game screen 100 (game space), game clearing processing is executed in a step S39, and the game entire processing is ended. For example, in the step S39, a game screen showing that the game is to be cleared is displayed, a sound effect and music showing that the game is to be cleared are output.
In the step S41, it is determined whether or not a time limit (60 seconds, for example) is exceeded. That is, the CPU 50 determines whether or not the time from the start of the game exceeds the time limit. Here, in a case that the number of attacks is restricted, it is determined whether or not the limited number of attacks here is exceeded. Furthermore, the number of attacks is added by one when the processing in a step S53 is ended and then, the process returns to the step S3 as described later. In addition, limits are installed on both of the time from the start of the game and the number of attacks, and if any one of the limits is exceeded, the game may be over.
If “NO” in the step S41, that is, if the time limit is not exceeded, the process proceeds to a step S45 shown in
As shown in
If “NO” in the step S47, that is, if the variable k is not 0, the process returns to the step S29 shown in
Although illustration is omitted, in a case that the drawn object 122 is moved in the step S29, if the entire drawn object 122 is out of the game screen 100, even if the drawn object 122 has not yet been moved according to all direction vectors Vh, the drawn object 122 is made to disappear, and the process returns to the step S3.
Succeedingly, in a step S77, it is determined whether or not a distance d3 is equal to or less than a constant distance L3 (3 dots in this embodiment). If “NO” in the step S77, that is, if the distance d3 exceeds the constant distance L3, the variable p is subtracted by one (p=p−1) in a step S79, and it is determined whether or not the variable p is the total number of coordinates −20 in a step S81. That is, out of all the sampled coordinates, as to the 20 samples of coordinates from the coordinates at the trailing end, it is determined that determination processing of the presence or absence of a flipping operation is ended.
If “NO” in the step S81, that is, if the variable p is not the total number of coordinates −20, the process returns to the step S73 as it is. On the other hand, if “YES” in the step S81, that is, if the variable p is the total number of coordinates −20, it is determined that there is no flipping operation, and 0 is set to the flipping velocity a in a step S83, and the process returns to the game entire processing.
Furthermore, if “YES” in the step S77, that is, if the distance d3 is equal to or less than the constant distance L3, the p-th coordinates is regarded as a starting position of a flipping operation in a step S85. In a succeeding step S87, a maximum distance dmax is fetched between the starting position of the flipping operation and the coordinates at the trailing end. That is, the CPU 50 fetches the maximum value out of the plurality of distances d3 stored in the working area of the main memory 52.
In a next step S89, a flipping velocity α is calculated by using the maximum distance dmax according to Equation 4. Then, in a step S91, the coordinates from the starting position of the flipping operation to the trailing end are deleted from the object data 524d, and the process returns to the game entire processing.
According to this embodiment, the drawn object is generated according to a touch operation by the player, and the generated drawn object is moved according to a course decided based on a locus of the touch operation, thus it is possible to generate a complex object with a simple operation, and move the object according to a complex route. Accordingly, it is necessary to generate the drawn object with a strategy devised, capable of increasing interest in the game.
The game apparatus 10 of another embodiment is the same as the above-described embodiment except that if the coordinates of the touched position do not change for a certain period of time or more during drawing (generating) of the drawn object 122, a copy of the drawn object 122 which has been generated until now is moved according to the moving course based on the locus, and therefore, a duplicated description is omitted.
Although illustration is omitted, the copy of the drawn object 122 is an object (copy object) which translucently represents the drawn object 122, for example. The copy object is for instructing the player how the drawn object 122 moves at this time point, that is, the moving course of the drawn object 122. Thus, a hit determination with the enemy object 112, 114 and the block object 116 is not required about the copy object, and therefore, the hit judging polygon is not set.
Furthermore, in a case that the coordinates of the touched position do not change for a certain period of time or more, the copy object starts to move, so that there is no increase in velocity by the flipping operation.
More specifically, movement processing of the copy object shown in
As shown in
In the step S105, it is determined whether or not the stop time is above a certain period of time (three seconds, for example). That is, the CPU 50 determines whether or not the count value of the stop time counter is above the certain period of time. If “NO” in the step S105, that is, if the stop time is equal to or less than the certain period of time, the process returns to the step S7 as it is. On the other hand, if “YES” in the step S105, that is, if the stop time is above the certain period of time, the stop time counter is instructed to stop counting in a step S107, and the process proceeds to a step S109 shown in
Although illustration is omitted, if “YES” in the step S7 as well, the counting of the stop time counter is stopped.
As shown in
In a next step S117, the copy object is moved by one frame. Here, the moving velocity A is a reference velocity A0 set in advance. In a next step S119, the variable k is subtracted by one. Then, it is determined whether or not the variable k is 0 in a step S121. If “NO” in the step S121, the process returns to the step S117 as it is. On the other hand, if “YES” in the step S121, the variable n is subtracted by one in a step S123.
Then, in a step S125, it is determined whether or not the variable n is 1. If “NO” in the step S125, the process returns to the step S113. On the other hand, if “YES” in the step S125, the copy object is moved until it is out of the screen in a step S127, and the process returns to the step S7.
According to this another embodiment, the copy object is moved under a predetermined condition to thereby preliminarily know the moving course of the drawn object, and therefore, it is possible to attack the enemy object with the strategy devised, such as making a touch-off operation as it is, continuing to draw the drawing object, executing a flipping operation, etc. Thus, it is possible to moreover increase interest in the game.
Additionally, in the above-described embodiment, in a case that a flipping operation is performed, the velocity of the drawn object is made high in response thereto, but the velocity may be made low.
Moreover, in the above-described embodiment, in a case that a flipping operation is made, the velocity to be added is calculated on the basis of the maximum distance in the flipping operation, but this is not limited thereto. For example, in a case that any one of the operating switches 20 is pushed, or a predetermined event occurs in the virtual game, the moving velocity of the drawn object may be made high. In addition, the velocity to be added may be decided to a constant value.
Additionally, in the above-described embodiment, as one example of the object controlling apparatus, a hand-held type game apparatus is only explained, but this is not restricted thereto. The present application can be applied to other type of the object controlling apparatuses, such as a console type game apparatus a personal computer, a PDA, a cellular phone having a game function, etc. having a pointing device, such as a touch panel, a computer mouse or a pen tablet, or being connected with such a pointing device. However, in a case that the pointing device except for the touch panel is provided, a designated position by the pointing device is required to be shown, and therefore, a designation image like a mouse pointer is displayed on the game screen.
In addition, the configuration of the game apparatus is not required to be restricted to this embodiment. For example, one camera may be provided, and no camera may be provided. Furthermore, the touch panel may be provided on the two LCDs.
Although the embodiments have been described and illustrated in detail, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the spirit and scope of the present invention being limited only by the terms of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2010-041530 | Feb 2010 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
5511983 | Kashii et al. | Apr 1996 | A |
5617117 | Kataoka et al. | Apr 1997 | A |
5900877 | Weiss et al. | May 1999 | A |
6590568 | Astala et al. | Jul 2003 | B1 |
6664965 | Yamamoto et al. | Dec 2003 | B1 |
6874126 | Lapidous | Mar 2005 | B1 |
7248270 | Boylan | Jul 2007 | B1 |
7607983 | Nakajima et al. | Oct 2009 | B2 |
7803049 | Nakada et al. | Sep 2010 | B2 |
8062131 | Hoga et al. | Nov 2011 | B2 |
8207970 | Matsuoka | Jun 2012 | B2 |
20030034439 | Reime et al. | Feb 2003 | A1 |
20030034961 | Kao | Feb 2003 | A1 |
20030197744 | Irvine | Oct 2003 | A1 |
20030214491 | Keely et al. | Nov 2003 | A1 |
20040021663 | Suzuki et al. | Feb 2004 | A1 |
20040164956 | Yamaguchi et al. | Aug 2004 | A1 |
20040196267 | Kawai et al. | Oct 2004 | A1 |
20050130738 | Miyamoto et al. | Jun 2005 | A1 |
20050168449 | Katayose | Aug 2005 | A1 |
20070024597 | Matsuoka | Feb 2007 | A1 |
20070063986 | Hoga et al. | Mar 2007 | A1 |
20080225007 | Nakadaira et al. | Sep 2008 | A1 |
Number | Date | Country |
---|---|---|
4-333912 | Nov 1992 | JP |
5-031256 | Feb 1993 | JP |
5-100809 | Sep 1998 | JP |
2001-297258 | Oct 2001 | JP |
2002-328040 | Nov 2002 | JP |
2004-070492 | Mar 2004 | JP |
2005-032015 | Feb 2005 | JP |
2005-092472 | Apr 2005 | JP |
2005-193006 | Jul 2005 | JP |
2007-034634 | Feb 2007 | JP |
Entry |
---|
Sakaguchi et al. “Visual Basic 6.0 Application Development and Programming: The Definite Edition” pp. 48-53, Kyoritsu Shuppan Co., Ltd., Jun. 20, 2000. |
Office Action for U.S. Appl. No. 11/493,037, mailed Mar. 5, 2009. |
Office Action for U.S. Appl. No. 11/493,037, mailed Sep. 10, 2009. |
Office Action for U.S. Appl. No. 11/493,037, mailed Mar. 5, 2010. |
Office Action for U.S. Appl. No. 11/493,037, mailed Oct. 13, 2010. |
Office Action for U.S. Appl. No. 11/493,037, mailed Apr. 6, 2011. |
Notice of Allowance for U.S. Appl. No. 11/493,037, mailed Dec. 30, 2011. |
Number | Date | Country | |
---|---|---|---|
20110214093 A1 | Sep 2011 | US |