The present invention relates to a game device, a game device control method, a program, and an information storage medium.
There is known a game device which displays a virtual three-dimensional space in which a character object is placed on a game screen. Conventionally, such a game device employs a method of changing the position of an eyeball of a character object according to the viewing direction of the character object, as a method of representing the direction in which the character object is looking.
However, when a person is looking in one direction, changes occur not only in the positions of the eyeballs but also in the peripheral parts of the eyes. For example, a wrinkle occurs on the skin surrounding the eyes. For this reason, the method of changing only the position of an eyeball as described above may give a user an impression that the peripheral part of the eye of a character object is unnatural.
As a method of suppressing occurrence of the above-mentioned drawback, there may be conceived a method of preparing, beforehand, data (motion data or animation data) which shows how the peripheral part of an eye of a character object changes according to the viewing direction of the character object, and changing the peripheral part of the eye of the character object according to the data. However, the viewing direction of a character object is not restricted to a fixed direction but varies, and hence the use of this method requires a vast amount of data described above. That is, the amount of data and the amount of a work of producing data increase.
The present invention has been made in light of the foregoing problem, and it is an object of the present invention to provide a game device, a game device control method, a program, and an information storage medium, which are capable of showing how a peripheral part of an eye of a character object changes according to a viewing direction of the character object while reducing an amount of data and an amount of a work for producing data.
In order to solve the above-mentioned problem, according to the present invention, a game device, which displays a virtual three-dimensional space in which a character object is placed on a game screen, is characterized by including: vertex position specifying data storage means for storing a plurality of pieces of vertex position specifying data corresponding to a plurality of basic viewing directions, the vertex position specifying data specifying positions of vertexes of a peripheral part of an eye of the character object; blend control data storage means for storing blend control data associating viewing direction information about a viewing direction of the character object with blend ratio information about a blend ratio of the plurality of pieces of vertex position specifying data; viewing direction acquisition means for acquiring the viewing direction of the character object; blend ratio information acquisition means for acquiring the blend ratio information corresponding to the viewing direction of the character object, which is acquired by the viewing direction acquisition means, based on the blend control data; vertex position specifying data acquisition means for blending the plurality of pieces of vertex position specifying data based on the blend ratio information acquired by the blend ratio information acquisition means, to thereby acquire the vertex position specifying data corresponding to the viewing direction of the character object acquired by the viewing direction acquisition means; and display control means for displaying the game screen based on the vertex position specifying data, which corresponds to the viewing direction of the character object acquired by the viewing direction acquisition means, acquired by the vertex position specifying data acquisition means.
Further, according to the present invention, a control method for a game device, which displays a virtual three-dimensional space in which a character object is placed on a game screen, is characterized by including: a step of reading storage content of vertex position specifying data storage means storing a plurality of pieces of vertex position specifying data corresponding to a plurality of basic viewing directions, the vertex position specifying data specifying positions of vertexes of a peripheral part of an eye of the character object; a step of reading storage content of blend control data storage means storing blend control data associating viewing direction information about a viewing direction of the character object with blend ratio information about a blend ratio of the plurality of pieces of vertex position specifying data; a viewing direction acquisition step of acquiring the viewing direction of the character object; a blend ratio information acquisition step of acquiring the blend ratio information corresponding to the viewing direction of the character object, which is acquired in the viewing direction acquisition step, based on the blend control data; a vertex position specifying data acquisition step of blending the plurality of pieces of vertex position specifying data based on the blend ratio information acquired in the blend ratio information acquisition step, to thereby acquire the vertex position specifying data corresponding to the viewing direction of the character object acquired in the viewing direction acquisition step; and a display control step of displaying the game screen based on the vertex position specifying data, which corresponds to the viewing direction of the character object acquired in the viewing direction acquisition step, acquired in the vertex position specifying data acquisition step.
Further, a program according to the present invention causes a computer such as a consumer game machine, a portable game machine, an arcade game machine, a mobile phone, a personal digital assistant (PDA), or a personal computer to function as a game device which displays a virtual three-dimensional space in which a character object is placed on a game screen, and further causes the computer to function as: vertex position specifying data storage means for storing a plurality of pieces of vertex position specifying data corresponding to a plurality of basic viewing directions, the vertex position specifying data specifying positions of vertexes of a peripheral part of an eye of the character object; blend control data storage means for storing blend control data associating viewing direction information about a viewing direction of the character object with blend ratio information about a blend ratio of the plurality of pieces of vertex position specifying data; viewing direction acquisition means for acquiring the viewing direction of the character object; blend ratio information acquisition means for acquiring the blend ratio information corresponding to the viewing direction of the character object, which is acquired by the viewing direction acquisition means, based on the blend control data; vertex position specifying data acquisition means for blending the plurality of pieces of vertex position specifying data based on the blend ratio information acquired by the blend ratio information acquisition means, to thereby acquire the vertex position specifying data corresponding to the viewing direction of the character object acquired by the viewing direction acquisition means; and display control means for displaying the game screen based on the vertex position specifying data, which corresponds to the viewing direction of the character object acquired by the viewing direction acquisition means, acquired by the vertex position specifying data acquisition means.
Further, an information storage medium according to the present invention is a computer-readable information storage medium recording the program. Further, a program delivery device according to the present invention includes an information storage medium recording the program, reads the program from the information storage medium, and delivers the program. A program delivery method according to the present invention is a program delivery method of reading the program from an information storage medium recording the program, and delivering the program.
The present invention relates to a game device which displays a virtual three-dimensional space in which a character object is placed on a game screen. According to the present invention, a plurality of pieces of vertex position specifying data corresponding to a plurality of basic viewing directions are stored. The vertex position specifying data is data for specifying positions of vertexes of a peripheral part of an eye of a character object. According to the present invention, blend control data is stored, which associates viewing direction information about a viewing direction of the character object with blend ratio information about a blend ratio of the plurality of pieces of vertex position specifying data. According to the present invention, the viewing direction of the character object is acquired. The blend ratio information corresponding to the viewing direction of the character object is acquired based on the blend control data. Further, blending the plurality of pieces of vertex position specifying data based on the blend ratio information provides vertex position specifying data corresponding to the viewing direction of the character object. The game screen is then displayed based on the vertex position specifying data. The present invention makes it possible to show how the peripheral part of the eye of the character object changes according to the viewing direction of the character object while reducing the amount of data and the amount of a work for producing data.
Further, according to one aspect of the present invention, one or more skeleton parts related to one or more vertexes of the character object may be set to the character object. One or more positions of the one or more vertexes of the character object which are related to the one or more skeleton parts may change according to a change in a state of the one or more skeleton parts. The vertex position specifying data may indicate the state of one or more skeleton parts in the one or more skeleton parts to which vertexes of the peripheral part of the eye of the character object are related. Further, the display control means may include: motion data storage means for storing motion data indicating the state of the one or more skeleton parts in a case where the character object performs a predetermined motion; means for changing the state of the one or more skeleton parts based on the motion data; and means for replacing a state of a skeleton part to which a vertex of the peripheral part of the eye of the character object is related from a state indicated by the motion data to a state indicated by the vertex position specifying data, which corresponds to the viewing direction of the character object acquired by the viewing direction acquisition means, acquired by the vertex position specifying data acquisition means.
Further, according to another aspect of the present invention, the display control means may include: motion data storage means for storing motion data for specifying a position of each vertex of the character object in a case where the character object performs a predetermined motion; means for changing the position of each vertex of the character object based on the motion data; and means for replacing a position of a vertex of the peripheral part of the eye of the character object from a position specified by the motion data to a position specified by the vertex position specifying data, which corresponds to the viewing direction of the character object acquired by the viewing direction acquisition means, acquired by the vertex position specifying data acquisition means.
Hereafter, an example of one embodiment of the present invention is described in detail referring to the accompanying drawings. A game device according to the embodiment of the present invention is realized by, for example, a consumer game machine, a portable game machine, a mobile phone, a personal digital assistant (PDA), or a personal computer. The following description is given of a case where the game device according to the embodiment of the present invention is realized by a consumer game machine.
The consumer game machine 11 is a publicly known computer game system. The consumer game machine 11 includes a bus 12, a microprocessor 14, a main memory 16, an image processing unit 18, an input/output processing unit 20, a sound processing unit 22, an optical disc reading unit 24, a hard disk 26, a communication interface 28, and a controller 30. The components other than the controller 30 are accommodated in a casing of the consumer game machine 11.
The bus 12 is used for exchanging addresses and data among the individual components of the consumer game machine 11. The microprocessor 14, the main memory 16, the image processing unit 18, and the input/output processing unit 20 are connected via the bus 12 in a mutually data communicatable manner.
The microprocessor 14 controls the individual components of the consumer game machine 11 based on an operating system stored in a ROM (not shown), and a program and data which are read from the optical disc 36 or the hard disk 26. The main memory 16 includes a RAM, for example. The program and data read from the optical disc 36 or the hard disk 26 are written in the main memory 16 as needed. The main memory 16 is used also as a work memory of the microprocessor 14.
The image processing unit 18, which includes a VRAM, renders a game screen into the VRAM based on image data sent from the microprocessor 14. The image processing unit 18 then converts the game screen rendered in the VRAM into a video signal, and outputs the video signal to the monitor 32 at a predetermined timing.
The input/output processing unit 20 is an interface via which the microprocessor 14 accesses the sound processing unit 22, the optical disc reading unit 24, the hard disk 26, the communication interface 28, and the controller 30. The sound processing unit 22, the optical disc reading unit 24, the hard disk 26, the communication interface 28, and the controller 30 are connected to the input/output processing unit 20.
The sound processing unit 22 includes a sound buffer in which various kinds of sound data read from the optical disc 36 or the hard disk 26, such as game music, game sound effects, and messages, are stored. The sound processing unit 22 reproduces the various kinds of sound data stored in the sound buffer, and outputs the sound data from the speaker 34.
The optical disc reading unit 24 reads the program and data recorded on the optical disc 36 in response to an instruction from the microprocessor 14. While the optical disc 36 is used here to supply a program and data to the consumer game machine 11, another information storage medium, such as a ROM card, may be used. Alternatively, a program and data may be supplied to the consumer game machine 11 from a remote place over a communication network, such as the Internet.
The hard disk 26 is a common hard disk drive (auxiliary storage device). A program and data are stored in the hard disk 26. The communication interface 28 is an interface for establishing cabled or wireless connection of the consumer game machine 11 to a communication network, such as the Internet.
The controller 30 is general-purpose operation input means for a user to input various game operations. The input/output processing unit 20 scans a state of each component of the controller 30 for every given period (for example, every 1/60th of a second). Then, the input/output processing unit 20 sends an operation signal indicating the scanning result to the microprocessor 14 via the bus 12. The microprocessor 14 determines a game operation carried out by a player based on the operation signal. A plurality of the controllers 30 can be connected to the consumer game machine 11. The microprocessor 14 executes game control based on the operation signal input from each controller 30.
The game device 10 with the foregoing configuration executes the game program read from the optical disc 36 or the hard disk 26, whereby a soccer game, for example, is realized.
A virtual three-dimensional space is created in the main memory 16 of the game device 10.
A skeleton is set inside the character object 46. A skeleton includes joints equivalent to joint portions, and bones which connect joints. One or more vertexes of the character object 46 are related to each skeleton part (joint and bone). The position of a vertex related to each skeleton part changes based on a change in the state (position, rotational angle, etc.) of the skeleton part. In order to change the posture or expression of the character object 46, a change is given to the state of each skeleton part. When the state of each skeleton part changes, the vertex that is related to the skeleton part moves according to the change. As a result, the posture and expression of the character object 46 change. A skeleton is used to change the posture and expression of the character object 46.
For example, skeleton parts (hereinafter referred to as “eye-related skeleton parts”) related to the vertexes of an eye 52 and the peripheral part of the eye 52 (in other words, portions which show changes according to a change in a viewing direction of the character object 46), skeleton parts related to the vertexes of a mouth 58 and the peripheral part of the mouth 58, and so forth are set on the face 50 (head) of the character object 46. The eye-related skeleton parts include, for example, a joint and/or bone for moving an upper eyelid 54, and a joint and/or bone for moving an eyebrow 56. The states of those joints and/or bones change, whereby the upper eyelid 54 and the eyebrow 56 of the character object 46 move.
A virtual camera 49 is disposed in the virtual three-dimensional space 40. A game screen showing the picture obtained by viewing the virtual three-dimensional space 40 from the virtual camera 49 is displayed on the monitor 32. A user operates a character object 46 as an operation target using the controller 30 while viewing the game screen. The character object 46 to be operated by the user acts according to contents of the user's operation. The other character objects 46 act according to contents determined by a predetermined algorithm (which is hereafter referred to as an “action decision algorithm”).
The following describes technology for allowing the game device 10 to show how the peripheral part of the eye 52 of the character object 46 changes according to the viewing direction of the character object 46 while reducing an amount of data and an amount of the work for producing data.
First, data to be stored in the game device 10 is described. Of data described below, the data to be stored in the optical disc 36 may be stored in the hard disk 26.
Model data of each object placed in the virtual three-dimensional space 40 is stored in the optical disc 36. Motion data of the character object 46 is stored in the optical disc 36 (motion data storage means). Motion data is data indicating a change in the position of a vertex of the character object 46 for each frame (for example, every 1/60th of a second) in a case where the character object 46 takes various motions. According to this embodiment, the position of a vertex of the character object 46 is specified by the state (rotational angle, position, etc.) of each skeleton part set for the character object 46. According to this embodiment, therefore, motion data is data indicating a change in the state (rotational angle, position, etc.) of each skeleton part for each frame in a case where the character object 46 performs various motions. Hereafter, changing the state of each skeleton part of the character object 46 according to motion data is referred to as “reproduction of motion data”.
Examples of motion data of the character object 46 include motion data in a case where the character object 46 runs, motion data (hereafter referred to as “pass motion data”) in a case where the character object 46 makes a pass, motion data (hereafter referred to as “joy motion data”) in a case where the character object 46 is happy, and motion data (hereafter referred to as “pain motion data”) in a case where the character object 46 shows pain.
Joy motion data is used, for example, in a case of getting a score or in a case of winning a game. The joy motion data includes data indicating a change in the state of each skeleton part of the face 50 of the character object 46 for making the character object 46 have a smiley face. When joy motion data is reproduced, therefore, the character object 46 takes a “joyful action”, and the expression of the face 50 of the character object 46 turns into a smiley face. Pain motion data is used, for example, when the character object 46 is tackled by a character object 46 of the opponent team. The pain motion data includes data indicating a change in the state of each skeleton part of the face 50 of the character object 46 for making the character object 46 have a “painful expression”. When pain motion data is reproduced, therefore, the character object 46 performs a “painful action”, and the expression of the face 50 of the character object 46 turns into a “painful expression”.
Data (vertex position specifying data) for specifying the positions of the vertexes of the eye 52 and the peripheral part of the eye 52 of the character object 46 when the character object 46 is looking in each of a plurality of basic viewing directions is stored in the optical disc 36 (vertex position specifying data storage means). As described above, the positions of the vertexes of the eye 52 and the peripheral part of the eye 52 of the character object 46 are specified by the states (rotational angle, position, etc.) of the eye-related skeleton parts of the character object 46. According to this embodiment, therefore, as data for specifying the positions of the vertexes of the eye 52 and the peripheral part of the eye 52 when the character object 46 is looking in a basic viewing direction, data indicating the states (rotational angle, position, etc.) of eye-related skeleton parts when the character object 46 is looking in the basic viewing direction is stored. Hereafter, this data is referred to as “basic state data” of an eye-related skeleton part.
According to this embodiment, the basic state data of an eye-related skeleton part corresponding to each of four basic viewing directions (Up, Down, Left, Right) are stored in the optical disc 36.
Blend control data is stored in the optical disc 36 (blend control data storage means). Blend control data is data which associates viewing direction information about the viewing direction of the character object 46 with blend ratio information about the blend ratio (composition ratio) in the case of blending (composing) the basic state data U, D, L, and R of an eye-related skeleton part.
The “viewing direction condition” field shows the conditions for the viewing direction of the character object 46. More specifically, the “viewing direction condition” field shows the conditions for the characteristic amounts θx and θy of the viewing direction of the character object 46.
The characteristic amounts θx and θy of the viewing direction of the character object 46 are described below.
The characteristic amounts θx and θy are angles which indicate deviations between the viewing direction 62 of the character object 46 and the frontward direction 64 of the face 50 of the character object 46. θx is an angle indicating how much the viewing direction 62 of the character object 46 is shifted in the latitudinal direction (X-axial direction) with respect to the frontward direction 64 of the face 50 of the character object 46. More specifically, as illustrated in
The “blend ratio” field shows the blend ratio in the case of blending the basic state data U, D, L, and R of an eye-related skeleton part.
Blend control data may be data of an equation form. That is, blend control data may be an operational equation for computing the blend ratio of the basic state data U, D, L, and R of an eye-related skeleton part based on the values of θx and θy. Alternatively, blend control data may be data which is a combination of data of a table format and data of an equation form. For example, blend control data may be data in which an operational equation for computing the blend ratio of the basic state data U, D, L, and R of an eye-related skeleton part based on the values of θx and θy is defined for every angle range of θx and θy.
In addition to the above-mentioned data, game status data indicating the current status of the game is stored in the main memory 16.
The processing which is executed by the game device 10 is described below.
As illustrated in
For example, the microprocessor 14 updates the “position”, the “moving direction”, and “moving speed” data of each character object 46 according to the contents of the user's operation, or the decision contents of the action decision algorithm. The microprocessor 14 updates the “position” data or the like of the character object 46, which is to be operated by the user, according to the contents of the user's operation. On the other hand, the microprocessor 14 updates the “position” data or the like of the character object 46 which is not the user's operation target according to the decision contents of the action decision algorithm.
Further, for example, the microprocessor 14 updates the “reproduced motion data ID” of each character object 46 according to the contents of the user's operation, or the decision contents of the action decision algorithm. When the user performs an operation to instruct a pass, for example, the microprocessor 14 sets the ID of pass motion data to the “reproduced motion data ID” of the character object 46 which is the user's operation target. When a score is achieved, for example, the microprocessor 14 sets the ID of joy motion data to the “reproduced motion data ID” of the character object 46 which has made the score. When the character object 46 is tackled, for example, the microprocessor 14 sets the ID of pain motion data to the “reproduced motion data ID”. Note that when the “reproduced motion data ID” is updated, the microprocessor 14 initializes the “current frame number” to the head frame number. When one of the character objects 46 performs a ball-related operation such as a pass or a shot, the microprocessor 14 updates the “position”, “moving direction”, and “moving speed” data of the ball object 48, too.
Further, for example, the microprocessor 14 advances the “current frame number” of each character object 46 by one frame.
Further, for example, the microprocessor 14 (display control means) updates the “skeleton state” data of each character object 46 based on “reproduced motion data ID” and the “current frame number”. More specifically, the microprocessor 14 acquires the state of each skeleton part in the current frame from the motion data being reproduced. Then, the microprocessor 14 updates the “skeleton state” data of the character object 46 in such a way that the “skeleton state” data shows its acquired state. When the ID of joy motion data is set to the “reproduced motion data ID” of the character object 46, for example, the “skeleton state” data of the character object 46 is updated so that the state of each skeleton part (eye-related skeleton part or the like) of the face 50 of the character object 46 becomes the state in the current frame of the joy motion data. Accordingly, the state indicated by the motion data being reproduced (for example, joy motion data) is held in the “skeleton state” data as the state of an eye-related skeleton part.
Further, for example, the microprocessor 14 updates the “viewing target” data of each character object. For example, the microprocessor 14 selects a predetermined position, the ball object 48, other character objects 46, or the like in the virtual three-dimensional space 40 as a viewing target according to a predetermined algorithm. The viewing target of the character object 46 may be determined every predetermined time, or may be determined when a certain game event takes place.
Thereafter, the microprocessor 14 performs a process for correcting the state of an eye-related skeleton part of each character object 46 (S102 to S112). First, the microprocessor 14 selects one of the character objects 46 (S102). Hereafter, the character object 46 which is selected in S102 or in later S112 is referred to as “selected character object”.
Then, the microprocessor 14 (viewing direction acquisition means) acquires the viewing direction 62 of the selected character object (S103). The microprocessor 14 acquires the viewing direction 62 of the selected character object based on the position of the selected character object and the position of the viewing target. For example, the microprocessor 14 acquires the direction from the central point 60 between the right eye 52 and the left eye 52 of the selected character object toward the position of the viewing target as the viewing direction 62. Then, the microprocessor 14 acquires the characteristic amounts θx and θy (refer to
Next, the microprocessor 14 reads a top record from the blend control data (S105). The microprocessor 14 then determines whether or not θx and θy satisfy the condition which is indicated in the “viewing direction condition” field of the read record (S106). If θx and θy do not satisfy the condition indicated in the “viewing direction condition” field of the read record, the microprocessor 14 reads the next record from the blend control data (S107). Then, the microprocessor 14 determines whether or not θx and θy satisfy the condition indicated in the “viewing direction condition” field of the record (S106).
If θx and θy satisfy the condition indicated in the “viewing direction condition” field of the read record, on the other hand, the microprocessor 14 (blend ratio information acquisition means) acquires the blend ratio indicated in the “blend ratio” field of the read record (S108). Then, the microprocessor 14 (vertex position specifying data acquisition means) blends the basic state data U, D, L, and R of the eye-related skeleton part based on the blend ratio acquired in S108 to acquire state data of the eye-related skeleton part corresponding to the viewing direction 62 acquired in S103 (S109).
In this case, for example, the rotational angle θ of each skeleton part (joint or bone) corresponding to the viewing direction 62 acquired in S103 is acquired by the following equation (1). In the following equation (1), θu represents the rotational angle of the skeleton part that is indicated by the basic state data U. θd represents the rotational angle of the skeleton part that is indicated by the basic state data D. θ1 represents the rotational angle of the skeleton part that is indicated by the basic state data L. θr represents the rotational angle of the skeleton part that is indicated by the basic state data R. Further, au represents the blend ratio of the basic state data U. αd represents the blend ratio of the basic state data D. αl represents the blend ratio of the basic state data L. αr represents the blend ratio of the basic state data R.
θ=θu*αu+θd*αd+αl+θr*αr (1)
For example, the position P of each skeleton part (joint or bone) corresponding to the viewing direction 62 acquired in S103 is acquired by the following equation (2). In the following equation (2), Pu represents the position of the skeleton part that is indicated by the basic state data U. Pd represents the position of the skeleton part that is indicated by the basic state data D. Pl represents the position of the skeleton part that is indicated by the basic state data L. Pr represents the position of the skeleton part that is indicated by the basic state data R. Note that, αu, αd, αl, and αr in the equation (2) are the same as those of the equation (1).
P=Pu*αu+Pd*αd+Pl*αl+Pr*αr (2)
After the state data of the eye-related skeleton part corresponding to the viewing direction 62 acquired in S103 is acquired, the microprocessor 14 (display control means) updates the state of the eye-related skeleton part held in the “skeleton state” data of the selected character object based on the acquired state data (S110). More specifically, the microprocessor 14 changes the state of the eye-related skeleton part held in the “skeleton state” data of the selected character object from the state acquired based on motion data in S101 to the state indicated by the state data acquired in S109.
Then, the microprocessor 14 determines whether or not there is any character object 46 which has not yet been selected as a selected character object (S111). If there are character objects 46 which have not yet been selected as a selected character object, the microprocessor 14 selects one of the character objects 46 unselected as a selected character object (S112), and performs the process in S103.
On the other hand, if there is not any character object 46 unselected as a selected character object, i.e., if correction of the state of the eye-related skeleton part of all character objects 46 is completed, the microprocessor 14 and the image processing unit 18 (display control means) generate a game screen showing a picture obtained by viewing the virtual three-dimensional space 40 from the virtual camera 49 in the VRAM based on the game status data stored in the main memory 16 (S113). For example, the microprocessor 14 and the image processing unit 18 compute the position of the vertex of each character object 46 based on the “skeleton state” data of the character object 46. Then, the microprocessor 14 and the image processing unit 18 generate the game screen based the computation result, the “position” data of each character object 46, and the like. The game screen generated in the VRAM is output to the monitor 32 at a given timing to be displayed thereon.
The game device 10 described above can express how the peripheral part of the eye 52 of the character object 46 changes according to the viewing direction of the character object 46.
For example, a case is assumed in which the viewing direction 62 of the character object 46 is shifted to the upper right with respect to the frontward direction 64 of the face 50. More specifically, it is assumed that θx is 45 degrees and θy is 45 degrees. In this case, “0.5”, “0”, “0”, and “0.5” are acquired as the blend ratios of the basic state data U, D, L, and R, respectively (see S108 of
Another case is assumed, in which, for example, the viewing direction 62 of the character object 46 is shifted to the lower left with respect to the frontward direction 64 of the face 50. More specifically, it is assumed that θx is −45 degrees and θy is −45 degrees. In this case, “0”, “0.5”, “0.5”, and “0” are acquired as the blend ratios of the basic state data U, D, L, and R, respectively (see S108 of
By the way, the following methods may be available as a method of expressing how the peripheral part of the eye 52 of the character object 46 changes according to the viewing direction 62 of the character object 46. That is, there may be a method of preparing motion data expressing how the peripheral part of the eye 52 of the character object 46 changes according to the viewing direction 62 of the character object 46 for every viewing direction 62 of the character object 46. For example, there may be a method of preparing joy motion data, pain motion data, etc. for every viewing direction 62 of the character object 46. However, the viewing direction 62 of the character object 46 changes according to the positional relationship between the character object 46 and a viewing target (for example, a predetermined position, the ball object 48, or another character object 46, or the like in the virtual three-dimensional space 40), and is not restricted to a fixed direction. For that reason, if the above-mentioned method is adopted, a vast amount of motion data as described above has to be prepared. That is, the amount of data and the amount of the work for producing data increase. In this regard, the game device 10 merely requires preparation of the basic state data (e.g., the basic state data U, D, L, and R) of an eye-related skeleton part corresponding to each of a plurality of basic viewing directions and the blend control data (see
In the game device 10, after the state of each skeleton part of a character object 46 is updated according to motion data such as joy motion data and pain motion data, only the state of an eye-related skeleton part is replaced. Accordingly, the states of skeleton parts (e.g., skeleton parts to which the vertexes of the mouth 58 and the periphery of the mouth 58 are related) other than the eye-related skeleton parts becomes the state that is determined by the original motion data (e.g., joy motion data, pain motion data, etc.). As a result, for example, the mouth shows the expression intended by the original motion data (smiley face, “painful expression”, or the like).
The present invention is not limited to the embodiment described above.
For example, the basic viewing directions are not restricted to the four directions of Up, Down, Left, and Right. For example, the basic viewing directions may be eight directions of Up, Down, Left, Right, Upper Left, Upper Right, Lower Left, and Lower Right.
Further, for example, in S109 of
For example, instead of the basic state data U, D, L and R of eye-related skeleton parts, basic position data U, D, L and R which indicate the positions of the vertexes of the eye 52 and the peripheral part of the eye 52 when the character object 46 is looking in each of the four basic viewing directions (Up, Down, Left, Right) may be stored in the optical disc 36 or the hard disk 26.
In this case, in S109 of
P′=Pu′*αu+Pd′*αd+Pl′*αl+Pr′*αr (3)
In this case, instead of omitting S110 of
For example, the present invention can be adapted to games other than a soccer game.
For example, the program is supplied to the consumer game machine 11 from the optical disc 36 as an information storage medium in the foregoing description, but the program may be delivered to a home or the like over a communication network.
Number | Date | Country | Kind |
---|---|---|---|
2007-207254 | Aug 2007 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2008/051822 | 2/5/2008 | WO | 00 | 2/12/2010 |