Three-dimensional image processing apparatus

Information

  • Patent Grant
  • 6590578
  • Patent Number
    6,590,578
  • Date Filed
    Wednesday, February 28, 2001
    23 years ago
  • Date Issued
    Tuesday, July 8, 2003
    21 years ago
Abstract
A three-dimension image processing apparatus includes a CPU. When the CPU detects by collision determination that another object, e.g., a wall, is existent between an operable object and a camera, it calculates such a moving angle of the camera that an eye of the camera to the operable object is not obstructed by the other object. The camera is moved in accordance with the moving angle to a position where the operable object and the other object existing in a photographed three-dimensional space are displayed on a display.
Description




FIELD OF THE INVENTION




This invention relates to a three-dimensional image processing apparatus and an external memory device to be used therewith. More particularly, the invention relates to a three-dimensional image processing apparatus which displays on a display device an image of a player controlled object or other objects, existing in three-dimensional space, from the perspective of a predetermined “camera” position (the point of view).




BACKGROUND AND SUMMARY OF THE INVENTION




The conventional so-called 3D (3-Dimensional) video game uses player controlled or operable objects (objects operable by an operator) configured with three-dimensional data when viewed by an apparent camera at predetermined angles and distances, thereby obtaining displayed images. In the conventional game, however, if a background image (e.g., a wall) or an object used as an opponent character (another object) comes between the player controlled object and the “camera”, or if another object is moved to interrupt the line of sight between the operable object and the camera, the operable object can not be viewed in the three-dimensional world. To this end, there has been a limitation in the conventional 3D games in that the other object has to be arranged by a program not to exist between the operable object and the camera.




It is therefore an object of the present invention to provide an image processing apparatus which is capable of displaying an operable object at substantially all times and hence free from limitation in arranging other objects.




The illustrative image processing apparatus displays on a display an operable object image and another object existing in a three-dimensional space from a predetermined viewing or “photographic” position. The image processing apparatus includes an external memory which stores operable object and the other object data and a predetermined program. The system uses an input controller which inputs data which alters the position of the operable object in the three-dimensional space. Operable object position data generating hardware and software generates operable object position data so as to alter the position of the operable object in the three-dimensional space based on the data input by the input controller. The three-dimensional data is created based on the data stored in the external memory and the operable object position data. Point of view position data is generated representing photographing position data in the three-dimensional space for displaying the operable object. The system detects whether or not the other object exists between the “camera” view position and the operable object position. If so, the system alters the photographing position data such that the other object is not existent between the photographing position and the operable object position when the detecting means detects existence of the other object. The system creates display data for displaying the image of the operable object photographed from a predetermined position in the three-dimensional space based on the three-dimensional data and the photographing position data; and image signal generating circuitry outputs an image signal to the display based on the generated display data.




The system determines whether or not there is a possibility of a collision between the operable object and a polygon plane of the other object. If there is a possibility of a collision of the operable object with the other object, the camera position is changed so that the other object does not exist between the operable object and the camera. Therefore, the operable object is “photographed” without interference by the other object.




In accordance with the present invention, even if another object is permitted to freely move, it is possible to display at substantially all times an operable object on a screen of a display. Consequently, if the present invention is applied to a game apparatus, the operable object can be displayed at all times on a display, even for a game that involves an operable object and a number of other objects moving around on the display screen.




The above and other objects, features, aspects, and advantages of the present invention will become more apparent from the ensuing detailed description of the present invention when taken in conjunction with the accompanying drawings.











BRIEF DESCRIPTION OF THE DRAWINGS





FIG. 1

is an illustrative schematic external view showing one embodiment of an exemplary image processing system;





FIG. 2

is an exemplary block diagram of an image processing apparatus in the

FIG. 1

embodiment;





FIG. 3

is an illustrative view showing a CPU memory map for use in the

FIG. 2

embodiment, showing an external memory and a W-RAM address space;





FIG. 4

is a block diagram showing an exemplary controller control circuit in the

FIG. 2

embodiment;





FIG. 5

is an illustrative view for explaining a modulating/demodulating method;





FIG. 6

is an illustrative view showing a memory map of a RAM in

FIG. 4

;





FIG. 7

is a perspective view of a controller of the

FIG. 2

embodiment as viewed from the top;





FIG. 8

is a perspective view of the controller of the

FIG. 2

embodiment as viewed from the bottom;





FIG. 9

is a block diagram showing in detail the controller and an expansion device;





FIG. 10

shows illustrative data from a controller's analog joystick of the respective keys/buttons;





FIG. 11

shows illustrative transmission and reception data when a command “0” is transmitted from the controller control circuit;





FIG. 12

shows illustrative transmission and reception data when a command “1” is transmitted from the controller control circuit;





FIG. 13

shows illustrative view of transmission and reception data when a command “2” is transmitted from the controller control circuit;





FIG. 14

shows illustrative view of transmission and reception data when a command “3” is transmitted from the controller control circuit;





FIG. 15

is a flowchart showing operation of the CPU of the

FIG. 2

embodiment;





FIG. 16

is a flowchart showing operation of the bus control circuit of the

FIG. 2

embodiment;





FIG. 17

is a flowchart showing operation of the controller control circuit of the

FIG. 2

embodiment;





FIG. 18

is a flowchart showing operation of the controller circuit of the

FIG. 2

embodiment;





FIG. 19

shows illustrative transmission and reception data when a command “255” is transmitted from the controller control circuit;





FIG. 20

is an illustrative view showing a state when a wall exists between the operable object (Mario) and the camera;





FIG. 21

is an illustrative view showing point of view movement regarding

FIG. 20

;





FIG. 22

is a flowchart showing operation for a camera turning-around process;





FIG. 23

is a flowchart showing a collision-determining routine;





FIG. 24

is an illustrative view showing a wall polygon;





FIG. 25

is an illustrative view showing each polygon;





FIG. 26

is an illustrative view showing a projected surface;





FIG. 27

is an illustrative view showing a state that a projection is onto a YX plane;





FIG. 28

is an illustrative view showing a state that a projection is onto an XY plane; and





FIG. 29

is an illustrative view showing a normal vector of the plane and a point of view vector of the camera.











EMBODIMENTS





FIG. 1

depicts an exemplary image processing system according to one embodiment of the present invention. The image processing system is for example a video game system, which comprises an image processing apparatus


10


, a ROM cartridge


20


(as one example of an external memory device), a display


30


(as one example of a display means) connected to the image processing apparatus main body


10


, a controller


40


as one example of a player controller or operating device. The preferred controller is shown in

FIGS. 7 and 8

and is described below. A RAM cartridge


50


is one example of an extension device detachably attached to the controller


40


. The external memory device stores image data and program data for image processing for games, and audio data for music, sound effects, etc. A CD-ROM or a magnetic disc may alternatively be employed in place of the ROM cartridge. Where the image processing system of this example is applied to a personal computer, an input device such as a keyboard or a mouse is used as the player operating device.





FIG. 2

is a block diagram of the image processing system of this example. The image processing apparatus


10


incorporates therein a central processor unit (hereinafter “CPU”)


11


and a bus control processing circuit


12


. The bus control circuit


12


is connected to a cartridge connector


13


for detachably attaching the ROM cartridge


20


, as well as a working RAM


14


. The bus control processing circuit


12


is connect to an audio signal generating circuit


15


for outputting an audio signal process by the CPU


11


and a video signal generating circuit


16


for outputting a video signal to display, and further with a controller control circuit


17


for serially transferring operating data of one or a plurality of controller(s)


40


and/or data from RAM cartridge(s)


50


. The controller control circuit


17


is connected with controller connectors (hereinafter abbreviated as “connectors”)


181


-


184


which are provided at a front face of the image processing apparatus


10


. To the connector


18


is detachably connected a connection jack


41


and the controller


40


through a cable


42


. Thus, the connection of the controller to the connector


181


-


184


places the controller


40


into electric connection to the image processing apparatus


10


, enabling transmission and reception of data therebetween.




More specifically, the bus control processing circuit


12


inputs commands output as parallel signals from CPU


11


via a bus, performs parallel to serial conversion, outputs command as serial signals to the controller control circuit


17


, and converts serial signal data input from the controller control circuit


17


into parallel signals and outputs such signals to the bus. The data outputted through the bus is subject to processing by CPU


11


, or is stored in W-RAM


14


. The W-RAM


14


is a memory temporarily storing data to be processed by CPU


11


, wherein read-out and write-in of data is possible through the bus control circuit


12


.





FIG. 3

is a diagrammatic illustration showing memory regions assigned to respective memory spaces. The memory spaces accessible by the CPU via the bus control processing circuit


12


involves an external memory address space of the ROM cartridge


20


and a memory address space of the W-RAM


14


. The ROM cartridge


20


is structured by mounting on a board a ROM stored with data for game processing and accommodating the same board in a housing. The ROM includes an image data storage region


201


for storing image data required to cause the image processing apparatus


10


to generate image signals for the game, and a program data region


202


for storing program data required for predetermined operation of the CPU


11


. In the program data region


202


, there are stored an image display program for performing image display processing based on image data


201


, a time-measuring program for carrying out measurement of time, and a determination program for determining that the cartridge


20


and an extension or expansion device


50


, are in a predetermined relationship. The details of the time-measuring program and the determination programs are described below. The memory region of W-RAM


14


includes a region


141


for temporarily storing data representative of an operating state from a control panel.





FIG. 4

is a more detailed circuit diagram of a controller control circuit


17


. The controller control circuit


17


transmits and receives data in serial form to and from the bus control processing circuit


12


and the controller connectors


181


-


184


, and includes a data transfer control circuit


171


, a signal transmitting circuit


172


, a signal receiving circuit


173


and a RAM


174


for temporarily storing transmission and reception data. The data transfer control circuit


171


includes a parallel-serial conversion circuit and a serial-parallel conversion circuit for conversion of data format during data transfer, and also performs control of write-in and read-out of the RAM


174


. The above-mentioned serial-parallel conversion circuit converts serial data supplied from the bus control processing circuit


12


into parallel data to provide such data to the RAM


174


or the signal transmitting circuit


172


. The parallel-serial conversion circuit converts parallel data supplied from the RAM


174


or the signal receiving circuit


173


into serial data to provide such data to the bus control processing circuit


12


. The signal transmission circuit


172


converts parallel data for signal read-in control of the controller


40


supplied from the data transfer control circuit


171


and write-in data (parallel data) to the RAM cartridge


50


into serial data, which serial data is transmitted through a corresponding channel CH


1


-CH


4


to each of the plurality of controllers


40


. The signal receiving circuit


173


receives serial read-out data, representative of an operating state of each of the controller


40


, input through a corresponding channel CH


1


-CH


4


to each of the controller


40


as well as read-out data from the RAM cartridge


50


, to convert such data into parallel data to provide it to the data transfer control circuit


171


.




The signal transmitting circuit


172


and the signal receiving circuit


173


adopt a duty-cycle modulation and demodulation (hereinafter referred to as “modulation/demodulation”) method as one example of the modulation/demodulation method that may be employed here. The duty-cycle modulation/demodulation method, as shown in

FIG. 5

, is a modulation/demodulation method wherein “1” and “0” are represented by varying a Hi time period and a Lo time period for a signal at a certain interval. Explaining the modulation/demodulation method in more detail, when data to be transmitted in serial is a logical “1”, a signal having, within one cycle period T, a high-level period tH rendered longer than a low-level period tL (tH>tL) is transmitted, while when data to be transmitted is a logical “0”, a signal having, within one cycle period T, tH rendered shorter than tL (tH<tL) is transmitted.




The demodulation method samples on a serial signal received (bit transmission signal) so as to monitor at all times whether the received signal is at a high level or a low level, wherein one cycle is expressed as T=tL+tH provided that time period of low till change to high is tL and time period of high till change to low is tH. In this case, the relationship of tL and tH being tL<tH is recognized as logical “1”, while tL>tH is recognized as logical “0”, thereby achieving demodulation. If a duty-cycle modulation/demodulation method like this is employed, there is no necessity of transmitting data in synchronism with a clock signal, offering an advantage that transmission and reception of data are available with only one signal line. If two signal lines are available, another modulation/demodulation method may be utilized.




The RAM


174


includes memory regions or memory areas


174




a


-


174


H as shown in a memory map of FIG.


6


. Specifically, the area


174




a


stores a command for channel


1


, while the area


174




b


stores transmission data and reception data for channel


1


. The


174




c


stores a command for channel


2


, while the area


174




d


stores transmission data and reception data for channel


2


. The area


174




e


stores a command for channel


3


, while the area


174




f


stores transmission data and reception data for channel


3


. The area


174




g


stores a command for channel


4


, while the area


174




h


stores transmission data and reception data for channel


4


.




Accordingly, the data transfer control circuit


171


operates to write-in control to the RAM


174


data transferred from the bus control processing circuit


12


or operating state data of the controller


40


received by the signal receiving circuit


173


or read-out data from the RAM cartridge


50


, and read data out of the RAM


174


based on a command from the bus control processing circuit


12


to transfer it to the bus control circuit


12


.




FIG.


7


and

FIG. 8

are external views of the top and rear surfaces of the presently preferred controller


40


. The controller


40


is in a shape that can be grasped by both hands or one hand, and has a housing having an exterior formed projecting with a plurality of buttons, which when depressed, are operable to generate an electrical signal and a vertically-standing control portion. Specifically, the controller


40


is constituted by an upper housing and a lower housing. As shown in

FIG. 7

, the controller


40


housing has an operating area formed on an upper surface in a planar shape running from switch


403


through buttons


404


. In the operating area of the controller


40


, there are provided a cross-shaped digital direction switch (hereinafter referred to as “cross switch”)


403


on a left side, a plurality of button switches (hereinafter merely abbreviated as “switches”)


404


A-


404


F on a right side, a start switch


405


generally at a laterally central portion, and a joystick


45


for allowing analog input at a centrally lower portion. The cross switch


403


is a direction switch for designating the direction of movement of a player controlled heroic character or a cursor, which has upper, lower, left and right depression points to be used for designating movement in four directions. The switches


404


A-


404


F, being different by game software, are used, for example, to launch a missile in a shooting game, or to designate various actions such as jumping, kicking, or taking a matter in an action game. Although the joystick


45


is used in place of the cross switch


403


to designate the direction of movement of a player controlled heroic character or the like, it can designate direction at the entire angular range over 360 degrees, being utilized as an analog direction designating switch.




The controller


40


housing has three grips


402


L,


402


C and


402


R formed in a manner projecting downward from three locations of the operating area. The grips


402


L,


402


C and


402


R are in such rod-shapes that, when seized by the hand, they are contoured by the palm, the middle finger, the finger between the middle and little fingers and the little finger. Each grip is formed a little thin at a base portion, thick at an intermediate portion and thinner toward an open end (downward in FIG.


7


). The lower housing of the controller


40


has an insertion aperture


409


formed at a centrally upper portion which projects from the underside for detachably attaching a RAM cartridge


50


as an expansion device. The housing has a button switch


406


L and a button


406


R provided left and right on upper side faces thereof at locations corresponding to the positions to which the left and right index fingers of a player extend. In a back surface at the base portion of the central grip


402


C, a switch


407


is provided as a switch having a function alternative to the switch


406


L when the joystick


45


is used in place of the cross switch


403


.




The lower half of the housing on a back surface side extends toward a bottom surface to have an aperture


408


formed at a tip end thereof. At a deep end of the aperture


408


, a connector (not shown) to which an extension cartridge


50


is to be connected is provided. In the aperture


408


is also formed a lever


409


for ejecting the cartridge


50


inserted in the aperture


408


. On a side opposite to the lever


409


in the aperture


408


for insertion of an extension cartridge


50


, a cut-out


410


is formed, which cut-out


410


provides a space for pulling out the extension cartridge


50


upon taking out the extension cartridge


50


by using the lever


409


.





FIG. 9

is a detailed circuit diagram of a controller


40


and a RAM cartridge


50


(as one example of an extension device). The controller


40


incorporates within its housing, electronic circuits such as an operation signal processing circuit


44


, etc. in order to detect operating states of the switches


403


-


407


or the joystick


45


or the like and transfer detected data to the controller control circuit


17


. The operation signal processing circuit


44


includes a signal receiving circuit


441


, a control circuit


442


, a switch signal detecting circuit


443


, a counter circuit


444


, a signal transmitting circuit


445


, a joyport control circuit


446


, a reset circuit


447


and a NOR gate


448


.




The signal receiving circuit


441


converts a serial signal, such as a control signal transmitted from the controller control circuit


17


, write-in data to the RAM cartridge


50


, etc., into a parallel signal to supply it to the control circuit


442


. The control circuit


442


generates a reset signal to cause resetting (e.g., setting to


0


) on measured values of an X-axis counter


444


X and a Y-axis counter


444


Y included in the counter


444


, when the control signal transmitted from the controller control circuit


17


is a reset signal for an X, Y coordinate of the joystick


45


. The joystick


45


includes photo-interrupters for the X-axis and Y-axis so as to generate the number of pulses proportional to the amount of inclination of a lever in directions of X-axis and Y-axis, providing respective pulse signals to the counters


444


X and


444


Y. The counter


444


X, when the joystick


45


is inclined in the X-axis direction, measures the number of pulses generated in proportion to the amount of inclination. The counter


444


Y measures the number of pulses generated in proportion to the amount of inclination, when the joystick


45


is inclines in the Y-axis direction. Accordingly, the resultant vector, determined by the measured values in X-axis and Y-axis of the counter


444


X and the


444


Y, determines the direction of movement and the coordinate position for the heroic character or the cursor. The counter


444


X and the counter


444


Y are also reset of their measured values by a reset signal supplied from the reset signal generating circuit


447


upon turning on the power supply, or a reset signal supplied from the switch signal detecting circuit


443


when the player depresses simultaneously two switches previously determined.




The switch signal detecting circuit


443


responds to an output command signal representing a switch state supplied at a constant period (e.g., at a {fraction (1/30)}-second interval as a frame period of a television), and reads a signal that is varied by the state of depression of the cross switch


403


and the switches


404


A-


404


F,


405


,


406


L,


406


R and


407


to supply it to the control circuit


442


.




The control circuit


442


responds to a read-out command signal of operating state data from the controller control circuit


17


, and supplies the operating state data of the switches


403


-


407


and the measuring values of the counters


444


X,


444


Y to the signal transmitting circuit


445


in a predetermined data-format order. The signal transmitting circuit


445


converts these parallel signals output from the control circuit


442


into serial data to transfer them to the controller control circuit


17


via a conversion circuit


43


and a signal line


42


.




The control circuit


442


is connected to an address bus, a data bus, and a port control circuit


446


through a port connector. The port control circuit


446


performs input-output control (or signal transmission or reception control) on data according to commands by the CPU


11


, when the RAM cartridge


50


(as one example of an extension device) is connected to a port connector


46


. The RAM cartridge


50


includes a RAM


51


and a timer chip not shown as one example of a time-related information generating means (or a calendar timer) connected to the address bus and the data bus, a battery


52


connected thereto for supplying power source to the RAM


51


and the timer counter


54


, and a decoder


54


for activating the timer counter


54


when a predetermined address is given. The RAM


51


is a RAM that has a capacity lower than a half of a maximum memory capacity accessible by using an address bus, and is comprised for example of a 256 k-bit RAM. This is because of avoiding duplication between the write-in/read-out address of the RAM and the read-out address of the timer chip by reading out a value of an arbitrary counter within the timer chip when the highest order bit becomes “1”. The RAM


51


stores backup data associated with a game, so that, if the RAM cartridge


50


is removed out of the port connector


46


, the stored data is kept by receiving power supplied from the battery


52


. The details of the kind of data stored by the RAM


51


, writing data therein, and utilization of the data stored is described below.





FIG. 10

is a graphical illustration of a data format by which the image processing apparatus reads out data representative of an operating state of switches


403


-


407


and joystick


45


from the controller


40


. The data generated by the controller


40


is configured as 4-byte data. The first-byte represents B, A, G, START, upper, lower, left and right, i.e. represents the depression of the switch


404


B,


404


A,


407


,


405


and the four cross switch


403


directions. For example, when the button B, i.e., the switch


404


B, is depressed, the highest order bit of the first byte becomes “1”. Similarly, the second-byte represents JSRST, 0 (not employed in the embodiment), L, R, E, D, C, and F, i.e., the depression of the switch


409


,


406


L,


406


R,


404


E,


404


D,


404


C and


404


F. The third byte represents by binary digit the X coordinate value (the value measured by the X counter


444


X) which value is dependent upon the inclination angle of the joystick


45


in the X direction. The fourth byte represents by binary digit the Y coordinate value (the value measured by the Y counter


444


Y) which value is dependent upon the inclination angle of the joystick


45


in the Y direction. Because the X and Y coordinate values are expressed by 8 bits of binary digits, the conversion into decimal digits makes it possible to represent the inclination of the joystick


45


by a numeral from 0-255. If the highest order bit is expressed by a signature denoting a negative value, the inclination angle of the joystick


45


can be expressed by a numeral between −128 and 127.




Referring to

FIG. 11

to

FIG. 14

, an explanation will be made on a format for the signals transmitted and received between the image processing apparatus


10


and the controller


40


.





FIG. 11

is an illustrative representation of a format for the signals transmitted and received between the image processing apparatus


10


and the controller


40


for identification of the type of a controller


40


by the image processing apparatus


10


. The image processing apparatus


10


transmits a type data request signal of a command “0” configured by 1 byte (8 bits) to the control circuit


442


within the controller


40


, and receives in response thereto, 3 bytes of a type data signal, concerning the controller


40


, of TYPE L (1 byte), TYPE H (1 byte) and the status generated by the control circuit


442


. Here, TYPE L and TYPE H are data representative of a function of a device or apparatus connected to connector


46


. The respective data of TYPE L and TYPE H are data inherent to the type of a RAM cartridge


50


. Based on the data, the image processing apparatus


10


identifies the type of a controller


40


, i.e., the type of a RAM cartridge


50


being connected to the controller


40


. The type of RAM cartridge


50


involves for example a type merely mounted with a RAM


51


, a type mounted with a RAM


51


together with a timer chip, and a type mounted with a RAM


51


together with a liquid crystal display. In the present embodiment, the type mounted with a RAM


51


and a time chip is explained in detail. Meanwhile, the status data is data that represents whether or not the port is connected to an extension device such as a RAM cartridge


50


and whether or not an extension device has been connected thereto after resetting.





FIG. 12

is an illustrative representation of a format for the signal transmitted and received between the image processing apparatus


10


and the controller


40


for determining the operating state of the controller


40


by the image processing apparatus


10


. The image processing apparatus


10


transmits a controller data request signal of a command “1” configured by 1 byte (8 bits) to the control circuit


442


within the controller


40


, and receives in response thereto an operating state data signal, concerning the controller


40


, generated by the control circuit


442


. Based on the operating state data, the image processing apparatus


10


acknowledges how the operator operates the controller


40


for utilization for varying the image. The operating state data signal has been described in detail in the explanation on FIG.


10


.





FIG. 13

is an illustrative representation of a format for a read data signal when the image processing apparatus


10


reads data from the RAM


51


within the RAM cartridge


50


which is connected to controller


40


. The image processing apparatus


10


transmits to control circuit


442


, a read command signal of a command “2” configured by 1 byte (8 bits), an address H (8 bits) signal representative of the higher order bits of an address, an address L (8 bits) signal representative of the lower order bits of an address and an address CRC (5 bits) signal for checking for transmission errors of address data of the address H signal and address L signal. The image processing apparatus receives in response thereto a storage data signal, for the RAM


51


, generated by the control circuit


442


and a data CRC (8 bits) signal for checking for data transmission error. Incidentally, to read out time-related information of the timer chip by the image processing apparatus


10


, it is satisfactory to read out addresses of 8000 h or longer by merely rendering the address H signal value greater than 80 h.





FIG. 14

is an illustrative representation of a format for a write data signal when the image processing apparatus


10


writes data into the RAM


51


within the RAM cartridge


50


connected to controller


40


. The image processing apparatus


10


transmits, to the control circuit


442


, a write command signal of a command “3” configured by 1 byte (8 bits), an address H (8 bits) signal representative of a higher order bit of an address, an address L signal and an address H signal representative of a lower order bit (3 bits) of an address, an address CRC (5 bits) signal for checking for transmission error of address data of the address L signal, and a 32-byte write-in data signal to be written into the RAM


51


. The image processing apparatus


10


receives in response thereto a data CRC (8 bits) signal generated by the control circuit


442


for checking for data reception error. The image processing apparatus


10


receives the CRC signal to perform CRC checking with the transmitted write-in data, and judges based thereon that the data has correctly been written into the RAM


51


. Incidentally, to reset for example date and time by writing time-related information into the timer chip from the image processing apparatus


10


, it is satisfactory to perform writing into addresses of 8000 h or higher by merely rendering the address H signal value greater than 80 h.




The operation of data transmission and reception between the image processing apparatus


10


and the controller


40


will now be explained.




Referring first to a flowchart for the CPU of the image processing apparatus


10


in

FIG. 15

, explanations will be made on image processing. At a step S


11


, CPU


11


is initialized based on an initial value (not shown) stored in the program data area


202


in FIG.


3


. Then, at a step S


12


, the CPU


11


outputs a control pad data request command stored in the program data area


202


to the bus control circuit


12


. At a step S


13


, the CPU


11


carries the desired image processing based on the program stored in the program data area


202


and the image data area


201


. While the CPU


11


is executing step S


13


, the bus control processing circuit


12


is executing steps S


21


-S


24


of FIG.


16


. Then, at a step S


14


, the CPU


11


outputs image data based on the control pad data stored in the control pad data area


141


in FIG.


3


. After completing step S


14


, the CPU branches to steps S


12


and repeats the execution of steps S


12


-S


14


.




The operation of the bus control processing circuit


12


is explained in conjunction with FIG.


16


. At a step S


21


, the bus control circuit


12


determines whether or not the CPU


11


has output a controller data request command (a request command for data relating to the switches of the controller


40


or data on the extension device


50


). If a controller data request command has been output, the process proceeds to a step S


22


. At the step S


22


, the bus control circuit


12


outputs a command for reading in data of the controller


40


(command 1 or command 2 referred above) to the controller control circuit


17


. Then, at a step S


23


, the bus control circuit


12


determines whether or not the controller control circuit


17


has received data from the controller


40


to store in the RAM


174


. If the controller control circuit


17


has not received data from the controller


40


to store in the RAM


174


, the bus control circuit


17


waits at step S


23


. If the controller control circuit


17


has received data from the controller


40


to store in the RAM


174


, the process proceeds to a step S


24


. At step S


24


, the bus control circuit


12


transfers the data of the controller


40


stored in the RAM


174


to the W-RAM


14


. The bus control circuit


12


, when completing the data transfer to the W-RAM


14


, returns the process back to the step S


21


to repeat execution of the step S


21


—the step S


24


.




The FIG.


15


and

FIG. 16

flowcharts show the example wherein, after the bus control circuit


12


has transferred data from the RAM


174


to the W-RAM


14


, the CPU


11


processes the data stored in the W-RAM


14


. However, the CPU


11


may directly process the data in the RAM


174


through the bus control circuit


12


.





FIG. 17

is a flowchart for explaining the operation of the controller control circuit


17


. At a step S


31


, it is determined whether there is data to be written from the bus control circuit


12


. If there is not, the data transfer control circuit


171


waits until there is data to write-in from the bus control circuit


12


. If there is data to be written, at a next step S


32


the data transfer control circuit


171


causes the RAM


174


to store commands for the first to the fourth channels and/or data (hereinafter abbreviated as “command/data”). At a step S


33


, the command/data for the first channel is transmitted to the controller


40


being connected to the connector


181


. The control circuit


442


performs a predetermined operation based on the command/data to output data to be transmitted to the image processing apparatus


10


. The content of the data will be described below in explaining the operation of the control circuit


442


. At a step S


34


, the data transfer control circuit


171


receives data output from the control circuit


442


, to cause the RAM to store the data.




At a step S


35


the command/data for the second channel is transmitted to the controller


40


, in a manner similar to the operation for the first channel at the steps S


33


and S


34


. The control circuit


442


performs a predetermined operation based on this command/data to output the data to be transmitted to the image processing apparatus


10


. At a step S


36


data transfer and write-in processes are carried out for the second channel. Meanwhile, at a step S


37


, the command/data for the fourth channel is transmitted to the controller


40


. The control circuit


442


performs a predetermined operation based on this command/data to output the data to be transmitted to the image processing apparatus


10


. At a step S


38


data transfer and write-in processes are carried out for the third channel. Furthermore, at a step S


39


, the command/data for the fourth channel is transmitted to the controller


40


. The control circuit


442


of the controller


40


performs a predetermined operation based on this command/data to output the data to be transmitted to the image processing apparatus


10


. At a step S


40


data transfer and write-in processes are carried out for the fourth channel. At a subsequent step S


41


, the data transfer circuit


171


transfer in batch the data which have received at the steps S


34


, S


36


, S


38


and S


40


to the bus control circuit


12


.




In the above identified manner, the data for the first channel to the fourth channel, that is, the commands for the controllers


40


connected to the connectors


181


-


184


and the operating state data to be read out of the controllers


40


, are transferred by time-divisional processing between the data transfer control circuit


171


and the control circuit


442


respectively within the controllers


40


.





FIG. 18

is a flowchart explaining the operation of the controller circuit


44


. First, at a step S


51


, it is determined whether or not a command has been input from the image processing circuit


10


to the control circuit


442


. If no command has been inputted, the controller circuit waits for a command. If a command is input, at a step S


52


it is determined whether or not the command inputted to the control circuit


442


is a status request command (command “0”). If a command “0” is detected, the process proceeds to a step S


53


, wherein a status transmitting process is carried out.




At the step S


53


, where the CPU


11


outputs the command “0”, the data in the format as shown in

FIG. 13

is transmitted and received between the image processing apparatus


10


and the controller


40


. On this occasion, the control circuit


442


, when receiving the command “0” data configured by 1 byte (8 bits), transmits TYPE L (1 byte), TYPE H (1 byte) and the status. Here, TYPE L and TYPE H are data for identifying the function of a device or apparatus being connected to the joyport connector


46


, which are inherently recorded in the RAM cartridge


50


. This makes possible recognition by the image processing apparatus


10


as to what extension device (e.g., a RAM cartridge


50


or other extension devices such as a liquid crystal display) is connected to the controller


40


. The status is data representative of whether or not an extension device such as a RAM cartridge


50


is connected to the port and whether or not the connection of the extension device is after resetting.




On the other hand, at the step S


52


if the determination reveals that there is not a command “0”, it is determined at a step S


54


whether or not the inputted command is a pad-data request command (command “1”). If it is a command “1”, the process proceeds to a step S


55


where the process of transmitting pad data is performed. Specifically, where the CPU


11


outputs a command “1”, the data in format as shown in

FIG. 14

is transmitted and received between the image processing apparatus


10


and the controller


40


. On this occasion, the control circuit


442


, if receiving command “1” data configured by 1 byte (8 bits), transmits the data of 14 switches (16 bits) of B, A, G, START, upper, lower, left, right, L, R, E, D, C and F; the data of JSRST (1 bit); and the data of the counter


444


X and the counter


444


Y (16 bits). By transmitting these data to the image processing apparatus


10


, the image processing apparatus


10


recognizes how the operator operated the controller


40


. Thus, these data are utilized for modifying the image by the image processing apparatus


10


in accordance with the operating state of the controller


40


as manipulated by the player.




At the aforesaid step S


54


, if the determination reveals that there is not a command “1”, it is determined at step S


56


whether or not the input command is a read-out request command (command “2”) for data associated with the RAM cartridge


50


to be connected to the extension connector. If it is a command “2”, the process proceeds to a step S


57


where the process reading out of the extension connector is performed. Specifically, where the CPU


11


outputs a command “2”, the data in format as shown in

FIG. 13

is transmitted and received between the image processing apparatus


10


and the controller


40


. On this occasion, when the control circuit


442


receives command “2” data configured by 1 byte (8 bits), address H representative of the higher-order bits (8 bits) of address, address L representative of the lower-order bits (3 bits) of address, and address CRC (5 bits) for checking for error in address data transmitted and received, the control circuit


442


transmits data stored in the RAM cartridge (32 bytes) and CRC (8 bits) for checking for data errors. In this manner, the connection of the RAM cartridge


50


(or other extension devices) and the image processing apparatus


10


enables the image processing apparatus


10


to process data from the RAM cartridge


50


, etc.




At the aforesaid step S


56


, if the determination is not a command “2”, it is determined at a subsequent step S


58


whether or not the inputted command is a write-in request command (command “3”) for information associated with the RAM cartridge


50


being connected to the extension connector


46


. Where it is the command “3”, the process of data read-out is carried out at a step


59


for the RAM cartridge


50


being connected to the extension connector


46


. Specifically, if the CPU


11


outputs a command “3”, the data shown in

FIG. 14

is transmitted and received, in response to the command “3”, between the image processing apparatus


10


and the controller


40


.




That is, when the control circuit


442


receives command “3” data configured by 1 byte (8 bits), address H representative of the higher-order bits of address (8 bits), address L representative of the lower-order bits of address (3 bits), address CRC for checking for error in address data transmitted and received (5 bits), and data to be transmitted to the RAM cartridge


50


(32 bytes), it transmits CRC for checking for error for data received (8 bits). In this manner, the connection of the extension device


50


and the image processing apparatus


10


enables the image processing apparatus


10


to control the extension device


50


. The connection of the extension device


50


and the image processing apparatus


10


also drastically improves the function of the controller


40


.




If at the aforesaid step S


58


the determination is not a command “3”, it is determined at a step S


60


whether or not it is a reset command (command


255


). Where it is the reset command (


255


), the process of resetting the counter


444


for the joystick


45


is performed at a step S


61


.




Where the CPU


11


outputs a reset command (command


255


), the data shown in

FIG. 19

is transmitted and received between the image processing apparatus


10


and the controller


40


. That is, the control circuit


442


of the controller


40


, if receiving command


255


data configured by 1 byte (8 bits), outputs a reset signal to reset the X counter


444


X and the counter


444


Y, and transmits aforesaid TYPE L (1 byte), TYPE H (1 byte) and the status.




An explanation is now made concerning changing the camera perspective (point of eye) in a three-dimensional space. That is, where in the conventional 3D game there exists between a camera and an operable object (e.g., Mario) another object (e.g., a wall or an opponent character) as shown in

FIG. 20

, the operable object or Mario cannot be viewed or “photographed” by the camera. In contrast, it is possible in the present invention to continuously display Mario at all times by turning the camera around Mario up to a lateral side thereof as shown in FIG.


20


.




Stated briefly, where the objects are situated as shown in

FIG. 21

, a determination is made of a collision with a topographical polygon extending from Mario's side, at several points on a straight line between Mario and camera. On this occasion, a check is made for a polygon that is perpendicular to an XZ plane inside a radius R from each point. The process of turning-around of the camera is performed on a polygon P determined as collisional. The wall surface P is expressed by the flat-plane equation as given by Equation (1).








Ax+By+Cz+D=


0  (1)






The correction in camera position is done by moving the “camera” in parallel with this plane P. Incidentally, the angle of Y-axis in parallel with the plane is calculated by the flat-plane equation.




Explaining in further detail in conjunction with the

FIG. 22

flowchart, the No. n of a polygon to be collision-determined is initialized (n=1) at a first step S


101


. At a next step S


102


, it is determined whether or not the number N of polygons to be checked for and the polygon No. are equal, that is, whether or not a collision-determination has been made at a next step S


103


.





FIG. 23

shows in detail step S


103


, i.e., an illustrative collision-determination routine. Before explaining this collision-determination routine, reference is made to FIG.


24


and

FIG. 25

which show wall data to be collision-determined. That is, the wall data is depicted as in

FIG. 24

wherein triangular polygons as in

FIG. 25

are gathered together. These respective polygons are stored as listing of wall polygons in a memory.




At a first step S


201


in

FIG. 23

, a point Q (Xg, Yg, Zg) and a radius R are input. The point Q is a point to be checked and the radius R is a distance considered to be collisional against the wall. At a next step S


202


, a wall-impingement flag is reset. At a step S


203


, it is determined whether or not the wall-polygon list explained hereinbefore is stored in the memory. If there exists a wall polygon list, it is determined at a next step


204


whether or not the same polygon is a polygon to be processed by turning around of the camera. At this step S


204


, If so, the process proceeds to a step S


205


.




At the step S


205


, the distance (dR) between the point Q and the plane of wall polygon is calculated according to Equation (2).








dR=AXg+BYg+CZg+D


  (2)






Then at a step S


206


it is determined whether or not the distance dR calculated at the step S


205


is smaller than the radius R. When the distance dR is greater than the radius R, there occurs no collision between the Mario and the wall, and accordingly the process returns back to the aforesaid step S


203


.




If “Yes” is determined at the step S


206


, that is, when |dR|<R, a calculation is made at a step S


207


according to Equation (3) for determining positional coordinate (Xg′, Yg′, Zg′) of a point of intersection Q′ between a straight line extending from the point Q vertically to the wall polygon P and the plane of the wall polygon.








Xg′=Xg+A×dR












Yg′=Yg+B×dR












Zg′=Zg+C×dR


  (3)






Then at a next step S


208


, it is determined whether to not the point Q′ is on the inner side of the polygon (within the range).




At step S


208


, it is determined onto which plane projection is to be made in dependence upon the direction of the wall (a value A). That is, when A<−0.707 or A>0.707, projection is onto a YZ plane shown in FIG.


26


. Otherwise, projection is onto an XY plane. Where the projection is onto the YZ plane, it is determined whether or not in

FIG. 27

the point Q′ is on an inner side of the polygon P


1


.




Meanwhile, where projection is onto the XY plane, it is determined on the point Q′ and apexes of the polygon P


1


in

FIG. 28

whether the value of counterclockwise cross product is positive or negative. That is, when C in the polygon-plane equation is C≧0, if each of the resulting cross products is 0 or negative, then determination is that the point Q′ is on the inner side of the polygon P.






(


Y




1





Yq


)×(


X




2





X




1


)−(


X




1





Xq


)×(


Y




2





Y




1


)≦0








(


Y




2





Yq


)×(


X




3





X




2


)−(


X




2





Xq


)×(


Y




3





Y




2


)≦0








(


Y




3





Yq


)×(


X




1





X




3


)−(


X




3





Xq


)×(


Y




1





Y




3


)≦0  (4)






Meanwhile, when C<0, if each of the resulting cross products is 0 or positive, then determination is that the point Q′ is on the inner side of the polygon P.






(


Y




1





Yq


)×(


X




2





X




1


)−(


X




1





Xq


)×(


Y




2





Y




1


)≧0  (4)








(


Y




2





Yq


)×(


X




3





X




2


)−(


X




2





Xq


)×(


Y




3





Y




2


)≧0








(


Y




3





Yq


)×(


X




1





X




3


)−(


X




3





Xq


)×(


Y




1





Y




3


)≧0  (5)






In this manner the point Q′ is checked at the step S


208


whether it is on the inner side of the polygon or not, and at a step


209


it is determined whether or not the point Q′ is on the inner side of the polygon. If “Yes” at this step S


209


, the wall-impingement flag that had been reset at the aforesaid step S


202


is set (step S


210


). Thereafter the process returns to FIG.


22


.




Note that the abovestated collision-determination is merely one example, and it should be recognized that the collision-determination is possible by other methods.




Referring back to

FIG. 22

, after the collision-determination at the step S


103


, it is determined at a step S


104


whether or not a wall-impingement flag is set. If “No” at this step S


104


, the process of turning around is unnecessary, so that the No. n of a point to be checked is incremented at step S


105


and the process returns back to the step S


102


.




If “Yes” at the step S


104


, it is determined at step S


106


and step S


107


whether it is on a back side of the wall. That is, the directionality of the polygon is determined. Whether the polygon is directed to the camera (the point of view) or not can be determined by examining the sign of the dot product of a normal vector N and an eye (point of view) vector V in FIG.


29


. The conditional expression therefore is given by Equation (6).








A=V·N=VxNx+VyNy+VzNz


  (6)






With Equation (6), determinations are respectively possible such that if A≧0 the wall is directed to the camera (frontward) while if A≧0 the wall is directed to a backside of the wall. If a plane existing between the camera and Mario is directed frontward relative to the camera, the turning-around of camera in

FIG. 20

is not done. In this case, the No. n of the point is incremented at a step S


105


, and the process returns back to the step S


102


.




If the plane between the camera and Mario is directed backward, the answer to the step S


107


becomes “Yes”, and the turning-around process is carried out at subsequent steps S


108


and S


109


. At the step S


108


, the angle of movement through which the position of camera (photographing position) is altered based on the flat-plane equation for the wall. That is, the flat-plane equation in terms of three points P


1


(X


1


, Y


1


, Z


1


), P


2


(X


2


, Y


2


, Z


2


), P


3


(X


3


, Y


3


, Z


3


) on the flat-plane equation is expressed by a multi-term equation of Equation (7).








Ax+By+Cz+D=


0






where,








A=Y




1


(


Z




2





Z




3


)+


Y




2


(


Z




3





Z




1


)+


Y




3


(


Z




1





Z




2


)










B=Z




1


(


X




2





X




3


)+


Z




2


(


X




3





X




1


)+


Z




3


(


X




1





X




2


)










C=X




1


(


Y




2





Y




3


)+


X




2


(


Y




3





Y




1


)+


X




3


(


Y




1





Y




2


)










D=X




1


(


Y




2




Z




3




−Z




2




Y




3


)+


Y




1


(


Z




2




X




3




−X




2




Z




3


)+


Z




1


(


X




2




Y




3





Y




2




X




3


)  (7)






The angle Ry of the normal vector with respect to the Y-axis is given by Equation (8).








Ry=tan




−1


(A/C)  (8)






Therefore, the turning-around angle of camera is either Ry+90° or Ry−90°. That is, at the step S


109


the camera is rotationally moved about Mario, or the operable object, in either direction Ry+90° or Ry−90°. Specifically, the movement is to a location closer to the presently-situated camera position (C in FIG.


21


).




Although the present invention has been described and illustrated in detail, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the spirit and scope of the present invention being limited only by the terms of the appended claims.



Claims
  • 1. A video game system comprising:a game program executing processing system for executing a video game program to generate a three-dimensional world display; and at least one player controller operable by a player for generating video game control signals, wherein said video game program includes point of view modifying instructions for automatically determining in response to a player's operation of a control key on the player controller whether modification of the displayed three-dimensional world point of view is necessary in order to prevent a first displayed object from having its view obstructed by a second displayed object; and automatically changing the point of view if the modification is determined to be necessary in order to maintain an unobstructed view of the second displayed object by the first displayed object.
  • 2. For use in a video game system having a game program executing processing system for executing a video game program, at least one player controller operable by a player for generating player controller related data, a method of operating said video game system including:generating a first three-dimensional world display from a first point of view in which a first displayed object is depicted in the three-dimensional world; automatically detecting in response to player manipulation of the player controller whether a modification of the first point of view display is necessary in order to prevent the first displayed object from obstructing a view of a second displayed object, and automatically generating a second three-dimensional world display from a second point of view in response to detecting that said modification is necessary in order to maintain an unobstructed view of the second displayed object.
Priority Claims (2)
Number Date Country Kind
7-288006 Oct 1995 JP
8-152728 Jun 1996 JP
Parent Case Info

This is a continuation of application Ser. No. 09/377,160, filed Aug. 19, 1999 U.S. Pat. No. 6,421,056, which is a continuation of Ser. No. 08/836,739, now U.S. Pat. No. 5,923,704, which is a 371 of PCT/JP96/02931 filed Oct. 9, 1996, the entire content of which is hereby incorporated by reference in this application.

US Referenced Citations (176)
Number Name Date Kind
3666900 Rothweiler et al. May 1972 A
3729129 Fletcher et al. Apr 1973 A
3827313 Kiessling Aug 1974 A
4148014 Burson Apr 1979 A
4161726 Burson et al. Jul 1979 A
4315113 Fisher et al. Feb 1982 A
4359222 Smith, III et al. Nov 1982 A
4467412 Hoff Aug 1984 A
4469330 Asher Sep 1984 A
4485457 Balaska et al. Nov 1984 A
4538035 Pool Aug 1985 A
4552360 Bromley et al. Nov 1985 A
4575591 Lugaresi Mar 1986 A
4587510 Kim May 1986 A
4620176 Hayes Oct 1986 A
4639225 Washizuka Jan 1987 A
4659313 Kuster et al. Apr 1987 A
4685678 Frederiksen Aug 1987 A
4748441 Brzezinski May 1988 A
4766423 Ono et al. Aug 1988 A
4783812 Kaneoka Nov 1988 A
4789932 Cutler et al. Dec 1988 A
4799677 Frederiksen Jan 1989 A
4870389 Ishiwata et al. Jun 1989 A
4858930 Sato Aug 1989 A
4868780 Stern Sep 1989 A
4875164 Monfort Oct 1989 A
4887230 Noguchi et al. Dec 1989 A
4887966 Gellerman Dec 1989 A
4890832 Komaki Jan 1990 A
4916440 Faeser et al. Apr 1990 A
4924216 Leung May 1990 A
4926372 Nakagawa May 1990 A
4933670 Wislocki Jun 1990 A
4949298 Nakanishi et al. Aug 1990 A
4974192 Face et al. Nov 1990 A
4976429 Nagel Dec 1990 A
4976435 Shatford et al. Dec 1990 A
4984193 Nakagawa Jan 1991 A
5001632 Hall-Tipping Mar 1991 A
5012230 Yasuda Apr 1991 A
D316879 Shulman et al. May 1991 S
5014982 Okada et al. May 1991 A
5016876 Loffredo May 1991 A
D317946 Tse Jul 1991 S
5046739 Reichow Sep 1991 A
5095798 Okada et al. Mar 1992 A
5146557 Yamrom et al. Sep 1992 A
5160918 Saposnik et al. Nov 1992 A
5203563 Loper, III Apr 1993 A
5207426 Inoue et al. May 1993 A
5213327 Kitaue May 1993 A
5226136 Nakagawa Jul 1993 A
5237311 Mailey et al. Aug 1993 A
5245320 Bouton Sep 1993 A
5259626 Ho Nov 1993 A
5273294 Amanai Dec 1993 A
5276831 Nakanishi et al. Jan 1994 A
5286024 Winblad Feb 1994 A
5290034 Hineman Mar 1994 A
5291189 Otake et al. Mar 1994 A
5317714 Nakagawa et al. May 1994 A
5327158 Takahashi et al. Jul 1994 A
5329276 Hirabayashi Jul 1994 A
5337069 Otake et al. Aug 1994 A
5357604 San et al. Oct 1994 A
5358259 Best Oct 1994 A
5371512 Otake et al. Dec 1994 A
5388841 San et al. Feb 1995 A
5388990 Beckman Feb 1995 A
5390937 Sakaguchi et al. Feb 1995 A
5393070 Best Feb 1995 A
5393071 Best Feb 1995 A
5393072 Best Feb 1995 A
5393073 Best Feb 1995 A
5394168 Smith, III et al. Feb 1995 A
D357712 Wu Apr 1995 S
5415549 Logg May 1995 A
5421590 Robbins Jun 1995 A
5426763 Okada Jun 1995 A
5436640 Reeves Jul 1995 A
5437464 Terasima et al. Aug 1995 A
5451053 Garrido Sep 1995 A
5453763 Nakagawa et al. Sep 1995 A
D363092 Hung Oct 1995 S
5459487 Bouton Oct 1995 A
5473325 McAlindon Dec 1995 A
5512920 Gibson Apr 1996 A
5513307 Naka et al. Apr 1996 A
5515044 Glatt May 1996 A
5551693 Goto et al. Sep 1996 A
5551701 Bouton et al. Sep 1996 A
5558329 Liu Sep 1996 A
5563629 Caprara Oct 1996 A
5566280 Fukui et al. Oct 1996 A
D375326 Yokoi et al. Nov 1996 S
5577735 Reed et al. Nov 1996 A
5589854 Tsai Dec 1996 A
5593350 Bouton et al. Jan 1997 A
5599232 Darling Feb 1997 A
5607157 Nagashima Mar 1997 A
5615083 Burnett Mar 1997 A
5624117 Ohkubo et al. Apr 1997 A
5628686 Svancarek et al. May 1997 A
5632680 Chung May 1997 A
5640177 Hsu Jun 1997 A
5643087 Marcus et al. Jul 1997 A
5649862 Sakaguchi et al. Jul 1997 A
5653637 Tai Aug 1997 A
5663747 Shulman Sep 1997 A
5670955 Thorne, III et al. Sep 1997 A
5680534 Yamato et al. Oct 1997 A
5684512 Schoch et al. Nov 1997 A
5691898 Rosenberg et al. Nov 1997 A
5694153 Aoyagi et al. Dec 1997 A
5704837 Iwasaki et al. Jan 1998 A
5706029 Tai Jan 1998 A
5714981 Scott-Jackson et al. Feb 1998 A
5724497 San et al. Mar 1998 A
5731806 Harrow et al. Mar 1998 A
5734373 Rosenberg et al. Mar 1998 A
5734376 Hsien Mar 1998 A
5734807 Sumi Mar 1998 A
5759100 Nakanishi Jun 1998 A
5769718 Rieder Jun 1998 A
5769719 Hsu Jun 1998 A
5784051 Harrow et al. Jul 1998 A
5785597 Shinohara Jul 1998 A
5786807 Couch et al. Jul 1998 A
5791994 Hirano et al. Aug 1998 A
5793356 Svancarek et al. Aug 1998 A
5804781 Okabe Sep 1998 A
5805138 Brawne et al. Sep 1998 A
5808591 Mantani Sep 1998 A
5816921 Hosokawa Oct 1998 A
5820462 Yokoi et al. Oct 1998 A
5830066 Goden et al. Nov 1998 A
5838330 Ajima Nov 1998 A
5850230 San et al. Dec 1998 A
5862229 Shimizu Jan 1999 A
5867051 Liu Feb 1999 A
5872999 Koizumi et al. Feb 1999 A
5877749 Shiga et al. Mar 1999 A
5880709 Itai et al. Mar 1999 A
5883628 Mullaly et al. Mar 1999 A
5896125 Niedzwiecki Apr 1999 A
5898424 Flannery Apr 1999 A
5946004 Kitamura et al. Aug 1999 A
5963196 Nishiumi et al. Oct 1999 A
5973704 Nishiumi et al. Oct 1999 A
5984785 Takeda et al. Nov 1999 A
6001015 Nishiumi et al. Dec 1999 A
6002351 Takeda et al. Dec 1999 A
6007428 Nishiumi et al. Dec 1999 A
6020876 Rosenberg et al. Feb 2000 A
6022274 Takeda et al. Feb 2000 A
6034669 Chiang et al. Mar 2000 A
6036495 Marcus et al. Mar 2000 A
6042478 Ng Mar 2000 A
6050718 Schena et al. Apr 2000 A
6050896 Hanado et al. Apr 2000 A
6067077 Martin et al. May 2000 A
6071194 Sanderson et al. Jun 2000 A
6078329 Umeki et al. Jun 2000 A
6102803 Takeda et al. Aug 2000 A
6126544 Kojima Oct 2000 A
6126545 Takahashi et al. Oct 2000 A
6146277 Ikeda Nov 2000 A
6149519 Osaki et al. Nov 2000 A
6154197 Watari et al. Nov 2000 A
6169540 Rosenberg et al. Jan 2001 B1
6175366 Watanabe et al. Jan 2001 B1
6186896 Takeda et al. Feb 2001 B1
6196919 Okubo Mar 2001 B1
6200253 Nishiumi et al. Mar 2001 B1
6219033 Rosenberg et al. Apr 2001 B1
Foreign Referenced Citations (57)
Number Date Country
32 04 428 Aug 1983 DE
40 18 052 Dec 1990 DE
268 419 May 1988 EP
0 431 723 Jun 1991 EP
0 470 615 Feb 1992 EP
553 532 Aug 1993 EP
685 246 Dec 1995 EP
724 220 Jul 1996 EP
2 234 575 Feb 1991 GB
2 244 546 Dec 1991 GB
2 263 802 Aug 1993 GB
50-22475 Mar 1975 JP
57-2084 Jan 1982 JP
57-18236 Jan 1982 JP
57-136217 Aug 1982 JP
59-40258 Mar 1984 JP
59-121500 Jul 1984 JP
61-16641 Jan 1986 JP
61-198286 Sep 1986 JP
61-185138 Nov 1986 JP
62-269221 Nov 1987 JP
2-41342 Mar 1990 JP
2-68404 May 1990 JP
2-283390 Nov 1990 JP
3-16620 Jan 1991 JP
3-248215 Nov 1991 JP
4-26432 Jan 1992 JP
4-20134 Feb 1992 JP
4-42029 Feb 1992 JP
4-104893 Sep 1992 JP
4-291468 Oct 1992 JP
5-100759 Apr 1993 JP
5-19925 May 1993 JP
5-177057 Jul 1993 JP
5-241502 Sep 1993 JP
6-23148 Feb 1994 JP
6-54962 Mar 1994 JP
6-68238 Mar 1994 JP
6-110602 Apr 1994 JP
6-114683 Apr 1994 JP
9412999 Jun 1994 JP
6-190145 Jul 1994 JP
6-190147 Jul 1994 JP
6-205010 Jul 1994 JP
6-285259 Oct 1994 JP
6-315095 Nov 1994 JP
07088252 Apr 1995 JP
7-104930 Apr 1995 JP
7-144069 Jun 1995 JP
7-222865 Aug 1995 JP
7-288006 Oct 1995 JP
7-317230 Dec 1995 JP
8-45392 Feb 1996 JP
9-56927 Mar 1997 JP
9209347 Jun 1992 WO
9717651 May 1997 WO
WO9732641 Dec 1997 WO
Non-Patent Literature Citations (21)
Entry
US 6,017,271, 1/2000, Miyamoto et al. (withdrawn)
3D Ballz Instruction Booklet, Accolade, San Jose, California, #3050-0023 Rev. A.
6 Photographs of Sony PlayStation: 1) top case and compact disk; 2) hand controller; 3) internal circuit boards (top view); 4) internal circuit boards (top view); 5) compact disk reader (bottom view); and internal main circuit board (bottom view).
Knuckles Chaotix Instruction Manual, SEGA, Redwood City, California, #84503 (1995).
Nintendo Power, vol. 30, p. 22, PilotWings article.
Nintendo Power, vol. 31, p. 35, PilotWings article.
Nintendo Power, vol. 31, pp. 74-76, PilotWings article.
Nintendo Power, vol. 38, p. 25, PilotWings article.
Nintendo Power, vol. 46, PilotWings article.
PilotWings Instruction Booklet, Super Nintedo Entertainment System, SNS-PW-USA, copyright 1991.
PilotWings, It's a Festival of Flight, Top Secret Password Nintendo Player's Guide, pp. 82-83 and 160, 1991.
PilotWings, Soar with the Flight Club, Super Nintendo Entertainment System Play's Guide, pp. 100-105, 1991.
SEGA Genesis 32X Instuction Manual, SEGA, Redwood City California, #672-2116 (1994).
SEGA Genesis Instuction Manual, SEGA, Hayward, California, #3701-926-0-01 (1994).
Sonic 2 The Hedgehog Instuction Manual, Sega, Hayward, California, #672-0944 3701-925-0-01 (1992).
Sony PlayStation Instruction Manual, and information materials, Sony Computer Entertainment Inc. 1995.
IBM Technical Disclosure Bulletin, vol. 37, No. 08, Aug. 1994, pp. 73-74, “Analog Joystick Interface Emulation using a Digital Counter”.
IBM Technical Disclosure Bulletin, vol. 33, No. 11, Apr. 1991, pp. 105-106, “Hardware Reset With Microcode Warning Period”.
Drucker et al., “Cinema: A System for Procedural Camera Movements”, Proceedings of the Symposium on Interactive 3D Graphics, Cambridge, MA Mar. 29-Apr. 1, 1992, pp. 67-70.
Sega Force/Saturn Peripherals, Data Information, 1997-99.
Sega Force/Saturn Tech Specs, Data Information, 1997.
Continuations (2)
Number Date Country
Parent 09/377160 Aug 1999 US
Child 09/794623 US
Parent 08/836739 US
Child 09/377160 US