This invention relates to game devices changing sound and images in accordance with a tilt operation. More particularly, this invention relates to a game device changing sound in response to a change in an image of a character in accordance with the tilt of a controller or a handheld game device operated by a player.
There is known a well-known game device using a tilt sensor (or an acceleration sensor), which is disclosed in Japanese Patent Laid-Open Publication No. 2001-170358. There is a game titled “Kirby's Tilt'n Tumble” (R) which is a product using the above-described conventional technique, and is embodied in a handheld game machine released by the applicant of the present invention. The game device as described above allows a moving direction or an amount of movement of a player character to be changed (as a result, a position of a background screen relative to the player character or a scroll is changed) in accordance with a tilt operation performed by a player who tilts a housing of a handheld game device longitudinally or laterally.
In the conventional game device using the tilt sensor, however, the tilt direction and/or the amount of tilt detection is only reflected in an animated image used for rendering a moving character or a moving speed thereof. Thus, the player often finds the game monotonous, and may soon tire of the game.
Therefore, a feature of the exemplary embodiments is to provide a game device producing novel staging effects by concurrently changing a game image of a character, etc., and game sound (imitative sound and music used in the game) in accordance with a tilt operation related to a tilt direction and/or the amount of tilt performed by a player who changes the tilt of a game controller or a handheld game device.
Another feature of the exemplary embodiments is to provide a game device that can enhance the realism of a game by changing a generation mode of game sound in response to a change in an image of a character in accordance with a tilt operation.
Still another feature of the exemplary embodiments is to provide a game device that can further enhance the realism of a game by synergistically enhancing the staging effects produced by an image and sound of the game in accordance with a tilt operation performed for changing various sound factors, such as a tempo, volume, a tone interval of the sound, and the degree of conversion of waveform data that is a sound source, or the like.
The exemplary embodiments have the following features.
A first aspect of an exemplary embodiment is directed to a game device that causes a display to display a game image including a character operated by a player, and causes a sound generator to output game sound corresponding to the character in accordance with an operation performed by the player, comprising: operation controller operated by the player; a tilt detector; game processing circuitry (and associated software); an image storage area; a sound data storage area; an image display processing unit; and a sound output processing unit.
The tilt detector is provided to the operation means for detecting an amount of tilt of the operation means. The game processing circuitry processes a game in accordance with the operation performed by the player. The image storage areas stores image data used for displaying the character. The sound data storage stores sound data of sound to be produced by the character. The image display processing unit reads the image data of the character from the image data storage means and causes the display to display an image of the character. The sound output processing unit reads the sound data from the sound data storage and causes the sound generator to output sound related to the image of the character while the image of the character is displayed by the image display processing unit.
The image display processing unit and the sound output processing unit respond to a change in output of the tilt detector, which changes in accordance with the tilt operation performed by the player with the operation controller, and change the game image displayed by the display and the game sound output from the sound generator concurrently and in an associated manner.
According to a second aspect, in the first aspect, the sound output processing unit changes an interval of reading the sound data from the sound data storage area in accordance with the output of the tilt detector.
According to a third aspect, in the first aspect, the sound data storage stores at least a first type of sound data and a second type of sound data for one character, and the sound output processing unit selects either of the first type of sound data and the second type of sound data based on an output value of the tilt detector and reads the selected first or second type of sound data.
According to a fourth aspect, in the first aspect, the sound output processing unit outputs sound (for example, sound whose frequency, volume, or tone of the sound data has been changed) converted from the sound data read from the sound data storage area by using an output value of the tilt detector.
According to a fifth aspect, in the third aspect, a determination mechanism determines whether or not the output value of the tilt detector is greater than a predetermined value is further included. The first type of sound data corresponds to sound that is used when the displayed character moves faster than a predetermined speed, and the second type of sound data corresponds to sound that is used when the displayed character moves slower than the predetermined speed.
The sound output processing unit reads the first type of sound data when it is determined by the determination mechanism that the output value of the tilt detector is greater than the predetermined value, and reads the second type of sound data when it is determined by the determination mechanism that the output value of the tilt detector is equal to or smaller than the predetermined value.
According to a sixth aspect, in the first aspect, the tilt detector is an acceleration sensor for detecting a magnitude of acceleration exerted at least on a lateral and a longitudinal direction of the game device.
A seventh aspect of an exemplary embodiments is directed to a computer-readable storage medium having stored therein a game program which causes a game device to execute a tilt detecting step, an image display processing step, and a sound output processing step, the game program is executed in the game device including: an operation controller operated by a player; a tilt detector provided to the operation controller for detecting an amount of tilt of the operation controller; a display for displaying a game image; a sound generator for outputting game sound; game processing circuitry for processing the game program in accordance with an operation performed by the player; an image storage for storing image data of a character used for displaying the character; and sound data storage for storing sound data of sound to be produced by the character.
The tilt detecting step detects tilt; of the operation controller. The image display processing step reads the image data from the image data storage and causes the display to display an image of the character. The sound output processing step reads the sound data from the sound data storage and outputs sound related to the image of the character from the sound generator while the image of the character is displayed by the image display processing unit.
In the image display processing step and the sound output processing step, an image displayed by the display and the sound output from the sound generator are changed concurrently and in an associated manner in accordance with the tilt operation performed by the player with the operation controller by using an output detected at the tilt detecting step.
According to an eighth aspect, in the seventh aspect, the sound output processing step changes an interval of reading the sound data from the sound data storage area based on an output value obtained at the tilt detecting step.
According to a ninth aspect, in the seventh aspect, the sound data storage area stores at least a first type of sound data and a second type of sound data for one character. The sound output processing step selects either of the first type of sound data and the second type of sound data based on an output value obtained at the tilt detecting step, and reads the selected first or second type of sound data.
According to a tenth aspect, in the seventh aspect, the sound output processing step includes a sound data converting step of converting the sound data read from the sound data storage means by using an output value obtained at the tilt detecting step.
According to the present exemplary embodiments, it is possible to concurrently change the game image of the character, etc., and the game sound (imitative sound and music used in the game) in accordance with the tilt operation, thereby providing the game device with the ability to realize novel staging effects by changing the image and the sound in an associated manner.
Furthermore, according to the present exemplary embodiments, it is possible to enhance the realism of the game by changing a generation mode of the game sound in response to a change in the image of the character in accordance with the tilt operation.
Still further, according to the present exemplary embodiments, it is possible to further enhance the realism of the game by synergistically enhancing the staging effects produced by the image and the sound of the game in accordance with the tilt operation performed for changing various sound factors, such as a tempo, volume, a tone interval of the sound, and the degree of conversion of waveform data that is a sound source, or the like.
These and other objects, features, aspects and advantages of the present illustrative embodiments will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
With reference to
In
A game program operated in the game device 10 is provided from an external storage medium 14 such as an optical disk, etc. The external storage medium 14 can be removably mounted on the main unit 11. The main unit 11 reads a program stored in the external storage medium 14, and executes the game program. Note that the external storage medium in the form of an optical disk is shown in the drawing, but a storage medium using a semiconductor or a storage medium using a magnetic or a magneto-optical technology may be used. In that case, a reading function corresponding to each medium is provided to the main unit 11. During execution of the game program, a game image is displayed on an image display device 15, and sound is output from a sound output device 16. Note that the sound output device 16 is connected to the image display device 15 in the drawing, but the sound output device 16 may be directly connected to the main unit 11. Furthermore, a data storage medium 17 can be inserted into the main unit 11. The data storage medium 17 can store, for example, processing results of the game program operated in the game device 10. Thus, the player can store the game progress in the data storage medium 17 when the game is temporarily suspended, and resume the game later by reading the stored data.
The CPU 21 is connected to a disk drive 26 via the interface 25. The disk drive 26 has a function of reading a program and data stored in the external storage medium 14. The read program and data are temporarily stored in the RAM 22, and the CPU 21 processes the read program and data as a game program.
The CPU 21 is connected to the controller 12 (or the handheld game device 13) via the interface 25. Hereinafter, the controller 12 and the handheld game device 13 are treated as an equivalent in the following description. The controller 12 includes an operation switch 12a and a joystick 12b. The operation switch 12a detects whether or not a switch is pressed by the player. An operation output detected by the controller 12 is used in game program processing. The joystick 12b is operated by the player for instructing a direction of a game character, for example. If an analog joystick capable of detecting a tilt angle of the stick as well as instructing a direction is used as the joystick 12b, it is possible to provide a game having more involved and interesting features.
The controller 12 further includes an acceleration sensor 12c. The acceleration sensor 12c has the following two functions: a sensor function of outputting the magnitude of acceleration caused in an X-axis direction of the sensor, and a sensor function of outputting the magnitude of acceleration caused in a Y-axis direction thereof. The X-axis direction of the sensor is a rotating direction whose center of rotation corresponds to a longitudinal direction of the operation surface of the controller 12, and the Y-axis direction thereof is a rotating direction whose center of rotation corresponds to a lateral direction of the operation surface of the controller 12. More specifically, for example, if the controller 12 is slowly tilted to the right (see
The controller 12 may further include a Z-axis contact switch 12d. The Z-axis contact switch 12d is a digital sensor for detecting a movement of the controller 12 in a Z-axis direction (a direction vertical to the operation surface of the controller 12). When the controller 12 is moved up and down in the Z-axis direction, for example, the above-described detection function allows an output corresponding to the movement to be obtained, whereby it is possible to realize a game including novel staging effects using the above-described resultant output. In place of the Z-axis contact switch 12d, the acceleration sensor 12c additionally including a Z-axis acceleration sensor may be used.
The image processing unit 23 processes, based on an instruction from the CPU 21 performing program processing, image data stored in the external storage medium 14, or image data generated as a result of the program processing and stored in the RAM 22. The processed image data is displayed on the image display device 15 via a video encoder 23a.
The sound processing unit 24 processes, based on an instruction from the CPU 21 performing the program processing, sound data stored in the external storage medium 14, or sound data generated as a result of the program processing and stored in the RAM 22. The processed sound data is output to the sound output device 16 via an audio codec 24a.
The program storage area 31 includes a main program storage area 32, a tilt detection program storage area 33, an image output program storage area 34, a sound output program storage area 35, an image generation program storage area 36, and a sound program storage area 37.
The main program storage area 32 stores a main program for executing the game according to the present invention in the game device 10. The tilt detection program storage area 33 stores a program for detecting the amount of tilt from acceleration sensor output data stored in the data storage area 41. More specifically, the tilt detection program storage area 33 stores a program for calculating a tilt direction of the controller based on data indicating the magnitude of acceleration caused in the X-axis direction and data indicating the magnitude thereof caused in the Y-axis direction, which are output from the acceleration sensor 12c. Changing the tilt of the controller is a basic operation of the present invention.
The image output program storage area 34 stores a program for generating an image, which is to be finally output from the image display device 15, from image data processed by a program stored in the image generation program storage area 36.
The sound output program storage area 35 stores a program for generating sound, which is to be finally output from the sound output device 16, from sound data processed by the sound program storage area 37.
The image generation program storage area 36 stores programs which are uniquely used for each of various characters, respectively, and processed using image data stored in an image data storage area 45. The sound program storage area 37 stores programs which are uniquely used for each of various characters, respectively, and processed using sound data stored in a sound data storage area 46.
The data storage area 41 stores data used in a program stored in the program storage area 31. The data storage area 41 includes a character data storage area 42, the image data storage area 45, the sound data storage area 46, and an acceleration sensor output data storage area 47.
The character data storage area 42 stores data (character data A, character data B) about various characters (a character A and a character B, for example) used in the game, and includes a character A data storage area 43 and a character B data storage area 44, for example. More specifically, for example, the character A data storage area 43 includes an image generation program designation data A storage area 43a, an image data designation data A storage area 43b, a sound program designation data A storage area 43c, and a sound data designation data A storage area 43d.
The image generation program designation data A storage area 43a stores data for designating a specific image generation program that is uniquely used for the character A and stored in the image generation program storage area 36. The image data designation data A storage area 43b stores data for designating specific image data which is uniquely used for the character A and stored in the image data storage area 45. The sound program designation data A storage area 43c stores data for designating a specific sound program which is uniquely used for the character A and stored in the sound program storage area 37. The sound data designation data A storage area stores data for designating specific sound data which is uniquely used for the character A and stored in the sound data storage area 46.
Assume that, as the character data A, an image generation program 2 is designated by the image generation program designation data, an image data 1 is designated by the image data designation data, a sound program 1 is designated by the sound program designation data, and sound data 1 is designated by the sound data designation data. In this case, if an operation is performed so that the controller 12 included in the game device according to the present invention is tilted, the image generation program 2 generates a unique image of the character A using the image data 1, and the sound program 1 generates sound unique to the character A using the sound data 1 based on the above-described operation.
Similarly, with regard to the character data B, the character B data storage area 44 includes an image generation program designation data B storage area 44a, an image data designation data B storage area 44b, a sound program designation data B storage area 44c, and a sound designation data B storage area 44d, which are used for designating a program or data uniquely used for the character B.
The image data storage area 45 stores image data (image data 1, image data 2, for example) of various characters, which is displayed in the game.
The sound data storage area 46 stores sound data (sound data 1, sound data 2, for example) of various characters, which is output in the game.
The acceleration sensor output data storage area 47 further includes an X-axis acceleration sensor data storage area 47a, a Y-axis acceleration sensor data storage area 47b, and a Z-axis contact switch data storage area 47c.
The X-axis acceleration sensor data storage area 47a stores data indicating the magnitude of X-axis acceleration output from the acceleration sensor 12c. The Y-axis acceleration sensor data storage area 47b stores data indicating the magnitude of Y-axis acceleration output from the acceleration sensor 12c. The Z-axis contact switch data storage area 47c stores data indicating an ON/OFF status of the Z-axis contact switch.
However, if the acceleration is detected in regard to both the X-axis and the Y-axis, the moving direction of the character has to be determined based on the comparison between the magnitudes thereof. In the case where the X-axis acceleration is equal to the Y-axis acceleration and y=x (x>0, y>0), it is determined that the moving direction is diagonal and upwards as shown by arrow 55. Similarly, in the case where x>y (x>0, y>0), it is determined that the moving direction is a direction closer to the X-axis to some extent compared to the arrow 55, which is shown by arrow 56. Also, in the case where x<y (x>0, y>0), it is determined that the moving direction is a direction closer to the Y-axis to some extent compared to the arrow 55, which is shown by arrow 57. Then, image generation processing is performed so as to generate an image of the character 60 rolling over in the determined direction as described above, and change the moving speed of the character in accordance with the magnitude of acceleration of rolling for displaying the generated image on the screen.
Note that the determined direction as described above has to be considered as an example, and another direction may be determined in accordance with the detected magnitude of acceleration.
With reference to
Assume that, in the game, the character 60 is a bell-shaped character producing sound while rolling over in a display screen in response to the tilt operation of the controller. A status illustration 61 represents a status of a game screen displayed on the image display device 15 and the controller 12 (or the handheld game device 13) when no tilt operation is performed. In this case, the acceleration of the controller 12 is 0 due to no tilt operation. That is, the X-axis and the Y-axis acceleration data obtained by the tilt sensor 12c is x=y=0.
In the above-described case, assume that the character data is, for example, the aforementioned character data A. In this case, an image of the character is generated by the image generation program 2 using the designated image data 1 (an image of a bell). If the image generation program 2 does not generate the image moving on the screen display when the X-axis and the Y-axis acceleration data is x=y=0, the character 60 is displayed as a still image as shown in the status illustration 61. On the other hand, sound of the character is generated by the sound program 1 using the designated sound data (sound of a bell). In this case, if mute processing is performed when the X-axis and the Y-axis acceleration data is x=y=0, that is, when the image is not moved, the sound of the character 60 is not produced as shown in the status illustration 61.
Next, a status illustration 62 is described. Assume that the X-axis acceleration data is x=x1, and the Y-axis acceleration data is y=y1. Also assume that the magnitude of tilt is relatively small as shown in the status illustration 62. In this case, the image generation program 2 uses the image data 1 and the acceleration data (x, y)=(x1, y1) for generating animation in which the bell-shaped character 60 rolls over slowly, and displays the generated animation. Concurrently, the sound program 1 changes, for example, a data reading interval of the sound of a bell (“tinkle”) of the sound data 1 using the acceleration data (x, y)=(x1, y1), and produces the sound of a bell “tinkle—tinkle—” whose interval is relatively lengthened. As a result, it is possible to produce the effect of making the bell-shaped character appear to slowly roll down a gentle slope.
Next, a status illustration 63 is described. Assume that the X-axis acceleration data is x=x2, and the Y-axis acceleration data is y=y2. Also assume that x2>x1 and y2>y1, and the magnitude of tilt is relatively large as shown in the status illustration 63. In this case, the image generation program 2 uses the image data 1 and the acceleration data (x, y)=(x2, y2) for generating animation in which the bell-shaped character 60 rapidly rolls over, and displays the generated animation. Concurrently, the sound program 1 changes the data reading interval of the sound of a bell (“tinkle”) using the above-described acceleration data (x, y)=(x2, y2), and produces the sound of a bell “tinkle, tinkle, tinkle.” whose interval is relatively shortened. As a result, it is possible to produce the effect of making the bell-shaped character appear to roll down a steep slope.
As such, an image and sound of a character operated by a player are concurrently changed in accordance with the tilt of the controller 12, whereby the player can experience a realistic sensation when operating the character of the game. Thus, it is possible to attract the interest of the player. Here, assume that, for example, a sound program (a sound program 2 shown in
Furthermore, a sound program (a sound program 3 shown in
Still further, a sound program combining processing for changing a data reading interval of the sound, processing for changing the read data, and processing for converting the sound data can be used. The use of the above-described sound program allows more advantageous staging effects to be obtained.
With reference to
At step S13, the CPU 21 obtains output data from the acceleration sensor based on the main program, and stores an output value of the acceleration sensor in the acceleration sensor output data storage area 47. Then, at step S14, the magnitude and the direction of tilt of the controller 12 (or 13) are detected based on the output of the acceleration sensor, which is stored in the acceleration sensor output data storage area 47.
Next, at step S15, by referring to the character data storage area 42, a character image and character sound to be changed in accordance with the tilt of the controller are determined.
Then, at step S16, the image generation program designated by the data stored in the character data storage area 42 reads the image data, and generates a character image based on the magnitude and the direction of tilt, which are detected at step S14.
Then, at step S17, the sound program designated based on the same character data used at step S16 reads the sound data, and generates the sound based on the magnitude and the direction of tilt, which are detected at step S14.
At step S18, it is determined whether or not an image and sound corresponding to each of the other characters are generated after generation of the image and the sound of one character is completed. If it is determined that the image and the sound are generated corresponding to each of the other characters, the game process goes back to step S15 for repeating the program processing from step S15 to step S17 until generation of the image and the sound corresponding to each of all the other characters to be processed is completed.
On the other hand, if it is determined at step S18 that the image and the sound corresponding to each of the other characters are not generated, the game process proceeds to step S19. At step S19, the character image generated at step S16 is displayed on the image display device 15 in accordance with the image output program 34. Then, at step S20, the sound generated at step S17 is output from the sound output device 16 in accordance with the sound output program 35.
Finally, at step S21, it is determined whether or not the game is ended. If it is determined that the game is not ended, the game process goes back to step S12, and obtains new output data of the tilt sensor for continuing processing. Otherwise, the game is ended.
Note that, in the above-described embodiment, the game device 10 is assumed to include the controller 12 (or the handheld game device 13) having the built-in tilt sensor 12c, and the image display device 15 provided independently of the main unit 11. It is understood that the present invention can be easily adapted to a game device integrally provided with a display function and a controller function, such as the handheld game device 13.
While the invention has been described in detail, the foregoing description is in all aspects illustrative and not restrictive. It is understood that numerous other modifications and variations can be devised without departing from the scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2002-142503 | May 2002 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
977523 | Gustafson | Dec 1910 | A |
1309888 | Gibson | Jul 1919 | A |
1443982 | Endean | Feb 1923 | A |
1576905 | Gerster | Mar 1926 | A |
1626567 | Steinbrecht | Apr 1927 | A |
1866596 | Hendrickson | Jul 1932 | A |
2338811 | Hasbrook | Jan 1944 | A |
2600363 | Morris | Jun 1952 | A |
3046676 | Hermann et al. | Jul 1962 | A |
3204233 | Olliff | Aug 1965 | A |
3793483 | Bushnell | Feb 1974 | A |
3809395 | Allison, Jr. et al. | May 1974 | A |
3863067 | Gooley | Jan 1975 | A |
RE28507 | Rusch | Aug 1975 | E |
3950859 | Kramer | Apr 1976 | A |
4093221 | Dash | Jun 1978 | A |
4107642 | Crummett | Aug 1978 | A |
4285523 | Lemelson | Aug 1981 | A |
4296930 | Frederiksen | Oct 1981 | A |
4318245 | Stowell et al. | Mar 1982 | A |
4337948 | Breslow et al. | Jul 1982 | A |
4425488 | Moskin et al. | Jan 1984 | A |
4439156 | Marshall et al. | Mar 1984 | A |
4445011 | Hansen | Apr 1984 | A |
4450325 | Luque | May 1984 | A |
4503299 | Henrard et al. | Mar 1985 | A |
4503622 | Swartz et al. | Mar 1985 | A |
4516329 | Dilcox | May 1985 | A |
4524348 | Lefkowitz | Jun 1985 | A |
4534735 | Allard et al. | Aug 1985 | A |
4540176 | Baer | Sep 1985 | A |
4542903 | Yokoi et al. | Sep 1985 | A |
4552360 | Bromley et al. | Nov 1985 | A |
4554535 | Floris et al. | Nov 1985 | A |
4567479 | Boyd | Jan 1986 | A |
4578674 | Baker et al. | Mar 1986 | A |
4591250 | Woodruff | May 1986 | A |
4639222 | Vishlizky | Jan 1987 | A |
4644662 | Anderson et al. | Feb 1987 | A |
4672541 | Bromley et al. | Jun 1987 | A |
4687200 | Shirai | Aug 1987 | A |
4695266 | Hui | Sep 1987 | A |
4721308 | Trimble | Jan 1988 | A |
4739128 | Grisham | Apr 1988 | A |
4747216 | Kelly et al. | May 1988 | A |
4787051 | Olson | Nov 1988 | A |
4804897 | Gordon et al. | Feb 1989 | A |
4817950 | Goo | Apr 1989 | A |
4839838 | LaBiche et al. | Jun 1989 | A |
4862152 | Milner | Aug 1989 | A |
4866850 | Kelly et al. | Sep 1989 | A |
4895376 | Chiang Shiung-Fei | Jan 1990 | A |
4922756 | Henrion | May 1990 | A |
4925189 | Braeunig | May 1990 | A |
4957291 | Miffitt et al. | Sep 1990 | A |
4969647 | Mical et al. | Nov 1990 | A |
4977404 | Durst et al. | Dec 1990 | A |
5059958 | Jacobs et al. | Oct 1991 | A |
5068645 | Drumm | Nov 1991 | A |
5127658 | Openiano | Jul 1992 | A |
5128671 | Thomas, Jr. | Jul 1992 | A |
5181181 | Glynn | Jan 1993 | A |
5184830 | Okada et al. | Feb 1993 | A |
5329276 | Hirabayashi | Jul 1994 | A |
5339095 | Redford | Aug 1994 | A |
D351430 | Barr | Oct 1994 | S |
5363120 | Drumm | Nov 1994 | A |
5396265 | Ulrich et al. | Mar 1995 | A |
5453758 | Sato | Sep 1995 | A |
5526022 | Donahue et al. | Jun 1996 | A |
5598187 | Ide et al. | Jan 1997 | A |
5602569 | Kato | Feb 1997 | A |
5611731 | Bouton et al. | Mar 1997 | A |
5615132 | Horton et al. | Mar 1997 | A |
5624117 | Ohkubo et al. | Apr 1997 | A |
5666138 | Culver | Sep 1997 | A |
5701131 | Kuga | Dec 1997 | A |
5703623 | Hall et al. | Dec 1997 | A |
5734371 | Kaplan | Mar 1998 | A |
5734373 | Rosenberg et al. | Mar 1998 | A |
5739811 | Rosenberg et al. | Apr 1998 | A |
5746602 | Kikinis | May 1998 | A |
5751273 | Cohen | May 1998 | A |
5752880 | Gabai et al. | May 1998 | A |
5757360 | Nitta et al. | May 1998 | A |
5784794 | Nelson | Jul 1998 | A |
5795227 | Raviv et al. | Aug 1998 | A |
5819206 | Horton et al. | Oct 1998 | A |
5835077 | Dao et al. | Nov 1998 | A |
5854622 | Brannon | Dec 1998 | A |
5862229 | Shimizu | Jan 1999 | A |
5898421 | Quinn | Apr 1999 | A |
5903257 | Nishiumi et al. | May 1999 | A |
5923317 | Sayler et al. | Jul 1999 | A |
5926438 | Saito | Jul 1999 | A |
5947868 | Dugan | Sep 1999 | A |
5955713 | Titus et al. | Sep 1999 | A |
5982352 | Pryor | Nov 1999 | A |
5991085 | Rallison et al. | Nov 1999 | A |
5999168 | Rosenberg et al. | Dec 1999 | A |
6020876 | Rosenberg et al. | Feb 2000 | A |
6160540 | Fishkin et al. | Dec 2000 | A |
6183365 | Tonomura et al. | Feb 2001 | B1 |
6200219 | Rudell et al. | Mar 2001 | B1 |
6201554 | Lands | Mar 2001 | B1 |
6281456 | Ogden | Aug 2001 | B1 |
6375572 | Masuyama et al. | Apr 2002 | B1 |
6464585 | Miyamoto et al. | Oct 2002 | B1 |
6545661 | Goschy et al. | Apr 2003 | B1 |
6641482 | Masuyama et al. | Nov 2003 | B2 |
Number | Date | Country |
---|---|---|
3437 456 | Sep 1985 | DE |
2 317 086 | Mar 1998 | GB |
2 331 686 | May 1999 | GB |
60-7128 | Jan 1985 | JP |
61-194231 | Dec 1986 | JP |
7-24141 | Jan 1995 | JP |
10-21000 | Jan 1998 | JP |
11-249653 | Sep 1999 | JP |
2001-170358 | Jun 2001 | JP |
Number | Date | Country | |
---|---|---|---|
20030216179 A1 | Nov 2003 | US |