Wireless game controllers

Information

  • Patent Grant
  • 8870655
  • Patent Number
    8,870,655
  • Date Filed
    Monday, April 17, 2006
    19 years ago
  • Date Issued
    Tuesday, October 28, 2014
    11 years ago
Abstract
A game controller arrangement includes a first control unit generating first operation data including linear acceleration sensed within a first control unit body. A second control unit generates second operation data in accordance with a direction input operation performed by a player. One of the first control unit and the second control unit includes a transmission section for wirelessly transmitting the first operation data and the second operation data to a computer.
Description
BACKGROUND

1. Field


The present invention relates to a game controller and a game system, and more particularly to a game controller which includes two control units connected to each other e.g. by a flexible cable or wirelessly and is operated using the two control units and a game system including the game controller.


2. Description of the Background Art


For example, Japanese Laid-Open Patent Publication No. 2004-313492 (hereinafter, referred to as Patent Document 1) discloses a controller having its control units held by both hands of a player, respectively, so as to play a game.


The controller disclosed in Patent Document 1 is composed of an R unit to be held by a right hand of a player and an L unit to be held by a left hand of the player. The R unit and the L unit each has an operation button and a stick on the top surface and the side of a housing thereof. The R unit and the L unit can be physically coupled to each other so as to be used as a combined controller.


However, the controller disclosed in Patent Document 1 is constructed by simply separating a conventional game apparatus controller into right and left units. That is, although a player can place his or her right and left hands anywhere when the player holds the R and L units by his or her right and left hands, respectively, the player cannot control the controller itself with improved flexibility. For example, not only the combined controller but also the game apparatus controller separated into the right and the left units cannot realize a new operation.


SUMMARY

Therefore, an object of the present invention is to provide a novel game controller and game system which realize a novel operation having enhanced flexibility by using a plurality of control units.


The following features attain the object mentioned above. The reference numerals and the like in the parentheses indicate the correspondence with the embodiment described below in order to aid in understanding the present invention and are not intended to limit, in any way, the scope of the present invention.


A first aspect is directed to a game controller (7) for transmitting operation data to a computer (30) executing a game program. The game controller comprises: a first control unit (70); a second control unit (76); and a cable (79). The cable is flexible and electrically connects between the first control unit and the second control unit. The first control unit includes a first operation data generation section (74, 701). The first operation data generation section generates first operation data in accordance with a motion of a first control unit body included in the first control unit. The second control unit includes a second operation data generation section (78). The second operation data generation section generates second operation data in accordance with a direction input operation performed by a player. One of the first control unit and the second control unit further includes a transmission section (75). The transmission section transmits the first operation data and the second operation data to the computer at a predetermined timing.


The first operation data generation section includes an image pickup section (74). The image pickup section is fixed to the first control unit body and takes an image of a periphery along a predetermined direction from the first control unit body. The first operation data generation section outputs, as the first operation data, one selected from the group consisting of an image taken by the image pickup section and a result of subjecting the image taken by the image pickup section to a predetermined calculation.


The first operation data generation section further includes a positional information calculation section (744). The positional information calculation section calculates positional information indicating a position, in the image taken by the image pickup section, of at least one marker image which is included in the taken image and is used as an imaging target, when performing the predetermined calculation, and outputs the positional information as the first operation data.


The transmission section wirelessly transmits the first operation data and the second operation data to the computer.


The first operation data generation section has one of an acceleration sensor (701) and a gyro sensor included in the first control unit body. The first operation data generation section outputs data generated by the one of the acceleration sensor and the gyro sensor as the first operation data.


The cable is detachably connected to at least the first control unit. The transmission section is included in the first control unit.


The transmission section collects and transmits to the computer the first operation data and the second operation data at intervals shorter than 1/60 second.


The second operation data generation section includes a stick (78a) which has a tip projecting from a second control unit body included in the second control unit and is inclinable on the second control unit body. The second operation data generation section outputs data obtained in accordance with an inclining direction of the stick as the second operation data.


The second operation data generation section includes an operation button (78f) which has operation portions representing at least four directions and which is able to be pushed into the second control unit body by the operation portions. The second operation data generation section outputs, as the second operation data, data corresponding to the operation portion at which the operation button is pushed.


The second operation data generation section includes a sliding member (78g) which has a top surface exposed from the second control unit body and which is horizontally movable on the second control unit body. The second operation data generation section outputs data obtained in accordance with a horizontal moving direction of the sliding member as the second operation data.


The second operation data generation section includes a touch pad (78h) on an outer surface of the second control unit body. The second operation data generation section outputs, as the second operation data, data obtained in accordance with a position on the touch pad at which the touch pad is touched.


The second operation data generation section includes at least four operation buttons (78i, 78j, 78k, 78l) which are able to be pushed into the second control unit body. The second operation data generation section outputs data obtained in accordance with the pushed operation button as the second operation data.


A further aspect is directed to a game controller for transmitting operation data to a computer executing a game program. The game controller comprises: a first control unit; a second control unit; and a wireless connecting means. The wireless connecting means wirelessly connects between the first control unit and the second control unit. The first control unit includes a first operation data generation section. The first operation data generation section generates first operation data in accordance with a motion of a first control unit body included in the first control unit. The second control unit includes a second operation data generation section. The second operation data generation section generates second operation data in accordance with a direction input operation performed by a player. Further, one of the first control unit and the second control unit includes a transmission section. The transmission section transmits the first operation data and the second operation data to the computer at a predetermined timing.


The first operation data generation section includes an image pickup section. The image pickup section is fixed to the first control unit body and takes an image of a periphery along a predetermined direction from the first control unit body. The first operation data generation section outputs, as the first operation data, one selected from the group consisting of an image taken by the image pickup section and a result of subjecting the image taken by the image pickup section to a predetermined calculation.


The first operation data generation section further includes a positional information calculation section. The positional information calculation section calculates positional information indicating a position, in the image taken by the image pickup section, of at least one marker image which is included in the taken image and is used as an imaging target, when performing the predetermined calculation, and outputs the positional information as the first operation data.


The transmission section wirelessly transmits the first operation data and the second operation data to the computer.


The first operation data generation section has one of an acceleration sensor and a gyro sensor included in the first control unit body. The first operation data generation section outputs data generated by the one of the acceleration sensor and the gyro sensor as the first operation data.


The transmission section collects and transmits to the computer the first operation data and the second operation data at intervals shorter than 1/60 second.


The second operation data generation section includes a stick which has a tip projecting from a second control unit body included in the second control unit and is inclinable on the second control unit body. The second operation data generation section outputs data obtained in accordance with an inclining direction of the stick as the second operation data.


The second operation data generation section includes an operation button (78f) which has operation portions representing at least four directions and which is able to be pushed into the second control unit body by the operation portions. The second operation data generation section outputs, as the second operation data, data corresponding to the operation portion at which the operation button is pushed.


The second operation data generation section includes a sliding member which has a top surface exposed from the second control unit body and which is horizontally movable on the second control unit body. The second operation data generation section outputs data obtained in accordance with a horizontal moving direction of the sliding member as the second operation data.


The second operation data generation section includes a touch pad on an outer surface of the second control unit body. The second operation data generation section outputs, as the second operation data, data obtained in accordance with a position on the touch pad at which the touch pad is touched.


The second operation data generation section includes at least four operation buttons which are able to be pushed into the second control unit body. The second operation data generation section outputs data obtained in accordance with the pushed operation button as the second operation data.


A further aspect is directed to a game system (1) comprising a game controller and a game apparatus (3). The game controller is described in the first aspect. The game apparatus is communicably connected to the game controller and includes a computer for executing a game program to represent a virtual game world on a display screen (2). The game apparatus performs a game process in accordance with at least one of the first operation data transmitted from the first control unit and the second operation data transmitted from the second control unit.


The game apparatus causes a player character appearing in the virtual game world to perform an action in accordance with at least one of the first operation data transmitted from the game controller and the second operation data transmitted from the game controller.


A further aspect is directed to a game system comprising a game controller and a game apparatus. The game controller is as described above. The game apparatus is communicably connected to the game controller and includes a computer for executing a game program to represent a virtual game world on a display screen. The game apparatus performs a game process in accordance with at least one of the first operation data transmitted from the first control units and the second operation data transmitted from the second control unit.


The game apparatus causes a player character appearing in the virtual game world to perform an action in accordance with at least one of the first operation data transmitted from the game controller and the second operation data transmitted from the game controller.


According to the first aspect, the first control unit generates operation data in accordance with a motion of a controller body included in the game controller, and the second control unit generates operation data in accordance with a direction input operation. Thereby, when the game controller is used in a game, a player can make an input with a finger of one hand as in the case of a conventional controller while moving the other hand. That is, the player can cause his or her right and left hands to perform separate operations, thereby providing a new operation, which cannot be conventionally performed. Further, by connecting two control units to each other by a cable, the game controller requires only one transmission section for a computer.


The first control unit may generate operation data in accordance with a motion of a controller body included in the game controller, and the second control unit generates operation data in accordance with a direction input operation. Thereby, when the game controller is used in a game, a player can make an input with a finger of one hand as in the case of a conventional controller while moving the other hand. That is, the player can cause his or her right and left hands to perform respective separate operations, thereby providing a new operation, which cannot be conventionally performed. Further, two control units are completely separated from each other, thereby providing improved controllability and enabling two players to operate the game controller.


An image taken by the image pickup section fixed to the first control unit or information obtained from the taken image can be used as the operation data. For example, a direction and a position of the first control unit with respect to the imaging target can be detected, whereby a game operation can be performed in accordance with the direction and the position of the unit.


The game controller and the computer are wirelessly connected to each other, thereby providing improved controllability of the game controller.


The acceleration sensor or the gyro sensor is used as the first operation data generation section, thereby reducing cost.


According to another aspect, the cable is eliminated from the first control unit, whereby the operation data can be transmitted to the computer using only the first control unit.


Data can be collected and transmitted at intervals shorter than a typical game process cycle ( 1/60 second).


The second operation data generation section for outputting a signal in accordance with a direction input operation performed by a player can be realized by the inclinable stick, the button such as a cross key having portions to be pressed depending on a direction, the horizontally movable pad, the touch pad, the button representing each direction and the like.


Further, the game system according to the present invention can obtain the same effect as that of the aforementioned game controller.


These and other objects, features, aspects and advantages will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is an external view illustrating a game system 1 according to an embodiment of the present invention;



FIG. 2 is a functional block diagram of a game apparatus 3 shown in FIG. 1;



FIG. 3 is a perspective view illustrating an outer appearance of a controller 7 shown in FIG. 1;



FIG. 4 is a perspective view illustrating a state of a connecting cable 79 of the controller 7 shown in FIG. 3 being connected to or disconnected from a core unit 70;



FIG. 5 is a perspective view of the core unit 70 shown in FIG. 3 as seen from the top rear side thereof;



FIG. 6 is a perspective view of the core unit 70 shown in FIG. 3 as seen from the bottom rear side thereof;



FIG. 7A is a perspective view illustrating a state where an upper casing of the core unit 70 shown in FIG. 3 is removed;



FIG. 7B is a perspective view illustrating a state where a lower casing of the core unit 70 shown in FIG. 3 is removed;



FIG. 8A is a top view of a subunit 76 shown in FIG. 3;



FIG. 8B is a bottom view of the subunit 76 shown in FIG. 3;



FIG. 8C is a left side view of the subunit 76 shown in FIG. 3;



FIG. 9 is a perspective view of the subunit 76 shown in FIG. 3 as seen from the top front side thereof;



FIG. 10 is a top view illustrating an example of a first modification of the subunit 76 shown in FIG. 3;



FIG. 11 is a top view illustrating an example of a second modification of the subunit 76 shown in FIG. 3;



FIG. 12 is a top view illustrating an example of a third modification of the subunit 76 shown in FIG. 3;



FIG. 13 is a top view illustrating an example of a fourth modification of the subunit 76 shown in FIG. 3;



FIG. 14 is a block diagram illustrating a structure of the controller 7 shown in FIG. 3;



FIG. 15 is a diagram illustrating a state of a game being generally controlled with the controller 7 shown in FIG. 3;



FIG. 16 shows an exemplary state of a player holding the core unit 70 with a right hand as seen from the front surface side of the core unit 70;



FIG. 17 shows an exemplary state of a player holding the core unit 70 with a right hand as seen from the left side of the core unit 70;



FIG. 18 is a diagram illustrating a viewing angle of a LED module 8L, a viewing angle of a LED module 8R, and a viewing angle of an image pickup element 743;



FIG. 19 shows an exemplary state of a player holding the subunit 76 with a left hand as seen from the right side of the subunit 76; and



FIG. 20 shows an exemplary game image displayed on the monitor 2 when the game apparatus 3 executes a shooting game.





DESCRIPTION OF THE PREFERRED EMBODIMENTS

With reference to FIG. 1, a game system 1 according to one embodiment of the present invention will be described. FIG. 1 is an external view illustrating the game system 1. In the following description, the game system 1 according to the present invention includes a stationary game apparatus.


As shown in FIG. 1, the game system 1 includes an installation type game apparatus (hereinafter, referred to simply as a “game apparatus”) 3, which is connected to a display (hereinafter, referred to as a “monitor”) 2 of a home-use television receiver or the like having a speaker 2a via a connection cord, and a controller 7 for giving operation information to the game apparatus 3. The game apparatus 3 is connected to a receiving unit 6 via a connection terminal. The receiving unit 6 receives transmission data which is wirelessly transmitted from the controller 7. The controller 7 and the game apparatus 3 are connected to each other by wireless communication. On the game apparatus 3, an optical disc 4 as an example of an exchangeable information storage medium is detachably mounted. The game apparatus 3 includes a power ON/OFF switch, a game process reset switch, and an OPEN switch for opening a top lid of the game apparatus 3 on a top main surface of the game apparatus 3. When a player presses the OPEN switch, the lid is opened, so that the optical disc 4 can be mounted or dismounted.


Further, on the game apparatus 3, an external memory card 5 is detachably mounted when necessary. The external memory card 5 has a backup memory or the like mounted thereon for fixedly storing saved data or the like. The game apparatus 3 executes a game program or the like stored on the optical disc 4 and displays the result on the monitor 2 as a game image. The game apparatus 3 can also reproduce a state of a game played in the past using saved data stored in the external memory card 5 and display the game image on the monitor 2. A player playing with the game apparatus 3 can enjoy the game by operating the controller 7 while watching the game image displayed on the monitor 2.


The controller 7 wirelessly transmits the transmission data from a communication section 75 included therein (described later) to the game apparatus 3 connected to the receiving unit 6, using the technology of, for example, Bluetooth (registered trademark). The controller 7 has two control units, a core unit 70 and a subunit 76, connected to each other by a flexible connecting cable 79. The controller 7 is an operation means for mainly operating a player object appearing in a game space displayed on the monitor 2. The core unit 70 and the subunit 76 each includes an operation section such as a plurality of operation buttons, a key, a stick and the like. As described later in detail, the core unit 70 includes an imaging information calculation section 74 for taking an image viewed from the core unit 70. As an example of an imaging target of the imaging information calculation section 74, two LED modules 8L and 8R are provided in the vicinity of a display screen of the monitor 2. The LED modules 8L and 8R each outputs infrared light forward from the monitor 2. Although in the present embodiment the core unit 70 and the subunit 76 are connected to each other by the flexible cable, the subunit 76 may have a wireless unit, thereby eliminating the connecting cable 79. For example, the subunit 76 has a Bluetooth (registered trademark) unit as the wireless unit, whereby the subunit 76 can transmit operation data to the core unit 70.


Next, with reference to FIG. 2, a structure of the game apparatus 3 will be described. FIG. 2 is a functional block diagram of the game apparatus 3.


As shown in FIG. 2, the game apparatus 3 includes, for example, a RISC CPU (central processing unit) 30 for executing various types of programs. The CPU 30 executes a boot program stored in a boot ROM (not shown) to, for example, initialize memories including a main memory 33, and then executes a game program stored on the optical disc 4 to perform game process or the like in accordance with the game program. The CPU 30 is connected to a GPU (Graphics Processing Unit) 32, the main memory 33, a DSP (Digital Signal Processor) 34, and an ARAM (audio RAM) 35 via a memory controller 31. The memory controller 31 is connected to a controller I/F (interface) 36, a video I/F 37, an external memory I/F 38, an audio I/F 39, and a disc I/F 41 via a predetermined bus. The controller I/F 36, the video I/F 37, the external memory I/F 38, the audio I/F 39 and the disc I/F 41 are respectively connected to the receiving unit 6, the monitor 2, the external memory card 5, the speaker 2a, and a disc drive 40.


The GPU 32 performs image processing based on an instruction from the CPU 30. The GPU 32 includes, for example, a semiconductor chip for performing calculation process necessary for displaying 3D graphics. The GPU 32 performs the image process using a memory dedicated for image process (not shown) and a part of the storage area of the main memory 33. The GPU 32 generates game image data and a movie to be displayed on the monitor 2 using such memories, and outputs the generated data or movie to the monitor 2 via the memory controller 31 and the video I/F 37 as necessary.


The main memory 33 is a storage area used by the CPU 30, and stores a game program or the like necessary for processing performed by the CPU 30 as necessary. For example, the main memory 33 stores a game program read from the optical disc 4 by the CPU 30, various types of data or the like. The game program, the various types of data or the like stored in the main memory 33 are executed by the CPU 30.


The DSP 34 processes sound data or the like generated by the CPU 30 during the execution of the game program. The DSP 34 is connected to the ARAM 35 for storing the sound data or the like. The ARAM 35 is used when the DSP 34 performs a predetermined process (for example, storage of the game program or sound data already read). The DSP 34 reads the sound data stored in the ARAM 35, and outputs the sound data to the speaker 2a included in the monitor 2 via the memory controller 31 and the audio I/F 39.


The memory controller 31 comprehensively controls data transmission, and is connected to the various I/Fs described above. The controller I/F 36 includes, for example, four controller I/Fs 36a, 36b, 36c and 36d, and communicably connects the game apparatus 3 to an external device which is engageable via connectors of the controller I/Fs 36a, 36b, 36c and 36d. For example, the receiving unit 6 is engaged with such a connector and is connected to the game apparatus 3 via the controller I/F 36. As described above, the receiving unit 6 receives the transmission data from the controller 7 and outputs the transmission data to the CPU 30 via the controller I/F 36. The video I/F 37 is connected to the monitor 2. The external memory I/F 38 is connected to the external memory card 5 and is accessible to a backup memory or the like provided in the external memory card 5. The audio I/F 39 is connected to the speaker 2a built in the monitor 2 such that the sound data read by the DSP 34 from the ARAM 35 or sound data directly outputted from the disc drive 40 can be outputted from the speaker 2a. The disc I/F 41 is connected to the disc drive 40. The disc drive 40 reads data stored at a predetermined reading position of the optical disc 4 and outputs the data to a bus of the game apparatus 3 or the audio I/F 39.


Next, with reference to FIGS. 3 and 4, the controller 7 will be described. FIG. 3 is a perspective view illustrating an outer appearance of the controller 7. FIG. 4 is a perspective view illustrating a state of the connecting cable 79 of the controller 7 shown in FIG. 3 being connected to or disconnected from the core unit 70.


As shown in FIG. 3, the controller 7 includes the core unit 70 and the subunit 76 connected to each other by the connecting cable 79. The core unit 70 has a housing 71 including a plurality of operation sections 72. The subunit 76 has a housing 77 including a plurality of operation sections 78. The core unit 70 and the subunit 76 are connected to each other by the connecting cable 79.


As shown in FIG. 4, the connecting cable 79 has a connector 791 detachably connected to the connector 73 of the core unit 70 at one end thereof, and the connecting cable 79 is fixedly connected to the subunit 76 at the other end thereof. The connector 791 of the connecting cable 79 is engaged with the connector 73 provided at the rear surface of the core unit 70 so as to connect the core unit 70 and the subunit 76 to each other by the connecting cable 79.


With reference to FIGS. 5 and 6, the core unit 70 will be described. FIG. 5 is a perspective view of the core unit 70 as seen from the top rear side thereof. FIG. 6 is a perspective view of the core unit 70 as seen from the bottom rear side thereof.


As shown in FIGS. 5 and 6, the core unit 70 includes the housing 71 formed by plastic molding or the like. The housing 71 has a generally parallelepiped shape extending in a longitudinal direction from front to rear. The overall size of the housing 71 is small enough to be held by one hand of an adult or even a child.


At the center of a front part of a top surface of the housing 71, a cross key 72a is provided. The cross key 72a is a cross-shaped four-direction push switch. The cross key 72a includes operation portions corresponding to the four directions (front, rear, right and left) represented by arrows, which are respectively located on cross-shaped projecting portions arranged at intervals of 90 degrees. The player selects one of the front, rear, right and left directions by pressing one of the operation portions of the cross key 72a. Through an operation on the cross key 72a, the player can, for example, instruct a direction in which a player character or the like appearing in a virtual game world is to move or a direction in which the cursor is to move.


Although the cross key 72a is an operation section for outputting an operation signal in accordance with the aforementioned direction input operation performed by the player, such an operation section may be provided in another form. For example, the cross key 72a may be replaced with a composite switch including a push switch including a ring-shaped four-direction operation section and a center switch provided at the center thereof. Alternatively, the cross key 72a may be replaced with an operation section which includes an inclinable stick projecting from the top surface of the housing 71 and outputs an operation signal in accordance with the inclining direction of the stick. Still alternatively, the cross key 72a may be replaced with an operation section which includes a disc-shaped member horizontally slideable and outputs an operation signal in accordance with the sliding direction of the disc-shaped member. Still alternatively, the cross key 72a may be replaced with a touch pad. Still alternatively, the cross key 72a may be replaced with an operation section which includes switches representing at least four directions (front, rear, right and left) and outputs an operation signal in accordance with the switch pressed by the player.


Behind the cross key 72a on the top surface of the housing 71, a plurality of operation buttons 72b, 72c, 72d, 72e, 72f and 72g are provided. The operation buttons 72b, 72c, 72d, 72e, 72f and 72g are each an operation section for outputting a respective operation signal assigned to the operation buttons 72b, 72c, 72d, 72e, 72f or 72g when the player presses a head thereof. For example, the operation buttons 72b, 72c, and 72d, are assigned with functions of an X button, a Y button, and a B button. Further, the operation buttons 72e, 72f and 72g are assigned with functions of a select switch, a menu switch and a start switch, for example. The operation buttons 72b, 72c, 72d, 72e, 72f and 72g are assigned with various functions in accordance with the game program executed by the game apparatus 3, but this will not be described in detail because the functions are not directly relevant to the present invention. In an exemplary arrangement shown in FIG. 5, the operation buttons 72b, 72c and 72d are arranged in a line at the center in the front-rear direction on the top surface of the housing 71. The operation buttons 72e, 72f and 72g are arranged in a line in the left-right direction between the operation buttons 72b and 72d on the top surface of the housing 71. The operation button 72f has a top surface thereof buried in the top surface of the housing 71, so as not to be inadvertently pressed by the player.


In front of the cross key 72a on the top surface of the housing 71, an operation button 72h is provided. The operation button 72h is a power switch for remote-controlling the power of the game apparatus 3 to be on or off. The operation button 72h also has a top surface thereof buried (recessed) in the top surface of the housing 71, so as not to be inadvertently pressed by the player.


Behind the operation button 72c on the top surface of the housing 71, a plurality of LEDs 702 are provided. The controller 7 is assigned a controller type (number) so as to be distinguishable from the other controllers 7. For example, the LEDs 702 are used for informing the player of the controller type which is currently set to controller 7 that he or she is using. Specifically, when the core unit 70 transmits the transmission data to the receiving unit 6, one of the plurality of LEDs 702 corresponding to the controller type is lit up.


On a bottom surface of the housing 71, a recessed portion is formed. As described later in detail, the recessed portion is formed at a position at which an index finger or middle finger of the player is located when the player holds the core unit 70. On a rear slope surface of the recessed portion, an operation button 72i is provided. The operation button 72i is an operation section acting as, for example, an A button. The operation button 72i is used, for example, as a trigger switch in a shooting game, or for attracting attention of a player object to a predetermined object.


On a front surface of the housing 71, an image pickup element 743 included in the imaging information calculation section 74 is provided. The imaging information calculation section 74 is a system for analyzing image data taken by the core unit 70 and detecting the position of the center of gravity, the size and the like of an area having a high brightness in the image data. The imaging information calculation section 74 has, for example, a maximum sampling period of about 200 frames/sec., and therefore can trace and analyze even with a relatively fast motion of the core unit 70. The imaging information calculation section 74 will be described later in detail. On a rear surface of the housing 71, the connector 73 is provided. The connector 73 is, for example, a 32-pin edge connector, and is used for engaging and connecting the core unit 70 with the connector 791 of the connecting cable 79.


With reference to FIGS. 7A and 7B, an internal structure of the core unit 70 will be described. FIG. 7A is a perspective view illustrating a state where an upper casing (a part of the housing 71) of the core unit 70 is removed. FIG. 7B is a perspective view illustrating a state where a lower casing (a part of the housing 71) of the core unit 70 is removed. FIG. 7B is a perspective view illustrating a reverse side of a substrate 700 shown in FIG. 7A.


As shown in FIG. 7A, the substrate 700 is fixed inside the housing 71. On a top main surface of the substrate 700, the operation buttons 72a, 72b, 72c, 72d, 72e, 72f, 72g and 72h, an acceleration sensor 701, the LEDs 702, a quartz oscillator 703, a wireless module 753, an antenna 754 and the like are provided. These elements are connected to a microcomputer 751 (see FIG. 14) via lines (not shown) formed on the substrate 700 and the like. The wireless module 753 and the antenna 754 allow the core unit 70 to act as a wireless controller. The quartz oscillator 703 generates a reference clock of the microcomputer 751 described later.


As shown in FIG. 7B, at a front edge of a bottom main surface of the substrate 700, the imaging information calculation section 74 is provided. The imaging information calculation section 74 includes an infrared filter 741, a lens 742, the image pickup element 743 and an image processing circuit 744 located in this order from the front surface of the core unit 70 on the bottom main surface of the substrate 700. At a rear edge of the bottom main surface of the substrate 700, the connector 73 is attached. The operation button 72i is attached on the bottom main surface of the substrate 700 behind the imaging information calculation section 74, and cells 705 are accommodated behind the operation button 72i. On the bottom main surface of the substrate 700 between the cells 705 and the connector 73, a vibrator 704 is attached. The vibrator 704 may be, for example, a vibration motor or a solenoid. The core unit 70 is vibrated by an actuation of the vibrator 704, and the vibration is conveyed to the player's hand holding the core unit 70.


With reference to FIGS. 8A, 8B, 8C and 9, the subunit 76 will be described. FIG. 8A is a top view of the subunit 76. FIG. 8B is a bottom view of the subunit 76. FIG. 8C is a left side view of the subunit 76. FIG. 9 is a perspective view of the subunit 76 as seen from the top front side thereof.


As shown in FIGS. 8A, 8B, 8C and 9, the subunit 76 includes the housing 77 formed by, for example, plastic molding. The housing 77 extends in a longitudinal direction from front to rear, and has a streamlined solid shape including a head which is a widest portion in the subunit 76. The overall size of the subunit 76 is small enough to be held by one hand of an adult or even a child.


In the vicinity of the widest portion on the top surface of the housing 77, a stick 78a is provided. The stick 78a is an operation section which includes an inclinable stick projecting from the top surface of the housing 77 and outputs an operation signal in accordance with the inclining direction of the stick. For example, a player can arbitrarily designate a direction and a position by inclining a stick tip in any direction of 360 degrees, whereby the player can instruct a direction in which a player character or the like appearing in a virtual game world is to move, or can instruct a direction in which a cursor is to move.


Although the stick 78a is an operation section for outputting an operation signal in accordance with a direction input operation performed by the player as described above, such an operation section may be provided in another form. Hereinafter, with reference to FIGS. 10 to 13, a first through a fifth exemplary modifications, each of which includes the subunit 76 having an operation section for outputting an operation signal in accordance with the direction input operation, will be described.


As the first exemplary modification, as shown in FIG. 10, the subunit 76 may include a cross key 78f similar to the cross key 72a of the core unit 70 instead of the stick 78a. As the second exemplary modification, as shown in FIG. 11, the subunit 76 may include a slide pad 78g which includes a disc-shaped member horizontally slideable and outputs an operation signal in accordance with the sliding direction of the disc-shaped member, instead of the stick 78a. As the third exemplary modification, as shown in FIG. 12, the subunit 76 may include a touch pad 78h instead of the stick 78a. As the fourth exemplary modification, as shown in FIG. 13, the subunit 76 may include an operation section which has buttons 78i, 78j, 78k, and 78l representing at least four directions (front, rear, right and left), respectively, and outputs an operation signal in accordance with the button (78i, 78j, 78k, or 78l) pressed by a player, instead of the stick 78a. As the fifth exemplary modification, the subunit 76 may include a composite switch including a push switch having a ring-shaped four-direction operation section and a center switch provided at the center thereof, instead of the stick 78a.


Behind the stick 78a on the top surface of the housing 77 and on the front surface of the housing 77, a plurality of operation buttons 78b, 78c, 78d and 78e are provided. The operation buttons 78b, 78c, 78d and 78e are each an operation section for outputting a respective operation signal assigned to the operation buttons 72b, 72c, 72d, and 72e when the player presses a head thereof. For example, the operation buttons 78b, 78c, 78d and 78e are assigned with functions of an X button, a Y button and the like. The operation buttons 78b, 78c, 78d and 78e are assigned with various functions in accordance with the game program executed by the game apparatus 3, but this will not be described in detail because the functions are not directly relevant to the present invention. In the exemplary arrangement shown in FIGS. 8A, 8B, 8C and 9, the operation buttons 78b and 78c are arranged in a line at the center in the left-right direction on the top surface of the housing 77. The operation buttons 78d and 78e are arranged in a line in the front-rear direction on the front surface of the housing 77.


Next, with reference to FIG. 14, an internal structure of the controller 7 will be described. FIG. 14 is a block diagram illustrating the structure of the controller 7.


As shown in FIG. 14, the core unit 70 includes the communication section 75 and the acceleration sensor 701 in addition to the aforementioned operation section 72 and the imaging information calculation section 74.


The imaging information calculation section 74 includes the infrared filter 741, the lens 742, the image pickup element 743 and the image processing circuit 744. The infrared filter 741 allows only infrared light to pass therethrough, among light incident on the front surface of the core unit 70. The lens 742 collects the infrared light which has passed through the infrared filter 741 and outputs the infrared light to the image pickup element 743. The image pickup element 743 is a solid-state imaging device such as, for example, a CMOS sensor or a CCD. The image pickup element 743 takes an image of the infrared light collected by the lens 742. Accordingly, the image pickup element 743 takes an image of only the infrared light which has passed through the infrared filter 741 and generates image data. The image data generated by the image pickup element 743 is processed by the image processing circuit 744. Specifically, the image processing circuit 744 processes the image data obtained from the image pickup element 743, identifies a spot thereof having a high brightness, and outputs process result data representing the identified position coordinates and size of the area to the communication section 75. The imaging information calculation section 74 is fixed to the housing 71 of the core unit 70. The imaging direction of the imaging information calculation section 74 can be changed by changing the direction of the housing 71. The housing 71 is connected to the subunit 76 by the flexible connecting cable 79, and therefore the imaging direction of the imaging information calculation section 74 is not changed by changing the direction and position of the subunit 76. As described later in detail, a signal can be obtained in accordance with the position of the core unit 70 based on the process result data outputted by the imaging information calculation section 74.


The core unit 70 preferably includes a three-axis, linear acceleration sensor 701 that detects linear acceleration in three directions, i.e., the up/down direction, the left/right direction, and the forward/backward direction. Alternatively, a two axis linear accelerometer that only detects linear acceleration along each of the up/down and left/right directions (or other pair of directions) may be used in another embodiment depending on the type of control signals desired. As a non-limiting example, the three-axis or two-axis linear accelerometer 701 may be of the type available from Analog Devices, Inc. or STMicroelectronics N.V. Preferably, the acceleration sensor 701 is an electrostatic capacitance or capacitance-coupling type that is based on silicon micro-machined MEMS (microelectromechanical systems) technology. However, any other suitable accelerometer technology (e.g., piezoelectric type or piezoresistance type) now existing or later developed may be used to provide the three-axis or two-axis acceleration sensor 701.


As one skilled in the art understands, linear accelerometers, as used in acceleration sensor 701, are only capable of detecting acceleration along a straight line corresponding to each axis of the acceleration sensor. In other words, the direct output of the acceleration sensor 701 is limited to signals indicative of linear acceleration (static or dynamic) along each of the two or three axes thereof. As a result, the acceleration sensor 701 cannot directly detect movement along a non-linear (e.g. arcuate) path, rotation, rotational movement, angular displacement, tilt, position, attitude or any other physical characteristic.


However, through additional processing of the linear acceleration signals output from the acceleration sensor 701, additional information relating to the core unit 70 can be inferred or calculated, as one skilled in the art will readily understand from the description herein. For example, by detecting static, linear acceleration (i.e., gravity), the linear acceleration output of the acceleration sensor 701 can be used to infer tilt of the object relative to the gravity vector by correlating tilt angles with detected linear acceleration. In this way, the acceleration sensor 701 can be used in combination with the micro-computer 751 (or another processor) to determine tilt, attitude or position of the core unit 70. Similarly, various movements and/or positions of the core unit 70 can be calculated or inferred through processing of the linear acceleration signals generated by the acceleration sensor 701 when the core unit 70 containing the acceleration sensor 701 is subjected to dynamic accelerations by, for example, the hand of a user, as explained herein. In another embodiment, the acceleration sensor 701 may include an embedded signal processor or other type of dedicated processor for performing any desired processing of the acceleration signals output from the accelerometers therein prior to outputting signals to micro-computer 751. For example, the embedded or dedicated processor could convert the detected acceleration signal to a corresponding tilt angle when the acceleration sensor is intended to detect static acceleration (i.e., gravity). Data representing the acceleration detected by the acceleration sensor 701 is outputted to the communication section 75.


In another exemplary embodiment, the acceleration sensor 701 may be replaced with a gyro-sensor of any suitable technology incorporating, for example, a rotating or vibrating element. Exemplary MEMS gyro-sensors that may be used in this embodiment are available from Analog Devices, Inc. Unlike the linear acceleration sensor 701, a gyro-sensor is capable of directly detecting rotation (or angular rate) around an axis defined by the gyroscopic element (or elements) therein. Thus, due to the fundamental differences between a gyro-sensor and an linear acceleration sensor, corresponding changes need to be made to the processing operations that are performed on the output signals from these devices depending on which device is selected for a particular application.


More specifically, when a tilt or inclination is calculated using a gyroscope instead of the acceleration sensor, significant changes are necessary. Specifically, when using a gyro-sensor, the value of inclination is initialized at the start of detection. Then, data on the angular velocity which is output from the gyroscope is integrated. Next, a change amount in inclination from the value of inclination previously initialized is calculated. In this case, the calculated inclination corresponds to an angle. In contrast, when an acceleration sensor is used, the inclination is calculated by comparing the value of the acceleration of gravity of each axial component with a predetermined reference. Therefore, the calculated inclination can be represented as a vector. Thus, without initialization, an absolute direction can be determined with an accelerometer. The type of the value calculated as an inclination is also very different between a gyroscope and an accelerometer; i.e., the value is an angle when a gyroscope is used and is a vector when an accelerometer is used. Therefore, when a gyroscope is used instead of an acceleration sensor or vice versa, data on inclination also needs to be processed by a predetermined conversion that takes into account the fundamental differences between these two devices. Due to the fact that the nature of gyroscopes is known to one skilled in the art, as well as the fundamental differences between linear accelerometers and gyroscopes, further details are not provided herein so as not to obscure the remainder of the disclosure. While gyro-sensors provide certain advantages due to their ability to directly detect rotation, linear acceleration sensors are generally more cost effective when used in connection with the controller applications described herein.


The communication section 75 includes the microcomputer 751, a memory 752, the wireless module 753 and the antenna 754. The microcomputer 751 controls the wireless module 753 for wirelessly transmitting the transmission data while using the memory 752 as a storage area during the process.


Data from the core unit 70 including an operation signal (core key data) from the operation section 72, acceleration signals (acceleration data) from the acceleration sensor 701, and the process result data from the imaging information calculation section 74 are outputted to the microcomputer 751. An operation signal (sub key data) from the operation section 78 of the subunit 76 is outputted to the microcomputer 751 via the connecting cable 79. The microcomputer 751 temporarily stores the input data (core key data, sub key data, acceleration data, and process result data) in the memory 752 as the transmission data which is to be transmitted to the receiving unit 6. The wireless transmission from the communication section 75 to the receiving unit 6 is performed periodically at a predetermined time interval. Since game processing is generally performed at a cycle of 1/60 sec., data needs to be collected and transmitted at a cycle of a shorter time period. Specifically, the game process unit is 16.7 ms ( 1/60 sec.), and the transmission interval of the communication section 75 structured using the Bluetooth (registered trademark) technology is 5 ms. At the transmission timing to the receiving unit 6, the microcomputer 751 outputs the transmission data stored in the memory 752 as a series of operation information to the wireless module 753. The wireless module 753 uses, for example, the Bluetooth (registered trademark) technology to modulate the operation information onto a carrier wave of a predetermined frequency, and radiates the low power radio wave signal from the antenna 754. Thus, the core key data from the operation section 72 included in the core unit 70, the sub key data from the operation section 78 included in the subunit 76, acceleration data from the acceleration sensor 701, and the process result data from the imaging information calculation section 74 are modulated onto the low power radio wave signal by the wireless module 753 and radiated from the core unit 70. The receiving unit 6 of the game apparatus 3 receives the low power radio wave signal, and the game apparatus 3 demodulates or decodes the low power radio wave signal to obtain the series of operation information (the core key data, the sub key data, the acceleration data, and the process result data). Based on the obtained operation information and the game program, the CPU 30 of the game apparatus 3 performs the game process. In the case where the communication section 75 is structured using the Bluetooth (registered trademark) technology, the communication section 75 can have a function of receiving transmission data which is wirelessly transmitted from other devices. The acceleration data and/or process result data are included in first operation data and the sub key data is included in the second operation data.


As shown in FIG. 15, in order to play a game using the controller 7 with the game system 1, a player holds the core unit 70 with one hand (for example, a right hand) (see FIGS. 16 and 17), and holds the subunit 76 with the other hand (for example, a left hand) (see FIG. 19). The player holds the core unit 70 so as to point the front surface of the core unit 70 (that is, a side having an entrance through which light is incident on the imaging information calculation section 74 taking an image of the light) to the monitor 2. On the other hand, two LED modules 8L and 8R are provided in the vicinity of the display screen of the monitor 2. The LED modules 8L and 8R each outputs infrared light forward from the monitor 2.


When a player holds the core unit 70 so as to point the front surface thereof to the monitor 2, infrared lights outputted by the two LED modules 8L and 8R are incident on the imaging information calculation section 74. The image pickup element 743 takes images of the infrared lights incident through the infrared filter 741 and the lens 742, and the image processing circuit 744 processes the taken images. The imaging information calculation section 74 detects infrared components outputted by the LED modules 8L and 8R so as to obtain positions and area information of the LED modules 8L and 8R. Specifically, the imaging information calculation section 74 analyzes image data taken by the image pickup element 743, eliminates images which do not represent the infrared lights outputted by the LED modules 8L and 8R from the area information, and identifies points each having a high brightness as positions of the LED modules 8L and 8R. The imaging information calculation section 74 obtains positions coordinates, coordinates of the center of gravity, and the like of each of the identified points having the high brightness and outputs the same as the process result data. When such process result data is transmitted to the game apparatus 3, the game apparatus 3 can obtain, based on the position coordinates and the coordinates of the center of gravity, operation signals relating to the motion, posture, position and the like of the imaging information calculation section 74, that is, the core unit 70, with respect to the LED modules 8L and 8R. Specifically, the position having a high brightness in the image obtained through the communication section 75 is changed in accordance with the motion of the core unit 70, and therefore a direction input or coordinate input is performed in accordance with the position having the high brightness being changed, thereby enabling a direction input or a coordinate input to be performed along the moving direction of the core unit 70.


Thus, the imaging information calculation section 74 of the core unit 70 takes images of stationary markers (infrared lights from the two LED modules 8L and 8R in the present embodiment), and therefore the game apparatus 3 can use the process result data relating to the motion, posture, position and the like of the core unit 70 in the game process, whereby an operation input, which is different from an input made by pressing an operation button or using an operation key, is further intuitively performed. As described above, since the markers are provided in the vicinity of the display screen of the monitor 2, the motion, posture, position and the like of the core unit 70 with respect to the display screen of the monitor 2 can be easily calculated based on positions from the markers. That is, the process result data used for obtaining the motion, posture, position and the like of the core unit 70 can be used as operation input immediately applied to the display screen of the monitor 2.


With reference to FIGS. 16 and 17, a state of a player holding the core unit 70 with one hand will be described. FIG. 16 shows an exemplary state of a player holding the core unit 70 with a right hand as seen from the front surface side of the core unit 70. FIG. 17 shows an exemplary state of a player holding the core unit 70 with a right hand as seen from the left side of the core unit 70.


As shown in FIGS. 16 and 17, the overall size of the core unit 70 is small enough to be held by one hand of an adult or even a child. When the player puts a thumb on the top surface of the core unit 70 (for example, near the cross key 72a), and puts an index finger in the recessed portion on the bottom surface of the core unit 70 (for example, near the operation button 72i), the light entrance of the imaging information calculation section 74 on the front surface of the core unit 70 is exposed forward to the player. It should be understood that also when the player holds the core unit 70 with a left hand, the holding state is the same as that described for the right hand.


Thus, the core unit 70 allows a player to easily operate the operation section 72 such as the cross key 72a or the operation button 72i while holding the core unit 70 with one hand. Further, when the player holds the core unit 70 with one hand, the light entrance of the imaging information calculation section 74 on the front surface of the core unit 70 is exposed, whereby the light entrance can easily receive infrared lights from the aforementioned two LED modules 8L and 8R. That is, the player can hold the core unit 70 with one hand without preventing the imaging information calculation section 74 from functioning. That is, when the player moves his or her hand holding the core unit 70 with respect to the display screen, the core unit 70 can further perform an operation input enabling a motion of the player's hand to directly act on the display screen.


As shown in FIG. 18, the LED modules 8L and 8R each has a viewing angle θ1. The image pickup element 743 has a viewing angle θ2. For example, the viewing angle θ1 of the LED modules 8L and 8R is 34 degrees (half-value angle), and the viewing angle θ2 of the image pickup element 743 is 41 degrees. When both the LED modules 8L and 8R are in the viewing angle θ2 of the image pickup element 743, and the image pickup element 743 is in the viewing angle θ1 of the LED module 8L and the viewing angle θ1 of the LED module 8R, the game apparatus 3 determines a position of the core unit 70 using positional information relating to the point having high brightness of the two LED modules 8L and 8R.


When either the LED module 8L or LED module 8R is in the viewing angle θ2 of the image pickup element 743, or when the image pickup element 743 is in either the viewing angle θ1 of the LED module 8L or the viewing angle θ1 of the LED module 8R, the game apparatus 3 determines a position of the core unit 70 using the positional information relating to the point having high brightness of the LED module 8L or the LED module 8R.


Next, with reference to FIG. 19, a state of a player holding the subunit 76 with one hand will be described. FIG. 19 shows an exemplary state of a player holding the subunit 76 with a left hand as seen from the right side of the subunit 76.


As shown in FIG. 19, the overall size of the subunit 76 is small enough to be held by one hand of an adult or even a child. For example, a player can put a thumb on the top surface of the subunit 76 (for example, near the stick 78a), put an index finger on the front surface of the subunit 76 (for example, near the operation buttons 78d and 78e), and put a middle finger, a ring finger and a little finger on the bottom surface of the subunit 76 so as to hold the subunit 76. It should be understood that also when the player holds the subunit 76 with a right hand, the holding state is similar to that described for the left hand. Thus, the subunit 76 allows the player to easily operate the operation section 78 such as the stick 78a and the operation buttons 78d and 78e while holding the subunit 76 with one hand.


Here, an exemplary game played using the aforementioned controller 7 will be described. As a first example, a shooting game played using the controller 7 will be described. FIG. 20 is a diagram illustrating an exemplary game image displayed on the monitor 2 when the game apparatus 3 executes the shooting game.


As shown in FIG. 20, a portion of a three-dimensional virtual game space S is displayed on the display screen of the monitor 2. As a game object acting in accordance with an operation of the controller 7, a portion of the player character P and a portion of a gun G held by the player character P are displayed on the display screen. Moreover, the virtual game space S displayed on the display screen represents a field of front vision of the player character P, and for example an opponent character E is displayed as a shooting target in FIG. 20. A target indicating a position at which the player character P shoots the gun G is displayed on the display screen as the target cursor T.


In the shooting game having such a game image displayed on the monitor 2, a player operates the core unit 70 with one hand and operates the subunit 76 with the other hand as shown in FIG. 15 so as to play the game. For example, when the player inclines the stick 78a (see FIGS. 8A, 8B, 8C and 9) on the subunit 76, the player character P is moved in the virtual game space S in accordance with the inclining direction. Further, when the player moves his or her hand holding the core unit 70 with respect to the display screen, the target cursor T is moved in accordance with the motion, posture, position and the like of the core unit 70 with respect to the monitor 2 (LED modules 8L and 8R). When the player presses the operation button 72i (shown in FIG. 6) on the core unit 70, the player character P shoots the gun G at the target cursor T.


That is, while the player uses the stick 78a on the subunit 76 so as to instruct the player character P to move, the player can operate the core unit 70 as if the core unit 70 is a gun for the shooting game, thereby enhancing enjoyment in playing a shooting game. The player can perform an operation of moving the player character P and an operation of moving the target cursor T by using respective units held by different hands, whereby the player can perform the respective operations as independent ones. For example, since the virtual game space S displayed on the display screen is changed in accordance with the movement of the player character P, it is sometimes difficult to keep the target positioned near a position observed by the player in the virtual game space S because, for example, the player may be paying attention to the opponent character E suddenly jumping into the virtual game space S. However, while the player is moving the player character P with one hand (for example, a thumb of a left hand), the player can control a motion of the arm (for example, a right arm) which is not used for moving the player character P such that the core unit 70 has its front surface pointed to the observed position, thereby substantially enhancing flexibility for operating the controller 7 and increasing the reality of the shooting game. Further, in order to move the target cursor T, the player moves the controller. However, the operation of moving the controller does not hinder the player from performing a direction instruction operation for moving the player character P, thereby enabling the player to stably perform the two direction instruction operations. That is, by using the controller 7, the player can freely use his or her left and right hands and can perform a new operation having increased flexibility, which cannot be achieved using a physically single controller.


In a second example, a player inclines the stick 78a on the subunit 76 so as to move the player character P in the virtual game space S in accordance with the inclining direction as in the first example. The player moves a hand holding the core unit 70 with respect to the display screen so as to move a sight point of a virtual camera in accordance with a position of the core unit 70 with respect to the monitor 2 (LED modules 8L and 8R). These operations allow the player to observe a position to which the core unit 70 is pointed in the virtual game space S while operating the stick 78a on the subunit 76 so as to instruct the player character P to move.


In the above description, the controller 7 and the game apparatus 3 are connected to each other by wireless communication. However, the controller 7 and the game apparatus 3 may be electrically connected to each other by a cable. In this case, the cable connected to the core unit 70 is connected to a connection terminal of the game apparatus 3.


Moreover, in the present embodiment, only the core unit 70 among the core unit 70 and the subunit 76 of the controller 7 has the communication section 75. However, the subunit 76 may have the communication section for wirelessly transmitting the transmission data to the receiving unit 6. Further, both the core unit 70 and the subunit 76 may have the respective communication sections. For example, the respective communication sections included in the core unit 70 and the subunit 76 may wirelessly transmit the transmission data to the receiving unit 6, or the communication section of the subunit 76 may wirelessly transmit the transmission data to the communication section 75 of the core unit 70, and the communication section 75 of the core unit 70 may wirelessly transmit, to the receiving unit 6, the received transmission data from the subunit 76 and the transmission data of the core unit 70. In these cases, the connecting cable 79 for electrically connecting between the core unit 70 and the subunit 76 can be eliminated.


In the above description, the receiving unit 6 connected to the connection terminal of the game apparatus 3 is used as a receiving means for receiving transmission data which is wirelessly transmitted from the controller 7. Alternatively, the receiving means may be a receiving module built in the game apparatus 3. In this case, the transmission data received by the receiving module is outputted to the CPU 30 via a predetermined bus.


Although in the present embodiment the imaging information calculation section 74 included in the core unit 70 is described as an example of a determining section for outputting a signal (process result data) in accordance with a motion of the core unit 70 body, the imaging information calculation section 74 may be provided in another form. For example, the core unit 70 may include the acceleration sensor 701 as described above, or may include a gyro sensor. The acceleration sensor or the gyro sensor can be used to determine a motion or posture of the core unit 70, and, therefore, can be used as a determining section for outputting a signal in accordance with the motion of the core unit 70 body using the detection signal for the motion or posture. In this case, the imaging information calculation section 74 may be eliminated from the core unit 70, or sensor and the imaging information calculation section can be used in combination.


Further, although in the present embodiment only the core unit 70 includes the imaging information calculation section 74, the subunit 76 may also include a similar imaging information calculation section.


In the present embodiment, image data taken by the image pickup element 743 is analyzed so as to obtain position coordinates and the like of an image of infrared lights from the LED modules 8L and 8R, and the core unit 70 generates process result data from the obtained coordinates and the like and transmits the process result data to the game apparatus 3. However, the core unit 70 may transmit data obtained in another process step to the game apparatus 3. For example, the core unit 70 transmits to the game apparatus 3 image data taken by the image pickup element 743, and the CPU 30 may perform the aforementioned analysis so as to obtain process result data. In this case, the image processing circuit 744 can be eliminated from the core unit 70. Alternatively, the core unit 70 may transmit, to the game apparatus 3, the image data having been analyzed halfway. For example, the core unit 70 transmits to the game apparatus 3 data indicating a brightness, a position, an area size and the like obtained from the image data, and the CPU 30 may perform the remaining analysis so as to obtain process result data.


Although in the present embodiment infrared lights from the two LED modules 8L and 8R are used as imaging targets of the imaging information calculation section 74 in the core unit 70, the imaging target is not restricted thereto. For example, infrared light from one LED module or infrared lights from at least three LED modules provided in the vicinity of the monitor 2 may be used as the imaging target of the imaging information calculation section 74. Alternatively, the display screen of the monitor 2 or another emitter (room light or the like) can be used as the imaging target of the imaging information calculation section 74. When the position of the core unit 70 with respect to the display screen is calculated based on the positional relationship between the imaging target and the display screen of the monitor 2, various emitters can be used as the imaging target of the imaging information calculation section 74.


The aforementioned shapes of the core unit 70 and the subunit 76 are merely examples. Further, the shape, the number, setting position and the like of each of the operation section 72 of the core unit 70 and the operation section 78 of the subunit 76 are merely examples. Needless to say, even when the shape, the number, the setting position and the like of each of the core unit 70, the subunit 76, the operation section 72, and the operation section 78 are different from those described in the embodiment, the present invention can be realized. Further, the imaging information calculation section 74 (light entrance of the imaging information calculation section 74) of the core unit 70 may not be positioned on the front surface of the housing 71. The imaging information calculation section 74 may be provided on another surface at which light can be received from the exterior of the housing 71.


Thus, the controller of the present invention allows a player to operate both the core unit 70 and the subunit 76 included therein so as to enjoy a game. The core unit 70 has a function of outputting a signal in accordance with motion of the unit body including the imaging information calculation section 74 and the accelerator sensor 701. The subunit 76 has a function of outputting a signal in accordance with a direction input operation performed by the player. For example, when used is a controller into which the core unit 70 and the subunit 76 are integrated, the whole controller has to be moved so as to output a signal in accordance with the motion of the unit body, thereby exerting some influence on the direction input operation. Further, the integration of the core unit 70 and the subunit 76 causes the opposite influence, that is, flexibility, which is realized by separation between the core unit 70 and the subunit 76, is substantially reduced. Therefore, the core unit 70 and the subunit 76 can be separated into a right unit and a left unit as in the case of a conventional controller for the game apparatus, and simultaneously the core unit 70 and the subunit 76 allow the player to freely use his or her right and left hands, thereby providing the player with new operation, which cannot be anticipated by the integrated controller. Further, the controller can be operated with substantially enhanced flexibility, thereby providing a player with a game operation having increased reality.


The game controller and the game system according to the present invention can realize an operation having increased flexibility, and are useful as a game controller which includes two independent units and is operated by a player holding the two independent units, a game system including the game controller, and the like.


While the invention has been described in detail, the foregoing description is in all aspects illustrative and not restrictive. It is understood that numerous other modifications and variations can be devised without departing from the scope of the invention.

Claims
  • 1. A game controller arrangement operable by at least one human player, the game controller arrangement comprising first and second control units, one of the first and second control units, in use, sending operating data obtained from the first control unit and the second control unit to a video game console providing video game play, the game controller arrangement comprising: a first control unit structured to be held in the air by and operated by a first hand, said first control unit including a first control unit body having disposed therein a sensor comprising a linear acceleration sensing arrangement that detects linear acceleration or a gyro-sensor that detects angular rate of rotation;a second control unit structured to be held in the air by and operated by a second hand, said second control unit including an imaging section that detects plural infrared markers and obtains marker coordinate related information therefrom, said second control unit further including at least one digit-operated control; anda wireless connection that wirelessly connects the first control unit with the second control unit, whereinthe first control unit further includes a first operating data generator coupled to said sensor that provides first operating data sensed by said sensor, said first operating data in use controlling at least a first aspect of game play provided by said video game console;the second control unit further includes a second operating data generator coupled to said imaging section that generates second operating data in response to the imaging section and position of the at least one digit-operated control, said second operating data in use controlling at least a second aspect of game play provided by said video game console; andone of the first and second control units further includes a wireless transceiver for wirelessly transmitting the first operating data from the first control unit and the second operating data from the second control unit wirelessly to the video game console to thereby at least in part control said video game play.
  • 2. The game controller arrangement according to claim 1, wherein the imaging section includes an image pickup sensor that senses an image along a predetermined direction, and outputs, as the second operating data, data including marker image position coordinate data that is a result of subjecting an image sensed by the image pickup sensor to a predetermined calculation.
  • 3. The game controller arrangement according to claim 2, wherein the second operating data generator further includes a coordinate information calculator calculating coordinate information indicating positional coordinates, in the image sensed by the image pickup sensor, of at least one marker image which is included in the sensed image and is used as an imaging target, when performing the predetermined calculation, and outputting said marker image positional coordinates as the second operating data.
  • 4. The game controller arrangement according to claim 1, wherein the sensor senses linear acceleration operating data, the imaging section provides marker image positional coordinate operating data, and the transceiver transmits to the video game player the linear acceleration operating data and the marker image positional coordinate operating data continually at intervals shorter than 1/60 second.
  • 5. The game controller arrangement according to claim 1, wherein the second control unit comprises a second control unit body, and the second control unit digit operated control includes a thumb-operable stick which has a tip projecting from the second control unit body and is inclinable relative to second control unit body, and outputs data obtained in accordance with an inclining direction of the thumb-operable stick for wireless transmission to the video game console.
  • 6. The game controller arrangement according to claim 1, wherein the second control unit comprises a second control unit body and the second control unit digit-operated control includes an operation button which has operation portions representing at least four directions and which is able to be pushed by a thumb into the second control unit body by the operation portions, and outputs, as the second operating data for wireless transmission to the video game console, data corresponding to the operation portion at which the operation button is pushed.
  • 7. The game controller arrangement according to claim 1, wherein the second control unit has a body and the second operating data generator includes a sliding member which has a top surface exposed from the second control unit body and which is horizontally movable on the second control unit body, and outputs data obtained in accordance with a horizontal moving direction of the sliding member as the second operating data for wireless transmission to the video game console.
  • 8. The game controller arrangement according to claim 1, wherein the second control unit has a body and the second operating data generator includes a touch pad on an outer surface of the second control unit body and outputs, as the second operating data for wireless transmission to the video game console, data obtained in accordance with a position on the touch pad at which the touch pad is touched.
  • 9. The game controller arrangement according to claim 1, wherein the second control unit has a body and the second operating data generator includes at least four operation buttons which are able to be pushed into the second control unit body, and outputs data obtained in accordance with the pushed operation button as the second operating data for wireless transmission to the video game console.
  • 10. A game system comprising the game controller arrangement according to claim 1, and wherein said video game console is wirelessly connected to the game controller arrangement and includes a computer for executing a game program to represent a virtual game world on a display screen, wherein the video game console provides animated game play including the first aspect controlled by the first operating data from the first control unit and the second aspect controlled by the second operating data from the second control unit.
  • 11. The game system according to claim 10, wherein the video game console causes a player character appearing in the virtual game world to perform a first action in accordance with the first operating data and causes the player character appearing in the virtual game world to perform a second action in accordance with the second operating data.
  • 12. For use with a video game playing system comprising (a) a video game console providing video game play and (b) a remote controller configured to be held in the air and operated by a first hand and including an image sensor that detects plural infrared markers disposed in the vicinity of a display screen and obtains marker image position coordinates therefrom and a three-axis linear acceleration sensor that detects linear acceleration to generate first data and wirelessly transmits said first data to the video game console, a further controller comprising: a housing structured to be held in a second hand;a thumbstick disposed on said housing, said thumbstick generating inclination data in response to inclination of said thumbstick by a thumb of said second hand relative to said housing;at least one button disposed on said housing, said button generating button data in response to depression of said button by a finger of said second hand;a further sensor disposed within said housing, said further sensor used to generate posture representing data related to posture of said housing; anda wireless transmitter operatively coupled to said thumbstick, said at least one button and said further sensor, said wireless transmitter configured for sending said thumbstick inclination data, said button data and said posture representing data over the air through said remote controller to said video game console, said remote controller and further controller cooperating to, in use, control a first aspect of video game play provided by said video game console by operating said remote controller with the first hand and controlling an additional aspect of said video game play by operating said further controller with the second hand,wherein said remote controller is configured to, in use, send said further controller's thumbstick inclination data, button data and posture representing data, and to also send said detected linear acceleration and said marker image position coordinates, over the air to said video game console.
  • 13. The video game playing system of claim 12 wherein said further controller is configured so that said remote controller can optionally connect to said further controller.
  • 14. A video game accessory controller wherein at least a first aspect of video game play is, in use, controlled by said accessory controller and at least a second aspect of said video game play is, in use, controlled by another handheld controller, and said accessory controller is configured to communicate with a video game console through said another handheld controller, said video game accessory controller comprising: a housing structured to be held in the air and operated by a single hand of a video game player;a joystick disposed on and projecting from said housing, said joystick generating inclination data in response to manual inclination thereof relative to said housing by a digit of said single hand;at least one button disposed on said housing, said button generating button data in response to depression thereof;a sensor disposed within said housing, said sensor generating data relating to attitude of said housing; anda wireless transceiver operatively coupled to said joystick, said at least one button and said sensor, said wireless transceiver sending said joystick inclination data, said button data and said sensor data over the air to another handheld controller held in the air and operated by a further hand for said another handheld controller to send to a video game console providing video game play.
  • 15. The accessory controller of claim 14 wherein the sensor comprises an image sensor.
  • 16. The accessory controller of claim 14 wherein wireless communication between said accessory controller and said another handheld controller is two-way.
  • 17. For use with a video game console providing video game play and having a remote game controller for being held in the air and operated by a first hand, a further game controller for being held in the air and operated by a second hand, said further game controller comprising: a housing dimensioned and structured to be held in the air and operated by the second hand, said housing comprising a tapered curved upper surface terminating in a forward surface and further comprising a lower surface having a concavity defined therein,a joystick disposed on said housing upper surface, said joystick generating inclination data in response to detected inclination thereof relative to said upper surface by a digit of said second hand;at least one button disposed on said housing forward surface, said button generating button data in response to depression thereof by a digit of said second hand;a further sensor disposed within said housing, said further sensor generating further data related to the posture or attitude of said housing; anda wireless communications device responsive to said joystick, said at least one button and said further sensor, said wireless communications device being configured to, in use, send said joystick inclination data, said button data and said further data wirelessly to said remote game controller for said remote game controller to forward to said video game console, operation of said remote game controller by said first hand, in use, controlling at least one aspect of game play provided by said video game console, operation of said further game controller by said second hand controlling, in use, at least one further aspect of said game play provided by said video game console.
  • 18. The further controller of claim 17 wherein said remote game controller and said further game controller are configured for the same game player to hold in the air in different hands and operate simultaneously.
  • 19. The further controller of claim 17 wherein said housing is dimensioned so that in use a plurality of fingers of said second hand can rest within said concavity while (a) at least a portion of the heel of the second hand is placed in contact with the tapered curved upper surface, (b) the thumb of said second hand can operate the joystick, and (c) the index finger of said second hand can operate the at least one button.
  • 20. The further controller of claim 17 wherein said joystick comprises a thumbstick.
  • 21. The further controller of claim 17 wherein said at least one button comprises plural buttons of different sizes and shapes disposed on said forward surface.
  • 22. The further controller of claim 17 further comprising a cross-switch, an image sensor and a two-way transceiver provided on or in said housing.
  • 23. The further controller of claim 17 wherein said remote game controller also wirelessly receives data from said further game controller.
  • 24. The further controller of claim 17 wherein said wireless communications device comprises a radio transceiver.
  • 25. A handheld remote controller structured to be held in, carried by and operated by a single hand of a video game player, said handheld remote controller comprising: a housing shaped and structured so it can be held in the air by and manipulated by the game player's single hand, said housing defining a surface;an inclinable thumbstick projecting from said housing surface, said inclinable thumbstick outputting inclination data in accordance with the inclining direction of the inclinable thumbstick relative to the housing;at least one button disposed on said housing, said button generating button depression data in response to depression thereof by an index finger of said single hand;a three-axis linear acceleration sensor disposed within said housing, said three-axis linear acceleration sensor detecting linear acceleration of said housing and generating three axes of acceleration data related to posture of said housing; anda wireless transceiver operatively coupled to said inclinable thumbstick, said at least one button and said acceleration sensor, said wireless transceiver sending at least said thumbstick inclination data, said button depression data and said three axes of acceleration data over the air for use by a video game console in generating video game play,wherein, in use, at least a first aspect of said generated video game play can be controlled by said thumbstick inclination data and at least a second aspect of said generated video game play can be controlled by said linear acceleration data.
  • 26. A handheld controller structured to be held in, carried by and operated by a single hand of a video game player, said handheld controller comprising: a housing shaped and structured so the handheld controller can in use be held in, carried by and simultaneously manipulated by the game player's single hand, said housing defining at least one surface;an inclinable thumbstick projecting from said at least one surface of said housing, said inclinable thumbstick detecting inclination of the inclinable thumbstick relative to said at least one surface of said housing as controlled by the thumb of said single hand;plural buttons disposed on said housing, said plural buttons generating button depression data in response to depression thereof by a digit of said single hand;an inertial sensor disposed within said housing, said inertial sensor selected from the group consisting of a linear acceleration sensor and a gyrosensor, said inertial sensor detecting information related to posture of said housing; anda wireless transceiver operatively coupled to said inclinable thumbstick, said plural buttons and said inertial sensor, said wireless transceiver sending said detected thumbstick inclination, said plural button depression data and said detected housing posture related information over the air for use by a video game console in generating video game play,wherein, in use, at least a first aspect of said generated video game play can be controlled by said detected thumbstick inclination and at least a second aspect of said generated video game play can be controlled by said detected housing posture related information.
  • 27. The controller of claim 26 wherein said inertial sensor comprises a three-axis linear accelerometer.
  • 28. The controller of claim 26 wherein said inertial sensor comprises a gyro-sensor capable of directly detecting rotation or angular rate.
  • 29. A game controller arrangement for use with a video game console and operable by the left and right hands of a user, the game controller arrangement comprising first and second control units, the game controller arrangement comprising: a first control unit structured to be held in the air by and operated by a first hand of a user, said first control unit including:an imaging section that detects plural infrared markers and obtains marker coordinate related information therefrom,a linear accelerometer that detects linear acceleration, anda wireless transmitter that wirelessly transmits data to the video game console; anda second control unit having a housing structured to be held in the air by and operated by a second, different hand of said user, said second control unit including:a sensor that generates game-related input by sensing stimuli external to said second control unit;a joystick that generates inclination data in response to inclination thereof relative to said housing by a digit of said second hand; anda wireless transmitter that wirelessly transmits output from the sensor and the digit-operated control to said first control unit.
  • 30. The game controller arrangement of claim 29 wherein said second control unit sensor comprises an image sensor that senses coordinates of plural infrared markers.
Priority Claims (1)
Number Date Country Kind
2005-242926 Aug 2005 JP national
CROSS REFERENCE TO RELATED APPLICATION

The disclosure of Japanese Patent Application No. 2005-242926 is incorporated herein by reference. This application also claims the benefit of Provisional Application No. 60/714,862, filed Sep. 8, 2005, the entire content of which is hereby incorporated by reference in this application.

US Referenced Citations (523)
Number Name Date Kind
3454920 Mehr Jul 1969 A
3474241 Kuipers Oct 1969 A
D220268 Kliewer Mar 1971 S
3660648 Kuipers May 1972 A
3973257 Rowe Aug 1976 A
4009619 Snyman Mar 1977 A
4038876 Morris Aug 1977 A
4166406 Maughmer Sep 1979 A
4287765 Kreft Sep 1981 A
4303978 Shaw et al. Dec 1981 A
4318245 Stowell et al. Mar 1982 A
4321678 Krogmann Mar 1982 A
4337948 Breslow Jul 1982 A
4342985 Desjardins Aug 1982 A
4402250 Baasch Sep 1983 A
4425488 Moskin Jan 1984 A
4443866 Burgiss, Sr. Apr 1984 A
4450325 Luque May 1984 A
4503299 Henrard Mar 1985 A
4514600 Lentz Apr 1985 A
4514798 Lesche Apr 1985 A
4540176 Baer Sep 1985 A
4546551 Franks Oct 1985 A
4558604 Auer Dec 1985 A
4561299 Orlando et al. Dec 1985 A
4578674 Baker et al. Mar 1986 A
4623930 Oshima et al. Nov 1986 A
4672374 Desjardins Jun 1987 A
4739128 Grisham Apr 1988 A
4761540 McGeorge Aug 1988 A
4787051 Olson Nov 1988 A
4816810 Moore Mar 1989 A
4839838 LaBiche et al. Jun 1989 A
4849655 Bennett Jul 1989 A
4851685 Dubgen Jul 1989 A
4914598 Krogmann et al. Apr 1990 A
4918293 McGeorge Apr 1990 A
4957291 Miffitt et al. Sep 1990 A
4961369 McGill Oct 1990 A
4969647 Mical et al. Nov 1990 A
4988981 Zimmerman Jan 1991 A
4994795 MacKenzie Feb 1991 A
5045843 Hansen Sep 1991 A
5059958 Jacobs et al. Oct 1991 A
5062696 Oshima et al. Nov 1991 A
5068645 Drumm Nov 1991 A
D325225 Adhida Apr 1992 S
5124938 Algrain Jun 1992 A
5128671 Thomas, Jr. Jul 1992 A
D328463 King et al. Aug 1992 S
5136222 Yamamoto Aug 1992 A
5138154 Hotelling Aug 1992 A
D331058 Morales Nov 1992 S
5175481 Kanno Dec 1992 A
5178477 Gambaro Jan 1993 A
5181181 Glynn Jan 1993 A
5202844 Kamio et al. Apr 1993 A
D340042 Copper et al. Oct 1993 S
5262777 Low et al. Nov 1993 A
D342256 Payne Dec 1993 S
5280744 DeCarlo et al. Jan 1994 A
5296871 Paley Mar 1994 A
5307325 Scheiber Apr 1994 A
5317394 Hale et al. May 1994 A
5329276 Hirabayashi Jul 1994 A
5332322 Gambaro Jul 1994 A
5339095 Redford Aug 1994 A
D350736 Takahashi et al. Sep 1994 S
D350782 Barr Sep 1994 S
D351430 Barr Oct 1994 S
5357267 Inoue Oct 1994 A
5359321 Ribic Oct 1994 A
5359348 Pilcher et al. Oct 1994 A
5363120 Drumm Nov 1994 A
5369580 Monji et al. Nov 1994 A
5369889 Callaghan Dec 1994 A
5373857 Hirabayashi et al. Dec 1994 A
5396265 Ulrich et al. Mar 1995 A
5421590 Robbins Jun 1995 A
5430435 Hoch et al. Jul 1995 A
D360903 Barr et al. Aug 1995 S
5440326 Quinn Aug 1995 A
5453758 Sato Sep 1995 A
5459489 Redford Oct 1995 A
5469194 Clark et al. Nov 1995 A
5481957 Paley et al. Jan 1996 A
5484355 King, II et al. Jan 1996 A
5485171 Copper et al. Jan 1996 A
5490058 Yamasaki et al. Feb 1996 A
5502486 Ueda et al. Mar 1996 A
5506605 Paley Apr 1996 A
5512892 Corballis et al. Apr 1996 A
5517183 Bozeman, Jr. May 1996 A
5526022 Donahue et al. Jun 1996 A
5528265 Harrison Jun 1996 A
5531443 Cruz Jul 1996 A
5541860 Takei et al. Jul 1996 A
5551701 Bouton et al. Sep 1996 A
5554033 Bizzi Sep 1996 A
5554980 Hashimoto et al. Sep 1996 A
5563628 Stroop Oct 1996 A
D375326 Yokoi et al. Nov 1996 S
5573011 Felsing Nov 1996 A
5574479 Odell Nov 1996 A
5579025 Itoh Nov 1996 A
D376826 Ashida Dec 1996 S
5587558 Matsushima Dec 1996 A
5594465 Poulachon Jan 1997 A
5598187 Ide et al. Jan 1997 A
5602569 Kato Feb 1997 A
5603658 Cohen Feb 1997 A
5605505 Han Feb 1997 A
5606343 Tsuboyama et al. Feb 1997 A
5611731 Bouton et al. Mar 1997 A
5615132 Horton et al. Mar 1997 A
5621459 Ueda et al. Apr 1997 A
5624117 Ohkubo et al. Apr 1997 A
5627565 Morishita et al. May 1997 A
5640152 Copper Jun 1997 A
5645077 Foxlin et al. Jul 1997 A
5645277 Cheng Jul 1997 A
5666138 Culver Sep 1997 A
5667220 Cheng Sep 1997 A
5670845 Grant et al. Sep 1997 A
5670988 Tickle Sep 1997 A
5676673 Ferre et al. Oct 1997 A
5679004 McGowan et al. Oct 1997 A
5682181 Nguyen et al. Oct 1997 A
5698784 Hotelling et al. Dec 1997 A
5701131 Kuga Dec 1997 A
5702305 Norman et al. Dec 1997 A
5703623 Hall et al. Dec 1997 A
5724106 Autry et al. Mar 1998 A
5726675 Inoue Mar 1998 A
5734371 Kaplan Mar 1998 A
5734373 Rosenberg et al. Mar 1998 A
5734807 Sumi Mar 1998 A
5736970 Bozeman, Jr. Apr 1998 A
5739811 Rosenberg et al. Apr 1998 A
5741182 Lipps et al. Apr 1998 A
5742331 Uomori et al. Apr 1998 A
5745226 Gigioli, Jr. Apr 1998 A
5746602 Kikinis May 1998 A
5751273 Cohen May 1998 A
5752880 Gabai et al. May 1998 A
5757354 Kawamura May 1998 A
5757360 Nitta et al. May 1998 A
5764224 Lilja et al. Jun 1998 A
5771038 Wang Jun 1998 A
D397162 Yokoi et al. Aug 1998 S
5794081 Itoh et al. Aug 1998 A
5796354 Cartabiano et al. Aug 1998 A
5805256 Miller Sep 1998 A
5807284 Foxlin Sep 1998 A
5819206 Horton Oct 1998 A
5820462 Yokoi et al. Oct 1998 A
5822713 Profeta Oct 1998 A
5825350 Case, Jr. et al. Oct 1998 A
D400885 Goto Nov 1998 S
5831553 Lenssen et al. Nov 1998 A
5835077 Dao Nov 1998 A
5835156 Blonstein et al. Nov 1998 A
5841409 Ishibashi et al. Nov 1998 A
5847854 Benson, Jr. Dec 1998 A
5850624 Gard et al. Dec 1998 A
5854622 Brannon Dec 1998 A
D405071 Gambaro Feb 1999 S
5867146 Kim et al. Feb 1999 A
5874941 Yamada Feb 1999 A
5875257 Marrin et al. Feb 1999 A
5897437 Nishiumi et al. Apr 1999 A
5898421 Quinn Apr 1999 A
5902968 Sato et al. May 1999 A
5912612 DeVolpi Jun 1999 A
5919149 Allum Jul 1999 A
5923317 Sayler et al. Jul 1999 A
5929782 Stark et al. Jul 1999 A
5947868 Dugan Sep 1999 A
5955713 Titus et al. Sep 1999 A
5955988 Blonstein et al. Sep 1999 A
5956035 Scianmanella et al. Sep 1999 A
5973757 Aubuchon et al. Oct 1999 A
5982352 Pryor Nov 1999 A
5982356 Akiyama Nov 1999 A
5984785 Takeda Nov 1999 A
5986644 Herder Nov 1999 A
5991085 Rallison et al. Nov 1999 A
5999168 Rosenberg et al. Dec 1999 A
6002394 Schein et al. Dec 1999 A
6010406 Kajikawa et al. Jan 2000 A
6011526 Toyoshima et al. Jan 2000 A
6013007 Root et al. Jan 2000 A
6016144 Blonstein et al. Jan 2000 A
6019680 Cheng Feb 2000 A
6020876 Rosenberg et al. Feb 2000 A
6037882 Levy Mar 2000 A
6044297 Sheldon et al. Mar 2000 A
6049823 Hwang Apr 2000 A
6052083 Wilson Apr 2000 A
6057788 Cummings May 2000 A
6058342 Orbach et al. May 2000 A
6059576 Brann May 2000 A
6069594 Barnes et al. May 2000 A
6072467 Walker Jun 2000 A
6072470 Ishigaki Jun 2000 A
6081819 Ogino Jun 2000 A
6084315 Schmitt Jul 2000 A
6084577 Sato et al. Jul 2000 A
6087950 Capan Jul 2000 A
6110039 Oh Aug 2000 A
6115028 Balakrishnan Sep 2000 A
6137457 Tokuhashi et al. Oct 2000 A
6148100 Anderson et al. Nov 2000 A
6155926 Miyamoto et al. Dec 2000 A
6160405 Needle et al. Dec 2000 A
6160540 Fishkin et al. Dec 2000 A
6162191 Foxlin Dec 2000 A
6164808 Shibata et al. Dec 2000 A
6176837 Foxlin Jan 2001 B1
6181329 Stork et al. Jan 2001 B1
6183365 Tonomura et al. Feb 2001 B1
6184862 Leiper Feb 2001 B1
6184863 Silbert et al. Feb 2001 B1
6186896 Takeda et al. Feb 2001 B1
6191774 Schena et al. Feb 2001 B1
6198295 Hill Mar 2001 B1
6198470 Agam et al. Mar 2001 B1
6198471 Cook Mar 2001 B1
6200219 Rudell et al. Mar 2001 B1
6200253 Nishiumi et al. Mar 2001 B1
6201554 Lands Mar 2001 B1
6217450 Meredith Apr 2001 B1
6217478 Vohmann et al. Apr 2001 B1
6225987 Matsuda May 2001 B1
6226534 Aizawa May 2001 B1
6239806 Nishiumi et al. May 2001 B1
6241611 Takeda et al. Jun 2001 B1
6243658 Raby Jun 2001 B1
6244987 Oshuga et al. Jun 2001 B1
6245014 Brainard, II Jun 2001 B1
6264558 Nishiumi et al. Jul 2001 B1
6273819 Strauss et al. Aug 2001 B1
6280327 Leifer et al. Aug 2001 B1
6297751 Fadavi-Ardekani Oct 2001 B1
6301534 McDermott, Jr. et al. Oct 2001 B1
6304250 Yang et al. Oct 2001 B1
6323614 Palazzolo et al. Nov 2001 B1
6323654 Needle et al. Nov 2001 B1
6325718 Nishiumi et al. Dec 2001 B1
6331841 Tokuhashi et al. Dec 2001 B1
6331856 Van Hook et al. Dec 2001 B1
6337954 Soshi et al. Jan 2002 B1
6361507 Foxlin Mar 2002 B1
D456410 Ashida Apr 2002 S
6369794 Sakurai et al. Apr 2002 B1
6375572 Masuyama et al. Apr 2002 B1
6377793 Jenkins Apr 2002 B1
6377906 Rowe Apr 2002 B1
D456854 Ashida May 2002 S
6394904 Stalker May 2002 B1
6400996 Hoffberg et al. Jun 2002 B1
6409687 Foxlin Jun 2002 B1
D459727 Ashida Jul 2002 S
6415223 Lin et al. Jul 2002 B1
6421056 Nishiumi et al. Jul 2002 B1
6424333 Tremblay Jul 2002 B1
6426719 Nagareda et al. Jul 2002 B1
6426741 Goldsmith et al. Jul 2002 B1
D462683 Ashida Sep 2002 S
6452494 Harrison Sep 2002 B1
6456276 Park Sep 2002 B1
D464053 Zicolello Oct 2002 S
6466198 Feinstein Oct 2002 B1
6466831 Shibata et al. Oct 2002 B1
6473070 Mishra et al. Oct 2002 B2
6473713 McCall et al. Oct 2002 B1
6474159 Foxlin et al. Nov 2002 B1
6484080 Breed Nov 2002 B2
6492981 Stork et al. Dec 2002 B1
6518952 Leiper Feb 2003 B1
6538675 Aratani et al. Mar 2003 B2
D473942 Motoki et al. Apr 2003 S
6544124 Ireland et al. Apr 2003 B2
6544126 Sawano et al. Apr 2003 B2
6545661 Goschy et al. Apr 2003 B1
6549191 Leman Apr 2003 B2
6567536 McNitt et al. May 2003 B2
6572108 Bristow Jun 2003 B1
6577350 Proehl et al. Jun 2003 B1
6582299 Matsuyama et al. Jun 2003 B1
6582380 Kazlausky et al. Jun 2003 B2
6585596 Leifer Jul 2003 B1
6590536 Walton Jul 2003 B1
6591677 Rothuff Jul 2003 B2
6597342 Haruta Jul 2003 B1
6597443 Boman Jul 2003 B2
6599194 Smith et al. Jul 2003 B1
6608563 Weston et al. Aug 2003 B2
6609977 Shimizu et al. Aug 2003 B1
6616607 Hashimoto et al. Sep 2003 B2
6628257 Oka et al. Sep 2003 B1
6634949 Briggs et al. Oct 2003 B1
6636826 Abe et al. Oct 2003 B1
6650029 Johnston Nov 2003 B1
6650313 Levine et al. Nov 2003 B2
6650345 Saito et al. Nov 2003 B1
6654001 Su Nov 2003 B1
6672962 Ozaki et al. Jan 2004 B1
6676520 Nishiumi Jan 2004 B2
6677990 Kawahara Jan 2004 B1
6681629 Foxlin et al. Jan 2004 B2
6682351 Abraham-Fuchs et al. Jan 2004 B1
6686954 Kitaguchi et al. Feb 2004 B1
6692170 Abir Feb 2004 B2
6693622 Shahoian et al. Feb 2004 B1
6712692 Basson et al. Mar 2004 B2
6717573 Shahoian et al. Apr 2004 B1
6718280 Hermann Apr 2004 B2
6724366 Crawford Apr 2004 B2
6725173 An et al. Apr 2004 B2
6736009 Schwabe May 2004 B1
6747632 Howard Jun 2004 B2
6747690 Mølgaard Jun 2004 B2
6749432 French et al. Jun 2004 B2
6753849 Curran et al. Jun 2004 B1
6753888 Kamiwada et al. Jun 2004 B2
6757068 Foxlin Jun 2004 B2
6757446 Li et al. Jun 2004 B1
6761637 Weston et al. Jul 2004 B2
6765553 Odamura Jul 2004 B1
6786877 Foxlin Sep 2004 B2
6796177 Mori Sep 2004 B2
6811489 Shimizu et al. Nov 2004 B1
6811491 Levenberg et al. Nov 2004 B1
6813525 Reid et al. Nov 2004 B2
6813584 Zhou et al. Nov 2004 B2
6816151 Dellinger Nov 2004 B2
6836705 Hellmann et al. Dec 2004 B2
6836751 Paxton et al. Dec 2004 B2
6836971 Wan Jan 2005 B1
6842991 Levi et al. Jan 2005 B2
6850221 Tickle Feb 2005 B1
6850844 Walters et al. Feb 2005 B1
6856327 Choi Feb 2005 B2
6868738 Moscrip et al. Mar 2005 B2
6872139 Sato et al. Mar 2005 B2
6873406 Hines et al. Mar 2005 B1
D505424 Ashida et al. May 2005 S
6897845 Ozawa May 2005 B2
6897854 Cho et al. May 2005 B2
6906700 Armstrong Jun 2005 B1
6908388 Shimizu et al. Jun 2005 B2
6922632 Foxlin Jul 2005 B2
6925410 Narayanan Aug 2005 B2
6929543 Ueshima et al. Aug 2005 B1
6929548 Wang Aug 2005 B2
6933923 Feinstein Aug 2005 B2
6954980 Song Oct 2005 B2
6956564 Williams Oct 2005 B1
6967566 Weston et al. Nov 2005 B2
6982697 Wilson et al. Jan 2006 B2
6984208 Zheng Jan 2006 B2
6990639 Wilson Jan 2006 B2
6993206 Ishino Jan 2006 B2
6993451 Chang et al. Jan 2006 B2
6995748 Gordon et al. Feb 2006 B2
6998966 Pederson et al. Feb 2006 B2
7000469 Foxlin et al. Feb 2006 B2
7002591 Leather et al. Feb 2006 B1
7031875 Ellenby et al. Apr 2006 B2
D522011 Hayes et al. May 2006 S
7066781 Weston Jun 2006 B2
7098891 Pryor Aug 2006 B1
7098894 Yang et al. Aug 2006 B2
7102616 Sleator Sep 2006 B1
7107168 Oystol et al. Sep 2006 B2
7126584 Nishiumi et al. Oct 2006 B1
7127370 Kelly et al. Oct 2006 B2
D531585 Weitgasser et al. Nov 2006 S
7133026 Horie et al. Nov 2006 B2
7139983 Kelts Nov 2006 B2
7140962 Okuda et al. Nov 2006 B2
7142191 Idesawa et al. Nov 2006 B2
7145551 Bathiche et al. Dec 2006 B1
7149627 Ockerse et al. Dec 2006 B2
7154475 Crew Dec 2006 B2
7158118 Liberty Jan 2007 B2
7173604 Marvit et al. Feb 2007 B2
7176919 Drebin et al. Feb 2007 B2
7182691 Schena Feb 2007 B1
7183480 Nishitani et al. Feb 2007 B2
7184059 Fouladi et al. Feb 2007 B1
7220220 Stubbs et al. May 2007 B2
7225101 Usuda et al. May 2007 B2
7231063 Naimark et al. Jun 2007 B2
7233316 Smith et al. Jun 2007 B2
7236156 Liberty et al. Jun 2007 B2
7239301 Liberty et al. Jul 2007 B2
7262760 Liberty Aug 2007 B2
D556201 Ashida et al. Nov 2007 S
7292151 Ferguson et al. Nov 2007 B2
7301527 Marvit Nov 2007 B2
7301648 Foxlin Nov 2007 B2
D556760 Ashida et al. Dec 2007 S
D559847 Ashida et al. Jan 2008 S
7335134 LaVelle Feb 2008 B1
D567243 Ashida et al. Apr 2008 S
7351148 Rothschild et al. Apr 2008 B1
RE40324 Crawford May 2008 E
7379566 Hildreth May 2008 B2
7395181 Foxlin Jul 2008 B2
7414611 Liberty Aug 2008 B2
7445550 Barney et al. Nov 2008 B2
7488231 Weston Feb 2009 B2
7500917 Barney et al. Mar 2009 B2
7568289 Burlingham et al. Aug 2009 B2
7582016 Suzuki Sep 2009 B2
7614958 Weston et al. Nov 2009 B2
20010008847 Miyamoto et al. Jul 2001 A1
20010010514 Ishino Aug 2001 A1
20010049302 Hagiwara et al. Dec 2001 A1
20020024500 Howard Feb 2002 A1
20020028071 Mølgaard Mar 2002 A1
20020072418 Masuyama et al. Jun 2002 A1
20020075335 Rekimoto Jun 2002 A1
20020107069 Ishino Aug 2002 A1
20020140745 Ellenby et al. Oct 2002 A1
20020158843 Levine et al. Oct 2002 A1
20030038778 Noguera et al. Feb 2003 A1
20030052860 Park et al. Mar 2003 A1
20030063068 Anton et al. Apr 2003 A1
20030069077 Korienek Apr 2003 A1
20030107551 Dunker Jun 2003 A1
20030144056 Leifer Jul 2003 A1
20030193572 Wilson et al. Oct 2003 A1
20030195041 McCauley Oct 2003 A1
20030204361 Townsend et al. Oct 2003 A1
20030216176 Shimizu et al. Nov 2003 A1
20030222851 Lai et al. Dec 2003 A1
20040028258 Naimark et al. Feb 2004 A1
20040048666 Bagley Mar 2004 A1
20040070564 Dawson Apr 2004 A1
20040075650 Paul et al. Apr 2004 A1
20040095317 Zhang et al. May 2004 A1
20040134341 Sandoz et al. Jul 2004 A1
20040140954 Faeth Jul 2004 A1
20040143413 Oystol et al. Jul 2004 A1
20040152515 Wegmuller et al. Aug 2004 A1
20040193413 Wilson et al. Sep 2004 A1
20040203638 Chan Oct 2004 A1
20040204240 Barney Oct 2004 A1
20040218104 Smith et al. Nov 2004 A1
20040222969 Buchenrieder Nov 2004 A1
20040227725 Calarco et al. Nov 2004 A1
20040229693 Lind et al. Nov 2004 A1
20040239626 Noguera Dec 2004 A1
20040259651 Storek Dec 2004 A1
20040268393 Hunleth et al. Dec 2004 A1
20050020369 Davis et al. Jan 2005 A1
20050047621 Cranfill Mar 2005 A1
20050054457 Eyestone et al. Mar 2005 A1
20050076161 Albanna et al. Apr 2005 A1
20050085298 Woolston Apr 2005 A1
20050107160 Cheng et al. May 2005 A1
20050125826 Hunleth et al. Jun 2005 A1
20050130739 Argentar Jun 2005 A1
20050143173 Barney et al. Jun 2005 A1
20050172734 Alsio Aug 2005 A1
20050174324 Liberty et al. Aug 2005 A1
20050179644 Alsio Aug 2005 A1
20050210419 Kela Sep 2005 A1
20050212749 Marvit Sep 2005 A1
20050212750 Marvit Sep 2005 A1
20050212751 Marvit Sep 2005 A1
20050212752 Marvit Sep 2005 A1
20050212753 Marvit Sep 2005 A1
20050212754 Marvit Sep 2005 A1
20050212755 Marvit Sep 2005 A1
20050212756 Marvit Sep 2005 A1
20050212757 Marvit Sep 2005 A1
20050212758 Marvit Sep 2005 A1
20050212759 Marvit Sep 2005 A1
20050212760 Marvit Sep 2005 A1
20050212764 Toba Sep 2005 A1
20050212767 Marvit et al. Sep 2005 A1
20050215295 Arneson Sep 2005 A1
20050217525 McClure Oct 2005 A1
20050243061 Liberty et al. Nov 2005 A1
20050243062 Liberty Nov 2005 A1
20050253806 Liberty et al. Nov 2005 A1
20050256675 Kurata Nov 2005 A1
20060028446 Liberty et al. Feb 2006 A1
20060030385 Barnet et al. Feb 2006 A1
20060092133 Touma et al. May 2006 A1
20060148563 Yang Jul 2006 A1
20060152487 Grunnet-Jepsen et al. Jul 2006 A1
20060152488 Salsman et al. Jul 2006 A1
20060152489 Sweetser et al. Jul 2006 A1
20060154726 Weston et al. Jul 2006 A1
20060178212 Penzias Aug 2006 A1
20060256081 Zalewski et al. Nov 2006 A1
20060264260 Zalewski et al. Nov 2006 A1
20060282873 Zalewski et al. Dec 2006 A1
20060287086 Zalewski et al. Dec 2006 A1
20060287087 Zalewski et al. Dec 2006 A1
20070049374 Ikeda et al. Mar 2007 A1
20070050597 Ikeda Mar 2007 A1
20070052177 Ikeda et al. Mar 2007 A1
20070060391 Ikeda et al. Mar 2007 A1
20070066394 Ikeda et al. Mar 2007 A1
20070066396 Weston et al. Mar 2007 A1
20070072680 Ikeda Mar 2007 A1
20070252815 Kuo et al. Nov 2007 A1
20070265076 Lin et al. Nov 2007 A1
20080014835 Weston et al. Jan 2008 A1
20080015017 Ashida et al. Jan 2008 A1
20080039202 Sawano et al. Feb 2008 A1
20080273011 Lin Nov 2008 A1
20080278445 Sweetser et al. Nov 2008 A1
20090005166 Sato Jan 2009 A1
20090051653 Barney et al. Feb 2009 A1
20090124165 Weston May 2009 A1
20090156309 Weston et al. Jun 2009 A1
Foreign Referenced Citations (110)
Number Date Country
03930581 Mar 1991 DE
19701344 Jul 1997 DE
19701374 Jul 1997 DE
19648487 Jun 1998 DE
19814254 Oct 1998 DE
19937307 Feb 2000 DE
10029173 Jan 2002 DE
10241392 May 2003 DE
10219198 Nov 2003 DE
0835676 Apr 1998 EP
0848226 Jun 1998 EP
0852961 Jul 1998 EP
1062994 Dec 2000 EP
1279425 Jan 2003 EP
1293237 Mar 2003 EP
1 524 334 Sep 1978 GB
1524334 Sep 1978 GB
2 244 546 May 1990 GB
2284478 Jun 1995 GB
2307133 May 1997 GB
2316482 May 1998 GB
2319374 May 1998 GB
60-077231 May 1985 JP
62-14527 Jan 1987 JP
H3-74434 Jul 1991 JP
3-059619 Nov 1991 JP
U-H05-56191 Jul 1993 JP
2901476 Dec 1993 JP
06-050758 Feb 1994 JP
3262677 May 1994 JP
06-154422 Jun 1994 JP
A-H06-198075 Jul 1994 JP
3194841 Oct 1994 JP
6-308879 Nov 1994 JP
3273531 Nov 1994 JP
7-028591 Jan 1995 JP
3228845 Jan 1995 JP
7044315 Feb 1995 JP
7-22312 May 1995 JP
7-146123 Jun 1995 JP
3517482 Jun 1995 JP
7-200142 Aug 1995 JP
7-302148 Nov 1995 JP
7-318332 Dec 1995 JP
A-H08-71252 Mar 1996 JP
8-095704 Apr 1996 JP
8-106352 Apr 1996 JP
08-111144 Apr 1996 JP
A-H08-111144 Apr 1996 JP
8-114415 May 1996 JP
8-122070 May 1996 JP
8-152959 Jun 1996 JP
8-211993 Aug 1996 JP
8-335136 Dec 1996 JP
9-230997 Sep 1997 JP
9-274534 Oct 1997 JP
9-319510 Dec 1997 JP
1033831 Feb 1998 JP
10-099542 Apr 1998 JP
10-154038 Jun 1998 JP
10-254614 Sep 1998 JP
11-114223 Apr 1999 JP
A-H11-114223 Apr 1999 JP
11-506857 Jun 1999 JP
2000-270237 Sep 2000 JP
2000-308756 Nov 2000 JP
U3078268 Nov 2000 JP
U-3078268 Apr 2001 JP
2001-175412 Jun 2001 JP
A-2003-140823 Nov 2001 JP
2002-062981 Feb 2002 JP
2002233665 Feb 2002 JP
2002-091692 Mar 2002 JP
2002-153673 May 2002 JP
2002-298145 Oct 2002 JP
2003-53038 Feb 2003 JP
2003-140823 May 2003 JP
3422383 Jun 2003 JP
2003208263 Jul 2003 JP
2003-236246 Aug 2003 JP
2003-325974 Nov 2003 JP
2004-062774 Feb 2004 JP
2004-313492 Nov 2004 JP
2005021458 Jan 2005 JP
2005-040493 Feb 2005 JP
2005-063230 Mar 2005 JP
2006-113019 Apr 2006 JP
9300171 Aug 1994 NL
2125853 Feb 1999 RU
2126161 Feb 1999 RU
2141738 Nov 1999 RU
WO94 02931 Feb 1994 WO
WO9605766 Feb 1996 WO
WO9709101 Mar 1997 WO
WO 9712337 Apr 1997 WO
WO9728864 Aug 1997 WO
WO 9811528 Mar 1998 WO
WO0033168 Jun 2000 WO
WO 0035345 Jun 2000 WO
WO 0047108 Aug 2000 WO
WO 0063874 Oct 2000 WO
WO 0187426 Nov 2001 WO
WO 0191042 Nov 2001 WO
WO 0217054 Feb 2002 WO
WO02 34345 May 2002 WO
WO 03015005 Feb 2003 WO
WO 03107260 Jun 2003 WO
WO 03088147 Oct 2003 WO
WO 2004039055 May 2004 WO
WO2004-051391 Jun 2004 WO
Non-Patent Literature Citations (436)
Entry
“ASCII Grip One Handed Controller” One Switch-ASCII Grip One Handed Playstation Controller http://www.oneswitch.org.uk/1/ascii/grip.htm Jul. 11, 2008 pp. 1-2.
“Superfamicom Grip controller by ASCII” http://superfami.com/sfc—grip.html Jul. 10, 2008 pp. 1-2.
“ASCII/Sammy Grip V2” One Switch-Accessible Gaming Shop-ASCII Grip V2 http://www.oneswitch.org.uk/1/AGS/AGS-onehand/ascii-grip-v2.html Jul. 10, 2008 pp. 1-2.
Photographs of prior art ASCII Grip V2 Controller (cited in previous IDS as: ASCII/Sammy Grip V2 One Switch-Accessible Gaming Shop-ASCII Grip V2 http://www.oneswitch.org.uk/1/AGS/AGS-onehand/ascii-grip-v2.html Jul. 10, 2008 pp. 1-2.).
Kennedy P.J. “Hand-Held Data Input Device” IBM Technical Disclosure Bulletin vol. 26 No. 11 Apr. 1984 pp. 5826-5827.
“Controllers Atari Space Age Joystick” AtariAge: Have You Played Atari Today? www.atariage.com/controller—page.html?SystemID=2600&ControllerID=12.
“Controllers-Booster Grip” AtariAge: Have You Played Atari Today? www.atariage.com/controller—page.html?SystemID=2600&ControllerID=18.
“Coleco Vision: Super Action™ Controller Set” www.vintagecomputing.com/wp-content/images/retroscan/coleco—sac—1—large.jpg.
Electro-Plankton Weblog http://www.tranism.com/weblog/2005/09/ “This is the Revolution Nintendo Style” Sep. 15, 2005 2 pgs.
“ASCII Grip” One-Handed Controller the Ultimate One-Handed Controller Designed for the Playstation Game Console (ASCII Entertainment 1997).
“Game Controller” Wikipedia, 7 pages, http://en.wikipedia.org/w/index.php?title=Game—controller&oldid=21390758. (Aug. 19, 2005).
Dichtburn, “Camera in Direct3D” Toymaker, 5 pages, http://web.archive.org/web/20050206032104/http://toymaker.info/games/html/camera.html. (Mar. 5, 2005).
Wilson, Andy, “XWand: UI for Intelligent Environments”, http://research.microsoft.com/en-us/um/people/awilson/wand/default.htm (last edit Apr. 26, 2004).
Wilson, Andrew, XWand: UI for Intelligent Spaces, CHI 2003, Ft. Lauderdale, FL, US, ACM (Apr. 5-10, 2003).
Office Action issued in corresponding Japanese patent application 2007-203785 (Oct. 27, 2008).
Odell, “An Optical Pointer for Infrared Remote Controllers,” Proceedings of International Conference on Consumer Electronics (1995).
Odell, Transcript of Testimony, Investigation No. 337-TA-658, Before the United States International Trade Commission, vol. IV, public session (May 14, 2009).
Selectech, Selectech AirMouse Devices (image) (1991).
Selectech, “Selectech AirMouse Remote Controls, Model # AM-R1,” photographs (1991).
Selectech, “Airmouse Remote Control System Model AM-1 User's Guide,” Colchester, VT (Sep. 24, 1991).
Selectech, Facsimile Transmission from Rossner to Monastiero, Airmouse Remote Controls, Colchester, VT (Mar. 25, 1992).
Selectech, “Changing Driver Versions on CDTV/AMIGA” (Oct. 17, 1991).
Selectech, “AirMouse Remote Controls, AirMouse Remote Control Warranty” (1991).
Selectech, Software, “AirMouse for DOS and Windows IBM & Compatibles,” “AirMouse Remote Control B0100EN-C, Amiga Driver, CDTV Driver, Version: 1.00,” “AirMouse Remote Control B0100EM-C.1, Apple Macintosh Serial Driver Version: 1.00 (1.01B),” “AirMouse Remote Control B0100EL-B/3.05 DOS Driver Version: 3.0, Windows Driver Version 1.00,” “AirMouse Remote Control MS-DOS Driver Version: 3.00/3.05, Windows 3.0 Driver Version: 1.00” (1991).
Wilson, “Wireless User Interface Devices for Connected Intelligent Environments,” Ubicomp 2003 Workshop (2003).
Wilson, “WorldCursor: Pointing in Intelligent Environments with a Tele-operated Laser Pointer,” UIST '03 Companion (Nov. 2003).
Wilson, Research page, biography available at http://research.microsoft.com/en-us/um/people/awilson/?0sr=a, Microsoft Corp. (2009).
Wilson, XWand video, http://research.microsoft.com/˜awilson/wand/wand%20video%20768k.WMV (Mar. 2002).
Wilson, Transcript of Testimony, Investigation No. 337-TA-658, Before the United States International Trade Commission, vol. V (May 15, 2009).
Office Action mailed in applicants' copending U.S. Appl. No. 11/764,409 (Feb. 20, 2009).
Titterton et al., “Strapdown Inertial Navigation Technology,” 2nd ed., Institution of Electrical Engineers (2004).
Acar, “Robust Micromachined Vibratory Gyroscopes” Dissertation (Dec. 2004).
Acar, et al., “Experimental evaluation and comparative analysis of commercial-variable-capacitance MEMS accelerometers,” Journal of Micromechanics and Microengineering, vol. 13 (1), pp. 634-645 (May 2003).
Agard, Agard, “Advances in Strapdown Inertial Systems,” Lecture Series Advisory Group for Aerospace Research and Development Neuilly-Sur-Seine (France) (1984).
Albrecht, “An Adaptive Digital Filter to Predict Pilot Head Look Direction for Helmet-mounted Displays,” MS Thesis University of Dayton (1989).
Algrain, “Estimation of 3-D Angular Motion Using Gyroscopes and Linear Accelerometers,” IEEE Transactions on Aerospace and Electronic Systems, vol. 27, No. 6, pp. 910-920 (Nov. 1991).
Algrain, et al., “Accelerometer Based Line-of-Sight Stabilization Approach for Pointing and Tracking System,” Second IEEE Conference on Control Applications, vol. 1 , Issue 13-16 pp. 159-163 (Sep. 1993).
Algrain, et al., “Interlaced Kalman Filtering of 3-D Angular Motion Based on Euler's Nonlinear Equations,” IEEE Transactions on Aerospace and Electronic Systems, vol. 30, No. 1 (Jan. 1994).
Allen, et al., “A General Method for Comparing the Expected Performance of Tracking and Motion Capture Systems,” {VRST} '05: Proceedings of the ACM symposium on Virtual reality software and technology, pp. 201-210 (Nov. 2005).
Allen, et al., “Tracking: Beyond 15 Minutes of Thought,” SIGGRAPH 2001 Course 11 (Course Pack) from Computer Graphics (2001).
Alves, “Extended Kalman filtering applied to a full accelerometer strapdown inertial measurement unit,” M.S. Thesis Massachusetts Institute of Technology. Dept. of Aeronautics and Astronautics, Santiago (1992).
Analog Devices “ADXL50 Single Axis Accelerometer” (Data Sheet), http://www.analog.com/en/obsolete/adxl50/products/product.html (Mar. 1996).
Analog Devices “ADXL202E Low-Cost ±2 g Dual-Axis Accelerometer with Duty Cycle Output” (Data Sheet), Rev. A (2000).
Analog Devices “ADXL330 Small, Low Power, 3-Axis ±2 g iMEMS Accelerometer” (Data Sheet), Rev. PrA (2005).
Analog Devices “ADXRS150 ±150°/s Single Chip Yaw Rate Gyro with Signal Conditioning” (Data Sheet), Rev. B (2004).
Analog Devices “ADXRS401 ±75°/s Single Chip Yaw Rate Gyro with Signal Conditioning” (Data Sheet), Rev. O (2004).
Ang, et al., “Design and Implementation of Active Error Canceling in Hand-held Microsurgical Instrument,” Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 2, (Oct. 2001).
Ang, et al., “Design of All-Accelerometer Inertial Measurement Unit for Tremor Sensing in Hand-held Microsurgical Instrument,” Proceedings of the 2003 IEEE International Conference on Robotics & Automation (Sep. 2003).
Apostolyuk, Vladislav, “Theory and design of micromechanical vibratory gyroscopes,” MEMS/NEMS Handbook, Springer, 2006, vol. 1, pp. 173-195 (2006).
Arcanatech, IMP (Photos) (1994).
Arcanatech, “IMP User's Guide” (1994).
Ascension Technology, The Bird 6D Input Devices (specification) (1998).
Ator, “Image-Velocity with Parallel-Slit Reticles,” Journal of the Optical Society of America (Dec. 1963).
Azarbayejani, et al, “Real-Time 3-D Tracking of the Human Body,” Proceedings of IMAGE'COM 96 (1996).
Azarbayejani, et al., “Visually Controlled Graphics,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 15, No. 6, pp. 602-605 (Jun. 1993).
Azuma et al., “Improving Static and Dynamic Registration in an Optical See-Through HMD,” International Conference on Computer Graphics and Interactive Techniques Proceedings of the 21st annual conference on Computer graphics and interactive techniques, pp. 197-204 (1994).
Azuma et al., “Making Augmented Reality Work Outdoors Requires Hybrid Tracking,” Proceedings of the International Workshop on Augmented Reality, San Francisco, CA, Nov. 1, 1998, Bellevue, Washington, pp. 219-224 (1999).
Azuma, “Predictive Tracking for Augmented Reality,” Ph.D. Dissertation, University of North Carolina at Chapel Hill (1995).
Azuma, et al., “A Frequency-Domain Analysis of Head-Motion Prediction,” Proceedings of SIGGRAPH '94, pp. 401-408 (1995).
Azuma, et al., “A motion-stabilized outdoor augmented reality system,” Proceedings of IEEE Virtual Reality '99, Houston, TX (Mar. 1999).
Bachmann et al., “Inertial and Magnetic Posture Tracking for Inserting Humans into Networked Virtual Environments,” Virtual Reality Software and Technology archive, Proceedings of the ACM Symposium on Virtual Reality Software and Technology, Baniff, Alberta, Canada, pp. 9-16 (2001)
Bachmann et al., “Orientation Tracking for Humans and Robots Using Inertial Sensors” (CIRA '99), Naval Postgraduate School, Monterey, CA (1999).
Bachmann, “Inertial and Magnetic Angle Tracking of Limb Segments for Inserting Humans into Synthetic Environments,” Dissertation, Naval Postgraduate School, Monterey, CA (Dec. 2000).
Baker et al., “Active Multimodal Control of a Floppy Telescope Structure,” Proc. SPIE, vol. 4825, 74 (Mar. 2003).
Balakrishnan, “The Rockin' Mouse: Integral 3D Manipulation on a Plane,” (CHI '97), Univ. Toronto, (1997).
Ballagas, et al., Jan, “iStuff: A Physical User Interface Toolkit for Ubiquitous Computer Environments,” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, vol. 5, No. 1, at 537-44 (ACM) (Apr. 5-10, 2003).
Baraff, “An Introduction to Physically Based Modeling,” SIGGRAPH 97 Course Notes (1997).
Baudisch, et al., “Soap: a pointing device that works in mid-air” Proc. UIST (2006).
BBN Report, “Virtual Environment Technology for Training (VETT),” The Virtual Environment and Teleoperator Research Consortium (VETREC) (Mar. 1992).
Behringer, “Improving Registration Precision Through Visual Horizon Silhouette Matching,” Proceedings of the international workshop on Augmented reality : placing artificial objects in real scenes: placing artificial objects in real scenes, Bellevue, Washington, United States pp. 225-232 (1999).
Behringer, “Registration for Outdoor Augmented Reality Applications Using Computer Vision Techniques and Hybrid Sensors,” Virtual Reality, 1999 Proceedings., IEEE Computer Society, pp. 244-261 (1999).
BEI, “BEI GyrochipTM Model QRS11 Data Sheet,” BEI Systron Donner Inertial Division, BEI Technologies, Inc., (Sep. 1998).
BEI Systron Donner Inertial Division, Gyrochip Theory of Operation (2001).
Benbasat, “An Inertial Measurement Unit for User Interfaces,” Massachusetts Institute of Technology Dissertation, (Sep. 2000).
Benbasat, et al., “An Inertial Measurement Framework for Gesture Recognition and Applications,” Gesture and Sign Language in Human-Computer Interaction, International Gesture Workshop, GW 2001, London, UK, 2001 Proceedings, LNAI 2298, at 9-20, I. Wachsmuth and T. Sowa (eds.), Springer-Verlag Berlin Heibelberg (2001, 2002).
Beuter, A., Publications, University of Quebec at Montreal, http://www.er.uqam.ca/nobel/r11040/publicat.htm (Aug. 2007).
BGM-109 Tomahawk, http://en.wikipedia.org/wiki/BGM-109—Tomahawk, Wikipedia, Jan. 2009.
Bhatnagar, “Position trackers for Head Mounted Display systems: A survey” (Technical Report), University of North Carolina at Chapel Hill (Mar. 1993).
Bianchi, “A Tailless Mouse, New cordless Computer Mouse Invented by ArcanaTech.” Inc. Article (Jun. 1992).
Bishop, “The Self-Tracker: A Smart Optical Sensor on Silicon,” Ph.D. Dissertation, Univ. of North Carolina at Chapel Hill (1984).
Bishop, et al., “Grids Progress Meeting” (Slides), University of North Carolina at Chapel Hill, NC (1998).
Bishop, et al., Self-Tracker: Tracking for Hybrid Environments without Infrastructure (1996).
Bona, et al., “Optimum Reset of Ship's Inertial Navigation System,” IEEE Transactions on Aerospace and Electronic Systems (1965).
Borenstein, et al., “Where am I? Sensors and Methods for Mobile Robot Positioning” (1996).
Boser, “3-Axis Accelerometer with Differential Sense Electronics,” http://www.eecs.berkeley.edu/˜boser/pdf/3axis.pdf (1997).
Boser, “Accelerometer Design Example: Analog Devices XL-05/5,” http://www.eecs.berkeley.edu/˜boser/pdf/xl05.pdf (1996).
Bowman et al., 3D User Interfaces: Theory and Practice, Addison-Wesley, Inc., (2005).
Bowman,. et al., “An Introduction to 3-D User Interface Design,” MIT Presence, vol. 10, No. 1, pp. 96-108 (2001).
Britton et al., “Making Nested rotations Convenient for the User,” ACM SIGGRAPH Computer Graphics, vol. 12, Issue 3, pp. 222-227 (Aug. 1978).
Britton, “A Methodology for the Ergonomic Design of Interactive Computer Graphic Systems, and its Application to Crystallography” (UNC Thesis) (1977).
Business Wire, “Feature/Virtual reality glasses that interface to Sega channel,” Time Warner, TCI: project announced concourrent with COMDEX (Nov. 1994).
Business Wire, “Free-space ‘Tilt’ Game Controller for Sony Playstation Uses Scenix Chip; SX Series IC Processes Spatial Data in Real Time for On-Screen” (Dec. 1999).
Business Wire, “InterSense Inc. Launches InertiaCube2—The World's Smallest Precision Orientation Sensor With Serial Interface” (Aug. 14, 2001).
Business Wire, “Logitech MAGELLAN 3D Controller,” Logitech (Apr. 1997).
Business Wire, “Mind Path Introduces GyroPoint RF Wireless Remote” (Jan. 2000).
Business Wire, “Pegasus' Wireless PenCell Writes on Thin Air with ART's Handwriting Recognition Solutions,” Business Editors/High Tech Writers Telecom Israel 2000 Hall 29, Booth 19-20 (Nov. 2000).
Business Wire, “RPI ships low-cost pro HMD Plus 3D Mouse and VR PC graphics card system for CES” (Jan. 1995).
Buxton, Bill, “Human input/output devices,” in M. Katz (ed.), Technology Forecast: 1995, Menlo Park, C.A.: Price Waterhouse World Firm Technology Center, 49-65 (1994).
Buxton, Bill, A Directory of Sources for Input Technologies, http://www.billbuxton.com/InputSources.html, Apr. 2001 (last update 2008).
Byte, “Imp Coexists With Your Mouse,” What's New, ArcaneTec (Jan. 1994).
Canaday, R67-26 “The Lincoln Wand,” IEEE Transactions on Electronic Computers, vol. EC-16, No. 2, p. 240 (Apr. 1967).
Caruso et al., “New Perspective on Magnetic Field Sensing,” Sensors Magazine (Dec. 1998).
Caruso et al., “Vehicle Detection and Compass Applications using AMR Magnetic Sensors,” Honeywell (May 1999).
Caruso, “Application of Magnetoresistive Sensors in Navigation Systems,” Sensors and Actuators, SAE SP-1220, pp. 15-21 (Feb. 1997 ).
Caruso, “Applications of Magnetic Sensors for Low Cost Compass Systems,” Honeywell, SSEC, http://www.ssec.honeywell.com/magnetic/datasheets/lowcost.pdf (May 1999 ).
Chatfield, “Fundamentals of High Accuracy Inertial Navigation,” vol. 174 Progress in Astronautics and Aeronautics, American Institute of Aeronautics and Astronautics, Inc. (1997).
Cheng, “Direct interaction with large-scale display systems using infrared laser tracking devices,” ACM International Conference Proceeding Series; vol. 142 (2003).
Cho, et al., “Magic Wand: A Hand-Drawn Gesture Input Device in 3-D Space with Inertial Sensors,” Proceedings of the 9th Intl Workshop on Frontiers in Handwriting Recognition (IWFHR-9 2004), IEEE (2004).
Computergram, “RPI Entertainment Pods Improve Virtual Experience” (1995).
Cookbook, Numerical Recipes Electronic Edition, http://www.library.cornell.edu/nr/cbookcpdf.html.
Cooke, et al., “NPSNET: flight simulation dynamic modeling using quaternions,” Presence, vol. 1, No. 4,pp. 404-420, MIT Press (1992/1994).
CSIDC Winners—Tablet-PC Classroom System Wins Design Competition, IEEE Computer Society Press, vol. 36 , Issue 8, pp. 15-18 , IEEE Computer Society (Aug. 2003).
Cutrone, “Hot products: Gyration GyroPoint Desk, GyroPoint Pro gyroscope-controlled wired and wireless mice” (Computer Reseller News) (Dec. 1995).
Cutts, “A Hybrid Image/Inertial System for Wide-Area Tracking” (Internal to UNC-CH Computer Science) (Jun. 1999 ).
Deruyck, et al., “An Electromagnetic Position Sensor,” Polhemus Navigation Sciences, Inc., Burlington, VT (Nov. 1973).
Donelson, et al., “Spatial Management of Information” (1978 ).
Eiβele, “Orientation as an additional User Interface in Mixed-Reality Environments,” 1. workshop Ervwiterte and Virtuelle Realität, pp. 79-90. GI-Fachgruppe AR/VR (2007).
Enura, et al., “Sensor Fusion Based Measurement of Human Head Motion,” 3rd IEEE International Workshop on Robot and Human Communication (Jul. 1994).
Ferrin, “Survey of Helmet Tracking Technologies,” Proc. SPIE vol. 1456, p. 86-94 (Apr. 1991).
Foxlin et al., “An Inertial Head-Orientation Tracker with Automatic Drift Compensation for Use with HMD's,” Proceedings of the conference on Virtual reality software and technology, Singapore, Singapore, pp. 159-173 (1994).
Foxlin et al., “Miniature 6-DOF Inertial System for Tracking HMDs,” SPIE vol. 3362 (Apr. 1998).
Foxlin et al., “Miniaturization, Calibration & Accuracy Evaluation of a Hybrid Self-Tracker,” The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 151-160 (2003).
Foxlin et al., “WearTrack: A Self-Referenced Head and Hand Tracker for Wearable Computers and Portable VR,” International Symposium on Wearable Computers (ISWC 2000), Oct. 16-18, 2000, Atlanta, GA (2000).
Foxlin, “FlightTracker: A Novel Optical/Inertial Tracker for Cockpit Enhanced Vision, Symposium on Mixed and Augmented Reality,” Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality, pp. 212-221 (Nov. 2004).
Foxlin, “Generalized architecture for simultaneous localization, auto-calibration, and map-building,” IEEE/RSJ Conf. on Intelligent Robots and Systems, Lausanne, Switzerland (Oct. 2002).
Foxlin, “Head-tracking Relative to a Moving Vehicle or Simulator Platform Using Differential Inertial Sensors,” InterSense, Inc., Presented: Helmet and Head-Mounted Displays V, SPIE vol. 4021, AeroSense Symposium, Orlando, FL, Apr. 24-25, 2000 (2000).
Foxlin, “Inertial Head Tracker Sensor Fusion by a Complementary Separate-bias Kalman Filter,” Proceedings of the IEEE 1996 Virtual Reality Annual International Symposium, pp. 185-194, 267 (1996).
Foxlin, “Inertial Head-Tracking,” MS Thesis, Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science (Sep. 1993).
Foxlin, “Motion Tracking Requirements and Technologies,” Chapter 7, from Handbook of Virtual Environment Technology, Stanney Kay, Ed. (2002).
Foxlin, “Pedestrian Tracking with Shoe-Mounted Inertial Sensors,” IEEE Computer Graphics and Applications, vol. 25, No. 6, pp. 38-46 (Nov. 2005).
Foxlin, et al., “Constellation: A Wide-Range Wireless Motion-Tracking System for Augmented Reality and Virtual Set Applications,” ACM SIGGRAPH, pp. 372-378 (1998).
Foxlin, et al., “VIS-Tracker: A Wearable Vision-Inertial Self-Tracker,” IEEE Computer Society (2003).
Freiburg Center for Data Analysis and Modeling—Publications, http://www.fdm.uni-freiburg.de/cms/puplications/publications/ (Aug. 2007).
Friedmann, et al., “Device Synchronization Using an Optimal Linear Filter,” SI3D '92: Proceedings of the 1992 symposium on Interactive 3D graphics, pp. 57-62 (1992).
Friedmann, et al., “Synchronization in virtual realities,” MIT Presence, vol. 1, No. 1, pp. 139-144 (1992).
Fröhlich, “The Yo Yo: An interaction device combining elastic and isotonic control,” at http://www.uni-weimar.de/cms/medien/vr/research/hci/3d-handheld-interaction/the-yoyo-a-handheld-device-combining-elastic-and-isotonic-input.html (2003).
Green, et al., “ADI's iMEMS Angular Rate Sensing Gyroscope,” Analog Dialogue (Jan. 2003).
Grimm et al., “Real-Time Hybrid Pose Estimation from Vision and Inertial Data,” Proceedings, First Canadian Conference on Computer and Robot Vision, pp. 480-486 (2004).
Gyration Inc., “The Magic Inside GyroPoint”.
Gyration, “Gyration GP110 Ultra Cordless Optical Mouse Data Sheet,” http://www.gyration.com/descriptions/document/GP110-SPEC-EN.pdf (2002).
Gyration, “Gyration GP110 Ultra Cordless Optical Mouse User Manual,” http://www.gyration.com/descriptions/document/GP110-MANUAL-EN.pdf (2002).
Gyration, “Gyration Ultra Cordless Optical Mouse,” photos (2002).
Gyration, “Gyration MicroGyro 100 Developer Kit Data Sheet,” http://web.archive.org/web/19980708122611/www.gyration.com/html/devkit.html (Jul. 1998).
Hamilton Institute, http://www.dcs.gla.ac.uk/.about.rod/, R. Murray-Smith (Aug. 2007).
Harada, et al., “Portable Absolute Orientation Estimation Device with Wireless Network under Accelerated Situation” Proceedings, 2004 IEEE International Conference on Robotics and Automation, vol. 2, Issue , Apr. 26-May 1, 2004 pp. 1412-1417 vol. 2 (Apr. 2004).
Harada, et al., “Portable orientation estimation device based on accelerometers, magnetometers and gyroscope sensors for sensor network,” Proceedings of IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, MFI2003, pp. 191-196 (Jul. 2003).
Haykin, et al., “Adaptive Tracking of Linear Time-Variant Systems by Extended RLS Algorithms, IEEE Transactions on Signal Processing,” vol. 45, No. 5 (May 1997).
Heath, “Virtual Reality Resource Guide AI Expert,” v9 n5 p. 32(14) (May 1994).
Hinckley et al., “The VideoMouse: A Camera-Based Multi-Degree-of-Freedom Input Device” A59, ACM UIST'99 Symposium on User Interface Software & Technology, CHI Letters 1 (1), pp. 103-112. (Jan. 1999).
Hinckley, “Synchronous Gestures for Multiple Persons and Computers”, CHI Letters vol. 5 No. 2 (ACM 2003) & Proceedings of the 16th Annual ACM UIST 2003 Symposium on User Interface Software & Technology, at 149-58 (UIST'03 Vancouver BC Canada) (ACM) (Nov. 2003).
Hinckley, et al., “Sensing Techniques for Mobile Interaction,” Proceedings of the 13th Annual ACM Symposium on User Interface Software and Technology (San Diego, Cal.), ACM UIST 2000 & Technology, CHI Letters 2 (2), at 91-100 (ACM) (2000).
Hinckley. et al. , “A Survey of Design Issues in Spatial Input,” Proceedings of the ACM Symposium on User Interface Software and Technology (1994).
Hogue, “Marvin: A Mobile Automatic Realtime Visual and INertial tracking system,” Master's Thesis, York University (2003).
Hogue, et al., “An optical-inertial tracking system for fully-enclosed VR displays,” Proceedings of the 1st Canadian Conference on Computer and Robot Vision, pp. 22-29 (May 2004 ).
Hollands, Robin, “Sourceless Trackers,” VR News (Apr. 1995).
Holloway, Richard Lee, “Registration Errors in Augmented Reality Systems,” Ph.D. Dissertation, University of North Carolina at Chapel Hill (1995).
Hudson Soft, “Brochure of Toukon Road Brave Warrior, Brave Spirits” (1998).
Inman, “Cheap sensors could capture your every move,” http://technology. newscientist.com/article/dn12963-cheap-sensors-could-capture-your-every-move.html (Nov. 2007 ).
InterSense, “InterSense InertiaCube2 Devices,” (Specification) (image) (2001).
InterSense, “InterSense InertiaCube2 Manual for Serial Port Model” (2001).
InterSense, “InterSense IS-1200 FlightTracker Prototype Demonstration” (Video) (Nov. 2004).
InterSense, “InterSense IS-1200 InertiaHawk Datasheet” (2009).
InterSense, “InterSense IS-1200 VisTracker Datasheet” (2007).
InterSense, “InterSense IS-1200 VisTracker Devices,” (image) (2007).
InterSense, “InterSense IS-900 MicroTraxTM Datasheet” (2007).
InterSense, “InterSense IS-900 Systems Datasheet” (2007).
InterSense, “InterSense MicroTrax Demo Reel,” http://www.youtube.com/watch?v=O2F4fu—CISo (2007).
InterSense, “InterSense Mobile Mixed Reality Demonstration” (Video), http://www.youtube.com/watch?v=daVdzGK0nUE&feature=channel—page (Oct. 2006).
InterSense, “InterSense Motion Gaming Tech Demo,” http://www.youtube.com/watch?v=7-3y5tdju4E, InterSense (Mar. 2008).
InterSense, “IS-1200 VisTracker Augmented Maintenance Demonstration” (Video), http://www.intersense.com/IS-1200 Systems.aspx, http://www.youtube.com/watch?v=IM178s91WQo&feature=channel—page (Jan. 2009).
InterSense, “IS-1200 VisTracker Industrial Cart Demonstration” (Video), InterSense http://www.intersense.com/IS-1200 Systems.aspx http://www.youtube.com/watch?v=7xKLCvDGMgY&feature=channel—page (Jan. 2008).
InterSense, “IS-900 Product Technology Brief,” http://www.intersense.com/uploadedFiles/Products/White—Papers/IS900—Tech—Overview—Enhanced.pdf (1999).
InterSense, Inc., “Comparison of InterSense IS-900 System and Optical Systems,” http://www.intersense.com/uploadedFiles/Products/White—Papers/Comparison%20of%20InterSense%20IS-900%20System%20and%20Optical%20Systems.pdf (Jul. 12, 2004).
Izumori et al, High School Algebra: Geometry (1986), .
Jacob, “Human-Computer Interaction—Input Devices” http://www.cs.tufts.edu/˜jacob/papers/surveys.html, “Human-Computer Interaction: Input Devices,” ACM Computing Surveys, vol. 28, No. 1, pp. 177-179 (Mar. 1996).
Jakubowsk, et al., “Increasing Effectiveness of Human Hand Tremor Separation Process by Using Higher-Order Statistics,” Measurement Science Review, vol. 1 (2001).
Jakubowski, et al., “Higher Order Statistics and Neural Network for Tremor Recognition,” IEEE Transactions on Biomedical Engineering, vol. 49, No. 2 (Feb. 2002).
Jian, et al., “Adaptive Noise Cancellation,” Rice University, http://www.ece.rice.edu/.about.klwang/elec434/elec434.htm, (Aug. 2007).
Jiang, “Capacitive position-sensing interface for micromachined inertial sensors,” Dissertation at Univ. of Cal. Berkley (2003).
Ju, et al., “The Challenges of Designing a User Interface for Consumer Interactive Television Consumer Electronics Digest of Technical Papers.,” IEEE 1994 International Conference on Volume , Issue , Jun. 21-23 1994 pp. 114-115 (Jun. 1994).
Keir, et al., “Gesture-recognition with Non-referenced Tracking,” IEEE Symposium on 3D User Interfaces, pp. 151-158 (Mar. 25-26, 2006).
Kessler, et al., “The Simple Virtual Environment Library” (MIT Presence) (2000).
Kindratenko, “A Comparison of the Accuracy of an Electromagnetic and a Hybrid Ultrasound-Inertia Position Tracking System,” MIT Presence, vol. 10, No. 6, Dec. 2001, 657-663 (2001).
Klein et al.,“Tightly Integrated Sensor Fusion for Robust Visual Tracking,” British Machine Vision Computing, vol. 22, No. 10, pp. 769-776 (2004).
Kohlhase, “NASA Report, The Voyager Neptune travel guide,” Jet Propulsion Laboratory Publication 89-24, excerpt (Jun. 1989).
Krumm, et al.,“How a Smart Environment Can Use Perception,” Ubicomp 2001 (Sep. 2001).
Kuipers, Jack B., “SPASYN —An Electromagnetic Relative Position and Orientation Tracking System,” IEEE Transactions on Instrumentation and Measurement, vol. 29, No. 4, pp. 462-466 (Dec. 1980).
La Scala, et al., “Design of an Extended Kalman Filter Frequency Tracker,” IEEE Transactions on Signal Processing, vol. 44, No. 3 (Mar. 1996).
Larimer et al., “VEWL: A Framework for building a Windowing Interface in a Virtual Environment,” in Proc. of IFIP TC13 Int. Conf. on Human-Computer Interaction Interact'2003 (Zürich, http://people.cs.vt.edu/˜bowman/papers/VEWL—final.pdf (2003).
Laughlin, et al., “Inertial Angular Rate Sensors: Theory and Applications,” SENSORS Magazine (Oct. 1992).
Lee et al., “Tilta-Pointer: The Free-Space Pointing Device,” Princeton COS 436 Project, http://www.milyehuang.com/cos436/project/specs.html (2004).
Lee, et al., “Innovative Estimation Method with Measurement Likelihood for all-Accelerometer Type Inertial Navigation System,” IEE Transactions on Aerospace and Electronic Systems, vol. 38, No. 1 (Jan. 2002).
Lee, et al., “Two-Dimensional Position Detection System with MEMS Accelerometer for Mouse Applications” Design Automation Conference, 2001. Proceedings, 2001 pp. 852-857 (Jun. 2001).
Leonard, “Computer Pointer Controls 3D Images in Free Space,” Electronic Design, pp. 160, 162, 165 , (Nov. 1991).
Liang, et al., “On Temporal-Spatial Realism in the Virtual Reality Environment,” ACM 1991 Symposium on User Interface Software and Technology (Nov. 1991).
Link, “Field-Qualified Silicon Accelerometers From 1 Milli g to 200,000 g,” SENSORS (Mar. 1993).
Liu, et al., “Enhanced Fisher Linear Discriminant Models for Face Recognition,” Proc. 14.sup.th International Conference on Pattern Recognition, Queensland, Australia (Aug. 1998).
Lobo et al., “Vision and Inertial Sensor Cooperation Using Gravity as a Vertical Reference,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 25, No. 12, pp. 1597-1608 (Dec. 2003).
Logitech, Logitech 2D/6D Mouse Devices Specification (1991).
Logitech, “Logitech 2D/6D Mouse Technical Reference Manual” (1991).
Logitech, “Logitech Tracker—Virtual Reality Motion Tracker.” http://www.vrealities.com/logitech.html.
Logitech, Inc., “3D Mouse & Head Tracker Technical Reference Manual” (1992).
Luinge, Inertial sensing of human movement, Thesis, University of Twente (2002).
Luinge, et al., “Estimation of orientation with gyroscopes and accelerometers,” Proceedings of the First Joint BMES/EMBS Conference, 1999., vol. 2, p. 844 (Oct. 1999).
Luthi, P. et al., “Low Cost Inertial Navigation System,” and translation (2000 ).
Mackenzie et al., “A two-ball mouse affords three degrees of freedom,” Extended Abstracts of the CHI '97 Conference on Human Factors in Computing Systems, pp. 303-304. New York: ACM (1997).
Mackinlay, “Rapid Controlled Movement Through a Virtural 3D Workspace,” ACM SIGGRAPH Computer Graphics archive, vol. 24 , No. 4, pp. 171-176 (Aug. 1990).
MaClean, “Designing with Haptic Feedback”, Proceedings of IEEE Robotics and Automation (ICRA '2000 , at 783-88 (Apr. 22-28, 2000).
Masliah, “Measuring the Allocation of Control in 6 Degree of Freedom Human-Computer Interaction Tasks,” Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 25-32 (2001 ).
Maybeck, “Stochastic Models, Estimation and Control,” vol. 1, Mathematics in Science and Engineering, vol. 141 (1979).
Merrill, “FlexiGesture: A sensor-rich real-time adaptive gesture and affordance learning platform for electronic music control,” Thesis, Massachusetts Institute of Technology (Jun. 2004).
Meyer et al., “A Survey of Position Tracker,” vol. 1, Issue 2, pp. 173-200, MIT Presence, (1992).
Microsoft Research Corp., “XWand Devices” (image).
Miles, “New pads lack control,” The Times, Dec. 6, 1999 (Dec. 1999).
Mizell, “Using Gravity to Estimate Accelerometer Orientation,” IEEE Computer Society (2003).
Morris, “Accelerometry—a technique for the measurement of human body movements,” J Biomechanics 6: 729-736 (1973).
Mulder, “How to Build an Instrumental Glove Based on the Powerglove Flex Sensors,” PCVR 16, pp. 10-14 (1994).
Mulder, “Human movement tracking technology,” School of Kinesiology, Simon Fraser University (Jul. 1994).
Myers, et al., “Interacting at a Distance: Measuring the Performance of Laser Pointers and Other Devices,” CHI 2002, (Apr. 2002).
N. I.C.E., “The N.I.C.E. Project” (video), http://www.niceproject.com/about/ (1997).
Naimark, et al., “Circular Data Matrix Fiducial System and Robust Image Processing for a Wearable Vision-Inertial Self-Tracker,” Proceedings. International Symposium on Mixed and Augmented Reality, ISMAR (2002).
Naimark, et al., “Encoded LED System for Optical Trackers,” Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 150-153 (2005 ).
Navarrete, et al., “Eigenspace-based Recognition of Faces: Comparisons and a new Approach,” Image Analysis and Processing (2001).
Newswire PR, “Five New Retailers to Carry Gyration's Gyropoint Point and Gyropoint Pro” (1996).
Newswire PR, “Three-Axis MEMS-based Accelerometer From STMicroelectronics Targets Handheld Terminals,” STMicro (Feb. 2003).
Nichols, “Geospatial Registration of Information for Dismounted Soldiers (GRIDS),” Contractor's Progress, Status, and Management Report (Milestone 3 Report to DARPA ETO) (Oct. 1998).
Nintendo, Nintendo Entertainment System (NES) (1984).
Nintendo, NES System and Controllers (1984).
Nintendo, NES Controller (1984).
Nintendo, NES Zapper Guns (1984).
Nintendo, NES Duck Hunt Game (1984).
Nintendo, Nintendo GameBoy System (1989).
Nintendo, Nintendo Super NES (SNES) (1991).
Nintendo, SNES System & Controllers (1991).
Nintendo, SNES Superscope (1991).
Nintendo, Nintendo 64 System (N64) (1996).
Nintendo, Nintendo 64 System and Controllers (1996).
Nintendo, Nintendo 64 Controller (1996).
Nintendo, Nintendo N64 Controller with Rumble Pack (1996-1997).
Nintendo, Nintendo N64 Rumble Packs (1996-1997).
Nintendo, Nintendo GameBoy Color System (1998).
Nintendo, GameBoy Color (1998).
Nintendo, Nintendo: Kirby Tilt & Tumble game, packaging and user manual (Aug. 2000-2001).
Nintendo, Pokemon Pinball (1998).
Nintendo, Nintendo Game Boy Advance System (2001).
Nintendo, Nintendo Game Boy Advance (2001).
Nintendo, Nintendo: WarioWare: Twisted game, packaging and user manual (2004-2005).
Nintendo, Nintendo Game Boy Advance Wireless Adapter (Sep. 26, 2003).
Nintendo, Nintendo GameCube System (2001).
Nintendo, GameCube System and Controller (2001).
Nintendo, GameCube Controller (2001).
Nintendo, Wavebird Wireless Controllers (May 2002).
Nintendo, G3 Wireless Controller (Pelican) (2001).
Nintendo, Game Boy Advance SP System (2003).
Nintendo, Nintendo Game Boy Color Game Cartridge with Built-In Rumble (Jun. 28, 2009).
Nishiyama, “A Nonlinear Filter for Estimating a Sinusoidal Signal and its Parameters in White Noise: On the Case of a Single Sinusoid,” IEEE Transactions on Signal Processing, vol. 45, No. 4 (Apr. 1997).
Nishiyama, “Robust Estimation of a Single Complex Sinusoid in White Noise-H.infin. Filtering Approach,” IEEE Transactions on Signal Processing, vol. 47, No. 10 (Oct. 1999).
Odell, Transcript of Testimony, Investigation No. 337-TA-658, Before the United States International Trade Commission, vol. IV, redacted (May 14, 2009).
Ojeda, et al., “No GPS? No Problem!” University of Michigan Develops Award-Winning Personal Dead-Reackoning (PDR) System for Walking Users, http://www.engin.umich.edu/research/mrl/urpr/In—Press/P135.pdf (post 2004).
Omelyan, “On the numerical integration of motion for rigid polyatomics: The modified quaternion approach” Computers in Physics, vol. 12 No. 1, pp. 97-103 (1998).
Ovaska, “Angular Acceleration Measurement: A Review,” Instrumentation and Measurement Technology Conference, Conference Proceedings. IEEE, vol. 2 (Oct. 1998).
Pai, et al., “The Tango: A Tangible Tangoreceptive Whole-Hand Interface,” Proceedings of World Haptics and IEEE Eurohaptics Conference, Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (2005).
Paradiso, et al., “Interactive Therapy with Instrumented Footwear,” CHI 2004, Apr. 24-29, 2004, Vienna, Austria (2004).
Park, Adaptive control strategies for MEMS gyroscopes (Dissertation), Univ. Cal. Berkley (2000).
PCTRACKER, Product Technology Brief, at http://www.intersense.com/uploadedFiles/Products/White—Papers/PCTracker—Tech—Overview.pdf.
Pelican Accessories G3 Wireless Controller (Sep. 6, 2002).
Perforce, Perforce Controller (image).
Pham, Hubert, “Pointing in Intelligent Environments with the WorldCursor,” Proceedings of Interact 2003, Andrew Wilson & (2003).
Phillips, “Forward/Up Directional Incompatibilities During Cursor Placement Within Graphical User Interfaces,” Ergonomics, informaworld.com (May 2005).
Phillips, “On the Right Track: A unique optical tracking system gives users greater freedom to explore virtual worlds” (Apr 2000).
Pierce et al., “Image Plane Interaction Techniques in 3D Immersive Environments,” Proceedings of the 1997 symposium on Interactive 3D graphics, portal.acm.org (1997).
Pilcher, “AirMouse Remote Controls,” IEEE Conference on Consumer Electronics (1992).
Pique, “Semantics of Interactive Rotations,” Interactive 3D Graphics, Proceedings of the 1986 workshop on Interactive 3D graphics, pp. 259-269 (Oct. 1986).
Piyabongkarn, “Development of a MEMS Gyroscope for Absolute Angle Measurement,” IEEE Transactions on Control Systems Technology, vol. 13, Issue 2, pp. 185-195 (Mar. 2005).
Piyabongkarn, “Development of a MEMS Gyroscope for Absolute Angle Measurement,” Dissertation, Univ. Minnesota (Nov. 2004).
Polhemus, “Polhemus 3SPACE FASTRAK devices” (image) (2000).
Pryor et al., “A Reusable Software Architecture for Manual Controller Integration,” IEEE Conf. on Robotics and Automation, Univ of Texas (Apr. 1997).
Raab, et al., “Magnetic Position and Orientation Tracking System,” IEEE Transactions on Aerospace and Electronic Systems, vol. AES-15, No. 5, pp. 709-718 (Sep. 1979).
Raethjen, et al., “Tremor Analysis in Two Normal Cohorts,” Clinical Neurophysiology 115 (2004).
Rebo, “Helmet-mounted virtual environment display system,” Thesis, Air Force Institute of Technology, Defense Technical Information Center (Dec. 1988).
Rebo, et al., “Helmet-Mounted Virtual Environment Display System,” Proc. SPIE vol. 1116, pp. 80-84 (Sep. 1989).
Rekimoto, “Tilting Operations for Small Screen Interfaces,” Proceedings of the 9th Annual ACM Symposium on User Interface Software and Technology, pp. 167-168 (1996).
Reunert, “Fiber-Optic Gyroscopes: Principles and Applications,” SENSORS, (Aug. 1993).
Ribo, et al., “Hybrid Tracking for Outdoor Augmented Reality Applications,” IEEE Computer Graphics and Applications, vol. 22, No. 6, pp. 54-63 (Nov./Dec. 2002).
Riviere, C., Robotics Institute, http://www.ri.cmu.edu/people/riviere.sub.--cameron.html http://www.ri.cmu.edu/person.html?type=publications&person—id=248 (Aug. 2007).
Riviere, et al., “Adaptive Canceling of Physiological Tremor for Improved Precision in Microsurgery,” IEEE Transactions on Biomedical Engineering, vol. 45, No. 7 (Jul. 1998).
Riviere, et al., “Toward Active Tremor Canceling in Handheld Microsurgical Instruments,” IEEE Transactions on Robotics and Automation, vol. 19, No. 5 (Oct. 2003).
Robbinett et al., “Implementation of Flying, Scaling, and Grabbing in Virtual Worlds,” ACM Symposium (1992).
Roberts, “The Lincoln Wand,” AFIPS Conference Proceedings, MIT Lincoln Laboratory (1966).
Robinett et al., “The Visual Display Transformation for Virtual Reality,” University of North Carolina at Chapel Hill (1994).
Roetenberg, “Inertial and magnetic sensing of human motion,” Thesis (2006).
Roetenberg, et al., “Inertial and Magnetic Sensing of Human Movement Near Ferromagnetic Materials,” Proceedings. The Second IEEE and ACM International Symposium on Mixed and Augmented Reality (Mar. 2003).
Rolland, et al., “A Survey of Tracking Technology for Virtual Environments,” University of Central Florida, Center for Research and Education in Optics Lasers (CREOL) (2001 ).
Sakai, et al., “Optical Spatial Filter Sensor for Ground Speed,” Optical Review, vol. 2, No. 1 pp. 65-67 (1994).
Saxena et al., “In Use Parameter Estimation of Inertial Sensors by Detecting Multilevel Quasi-Static States,” Lecture Notes in Computer Science, 2005—Berlin: Springer-Verlag, (Apr. 2004).
Sayed, “A Framework for State-Space Estimation with Uncertain Models,” IEEE Transactions on Automatic Control, vol. 46, No. 7 (Jul. 2001).
Sayed, UCLA Adaptive Systems Laboratory-Home Page, UCLA, http://asl.ee.ucla.edu/index.php?option=com.sub.--frontpage&Itemid=1 (Aug. 2007).
Schofield, Jack et al., Coming up for airpad, The Guardian (Feb. 2000).
Screen Shot of Brave Spirits (1998 ).
Selectech, “AirMouse Remote Controls, AirMouse Remote Control Warrant” (1991).
Selectech, Software, “AirMouse for DOS and Windows IBM & Compatibles,” “AirMouse Remote Control B0100EN-C, Amiga Driver, CDTV Driver, Version: 1.00,” “AirMouse Remote Control B0100EM-C.1, Apple Macintosh Serial Driver Version: 1.00 (1.01B),” “AirMouse Remote Control B0100EL-B/3.05 DOS Driver Version: 3.0, Windows Driver Version 1.00,” AirMouse Remote Control MS-DOS Driver Version: 3.00/3.05, Windows 3.0 Driver Version: 1.00 (1991).
Seoul National Univ., “EMMU System”—Seoul National Univ Power Point Presentation, www.computer.org/portal/cms—docs—ieeecs/ieeecs/education/csidc/CSIDC03Presentations/SNU.ppt (2003).
Shoemake, Ken, Quaternions, UPenn, Online.
Simon, et al. “The Yo Yo: A Handheld Combining Elastic and Isotonic Input,” http://www.uni-weimar.de/cms/fileadmin/medien/vr/documents/publications/TheYoYo-Interact2003-Talk.pdf (2003).
Simon, et al., “The Yo Yo: A Handheld Device Combining Elastic and Isotonic Input,” Human-Computer Interaction—Interact'03, pp. 303-310 (2003).
Smith, “Gyrevolution: Orienting the Digital Era,” http://www.gyration.com/images/pdfs/Gyration—White—Paper.pdf (2007).
Sorenson, et al., “The Minnesota Scanner: A Prototype Sensor for Three-Dimensional Tracking of Moving Body Segments,” IEEE Transactions on Robotics and Animation (Aug. 1989).
sourceforge.com, “ARToolkit API Documentation” (SourceForge web pages) (2004-2006).
Stovall, “Basic Inertial Navigation,” NAWCWPNS TM 8128, Navigation and Data Link Section, Systems Integration Branch (Sep. 1997).
Sutherland, “A Head-Mounted Three Dimensional Display,” AFIPS '68 (Fall, part I): Proceedings of the Dec. 9-11, 1968, fall joint computer conference, part I, pp. 757-764 (Dec. 1968).
Sutherland, Ivan E., “Sketchpad: A Man-Machine Graphical Communication System,” AFIPS '63 (Spring): Proceedings of the May 21-23, 1963, Spring Joint Computer Conference, pp. 329-346 (May 1963).
Sweetster, “A Quaternion Algebra Tool Set,” http://world.std.com/%7Esweetser/quaternions/intro/tools/tools.html (Jun. 2005).
Thinkoptics, Thinkoptics Wavit devices (image) (2007).
Timmer, “Data Analysis and Modeling Dynamic Processes in the Life Sciences,” Freiburg Center for Data Analysis and Modeling, http://webber.physik.uni-freiburg.de/.about.jeti/ (Aug. 2007).
Timmer, “Modeling Noisy Time Series: Physiological Tremor,” International Journal of Bifurcation and Chaos, vol. 8, No. 7 (1998).
Timmer, et al, “Pathological Tremors: Deterministic Chaos or Nonlinear Stochastic Oscillators?” Chaos, vol. 10, No. 1 (Mar. 2000).
Timmer, et al., “Characteristics of Hand Tremor Time Series,” Biological Cybernetics, vol. 70 (1993).
TImmer, et al., Cross-Spectral Analysis of Physiological Tremor and Muscle Activity: II Application to Synchronized Electromyogram, Biological Cybernetics, vol. 78 (1998).
Timmer, et al., “Cross-Spectral Analysis of Tremor Time Series,” International Journal of Bifurcation and Chaos, vol. 10, No. 11 (2000).
Titterton et al., “Strapdown Inertial Navigation Technology,” pp. 1-56 and pp. 292-321 (May 1997).
Tokimec, et al., “A Wearable Attitude-Measurement System Using a Fiberoptic Gyroscope,” MIT Presence (Apr. 2002).
UNC Computer Science Department, “News & Notes from Sitterson Hall,” UNC Computer Science, Department Newsletter, Issue 24, Spring 1999 (Apr. 1999).
Univ. Illinois at Chicago, “CAVE—A Virtual Reality Theater,” http://www.youtube.com/watch?v=-Sf6bJjwSCE 1993.
Univ. Wash., “ARToolkit” (U. Wash. web pages) (1999).
Urban, “BAA 96-37 Proposer Information,” DARPA/ETO (1996).
US Dynamics Corp, “Spinning Mass Mechanical Gyroscopes” (Aug. 2006).
US Dynamics Corp, “The Concept of ‘Rate’ (more particularly, angular rate pertaining to rate gyroscopes) (rate gyro explaination),” (Aug. 2006).
US Dynamics Corp, “US Dynamics Model 475 Series Rate Gyroscope Technical Brief—brief discussion on rate gyroscope basics, operation, and uses, and a dissection of the model by major component” (Dec. 2005).
US Dynamics Corp, “US Dynamics Rate Gyroscope Interface Brief (rate gyro IO)” (Aug. 2006).
Van Den Bogaard, “Using linear filters for real-time smoothing of rotational data in virtual reality application,” http://www.science.uva.nl/research/ias/alumni/m.sc.theses/theses/RobvandenBogaard.pdf (Aug. 2004).
Van Laerhoven, et al., “Using an Autonomous Cube for Basic Navigation and Input,” Proceedings of the 5th International Conference on Multimodal Interfaces, Vancouver, British Columbia, Canada, pp. 203-210 (2003).
Van Rheeden, et al., “Noise Effects on Centroid Tracker Aim Point Estimation,” IEEE Trans. on Aerospace and Electronic Systems, vol. 24, No. 2, pp. 177-185 (Mar. 1988).
Vaz, et al., “An Adaptive Estimation of Periodic Signals Using a Fourier Linear Combiner,” IEEE Transactions on Signal Processing, vol. 42, Issue 1, pp. 1-10 (Jan. 1994).
Verplaetse, “Inertial Proprioceptive Devices: Self-Motion Sensing Toys and Tools,” IBM Systems Journal (Sep. 1996).
Verplaetse, “Inertial-Optical Motion-Estimating Camera for Electronic Cinematography,” Masters of Science Thesis, MIT, (1997).
Vorozcovs, et al.,“The Hedgehog: A Novel Optical Tracking Method for Spatially Immersive Displays,” MIT Presence, vol. 15, No. 1, pp. 108-121 (2006).
Wang, et al., “Tracking a Head-Mounted Display in a Room-Sized Environment with Head-Mounted Cameras,” SPIE 1990 Technical Symposium on Optical Engineering and Photonics in Aerospace Sensing, vol. 1290, pp. 47-57 (1990).
Ward, et al., “A Demonstrated Optical Tracker With Scalable Work Area for Head-Mounted Display Systems,” Symposium on Interactive 3D Graphics, Proceedings of the 1992 Symposium on Interactive 3D Graphics, pp. 43-52, ACM Press, Cambridge, MA (1992).
Watt, 3D Computer Graphics, “Three-Dimensional Geometry in Computer Graphics,”, pp. 1-22 Addison-Wesley (1999).
Welch et al., “HiBall-3100™ Wide-Area, High-Precision Tracker and 3D Digitizer,” http://www.3rdtech.com/HiBall.htm (2002-2006).
Welch et al., HiBall Devices (image) (2002-2006).
Welch et al., Motion Tracking: No Silver Bullet, but a Respectable Arsenal IEEE Computer Graphics and Applications, vol. 22, No. 6, pp. 24-38 (Nov. 2002).
Welch, “Hybrid Self-Tracker: An Inertial/Optical Hybrid Three-Dimensional Tracking System,” Tech. Report TR95-048, Dissertation Proposal, Univ. of North Carolina at Chapel Hill, Dept. Computer Science, Chapel Hill, N.C. (1995).
Welch, “A Self-Contained Wide-Area Tracker Using Sensor Fusion” (2001).
Welch, “Hawkeye Zooms in on Mac Screens with Wireless Infrared Penlight Pointer,” MacWeek (May 1993).
Welch, et al., “Complementary Tracking and Two-Handed Interaction for Remote 3D Medical Consultation with a PDA,” Proceedings of Trends and Issues in Tracking for Virtual Environments, Workshop at the IEEE Virtual Reality 2007 Conference (Mar. 2007).
Welch, et al., “High-Performance Wide-Area Optical Tracking: The HiBall Tracking System,” MIT Presence: Teleoperators & Virtual Environments (2001).
Welch, et al., “SCAAT: Incremental Tracking with Incomplete Information,” Computer Graphics, SIGGRAPH 97 Conference Proceedings, pp. 333-344 (Aug. 1997).
Welch, et al., “Source Code for HiBall+Inerital device,” UNC-CH Computer Science (Jun. 1998).
Welch, et al., “The HiBall Tracker: High-Performance Wide-Area Tracking for Virtual and Augmented Environments,” ACM SIGGRAPH, Addison-Wesley (1999).
Welch, et al., “The High-Performance Wide-Area Optical Tracking : The HiBall Tracking System,” MIT Presence, Presence, vol. 10 , No. 1 (Feb. 2001).
Welch, et al., “Tracking for Training in Virtual Environments: Estimating the Pose of People and Devices for Simulation and Assessment,” [J. Cohn, D. Nicholson, and D. Schmorrow, editors, The PSI Handbook of Virtual Environments for Training and Education: Developments for the Military and Beyond, Chap.1, pp. 23-47] (2008).
Widrow, et al., “Fundamental Relations Between the LMS Algorithm and the DFT” IEEE Transactions on Circuits and Systems, vol. 34, No. CAS-7, (Jul. 1987).
Williams, et al., “Physical Presence: Palettes in Virtual Spaces,” Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series, vol. 3639, No. 374-384 (May 1999).
Wilson, “XWand: UI for Intelligent Environments,” http://research.microsoft.com/en-us/um/people/awilson/wand/default.htm (Apr. 2004).
Wilson, et al., “Demonstration of the XWand Interface for Intelligent Spaces,” UIST '02 Companion, pp. 37-38 (Oct. 2002).
Wilson, et al., “Gesture Recognition Using the Xwand,” ri.cmu.edu (2004).
Wilson, et al., “Xwand: UI for Intelligent Spaces,” CHI 2003, Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 545-552 (Apr. 2003).
Wlson, XWand video, http://research.microsoft.com/˜awilson/wand/wand%20video%20768k.WMV, (Mar. 2002).
Wormell, “Unified Camera, Content and Talent Tracking in Digital Television and Movie Production,” InterSense, Inc. & Mark Read, Hypercube Media Concepts, Inc. Presented: NAB 2000, Las Vegas, NV, Apr. 8-13, (2000).
Wormell, et al., “Advancements In 3D Interactive Devices for Virtual Environments,” ACM International Conference Proceeding Series; vol. 39 (2003).
Worringham, et al., “Directional Stimulus-Response Compatibility: A Test of Three Alternative Principles,” Ergonomics, vol. 41, Issue 6, pp. 864-880 (Jun. 1998).
Worringham, et al., “Tablet-PC Classroom System Wins Design Competition,” Computer, vol. 36, No. 8, pp. 15-18 (Aug. 2003).
Yang, et al., “Implementation and Evaluation of ‘Just Follow Me’: An Immersive, VR-Based, Motion-Training System,” MIT Presence: Teleoperators and Virtual Environments, vol. 11 No. 3, at 304-23 (MIT Press) (Jun. 2002).
You, et al., “Hybrid Inertial and Vision Tracking for Augmented Reality Registration,” http://graphics.usc.edu/cgit/pdf/papers/Vr1999.PDF (1999).
You, et al., “Orientation Tracking for Outdoor Augmented Reality Registration,” IEE Computer Graphics and Applications, IEEE, vol. 19, No. 6, pp. 36-42 (Nov. 1999).
Youngblut, et al., “Review of Virtual Environment Interface Technology,” Institute for Defense Analyses (Jul. 1996).
Yun, et al., “Recent Developments in Silicon Microaccelerometers,” SENSORS, University of California at Berkeley (Oct. 1992).
Zhai, “Human Performance in Six Degree of Freedom Input Control,” Thesis, University of Toronto (1995).
Zhou, et al., “A survey—Human Movement Tracking and Stroke Rehabilitation,” Technical Report: CSM-420, ISSN 1744-8050, Dept. of Computer Sciences, University of Essex, UK (Dec. 8, 2004).
Zhu, et al., “A Real-Time Articulated Human Motion Tracking Using Tri-Axis Inertial/Magnetic Sensors Package,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 12, No. 2 (Jun. 2004).
Office Action issued in related Chinese patent application No. 200610111559.7 (Sep. 18, 2009).
Cyberglove/Cyberforce, Immersion, Cyberforce CyberGlove Systems “Immersion Ships New Wireless CyberGlove(R) II Hand Motion-Capture Glove; Animators, Designers, and Researchers Gain Enhanced Efficiency and Realism for Animation, Digital Prototyping and Virtual Reality Projects,” Business Wire, Dec. 7, 2005.
Ewalt, David M., “Nintendo's Wii Is a Revolution,” Review, Forbes.com (Nov. 13, 2006).
Foxlin IS-900 Motion Tracking System, Technical Overview, 10 pages, intersense.com, 1999.
Frankie, “E3 2002: Roll O Rama”, GN: Roll-o-Rama Preview, . 3 pages. E3 Demo of Kirby game (“Roll O Rama”), http://cube.ign.com/objects/482/482164.html, (May 23, 2002).
Goschy, “Midway Velocity Controller” (youtube video http://www.youtube.com/watch?v=wjLhSrSxFNw) (Sep. 8, 2007).
Hartley, Matt, “Why is the Nintendo Wii So Successful?”, SmartHouse—The Lifestyle Technology Guide Website (Sep. 12, 2007).
Hinckley, Ken, “Haptic Issues for Virtual Manipulation,” Thesis (Dec. 1996).
Interview with Pat Goschy (youtube video http://www.youtube.com/watch?v=oKtZysYGDLE) (Jan. 14, 2008).
Kohler, Chris, “Triumph of the Wii: How Fun Won out in the Cossole Wars,” WIRED (Jun. 11, 2007).
Kunz, Andreas M. et al., “Design and Construction of a New Haptic Interface,” Proceedings of DETC'00, ASME 2000 Design Engineering Technical Conferences and Computers and Information in Engineering Conference, Baltimore, Maryland (Sep. 10-13, 2000).
Louderback, Jim, “Nintendo Wii,” Reviews by PC Magazine, (Nov. 13, 2006).
Marrin, Teresa, “Possibilities for the Digital Baton as a General-Purpose Gestural Interface,” Late-Breaking/Short Talks, CHI 97 (122-27 Mar. 1997).
Marti, Gaetan et al., “Biopsy navigator: a smart haptic interface for interventional radiological gestures,” Swiss Federal Institute of Technology (EPFL), Lausanne, Switzerland (2003).
Mattel Power Glove Instructions, Licensed by Nintendo for play on Nintendo Entertainment System (1989).
Office Action issued in related Japanese patent application 2006-216569 (Oct. 20, 2009).
Ogawa et al., “Wii are the Elite,” GameSpot web site (Feb. 5, 2008).
Paley, W. Bradford, “Interaction in 3D Graphics,” SIGGRAPH Computer Graphics Newsletter, Cricket input device (Nov. 1998).
PC World, “The 20 Most Innovative Products of the Year” (Dec. 27, 2006).
Press Release, “Logitech's Wingman Cordless RumblePad Sets PC Gamers Free,” http://www.logitech.com/index.cfm/172/1373&cl=nz,en (Sep. 2, 2001).
Riviere, Cameron, Testimony, Trial Day 5, in the Matter of Certain Video Game Machines and Related Three-Dimensional Pointing Devices, ITC Investigation No. 337-TA-658 (May 15, 2009).
Sulic, Ivan, “Logitech Wingman Cordless Rumblepad Review”, Review at IGN, 4 pages, Jan. 14, 2002.
TRAQ 3D (Trazer) Product, http://www.exergamefitness.com/traq—3d.htm, http://www.trazer.com/, http://www.traq3d.com/ (1997).
Ulanoff, Lance, “Nintendo's Wii is the Best Product Ever,” PC Magazine (Jun. 21, 2007).
Williams, Robert L. et al., “Implementation and Evaluation of a Haptic Playback System,” vol. 3 No. 3, Haptics-e (2004).
Williams, Robert L. et al., “The Virtual Haptic Back Project,” Presented at the Image 2003 Conference, Scottsdale, Arizong (Jul. 14-18, 2003).
Briefs. (New & Improved) (Brief Article), PC Magazine, Oct. 26, 1993.
Foremski, T. “Remote Control Mouse Aims At Interactive TV”, Electronics Weekly, Mar. 9, 1994.
Gelmis, J.; “Ready to Play, the Future Way”, Jul. 23, 1996, Buffalo News.
Ji, H. “Study on the infrared remote-control lamp-gesture device”, Yingyong Jiguang/Applied Laser Technology, v 17, n. 5, p. 225-227, Oct. 1997 Language: Chinese-Abstract only.
Maggioni, C., “A novel gestural input device for virtual reality”, IEEE Virtual Reality Annual International Symposium (Cat. No. 93CH3336-5), 118-24, 1993.
Morgan, C.; “Still chained to the overhead projector instead of the podium? (TV Interactive Corp's LaserMouse Remote Pro infrared mouse) (Clipboard)(Brief Article) (Product Announcement)”, Government Computer News, Jun. 13, 1994.
Templeman, James N., “Virtual Locomotion: Walking in Place through Virtual Environments,” Presence, vol. 8 No. 6, pp. 598-617, Dec. 1999.
Office Action issued in commonly assigned copending U.S. Appl. No. 12/222,787, Feb. 5, 2010.
Sega/Sports Sciences Inc., “Batter Up, It's a Hit,” Instruction Manual, Optional Equipment Manual (1994).
Sega/Sports Sciences Inc., “Batter Up, It's a Hit,” photos of baseball bat (1994).
U.S. Appl. No. 11/745,842, filed May 2007, Ashida et al.
AirPad Controller Manual (AirPad Corp. 2000).
Physical Product: Airpad Motion Reflex Controller for Sony Playstation, (AirPad Corp. 2000).
Office Action in commonly assigned copending U.S. Appl. No. 11/404,844 (Oct. 6, 2010).
Third Party Opposition submitted in counterpart EP Application No. EP 1 854 518 (Sep. 2, 2010).
PAD—Controller and Memory I/F in PlayStation (Apr. 17, 1995).
Game Controller (Wikipedia) (May 1, 2005).
Computer Mouse (Wikipedia) (May 7, 2005).
Transmission Mode (Apr. 22, 1999).
Serial Communication (Wikipedia) Jul. 2, 2005).
Wireless (Wikipedia) (Aug. 1, 2005).
Office Action in corresponding Japanese Patent Application (Sep. 9, 2010).
Translation of JP2005-063230 (previously submitted with translation of abstract only) (Mar. 2005).
European Examination Report issued in EP Application No. 10176870.3 on Aug. 9, 2011.
You et al., Fusion of Vision and Gyro Tracking for Robust Augmented Reality Registration, Proceedings of the Virtual Reality 2001 Conference, 2001, 1-8.
Office Action issued in Taiwanese Patent Appl No. 10021121610 on Dec. 14, 2011.
Office Action/Search Report issued in Taiwanese Patent Appl No. 10021121610 on Dec. 14, 2011.
Cancellation Request of BigBen against German utility model 20 2006 020 818 (UM1) (Oct. 15, 2010) and translation.
Cancellation Request of BigBen against German utility model 20 2006 020 819 (UM2) (Oct. 15, 2010) and translation.
Cancellation Request of BigBen against German utility model 20 2006 020 820 (UM3) (Oct. 15, 2010) and translation.
Brief of System Conn 99 (Oct. 27, 2010) and translation.
Brief of BigBen Oct. 27, 2010 and translation.
Buxton, et al, “A Study in Two-Handed Input,” ACM CHI'86 Proceedings (1986).
Leganchuk, et al, “Manual and Cognitive Benefits of Two-Handed Input: An Experimental Study,” ACM Transactions on Computer-Human Interaction vol. 5, No. 4, pp. 326-359 (Dec. 1998).
Japanese Office Action, “Notice of Reasons for Rejections”, issued Sep. 9, 2010 for corresponding Japanese Patent Application No. 2008-250858, 3 pages.
Extended European Search Report EP10 17 8309 ((Feb. 4, 2011).
Office Action, JP Application No. 2008-256858 (Apr. 22, 2011).
European Communication for Application No. 10178309.0, dated Oct. 25, 2012.
Notice of Allowance and Fee(s) Due dated Apr. 15, 2014, issued in related U.S. Appl. No. 12/285,812.
Related Publications (1)
Number Date Country
20070050597 A1 Mar 2007 US
Provisional Applications (1)
Number Date Country
60714862 Sep 2005 US