1. Field of the Invention
The present invention relates generally to methods and systems for expanding input device functionality, and more particularly, methods and systems for improving interaction between a game controller and a gaming base system by expanding the capabilities of the controller.
2. Description of the Related Art
The video game industry has seen many changes over the years. As computing power has expanded, developers of video games have likewise created game software that takes advantage of these increases in computing power. To this end, video game developers have been coding games that incorporate sophisticated operations and mathematics to produce a very realistic game experience.
Example gaming platforms, may be the Sony Playstation®, Sony Playstation2® (PS2), and Sony Playstation3® (PS3), each of which is sold in the form of a game console. As is well known, the game console is designed to connect to a monitor (usually a television) and enable user interaction through handheld controllers. The game console is designed with specialized processing hardware, including a CPU, a graphics synthesizer for processing intensive graphics operations, a vector unit for performing geometry transformations, and other glue hardware, firmware, and software. The game console is further designed with an optical disc tray for receiving game compact discs for local play through the game console. Online gaming is also possible, where a user can interactively play against or with other users over the Internet.
As game complexity continues to intrigue players, game and hardware manufacturers have continued to innovate to enable additional interactivity and computer programs. A growing trend in the computer gaming industry is to develop games that increase the interaction between user and the gaming system. One way of accomplishing a richer interactive experience is to use wireless game controllers whose movement is tracked by the gaming system in order to track the player's movements and use these movements as inputs for the game. Another way to improve the interactive experience is by providing the user with controllers with better communication capabilities.
However, input device development processes are usually lengthy and adding new features requires large investments in time and research. Further, adding features and complexity to input devices also raises the final cost to the user, which may decrease the number of potential customers. Still yet, some features may be useful for just a small community of players.
It is in this context that embodiments of the invention arise.
Embodiments of the present invention provide apparatus, methods and systems for interfacing a controller with a gaming system to interact with a computer program rendering interactive content on a display.
It should be appreciated that the present invention can be implemented in numerous ways, such as a process, an apparatus, a system, a device or a method on a computer readable medium. Several inventive embodiments of the present invention are described below.
In one embodiment, a controller includes a handle, a connector and an attachment. The handle has an elongated shape, which includes opposite first end and second ends, and is configured to be held by a single hand of a user during operation. The second end is aimed towards the display during operation. The connector is located at the second end of the handle. The attachment includes two interfaces: a visual interface for communication with the gaming system, and a connector interface that is coupled to the connector in the handle. The attachment can be coupled and decoupled to the connector and allows the gaming system to visually track the location of the handle via the visual interface.
In another embodiment, a handheld controller includes a handle, a connector in the handle and an attachment. The attachment includes a communications interface and a connector interface. The communications interface enables the attachment to exchange information directly with the gaming system, and the connector interface allows the attachment to communicate with the handle. The attachment can be coupled and decoupled to the connector allowing the handle to exchange information with the gaming system and the user via the communications and the connector interfaces.
In yet another embodiment, a method is presented for interfacing a controller with a gaming system to interact with a computer program. The method includes the operation of connecting an attachment to the controller, enabling the controller to receive and send information. The controller receives information from the attachment that was previously sent from the base station to the attachment. The controller sends information to the base station via the attachment. The method further includes the operation of updating a state of the controller when the controller receives information, and the operation of updating a state of the computer program when the controller sends information via the attachment.
In another embodiment, a system for playing games includes a controller, an attachment, an image capture device, and a base gaming system. The controller is designed to be held and operated by a single hand of a user, and the attachment can be connected to and disconnected from the controller. The image capture device takes images of the attachment in the field of play. The base gaming system is connected to the image capture device and interfaces with the attachment when the attachment is connected to the game controller. In the system, a first type of information is provided to the base gaming system and to a user when the attachment is connected to the controller and a second type of information is provided when the attachment is disconnected.
Other aspects of the invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the principles of the invention.
The invention may best be understood by reference to the following description taken in conjunction with the accompanying drawings in which:
The following embodiments describe apparatus, methods and systems for interfacing a controller with a gaming system to interact with a computer program rendering interactive content on a display.
It will be obvious, however, to one skilled in the art, that the present invention may be practiced without some or all of these specific details. In other instances, well known process operations have not been described in detail in order not to unnecessarily obscure the present invention.
In one embodiment, the four controllers include spherical attachments, sometimes referred to as “balls,” that can be illuminated with different colors that enable visual controller differentiation. For example, controller C1 lights up as red, C2 as yellow, C3 as white, and C4 as blue. This color selection is exemplary, and many other color combinations are possible. In one embodiment, the movement of the controllers is used for playing a game where players fly virtual kites, but many other applications are possible, such as karate fighting, firing, sword fighting, virtual worlds, etc.
In some embodiments, the light in the controller is used to provide feedback to the user, such as being when the player is “hit,” to indicate the amount of life left, to flag when the controller is occluded from view of the camera, etc. The two modes, providing visual input via camera pictures and providing user feedback, can be used simultaneously. In one embodiment, each time the ball is lit to provide user feedback, the base station uses the information associated with lighting the ball in the controller to analyze an image taken by image capture device 302 searching for the color associated with the lighting of the ball. For example, in one mode of operation, when a player pushes a button on the controller then the controller responds by lighting up the ball. The base station monitors the visual status of the ball and when the base station detects that the ball has lighted up, then the base station processes this event to capture that the player pushed the button.
Attachments providing expanded capabilities to handle 402 are connected and disconnected to expansion connector 402. In one embodiment, a spherical attachment enables the base computing device to locate the combination of handle and attachment within a three-dimensional space via visual recognition of images taken by a camera attached to the base device. Other embodiments provide additional communication capabilities to controller 400, such as an attachment that provides ultrasonic communication with the base computing device or with other controllers in the field of play. In yet another embodiment, an attachment provides infrared capabilities to allow the controller to communicate via infrared frequencies with the base station, or to use controller 400 as a remote control for a TV or other electronic equipment. More attachment embodiments are presented below in
In one embodiment, the attachment communicates directly with the base station and can act upon commands received from the base station, such as turning on an internal light or emitting a sound. In another embodiment, the attachment is directly controlled by handle 424 and the attachment only reacts to commands from handle 424. In yet another embodiment, the attachment can react to commands received from the base station or from the handle.
Inside handle 424, printed circuit board 416 holds processor 412, Input/Output (I/O) module 406, memory 416, and Bluetooth module 418, all interconnected by bus 422. A Universal Serial Bus (USB) module 420 also provides interactivity with the base computing device, or with other devices connected to USB port 432. The USB port can also be used to charge the rechargeable battery 430. Vibrotactile feedback is provided by vibrotactile module 428. Speaker 426 provides audio output.
Note that the above controller configuration is exemplary and many modifications thereto, including eliminating or adding modules, would occur to a person of ordinary skill in the art with access to the present Specification, and is well within the scope of the claimed invention. For example, controller 400 can also include sensors for mechanical tracking of the controller movement.
Attachment 502 has a spherical shape that allows the visual tracking of the attachment's location. The ability of tracking the location of the attachment by analyzing images captured of the attachment defines a visual interface between the attachment and the image capture device. When the attachment is rigidly connected to the controller, the combination of controller and attachment move synchronously, thereby enabling the detection of location and movement of the controller by tracking the location and movement of the attachment. In another embodiment, attachment 502 is not rigidly attached and the system must track the relative positions between the attachment and the controller in order to relate the attachments location to the controller's location. Further, certain functionality provided by the attachment, such as providing ultrasound communications, is independent of the relative position between attachment and controller, thus it is not required that the controller be rigidly attached to the handle. The attachment can thus be connected via a flexible cable, for example.
The different modules in spherical attachment 502 are interconnected via a common bus, but other interconnection mechanisms are possible. Connector 504 provides the interface to connect or disconnect attachment 502 from the controller. Attachment 502 includes a processor or circuit plus memory allowing the attachment to process computer instructions. Further, attachment 502 includes communication modules such as ultrasound, infrared, and WiFi. Such communications enable the attachment to communicate with the base station or other electronic devices, which is referred to herein as a communications interface between the controller and the base station or any other electronic device. In one embodiment, the attachment operates as a modem by receiving information from the controller and forwarding the information to the base station, and vice versa.
Information received by the attachment and passed to the controller is used to change the state of the controller. For example, the controller may emit a sound, change button configuration, disable the controller, load registers in memory, send a command to the attachment to light up, etc. The information received by the base station is used by the computer program to update the state of the computer program. For example, the computer program may move an avatar on the screen or change the status of the avatar, fire a weapon, start a game, select an option in a menu, etc.
An accelerometer, a magnetometer and a gyroscope provide mechanical information related to the movement of the attachment. In one embodiment, the mechanical or inertial information is combined with other location determination information, such as via visual tracking information, in order to refine the determination of the location of the controller-attachment combo. For example, if spherical attachment 502 is occluded from view, inertial information can be used to track movement of the attachment.
An internal light emitting device allows the attachment to be lit from the inside to improve visual recognition or to provide user feedback. In one embodiment, light emitting device can emit light of a single color, and in another embodiment, light emitting device can be configured to emit light from a choice of colors. In yet another embodiment, attachment 502 includes several light emitting devices, each device being capable of emitting light of one color. The light emitting device is configurable to emit different levels of brightness. The base computing device can provide interactivity to the user holding the controller by changing the light emitting status of attachment 502, producing audio signals, or with vibrotactile feedback, etc. One feedback operation or a combination of feedback operations is possible. In one embodiment, the type of feedback is selected from a list of predefined interactivity, and based on what is occurring in a game.
A microphone and a speaker provide audio capabilities, while a battery powers the rest of the components, including the processor and the light emitting device. The battery can also be used by the handle as a second source of power. For example, if the rechargeable battery in the controller is discharged, the attachment can provide the required power so the user can continue playing instead of having to stop to recharge the controller. In one embodiment, attachment 502 does not include the battery and power to the modules in attachment 502 is obtained via an electrical connection with the power source of the handle.
A camera provides image capture capabilities and a USB module allows USB communication to and from the attachment. In one embodiment, the USB connection is used to charge the battery in the attachment. In yet another embodiment, attachment 502 includes files in memory that are transferred to the controller, or to the base station, or to both the controller and the base station. The files in memory can include configuration files or programs that are transferred for execution in the controller or the gaming system. The files can be used to identify a specific user, to configure the controller or the base system, to load a game, to add features to existing games, etc. For example, one file is a game that is loaded to the base station for playing, another file contains karaoke songs that can be used in a sing-along game, another file contains new player rosters and statistics for an update to a sports game, etc. In addition, the attachment can be used to store user parameters, such as player configuration for a particular game. The player can then use the attachment in a different gaming system to play with other players using the configuration obtained from the original gaming system.
The visual cues generated by attachment 502 can be used to provide visual input to the base computing device or to provide feedback to the user or both. Additionally, the visual cues can be generated upon a command transmitted from the base station or upon the occurrence of a preconfigured condition detected at controller 424, such as pressing a button or jerking the controller at great speed. The difference combinations of visual cue generation and purpose can place the controller in different modes. In one mode, the base computing device sends a command to the controller to set the light emitted by attachment 502 in a desired state (such as lighting up green), and then the base computing device proceeds to visually track the controller. In a second mode, the base computing device sends a command to the controller to create a visual cue every times a desired event takes place, such as pressing a button to simulate firing. The base computing device can then track the visual state of the controller and detect the event at the controller by analyzing the images taken of the controller. In this mode, the visual cue can also provide feedback to the user, such as flashing a light when the button gets pushed.
In yet another mode, the primary purpose of the visual cues is to provide feedback to the user, such as for example lighting up the ball in a color indicative of a state of the game played. It should be noted that even when the purpose of the visual cue is to provide user feedback, the base computing device can also use the cues for input, such as tracking visually attachment 502 because the base computing device knows the color of the visual cue, or is able to monitor different visual cues that can be produced at any time by attachment 502.
In one embodiment, attachment 502 interacts with controller 424 via a communications interface, such as a USB interface. In another embodiment, attachment 502 is in electrical communication with one or several internal modules inside controller 424. For example, processor/circuit of attachment 502 (as seen in
It should be noted that the embodiment depicted in
When the connector rigidly connects to both controllers, the combination unit moves in unison, thus visually tracking movement of one of the controllers is enough to detect movement and location of the combination. In other embodiment, both controllers have spherical attachments connected and they are both tracked by the base station for redundancy and improved accuracy.
Attachments 1004 and 1006 add features to controller 1002. For example, attachment 1004 can provide ultrasonic communication, an input wheel, or an extra battery, while attachment 1006 provides a visual interface or a microphone.
After the controller sends information, a state of the computer program is updated in operation 160. After the controller receives information, a state of the controller is updated in operation 158.
The different communication devices connected to the base computing system connect to the respective controllers inside the computing system. The memory area includes running programs, an image processing area, a sound processing area, and a clock synchronization area. Running programs include a gaming program, image processing program, sound processing program, clock synchronization program, etc. These programs use the corresponding areas of memory, such as the image processing area containing image data, the sound processing area containing ultrasound communications data, and the clock synchronization area used for the synchronization with remote devices.
Several embodiments for controller configuration are shown in the player environment area. Controller A represents a “fully loaded” controller with many of the features previously described. Controller A includes a Clock Synchronization (CS) module used for clock synchronization with the base computing system; a Sound Receiver (SRx) for receiving ultrasonic data; a Sound Transmitter (SRx) for sending ultrasonic data; a WiFi (WF) module for WiFi communications with computing system 700; an Acoustic Chamber (AC) for conducting sound to and from the front and/or the sides of the controller; an Image Capture (IC) device, such as a digital video camera, for capturing image data; and a Light Emitter (LE) in the infrared or visible spectrum for easier image recognition from the image processing module at computing system 700.
Additionally, controller A includes a connector (not shown) configured to be coupled and decoupled to an attachment, such as a spherical section, to improve image recognition by a remote capture device. The light created by the light emitter can be in the infrared or the visible spectrum, therefore the image capture device will work in the same light spectrum. The different components in Controller A can be implemented as separate devices or modules inside Controller A. In another embodiment, the different components in Controller A are grouped into a smaller number of integrated components enabling a more compact implementation. The various controllers can also include one or more USB plugs, to enable charging of the controllers when connected to the game station or a computer.
According to the intended use of a given controller, simpler configurations can be used with less features than those described for controller A. Some embodiments of simpler devices are shown with respect to controllers B-E utilizing a subset of features from those described for controller A. The person skilled in the art will readily appreciate that similar configurations are possible within the spirit of the invention by adding or subtracting components, as long as the principles of the invention are maintained.
The I/O bridge 1434 also connects to six Universal Serial Bus (USB) 2.0 ports 1424; a gigabit Ethernet port 1422; an IEEE 802.11b/g wireless network (Wi-Fi) port 1420; and a Bluetooth® wireless link port 1418 capable of supporting of up to seven Bluetooth connections.
In operation, the I/O bridge 1434 handles all wireless, USB and Ethernet data, including data from one or more game controllers 1402-1403. For example when a user is playing a game, the I/O bridge 1434 receives data from the game controller 1402-1403 via a Bluetooth link and directs it to the Cell processor 1428, which updates the current state of the game accordingly.
The wireless, USB and Ethernet ports also provide connectivity for other peripheral devices in addition to game controllers 1402-1403, such as: a remote control 1404; a keyboard 1406; a mouse 1408; a portable entertainment device 1410 such as a Sony Playstation Portable® entertainment device; a video camera such as an EyeToy® video camera 1412; a microphone headset 1414; and a microphone 1415. Such peripheral devices may therefore in principle be connected to the system unit 1400 wirelessly; for example the portable entertainment device 1410 may communicate via a Wi-Fi ad-hoc connection, whilst the microphone headset 1414 may communicate via a Bluetooth link.
The provision of these interfaces means that the Playstation 3 device is also potentially compatible with other peripheral devices such as digital video recorders (DVRs), set-top boxes, digital cameras, portable media players, Voice over IP telephones, mobile telephones, printers and scanners.
In addition, a legacy memory card reader 1416 may be connected to the system unit via a USB port 1424, enabling the reading of memory cards 1448 of the kind used by the Playstation® or Playstation 2® devices.
The game controllers 1402-1403 are operable to communicate wirelessly with the system unit 1400 via the Bluetooth link, or to be connected to a USB port, thereby also providing power by which to charge the battery of the game controllers 1402-1403. Game controllers 1402-1403 can also include memory, a processor, a memory card reader, permanent memory such as flash memory, light emitters such as LEDs or infrared lights, microphone and speaker for ultrasound communications, an acoustic chamber, a digital camera, an internal clock, a recognizable shape such as a spherical section facing the game console, and wireless communications using protocols such as Bluetooth®, WiFi™, etc.
Game controller 1402 is a controller designed to be used with two hands, and game controller 1403 is a single-hand controller with a connector for expansion capabilities via a hardware attachment. In addition to one or more analog joysticks and conventional control buttons, the game controller is susceptible to three-dimensional location determination. Consequently gestures and movements by the user of the game controller may be translated as inputs to a game in addition to or instead of conventional button or joystick commands. Optionally, other wirelessly enabled peripheral devices such as the Playstation™ Portable device may be used as a controller. In the case of the Playstation™ Portable device, additional game or control information (for example, control instructions or number of lives) may be provided on the screen of the device. Other alternative or supplementary control devices may also be used, such as a dance mat (not shown), a light gun (not shown), a steering wheel and pedals (not shown) or bespoke controllers, such as a single or several large buttons for a rapid-response quiz game (also not shown).
The remote control 1404 is also operable to communicate wirelessly with the system unit 1400 via a Bluetooth link. The remote control 1404 comprises controls suitable for the operation of the Blu Ray™ Disk BD-ROM reader 1440 and for the navigation of disk content.
The Blu Ray™ Disk BD-ROM reader 1440 is operable to read CD-ROMs compatible with the Playstation and PlayStation 2 devices, in addition to conventional pre-recorded and recordable CDs, and so-called Super Audio CDs. The reader 1440 is also operable to read DVD-ROMs compatible with the Playstation 2 and PlayStation 3 devices, in addition to conventional pre-recorded and recordable DVDs. The reader 1440 is further operable to read BD-ROMs compatible with the Playstation 3 device, as well as conventional pre-recorded and recordable Blu-Ray Disks.
The system unit 1400 is operable to supply audio and video, either generated or decoded by the Playstation 3 device via the Reality Synthesizer graphics unit 1430, through audio and video connectors to a display and sound output device 1442 such as a monitor or television set having a display 1444 and one or more loudspeakers 1446. The audio connectors 1450 may include conventional analogue and digital outputs whilst the video connectors 1452 may variously include component video, S-video, composite video and one or more High Definition Multimedia Interface (HDMI) outputs. Consequently, video output may be in formats such as PAL or NTSC, or in 720 p, 1080 i or 1080 p high definition.
Audio processing (generation, decoding and so on) is performed by the Cell processor 1428. The Playstation 3 device's operating system supports Dolby® 5.1 surround sound, Dolby® Theatre Surround (DTS), and the decoding of 7.1 surround sound from Blu-Ray® disks.
In the present embodiment, the video camera 1412 comprises a single charge coupled device (CCD), an LED indicator, and hardware-based real-time data compression and encoding apparatus so that compressed video data may be transmitted in an appropriate format such as an intra-image based MPEG (motion picture expert group) standard for decoding by the system unit 1400. The camera LED indicator is arranged to illuminate in response to appropriate control data from the system unit 1400, for example to signify adverse lighting conditions. Embodiments of the video camera 1412 may variously connect to the system unit 1400 via a USB, Bluetooth or Wi-Fi communication port. Embodiments of the video camera may include one or more associated microphones and also be capable of transmitting audio data. In embodiments of the video camera, the CCD may have a resolution suitable for high-definition video capture. In use, images captured by the video camera may for example be incorporated within a game or interpreted as game control inputs. In another embodiment the camera is an infrared camera suitable for detecting infrared light.
In general, in order for successful data communication to occur with a peripheral device such as a video camera or remote control via one of the communication ports of the system unit 1400, an appropriate piece of software such as a device driver should be provided. Device driver technology is well-known and will not be described in detail here, except to say that the skilled man will be aware that a device driver or similar software interface may be required in the present embodiment described.
The Power Processing Element (PPE) 1550 is based upon a two-way simultaneous multithreading Power 1470 compliant PowerPC core (PPU) 1555 running with an internal clock of 3.2 GHz. It comprises a 512 kB level 2 (L2) cache and a 32 kB level 1 (L1) cache. The PPE 1550 is capable of eight single position operations per clock cycle, translating to 25.6 GFLOPs at 3.2 GHz. The primary role of the PPE 1550 is to act as a controller for the Synergistic Processing Elements 1510A-H, which handle most of the computational workload. In operation the PPE 1550 maintains a job queue, scheduling jobs for the Synergistic Processing Elements 1510A-H and monitoring their progress. Consequently each Synergistic Processing Element 1510A-H runs a kernel whose role is to fetch a job, execute it and synchronized with the PPE 1550.
Each Synergistic Processing Element (SPE) 1510A-H comprises a respective Synergistic Processing Unit (SPU) 1520A-H, and a respective Memory Flow Controller (MFC) 1540A-H comprising in turn a respective Dynamic Memory Access Controller (DMAC) 1542A-H, a respective Memory Management Unit (MMU) 1544A-H and a bus interface (not shown). Each SPU 1520A-H is a RISC processor clocked at 3.2 GHz and comprising 256 kB local RAM 1530A-H, expandable in principle to 4 GB. Each SPE gives a theoretical 25.6 GFLOPS of single precision performance. An SPU can operate on 4 single precision floating point members, 4 32-bit numbers, 8 16-bit integers, or 16 8-bit integers in a single clock cycle. In the same clock cycle it can also perform a memory operation. The SPU 1520A-H does not directly access the system memory XDRAM 1426; the 64-bit addresses formed by the SPU 1520A-H are passed to the MFC 1540A-H which instructs its DMA controller 1542A-H to access memory via the Element Interconnect Bus 1580 and the memory controller 1560.
The Element Interconnect Bus (EIB) 1580 is a logically circular communication bus internal to the Cell processor 1428 which connects the above processor elements, namely the PPE 1550, the memory controller 1560, the dual bus interface 1570A,B and the 8 SPEs 1510A-H, totaling 12 participants. Participants can simultaneously read and write to the bus at a rate of 8 bytes per clock cycle. As noted previously, each SPE 1510A-H comprises a DMAC 1542A-H for scheduling longer read or write sequences. The EIB comprises four channels, two each in clockwise and anti-clockwise directions. Consequently for twelve participants, the longest step-wise data-flow between any two participants is six steps in the appropriate direction. The theoretical peak instantaneous EIB bandwidth for 12 slots is therefore 96 B per clock, in the event of full utilization through arbitration between participants. This equates to a theoretical peak bandwidth of 307.2 GB/s (gigabytes per second) at a clock rate of 3.2 GHz.
The memory controller 1560 comprises an XDRAM interface 1562, developed by Rambus Incorporated. The memory controller interfaces with the Rambus XDRAM 1426 with a theoretical peak bandwidth of 25.6 GB/s.
The dual bus interface 1570A,B comprises a Rambus FlexIO® system interface 1572A,B. The interface is organized into 12 channels each being 8 bits wide, with five paths being inbound and seven outbound. This provides a theoretical peak bandwidth of 62.4 GB/s (36.4 GB/s outbound, 26 GB/s inbound) between the Cell processor and the I/O Bridge 700 via controller 170A and the Reality Simulator graphics unit 200 via controller 170B.
Data sent by the Cell processor 1428 to the Reality Simulator graphics unit 1430 will typically comprise display lists, being a sequence of commands to draw vertices, apply textures to polygons, specify lighting conditions, and so on.
Embodiments of the present invention may be practiced with various computer system configurations including hand-held devices, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers and the like. The invention can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a wire-based or wireless network.
With the above embodiments in mind, it should be understood that the invention can employ various computer-implemented operations involving data stored in computer systems. These operations are those requiring physical manipulation of physical quantities. Any of the operations described herein that form part of the invention are useful machine operations. The invention also relates to a device or an apparatus for performing these operations. In one embodiment, the apparatus can be specially constructed for the required purpose (e.g. a special purpose machine), or the apparatus can be a general-purpose computer selectively activated or configured by a computer program stored in the computer. In particular, various general-purpose machines can be used with computer programs written in accordance with the teachings herein, or it may be more convenient to construct a more specialized apparatus to perform the required operations.
The invention can also be embodied as computer readable code on a computer readable medium. The computer readable medium is any data storage device that can store data, which can be thereafter be read by a computer system. Examples of the computer readable medium include hard drives, network attached storage (NAS), read-only memory, random-access memory, CD-ROMs, CD-Rs, CD-RWs, magnetic tapes and other optical and non-optical data storage devices. The computer readable medium can include computer readable tangible medium distributed over a network-coupled computer system so that the computer readable code is stored and executed in a distributed fashion.
Although the method operations were described in a specific order, it should be understood that other housekeeping operations may be performed in between operations, or operations may be adjusted so that they occur at slightly different times, or may be distributed in a system which allows the occurrence of the processing operations at various intervals associated with the processing, as long as the processing of the overlay operations are performed in the desired way.
Although the foregoing invention has been described in some detail for purposes of clarity of understanding, it will be apparent that certain changes and modifications can be practiced within the scope of the appended claims. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the invention is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.
This application is a continuation application of U.S. patent application Ser. No. 11/588,779 (now issued as Pat. No. 8,062,126), filed Oct. 26, 2006 and entitled, “SYSTEM AND METHOD FOR INTERFACING WITH A COMPUTER PROGRAM”, which claims priority from U.S. Provisional Patent Application No. 60/730,659, filed Oct. 26, 2005, and entitled “SYSTEM AND METHOD FOR INTERFACING WITH A COMPUTER PROGRAM”. This application is a continuation application of U.S. patent application Ser. No. 12/259,181, filed Oct. 27, 2008, and entitled “DETERMINING LOCATION AND MOVEMENT OF BALL-ATTACHED CONTROLLER”. This application claims priority from U.S. Provisional Patent Application No. 61/057,783, filed May 30, 2008, and entitled, “DETERMINATION OF CONTROLLER THREE-DIMENSIONAL LOCATION USING IMAGE ANALYSIS AND ULTRASONIC COMMUNICATION”; U.S. Provisional Patent Application No. 61/120,340, filed on Dec. 5, 2008, and entitled “CONTROL DEVICE FOR COMMUNICATING VISUAL INFORMATION”; and U.S. Provisional Patent Application No. 61/200,973, filed on Dec. 5, 2008, and entitled “SPHERICAL ENDED CONTROLLER WITH CONFIGURABLE MODES”. All these Patent Applications are incorporated herein by reference. This application is related to U.S. patent application Ser. No. 12/145,455, filed Jun. 24, 2008 and entitled, “DETERMINATION OF CONTROLLER THREE-DIMENSIONAL LOCATION USING IMAGE ANALYSIS AND ULTRASONIC COMMUNICATION”; U.S. patent application Ser. No. 11/429,133, filed May 4, 2006, and entitled “SELECTIVE SOUND SOURCE LISTENING IN CONJUNCTION WITH COMPUTER INTERACTIVE PROCESSING”; International Application No: PCT/US2006/017483, filed May 4, 2006, and titled “SELECTIVE SOUND SOURCE LISTENING IN CONJUNCTION WITH COMPUTER INTERACTIVE PROCESSING”, all of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3943277 | Everly et al. | Mar 1976 | A |
4263504 | Thomas | Apr 1981 | A |
4313227 | Eder | Jan 1982 | A |
4558864 | Medwedeff | Dec 1985 | A |
4565999 | King et al. | Jan 1986 | A |
4787051 | Olson | Nov 1988 | A |
4802227 | Elko et al. | Jan 1989 | A |
4823001 | Kobayashi et al. | Apr 1989 | A |
4843568 | Krueger et al. | Jun 1989 | A |
5034986 | Karmann et al. | Jul 1991 | A |
5055840 | Bartlett | Oct 1991 | A |
5111401 | Everett et al. | May 1992 | A |
5128671 | Thomas, Jr. | Jul 1992 | A |
5144594 | Gilchrist | Sep 1992 | A |
5260556 | Lake et al. | Nov 1993 | A |
5297061 | Dementhon et al. | Mar 1994 | A |
5317140 | Dunthorn | May 1994 | A |
5335011 | Addeo et al. | Aug 1994 | A |
5394168 | Smith, III et al. | Feb 1995 | A |
5426450 | Drumm | Jun 1995 | A |
5455685 | Mori | Oct 1995 | A |
5473701 | Cezanne et al. | Dec 1995 | A |
5485273 | Mark et al. | Jan 1996 | A |
5528265 | Harrison | Jun 1996 | A |
5534917 | MacDougall | Jul 1996 | A |
5543818 | Scott | Aug 1996 | A |
5557684 | Wang et al. | Sep 1996 | A |
5563988 | Maes et al. | Oct 1996 | A |
5568928 | Munson et al. | Oct 1996 | A |
5581276 | Cipolla et al. | Dec 1996 | A |
5583478 | Renzi | Dec 1996 | A |
5586231 | Florent et al. | Dec 1996 | A |
5611731 | Bouton et al. | Mar 1997 | A |
5616078 | Oh | Apr 1997 | A |
5638228 | Thomas, III | Jun 1997 | A |
5649021 | Matey et al. | Jul 1997 | A |
5675825 | Dreyer et al. | Oct 1997 | A |
5675828 | Stoel et al. | Oct 1997 | A |
5677710 | Thompson-Rohrlich | Oct 1997 | A |
5706364 | Kopec et al. | Jan 1998 | A |
5768415 | Jagadish et al. | Jun 1998 | A |
5796354 | Cartabiano et al. | Aug 1998 | A |
5818424 | Korth | Oct 1998 | A |
5846086 | Bizzi et al. | Dec 1998 | A |
5850222 | Cone | Dec 1998 | A |
5850473 | Andersson | Dec 1998 | A |
5861910 | McGarry et al. | Jan 1999 | A |
5870100 | DeFreitas | Feb 1999 | A |
5883616 | Koizumi et al. | Mar 1999 | A |
5889505 | Toyama et al. | Mar 1999 | A |
5889672 | Schuler et al. | Mar 1999 | A |
5900863 | Numazaki | May 1999 | A |
5913727 | Ahdoot | Jun 1999 | A |
5914723 | Gajewska | Jun 1999 | A |
5917493 | Tan et al. | Jun 1999 | A |
5923306 | Smith et al. | Jul 1999 | A |
5923318 | Zhai et al. | Jul 1999 | A |
5929444 | Leichner | Jul 1999 | A |
5930383 | Netaer | Jul 1999 | A |
5930741 | Kramer | Jul 1999 | A |
5937081 | O'Brill et al. | Aug 1999 | A |
5959596 | McCarten et al. | Sep 1999 | A |
5963250 | Parker et al. | Oct 1999 | A |
5993314 | Dannenberg et al. | Nov 1999 | A |
6009210 | Kang | Dec 1999 | A |
6021219 | Andersson et al. | Feb 2000 | A |
6031545 | Ellenby et al. | Feb 2000 | A |
6031934 | Ahmad et al. | Feb 2000 | A |
6037942 | Millington | Mar 2000 | A |
6044181 | Szeliski et al. | Mar 2000 | A |
6049619 | Anandan et al. | Apr 2000 | A |
6056640 | Schaaij | May 2000 | A |
6057909 | Yahav et al. | May 2000 | A |
6061055 | Marks | May 2000 | A |
6075895 | Qiao et al. | Jun 2000 | A |
6078789 | Bodenmann et al. | Jun 2000 | A |
6091905 | Yahav et al. | Jul 2000 | A |
6094625 | Ralston | Jul 2000 | A |
6097369 | Wambach | Aug 2000 | A |
6100517 | Yahav et al. | Aug 2000 | A |
6100895 | Miura et al. | Aug 2000 | A |
6101289 | Kellner | Aug 2000 | A |
6115052 | Freeman et al. | Sep 2000 | A |
6134346 | Berman et al. | Oct 2000 | A |
6144367 | Berstis | Nov 2000 | A |
6151009 | Kanade et al. | Nov 2000 | A |
6157368 | Faeger | Dec 2000 | A |
6160540 | Fishkin et al. | Dec 2000 | A |
6166744 | Jaszlics et al. | Dec 2000 | A |
6173059 | Huang et al. | Jan 2001 | B1 |
6175343 | Mitchell et al. | Jan 2001 | B1 |
6184863 | Sibert et al. | Feb 2001 | B1 |
6191773 | Maruno et al. | Feb 2001 | B1 |
6195104 | Lyons | Feb 2001 | B1 |
6215898 | Woodfill et al. | Apr 2001 | B1 |
6243491 | Andersson | Jun 2001 | B1 |
6275213 | Tremblay et al. | Aug 2001 | B1 |
6281930 | Parker et al. | Aug 2001 | B1 |
6282362 | Murphy et al. | Aug 2001 | B1 |
6295064 | Yamaguchi | Sep 2001 | B1 |
6297838 | Chang et al. | Oct 2001 | B1 |
6304267 | Sata | Oct 2001 | B1 |
6307549 | King et al. | Oct 2001 | B1 |
6307568 | Rom | Oct 2001 | B1 |
6323839 | Fukuda et al. | Nov 2001 | B1 |
6323942 | Bamji | Nov 2001 | B1 |
6326901 | Gonzales | Dec 2001 | B1 |
6327073 | Yahav et al. | Dec 2001 | B1 |
6331911 | Manassen et al. | Dec 2001 | B1 |
6342010 | Slifer | Jan 2002 | B1 |
6346929 | Fukushima et al. | Feb 2002 | B1 |
6351661 | Cosman | Feb 2002 | B1 |
6371849 | Togami | Apr 2002 | B1 |
6375572 | Masuyama et al. | Apr 2002 | B1 |
6392644 | Miyata et al. | May 2002 | B1 |
6394897 | Togami | May 2002 | B1 |
6400374 | Lanier | Jun 2002 | B2 |
6409602 | Wiltshire et al. | Jun 2002 | B1 |
6411392 | Bender et al. | Jun 2002 | B1 |
6411744 | Edwards | Jun 2002 | B1 |
6417836 | Kumar et al. | Jul 2002 | B1 |
6441825 | Peters | Aug 2002 | B1 |
6473516 | Kawaguchi et al. | Oct 2002 | B1 |
6498860 | Sasaki et al. | Dec 2002 | B1 |
6504535 | Edmark | Jan 2003 | B1 |
6516466 | Jackson | Feb 2003 | B1 |
6533420 | Eichenlaub | Mar 2003 | B1 |
6542927 | Rhoads | Apr 2003 | B2 |
6545706 | Edwards et al. | Apr 2003 | B1 |
6546153 | Hoydal | Apr 2003 | B1 |
6556704 | Chen | Apr 2003 | B1 |
6577748 | Chang | Jun 2003 | B2 |
6580414 | Wergen et al. | Jun 2003 | B1 |
6580415 | Kato et al. | Jun 2003 | B1 |
6587573 | Stam et al. | Jul 2003 | B1 |
6593956 | Potts et al. | Jul 2003 | B1 |
6621938 | Tanaka et al. | Sep 2003 | B1 |
6628265 | Hwang | Sep 2003 | B2 |
6661914 | Dufour | Dec 2003 | B2 |
6674415 | Nakamura et al. | Jan 2004 | B2 |
6676522 | Rowe et al. | Jan 2004 | B2 |
6677967 | Swano et al. | Jan 2004 | B2 |
6677987 | Girod | Jan 2004 | B1 |
6709108 | Levine et al. | Mar 2004 | B2 |
6720949 | Pryor et al. | Apr 2004 | B1 |
6727988 | Kim et al. | Apr 2004 | B2 |
6741741 | Farrell | May 2004 | B2 |
6746124 | Fischer et al. | Jun 2004 | B2 |
6751338 | Wallack | Jun 2004 | B1 |
6753849 | Curran et al. | Jun 2004 | B1 |
6767282 | Matsuyama et al. | Jul 2004 | B2 |
6769769 | Podlleanu et al. | Aug 2004 | B2 |
6772057 | Breed et al. | Aug 2004 | B2 |
6774939 | Peng | Aug 2004 | B1 |
6785329 | Pan et al. | Aug 2004 | B1 |
6789967 | Forester | Sep 2004 | B1 |
6791531 | Johnston et al. | Sep 2004 | B1 |
6795068 | Marks | Sep 2004 | B1 |
6809776 | Simpson | Oct 2004 | B1 |
6819318 | Geng | Nov 2004 | B1 |
6847311 | Li | Jan 2005 | B2 |
6863609 | Okuda et al. | Mar 2005 | B2 |
6881147 | Naghi et al. | Apr 2005 | B2 |
6884171 | Eck et al. | Apr 2005 | B2 |
6890262 | Oishi et al. | May 2005 | B2 |
6917688 | Yu et al. | Jul 2005 | B2 |
6919824 | Lee | Jul 2005 | B2 |
6924787 | Kramer et al. | Aug 2005 | B2 |
6928180 | Stam et al. | Aug 2005 | B2 |
6930725 | Hayashi | Aug 2005 | B1 |
6931596 | Gutta et al. | Aug 2005 | B2 |
6943776 | Ehrenburg | Sep 2005 | B2 |
6945653 | Kobori et al. | Sep 2005 | B2 |
6951515 | Ohshima et al. | Oct 2005 | B2 |
6952198 | Hansen | Oct 2005 | B2 |
6965362 | Ishizuka | Nov 2005 | B1 |
6970183 | Monroe | Nov 2005 | B1 |
6990639 | Wilson | Jan 2006 | B2 |
7006009 | Newman | Feb 2006 | B2 |
7016411 | Azuma et al. | Mar 2006 | B2 |
7039199 | Rui | May 2006 | B2 |
7039253 | Matsuoka et al. | May 2006 | B2 |
7042440 | Pryor et al. | May 2006 | B2 |
7043056 | Edwards et al. | May 2006 | B2 |
7054452 | Ukita | May 2006 | B2 |
7059962 | Watashiba | Jun 2006 | B2 |
7061507 | Tuomi et al. | Jun 2006 | B1 |
7071914 | Marks | Jul 2006 | B1 |
7090352 | Kobor et al. | Aug 2006 | B2 |
7098891 | Pryor | Aug 2006 | B1 |
7102615 | Marks | Sep 2006 | B2 |
7106366 | Parker et al. | Sep 2006 | B2 |
7116330 | Marshall et al. | Oct 2006 | B2 |
7116342 | Dengler et al. | Oct 2006 | B2 |
7121946 | Paul et al. | Oct 2006 | B2 |
7139767 | Taylor et al. | Nov 2006 | B1 |
7148922 | Shimada | Dec 2006 | B2 |
7164413 | Davis et al. | Jan 2007 | B2 |
7183929 | Antebi et al. | Feb 2007 | B1 |
7212308 | Morgan | May 2007 | B2 |
7223173 | Masuyama et al. | May 2007 | B2 |
7224384 | Iddan et al. | May 2007 | B1 |
7227526 | Hildreth et al. | Jun 2007 | B2 |
7227976 | Jung et al. | Jun 2007 | B1 |
7245273 | Eberl et al. | Jul 2007 | B2 |
7259375 | Tichit et al. | Aug 2007 | B2 |
7263462 | Funge et al. | Aug 2007 | B2 |
7274305 | Lutrell | Sep 2007 | B1 |
7283679 | Okada et al. | Oct 2007 | B2 |
7296007 | Funge et al. | Nov 2007 | B1 |
7301530 | Lee et al. | Nov 2007 | B2 |
7305114 | Wolff et al. | Dec 2007 | B2 |
7331856 | Nakamura et al. | Feb 2008 | B1 |
7346387 | Wachter et al. | Mar 2008 | B1 |
7352359 | Zalewski et al. | Apr 2008 | B2 |
7364297 | Goldfain et al. | Apr 2008 | B2 |
7379559 | Wallace et al. | May 2008 | B2 |
7391409 | Zalewski et al. | Jun 2008 | B2 |
7436887 | Yeredor et al. | Oct 2008 | B2 |
7445550 | Barney | Nov 2008 | B2 |
7446650 | Schofield et al. | Nov 2008 | B2 |
7545926 | Mao | Jun 2009 | B2 |
7558698 | Funge et al. | Jul 2009 | B2 |
7613610 | Zimmerman et al. | Nov 2009 | B1 |
7623115 | Marks | Nov 2009 | B2 |
7627139 | Marks et al. | Dec 2009 | B2 |
7636645 | Yen et al. | Dec 2009 | B1 |
7636697 | Dobson et al. | Dec 2009 | B1 |
7636701 | Funge et al. | Dec 2009 | B2 |
7697700 | Mao | Apr 2010 | B2 |
7721231 | Wilson | May 2010 | B2 |
20010056477 | McTernan et al. | Dec 2001 | A1 |
20020010655 | Kjallstrom | Jan 2002 | A1 |
20020056114 | Fillebrown et al. | May 2002 | A1 |
20020065121 | Fukunaga et al. | May 2002 | A1 |
20020072414 | Stylinski et al. | Jun 2002 | A1 |
20020075286 | Yonezawa et al. | Jun 2002 | A1 |
20020083461 | Hutcheson et al. | Jun 2002 | A1 |
20020085097 | Colmenarez et al. | Jul 2002 | A1 |
20020094189 | Navab et al. | Jul 2002 | A1 |
20020103025 | Murzanski et al. | Aug 2002 | A1 |
20020126899 | Farrell | Sep 2002 | A1 |
20020134151 | Naruoka et al. | Sep 2002 | A1 |
20020158873 | Williamson | Oct 2002 | A1 |
20030014212 | Ralston et al. | Jan 2003 | A1 |
20030022716 | Park et al. | Jan 2003 | A1 |
20030093591 | Hohl | May 2003 | A1 |
20030100363 | Ali | May 2003 | A1 |
20030160862 | Charlier et al. | Aug 2003 | A1 |
20030232649 | Gizis et al. | Dec 2003 | A1 |
20040001082 | Said | Jan 2004 | A1 |
20040017355 | Shim | Jan 2004 | A1 |
20040063480 | Wang | Apr 2004 | A1 |
20040063481 | Wang | Apr 2004 | A1 |
20040070565 | Nayar et al. | Apr 2004 | A1 |
20040087366 | Shum et al. | May 2004 | A1 |
20040095327 | Lo | May 2004 | A1 |
20040140955 | Metz | Jul 2004 | A1 |
20040150728 | Ogino | Aug 2004 | A1 |
20040178576 | Hillis et al. | Sep 2004 | A1 |
20040207597 | Marks | Oct 2004 | A1 |
20040212589 | Hall et al. | Oct 2004 | A1 |
20040213419 | Varma et al. | Oct 2004 | A1 |
20040227725 | Calarco et al. | Nov 2004 | A1 |
20040254017 | Cheng | Dec 2004 | A1 |
20050009605 | Rosenberg et al. | Jan 2005 | A1 |
20050037844 | Shum et al. | Feb 2005 | A1 |
20050047611 | Mao | Mar 2005 | A1 |
20050088369 | Yoshioka | Apr 2005 | A1 |
20050102374 | Moragne et al. | May 2005 | A1 |
20050105777 | Koslowski et al. | May 2005 | A1 |
20050117045 | Abdellatif et al. | Jun 2005 | A1 |
20050130743 | Leifer | Jun 2005 | A1 |
20050156903 | Kawell et al. | Jul 2005 | A1 |
20050198095 | Du et al. | Sep 2005 | A1 |
20050226431 | Mao | Oct 2005 | A1 |
20050239548 | Ueshima et al. | Oct 2005 | A1 |
20060033713 | Pryor | Feb 2006 | A1 |
20060035710 | Festejo et al. | Feb 2006 | A1 |
20060038819 | Festejo et al. | Feb 2006 | A1 |
20060084504 | Chan et al. | Apr 2006 | A1 |
20060204012 | Marks et al. | Sep 2006 | A1 |
20060233389 | Mao et al. | Oct 2006 | A1 |
20060252541 | Zalewski et al. | Nov 2006 | A1 |
20060256081 | Zalewski et al. | Nov 2006 | A1 |
20060264258 | Zalewski et al. | Nov 2006 | A1 |
20060264259 | Zalewski et al. | Nov 2006 | A1 |
20060264260 | Zalewski et al. | Nov 2006 | A1 |
20060269072 | Mao | Nov 2006 | A1 |
20060269073 | Mao | Nov 2006 | A1 |
20060274032 | Mao et al. | Dec 2006 | A1 |
20060274911 | Mao et al. | Dec 2006 | A1 |
20060277571 | Marks et al. | Dec 2006 | A1 |
20060280312 | Mao | Dec 2006 | A1 |
20060282873 | Zalewski et al. | Dec 2006 | A1 |
20060287084 | Mao et al. | Dec 2006 | A1 |
20060287085 | Mao et al. | Dec 2006 | A1 |
20060287086 | Zalewski et al. | Dec 2006 | A1 |
20060287087 | Zalewski et al. | Dec 2006 | A1 |
20070015559 | Zalewski et al. | Jan 2007 | A1 |
20070021208 | Mao et al. | Jan 2007 | A1 |
20070025562 | Zalewski et al. | Feb 2007 | A1 |
20070060336 | Marks et al. | Mar 2007 | A1 |
20070061413 | Larsen et al. | Mar 2007 | A1 |
20070066394 | Ikeda et al. | Mar 2007 | A1 |
20070072675 | Hammano et al. | Mar 2007 | A1 |
20070072680 | Ikeda | Mar 2007 | A1 |
20070117625 | Marks et al. | May 2007 | A1 |
20070120834 | Boillot | May 2007 | A1 |
20070120996 | Boillot | May 2007 | A1 |
20070260340 | Mao | Nov 2007 | A1 |
20070260517 | Zalewski et al. | Nov 2007 | A1 |
20070261077 | Zalewski et al. | Nov 2007 | A1 |
20080056561 | Sawachi | Mar 2008 | A1 |
20080070684 | Haigh-Hutchinson | Mar 2008 | A1 |
20080091421 | Gustavsson | Apr 2008 | A1 |
20080261693 | Zalewski | Oct 2008 | A1 |
20090010494 | Bechtel et al. | Jan 2009 | A1 |
20090016642 | Hart | Jan 2009 | A1 |
20090209343 | Foxlin et al. | Aug 2009 | A1 |
20090221368 | Yen et al. | Sep 2009 | A1 |
20090221374 | Yen et al. | Sep 2009 | A1 |
20090288064 | Yen et al. | Nov 2009 | A1 |
20100004896 | Yen et al. | Jan 2010 | A1 |
20100137064 | Shum et al. | Jun 2010 | A1 |
Number | Date | Country |
---|---|---|
0353200 | Jan 1990 | EP |
0652686 | May 1995 | EP |
0750202 | Dec 1996 | EP |
0835676 | Apr 1998 | EP |
1098686 | May 2003 | EP |
1435258 | Jul 2004 | EP |
2814965 | Apr 2002 | FR |
2206716 | Jan 1989 | GB |
2206716 | Nov 1989 | GB |
2376397 | Nov 2002 | GB |
2388418 | Nov 2003 | GB |
2388418 | Nov 2003 | GB |
01-284897 | Nov 1989 | JP |
06-102980 | Apr 1994 | JP |
07-311568 | Nov 1995 | JP |
9-128141 | May 1997 | JP |
9-185456 | Jul 1997 | JP |
9-265346 | Oct 1997 | JP |
9-265346 | Oct 1998 | JP |
11-38949 | Feb 1999 | JP |
2000-172431 | Jun 2000 | JP |
2000259856 | Sep 2000 | JP |
2000350859 | Dec 2000 | JP |
2001-166676 | Jun 2001 | JP |
2002369969 | Dec 2002 | JP |
2004-145448 | May 2004 | JP |
2004145448 | May 2004 | JP |
2005-046422 | Feb 2005 | JP |
200525410 | Aug 2005 | TW |
WO 8805942 | Aug 1988 | WO |
WO 9848571 | Oct 1998 | WO |
WO 9935633 | Jul 1999 | WO |
WO 9926198 | Oct 1999 | WO |
WO 0227456 | Feb 2002 | WO |
WO 03079179 | Sep 2003 | WO |
WO 2005073838 | Aug 2005 | WO |
WO 2005107911 | Nov 2005 | WO |
WO 2007095082 | Aug 2007 | WO |
WO 2008056180 | May 2008 | WO |
Entry |
---|
Bolt, R.A., “Put-that-there”: voice and gesture at the graphics interface, Computer Graphics, vol. 14, No. 3 (ACM SIGGRAPH Conference Proceedings) Jul. 1980, pp. 262-270. |
DeWitt, Thomas and Edelstein, Phil “Pantomation: A System for Position Tracking”, Proceedings of the 2nd Symposium on Small Computers in the Arts, Oct. 1982, pp. 61-69. |
Ephraim et al. “Speech Enhancement Using a Minimum Mean-Square Error Log-Spectral Amplitude Estimator”, 1985, IEEE. |
Ephraim et al. “Speech Enhancement Using a Minimum Mean-Square Error Short-Time Spectral Amplitude Estimator”, 1984, IEEE. |
Richardson et al. “Virtual Network Computing”, 1998, IEEE Internet Computing vol. 2. |
XP-002453974, “CFS and FS95/98/2000: How to Use the Trim Controls to Keep Your Aircraft Level”, Aug. 10, 2007, http://support.microsoft.com/?scid=kb%3Ben-us%3B175195&x=13&y=15. |
“The Tracking Cube: A Three-Dimentional Input Device”, IBM Technical Disclosure Bulletin, Aug. 1, 1989, pp. 91-95, No. 3B, IBM Corp. New York, U.S. |
K. B. Shimoga, et al., “Touch and Force Reflection for Telepresence Surgery”, Engineering in Medicine and Biology Opportunities of the IEEEE, Baltimore, MD, USA, Nov. 3, 1994, New York, New York, USA, pp. 1049-1050. |
Iddan, et al., “3D Imaging in the Studio (and Elsewhere . . . )”, Proceedings of the SPIE, SPIE, Bellingham, VA, US, vol. 4298, Jan. 24, 2001, pp. 48-55, XP008005351. |
Jojic, et al., “Tracking Self-Occluding Articulated Objects in Dense Disparity Maps”, Computer Vision, 1999, The Proceedings fo the Seventh IEEE International Conference on Kerkyra, Greece Sep. 20-27, 1999, Los Alamitos, CA, US, IEEE Computer Society, US, Sep. 20, 1999, (Sep. 20, 1999), pp. 123-130. |
Klinker, et al., “Distributed User Tracking Concepts for Augmented Reality Applications”, pp. 37-44, Augmented Reality, 2000, IEEE and ACM Int'l Symposium, Oct. 2000, XP010520308, ISBN: 0-7695-0846-4, Germany. |
Nakagawa, et al., “A Collision Detection and Motion Image Synthesis Between a Background Image and a Foreground 3-Dimensional Object”, TVRSJ Bol. 4, No. 2, pp. 425-430, 1999, Japan. |
Mihara, et al., “A Realtime Vision-Based Interface Using Motion Processor and Applications to Robotics”, vol. J84-D-11, No. 9, pp. 2070-2078. Sep. 2001, Japan. |
Nakamura, et al., “A Consideration on Reconstructing 3-D Model Using Object Views”, 2004-01601-003, pp. 17-21, Kokkaido University, Japan, nakamura@media.eng.holcudai.ac.jp. |
Nishida, et al., “A Method of Estimating Human Shapes by Fitting the Standard Human Model to Partial Measured Data”, D-II vol. J84-D-II, No. 7, pp. 1310-1318, Jul. 2001. |
Wilson & Darrell, “Audio-Video Array Source Localization for Intelligent Environments”, 2002 IEEE Dept. of Electrical Eng and Computer Science, MIT, Cambridge, MA 02139. |
Fiala, et al., “A Panoramic Video and Acoustic Beamforming Sensor for Videoconferencing”, 2004 IEEE, Computational Video Group, National Research Council, Ottawa, Canada K1A 0R6. |
Hemmi, et al., “3-D Natural Interactive Interface-Using Marker Tracking from a Single View”, Sep. 9, 1991, Systems and Computers in Japan. |
Lanier, Jaron, “Virtually there: three-dimensional tele-immersion may eventually bring the world to your desk”, Scientific American, ISSN: 0036-8733, Year: 2001. |
Richardson et al., “Virtual Network Computing” IEEE Internet Computing, vol. 2, No. 1 Jan./Feb. 1998. |
Fujitsu, “Internet Development of Emulators” Abstract, Mar. 1997, vol. 48, No. 2. |
Kanade, et al., “A Stereo Machine for Video-rate Dense Depth Mapping and Its New Application” 1996, CVPR 96, IEEE Computer Society Conference, pp. 196-202 (022). |
Gvili, et al., “Depth Keying”, SPIE vol. 5006 (2003), 2003 SPIE-IS&T, pp. 564-574 (031). |
Taiwan Intellectual Property Office, “Taiwan IPO Search Report (with English translation)” for Taiwan Invention Patent Application No. 098129110, completed Jan. 2, 2013. |
Number | Date | Country | |
---|---|---|---|
20090298590 A1 | Dec 2009 | US |
Number | Date | Country | |
---|---|---|---|
61057783 | May 2008 | US | |
61120340 | Dec 2008 | US | |
61200973 | Dec 2008 | US | |
60730659 | Oct 2005 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11588779 | Oct 2006 | US |
Child | 12428433 | US | |
Parent | 12259181 | Oct 2008 | US |
Child | 11588779 | US |