In the past, the term “Virtual Reality” has been used as a catch-all description for a number of technologies, products, and systems in the gaming, entertainment, training, and computing industries. It is often used to describe almost any simulated graphical environment, interaction device, or display technology. The term “immersion.” often used to describe any computer game in which the gamer is highly engrossed/immersed in playing the game (perhaps because of the complexity or rapid-reactions required of the game)—just as a reader can be engrossed/immersed in a book—even though the gamer can usually still see and hear real-world events not associated with the game.
True immersion in a game can be defined as the effect of convincing the gamer's mind to perceive the simulated game world as if it were real. As a result, the gamer's mind begins to perceive and interact with the virtual game environment as if it were the real world. This immersive effect allows the gamer to focus on the activity of game play, and not the mechanics of interacting with the game environment.
In recent years games and simulators have been developed in which game activity is displayed on a television, computer screen or other display and the scene on the display changes according to the movement of the user. Many games and simulators used joysticks to translate hand movement of the user to activity on the screen. Other games and simulators have used sensors attached to the user or player which translate movement of the user to activity on the screen. An example of the use of position sensors in video games can be found in Published United States Patent Application No. 2007/0132785, the content of which is herein expressly incorporated by reference.
One video game system that is currently popular is sold under the name Wii. This system contains a controller, also called a remote, similar in size to a television remote. The remote contains an infrared (IR) camera and is capable of receiving infrared light. The player holds the remote in his or her hand or attaches the controller to a leg or foot. Depending upon the game being played movement of the player's arm or leg is translated to display throwing, hitting or kicking a ball. The angle and speed of arm or leg movement determines the direction and speed of the ball in the game.
Head tracking has been used in simulators and some vehicles to enable the driver or operator to cause an action according to the movement and position of the user's head. In such systems the user wears a helmet, glasses or other device that has sensors or emitters which enable the system to track the position and movement of the head.
Head tracking can be used in combination with video games to give the user a sense of being part of the environment of the game. Indeed, a head tracking unit of the type here disclosed can enable the user to see a three dimensional display and have the feeling that he or she is in that virtual space. Such a system is shown in a video on the You-Tube website at http://www.youtube.com/watch?v=Jd3-eiid-Uw in a video titled “Head Tracking for Desktop Virtual Reality Displays using the Wii Remote.” In this System a Wii controller is placed below a video display screen on which the video game is played. The user is given a bar containing two spaced apart light emitting diodes (LEDs) which emit continuous infrared light Alternatively, the two spaced apart LEDs can be provided on a pair of glasses. One infrared LED is attached to each side of the frame. Both LEDs emit the same wavelength of light and are either on or off. These glasses or the bar with the LEDs are worn on the head of the user to permit head tracking by the Wii controller. The LEDs permit head tracking such that the scene on the screen responds to the position of the player. Head tracking can create the illusion that certain objects on the screen are behind or in front of the other objects. As the user moves to different positions relative to the screen and the Wii controller positioned below the screen, objects on the screen are shown in different views. The object gets larger on the screen as the user moves toward the screen. Other parts of an object or other objects appear on the screen as the user moves left or right. However, in this system head tracking only works for one person at a time playing the game.
The software used in this head tracking system is a custom C# DirectX program. Johnny Chung Lee, a PHD student at Carnegie Mellon University, recently made this program available as sample code for developers without support or documentation under the name WiiDesktopVR sample program. This program requires information about the display size and the spacing of the LEDs.
We provide a head tracking system in which there is a cluster of LEDs one either side of the glasses or other device worn by the user. The cluster can be arranged in a pattern which may or may not be the same for each cluster.
The LEDs could emit different wavelengths of light. Such wavelengths need not be limited to the infrared spectrum, but could be visible light or any other wavelength that is detectable.
The LEDs can be activated in a manner to strobe or provide a distinct pattern of on and off pulses. The pulses may or may not be the same for all of the LEDs or cluster of LEDs.
Preferably the LEDs are controlled by a microprocessor which enables the LEDs to be strobed or activated in distinct patterns, or encryption methods. These patterns may be selected to correspond to a particular game or gaming device. These patterns may be digitally modulated to transmit digital data. Consequently, a particular set of head or body apparatus could be designed for use with only one type or brand of video game system. The patterns may be used to identify different body part locations.
Other aspects and advantages of our system will become apparent from a description of certain present preferred embodiments shown in the accompanying drawings.
The first present preferred embodiment of our head tracking system is in the form of a headset 1 shown in
A second present preferred embodiment shown in
In a third present embodiment, each LED cluster 5, 6 is attached to a band 14 that may fit over an arm or leg of a user as shown in
In the embodiments illustrated in
The LEDs can be arranged in any desired configuration. In the embodiment of
The receiver in the eyewear or other device worn by the user receives signals from a controller or other device associated with the video game system. The receiver could use IR from the light bar included in the Wii or other IR device, using some type of modulation or coding method. It could also use RF or other communication techniques. The receiver could be coupled to a microprocessor or controller that activates and controls the LEDs. This embodiment can be designed so that a distinct signal or pattern must be received to activate the head tracking unit. Indeed, different patterns or signals could be used to enable the head tracking unit to be used with different games, multiple players on the same game, or other activities. Consequently, one pattern would enable the user to play one game and a different pattern could be used to play another game. The patterns may also be used to set the level of difficulty of the game. Similarly, patterns emitted from the LEDs on the head tracking unit could be used in a similar way. The receiver and LEDs enable two way communications between the eyeglasses or other wearable device and the game controller. All of this would be determined by software in the microprocessor or microprocessors used to control the LEDs and the game controller. The patterns may be sent once, continuously or intermittently.
The microprocessor that is used as the controller can be very small and attached to the frame of the glasses or headband as shown in
One could provide diffusers or filters on the LEDs or LED clusters to create a desired effect.
While we have discussed using the LEDs on the headset, eyeglasses and band shown in
A speaker and a microphone could be provided in the glasses, earmuffs or other wearable device. These components could be wired to the game console or be wireless.
The source of power for the LEDs in the glasses or other wearable device source could be a single use or rechargeable battery. If a rechargeable battery is used battery leads may be located to enable the glasses or other wearable device to be placed in a docking station for recharging when not in use. The eyeglasses or other wearable device could plug into the Wii remote held in the player's hand using the “nun chuck” port to provide power and or communications to/from the head set from the Wii remote and/or from the Wii, which talks to the remote via RF(Bluetooth). The power source could also be wireless, RF or inductive. The power can be switched on manually, by external trigger such as IR or RF, or by motion sensing trigger.
Because the use of the glasses or other wearable device containing the LEDs allows the system to know the position of the player or user in the room, one can design games or other displays that use that position information as part of the display or game. For example, the game may require the player to go to a position in the room and wait until the player does so. Then the user's position could be displayed on the screen or otherwise used. Indeed, the game software could utilize the position of the user in the room as a feature of the game. For example, the user may be directed by the game to move through a virtual room.
While the discussion has been focused on activity in a video game context, the system is not so limited. Being able to sense and track the position of the user in a room enables the system to be used to teach movement to the user. Those movements may constitute a dance step, a physical exercise or any other activity involving movement. The movement of the user could be displayed on the screen along with or in addition to the movement being taught.
Currently, the Wii system, as well as other video game consoles, is designed for network connection via the internet with other users of a comparable system. This enables two or more players in different locations to play the same game. The position tracking capability here disclosed enables the creation of video games in which the movement of two or more players becomes part of the game. Each player could be in a virtual room or other virtual location and the position of each player could be displayed on the screen. Even if a player's position is not displayed, that position could be tracked and be utilized in the game.
Our tracking device is not limited to the specific embodiments described and illustrated but may be variously embodied within the scope of the following claims.
Applicant claims the benefit of U.S. Provisional Application Ser. No. 61/070516 filed Mar. 24, 2008.
Number | Date | Country | |
---|---|---|---|
61070516 | Mar 2008 | US |