Virtual reality technology is becoming more sophisticated and available to the general public. Currently, many virtual reality systems require a user to sit in a chair, wear a bulky headset, and face a specific direction while limited optical sensors track certain movements of portions of the headset. As a user moves his head from side to side, an image provided to a user may change. The optical sensors provide a line-of-sight signal to a headset and may provide input to a remote server to update a graphical interface when the headset is detected to shift to the left or the right.
Virtual reality systems based on optical tracking have significant limitations. First, virtual-reality tracking systems based on optical sensors require a line of sight between the optical sensor and the user. Additionally, the virtual reality environments are limited to a space defined by a physical arena or space. What is needed is an improved virtual-reality system.
The present technology, roughly described, provides a combined physical and virtual environment in which a user's position in a physical environment is displayed in an offset position within the virtual environment. The offset is determined based on mapping between the physical environment and the virtual environment and offsets generated for a user position and direction as a user moves throughout the physical environment. The physical environment and corresponding virtual environment may have a different layout. The offsets are used to correlate portions of the physical environment and virtual environments together so that a user does not realize the differences between the environments. By providing offsets in this manner, an enclosed physical environment may be used to provide an expanded and unlimited virtual environment for user to navigate and explore.
In some implementations, when a user moves through a physical environment that is curved or otherwise nonlinear, offsets may be used to make it appear that a user is traveling in a straight direction in a corresponding virtual environment. In fact, if the physical environment includes a closed loop curve (e.g., a circular hallway), a user may be guided indefinitely along a straight path or “infinite hallway.”
In an embodiment, a method may provide a combined virtual and physical environment. A local machine may track a user position in a physical environment. The local machine may also determine the user's position in a virtual environment based on the user's tracked position in the physical environment and offset data.
In an embodiment, a system for transmitting a plurality of wide band tracking signals within a position tracking system may include a processor, memory, and one or more modules stored in memory. The one or more modules may be executable by the processor to track a user position in a physical environment and display the user's position in a virtual environment based on the user's tracked position in the physical environment and offset data.
The present technology, roughly described, provides a combined physical and virtual environment in which a user's position in a physical environment is displayed in an offset position within the virtual environment. The offset is determined based on mapping between the physical environment and the virtual environment and offsets generated for a user position and direction as a user moves throughout the physical environment. The physical environment and corresponding virtual environment may have a different layout. The offsets are used to correlate portions of the physical environment and virtual environments together so that a user does not realize the differences between the environments. By providing offsets in this manner, an enclosed physical environment may be used to provide an expanded and unlimited virtual environment for user to navigate and explore.
In some implementations, when a user moves through a physical environment that is curved or otherwise nonlinear, offsets may be used to make it appear that a user is traveling in a straight direction in a corresponding virtual environment. In fact, if the physical environment includes a closed loop curve (e.g., a circular hallway), a user may be guided indefinitely along a straight path or “infinite hallway.”
Receivers 112-117 may be placed on a player 140 or an accessory 135. Each receiver may receive one or more signals from one or more of transmitters 102-108. The signals received from each transmitter may include an identifier to identify the particular transmitter. In some instances, each transmitter may transmit an omnidirectional signal periodically at the same point in time. Each receiver may receive signals from multiple transmitters, and each receiver may then provide signal identification information and timestamp information for each received signal to player computer 120. By determining when each transmitter signal is received from a receiver, player computer 120 may identify the location of each receiver.
Player computer 120 may be positioned on a player, such as for example on the back of a vest worn by a player. A player computer may receive information from a plurality of receivers, determine the location of each receiver, and then locally update a virtual environment accordingly. Updates to the virtual environment may include a player's point of view in the environment, events that occur in the environment, and video and audio output to provide to a player representing the player's point of view in the environment along with the events that occur in the environment.
Player computer 120 may also communicate changes to the virtual environment determined locally at the computer to other player computers, such as player computer 122, through game computer 150. In particular, a player computer for a first player may detect a change in the player's position based on receivers on the player's body, determine changes to the virtual environment for that player, provide those changes to game computer 150, and game computer 150 will provide those updates to any other player computers for other players in the same virtual reality session, such as a player associated player computer 122.
A player 140 may have multiple receivers on his or her body. The receivers receive information from the transmitters 102-108 and provide that information to the player computer. In some instances, each receiver may provide the data to the player computer wirelessly, such as for example through a radiofrequency signal such as a Bluetooth signal. In some instances, each receive may be paired or otherwise configured to only communicate data with a particular players computer. In some instances, a particular player computer may be configured to only receive data from a particular set of receivers. Based on physical environment events such as a player walking, local virtual events that are provided by the players computer, or remote virtual events triggered by an element of the virtual environment located remotely from the player, haptic feedback may be triggered and sensed by a player. The haptic feedback may be provided in the terms of transducer 132 and motor 133. For example, if an animal or object touches a player at a particular location on the player's body within the virtual environment, a transducer located at that position may be activated to provide a haptic sensation of being touched by that object.
Visual display 134 may be provided through a headset worn by player 140. The virtual display 134 may include a helmet, virtual display, and other elements and components needed to provide a visual and audio output to player 140. In some instances, player computer 120 may generate and provide virtual environment graphics to a player through the virtual display 140.
Accessory 135 may be an element separate from the player, in communication with player computer 120, and displayed within the virtual environment through visual display 134. For example, an accessory may include a gun, a torch, a light saber, a wand, or any other object that can be graphically displayed within the virtual environment and physically engaged or interacted with by player 140. Accessories 135 may be held by a player 140, touched by a player 140, or otherwise engaged in a physical environment and represented within the virtual environment by player computer 120 through visual display 134.
Game computer 150 may communicate with player computers 120 and 122 to receive updated virtual information from the player computers and provide that information to other player computers currently active in the virtual reality session. Game computer 150 may store and execute a virtual reality engine, such as Unity game engine, Leap Motion, Unreal game engine, or another virtual reality engine. Game computer 150 may also provide virtual environment data to networking computer 170 and ultimately to other remote locations through network 180.
Environment devices 162 may include physical devices that form part of the physical environment. The devices 162 may provide an output that may be sensed or detected by a player 140. For example, an environment device 162 may be a source of heat, cold, wind, sound, smell, vibration, or some other sense that may be detected by a player 140.
Transmitters 102-108 may transmit a synchronized wideband signal within a pod to one or more receivers 112-117. Logic on the receiver and on a player computing device, such as player computing device 120 or 122, may enable the location of each receiver to be determined in a universal space within the pod.
A virtual reality system may be initialized and calibrated at step 415. Initialization and calibration may include calibrating a tracking system, initializing the virtual environment software, and other initialization and calibration tasks.
The user's physical position may be tracked at step 420. A user may be tracked continuously as the user navigates throughout the physical environment. As the user moves throughout the physical environment, position data generated by a tracking system is provided to a local machine at step 425. The local machine may be, in some implementations, attached, coupled, worn, or otherwise positioned on a user's body. The user position data may include data indicating a position of one or more receivers located on portions of the user, objects carried by the user, or at other locations.
Offsets for the user within the virtual environment may be determined at step 430. The offsets may include directional offsets, positional offsets, and may be used to alter a perceived path of the user within a virtual environment from an actual path of the user within a physical environment. For example, the offsets may be used to make a physical curved path traveled by a user appear as a straight path within the virtual environment. Determining offsets for user within a virtual environment is discussed in more detail with respect to the method of
A user is displayed within a virtual environment with offsets at step 435. A user may be displayed as a first object within the virtual environment. The movement of the user within the virtual environment may be displayed based on tracking data received by the local machine and offsets determined based on the location of the user. An offset user position is transmitted to remote machines at step 440. In some instances, the local machine of the user may first transmit the user's offset location to a game computer, and the game computer may transmit the offset user position data to other user computers or remote machines. The remote machines may update the user location within the virtual environment for the particular user associated with a remote machine at step 440. Hence, as a user moves around a physical environment, the updated offset position of the user within the virtual environment is provided to other users participating in a virtual reality session in real time.
Physical points along the walls and corners are assigned to points within a virtual environment at step 520. Assigning the physical points to the virtual environment points ensures that the physical walls are aligned with walls displayed within the virtual environment and can be interacted with as such. Virtual environment may be restructured based on the physical space to fit the physical space at step 525. Restructuring a virtual environment may include adjusting the size of virtual spaces, adjusting a speed at which a user may travel through a particular space and adjusting other parameters of the virtual environment.
Returning to
A current user position with respect to a starting position is determined at step 620. The user position with respect to the star position is used to determine how far the user has traveled along the curved path in the model of
The length of travel in a virtual environment hall or path is determined based on the determined angle at step 630. The length of travel may be determined by applying the proportion of the angle traveled with respect to the maximum allowed angle of travel to the maximum length of travel in the corresponding path in the virtual environment. The proportion may be expressed as:
where the angle of travel is αn, the maximum possible angle of travel is αtot, the maximum possible distance traveled in the virtual environment is Dtot′, and the determined distance traveled in the virtual environment is Dn′.
Referring to
A side to side position within a hall or other traversable space within the virtual environment is determined based on a distance the user is from the rotation point in the physical environment at step 635.
A shortest distance a user may be to the rotation point may be represented by minimum distance dmin and the furthest distance a user may be to the rotation point may be represented by maximum distance dmax. The actual distance a user is located from the rotation point may be represented as doff. In the virtual environment, these distances are correlated to distances dmin′, dmax′, and doff′ in the straight path of the virtual environment.
A first user movement is detected at step 920. A determination is then made as to whether the first user movement results in a new chunk at step 930. If the movement does not result in a new chunk, the method of
A determination is made as to whether a second user is present in the physical space associated with the second chunk at step 1050. When a user moves from a first chunk to a second chunk, other users may exist in the same physical space as the first user but be experiencing different chunks of the virtual environment. If there are no other users in the present physical space in a chunk other than that of the first user, the method of
A secondary objects is generated to represent the second user in the new chunk for the first user at step 960. Though each user within the virtual environment is associated with a graphical object, a secondary graphical object may be generated to represent a particular user in a chunk other than that experienced by that particular user. This allows a user in a different chunk and the same physical space as the user to identify that another user, or some object, is in a physical space as the user in a different chunk, which helps to prevent collisions or other contact between the two users in the same physical space but different chunks. A secondary object may also be generated to represent the first user in the chunk associated with the second user at step 970.
The components shown in
Mass storage device 1130, which may be implemented with a magnetic disk drive, an optical disk drive, or solid state non-volatile storage, is a non-volatile storage device for storing data and instructions for use by processor unit 1110. Mass storage device 1130 can store the system software for implementing embodiments of the present invention for purposes of loading that software into main memory 1110.
Portable storage device 1140 operates in conjunction with a portable non-volatile storage medium, such as a floppy disk, compact disk or Digital video disc, to input and output data and code to and from the computer system 1100 of
Input devices 1160 provide a portion of a user interface. Input devices 1160 may include an alpha-numeric keypad, such as a keyboard, for inputting alpha-numeric and other information, or a pointing device, such as a mouse, a trackball, stylus, or cursor direction keys. Additionally, the system 1100 as shown in
Display system 1170 may include a liquid crystal display (LCD) or other suitable display device. Display system 1170 receives textual and graphical information, and processes the information for output to the display device.
Peripherals 1180 may include any type of computer support device to add additional functionality to the computer system. For example, peripheral device(s) 1180 may include a modem or a router.
The components contained in the computer system 1100 of
The foregoing detailed description of the technology herein has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the technology to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. The described embodiments were chosen in order to best explain the principles of the technology and its practical application to thereby enable others skilled in the art to best utilize the technology in various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the technology be defined by the claims appended hereto.
This application is a continuation in part and claims the priority benefit of U.S. patent application Ser. No. 14/942,878, titled “Combined Virtual and Physical Environment,” filed Nov. 15, 2015, which claims the priority benefit of U.S. provisional application 62/080,308, titled “Systems and Methods for Creating Combined Virtual and Physical Environments,” filed Nov. 15, 2014, and U.S. provisional application 62/080,307, titled “Systems and Methods for Creating Combined Virtual and Physical Environments,” filed Nov. 15, 2014, the disclosures of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
62080308 | Nov 2014 | US | |
62080307 | Nov 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14942878 | Nov 2015 | US |
Child | 15183839 | US |