Virtual reality systems exist for simulating virtual environments within which a user may be immersed. Displays such as head-up displays, head-mounted displays, etc., may be utilized to display the virtual environment. Typically, virtual reality systems entail providing the user with a fully virtual experience having no correspondence to the physical environment in which the user is located. In some cases, virtual environments are based on real-world settings, though these systems typically involve pre-experience modeling of the physical environment and are limited in the extent to which real-world features enrich the user's virtual experience.
According to one aspect of the disclosure, an integrated virtual environment is displayed on a display device for a user and from the user's vantage point. The integrated virtual environment incorporates virtualized representations of real-world physical objects from the user's environment into an existing virtual environment. The view of the integrated virtual environment may change in response to the user moving within their physical environment and/or interacting with the physical objects in their physical environment.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
Aspects of this disclosure will now be described by example and with reference to the illustrated embodiments listed above. Components, process steps, and other elements that may be substantially the same in one or more embodiments are identified coordinately and are described with minimal repetition. It will be noted, however, that elements identified coordinately may also differ to some degree. It will be further noted that the drawing figures included herein are schematic and generally not drawn to scale. Rather, the various drawing scales, aspect ratios, and numbers of components shown in the figures may be purposely distorted to make certain features or relationships easier to see.
Virtual reality systems allow a user to become immersed to varying degrees in a simulated virtual environment. In order to render an immersive feeling, the virtual environment may be displayed to the user via a head-mounted display (HMD). The present disclosure describes systems and methods that allow a user to interact with their physical environment and incorporate real-world elements from the physical environment into the virtual environment.
Referring first to
Turning back to
For example,
Further, integrated virtual environment 200 may further include one or more virtual objects, such as virtual objects 214, 216, 218, and 220 that do not correspond to a physical object. As such, virtual objects 214, 216, 218, and 220 may be virtual objects of an existing virtual environment associated with a particular video game. When an existing virtual environment further includes virtualized representations of physical objects, such as those discussed above, the existing virtual environment may be referred to as an integrated virtual environment, such as integrated virtual environment 200.
Using the combat game scenario as an example, a virtualized representation of a physical object may be selected from one of a plurality of candidate virtualized representations based on characteristics of the physical object. For example,
As non-limiting examples, a gaming system may consider one or more characteristics of a physical object such as geometric shape, geometric size, weight and/or textile feel. One or more said characteristics may be used to match a physical object to a virtualized representation. For example, the system may recognize that physical objects 102 and 104 have a geometric shape similar to candidates 302 and 304 respectively and select candidates 302 and 304 as virtualized representations of their respective physical objects. The system may modify the appearance, such as the size and/or the perspective view of candidates 302 and 304, to more closely match the dimensions of physical objects 102 and 104. For example, as shown in
As another example, the system may recognize that physical object 106 is heavy and select candidate 306 as a virtual representation for physical object 106. The system may modify, for example, the number of sandbags and/or the configuration of the sandbags to closely resemble the geometric size and geometric shape of physical object 106, which is depicted in
As another example, the system may recognize that physical object 108, depicted in
It will be appreciated that some physical objects may be incorporated into an existing virtual environment such that their virtual representation is substantially the same as the physical object. Such objects may be virtually rendered with substantially few modifications. Using the combat game as a non-limiting example, a physical environment may include a helmet 110 and/or a canteen 112 that the system may incorporate into the virtual environment as a virtual helmet 310 and virtual canteen 312 for the user to interact with. User interaction with virtualized representations of physical objects will be discussed in greater detail with respect to
Alternatively, in a semi-transparent virtual environment, physical objects that are already compatible with the existing virtual environment may be visually displayed to the user without creating a virtual representation of the physical object. For example, since a helmet and a canteen are listed as candidates in
In some embodiments, the virtualized representations of the physical objects may occupy a greater or lesser geometric space than the physical objects, and/or may differ to some degree in exact location. In such cases, the geometric center of the virtualized representations may substantially align with the geometric center of their respective physical objects in order to maintain a spatial relationship between the user and the physical objects. However it will be appreciated that other configurations are possible in order to maintain a spatial relationship.
While
The methods and processes described herein may be tied to a variety of different types of computing systems.
As shown in
Virtually any object recognition and/or scene capture technology may be used without departing from the scope of this disclosure. As one example, structured light 3D scanners may determine the geometry of physical object 106 within physical environment 100. Example object recognition and/or scene capture technologies are further discussed below with reference to
In some cases, such as with objects that have moving parts, the 3D spatial model 502 may include or be used to generate a virtual skeleton 504. Virtual skeleton 504 may be derived from 3D spatial model 502 to provide a machine readable representation of physical object 106. In other words, virtual skeleton 504 is derived from or included as part of 3D spatial model 502 to model physical object 106. The virtual skeleton 504 may be generated in any suitable manner. In some embodiments, one or more skeletal fitting algorithms may be applied. The present disclosure is compatible with virtually any skeletal modeling techniques.
The virtual skeleton 504 may include a plurality of joints, each joint corresponding to a feature of the physical object. In
As shown in
In some scenarios, a virtualized representation may be changed. As one non-limiting example, a user may move relative to the physical object and thus the virtualized representation may move within the integrated virtual environment. Therefore, depending on the vantage point of the user, the perspective view of the virtualized representation within the integrated virtual environment may change. Such an example will be discussed in greater detail with respect to
As introduced above, a 3D virtual reality combat game may incorporate physical objects as virtualized representations within an existing virtual environment via 3D spatial modeling, thus creating an integrated virtual environment. Within such an environment a game player may interact with the physical environment in various ways and have such interactions translate to the integrated virtual environment. This translation can result in modifying gameplay sequences of the video game.
For a first example,
At time t1, game player 10 moves within physical environment 600 such that game player 10 is closer to physical object 106. Such a movement may change integrated virtual environment 700 by changing the perspective view and/or scale of virtualized representation 206 of the physical object. In this way, a game player moving relative to the physical object may modify a gameplay sequence of a video game. For example, as shown, the game player may use physical object 106 (and thus virtualized representation 206) as a protective barrier from virtual enemies. In other words, the computing system translates the position of the game player relative to the physical object and translates this position to the integrated virtual environment to maintain a spatial relationship between the user and the physical object. In this way, the gameplay sequence is modified in response to the user moving relative to a physical object. Thus, in the example provided, the combat gameplay sequence may be modified to include the avatar being protected from enemy fire as a result of the game player moving to hide behind the couch.
As another example,
At time t1, game player 10 throws physical object 108. Such an interaction may change integrated virtual environment 900 by moving virtualized representation 206 of the physical object within integrated virtual environment 900.
At time t2, physical object 108 hits wall 802 which may correspond to virtualized representation 208 exploding at 902. In this example, the game player interaction with the physical object modifies the appearance of virtualized representation 208 within integrated virtual environment 900. In this way, a game player may interact with the physical object to modify a gameplay sequence of a video game. For example, as shown, the game player may use physical object 108 (and thus virtualized representation 208) as a weapon to combat virtual enemies. In other words, the computing system translates the interaction of the game player with the physical object and translates this interaction such that it is included within the integrated virtual environment. In some embodiments, the position, velocity, or other attributes of one or more physical objects may be taken into consideration. In this way, the gameplay sequence is modified in response to the user interacting with a physical object. Thus, in the example provided, the combat gameplay sequence may be modified to include the avatar throwing a grenade at an enemy as a result of the game player throwing a ball.
As other non-limiting example, a game player may interact with the physical environment by pushing or knocking a physical object over. Such an interaction may modify the gameplay sequence. Using the combat game as an example, pushing a coffee table to a new location within the physical environment may modify the gameplay sequence by pushing a rock to uncover a trap door. It will be appreciated that the above examples are provided as non-limiting examples of a game player interacting with the physical environment and incorporating those interactions into the integrated virtual environment. As such, it will be understood that other user interactions are possible without departing from the scope of this disclosure.
At 1004, the method includes identifying a physical object within the physical environment via analysis of the 3D spatial model. At 1006, the method includes selecting one of a plurality of candidate virtualized representations based on one or more characteristics of the physical object, as described above. At 1008, the method includes modifying said one of the plurality of candidate virtualized representations based on the one or more of said characteristics of the physical object. At 1010, the method includes generating a virtualized representation of the physical object.
At 1012, the method includes incorporating the virtualized representation of the physical object into an existing virtual environment, thereby yielding the integrated virtual environment. As described above, the virtualized representation may be incorporated by placing the virtualized representation of the physical object within the existing virtual environment based on a spatial relationship between the user and the physical object. In this way, real-world physical elements may be incorporated into a virtual environment.
At 1014, the method includes displaying a view of the integrated virtual environment on a display device. The integrated virtual environment may be displayed from a vantage point of the user, for example. Further, the view of the integrated virtual environment may be changeable in response to the user moving within the physical environment.
At 1106, the method includes changing the integrated virtual environment in response to detecting the user physically moving relative to the physical object. In other words, changing the integrated virtual environment comprises translating a physical user movement relative to the physical object into a virtual movement relative to the virtualized representation of the physical object. For example, if the integrated virtual environment is part of a video game the method at 1106 comprises modifying a gameplay sequence of the video game in response to detecting the user physically moving relative to the physical object, as described above in reference to
At 1206, the method includes changing the integrated virtual environment in response to the user physically interacting with or moving relative to the physical object. This can include changing the virtualized representation of the physical object, for example, by moving and/or modifying the appearance of the virtualized representation of the physical object within the integrated virtual environment. In other words, changing the virtualized representation of the physical object (and thus changing the integrated virtual environment) comprises translating a physical user interaction with, or movement relative to, the physical object into an avatar interaction with, or movement relative to, the virtualized representation of the physical object. As indicated above, the integrated virtual environment may be part of a video game, and thus changing the integrated virtual environment and/or virtualized representation of the physical object may modify the gameplay sequence of the video game, as described above in reference to
While described with reference to a 3D virtual combat video game, the integrated virtual environment described above may be applied to other games or applications. Furthermore, the integrated virtual environment described above may be used as a training tool, such as a flight simulator for training pilots. Further, while the above description relates to displaying an integrated virtual environment derived from an indoor physical environment, it will be appreciated that an integrated virtual environment may be derived from an outdoor physical environment.
In some embodiments, the gaming system may be configured for global positioning. As such, global positioning data may be included as information for generating a 3D spatial model of the physical environment. In this way, one or more users and/or one or more physical objects may be tracked with global positioning and this data may be translated to an integrated virtual environment.
In some embodiments, the above described methods and processes may be tied to a computing system comprising one or more computers. In particular, the methods and processes described herein may be implemented as a computer application, computer service, computer API, computer library, and/or other computer program product.
Computing system 1300 includes a logic subsystem 1302 and a data-holding subsystem 1304. Computing system 1300 may optionally include a display subsystem 1306, communication subsystem 1308, one or more sensors 1310 and/or other components not shown in
Logic subsystem 1302 may include one or more physical devices configured to execute one or more instructions. For example, the logic subsystem may be configured to execute one or more instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more devices, or otherwise arrive at a desired result. As described above, such instructions may be executable by a logic subsystem to provide an integrated virtual environment incorporating real-world physical elements. Further, the instructions may be executable to detect the user physically interacting with, or moving relative to, a physical object via information obtained via one or more sensors 1310.
The logic subsystem may include one or more processors that are configured to execute software instructions. Additionally or alternatively, the logic subsystem may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. Processors of the logic subsystem may be single core or multicore, and the programs executed thereon may be configured for parallel or distributed processing. The logic subsystem may optionally include individual components that are distributed throughout two or more devices, which may be remotely located and/or configured for coordinated processing. One or more aspects of the logic subsystem may be virtualized and executed by remotely accessible networked computing devices configured in a cloud computing configuration.
Data-holding subsystem 1304 may include one or more physical, non-transitory, devices wherein such devices are configured to hold data and/or instructions executable by the logic subsystem to implement the herein described methods and processes. When such methods and processes are implemented, the state of data-holding subsystem 1304 may be transformed (e.g., to hold different data).
Data-holding subsystem 1304 may include removable media and/or built-in devices. Data-holding subsystem 1304 may include optical memory devices (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory devices (e.g., RAM, EPROM, EEPROM, etc.) and/or magnetic memory devices (e.g., hard disk drive, floppy disk drive, tape drive, MRAM, etc.), among others. Data-holding subsystem 1304 may include devices with one or more of the following characteristics: volatile, nonvolatile, dynamic, static, read/write, read-only, random access, sequential access, location addressable, file addressable, and content addressable. In some embodiments, logic subsystem 1302 and data-holding subsystem 1304 may be integrated into one or more common devices, such as an application specific integrated circuit or a system on a chip.
It is to be appreciated that data-holding subsystem 1304 includes one or more physical, non-transitory devices. In contrast, in some embodiments or aspects of the instructions described herein may be propagated in a transitory fashion by a pure signal (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by a physical device for at least a finite duration. Furthermore, data and/or other forms of information pertaining to the present disclosure may be propagated by a pure signal.
The terms “module,” “program,” and “engine” may be used to describe an aspect of computing system 1300 that is implemented to perform one or more particular functions. In some cases, such a module, program, or engine may be instantiated via logic subsystem 1302 executing instructions held by data-holding subsystem 1304. It is to be understood that different modules, programs, and/or engines may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same module, program, and/or engine may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc. The terms “module,” “program,” and “engine” are meant to encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.
It is to be appreciated that a “service”, as used herein, may be an application program executable across multiple user sessions and available to one or more system components, programs, and/or other services. In some implementations, a service may run on a server responsive to a request from a client.
Display subsystem 1306 may be used to present a visual representation of data held by data-holding subsystem 1304. As the herein described methods and processes change the data held by the data-holding subsystem, and thus transform the state of the data-holding subsystem, the state of display subsystem 1306 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 1306 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with logic subsystem 1302 and/or data-holding subsystem 1304 in a shared enclosure, or such display devices may be peripheral display devices.
For example,
Turning back to
For example, communication subsystem 1308 may enable more than one integrated virtual environment corresponding to more than one physical environment to be networked. In such cases, the one or more integrated virtual environments may be combined into a merged integrated virtual environment incorporating one or more physical objects from each of the physical environments. In this way, game players may play a networked game in a merged integrated virtual environment that each game player may interact with.
Computing system 1300 may include one or more sensors 1310 configured to obtain information about the physical environment. The one or more sensors 1310 may be configured to obtain information optically and in real time. For example, the one or more sensors 1300 may comprise an image capture device configured to obtain one or more depth images of the physical environment. As additional non-limiting examples, computing system 1300 may be operatively coupled to one or more laser range finders, time of flight cameras, and/or structured light 3D scanners. Such technologies may be directly coupled and/or remotely linked to computing system 1300. As one example, one or more sensors 1310 may be included in a display device, such as HMD 1400. As another example, one or more sensors 1310 may be remotely linked to computing system 1300 and HMD 1400. In this way, one or more sensors 1310 may be placed at different positions within an environment, and as such, may be wirelessly linked to computing system 1300. The one or more sensors 1300 may be configured to obtain information regarding the position of a user and/or one or more physical objects. In this way, the sensors may detect the position and movement of the user within the physical environment based on a spatial relationship between the user and the one or more physical objects.
It is to be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated may be performed in the sequence illustrated, in other sequences, in parallel, or in some cases omitted. Likewise, the order of the above-described processes may be changed.
The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.