This patent application relates to a Location-Based Experience (LBE) where a user makes superhuman movements in a simulated virtual world simply by walking through the real world.
Simulation systems that present a virtual world are key to video games, data visualization, and other fields, engaging a user's senses of sound and vision. But they traditionally ignore the user's sense of proprioception, the body's innate sense of its own movement. That's why virtual reality (VR) headsets are considered an improvement over tapping on a keyboard and making small mouse movements. You're still sitting in a chair, but physically moving your head in a VR headset to look around feels more immersive.
More recently, augmented reality simulation systems give an even greater immersive feeling, by allowing users to stand up and physically walk around to navigate a virtual world. As one example, see U.S. Pat. No. 11,112,250B1 entitled “System for automatic instantiation of a generically described location-based travelling experience to specific map coordinates and real-time conditions” assigned to Monsarrat, Inc., the assignee of the present application, the entire contents of which are hereby incorporated by reference. In that patent, a user's position and orientation in the real world is tracked by a device such as a mobile phone or Augmented Reality (AR) headset. A simulation system maintains a correlation vector that maps a virtual world space onto a real world space. The resulting Location Based Travel Experience (LBTE) is such that when the user walks one step forward in the real world, the mobile device senses this and the correlation vector is applied to move the user's avatar one step forward in the virtual world.
The user's movement are both a type of control input into the virtual world, and also a new type of “output”, where your proprioceptive sense of physically moving, and the heightened body chemistry you get from a little exercise, becomes part of the immersive experience (suspension of disbelief) that you are navigating a virtual world. Haptic feedback may also form part of the output of the simulation system.
This works well when the real user's walking in the real world is perfectly matched to a user avatar walking in a virtual world. But it doesn't work when the user avatar needs to move in virtual ways that cannot match what the user is capable of in the real world. For example, no human in the real world is going to leap tall buildings as a control input to a virtual superhero. And superhuman virtual movements, for example flying an airplane upside down, such that the virtual world appears upside down, may work for a user seated in a chair with a static computer monitor, but won't work for a real world user trying to maintain his or her sense of balance while walking.
A new type of simulation system is needed where simple real world movements such as walking are a control input into, and a type of “output” from, superhuman virtual world experiences.
More particularly, described herein is a simulation system that enables a user to navigate a virtual space in a superhuman way, simply by holding or wearing a mobile device that tracks the user walking around. The system enables solves the problem where user movements which are necessarily limited to what is possible in the real world are associated with superhuman movements of an avatar in a virtual world. As a result, movements of the avatar in the virtual world can now be quite different from movements that are possible for a normal human. The avatar may represent a superpowered character, a vehicle (such as a race car or airplane), game piece, or some other object in the virtual world having its movements are controlled by, but not necessarily in lock step with the user's movements in the real world.
The simulation system:
A first goal is to make the real world control movements feel intuitive as an input to the simulation system. Like a child running with a toy car, the user should be able to simply walk straight ahead, walk while turning, or come to a stop, and optionally move the mobile device. Walking sideways or backwards will feel awkward and should not be required as control inputs.
A second goal is to make the virtual world movements feel intuitive to the type of user avatar. A person walking can stop on a dime, but a user avatar that is a virtual airplane can't do that. So if the user makes abrupt, mismatching real world movements, they need to be ignored or adapted.
The third goal is that the user should not be required to move the device, if held in the hand, in such a way that prevents the user from seeing the device's display, which needs to show the virtual world.
The final goal is to reduce the dissonance between the real world and virtual world. The key is to add limitations to movement of the user viewpoint and user avatar in the virtual world that reduce disorientation.
For example:
To satisfy these goals, the simulation system defines a series of real world user movement control inputs, and uses a matrix of movement rules to map them to two types of change in the virtual world: the user avatar and the user's viewpoint. The system maximizes the alignments of these two changes, to increase user immersion and decrease dissonance. The matrix of movement rules may also be used in reverse, if something happens in the virtual world which requires the user to perform a real world movement in response, or which triggers real world haptic feedback to the user.
In a specific implementation, a virtual experience method or system operates with a portable electronic device operated by a user. A definition for a virtual space is provided within which the user may navigate using superhuman movements. Such superhuman movements within the virtual space cannot be determined by precisely replicating movements of the user within the physical space.
A relation or correlation is maintained between a virtual coordinate system associated with the virtual space and a physical coordinate system associated with a physical coordinate system or a “real world” space. The portable electronic device provides location data responsive to the estimates of physical location in the real world of the portable electronic device. The location data may include position, orientation, or acceleration information.
The method or system also maintaining two correlate locations associated with the virtual space. These include (a) a user viewpoint, which defines how the virtual space is shown on the portable electronic device, and (b) user avatar, of a type associated with superhuman movements.
The location data is also processed against a matrix of movement rules to thereby determine changes to the user viewpoint and user avatar. The user viewpoint and user avatar are displayed on the portable electronic device.
A description of preferred embodiments follows.
How A Virtual World Can Be Correlated With the Real World
The embodiments described herein assume the existence of a technology that provides virtual Location-Based Experiences (LBEs). One example of such a technology was described in the aforementioned U.S. Pat. No. 11,112,250B1. As explained in that patent, an LBE maps virtual user experience locations to real world locations using a graph of location elements, where nodes represent locations, and parent nodes define routes in-between locations and also define location properties to be inherited by child nodes. The location elements may further include map descriptors that refer to map metadata, including metadata; real-time descriptors, including at least whether locations are open or closed, or behavior of other users; experiential descriptors, including whether line-of-sight between locations should be kept or avoided, a mode of transportation users are expected to employ or whether the route needs to be entirely completed within a set time; and nesting and inheritance properties, such that larger LBEs may be assembled from contained smaller LBEs. A route can be laid out by the user physically moving from one location to next location in the real world map and selecting points on the real world map along the route; or by operating a flow based layout model where each node in sequence is automatically placed relative to an initial location, via a graph drawing algorithm that identifies possible location placements relative to element constraints based on a best-fit layout.
The present application is directed to a particular way in which a user avatar and representations of the virtual space are presented on the mobile device.
As in
As in
At given moments in time, as in
The most trivial case is described in the prior art patent referenced above: the user walks a Walking Path in the Real World, as in
This invention describes extensions to that prior art, which are referred to as Virtual World Correlation Vectors 109, 110 that let the user control superhuman motion through the virtual world, simply by walking 101 and moving the device 102 in the real world.
Permitted Real World User Control Movements
The Simulation System 400 enables a User 401 to physically walk through a Real World Space 402, holding or wearing a Computer Device 403, and display a Virtual World 404 From a User Viewpoint 404 and display a User Avatar 405 in that Virtual World 404. The Virtual World 404 is mapped (correlated with) the Real World Space 402. The User's 401 movements in the Real World Space 402 control corresponding movements of the User Avatar 405 in the Virtual World 404.
The computer device 403 has one or more Location Sensors 406 that report the Device's Position and Orientation 407, which the simulation system 400 then feeds into a Rules Matrix 408 to calculate changes to the User Avatar Virtual Position and Orientation 409 and the User Viewpoint Virtual Position and Orientation 410. These changes are then rendered on the computer device 403 for the user to see. Output from the computer device could also include haptic feedback or commands for the user to move a specified way.
It should be understood that the Simulation System 400 may be implemented entirely by one or more programmable data processors that are resident within the Computer Device 403. However, in other implementations, the Simulation System 400 may be implemented in whole or in part by one or more data processors that are external to the Computer Device 403, such as servers or cloud computers connected wirelessly to communicate with, exchange data with, and control the Computer Device 403.
Permitted Virtual World Movements of the User Viewpoint
While the user avatar moves in the virtual world, the user's viewpoint may also move. As in
Permitted Virtual World Movements of the User Avatar
To rotate the avatar around the X-axis, as in
To rotate the avatar around the Z-axis, the avatar is not shown rotated on the user because that would disturb the user's sense of left and right. Instead, the virtual world itself is shown rotated. When the avatar is translating through the virtual world, the avatar is not shown as moving, but instead the virtual world moves.
In other words, movements of the avatar through the virtual world are shown on the device as either an avatar movement or as a virtual world movement (but not both), whichever is better to reduce disorientation of the user, as the user walks in a real world space.
Matrix of Movement Rules
The center column lists example constraints. For example, a positive X-axis translation in the real world results in acceleration or deceleration of the virtual avatar, in an amount proportional to the user's walking speed. A negative translation may brake the avatar's momentum. An X-axis rotation may result in banking the avatar left or right, stopping when its tilt angle matches the user orientation. A Y-axis rotation may accelerate banking of the avatar. A Z-axis rotation may also result in banking the plane avatar and stopping when its virtual world orientation matches the user's orientation.
As indicated in the right hand column, other constraints may be appropriate for a given real world motion. For example, a positive X-axis translation above certain speeds may be ignored and speeds below a minimum may result in stalling the plane avatar. A negative translation may not result in reverse movement of the plane. An X-axis rotation may result in spinning the avatar, but not the ground plane. A Y-axis rotation above a certain amount may be ignored.
These constraints are now described in more detail.
Proportional Positive X-Axis Translation and “Outpacing”
As in
The user's speed in the real world SR is typically used, for example, by simply multiplying a proportion constant, to calculate a set proportional speed for the airplane avatar in the virtual world, Sv. The airplane avatar smoothly accelerates or decelerates until it reaches Sv.
If the user's speed goes above some maximum limit that is either beyond the airplane avatar's maximum proportional speed, or is an unsafe speed for the real world, that maximum limit is used for SR in calculations.
If the user's speed goes below some minimum limit, or the user stops walking, as in
A normal experience would be for the user to walk forwards in the real world at a moderate speed. As in
The system may also need to handle when the user outpaces the user avatar. For example, if the user immediately starts to run at high speed, the plane's measured acceleration will not allow it to immediately jump to its top virtual speed. Or the user may be running so quickly that he or she is simply going faster in the real world than the plane's proportional top speed, as mapped back into the real world. In this case, keeping the avatar and user viewpoint aligned, as in
So in this case, as in
Negative X-Axis Translation
As in
So in this case, the result is that the user “outpaces” the avatar in the negative direction. If the system kept the avatar in a static position on the user's device, it would feel to the user as if the aircraft itself were moving backwards. So instead, the system allows the avatar to move out in front of the user. The effect will be similar to that shown in
Y-Axis Translation
As in
Z-Axis Translation
As in
X-Axis Rotation
As in
Y-Axis Rotation
As in
In addition, the system should not “reward” the user for awkward movements in the real world like trying to look directly upwards. For example, if the user tries to tilt the mobile device being worn or held to an uncomfortable low or high past some limits, the system should not provide any further input to the airplane.
The airplane avatar may also have fixed limits to how tilted up or down it can go. If the user outpaces the airplane's maximum turn speed or maximum angle or orientation, the avatar is allowed to slip off of the user's device screen until it catches up, if ever.
Z-Axis Rotation
As in
Similar to the above section, this motion will bank the plane left or right until it reaches the user's orientation as a set point. If the user outpaces the avatar's maximum turning speed, the avatar then rotates away or slides off of the user's device screen to the left or right until it catches up.
How to Display the Avatar's Landscape Backdrop at High Speeds
In the system and methods described herein, however, the user avatar in the virtual world moves too quickly to be matched to a real world environment as it passes by the slowly walking real-world user.
The augmented reality simulation system may respond to this challenge by:
At step 1010, the method is receiving a definition for a virtual space for a user to navigate using the portable electronic device.
At step 1020, the method is receiving a correlation vector that relates a virtual coordinate system associated with the virtual space to a physical coordinate system associated with a real world space.
At step 1030, the method is maintaining location data of the portable electronic device, within the physical space, the location data responsive to the estimates of physical location of the portable electronic device, the location data including one or more of position, orientation, or acceleration.
At step 1040, the method is maintaining two correlated locations associated with the virtual space including. The correlated locations include:
At 1050, a user viewpoint, which defines how the virtual space is shown on the portable electronic device.
At 1060, a user avatar, of a type associated with superhuman movements, wherein such movements cannot be calculated by replicating movements of the user within the physical space.
At step 1070, the method continues operating the virtual experience system, for further processing the location data against a matrix of movement rules to thereby determine changes to the user viewpoint and user avatar.
At step 1080, the method is displaying the user viewpoint and user avatar on the portable electronic device.
The foregoing description, along with the accompanying drawings, sets forth certain specific details in order to provide a thorough understanding of various disclosed embodiments. However, one skilled in the relevant art will recognize that the disclosed embodiments may be practiced in various combinations, without one or more of these specific details, or with other methods, components, devices, materials, etc. In other instances, well-known structures or components that are associated with the environment of the present disclosure, including but not limited to the data processing systems, or wireless communication systems and networks, have not been shown or described in order to avoid unnecessarily obscuring descriptions of the embodiments. Additionally, the various embodiments may be methods, systems, media, or devices. Accordingly, the various embodiments may be entirely hardware embodiments, entirely software embodiments, or embodiments combining software and hardware aspects.
As used herein, the term “or” is an inclusive “or” operator, and is equivalent to the phrases “A or B, or both” or “A or B or C, or any combination thereof,” and lists with additional elements are similarly treated. The term “based on” is not exclusive and allows for being based on additional features, functions, aspects, or limitations not described, unless the context clearly dictates otherwise. In addition, throughout the specification, the meaning of “a,” “an,” and “the” include singular and plural references.
In some instances, the various “data processors” may each be implemented by a physical or virtual general purpose computer having a central processor, memory, disk or other mass storage, communication interface(s), input/output (I/O) device(s), and other peripherals. The general-purpose computer is transformed into the processors and executes the processes described above, for example, by loading software instructions into the processor, and then causing execution of the instructions to carry out the functions described. As is known in the art, such a computer may contain one or more central processing units, disks, various memories, and input/output ports that enable the transfer of information between the elements. The central processor units provide for the execution of computer instructions. One or more memories provide volatile and/or non-volatile storage for these computer software instructions and data used to implement an embodiment. Disks or other mass storage provides non-volatile storage for these computer software instructions and data used to implement, for example, the various procedures described herein. The instructions may therefore typically be implemented in hardware, custom designed semiconductor logic, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), firmware, software, or any combination thereof. In certain embodiments, the procedures, devices, and processes described herein are a computer program product, including a computer readable medium (e.g., a removable storage medium such as one or more DVD-ROM's, CD-ROM's, diskettes, tapes, etc.) that provides at least a portion of the software instructions for the system.
It also should be understood that the block and system diagrams may include more or fewer elements, be arranged differently, or be represented differently. But it further should be understood that certain implementations may dictate the block and network diagrams and the number of block and network diagrams illustrating the execution of the embodiments be implemented in a particular way.
It is understood by those skilled in the art that various changes in form and details may be made therein without departing from the legal scope of this patent as encompassed by the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
10521962 | Nussbaum et al. | Dec 2019 | B1 |
11112250 | Monsarrat | Sep 2021 | B1 |
11430187 | Monsarrat | Aug 2022 | B1 |
11776206 | Gupta | Oct 2023 | B1 |
20020090985 | Tochner | Jul 2002 | A1 |
20050049022 | Mullen | Mar 2005 | A1 |
20050216181 | Estkowski et al. | Sep 2005 | A1 |
20090005140 | Rose | Jan 2009 | A1 |
20100259610 | Petersen | Oct 2010 | A1 |
20110009241 | Lane | Jan 2011 | A1 |
20110102459 | Hall | May 2011 | A1 |
20110208425 | Zheng et al. | Aug 2011 | A1 |
20120100911 | Rejen | Apr 2012 | A1 |
20130339098 | Looman et al. | Dec 2013 | A1 |
20140171962 | Kang | Jun 2014 | A1 |
20140221090 | Mutschler | Aug 2014 | A1 |
20150097864 | Alaniz | Apr 2015 | A1 |
20150209664 | Haseltine | Jul 2015 | A1 |
20160004335 | Hosenpud | Jan 2016 | A1 |
20160232713 | Lee | Aug 2016 | A1 |
20160232715 | Lee | Aug 2016 | A1 |
20170068323 | West | Mar 2017 | A1 |
20170255256 | Kim | Sep 2017 | A1 |
20170263032 | Cricri et al. | Sep 2017 | A1 |
20180033204 | Dimitrov | Feb 2018 | A1 |
20180345129 | Rathod | Dec 2018 | A1 |
20190019378 | Greiner | Jan 2019 | A1 |
20190033960 | Ho | Jan 2019 | A1 |
20190073832 | Kim | Mar 2019 | A1 |
20190180509 | Laaksonen et al. | Jun 2019 | A1 |
20190240568 | Routhier | Aug 2019 | A1 |
20190265055 | Chen et al. | Aug 2019 | A1 |
20190301953 | Harvey | Oct 2019 | A1 |
20200049522 | Wang et al. | Feb 2020 | A1 |
20200133618 | Kim | Apr 2020 | A1 |
20200184221 | Alexander | Jun 2020 | A1 |
20200279407 | Liljeroos | Sep 2020 | A1 |
20200284416 | Greiner | Sep 2020 | A1 |
20200294350 | Soon-Shiong | Sep 2020 | A1 |
20200341541 | Olah-Reiken | Oct 2020 | A1 |
20200384351 | Asano | Dec 2020 | A1 |
20210201581 | Xie et al. | Jul 2021 | A1 |
20230277943 | Heged?s | Sep 2023 | A1 |
20240019935 | Kondo | Jan 2024 | A1 |
20240316461 | Crosby | Sep 2024 | A1 |
Number | Date | Country |
---|---|---|
3839699 | Jun 2011 | EP |
2013074997 | May 2013 | WO |