This disclosure relates generally to digital imaging. More particularly, without limitation, certain embodiments relate to systems and methods that may be used in mixed reality multi-user environments and related technologies.
Mixed reality (MR) systems combine elements of both virtual reality (VR) and augmented reality (AR). In MR, digital objects are overlaid onto the physical world in a way that allows users to interact with both the real and virtual environments. This may create a sense of immersion and presence that can be used for a variety of applications, such as gaming, education, and training.
Some commercial examples of MR systems include the Microsoft HoloLens, the Magic Leap One, and the Meta Quest/Quest Pro. These systems typically consist of a headset or glasses that are equipped with cameras and sensors to track users' movements and the environment around them. The MR system then typically renders and overlays digital objects onto a user's field of view, creating a blending of the real and virtual worlds that may appear to a user to be seamless.
As just one example of a mixed reality application, “Spatial Ops” is a mixed reality multiplayer game developed by Resolution Games, AB, which is designed to test and improve players' spatial awareness and problem-solving skills. In the game, players take on the role of a member of a futuristic space crew tasked with navigating through complex and dangerous environments. Players must use their wits, reflexes, and spatial awareness to navigate through a series of increasingly challenging levels, avoiding obstacles, solving puzzles, and defeating enemies along the way. The game features a range of different environments, including futuristic cities, space stations, and alien worlds, each with its own unique challenges and hazards. Players can move around freely in the game world using a variety of different movement options, including teleportation and free movement. As an example, with Spatial Ops, multiple players (e.g., up to eight players in some implementations) can transform a real-world space into what appears to be an urban battlefield and then team up with or against other players in a first-person shooter (FPS) game experience.
Technology known to skilled artisans uses VR, AR, and/or MR to designate or define points or objects in the physical world that can be digitally tagged or annotated. These tags can then be used to overlay digital content onto the physical environment. Such tags (sometimes referred to as spatial anchors) may be shared across multiple users and devices, enabling different users to experience the same digital content or interact with the same spatial anchors, even if they are in different physical locations.
There is a need to provide innovations in the above technologies to enhance multi-player experiences. It is therefore desirable to address the limitations in the known art by means of the systems and methods described herein.
By way of example, reference will now be made to the accompanying drawings, which are not to scale.
Those of ordinary skill in the art will realize that the following description of the present invention is illustrative only and not in any way limiting. Other embodiments of the invention will readily suggest themselves to such skilled persons upon their having the benefit of this disclosure. Reference will now be made in detail to specific implementations of the present invention, as illustrated in the accompanying drawings. The same reference numbers will be used throughout the drawings and the following description to refer to the same or like parts.
Certain figures in this specification are flow charts illustrating methods and systems. It will be understood that each block of these flow charts, and combinations of blocks in these flow charts, may be implemented by computer program instructions. These computer program instructions may be loaded onto a computer or other programmable apparatus to produce a machine, such that the instructions that execute on the computer or other programmable apparatus create structures for implementing the functions specified in the flow chart block or blocks. These computer program instructions may also be stored in computer-readable memory that can direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in computer-readable memory produce an article of manufacture including instruction structures that implement the function specified in the flow chart block or blocks. The computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions that execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flow chart block or blocks.
Accordingly, blocks of the flow charts support combinations of structures for performing the specified functions and combinations of steps for performing the specified functions. It will also be understood that each block of the flow charts, and combinations of blocks in the flow charts, can be implemented by special purpose hardware-based computer systems that perform the specified functions or steps, or combinations of special purpose hardware and computer instructions.
For example, any number of computer programming languages, such as C, C++, C #(CSharp), Perl, Ada, Python, Pascal, SmallTalk, FORTRAN, assembly language, and the like, may be used to implement aspects of the present invention. Further, various programming approaches such as procedural, object-oriented or artificial intelligence techniques may be employed, depending on the requirements of each particular implementation. Compiler programs and/or virtual machine programs executed by computer systems generally translate higher-level programming languages to generate sets of machine instructions that may be executed by one or more processors to perform a programmed function or set of functions.
In the descriptions in this document, certain embodiments are described in terms of particular data structures, preferred and optional enforcements, preferred control flows, and examples. Other and further applications of the described methods, as would be understood after review of this application by those with ordinary skill in the art, are within the scope of the claimed invention.
The term “machine-readable medium” should be understood to include any structure that participates in providing data that may be read by an element of a computer system. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media include, for example, optical or magnetic disks and other persistent memory such as devices based on flash memory (such as solid-state drives, or SSDs). Volatile media include dynamic random access memory (DRAM) and/or static random access memory (SRAM). Transmission media include cables, wires, and fibers, including the wires that comprise a system bus coupled to a processor. Common forms of machine-readable media include, for example and without limitation, a floppy disk, a flexible disk, a hard disk, a solid-state drive, a magnetic tape, any other magnetic medium, a CD-ROM, a DVD, or any other optical medium.
As used herein, the term “computer system” is defined to include one or more processing devices (such as a central processing unit (“CPU”) or graphics processing unit (“GPU”)) for processing data and instructions that are coupled with one or more data storage devices for exchanging data and instructions with the processing unit, including, but not limited to, RAM, ROM, internal SRAM, on-chip RAM, on-chip flash, CD-ROM, hard disks, and the like. Examples of computer systems include everything from a controller to a laptop or desktop computer, to a super-computer. The data storage devices can be dedicated, i.e., coupled directly with the processing unit, or remote, i.e., coupled with the processing unit over a computer network. It should be appreciated that remote data storage devices coupled to a processing unit over a computer network can be capable of sending program instructions to the processing unit for execution. In addition, the processing device can be coupled with one or more additional processing devices, either through the same physical structure (e.g., a parallel processor), or over a computer network (e.g., a distributed processor.). The use of such remotely coupled data storage devices and processors will be familiar to those of skill in the computer science arts. The term “computer network” as used herein is defined to include a set of communications channels interconnecting a set of computer systems that can communicate with each other. The communications channels can include transmission media such as, but not limited to, twisted pair wires, coaxial cable, optical fibers, satellite links, or digital microwave radio. The computer systems can be distributed over large, or “wide,” areas (e.g., over tens, hundreds, or thousands of miles, WAN), or local area networks (e.g., over several feet to hundreds of feet, LAN). Furthermore, various local-area and wide-area networks can be combined to form aggregate networks of computer systems.
Mixed reality system 100 may also comprise additional components (not shown), such as tracking devices, microphones, headphones, and the like. Depending on the particular requirements of each implementation, computing system 135 may be configured as a separate desktop or laptop computer, a mobile or cell phone, or any of any number of other embodiments known to skilled artisans. Alternatively, computing system 135 may be integrated into head-mounted display system 120. Computing system 135 communicates with the components of mixed reality system 100, either wirelessly or with one or more wired connections, in accordance with techniques that are well-known to skilled artisans. Mixed reality system 100 may include a network connection to enable downloading software updates or accessing online content, as well as to facilitate communication, without limitation, with remote servers or other users.
As is well-known to skilled artisans, display device 120 may be mounted on the user's head so as to cover the user's eyes, and may provide visual content to the user 110 through display devices within the headset that are facing the user's eyes (not shown in
To enable the user 110 to see the surrounding real-world environment, head-mounted display device 120 may comprise an image passthrough feature, as known in the art. Specifically, to enable the user 110 to perceive their physical surroundings while wearing the head-mounted display device 120, one or more cameras may be implemented into the head-mounted display device 120, such as outward-facing cameras 140A and 140B that are depicted in
In certain embodiments, instead of a head-mounted display device (such as device 120 as shown in
Processors 350 may include, without limitation, any type of conventional processors, microprocessors, CPUs, GPUs, or processing logic that interprets and executes instructions. Main memory 310 may include, without limitation, a random-access memory (RAM) or another type of dynamic storage device that stores information and instructions for execution by processor 350. ROM 320 may include, without limitation, a conventional ROM device or another type of static storage device that stores static information and instructions for use by processors 350. Storage device 330 may include, without limitation, a magnetic and/or optical recording medium and its corresponding drive.
Input device(s) 380 may include, without limitation, one or more conventional mechanisms that permit a user to input information to computing device 300, such as a keyboard, a mouse, a pen, a stylus, handwriting recognition, voice recognition, biometric mechanisms, touch screen, and the like (e.g., controllers 130L and 130R as depicted in
As described in detail herein, computing device 300 may perform operations based on software instructions that may be read into memory 310 from another computer-readable medium, such as data storage device 330, or from another device via communication interface 360. The software instructions contained in memory 310 cause one or more processors 350 to perform processes that are described elsewhere. Alternatively, hardwired circuitry may be used in place of, or in combination with, software instructions to implement processes consistent with the present invention. Thus, various implementations are not limited to any specific combination of hardware circuitry and software.
In certain embodiments, a client 420 may connect to network 410 via wired and/or wireless connections, and thereby communicate or become coupled with server 400, either directly or indirectly. Alternatively, client 420 may be associated with server 400 through any suitable tangible computer-readable media or data storage device (such as a disk drive, CD-ROM, DVD, or the like), data stream, file, or communication channel.
Network 410 may include, without limitation, one or more networks of any type, including a Public Land Mobile Network (PLMN), a telephone network (e.g., a Public Switched Telephone Network (PSTN) and/or a wireless network), a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN), an Internet Protocol Multimedia Subsystem (IMS) network, a private network, the Internet, an intranet, a cellular network, and/or another type of suitable network, depending on the requirements of each particular implementation.
One or more components of networked environment 430 may perform one or more of the tasks described as being performed by one or more other components of networked environment 430.
Details regarding the foregoing components (e.g., as depicted in
Certain embodiments of the present invention may be implemented in the context of a networked multiplayer environment such as a computer game. As is well-known to skilled artisans, such a game requires a multiplayer networking system that can connect players to each other and allow them to interact within the same virtual environment. Among other requirements, such a system should typically be able to handle multiple players, provide reliable connectivity, and minimize latency to ensure smooth gameplay. Such a game also requires a game server to manage the multiplayer environment and store data related to the players' progress and interactions. The server should be powerful enough to handle multiple players at once and provide a stable and secure environment for gameplay. Furthermore, the game should be designed with multiple players in mind, with mechanics and gameplay elements that encourage social interaction and collaboration between players. Finally, without limitation, the game should provide players with a way to communicate with each other during gameplay, either through voice chat or text chat.
In the first step (510), a user (such as user 110 depicted in
At step (520), the user defines the boundaries of one or more objects within the relevant room. For example, a desk may rest on the floor of the room at a specific fixed location, and the user may hold a controller (e.g., controller 130L or 130R) and move it around the edges of the object while activating a predetermined button or other input device on the controller. In a virtual version of the object, such as the desk described in the preceding sentence, the object may appear as a rectangular prism (e.g., a box) or other simplified form, approximately the same size as the real-world object and approximately at the same location as the real-world-object. For example, referring to
Referring back to
At step (540), multi-user match-making is performed with the objective of associating the current user with another user who also has defined an inter-user window as described above. Once this multi-player match-making step has been performed, the rooms in which both users are located are virtually “stitched” together at the location of the inter-user window, such that the second user may virtually “see” into the first user's room (e.g., the second user may see a virtual representation of the first user, a visual representation of portions of the room in which the first user is located, and a virtual representation of one or more objects within the room in which the first user is located).
At step (540), a multi-user mixed-reality session (e.g., a game) is initiated that includes at least both of the users described above with reference to step (540).
A second user (not shown) is located in a second room (also not shown), which may be remote with respect to first room (620) (e.g., the first user may be located in a room in his or her home in a first city, and the second user may be located in a room in his or her home in a second city). The second user also performs steps (510), (520), and (530) in accordance with
As a result of the actions described in the preceding two paragraphs, in certain embodiments, as depicted in
Conversely, the second user 690 (who also be wearing a head-mounted mixed-reality display device), through his or her own version of inter-user window 680 (which may appear to the second user 690 to be located on a right-side wall of the second room, as opposed to left-hand wall 640 in the example shown in
Still referring to
While the interactive shared mixed reality possibilities are vast, an additional example may include the first user 610 firing projectiles from his or her weapon through inter-user window 680, such that the second user 690 sees and perceives the effects of these projectiles (e.g., first user 610 may intentionally or accidentally shoot projectiles at the second user 690, or vice versa). As an additional example, virtual monster 670 may fire projectiles at first user 610 such that the second user 690 sees and perceives these actions and their effects through his or her own side of inter-user window 680. As another example, virtual monster 670 may appear to fire a projectile through the first room 620 and through inter-user window 680, ending up in a virtual representation of the second room (in which the second user 690 is located), such that the first user 610 and the second user 690 both see and perceive these actions and their effect in real-time as if they were occurring in a coordinated manner throughout the mixed-reality space shared by the first user 610 and the second user 690. As another example, virtual monster 670 may appear to enter first room 620, in a manner visible to both the first user 610 and the second user 690, and the second user 690 may fire projectiles through the intra-user window 680 into the virtual representation of first room 620 so as to assist with fighting monster 670. As another example, multiple users may be physically present in first room 620 and/or in the second room, and, in certain embodiments according to aspects of the present invention, the movements and actions of all such users are tracked and reflected in the mixed-reality multiplayer virtual space that is shared by all such users.
Aspects of the present invention in certain embodiments enable users to collaborate in unique and novel ways, as users are given the visual and auditory impression that another user's movements are projected onto a wall of the room in which that user is located, even though the two users may be located far apart from each other in the real world.
In certain embodiments, the process of defining an inter-user window in accordance with step (530) of
In certain embodiments, the minimum amount of information required for each user according to aspects of the present invention are four points within each user's physical room to calibrate the room position and orientation (or alternatively, origin or room boundary information may be provided using so-called guardian or chaperone systems used in commercially available systems VR, AR, or MR systems), as well as two additional points to define the position of intra-user window 680. Once this information has been collected, the virtual versions of two physical rooms may be stitched together into a single shared and combined virtual room. For example, a virtual version of the second room may be added or stitched to first room 620, using the position of the defined origin point as a reference, and this addition or stitching takes place by translating and/or rotating the virtual representation of the second room so that it becomes adjacent to first room 620 and connected to first room 620 such that intra-user window 680 is located in the same position within the users' shared virtual space.
Certain embodiments allow multiple users to align their play space with other users in different locations. For example, in certain embodiments, a player may map out the physical space around that player to use it as that player's play space. This includes, but is not limited to, mapping out walls, windows, doors, tables, chairs, and the like. For example, such embodiments proceed as follows:
The process through which a player may map out his or her space is defined as a process that allows a player either directly or indirectly to mark out the layout and objects of the room that he or she is in. In certain embodiments, the information required for such layout marking may be provided by the application executing the audiovisual experience itself, or by the device that the application is running on, or by another device which allows the player to map out their space and share that information with the application.
For example, one way to implement the system is to use the room setup system commercially (e.g., the OpenXR Scene API) provided by Meta. That facility allows the player to map out his or her room, and that acquired map data can then be shared with the application to use as it needs. The application may create representations of the different objects that have been mapped out to visualize them to the player within the application.
An example of the process that players may take and how the application handles the system is as follows:
While the above description contains many specifics and certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of, and not restrictive on, the broad invention, and that this invention is not to be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art, as mentioned above. The invention includes any combination or sub-combination of the elements from the different species and/or embodiments disclosed herein.
This application claims the benefit of Provisional Application Ser. No. 63/452,541 filed 16 Mar. 2023, the contents of which are herein incorporated by reference in their entirety for all purposes.
Number | Date | Country | |
---|---|---|---|
63452541 | Mar 2023 | US |