The United States Media and Entertainment Industry is the largest in the world. The United States Media and Entertainment Industry represents a third of the global media and entertainment industry which delivers events, such as musical events, theatrical events, sporting events, and/or motion picture events, to an audience for their viewing pleasure. Different real-world venues have different configurations and arrangements from one another, such as media surfaces, seating locations, and/or standing locations to provide some examples, for presenting an event to an audience for their viewing pleasure. Event planners often design and plan the presentation of the event to optimize the event experience for the specific configurations and arrangements at these specific venues. In some situations, the event might need to be optimized to accommodate the specific configurations and arrangements of the specific venue before the event is to be presented and/or the specific venue might need to be modified to accommodate the event before the event is to be presented. In these situations, the event and/or content of the event may be needed to be modified differently for different venues. But in current systems, customizing and planning the content for the specific configurations and arrangements requires planners to physically travel to the specific venue, sometimes world-wide.
The accompanying drawings, which are incorporated herein and form a part of the specification, illustrate the present disclosure and, together with the description, further explain the principals thereof and to enable a person skilled in the pertinent art to make and use the same. Aspects of the present disclosure are best understood from the following detailed description when read with the accompanying figures. It is noted that, in accordance with the standard practice in the industry, features are not drawn to scale. In fact, the dimensions of the features may be arbitrarily increased or reduced for clarity of discussion. In the drawings:
In the accompanying drawings, like reference numbers indicate identical or functionally similar elements. Additionally, the left most digit(s) of a reference number identifies the drawing in which the reference number first appears.
The following disclosure provides many different embodiments, or examples, for implementing different features of the provided subject matter. Specific examples of components and arrangements are described below to simplify the present disclosure. These are, of course, merely examples and are not intended to be limiting. For example, the formation of a first feature over a second feature in the description that follows may include embodiments in which the first and second features are formed in direct contact, and may also include embodiments in which additional features may be formed between the first and second features, such that the first and second features may not be in direct contact. In addition, the present disclosure may repeat reference numerals and/or letters in the examples. This repetition does not in itself dictate a relationship between the embodiments and/or configurations discussed.
Overview
Real-world users of an event simulation system can playback a digital representation of an event that is mapped onto a virtual model of a real-world venue to simulate the event being presented at a real-world venue. As the real-world users are viewing this virtual event, these users can virtually interact with the virtual event, for example, move around the virtual event to view the virtual event at various locations and/or modify parameters, characteristics, and/or attributes of the virtual event. Thereafter, the event simulation system can propagate these modifications across multiple real-world users of the event simulation system to allow these real-world users to collaboratively interact with the virtual event.
Exemplary Event Simulation System
As illustrated in
In the exemplary embodiment illustrated in
In some embodiments, event simulation server 102 synchronizes a virtual event between multiple user devices 106.1 to 106.n. which facilitates collaboration on the design and planning of the virtual event. After retrieving the virtual model, the event simulation server 102 can map the digital representation of the event onto the virtual model to generate a virtual event as illustrated in
In some embodiments, event simulation server 102 may only function to synchronize the virtual events 112.1 to 112.n based on interactions and modifications by user devices 106.1 to 106.n. In these embodiments, each of user devices 106.1 to 106.n may retrieve the virtual model or the virtual event directly from event storage 104 or remote event sources 108. After retrieval of either the virtual model or the virtual event, user devices 106.1 to 106.n may then transmit respective interactions and modifications to event simulation server 102 which transmits the interactions and modifications to the other user devices. In this manner, event simulation server 102 facilitates collaboration between the user devices 106.1 to 106.n by ensuring that each user device has the same view of the virtual event.
After receiving the virtual event (either from event simulation server 102 or directly from event storage 104 or remote event sources 108), each user device may store and interact with a local copy of the virtual event, such as virtual event 112.1 for user device 106.1, virtual event 112.2 for user device 106.2, and virtual event 112.n for user device 106.n. Modifications to the virtual event (e.g., such as moving digital objects, changing acoustic or visual parameters of the virtual event) and other interactions by each user device are transmitted to event simulation server 102 which forwards the modifications to the other user devices to update their respective virtual events. For example if user device 106.1 adjusts a property of virtual event 112.1 such as, wind flow pattern (e.g., for a concert), scent model (e.g., the modeling of a scent through the venue), changes to acoustic beam-forming, just to name a few examples, user device 106.1 transmits the adjustment to user devices 106.2 and 106.n via event simulation server 102. User devices 106.2 and 106.n may then make a corresponding adjustment to their respective virtual events 112.2 and 112.n so that the adjustment may be visually represented at user devices 106.2 and 106.n.
The digital representation of the event can represent one or more computer-generated digital representations of a musical event, a theatrical event, and/or a sporting event to provide some examples, and/or the event itself, such as a motion picture event to provide an example. In some embodiments, a real-time, or near real-time, event also referred to as a live event, such as the musical event, the theatrical event, and/or the sporting event to provide some examples, can be digitally captured, for example, by one or more digital cameras, to provide the digital representation of the event of this real-time, or near real-time, event for mapping onto the virtual model. As illustrated in
In some embodiments, the event can include one or more performers and/or one or more theatrical properties, also referred to as props, that are associated with the event. In these embodiments, the event simulation server 102 can map one or more computer-generated models of the one or more performers and/or one or more computer-generated models of the one or more props that are associated with the event onto the virtual model to generate the virtual event.
In some embodiments, virtual events 112.1 to 112.n may also include visual representations of real-world effects related to the actual event. Various virtual effects may be related to the senses of the human body, such as sight, smell, touch, taste, and/or hearing to provide some examples, into the virtual event. These various virtual effects can include audio, visual, and/or sensory effects that are visually represented in the virtual events and provide a visual aid to real-world effects that are not necessarily visible. For example, real-world effects of an event may include wind flow patterns (e.g., from a wind machine), sound localization (e.g., from acoustic beamforming techniques), and scent trails. A virtual event may represent these real-world properties through visual virtual effects, such as arrows, lines, or any other visual effect that can be displayed in the virtual event.
In some embodiments, these various visual virtual effects can relate to lighting options available in the real-world venue, colors present in the real-world venue, different materials in the real-world venue, seats in the real-world venue, screens, exterior surroundings of the real-world venue, such as trees, buildings, roads, sky, lighting, and/or sun effects, and/or other real-world viewers in the real-world venue to provide some examples. In some embodiments, the audio effects can include realistic, confirmatory effects; realistic, evocative effects; symbolic, evocative effects; conventionalized effects; impressionistic effects; and/or music as effects to provide some examples. In some embodiments, the visual effects can include special effects, motion capture, matte painting, animation, three-dimensional modeling, rigging, rotoscoping, match moving, and/or compositing to provide some examples. In some embodiments, the sensory effects can include various effects that are related to the senses that can be experienced by the human body, such as temperature, touch, and/or smell to provide some examples. In some embodiments, the virtual event may include these virtual effects and/or computer-generated digital models of various electrical, mechanical, and/or electro-mechanical devices to simulate these virtual effects. For example, the virtual event may include computer-generated digital models of lighting systems; fog machines; smoke machines; wind machines; robots or animatronics; platforms, such as moveable platforms for performers to provide an example, and/or four-dimensional effects pods into the virtual event.
After generating the virtual event, the event simulation server 102 can provide the virtual event to the user devices 106.1 through 106.n. As illustrated in
In the exemplary embodiment illustrated in
In the exemplary embodiment illustrated in
In some embodiments, these interactions can include virtually modifying the virtual event as these users are viewing the virtual event views 114.1 through 114.n. In some embodiments, the user devices 106.1 through 106.n can provide various virtual graphical elements to the real-world users to allow these users to modify the virtual event. In these embodiments, these virtual graphical elements can outline various interactions, for example, modifications, that are available to the real-world users. In these embodiments, these virtual graphical elements can include one or more radio buttons, one or more check boxes, one or more text boxes, one or more toggle switches, one or more pop-up menus, one or more lists, and/or any other suitable mechanism that allows the real-world users to interact to provide some examples. For example, these modifications can include removing the one or more parameters, characteristics, and/or attributes of the virtual event from the three-dimensional space of the virtual event. As another example, these modifications can include moving a location, for example, a position and/or an orientation, of the one or more parameters, characteristics, and/or attributes of the virtual event 112 within the three-dimensional space of the virtual event. As a further example, these modifications can include inserting one or more new parameters, new characteristics, and/or new attributes into the three-dimensional space of the virtual event. In some embodiments, the parameters, characteristics, and/or attributes of the virtual event can include, or relate to, the one or more computer-generated digital models of the various architectural features of the real-world venue, the one or more computer-generated digital models of the various objects, the one or more computer-generated models of the one or more performers, the one or more computer-generated models of the one or more props that are associated with the event, and/or other suitable parameters, characteristics, and/or attributes of the virtual event that will be apparent to those skilled in the relevant art(s) without departing the spirit and scope of the present disclosure.
In some embodiments, the virtual graphical elements correspond to virtual effects and are displayed in virtual event views 114.1 to 114.n. Virtual graphical elements may be visual representations of real-world effects and the parameters, characteristics, and/or attributes of the virtual event correspond to the parameters, characteristics, and/or attributes of the real-world effects. Examples of real-world effects are wind flow, scent trails, smoke/fog trails, audio directions (e.g., from beamforming), and lighting effects, just to name a few examples. Examples of parameters, characteristics, and/or attributes for wind flow may include wind speed, wind direction, and wind duration. Examples of parameters, characteristics, and/or attributes scent or smoke/fog trails may include scent or smoke intensity, initial direction, and duration. Audio direction relates to beamforming technology which controls the size, shape, and direction of an acoustic wave in order to direct sound to a particular location. For example, beamforming can allow sound to be directed to a particular location of a venue so that only certain users hear the sound. Examples of parameters, characteristics, and/or attributes for audio direction may include the target location of the audio and volume. Examples of parameters, characteristics, and/or attributes for lighting effects may include color, intensity, movement pattern, and target location (to be illuminated).
In some embodiments, the virtual graphical elements provide visual representations of real-world effects that are not typically visible to the human eye, such as wind, scent, and audio, as discussed above. For example, a virtual graphical element for wind may depict the trails of wind flow (e.g., from a wind machine) and the interaction of the wind with architecture of the venue within the virtual event. As another example, a virtual graphical element for directed audio (e.g., beamforming) may depict the direction of audio from a source to an intended target and the interaction of the audio with architecture of the venue. When parameters, characteristics, and/or attribute of these real-world effects are modified (e.g., by any of user devices 106.1 through 106.n), event simulation server 102 and/or user devices 106.1 through 106.n may update the virtual graphical elements to represent that modification. For example, the updated virtual graphical elements may represent a new direction for the wind flow or a new direction or target for the directed audio.
In some embodiments, users may modify parameters, characteristics, and/or attributes of virtual effects via an interface provided by user devices 106.1 to 106.n. For example, a user may modify a parameter, characteristic, and/or attribute of virtual event 112.1 at user device 106.1, such as the wind direction of a wind machine. Such a modification changes virtual event 112.1 which displays the modification as a visual representation via virtual event view 114.1. For example, virtual event view 114.1 may display an arrow representing the new wind direction and wind flow within the venue. User device 106.1 may transmit the modification to the user devices 106.2 to 106.n (e.g., via event simulation server 102). Upon receiving the modification, user devices 106.2 to 106.n may update respective virtual events 112.2 to 112.n based on the modification. This update includes displaying the arrow representing the new wind direction and wind flow in respective virtual event views 114.2 to 114.n. Although wind direction is discussed in this embodiment, similar discussion applies to other virtual effects such as those discussed above (e.g., scent trails, fog/smoke trails, audio directions) for displaying virtual graphical effects to represent virtual effects and any resulting modifications to those effects. In this manner, virtual events 112.1 to 112.n may simulate the behavior of real-world effects within a particular venue and display that simulated behavior as virtual graphical elements in virtual event views 114.1 to 114.n.
In some embodiments, event simulator server 102 may process the virtual graphical elements to simulate the real-world effects. For example, event simulator server 102 may receive a modification to the parameters, characteristics, and/or attributes from a user device, simulate the impact of that modification on the virtual event, generating the virtual graphical element to correspond to the simulated impact, and transmitting the virtual graphical element to the other user devices. In some embodiments, user devices 106.1 to 106.n. receive the modification from event simulator server 102 and simulates the impact of the modification locally.
In some embodiments, processing of the virtual graphical elements may include simulating the interaction between two or more virtual graphical elements. For example, a user may modify one or more parameters, characteristics, and/or attributes such as both a wind flow and a smoke machine. The virtual event may be updated based on the modifications to the wind flow and the smoke trail which may include simulating the impact of the modifications to the wind flow and the smoke trail. This may further include updating the virtual graphical elements within the virtual event view to display the updated simulation such as a line representing a new smoke trail that is impacted by the change in the wind flow.
In the exemplary embodiment illustrated in
In the exemplary embodiment illustrated in
In some embodiments, the communications between user devices 106.1 through 106.n may be associated with the virtual graphical elements being displayed in corresponding virtual event views 114.1 through 114.n. For example, a user at user device 106.1 may interact with a virtual graphical element in virtual event view 114.1. Examples of interactions include selection, annotation, or modification a virtual graphical element. The user may wish to collaborate with another user at user device 106.2 on the interaction with the virtual graphical element. User device 106.1 may initiate a communication with the user device 106.2 which may include modifying the virtual event view 114.2 to see the virtual graphical element. For example, user device 106.1 may send an instruction to user device 106.2 to move the location of the virtual user so that the virtual graphical element is displayed in virtual event view 114.2 As another example, user device 106.1 may transmit a text-based communication that includes an image of the interaction with the virtual graphical element to the user device 106.2.
The remote event sources 108 can provide the event to the event simulation server 102 via the communication network 110 as described above. In some embodiments, the remote event sources 108 can include a remote depository that stores the digital representation of the event, for example, a remote depository that is associated with an owner or owners of the digital representation of the event. In some embodiments, the remote event sources 108 can live-stream the digital representation of the event to the event simulation server 102 via the communication network 110. For example, the remote event sources 108 can provide the digital representation of the event, such the one or more computer-generated digital representations of the musical event, the one or more computer-generated digital representations of the theatrical event, the one or more computer-generated digital representations of the sporting event, and/or the motion picture event to provide some examples, to the remote event sources 108 as the event is being presented at another real-world venue.
The communication network 110 can include a wireless communication network, a wireline communication network, and/or any combination thereof that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure to communicatively couple the event simulation server 102, the user devices 106.1 through 106.n, and/or the one or more remote event sources 108 to one another. In some embodiments, the wireless communication network can be compliant with, for example, a version of an Institute of Electrical and Electronics Engineers (I.E.E.E.) 802.11 communication standard, for example, 802.11a, 802.11b/g/n, 802.11h, and/or 802.1 lac, which are collectively referred to as Wi-Fi, a version of a Bluetooth communication standard, and/or or any other wireless communication standard or protocol that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure. In some embodiments, the wireline communication network can be compliant with, for example, a version of an Institute of Electrical and Electronics Engineers (IEEE) 802.10 communication standard or protocol, also referred as Ethernet, such as 50G Ethernet, 100G Ethernet, 200G Ethernet, and/or 400G Ethernet to provide some examples, and/or or any other wireline communication standard or protocol that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure.
Exemplary Operation of the Exemplary Event Simulation System
At operation 202, the operational control flow 200 maps the digital representation of the event onto the virtual model to generate the virtual event. The digital representation of the event is substantially similar to the digital representation of the event as described above in
At operation 204, the operational control flow 200 generates a virtual event view at a location of the three-dimensional space of the virtual event from operation 202. The location of the three-dimensional space may correspond to a physical location of the venue and therefore the virtual event view at the location corresponds to the real world view at the corresponding physical location at the venue. In the exemplary embodiment illustrated in
At operation 206, the operational control flow 200 can receive interactions from user devices 106.1 through 106.n in the form of user input and represent these interactions in the virtual event view from operation 204. The operational control flow 200 can play the virtual event from operation 202 to virtually simulate the event being presented at the real-world venue. The operational control flow 200 can receive user input from user devices 106.1 through 106.n as the virtual event view from operation 204 is being viewed (e.g., on a display of user devices 106.1 through 106.n. In some embodiments, this user input can include instructions (e.g., from an input device of user devices 106.1 through 106.n) for virtually moving virtual users around the three-dimensional space of the virtual event from operation 202 to view the digital representation of the event at one or more locations of the three-dimensional space of the virtual event from operation 202. In some embodiments, this interaction can include virtually modifying one or more parameters, characteristics, and/or attributes of the virtual event from operation 202 as the virtual event view from operation 204 is being viewed in a substantially similar manner as described above in
Exemplary Virtual Event that can be Implemented within the Exemplary Event Simulation System
As illustrated in
For example, as illustrated in
Exemplary Virtual Event Views that can be Generated by the Exemplary Event Simulation System
As illustrated in
As illustrated in
Exemplary Interactions with the Exemplary Virtual Event Views that can be Generated by Users of the Exemplary Event Simulation System
As illustrated in
After emphasizing the virtual architectural feature 502, the real-world user can utilize the virtual pointer 504 to interact with a user interface 506 to modify the one or more parameters, characteristics, and/or attributes of the computer-generated digital model of the virtual architectural feature 502. In some embodiments, the user interface 506 can represent an exemplary embodiment of the user interface 408 as described above in
As to be described in further detail below, a real-world user of a user device of an event simulation system, such as one or more of the user devices 106.1 through 106.n as described above in
Exemplary Collaboration Among Users of the Exemplary Event Simulation System
In the exemplary embodiment illustrated in
In the exemplary embodiment illustrated in
Exemplary Event Simulation Server that can be Implemented within the Exemplary Event Simulation System
In the exemplary embodiment illustrated in
The venue modeling module 706 can retrieve and/or generate a virtual model of a real-world venue. The virtual model represents a computer-generated digital model of the real-world venue in three-dimensional space. In some embodiments, the real-world venue can represent a music venue, for example, a music theater, a music club, and/or a concert hall, a sporting venue, for example, an arena, a convention center, and/or a stadium, and/or any other suitable venue that will be apparent to those skilled in the relevant art(s) without departing the spirit and scope of the present disclosure. In these embodiments, the virtual model can represent a computer-generated digital model of the music theater, the sporting venue, and/or the other suitable venue in the three-dimensional space. In some embodiments, the virtual model can include one or more computer-generated digital models of various architectural features of the real-world venue in the three-dimensional space, such as the performance area, the media surfaces, the seating locations, and/or the standing locations to provide some examples.
In the exemplary embodiment illustrated in
The event modeling module 708 can retrieve and/or generate a digital representation of an event. In some embodiments, the event can include a musical event, a theatrical event, a sporting event, and/or a motion picture event to provide some examples. In the exemplary embodiment illustrated in
The virtual event organization module 710 can retrieve and/or generate various virtual effects including those related to the senses of the human body, such as sight, smell, touch, taste, and/or hearing to provide some examples, that are to be included within the virtual event. These various virtual effects can include audio, visual, and/or sensory effects that are to be inserted into the virtual events. In some embodiments, these various virtual effects can relate to lighting options available in the real-world venue, colors present in the real-world venue, different materials in the real-world venue, seats in the real-world venue, screens, exterior surroundings of the real-world venue, such as trees, buildings, roads, sky, lighting, and/or sun effects, and/or other real-world viewers in the real-world venue to provide some examples. In some embodiments, the audio effects can include realistic, confirmatory effects; realistic, evocative effects; symbolic, evocative effects; conventionalized effects; impressionistic effects; and/or music as effects to provide some examples. In some embodiments, the visual effects can include special effects, motion capture, matte painting, animation, three-dimensional modeling, rigging, rotoscoping, match moving, and/or compositing to provide some examples. In some embodiments, the sensory effects can include various effects that are related to the senses that can be experienced by the human body, such as temperature, touch, and/or smell to provide some examples. In some embodiments, the virtual event organization module 710 can insert these virtual effects and/or computer-generated digital models of various electrical, mechanical, and/or electro-mechanical devices into the virtual event to simulate these virtual effects. For example, the virtual event organization module 710 can insert computer-generated digital models of lighting systems; fog machines; smoke machines; wind machines; robots or animatronics; platforms, such as moveable platforms for performers to provide an example, and/or four-dimensional effects pods into the virtual event. An exemplary four-dimensional effects pod is described in U.S. patent application Ser. No. 16/997,511, filed on Aug. 19, 2020, U.S. patent application Ser. No. 16/997,518, filed on Aug. 19, 2020, and U.S. patent application Ser. No. 17/150,794, filed on Jan. 15, 2021, each of which is incorporated herein by reference in its entirety.
In some embodiments, the real-world users can modify one or more parameters, characteristics, and/or attributes of insert the sensory effects and/or the computer-generated digital models of various electrical, mechanical, and/or electro-mechanical devices to simulate these virtual effects as these real-world users are viewing the virtual event in a substantially similar manner as described above in
The virtual event simulation module 712 can map the digital representation of the event from the event modeling module 708 onto the virtual model from venue modeling module 706 to generate the virtual event. In some embodiments, the virtual event represents a virtual presentation of the event at the real-world venue using the virtual model. For example, the event can include a musical event, a theatrical event, a sporting event, and/or a motion picture event to provide some examples. In this example, the virtual event simulation module 712 can map the musical event, the theatrical event, the sporting event, and/or the motion picture onto the virtual model to generate the virtual event. As such, the virtual event represents a virtual presentation of the musical event, the theatrical event, the sporting event, and/or the motion picture at the real-world venue using the virtual model. In some embodiments, the virtual event simulation module 712 can insert the sensory effects from the virtual event organization module 710 and/or the computer-generated digital models of various electrical, mechanical, and/or electro-mechanical devices into the virtual event to simulate these virtual effects into the virtual event.
In the exemplary embodiment illustrated in
The communication module 716 and the user devices functionally cooperate to provide an interactive environment for interacting with the virtual event. In some embodiments, the communication module 716 can provide multiple real-world users of the user devices with various communication capabilities, for example, audio, video, and/or data communications. In these embodiments, the communication module 716 can establish one or more communication sessions, for example, audio, video, and/or data communication sessions, between the multiple real-world users of the user devices to allow these real-world users to communicate among themselves while interacting with the virtual event as described above. For example, the communication module 716 can establish voice calls, video calls, and/or text messaging among multiple real-world users of the user devices to allow these real-world users to communicate among themselves while interacting with the virtual event as described above. In some embodiments, the one or more communication sessions can include synchronous audio and/or video conferencing sessions among the multiple users to allow these users to communicate among themselves in real-time, or near-real time, while interacting with the virtual event. In these embodiments, the synchronous audio and/or video conferencing sessions can be implemented in accordance with an Internet Relay Chat (IRC) conferencing protocol, a Protocol for Synchronous Conferencing (PSYC), a Secure Internet Live Conferencing (SILC) protocol, an Extensible Messaging and Presence Protocol (XMPP), and/or Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE) to provide some examples. In some embodiments, the one or more communication sessions can include asynchronous audio and/or video conferencing sessions among the multiple users to allow these users to communicate among themselves with a delay while interacting with the virtual event. In these embodiments, the asynchronous audio and/or video conferencing sessions can include electronic bulletin boards, electronic messages (e-mails), online forums and/or polls, social networking sites, and/or shared calendars to provide some examples.
Exemplary Event User Device that can be Implemented within the Exemplary Event Simulation System
In some embodiments, the user device 800 can be implemented as a standalone, or a discrete device, and/or can be incorporated within or coupled to one or more computing devices, such as one or more desktop computers, one or more mobile phones, one or more mobile computing devices; one or more mobile internet devices, such as tablet computers and/or laptop computers, one or more mobile video game consoles, one or more mobile wearable electronic devices, such as smartwatches, and/or any other computing device having one or more processors that will be recognized by those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure to provide some examples. In some embodiments, the user device 800 can represent an exemplary embodiment of one or more of the user devices 106.1 through 106.n as described above in
The user simulation module 810 can process the virtual event to generate a virtual event view that corresponds to a location of a virtual user that is associated with the real-world user within the virtual event. In some embodiments, the user simulation module 810 can process the virtual event to generate the virtual event view for presentation in a virtual reality (VR) environment. In these embodiments, the virtual event view displayed by user device 800 represents an immersive virtual world of the virtual event view. This virtual world effectively immerses the real-world user within the virtual event giving the impression to the real-world user that they have entered the virtual event. In some embodiments, as the real-world user moves such as changing location within the virtual event and/or moves a part of his or her body within the real-world, for example, moves his or her head up-and-down or side-to-side, the user device 800 may update virtual event view to effectively immerse the real-world user within the virtual event. In some embodiments, the user simulation module 810 can process the virtual event to generate the virtual event view in a substantially similar manner as described above in
The interaction module 812 functionally cooperates with an event simulation server, such as the event simulation server 102 as described above in
The venue manipulator module 814 receive interactions of the real-world user with the virtual event. In some embodiments, the venue manipulator module 814 can insert, for example, overlay, various virtual graphical elements onto the virtual event view to allow the real-world user to interact with the virtual event. In some embodiments, these virtual graphical elements can outline various interactions, for example, modifications, that are available to the real-world user. In these embodiments, these virtual graphical elements can include one or more radio buttons, one or more check boxes, one or more text boxes, one or more toggle switches, one or more pop-up menus, one or more lists, and/or any other suitable mechanism that allows the real-world user to interact to provide some examples. In some embodiments, the venue manipulator module 814 can insert, for example, overlay, a virtual selection tool such as those described above in
In the exemplary embodiment illustrated in
In some embodiments, user devices 800A-C can process the virtual event by representing interactions with the virtual event view 808, for example, move around the virtual event to view the virtual event at various locations and/or modify parameters, characteristics, and/or attributes of the virtual event. Different interactions may be available via a graphical user interface in the virtual event view of the virtual event based on the type of user device. For example, user device 800A may include physical interface devices such as a keyboard and mouse. Virtual event view 808A may be customized to include interactions that are more easily input via such physical interface devices. Examples of such interactions for user device 800A include modifying code segments of the virtual event or any modification of parameters, characteristics, and/or attributes that requires text entry. As another example, virtual event view 808B may be customized to accommodate the VR implementation of user device 800B to include interactions that are specific to user controller device 806. As another example, virtual event view 808C may be customized to accommodate the mobility of user device 800C. Examples of such interactions for user device 800C include providing augmented reality (AR) based interactions. For example, user device 800C may be physically located within a venue in which an event is to take place while user device 800A and/or user device 800B may be physically located remotely from the venue. Virtual event view 808C may combine a real-time view of the venue along with the virtual event in an augmented reality format. For example, virtual graphical elements of virtual event view 808C may be displayed as an overlay over real-world elements of the venue.
User device 800B may present the virtual event view 808B. In some embodiments, the viewer 802 represents a virtual reality (VR) headset for presenting the virtual event view 808B in a virtual reality (VR) environment. In these embodiments, the viewer 802 presents an immersive virtual world of the virtual event view 808B to the real-world users to effectively immerse the real-world user within the virtual event. In some embodiments, the viewer 802 can be implemented a standalone device. In some embodiments, viewer 802 can be implemented a tethered device that is communicatively coupled to another device, such as user device 800A.
The user controller device 806 represents an input device that is used by the real-world user to interact with the virtual event when using user device 800B. In some embodiments, the user controller device 806 can include one or more action buttons and/or one or more omnidirectional control sticks or buttons that can be manipulated by the real-world user to interact with the virtual event. In some embodiments, the real-world user can use the one or more action buttons and/or the one or more omnidirectional control sticks or buttons to perform various actions within the virtual world. For example, the real-world user can use the one or more action buttons and/or the one or more omnidirectional control sticks to “point-and-click” and/or “drag and drop” one or more computer-generated digital models of various architectural features of the real-world venue in the three-dimensional space, such as the performance area, the media surfaces, the seating locations, and/or the standing locations to provide some examples, and/or one or more computer-generated digital models of various objects at the real-world venue in the three-dimensional space, such as stage objects that are associated with the real-world venue and/or stage objects that are associated with the event to provide some examples.
Exemplary Computer System that can be Utilized to Implement Electronic Devices within the Exemplary Venue
In the exemplary embodiment illustrated in
The computer system typically includes an operating system, such as Microsoft's Windows, Sun Microsystems's Solaris, Apple Computer's MacOs, Linux or UNIX. The computer system also typically can include a Basic Input/Output System (BIOS) and processor firmware. The operating system, BIOS and firmware are used by the processor to control subsystems and interfaces coupled to the processor. Typical processors compatible with these operating systems include the Pentium and Itanium from Intel, the Opteron and Athlon from Advanced Micro Devices, and the ARM processor from ARM Holdings.
As illustrated in
The user interface input devices 922 may include an alphanumeric keyboard, a keypad, pointing devices such as a mouse, trackball, touchpad, stylus, or graphics tablet, a scanner, a touchscreen incorporated into the display, audio input devices such as voice recognition systems or microphones, eye-gaze recognition, brainwave pattern recognition, and other types of input devices. Such devices can be connected by wire or wirelessly to a computer system. In general, use of the term “input device” is intended to include all possible types of devices and ways to input information into the computer system 910 or onto the communication network 918. The user interface input devices 922 typically allow a user to select objects, icons, text and the like that appear on some types of user interface output devices, for example, a display subsystem.
The user interface output devices 920 may include a display subsystem, a printer, a fax machine, or non-visual displays such as audio output devices. The display subsystem may include a cathode ray tube (CRT), a flat-panel device such as a liquid crystal display (LCD), a projection device, or some other device for creating a visible image such as a virtual reality system. The display subsystem may also provide non-visual display such as via audio output or tactile output (e.g., vibrations) devices. In general, use of the term “output device” is intended to include all possible types of devices and ways to output information from the computer system 910 to the user or to another machine or computer system.
The memory subsystem 926 typically includes a number of memories including a main random-access memory (“RAM”) 930 (or other volatile storage device) for storage of instructions and data during program execution and a read only memory (“ROM”) 932 in which fixed instructions are stored. The file storage subsystem 928 provides persistent storage for program and data files, and may include a hard disk drive, a floppy disk drive along with associated removable media, a CD-ROM drive, an optical drive, a flash memory, or removable media cartridges. The databases and modules implementing the functionality of certain embodiments may be stored by file storage subsystem 928.
The bus subsystem 912 provides a device for letting the various components and subsystems of the computer system 910 communicate with each other as intended. Although the bus subsystem 912 is shown schematically as a single bus, alternative embodiments of the bus subsystem may use multiple busses. For example, RAM-based main memory can communicate directly with file storage systems using Direct Memory Access (“DMA”) systems.
The Detailed Description referred to accompanying figures to illustrate exemplary embodiments consistent with the disclosure. References in the disclosure to “an exemplary embodiment” indicates that the exemplary embodiment described can include a particular feature, structure, or characteristic, but every exemplary embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same exemplary embodiment. Further, any feature, structure, or characteristic described in connection with an exemplary embodiment can be included, independently or in any combination, with features, structures, or characteristics of other exemplary embodiments whether or not explicitly described.
The Detailed Description is not meant to limiting. Rather, the scope of the disclosure is defined only in accordance with the following claims and their equivalents. It is to be appreciated that the Detailed Description section, and not the Abstract section, is intended to be used to interpret the claims. The Abstract section can set forth one or more, but not all exemplary embodiments, of the disclosure, and thus, are not intended to limit the disclosure and the following claims and their equivalents in any way.
The exemplary embodiments described within the disclosure have been provided for illustrative purposes and are not intended to be limiting. Other exemplary embodiments are possible, and modifications can be made to the exemplary embodiments while remaining within the spirit and scope of the disclosure. The disclosure has been described with the aid of functional building blocks illustrating the implementation of specified functions and relationships thereof. The boundaries of these functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternate boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed.
Embodiments of the disclosure can be implemented in hardware, firmware, software application, or any combination thereof. Embodiments of the disclosure can also be implemented as instructions stored on a machine-readable medium, which can be read and executed by one or more processors. A machine-readable medium can include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computing circuitry). For example, a machine-readable medium can include non-transitory machine-readable mediums such as read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; and others. As another example, the machine-readable medium can include transitory machine-readable medium such as electrical, optical, acoustical, or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.). Further, firmware, software application, routines, instructions can be described herein as performing certain actions. However, it should be appreciated that such descriptions are merely for convenience and that such actions in fact result from computing devices, processors, controllers, or other devices executing the firmware, software application, routines, instructions, etc.
The Detailed Description of the exemplary embodiments fully revealed the general nature of the disclosure that others can, by applying knowledge of those skilled in relevant art(s), readily modify and/or adapt for various applications such exemplary embodiments, without undue experimentation, without departing from the spirit and scope of the disclosure. Therefore, such adaptations and modifications are intended to be within the meaning and plurality of equivalents of the exemplary embodiments based upon the teaching and guidance presented herein. It is to be understood that the phraseology or terminology herein is for the purpose of description and not of limitation, such that the terminology or phraseology of the present specification is to be interpreted by those skilled in relevant art(s) in light of the teachings herein.
This application claims the benefit of U.S. Provisional Application No. 63/347,828, filed Jun. 1, 2022, which is incorporated by reference in its entirety for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
5850352 | Moezzi et al. | Dec 1998 | A |
8325230 | Pattikonda et al. | Dec 2012 | B1 |
8427510 | Towfiq et al. | Apr 2013 | B1 |
8487959 | Khan et al. | Jul 2013 | B1 |
8527340 | Fisher et al. | Sep 2013 | B2 |
8549574 | Perlman et al. | Oct 2013 | B2 |
8648857 | Williams | Feb 2014 | B2 |
9239992 | Valentino | Jan 2016 | B2 |
9264598 | Baldwin | Feb 2016 | B1 |
9286580 | Itoh | Mar 2016 | B2 |
9329469 | Benko et al. | May 2016 | B2 |
9480907 | Benko et al. | Nov 2016 | B2 |
9503687 | Kratz et al. | Nov 2016 | B2 |
9609383 | Hirst | Mar 2017 | B1 |
9654818 | Kuncl et al. | May 2017 | B2 |
9797151 | Puigcercos | Oct 2017 | B2 |
9838675 | Gocke et al. | Dec 2017 | B2 |
9842268 | Krafka et al. | Dec 2017 | B1 |
10063820 | Bert et al. | Aug 2018 | B2 |
10063822 | Hattingh et al. | Aug 2018 | B2 |
10076711 | Malenfant | Sep 2018 | B2 |
10078917 | Gaeta et al. | Sep 2018 | B1 |
10096085 | Kim et al. | Oct 2018 | B2 |
10110850 | Pantofaru et al. | Oct 2018 | B1 |
10156898 | Prosserman et al. | Dec 2018 | B2 |
10165326 | Kline et al. | Dec 2018 | B1 |
10204444 | Khalid et al. | Feb 2019 | B2 |
10206001 | Kline et al. | Feb 2019 | B2 |
10257490 | Khalid et al. | Apr 2019 | B2 |
10281979 | Oyama | May 2019 | B2 |
10296281 | Prosserman et al. | May 2019 | B2 |
10343015 | Marty et al. | Jul 2019 | B2 |
10380798 | He et al. | Aug 2019 | B2 |
10397656 | Kline et al. | Aug 2019 | B2 |
10403050 | Beall | Sep 2019 | B1 |
10416757 | Smit | Sep 2019 | B2 |
10419788 | Arimilli et al. | Sep 2019 | B2 |
10478730 | Burnett | Nov 2019 | B1 |
10503457 | Dimitrov et al. | Dec 2019 | B2 |
10514262 | Oxenham et al. | Dec 2019 | B2 |
10559060 | Noh et al. | Feb 2020 | B2 |
10621784 | Khan et al. | Apr 2020 | B2 |
10664225 | Prosserman et al. | May 2020 | B2 |
10667012 | Kline et al. | May 2020 | B2 |
10713543 | Skuin et al. | Jul 2020 | B1 |
10733774 | Bae et al. | Aug 2020 | B2 |
10748008 | Chang et al. | Aug 2020 | B2 |
10769446 | Chang et al. | Sep 2020 | B2 |
10810791 | Khalid et al. | Oct 2020 | B2 |
10819967 | Khalid et al. | Oct 2020 | B2 |
11023729 | Rodriguez et al. | Jun 2021 | B1 |
11260314 | Anderson et al. | Mar 2022 | B1 |
11266921 | Anderson et al. | Mar 2022 | B1 |
20020082879 | Miller et al. | Jun 2002 | A1 |
20020159035 | Koyama et al. | Oct 2002 | A1 |
20030038892 | Wang et al. | Feb 2003 | A1 |
20040146840 | Hoover et al. | Jul 2004 | A1 |
20040218918 | Gluck | Nov 2004 | A1 |
20060038814 | Rivera | Feb 2006 | A1 |
20070047949 | Gluck | Mar 2007 | A1 |
20070121534 | James et al. | May 2007 | A1 |
20070265892 | Valentino | Nov 2007 | A1 |
20080103934 | Gibson et al. | May 2008 | A1 |
20080129825 | DeAngelis et al. | Jun 2008 | A1 |
20080249895 | Mariotti | Oct 2008 | A1 |
20080255889 | Geisler et al. | Oct 2008 | A1 |
20080268961 | Brook et al. | Oct 2008 | A1 |
20080281644 | Payne | Nov 2008 | A1 |
20090013263 | Fortnow | Jan 2009 | A1 |
20090091711 | Rivera | Apr 2009 | A1 |
20090256839 | Bastian | Oct 2009 | A1 |
20100010840 | Eden | Jan 2010 | A1 |
20100037273 | Dressel et al. | Feb 2010 | A1 |
20100073468 | Kutner | Mar 2010 | A1 |
20100121808 | Kuhn | May 2010 | A1 |
20100133339 | Gibson et al. | Jun 2010 | A1 |
20100138874 | Deutschman | Jun 2010 | A1 |
20100201878 | Barenbrug et al. | Aug 2010 | A1 |
20100251173 | Imanishi | Sep 2010 | A1 |
20110013087 | House et al. | Jan 2011 | A1 |
20110115930 | Kulinets et al. | May 2011 | A1 |
20110243546 | Pace et al. | Oct 2011 | A1 |
20120050698 | Kotani | Mar 2012 | A1 |
20120078667 | Denker et al. | Mar 2012 | A1 |
20120090005 | Marlow et al. | Apr 2012 | A1 |
20120133638 | Davison | May 2012 | A1 |
20120159329 | Chow et al. | Jun 2012 | A1 |
20120166960 | Salles | Jun 2012 | A1 |
20120249741 | Maciocci | Oct 2012 | A1 |
20120319997 | Majumder | Dec 2012 | A1 |
20120323612 | Callaghan | Dec 2012 | A1 |
20130083173 | Geisner | Apr 2013 | A1 |
20130141588 | Crookham et al. | Jun 2013 | A1 |
20130159030 | Tattenbaum et al. | Jun 2013 | A1 |
20130222557 | Kuo et al. | Aug 2013 | A1 |
20130267319 | Kuhn et al. | Oct 2013 | A1 |
20130268899 | Valentino | Oct 2013 | A1 |
20130321400 | van Os et al. | Dec 2013 | A1 |
20130321401 | Piemonte et al. | Dec 2013 | A1 |
20130335520 | Campbell et al. | Dec 2013 | A1 |
20140007017 | Sternfeld et al. | Jan 2014 | A1 |
20140013228 | Hutten | Jan 2014 | A1 |
20140044340 | Phan et al. | Feb 2014 | A1 |
20140066127 | Naiki et al. | Mar 2014 | A1 |
20140085203 | Kobayashi | Mar 2014 | A1 |
20140095223 | Oxenham et al. | Apr 2014 | A1 |
20140146080 | Ivashin et al. | May 2014 | A1 |
20140146177 | Pacor et al. | May 2014 | A1 |
20140150032 | Pacor et al. | May 2014 | A1 |
20140150042 | Pacor et al. | May 2014 | A1 |
20140168277 | Ashley et al. | Jun 2014 | A1 |
20140188527 | Oxenham et al. | Jul 2014 | A1 |
20140188528 | Oxenham et al. | Jul 2014 | A1 |
20140195277 | Kim | Jul 2014 | A1 |
20140240352 | Kuncl et al. | Aug 2014 | A1 |
20150012308 | Snyder | Jan 2015 | A1 |
20150100354 | Horowitz et al. | Apr 2015 | A1 |
20150100869 | Sunshine et al. | Apr 2015 | A1 |
20150106134 | Gandham et al. | Apr 2015 | A1 |
20150161525 | Hirose et al. | Jun 2015 | A1 |
20150169142 | Longo et al. | Jun 2015 | A1 |
20150169694 | Longo et al. | Jun 2015 | A1 |
20150221334 | King et al. | Aug 2015 | A1 |
20150222935 | King et al. | Aug 2015 | A1 |
20150242947 | Wilson et al. | Aug 2015 | A1 |
20150297949 | Aman et al. | Oct 2015 | A1 |
20150304601 | Hicks et al. | Oct 2015 | A1 |
20150304724 | Prosserman et al. | Oct 2015 | A1 |
20150350628 | Sanders et al. | Dec 2015 | A1 |
20160004979 | Getchius | Jan 2016 | A1 |
20160014435 | Cronin et al. | Jan 2016 | A1 |
20160063103 | Bostick et al. | Mar 2016 | A1 |
20160073013 | Prosserman et al. | Mar 2016 | A1 |
20160080710 | Hattingh et al. | Mar 2016 | A1 |
20160125324 | Yamamoto et al. | May 2016 | A1 |
20160142629 | Jung et al. | May 2016 | A1 |
20160153208 | Oehl et al. | Jun 2016 | A1 |
20160182854 | Suzuki | Jun 2016 | A1 |
20160182877 | DeLuca | Jun 2016 | A1 |
20160219338 | Wang et al. | Jul 2016 | A1 |
20160225187 | Knipp et al. | Aug 2016 | A1 |
20160227190 | Cole et al. | Aug 2016 | A1 |
20160323559 | Matsunobu et al. | Nov 2016 | A1 |
20160371886 | Thompson et al. | Dec 2016 | A1 |
20160381110 | Barnett et al. | Dec 2016 | A1 |
20160381157 | Magnusson et al. | Dec 2016 | A1 |
20170011554 | Burba et al. | Jan 2017 | A1 |
20170025152 | Jaime et al. | Jan 2017 | A1 |
20170026612 | Rintel et al. | Jan 2017 | A1 |
20170053449 | Lee et al. | Feb 2017 | A1 |
20170083835 | Sunshine et al. | Mar 2017 | A1 |
20170083836 | Sunshine et al. | Mar 2017 | A1 |
20170157512 | Long et al. | Jun 2017 | A1 |
20170262918 | Sweeney | Sep 2017 | A1 |
20170264936 | Depies et al. | Sep 2017 | A1 |
20170270587 | Wolfson et al. | Sep 2017 | A1 |
20170277358 | Kihara | Sep 2017 | A1 |
20170287059 | Shelksohn et al. | Oct 2017 | A1 |
20170316606 | Khalid et al. | Nov 2017 | A1 |
20170318275 | Khalid et al. | Nov 2017 | A1 |
20170354875 | Marks et al. | Dec 2017 | A1 |
20170372390 | Lokesh | Dec 2017 | A1 |
20180039916 | Ravindra | Feb 2018 | A1 |
20180052935 | Vasvani | Feb 2018 | A1 |
20180060895 | Sunshine et al. | Mar 2018 | A1 |
20180075656 | Kim | Mar 2018 | A1 |
20180089594 | Duncker et al. | Mar 2018 | A1 |
20180136893 | Mirarchi et al. | May 2018 | A1 |
20180139425 | Mutter et al. | May 2018 | A1 |
20180165830 | Danieau et al. | Jun 2018 | A1 |
20180167656 | Ortiz et al. | Jun 2018 | A1 |
20180173372 | Greenspan et al. | Jun 2018 | A1 |
20180176502 | Bhuruth et al. | Jun 2018 | A1 |
20180176520 | Jang et al. | Jun 2018 | A1 |
20180189684 | Vasvani | Jul 2018 | A1 |
20180191952 | Ardo et al. | Jul 2018 | A1 |
20180197119 | Sunshine et al. | Jul 2018 | A1 |
20180213192 | Jang et al. | Jul 2018 | A1 |
20180214777 | Hingorani | Aug 2018 | A1 |
20180225537 | Cole et al. | Aug 2018 | A1 |
20180227572 | King | Aug 2018 | A1 |
20180227694 | King | Aug 2018 | A1 |
20180242178 | Barton et al. | Aug 2018 | A1 |
20180289166 | Andon et al. | Oct 2018 | A1 |
20180295389 | Kakurai | Oct 2018 | A1 |
20180299952 | Koker et al. | Oct 2018 | A1 |
20180324229 | Ross | Nov 2018 | A1 |
20180324410 | Roine et al. | Nov 2018 | A1 |
20180343442 | Yoshikawa et al. | Nov 2018 | A1 |
20180350136 | Rowley | Dec 2018 | A1 |
20180352298 | Kline et al. | Dec 2018 | A1 |
20180352300 | Kline et al. | Dec 2018 | A1 |
20180357981 | Ng et al. | Dec 2018 | A1 |
20180374192 | Kunkel et al. | Dec 2018 | A1 |
20180376207 | Kline et al. | Dec 2018 | A1 |
20180376217 | Kahng et al. | Dec 2018 | A1 |
20190039288 | Goel et al. | Feb 2019 | A1 |
20190043218 | Hiltner et al. | Feb 2019 | A1 |
20190099678 | Khan et al. | Apr 2019 | A1 |
20190099681 | Rico et al. | Apr 2019 | A1 |
20190102939 | He et al. | Apr 2019 | A1 |
20190102941 | Khan et al. | Apr 2019 | A1 |
20190102944 | Han et al. | Apr 2019 | A1 |
20190121522 | Davis et al. | Apr 2019 | A1 |
20190124316 | Yoshimura | Apr 2019 | A1 |
20190146313 | De La Cruz et al. | May 2019 | A1 |
20190156565 | Khalid et al. | May 2019 | A1 |
20190156690 | Carrick et al. | May 2019 | A1 |
20190166339 | De La Cruz et al. | May 2019 | A1 |
20190182471 | Khalid et al. | Jun 2019 | A1 |
20190199992 | Shikata et al. | Jun 2019 | A1 |
20190209046 | Addison et al. | Jul 2019 | A1 |
20190212901 | Garrison et al. | Jul 2019 | A1 |
20190238861 | D'Acunto et al. | Aug 2019 | A1 |
20190253743 | Tanaka et al. | Aug 2019 | A1 |
20190261052 | Kline et al. | Aug 2019 | A1 |
20190268572 | Tsukahara et al. | Aug 2019 | A1 |
20190272738 | Hutz et al. | Sep 2019 | A1 |
20190289275 | Arai | Sep 2019 | A1 |
20190313119 | Han et al. | Oct 2019 | A1 |
20190335166 | Copley et al. | Oct 2019 | A1 |
20190354759 | Somers et al. | Nov 2019 | A1 |
20190358547 | Mack et al. | Nov 2019 | A1 |
20190366189 | Plant et al. | Dec 2019 | A1 |
20200012293 | Lee | Jan 2020 | A1 |
20200033610 | Haseltine et al. | Jan 2020 | A1 |
20200045275 | Hsiao | Feb 2020 | A1 |
20200050884 | Han et al. | Feb 2020 | A1 |
20200074181 | Chang et al. | Mar 2020 | A1 |
20200076523 | Kline et al. | Mar 2020 | A1 |
20200099905 | Post et al. | Mar 2020 | A1 |
20200104999 | Edell et al. | Apr 2020 | A1 |
20200120097 | Amitay et al. | Apr 2020 | A1 |
20200162643 | Park et al. | May 2020 | A1 |
20200167649 | Tanninen et al. | May 2020 | A1 |
20200177850 | Emig et al. | Jun 2020 | A1 |
20200226843 | Khan et al. | Jul 2020 | A1 |
20200228767 | Ichieda | Jul 2020 | A1 |
20200241697 | Ikeda et al. | Jul 2020 | A1 |
20200279385 | Kirk et al. | Sep 2020 | A1 |
20200279410 | Lee et al. | Sep 2020 | A1 |
20200289935 | Azmandian et al. | Sep 2020 | A1 |
20200293176 | Yoganandan et al. | Sep 2020 | A1 |
20200302510 | Chachek et al. | Sep 2020 | A1 |
20200305846 | Syu | Oct 2020 | A1 |
20200322854 | Soule et al. | Oct 2020 | A1 |
20200404241 | Han et al. | Dec 2020 | A1 |
20210004730 | Koslu | Jan 2021 | A1 |
20210027347 | Schnitzer et al. | Jan 2021 | A1 |
20220228605 | Anderson et al. | Jul 2022 | A1 |
20230239528 | Rodriguez et al. | Jul 2023 | A1 |
20230393648 | Spears et al. | Dec 2023 | A1 |
Entry |
---|
Courchesne, Luc, “Posture platform and the drawing room: virtual teleportation in cyberspace”, Aug. 2014, pp. 367-374 (Year: 2014). |
International Search Report and Written Opinion directed to related International Application No. PCT/US2023/074818, mailed Mar. 22, 2024; 15 pages. |
Number | Date | Country | |
---|---|---|---|
20230393648 A1 | Dec 2023 | US |
Number | Date | Country | |
---|---|---|---|
63347828 | Jun 2022 | US |