Not Applicable
Not Applicable INCORPORATION-BY-REFERENCE OF MATERIAL SUBMITTED ON A COMPACT DISC
Not Applicable
The invention disclosed herein provides systems and methods for simplifying virtual reality (VR), augmented reality (AR), or virtual augmented reality (VAR) based communication and collaboration through a streamlined user interface framework that enables both synchronous and asynchronous interactions in immersive environments.
VR, AR, VAR systems (hereinafter, collectively or individually as VAR) viewed in spherical coordinates or other three dimensional environments or immersive environments require complex and heavyweight files for all stakeholders who wish to collaborate in these environments. There is a need to simplify VAR environments for synchronous and asynchronous interaction and communication.
Generally, as used herein, a publisher may publish a VAR environment in an immersive environment for a participant to view and/or annotate at a later time or asynchronously. A user may view the annotated VAR environment in an immersive environment. A publisher, participant, third party, or combination thereof may be a user.
According to one embodiment, a participant's movement throughout a VAR immersive environment is recorded or tracked. According to one embodiment, movement means a participant's focus point (FP) from a starting point (SP) through more than one FP in a VAR immersive environment. According to one embodiment, a participant's FP is determined by the participant's head position and/or eye gaze. According to one embodiment, the participant annotates his movement through a VAR immersive.
According to one embodiment, there exists more than one participant. According to one embodiment, there exists more than one user. According to one embodiment, the participant's movement in the VAR immersive environment is traced for a user with a visible reticle.
According to one embodiment, the reticles may have different colors, shapes, icons, etc. According to one embodiment, more than one user may synchronously or asynchronously view the annotated immersive environment.
According to one embodiment, published and/or annotated VAR immersive environment maybe viewed on a mobile computing device such as a smart-phone or tablet. According to one embodiment, the participant may view the immersive environment using any attachable binocular optical system such as Google Cardboard or other similar device. According to one embodiment, a publisher, participant or user may interact with an annotated or unannotated VAR immersive environment via a touch sensitive screen or other touch sensitive device.
Other features and advantages of the present invention will become apparent in the following detailed descriptions of the preferred embodiment with reference to the accompanying drawings, of which:
In the following detailed description, reference is made to the accompanying drawings, which form a part hereof. In the drawings, the use of similar or the same symbols in different drawings typically indicates similar or identical items, unless context dictates otherwise.
The illustrative embodiments described in the detailed description, drawings, and claims are not meant to be limiting. Other embodiments may be utilized, and other changes may be made, without departing from the spirit or scope of the subject matter presented here.
One skilled in the art will recognize that the herein described components (e.g., operations), devices, objects, and the discussion accompanying them are used as examples for the sake of conceptual clarity and that various configuration modifications are contemplated. Consequently, as used herein, the specific exemplars set forth and the accompanying discussion are intended to be representative of their more general classes. In general, use of any specific exemplar is intended to be representative of its class, and the non-inclusion of specific components (e.g., operations), devices, and objects should not be taken as limiting.
The present application uses formal outline headings for clarity of presentation. However, it is to be understood that the outline headings are for presentation purposes, and that different types of subject matter may be discussed throughout the application (e.g., device(s)/structure(s) may be described under process(es)/operations heading(s) and/or process(es)/operations may be discussed under structure(s)/process(es) headings; and/or descriptions of single topics may span two or more topic headings). Hence, the use of the formal outline headings is not intended to be in any way limiting. Given by way of overview, illustrative embodiments include systems and methods for simplifying VAR based communication and collaboration through a streamlined user interface framework that enables both synchronous and asynchronous interactions in immersive environments.
Referring to
According to one embodiment, a participant's movement throughout a VAR immersive environment is recorded or tracked. Movement throughout a VAR immersive environment means tracking or recording a participant's focus point (FP) from a starting point (SP) through more than one FP in the VAR immersive environment. According to one embodiment, a participant's FP (30) is determined by head position and/or eye gaze. According to one embodiment, a participant annotates his movement throughout a VAR immersive environment. (5)
According to one embodiment, annotation is voice annotation from a SP (20) through more than one FP (30). According to another embodiment, annotation is movement throughout the VAR environment. In another embodiment, annotation is movement throughout the VAR environment coordinated with voice annotation though the same space. (5) According to one embodiment, the participant's annotation is marked with a unique identifier or UID. (6)
According to one embodiment, a user may view an annotated immersive environment. (8) According to one embodiment, a user receives notice that a participant has annotated an immersive environment. (7) The user may then review the annotated immersive environment. (8)
According to one embodiment, the participant is more than one participant. (2) According to one embodiment, more than one participant may view the VAR immersive environment asynchronously on a VAR platform. (2) According to one embodiment, more than one participant may annotate the VAR immersive environment asynchronously. (5) According to one embodiment, more than one participant may view the VAR immersive environment synchronously (2) but may annotate the environment asynchronously. (5) According to one embodiment, each annotated immersive environment is marked with a UID. (6)
According to one embodiment, the user is more than one user. According to one embodiment, more than one user may synchronously view one annotated immersive environment on a VAR platform. (8) According to one embodiment, at least one user may join or leave a synchronous viewing group. (12) According to one embodiment, at least one user may view at least one UID annotated VAR immersive environment on a VAR platform. (8).
Referring to
Referring to
According to one embodiment, a VAR immersive environment is viewed on a touch-sensitive device (50). A touch-sensitive device (50) is a device that responds to the touch of, a finger for example, by transmitting the coordinates of the touched point to a computer. The touch-sensitive area may be the screen itself, in which case it is called a touch-screen. Alternatively, it may be integral with the keyboard or a separate unit that can be placed on a desk; movement of the finger across a touchpad causes the cursor to move around the screen.
According to one embodiment, the user may view the VAR immersive environment on a mobile computing device (50), such as a smart phone or tablet, which has a touch screen. (2) According to one embodiment, the user may view the VAR immersive environment using any attachable binocular optical system such as Google Card Board, or other similar device.
According to one embodiment, the user may select an action that affects a VAR immersive environment by touching a portion of the screen that is outside (51) the VAR immersive environment. According to one embodiment, the actions are located on the corners of the touch screen (51). This allows the user to ambidextrously select an action. According to one embodiment, the user may select an action by manipulating a touch pad. An action may include: choosing one from 1, 2, 3, 4; choosing to publish, view, annotate; choosing to teleport; choosing to view a point of interest; choosing to view one of several annotations; choosing to enter or leave a VAR Platform when synchronously viewing an annotated immersive VAR environment; amongst others.
Referring to
Referring to
According to one embodiment, the content professional may ask at least one user to vote (11) from the recommendations of more than one stakeholder where, a vote is given after viewing each annotated VAR immersive environment (5). According to one embodiment, each vote may be graphically presented. (14) According to one embodiment, the user may choose a hot spot (53) or a touch screen (51), or a combination thereof to vote.
According to one embodiment, the more than one stakeholder may synchronously view at least one annotated VAR environment on a VAR platform. (8) According to one embodiment, the more than one stakeholder may choose one out of a plurality of annotated VAR environments to view. (8) According to one embodiment, the more than one stakeholder may choose more than one annotated VAR environments to view simultaneously. (8) According to one embodiment, at least one of the more than one stakeholder may join or leave synchronous viewing group. (12) According to one embodiment, at least one published VAR immersive environment, annotated immersive environment, vote, graphical representation or a combination thereof may be stored or processed on a server or cloud. (15) One skilled in the art will appreciate that more than one server or cloud may be utilized.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects. Further aspects of this invention may take the form of a computer program embodied in one or more readable medium having computer readable program code/instructions thereon. Program code embodied on computer-readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing. The computer code may be executed entirely on a user's computer, partly on the user's computer, as a standalone software package, a cloud service, partly on the user's computer and partly on a remote computer or entirely on a remote computer, remote or cloud based server.