The present disclosure relates to rendering extended reality (XR) environments and associated XR rendering devices, and more particularly to rendering avatars and virtual objects in immersive XR environments displayed on XR participant devices.
Immersive extended reality (XR) environments have been developed which provide a myriad of different types of user experiences for gaming, on-line meetings, co-creation of products, etc. Immersive XR environments (also referred to as “XR environments”) include virtual reality (VR) environments where human users only see computer generated graphical renderings and include augmented reality (AR) environments where users see a combination of computer generated graphical renderings overlaid on a view of the physical real-world through, e.g., see-through display screens.
Immersive XR environments, such as gaming environments and meeting environments, are often configured to display computer generated avatars which represent poses of human users in the immersive XR environments. A user may select and customize an avatar, such as gender, clothing, hair style, etc. to represent that user for viewing by other users participating in the immersive XR environment. Example XR environment rendering devices include, without limitation, XR environment servers, XR headsets, gaming consoles, smartphones running an XR application, and tablet/laptop/desktop computers running an XR application. Oculus Quest is an example XR device and Google Glass is an example AR device.
Rendering an XR environment in real-time or near-real-time is computationally intensive and requires trade-offs between device responsiveness and the level of rendered graphical detail in order to not exceed the computational capacity (e.g., graphics processing unit (GPU) bandwidth) of the XR rendering device. Various technical approaches are directed to reducing or constraining the rendering computational requirements.
There are various approaches intended to reduce rendering load in context of e.g. XR gaming, etc. One approach is known as foveated rendering where an XR headset, XR glasses, etc. worn by an XR environment participant uses eye tracking to determine where the eyes are directed relative to an area of the XR environment displayed on a head-mounted display, and responsively controls rendering to provide higher resolution quality (e.g., image and/or video quality) for a sub-area of the area corresponding to the user's primary eye focus while reducing the resolution quality outside the user's primary eye focus, such as within the user's peripherical vision.
Another approach is referred to as fixed foveated rendering which controls rendered resolution quality based on a fixed focal orientation relative to the user's location within the XR environment.
Another approach assumes a fixed rendering depth, where virtual objects that are within a fixed distance of the user's location in the XR environment are rendered with higher resolution quality relative to other virtual objects that are location beyond the fixed distance. For example, spherical rendering (i.e. 360-sphere) operates to provide high resolution rendering within the sphere. The larger the distance (e.g., radius) of “the high-resolution sphere” that is defined around the user, the greater the computational rendering load that is required from the XR rendering device. Although spherical rendering can provide a high user quality of experience for panning, image-tracking and/or mobility, etc., the computational requirements can exceed the capacity of a commercially available or viable XR rendering device.
Some embodiments disclosed herein are directed to an XR rendering device including at least one processor and at least one memory storing instructions executable by the at least one processor to perform operations. The operations render an immersive XR environment on a display device for viewing by a participant among a group of participants who have associated avatars representing the participants which are rendered in the immersive XR environment. The operations obtain rendering prioritization rules which indicate conditions for prioritizing rendering of avatars and/or virtual objects associated with the avatars. The operations prioritize rendering of particular ones of the avatars and/or particular ones of the virtual objects which have satisfied conditions indicated by the rendering prioritization rules, when rendering in the immersive XR environment the avatars representing the group of the participants and/or the virtual objects associated with the avatars.
Some other related embodiments are directed to a corresponding method by an XR rendering device. The method renders an immersive XR environment on a display device for viewing by a participant among a group of participants who have associated avatars representing the participants which are rendered in the immersive XR environment. The method obtains rendering prioritization rules which indicate conditions for prioritizing rendering of avatars and/or virtual objects associated with the avatars. The method prioritizes rendering of particular ones of the avatars and/or particular ones of the virtual objects which have satisfied conditions indicated by the rendering prioritization rules, when rendering in the immersive XR environment the avatars representing the group of the participants and/or the virtual objects associated with the avatars.
Some other related embodiments are directed to a corresponding computer program product including a non-transitory computer readable medium storing program code executable by at least one processor of an XR rendering device for performing operations. The operations render an immersive XR environment on a display device for viewing by a participant among a group of participants who have associated avatars representing the participants which are rendered in the immersive XR environment. The operations obtain rendering prioritization rules which indicate conditions for prioritizing rendering of avatars and/or virtual objects associated with the avatars. The operations also prioritize rendering of particular ones of the avatars and/or particular ones of the virtual objects which have satisfied conditions indicated by the rendering prioritization rules, when rendering in the immersive XR environment the avatars representing the group of the participants and/or the virtual objects associated with the avatars.
Some potential advantages of various embodiments disclosed herein is that they provide means to enhance the experience of meetings or sessions in immersive XR environments by prioritizing which avatars and/or faces among a set of potential avatars representing individual ones of a group of participants that should be prioritized for a XR rendering of a media stream to a participant viewing the immersive XR environment.
Other XR rendering device, methods, and computer program products according to embodiments will be or become apparent to one with skill in the art upon review of the following drawings and detailed description. It is intended that all such additional XR rendering device, methods, and computer program products be included within this description and protected by the accompanying claims.
Aspects of the present disclosure are illustrated by way of example and are not limited by the accompanying drawings. In the drawings:
Inventive concepts will now be described more fully hereinafter with reference to the accompanying drawings, in which examples of embodiments of inventive concepts are shown. Inventive concepts may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of various present inventive concepts to those skilled in the art. It should also be noted that these embodiments are not mutually exclusive. Components from one embodiment may be tacitly assumed to be present/used in another embodiment.
Various embodiments of the present disclosure are directed to devices and methods for dynamic avatar representation in immersive extended reality (XR) environments and which use rendering prioritization rules to determine which avatars and/or the virtual objects associated with avatars that should be prioritized for higher quality rendering. Higher quality rendering can include, but is not limited to, any or more of fine-grained resolution and details, higher video frame rate, higher audio bitrate, depth of field, or selective rendering of quality of parts of avatars and/or objects associated with the avatars. Rendering prioritization may be performed based on relations between individual participants in a group of participants in the immersive XR environment.
For example, a first participant may have a defined role relative to other participants which causes prioritization of higher quality XR rendering of the avatar associated with the first participant and/or a virtual object associated with the avatar by an XR rendering device. Certain participant-attributes such as a defined role in a meeting, social relations with other participants, digital meeting context (i.e., casual, professional), physical environment & sensor attributes, geo-location, social media attributes (e.g., Tinder preferences), etc., may be used as input to control XR rendering prioritizations.
Although the XR rendering device 100 is illustrated in
In a multi-participant XR environment scenario such as illustrated in
Some potential advantages of various embodiments disclosed herein is that they provide means to enhance the experience of meetings or sessions in immersive XR environments by prioritizing which avatars representing participants and/or virtual objects associated with the avatars should be prioritized for XR rendering in a media stream provided to a participant viewing the immersive XR environment. Avatars and/or virtual objects which satisfy rendering prioritization rules would be assigned higher rendering priority, and associated rendered-bits are then provided for viewing before other less important bits and/or are rendered with higher graphical detail quality.
Referring to
For example, a rendering prioritization rule can indicate one avatar (e.g., avatar 200a) representing a first participant and/or a virtual object associated with the avatar (e.g., virtual object 204 in FOV of avatar 200a) is preferred to be prioritized for rendering. The XR rendering device 100 can, when rendering in the immersive XR environment a plurality of avatars (e.g., avatars 200a-g) representing the participants among a group of participants and/or a plurality of virtual objects which includes the virtual object associated with the avatars (e.g., virtual object 204 in FOV of avatars 200a-c), operate to prioritize 404 rendering of particular ones of the avatars (e.g., avatar 200a) and/or particular ones of the virtual objects (e.g., virtual object 204 in FOV of avatar 200a) which have satisfied conditions indicated by the rendering prioritization rule.
As was explained above, the participant devices may be any type of XR device. In some embodiments some or all of the participant devices are virtual reality (VR) headsets in which the wearers only see computer generated graphical renderings such as avatars, virtual objects, etc. In some other embodiments, some or all of the participant devices are augmented reality (AR) headsets (e.g., glasses) through which wearers see a combination of computer generated graphical renderings overlaid on a view of the physical real-world through, e.g., see-through display screens. In the case of an AR headset, the XR rendering device 100 can render graphical objects, textures, and color shading that is overlaid on the wearer's view of the physical real-world, such as a computer generated graphical rendering of clothing which is posed on a display screen to be viewed as overlaid on the real-world body of another participant and/or as a computer generated graphical rendering of a virtual object (e.g., weapon, handbag, hat, shield, scooter, etc.) which is posed on a display screen to be viewed as being carried by, connected to, or otherwise associated with the real-world body of the other participant.
Accordingly, in situations when the XR rendering device 100 becomes processing bandwidth limited with resulting degraded ability to provide rendering quality, the XR rendering device 100 operates to render the graphical details of the avatar 200a with a higher priority than other avatars 200b-g which can result in higher resolution quality being rendered for the avatar graphical details when viewed by a participant in the XR environment and can correspondingly result in lower resolution quality being rendered for the other avatars 200b-g and/or for virtual objects associated with the other avatars 200b-g, when viewed by the participant.
In the example context of
Similarly, other participants may define one or more rendering prioritization rule that are stored in the rendering priority data structure repository 104 with associations to corresponding identities of the other participants who request such rendering prioritization.
In some embodiments, the conditions indicated by the rendering prioritization rules are related to at least one of:
In some embodiments, the 1) personal relationships defined among the participants in the immersive XR environment are related to at least one of:
In some embodiments, the above 2) contexts of the immersive XR environment are related to at least one of:
In some embodiments, the above 3) participant roles in the immersive XR environment relate to at least one of:
Some embodiments are directed to determining an active participant status and applying it to the rendering prioritization rules. In some embodiments, the conditions indicated by the rendering prioritization rules are related to at least one of:
A second participant may satisfy a rendering prioritization rule as having an active participant status when it is determined that the second participant addresses a first participant specifically. For example, the rendering prioritization rule may be satisfied by the second participant having addressed the first participant by name or by sending/receiving an in-application text message. In another example, the rendering prioritization rule may be satisfied by the first and second participants being located in a same friend/family-zone of the XR environment.
Some embodiments are directed to determining one participant has an avatar that is about to emerge within the FOV of the participant viewing the immersive XR environment. In some embodiments, the conditions indicated by the rendering prioritization rules are related to at least one of:
Prediction of emerging active participants may be determined by a participant's relative mobility patterns. For example, determining the participant is “approaching,” the participant has an “ETA estimated in x seconds,” or the participant has is “just around corner/other side of object.” An avatar may be prioritized for rendering based on the avatar being predicted to become “active” within view of another participant within a pre-determined (adaptive, etc.) number of seconds. In one example, “any approaching participant” may not satisfy the rendering prioritization rules, however the rules may be satisfied when the approaching participant's avatar has an identified social relation to a participant viewing the XR environment.
In some embodiments, one of the conditions indicated by the rendering prioritization rules is related to a facial emotion status of the avatar of one of the participants.
Rendering priority may be determined based on a certain offset to an expected or determined baseline of facial emotions: such as a small facial emotion offset can be defined as “usual”, larger facial emotion offset can be defined as “someone being angry, sad or happy associated with action from first user”. The rendering prioritization rules may be based on the determined categorization of the facial emotion. Avatar rendering for a larger-offset facial emotion can prioritized. i.e., such as when something suddenly happens in the ongoing meeting and at least one person gets upset, happy, or angry: then rendering priority is provided to that person. If a plethora of people were determined with changed “offset”, priority to positive/negative emotion changes may be given priority.
In some embodiments, one of the conditions indicated by the rendering prioritization rules is related to rendering resource loading of a participant's device relative to a threshold capacity level.
In some embodiments, when rendering in the immersive XR environment the avatars representing the group of the participants and/or the virtual objects associated with the avatars, the operations prioritize rendering one of the virtual objects based on the rendering prioritization rules defining one of the following conditions that is determined to be satisfied:
Referring to
In further detail, each one of the participant can define rendering prioritization rules that are to be used by the each one of the participants' devices to control prioritization of what is rendered. The rendering prioritization rules may be stored as an attribute of a participant's profile in the participant's device. For example, a first participant can define rendering prioritization rules which is provided 310a to the XR rendering device 100 and requests that rendering prioritization be given to an avatar associated with other participants, a virtual object associated with the other participants avatars, and/or which provides a parametric representation and/or a digital graphic representation of the avatar and/or a virtual object for rendering. Similarly, a second participant, such as a participant associated with one of the other participant devices, can define rendering prioritization rules which is provided 310b to the XR rendering device 100 and requests that rendering prioritization be given to an avatar associated with the first participant or other participants, a virtual object associated with the first participant's avatar or other participants' avatars, and/or which provides a parametric representation and/or a digital graphic representation of the avatar(s) and/or a virtual object for rendering.
The XR rendering device 100 can control rendering of the XR environment by a rendering circuit 300 based on the rendering prioritization rules that have been defined, and may provide rendered parametric representations of the XR environment to the participant devices for display. Alternatively or additionally, the XR rendering device 100 can use the rendering prioritization rules that have been defined to provide priority commands 314a. 314b, etc. which control the rendering operations performed by the respective participant devices. For example, the XR rendering device 100 may use the rendering prioritization rules provided 310a by the first participant device to generate a message 314b that controls rendering priority by one or more other participant devices, such as by controlling: a sequential order in which avatar representing the first participant and/or a sequential order in which the plurality of virtual objects identified by the first participant are rendered on the display device responsive to the preference indicated by the respective rendering prioritization rules: a number of display pixels used to render the avatar representing the first participant and/or the virtual object on the display device responsive to the preference indicated by the rendering prioritization rules; and/or a color depth used to render the avatar representing the first participant and/or the virtual object on the display device responsive to the preference indicated by the rendering prioritization rules. Similarly, the XR rendering device 100 may use the rendering prioritization rules provided 310b by the second participant device to generate a message 314a that controls rendering priority by the first participant device.
For example, when the XR rendering device 100 determines that an avatar for the first participant has become within the FOV of an avatar for the second participant or is estimated to become within the FOV within a threshold time, the XR rendering device 100 can provide 314b the rendering prioritization rules for the first participant to the device of the second participant to control how prioritization is carried out when rendering of the first participant's avatar and/or a virtual object associated with the first participant.
The XR rendering device 100 may determine what to prioritize for rendering based on a ranking of requests indicated by the rendering prioritization rules from the participant devices, e.g., first face, next accessory, next shirt, etc. For example, if a majority of the rendering prioritization rules indicate that the highest priority for rendering is requested for the face of an avatar, followed by an accessory carried by the avatar (e.g., a shield), followed by a shirt worn by the avatar, etc., then the XR rendering device 100 can prioritize sequential rendering of the various items of each of the avatars according to the ranking.
In some embodiments, the XR rendering device uses machine learning (ML) to generate a rendering prioritization rules based on learning related to, for example, what avatar is focused on most (or re-viewed most often) by participants, what type(s) of virtual objects (e.g., clothing articles, weapons, etc.) are focused on most (or re-viewed most often) by participants, etc. In the example of
In the above-description of various embodiments of present inventive concepts, it is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of present inventive concepts. Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which present inventive concepts belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense expressly so defined herein.
When an element is referred to as being “connected”, “coupled”, “responsive”, or variants thereof to another element, it can be directly connected, coupled, or responsive to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected”, “directly coupled”, “directly responsive”, or variants thereof to another element, there are no intervening elements present. Like numbers refer to like elements throughout. Furthermore, “coupled”, “connected”, “responsive”, or variants thereof as used herein may include wirelessly coupled, connected, or responsive. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Well-known functions or constructions may not be described in detail for brevity and/or clarity. The term “and/or” includes any and all combinations of one or more of the associated listed items.
It will be understood that although the terms first, second, third, etc. may be used herein to describe various elements/operations, these elements/operations should not be limited by these terms. These terms are only used to distinguish one element/operation from another element/operation. Thus, a first element/operation in some embodiments could be termed a second element/operation in other embodiments without departing from the teachings of present inventive concepts. The same reference numerals or the same reference designators denote the same or similar elements throughout the specification.
As used herein, the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof. Furthermore, as used herein, the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item. The common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits. These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).
These computer program instructions may also be stored in a tangible computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and/or flowchart block or blocks. Accordingly, embodiments of present inventive concepts may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as “circuitry,” “a module” or variants thereof.
It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Moreover, the functionality of a given block of the flowcharts and/or block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and/or block diagrams may be at least partially integrated. Finally, other blocks may be added/inserted between the blocks that are illustrated, and/or blocks/operations may be omitted without departing from the scope of inventive concepts. Moreover, although some of the diagrams include arrows on communication paths to show a primary direction of communication, it is to be understood that communication may occur in the opposite direction to the depicted arrows.
Many variations and modifications can be made to the embodiments without substantially departing from the principles of the present inventive concepts. All such variations and modifications are intended to be included herein within the scope of present inventive concepts. Accordingly, the above disclosed subject matter is to be considered illustrative, and not restrictive, and the appended examples of embodiments are intended to cover all such modifications, enhancements, and other embodiments, which fall within the spirit and scope of present inventive concepts. Thus, to the maximum extent allowed by law, the scope of present inventive concepts are to be determined by the broadest permissible interpretation of the present disclosure including the following examples of embodiments and their equivalents, and shall not be restricted or limited by the foregoing detailed description.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2021/063340 | 5/19/2021 | WO |