The present disclosure generally relates to warping a frame based on pose and warping data.
Some devices include applications that generate application frames. For example, some devices include a camera application that captures an image frame via an image sensor. These application frames may be presented on mobile communication devices.
So that the present disclosure can be understood by those of ordinary skill in the art, a more detailed description may be had by reference to aspects of some illustrative implementations, some of which are shown in the accompanying drawings.
In accordance with common practice the various features illustrated in the drawings may not be drawn to scale. Accordingly, the dimensions of the various features may be arbitrarily expanded or reduced for clarity. In addition, some of the drawings may not depict all of the components of a given system, method or device. Finally, like reference numerals may be used to denote like features throughout the specification and figures.
Various implementations disclosed herein include devices, systems, and methods for warping an application frame based on a pose of a device and intermediate warping data. In some implementations, a device includes an environmental sensor, a display, a non-transitory memory and one or more processors coupled with the environmental sensor, the display and the non-transitory memory. In some implementations, a method includes generating, at a first time, intermediate warping data for a warping operation to be performed on an application frame. In some implementations, the method includes obtaining, at a second time that occurs after the first time, via the environmental sensor, environmental data that indicates a pose of the device within a physical environment of the device. In some implementations, the method includes generating a warped application frame by warping the application frame in accordance with the pose of the device and the intermediate warping data. In some implementations, the method includes displaying the warped application frame on the display.
In accordance with some implementations, a device includes one or more processors, a non-transitory memory, and one or more programs. In some implementations, the one or more programs are stored in the non-transitory memory and are executed by the one or more processors. In some implementations, the one or more programs include instructions for performing or causing performance of any of the methods described herein. In accordance with some implementations, a non-transitory computer readable storage medium has stored therein instructions that, when executed by one or more processors of a device, cause the device to perform or cause performance of any of the methods described herein. In accordance with some implementations, a device includes one or more processors, a non-transitory memory, and means for performing or causing performance of any of the methods described herein.
Numerous details are described in order to provide a thorough understanding of the example implementations shown in the drawings. However, the drawings merely show some example aspects of the present disclosure and are therefore not to be considered limiting. Those of ordinary skill in the art will appreciate that other effective aspects and/or variants do not include all of the specific details described herein. Moreover, well-known systems, methods, components, devices and circuits have not been described in exhaustive detail so as not to obscure more pertinent aspects of the example implementations described herein.
Some devices utilize a pose of the device to perform a warping operation. For example, a device may overlay visual elements onto an application frame based on a location and/or an orientation of the device relative to physical objects in the physical environment. Generally, the device obtains pose information of the device as late as possible so that the visual elements are composited at appropriate positions within the application frame. However, sometimes the warping operation is time-intensive and results in a delay in rendering the application frame. As such, warping can sometimes increase a latency of the device and degrade a user experience of the device.
The present disclosure provides methods, systems, and/or devices for warping an application frame with reduced latency by warping the application frame in a shorter time duration. A device warps an application frame in a shorter time duration by performing a pose-independent portion of the warping operation prior to obtaining a pose of the device and performing a pose-dependent portion of the warping operation after obtaining the pose of the device. Since the device performs the pose-independent portion of the warping operation prior to obtaining the pose, the device only has to perform the pose-dependent portion of the warping operation after obtaining the pose and not the pose-independent portion. By not having to perform the pose-independent portion of the warping operation after obtaining the pose, the device uses less time to complete the warping operation after obtaining the pose.
The pose-independent portion results in intermediate warping data that is used to perform the pose-dependent portion of the warping operation. The pose-independent portion of the warping operation does not rely on the pose of the device and can be performed before obtaining the pose of the device. By contrast, the pose-dependent portion of the warping operation relies on the pose of the device and is performed after obtaining the pose of the device. Splitting the warping operation into a pose-independent portion and a pose-dependent portion reduces an amount of time required for warping after obtaining the pose thereby reducing a latency of the device and improving a user experience of the device. Splitting the warping operating into the pose-independent portion and the pose-dependent portion allows the warping operating to be completed closer to a timing signal that serves as a trigger for rendering the warped application frame. Completing the warping operating closer to the timing signal results in a more accurate warp.
In some implementations, the electronic device 20 includes an environmental sensor 22 that captures environmental data 24 that corresponds to the operating environment 10. In some implementations, the environmental sensor 22 includes a depth sensor (e.g., a depth camera) and the environmental data 24 includes depth data 24a that is captured by the depth sensor. In some implementations, the environmental sensor 22 includes an image sensor (e.g., a camera, for example, an infrared (IR) camera or a visible light camera) and the environmental data 24 includes image data 24b that is captured by the image sensor. In some implementations, the electronic device 20 includes a tablet or a smartphone and the environmental sensor 22 includes a rear-facing camera of the tablet or the smartphone that the user points in a desired direction within the operating environment 10. In some implementations, the electronic device 20 includes a display 26.
As shown in
In various implementations, the electronic device 20 includes a compositor 40. In some implementations, the compositor 40 composites visual elements onto the application frame 32 generated by the application 30. For example, in some implementations, the compositor 40 overlays XR elements (e.g., AR elements) onto the application frame 32. In various implementations, the compositor 40 performs a warping operation on the application frame 32. Warping the application frame 32 results in a warped application frame 42 that the electronic device 20 presents on the display 26.
In various implementations, the electronic device 20 (e.g., the application 30 and/or the compositor 40) splits the warping operation into a pose-independent portion that does not rely on a pose 70 of the electronic device 20, and a pose-dependent portion that relies on the pose 70 of the electronic device 20. In various implementations, the electronic device 20 (e.g., the application 30 and/or the compositor 40) performs the pose-independent portion of the warping operation at a first time T1 that is prior to obtaining the pose 70 of the electronic device 20. In some implementations, performing the pose-independent portion of the warping operation results in intermediate warping data 50 that is used in the pose-dependent portion of the warping operation.
In some implementations, the electronic device 20 (e.g., the application 30 and/or the compositor 40) generates the intermediate warping data 50 based on the environmental data 24 captured by the environmental sensor 22. In some implementations, the intermediate warping data 50 includes low resolution depth data 52 for the operating environment 10. In some implementations, the low resolution depth data 52 is a modified version (e.g., a lower resolution version) of the depth data 24a captured by a depth sensor of the electronic device 20. In some implementations, the intermediate warping data 50 includes low precision depth data 54 for the operating environment 10. In some implementations, the low precision depth data 54 is a modified version (e.g., a lower precision version) of the depth data 24a captured by the depth sensor of the electronic device 20. In some implementations, the electronic device 20 generates the low resolution depth data 52 and/or the low precision depth data 54 by down-sampling the depth data 24a captured by the depth sensor of the electronic device 20. In some implementations, the electronic device 20 obtains the low resolution depth data 52 and/or the low precision depth data 54 by operating the depth sensor at a reduced capability (e.g., at a lower resolution, at a lower frequency and/or at a lower power-consumption setting).
In some implementations, the intermediate warping data 50 includes an average color value 56 of pixels in an image that is captured by an image sensor of the electronic device 20. For example, in some implementations, the image data 24b indicates respective color values of pixels and the average color value 56 is an average of the respective color values indicated by the image data 24b. In some implementations, the intermediate warping data 50 includes a quad tree representation 58 of the operating environment 10. In some implementations, the electronic device 20 generates the quad tree representation 58 based on the environmental data 24 (e.g., based on the depth data 24a and/or the image data 24b).
In some implementations, the application 30 generates the intermediate warping data 50, and the application 30 provides the intermediate warping data 50 to the compositor 40. Alternatively, in some implementations, the compositor 40 generates the intermediate warping data 50.
In various implementations, the compositor 40 obtains the pose 70 of the electronic device 20 at a second time T2 that is after the first time T1. In some implementations, the pose 70 indicates a location 72 of the electronic device 20 within the operating environment 10. For example, the pose 70 indicates the location 72 of the electronic device 20 relative to other objects in the operating environment 10. In some implementations, the pose 70 indicates an orientation 74 of the electronic device 20 within the operating environment 10. For example, the pose 70 indicates the orientation 74 of the electronic device 20 relative to other objects in the operating environment 10. In some implementations, the electronic device 20 (e.g., the compositor 40) determines the pose 70 based on the environmental data 24 captured by the environmental sensor 22. For example, in some implementations, the electronic device 20 generates the pose 70 based on the depth data 24a and/or the image data 24b.
In various implementations, the compositor 40 utilizes the intermediate warping data 50 and the pose 70 to complete the warping operation and generate the warped application frame 42. For example, in some implementations, the compositor 40 uses the intermediate warping data 50 and the pose 70 to perform the pose-dependent portion of the warping operation. Since the compositor 40 only has to perform the pose-dependent portion of the warping operation after obtaining the pose 70 and not the pose-dependent portion of the warping operation, it takes the compositor 40 less time to generate the warped application frame 42 after obtaining the pose 70.
In some implementations, the compositor 40 uses the low resolution depth data 52 and the pose 70 to perform the pose-dependent portion of the warping operation and generate the warped application frame 42. In some implementations, the compositor 40 uses the low precision depth data 54 to perform the pose-dependent portion of the warping operation and generate the warped application frame 42. In some implementations, the compositor 40 uses the low precision depth data 54 and the pose 70 to perform an occlusion operation (e.g., to occlude a portion of the operating environment 10 by compositing visual elements onto a portion of the application frame 32 that corresponds to the portion of the operating environment 10 that is being occluded). In some implementations, the compositor 40 uses the low precision depth data 54 and the pose 70 to perform a point-of-view correction (POVC) operation (e.g., to change a POV of the application frame 32).
In some implementations, the compositor 40 uses the average color value 56 and the pose 70 to perform a tone mapping operation and/or an accessibility operation. For example, the compositor 40 adjusts a color of visual elements that are being composited onto the application frame 32 based on the average color value 56 so that the color of the visual elements matches the color of the operating environment 10. In some implementations, the compositor 40 uses the quad tree representation 58 and the pose 70 to improve the warping operation (e.g., to perform a more accurate warping operation and/or to perform the warping operation in less time).
In some implementations, the application thread 210 includes various application rendering operations 220 (“application renders 220”, hereinafter for the sake of brevity). For example, as shown in
In some implementations, the compositor thread 230 includes various compositing operations 235. In the example of
In the example of
Similarly, the system 200 splits the second compositing operation 235b into second pose-independent work 240b for the second application render 220b that is performed prior to obtaining a second pose 250b and second pose-dependent work 260b for the second application render 220b that is performed after obtaining the second pose 250b. Performing the second pose-independent work 240b, obtaining the second pose 250b and performing the second pose-dependent work 260b results in a second warped application frame that is presented at a time corresponding to a second instance of the timing signal 270.
Advantageously, since the pose-independent work 260 is performed prior to obtaining the poses 250, an amount of time required to complete the warping is reduced, thereby reducing a latency of the system 200 and enhancing a user experience provided by the system 200. Furthermore, since a portion of the warping operation (e.g., pose-independent work 260) can be performed without the poses 250, the time at which the poses 250 are obtained can be delayed. Getting the poses 250 as late as possible results in more accurate warping because there is less time for the pose of the electronic device 20 to change. In other words, the poses 250 are more likely to represent actual poses of the electronic device 20 at times corresponding to the timing signal 270.
Referring to
Referring to
Referring to
Referring to
As represented by block 310, in various implementations, the method 300 includes generating, at a first time, intermediate warping data for a warping operation to be performed on an application frame. For example, as shown in
As represented by block 310a, in some implementations, the intermediate warping data is generated while the application frame is being generated and the warped application frame is generated after the application frame has been generated. For example, as shown in
As represented by block 310b, in some implementations, generating the intermediate warping data includes generating the intermediate warping data based on depth data corresponding to the physical environment. For example, as discussed in relation to
As represented by block 310c, in some implementations, generating the intermediate warping data includes generating the intermediate warping data based on color data corresponding to the physical environment. For example, as shown in
As represented by block 310d, in some implementations, the intermediate warping data is generated after the application frame has been generated and before determining the pose of the device. For example, as shown in
As represented by block 310e, in some implementations, the application frame includes an image frame that is captured via a camera application. For example, referring to
As represented by block 320, in some implementations, the method 300 includes obtaining, at a second time that occurs after the first time, via the environmental sensor, environmental data that indicates a pose of the device within a physical environment of the device. For example, as shown in
As represented by block 320a, in some implementations, the method 300 includes determining the pose of the device based on the environmental data. In some implementations, the method 300 includes determining the pose of the device based on a distance and/or an orientation of the device relative to a physical object in the physical environment.
As represented by block 320b, in some implementations, the environmental sensor includes a depth sensor and obtaining the environmental data includes obtaining depth data via the depth sensor. In such implementations, the method 300 includes determining the pose of the device based on the depth data captured by the depth sensor. For example, referring to
As represented by block 320c, in some implementations, the environmental sensor includes an image sensor and obtaining the environmental data includes obtaining image data via the image sensor. In such implementations, the method 300 includes determining the pose of the device based on the image data captured by the image sensor. For example, referring to
As represented by block 330, in some implementations, the method 300 includes generating a warped application frame by warping the application frame in accordance with the pose of the device and the intermediate warping data. For example, as shown in
As represented by block 330a, in some implementations, the intermediate warping data includes depth data of the physical environment at a particular resolution that is lower than a threshold resolution, and generating the warped application frame includes warping the application frame based on the depth data at the particular resolution. For example, as shown in
As represented by block 330b, in some implementations, the intermediate warping data includes depth data of the physical environment at a particular precision that is lower than a threshold precision, and generating the warped application frame includes performing an occlusion operation based on the depth data at the particular precision in order to occlude a physical object in the physical environment. For example, as shown in
As represented by block 330c, in some implementations, the intermediate warping data includes depth data of the physical environment at a particular precision that is lower than a threshold precision, and generating the warped application frame includes performing a point-of-view (POV) adjustment operation based on the depth data at the particular precision in order to adjust a POV of the application frame. For example, as shown in
As represented by block 330d, in some implementations, the intermediate warping data indicates an average color value of a plurality of pixels in the application frame, and generating the warped application frame includes performing a tone mapping operation by modifying a color value of virtual content that is to be composited onto the application frame based on the average color value of the plurality of pixels in the application frame. For example, as shown in
As represented by block 330e, in some implementations, the intermediate warping data includes a quad tree representation of the physical environment, and generating the warped application frame includes warping the application frame based on the quad tree representation of the physical environment. For example, as shown in
As represented by block 340, in various implementations, the method 300 includes displaying the warped application frame on the display. For example, as shown in
In some implementations, the network interface 402 is provided to, among other uses, establish and maintain a metadata tunnel between a cloud hosted network management system and at least one private network including one or more compliant devices. In some implementations, the one or more communication buses 405 include circuitry that interconnects and controls communications between system components. The memory 404 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM or other random access solid state memory devices, and may include non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. The memory 404 optionally includes one or more storage devices remotely located from the one or more CPUs 401. The memory 404 comprises a non-transitory computer readable storage medium.
In some implementations, the memory 404 or the non-transitory computer readable storage medium of the memory 404 stores the following programs, modules and data structures, or a subset thereof including an optional operating system 406, an environmental data obtainer 420, a warping data generator 430, a frame warper 440 and a frame presenter 450. In various implementations, the device 400 performs the method 300 shown in
In some implementations, the environmental data obtainer 420 includes instructions 420a, and heuristics and metadata 420b for obtaining (e.g., receiving and/or capturing) the environmental data 24 shown in
In some implementations, the warping data generator 430 includes instructions 430a, and heuristics and metadata 430b for generating the intermediate warping data 50 shown in
In some implementations, the frame warper 440 includes instructions 440a, and heuristics and metadata 440b for generating a warped application frame by warping an application frame based on the intermediate warping data and the pose of the device 400. For example, the frame warper 440 uses the intermediate warping data 50 and the pose 70 to warp the application frame 32 and generate the warped application frame 42 shown in
In some implementations, the frame presenter 450 includes instructions 450a, and heuristics and metadata 450b for presenting a warped application frame (e.g., the warped application frame 42 shown in
In some implementations, the one or more I/O devices 410 include an input device for obtaining inputs (e.g., a touchscreen for detecting user inputs). In some implementations, the one or more I/O devices 410 include an environmental sensor (e.g., the environmental sensor 22 shown in
In various implementations, the one or more I/O devices 410 include a video pass-through display which displays at least a portion of a physical environment surrounding the device 400 as an image captured by a camera. In various implementations, the one or more I/O devices 410 include an optical see-through display which is at least partially transparent and passes light emitted by or reflected off the physical environment.
It will be appreciated that
While various aspects of implementations within the scope of the appended claims are described above, it should be apparent that the various features of implementations described above may be embodied in a wide variety of forms and that any specific structure and/or function described above is merely illustrative. Based on the present disclosure one skilled in the art should appreciate that an aspect described herein may be implemented independently of any other aspects and that two or more of these aspects may be combined in various ways. For example, an apparatus may be implemented and/or a method may be practiced using any number of the aspects set forth herein. In addition, such an apparatus may be implemented and/or such a method may be practiced using other structure and/or functionality in addition to or other than one or more of the aspects set forth herein.
This application claims the benefit of U.S. Provisional Patent App. No. 63/247,938, filed on Sep. 24, 2021, which is incorporated by reference in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2022/042897 | 9/8/2022 | WO |
Number | Date | Country | |
---|---|---|---|
63247938 | Sep 2021 | US |