The present disclosure generally relates to interacting with computer-generated content.
Some devices are capable of generating and presenting graphical environments that include many objects. These objects may mimic real world objects. These environments may be presented on mobile communication devices.
So that the present disclosure can be understood by those of ordinary skill in the art, a more detailed description may be had by reference to aspects of some illustrative implementations, some of which are shown in the accompanying drawings.
In accordance with common practice, the various features illustrated in the drawings may not be drawn to scale. Accordingly, the dimensions of the various features may be arbitrarily expanded or reduced for clarity. In addition, some of the drawings may not depict all of the components of a given system, method, or device. Finally, like reference numerals may be used to denote like features throughout the specification and figures.
Various implementations disclosed herein include devices, systems, and methods for associating chronology with a physical article. In some implementations, a device includes a display, one or more processors, and a memory. The method may include presenting an environment comprising a representation of a physical article. An amount of time since a previous event associated with the physical article may be monitored. An indicator of the amount of time may be displayed proximate the representation of the physical article.
In accordance with some implementations, a device includes one or more processors, a non-transitory memory, and one or more programs. In some implementations, the one or more programs are stored in the non-transitory memory and are executed by the one or more processors. In some implementations, the one or more programs include instructions for performing or causing performance of any of the methods described herein. In accordance with some implementations, a non-transitory computer readable storage medium has stored therein instructions that, when executed by one or more processors of a device, cause the device to perform or cause performance of any of the methods described herein. In accordance with some implementations, a device includes one or more processors, a non-transitory memory, and means for performing or causing performance of any of the methods described herein.
Numerous details are described in order to provide a thorough understanding of the example implementations shown in the drawings. However, the drawings merely show some example aspects of the present disclosure and are therefore not to be considered limiting. Those of ordinary skill in the art will appreciate that other effective aspects and/or variants do not include all of the specific details described herein. Moreover, well-known systems, methods, components, devices, and circuits have not been described in exhaustive detail so as not to obscure more pertinent aspects of the example implementations described herein.
People may sense or interact with a physical environment or world without using an electronic device. Physical features, such as a physical object or surface, may be included within a physical environment. For instance, a physical environment may correspond to a physical city having physical buildings, roads, and vehicles. People may directly sense or interact with a physical environment through various means, such as smell, sight, taste, hearing, and touch. This can be in contrast to an extended reality (XR) environment that may refer to a partially or wholly simulated environment that people may sense or interact with using an electronic device. The XR environment may include virtual reality (VR) content, mixed reality (MR) content, augmented reality (AR) content, or the like. Using an XR system, a portion of a person's physical motions, or representations thereof, may be tracked and, in response, properties of virtual objects in the XR environment may be changed in a way that complies with at least one law of nature. For example, the XR system may detect a user's head movement and adjust auditory and graphical content presented to the user in a way that simulates how sounds and views would change in a physical environment. In other examples, the XR system may detect movement of an electronic device (e.g., a laptop, tablet, mobile phone, or the like) presenting the XR environment. Accordingly, the XR system may adjust auditory and graphical content presented to the user in a way that simulates how sounds and views would change in a physical environment. In some instances, other inputs, such as a representation of physical motion (e.g., a voice command), may cause the XR system to adjust properties of graphical content.
Numerous types of electronic systems may allow a user to sense or interact with an XR environment. A non-exhaustive list of examples includes lenses having integrated display capability to be placed on a user's eyes (e.g., contact lenses), heads-up displays (HUDs), projection-based systems, head mountable systems, windows or windshields having integrated display technology, headphones/earphones, input systems with or without haptic feedback (e.g., handheld or wearable controllers), smartphones, tablets, desktop/laptop computers, and speaker arrays. Head mountable systems may include an opaque display and one or more speakers. Other head mountable systems may be configured to receive an opaque external display, such as that of a smartphone. Head mountable systems may capture images/video of the physical environment using one or more image sensors or capture audio of the physical environment using one or more microphones. Instead of an opaque display, some head mountable systems may include a transparent or translucent display. Transparent or translucent displays may direct light representative of images to a user's eyes through a medium, such as a hologram medium, optical waveguide, an optical combiner, optical reflector, other similar technologies, or combinations thereof. Various display technologies, such as liquid crystal on silicon, LEDs, uLEDs, OLEDs, laser scanning light source, digital light projection, or combinations thereof, may be used. In some examples, the transparent or translucent display may be selectively controlled to become opaque. Projection-based systems may utilize retinal projection technology that projects images onto a user's retina or may project virtual content into the physical environment, such as onto a physical surface or as a hologram.
Some devices display an extended reality (XR) environment that includes one or more objects, e.g., representations of physical articles. Representations of physical articles may include sets of pixels representing physical articles, e.g., in the case of a video passthrough. In some implementations, representations of physical articles include the physical articles themselves, e.g., as seen through a lens, as in the case of an optical passthrough.
A user may wish to track a duration of time that is associated with a physical article. For example, the user may wish to monitor a duration of time that has elapsed since a most recent interaction with the physical article. A timer application may be used to track a duration of time that is associated with a physical article. In some implementations, a timer application may be used to track multiple durations of time. However, tracking multiple durations of time may cause the user to lose track of one or more timers. When timers are run for long periods of time, e.g., days or weeks, a user may forget that a timer is running, diminishing the utility of the timer.
The present disclosure provides methods, systems, and/or devices for associating chronology with a physical article. In some implementations, a physical article is associated with a user interface element that displays an indicator of the elapsed time between interactions with the physical article or an indicator of the elapsed time since the most recent interaction with the physical article. The user interface element may be displayed when a user looks at the physical article. In some implementations, the appearance of the user interface element changes with the time scale of the elapsed time. For example, the user interface element may have one appearance when indicating time in hours and another appearance when indicating time in days.
In the example of
As illustrated in
In some implementations, the XR environment 106 includes a virtual environment that is a simulated replacement of a physical environment. In some implementations, the XR environment 106 is synthesized by the electronic device 100. In such implementations, the XR environment 106 is different from a physical environment in which the electronic device 100 is located. In some implementations, the XR environment 106 includes an augmented environment that is a modified version of a physical environment. For example, in some implementations, the electronic device 100 modifies (e.g., augments) the physical environment in which the electronic device 100 is located to generate the XR environment 106. In some implementations, the electronic device 100 generates the XR environment 106 by simulating a replica of the physical environment in which the electronic device 100 is located. In some implementations, the electronic device 100 generates the XR environment 106 by removing and/or adding items from the simulated replica of the physical environment in which the electronic device 100 is located.
In some implementations, the XR environment 106 includes a representation 110 of a physical article. The representation 110 may include a set of pixels representing a physical article, e.g., in the case of a video passthrough. In some implementations, the representation 110 includes the physical article, e.g., as seen through a lens, as in the case of an optical passthrough.
In some implementations, the physical article is associated with a duration of time. The duration of time may be an amount of time since a previous event associated with the physical article. In some implementations, the previous event is a user interaction with the physical article. For example, if the physical article is a plant, the physical article may be associated with an elapsed time since the last time the plant was watered. As another example, if the physical article is an oven, the physical article may be associated with a cooking time, e.g., an elapsed time since the oven was turned on.
In some implementations, the electronic device 100 monitors the amount of time since the previous event associated with the physical article. For example, the electronic device 100 may include a timer that monitors the amount of time since the previous event. In some implementations, the electronic device 100 compares a first timestamp (e.g., corresponding to a current time) with a second timestamp (e.g., corresponding to a time associated with the previous event) to determine the amount of time since the previous event. In some implementations, the electronic device 100 receives an indication of the amount of time since the previous event.
In some implementations, the electronic device 100 displays an indicator 120 of the amount of time proximate the representation 110 of the physical article. The indicator 120 may have an appearance that represents the amount of time. For example, as represented in
As represented in
As represented in
As represented in
As represented in
As represented in
As represented in
As represented in
As represented in
As represented in
In some implementations, the electronic device 100 determines a plurality of respective time periods that correspond to a plurality of events associated with the physical article. For example, if the physical article is a plant, the electronic device 100 may monitor multiple time intervals over which the plant is watered. In some implementations, as represented in
In some implementations, the electronic device 100 includes or is attached to a head-mountable device (HMD) worn by the user 20. The HMD presents (e.g., displays) the XR environment 106 according to various implementations. In some implementations, the HMD includes an integrated display (e.g., a built-in display) that displays the XR environment 106. In some implementations, the HMD includes a head-mountable enclosure. In various implementations, the head-mountable enclosure includes an attachment region to which another device with a display can be attached. For example, in some implementations, the electronic device 100 can be attached to the head-mountable enclosure. In various implementations, the head-mountable enclosure is shaped to form a receptacle for receiving another device that includes a display (e.g., the electronic device 100). For example, in some implementations, the electronic device 100 slides/snaps into or otherwise attaches to the head-mountable enclosure. In some implementations, the display of the device attached to the head-mountable enclosure presents (e.g., displays) the XR environment 106. In various implementations, examples of the electronic device 100 include smartphones, tablets, media players, laptops, etc.
With reference to
In some implementations, the time monitoring subsystem 220 monitors an amount of time since a previous event associated with the physical article. For example, the time monitoring subsystem 220 may include a timer 222 that monitors the amount of time since the previous event. In some implementations, the time monitoring subsystem 220 compares a first timestamp (e.g., corresponding to a current time) with a second timestamp (e.g., corresponding to a time associated with the previous event) to determine the amount of time since the previous event. In some implementations, the time monitoring subsystem 220 receives an indication of the amount of time since the previous event.
In some implementations, the time monitoring subsystem 220 determines a plurality of respective time periods that correspond to a plurality of events associated with the physical article. For example, if the physical article is a plant, the time monitoring subsystem 220 may monitor multiple time intervals over which the plant is watered. In some implementations, the time monitoring subsystem 220 determines one or more of a trend, an average, and/or historical information relating to the plurality of time periods. For example, the time monitoring subsystem 220 may determine the average time interval between watering events for a plant and/or a log of previous watering events.
In some implementations, the user interface generator 230 synthesizes a user interface that displays an indicator of the amount of time proximate the representation of the physical article. For example, the user interface generator 230 may generate a user interface and insert the user interface into the XR environment 106 to be rendered by the environment renderer 210. In some implementations, the user interface generator 230 modifies the XR environment 106 to generate a modified XR environment that includes a representation of the user interface.
The user interface includes an indicator of the amount of time since the previous event associated with the physical article. The indicator may be displayed proximate the representation of the physical article and may have an appearance that represents the amount of time. In some implementations, as represented in Figures IA-1K, the indicator includes one or more rings that represent the amount of time. For example, a ring may be displayed with a thickness that corresponds to the amount of time. In some implementations, a ring includes an arc section having an arc length that corresponds to the amount of time. In some implementations, the indicator includes multiple rings, and the number and/or spacing of the rings corresponds to the amount of time.
In some implementations, the user interface generator 230 changes a visual property of the indicator based on a time scale of the indicator. For example, a dimension (e.g., a thickness) of the indicator may be used to represent the time scale of the indicator, with thin rings representing a first time scale (e.g., minutes) and thick rings representing a second time scale (e.g., hours) larger than the first time scale. In some implementations, the user interface generator 230 changes a color and/or a brightness of the indicator based on the time scale of the indicator. For example, a first color or brightness level may be used to represent a first time scale. If the time scale of the indicator changes, e.g., due to the passage of time, the user interface generator 230 may change the indicator to a second color or brightness level different from the first color or brightness level to indicate a different time scale. In some implementations, multiple rings of a single color or brightness level represent multiple units of a corresponding time scale. In some implementations, a single ring includes an arc section having an arc length that represents the units of a corresponding time scale. In some implementations, the user interface generator 230 changes a color and/or a brightness of the indicator to indicate that an acceptable (e.g., desired) or unacceptable (e.g., undesired) amount of time has elapsed relative to a threshold or thresholds. For example, if the indicator is used to indicate an amount of time since a plant has been watered, the indicator may be green if the time since the last watering event is less than a first threshold. If the time since the last watering event is greater than the first threshold, the indicator may progressively change from green to red. If the time since the last watering event is greater than a second threshold, the indicator may be red.
In some implementations, the indicator includes a plurality of rings that are separated by a distance d. In some implementations, the distance d changes based on the amount of time. For example, the distance d may decrease as the amount of time increases (e.g., the rings may become closer together). In some implementations, when the amount of time breaches a threshold (e.g., an hour), the time scale of the indicator changes, and a visual property of the indicator may change as disclosed herein. For example, when the time scale of the indicator changes from minutes to hours, the rings may be displayed with a thicker appearance to indicate the change in time scale.
In some implementations, the user input subsystem 240 detects a user input directed to the indicator. For example, the user input subsystem 240 may obtain a gesture input 242 from an image sensor 244 (e.g., a scene-facing image sensor). In some implementations, the user input subsystem 240 obtains a gaze input 246 from a user-facing image sensor 248 (e.g., a front-facing camera or an inward-facing camera). The user-facing image sensor 248 may capture a set of one or more images of the eyes of the user and may generate image data. The image data may be used to determine a gaze vector. The user input subsystem 240 may determine, based on the gaze vector, that the gaze of the user is directed to a location (e.g., the indicator) within the field of view. In some implementations, the user input subsystem 240 obtains an audio input 252. For example, an audio sensor 254 may obtain an audio signal corresponding to a spoken command (e.g., “reset the timer”).
In some implementations, the user interface generator 230 displays a collapsed indicator in response to the user input. The collapsed indicator may display a different time scale (e.g., a less granular time scale) than the indicator to accommodate a reduced size. In some implementations, the collapsed indicator displays less information than the indicator. For example, if the indicator includes a description of the event that is being monitored, the collapsed indicator may omit the description. In some implementations, the user interface generator displays an expanded indicator in response to the user input. The expanded indicator displays a different time scale (e.g., a more granular time scale) than the indicator to take advantage of the greater available display space. In some implementations, the expanded indicator displays additional information that is not displayed in the indicator. For example, the expanded indicator may display a narrative description of the event that is being monitored.
In some implementations, the user interface generator 230 displays an affordance in connection with the representation of the physical article. The affordance may be implemented as a user-interactive graphical user interface object. Examples of user-interactive graphical user interface objects include, without limitation, buttons, sliders, icons, selectable menu items, switches, hyperlinks, and/or other user interface controls.
In some implementations, the user input subsystem 240 obtains a user input directed to the affordance. For example, the user input may include the gesture input 242 obtained by the image sensor 244. In some implementations, the user input includes the gaze input 246. In some implementations, the user input includes the audio input 252. For example, the audio sensor 254 may obtain an audio signal corresponding to a spoken command (e.g., “reset the timer”). When the user input subsystem 240 obtains the user input, the time monitoring subsystem 220 may reset the monitored amount of time. In some implementations, the user interface generator 230 adjusts the indicator to display a representation of the monitored amount of time that has been reset.
As represented by block 310, in various implementations, the method 300 includes presenting an XR environment that includes a representation of a physical article. In some implementations, the XR environment is generated. In some implementations, the XR environment is received from another device that generated the XR environment.
The XR environment may include a virtual environment that is a simulated replacement of a physical environment. In some implementations, the XR environment is synthesized by the electronic device 100 and is different from a physical environment in which the electronic device 100 is located. In some implementations, the XR environment includes an augmented environment that is a modified version of a physical environment. For example, the physical environment may be modified (e.g., augmented) to generate the XR environment, e.g., by simulating a replica of the physical environment and/or by adding objects to or removing objects from the simulated replica of the physical environment. In some implementations, objects are added to a passthrough portion of the XR environment.
In some implementations, the XR environment includes a representation of a physical article. The representation may include a set of pixels representing the physical article, e.g., in the case of a video passthrough. In some implementations, the representation includes the physical article, e.g., as seen through a lens, as in the case of an optical passthrough.
In various implementations, as represented by block 320, the method 300 includes determining an amount of time that has passed since an occurrence of a previous event associated with the physical article. For example, a timer may monitor the amount of time since the previous event. In some implementations, the electronic device 100 compares a first timestamp (e.g., corresponding to a current time) with a second timestamp (e.g., corresponding to a time associated with the previous event) to determine the amount of time since the previous event. In some implementations, the electronic device 100 receives an indication of the amount of time since the previous event. In some implementations, as represented by block 320a, the previous event comprises a user interaction with the physical article. For example, if the physical article is an oven, the amount of time since the oven has been preheated may be monitored.
In various implementations, as represented by block 330, the method 300 includes displaying an indicator of the amount of time proximate to the representation of the physical article. The indicator may have an appearance that represents the amount of time. For example, the indicator may include one or more rings. In some implementations, the thickness of the one or more rings corresponds to the amount of time. In some implementations, a ring may include an arc section that has a length that corresponds to the amount of time. As the amount of time increases, the size of the arc section may also increase.
In some implementations, as represented by block 330a, the method 300 includes changing a visual property of the indicator based on a time scale of the indicator. As represented by block 330b, the visual property may include a dimension of the indicator. For example, the electronic device 100 may change a thickness of the indicator to indicate that the indicator represents a larger or smaller unit of time.
As represented by block 330c, in some implementations, the visual property includes a color of the indicator. For example, a first color may correspond to a time scale of hours. A second color different from the first color may correspond to a time scale of days. Multiple rings of the first and second colors may be used to represent multiple hours and/or days, respectively.
In some implementations, as represented by block 330d, the visual property includes a brightness of the indicator. For example, a first brightness level may correspond to a time scale of days. A second brightness level different from the first brightness level may correspond to a time scale of weeks. Multiple rings of the first and second brightness levels may be used to represent multiple days and/or weeks, respectively. In some implementations, a first ring of the first brightness level has an arc section having a first arc length corresponding to a number of days, and a second ring of the second brightness level has an arc section having a second arc length corresponding to a number of weeks.
In some implementations, as represented by block 330e, the indicator includes a plurality of rings. As represented by block 330f, the rings may be separated by a distance, and the distance may change based on the amount of time. For example, the distance may decrease as the amount of time increases (e.g., the rings may become closer together). In some implementations, when the amount of time breaches a threshold (e.g., an hour), the time scale of the indicator changes, and a visual property of the indicator may change as disclosed herein. For example, when the time scale of the indicator changes from minutes to hours, the rings may be displayed with a thicker appearance to indicate the change in time scale.
As represented by block 330g, in some implementations, the method 300 includes detecting a user input directed to the indicator. For example, as represented by block 330h, the user input may include a gesture input. The gesture input may be obtained from an image sensor (e.g., a scene-facing image sensor). In some implementations, as represented by block 330i, the user input includes a gaze input, e.g., obtained from a user-facing image sensor (e.g., a front-facing camera or an inward-facing camera). The user-facing image sensor may capture a set of one or more images of the eyes of the user and may generate image data that may be used to determine a gaze vector. In some implementations, as represented by block 330j, the user input includes an audio input. For example, an audio sensor may obtain an audio signal corresponding to a spoken command (e.g., “reset the timer”).
In some implementations, as represented by block 330k, the method 300 includes displaying a collapsed indicator in response to the user input. The collapsed indicator may display a different time scale (e.g., a less granular time scale) than the indicator to accommodate a reduced size. In some implementations, the collapsed indicator displays less information than the indicator. For example, if the indicator includes a description of the event that is being monitored, the collapsed indicator may omit the description. In some implementations, as represented by block 330l, the method 300 includes displaying an expanded indicator in response to the user input. The expanded indicator displays a different time scale (e.g., a more granular time scale) than the indicator to take advantage of the greater available display space. In some implementations, the expanded indicator displays additional information that is not displayed in the indicator. For example, the expanded indicator may display a narrative description of the event that is being monitored.
In some implementations, as represented by block 330m of
In some implementations, as represented by block 3300, the method 300 includes compositing an affordance with the representation of the physical article. The affordance may be implemented as a user-interactive graphical user interface object. Examples of user-interactive graphical user interface objects include, without limitation, buttons, sliders, icons, selectable menu items, switches, hyperlinks, and/or other user interface controls. In some implementations, as represented by block 330p, the method 300 includes detecting a user input directed to the affordance and resetting the monitored amount of time in response to detecting the user input. For example, the user input may include a gesture input, a gaze input, and/or an audio input. As represented by block 330q, the method 300 may include resetting the monitored amount of time and adjusting the displayed indicator of the amount of time in response to resetting the monitored amount of time.
In some implementations, as represented by block 330r, the method 300 includes determining a plurality of respective time periods corresponding to a plurality of events associated with the physical article. For example, if the physical article is a plant, multiple watering intervals may be determined. In some implementations, a plurality of respective time periods are determined corresponding to different types of interactions with the physical article. For example, if the physical article is an oven, a first time period may be associated with a first interaction with the oven (e.g., placing a pie into the oven). A second time period may be associated with a second interaction with the oven (e.g., placing a sheet of cookies into the oven).
In some implementations, as represented by block 330s, the method 300 includes displaying a trend associated with the plurality of respective time periods. For example, the indicator may notify the user of whether a watering interval is increasing or decreasing relative to previous watering intervals. In some implementations, as represented by block 330t, the method 300 includes displaying an average of the plurality of respective time periods. As represented by block 330u, historical information relating to the plurality of respective time periods may be displayed.
In some implementations, the network interface 402 is provided to, among other uses, establish and maintain a metadata tunnel between a cloud hosted network management system and at least one private network including one or more compliant devices. In some implementations, the one or more communication buses 405 include circuitry that interconnects and controls communications between system components. The memory 404 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices, and may include non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. The memory 404 optionally includes one or more storage devices remotely located from the one or more CPUs 401. The memory 404 comprises a non-transitory computer readable storage medium.
In some implementations, the memory 404 or the non-transitory computer readable storage medium of the memory 404 stores the following programs, modules and data structures, or a subset thereof including an optional operating system 406, the environment renderer 210, the time monitoring subsystem 220, the user interface generator 230, and the user input subsystem 240. In various implementations, the device 400 performs the method 300 shown in
In some implementations, the environment renderer 210 displays an extended reality (XR) environment that includes a representation of a physical article. In some implementations, the environment renderer 210 performs at least some of the operation(s) represented by block 310 in
In some implementations, the time monitoring subsystem 220 monitors an amount of time since a previous event associated with the physical article. In some implementations, the time monitoring subsystem 220 performs the operation(s) represented by block 320 in
In some implementations, the user interface generator 230) synthesizes a user interface that displays an indicator of the amount of time proximate the representation of the physical article. In some implementations, the user interface generator 230) performs the operations represented by block 330 in
In some implementations, the user input subsystem 240 detects a user input directed to the indicator. To that end, the user input subsystem 240 includes instructions 240a and heuristics and metadata 240b.
In some implementations, the one or more I/O devices 410 include a user-facing image sensor. In some implementations, the one or more I/O devices 410 include one or more head position sensors that sense the position and/or motion of the head of the user. In some implementations, the one or more I/O devices 410 include a display for presenting the graphical environment (e.g., for presenting the XR environment 106). In some implementations, the one or more I/O devices 410 include a speaker for outputting an audible signal.
In various implementations, the one or more I/O devices 410 include a video pass-through display which displays at least a portion of a physical environment surrounding the device 400 as an image captured by a scene camera. In various implementations, the one or more I/O devices 410 include an optical see-through display which is at least partially transparent and passes light emitted by or reflected off the physical environment.
It will be appreciated that
While various aspects of implementations within the scope of the appended claims are described above, it should be apparent that the various features of implementations described above may be embodied in a wide variety of forms and that any specific structure and/or function described above is merely illustrative. Based on the present disclosure one skilled in the art should appreciate that an aspect described herein may be implemented independently of any other aspects and that two or more of these aspects may be combined in various ways. For example, an apparatus may be implemented and/or a method may be practiced using any number of the aspects set forth herein. In addition, such an apparatus may be implemented and/or such a method may be practiced using other structure and/or functionality in addition to or other than one or more of the aspects set forth herein.
This application claims the benefit of U.S. Provisional Patent App. No. 63/226,371, filed on Jul. 28, 2021, which is incorporated by reference in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2022/038299 | 7/26/2022 | WO |
Number | Date | Country | |
---|---|---|---|
63226371 | Jul 2021 | US |