This relates generally to systems and methods of selectively controlling display of a user an electronic device based on a context of the electronic device within a computer-generated environment.
Some electronic devices are used to track an activity, such as an exercise activity, of a user. In some examples, while the user is participating in the activity, the electronic device presents a user interface corresponding to the activity. There is a need for systems and methods for selectively controlling and/or updating display of the user interface corresponding to the activity based on changes in context of the electronic device in the physical environment.
Some examples of the disclosure are directed to systems and methods for displaying one or more user interfaces based on a context of an electronic device within a physical environment. In some examples, a method is performed at an electronic device in communication with a display, one or more input devices, and one or more cameras. In some examples, the electronic device detects, via the one or more input devices, initiation of an exercise activity associated with a user of the electronic device, optionally while a computer-generated environment is presented at the electronic device. In some examples, in response to detecting the initiation of the exercise activity, the electronic device activates an exercise tracking mode of operation. In some examples, while the exercise tracking mode of operation is active, the electronic device captures, via the one or more cameras, one or more images of a physical environment. In some examples, in accordance with detecting, in the one or more images, a feature of the physical environment, the electronic device performs a first operation associated with the exercise tracking mode of operation and directed toward a computer-generated environment presented at the electronic device.
In some examples, activating the exercise tracking mode of operation includes displaying, via the display, one or more representations of one or more fitness metrics associated with the exercise activity and that are tracked by the electronic device during the exercise activity. In some examples, the physical environment of which the one or more images are captured is a physical environment in which the exercise activity is performed by the user. In some examples, detecting the feature of the physical environment includes detecting a physical object that is associated with a portion of the user. In some such examples, performing the first operation includes displaying one or more visual indications of information associated with the physical object in the computer-generated environment. In some examples, detecting the feature of the physical environment includes detecting a physical object in the physical environment that corresponds to a pause of the exercise activity. In some such examples, performing the first operation includes temporarily ceasing updating display of the one or more representations of the one or more fitness metrics associated with the exercise activity (e.g., for a duration of the pause of the exercise activity).
In some examples, the one or more fitness metrics associated with the exercise activity include an intensity metric that measures and/or qualifies an intensity of the exercise activity, such as a relative difficulty of the exercise activity for the user. In some examples, detecting the feature of the physical environment includes detecting terrain or other surface material and/or a change in elevation or grade of the physical environment that corresponds to a change in the intensity metric because the feature of the physical environment will cause the relative difficulty of the exercise activity to change. In some such examples, performing the first operation includes updating the intensity metric and thus updating display of a representation of the intensity metric in the computer-generated environment.
The full descriptions of these examples are provided in the Drawings and the Detailed Description, and it is understood that this Summary does not limit the scope of the disclosure in any way.
For improved understanding of the various examples described herein, reference should be made to the Detailed Description below along with the following drawings. Like reference numerals often refer to corresponding parts throughout the drawings.
Some examples of the disclosure are directed to systems and methods for displaying one or more user interfaces based on a context of an electronic device within a physical environment. In some examples, a method is performed at an electronic device in communication with a display, one or more input devices, and one or more cameras. In some examples, the electronic device detects, via the one or more input devices, initiation of an exercise activity associated with a user of the electronic device, optionally while a computer-generated environment is presented at the electronic device. In some examples, in response to detecting the initiation of the exercise activity, the electronic device activates an exercise tracking mode of operation. In some examples, while the exercise tracking mode of operation is active, the electronic device captures, via the one or more cameras, one or more images of a physical environment. In some examples, in accordance with detecting, in the one or more images, a feature of the physical environment, the electronic device performs a first operation associated with the exercise tracking mode of operation and directed toward a computer-generated environment presented at the electronic device.
In some examples, activating the exercise tracking mode of operation includes displaying, via the display, one or more representations of one or more fitness metrics associated with the exercise activity and that are tracked by the electronic device during the exercise activity. In some examples, the physical environment of which the one or more images are captured is a physical environment in which the exercise activity is performed by the user. In some examples, detecting the feature of the physical environment includes detecting a physical object that is associated with a portion of the user. In some such examples, performing the first operation includes displaying one or more visual indications of information associated with the physical object in the computer-generated environment. In some examples, detecting the feature of the physical environment includes detecting a physical object in the physical environment that corresponds to a pause of the exercise activity. In some such examples, performing the first operation includes temporarily ceasing updating display of the one or more representations of the one or more fitness metrics associated with the exercise activity (e.g., for a duration of the pause of the exercise activity).
In some examples, the one or more fitness metrics associated with the exercise activity include an intensity metric that measures and/or qualifies an intensity of the exercise activity, such as a relative difficulty of the exercise activity for the user. In some examples, detecting the feature of the physical environment includes detecting terrain or other surface material and/or a change in elevation or grade of the physical environment that corresponds to a change in the intensity metric because the feature of the physical environment will cause the relative difficulty of the exercise activity to change. In some such examples, performing the first operation includes updating the intensity metric and thus updating display of a representation of the intensity metric in the computer-generated environment.
In some examples, as shown in
In some examples, display 120 has a field of view visible to the user (e.g., that may or may not correspond to a field of view of external image sensors 114b and 114c). Because display 120 is optionally part of a head-mounted device, the field of view of display 120 is optionally the same as or similar to the field of view of the user's eyes. In other examples, the field of view of display 120 may be smaller than the field of view of the user's eyes. In some examples, electronic device 101 may be an optical see-through device in which display 120 is a transparent or translucent display through which portions of the physical environment may be directly viewed. In some examples, display 120 may be included within a transparent lens and may overlap all or only a portion of the transparent lens. In other examples, electronic device may be a video-passthrough device in which display 120 is an opaque display configured to display images of the physical environment in which the electronic device 101 is located. While a single display 120 is shown, it should be appreciated that display 120 may include a stereo pair of displays.
In some examples, in response to a trigger, the electronic device 101 may be configured to display a virtual object 104 in the XR environment represented by a cube illustrated in
It should be understood that virtual object 104 is a representative virtual object and one or more different virtual objects (e.g., of various dimensionality such as two-dimensional or other three-dimensional virtual objects) can be included and rendered in a three-dimensional XR environment. For example, the virtual object can represent an application or a user interface displayed in the XR environment. In some examples, the virtual object can represent content corresponding to the application and/or displayed via the user interface in the XR environment. In some examples, the virtual object 104 is optionally configured to be interactive and responsive to user input (e.g., air gestures, such as air pinch gestures, air tap gestures, and/or air touch gestures), such that a user may virtually touch, tap, move, rotate, or otherwise interact with, the virtual object 104.
In some examples, displaying an object in a three-dimensional environment may include interaction with one or more user interface objects in the three-dimensional environment. For example, initiation of display of the object in the three-dimensional environment can include interaction with one or more virtual options/affordances displayed in the three-dimensional environment. In some examples, a user's gaze may be tracked by the electronic device as an input for identifying one or more virtual options/affordances targeted for selection when initiating display of an object in the three-dimensional environment. For example, gaze can be used to identify one or more virtual options/affordances targeted for selection using another selection input. In some examples, a virtual option/affordance may be selected using hand-tracking input detected via an input device in communication with the electronic device. In some examples, objects displayed in the three-dimensional environment may be moved and/or reoriented in the three-dimensional environment in accordance with movement input detected via the input device.
In the discussion that follows, an electronic device that is in communication with a display generation component and one or more input devices is described. It should be understood that the electronic device optionally is in communication with one or more other physical user-interface devices, such as a touch-sensitive surface, a physical keyboard, a mouse, a joystick, a hand tracking device, an eye tracking device, a stylus, etc. Further, as described above, it should be understood that the described electronic device, display and touch-sensitive surface are optionally distributed amongst two or more devices. Therefore, as used in this disclosure, information displayed on the electronic device or by the electronic device is optionally used to describe information outputted by the electronic device for display on a separate display device (touch-sensitive or not). Similarly, as used in this disclosure, input received on the electronic device (e.g., touch input received on a touch-sensitive surface of the electronic device, or touch input received on the surface of a stylus) is optionally used to describe input received on a separate input device, from which the electronic device receives input information.
The device typically supports a variety of applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disk authoring application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a workout support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, a television channel browsing application, and/or a digital video player application.
As illustrated in
Communication circuitry 222 optionally includes circuitry for communicating with electronic devices, networks, such as the Internet, intranets, a wired network and/or a wireless network, cellular networks, and wireless local area networks (LANs). Communication circuitry 222 optionally includes circuitry for communicating using near-field communication (NFC) and/or short-range communication, such as Bluetooth®.
Processor(s) 218 include one or more general processors, one or more graphics processors, and/or one or more digital signal processors. In some examples, memory 220 is a non-transitory computer-readable storage medium (e.g., flash memory, random access memory, or other volatile or non-volatile memory or storage) that stores computer-readable instructions configured to be executed by processor(s) 218 to perform the techniques, processes, and/or methods described below. In some examples, memory 220 can include more than one non-transitory computer-readable storage medium. A non-transitory computer-readable storage medium can be any medium (e.g., excluding a signal) that can tangibly contain or store computer-executable instructions for use by or in connection with the instruction execution system, apparatus, or device. In some examples, the storage medium is a transitory computer-readable storage medium. In some examples, the storage medium is a non-transitory computer-readable storage medium. The non-transitory computer-readable storage medium can include, but is not limited to, magnetic, optical, and/or semiconductor storages. Examples of such storage include magnetic disks, optical discs based on compact disc (CD), digital versatile disc (DVD), or Blu-ray technologies, as well as persistent solid-state memory such as flash, solid-state drives, and the like.
In some examples, display generation component(s) 214 include a single display (e.g., a liquid-crystal display (LCD), organic light-emitting diode (OLED), or other types of display). In some examples, display generation component(s) 214 includes multiple displays. In some examples, display generation component(s) 214 can include a display with touch capability (e.g., a touch screen), a projector, a holographic projector, a retinal projector, a transparent or translucent display, etc. In some examples, electronic device 201 includes touch-sensitive surface(s) 209, respectively, for receiving user inputs, such as tap inputs and swipe inputs or other gestures. In some examples, display generation component(s) 214 and touch-sensitive surface(s) 209 form touch-sensitive display(s) (e.g., a touch screen integrated with electronic device 201 or external to electronic device 201 that is in communication with electronic device 201).
Electronic device 201 optionally includes image sensor(s) 206. Image sensors(s) 206 optionally include one or more visible light image sensors, such as charged coupled device (CCD) sensors, and/or complementary metal-oxide-semiconductor (CMOS) sensors operable to obtain images of physical objects from the real-world environment. Image sensor(s) 206 also optionally include one or more infrared (IR) sensors, such as a passive or an active IR sensor, for detecting infrared light from the real-world environment. For example, an active IR sensor includes an IR emitter for emitting infrared light into the real-world environment. Image sensor(s) 206 also optionally include one or more cameras configured to capture movement of physical objects in the real-world environment. Image sensor(s) 206 also optionally include one or more depth sensors configured to detect the distance of physical objects from electronic device 201. In some examples, information from one or more depth sensors can allow the device to identify and differentiate objects in the real-world environment from other objects in the real-world environment. In some examples, one or more depth sensors can allow the device to determine the texture and/or topography of objects in the real-world environment.
In some examples, electronic device 201 uses CCD sensors, event cameras, and depth sensors in combination to detect the physical environment around electronic device 201. In some examples, image sensor(s) 206 include a first image sensor and a second image sensor. The first image sensor and the second image sensor work in tandem and are optionally configured to capture different information of physical objects in the real-world environment. In some examples, the first image sensor is a visible light image sensor and the second image sensor is a depth sensor. In some examples, electronic device 201 uses image sensor(s) 206 to detect the position and orientation of electronic device 201 and/or display generation component(s) 214 in the real-world environment. For example, electronic device 201 uses image sensor(s) 206 to track the position and orientation of display generation component(s) 214 relative to one or more fixed objects in the real-world environment.
In some examples, electronic device 201 includes microphone(s) 213 or other audio sensors. Electronic device 201 optionally uses microphone(s) 213 to detect sound from the user and/or the real-world environment of the user. In some examples, microphone(s) 213 includes an array of microphones (a plurality of microphones) that optionally operate in tandem, such as to identify ambient noise or to locate the source of sound in space of the real-world environment.
Electronic device 201 includes location sensor(s) 204 for detecting a location of electronic device 201 and/or display generation component(s) 214. For example, location sensor(s) 204 can include a global position system (GPS) receiver that receives data from one or more satellites and allows electronic device 201 to determine the device's absolute position in the physical world.
Electronic device 201 includes orientation sensor(s) 210 for detecting orientation and/or movement of electronic device 201 and/or display generation component(s) 214. For example, electronic device 201 uses orientation sensor(s) 210 to track changes in the position and/or orientation of electronic device 201 and/or display generation component(s) 214, such as with respect to physical objects in the real-world environment. Orientation sensor(s) 210 optionally include one or more gyroscopes and/or one or more accelerometers.
Electronic device 201 includes hand tracking sensor(s) 202 and/or eye tracking sensor(s) 212 (and/or other body tracking sensor(s), such as leg, torso and/or head tracking sensor(s)), in some examples. Hand tracking sensor(s) 202 are configured to track the position/location of one or more portions of the user's hands, and/or motions of one or more portions of the user's hands with respect to the extended reality environment, relative to the display generation component(s) 214, and/or relative to another defined coordinate system. Eye tracking sensor(s) 212 are configured to track the position and movement of a user's gaze (eyes, face, or head, more generally) with respect to the real-world or extended reality environment and/or relative to the display generation component(s) 214. In some examples, hand tracking sensor(s) 202 and/or eye tracking sensor(s) 212 are implemented together with the display generation component(s) 214. In some examples, the hand tracking sensor(s) 202 and/or eye tracking sensor(s) 212 are implemented separate from the display generation component(s) 214.
In some examples, the hand tracking sensor(s) 202 (and/or other body tracking sensor(s), such as leg, torso and/or head tracking sensor(s)) can use image sensor(s) 206 (e.g., one or more IR cameras, 3D cameras, depth cameras, etc.) that capture three-dimensional information from the real-world including one or more body parts (e.g., hands, legs, or torso of a human user). In some examples, the hands can be resolved with sufficient resolution to distinguish fingers and their respective positions. In some examples, one or more image sensors 206 are positioned relative to the user to define a field of view of the image sensor(s) 206 and an interaction space in which finger/hand position, orientation and/or movement captured by the image sensors are used as inputs (e.g., to distinguish from a user's resting hand or other hands of other persons in the real-world environment). Tracking the fingers/hands for input (e.g., gestures, touch, tap, etc.) can be advantageous in that it does not require the user to touch, hold or wear any sort of beacon, sensor, or other marker.
In some examples, eye tracking sensor(s) 212 includes at least one eye tracking camera (e.g., infrared (IR) cameras) and/or illumination sources (e.g., IR light sources, such as LEDs) that emit light towards a user's eyes. The eye tracking cameras may be pointed towards a user's eyes to receive reflected IR light from the light sources directly or indirectly from the eyes. In some examples, both eyes are tracked separately by respective eye tracking cameras and illumination sources, and a focus/gaze can be determined from tracking both eyes. In some examples, one eye (e.g., a dominant eye) is tracked by one or more respective eye tracking cameras/illumination sources.
Electronic device 201 is not limited to the components and configuration of
Attention is now directed towards examples of selective control and/or display of one or more user interfaces associated with a detected user activity within a computer-generated environment at an electronic device based on changes in context of the electronic device within a physical environment.
As shown in
In
In some examples, as shown in
In some examples, as shown in
In some examples, the electronic device 301 updates the one or more fitness metrics associated with the exercise activity in accordance with a progression of the exercise activity. For example, in
In some examples, the electronic device 301 updates display of the three-dimensional environment 350 based on detected changes in a context of the electronic device 301. As used herein, the context of the electronic device 301 may include a location of the electronic device 301 within the physical environment (e.g., the city environment 340), an orientation of the electronic device 301 within the physical environment, and/or a relationship between the electronic device 301 and features of (e.g., one or more real-world objects in, a terrain of, and/or a surface elevation of) the physical environment. In some examples, detecting a feature of the physical environment includes visually detecting one or more real-world objects in the field of view of the three-dimensional environment 350 and/or changes in appearance of the one or more real-world objects in the field of view of the three-dimensional environment 350, as discussed below. As discussed herein, the electronic device 301 optionally visually detects a feature of the physical environment based on images captured by one or more cameras of the electronic device 301 (e.g., captured while the exercise tracking mode of operation is active and/or while the exercise tracking mode of operation is not active).
In
In some examples, when the electronic device 301 visually detects the first shoe 308-1 and/or the second shoe 308-2 of the user 306, the electronic device 301 initiates a process to track (e.g., record) a life of the first shoe and/or the second shoe (which are optionally the same). For example, visually detecting the first shoe 308-1 and/or the second shoe 308-2 includes recognizing, via one or more processors of the electronic device 301 (e.g., using computer vision, optical recognition, and/or other techniques), the first shoe 308-1 and the second shoe 308-2 as being associated with (e.g., belonging to and/or having previously been worn by) the user 306. In some examples, the electronic device 301 automatically tracks the shoe life of a given pair of shoes associated with the user each time the user 306 is wearing the pair of shoes and the shoes are visible in the field of view of the electronic device 301 (e.g., enabling the electronic device 301 to visually detect the pair of shoes as similarly discussed above). In some examples, tracking the shoe life of the first shoe 308-1 and/or the second shoe 308-2 includes tracking/recording a distance travelled (e.g., walked, ran, etc.) in the first shoe 308-1 and/or the second shoe 308-2 (e.g., by attributing the distance 305-4 at the end of the activity to the detected shoe 308). In some examples, the electronic device 301 tracks the shoe life of the first shoe 308-1 and/or the second shoe 308-2 in the manner discussed above irrespective of whether the exercise tracking mode of operation is active and/or irrespective of whether the user 306 is participating in an exercise activity.
In some examples, as shown in
In some examples, in accordance with a determination that the first shoe 308-1 and/or the second shoe 308-2 is a new pair of shoes (e.g., a newly purchased pair of shoes (e.g., purchased via a web-browsing application or an online marketplace application on the electronic device 301), a pair of shoes worn by the user for the first time (e.g., visually detected by the electronic device 301 for the first time), etc.), the electronic device 301 initiates a new shoe life tracking for the first shoe 308-1 and/or the second shoe 308-2. In some examples, the electronic device 301 provides a prompt to the user 306 inquiring whether to enable shoe life tracking according to the processes discussed above (e.g., the electronic device 301 displays a notification or other user interface element in the three-dimensional environment 350 that includes a selectable option for enabling shoe life tracking of the new pair of shoes and a selectable option for rejecting shoe life tracking of the new pair of shoes. In some examples, the electronic device 301 automatically enables shoe life tracking of the new pair of shoes (e.g., and displays a visual confirmation that the shoe life of the new pair of shoes is being tracked).
In some examples, as alluded to above, the electronic device 301 updates the shoe life of the first shoe 308-1 and/or the second shoe 308-2 in accordance with a determination that a progression of the exercise activity causes the shoe life of the first shoe 308-1 and/or the second shoe 308-2 to change. For example, in
In some examples, detecting a change in the context of the electronic device 301 includes detecting a change in a visual appearance of one or more real-world objects in the city environment 340 that is visible in the three-dimensional environment 350. As shown in
In some examples, in response to visually detecting the illumination of the red light 342-1 of the traffic light 341 in the city environment 340, the electronic device 301 pauses the exercise tracking mode of operation, such that the electronic device 301 forgoes updating the one or more fitness metrics associated with the exercise activity. For example, as shown in
Additionally, in some examples, the electronic device 301 detects a pause in the movement of the electronic device 301. For example, in
In
In some examples, responding to the change in context discussed above (e.g., the change in the visual appearance of the traffic light 341) may similarly be applied to other changes in context that cause the user to be unable to progress. For example, the electronic device 301 pauses the tracking of the exercise activity as similarly discussed above in response to visually detecting a physical object or obstruction in the user's field of view that causes the motion of the user to cease, such as one or more persons, vehicles, road work, animals, etc. In the other hand, should the user intentionally cease motion, which causes the movement of the electronic device 301 to cease as well, the electronic device 301 optionally does not pause the tracking of the exercise activity in the manner discussed above. For example, if the user stops running due to exhaustion or fatigue or to interact with the electronic device 301 (or another electronic device, such as a smartphone or smartwatch), the electronic device 301 continues to track the running activity and update the fitness metrics associated with the running activity because such user action does not correspond to a change in the context of the electronic device 301 that causes the user to be unable to progress the running activity.
In some examples, tracking one or more fitness metrics associated with the exercise activity includes tracking an intensity metric associated with the exercise activity. For example, as shown in
In some examples, as shown in
In some examples, as shown in
In some examples, when the electronic device 301 determines that the difficulty of the exercise activity is increased as discussed above, the electronic device 301 updates the intensity metric associated with the exercise activity. For example, the electronic device 301 updates the second user interface 330 in the three-dimensional environment 350 such that the representation 331-3 of the third intensity level is visually emphasized, which signifies that the user of the electronic device 301 is running at the third intensity level.
Similarly, the electronic device 301 optionally updates the intensity metric in accordance with a determination that the exercise activity has decreased in difficulty. For example, as shown in
In some examples, as shown in
In some examples, as mentioned previously above, the electronic device 301 determines the intensity metric associated with the exercise activity at least in part based on a terrain, surface material, and/or grade of the physical environment in which the exercise activity is being performed. In
In some examples, displaying the second user interface 330 corresponding to the intensity metric is in accordance with an intensity competition between the user of the electronic device 301 and a second user, different from the user, of a second electronic device, different from the electronic device 301. For example, as shown in
As previously described herein, the electronic device 301 determines the intensity metric based on a relative difficulty of the exercise activity (e.g., determined based on environmental factors and/or user health and/or fitness profiles). As shown in
It is understood that the examples shown and described herein are merely exemplary and that additional and/or alternative elements may be provided within the three-dimensional environment relating to the display and tracking of user exercise activities. It should be understood that the appearance, shape, form and size of each of the various user interface elements and objects shown and described herein are exemplary and that alternative appearances, shapes, forms and/or sizes may be provided. For example, the virtual objects representative of user interfaces (e.g., user interfaces 315, 330, and 335) may be provided in an alternative shape than a rectangular shape, such as a circular shape, triangular shape, etc. Additionally or alternatively, in some examples, the various user interface elements described herein may be selected and/or manipulated via user input received via one or more separate input devices in communication with the electronic device(s). For example, where applicable, selection input (e.g., for initiating tracking of the exercise activity) may be received via physical input devices, such as a mouse, trackpad, keyboard, etc. in communication with the electronic device(s).
In some examples, at 404, in response to detecting the initiation of the exercise activity, the electronic device activates an exercise tracking mode of operation. For example, as shown in
In some examples, at 410, in accordance with detecting, in the one or more images, a feature of the physical environment, the electronic device performs a first operation associated with the exercise tracking mode of operation and directed toward a computer-generated environment presented at the electronic device. For example, as described with reference to
It is understood that process 400 is an example and that more, fewer, or different operations can be performed in the same or in a different order. Additionally, the operations in process 400 described above are, optionally, implemented by running one or more functional modules in an information processing apparatus such as general-purpose processors (e.g., as described with respect to
Therefore, according to the above, some examples of the disclosure are directed to a method, comprising at an electronic device in communication with a display, one or more input devices, and one or more cameras: detecting, via the one or more input devices, initiation of an exercise activity associated with a user of the electronic device; in response to detecting the initiation of the exercise activity, activating an exercise tracking mode of operation; and while the exercise tracking mode of operation is active, capturing, via the one or more cameras, one or more images of a physical environment, and in accordance with detecting, in the one or more images, a feature of the physical environment, performing a first operation associated with the exercise tracking mode of operation and directed toward a computer-generated environment presented at the electronic device.
Additionally or alternatively, in some examples, the method further comprises, while the exercise tracking mode of operation is active, displaying, via the display, one or more indications associated with one or more fitness metrics corresponding to the exercise activity in the computer-generated environment. Additionally or alternatively, in some examples, the method further comprises: while exercise tracking mode of operation is active and while the one or more indications are displayed in the computer-generated environment, detecting a progression in the exercise activity; and in response to detecting the progression in the exercise activity, updating display, via the display, of the one or more indications based on the progression in the exercise activity. Additionally or alternatively, in some examples, detecting the feature of the physical environment includes detecting a physical object in the physical environment, wherein the physical object includes at least one of a stop sign, a traffic signal, and a vehicle. Additionally or alternatively, in some examples, detecting the feature of the physical environment includes detecting, via the one or more input devices or the one or more cameras, a pause of the exercise activity while the physical object is detected in the physical object. Additionally or alternatively, in some examples, performing the first operation includes forgoing updating display of the one or more indications. Additionally or alternatively, in some examples, the method further comprises: after performing the first operation, detecting, via the one or more input devices, a progression in the exercise activity that includes movement of the electronic device; and in response to detecting the progression in the exercise activity, updating display of the one or more indications based on the progression in the exercise activity.
Additionally or alternatively, in some examples, detecting the feature of the physical environment includes detecting an object that is associated with a portion of the user. Additionally or alternatively, in some examples, the object corresponds to a shoe worn on a foot of the user, and performing the first operation includes displaying an indication of a shoe life associated with the shoe in the computer-generated environment. Additionally or alternatively, in some examples, the object corresponds to a shoe worn on a foot of the user, and performing the first operation includes displaying an indication of a shoe type of the shoe. Additionally or alternatively, in some examples, the method further comprises: while the exercise tracking mode of operation is active, detecting, via the one or more input devices, a conclusion of the exercise activity; and in response to detecting the conclusion of the exercise activity, deactivating the exercise tracking mode of operation. Additionally or alternatively, in some examples, the method further comprises: after deactivating the exercise tracking mode of operation, visually detecting, via the one or more cameras, a shoe worn on a foot of the user; and in response to visually detecting the shoe, displaying, via the display, an indication of information associated with the shoe in the computer-generated environment. Additionally or alternatively, in some examples, the method further comprises, while the exercise tracking mode of operation is active, displaying, via the display, one or more indications associated with an intensity metric corresponding to the exercise activity in the computer-generated environment, wherein the intensity metric measures a relative difficulty of the exercise activity for the user based on the physical environment.
Additionally or alternatively, in some examples, detecting the feature of the physical environment includes detecting terrain in the physical environment that produces a change in the intensity metric. Additionally or alternatively, in some examples, detecting the feature of the physical environment includes detecting a change in elevation or grade of a surface of the physical environment that produces a change in the intensity metric. Additionally or alternatively, in some examples, performing the first operation includes updating the intensity metric corresponding to the exercise activity, including updating display of the one or more indications associated with the intensity metric in the computer-generated environment. Additionally or alternatively, in some examples, updating the intensity metric corresponding to the exercise activity includes, in accordance with a determination that the feature of the physical environment corresponds to a decrease in the intensity metric because the feature of the physical environment will cause the relative difficulty of the exercise activity to decrease below a threshold difficulty for the user, displaying, via the display, a user interface object prompting the user to adjust one or more characteristics of the exercise activity for increasing the relative difficulty of the exercise activity in the computer-generated environment. Additionally or alternatively, in some examples, the exercise activity includes a running activity. Additionally or alternatively, in some examples, the exercise activity includes a cycling activity. Additionally or alternatively, in some examples, the electronic device includes a head-mounted display.
Some examples of the disclosure are directed to an electronic device, comprising: one or more processors; memory; and one or more programs stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for performing any of the above methods.
Some examples of the disclosure are directed to a non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by one or more processors of an electronic device, cause the electronic device to perform any of the above methods.
Some examples of the disclosure are directed to an electronic device, comprising one or more processors, memory, and means for performing any of the above methods.
Some examples of the disclosure are directed to an information processing apparatus for use in an electronic device, the information processing apparatus comprising means for performing any of the above methods.
The foregoing description, for purpose of explanation, has been described with reference to specific examples. However, the illustrative discussions above are not intended to be exhaustive or to limit the disclosure to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The examples were chosen and described in order to best explain the principles of the disclosure and its practical applications, to thereby enable others skilled in the art to best use the disclosure and various described examples with various modifications as are suited to the particular use contemplated.
This application claims the benefit of U.S. Provisional Application No. 63/583,567, filed Sep. 18, 2023, the content of which is herein incorporated by reference in its entirety for all purposes.
Number | Date | Country | |
---|---|---|---|
63583567 | Sep 2023 | US |