This relates generally to systems and methods for grouping user interfaces in a computer-generated environment.
Computer-generated environments are environments where at least some objects displayed for a user's viewing are generated using a computer. Users may interact with a computer-generated environment, such as by instantiating user interfaces of applications and displaying the user interfaces in the computer-generated environment.
Some embodiments described in this disclosure are directed to methods of grouping user interfaces in a three-dimensional environment into containers. Some embodiments described in this disclosure are directed to methods of adding user interfaces to containers, moving user interfaces within containers, and removing user interfaces from containers. These interactions provide a more efficient and intuitive user experience. The full descriptions of the embodiments are provided in the Drawings and the Detailed Description, and it is understood that this Summary does not limit the scope of the disclosure in any way.
For a better understanding of the various described embodiments, reference should be made to the Detailed Description below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.
In the following description of embodiments, reference is made to the accompanying drawings which form a part hereof, and in which it is shown by way of illustration specific embodiments that are optionally practiced. It is to be understood that other embodiments are optionally used and structural changes are optionally made without departing from the scope of the disclosed embodiments.
A person can interact with and/or sense a physical environment or physical world without the aid of an electronic device. A physical environment can include physical features, such as a physical object or surface. An example of a physical environment is physical forest that includes physical plants and animals. A person can directly sense and/or interact with a physical environment through various means, such as hearing, sight, taste, touch, and smell. In contrast, a person can use an electronic device to interact with and/or sense an extended reality (XR) environment that is wholly or partially simulated. The XR environment can include mixed reality (MR) content, augmented reality (AR) content, virtual reality (VR) content, and/or the like. An XR environment is often referred to herein as a computer-generated environment. With an XR system, some of a person's physical motions, or representations thereof, can be tracked and, in response, characteristics of virtual objects simulated in the XR environment can be adjusted in a manner that complies with at least one law of physics. For instance, the XR system can detect the movement of a user's head and adjust graphical content and auditory content presented to the user similar to how such views and sounds would change in a physical environment. In another example, the XR system can detect movement of an electronic device that presents the XR environment (e.g., a mobile phone, tablet, laptop, or the like) and adjust graphical content and auditory content presented to the user similar to how such views and sounds would change in a physical environment. In some situations, the XR system can adjust characteristic(s) of graphical content in response to other inputs, such as a representation of a physical motion (e.g., a vocal command).
Many different types of electronic systems can enable a user to interact with and/or sense an XR environment. A non-exclusive list of examples include heads-up displays (HUDs), head mountable systems, projection-based systems, windows or vehicle windshields having integrated display capability, displays formed as lenses to be placed on users' eyes (e.g., contact lenses), headphones/earphones, input systems with or without haptic feedback (e.g., wearable or handheld controllers), speaker arrays, smartphones, tablets, and desktop/laptop computers. A head mountable system can have one or more speaker(s) and an opaque display. Other head mountable systems can be configured to accept an opaque external display (e.g., a smartphone). The head mountable system can include one or more image sensors to capture images/video of the physical environment and/or one or more microphones to capture audio of the physical environment. A head mountable system may have a transparent or translucent display, rather than an opaque display. The transparent or translucent display can have a medium through which light is directed to a user's eyes. The display may utilize various display technologies, such as μLEDs, OLEDs, LEDs, liquid crystal on silicon, laser scanning light source, digital light projection, or combinations thereof. An optical waveguide, an optical reflector, a hologram medium, an optical combiner, combinations thereof, or other similar technologies can be used for the medium. In some implementations, the transparent or translucent display can be selectively controlled to become opaque. Projection-based systems can utilize retinal projection technology that projects images onto users' retinas. Projection systems can also project virtual objects into the physical environment (e.g., as a hologram or onto a physical surface).
In some embodiments, device 200 is a mobile device, such as a mobile phone (e.g., smart phone or other portable communication device), a tablet computer, a laptop computer, a desktop computer, a television, a wearable device, a head-mounted display, an auxiliary device in communication with another device, etc. In some embodiments, device 200, as illustrated in
Communication circuitry 202 optionally includes circuitry for communicating with electronic devices, networks, such as the Internet, intranets, a wired network and/or a wireless network, cellular networks and wireless local area networks (LANs). Communication circuitry 202 optionally includes circuitry for communicating using near-field communication (NFC) and/or short-range communication, such as Bluetooth®.
Processor(s) 204 optionally include one or more general processors, one or more graphics processors, and/or one or more digital signal processors. In some embodiments, memory 206 is a non-transitory computer-readable storage medium (e.g., flash memory, random access memory, or other volatile or non-volatile memory or storage) that stores one or more programs including instructions or computer-readable instructions configured to be executed by processor(s) 204 to perform the techniques, processes, and/or methods described below. In some embodiments, memory 206 can including more than one non-transitory computer-readable storage medium. A non-transitory computer-readable storage medium can be any medium (e.g., excluding a signal) that can tangibly contain or store computer-executable instructions for use by or in connection with the instruction execution system, apparatus, or device. In some embodiments, the storage medium is a transitory computer-readable storage medium. In some embodiments, the storage medium is a non-transitory computer-readable storage medium. The non-transitory computer-readable storage medium can include, but is not limited to, magnetic, optical, and/or semiconductor storages. Examples of such storage include magnetic disks, optical discs based on CD, DVD, or Blu-ray technologies, as well as persistent solid-state memory such as flash, solid-state drives, and the like.
In some embodiments, display generation component(s) 224 include a single display (e.g., a liquid-crystal display (LCD), organic light-emitting diode (OLED), or other types of display). In some embodiments, display generation component(s) 224 includes multiple displays. In some embodiments, display generation component(s) 224 can include a display with touch capability (e.g., a touch screen), a projector, a holographic projector, a retinal projector, etc. In some embodiments, device 200 includes touch-sensitive surface(s) 220 for receiving user inputs, such as tap inputs and swipe inputs or other gestures. In some embodiments, display generation component(s) 224 and touch-sensitive surface(s) 220 form touch-sensitive display(s) (e.g., a touch screen integrated with device 200 or external to device 200 that is in communication with device 200).
Image sensors(s) 210 optionally include one or more visible light image sensor, such as charged coupled device (CCD) sensors, and/or complementary metal-oxide-semiconductor (CMOS) sensors operable to obtain images of physical objects from the real-world environment. Image sensor(s) 210 also optionally include one or more infrared (IR) sensors, such as a passive or an active IR sensor, for detecting infrared light from the real-world environment. For example, an active IR sensor includes an IR emitter for emitting infrared light into the real-world environment. Image sensor(s) 210 also optionally include one or more cameras configured to capture movement of physical objects in the real-world environment. Image sensor(s) 210 also optionally include one or more depth sensors configured to detect the distance of physical objects from device 200. In some embodiments, information from one or more depth sensors can allow the device to identify and differentiate objects in the real-world environment from other objects in the real-world environment. In some embodiments, one or more depth sensors can allow the device to determine the texture and/or topography of objects in the real-world environment. In some embodiments, device 200 uses CCD sensors, event cameras, and depth sensors in combination to detect the physical environment around device 200. In some embodiments, image sensor(s) 210 include a first image sensor and a second image sensor. The first image sensor and the second image sensor work in tandem and are optionally configured to capture different information of physical objects in the real-world environment. In some embodiments, the first image sensor is a visible light image sensor and the second image sensor is a depth sensor. In some embodiments, device 200 uses image sensor(s) 210 to detect the position and orientation of device 200 and/or display generation component(s) 224 in the real-world environment. For example, device 200 uses image sensor(s) 210 to track the position and orientation of display generation component(s) 224 relative to one or more fixed objects in the real-world environment.
Device 200 optionally uses microphone(s) 218 or other audio sensors to detect sound from the user and/or the real-world environment of the user. In some embodiments, microphone(s) 218 includes an array of microphones (a plurality of microphones) that optionally operate in tandem, such as to identify ambient noise or to locate the source of sound in space of the real-world environment. In some embodiments, audio or voice inputs can be used to interact with the user interface or computer-generated environment captured by one or more microphones (e.g., audio sensors).
Device 200 optionally includes location sensor(s) 214 for detecting a location of device 200 and/or display generation component(s) 224. For example, location sensor(s) 214 can include a GPS receiver that receives data from one or more satellites and allows device 200 to determine the device's absolute position in the physical world. Device 200 optionally includes orientation sensor(s) 216 for detecting orientation and/or movement of device 200 and/or display generation component(s) 224. For example, device 200 uses orientation sensor(s) 216 to track changes in the position and/or orientation of device 200 and/or display generation component(s) 224, such as with respect to physical objects in the real-world environment. Orientation sensor(s) 216 optionally include one or more gyroscopes and/or one or more accelerometers.
Device 200 includes hand tracking sensor(s) 230 and/or eye tracking sensor(s) 232, in some embodiments. Hand tracking sensor(s) 230 are configured to track the position/location of one or more portions of the user's hands, and/or motions of one or more portions of the user's hands with respect to the computer-generated environment, relative to the display generation component(s) 224, and/or relative to another defined coordinate system. Eye tracking sensor(s) 232 are configured to track the position and movement of a user's gaze (eyes, face, or head, more generally) with respect to the real-world or computer-generated environment and/or relative to the display generation component(s) 224. In some embodiments, hand tracking sensor(s) 230 and/or eye tracking sensor(s) 232 are implemented together with the display generation component(s) 224. In some embodiments, the hand tracking sensor(s) 230 can use image sensor(s) 210 (e.g., one or more IR cameras, 3D cameras, depth cameras, etc.) that capture three-dimensional information from the real-world including one or more hands (e.g., of a human user). In some examples, the hands can be resolved with sufficient resolution to distinguish fingers and their respective positions. In some embodiments, one or more image sensor(s) 210 are positioned relative to the user to define a field of view of the image sensor(s) and an interaction space in which finger/hand position, orientation and/or movement captured by the image sensors are used as inputs (e.g., to distinguish from a user's resting hand or other hands of other persons in the real-world environment). Tracking the fingers/hands for input (e.g., gestures) can be advantageous in that it does not require the user to touch, hold or wear any sort of beacon, sensor, or other marker. In some embodiments, eye tracking sensor(s) 232 includes at least one eye tracking camera (e.g., infrared (IR) cameras) and/or illumination sources (e.g., IR light sources, such as LEDs) that emit light towards a user's eyes. The eye tracking cameras may be pointed towards a user's eyes to receive reflected IR light from the light sources directly or indirectly from the eyes. In some embodiments, both eyes are tracked separately by respective eye tracking cameras and illumination sources, and a focus/gaze can be determined from tracking both eyes. In some embodiments, one eye (e.g., a dominant eye) is tracked by a respective eye tracking camera/illumination source(s). In some embodiments, eye tracking sensor(s) 232 can use image sensor(s) 210 (e.g., one or more IR cameras, 3D cameras, depth cameras, etc.).
Device 200 is not limited to the components and configuration of
As described herein, a computer-generated environment including various graphics user interfaces (“GUIs”) may be displayed using an electronic device, such as electronic device 100 or device 200, including one or more display generation components. The computer-generated environment can include one or more GUIs associated with an application. Device 100 or device 200 may supports a variety of applications, such as productivity applications (e.g., a presentation application, a word processing application, a spreadsheet application, etc.), a gaming application, a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a web browsing application, etc.
In some embodiments, locations in a computer-generated environment (e.g., a three-dimensional environment, an XR environment, etc.) optionally have corresponding locations in the physical environment. Thus, when a device is described as displaying a virtual object at a respective location with respect to a physical object (e.g., such as a location at or near the hand of the user, or at or near a physical table), the device displays the virtual object at a particular location in the three-dimensional environment such that it appears as if the virtual object is at or near the physical object in the physical world (e.g., the virtual object is displayed at a location in the three-dimensional environment that corresponds to a location in the physical environment at which the virtual object would be displayed if it were a real object at that particular location).
In some embodiments, real world objects that exist in the physical environment that are displayed in the three-dimensional environment can interact with virtual objects that exist only in the three-dimensional environment. For example, a three-dimensional environment can include a table and a user interface located in front of the table, with the table being a view of (or a representation of) a physical table in the physical environment, and the user interface being a virtual object.
Similarly, a user is optionally able to interact with virtual objects in the three-dimensional environment (e.g., such as user interfaces of applications running on the device) using one or more hands as if the virtual objects were real objects in the physical environment. For example, as described above, one or more sensors of the device optionally capture one or more of the hands of the user and display representations of the hands of the user in the three-dimensional environment (e.g., in a manner similar to displaying a real world object in three-dimensional environment described above), or in some embodiments, the hands of the user are visible via the display generation component via the ability to see the physical environment through the user interface due to the transparency/translucency of a portion of the display generation component that is displaying the user interface or projection of the user interface onto a transparent/translucent surface or projection of the user interface onto the user's eye or into a field of view of the user's eye. Thus, in some embodiments, the hands of the user are displayed at a respective location in the three-dimensional environment and are treated as if they were objects in the three-dimensional environment that are able to interact with the virtual objects in the three-dimensional environment (e.g., grabbing, moving, touching, pointing at virtual objects, etc.) as if they were real physical objects in the physical environment. In some embodiments, a user is able to move his or her hands to cause the representations of the hands in the three-dimensional environment to move in conjunction with the movement of the user's hand.
In some of the embodiments described below, the device is optionally able to determine the “effective” distance between physical objects in the physical world and virtual objects in the three-dimensional environment, for example, for the purpose of determining whether a physical object is interacting with a virtual object (e.g., whether a hand is touching, grabbing, holding, etc. a virtual object or within a threshold distance from a virtual object). For example, the device determines the distance between the hands of the user and virtual objects when determining whether the user is interacting with virtual objects and/or how the user is interacting with virtual objects. In some embodiments, the device determines the distance between the hands of the user and a virtual object by determining the distance between the location of the hands in the three-dimensional environment and the location of the virtual object of interest in the three-dimensional environment. For example, the one or more hands of the user can be located at a particular position in the physical world, which the device optionally captures and displays at a particular corresponding position in the three-dimensional environment (e.g., the position in the three-dimensional environment at which the hands would be displayed if the hands were virtual, rather than physical, hands). The position of the hands in the three-dimensional environment is optionally compared against the position of the virtual object of interest in the three-dimensional environment to determine the distance between the one or more hands of the user and the virtual object. In some embodiments, the device optionally determines a distance between a physical object and a virtual object by comparing positions in the physical world (e.g., as opposed to comparing positions in the three-dimensional environment). For example, when determining the distance between one or more hands of the user and a virtual object, the device optionally determines the corresponding location in the physical world of the virtual object (e.g., the position at which the virtual object would be located in the physical world if it were a physical object rather than a virtual object), and then determines the distance between the corresponding physical position and the one of more hands of the user. In some embodiments, the same techniques are optionally used to determine the distance between any physical object and any virtual object. Thus, as described herein, when determining whether a physical object is in contact with a virtual object or whether a physical object is within a threshold distance of a virtual object, the device optionally performs any of the techniques described above to map the location of the physical object to the three-dimensional environment and/or map the location of the virtual object to the physical world.
In some embodiments, the same or similar technique is used to determine where and what the gaze of the user is directed to. For example, if the gaze of the user is directed to a particular position in the physical environment, the device optionally determines the corresponding position in the three-dimensional environment and if a virtual object is located at that corresponding virtual position, the device optionally determines that the gaze of the user is directed to that virtual object.
Similarly, the embodiments described herein may refer to the location of the user (e.g., the user of the device) and/or the location of the device in the three-dimensional environment. In some embodiments, the user of the device is holding, wearing, or otherwise located at or near the electronic device. Thus, in some embodiments, the location of the device is used as a proxy for the location of the user. In some embodiments, the location of the device and/or user in the physical environment corresponds to a respective location in the three-dimensional environment. In some embodiments, the respective location is the location from which the “camera” or “view” of the three-dimensional environment extends. For example, the location of the device would be the location in the physical environment (and its corresponding location in the three-dimensional environment) from which, if a user were to stand at that location facing the respective portion of the physical environment displayed by the display generation component, the user would see the objects in the physical environment in the same position, orientation, and/or size as they are displayed by the display generation component of the device (e.g., in absolute terms and/or relative to each other). Similarly, if the virtual objects displayed in the three-dimensional environment were physical objects in the physical environment (e.g., placed at the same location in the physical environment as they are in the three-dimensional environment, and having the same size and orientation in the physical environment as in the three-dimensional environment), the location of the device and/or user is the position at which the user would see the virtual objects in the physical environment in the same position, orientation, and/or size as they are displayed by the display generation component of the device (e.g., in absolute terms and/or relative to each other and the real world objects).
Some embodiments described herein may refer to selection inputs as either discrete inputs or as continuous inputs. For example, a selection input can correspond to a single selection input or a selection input can be held (e.g., maintained) while performing one or more other gestures or inputs. In some embodiments, a selection input can have an initiation stage, a holding stage, and a termination stage. For example, in some embodiments, a pinch gesture by a hand of the user can be interpreted as a selection input. In this example, the motion of the hand into a pinch position can be referred to as the initiation stage and the device is able to detect that the user has initiated a selection input. The holding stage refers to the stage at which the hand maintains the pinch position. Lastly, the termination stage refers to the motion of the hand terminating the pinch position (e.g., releasing the pinch). In some embodiments, if the holding stage is less than a predetermined threshold amount of time (e.g., less than 0.1 seconds, 0.3 seconds, 0.5 seconds, 1 second, 2 seconds, etc.), then the selection input is interpreted as a discrete selection input (e.g., a single event actuating a respective user interface element), such as a mouse click-and-release, a keyboard button press-and-release, etc. In such embodiments, the electronic device optionally reacts to the discrete selection event (e.g., optionally after detecting the termination). In some embodiments, if the holding stage is more than the predetermined threshold amount of time, then the selection input is interpreted as a select-and-hold input, such as a mouse click-and-hold, a keyboard button press-and-hold, etc. In such embodiments, the electronic device can react to not only the initiation of the selection input (e.g., initiation stage), but also to any gestures or events detected during the holding stage (e.g., such as the movement of the hand that is performing the selection gesture), and/or the termination of the selection input (e.g., termination stage).
In some embodiments, three-dimensional environment 300 includes one or more real-world objects (e.g., representations of objects in the physical environment around the device) and/or one or more virtual objects (e.g., representations of objects generated and displayed by the device that are not necessarily based on real world objects in the physical environment around the device). For example, in
In
In some embodiments, user interfaces in a container have sizes and/or shapes based on the characteristics of the respective user interface and/or respective application. For example, if a first user interface in a container is associated with a first application and a second user interface in the container is associated with a second application, the size and shape of the first user interface is determined based on the design and requirements of the first application and the size and shape of the second user interface is determined based on the design and requirements of the second application. In some embodiments, whether a user interface is a member of a container (e.g., as opposed to not being a member of a container) does not affect the size and shape of a respective user interface. In some embodiments, a user is able to resize user interface in a container. In some embodiments, different user interfaces in three-dimensional environment 300 can have different sizes, shapes, and/or orientations (e.g., portrait vs. landscape).
In some embodiments, a container can impose size and shape restrictions on the user interfaces in the container, optionally to ensure a consistent look and feel. For example, a container can require that user interfaces in the container be less than a maximum height, be less than a maximum width, and/or have an aspect ratio within a predetermined range. It is understood that the sizes and shapes of the user interfaces illustrated herein are merely exemplary and not limiting.
In some embodiments, user interfaces in three-dimensional environment 300 are accompanied with one or more manipulation affordances. For example, in
In some embodiments, affordance 308-1 is not displayed (e.g., is hidden) if the focus of the user is not directed at the user interface 306-1. In some embodiments, affordance 308-1 is always displayed (e.g., without regard to whether the focus is on user interface 306-1).
In some embodiments, affordance 308-1 can be manipulated by the user to perform one or more manipulation operations on user interface 306-1. For example, a user is able to use a hand to perform one or more gestures directed to affordance 308-1 to interact with affordance 308-1. In some embodiments, the user is able to perform a selection gesture with a hand (e.g., a pinch and release gesture, a tap gesture, etc.) to actuate affordance 308-1, which optionally causes display of one or more options to perform one or more functions with respect to user interface 306-1, such as to close, resize, move, etc. user interface 306-1. In some embodiments, the user is able to perform a selection gesture with a hand and while maintaining the selection gesture (e.g., a pinch-and-hold gesture, a pointing gesture with one or more fingers while maintaining the extended position of the one or more fingers, etc.), move the hand to cause affordance 308-1 and user interface 306-1 to move around in three-dimensional environment 300 in accordance with the movement of the hand (e.g., a drag-and-drop operation). In some embodiments, a selection gesture can include a forward pointing gesture with a finger of a hand pointing at affordance 308-1 (e.g., a forward and/or upward movement by the hand and/or an extension of one or more fingers towards affordance 308-1), a tap gesture with a finger of the hand (e.g., a forward movement by a finger of the hand towards affordance 308-1 such that the finger touches affordance 308-1 or approaches within a threshold distance of affordance 308-1), a pinch gesture by two or more fingers of the hand (e.g., a pinch by a thumb and forefinger of the hand at a location associated with affordance 308-1 such that it appears as if the user is pinching affordance 308-1), or any other suitable gesture indicative of the user's interest in affordance 308-1.
In
In some embodiments, the detected gesture includes a selection gesture on a representation of an application (e.g., selection of an application icon from a buffet of applications that are installed on the device and able to be executed, launched and/or otherwise displayed in three-dimensional environment 300). In some embodiments, the gesture optionally includes a selection gesture and a movement by hand 301 moving a user interface associated with the application to be launched to the desired location in three-dimensional environment 300 (e.g., a drag-and-drop gesture). For example, in some embodiments, in response to selecting an application icon, a representation of a user interface associated with the selected application (e.g., such as user interface 306-2) is displayed at or near the location of the user's selection (e.g., the application icon turns into and/or morphs into a representation of a user interface), and while displaying the representation of the user interface, the user is able to move hand 301 while maintaining the selection gesture to move the representation of the user interface to different locations in three-dimensional environment 300 (e.g., dragging the representation of the user interface).
In some embodiments, in response to detecting the termination of the selection gesture, user interface 306-2 is displayed where the representation of user interface 306-2 was located when the termination of the selection gesture was detected, as will be described in further detail below with respect to
In some embodiments, three-dimensional environment 300 has one or more predetermined positions at which user interfaces can be displayed and/or placed. In such embodiments, when a user interface (or representation of a user interface, as the case may be) is moved to within the threshold distance of the one or more predetermined positions (e.g., within 1 inch, 3 inches, 6 inches, 1 foot, etc. from the predetermined position), an outline (e.g., such as outline 310) is displayed at the respective predetermined position to indicate that the user interface can be and/or will be placed at the respective predetermined position in response to the release of the selection input. In some embodiments, if the user interface is not within the threshold distance from the predetermined location, outline 310 is not displayed in three-dimensional environment 300.
In some embodiments, while outline 310 is displayed, if user interface 306-2 is moved to more than the threshold distance from the respective location (e.g., moved away from the respective location), then outline 310 is optionally ceased to be displayed. In some embodiments, the threshold distance at which outline 310 ceases to be displayed is more than (e.g., 1 inch more, 3 inches more, 6 inches more, etc.) the threshold distance at which 310 begins to be displayed (e.g., a hysteresis effect). Implementing a hysteresis effect prevents outline 310 from flickering in and out of display if, for example, user interface 306-2 is at or near the threshold distance at which outline 310 begins to be displayed.
In some embodiments, the one or more predetermined positions can be based on the location of objects in three-dimensional environment 300, such as tables, walls, and/or other user interfaces. For example, three-dimensional environment 300 can impose rules to prevent user interfaces from conflicting with other user interfaces by setting the predetermined positions based on distance from existing user interfaces (e.g., optionally taking into account the size and/or shape of the existing user interfaces and/or the user interface to be placed). For example, three-dimensional environment 300 can require a buffer between user interfaces (e.g., a 3 inch, 6 inch, 1 foot, 3 feet, etc. buffer). Thus, a respective predetermined position can be located to the left, right, above, or below an existing user interface, but optionally cannot be located at a location that causes the obscuring of either the existing user interface or the user interface being placed.
In some embodiments, the one or more predetermined positions can be based on acceptable locations and/or positions for user interfaces in a container (e.g., a set of user interfaces that move together in response to movement inputs). In some embodiments, a container is optionally an organizational element (e.g., a user interface element, a software element, a software construct, etc.) that includes a set of user interfaces (e.g., one or more, a plurality, etc.) that have been grouped together (e.g., a set of user interfaces, a workspace, etc.). In some embodiments, user interfaces that are grouped together in a container share certain characteristics, properties, and/or behaviors with each other. For example, user interfaces in a container optionally are automatically aligned with each other (e.g., aligned horizontally and/or aligned vertically, etc.), maintain a predetermined amount of separation from each other, and/or maintain the same distance from the user as the other user interfaces in the same container. For example, if a first user interface in a container is moved around in three-dimensional environment 300 (e.g., in response to a user input, for example), then the other user interfaces in the same container are optionally also moved around in three-dimensional environment 300 to maintain the same position relative to each other (e.g., the user interfaces move together as a single unit).
Thus, user interfaces in a container can be arranged according to a predetermined criteria. For example, user interfaces in a container can be aligned horizontally and have a fixed spacing between each user interface. Thus, acceptable locations for user interfaces in a container can be a location to the left of the left-most user interface in the container (e.g., separated from the left-most user interface by a predetermined spacing, such as 1 inch, 3 inches, 6 inches, 1 foot, etc.), a location to the right of the right-most user interface in the container (e.g., separated from the right-most user interface by a predetermined spacing, such as 1 inch, 3 inches, 6 inches, 1 foot, etc.), and/or a location between two user interfaces (e.g., separated from each adjacent user interface by a predetermined spacing, such as 1 inch, 3 inches, 6 inches, 1 foot, etc.). In some embodiments, the predetermined locations for a container are locations that, when a user interface is placed and/or snapped to a respective predetermined locations for the container, the user interface is added to the container at the respective predetermined location (optionally causing the existing user interfaces to adjust positions to allow the user interface being added to be added at the respective location), as will be discussed in further detail below. Thus, if a respective user interface is moved to within a threshold distance from these predetermined locations in a container (e.g., 1 inch, 3 inches, 6 inches, 1 foot, 3 feet, etc.), an outline can be displayed indicating that the respective user interface will be added to the container and placed at the location indicated by the outline. In some embodiments, the predetermined locations in a container are based on the size and/or shape of the user interface being added. For example, if user interface 306-2 has a width of 1 foot, then a respective predetermined location can be 6 inches to the left or right of user interface 306-1, optionally including a spacing margin (e.g., a predetermined spacing between user interfaces, such as 1 inch, 3 inches, 6 inches, etc.). Setting the respective predetermined location at least 6 inches to the left or right of user interface 306-1 prevents overlap between user interface 306-1 and user interface 306-2 (e.g., prevents one user interface from obscuring the other). Thus, in some embodiments, the predetermined locations in a container can dynamically be updated and/or adjusted based on the size and/or shape of the existing user interfaces and/or the size and/or shape of the user interfaces being added.
In some embodiments, outline 310 is only displayed for predetermined location associated with containers and not displayed for predetermined locations that are not associated with containers (e.g., locations that do not cause the user interface to be added to the container). In some embodiments, outline 310 is displayed for predetermined locations without regard to whether they are associated or not associated with containers.
In some embodiments, the threshold distance at which outline 310 is displayed can be the same threshold distance at which user interface 306-2 will snap to the location indicated by outline 310. For example, outline 310 is displayed when user interface 306-2 reaches a position such that the release of the selection gesture will cause user interface 306-2 to “snap” to the respective location indicated by outline 310.
In some embodiments, outline 310 is based on user interface 306-2 and has the same size and/or shape as user interface 306-2. In some embodiments, outline 310 has the same shape, but has a larger size than user interface 306-2 (e.g., 5% larger, 10% larger, 20% larger, etc.). In some embodiments, outline 310 provides a preview how user interface 306-2 will look once placed at the respective location (e.g., a preview of the boundary of user interface 306-2, a preview of the area which will be occupied by user interface 306-2, etc.).
In some embodiments, the display of outline 310 indicates that the release of the selection gesture will cause user interface 306-2 to be placed at (e.g., snapped to) the respective location indicated by outline 310. In some embodiment, outline 310 is displayed if the release of the selection gesture will cause user interface 306-2 to be added to a container that includes user interface 306-1 (as will be described in further detail below with respect to
In some embodiments, if the user moves user interface 306-2 to a position such that outline 310 is displayed (e.g., as described above) and while outline 310 is displayed, if user interface 306-2 hovers at or near that respective location (e.g., user interface 306-2 moves by less than 1 inch, 3 inches, 6 inches, etc.) for more than a threshold amount of time (e.g., 1 second, 3 seconds, 5 seconds, etc.), then an animation is optionally displayed moving user interface 306-2 into the location indicated by outline 310 (e.g., the location that user interface 306-2 will be displayed upon detection of the release of the selection input) (e.g., without requiring a movement by hand 301 moving user interface 306-2 to that location). For example, an animation is displayed of user interface 306-2 sliding into position, thus providing a further visual indication that the release of the selection gesture will cause user interface 306-2 to be placed at the respective location indicated by outline 310.
In
In some embodiments, in response to detecting the termination of the selection gesture by hand 301, the application associated with user interface 306-2 is launched (e.g., if the application was not already running) and user interface 306-2 is displayed (e.g., placed) at the respective location indicated by outline 310 (e.g., snapped to the location around which outline 310 was displayed), as shown in
In some embodiments, in response to detecting the termination of the selection gesture by hand 301, user interface 306-1 is added to a container that includes user interface 306-1. For example, in
In some embodiments, if three-dimensional environment 300 already includes a container and user interface 306-1 is already a member of that container, then in response to detecting the termination of the selection gesture by hand 301 when user interface 306-2 is within the threshold distance of the respective location to the left of user interface 306-1, user interface 306-2 is added to the existing container to the left of user interface 306-1, as shown in
In some embodiments, three-dimensional environment 300 can include one or more containers, each of which including one or more user interfaces. In some embodiments, three-dimensional environment 300 can include a maximum of one container. In some embodiments, three-dimensional environment 300 can include no containers. In some embodiments, a container can contain one or more user interfaces. In some embodiments, a container must include at least two or more user interfaces and a container cannot be created with just one user interface. In such embodiments, removing the second-to-last user interface from the container automatically disbands the container (e.g., while maintaining display of the last user interface, which is now no longer a member of a container). In some embodiments, three-dimensional environment 300 can include user interfaces that are part of containers (e.g., which optionally move as a group, in accordance with the behavioral rules of the container) and user interfaces that are not part of containers (e.g., which are able to move freely, without being constrained by the behavioral rules of the container). For example, not every user interface in three-dimensional environment 300 must be a member of a container if a container exists. In some embodiments, if a container exists in three-dimensional environment 300, all user interfaces in three-dimensional environment 300 must be a member of that container or another container (e.g., must be a member of some container). In some embodiments, a user interface cannot be a member of more than one container. In some embodiments, a respective user interface can be a member of multiple containers (e.g., which optionally causes the user interfaces in the multiple containers of which the respective user interface is a member to be indirectly associated with each other).
Thus, as described above, in response to detecting a user moving a user interface (e.g., either a representation of a user interface that was previously an icon of an application, or a user interface from another location in three-dimensional environment 300, as will be described in further detail below with respect to
As shown in
In some embodiments, affordance 312 is associated with the container and optionally is manipulatable to manipulate the container (e.g., to manipulate the user interfaces in the container, to manipulate all user interfaces in the container, etc., such as to move the user interfaces in a horizontal, vertical directions and/or to change the distance of the user interfaces from the user (e.g., change the z depth), etc.). For example, a user is able to select affordance 312 with hand 301 and while maintaining the selection input, move hand 301 to move affordance 312 in accordance with the movement of hand 301. In some embodiments, moving affordance 312 causes one or more of the user interfaces in the container to move in accordance with the movement of hand 301 (e.g., the user interfaces adjacent to affordances 312, all user interfaces in the container, etc., optionally including the associated affordances). For example, if hand 301 moves rightwards, affordance 312 (e.g., and optionally affordances 308-1 and 308-2) and/or user interfaces 306-1 and 306-2 (e.g., the user interfaces in the container) move rightward, optionally with a speed and/or magnitude that is based on the speed and/or magnitude of the movement of hand 301. In some embodiments, as will be described in more detail below with respect to
Thus, as described above, if a user interface is moved (e.g., via a drag-and-drop style interaction) to within a threshold distance of a respective location which can cause the user interface to be added to a container that includes one or more existing user interfaces, a target can be displayed at the respective location. In response to detecting the termination of the selection input (e.g., the “drop” of the drag-and-drop interaction), the user interact being moved is snapped to the respective location and added to a container that options one or more of the existing user interfaces.
In some embodiments, if the user interface is not brought to within the threshold distance from the respective location, then in response to detecting the termination of the selection input, the application associated with the user interface is not launched and/or the user interface being moved is not displayed or placed in three-dimensional environment 300 (e.g., the launch is canceled and the user interface being moved ceases to be displayed). In some embodiments, if a user interface is not within the threshold distance from the predetermined location, then in response to detecting the termination of the selection input, the user interface is placed at the location that the user interface was at when the termination of the selection input was received (e.g., the user interface is not snapped to one of the predetermined locations). Thus, the user is able to flexibly place user interfaces at any location in three-dimensional environment 300 by dragging and dropping the user interfaces (e.g., optionally via dragging and dropping an affordance, such as affordance 308-1 and/or optionally via dragging and dropping the user interface itself) at the desired location.
In
In some embodiments, user interfaces 406-1 and/or 406-2 move to the right and left, respectively, to make room for user interface 406-3 to be inserted between user interface 406-1 and user interface 406-2. For example, user interface 406-1 can move to the right and user interface 406-3 can be displayed at least partially where user interface 406-1 was previously located (e.g., centered on where user interface 406-1 was previously centered), user interface 406-2 can move to the left and user interface 406-3 can be displayed at least partially where user interface 406-2 was previously located (e.g., centered on where user interface 406-2 was previously centered), or both user interfaces 406-1 and 406-2 can move to the right and left, respectively, and user interface 406-3 can be displayed at least partially where user interfaces 406-1 and 406-2 were previously located (e.g., centered on a location between user interface 406-1 and user interface 406-2).
In some embodiments, if the workspace has multiple user interfaces to the left and/or right of the location at which the new user interface is inserted, then the multiple user interfaces are moved to the left or right, as the case may be, to make room for the new user interface to be inserted. For example, in
In some embodiments, an animation is displayed of the user interfaces moving to the left and/or right to make room for the new user interface to be inserted and/or an animation of the new user interface to be inserted appearing at the respective location during and/or after the user interfaces move to the left and/or right to make room.
Thus, as shown in
In some embodiments, the position and/or location where the new user interface is inserted is based at least on the location of the gaze of the user, such as gaze 412 illustrated in
In some embodiments, if gaze 412 is directed at an existing user interface, then in response to voice command 414, the electronic device replaces the existing user interface at which gaze 412 is directed with the new user interface being displayed (e.g., optionally without displaying an outline such as outline 310). For example, in
It is understood that although
In some embodiments, if three-dimensional environment 400 does not include a container and/or includes at least one user interface that is not a member of a container (e.g., such as in
It is noted that
In some embodiments, on the other hand, the user is optionally able to determine, from portions of the voice command, as the voice command is being received, that the user desires to launch an application and/or display a user interface, and the device is able to display an outline at the respective location. For example, in response to receiving a first portion of a voice command saying “display”, the device is optionally able to determine that the command is likely a request to launch an application and/or display a user interface. Thus, in response to receiving the command “display” (e.g., or any other similar command), an outline can be displayed at the respective location (e.g., before the full voice command is received, which optionally causes the user interface to be placed at the respective location). In such embodiments, the respective location can be determined based on the location of gaze 412 when the first portion of the voice command is received.
While displaying three-dimensional environment 500 that includes one or more user interfaces that are members of a container, such as in
In some embodiments, application launcher element 514 includes a text field in which a user is able to enter text to launch an application or otherwise cause a user interface for an application to be displayed. Application launcher element 514 can be used to perform a plurality of functions including searching for documents, opening documents, searching for applications, launching applications, performing mathematical calculations, navigating to web pages, etc. For example, a user is able to enter text into application launcher element 514 via a physical keyboard, a virtual keyboard, a voice command, etc., and the device is able to identify one or more documents and/or applications that match or partially match the text entered into application launcher element 514 (e.g., document and/or application search results). In some embodiments, the device displays the results of the search as a list. In some embodiments, the search results can be labeled and/or categorized as either documents or applications (or other types of search results). The user is optionally able to select the desired document or application from the list of search results to cause the respective document or application to be displayed. For example, a user is able to use a voice command to indicate which search result to actuate and/or use a hand to perform a selection gesture directed to a respective search result. In some embodiments, a user is able to execute the top search result by performing a user input corresponding to a confirmation or execution user input. For example, if the top search result is a document, then in response to the selection of an “enter” key from a soft keyboard, the document is displayed (e.g., an application for displaying the document is launched and the document is opened using the application) and if the top search result is an application, then in response to the selection of the “enter” key from the soft keyboard, the application is launched (or a user interface of the application is displayed).
As shown in
In some embodiments, if a user interface will be displayed via application launcher element 514, then three-dimensional environment 500 is updated to indicate that a user interface will be displayed at a respective position. In some embodiments, updating three-dimensional environment 500 can include displaying an outline (e.g., such as outline 310 described above with respect to
In
In some embodiments, in accordance with a determination that the device identified Application 3 as matching a search criteria based on the text entered in application launcher element 514, the device determines that Application 3 will be launched and/or a user interface associated with Application 3 will be displayed in response to a user input confirming the launch. In some embodiments, in response to determining that Application 3 will be launched and/or a user interface associated with Application 3 will be displayed, user interface 506-1 moves to the right and user interface 506-2 moves to the left to make room for the user interface associated with Application 3 (e.g., before Application 3 is launched), as shown in
In some embodiments, additionally or alternatively to moving user interfaces 506-1 and 506-2, outline 516 is displayed at the location where the user interface associated with Application 3 will be displayed (e.g., in response to a user input confirming the launch). Outline 516 optionally has characteristics and/or behaviors similar to outline 310 described above with respect to
In some embodiments, application launcher element 514 continues to be displayed while user interface 506-1 and user interface 506-2 move to the left and right, respectively, and before launching the respective user interface, the user is able to change the text in application launcher element 514 to change the search results and/or change whether outline 516 is displayed and/or whether the existing user interfaces move. For example, if the user deletes the text in application launcher element 514, then Application 3 is no longer the application that will be launched, and user interfaces 506-1 and 506-2 are optionally moved back to their original positions and/or outline 516 is ceased to be displayed. Similarly, if the user changes the text in application launcher element 514 such that a search result that is not associated with any application is the top search result (e.g., an application will not be launched), then user interfaces 506-1 and 506-2 are optionally moved back to their original positions and/or outline 516 is ceased to be displayed.
On the other hand, if the user changes the text in application launcher element 514 such that a different application is now the top search result (for example, Application 4, instead of Application 3), then in some embodiments, outline 516 may be updated and/or adjusted based on the size and/or shape of the user interface for Application 4 and/or user interfaces 506-1 and 506-2 may move closer or farther apart based on the size and/or shape of the user interface for Application 4. For example, if the landing page for Application 4 is larger than the landing page for Application 3, then the user interfaces are optionally moved farther apart to provide additional space for the larger landing page. If the user interface that will be displayed for Application 4 is the same size and shape as the user interface that will be displayed for Application 3, then outline 516 optionally does not change size and/or shape.
In some embodiments, three-dimensional environment 500 may exhibit similar behavior based on the current focus within the search results of application launcher element 514 (e.g., as opposed to the top search result described above). For example, if the search results of application launcher element 514 include multiple matches, the user is able to move the focus within the search results. If the current focus is a document, then in response to a confirmation user input, an application associated with the document will be launched and user interfaces 506-1 and 506-2 move apart and/or outline 516 is displayed, as shown in
In some embodiments, the location that the application will be launched (e.g., the location that the user interface associated with the application to be launched will be displayed) is at least partially based on the location of the user's gaze at the time when the user input to display application launcher element 514 is received and/or is based on the position of application launcher element 514 (e.g., which optionally is based on the location of the user's gaze at the time when the user input to display application launcher element 514). For example, in
Similarly, if gaze 512 is looking at a respective location to the left or to the right of the left-most and right-most user interface in the container, application launcher element 514 (e.g., and optionally outline 516) is displayed at the respective location to the left or right of the container, respectively, and an application that is launched via application launcher element 514 is optionally launched at the respective location to the left or right of the container, respectively (e.g., and added to the container at the left-most or right-most position, respectively).
In some embodiments, if the top search result is a document and/or if the user moves a current focus to a document in the list of search results (e.g., as opposed to an application, as described above), then the behavior described above (e.g., with respect to displaying an outline, moving existing user interfaces, and/or launching a user interface in response to a user input confirming the launch) is not performed. In some embodiments, if the top search result is a document and/or if the user moves a current focus to a document in the list of search results, then the behavior described is performed and the user interface to be launched is a user interface of the application used to open and/or view the respective document. For example, if the document is a text document, then the application that will be launched is a text editing and/or viewer application.
In
In some embodiments, in response to receiving a user input to launch Application 3, outline 516 ceases to be displayed and user interface 506-3 associated with Application 3 is displayed at a location based on the location of outline 516 and added to the container that includes user interface 506-1 and user interface 506-2.
As shown in
In some embodiments, while application launcher element 514 is displayed, user interface 506-1 and user interface 506-2, which are shown partially overlapping with application launcher element 514 in
Although the description of
Additionally, as shown in
In some such embodiments, a user input is received to launch Application 3 (e.g., in a similar manner as described above). In some such embodiments, in response to receiving a user input to launch Application 3, the outline 516 ceases to be displayed and user interface 506-3 associated with Application replaces user interface 506-1 and is added to the container that now includes user interface 506-2 and user interface 506-3. For example, as shown in
In some embodiments, replacing an application in a container using application launcher element 514 as described with respect to
Thus, as shown in
In
Thus, in some embodiments, performing a selection input on an affordance between user interfaces in the same container causes one or more user interfaces to be separated from the container. In some embodiments, if the user selected a vertical affordance that is in the center of four user interfaces in the same container, then the two user interfaces on each side of the selected affordance are disassociated from each other and optionally two containers are generated, each including the two user interfaces to the left and right of the selected affordance. For example, the container that includes the four user interfaces are optionally dissolved and two new containers are generated. Alternatively, two of the user interfaces remain in the original container while a new container is generated for the other two user interfaces.
In some embodiments, a container cannot have only a single user interface and thus, in response to selecting a vertical affordance between the left-most or right-most user interface and the next adjacent user interface, the left-most or right-most user interface (as the case may be) is removed from the container and is no longer a member of any container. In some embodiments, if the container includes only two user interfaces, then in response to selecting the vertical affordance between the two user interfaces, the container that includes the two user interfaces is dissolved and neither user interface is a member of any container.
In some embodiments, after a user interface is removed from a container, the respective user interface is able to be manipulated without automatically manipulating the other user interfaces in the same container. For example, a user is able to interact with affordance 608-1 to move user interface 606-1 without causing user interface 606-2 and/or user interface 606-3 to also move with user interface 606-1. For example, in
In some embodiments, a user need not first select affordance 610-1 to remove user interface 606-1 from the container that included user interfaces 606-1, 606-2, and 606-3, as described above with respect to
In
In
In some embodiments, if user interface 606-1 is at a distance that is farther than user interface 606-3 (e.g., at a farther z-depth), then in accordance with a determination that user interface 606-3 is obscuring a threshold amount of user interface 606-1 (e.g., because user interface 606-3 is in front of user interface 606-1), user interface 606-3 is made at least partially transparent (e.g., 10% transparency, 30% transparency, 50% transparency, 90% transparency, etc.), for example, so that user interface 606-1 is at least partially visible. In some embodiments, modifying user interface 606-1 to be at least partially visible allows the user to see the user interface that is replacing the existing user interface.
In some embodiments, the transparency feature described above is only performed if the user interface that is obscuring the user interface being obscured is a larger size than the user interface being obscured (e.g., larger in both width and height dimensions, larger in surface area, etc.). In some embodiments, the transparency feature described above is performed without regard to the relative sizes of the user interfaces.
In some embodiments, the transparency feature described above is only performed if the user interface being moved by hand 601 is at least a threshold distance from the location of an existing user interface such that the user interface being moved will replace the existing user interface upon release of the selection input. For example, if user interface 606-1 is moved in three-dimensional environment 600 such that it overlaps with a portion of user interface 606-3, but is not within the threshold distance from user interface 606-3 (e.g., threshold distance from the center of user interface 606-3), then user interface 606-1 is optionally not made at least partially transparent. If, on the other hand, if user interface 606-1 is moved to overlap with a portion of user interface 606-3, upon user interface 606-1 moving to overlap more of user interface 606-3 and reaching the threshold distance from the location occupied by user interface 606-3 (e.g., the center of user interface 606-3), then user interface 606-1 is optionally made at least partially transparent (e.g., and outline 612 is displayed). In some embodiments, the transparency feature described above is performed without regard to whether an existing user interface will be replaced by the user interface being moved.
In
Thus, as described above, a user is able to replace an existing user interface in a container with another user interface by moving the other user interface to within a threshold distance of the location occupied by the existing user interface, such as described in
In some embodiments, a user is able to add a user interface to a container (e.g., to either end or between user interfaces) via a user input (or sequence of user inputs) that does not involve moving and/or dragging a user interface to a respective location and/or does not include a “setup” stage and a “confirmation” or “execution” stage. In such embodiments, the gaze of the user can be used to determine the location that the user interface will be inserted. In some embodiments, an outline can be displayed at the respective location. For example, in
It is understood that although the figures illustrate user interfaces in a container aligned horizontally, user interfaces in a container can be arranged in any orientation. For example, user interfaces can be oriented vertically, horizontally, or in a grid (e.g., 2×2 grid, 3×3 grid, 2×4 grid, etc.). In such embodiments, user interfaces can be added or inserted anywhere within the container (e.g., above, below, to the left or right, etc.).
In some embodiments, an electronic device (e.g., a mobile device (e.g., a tablet, a smartphone, a media player, or a wearable device), a computer, etc. such as device 100 and/or device 200) in communication with a display generation component (e.g., a display integrated with the electronic device (e.g., a touch screen display, a head mounted display, etc.) and/or an external display such as a monitor, projector, television, etc.) and one or more input devices (e.g., a touch screen, mouse (e.g., external), trackpad (optionally integrated or external), touchpad (optionally integrated or external), remote control device (e.g., external), another mobile device (e.g., separate from the electronic device), a handheld device (e.g., external), a controller (e.g., external), a camera (e.g., visible light camera), a depth sensor and/or a motion sensor (e.g., a hand tracking sensor, a hand motion sensor), etc.) presents (702), via the display generation, a computer-generated environment, including a first user interface at a first location in the computer-generated environment, such as three-dimensional environment 300 including user interface 306-1 in
In some embodiments, while presenting the computer-generated environment, the electronic device receives (704), via the one or more input devices, a request to display a second user interface in the computer-generated environment, such as a user input moving user interface 306-2 in
In some embodiments, in accordance with the determination that the request satisfies the one or more criteria, the electronic device displays the second user interface at the second location, such as displaying user interface 306-2 at a location adjacent to user interface 306-1 in
In some embodiments, in accordance with a determination that the request does not satisfy the one or more criteria, the electronic device displays the second user interface at a respective location in the computer-generated environment other than the second location, such as if user interface 306-2 were displayed at a location other than the location associated with outline 310 in
In some embodiments, the one or more criteria includes a requirement that the request includes a user input for displaying the second user interface that does not include a movement component moving the second user interface, such as voice command 414 in
In some embodiments, before adding the second user interface to the container that includes the first user interface, the electronic device displays an animation of the second user interface moving to the second location, such as displaying an animation of user interface 306-2 moving into the location indicated by outline 310 in
In some embodiments, before receiving the request to display the second user interface in the computer-generated environment, the computer-generated environment included a third user interface, and the container that includes the first user interface included the third user interface, such as three-dimensional environment 500 including user interfaces 506-1 and user interface 506-2 in
In some embodiments, in accordance with the determination that the request satisfies the one or more criteria, the electronic device moves at least one of the first user interface or the third user interface to provide space for the second user interface, such as in
In some embodiments, after adding the second user interface to the container that includes the first user interface, the electronic device receives a user input corresponding to a request to move one of the first user interface or the second user interface, such as if the device receives a user input to move user interface 306-1 or 306-2 in
In some embodiments, the visual indication is displayed while receiving the request to display the second user interface in the computer-generated environment, such as in
In some embodiments, receiving the request to display the second user interface in the computer-generated environment includes detecting a predetermined gesture performed by a hand of the user of the electronic device, such as detecting a pinch or pointing gesture by hand 301 in
In some embodiments, an electronic device (e.g., a mobile device (e.g., a tablet, a smartphone, a media player, or a wearable device), a computer, etc. such as device 100 and/or device 200) in communication with a display generation component (e.g., a display integrated with the electronic device (e.g., a touch screen display, a head mounted display, etc.) and/or an external display such as a monitor, projector, television, etc.) and one or more input devices (e.g., a touch screen, mouse (e.g., external), trackpad (optionally integrated or external), touchpad (optionally integrated or external), remote control device (e.g., external), another mobile device (e.g., separate from the electronic device), a handheld device (e.g., external), a controller (e.g., external), a camera (e.g., visible light camera), a depth sensor and/or a motion sensor (e.g., a hand tracking sensor, a hand motion sensor), etc.) presents (802), via the display generation, a computer-generated environment, including a first user interface at a first location in the computer-generated environment, such as three-dimensional environment 400 in
In some embodiments, while presenting the computer-generated environment, the electronic device receives (804), via the one or more input devices, a user input corresponding to a request to display a second user interface in the computer-generated environment, such as voice command 414 requesting the device open App 3 in
In some embodiments, in accordance with a determination that the request does not satisfy the one or more first criteria (812), the electronic device maintains (814) display of the first user interface at the first location, and displays (816) the second user interface at a second location in the computer-generated environment, such as if user interface 406-3 were added to the container at a new location or user interface 406-3 is displayed in three-dimensional environment 400 outside of the container in
In some embodiments, the second location is a location at which the gaze of the user was directed when the user input was received, such as the location between user interfaces 406-1 and 406-2 in
In some embodiments, before receiving the user input, the computer-generated environment includes a first container that includes the first user interface, such as the container that includes user interface 406-1 and user interface 406-2 in
In some embodiments, in accordance with the determination that the request satisfies the one or more first criteria, the electronic device removes the first user interface from the first container, and adds the second user interface to the first container, such as if user interface 406-3 is added to the container and the first user interface is removed from the container in
In some embodiments, the three-dimensional environment includes a third user interface, and the first container includes the third user interface, such as in
In some embodiments, before displaying the second user interface at the location between the first user interface and the third user interface, the electronic device moves at least one of the first user interface or the third user interface to provide space for the second user interface, such as user interface 406-1 moving to the right and user interface 406-2 moving to the left in
In some embodiments, an electronic device (e.g., a mobile device (e.g., a tablet, a smartphone, a media player, or a wearable device), a computer, etc. such as device 100 and/or device 200) in communication with a display generation component (e.g., a display integrated with the electronic device (e.g., a touch screen display, a head mounted display, etc.) and/or an external display such as a monitor, projector, television, etc.) and one or more input devices (e.g., a touch screen, mouse (e.g., external), trackpad (optionally integrated or external), touchpad (optionally integrated or external), remote control device (e.g., external), another mobile device (e.g., separate from the electronic device), a handheld device (e.g., external), a controller (e.g., external), a camera (e.g., visible light camera), a depth sensor and/or a motion sensor (e.g., a hand tracking sensor, a hand motion sensor), etc.) presents (902) a computer-generated environment, including a first user interface at a first location in the computer-generated environment, such as three-dimensional environment 600 in
In some embodiments, while presenting the computer-generated environment, the electronic device receives (904), via the one or more input devices, a request to display a second user interface in the computer-generated environment, including an input moving the second user interface to a respective location in the computer-generated environment, such as in
In some embodiments, in accordance with a determination that the request satisfies one or more criteria, the electronic device modifies (906) a transparency of a given user interface, such as user interface 606-1 obscuring a threshold amount of user interface 606-3 in
In some embodiments, the one or more criteria includes a criterion that is satisfied when the second user interface obscures at least a threshold amount of the first user interface, such as in
In some embodiments, the one or more criteria includes a second criterion that is satisfied when a size of the second user interface is less than a size of the first user interface by a threshold amount, such as if user interface 606-3 were smaller than user interface 606-1 in
In some embodiments, the one or more criteria includes a criterion that is satisfied when the first user interface obscures at least a threshold amount of the second user interface, such as if user interface 606-1 were at a farther z-depth than user interface 606-3 such that user interface 606-1 were obscured by user interface 606-3 by a threshold amount in
In some embodiments, the one or more criteria includes a criterion that is satisfied when at least a threshold amount of the given user interface is obscured for a threshold amount of time, such as if user interface 606-1 is hovered at location in
In some embodiments, the first user interface is a user interface of a first application, such as user interface 606-3 being a user interface of a particular application in
In some embodiments, in accordance with a determination that the request does not satisfy the one or more criteria, the electronic device forgoes modifying the transparency of the given user interface, such as if user interface 606-1 were not made partially transparent in
In some embodiments, after modifying the transparency of the given user interface to restore the transparency value that the given user interface had before the request was received and while displaying the given user interface with an unmodified transparency value, the electronic device detects a termination of the request to display the second user interface in the computer-generated environment, such as if the device detects a termination of the selection input after hand 601 moves user interface 606-1 away from user interface 606-3 in
In some embodiments, after modifying the transparency of the given user interface and while displaying the given user interface with a modified transparency value, the electronic device detects a termination of the request to display the second user interface in the computer-generated environment, such as the release of the pointing or pinching gesture in
It should be understood that, as used herein, presenting an environment includes presenting a real-world environment, presenting a representation of a real-world environment (e.g., displaying via a display generation component), and/or presenting a virtual environment (e.g., displaying via a display generation component). Virtual content (e.g., user interfaces, content items, etc.) can also be presented with these environments (e.g., displayed via a display generation component). It is understood that as used herein the terms “presenting”/“presented” and “displaying”/“displayed” are often used interchangeably, but depending on the context it is understood that when a real world environment is visible to a user without being generated by the display generation component, such a real world environment is “presented” to the user (e.g., allowed to be viewable, for example, via a transparent or translucent material) and not necessarily technically “displayed” to the user. Additionally or alternatively, as used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms, unless the context clearly indicates otherwise. The term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. The terms “includes,” “including,” “comprises,” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Further, although the above description uses terms “first,” “second,” etc. to describe various elements, these elements should not be limited by the terms. These terms are only used to distinguish one element from another. For example, a respective user interface could be referred to as a “first” or “second” user interface, without implying that the respective user interface has different characteristics based merely on the fact that the respective user interface is referred to as a “first” or “second” user interface. On the other hand, a user interface referred to as a “first” user interface and a user interface referred to as a “second” user interface are both user interface, but are not the same user interface, unless explicitly described as such.
Additionally or alternatively, as described herein, the term “if,” optionally, means “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” is, optionally, construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context. The terminology used in the description of the various described embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best use the invention and various described embodiments with various modifications as are suited to the particular use contemplated.
This application is continuation of U.S. application Ser. No. 18/260,022, filed Jun. 29, 2023, which is a National Phase application under 35 U.S.C. § 371 of International Application No. PCT/US2021/065240, filed Dec. 27, 2021, which claims the priority benefit of U.S. Provisional Application No. 63/132,956, filed Dec. 31, 2020, the contents of which are hereby incorporated by reference in their entireties for all intended purposes.
Number | Name | Date | Kind |
---|---|---|---|
1173824 | Mckee | Feb 1916 | A |
5515488 | Hoppe et al. | May 1996 | A |
5524195 | Clanton et al. | Jun 1996 | A |
5610828 | Kodosky et al. | Mar 1997 | A |
5737553 | Bartok | Apr 1998 | A |
5740440 | West | Apr 1998 | A |
5751287 | Hahn et al. | May 1998 | A |
5758122 | Corda et al. | May 1998 | A |
5794178 | Caid et al. | Aug 1998 | A |
5877766 | Bates et al. | Mar 1999 | A |
5900849 | Gallery | May 1999 | A |
5933143 | Kobayashi | Aug 1999 | A |
5990886 | Serdy et al. | Nov 1999 | A |
6061060 | Berry et al. | May 2000 | A |
6078310 | Tognazzini | Jun 2000 | A |
6108004 | Medl | Aug 2000 | A |
6112015 | Planas et al. | Aug 2000 | A |
6154559 | Beardsley | Nov 2000 | A |
6456296 | Cataudella et al. | Sep 2002 | B1 |
6584465 | Zhu et al. | Jun 2003 | B1 |
6756997 | Ward et al. | Jun 2004 | B1 |
7035903 | Baldonado | Apr 2006 | B1 |
7134130 | Thomas | Nov 2006 | B1 |
7137074 | Newton et al. | Nov 2006 | B1 |
7230629 | Reynolds et al. | Jun 2007 | B2 |
7706579 | Oijer | Apr 2010 | B2 |
8341541 | Holecek et al. | Dec 2012 | B2 |
8593558 | Gardiner et al. | Nov 2013 | B2 |
8724856 | King | May 2014 | B1 |
8793620 | Stafford | Jul 2014 | B2 |
8793729 | Adimatyam et al. | Jul 2014 | B2 |
8803873 | Yoo et al. | Aug 2014 | B2 |
8805690 | Lebeau et al. | Aug 2014 | B1 |
8866880 | Tan et al. | Oct 2014 | B2 |
8896632 | Macdougall et al. | Nov 2014 | B2 |
8947323 | Raffle et al. | Feb 2015 | B1 |
8970478 | Johansson | Mar 2015 | B2 |
8970629 | Kim et al. | Mar 2015 | B2 |
8994718 | Latta et al. | Mar 2015 | B2 |
9007301 | Raffle et al. | Apr 2015 | B1 |
9108109 | Pare et al. | Aug 2015 | B2 |
9185062 | Yang et al. | Nov 2015 | B1 |
9189611 | Wssingbo | Nov 2015 | B2 |
9201500 | Srinivasan et al. | Dec 2015 | B2 |
9256785 | Qvarfordt | Feb 2016 | B2 |
9293118 | Matsui | Mar 2016 | B2 |
9316827 | Lindley et al. | Apr 2016 | B2 |
9400559 | Latta et al. | Jul 2016 | B2 |
9448635 | Macdougall et al. | Sep 2016 | B2 |
9448687 | Mckenzie et al. | Sep 2016 | B1 |
9465479 | Cho et al. | Oct 2016 | B2 |
9526127 | Taubman et al. | Dec 2016 | B1 |
9544257 | Ogundokun et al. | Jan 2017 | B2 |
9563331 | Poulos et al. | Feb 2017 | B2 |
9575559 | Andrysco | Feb 2017 | B2 |
9619519 | Dorner | Apr 2017 | B1 |
9672588 | Doucette et al. | Jun 2017 | B1 |
9681112 | Son | Jun 2017 | B2 |
9684372 | Xun et al. | Jun 2017 | B2 |
9734402 | Jang et al. | Aug 2017 | B2 |
9778814 | Ambrus et al. | Oct 2017 | B2 |
9829708 | Asada | Nov 2017 | B1 |
9851866 | Goossens et al. | Dec 2017 | B2 |
9864498 | Olsson et al. | Jan 2018 | B2 |
9886087 | Wald et al. | Feb 2018 | B1 |
9933833 | Tu et al. | Apr 2018 | B2 |
9934614 | Ramsby et al. | Apr 2018 | B2 |
9952042 | Abovitz et al. | Apr 2018 | B2 |
10049460 | Romano et al. | Aug 2018 | B2 |
10101803 | Faaborg et al. | Oct 2018 | B2 |
10203764 | Katz et al. | Feb 2019 | B2 |
10307671 | Barney et al. | Jun 2019 | B2 |
10353532 | Holz et al. | Jul 2019 | B1 |
10394320 | George-svahn et al. | Aug 2019 | B2 |
10431216 | Lemon et al. | Oct 2019 | B1 |
10530731 | Wu et al. | Jan 2020 | B1 |
10534439 | Raffa et al. | Jan 2020 | B2 |
10565448 | Bell et al. | Feb 2020 | B2 |
10664048 | Cieplinski et al. | May 2020 | B2 |
10664050 | Alcaide et al. | May 2020 | B2 |
10678403 | Duarte et al. | Jun 2020 | B2 |
10699488 | Terrano | Jun 2020 | B1 |
10732721 | Clements | Aug 2020 | B1 |
10754434 | Hall et al. | Aug 2020 | B2 |
10768693 | Powderly et al. | Sep 2020 | B2 |
10861242 | Lacey et al. | Dec 2020 | B2 |
10890967 | Stellmach et al. | Jan 2021 | B2 |
10956724 | Terrano | Mar 2021 | B1 |
10983663 | Iglesias | Apr 2021 | B2 |
11055920 | Bramwell et al. | Jul 2021 | B1 |
11079995 | Hulbert et al. | Aug 2021 | B1 |
11082463 | Felman | Aug 2021 | B2 |
11112875 | Zhou et al. | Sep 2021 | B1 |
11175791 | Patnaikuni | Nov 2021 | B1 |
11199898 | Blume et al. | Dec 2021 | B2 |
11200742 | Post et al. | Dec 2021 | B1 |
11232643 | Stevens et al. | Jan 2022 | B1 |
11294472 | Tang et al. | Apr 2022 | B2 |
11294475 | Pinchon et al. | Apr 2022 | B1 |
11307653 | Qian et al. | Apr 2022 | B1 |
11340756 | Faulkner et al. | May 2022 | B2 |
11348300 | Zimmermann et al. | May 2022 | B2 |
11461973 | Pinchon | Oct 2022 | B2 |
11496571 | Berliner et al. | Nov 2022 | B2 |
11567625 | Faulkner et al. | Jan 2023 | B2 |
11573363 | Zou et al. | Feb 2023 | B2 |
11574452 | Berliner et al. | Feb 2023 | B2 |
11599239 | Rockel et al. | Mar 2023 | B2 |
11720171 | Pastrana Vicente et al. | Aug 2023 | B2 |
11726577 | Katz | Aug 2023 | B2 |
11733824 | Iskandar et al. | Aug 2023 | B2 |
11762457 | Ikkai et al. | Sep 2023 | B1 |
11875013 | Lemay et al. | Jan 2024 | B2 |
12032803 | Pastrana Vicente et al. | Jul 2024 | B2 |
12099653 | Chawda et al. | Sep 2024 | B2 |
12099695 | Smith et al. | Sep 2024 | B1 |
12113948 | Smith et al. | Oct 2024 | B1 |
12118200 | Shutzberg et al. | Oct 2024 | B1 |
20010047250 | Schuller et al. | Nov 2001 | A1 |
20020044152 | Abbott et al. | Apr 2002 | A1 |
20020065778 | Bouet et al. | May 2002 | A1 |
20030038754 | Goldstein et al. | Feb 2003 | A1 |
20030151611 | Turpin et al. | Aug 2003 | A1 |
20030222924 | Baron | Dec 2003 | A1 |
20040059784 | Caughey | Mar 2004 | A1 |
20040104806 | Yui et al. | Jun 2004 | A1 |
20040243926 | Trenbeath et al. | Dec 2004 | A1 |
20050044510 | Yi | Feb 2005 | A1 |
20050100210 | Rice et al. | May 2005 | A1 |
20050138572 | Good et al. | Jun 2005 | A1 |
20050144570 | Loverin et al. | Jun 2005 | A1 |
20050144571 | Loverin et al. | Jun 2005 | A1 |
20050175218 | Vertegaal et al. | Aug 2005 | A1 |
20050198143 | Moody et al. | Sep 2005 | A1 |
20050216866 | Rosen et al. | Sep 2005 | A1 |
20060028400 | Lapstun et al. | Feb 2006 | A1 |
20060080702 | Diez et al. | Apr 2006 | A1 |
20060156228 | Gallo et al. | Jul 2006 | A1 |
20060256083 | Rosenberg | Nov 2006 | A1 |
20060283214 | Donadon et al. | Dec 2006 | A1 |
20070259716 | Mattice et al. | Nov 2007 | A1 |
20080181502 | Yang | Jul 2008 | A1 |
20080211771 | Richardson | Sep 2008 | A1 |
20090064035 | Shibata et al. | Mar 2009 | A1 |
20090146779 | Kumar et al. | Jun 2009 | A1 |
20090164219 | Yeung et al. | Jun 2009 | A1 |
20090231356 | Bames et al. | Sep 2009 | A1 |
20100097375 | Tadaishi et al. | Apr 2010 | A1 |
20100150526 | Rose et al. | Jun 2010 | A1 |
20100177049 | Levy et al. | Jul 2010 | A1 |
20100188503 | Tsai et al. | Jul 2010 | A1 |
20100269145 | Ingrassia et al. | Oct 2010 | A1 |
20110018895 | Buzyn et al. | Jan 2011 | A1 |
20110018896 | Buzyn et al. | Jan 2011 | A1 |
20110098029 | Rhoads et al. | Apr 2011 | A1 |
20110156879 | Matsushita et al. | Jun 2011 | A1 |
20110169927 | Mages et al. | Jul 2011 | A1 |
20110175932 | Yu et al. | Jul 2011 | A1 |
20110216060 | Weising et al. | Sep 2011 | A1 |
20110254865 | Yee et al. | Oct 2011 | A1 |
20110310001 | Madau et al. | Dec 2011 | A1 |
20120066638 | Ohri | Mar 2012 | A1 |
20120075496 | Akifusa et al. | Mar 2012 | A1 |
20120086624 | Thompson et al. | Apr 2012 | A1 |
20120113223 | Hilliges et al. | May 2012 | A1 |
20120124525 | Kang | May 2012 | A1 |
20120131631 | Bhogal et al. | May 2012 | A1 |
20120151416 | Bell et al. | Jun 2012 | A1 |
20120170840 | Caruso et al. | Jul 2012 | A1 |
20120184372 | Laarakkers et al. | Jul 2012 | A1 |
20120218395 | Andersen et al. | Aug 2012 | A1 |
20120256967 | Baldwin et al. | Oct 2012 | A1 |
20120257035 | Larsen | Oct 2012 | A1 |
20120272179 | Stafford | Oct 2012 | A1 |
20130027860 | Masaki et al. | Jan 2013 | A1 |
20130093727 | Eriksson et al. | Apr 2013 | A1 |
20130127850 | Bindon | May 2013 | A1 |
20130148850 | Matsuda et al. | Jun 2013 | A1 |
20130169533 | Jahnke | Jul 2013 | A1 |
20130190044 | Kulas | Jul 2013 | A1 |
20130211843 | Clarkson | Aug 2013 | A1 |
20130229345 | Day et al. | Sep 2013 | A1 |
20130265227 | Julian | Oct 2013 | A1 |
20130271397 | Hildreth et al. | Oct 2013 | A1 |
20130278501 | Bulzacki | Oct 2013 | A1 |
20130286004 | Mcculloch et al. | Oct 2013 | A1 |
20130293456 | Son et al. | Nov 2013 | A1 |
20130300648 | Kim et al. | Nov 2013 | A1 |
20130300654 | Seki | Nov 2013 | A1 |
20130326364 | Latta et al. | Dec 2013 | A1 |
20130328925 | Latta et al. | Dec 2013 | A1 |
20130342564 | Kinnebrew et al. | Dec 2013 | A1 |
20130342570 | Kinnebrew et al. | Dec 2013 | A1 |
20140002338 | Raffa et al. | Jan 2014 | A1 |
20140024324 | Mumick | Jan 2014 | A1 |
20140028548 | Bychkov et al. | Jan 2014 | A1 |
20140049462 | Weinberger et al. | Feb 2014 | A1 |
20140068692 | Archibong et al. | Mar 2014 | A1 |
20140075361 | Reynolds et al. | Mar 2014 | A1 |
20140108942 | Freeman et al. | Apr 2014 | A1 |
20140125584 | Xun et al. | May 2014 | A1 |
20140126782 | Takai et al. | May 2014 | A1 |
20140132499 | Schwesinger et al. | May 2014 | A1 |
20140139426 | Kryze et al. | May 2014 | A1 |
20140164928 | Kim | Jun 2014 | A1 |
20140168453 | Shoemake et al. | Jun 2014 | A1 |
20140198017 | Lamb et al. | Jul 2014 | A1 |
20140232639 | Hayashi et al. | Aug 2014 | A1 |
20140247210 | Henderek et al. | Sep 2014 | A1 |
20140258942 | Kutliroff et al. | Sep 2014 | A1 |
20140268054 | Olsson et al. | Sep 2014 | A1 |
20140282272 | Kies et al. | Sep 2014 | A1 |
20140285641 | Kato et al. | Sep 2014 | A1 |
20140304612 | Collin | Oct 2014 | A1 |
20140320404 | Kasahara | Oct 2014 | A1 |
20140331187 | Hicks et al. | Nov 2014 | A1 |
20140347391 | Keane et al. | Nov 2014 | A1 |
20140351753 | Shin et al. | Nov 2014 | A1 |
20140372957 | Keane et al. | Dec 2014 | A1 |
20150009118 | Thomas et al. | Jan 2015 | A1 |
20150035822 | Arsan et al. | Feb 2015 | A1 |
20150035832 | Sugden et al. | Feb 2015 | A1 |
20150042679 | Järvenpää | Feb 2015 | A1 |
20150067580 | Um et al. | Mar 2015 | A1 |
20150077335 | Taguchi et al. | Mar 2015 | A1 |
20150082180 | Ames et al. | Mar 2015 | A1 |
20150095844 | Cho et al. | Apr 2015 | A1 |
20150123890 | Kapur et al. | May 2015 | A1 |
20150128075 | Kempinski | May 2015 | A1 |
20150131850 | Qvarfordt | May 2015 | A1 |
20150135108 | Pope et al. | May 2015 | A1 |
20150145887 | Forutanpour et al. | May 2015 | A1 |
20150177937 | Poletto et al. | Jun 2015 | A1 |
20150187093 | Chu et al. | Jul 2015 | A1 |
20150205106 | Norden | Jul 2015 | A1 |
20150212576 | Ambrus et al. | Jul 2015 | A1 |
20150220152 | Tait et al. | Aug 2015 | A1 |
20150227285 | Lee et al. | Aug 2015 | A1 |
20150242095 | Sonnenberg | Aug 2015 | A1 |
20150255067 | White et al. | Sep 2015 | A1 |
20150287403 | Holzer Zaslansky et al. | Oct 2015 | A1 |
20150317832 | Ebstyne et al. | Nov 2015 | A1 |
20150331240 | Poulos et al. | Nov 2015 | A1 |
20150331576 | Piya et al. | Nov 2015 | A1 |
20150332091 | Kim et al. | Nov 2015 | A1 |
20150350141 | Yang et al. | Dec 2015 | A1 |
20150370323 | Cieplinski et al. | Dec 2015 | A1 |
20160015470 | Border | Jan 2016 | A1 |
20160018898 | Tu et al. | Jan 2016 | A1 |
20160018900 | Tu et al. | Jan 2016 | A1 |
20160026242 | Burns et al. | Jan 2016 | A1 |
20160026243 | Bertram et al. | Jan 2016 | A1 |
20160026253 | Bradski et al. | Jan 2016 | A1 |
20160062636 | Jung et al. | Mar 2016 | A1 |
20160093108 | Mao et al. | Mar 2016 | A1 |
20160098094 | Minkkinen | Apr 2016 | A1 |
20160133052 | Choi et al. | May 2016 | A1 |
20160171304 | Golding et al. | Jun 2016 | A1 |
20160179191 | Kim et al. | Jun 2016 | A1 |
20160179336 | Ambrus et al. | Jun 2016 | A1 |
20160196692 | Kjallstrom et al. | Jul 2016 | A1 |
20160216768 | Goetz et al. | Jul 2016 | A1 |
20160239165 | Chen et al. | Aug 2016 | A1 |
20160253063 | Critchlow | Sep 2016 | A1 |
20160253821 | Romano et al. | Sep 2016 | A1 |
20160275702 | Reynolds et al. | Sep 2016 | A1 |
20160306434 | Ferrin | Oct 2016 | A1 |
20160313890 | Walline et al. | Oct 2016 | A1 |
20160334940 | Kandadai et al. | Nov 2016 | A1 |
20160350973 | Shapira et al. | Dec 2016 | A1 |
20160357266 | Patel et al. | Dec 2016 | A1 |
20160379409 | Gavriliuc et al. | Dec 2016 | A1 |
20170038829 | Lanier et al. | Feb 2017 | A1 |
20170038837 | Faaborg et al. | Feb 2017 | A1 |
20170038849 | Hwang | Feb 2017 | A1 |
20170039770 | Lanier et al. | Feb 2017 | A1 |
20170046872 | Geselowitz et al. | Feb 2017 | A1 |
20170060230 | Faaborg et al. | Mar 2017 | A1 |
20170123487 | Hazra et al. | May 2017 | A1 |
20170131964 | Baek et al. | May 2017 | A1 |
20170132694 | Damy | May 2017 | A1 |
20170132822 | Marschke et al. | May 2017 | A1 |
20170153866 | Grinberg et al. | Jun 2017 | A1 |
20170206691 | Harrises et al. | Jul 2017 | A1 |
20170212583 | Krasadakis | Jul 2017 | A1 |
20170228130 | Palmaro | Aug 2017 | A1 |
20170236332 | Kipman et al. | Aug 2017 | A1 |
20170285737 | Khalid et al. | Oct 2017 | A1 |
20170308163 | Cieplinski et al. | Oct 2017 | A1 |
20170315715 | Fujita et al. | Nov 2017 | A1 |
20170344223 | Holzer et al. | Nov 2017 | A1 |
20170357389 | Fleizach et al. | Dec 2017 | A1 |
20170357390 | Alonso Ruiz et al. | Dec 2017 | A1 |
20170358141 | Stafford et al. | Dec 2017 | A1 |
20170364198 | Yoganandan et al. | Dec 2017 | A1 |
20180045963 | Hoover et al. | Feb 2018 | A1 |
20180075658 | Lanier et al. | Mar 2018 | A1 |
20180081519 | Kim | Mar 2018 | A1 |
20180095634 | Alexander | Apr 2018 | A1 |
20180095635 | Valdivia et al. | Apr 2018 | A1 |
20180095636 | Valdivia et al. | Apr 2018 | A1 |
20180101223 | Ishihara et al. | Apr 2018 | A1 |
20180114364 | Mcphee et al. | Apr 2018 | A1 |
20180150204 | Macgillivray | May 2018 | A1 |
20180150997 | Austin | May 2018 | A1 |
20180157332 | Nie | Jun 2018 | A1 |
20180158222 | Hayashi | Jun 2018 | A1 |
20180181199 | Harvey et al. | Jun 2018 | A1 |
20180181272 | Olsson et al. | Jun 2018 | A1 |
20180188802 | Okumura | Jul 2018 | A1 |
20180197336 | Rochford et al. | Jul 2018 | A1 |
20180210628 | Mcphee et al. | Jul 2018 | A1 |
20180239144 | Woods et al. | Aug 2018 | A1 |
20180275753 | Publicover et al. | Sep 2018 | A1 |
20180288206 | Stimpson et al. | Oct 2018 | A1 |
20180300023 | Hein | Oct 2018 | A1 |
20180315248 | Bastov et al. | Nov 2018 | A1 |
20180322701 | Pahud et al. | Nov 2018 | A1 |
20180348861 | Uscinski et al. | Dec 2018 | A1 |
20180350119 | Kocharlakota et al. | Dec 2018 | A1 |
20190018498 | West et al. | Jan 2019 | A1 |
20190034076 | Vinayak et al. | Jan 2019 | A1 |
20190043259 | Wang et al. | Feb 2019 | A1 |
20190050062 | Chen et al. | Feb 2019 | A1 |
20190065027 | Hauenstein et al. | Feb 2019 | A1 |
20190073109 | Zhang et al. | Mar 2019 | A1 |
20190080572 | Kim et al. | Mar 2019 | A1 |
20190088149 | Fink et al. | Mar 2019 | A1 |
20190094963 | Nijs | Mar 2019 | A1 |
20190094979 | Hall et al. | Mar 2019 | A1 |
20190101991 | Brennan | Apr 2019 | A1 |
20190130633 | Haddad et al. | May 2019 | A1 |
20190130733 | Hodge | May 2019 | A1 |
20190138183 | Rosas et al. | May 2019 | A1 |
20190146128 | Cao et al. | May 2019 | A1 |
20190172261 | Alt et al. | Jun 2019 | A1 |
20190172262 | Mchugh et al. | Jun 2019 | A1 |
20190188895 | Miller, IV et al. | Jun 2019 | A1 |
20190204906 | Ross et al. | Jul 2019 | A1 |
20190227763 | Kaufthal | Jul 2019 | A1 |
20190251884 | Burns et al. | Aug 2019 | A1 |
20190258365 | Zurmoehle et al. | Aug 2019 | A1 |
20190279407 | Mchugh et al. | Sep 2019 | A1 |
20190294312 | Rohrbacher | Sep 2019 | A1 |
20190310757 | Lee et al. | Oct 2019 | A1 |
20190324529 | Stellmach et al. | Oct 2019 | A1 |
20190332244 | Beszteri et al. | Oct 2019 | A1 |
20190339770 | Kurlethimar et al. | Nov 2019 | A1 |
20190340816 | Rogers | Nov 2019 | A1 |
20190346678 | Nocham | Nov 2019 | A1 |
20190346922 | Young et al. | Nov 2019 | A1 |
20190354259 | Park | Nov 2019 | A1 |
20190361521 | Stellmach et al. | Nov 2019 | A1 |
20190362557 | Lacey et al. | Nov 2019 | A1 |
20190370492 | Falchuk et al. | Dec 2019 | A1 |
20190371072 | Lindberg et al. | Dec 2019 | A1 |
20190377487 | Bailey et al. | Dec 2019 | A1 |
20190379765 | Fajt et al. | Dec 2019 | A1 |
20190384406 | Smith et al. | Dec 2019 | A1 |
20200004401 | Hwang et al. | Jan 2020 | A1 |
20200012341 | Stellmach et al. | Jan 2020 | A1 |
20200026349 | Fontanel et al. | Jan 2020 | A1 |
20200043243 | Bhushan et al. | Feb 2020 | A1 |
20200082602 | Jones | Mar 2020 | A1 |
20200089314 | Poupyrev et al. | Mar 2020 | A1 |
20200092537 | Sutter et al. | Mar 2020 | A1 |
20200098140 | Jagnow et al. | Mar 2020 | A1 |
20200098173 | Mccall | Mar 2020 | A1 |
20200117213 | Tian et al. | Apr 2020 | A1 |
20200128232 | Hwang et al. | Apr 2020 | A1 |
20200129850 | Ohashi | Apr 2020 | A1 |
20200159017 | Lin et al. | May 2020 | A1 |
20200225735 | Schwarz | Jul 2020 | A1 |
20200225746 | Bar-zeev et al. | Jul 2020 | A1 |
20200225747 | Bar-zeev et al. | Jul 2020 | A1 |
20200225830 | Tang et al. | Jul 2020 | A1 |
20200226814 | Tang et al. | Jul 2020 | A1 |
20200242844 | Bae et al. | Jul 2020 | A1 |
20200285314 | Cieplinski et al. | Sep 2020 | A1 |
20200356221 | Behzadi et al. | Nov 2020 | A1 |
20200357374 | Verweij et al. | Nov 2020 | A1 |
20200363867 | Azimi et al. | Nov 2020 | A1 |
20200371673 | Faulkner | Nov 2020 | A1 |
20200387214 | Ravasz et al. | Dec 2020 | A1 |
20200387228 | Ravasz et al. | Dec 2020 | A1 |
20200387287 | Ravasz et al. | Dec 2020 | A1 |
20210074062 | Madonna et al. | Mar 2021 | A1 |
20210090337 | Ravasz et al. | Mar 2021 | A1 |
20210096726 | Faulkner et al. | Apr 2021 | A1 |
20210097766 | Palangie et al. | Apr 2021 | A1 |
20210103333 | Cieplinski et al. | Apr 2021 | A1 |
20210125414 | Berkebile | Apr 2021 | A1 |
20210165484 | Suguhara et al. | Jun 2021 | A1 |
20210191600 | Lemay et al. | Jun 2021 | A1 |
20210295602 | Scapel et al. | Sep 2021 | A1 |
20210303074 | Vanblon et al. | Sep 2021 | A1 |
20210303107 | Pla I Conesa et al. | Sep 2021 | A1 |
20210319617 | Ahn et al. | Oct 2021 | A1 |
20210327140 | Rothkopf et al. | Oct 2021 | A1 |
20210339134 | Knoppert | Nov 2021 | A1 |
20210350564 | Peuhkurinen et al. | Nov 2021 | A1 |
20210350604 | Pejsa et al. | Nov 2021 | A1 |
20210352172 | Kim et al. | Nov 2021 | A1 |
20210368136 | Chalmers et al. | Nov 2021 | A1 |
20210375022 | Lee et al. | Dec 2021 | A1 |
20210402306 | Huang | Dec 2021 | A1 |
20220011577 | Lawver et al. | Jan 2022 | A1 |
20220011855 | Hazra et al. | Jan 2022 | A1 |
20220012002 | Bar-zeev et al. | Jan 2022 | A1 |
20220030197 | Ishimoto | Jan 2022 | A1 |
20220070241 | Yerli | Mar 2022 | A1 |
20220083197 | Rockel et al. | Mar 2022 | A1 |
20220091722 | Faulkner et al. | Mar 2022 | A1 |
20220091723 | Faulkner et al. | Mar 2022 | A1 |
20220092862 | Faulkner et al. | Mar 2022 | A1 |
20220100270 | Pastrana Vicente et al. | Mar 2022 | A1 |
20220101593 | Rockel et al. | Mar 2022 | A1 |
20220101612 | Palangie et al. | Mar 2022 | A1 |
20220104910 | Shelton et al. | Apr 2022 | A1 |
20220121344 | Pastrana Vicente et al. | Apr 2022 | A1 |
20220130107 | Lindh | Apr 2022 | A1 |
20220137701 | Bowman et al. | May 2022 | A1 |
20220137705 | Hashimoto et al. | May 2022 | A1 |
20220155909 | Kawashima et al. | May 2022 | A1 |
20220157083 | Jandhyala et al. | May 2022 | A1 |
20220165013 | Velez et al. | May 2022 | A1 |
20220187907 | Lee et al. | Jun 2022 | A1 |
20220191570 | Reid et al. | Jun 2022 | A1 |
20220197403 | Hughes et al. | Jun 2022 | A1 |
20220198755 | Pinchon | Jun 2022 | A1 |
20220214743 | Dascola et al. | Jul 2022 | A1 |
20220229524 | Mckenzie et al. | Jul 2022 | A1 |
20220229534 | Terre et al. | Jul 2022 | A1 |
20220232191 | Kawakami et al. | Jul 2022 | A1 |
20220244536 | Sha et al. | Aug 2022 | A1 |
20220245888 | Singh et al. | Aug 2022 | A1 |
20220253136 | Holder et al. | Aug 2022 | A1 |
20220253149 | Berliner et al. | Aug 2022 | A1 |
20220253194 | Berliner et al. | Aug 2022 | A1 |
20220255995 | Berliner et al. | Aug 2022 | A1 |
20220276720 | Yasui | Sep 2022 | A1 |
20220317776 | Sundstrom et al. | Oct 2022 | A1 |
20220326837 | Dessero et al. | Oct 2022 | A1 |
20220365595 | Cieplinski et al. | Nov 2022 | A1 |
20220365740 | Chang et al. | Nov 2022 | A1 |
20220374136 | Chang et al. | Nov 2022 | A1 |
20220413691 | Becker et al. | Dec 2022 | A1 |
20220414999 | Ravasz et al. | Dec 2022 | A1 |
20230004216 | Rodgers et al. | Jan 2023 | A1 |
20230008537 | Henderson et al. | Jan 2023 | A1 |
20230021861 | Fujiwara et al. | Jan 2023 | A1 |
20230032545 | Mindlin et al. | Feb 2023 | A1 |
20230068660 | Brent et al. | Mar 2023 | A1 |
20230069764 | Jonker et al. | Mar 2023 | A1 |
20230074080 | Miller et al. | Mar 2023 | A1 |
20230092282 | Boesel et al. | Mar 2023 | A1 |
20230093979 | Stauber et al. | Mar 2023 | A1 |
20230100689 | Chiu et al. | Mar 2023 | A1 |
20230133579 | Chang et al. | May 2023 | A1 |
20230152889 | Cieplinski et al. | May 2023 | A1 |
20230152935 | Mckenzie et al. | May 2023 | A1 |
20230154122 | Dascola et al. | May 2023 | A1 |
20230163987 | Young et al. | May 2023 | A1 |
20230168788 | Faulkner et al. | Jun 2023 | A1 |
20230185426 | Rockel et al. | Jun 2023 | A1 |
20230186577 | Rockel et al. | Jun 2023 | A1 |
20230206921 | Edelsburg et al. | Jun 2023 | A1 |
20230244857 | Weiss et al. | Aug 2023 | A1 |
20230259265 | Krivoruchko et al. | Aug 2023 | A1 |
20230273706 | Smith et al. | Aug 2023 | A1 |
20230274504 | Ren et al. | Aug 2023 | A1 |
20230308610 | Henderson et al. | Sep 2023 | A1 |
20230315385 | Akmal et al. | Oct 2023 | A1 |
20230316634 | Chiu et al. | Oct 2023 | A1 |
20230325004 | Burns et al. | Oct 2023 | A1 |
20230333646 | Pastrana Vicente et al. | Oct 2023 | A1 |
20230334808 | Sundstrom et al. | Oct 2023 | A1 |
20230350539 | Owen et al. | Nov 2023 | A1 |
20230359199 | Adachi et al. | Nov 2023 | A1 |
20230384907 | Boesel et al. | Nov 2023 | A1 |
20230388357 | Faulkner et al. | Nov 2023 | A1 |
20240086032 | Palangie et al. | Mar 2024 | A1 |
20240087256 | Hylak et al. | Mar 2024 | A1 |
20240094863 | Smith et al. | Mar 2024 | A1 |
20240094866 | Lemay et al. | Mar 2024 | A1 |
20240094882 | Brewer et al. | Mar 2024 | A1 |
20240095984 | Ren et al. | Mar 2024 | A1 |
20240103613 | Chawda et al. | Mar 2024 | A1 |
20240103676 | Pastrana Vicente et al. | Mar 2024 | A1 |
20240103684 | Yu et al. | Mar 2024 | A1 |
20240103687 | Pastrana Vicente et al. | Mar 2024 | A1 |
20240103701 | Pastrana Vicente et al. | Mar 2024 | A1 |
20240103704 | Pastrana Vicente et al. | Mar 2024 | A1 |
20240103707 | Henderson et al. | Mar 2024 | A1 |
20240103716 | Pastrana Vicente et al. | Mar 2024 | A1 |
20240103803 | Krivoruchko et al. | Mar 2024 | A1 |
20240104836 | Dessero et al. | Mar 2024 | A1 |
20240104873 | Pastrana Vicente et al. | Mar 2024 | A1 |
20240104877 | Henderson et al. | Mar 2024 | A1 |
20240111479 | Paul | Apr 2024 | A1 |
20240119682 | Rudman et al. | Apr 2024 | A1 |
20240192764 | Dascola et al. | Jun 2024 | A1 |
20240221291 | Henderson et al. | Jul 2024 | A1 |
20240272782 | Pastrana Vicente et al. | Aug 2024 | A1 |
20240291953 | Cerra et al. | Aug 2024 | A1 |
20240310971 | Kawashima et al. | Sep 2024 | A1 |
20240361835 | Hylak et al. | Oct 2024 | A1 |
20240393876 | Chawda et al. | Nov 2024 | A1 |
20240402800 | Shutzberg et al. | Dec 2024 | A1 |
20240402821 | Meyer et al. | Dec 2024 | A1 |
20240404206 | Chiu et al. | Dec 2024 | A1 |
20240411444 | Shutzberg et al. | Dec 2024 | A1 |
20240420435 | Gitter et al. | Dec 2024 | A1 |
20240428488 | Ren et al. | Dec 2024 | A1 |
20250008057 | Chiu et al. | Jan 2025 | A1 |
20250013343 | Smith et al. | Jan 2025 | A1 |
20250013344 | Smith et al. | Jan 2025 | A1 |
20250024008 | Cerra et al. | Jan 2025 | A1 |
20250028423 | Dessero et al. | Jan 2025 | A1 |
20250029319 | Boesel et al. | Jan 2025 | A1 |
20250029328 | Smith et al. | Jan 2025 | A1 |
20250036255 | Pastrana Vicente et al. | Jan 2025 | A1 |
20250078420 | Dessero et al. | Mar 2025 | A1 |
20250078429 | Dascola et al. | Mar 2025 | A1 |
Number | Date | Country |
---|---|---|
3033344 | Feb 2018 | CA |
104714771 | Jun 2015 | CN |
105264461 | Jan 2016 | CN |
105264478 | Jan 2016 | CN |
106575149 | Apr 2017 | CN |
108633307 | Oct 2018 | CN |
110476142 | Nov 2019 | CN |
110673718 | Jan 2020 | CN |
109491508 | Aug 2022 | CN |
0816983 | Jan 1998 | EP |
1530115 | May 2005 | EP |
2551763 | Jan 2013 | EP |
2741175 | Jun 2014 | EP |
2947545 | Nov 2015 | EP |
3088997 | Nov 2016 | EP |
3249497 | Nov 2017 | EP |
3316075 | May 2018 | EP |
3451135 | Mar 2019 | EP |
3503101 | Jun 2019 | EP |
3570144 | Nov 2019 | EP |
3588255 | Jan 2020 | EP |
3654147 | May 2020 | EP |
H06-4596 | Jan 1994 | JP |
H10-51711 | Feb 1998 | JP |
H10-78845 | Mar 1998 | JP |
2005-215144 | Aug 2005 | JP |
2005-333524 | Dec 2005 | JP |
2006-4093 | Jan 2006 | JP |
2006-107048 | Apr 2006 | JP |
2006-146803 | Jun 2006 | JP |
2006-295236 | Oct 2006 | JP |
2011-203880 | Oct 2011 | JP |
2012-234550 | Nov 2012 | JP |
2013-178639 | Sep 2013 | JP |
2013-196158 | Sep 2013 | JP |
2013-254358 | Dec 2013 | JP |
2013-257716 | Dec 2013 | JP |
2014-021565 | Feb 2014 | JP |
2014-059840 | Apr 2014 | JP |
2014-071663 | Apr 2014 | JP |
2014-099184 | May 2014 | JP |
2014-514652 | Jun 2014 | JP |
2015-056173 | Mar 2015 | JP |
2015-515040 | May 2015 | JP |
2015-118332 | Jun 2015 | JP |
2016-096513 | May 2016 | JP |
2016-194744 | Nov 2016 | JP |
2017-027206 | Feb 2017 | JP |
2017-058528 | Mar 2017 | JP |
2017-531221 | Oct 2017 | JP |
2018-005516 | Jan 2018 | JP |
2018-005517 | Jan 2018 | JP |
2018-041477 | Mar 2018 | JP |
2018-514005 | May 2018 | JP |
2018-101019 | Jun 2018 | JP |
2018-106499 | Jul 2018 | JP |
2019-040333 | Mar 2019 | JP |
2019-169154 | Oct 2019 | JP |
2019-175449 | Oct 2019 | JP |
2019-536131 | Dec 2019 | JP |
2020-503595 | Jan 2020 | JP |
2022-053334 | Apr 2022 | JP |
10-2011-0017236 | Feb 2011 | KR |
10-2016-0012139 | Feb 2016 | KR |
10-2019-0100957 | Aug 2019 | KR |
10-2020-0010296 | Jan 2020 | KR |
10-2020-0035103 | Apr 2020 | KR |
2010026519 | Mar 2010 | WO |
2011008638 | Jan 2011 | WO |
2012145180 | Oct 2012 | WO |
2014203301 | Dec 2014 | WO |
2015195216 | Dec 2015 | WO |
2017088487 | Jun 2017 | WO |
2018046957 | Mar 2018 | WO |
2018175735 | Sep 2018 | WO |
2019142560 | Jul 2019 | WO |
2019217163 | Nov 2019 | WO |
2020066682 | Apr 2020 | WO |
2021061351 | Apr 2021 | WO |
2021173839 | Sep 2021 | WO |
2021202783 | Oct 2021 | WO |
2022046340 | Mar 2022 | WO |
2022055822 | Mar 2022 | WO |
2022066399 | Mar 2022 | WO |
2022066535 | Mar 2022 | WO |
2022146936 | Jul 2022 | WO |
2022146938 | Jul 2022 | WO |
2022147146 | Jul 2022 | WO |
2022164881 | Aug 2022 | WO |
2023096940 | Jun 2023 | WO |
2023141535 | Jul 2023 | WO |
Entry |
---|
AquaSnap Window Manager: dock, snap, tile, organize [online], Nurgo Software, Available online at: <https://www.nurgo-software.com/products/aquasnap>, [retrieved on Jun. 27, 2023], 5 pages. |
Corrected Notice of Allowability received for U.S. Appl. No. 17/448,875, mailed on Apr. 24, 2024, 4 pages. |
Corrected Notice of Allowability received for U.S. Appl. No. 17/479,791, mailed on May 19, 2023, 2 pages. |
Corrected Notice of Allowability received for U.S. Appl. No. 17/659,147, mailed on Feb. 14, 2024, 6 pages. |
Corrected Notice of Allowability received for U.S. Appl. No. 17/932,655, mailed on Oct. 12, 2023, 2 pages. |
Corrected Notice of Allowability received for U.S. Appl. No. 18/154,757, mailed on Aug. 30, 2024, 2 pages. |
Corrected Notice of Allowability received for U.S. Appl. No. 18/421,827, mailed on Aug. 29, 2024, 2 pages. |
Corrected Notice of Allowability received for U.S. Appl. No. 18/463,739, mailed on Oct. 4, 2024, 2 pages. |
Corrected Notice of Allowability received for U.S. Appl. No. 18/465,098, mailed on Mar. 13, 2024, 3 pages. |
Corrected Notice of Allowability received for U.S. Appl. No. 17/935,095, mailed on Oct. 18, 2024, 3 Pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 17/478,593, mailed on Dec. 21, 2022, 2 pages. |
European Search Report received for European Patent Application No. 21791153.6, mailed on Mar. 22, 2024, 5 pages. |
European Search Report received for European Patent Application No. 21801378.7, mailed on Jul. 10, 2024, 5 pages. |
Extended European Search Report received for European Patent Application No. 23158818.7, mailed on Jul. 3, 2023, 12 pages. |
Extended European Search Report received for European Patent Application No. 23158929.2, mailed on Jun. 27, 2023, 12 pages. |
Extended European Search Report received for European Patent Application No. 23197572.3, mailed on Feb. 19, 2024, 7 pages. |
Extended European Search Report received for European Patent Application No. 24159868.9, mailed on Oct. 9, 2024, 13 pages. |
Extended European Search Report received for European Patent Application No. 24179233.2, mailed on Oct. 2, 2024, 10 pages. |
Extended European Search Report received for European Patent Application No. 24179830.5, mailed on Nov. 5, 2024, 11 pages. |
Final Office Action received for U.S. Appl. No. 14/531,874, mailed on Nov. 4, 2016, 10 pages. |
Final Office Action received for U.S. Appl. No. 15/644,639, mailed on Sep. 19, 2019, 12 pages. |
Final Office Action received for U.S. Appl. No. 17/202,034, mailed on May 4, 2023, 41 pages. |
Final Office Action received for U.S. Appl. No. 17/202,034, mailed on Nov. 4, 2024, 50 pages. |
Final Office Action received for U.S. Appl. No. 17/448,875, mailed on Mar. 16, 2023, 24 pages. |
Final Office Action received for U.S. Appl. No. 17/580,495, mailed on May 13, 2024, 29 pages. |
Final Office Action received for U.S. Appl. No. 17/659,147, mailed on Oct. 4, 2023, 17 pages. |
Final Office Action received for U.S. Appl. No. 17/816,314, mailed on Jan. 20, 2023, 11 pages. |
Final Office Action received for U.S. Appl. No. 17/935,095, mailed on Dec. 29, 2023, 15 pages. |
Final Office Action received for U.S. Appl. No. 18/182,300, mailed on Feb. 16, 2024, 32 pages. |
Final Office Action received for U.S. Appl. No. 18/182,300, mailed on Oct. 31, 2024, 34 pages. |
Final Office Action received for U.S. Appl. No. 18/375,280, mailed on Jul. 12, 2024, 19 pages. |
Home | Virtual Desktop [online], Virtual Desktop, Available online at: <https://www.vrdesktop.net>, [retrieved on Jun. 29, 2023], 4 pages. |
International Search Report received for PCT Application No. PCT/US2022/076603, mailed on Jan. 9, 2023, 4 pages. |
International Search Report received for PCT Application No. PCT/US2023/017335, mailed on Aug. 22, 2023, 6 pages. |
International Search Report received for PCT Application No. PCT/US2023/018213, mailed on Jul. 26, 2023, 6 pages. |
International Search Report received for PCT Application No. PCT/US2023/019458, mailed on Aug. 8, 2023, 7 pages. |
International Search Report received for PCT Application No. PCT/US2023/060052, mailed on May 24, 2023, 6 pages. |
International Search Report received for PCT Application No. PCT/US2023/060943, mailed on Jun. 6, 2023, 7 pages. |
International Search Report received for PCT Application No. PCT/US2023/074962, mailed on Jan. 19, 2024, 9 pages. |
International Search Report received for PCT Application No. PCT/US2024/030107, mailed on Oct. 23, 2024, 9 pages. |
International Search Report received for PCT Application No. PCT/US2024/032314, mailed on Nov. 11, 2024, 6 pages. |
International Search Report received for PCT Patent Application No. PCT/US2015/029727, mailed on Nov. 2, 2015, 6 pages. |
International Search Report received for PCT Patent Application No. PCT/US2021/022413, mailed on Aug. 13, 2021, 7 pages. |
Yamada Yoshihiro, “How to Generate a Modal Window with ModalPopup Control”, Available online at: <http://web.archive.org/web/20210920015801/https://atmarkit.itmedia.co.jp/fdotnet/dotnettips/580aspajaxmodalpoup/aspajaxmodalpopup.html>[Search Date Aug. 22, 2023], Sep. 20, 2021, 8 pages (1 page of English Abstract and 7 pages of Official Copy). |
International Search Report received for PCT Patent Application No. PCT/US2021/050948, mailed on Mar. 4, 2022, 6 pages. |
Simple Modal Window With Background Blur Effect, Available online at: <http://web.archive.org/web/20160313233427/https://www.cssscript.com/simple-modal-window-with-background-blur-effect/>, Mar. 13, 2016, 5 pages. |
Schenk et al., “SPOCK: A Smooth Pursuit Oculomotor Control Kit”, CHI'16 Extended Abstracts, San Jose, CA, USA, ACM, May 7-12, 2016, pp. 2681-2687. |
International Search Report received for PCT Patent Application No. PCT/US2021/071518, mailed on Feb. 25, 2022, 7 pages. |
International Search Report received for PCT Patent Application No. PCT/US2021/071595, mailed on Mar. 17, 2022, 7 pages. |
International Search Report received for PCT Patent Application No. PCT/US2021/071596, mailed on Apr. 8, 2022, 7 pages. |
International Search Report received for PCT Patent Application No. PCT/US2022/013208, mailed on Apr. 26, 2022, 7 pages. |
International Search Report received for PCT Patent Application No. PCT/US2022/071704, mailed on Aug. 26, 2022, 6 pages. |
International Search Report received for PCT Patent Application No. PCT/US2022/076985, mailed on Feb. 20, 2023, 5 pages. |
International Search Report received for PCT Patent Application No. PCT/US2023/074257, mailed on Nov. 21, 2023, 5 pages. |
International Search Report received for PCT Patent Application No. PCT/US2023/074793, mailed on Feb. 6, 2024, 6 pages. |
International Search Report received for PCT Patent Application No. PCT/US2023/074950, mailed on Jan. 3, 2024, 9 pages. |
International Search Report received for PCT Patent Application No. PCT/US2023/074979, mailed on Feb. 26, 2024, 6 pages. |
International Search Report received for PCT Patent Application No. PCT/US2024/026102, mailed on Aug. 26, 2024, 5 pages. |
Letter Restarting Period for Response received for U.S. Appl. No. 15/644,639, mailed on Sep. 28, 2018, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/531,874, mailed on May 18, 2016, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/644,639, mailed on Apr. 12, 2019, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/644,639, mailed on Sep. 10, 2018, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/881,599, mailed on Apr. 28, 2021, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/123,000, mailed on Nov. 12, 2021, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/202,034, mailed on Jan. 19, 2024, 44 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/202,034, mailed on Jul. 20, 2022, 38 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/448,875, mailed on Oct. 6, 2022, 25 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/448,875, mailed on Sep. 29, 2023, 30 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/479,791, mailed on May 11, 2022, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/580,495, mailed on Aug. 15, 2024, 28 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/580,495, mailed on Dec. 11, 2023, 27 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/659,147, mailed on Mar. 16, 2023, 19 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/816,314, mailed on Jul. 6, 2023, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/816,314, mailed on Sep. 23, 2022, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/932,655, mailed on Apr. 20, 2023, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/932,999, mailed on Feb. 23, 2024, 22 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/935,095, mailed on Jun. 22, 2023, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/154,697, mailed on Nov. 24, 2023, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/157,040, mailed on May 2, 2024, 25 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/182,300, mailed on May 29, 2024, 33 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/182,300, mailed on Oct. 26, 2023, 29 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/305,201, mailed on May 23, 2024, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/322,469, mailed on Nov. 15, 2024, 34 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/327,318, mailed on Sep. 16, 2024, 13 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/336,770, mailed on Jun. 5, 2024, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/473,196, mailed on Aug. 16, 2024, 21 pages. |
Notice of Allowance received for U.S. Appl. No. 18/154,757, mailed on Aug. 26, 2024, 12 pages. |
Notice of Allowance received for U.S. Appl. No. 14/531,874, mailed on Mar. 28, 2017, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 15/644,639, mailed on Jan. 16, 2020, 16 pages. |
Notice of Allowance received for U.S. Appl. No. 16/881,599, mailed on Dec. 17, 2021, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 17/123,000, mailed on May 27, 2022, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 17/123,000, mailed on Sep. 19, 2022, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 17/448,875, mailed on Apr. 17, 2024, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 17/448,875, mailed on Jul. 12, 2024, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 17/448,876, mailed on Apr. 7, 2022, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 17/448,876, mailed on Jul. 20, 2022, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 17/478,593, mailed on Aug. 31, 2022, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 17/479,791, mailed on Mar. 13, 2023, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 17/479,791, mailed on Nov. 17, 2022, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 17/580,495, mailed on Jun. 6, 2023, 6 pages. |
Notice of Allowance received for U.S. Appl. No. 17/580,495, mailed on Nov. 30, 2022, 12 pages. |
Notice of Allowance received for U.S. Appl. No. 17/659,147, mailed on Jan. 26, 2024, 13 pages. |
Notice of Allowance received for U.S. Appl. No. 17/659,147, mailed on May 29, 2024, 13 pages. |
Notice of Allowance received for U.S. Appl. No. 17/816,314, mailed on Jan. 4, 2024, 6 pages. |
Notice of Allowance received for U.S. Appl. No. 17/932,655, mailed on Jan. 24, 2024, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 17/932,655, mailed on Sep. 29, 2023, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 17/932,999, mailed on Sep. 12, 2024, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 17/935,095, mailed on Jul. 3, 2024, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 18/154,697, mailed on Aug. 6, 2024, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 18/154,757, mailed on Jan. 23, 2024, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 18/154,757, mailed on May 10, 2024, 12 pages. |
Notice of Allowance received for U.S. Appl. No. 18/421,675, mailed on Apr. 11, 2024, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 18/421,675, mailed on Jul. 31, 2024, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 18/421,827, mailed on Aug. 14, 2024, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 18/423,187, mailed on Jun. 5, 2024, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 18/463,739, mailed on Feb. 1, 2024, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 18/463,739, mailed on Jun. 17, 2024, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 18/463,739, mailed on Oct. 30, 2023, 11 pages. |
Notice of Allowance received for U.S. Appl. No. 18/465,098, mailed on Jun. 20, 2024, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 18/465,098, mailed on Mar. 4, 2024, 6 pages. |
Notice of Allowance received for U.S. Appl. No. 18/465,098, mailed on Nov. 17, 2023, 8 pages. |
Restriction Requirement received for U.S. Appl. No. 17/932,999, mailed on Oct. 3, 2023, 6 pages. |
Search Report received for Chinese Patent Application No. 202310873465.7, mailed on Feb. 1, 2024, 5 pages (2 pages of English Translation and 3 pages of Official Copy). |
Supplemental Notice of Allowance received for U.S. Appl. No. 14/531,874, mailed on Jul. 26, 2017, 5 pages. |
Bhowmick Shimmila, “Explorations on Body-Gesture Based Object Selection on HMD Based VR Interfaces for Dense and Occluded Dense Virtual Environments”, Report: State of the Art Seminar, Department of Design Indian Institute of Technology, Guwahati, Nov. 2018, 25 pages. |
Bohn Dieter, “Rebooting WebOS: How LG Rethought The Smart TV”, The Verge, Available online at: <http://www.theverge.com/2014/1/6/5279220/rebooting-webos-how-Ig-rethought-the-smart-tv>, [Retrieved Aug. 26, 2019], Jan. 6, 2014, 5 pages. |
Bolt et al., “Two-Handed Gesture in Multi-Modal Natural Dialog”, Uist '92, 5th Annual Symposium on User Interface Software And Technology. Proceedings Of the ACM Symposium on User Interface Software And Technology, Monterey, Nov. 15-18, 1992, pp. 7-14. |
Brennan Dominic, “4 Virtual Reality Desktops for Vive, Rift, and Windows VR Compared”, [online]. Road to VR, Available online at: <https://www.roadtovr.com/virtual-reality-desktop-compared-oculus-rift-htc-vive/>, [retrieved on Jun. 29, 2023], Jan. 3, 2018, 4 pages. |
Camalich Sergio, “CSS Buttons with Pseudo-elements”, Available online at: <https://tympanus.net/codrops/2012/01/11/css-buttons-with-pseudo-elements/>, [retrieved on Jul. 12, 2017], Jan. 11, 2012, 8 pages. |
Chatterjee et al., “Gaze+Gesture: Expressive, Precise and Targeted Free-Space Interactions”, ICMI '15, Nov. 9-13, 2015, 8 pages. |
Fatima et al., “Eye Movement Based Human Computer Interaction”, 3rd International Conference On Recent Advances In Information Technology (RAIT), Mar. 3, 2016, pp. 489-494. |
Grey Melissa, “Comcast's New X2 Platform Moves your DVR Recordings from the Box to the Cloud”, Engadget, Available online at: <http://www.engadget.com/2013/06/11/comcast-x2-platform/>, Jun. 11, 2013, 15 pages. |
Lin et al., “Towards Naturally Grabbing and Moving Objects in VR”, IS&T International Symposium on Electronic Imaging and The Engineering Reality of Virtual Reality, 2016, 6 pages. |
Mcgill et al., “Expanding The Bounds Of Seated Virtual Workspaces”, University of Glasgow, Available online at: <https://core.ac.uk/download/pdf/323988271.pdf>, [retrieved on Jun. 27, 2023], Jun. 5, 2020, 44 pages. |
Pfeuffer et al., “Gaze + Pinch Interaction in Virtual Reality”, In Proceedings of SUI '17, Brighton, United Kingdom, Oct. 16-17, 2017, pp. 99-108. |
Pfeuffer et al., “Gaze and Touch Interaction on Tablets”, UIST '16, Tokyo, Japan, ACM, Oct. 16-19, 2016, pp. 301-311. |
Corrected Notice of Allowability received for U.S. Appl. No. 17/932,999, mailed on Feb. 20, 2025, 2 pages. |
Corrected Notice of Allowability received for U.S. Appl. No. 17/932,999, mailed on Jan. 23, 2025, 9 pages. |
Corrected Notice of Allowability received for U.S. Appl. No. 18/174,337, mailed on Jan. 15, 2025, 2 pages. |
Extended European Search Report received for European Patent Application No. 24178730.8, mailed on Oct. 14, 2024, 8 pages. |
Extended European Search Report received for European Patent Application No. 24178752.2, mailed on Oct. 4, 2024, 8 pages. |
Extended European Search Report received for European Patent Application No. 24190323.6, mailed on Dec. 12, 2024, 9 pages. |
Final Office Action received for U.S. Appl. No. 17/580,495, mailed on Feb. 12, 2025, 29 pages. |
Final Office Action received for U.S. Appl. No. 18/157,040, mailed on Dec. 2, 2024, 25 pages. |
Final Office Action received for U.S. Patent Application No. 18/473, 196, mailed on Dec. 6, 2024, 22 pages. |
International Search Report received for PCT Application No. PCT/US2024/032451, mailed on Nov. 15, 2024, 6 pages. |
International Search Report received for PCT Application No. PCT/US2024/032456, mailed on Nov. 14, 2024, 6 pages. |
Restriction Requirement received for U.S. Patent Application No. 18/473, 187, mailed on Dec. 30, 2024, 5 pages. |
International Search Report received for PCT Patent Application No. PCT/US2024/039190, mailed on Nov. 22, 2024, 5 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/149,640, mailed on Jan. 15, 2025, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/375,280, mailed on Nov. 27, 2024, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/645,292, mailed on Feb. 21, 2025, 39 pages. |
Notice of Allowance received for U.S. Appl. No. 18/154,697, mailed on Dec. 3, 2024, 7 pages. |
Notice of Allowance received for U.S. Appl. No. 18/154,757, mailed on Jan. 23, 2025, 12 pages. |
Notice of Allowance received for U.S. Patent Application No. 18/174, 337, mailed on Jan. 2, 2025, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 18/336,770, mailed on Nov. 29, 2024, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 18/671,936, mailed on Jan. 15, 2025, 9 pages. |
European Search Report received for European Patent Application No. 22703771.0, mailed on Feb. 26, 2025, 4 pages. |
Extended European Search Report received for European Patent Application No. 24217335.9, mailed on Feb. 24, 2025, 8 pages. |
International Search Report received for PCT Patent Application No. PCT/US2023/060592, mailed on Jun. 14, 2023, 7 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/153,943, mailed on Dec. 31, 2024, 33 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/298,994, mailed on Mar. 7, 2025, 26 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/473,196, mailed on Feb. 28, 2025, 20 pages. |
Non-Final Office Action received for U.S. Appl. No. 18/988,115, mailed on Feb. 24, 2025, 40 pages. |
Notice of Allowance received for U.S. Appl. No. 18/671,936, mailed on Mar. 5, 2025, 7 pages. |
Macmostvideo, “A Beginner's Guide to Selecting Items On Your Mac (#1566)”, Bibliographic Information, Jan. 4, 2018, Retrieved from <URL:https://www.youtube.com/watch?v=a6MDAuh7MOQ&ab_channel=macmostvideo/>, [retrieved on Feb. 19, 2025], Most relevant passage of the video is 00:10 to 00:30, 2 pages. |
Number | Date | Country | |
---|---|---|---|
20240086031 A1 | Mar 2024 | US |
Number | Date | Country | |
---|---|---|---|
63132956 | Dec 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 18260022 | US | |
Child | 18515188 | US |