This relates generally to electronic devices with touch-sensitive surfaces, including but not limited to electronic devices with touch-sensitive surfaces that display user interfaces with wallpapers.
The use of touch-sensitive surfaces as input devices for computers and other electronic computing devices has increased significantly in recent years. Example touch-sensitive surfaces include touchpads and touch-screen displays. Such surfaces are widely used to manipulate user interfaces and objects therein on a display. Example user interface objects include digital images, video, text, icons, widgets, and control elements such as buttons and other graphics.
Example manipulations include adjusting the position and/or size of one or more user interface objects or activating buttons or opening files/applications represented by user interface objects, as well as associating metadata with one or more user interface objects or otherwise manipulating user interfaces. Example user interface objects include digital images, video, text, icons, widgets and control elements such as buttons and other graphics. A user will, in some circumstances, need to perform such manipulations on user interface objects while configuring a system user interface (e.g., a wake screen user interface and/or home screen user interface), and/or in a file management program (e.g., Finder from Apple Inc. of Cupertino, California), an image management application (e.g., Aperture, iPhoto, Photos from Apple Inc. of Cupertino, California), a digital content (e.g., videos and music) management application (e.g., iTunes from Apple Inc. of Cupertino, California), a drawing application, a presentation application (e.g., Keynote from Apple Inc. of Cupertino, California), a word processing application (e.g., Pages from Apple Inc. of Cupertino, California), or a spreadsheet application (e.g., Numbers from Apple Inc. of Cupertino, California).
But methods for performing these manipulations are cumbersome and inefficient. For example, using a sequence of mouse based inputs to select one or more user interface objects and perform one or more actions on the selected user interface objects is tedious and creates a significant cognitive burden on a user. In addition, these methods take longer than necessary, thereby wasting energy. This latter consideration is particularly important in battery-operated devices.
Accordingly, there is a need for electronic devices with faster, more efficient methods and interfaces for configuring and interacting with wallpapers and user interface elements overlaid on wallpapers. Such methods and interfaces optionally complement or replace conventional methods for configuring and interacting with wallpapers and user interface elements overlaid on wallpapers. Such methods and interfaces reduce the number, extent, and/or nature of the inputs from a user and produce a more efficient human-machine interface. For battery-operated devices, such methods and interfaces conserve power and increase the time between battery charges.
The above deficiencies and other problems associated with user interfaces for electronic devices (or more generally, computer systems) with touch-sensitive surfaces are reduced or eliminated by the disclosed devices. In some embodiments, the device is a desktop computer. In some embodiments, the device is portable (e.g., a notebook computer, tablet computer, or handheld device). In some embodiments, the device is a personal electronic device (e.g., a wearable electronic device, such as a watch). In some embodiments, the device has a touchpad. In some embodiments, the device has a touch-sensitive display (also known as a “touch screen” or “touch-screen display”). In some embodiments, the device has a graphical user interface (GUI), one or more processors, memory and one or more modules, programs or sets of instructions stored in the memory for performing multiple functions. In some embodiments, the user interacts with the GUI primarily through stylus and/or finger contacts and gestures on the touch-sensitive surface. In some embodiments, the functions optionally include image editing, drawing, presenting, word processing, spreadsheet making, game playing, telephoning, video conferencing, e-mailing, instant messaging, workout support, digital photographing, digital videoing, web browsing, digital music playing, note taking, and/or digital video playing. Executable instructions for performing these functions are, optionally, included in a non-transitory computer readable storage medium or other computer program product configured for execution by one or more processors.
In accordance with some embodiments, a method includes receiving a request to display a first user interface. The method includes, in response to receiving the request to display the first user interface, displaying the first user interface that includes a set of user interface elements displayed overlaid on a wallpaper, including: in accordance with a determination that the wallpaper has a first set of one or more wallpaper foreground elements, the wallpaper has a first wallpaper background with an automatically selected appearance that was automatically selected, based on the first set of one or more wallpaper foreground elements; and in accordance with a determination that the wallpaper has a second set of one or more wallpaper foreground elements that are different from the first set of one or more wallpaper foreground elements, the wallpaper has a second wallpaper background with an automatically selected appearance that was automatically selected, based on the second set of one or more wallpaper foreground elements.
In accordance with some embodiments, a method includes receiving a request to display a first user interface and in response to receiving the request to display the first user interface, displaying the first user interface with a set of user interface elements overlaid on a wallpaper that includes a plurality of selectable element. The method further includes, while displaying the first user interface, detecting a first user input directed the wallpaper while a first selectable element and a second selectable element of the plurality of selectable elements are concurrently displayed in the wallpaper. The method includes, in response to detecting the first user input directed to the wallpaper: in accordance with a determination that the first user input meets selection criteria and is directed to the first selectable element of the plurality of selectable elements, updating the wallpaper of the first user interface to include the first selectable element, without including the second selectable element of the plurality of selectable elements; and in accordance with a determination that the first user input meets the selection criteria and is directed to the second selectable element of the plurality of selectable elements, updating the wallpaper of the first user interface to include the second selectable element, without including the first selectable element of the plurality of selectable elements.
In accordance with some embodiments, a method includes receiving a request to display a first user interface. The method includes, in response to receiving the request to display the first user interface, displaying the first user interface that includes a plurality of widgets, including: in accordance with a determination that the first user interface is displayed in a first orientation, displaying a first version of the first user interface that includes a first set of widgets; and in accordance with a determination that the first user interface is displayed in a second orientation that is different from the first orientation, displaying a second version of the first user interface that includes a second set of widgets, wherein the widgets included in the second set of widgets are different from the widgets included in the first set of widgets.
In accordance with some embodiments, a method includes, while a computer system is in a low power state in which respective user-captured media with multiple frames is selected as a background for a wake screen, the computer system detects an event that corresponds to a trigger to wake the computer system to a higher power state. In response to detecting the event, the computer system displays, via the one or more display generation components, a wake screen that includes device status information and the background, wherein displaying the background for the wake screen includes playing through a plurality of frames of the respective user-captured media.
In accordance with some embodiments, an electronic device (or computer system more generally) includes a display, a touch-sensitive surface, optionally one or more sensors to detect intensities of contacts with the touch-sensitive surface, optionally one or more tactile output generators, one or more processors, and memory storing one or more programs; the one or more programs are configured to be executed by the one or more processors and the one or more programs include instructions for performing or causing performance of the operations of any of the methods described herein. In accordance with some embodiments, a computer readable storage medium has stored therein instructions that, when executed by an electronic device with a display, a touch-sensitive surface, optionally one or more sensors to detect intensities of contacts with the touch-sensitive surface, and optionally one or more tactile output generators, cause the device to perform or cause performance of the operations of any of the methods described herein. In accordance with some embodiments, a graphical user interface on an electronic device with a display, a touch-sensitive surface, optionally one or more sensors to detect intensities of contacts with the touch-sensitive surface, optionally one or more tactile output generators, a memory, and one or more processors to execute one or more programs stored in the memory includes one or more of the elements displayed in any of the methods described herein, which are updated in response to inputs, as described in any of the methods described herein. In accordance with some embodiments, an electronic device includes: a display, a touch-sensitive surface, optionally one or more sensors to detect intensities of contacts with the touch-sensitive surface, and optionally one or more tactile output generators; and means for performing or causing performance of the operations of any of the methods described herein. In accordance with some embodiments, an information processing apparatus, for use in an electronic device with a display, a touch-sensitive surface, optionally one or more sensors to detect intensities of contacts with the touch-sensitive surface, and optionally one or more tactile output generators, includes means for performing or causing performance of the operations of any of the methods described herein.
Thus, electronic devices and other computer systems with displays, touch-sensitive surfaces, optionally one or more sensors to detect intensities of contacts with the touch-sensitive surface, optionally one or more tactile output generators, optionally one or more device orientation sensors, and optionally an audio system, are provided with improved methods and interfaces for configuring and displaying user interfaces with wallpapers, thereby increasing the effectiveness, efficiency, and user satisfaction with such devices. Such methods and interfaces may complement or replace conventional methods for configuring and displaying user interfaces with wallpapers.
For a better understanding of the various described embodiments, reference should be made to the Description of Embodiments below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.
FIGS. 4C1-4C2 illustrate an example state diagram of navigation between various user interfaces of the multifunction devices in accordance with some embodiments.
Many electronic devices have graphical user interfaces that are customizable for a user. Providing the user with additional options to configure various graphical user interfaces, including configuring different wallpapers that are displayed on various graphical user interfaces, improves the user experience and enables the user to have more control over their device. For example, automatically updating features and providing suggested wallpaper backgrounds for a wallpaper based on one or more inputs from the user, provides real-time feedback to the user without requiring the user to manually configure the wallpaper.
The methods, devices, and GUIs described herein use haptic feedback to improve user interface interactions in multiple ways. For example, they make it easier to: drag and drop objects and indicate device orientation.
The processes described below enhance the operability of the devices and make the user-device interfaces more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) through various techniques, including by providing improved visual, audio, and/or tactile feedback to the user, reducing the number of inputs needed to perform an operation, providing additional control options without cluttering the user interface with additional displayed controls, performing an operation when a set of conditions has been met without requiring further user input, and/or additional techniques. These techniques also reduce power usage and improve battery life of the device by enabling the user to use the device more quickly and efficiently.
Below,
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the various described embodiments. However, it will be apparent to one of ordinary skill in the art that the various described embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
It will also be understood that, although the terms first, second, etc. are, in some instances, used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first contact could be termed a second contact, and, similarly, a second contact could be termed a first contact, without departing from the scope of the various described embodiments. The first contact and the second contact are both contacts, but they are not the same contact, unless the context clearly indicates otherwise.
The terminology used in the description of the various described embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “includes,” “including,” “comprises,” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used herein, the term “if” is, optionally, construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” is, optionally, construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
Embodiments of electronic devices (and computer systems more generally), user interfaces for such devices, and associated processes for using such devices are described. In some embodiments, the device is a portable communications device, such as a mobile telephone, that also contains other functions, such as PDA and/or music player functions. Example embodiments of portable multifunction devices include, without limitation, the iPhone®, iPod Touch®, and iPad® devices from Apple Inc. of Cupertino, California. Other portable electronic devices, such as laptops or tablet computers with touch-sensitive surfaces (e.g., touch-screen displays and/or touchpads), are, optionally, used. It should also be understood that, in some embodiments, the device is not a portable communications device, but is a desktop computer with a touch-sensitive surface (e.g., a touch-screen display and/or a touchpad).
In the discussion that follows, a computer system in the form of an electronic device that includes a display and a touch-sensitive surface is described. It should be understood, however, that the electronic device optionally includes one or more other physical user-interface devices, such as a physical keyboard, a mouse and/or a joystick.
The device typically supports a variety of applications, such as one or more of the following: a note taking application, a drawing application, a presentation application, a word processing application, a website creation application, a disk authoring application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a workout support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
The various applications that are executed on the device optionally use at least one common physical user-interface device, such as the touch-sensitive surface. One or more functions of the touch-sensitive surface as well as corresponding information displayed on the device are, optionally, adjusted and/or varied from one application to the next and/or within a respective application. In this way, a common physical architecture (such as the touch-sensitive surface) of the device optionally supports the variety of applications with user interfaces that are intuitive and transparent to the user.
Attention is now directed toward embodiments of computer systems such as portable devices with touch-sensitive displays.
As used in the specification and claims, the term “tactile output” refers to physical displacement of a device relative to a previous position of the device, physical displacement of a component (e.g., a touch-sensitive surface) of a device relative to another component (e.g., housing) of the device, or displacement of the component relative to a center of mass of the device that will be detected by a user with the user's sense of touch. For example, in situations where the device or the component of the device is in contact with a surface of a user that is sensitive to touch (e.g., a finger, palm, or other part of a user's hand), the tactile output generated by the physical displacement will be interpreted by the user as a tactile sensation corresponding to a perceived change in physical characteristics of the device or the component of the device. For example, movement of a touch-sensitive surface (e.g., a touch-sensitive display or trackpad) is, optionally, interpreted by the user as a “down click” or “up click” of a physical actuator button. In some cases, a user will feel a tactile sensation such as an “down click” or “up click” even when there is no movement of a physical actuator button associated with the touch-sensitive surface that is physically pressed (e.g., displaced) by the user's movements. As another example, movement of the touch-sensitive surface is, optionally, interpreted or sensed by the user as “roughness” of the touch-sensitive surface, even when there is no change in smoothness of the touch-sensitive surface. While such interpretations of touch by a user will be subject to the individualized sensory perceptions of the user, there are many sensory perceptions of touch that are common to a large majority of users. Thus, when a tactile output is described as corresponding to a particular sensory perception of a user (e.g., an “up click,” a “down click,” “roughness”), unless otherwise stated, the generated tactile output corresponds to physical displacement of the device or a component thereof that will generate the described sensory perception for a typical (or average) user. Using tactile outputs to provide haptic feedback to a user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user to provide proper inputs and reducing user mistakes when operating/interacting with the device) which, additionally, reduces power usage and improves battery life of the device by enabling the user to use the device more quickly and efficiently.
In some embodiments, a tactile output pattern specifies characteristics of a tactile output, such as the amplitude of the tactile output, the shape of a movement waveform of the tactile output, the frequency of the tactile output, and/or the duration of the tactile output.
When tactile outputs with different tactile output patterns are generated by a device (e.g., via one or more tactile output generators that move a moveable mass to generate tactile outputs), the tactile outputs may invoke different haptic sensations in a user holding or touching the device. While the sensation of the user is based on the user's perception of the tactile output, most users will be able to identify changes in waveform, frequency, and amplitude of tactile outputs generated by the device. Thus, the waveform, frequency and amplitude can be adjusted to indicate to the user that different operations have been performed. As such, tactile outputs with tactile output patterns that are designed, selected, and/or engineered to simulate characteristics (e.g., size, material, weight, stiffness, smoothness, etc.); behaviors (e.g., oscillation, displacement, acceleration, rotation, expansion, etc.); and/or interactions (e.g., collision, adhesion, repulsion, attraction, friction, etc.) of objects in a given environment (e.g., a user interface that includes graphical features and objects, a simulated physical environment with virtual boundaries and virtual objects, a real physical environment with physical boundaries and physical objects, and/or a combination of any of the above) will, in some circumstances, provide helpful feedback to users that reduces input errors and increases the efficiency of the user's operation of the device. Additionally, tactile outputs are, optionally, generated to correspond to feedback that is unrelated to a simulated physical characteristic, such as an input threshold or a selection of an object. Such tactile outputs will, in some circumstances, provide helpful feedback to users that reduces input errors and increases the efficiency of the user's operation of the device.
In some embodiments, a tactile output with a suitable tactile output pattern serves as a cue for the occurrence of an event of interest in a user interface or behind the scenes in a device. Examples of the events of interest include activation of an affordance (e.g., a real or virtual button, or toggle switch) provided on the device or in a user interface, success or failure of a requested operation, reaching or crossing a boundary in a user interface, entry into a new state, switching of input focus between objects, activation of a new mode, reaching or crossing an input threshold, detection or recognition of a type of input or gesture, etc. In some embodiments, tactile outputs are provided to serve as a warning or an alert for an impending event or outcome that would occur unless a redirection or interruption input is timely detected. Tactile outputs are also used in other contexts to enrich the user experience, improve the accessibility of the device to users with visual or motor difficulties or other accessibility needs, and/or improve efficiency and functionality of the user interface and/or the device. Tactile outputs are optionally accompanied with audio outputs and/or visible user interface changes, which further enhance a user's experience when the user interacts with a user interface and/or the device, and facilitate better conveyance of information regarding the state of the user interface and/or the device, and which reduce input errors and increase the efficiency of the user's operation of the device.
It should be appreciated that device 100 is only one example of a portable multifunction device, and that device 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components. The various components shown in
Memory 102 optionally includes high-speed random access memory and optionally also includes non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to memory 102 by other components of device 100, such as CPU(s) 120 and the peripherals interface 118, is, optionally, controlled by memory controller 122.
Peripherals interface 118 can be used to couple input and output peripherals of the device to CPU(s) 120 and memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in memory 102 to perform various functions for device 100 and to process data.
In some embodiments, peripherals interface 118, CPU(s) 120, and memory controller 122 are, optionally, implemented on a single chip, such as chip 104. In some other embodiments, they are, optionally, implemented on separate chips.
RF (radio frequency) circuitry 108 receives and sends RF signals, also called electromagnetic signals. RF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other communications devices via the electromagnetic signals. RF circuitry 108 optionally includes well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. RF circuitry 108 optionally communicates with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication. The wireless communication optionally uses any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), high-speed uplink packet access (HSUPA), Evolution, Data-Only (EV-DO), HSPA, HSPA+, Dual-Cell HSPA (DC-HSPA), long term evolution (LTE), near field communication (NFC), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11ac, IEEE 802.11ax, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for e-mail (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
Audio circuitry 110, speaker 111, and microphone 113 provide an audio interface between a user and device 100. Audio circuitry 110 receives audio data from peripherals interface 118, converts the audio data to an electrical signal, and transmits the electrical signal to speaker 111. Speaker 111 converts the electrical signal to human-audible sound waves. Audio circuitry 110 also receives electrical signals converted by microphone 113 from sound waves. Audio circuitry 110 converts the electrical signal to audio data and transmits the audio data to peripherals interface 118 for processing. Audio data is, optionally, retrieved from and/or transmitted to memory 102 and/or RF circuitry 108 by peripherals interface 118. In some embodiments, audio circuitry 110 also includes a headset jack (e.g., 212,
I/O subsystem 106 couples input/output peripherals on device 100, such as touch-sensitive display system 112 and other input or control devices 116, with peripherals interface 118. I/O subsystem 106 optionally includes display controller 156, optical sensor controller 158, intensity sensor controller 159, haptic feedback controller 161, and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 receive/send electrical signals from/to other input or control devices 116. The other input or control devices 116 optionally include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 160 are, optionally, coupled with any (or none) of the following: a keyboard, infrared port, USB port, stylus, and/or a pointer device such as a mouse. The one or more buttons (e.g., 208,
Touch-sensitive display system 112 provides an input interface and an output interface between the device and a user. Display controller 156 receives and/or sends electrical signals from/to touch-sensitive display system 112. Touch-sensitive display system 112 displays visual output to the user. The visual output optionally includes graphics, text, icons, video, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output corresponds to user interface objects. As used herein, the term “affordance” refers to a user-interactive graphical user interface object (e.g., a graphical user interface object that is configured to respond to inputs directed toward the graphical user interface object). Examples of user-interactive graphical user interface objects include, without limitation, a button, slider, icon, selectable menu item, switch, hyperlink, or other user interface control.
Touch-sensitive display system 112 has a touch-sensitive surface, sensor or set of sensors that accepts input from the user based on haptic and/or tactile contact. Touch-sensitive display system 112 and display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) on touch-sensitive display system 112 and converts the detected contact into interaction with user-interface objects (e.g., one or more soft keys, icons, web pages or images) that are displayed on touch-sensitive display system 112. In some embodiments, a point of contact between touch-sensitive display system 112 and the user corresponds to a finger of the user or a stylus.
Touch-sensitive display system 112 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other embodiments. Touch-sensitive display system 112 and display controller 156 optionally detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch-sensitive display system 112. In some embodiments, projected mutual capacitance sensing technology is used, such as that found in the iPhone®, iPod Touch®, and iPad® from Apple Inc. of Cupertino, California.
Touch-sensitive display system 112 optionally has a video resolution in excess of 100 dpi. In some embodiments, the touch screen video resolution is in excess of 400 dpi (e.g., 500 dpi, 800 dpi, or greater). The user optionally makes contact with touch-sensitive display system 112 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work with finger-based contacts and gestures, which can be less precise than stylus-based input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user.
In some embodiments, in addition to the touch screen, device 100 optionally includes a touchpad for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad is, optionally, a touch-sensitive surface that is separate from touch-sensitive display system 112 or an extension of the touch-sensitive surface formed by the touch screen.
Device 100 also includes power system 162 for powering the various components. Power system 162 optionally includes a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
Device 100 optionally also includes one or more optical sensors 164 (e.g., as part of one or more cameras).
Device 100 optionally also includes one or more contact intensity sensors 165.
Device 100 optionally also includes one or more proximity sensors 166.
Device 100 optionally also includes one or more tactile output generators 167.
Device 100 optionally also includes one or more accelerometers 168.
In some embodiments, the software components stored in memory 102 include operating system 126, communication module (or set of instructions) 128, contact/motion module (or set of instructions) 130, graphics module (or set of instructions) 132, haptic feedback module (or set of instructions) 133, text input module (or set of instructions) 134, Global Positioning System (GPS) module (or set of instructions) 135, and applications (or sets of instructions) 136. Furthermore, in some embodiments, memory 102 stores device/global internal state 157, as shown in
Operating system 126 (e.g., iOS, Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
Communication module 128 facilitates communication with other devices over one or more external ports 124 and also includes various software components for handling data received by RF circuitry 108 and/or external port 124. External port 124 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.). In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or similar to and/or compatible with the 30-pin connector used in some iPhone®, iPod Touch®, and iPad® devices from Apple Inc. of Cupertino, California. In some embodiments, the external port is a Lightning connector that is the same as, or similar to and/or compatible with the Lightning connector used in some iPhone®, iPod Touch®, and iPad® devices from Apple Inc. of Cupertino, California. In some embodiments, the external port is a USB Type-C connector that is the same as, or similar to and/or compatible with the USB Type-C connector used in some electronic devices from Apple Inc. of Cupertino, California.
Contact/motion module 130 optionally detects contact with touch-sensitive display system 112 (in conjunction with display controller 156) and other touch-sensitive devices (e.g., a touchpad or physical click wheel). Contact/motion module 130 includes various software components for performing various operations related to detection of contact (e.g., by a finger or by a stylus), such as determining if contact has occurred (e.g., detecting a finger-down event), determining an intensity of the contact (e.g., the force or pressure of the contact or a substitute for the force or pressure of the contact), determining if there is movement of the contact and tracking the movement across the touch-sensitive surface (e.g., detecting one or more finger-dragging events), and determining if the contact has ceased (e.g., detecting a finger-up event or a break in contact). Contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact, which is represented by a series of contact data, optionally includes determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations are, optionally, applied to single contacts (e.g., one finger contacts or stylus contacts) or to multiple simultaneous contacts (e.g., “multitouch”/multiple finger contacts). In some embodiments, contact/motion module 130 and display controller 156 detect contact on a touchpad.
Contact/motion module 130 optionally detects a gesture input by a user. Different gestures on the touch-sensitive surface have different contact patterns (e.g., different motions, timings, and/or intensities of detected contacts). Thus, a gesture is, optionally, detected by detecting a particular contact pattern. For example, detecting a finger tap gesture includes detecting a finger-down event followed by detecting a finger-up (lift off) event at the same position (or substantially the same position) as the finger-down event (e.g., at the position of an icon). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event followed by detecting one or more finger-dragging events, and subsequently followed by detecting a finger-up (lift off) event. Similarly, tap, swipe, drag, and other gestures are optionally detected for a stylus by detecting a particular contact pattern for the stylus.
In some embodiments, detecting a finger tap gesture depends on the length of time between detecting the finger-down event and the finger-up event, but is independent of the intensity of the finger contact between detecting the finger-down event and the finger-up event. In some embodiments, a tap gesture is detected in accordance with a determination that the length of time between the finger-down event and the finger-up event is less than a predetermined value (e.g., less than 0.1, 0.2, 0.3, 0.4 or 0.5 seconds), independent of whether the intensity of the finger contact during the tap meets a given intensity threshold (greater than a nominal contact-detection intensity threshold), such as a light press or deep press intensity threshold. Thus, a finger tap gesture can satisfy particular input criteria that do not require that the characteristic intensity of a contact satisfy a given intensity threshold in order for the particular input criteria to be met. For clarity, the finger contact in a tap gesture typically needs to satisfy a nominal contact-detection intensity threshold, below which the contact is not detected, in order for the finger-down event to be detected. A similar analysis applies to detecting a tap gesture by a stylus or other contact. In cases where the device is capable of detecting a finger or stylus contact hovering over a touch sensitive surface, the nominal contact-detection intensity threshold optionally does not correspond to physical contact between the finger or stylus and the touch sensitive surface.
The same concepts apply in an analogous manner to other types of gestures. For example, a swipe gesture, a pinch gesture, a depinch gesture, and/or a long press gesture are optionally detected based on the satisfaction of criteria that are either independent of intensities of contacts included in the gesture, or do not require that contact(s) that perform the gesture reach intensity thresholds in order to be recognized. For example, a swipe gesture is detected based on an amount of movement of one or more contacts; a pinch gesture is detected based on movement of two or more contacts towards each other; a depinch gesture is detected based on movement of two or more contacts away from each other; and a long press gesture is detected based on a duration of the contact on the touch-sensitive surface with less than a threshold amount of movement. As such, the statement that particular gesture recognition criteria do not require that the intensity of the contact(s) meet a respective intensity threshold in order for the particular gesture recognition criteria to be met means that the particular gesture recognition criteria are capable of being satisfied if the contact(s) in the gesture do not reach the respective intensity threshold, and are also capable of being satisfied in circumstances where one or more of the contacts in the gesture do reach or exceed the respective intensity threshold. In some embodiments, a tap gesture is detected based on a determination that the finger-down and finger-up event are detected within a predefined time period, without regard to whether the contact is above or below the respective intensity threshold during the predefined time period, and a swipe gesture is detected based on a determination that the contact movement is greater than a predefined magnitude, even if the contact is above the respective intensity threshold at the end of the contact movement. Even in implementations where detection of a gesture is influenced by the intensity of contacts performing the gesture (e.g., the device detects a long press more quickly when the intensity of the contact is above an intensity threshold or delays detection of a tap input when the intensity of the contact is higher), the detection of those gestures does not require that the contacts reach a particular intensity threshold so long as the criteria for recognizing the gesture can be met in circumstances where the contact does not reach the particular intensity threshold (e.g., even if the amount of time that it takes to recognize the gesture changes).
Contact intensity thresholds, duration thresholds, and movement thresholds are, in some circumstances, combined in a variety of different combinations in order to create heuristics for distinguishing two or more different gestures directed to the same input element or region so that multiple different interactions with the same input element are enabled to provide a richer set of user interactions and responses. The statement that a particular set of gesture recognition criteria do not require that the intensity of the contact(s) meet a respective intensity threshold in order for the particular gesture recognition criteria to be met does not preclude the concurrent evaluation of other intensity-dependent gesture recognition criteria to identify other gestures that do have criteria that are met when a gesture includes a contact with an intensity above the respective intensity threshold. For example, in some circumstances, first gesture recognition criteria for a first gesture—which do not require that the intensity of the contact(s) meet a respective intensity threshold in order for the first gesture recognition criteria to be met—are in competition with second gesture recognition criteria for a second gesture—which are dependent on the contact(s) reaching the respective intensity threshold. In such competitions, the gesture is, optionally, not recognized as meeting the first gesture recognition criteria for the first gesture if the second gesture recognition criteria for the second gesture are met first. For example, if a contact reaches the respective intensity threshold before the contact moves by a predefined amount of movement, a deep press gesture is detected rather than a swipe gesture. Conversely, if the contact moves by the predefined amount of movement before the contact reaches the respective intensity threshold, a swipe gesture is detected rather than a deep press gesture. Even in such circumstances, the first gesture recognition criteria for the first gesture still do not require that the intensity of the contact(s) meet a respective intensity threshold in order for the first gesture recognition criteria to be met because if the contact stayed below the respective intensity threshold until an end of the gesture (e.g., a swipe gesture with a contact that does not increase to an intensity above the respective intensity threshold), the gesture would have been recognized by the first gesture recognition criteria as a swipe gesture. As such, particular gesture recognition criteria that do not require that the intensity of the contact(s) meet a respective intensity threshold in order for the particular gesture recognition criteria to be met will (A) in some circumstances ignore the intensity of the contact with respect to the intensity threshold (e.g. for a tap gesture) and/or (B) in some circumstances still be dependent on the intensity of the contact with respect to the intensity threshold in the sense that the particular gesture recognition criteria (e.g., for a long press gesture) will fail if a competing set of intensity-dependent gesture recognition criteria (e.g., for a deep press gesture) recognize an input as corresponding to an intensity-dependent gesture before the particular gesture recognition criteria recognize a gesture corresponding to the input (e.g., for a long press gesture that is competing with a deep press gesture for recognition).
Graphics module 132 includes various known software components for rendering and displaying graphics on touch-sensitive display system 112 or other display, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast or other visual property) of graphics that are displayed. As used herein, the term “graphics” includes any object that can be displayed to a user, including without limitation text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations and the like.
In some embodiments, graphics module 132 stores data representing graphics to be used. Each graphic is, optionally, assigned a corresponding code. Graphics module 132 receives, from applications etc., one or more codes specifying graphics to be displayed along with, if necessary, coordinate data and other graphic property data, and then generates screen image data to output to display controller 156.
Haptic feedback module 133 includes various software components for generating instructions (e.g., instructions used by haptic feedback controller 161) to produce tactile outputs using tactile output generator(s) 167 at one or more locations on device 100 in response to user interactions with device 100.
Text input module 134, which is, optionally, a component of graphics module 132, provides soft keyboards for entering text in various applications (e.g., contacts module 137, e-mail client module 140, IM module 141, browser module 147, and any other application that needs text input).
GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to telephone module 138 for use in location-based dialing, to camera module 143 as picture/video metadata, and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets).
Applications 136 optionally include the following modules (or sets of instructions), or a subset or superset thereof:
Examples of other applications 136 that are, optionally, stored in memory 102 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, contacts module 137 includes executable instructions to manage an address book or contact list (e.g., stored in application internal state 192 of contacts module 137 in memory 102 or memory 370), including: adding name(s) to the address book; deleting name(s) from the address book; associating telephone number(s), e-mail address(es), physical address(es) or other information with a name; associating an image with a name; categorizing and sorting names; providing telephone numbers and/or e-mail addresses to initiate and/or facilitate communications by telephone module 138, video conference module 139, e-mail client module 140, or IM module 141; and so forth.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, telephone module 138 includes executable instructions to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in address book 137, modify a telephone number that has been entered, dial a respective telephone number, conduct a conversation and disconnect or hang up when the conversation is completed. As noted above, the wireless communication optionally uses any of a plurality of communications standards, protocols and technologies.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch-sensitive display system 112, display controller 156, optical sensor(s) 164, optical sensor controller 158, contact module 130, graphics module 132, text input module 134, contact list 137, and telephone module 138, videoconferencing module 139 includes executable instructions to initiate, conduct, and terminate a video conference between a user and one or more other participants in accordance with user instructions.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, e-mail client module 140 includes executable instructions to create, send, receive, and manage e-mail in response to user instructions. In conjunction with image management module 144, e-mail client module 140 makes it very easy to create and send e-mails with still or video images taken with camera module 143.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, the instant messaging module 141 includes executable instructions to enter a sequence of characters corresponding to an instant message, to modify previously entered characters, to transmit a respective instant message (for example, using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for telephony-based instant messages or using XMPP, SIMPLE, Apple Push Notification Service (APNs) or IMPS for Internet-based instant messages), to receive instant messages, and to view received instant messages. In some embodiments, transmitted and/or received instant messages optionally include graphics, photos, audio files, video files and/or other attachments as are supported in an MMS and/or an Enhanced Messaging Service (EMS). As used herein, “instant messaging” refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and Internet-based messages (e.g., messages sent using XMPP, SIMPLE, APNs, or IMPS).
In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, GPS module 135, map module 154, and video and music player module 152, workout support module 142 includes executable instructions to create workouts (e.g., with time, distance, and/or calorie burning goals); communicate with workout sensors (in sports devices and smart watches); receive workout sensor data; calibrate sensors used to monitor a workout; select and play music for a workout; and display, store and transmit workout data.
In conjunction with touch-sensitive display system 112, display controller 156, optical sensor(s) 164, optical sensor controller 158, contact module 130, graphics module 132, and image management module 144, camera module 143 includes executable instructions to capture still images or video (including a video stream) and store them into memory 102, modify characteristics of a still image or video, and/or delete a still image or video from memory 102.
In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, and camera module 143, image management module 144 includes executable instructions to arrange, modify (e.g., edit), or otherwise manipulate, label, delete, present (e.g., in a digital slide show or album), and store still and/or video images.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, browser module 147 includes executable instructions to browse the Internet in accordance with user instructions, including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, e-mail client module 140, and browser module 147, calendar module 148 includes executable instructions to create, display, modify, and store calendars and data associated with calendars (e.g., calendar entries, to do lists, etc.) in accordance with user instructions.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, widget modules 149 are mini-applications that are, optionally, downloaded and used by a user (e.g., weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, and dictionary widget 149-5) or created by the user (e.g., user-created widget 149-6). In some embodiments, a widget includes an HTML (Hypertext Markup Language) file, a CSS (Cascading Style Sheets) file, and a JavaScript file. In some embodiments, a widget includes an XML (Extensible Markup Language) file and a JavaScript file (e.g., Yahoo! Widgets).
In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, the widget creator module 150 includes executable instructions to create widgets (e.g., turning a user-specified portion of a web page into a widget).
In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, search module 151 includes executable instructions to search for text, music, sound, image, video, and/or other files in memory 102 that match one or more search criteria (e.g., one or more user-specified search terms) in accordance with user instructions.
In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, and browser module 147, video and music player module 152 includes executable instructions that allow the user to download and play back recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files, and executable instructions to display, present or otherwise play back videos (e.g., on touch-sensitive display system 112, or on an external display connected wirelessly or via external port 124). In some embodiments, device 100 optionally includes the functionality of an MP3 player, such as an iPod (trademark of Apple Inc.).
In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, notes module 153 includes executable instructions to create and manage notes, to do lists, and the like in accordance with user instructions.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, GPS module 135, and browser module 147, map module 154 includes executable instructions to receive, display, modify, and store maps and data associated with maps (e.g., driving directions; data on stores and other points of interest at or near a particular location; and other location-based data) in accordance with user instructions.
In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, text input module 134, e-mail client module 140, and browser module 147, online video module 155 includes executable instructions that allow the user to access, browse, receive (e.g., by streaming and/or download), play back (e.g., on the touch screen 112, or on an external display connected wirelessly or via external port 124), send an e-mail with a link to a particular online video, and otherwise manage online videos in one or more file formats, such as H.264. In some embodiments, instant messaging module 141, rather than e-mail client module 140, is used to send a link to a particular online video.
Each of the above identified modules and applications correspond to a set of executable instructions for performing one or more functions described above and the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (e.g., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules are, optionally, combined or otherwise re-arranged in various embodiments. In some embodiments, memory 102 optionally stores a subset of the modules and data structures identified above. Furthermore, memory 102 optionally stores additional modules and data structures not described above.
In some embodiments, device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through a touch screen and/or a touchpad. By using a touch screen and/or a touchpad as the primary input control device for operation of device 100, the number of physical input control devices (such as push buttons, dials, and the like) on device 100 is, optionally, reduced.
The predefined set of functions that are performed exclusively through a touch screen and/or a touchpad optionally include navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates device 100 to a main, home, or root menu from any user interface that is displayed on device 100. In such embodiments, a “menu button” is implemented using a touchpad. In some other embodiments, the menu button is a physical push button or other physical input control device instead of a touchpad.
Event sorter 170 receives event information and determines the application 136-1 and application view 191 of application 136-1 to which to deliver the event information. Event sorter 170 includes event monitor 171 and event dispatcher module 174. In some embodiments, application 136-1 includes application internal state 192, which indicates the current application view(s) displayed on touch-sensitive display system 112 when the application is active or executing. In some embodiments, device/global internal state 157 is used by event sorter 170 to determine which application(s) is (are) currently active, and application internal state 192 is used by event sorter 170 to determine application views 191 to which to deliver event information.
In some embodiments, application internal state 192 includes additional information, such as one or more of: resume information to be used when application 136-1 resumes execution, user interface state information that indicates information being displayed or that is ready for display by application 136-1, a state queue for enabling the user to go back to a prior state or view of application 136-1, and a redo/undo queue of previous actions taken by the user.
Event monitor 171 receives event information from peripherals interface 118. Event information includes information about a sub-event (e.g., a user touch on touch-sensitive display system 112, as part of a multi-touch gesture). Peripherals interface 118 transmits information it receives from I/O subsystem 106 or a sensor, such as proximity sensor 166, accelerometer(s) 168, and/or microphone 113 (through audio circuitry 110). Information that peripherals interface 118 receives from I/O subsystem 106 includes information from touch-sensitive display system 112 or a touch-sensitive surface.
In some embodiments, event monitor 171 sends requests to the peripherals interface 118 at predetermined intervals. In response, peripherals interface 118 transmits event information. In other embodiments, peripheral interface 118 transmits event information only when there is a significant event (e.g., receiving an input above a predetermined noise threshold and/or for more than a predetermined duration).
In some embodiments, event sorter 170 also includes a hit view determination module 172 and/or an active event recognizer determination module 173.
Hit view determination module 172 provides software procedures for determining where a sub-event has taken place within one or more views, when touch-sensitive display system 112 displays more than one view. Views are made up of controls and other elements that a user can see on the display.
Another aspect of the user interface associated with an application is a set of views, sometimes herein called application views or user interface windows, in which information is displayed and touch-based gestures occur. The application views (of a respective application) in which a touch is detected optionally correspond to programmatic levels within a programmatic or view hierarchy of the application. For example, the lowest level view in which a touch is detected is, optionally, called the hit view, and the set of events that are recognized as proper inputs are, optionally, determined based, at least in part, on the hit view of the initial touch that begins a touch-based gesture.
Hit view determination module 172 receives information related to sub-events of a touch-based gesture. When an application has multiple views organized in a hierarchy, hit view determination module 172 identifies a hit view as the lowest view in the hierarchy which should handle the sub-event. In most circumstances, the hit view is the lowest level view in which an initiating sub-event occurs (e.g., the first sub-event in the sequence of sub-events that form an event or potential event). Once the hit view is identified by the hit view determination module, the hit view typically receives all sub-events related to the same touch or input source for which it was identified as the hit view.
Active event recognizer determination module 173 determines which view or views within a view hierarchy should receive a particular sequence of sub-events. In some embodiments, active event recognizer determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, active event recognizer determination module 173 determines that all views that include the physical location of a sub-event are actively involved views, and therefore determines that all actively involved views should receive a particular sequence of sub-events. In other embodiments, even if touch sub-events were entirely confined to the area associated with one particular view, views higher in the hierarchy would still remain as actively involved views.
Event dispatcher module 174 dispatches the event information to an event recognizer (e.g., event recognizer 180). In embodiments including active event recognizer determination module 173, event dispatcher module 174 delivers the event information to an event recognizer determined by active event recognizer determination module 173. In some embodiments, event dispatcher module 174 stores in an event queue the event information, which is retrieved by a respective event receiver module 182.
In some embodiments, operating system 126 includes event sorter 170. Alternatively, application 136-1 includes event sorter 170. In yet other embodiments, event sorter 170 is a stand-alone module, or a part of another module stored in memory 102, such as contact/motion module 130.
In some embodiments, application 136-1 includes a plurality of event handlers 190 and one or more application views 191, each of which includes instructions for handling touch events that occur within a respective view of the application's user interface. Each application view 191 of the application 136-1 includes one or more event recognizers 180. Typically, a respective application view 191 includes a plurality of event recognizers 180. In other embodiments, one or more of event recognizers 180 are part of a separate module, such as a user interface kit or a higher level object from which application 136-1 inherits methods and other properties. In some embodiments, a respective event handler 190 includes one or more of: data updater 176, object updater 177, GUI updater 178, and/or event data 179 received from event sorter 170. Event handler 190 optionally utilizes or calls data updater 176, object updater 177 or GUI updater 178 to update the application internal state 192. Alternatively, one or more of the application views 191 includes one or more respective event handlers 190. Also, in some embodiments, one or more of data updater 176, object updater 177, and GUI updater 178 are included in a respective application view 191.
A respective event recognizer 180 receives event information (e.g., event data 179) from event sorter 170, and identifies an event from the event information. Event recognizer 180 includes event receiver 182 and event comparator 184. In some embodiments, event recognizer 180 also includes at least a subset of: metadata 183, and event delivery instructions 188 (which optionally include sub-event delivery instructions).
Event receiver 182 receives event information from event sorter 170. The event information includes information about a sub-event, for example, a touch or a touch movement. Depending on the sub-event, the event information also includes additional information, such as location of the sub-event. When the sub-event concerns motion of a touch, the event information optionally also includes speed and direction of the sub-event. In some embodiments, events include rotation of the device from one orientation to another (e.g., from a portrait orientation to a landscape orientation, or vice versa), and the event information includes corresponding information about the current orientation (also called device attitude) of the device.
Event comparator 184 compares the event information to predefined event or sub-event definitions and, based on the comparison, determines an event or sub-event, or determines or updates the state of an event or sub-event. In some embodiments, event comparator 184 includes event definitions 186. Event definitions 186 contain definitions of events (e.g., predefined sequences of sub-events), for example, event 1 (187-1), event 2 (187-2), and others. In some embodiments, sub-events in an event 187 include, for example, touch begin, touch end, touch movement, touch cancellation, and multiple touching. In one example, the definition for event 1 (187-1) is a double tap on a displayed object. The double tap, for example, comprises a first touch (touch begin) on the displayed object for a predetermined phase, a first lift-off (touch end) for a predetermined phase, a second touch (touch begin) on the displayed object for a predetermined phase, and a second lift-off (touch end) for a predetermined phase. In another example, the definition for event 2 (187-2) is a dragging on a displayed object. The dragging, for example, comprises a touch (or contact) on the displayed object for a predetermined phase, a movement of the touch across touch-sensitive display system 112, and lift-off of the touch (touch end). In some embodiments, the event also includes information for one or more associated event handlers 190.
In some embodiments, event definition 187 includes a definition of an event for a respective user-interface object. In some embodiments, event comparator 184 performs a hit test to determine which user-interface object is associated with a sub-event. For example, in an application view in which three user-interface objects are displayed on touch-sensitive display system 112, when a touch is detected on touch-sensitive display system 112, event comparator 184 performs a hit test to determine which of the three user-interface objects is associated with the touch (sub-event). If each displayed object is associated with a respective event handler 190, the event comparator uses the result of the hit test to determine which event handler 190 should be activated. For example, event comparator 184 selects an event handler associated with the sub-event and the object triggering the hit test.
In some embodiments, the definition for a respective event 187 also includes delayed actions that delay delivery of the event information until after it has been determined whether the sequence of sub-events does or does not correspond to the event recognizer's event type.
When a respective event recognizer 180 determines that the series of sub-events do not match any of the events in event definitions 186, the respective event recognizer 180 enters an event impossible, event failed, or event ended state, after which it disregards subsequent sub-events of the touch-based gesture. In this situation, other event recognizers, if any, that remain active for the hit view continue to track and process sub-events of an ongoing touch-based gesture.
In some embodiments, a respective event recognizer 180 includes metadata 183 with configurable properties, flags, and/or lists that indicate how the event delivery system should perform sub-event delivery to actively involved event recognizers. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate how event recognizers interact, or are enabled to interact, with one another. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate whether sub-events are delivered to varying levels in the view or programmatic hierarchy.
In some embodiments, a respective event recognizer 180 activates event handler 190 associated with an event when one or more particular sub-events of an event are recognized. In some embodiments, a respective event recognizer 180 delivers event information associated with the event to event handler 190. Activating an event handler 190 is distinct from sending (and deferred sending) sub-events to a respective hit view. In some embodiments, event recognizer 180 throws a flag associated with the recognized event, and event handler 190 associated with the flag catches the flag and performs a predefined process.
In some embodiments, event delivery instructions 188 include sub-event delivery instructions that deliver event information about a sub-event without activating an event handler. Instead, the sub-event delivery instructions deliver event information to event handlers associated with the series of sub-events or to actively involved views. Event handlers associated with the series of sub-events or with actively involved views receive the event information and perform a predetermined process.
In some embodiments, data updater 176 creates and updates data used in application 136-1. For example, data updater 176 updates the telephone number used in contacts module 137, or stores a video file used in video and music player module 152. In some embodiments, object updater 177 creates and updates objects used in application 136-1. For example, object updater 177 creates a new user-interface object or updates the position of a user-interface object. GUI updater 178 updates the GUI. For example, GUI updater 178 prepares display information and sends it to graphics module 132 for display on a touch-sensitive display.
In some embodiments, event handler(s) 190 includes or has access to data updater 176, object updater 177, and GUI updater 178. In some embodiments, data updater 176, object updater 177, and GUI updater 178 are included in a single module of a respective application 136-1 or application view 191. In other embodiments, they are included in two or more software modules.
It shall be understood that the foregoing discussion regarding event handling of user touches on touch-sensitive displays also applies to other forms of user inputs to operate multifunction devices 100 with input-devices, not all of which are initiated on touch screens. For example, mouse movement and mouse button presses, optionally coordinated with single or multiple keyboard presses or holds; contact movements such as taps, drags, scrolls, etc., on touch-pads; pen stylus inputs; movement of the device; oral instructions; detected eye movements; biometric inputs; and/or any combination thereof are optionally utilized as inputs corresponding to sub-events which define an event to be recognized.
Device 100 optionally also includes one or more physical buttons, such as “home” or menu button 204. As described previously, menu button 204 is, optionally, used to navigate to any application 136 in a set of applications that are, optionally executed on device 100. Alternatively, in some embodiments, the menu button is implemented as a soft key in a GUI displayed on the touch-screen display, or as a system gesture such as an upward edge swipe.
In some embodiments, device 100 includes the touch-screen display, menu button 204 (sometimes called home button 204), push button 206 for powering the device on/off and locking the device, volume adjustment button(s) 208, Subscriber Identity Module (SIM) card slot 210, head set jack 212, and/or docking/charging external port 124. Push button 206 is, optionally, used to turn the power on/off on the device by depressing the button and holding the button in the depressed state for a predefined time interval; to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or to unlock the device or initiate an unlock process. In some embodiments, device 100 also accepts verbal input for activation or deactivation of some functions through microphone 113. Device 100 also, optionally, includes one or more contact intensity sensors 165 for detecting intensities of contacts on touch-sensitive display system 112 and/or one or more tactile output generators 167 for generating tactile outputs for a user of device 100.
Each of the above identified elements in
Attention is now directed towards embodiments of user interfaces (“UI”) that are, optionally, implemented on portable multifunction device 100.
It should be noted that the icon labels illustrated in
Additionally, while the following examples are given primarily with reference to finger inputs (e.g., finger contacts, finger tap gestures, finger swipe gestures, etc.), it should be understood that, in some embodiments, one or more of the finger inputs are replaced with input from another input device (e.g., a mouse based input or a stylus input). For example, a swipe gesture is, optionally, replaced with a mouse click (e.g., instead of a contact) followed by movement of the cursor along the path of the swipe (e.g., instead of movement of the contact). As another example, a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e.g., instead of detection of the contact followed by ceasing to detect the contact). Similarly, when multiple user inputs are simultaneously detected, it should be understood that multiple computer mice are, optionally, used simultaneously, or a mouse and finger contacts are, optionally, used simultaneously.
FIGS. 4C1-4C2 illustrate an example state diagram 4000 of navigation between various user interfaces of the multifunction device 100 in accordance with some embodiments. In some embodiments, the multifunction device 100 displays a respective user interface from a plurality of different user interfaces, including a wake screen user interface 490 (also referred to as a coversheet user interface 496), a home screen user interface 492, a widget user interface 491, a control user interface 498, a search user interface 494, an application library user interface 497, and an application user interface 493 of a respective application (e.g., a camera application (e.g., camera application user interface 495), a flashlight application, a settings application, a messaging application (e.g., application user interface 493), a telephony application, a maps application, a browser application, or another type of application) of a plurality of applications. In some embodiments, the multifunction device utilizes various portions of the display (e.g., touch-screen display 112, display 340 associated with a touch-sensitive surface, a head-mounted display, or another type of display) to display persistent content across multiple user interfaces. For example, in some embodiments, the display includes a dynamic status region 4002 for displaying alerts, status updates, and/or current states for various subscribed and/or ongoing events, and/or for various application activities, in real-time or substantially real-time. In some embodiments, the display includes a static status region 4022 for displaying status information for one or more system functions that is relatively stable over a period of time. In some embodiments, the dynamic status region 4002 changes (e.g., expands and/or shrinks) from a region that accommodate one or more hardware elements of the multifunction device (e.g., the camera lenses, microphone, and/or speakers). As described herein, although examples below are given with touch-gestures on a touch-screen display, similar functions can be implemented with a display that is associated with a touch-sensitive surface, where a location (e.g., a location on a top edge, a bottom edge, a left edge, a right edge, a top left portion, a bottom right portion, an interior portion, and/or another portion) on the touch-sensitive surface has a corresponding location (e.g., a location on a top edge, a bottom edge, a left edge, a right edge, a top left portion, a bottom right portion, an interior portion, and/or another portion) on the display (and/or on the user interface presented on the display). Furthermore, although the examples below are given with touch-gestures on a touch-screen display, similar functions can be implemented with a display that is associated with another type of input, such as a mouse inputs, a pointer inputs, gaze inputs (e.g., gazes with time and location characteristics that are directed to various portions of the displayed user interface and/or user interface elements) in conjunction with air gesture inputs (e.g., air tap, air swipe, air pinch, pinch and hold, pinch-hold and drag, and/or another type of air gestures). As described herein, although examples below are given with touch-gestures on a touch-screen display, similar functions can be implemented with a head-mounted display that displays the user interfaces in a three-dimensional environment and that is controlled with various input devices and sensors for detecting various types of user inputs (e.g., touch gestures, inputs provided by a pointer or controller, gaze inputs, voice inputs, and/or air gestures).
As shown in FIG. 4C1, when the multifunction device 100 is initially powered on (e.g., in response to a long press or other activation input 4100 on a power button 116a (
In some embodiments, while the wake screen user interface 490 is displayed after a period of time, the multifunction device 100 optionally transitions (4101) to a low power state, where the display of the multifunction device 100 is optionally turned off, or dimmed, as illustrated by user interface 489. In some embodiments, the wake screen user interface 490 remains displayed in a dimmed, always on state, while the multifunction device 100 is in the low power state. For example, in the low power state illustrated by user interface 489, the time indication and/or date indication continues to be displayed.
In some embodiments, the multifunction device 100 transitions (4101) into the low power state (e.g., turns off the display or displays the wake screen user interface 490 in the dimmed, always-on state) in response to activation of the power button 116a of the multifunction device 100 by a user input 4101 (e.g., while displaying the wake screen user interface 490, and/or any of the other user interfaces described herein).
In some embodiments, the multifunction device transitions (e.g., automatically after a period of inactivity, and/or in response to detecting a user input activating the power button 116a) into the low power state from the normal operating state in which any of a number of user interfaces (e.g., the wake screen user interface 490, the home screen user interface 492, the application user interface 493 of a respective application, or another system and/or application user interface) may be the last displayed user interface before the transition into the low power state.
In some embodiments, when the multifunction device 100 is in the low power state, the multifunction device continues to detect inputs via one or more sensors and input devices of the multifunction device (e.g., movement of the device, touch gestures (e.g., swipe, tap, or other touch input), gaze input, air gestures, impact on the device, press on the power button, rotation of a crown, or other types of inputs). In some embodiments, in response to detecting a user input via the one or more sensors and input devices of the multifunction device, the multifunction device transitions (4100) from the low power state to the normal operating state, and displays the wake screen user interface 490 in a normal, undimmed state.
In some embodiments, when the multifunction device 100 is in the low power state illustrated in user interface 489, the multifunction device continues to detect events, such as arrival of notifications and status updates (e.g., notification for messages, incoming communication requests, and/or other application-generated events and system-generated events, and status updates for sessions, subscribed events, and/or other status changes that require the user's attention). In some embodiments, in response to detecting an event that generates an alert, a notification, and/or a status update, the multifunction device transitions from the low power state to the normal operating state, and displays the alert, notification, and/or status update on the wake screen user interface 490 in the normal, undimmed state. In some embodiments, the multifunction device automatically returns to the low power mode after a short period of time after displaying the alert, notification, and/or the status update.
In some embodiments, the wake screen user interface 490 displayed in the dimmed always-on state includes the same or substantially the same set of user interface elements as the wake screen user interface 490 displayed in the normal operating state (e.g., as opposed to the dark screen shown in FIGS. 4C1 and 4C2). In some embodiments, the wake screen user interface 490 displayed in the dimmed, always-on state has fewer user interface elements than the wake screen user interface 490 displayed in the normal operating state. For example, in some embodiments, the wake screen user interface 490 displayed in the normal operating state includes a time element 4004 showing the current time, a date element 4006 showing the current date, one or more widgets 4008 that include content from respective applications that is updated from time to time without user intervention. In some embodiments, the wake screen user interface 490 displayed in the normal operating state includes one or more application icons corresponding to respective applications, such as an application icon 4010 for the flashlight application, an application icon 4012 for the camera application, or another system-recommended or user selected application. In some embodiments, the wake screen user interface 490 displayed in the normal operating state includes one or more shortcuts for accessing respective operations in one or more system-recommended and/or user-selected applications (e.g., shortcuts to play music using a media player application, to send a quick message using the messaging application, or turn on the DND or sleep mode using a system application). In some embodiments, the wake screen user interface 490 includes the dynamic status region 4002 that displays status updates or current state of an ongoing activity for one or more applications, such as a communication session, a charging session, a running timer, music playing session, delivery updates, navigation instructions, location sharing status, and/or status updates for subscribed application and system events. In some embodiments, the wake screen user interface 490 includes the static status region 4022 that displays status for one or more system functions, such as the network connection status, battery status, location sharing status, cellular signal and carrier information, and other system status information. In some embodiments, a dynamic status update (e.g., battery charging, screen recording, location sharing, and other status updates) is displayed in the dynamic status region 4002 first, and then moved to the static status region 4022 after a period of time. In some embodiments, in a dimmed always on state, the wake screen user interface 490 omits the dynamic status region 4002, static status region 4022, the application icons 4010 and 4012, and/or the shortcuts for application and/or system operations, and optionally disables interaction with remaining user interface elements (e.g., the wallpaper, the time element 4004, the date element 4006, and/or the widgets 4008) of the wake screen user interface 490.
In some embodiments, the wake screen user interface includes one or more recently received notifications (e.g., notifications 4016, or other newly received notification(s)) that correspond to one or more applications. In some embodiments, the wake screen user interface displayed in the dimmed always on state transitions into the wake screen user interface 490 in response to detecting receipt or generation of a new notification (e.g., notification 4018, FIG. 4C2, or another one or more newly received notification(s)) In some embodiments, the notifications 4016 are grouped or coalesced based on event types and/or applications corresponding to the notifications. In some embodiments, user can interact with the notifications to dismiss the notifications, sent the notifications to notification history, and/or expand the notifications to see additional notification content (e.g., optionally after valid authentication data has been requested and/or obtained).
In some embodiments, the wake screen user interface 490 may be displayed while the multifunction device is in a locked state or an unlocked state. In some embodiments, when the wake screen user interface 490 is displayed while the multifunction device is in the locked state, a locked symbol 4020a is optionally displayed in the status region (e.g., dynamic status region 4002, static status region in the upper right corner of the display) or elsewhere (e.g., below the dynamic status region 4002, in the upper left corner, or in another portion of the display) in the wake screen user interface 490 to indicate that the multifunction device is in the locked state (e.g., shown in wake screen user interface 490 in FIG. 4C1), and that authentication data is required to dismiss the wake screen user interface 490 to navigate to the home screen user interface 492 or last-displayed application user interface. In some embodiments, the multifunction device automatically attempts to obtain authentication data via biometric scan (e.g., facial, fingerprint, voiceprint, and/or iris) when the wake screen user interface 490 is displayed (e.g., in the low power state, and/or the normal operating state), and automatically transitions into the unlocked state if valid authentication data is successfully obtained. In some embodiments, in conjunction with transitioning into the unlocked state, the multifunction device replaces the locked symbol 4020a with an unlocked symbol 4020b to indicate that the multifunction device is now in the unlocked state (e.g., shown in wake screen user interface 490 in FIG. 4C2).
In some embodiments, the multifunction device allows user interaction with the user interface elements of the wake screen user interface 490 when the wake screen user interface 490 is displayed in the normal operating mode.
For example, in some embodiments, selecting (e.g., by tapping, clicking, and/or air tapping) on a user interface element, such as one of the widgets 4008, status region 4002, notification 4018, and/or application icons 4010 or 4012, causes the multifunction device to navigate away from the wake screen user interface 490 and displays a respective user interface of the application that corresponds to the selected user interface element, or an enlarged version of the user interface element to show additional information and/or controls related to the initially displayed content in the selected user interface element. For example, as shown in FIG. 4C2, in response to a user input 4113 selecting message notification 4018, the computer system displays (4113) the application user interface 493 for the messaging application.
In another example, in some embodiments, an enhanced selection input 4112 (e.g., a touch and hold gesture, a light press input, or another type of input) on a respective user interface element, such as the time element 4004, the date element 4006, or a wallpaper of the wake screen user interface 490, causes the multifunction device to display a configuration user interface for configuring one or more aspects of the wake screen user interface 490 (e.g., selecting a wallpaper, configuring a color or font scheme of the user interface element, configuring how to layout the different elements of the wake screen user interface, configuring additional wake screen, selecting a previously configured wake screen, and view additional customization options for the wake screen user interface). In some embodiments, configuration of the wake screen user interface 490 is partially applied to the home screen user interface 492, and vice versa.
In some embodiments, an enhanced selection input (e.g., a touch and hold gesture, a light press input, or another type of input) on the flashlight application icon 4010 or the camera application icon 4012 causes the multifunction device to activate the flashlight of the multifunction device or display the camera user interface 495 of the camera application. For example, in response to detecting selection input 4104a on the camera application icon 4012 in the wake screen user interface 490, the multifunction device activates the camera application and displays (4104a) the camera application UI 495 (e.g., as shown in FIG. 4C1).
In some embodiments, if the multifunction device detects user interaction with the user interface elements shown in the wake screen user interface 490 and determines that the wake screen user interface is in the locked state, the multifunction device attempts to obtain authentication data from the user by displaying an authentication user interface (e.g., a passcode entry interface, a password entry user interface, and/or a biometric scan user interface). The multifunction device proceeds to navigate away from the wake screen user interface 490 and performs the operation in accordance with the user's interaction after valid authentication data has been obtained from the user.
In some embodiments, in addition to performing operations (e.g., navigating to application user interfaces, displaying expanded versions of user interface elements that show additional information, and/or displaying configuration options for a respective user interface element or the wake screen user interface), the multifunction device allows the user to navigate from the wake screen user interface 490 to other user interfaces (optionally, after valid authentication data has been obtained) in response to navigation inputs (e.g., swipe gestures or other types of navigation inputs that are directed to regions of the wake screen user interface that are not occupied by a user interface element, and/or regions of the wake screen user interface that are occupied by user interface element (e.g., widgets, application icons, and/or time elements) that do not respond to swipe gestures or said other types of navigation inputs).
For example, in some embodiments, an upward swipe gesture 4105 that starts from the bottom edge of the wake screen user interface 490 causes (4105) the multifunction device to navigate away from the wake screen user interface 490 and display the home screen user interface 492 or the last-displayed application user interface (optionally, after requesting and obtaining valid authentication data).
In some embodiments, the upward swipe gesture 4105 is a representative example of a home gesture or dismissal gesture (e.g., other examples include upward swipe gestures 4103a, 4103c, 4103d, 4103e, 4110a, and 4111a) that causes the multifunction device to dismiss the currently displayed user interface (e.g., the wake screen user interface 490, an application user interface (e.g., camera user interface 495, messages user interface 493, or another application user interface), the control user interface 498, the search user interface 494, the application library user interface 497, or the home screen configuration user interface) and navigate to the home screen user interface 492 or a last-displayed user interface (e.g., the wake screen user interface 490, the wake screen configuration user interface, the search user interface 494, an application user interface, or the home screen user interface 492).
In some embodiments, a downward swipe from a top edge (e.g., the central portion of the top edge, or any portion of the top edge) or an interior region of the wake screen user interface 490 (e.g., downward swipe 4106a, or another downward swipe) causes (4106a) the multifunction device to display the search user interface 494 that includes a search input region 4030 and one or more applications icons 4032 for recommended applications (e.g., recently used applications, and/or relevant applications based on the current context), as shown in FIG. 4C1. In some embodiments, in response to detecting a search input in the search input region 4030, the multifunction device retrieves and displays search results that include relevant application content (e.g., messages, notes, media files, and/or documents) from the different applications that are installed on the multifunction device, relevant applications (e.g., applications that are installed on the multifunction device and/or applications that are available in the app store), relevant webpages (e.g., bookmarked webpages and/or webpages newly retrieved from the Internet), and/or search results from other sources (e.g., news, social media platforms, and/or reference websites). In some embodiments, different sets of search results are provided depending on the locked and unlocked state of the multifunction device, and more details or additional search results may be displayed if the multifunction device is in the unlocked state when the search is performed. In some embodiments, the multifunction device attempts to obtain valid authentication data in response to receiving the search input, and displays different sets of search results depending on whether valid authentication data is obtained. In some embodiments, an upward swipe gesture 4103d that starts from the bottom edge of the search user interface (or another type of dismissal input) causes (4103d) the multifunction device to dismiss the search user interface 494 and redisplays the wake screen user interface 490 (e.g., since the wake screen user interface was the last displayed user interface), as shown in FIG. 4C1. In some embodiments, in response to a downward swipe 4106b from an interior region of the home screen user interface 492 causes (4106b) the multifunction device to display the search user interface 494; and in response to a subsequent upward swipe gesture 4103d from the bottom edge of the search user interface 494, the home screen user interface 492 is (4103d) redisplayed (e.g., since the home screen user interface was the last displayed user interface), as shown in FIG. 4C1.
In some embodiments, as shown in FIG. 4C1, a rightward swipe gesture 4102a that starts from a left edge or interior region of the wake screen user interface 490 causes (4102a) the multifunction device to navigate from the wake screen user interface 490 to a widget user interface 491 (or another system user interface other than the home screen user interface, such as a control user interface, a search user interface, or a notification history user interface). In some embodiments, the widget user interface 491 includes a plurality of widgets 4026 (e.g., including widget 4026a, widget 4026b and widget 4026c) that are automatically selected by the operating system and/or selected by the user for inclusion in the widget user interface 491. In some embodiments, the widgets 4026 displayed in the widget user interface 491 have form factors that are larger than the widgets 4008 displayed under the time element 4004 in the wake screen user interface 490. In some embodiments, the widgets 4026 displayed in the widget user interface 491 and the widgets 4008 displayed in the wake screen user interface 490 are independently selected and/or configured from each other. In some embodiments, the widgets 4026 in the widget user interface 491 include content from their respective applications and the content is automatically updated from time to time as the updates to the content becomes available in the respective applications. In some embodiments, selection of a respective widget (e.g., tapping on the respective widget, or providing other selection input directed to the respective widget) in the widget user interface causes the multifunction device to navigate away from the widget user interface 491 and displays a user interface of the application that corresponds to the respective widget (optionally, after valid authentication data is requested and/or obtained).
In some embodiments, an upward swipe gesture 4103a that starts from the bottom edge of the widget user interface 491 and/or a leftward swipe gesture 4103b that starts from the right edge or the interior region of the widget user interface 491 causes (4103a-1/4103b-1) the multifunction device to dismiss the widget user interface 491 and redisplay the wake screen user interface 490, as shown in FIG. 4C1.
In some embodiments, a leftward swipe gesture 4104b that starts from the right edge or interior portion of the wake screen user interface 490 causes (4104b) the multifunction device to navigate from the wake screen user interface 490 to a camera user interface 495 of the camera application. In some embodiments, access to the photo library through the camera application is restricted in the camera user interface 495 unless valid authentication data has been obtained. In some embodiments, as shown in FIG. 4C1, an upward swipe gesture 4103c that starts from the bottom edge of the camera user interface 495 or another dismissal input causes (4103c) the multifunction device to navigate away from the camera user interface 495 and redisplay the wake screen user interface 490 (e.g., since the wake screen user interface 490 is the last displayed user interface prior to displaying the camera user interface 495).
In some embodiments, a downward swipe gesture 4109a that starts from the right portion of the top edge of the wake screen user interface (e.g., as illustrated in FIG. 4C2) causes (4109a) the multifunction device to display the control user interface 498 overlaying or replacing display of the wake screen user interface 490. In some embodiments, the control user interface 498 includes status information for one or more static status indicators displayed in the static status region 4022, and respective sets of controls 4028 (e.g., including control 4028a, control 4028b, and control 4028c) for various system functions, such as network connections (WiFi, cellular data, airplane mode, Bluetooth, and other connection types), media playback controls, display controls (e.g., display brightness, color temperature, night shift, true tone, and dark mode controls), audio controls (e.g., volume, and/or mute/unmute controls, focus mode controls (e.g., DND, work, study, sleep, and other modes in which generation of alerts and notifications are moderated based on context and configurations, and application icons (e.g., flashlight, timer, calculator, camera, screen recording, and/or other user-selected or system recommended applications))). In some embodiments, an upward swipe gesture 4110a that starts from the bottom edge of the control user interface 498 (or another dismissal input) causes the multifunction device to dismiss the control user interface 498 and redisplay (4110a-1) the wake screen user interface 490 (e.g., since the wake screen user interface 490 is the last displayed user interface prior to displaying the control user interface 498).
In some embodiments, an upward swipe gesture 4107 that starts from the interior region of the wake screen user interface 490 and/or an upward swipe gesture that starts from the interior of the coversheet user interface 496 (e.g., optionally, when there are no unread notifications displayed in the coversheet user interface) causes (4107) the multifunction device to display the notification history user interface that includes a plurality of previously saved notifications and notifications that have been sent directly to notification history without first being displayed on the wake screen user interface 490. In some embodiments, the notification history user interface can be scrolled to reveal additional notifications in response to an upward swipe gesture 4118 directed to the notification history in the wake screen user interface 490 and/or the coversheet user interface 496. In some embodiments, the notification history is displayed as part of the wake screen user interface 490 and/or coversheet user interface 496, and a downward swipe gesture 4103f that is directed to the interior portion of the notification history causes the notification history to cease to be displayed and causes the wake screen user interface 490 and/or coversheet user interface 496 to be redisplayed without the notification history.
As described above, after navigating from the wake screen user interface 490 to a respective user interface other than the home screen user interface (e.g., in response to a swipe gesture in the downward, leftward, or rightward directions), an upward swipe gesture 4103 (e.g., 4103a, and 4103c through 4103f) that starts from a bottom edge of the respective user interface (e.g., an upward swipe gesture that starts from the bottom edge of the touch-sensitive display that displays a respective user interface in full screen mode, or an upward swipe gesture that starts from the bottom edge of a touch-sensitive surface that corresponds to the display that displays the respective user interface) causes the multifunction device to dismiss the respective user interface and returns to the wake screen user interface 490. In contrast, an upward swipe gesture 4105 that starts from the bottom edge of the wake screen user interface 490 causes (4105) the multifunction device to navigate away from the wake screen user interface 490 and displays the home screen user interface 492, and another upward swipe gesture that starts from the bottom edge of the home screen user interface 492 does not cause the multifunction device to dismiss the home screen user interface 492 and return to the wake screen user interface 490. In other words, once the navigation from the wake screen user interface 490 to the home screen user interface 492 is completed, the multifunction device is no longer in the restricted state, and access to the application icons displayed on the home screen user interface 492 and access to the content and functions of the computer system are unrestricted to the user. The upward swipe gesture that starts from the bottom edge of the currently displayed user interface is a representative example of a dismissal input that dismisses the currently displayed user interface and redisplays the last displayed user interface. The upward swipe gesture that starts from the bottom edge of the currently displayed user interface is also a representative example of a home gesture that dismisses the currently displayed user interface and displays the home screen user interface (e.g., irrespective of whether the home screen user interface was the last displayed user interface prior to displaying the currently displayed user interface).
As shown in FIG. 4C2, once the multifunction device navigates away from the wake screen user interface 490 and displays the home screen user interface 492, the user can access the functions and applications of the multifunction device without restriction. For example, in some embodiments, the home screen user interface 492 includes multiple pages, and a respective page of the home screen user interface includes a respective set of application icons and/or widgets corresponding to different applications, and user selection of (e.g., by tapping on, clicking on, or otherwise selecting) a respective widget or application icon causes the multifunction device to display an application user interface of the application that corresponds to the respective widget or application icon.
In some embodiments, the home screen user interface 492 displays a search affordance 4034 (e.g., as illustrated in FIG. 4C1), and a tap on the search affordance 4034 causes the search user interface 494 described above to be displayed overlaying the home screen user interface 492. In some embodiments, in response to detecting an upward swipe gesture 4103d that starts from the bottom edge of the search user interface (or another dismissal input), the multifunction device dismisses the search user interface 494 and redisplays (4103d) the home screen user interface 492 (e.g., not the wake screen user interface 490, as the upward edge swipe gesture dismisses the currently displayed user interface and redisplays the last displayed system user interface).
In some embodiments, as shown in FIG. 4C1, a rightward swipe gesture 4102b that starts from the left edge of the first page of the home screen user interface 492 causes (4102b) the multifunction device to display the widget user interface 491 described above. In some embodiments, a leftward swipe gesture (e.g., gesture 4103b, or another leftward swipe gesture) that starts from the right edge or the interior region of the widget user interface or an upward swipe gesture (e.g., gesture 4103a, or another upward swipe gesture) that starts from the bottom edge of the widget user interface 491 causes (4103a-2/4103b-2) the multifunction device to navigate away from the widget user interface 491 and redisplays the first page of the home screen user interface 492 (e.g., when the home screen user interface 492 was the last displayed user interface prior to displaying the widget user interface 491).
In some embodiments, consecutive leftward swipe gestures 4116 on the home screen user interface 492, as shown in FIG. 4C2, navigates through consecutive pages of the home screen user interface 492 until the application library user interface 497 is (4116) displayed. In some embodiments, the application library user interface 497 displays application icons from multiple pages of the home screen user interface grouped into different categories. In some embodiments, the application library user interface 497 includes a search user interface element 4036 that accepts search criteria (e.g., keywords, image, and/or other search criteria) and returns application icons for relevant applications (e.g., applications that are stored on the multifunction device and/or available in the app store) as search results. In some embodiments, user selection of (e.g., by a tap input, a click input, or another type of selection input) on an application icon in the search results and/or in the application library causes the multifunction device to display the application user interface of the application that corresponds to the selected application icon.
In some embodiments, a downward swipe gesture 4109c that starts from the right portion of the top edge of the application library user interface 497 causes display of the control user interface 498 as described above. In some embodiments, an upward swipe gesture (e.g., upward swipe gesture 4110a, or another upward swipe gesture) that starts from the bottom edge of the control user interface 498 or another dismissal input causes the multifunction device to dismiss the control user interface 498 and redisplay the application library user interface 497 (e.g., since the application library user interface is the last displayed user interface before the display of the control user interface) (e.g., or redisplay another user interface (e.g., redisplay (4110a-1) the wake screen user interface 490 (e.g., if control user interface 498 is displayed in response to swipe gesture 4109a), redisplay (4110a-3) the home screen user interface 492 (e.g., if the control user interface is displayed in response to a downward swipe from the top right portion of the top edge of the display), or redisplay (4110a-2) the application user interface (e.g., if the control user interface is displayed in response to the downward swipe 4109b) that was the last displayed user interface prior to displaying the control user interface).
In some embodiments, a rightward swipe gesture 4115 that starts from the interior region or the left edge of the application library user interface 497 or an upward swipe gesture that starts from the bottom edge of the application library user interface 497 causes (4115) the multifunction device to dismiss the application library user interface 497 and redisplays the last page of the home screen user interface 492.
In some embodiments, a downward swipe gesture 4114 that starts from the interior region of the application library user interface 497 causes the multifunction device to display the application icons for applications stored on the multifunction device in a scrollable list (e.g., according to chronological or alphabetical order).
In some embodiments, an upward swipe gesture that starts from the bottom edge of the home screen user interface causes the multifunction device to display the first page of the home screen user interface 492 or display the multitasking user interface 488 (also referred to an application switcher user interface). In some embodiments, different criteria (e.g., criteria based on the speed, direction, duration, distance, intensity, and/or other characteristics) are used to determine whether to navigate to the first page of the home screen user interface 492 or to the multitasking user interface 488 in response to detecting the upward swipe gesture that starts from the bottom edge of the home screen user interface. For example, in some embodiments, a short flick and a slow and long swipe cause the multifunction device to navigate to the first page of the home screen user interface 492, while a slow and medium length swipe causes the multifunction device to display the multitasking user interface 488. In some embodiments, a navigation gesture is dynamically evaluated before the termination of the gesture is detected, and therefore, the estimated destination user interface of the navigation gesture continues to change and visual feedback regarding the estimated destination user interface continues to be provided to guide the user to conclude the gesture when the desired destination user interface is indicated by the visual feedback. In some embodiments, in response to a user input 4117 at a portion of the multitasking user interface 488 that does not correspond to an application, a last displayed user interface that is displayed before displaying the multitasking user interface 488 is displayed (e.g., home screen user interface 492 is displayed when the multitasking user interface 488 is displayed in response to user input 4111b).
In some embodiments, a reconfiguration mode of the home screen user interface 492 is displayed in which application icons and/or widgets can be repositioned in, removed from, or added to the different pages of the home screen user interface 492. In some embodiments, a touch and hold gesture or another enhanced selection input directed to the home screen user interface 492 for a respective threshold amount of time or another enhanced selection input directed to the home screen user interface 492 cause the multifunction device to display the home screen user interface 492 in the configuration mode. In some embodiments, selection of the search affordance 4034 in the home screen user interface 492 while the home screen user interface 492 is in the reconfiguration mode causes the multifunction device to display a page editing user interface for the home screen user interface in which pages of the home screen user interface may be reordered, deleted, hidden, or created. In some embodiments, a tap input on the home screen user interface in the reconfiguration mode, causes the home screen user interface to exit the reconfiguration mode. In some embodiments, a tap input on unoccupied portion of the page editing user interface causes the multifunction device to exist the page editing user interface and redisplays the home screen user interface in the reconfiguration mode. Another tap on the home screen user interface causes the home screen user interface to exit the reconfiguration mode and be redisplayed in the normal mode.
In some embodiments, while displaying the home screen user interface 492, a downward swipe gesture 4108a that starts from the top edge of the home screen user interface 492 causes (4108a) the multifunction device to cover the home screen user interface 492 with the coversheet user interface 496 (also referred to as the wake screen user interface 490 if the user interface is displayed when transitioning from a normal mode to a low-power mode, and/or vice versa (e.g., due to inactivity, due to activation of the power button, and/or due to user input that corresponds to a request to wake or lock the device)) and the access to the home screen user interface is temporarily restricted by the coversheet user interface 496. In some embodiments, while the coversheet user interface 496 is displayed, an upward swipe gesture 4103e that starts from the bottom edge of the coversheet user interface 496 dismisses (4103e) the coversheet user interface 496 and redisplays the home screen user interface 492 (e.g., since the home screen user interface is the last displayed user interface). In some embodiments, the coversheet user interface has responses to user inputs in a manner analogous to those described with respect to the wake screen user interface 490.
In some embodiments, an application user interface of a respective application can be displayed in response to user inputs in a number of scenarios, such as tapping on a widget displayed in the home screen user interface or the widget user interface; tapping on an application icon displayed in the home screen, in the widget user interface, in the search result or recommended application portion of the search user interface, in the application library user interface or in the search results provided in a search in the application library user interface; tapping on a notification on the wake screen user interface or in the notification history; tapping on a representation of an application in the multitasking user interface; or selecting a link to an application in a user interface of another application (e.g., a link to a document, a link to a phone number, a link to a message, a link to an image, and other types of links). In some embodiments, a user interface of a single application is displayed in a full-screen mode. In some embodiments, user interfaces of two or more applications are displayed in a concurrent-display configuration, such as in a side-by-side display configuration where the user interfaces of the applications are displayed adjacent to one another to fit within the display, or in an overlay display configuration where the user interface of a first application is displayed in the full-screen mode while the user interfaces of other applications are overlaid on portion(s) of the user interface of the first application (e.g., in a single stack or separately on different portions).
In some embodiments, while displaying a user interface of an application, an upward swipe gesture (e.g., upward swipe gesture 4111a, or another upward swipe gesture) that starts from the bottom edge of the application user interface (e.g., messages user interface 493, or another user interface of an application) or another dismissal input or home gesture causes (4111a-1, or 4111a-2) the multifunction device to dismiss the currently displayed application user interface, and display either the home screen user interface (e.g., shown as transition 4111a-1) or the multitasking user interface (e.g., shown as transition 4111a-1) depending on the characteristics of the upward swipe gesture. In some embodiments, while displaying home screen user interface 492, an upward swipe gesture 4111b that starts from the bottom edge of the home screen user interface causes (4111b) the multifunction device to dismiss the currently displayed home screen user interface 492, and display the multitasking user interface 488.
In some embodiments, a horizontal swipe gesture in the leftward and/or rightward direction that is performed within a bottom portion of the application use interface(s) causes the multifunction device to switch to another previously displayed application user interface of a different application. In some embodiments, the same swipe gesture that starts from the bottom portion of a respective application user interface is continuously evaluated, to determine and update an estimated destination user interface among the multitasking user interface 488, the home screen user interface 492, or a user interface of a previously displayed application, based on the characteristics of the swipe gesture (e.g., location, speed, direction, and/or change in one or more of the above), and a final destination user interface is displayed in accordance with the estimated destination user interface at the termination of the swipe gesture (e.g., lift off of the contact, reduction in intensity of the contact, a pause in movement, and/or another type of change in the input).
In some embodiments, while displaying an application user interface of a respective application (or displaying application user interfaces of multiple applications in a concurrent-display configuration), a downward swipe gesture 4108b that starts from the top edge of the application user interface(s) causes (4108b) the multifunction device to display the coversheet user interface 496 (FIG. 4C1) (or the wake screen user interface 490 in FIG. 4C2) over the application user interface(s). The multifunction device dismisses the coversheet user interface 496 (or the wake screen user interface 490) and redisplays the application user interface(s) in response to an upward swipe gesture that starts from the bottom edge of the coversheet user interface (or another dismissal input).
In some embodiments, as shown in FIG. 4C2, a downward swipe gesture 4109b that starts from the static status region 4022 on the display cause (4109b) the multifunction device to display the control user interface 498 over the application user interface(s), and an upward swipe gesture 4110a that starts from the bottom edge of the control user interface 498 (or another dismissal input) dismisses the control user interface 498 and causes (4110a-2) the application user interfaces to be redisplayed (e.g., or the last displayed user interface that is displayed before displaying the control user interface 498).
In some embodiments, rotation of the display causes the multifunction device to display a different version of the currently displayed user interface (e.g., application user interface, home screen user interface, wake screen user interface, control user interface, notification user interface, widget user interface, application library user interface, and other user interfaces described with respect to FIGS. 4C1-4C2) that have a differently layout (e.g., landscape version vs. portrait version). In some embodiments, rotation of the display has no effect on the orientation of the respective user interface that is currently displayed.
The above description of the navigation between user interfaces and exact appearances and components of the various user interfaces are merely illustrative and may be implemented with variations in various embodiments described herein. In addition, the transitions between pairs of user interfaces illustrated in FIGS. 4C1-4C2 are only a subset of all transitions that are possible between different pairs of user interfaces illustrated in FIGS. 4C1-4C2, and a transition to a respective user interface may be possible from any of multiple other user interfaces, in accordance with a respective user input of a same type, directed to a same interaction region of the display, and/or in accordance with a different type of input or directed to a different interactive region, in accordance with various embodiments.
Attention is now directed towards embodiments of user interfaces (“UI”) and associated processes that may be implemented on an electronic device (or computer system more generally), such as portable multifunction device 100 or device 300, with a display, a touch-sensitive surface, (optionally) one or more tactile output generators for generating tactile outputs, and (optionally) one or more sensors to detect intensities of contacts with the touch-sensitive surface.
As used herein, the wallpaper of a user interface (e.g., the wallpaper of a wake screen user interface and/or the wallpaper of a home screen user interface) includes a wallpaper background (e.g., a background color, a background gradient, or a background texture) and/or one or more wallpaper foreground elements (e.g., foreground icons (e.g., emojis, avatars, or other icons), foreground subjects, and/or foreground images). In some embodiments, the one or more wallpaper foreground elements are repeated in a pattern (e.g., a small grid, a medium grid, a large grid, a spiral, or another layout) that is overlaid on the wallpaper background. In some embodiments, as described below, a user is enabled to configure the one or more wallpaper foreground elements and/or the wallpaper background to modify the appearance of the wallpaper. In some embodiments, changing the one or more wallpaper foreground elements causes the device 100 to automatically, without user input, update the wallpaper background (e.g., according to the new wallpaper foreground elements that are selected, the remaining wallpaper foreground elements after deletion of one or more foreground elements, and/or the combination of the wallpaper foreground elements that are chosen).
As used herein, a wake screen user interface is a user interface that is displayed after the display of device 100 exits a low power state during which the display is turned off or enters a dimmed always-on state. In some embodiments, a wake screen user interface is also referred to herein as a face. For example, actions described as being performed with respect to a wake screen user interface may also be described as being performed with respect to a face (e.g., “switching between wake screen user interfaces” may also be stated as “switching between faces” and “editing a wake screen user interface” may also be stated as “editing a face”). In some embodiments, an “expanded face switcher” user interface includes display of one or more faces (e.g., one or more wake screen user interfaces), wherein a size of a respective face (e.g., wake screen user interface) is less than a full size of the display area (e.g., as illustrated in
As used herein, a home screen user interface includes application icons for navigating to respective applications of a plurality of applications that are executed by the device 100. In some embodiments, the device 100 detects and responds to interaction with the home screen user interface using one or more gestures, including touch inputs. For example, a tap input or other selection input on a respective application icon causes the respective application to launch, or otherwise open a user interface for the respective application, on the display area of device 100. In some embodiments, a plurality of pages for the home screen user interface is available. For example, the device detects and responds to user inputs such as swipe gestures or other inputs (e.g., inputs directed to the currently displayed page of the home screen user interface) that correspond to requests to navigate between the plurality of pages, wherein each page of the home screen user interface includes different sets of application icons for different applications. In some embodiments, the home screen user interface includes widgets that correspond to different applications and optionally have different sizes from the application icons. In some embodiments, a widget that corresponds to a respective application displays application content from the respective application, wherein the application content is updated from time to time based on updates in content in the application, without displaying the application. More details and interactions with the home screen user interface are described in
As used herein, in some embodiments, widgets (also referred to as mini application objects) are user interface objects that provide a limited subset of functions and/or information available from their corresponding applications without requiring the applications to be launched. In some embodiments, mini-application objects (or widgets) contain application content that is dynamically updated based on the current context. In some embodiments, a tap input or other selection input on a mini-application object (widget) causes the corresponding application to be launched. In some embodiments, a respective mini application object operates as a standalone application residing in memory of the device, distinct from an associated application also residing in the memory of the device. In some embodiments, a respective mini application object operates as an extension or component of an associated application on the device. In some embodiments, a respective mini application object has a dedicated memory portion for temporary storage of information. In some embodiments, the memory portion is accessible by a corresponding full-featured application of the respective mini application object. In some embodiments, a mini application object is configured to perform a subset, less than all, of the functions of a corresponding application. In some embodiments, a mini application object displays an identifier for the corresponding application. In some embodiments, a mini application object displays a portion of the content from the corresponding application. For example, a map mini application object displays a portion of a map that is displayed in a map application that corresponds to the map mini application object. For example, a calendar mini application object displays a portion of a calendar that is displayed in a corresponding calendar application. In some embodiments, a predefined input on a mini application object launches the corresponding application. In some embodiments, a mini application object operates as a standalone application residing in memory of the device, distinct from an associated application also residing in the memory of the device. For example, a mini application object corresponding to a social networking application operates as a single-purpose or streamlined application with a subset, less than all, of the functionality of the corresponding application, but is associated with the full-featured social networking application. In this example, the mini application object operates independently of the social networking application, and in a scenario where the social networking application is not running, the mini application object continues to operate. In some embodiments, a mini application object operates as an extension or component of an associated application on the device. For example, a mini application object for a calendar application is a single feature or operational component of the full-featured calendar application. In this example, if the calendar application is not running (e.g., in the background), the calendar mini application object does not operate either. In some embodiments, a mini application object has a dedicated memory portion for temporary storage of information. In some embodiments, this memory portion can be accessed by the corresponding full-featured application. For example, a mini application object for an instant messaging application has a memory portion for temporary storage of partially written reply messages. In this example, if the user opens the corresponding application in the middle of writing a reply message, the contents of the reply message are retrieved from the temporary storage location and used by the full-featured application to allow the user to complete his reply message.
In some embodiments, the wake screen user interface (e.g., user interface 501, or another wake screen user interface) includes one or more recently received notifications, a date and/or time indication for current date and time, and/or one or more status indicators (e.g., status for one or more system functions, such as network connections, battery level, and/or other system functions).
Although
In some embodiments, the user is enabled to create a new wake screen user interface, for example, by selecting (e.g., via user input 520, or another selection input) the plus icon in representation 524. For example, in response to user input 520, device 100 displays user interface 523, illustrated in
In some embodiments, platter 526 further includes a set of selectable wake screen face designs 528-1 through 528-4. For example, different wake screen face design 528 include different wallpapers (e.g., as indicated by the shading in wallpaper designs 528-2 and 528-4), different visual properties of the time and/or date indications (e.g., different fonts, sizes, and/or colors), and/or different sets of widgets for the respective wake screen face design. As such, the user is enabled to select a wake screen face design, and optionally enabled to further customize and/or modify the selected wake screen face design, to use as the wake screen user interface of device 100.
In some embodiments, the user is enabled to modify the currently selected wake screen user interface (e.g., represented by the representation 514 corresponding to user interface 501) by selecting customize button 518 via user input 522. In some embodiments, in response to user input 522, editing user interface 531 is displayed, as illustrated in
For example, editing user interface 531 enables the user to edit the wake screen user interface by changing (e.g., selecting, adding, removing, and/or rearranging) the widgets that are included on the wake screen user interface, changing a wallpaper of the wake screen user interface, and/or changing visual properties of the time and/or date indication. In some embodiments, the wallpaper includes a set of one or more wallpaper foreground elements, such as one or more types of emoji (e.g., or other icons, such as avatars) that are displayed in a pattern, as described in more detail with reference to
In some embodiments, platter 534 is displayed in editing user interface 531 in response to selection of the emoji button by input 530 in user interface 526 (e.g., if the currently edited wake screen user interface does not include emojis in the wallpaper). In some embodiments, platter 534 is displayed in editing user interface 531 in response to selection of the customize button 518 in editing user interface 513 (e.g., if the currently edited wake screen user interface includes at least one emoji in the wallpaper). In some embodiments, selecting the customization button 518 causes the device to display a customization option for adding or editing emojis in the wallpaper of the currently edited wake screen user interface, and selection of the customization option causes the platter 534 to be displayed in the editing user interface 531. In some embodiments, platter 534 includes search bar 536 that enables the user to search, via a text input (e.g., input through a keyboard (e.g., a virtual or physical keyboard) and/or a voice command), for emoji that match the text input. In some embodiments, platter 534 includes an emoji keyboard that displays a plurality of selectable emoji, avatars, or other icons, including thumbs up emoji 548, plane emoji 552, smiley face emoji 550, and rocket emoji 554. It will be understood that additional and/or alternative emoji are also displayed in some embodiments and that the emoji keyboard illustrated in
In some embodiments, device 100 detects user input 560 selecting done button 544, and in response to user input 560, device 100 ceases to display editing user interface 543 and displays wake screen user interface 563, as illustrated in FIG. 5F1. In some embodiments, wake screen user interface 563 includes the wallpaper, as selected in the editing user interface 543, including the plurality of smiley face emojis in the grid pattern overlaying the wallpaper background having the first color and the widgets indicated in editing user interface 543.
FIG. 5F1 further illustrates user input 564 for dismissing wake screen user interface 563 to navigate to the home screen user interface. For example, user input 564 comprises a swipe input in a first direction (e.g., an upward swipe, a downward swipe, or another user input) that starts from an edge portion of the display (e.g., the bottom edge, the top edge, or another portion of the display). In some embodiments, in response to detecting user input 564, device 100 ceases display of wake screen user interface 563 and displays home screen user interface 565, as illustrated in FIG. 5F2. In some embodiments, a wallpaper of home screen user interface 565 in FIG. 5F2 is the same wallpaper as the wallpaper of wake screen user interface 563 (e.g., the wallpaper includes the instances of smiley face emoji (e.g., wallpaper foreground elements) in a grid pattern overlaying a wallpaper background with the first color). In some embodiments, the wallpaper of home screen user interface 565 is a different wallpaper (e.g., a solid color, a different image, a blurred version of the image, or other wallpaper that is configurable by the user) than the wallpaper of wake screen user interface 563. In some embodiments, the wallpaper of home screen user interface 565 takes on some characteristics (e.g., theme, color scheme, and/or subject matter) of the wallpaper of the wake screen user interface 563, but are not exactly the same as the wallpaper of the wake screen user interface 563.
In some embodiments, going back to
In some embodiments, in response to user input 562 corresponding to a request to add thumbs up emoji 548 as a wallpaper foreground element for the wallpaper, the appearance of the wallpaper background (e.g., color, brightness, texture, or other visual properties) is automatically, without additional user input, updated (e.g., from the first color that is selected based only on smiley face emoji 550 to a second color that is distinct from the first color, and/or from a first brightness and/or texture to a second brightness and/or texture). In some embodiments, the first appearance is automatically selected based on the smiley face emoji, and the second appearance (e.g., the color, texture, brightness, and/or other visual properties) are automatically selected based at least in part on the newly added thumbs up emoji (e.g., based solely on the thumbs up emoji, or based on a combination of the smiley emoji and the thumbs up emoji). Using color as an example, in some embodiments, the second color used for the wallpaper background of the wallpaper is a color based at least in part on the selected thumbs up emoji 548. For example, thumbs up emoji 548 is optionally associated with the second color, wherein the second color is, in some embodiments, selected based on one or more colors in the thumbs up emoji. For example, the second color is selected to match or compliment one or more colors present in the thumbs up emoji 548. In some embodiments, the second color is based on the last (e.g., most recently) selected emoji (e.g., thumbs up emoji 548) only (e.g., and the second color is not based on previously selected emoji, such as smiley face emoji 550). In some embodiments, the second color is a color that is based at least in part on all of the currently selected emoji that are selected as wallpaper foreground elements (e.g., both smiley face emoji 550 and thumbs up emoji 548). For example, the second color is selected as color that is present in (e.g., or otherwise compliments) both smiley face emoji 550 and thumbs up emoji 548. In some embodiments, the second color is selected as a combination of colors associated with each of the wallpaper foreground elements (e.g., a combination of a color associated with smiley face emoji 550 and a color associated with thumbs up emoji 548). As such, in some embodiments the second color is based on a properties of all of the selected emoji (e.g., smiley face emoji 550 and the thumbs up emoji 548). Analogously, texture, brightness, and other visual properties of the wallpaper background can be automatically updated in accordance with various visual characteristics (e.g., unique characteristics, or shared characteristics) of the newly selected foreground element, and/or the combination of the newly selected and existing foreground elements, to complement and/or contrast the visual characteristics of the newly selected foreground element, and/or the combination of the newly selected and existing foreground elements, in accordance with various embodiments.
FIG. 5G1 further illustrates user input 566 selecting delete button 538 for removing an emoji from box 540 (e.g., and thus excluding the emoji from being included as a wallpaper foreground element in the wallpaper). In some embodiments, box 540 includes emojis that are already included in the wallpaper of the wake screen user interface, both from the current editing session and from previous editing sessions. In some embodiments, the deletion can start from any insertion point between adjacent emoji in box 540. In some embodiments, in response to the user input 566, the thumbs up emoji 548 is removed from box 540 and is removed as a wallpaper foreground element in the wallpaper, as illustrated in editing user interface 569 in FIG. 5G2. In some embodiments, in response to user input 566, the wallpaper background of the wallpaper in editing user interface 569 automatically, without additional user input, reverts to the first color that is selected based on smiley face emoji 550 (e.g., while the smiley face emoji 550 is the only emoji selected in FIG. 5G2). In some embodiments, in response to user input 566, the wallpaper background of the wallpaper in editing user interface 569 automatically, without additional user input, changes to another color, different from the first color and the second color, that is selected based on smiley face emoji 550 (e.g., while the smiley face emoji 550 is the only emoji selected in FIG. 5G2 and the smiley face emoji has multiple associated wallpaper background colors, or while there are other emojis remaining on the wallpaper background (e.g., from the current editing session, and/or from previous editing sessions)).
FIG. 5G2 further illustrates user input 568 selecting plane emoji 552 (e.g., after the thumbs up emoji is deleted and the color of the wallpaper background is updated accordingly). In some embodiments, in response to detecting user input 568 selecting plane emoji 552, plane emoji 552 is displayed in box 540, and is added as a wallpaper foreground element in the wallpaper of the wake screen user interface that is being edited in the editing user interface 571 (e.g., same editing user interface as editing user interface 569, 567, 543, and 531, but with a different appearance for the wallpaper of the currently edited wake screen user interface), as illustrated in
In some embodiments, selection of a same emoji, more than one time, to be included in the wallpaper foreground elements concurrently (e.g., if user reselects smiley face emoji 550 after selecting plane emoji 552, such that box 540 includes two smiley face emoji and one plane emoji), causes the color and/or other visual properties (e.g., texture, brightness, or other visual properties) of the wallpaper background of the wallpaper to update based on either (i) only the last-selected emoji (e.g., smiley face emoji 550) or (ii) a combination of all of the selected emoji, optionally with a greater amount of weight given to the emoji selected more than one time (e.g., smiley face emoji 550). Using color as an example of visual properties, while only one smiley face emoji and one plane emoji are selected in box 540, the third color of the wallpaper background is optionally based on a combination of two colors, one from each of the selected emoji, whereas, while two smiley face emoji 550 and one plane emoji 552 are selected, the color of the wallpaper background is optionally updated to a color that complements smiley face emoji 550 (e.g., without regard, or with less regard, to what color complements plane emoji 552).
In some embodiments, as illustrated in
In some embodiments, editing user interface 577 includes emoji button 584 that, when selected, causes device 100 to display the emoji platter (e.g., platter 534, or another selection interface for symbols and graphics to use as wallpaper foreground elements) for the user to select (e.g., modify and/or delete) emojis to be used as wallpaper foreground elements for the currently edited wallpaper. In some embodiments, editing user interface 577 includes wallpaper background button 580 that, when selected, displays a wallpaper background platter for the user to select and/or modify one or more properties of the wallpaper background (e.g., change a color, a pattern, texture, gradient, a respective combination of visual properties, filter, or other visual properties). For example, in response to user input 582 selecting wallpaper background button 580, the device displays wallpaper background platter 586 (or another configuration user interface for selection and/or modifying one or more properties of the wallpaper background) in editing user interface 579, as shown in
In some embodiments, the instances of emoji that are displayed as wallpaper foreground elements that are within the display area of the display in the first orientation and remain within the display area of the display in the second orientation continue to be displayed, and any additional instances of emoji that are displayed on the display in the second orientation but not on the display in the first orientation (e.g., or vice-versa) appear at their respective positions within the display area during the animation (e.g., during rotation of the device, the display, and the wake screen user interface (e.g., the wallpaper background, the time elements, and/or widgets)). For example, the instances of emoji in the corners of the display area of device 100 are displayed (e.g., as if the wallpaper is expanding or has additional regions that were not previously visible and that has now been revealed) during the transition between the first orientation and the second orientation, such that instances of emoji in the corners of the display area instantaneously appear according to the pattern of the wallpaper foreground elements. As such, the during the animated transition, display of blank areas of the wallpaper (e.g., areas that do not include wallpaper foreground elements and/or wallpaper background) is prevented.
In FIG. 5O1, device 100 is shown in the second orientation (e.g., portrait orientation, or another second canonical orientation of the device), in some embodiments, and the animation for rotating the emojis displayed in the wake screen user interface is initiated in response to detecting that the device 100 has satisfied the interface rotation criteria. For example, the interface rotation criteria include criteria that are met when the device 100 has been rotated by at least a threshold amount relative to the first orientation (e.g., landscape orientation, or another first canonical orientation of the device) and/or when device 100 (e.g., the display, touch-screen display, or other display generation component of device 100) is rotated with at least a threshold rate of rotational movement. In some embodiments, the interface rotation criteria include criteria that are met when the device has been rotated by the threshold angle that is less than 90 degrees (e.g., while the device is rotated between the first orientation and second orientation before the device is in the second orientation). For example, the animation described with reference to
In some embodiments, in response to detecting user input 5012 on done button 5010, as shown in
Although the above example describes the configuration process for configuring a wake screen user interface of the device, and various aspects of the editing user interface and how the foreground elements of the wallpaper and the wallpaper background can be configured may also apply to the configuration of other system user interfaces, such as the home screen user interface, the notification user interface, the widget user interface, and/or the control user interface of the device, using similar or the same editing user interface. In some embodiments, the editing user interface displays a preview of how the currently edited user interface would look in the background of the editing user interface (e.g., a view of the currently edited user interface, or a view of the wallpaper of the currently edited user interface remains visible during the editing process behind the user interface elements of the editing user interface) as new configuration options are selected and/or the currently selected configuration options are changed using the editing user interface. In some embodiments, other ways of displaying a preview of the currently edited user interface in the editing user interface may be used (e.g., displaying a reduced version of the currently edited user interface in the editing user interface to show how the change in configuration may affect the appearance of the currently edited user interface). More details of editing a user interface are provided below with respect to
Continuing with the example of the wake screen user interface 601 with the plurality of planetary bodies 602 as wallpaper foreground elements overlaying a wallpaper background of space (e.g., represented with spatial qualities of space and optionally rendering a starry environment surrounding the planetary bodies), in some embodiments, the plurality of planetary bodies 602 are arranged in a solar system diagram in the first wallpaper in accordance with a current time of day and/or a current date. For example, the respective positions of the planetary bodies 602 are updated, in real-time or substantial real-time, based on the actual positions of the planetary bodies 602 in space at the current date and/or time. As such, the planetary bodies 602 appear to move along their respective orbits (e.g., around the Sun (e.g., star 602g)) in the solar system diagram shown in the first wallpaper over time, in accordance with some embodiments.
In some embodiments, wake screen user interface 609 includes one or more different wallpaper foreground elements than wake screen user interface 601, optionally while maintaining a same wallpaper background. For example, the wallpaper of wake screen user interface 601 includes the plurality of planetary bodies 602 as wallpaper foreground elements and displays a representation of space (e.g., a black or dark area, optionally with stars distributed within space) as the wallpaper background, and the second wallpaper of wake screen user interface 609 includes a different wallpaper foreground element (e.g., only one planet of the plurality of planetary bodies 602 without the other planets) while maintaining the wallpaper background as the representation of space. As such, during the animation and after the transition from wake screen user interface 601 to wake screen user interface 609, the wallpaper background is maintained while updating the wallpaper foreground elements. In some embodiments, different portions of the wallpaper background are shown in the wallpaper of the wake screen user interface during the transition between wake screen user interface 601 and wake screen user interface 609. For example, a portion of space closer to the selected planet 602a is shown in the wallpaper background in wake screen user interface 609, while a portion of space surrounding the plurality of planets 602 as a whole is shown in the wallpaper background in wake screen user interface 601, in accordance with some embodiments.
In some embodiments, at the end of the animated transition shown in
In some embodiments, as illustrated in
In some embodiments, user input 624 (e.g., a swipe user input in a horizontal direction or another type of navigation input) corresponds to a request to open a camera application, and in response to detecting user input 624, the device 100 displays a user interface that includes a camera view such that the user is enabled to capture images using one or more cameras of device 100.
In some embodiments, user input 620 corresponds to a selection of news widget 610 (e.g., a tap input directed to news widget 610, or another type of selection input), and in response to detecting user input 620, a user interface for the application associated with news widget 610 (e.g., a news application, or another type of app that provides news content) is displayed at device 100 (e.g., replacing display of wake screen user interface 619, or another system user interface displaying the news widget 610). In some embodiments, selection of a different respective widget causes the device 100 to display a user interface for the respective application associated with the respective widget (e.g., replacing display of the wake screen user interface 619, or another system user interface displaying the respective widget). In some embodiments, before displaying the user interface for an application (e.g., a camera application, a news application, and/or another application), the device 100 optionally requires user authentication to unlock the device 100 before providing the user with access to the application.
In some embodiments, user input 626 is detected in user interface 619. In some embodiments, user input 626 corresponds to a request to dismiss the currently displayed user interface (e.g., wake screen user interface 619, or another system user interface other than the home screen user interface) (e.g., to dismiss the wake screen user interface and to display a home screen user interface, or a last displayed application user interface). For example, user input 626 corresponds to a swipe gesture in a respective direction (e.g., an upward swipe that starts from a bottom edge of the currently displayed user interface, or another type of dismissal input) that satisfies dismissal criteria for dismissing the currently dismissed user interface (e.g., wake screen user interface 619, or another currently displayed user interface other than the home screen user interface). In some embodiments, in response to detecting user input 626, device 100 ceases display of the wake screen user interface 619 and displays home screen user interface 621, as illustrated in
In some embodiments, as shown in
In some embodiments, editing user interface 623 includes the widgets of wake screen user interface 619, including calendar widget 606, activity widget 608, and news widget 610, in the view of the wake screen user interface 619. In some embodiments, the widgets displayed in editing user interface 623 are displayed in a widgets area. In some embodiments, the widgets area has a first size and is displayed at a first position relative to the wake screen user interface (e.g., at a position along the left edge, or at another position) of the display) while device 100 is in the first orientation (e.g., landscape orientation, or another first canonical orientation of the device). In some embodiments, the user is enabled to add, remove and/or modify the widgets displayed in the widgets area. In some embodiments, widgets are optionally confined within the widgets area (e.g., widgets cannot be placed outside of the widgets area) in the wake screen user interface. In some embodiments, the widgets are positioned freely within the widgets area, and may be separately by an arbitrary amount of spacing chosen by a user that repositions the widget(s) by dragging and dropping them around within the widgets area.
In some embodiments, in response to detecting user input 630 directed to the widgets area (e.g., a tap input or another type of selection input) while widgets area is displayed in editing user interface 623, as illustrated in
In some embodiments, as shown in
In some embodiments, as shown in
In some embodiments, device 100 is rotated from the first orientation (e.g., landscape orientation) to a second orientation (e.g., portrait orientation, or another second canonical orientation of the device), as illustrated in
In some embodiments, the wallpaper foreground element of the wake screen user interface is automatically updated after detecting an interaction with the device. For example, in some embodiments, after device 100 has entered a sleep mode and/or low power mode in which the wake screen user interface ceases to be displayed or is displayed in a dimmed, always on state, upon exiting the sleep mode and/or low power mode (e.g., in response to a raise to wake input, an input on the touch screen, an input on a button of device 100, or another input), the wallpaper foreground element that was displayed prior to entering sleep mode and/or low power mode ceases to be displayed and is replaced with another wallpaper foreground element in the wallpaper of the wake screen user interface.
In some embodiments, device 100 (e.g., the touch-screen display of device 100, or other display generation component of the device) is situated in a first orientation (e.g., a landscape orientation, or another canonical orientation of the device and/or display) in
Although the terms first orientation and second orientation used herein to describe a landscape orientation and a portrait orientation, respectively, these terms are only used to distinguish one orientation from another. For example, a first orientation could be termed a second orientation, and, similarly, a second orientation could be termed a first orientation, without departing from the scope of the various described embodiments.
In some embodiments, in response to detecting user input 706 directed to wake screen user interface 701 displaying with the first orientation, where user input 706 corresponds to a request to configure wake screen user interface 701 (e.g., user input 706 is a touch-hold input on an unoccupied portion of the wallpaper of wake screen user interface 701, or another user input that meets configuration criteria), device 100 displays editing user interface 703, where editing user interface 703 shows various selectable options overlaying a view of wake screen user interface 701 that is being edited, as illustrated in
In some embodiments, editing user interface 703 includes widgets area 708 that corresponds to an area of the display to which one or more widgets may be added. For example, in response to user input 714 directed to widgets area 708, widgets platter 716 is displayed that includes a plurality of selectable widgets to be added to widgets area 708, as illustrated in
In some embodiments, in response to a user input selecting a respective widget in widget platter 716, the respective widget is added to widgets area 708. For example, in response to detecting user input 718, widget 728 is added to widgets area 708; in response to detecting user input 720, widget 726 is added to widgets area 708; and/or in response to detecting user input 722, widget 724 is added to widgets area 708. In some embodiments, in accordance with a determination that the user input selecting the respective widget is a tap input, the respective widget is automatically added to a position in the widgets area 708 according to one or more arrangement criteria. In some embodiments, the one or more arrangement criteria cause the device 100 to add a newly added widget to an open position in widgets area 708 at which another widget is not already displayed. For example, newly added widgets are added to open areas within widgets area 708 in an order that they have been selected from the widget platter. In some embodiments, the one or more arrangement criteria cause device 100 to add newly added widgets based on a size of the respective widget (e.g., according to where the newly added widget can fit without overlapping and/or moving a widget that is already in widgets area 708). In some embodiments, the one or more arrangement criteria are based at least in part on a language setting for a device. For example, the one or more arrangement criteria cause device 100 to add newly added widgets from left-to-right in the open placement locations of the widget area if the language setting corresponds to a language that reads from left-to-right, while the one or more arrangement criteria cause device 100 to add newly added widgets from right-to-left in the open placement locations of the widget area if the language setting corresponds to a language that reads from right-to-left. In some embodiments, the one or more arrangement criteria cause device 100 to add newly added widgets from top-to-bottom and/or from bottom-to-top (e.g., in the first available open area in widgets area 708 determined from top-to-bottom and/or from bottom-to-top of the widget area).
For example, in
As illustrated in
In some embodiments, while device 100 is in the second orientation (e.g., portrait orientation, or another second canonical orientation of the device), the area in which widgets are displayed is a different area of the display from the area in which widgets are displayed while the device 100 is in the first orientation (e.g., landscape orientation, or another first canonical orientation of the device). For example, FIG. 7M1 illustrates the display of device 100 after it has been rotated to the second orientation (e.g., portrait orientation, or another second canonical orientation of the device).
In some embodiments, in the second orientation (e.g., portrait orientation, or another second canonical orientation of the device), widgets 724a, 734a, and 726a are displayed in a widgets area that is below the time indication 746 (e.g., wherein the time indication 746 in the second orientation corresponds to time indication 704 in the first orientation). In some embodiments, the widgets area for the first orientation has a first size that is distinct from a second size of the widgets area for the second orientation, and/or the widgets area for the first orientation has a first position in the display area (e.g., relative to the time indication 704) of the device 100 while the device 100 is in the first orientation that is distinct from a second position in the display area (e.g., relative to the time indication 746) of device 100 that includes the widgets area while the device 100 is in the second orientation. For example, the widgets area in landscape orientation is a larger size than the widgets area in portrait orientation, in accordance with some embodiments.
In some embodiments, in response to device 100 rotating from the first orientation to the second orientation in
FIG. 7M2 illustrates that, in some embodiments, widgets that are added to wake screen user interface 723 in the first orientation are not automatically added to wake screen user interface 727 in the second orientation. For example, wake screen user interface 727 show in the second orientation is independently configurable (e.g., using a respective editing user interface) from wake screen user interface 723 in the first orientation (e.g., the wake screen user interface in the first orientation is configurable separately and independently from the configuration of wake screen user interface in the second orientation, and the wake screen user interface in the second orientation is configurable separately and independently from the configuration of the wake screen user interface in the first orientation).
FIG. 7M2 illustrates user input 748 (e.g., a touch and hold input or a light press input directed to an unoccupied portion of the wallpaper, or another type of input that meets editing criteria) corresponding to a request to display an editing user interface for the wake screen user interface, detected in wake screen user interface 727. In some embodiments, in response to detecting user input 748, device 100 displays editing user interface 729, as illustrated in
In some embodiments, in response to detecting user input 772, the selected widget 730f is added to widgets area 708 as widget 774, as illustrated in editing user interface 739 in
As described below, method 800 automatically updates a wallpaper background of a user interface in accordance with a selected wallpaper foreground element, thereby providing real-time feedback to the user in response to a user input to change the wallpaper foreground elements without requiring the user to manually configure the wallpaper background, which performs an operation automatically when a set of conditions has been met without requiring further user input and provides improved feedback to the user.
The computer system receives (802) a request to display a first user interface. In some embodiments, the first user interface is a system user interface that includes a wallpaper and a plurality of system user interface objects (e.g., time, widgets, application icons, status indicators, or other system user interface objects), where the wallpaper includes one or more foreground elements (e.g., a pattern, an image, or other graphical elements) overlaying a background (e.g., a background color, a background gradient, or a background texture), optionally in a respective layout arrangement. In some embodiments, the first user interface is a wake screen user interface (e.g., wake screen user interface 490) that is displayed when the device is in a low-power mode (e.g., with the displayed turned off or with the display in a dimmed always-on state, as described with reference to user interface 489), when the device is transitioned from a low-power mode to a regular operating mode (e.g., with the display turned on to a regular brightness level), when the device is transitioned from a regular operating mode (e.g., with the display lit at the regular brightness level) to a low-power mode (e.g., with the display turned off or dimmed), and/or when the device transitions from an application UI (e.g., application user interface 493) and home screen to a locked and/or restricted state. In some embodiments, the first user interface is a home screen user interface (e.g., home screen user interface 492). In some embodiments, the wake screen user interface and the home screen user interface are configured together as a pair, so a change to the wake screen configuration is applied to the configuration of the home screen user interface, and, optionally, vice versa. In some embodiments, receiving the request to display the first user interface includes detecting a raise to wake input or detecting generation of a notification or alert, while the device is in a low-power mode. In some embodiments, receiving the request to display the first user interface includes detecting a power-off input or device sleep input (e.g., a press on the power button, prolonged inactivity by the user, or other event or input that puts the device in a low power mode) while the device is in a regular operating mode. In some embodiments, detecting the request to display the first user interface includes detecting an input that corresponds to a request to close an open application (e.g., an upward edge swipe gesture, a press on the home button, or other inputs), or an input that corresponds to a request to display the coversheet user interface and put the device in a restricted mode (e.g., a downward edge swipe gesture, a press on the power button, or other inputs) that is displayed while an application user interface is displayed.
The computer system, in response to receiving the request to display the first user interface, displays (804) the first user interface that includes a set of user interface elements (e.g., wake screen features such as notifications, a date/time, widgets, and/or status indicators or home screen features such as widgets and/or application icons) displayed overlaid on a wallpaper. In some embodiments, the wallpaper comprises an image, pattern, design, canvas, or other background that is displayed in the user interface with one or more application icons and/or system user interface objects that are displayed as at least partially occluding the wallpaper. For example, the wallpaper is an image, pattern, design, canvas or other background that is displayed behind system information (e.g., an indication of a date, an indication of a time, notifications, or other text) and/or behind application icons and/or other user interface elements displayed in the user interface. In some embodiments, at least a portion of the wallpaper is displayed in front of system information (e.g., a portion of the wallpaper partially occludes the indication of the time). In some embodiments, the wallpaper is configurable by the user, whereby the user selects an image, pattern and/or design of the wallpaper. In some embodiments, the wallpaper includes a wallpaper background and/or one or more wallpaper foreground elements. For example, the wallpaper includes a color (e.g., or a pattern, such as a gradient of color) with objects (e.g., foreground elements, such as emoji, planets, or other objects) that are displayed, within the wallpaper, in front of the color that serves as the wallpaper background. As such, both the wallpaper background and wallpaper foreground elements make up the wallpaper that is displayed behind system information, application icons and/or other user interface elements. Displaying the first user interface includes: in accordance with a determination that the wallpaper has a first set of one or more wallpaper foreground elements (e.g., the determination of which wallpaper foreground elements are on the wallpaper is, optionally, made at a time while the user is configuring the wallpaper (e.g., in an edit mode of the wallpaper in which the user selects one or more foreground elements (e.g., emoji or other elements) to be displayed in the wallpaper), at a time right after the completion of the configuration process, and/or a time after the request to display the first user interface is received), the wallpaper has a first wallpaper background with an automatically selected appearance that was automatically selected (e.g., by the computer system, without user intervention at the time of displaying the first user interface), based on the first set of one or more wallpaper foreground elements. For example, as described with reference to
In some embodiments, the computer system displays (806), via the display generation component, a second user interface for configuring the first user interface, wherein the second user interface includes respective selectable options for specifying one or more wallpaper foreground elements (e.g., symbols, emojis, stickers, icons, and other graphical elements or patterns that are repeated on a wallpaper background) that are to be included in the wallpaper of the first user interface (e.g., the wallpaper includes an emoji wallpaper, a cartoon character wallpaper, or another type of background with repeated symbols or graphics as the foreground elements overlaying an automatically selected wallpaper background color or texture). In some embodiments, while displaying the second user interface, the computer system detects a first user input selecting a first selectable option that corresponds to a first wallpaper foreground element (e.g., a first emoji, a first symbol, a first icon, or a first graphical element) (e.g., without displaying a corresponding wallpaper background underlying a selectable wallpaper foreground element, and/or while a set of multiple possible wallpaper backgrounds are independently selectable at a different time from the wallpaper foreground elements). In some embodiments, in response to detecting the first user input selecting the first selectable option that corresponds to the first wallpaper foreground element (e.g., without specifying a corresponding wallpaper background for the first wallpaper foreground element, and without presenting the first wallpaper foreground element with a corresponding wallpaper background as an option for selecting both as a combination), the computer system displays, in the second user interface, a representation of the wallpaper having the first wallpaper foreground element and the first wallpaper background with an appearance that is automatically selected based on one or more properties of the first wallpaper foreground element (e.g., the wallpaper background is selected based on the color(s) and style, and other characteristics of the first wallpaper foreground element). In some embodiments, the wallpaper background is displayed as part of a representation of the first user interface in the second user interface. In some embodiments, the computer system updates, in response to the selection of the first wallpaper foreground element, the wallpaper background and the set of foreground elements of an existing wallpaper that is currently shown in the first user interface, such that the wallpaper of the first user interface now comprises the first foreground element (e.g., in addition to, or instead of the previously displayed foreground element(s)) and the wallpaper background with an appearance that is automatically selected based on the first foreground element. In some embodiments, while displaying, in the second user interface, the representation of the wallpaper having the first wallpaper foreground element and the wallpaper background with an appearance that is automatically selected based on the one or more properties of the first wallpaper foreground element, the computer system detects a second user input selecting a second selectable option that corresponds to a second wallpaper foreground element, different from the first wallpaper foreground element. In some embodiments, in response to detecting the second user input selecting the second selectable option that corresponds to the second wallpaper foreground element (e.g., without specifying a corresponding wallpaper background for the second wallpaper foreground element, and without presenting the second wallpaper foreground element with a corresponding wallpaper background as an option for selecting both as a combination), the computer system updates, in the second user interface, the representation of the wallpaper to include the second wallpaper foreground element (e.g., in addition to the first foreground element, or instead of the first foreground element) and the second wallpaper background with an appearance that is automatically selected based on one or more properties of the second wallpaper foreground element (e.g., the second wallpaper background is selected, at least in part, based on the color(s) and style, and other characteristics of the second wallpaper foreground element). In some embodiments, the second wallpaper background is further based on the one or more properties of the first wallpaper foreground element, in conjunction with the one or more properties of the second wallpaper foreground element, if the first wallpaper foreground element is not deselected and will be used together with the second wallpaper foreground element in the wallpaper. In some embodiments, the second wallpaper background is only based on the second wallpaper foreground element, without regard to the first wallpaper foreground element (e.g., the second wallpaper foreground element completely replaces the first wallpaper foreground element in the wallpaper). For example, editing user interface 543 (
In some embodiments, after detecting the second user input selecting the second selectable option that corresponds to the second wallpaper foreground element (e.g., and updating the respective representation of the wallpaper to include the second wallpaper foreground element and the second wallpaper background with an appearance that is automatically selected based on the one or more properties of the second wallpaper foreground element), the computer system detects (808) a third user input that unselects the second selectable option that corresponds to the second wallpaper foreground element (e.g., the third user input deletes the second wallpaper foreground element from the set of selected wallpaper foreground elements, without providing an input to delete or changing a corresponding wallpaper background for the second wallpaper foreground element, and without deleting or changing the second wallpaper foreground element and a corresponding wallpaper background together as a selectable combination). In some embodiments, in response to detecting the third user input that unselects the second selectable option that corresponds to the second wallpaper foreground element: the computer system ceases display of the second wallpaper foreground element in the representation of the wallpaper (e.g., and removes the second foreground element from the foreground of the wallpaper, so that when the first user interface is displayed, the wallpaper of the first user interface is displayed without the second foreground element) and updates, in the second user interface, the representation of the wallpaper to display a third wallpaper foreground element and a third wallpaper background with an appearance that is automatically selected based on one or more properties of the third wallpaper foreground element (e.g., and when the first user interface is displayed, the wallpaper of the first user interface is displayed with the third wallpaper background and the third wallpaper foreground element). In some embodiments, the third wallpaper background is the same as the first wallpaper background. For example, if the first and second wallpaper foreground elements were both selected, and the second foreground element is then deleted by the third user input, the wallpaper background reverts back to the first wallpaper background that is automatically determined in accordance with one or more properties of the first wallpaper foreground element. In some embodiments, the third wallpaper background is distinct from the first wallpaper background and the second wallpaper background (e.g., based on properties of any of the currently selected wallpaper foreground elements (e.g., if any) that remain after deleting the second wallpaper foreground element), different from the second wallpaper background. For example, as described with reference to FIG. 5G1, in response to user input 566 corresponding to a request to delete thumbs up emoji 548 from the set of wallpaper foreground elements, the wallpaper background color is automatically, without additional user input, updated to a different color (e.g., the first color that was selected based on the smiley face emoji 550), as illustrated in editing user interface 569 in FIG. 5G2. Automatically changing one or more visual features of a wallpaper based on a selected set of wallpaper foreground elements, including updating the visual features of the wallpaper in response to the user removing a wallpaper foreground element from the set of wallpaper foreground elements, without requiring the user to manually select the one or more visual features of the wallpaper, provides improved feedback that indicates a wallpaper foreground element has been deleted and reduces a number of inputs needed to design or edit the wallpaper.
In some embodiments, the second user input selecting the second selectable option that corresponds to (810) the second wallpaper foreground element includes a user input that adds the second wallpaper foreground element as an additional foreground element to be included with the first wallpaper foreground element in the wallpaper (e.g., the user adds additional emoji or other wallpaper foreground elements to be displayed in the wallpaper). In some embodiments, updating the representation of the wallpaper to include the second wallpaper foreground element and the second wallpaper background with the appearance that is automatically selected based on the one or more properties of the second wallpaper foreground element includes updating the representation of the wallpaper to include both the first wallpaper foreground element and the second wallpaper foreground element overlaying the second wallpaper background. In some embodiments, the second wallpaper background has an appearance that is automatically selected based on one or more properties of the second wallpaper foreground element as well as one or more properties of the first wallpaper foreground element, or an interrelationship between the one or more properties of the first and second wallpaper foreground elements. For example, as described with reference to FIG. 5G2, in response to user input 568 corresponding to a request to add plane emoji 552 to the set of wallpaper foreground elements, the wallpaper background is updated to include instances of smiley face emoji 550 and instances of plane emoji 552 repeated in a grid pattern as the wallpaper foreground elements, and the wallpaper background is updated to display a color that is selected based at least in part on one or more properties of plane emoji 552 (e.g., the color of the wallpaper background is optionally selected to complement and/or match one or more colors of plane emoji 552). Automatically changing one or more visual features of a wallpaper based on a selected set of wallpaper foreground elements, including updating the visual features of the wallpaper in response to the user adding a wallpaper foreground element to the set of wallpaper foreground elements, without requiring the user to manually select the one or more visual features of the wallpaper, provides improved feedback that indicates a wallpaper foreground element has been added and reduces a number of inputs needed to design or edit the wallpaper.
In some embodiments, the computer system displays (812), in the second user interface for configuring the first user interface, a search input area for receiving one or more search criteria for searching wallpaper foreground elements (e.g., a search bar for searching for wallpaper foreground elements using text-base or image-based search criteria). In some embodiments, the computer system detects a first set of search criteria in the search input area (e.g., detecting the user typing on a keyboard to input the text and/or a voice search command that corresponds to search criteria) in the search input area; and in response to detecting the first set of search criteria in the search input area, displays one or more graphical elements (e.g., images, icons, emojis, symbols, and/or other objects suitable for serving as a wallpaper foreground element) that correspond to the first set of search criteria, as selectable options for specifying the one or more wallpaper foreground elements that are to be included in the wallpaper of the first user interface (e.g., without displaying other foreground elements that do not match the search criteria). For example, a plurality of emoji (e.g., or other icons, stickers, avatars, or foreground elements) are displayed as search results in response to the user input. For example, as described with reference to
In some embodiments, displaying the first user interface that includes the set of user interface elements displayed overlaid on the wallpaper includes (814): in accordance with a determination that the wallpaper has a respective set of one or more wallpaper foreground elements and that the respective set of one or more wallpaper foreground elements includes a single wallpaper foreground element (e.g., as a standalone object, or as a pattern that is repeated multiple times throughout the wallpaper background), displaying the wallpaper with a respective wallpaper background that has an appearance that is automatically selected based on the one or more properties of the single wallpaper foreground element (e.g., automatically selecting a color of the wallpaper background based on the single wallpaper foreground element, wherein the selected color is associated with (e.g., assigned to or otherwise selected for) the single wallpaper foreground element). In some embodiments, different wallpaper foreground elements are associated with different wallpaper backgrounds by default, and of a respective wallpaper foreground element is the only wallpaper foreground element that has been specified for a wallpaper, the wallpaper is displayed with the respective wallpaper foreground element overlaying a wallpaper background that is associated with the respective wallpaper foreground element. There is no need for the user to separately specify the wallpaper background in addition to specifying the wallpaper foreground element in the configuration user interface. When a new wallpaper foreground element is selected to replace a previously selected wallpaper foreground element, the device displays the newly selected wallpaper foreground element overlaid on a new wallpaper background that is associated with the newly selected wallpaper foreground element. In some embodiments, a respective wallpaper foreground element has more than one wallpaper backgrounds that is available for use with the respective wallpaper foreground, and the device selects one based on one or more other factors, such as the time of day, and/or visual properties of other foreground elements on the first user interface. For example, as described with reference to
In some embodiments, displaying the first user interface that includes the set of user interface elements displayed overlaid on the wallpaper includes (816): in accordance with a determination that the wallpaper has a respective set of one or more wallpaper foreground elements and that the respective set of one or more wallpaper foreground elements includes multiple wallpaper foreground elements (e.g., as standalone objects, or as patterns that are repeated multiple times throughout the wallpaper background), displaying the wallpaper with a respective wallpaper background that has an appearance that is automatically selected based on the one or more properties of the multiple wallpaper foreground elements. For example, in some embodiments, in accordance with a determination that the wallpaper includes at least one foreground element from the first set of one or more wallpaper foreground elements and at least one foreground element from the second set of the one or more wallpaper foreground elements, the wallpaper has a wallpaper background with an automatically selected appearance that was automatically selected based on both of the at least one foreground element from the first set of one or more wallpaper foreground elements and the at least one foreground element from the second set of the one or more wallpaper foreground elements. In some embodiments, the automatically selected appearance is automatically selected based on one or more properties that the at least one foreground element from the first set of one or more wallpaper foreground elements and the at least one foreground element from the second set of the one or more wallpaper foreground elements share (e.g., a color that is in both foreground elements). In some embodiments, the automatically selected appearance is automatically selected based on a combination of properties of the at least one foreground element from the first set of one or more wallpaper foreground elements and the at least one foreground element from the second set of the one or more wallpaper foreground elements (e.g., a color that complements and/or accents both foreground elements, such as a blended color of colors that occur in each of the foreground elements). For example, as described with reference to FIG. 5G1, in some embodiments, the second color of wallpaper background of editing user interface 567 is a color that is based at least in part on all of the currently selected emoji that are selected as wallpaper foreground elements (e.g., both smiley face emoji 550 and thumbs up emoji 548). For example, the second color is selected as color that is present in (e.g., or otherwise complements) both smiley face emoji 550 and thumbs up emoji 548. Automatically changing a color of a wallpaper background to a color that is associated with at least two different emoji, sticker, avatar or icon that are used in the set of wallpaper foreground elements, without requiring the user to manually select the color of the wallpaper background, provides improved feedback that updates a wallpaper background color to match two or more wallpaper foreground elements have been selected and reduces a number of inputs needed to design or edit the wallpaper.
In some embodiments, displaying the first user interface that includes the set of user interface elements displayed overlaid on the wallpaper includes (818): in accordance with a determination that the wallpaper has a respective set of one or more wallpaper foreground elements and that the respective set of one or more wallpaper foreground elements includes multiple wallpaper foreground elements (e.g., as standalone objects, or as patterns that are repeated multiple times throughout the wallpaper background), displaying the wallpaper with a respective wallpaper background that has an appearance that is automatically selected based on the one or more properties of a last selected wallpaper foreground element among the multiple wallpaper foreground elements. For example, in some embodiments, if the multiple wallpaper foreground elements include the first wallpaper foreground element selected by the first user input and the second wallpaper foreground element selected by the second user input, the wallpaper has a wallpaper background with an automatically selected appearance that was automatically selected based on the second foreground element and not based on the first wallpaper foreground element. For example, as described with reference to FIG. 5G1, in some embodiments, the second color of wallpaper background of editing user interface 567 is based on the last (e.g., most recently) selected emoji (e.g., thumbs up emoji 548) only (e.g., and the second color is not based on previously selected emoji, such as smiley face emoji 550). For example, as described with reference to
In some embodiments, while displaying the representation of the wallpaper having a respective wallpaper foreground element (e.g., the first wallpaper foreground element, the second wallpaper foreground element, and/or another wallpaper foreground element) and a respective wallpaper background (e.g., the first wallpaper background, the second wallpaper background, or another wallpaper background) with an appearance that is automatically selected based on one or more properties of the respective wallpaper foreground element, the computer system detects (820) a fourth user input that changes the respective wallpaper background (e.g., switching to another wallpaper background as a whole) or one or more properties of the respective wallpaper background (e.g., color, layout, brightness, hue, translucency, texture, and/or other properties). In some embodiments, in response to detecting the fourth user input, the computer system updates display of the representation of the wallpaper to include the respective wallpaper foreground element and an updated wallpaper background specified by the fourth user input. In some embodiments, the user is enabled to reselect the automatically selected appearance for the wallpaper background to switch between the user-selected appearance (e.g., the color selected by the user) and the automatically selected appearance (e.g., the color selected automatically based on the foreground elements) for the wallpaper background. For example, in some embodiments, when the representation of the wallpaper is displayed with the user-specified wallpaper background, a second user interface includes a selectable option that corresponds to the wallpaper background having the automatically selected appearance. In response to detecting a fifth user input that selects that selectable option, the device updates display of the representation of the wallpaper to include the respective wallpaper foreground element and the original automatically selected wallpaper background that is automatically selected based on the one or more properties of the respective wallpaper foreground element. For example, as described with reference to
In some embodiments, at least one of the first set of one or more wallpaper foreground elements and the second set of one or more wallpaper foreground elements include (822) one or more user-generated avatars (e.g., animated characters, models, icons, and/or emoji that are generated based on an appearance of a user (e.g., a user of the device, and/or one or more contacts of the user in one or more communication applications) or are otherwise configured by the user). In some embodiments, available foreground elements are stored in a library that can be searched and browsed in the second user interface. In some embodiments, a user is enabled to add to and/or create foreground elements in the library of available foreground elements. For example, as described with reference to
In some embodiments, while the display generation component is in a first orientation (e.g., a portrait orientation, a landscape orientation, or another orientation specified relative to the display or the physical environment), the computer system displays (824) the first user interface with a first layout that corresponds to the first orientation, including displaying the set of user interface elements overlaid on the wallpaper with the first layout, wherein a respective set of wallpaper foreground elements are oriented in a first manner relative to the set of user interface elements (e.g., oriented with the same up and down directions, oriented at a first angle relative to the up and down direction of the set of user interface elements). In some embodiments, while displaying the first user interface with the first layout, the computer system detects a change in orientation of the display generation component from the first orientation to a second orientation, different from the first orientation; and in response to detecting the change in orientation of the display generation component: in accordance with a determination that the change in orientation of the display generation component meets rotation criteria (e.g., criteria based on the amount, speed, rate of change, and/or direction of the change in orientation of the display generation component), the computer system displays the first user interface with a second layout that corresponds to the second orientation, including displaying the set of user interface elements overlaid on the wallpaper with the second layout (e.g., displaying an animation that shows the set of user interface elements rotating and/or moving in the first user interface from the first layout to the second layout), and rotates respective instances of the one or more wallpaper foreground elements in the respective set of one or more wallpaper foreground elements (e.g., such that the respective set of wallpaper foreground elements are oriented in the first manner relative to the set of user interface elements in the second layout (e.g., oriented with the same up and down directions, oriented at the first angle relative to the up and down direction of the set of user interface elements)) while maintaining respective positions of the respective instances of the one or more wallpaper foreground elements in the first user interface (e.g., individual instances of the wallpaper foreground element(s) are rotated in place such that the instances of the wallpaper foreground elements are rotated without shifting them relative to the display or the wallpaper background). In some embodiments, additional instances of the wallpaper foreground elements are displayed in some portions of the display area when the first user is displayed with the second layout, while some instances of the wallpaper foreground elements are no longer displayed in other display areas (e.g., corners and/or sides of the display generation component). For example, as described with reference to
In some embodiments, in response to detecting the change in orientation of the display generation component: in accordance with the determination that the change in orientation of the display generation component meets the rotation criteria (e.g., criteria based on the amount, speed, rate of change, and/or direction of the change in orientation of the display generation component), the computer system: expands (826) a display area of a respective wallpaper background (e.g., the first wallpaper background, the second wallpaper background, or another automatically selected wallpaper background or manually selected wallpaper background) of the wallpaper relative to the respective instances of the one or more wallpaper foreground elements in the respective set of one or more wallpaper foreground elements, while the respective instances of the one or more wallpaper foreground elements are rotated at their respective positions in the first user interface (e.g., the expanded display area of the wallpaper background includes one or more areas that were previously outside of the display area covered by the wallpaper background corresponding to the first orientation (e.g., corners, and/or edges)). For example, as described with reference to
In some embodiments, detecting the first user input selecting the first selectable option that corresponds to the first wallpaper foreground element includes (828) detecting entry of a first emoji using an emoji keyboard, displaying the representation of the wallpaper having the first wallpaper foreground element and the first wallpaper background includes displaying a repeated pattern of the first emoji overlaying a first background color that is automatically selected based on one or more properties of the first emoji; and detecting the second user input selecting the second selectable option that corresponds to the second wallpaper foreground element includes detecting entry of a second emoji using the emoji keyboard, the second emoji being different from the first emoji. In some embodiments, updating the representation of the wallpaper to include the second wallpaper foreground element and the second wallpaper background includes displaying a repeated pattern of the second emoji (e.g., in place of or in addition to the first emoji, depending on whether the first emoji is replaced or remains selected) overlaying a second background color that is automatically selected based on one or more properties of the second emoji, the second background color replacing the first background color in the representation of the wallpaper. For example, as described with reference to
It should be understood that the particular order in which the operations in
As described below, method 900 automatically updates a wallpaper displayed in a user interface in response to a user selecting a user interface object from a set of user interface objects displayed in the wallpaper, thereby displaying an updated wallpaper for the user without requiring the user to manually configure the wallpaper, which provides additional control options without cluttering the user interface with additional displayed controls and provides improved feedback to the user.
The computer system receives (902) a request to display a first user interface. In some embodiments, the first user interface is a system user interface that includes a wallpaper and a plurality of system user interface objects (e.g., time, widgets, application icons, status indicators, or other system user interface objects), where the wallpaper includes one or more foreground elements (e.g., a pattern, an image, or other graphical elements) overlaying a background (e.g., a background color, a background gradient, or a background texture), optionally in a respective layout arrangement. In some embodiments, the first user interface is a wake screen user interface (e.g., wake screen user interface 490) that is displayed when the device is in a low-power mode (e.g., with the displayed turned off or with the display in a dimmed always-on state), when the device is transitioned from a low-power mode to a regular operating mode (e.g., with the display turned on to a regular brightness level), when the device is transitioned from a regular operating mode (e.g., with the display lit at the regular brightness level) to a low-power mode (e.g., with the display turned off or dimmed), and/or when the device transitions from an application UI (e.g., application user interface 493) and home screen to a locked and/or restricted state. In some embodiments, the first user interface is a home screen user interface (e.g., home screen user interface 492). In some embodiments, the wake screen user interface and the home screen user interface are configured together as a pair, so a change to the wake screen configuration is applied to the configuration of the home screen user interface, and, optionally, vice versa. In some embodiments, receiving the request to display the first user interface includes detecting a raise to wake input or detecting generation of a notification or alert, while the device is in a low-power mode. In some embodiments, receiving the request to display the first user interface includes detecting a power-off input or device sleep input (e.g., a press on the power button, prolonged inactivity by the user, or other event or input that puts the device in a low power mode) while the device is in a regular operating mode. In some embodiments, detecting the request to display the first user interface includes detecting an input that corresponds to a request to close an open application (e.g., an upward edge swipe gesture, a press on the home button, or other inputs), or an input that corresponds to a request to display the coversheet user interface and put the device in a restricted mode (e.g., a downward edge swipe gesture, a press on the power button, or other inputs) that is displayed while an application user interface is displayed.
In response to receiving the request to display the first user interface, the computer system displays (904) the first user interface with a set of user interface elements (e.g., wake screen features such as notifications, a date/time, widgets, and/or status indicators or home screen features such as widgets and/or application icons) overlaid on a wallpaper that includes a plurality of selectable elements (e.g., the wallpaper includes selectable representations of planets in a solar system, or the wallpaper includes selectable avatars of the user's family members, or other selectable representations of a group of subjects). For example,
While displaying the first user interface, the computer system detects (906) a first user input directed the wallpaper while a first selectable element and a second selectable element of the plurality of selectable elements are concurrently displayed in the wallpaper. For example, in
In response to detecting the first user input directed to the wallpaper, the computer system (908): in accordance with a determination that the first user input meets selection criteria and is directed to the first selectable element of the plurality of selectable elements, updates the wallpaper of the first user interface to include the first selectable element, without including the second selectable element of the plurality of selectable elements (e.g., and without including other selectable elements of the plurality of selectable elements). For example, as described with reference to
In some embodiments, the plurality of selectable elements, including the first selectable element and the second selectable element that are concurrently displayed in the wallpaper, include (910) respective representations of a plurality of planetary bodies (e.g., planets of our solar system, or planets of other solar systems). In some embodiments, the plurality of selectable elements include selectable images of planets in the solar system. In some embodiments, the plurality of selectable elements are arranged in the wallpaper in accordance with their relative positions in the solar system. For example, as described with reference to
In some embodiments, displaying the first user interface with the set of user interface elements (e.g., wake screen features such as notifications, a date/time, widgets, and/or status indicators or home screen features such as widgets and/or application icons) overlaid on the wallpaper that includes the plurality of selectable elements includes (912): in accordance with a determination that a current time is a first time that corresponds to a first celestial arrangement of the plurality of planetary bodies, arranging the respective representations of the plurality of planetary bodies in the wallpaper in accordance with the first celestial arrangement of the plurality of planetary bodies; and in accordance with a determination that the current time is a second time that corresponds to a second celestial arrangement of the plurality of planetary bodies, that is different from the first celestial arrangement of the plurality of planetary bodies, arranging the respective representations of the plurality of planetary bodies in the wallpaper in accordance with the second celestial arrangement of the plurality of planetary bodies. For example, in some embodiments, the representations of the planets in the wallpaper are concurrently displayed in a solar system diagram (e.g., an orrery, a representation of a solar system or other planetary model) in accordance with the current time (e.g., and/or a current date) and updated over time to represent the respective positions of the respective planets along their orbits around the Sun. For example, as described with reference to
In some embodiments, in response to detecting the first user input directed to the wallpaper, the computer system (914): in accordance with the determination that the first user input meets the selection criteria and is directed to the first selectable element of the plurality of selectable elements, displays a first animation that transitions the wallpaper (e.g., over a period of time) from displaying the plurality of selectable elements to displaying the first selectable element without the second selectable element; and in accordance with the determination that the first user input meets the selection criteria and is directed to the second selectable element of the plurality of selectable element, displays a second animation that transitions the wallpaper (e.g., over a period of time) from displaying the plurality of selectable elements to displaying the second selectable element without the first selectable element. In some embodiments, the animation that is displayed includes a simulated view of a viewer traveling through space across the plurality of selectable elements to the final position occupied by the selected element. In some embodiments, the animation includes moving the respective selected element forward in the user interface as if the selected element is traveling toward the user, while moving the other selectable elements in the plurality of selectable elements out of a display area of the display generation component. For example, as described with reference to
In some embodiments, displaying the first animation includes (916) displaying a change in view of a virtual viewer that travels toward the first selectable element of the plurality of selectable elements, and displaying the second animation includes displaying a change in view of the virtual viewer that travels toward the second selectable element of the plurality of selectable elements (e.g., visually deemphasizing the elements that are not selected and visually emphasizing the selected element, such as enlarging and/or moving the selected element towards a center of the field of view provided by the display, as if the viewer is traveling closer to the selected element from his/her initial position). For example, as described with reference to
In some embodiments, while displaying the first user interface, including the wallpaper that has been updated in response to detecting the first user input (e.g., the wallpaper displaying the first selectable element without the second selectable element, or the wallpaper displaying the second selectable element without the first selectable element), the computer system detects (918) a second user input directed to the first user interface; and in response to detecting the second user input that is directed to the first user interface: in accordance with a determination that the second user input meets dismissal criteria for dismissing the first user interface (e.g., the second user input includes an upward edge swipe gesture, a press on the home button, or another user input that is used to dismiss the first user interface), restores the wallpaper of the first user interface to display the plurality of selectable element, including concurrently displaying the first selectable element and the second selectable element, without dismissing the first user interface. In some embodiments, while displaying the first user interface with the wallpaper that has been restored in response to the second user input, the computer system detects a third user input directed to the first user interface; and in response to detecting the third user input and in accordance with a determination that the third user input also meets the dismissal criteria for dismissing the first user interface, the computer system dismisses the first user interface and display a home screen user interface, wherein the home screen user interface includes an arrangement of application icons overlaid on a wallpaper that includes the plurality of selectable elements. For example, as described with reference to
In some embodiments, detecting the second user input includes (920) detecting a swipe gesture. In some embodiments, the swipe gesture that meets the dismissal criteria is a swipe in a first direction (e.g., an upward swipe, a downward swipe, and/or a side swipe). In some embodiments, the swipe gesture that meets the dismissal criteria is an edge swipe gesture that starts from a first edge of a touch-sensitive surface that is associated with the display generation component. For example, as described with reference to
In some embodiments, while displaying the wallpaper that includes the plurality of selectable elements (e.g., in the first user interface, or in a representation of the first user interface shown in an editing user interface), the computer system detects (922) user selection of the first selectable element from the plurality of selectable elements in the wallpaper; and in response to detecting the user selection of the first selectable element from the plurality of selectable elements in the wallpaper, the computer system updates the wallpaper (e.g., in the first user interface, or in the representation of the first user interface shown in the editing user interface) to display the first selectable elements without displaying the second selectable elements (and without displaying other selectable elements of the plurality of selectable elements). In some embodiments, after updating the wallpaper in response to detecting the user selection of the first selectable element, while displaying the first user interface with the wallpaper including the first selectable element without the second selectable element, the computer system detects a request to dismiss the first user interface (e.g., an upward edge swipe gesture or a press on the home button to dismiss the wake screen user interface and displaying a home screen user interface, as described with reference to FIGS. 4C1-4C2). In some embodiments, in response to detecting the request to dismiss the first user interface, the computer system: in accordance with a determination that the user selection of the first selectable element from the plurality of selectable elements occurred while displaying the first user interface, displays a second user interface (e.g., the home screen user interface, or another user interface that has a wallpaper that is coordinated with the wallpaper of the first user interface) with a first wallpaper including the plurality of selectable elements; and in accordance with a determination that the user selection of the first selectable element from the plurality of selectable elements occurred while displaying an editing user interface of the first user interface, displays the second user interface with a second wallpaper including the first selectable element without including the second selectable element. For example, as described with reference to
In some embodiments, in an editing user interface for the first user interface, the computer system detects (924) user selection of a respective selectable element from the plurality of selectable elements (e.g., from a representation of the wallpaper while displaying the representation of the wallpaper that includes the plurality of selectable elements (e.g., in a representation of the first user interface shown in an editing user interface, or in a representation of the wallpaper in an editing user interface)). In some embodiments, in response to detecting the user selection of the respective selectable element from the plurality of selectable elements in the wallpaper, the computer system updates the wallpaper (e.g., in the first user interface, or in the representation of the first user interface shown in the editing user interface) to include the respective selectable element without including at least one of the plurality of selectable elements (e.g., without including the first selectable element, or the second selectable elements, and/or other selectable elements of the plurality of selectable elements that is not selected) that is different from the respective selectable element. In some embodiments, after updating the wallpaper in response to the computer system detects the user selection of the respective selectable element in the editing user interface, detecting a second request to display the first user interface. In some embodiments, in response to detecting the second request to display the first user interface, the computer system displays the first user interface with the set of user interface elements (e.g., wake screen features such as notifications, a date/time, widgets, and/or status indicators or home screen features such as widgets and/or application icons) overlaid on the wallpaper that includes the respective selectable element without including at least one of the plurality of selectable elements that is different from the respective selectable element (e.g., including the first selectable element without including the second selectable element, or including the second selectable element without including the first selectable element, or including one of the selectable elements without including other elements of the plurality of selectable elements). For example, as described with reference to
In some embodiments, the editing user interface for the first user interface includes (926) a respective selectable option for rotating through the plurality of selectable elements in the wallpaper. In some embodiments, while displaying the editing user interface for the first user interface, the computer system detects user selection of the respective selectable option for rotating through the plurality of selectable elements in the wallpaper (e.g., a toggle switch for turning on or off the automatic rotation through the selectable elements in the wallpaper). In some embodiments, in response to detecting the user selection of the respective selectable option for rotating through the plurality of selectable elements in the wallpaper, the computer system enables automatic selection of different selectable elements of the plurality of selectable elements at different times (e.g., in response to a set of conditions being met at a time that the first user interface is displayed or requested to be displayed, such as time-based conditions, a schedule, trigger-based conditions, or other conditions) to include in the wallpaper. In some embodiments, after enabling the automatic selection of different selectable elements of the plurality of selectable elements at different times to include in the wallpaper, the computer system detects a third request to display the first user interface; and in response to detecting the third request to display the first user interface, displays the first user interface with the set of user interface elements (e.g., wake screen features such as notifications, a date/time, widgets, and/or status indicators or home screen features such as widgets and/or application icons) overlaid on the wallpaper, including: at a first point in time, displaying the first user interface with the wallpaper including the first selectable element without including the second selectable element; and at a second point in time after the first point in time: in accordance with a determination that wallpaper updating criteria are met (e.g., a previous selectable element has been used for more than a threshold amount of time, the first user interface has been re-invoked after the device was in a low-power mode, location of the device has changed, or other conditions for switching the selectable element in the wallpaper have been met), displaying the first user interface with the wallpaper including the second selectable element without including the first selectable element; and in accordance with a determination that the wallpaper updating criteria are not met, displaying the first user interface with the wallpaper including the first selectable element without including the second selectable element. For example, as described with reference to
In some embodiments, the computer system displays (928) respective labels for the plurality of selectable elements in a representation of the first user interface in the editing user interface (e.g., a name of the planet is displayed adjacent to and/or overlapping the representation of the planet in the solar system view). For example, as described with reference to
In some embodiments, while displaying the respective labels for the plurality of selectable elements in the representation of the first user interface in the editing user interface (e.g., a name of the planet is displayed adjacent to and/or overlapping the representation of the planet in the solar system view), in accordance with a determination that user input has not been received for at least a first threshold amount of time, the computer system visually deemphasizes (930) the respective labels for the plurality of selectable elements displayed in the representation of the first user interface. In some embodiments, the plurality of labels fade over time. In some embodiments, the plurality of labels cease to be displayed after a period of inactivity on the representation of the first user interface. For example, as described with reference to
In some embodiments, while the respective labels for the plurality of selectable elements are visually deemphasized, the computer system detects (932) a third user input directed to the editing user interface (e.g., the respective user input is a tap input directed to an unoccupied space in the editing user interface (e.g., an area that is not occupied by an affordance or the selectable elements); and in response to the third user input directed to the editing user interface, the computer system restores a level of visual emphasis of the respective labels for the plurality of selectable elements in the representation of the first user interface. For example, as described with reference to
In some embodiments, while displaying the first user interface that includes the set of user interface elements overlaid on the wallpaper that includes the plurality of selectable elements (e.g., concurrently displaying the first selectable element and the second selectable element, and/or other selectable elements of the plurality of selectable elements), the computer system detects (934) a fourth user input directed to a first user interface element of the set of user interface elements that are displayed overlaid on the wallpaper; and in response to detecting the fourth user input, in accordance with a determination that the fourth user input meets action criteria (e.g., criteria for detecting a long press, a light press, a double tap, or other types of input), performs a first operation corresponding to the first user interface element. In some embodiments, the first user interface element corresponds to a widget associated with a respective application; and in response to the fourth user input selecting the widget, the respective application associated with the widget is opened and/or displayed. In some embodiments, the first user interface element corresponds to an application shortcut (e.g., for accessing the flashlight, camera, or other application); and in response to the fourth user input, the application for the application shortcut is opened and/or displayed. In some embodiments, the first user interface element includes a time indication and in response to the fourth user input, an editing user interface for the first user interface is displayed. For example, as described with reference to
In some embodiments, in response to detecting the first user input directed to the wallpaper, the computer system (936): in accordance with a determination that the first user input meets the selection criteria and is directed to the first selectable element of the plurality of selectable elements, maintains display of the set of user interface elements displayed overlaid on the wallpaper after the wallpaper is updated to include the first selectable element without including the second selectable element; and in accordance with a determination that the first user input meets the selection criteria and is directed to the second selectable element of the plurality of selectable elements, maintains display of the set of user interface elements displayed overlaid on the wallpaper after the wallpaper is updated to include the second selectable element without including the first selectable element (e.g., the wake screen date and/or time indications and optionally widgets displayed on the wake screen are maintained while the wallpaper changes (e.g., to show different planets as the wallpaper)). For example, the same wake screen elements continue to be displayed even as the user changes the planet (or other element) that is featured as the wallpaper background. For example, as described with reference to
In some embodiments, the wallpaper that includes the plurality of selectable elements further includes (938) a first wallpaper background underlying the plurality of selectable elements (e.g., a wallpaper background of space, such as black with stars or other entities); and in response to detecting the first user input directed to the wallpaper, the computer system: in accordance with a determination that the first user input meets the selection criteria and is directed to the first selectable element of the plurality of selectable elements, maintains display of the first wallpaper background underlying the first selectable element after the wallpaper is updated to include the first selectable element without including the second selectable element; and in accordance with a determination that the first user input meets the selection criteria and is directed to the first selectable element of the plurality of selectable elements, maintains display of the first wallpaper background underlying the second selectable element after the wallpaper is updated to include the second selectable element without including the first selectable element. In some embodiments, the updated wallpaper displayed while maintaining the wallpaper background comprises maintaining the background of space, although a different portion of space is displayed in the updated background (e.g., a portion of space that corresponds to the space around the selectable element that has been selected). For example, as described with reference to
It should be understood that the particular order in which the operations in
As described below, method 1000 enables a user to configure a user interface while the device is in different orientations, thereby providing the user with options to add different application widgets in different configurations that can be accessed by rotating the device, which provides additional control options without cluttering the user interface with additional displayed controls.
The computer system receives (1002) a request to display a first user interface. In some embodiments, the first user interface is a system user interface that includes a wallpaper and a plurality of system user interface objects (e.g., time, widgets, application icons, status indicators, or other system user interface objects), where the wallpaper includes one or more foreground elements (e.g., a pattern, an image, or other graphical elements) overlaying a background (e.g., a background color, a background gradient, or a background texture), optionally in a respective layout arrangement. In some embodiments, the first user interface is a wake screen user interface (e.g., wake screen user interface 490) that is displayed when the device is in a low-power mode (e.g., with the displayed turned off or with the display in a dimmed always-on state), when the device is transitioned from a low-power mode to a regular operating mode (e.g., with the display turned on to a regular brightness level), when the device is transitioned from a regular operating mode (e.g., with the display lit at the regular brightness level) to a low-power mode (e.g., with the display turned off or dimmed), and/or when the device transitions from an application UI (e.g., application user interface 493) and home screen to a locked and/or restricted state. In some embodiments, the first user interface is a home screen user interface (e.g., home screen user interface 492). In some embodiments, the wake screen user interface and the home screen user interface are configured together as a pair, so a change to the wake screen configuration is applied to the configuration of the home screen user interface, and, optionally, vice versa. In some embodiments, receiving the request to display the first user interface includes detecting a raise to wake input or detecting generation of a notification or alert, while the device is in a low-power mode. In some embodiments, receiving the request to display the first user interface includes detecting a power-off input or device sleep input (e.g., a press on the power button, prolonged inactivity by the user, or other event or input that puts the device in a low power mode) while the device is in a regular operating mode. In some embodiments, detecting the request to display the first user interface includes detecting an input that corresponds to a request to close an open application (e.g., an upward edge swipe gesture, a press on the home button, or other inputs), or an input that corresponds to a request to display the coversheet user interface and put the device in a restricted mode (e.g., a downward edge swipe gesture, a press on the power button, or other inputs) that is displayed while an application user interface is displayed.
In response to receiving the request to display the first user interface, the computer system displays (1004) the first user interface (e.g., a wake screen user interface, a home screen user interface, or another system user interface) that includes a plurality of widgets (e.g., along with one or more other user interface elements such as wake screen features such as notifications, a date/time, and/or status indicators or home screen features such as application icons), including: in accordance with a determination that the first user interface is displayed in a first orientation (e.g., a landscape orientation, an upright orientation, or another canonical orientation of the user interface or display), displaying a first version of the first user interface that includes a first set of widgets (e.g., the first version of the first user interface is displayed in the first orientation relative to the display and the first set of widget). For example, as described with reference to
In some embodiments, the device includes sensors that detects a change in the orientation of the display generation component, and in response to detecting movement of the display generation component: in accordance with a determination that the movement of the display generation component caused the display generation component to rotate from the first orientation to a second orientation that meets second orientation criteria, different from the first orientation criteria (e.g., the second orientation is recognized as a portrait orientation, as opposed to the landscape orientation, or vice versa), the computer system displays a second version of the first user interface. In some embodiments, the first version of the first user interface includes a first set of widgets arranged in accordance with a first layout corresponding to the first orientation of the display generation component, the second version of the first user interface includes a second set of widgets arranged in accordance with a second layout corresponding to the second orientation of the display generation component, and wherein: at least the first set of widgets and the second set of widgets, or the first layout and the second layout, are separately configurable by a user. In some embodiments, the user is enabled to configure the first set of widgets into a first layout in the portrait orientation and the second, different, set of widgets into a second, different, layout, in the landscape orientation; or the user is enabled to configure the first set of widgets and the second set of widgets differently, and the computer system automatically arranges them into the first layout and the second layout in the two different orientations. In some embodiments, the user is enabled to select the same set of widgets for both orientations, and separately configure the layouts (e.g., sequence, and spatial arrangement) for the two orientations. In some embodiments, for each of the orientations, the user can separately configure the set of widgets and their spatial arrangement in a configuration mode (e.g., the user is enabled to move, reorder, add, delete, and/or change the set of widgets (e.g., to include different user interface objects) in an arrangement while a respective version of the first user interface is displayed based on the current orientation of the display generation component.
In some embodiments, while displaying the first version of the first user interface that includes the first set of widgets in accordance with the determination that the first user interface is displayed in the first orientation, the computer system detects (1006) a first set of one or more user inputs that corresponds to a request to replace (e.g., add, remove, reorder or otherwise change) the first set of widgets in the first version of the first user interface with a third set of widgets different from the first set of widgets. In some embodiments, the first set of one or more user inputs include a tap and hold input on the first set of widgets to invoke display of a widget picker to add additional widgets into the existing set of widgets on the first version of the first user interface followed by subsequent inputs that select widgets using the widget picker to add widgets into the existing set of widgets on first version of the first user interface, a tap and hold input on one of the widgets in the first set of widgets to invoke display of deletion affordances for the respective widgets of the first set of widgets followed by selection of the deletion affordances for one or more of the first set of widgets to delete the corresponding widgets from the first version of the first user interface. In some embodiments, in response to detecting the first set of one or more user inputs, the computer system updates the first version of the first user interface displayed in the first orientation to includes the third set of widgets instead of the first set of widgets. For example, as described with reference to
In some embodiments, while displaying the second version of the first user interface that includes the second set of widgets in accordance with the determination that the first user interface is displayed in the second orientation, the computer system detects (1008) a second set of one or more user inputs that corresponds to a request to replace (e.g., add, remove, reorder or otherwise change) the second set of widgets in the second version of the first user interface with a fourth set of widgets different from the second set of widgets. In some embodiments, the second set of one or more user inputs include a tap and hold input on the second set of widgets to invoke display of a widget picker to add additional widgets into the existing set of widgets on the second version of the first user interface and subsequent inputs to select widgets from the widget picker to add to the existing set of widgets in the second version of the first user interface, a tap and hold input on one of the widgets in the second set of widgets to invoke display of deletion affordances for the respective widgets of the second set of widgets followed by selection of the deletion affordances for one or more of the second set of widgets to delete the corresponding widgets from the second version of the first user interface. In some embodiments, in response to detecting the first set of one or more user inputs, the computer system updates the second version of the first user interface displayed in the second orientation to includes the fourth set of widgets instead of the second set of widgets. For example, as described with reference to FIGS. 7M2-7Q, the user is enabled to configure the user interface 735, including adding widgets to widgets area 750 in an editing user interface while the device 100 is in a portrait orientation. Enabling a user to direct inputs to objects at different locations in an editing user interface while the device is in a portrait orientation (e.g., or a landscape orientation), including to arrange widgets that are periodically updated with content from associated active applications, enables the user to customize different widgets in the user interface while the device is in the portrait orientation (e.g., or a landscape orientation) and causes the device to automatically display a set of selectable widgets that are periodically updated with content from associated active applications on the user interface while the device is in the portrait orientation (e.g., or a landscape orientation).
In some embodiments, while displaying the first version of the first user interface that includes the first set of widgets in accordance with a determination that the first user interface is displayed in the first orientation, the computer system detects (1010) a first request to display the first user interface in the second orientation (e.g., including detecting a first movement of the display generation component that causes the display generation component to rotate from the first orientation to the second orientation, or other inputs that causes the first user interface to be displayed in the second orientation). In some embodiments, in response to detecting the request to display the first user interface in the second orientation, the computer system displays the first user interface in the second orientation, including displaying the second version of the first user interface that includes the second set of widgets. In some embodiments, while displaying the first user interface in the second orientation, the computer system detects a third set of one or more user inputs that corresponds to a request to rearrange a respective layout of the second set of widgets in the second version of the first user interface. In some embodiments, in response to detecting the third set of user inputs, the computer system updates the first user interface displayed in the second orientation, including updating the respective layout of the second set of widgets in the second version of the first user interface in accordance with the third set of one or more user inputs (e.g., moving and/or reordering the widgets in the second set of widgets while the second version of the first user interface is displayed). In some embodiments, after updating the first user interface displayed in the second orientation (e.g., while displaying the second version of the first user interface that includes the second set of widgets in the updated layout), the computer system detects a first request to display the first user interface in the first orientation (e.g., including detecting a second movement of the display generation component that causes the display generation component to rotate from the second orientation to the first orientation, or other inputs that causes the first user interface to be displayed in the first orientation); and in response to detecting the first request to display the first user interface in the first orientation, the computer system displays the first user interface in the first orientation, including displaying the first version of the first user interface that includes the first set of widgets (e.g., without the effect of the changes made in the second version of the first user interface). For example, in
In some embodiments, after updating the first user interface displayed in the second orientation and while displaying the first user interface in the first orientation, including displaying the first version of the first user interface that includes the first set of widgets, the computer system detects (1012) a second request to display the first user interface in the second orientation. In some embodiments, in response to detecting the second request to display the first user interface in the second orientation, the computer system displays the first user interface in the second orientation, including displaying the second version of the first user interface in which the respective layout of the second set of widgets have been updated in accordance with the third set of one or more user inputs. For example, as described with reference to
In some embodiments, while displaying the second version of the first user interface that includes the second set of widgets in accordance with a determination that the first user interface is displayed in the second orientation, the computer system detects (1014) a second request to display the first user interface in the first orientation (e.g., including detecting a third movement of the display generation component that causes the display generation component to rotate from the second orientation to the first orientation, or other inputs that causes the first user interface to be displayed in the first orientation). In some embodiments, in response to detecting the second request to display the first user interface in the first orientation, the computer system displays the first user interface in the first orientation, including displaying the first version of the first user interface that includes the first set of widgets. In some embodiments, while displaying the first user interface in the first orientation, the computer system detects a fourth set of one or more user inputs that corresponds to a request to rearrange a respective layout of the first set of widgets in the first version of the first user interface. In some embodiments, in response to detecting the fourth set of user inputs, the computer system updates the first user interface displayed in the first orientation, including updating the respective layout of the first set of widgets in the first version of the first user interface in accordance with the fourth set of one or more user inputs (e.g., moving and/or reordering the widgets in the first set of widgets while the first version of the first user interface is displayed). In some embodiments, after updating the first user interface displayed in the first orientation (e.g., while displaying the first version of the first user interface that includes the first set of widgets in the updated layout), the computer system detects a third request to display the first user interface in the second orientation (e.g., including detecting a fourth movement of the display generation component that causes the display generation component to rotate from the first orientation to the second orientation, or other inputs that causes the first user interface to be displayed in the second orientation). In some embodiments, in response to detecting the third request to display the first user interface in the second orientation, the computer system displays the first user interface in the second orientation, including displaying the second version of the first user interface that includes the second set of widgets (e.g., without the effect of the changes made in the first version of the first user interface). For example, as described with reference to
In some embodiments, after updating the first user interface displayed in the first orientation and while displaying the first user interface in the second orientation, including displaying the second version of the first user interface that includes the second set of widgets, the computer system detects (1016) a third request to display the first user interface in the first orientation; and in response to detecting the third request to display the first user interface in the first orientation, displays the first user interface in the first orientation, including displaying the first version of the first user interface in which the respective layout of the first set of widgets have been updated in accordance with the fourth set of one or more user inputs. For example, after rotating the device 100 into the landscape orientation in
In some embodiments, while displaying the first user interface in the first orientation, including displaying the first version of the first user interface, the computer system detects (1018) a movement of the display generation component; and in response to detecting the movement of the display generation component: in accordance with a determination that rotation criteria (e.g., the display generation component has not been rotated into the second orientation, and/or maintained in the second orientation for a threshold amount of time) are not met by the movement of the display generation component, the computer system maintains display of the first set of widgets in the first version of the first user interface in accordance with a first layout (e.g., with a first order, first size and/or a first spacing between the widgets in the first set of widgets) in the first version of the respective user interface; and in accordance with a determination that the rotation criteria are met by the movement of the display generation component, the computer system displays the first user interface in the second orientation, including displaying the second version of the first user interface that includes the second set of widgets in accordance with a second layout (e.g., with a second order, second size and/or a second spacing between the widgets in the second set of widgets that is different than the order, size and/or spacing of the first set of widgets) in the second version of the respective user interface. In some embodiments, the first set of widgets and the second set of widgets have at least one widget in common. In some embodiments, the widget that is in both the first set of widgets and the second set of widgets are not located at the same location (e.g., does not have the same spatial relationship to other widgets and/or other elements of the first user interface) in the first version and the second version of the first user interface. For example, in
In some embodiments, displaying the first user interface in the first orientation includes (1020) displaying the first set of widgets in a first region (e.g., at a first position, or along a first side edge) (e.g., having a first layout) of a display area of the display generation component; and displaying the first user interface in the second orientation includes displaying the second set of widgets in a second region (e.g., at a second position, or within an interior region, such as below a time indication) (e.g., having a second layout) of the display area of the display generation component, the second region being different from the first region. In some embodiments, the second region and the first region differ in size, location, overall grid size, individual grid size, maximum numbers of widgets that can be accommodated, widget reflow behavior, widget position snapping behavior, and/or other appearance and behavior attributes. In some embodiments, the first region and/or the second region are defined relative to the edges or center of the display area of the display generation component. In some embodiments, the first region and/or the second region are defined relative to other elements of the first user interface (e.g., the time indication, or the date indication). For example, a time indication is displayed in both the first orientation and the second orientation, wherein the time indication is displayed in an upper middle portion of the display in the respective orientation. In some embodiments, the first region is below the time indication and the second region is to the left and/or right of the time indication (e.g., along an edge of the display generation component). For example, as described with reference to
In some embodiments, while displaying an editing user interface for editing the first user interface, the computer system detects (1022) a fifth set of user inputs that corresponds to a request to add one or more new widgets to the first user interface displayed in a respective orientation (e.g., the first orientation or the second orientation). In some embodiments, in response to detecting the fifth set of user inputs that corresponds to a request to add the one or more new widgets to the first user interface displayed in the respective orientation, the computer system adds the one or more new widgets to the first user interface displayed in the respective orientation, including: in accordance with a determination that the respective orientation is the first orientation, displaying the one or more new widgets adjacent to another widget in the first set of widgets in the first version of the first user interface. In some embodiments, adding the one or more new widgets to the first user interface displayed in the respective orientation includes, in accordance with a determination that the respective orientation is the second orientation, displaying the one or more new widgets adjacent to another widget in the second set of widgets in the second version of the first user interface. For example, as described with reference to
In some embodiments, while displaying the first user interface (e.g., in an editing user interface for editing the first user interface, or displaying the first user interface in an editing mode) (e.g., in a first orientation or a second orientation) in a respective orientation of the first orientation and the second orientation, the computer system detects (1024) a sixth set of user inputs that corresponds to a request to drag and drop a first widget of a respective set of widget of the first user interface from a first location to a second location in a respective version (e.g., the first version or the second version) of the first user interface displayed in the respective orientation (e.g., the first orientation or the second orientation). In some embodiments, in response to detecting the sixth set of user inputs, the computer system moves the first widget of the respective set of widget of the first user interface to a new location in the respective version (e.g., the first version or the second version) of the first user interface displayed in the respective orientation (e.g., the first orientation or the second orientation), including: in accordance with a determination that the sixth set of user inputs includes a first amount of movement, displaying the first widget at a first location; and in accordance with a determination that the sixth set of user inputs includes a second amount of movement, displaying the first widget at a second location, wherein the first location and the second location are unoccupied by other widgets of the respective set of widgets (e.g., the space between the first location and the second location are able to accommodate one or more widgets, but is not currently occupied by any widgets during the movement of the first widget). For example, as described with reference to
In some embodiments, in response to detecting the fifth set of user inputs that corresponds to a request to add the one or more new widgets to the first user interface displayed in the respective orientation, the computer system automatically selects (1026) respective positions of the one or more new widgets in the first user interface in accordance with a language setting of the computer system. In some embodiments, an added widget is automatically placed to the first open space that the widget will fit in the top left (e.g., or bottom left) for languages that read text from left to right, whereas an added widget is automatically placed in the first open space that the widget will fit in the top right (e.g., or bottom right) for languages that read text from right to left. In some embodiments, another placement rule for automatic arrangement of widgets is provided. For example, as described with reference to
In some embodiments, the first user interface includes (1028) a set of user interface elements (e.g., time indication, widgets, and other user interface objects of the first user interface) displayed overlaid on a wallpaper that includes one or more wallpaper foreground elements (e.g., the wallpaper foreground elements include one or more planets, one or more image or graphics overlaid on a wallpaper background). In some embodiments, in accordance with a determination that the first user interface is displayed in the first orientation, the computer system displays the first version of the first user interface (that, optionally, includes the first set of widgets among the set of user interface elements), including displaying the one or more wallpaper foreground elements as at least partially occluding the set of user interface elements, irrespective of whether the first version of the first user interface currently includes widgets. For example, in some embodiments, the wallpaper foreground elements include a planet that partially covers up a time indication that is displayed in a wake screen user interface when the wake screen user interface is displayed in the landscape orientation. In some embodiments, the one or more wallpaper foreground elements are displayed in front of portions of one or more user interface elements to provide a sense of depth (e.g., depth of field) of the respective user interface. In some embodiments, the one or more wallpaper foreground elements are displayed as at least partially occluding the set of user interface elements regardless of whether there are widgets displayed in the first user interface in the first orientation. In some embodiments, in accordance with a determination that the respective user interface is displayed in the second orientation, the computer system displays the second version of the first user interface, wherein displaying the second version of the first user interface includes: in accordance with a determination that the second version of the first user interface currently includes widgets (e.g., the second set of widgets, or another set of widgets), displaying the one or more wallpaper foreground elements without occluding the set of user interface elements in the second version of the first user interface. In some embodiments, in accordance with a determination that the second version of the first user interface does not currently includes widgets, the computer system displays the one or more wallpaper foreground elements at least partially occluding the set of user interface elements in the second version of the first user interface. For example, in some embodiments, the wallpaper foreground elements include a planet that does not occlude the time indication that is displayed in the wake screen user interface in the portrait orientation if there are widgets in the wake screen user interface; and the planet does occlude the time indication if there is no widget in the wake screen user interface. In some embodiments, the widgets in the wake screen user interface are displayed as overlaid on the one or more wallpaper foreground elements in the wake screen user interface displayed in the portrait orientation. For example, as described with reference to
In some embodiments, the computer system displays (1030) an editing user interface for a respective version of the first user interface, wherein the editing user interface includes: a listing of applications, and for a respective application in the list of applications, one or more widgets associated with the respective application. In some embodiments, the computer system detects user selection of a third widget from the one or more widgets associated with the respective application; and in response to detecting the user selection of the third widget, the computer system adds the third widget to the respective set of widgets in the respective version of the first user interface. For example, as described with reference to
In some embodiments, the first set of widgets includes (1032) at least one widget that is not included in the second set of widgets (e.g., the first set of widgets includes a fourth widget and a fifth widget, and the second set of widgets includes the fourth widget and does not include the fifth widget). For example, in
In some embodiments, the second set of widgets includes (1034) at least one widget that is not included in the first set of widgets (e.g., the second set of widgets includes a sixth widget and a seventh widget and the first set of widgets includes the sixth widget and does not include the seventh widget). For example, in
It should be understood that the particular order in which the operations in
The operations described above with reference to
In some embodiments, device 100 detects a user input or other trigger corresponding to a request to wake the device 100 from the low power state. In some embodiments, the user input comprises a raising of the device (e.g., a change in position, orientation, and/or other movement of the device) in a manner that satisfies wake criteria (e.g., the device is raised at a threshold angle and/or speed that satisfies the wake criteria). In some embodiments, the user input corresponding to a request to wake the device 100 from the low power state is a tap input or other type of input that is detected on a touch screen of device 100. For example, user input 1104 is a tap input on the touch screen of device 100 while device 100 is in the low power state. In some embodiments, the user input corresponding to a request to wake the device 100 from the low power state is an activation of one or more buttons of the device, such as user input 1106 on the power button or another button of the device.
In some embodiments, the trigger is a trigger other than a user input, such as the device receiving a notification and/or alert. For example, device 100 wakes from the low power state in response to receiving a message communication, a phone communication, a notification from an application, and/or a system alert.
In some embodiments, the low power state corresponds to a dimmed, always on state, as illustrated in
In some embodiments, user interface 1108 illustrated in
In some embodiments, the transition between the low power state to the normal operating state includes displaying the wallpaper background of the wake screen user interface 1110. In some embodiments, the wallpaper background of the wake screen user interface 1110 corresponds to a media item (e.g., a photo and/or video). In some embodiments, the media item is captured by the device 100 (e.g., using one or more cameras of device 100) (e.g., prior to the media item being set as the wallpaper background). For example, the user of the device 100 previously captured the media item and set the wallpaper background of the wake screen user interface 1110 after capturing the media item. In some embodiments, the media item is captured by another device, different from device 100, and is sent to or otherwise shared with device 100 such that device 100 is enabled to save and/or set the media item as the wallpaper background of the wake screen user interface 1110 of device 100. In some embodiments, the media item includes multiple frames. For example, the media item is a photo that includes a plurality of frames that are captured at or near a time when the shutter button is pressed to take the photo, sometimes referred to herein as a lively photo.
In some embodiments, the device 100 interpolates frames between two or more frames that are captured in the media item. For example, the media item includes a first frame and a second frame that are captured while the media item is captured, and device 100 adds a third frame between the first frame and the second frame based at least in part on the image content of the first frame and the second frame (e.g., such that the third frame includes image content that shows an intermediary amount of movement between the first frame and the second frame). As such, the third frame is interpolated and/or inserted between the first frame and the second frame such that playback of the media item (e.g., to generate an animation of the media item) includes playing back the first frame, then the third frame, then the second frame. In some embodiments, device 100 adds additional interpolated frames to playback of the media item in order to generate a slow motion effect.
In some embodiments, displaying the wallpaper background of the wake screen includes displaying the multiple frames of the media item. For example, the wallpaper background is updated to display an animation and/or other movement that is created by displaying the multiple frames of the media item in succession. For example, as illustrated in
In some embodiments, as the device 100 wakes in response to the user input or other trigger, the wallpaper background is gradually updated from a first level of visual deemphasis to a second level of visual deemphasis. For example, the wallpaper background is initially displayed with a first level of blurring (e.g., or fading or other type of visual deemphasis) and, over time, as the plurality of frames progress, the wallpaper background is displayed with a second level of blurring, optionally less than the first level (e.g., optionally without any blurring). As such, the wallpaper background is updated to be displayed with less visual deemphasis over time as the device 100 wakes to its normal operating state illustrated in
In some embodiments, as the device 100 wakes in response to the user input or other trigger, the successive frames of the multiple frames of the media item used as the wallpaper background are updated at a plurality of different frame rates, optionally by inserting more and more interpolated frames between subsequent pairs of captured frames of the respective user-captured media. For example, the device 100 plays through the multiple frames of the media item such that playback of the media item is displayed as increasingly slowed slow motion over time. For example, in
In some embodiments, as illustrated in
In some embodiments, as described with reference to FIGS. 4C1-4C2, the user is enabled to navigate from the wake screen to a home screen, an application user interface, and/or another user interface using one or more gestures. For example, in response to a swipe gesture in a first direction (e.g., an upward swipe, a downward swipe, or a side swipe), or another type of user input, the device 100 replaces display of the wake screen use interface with a home screen user interface.
In some embodiments, in response to a user input 1112 (e.g., a press and hold or other user input) corresponding to a request to edit the wake screen user interface, the device 100 displays editing user interface 1114 (e.g., editing user interface 531 illustrated in
In some embodiments, a cropping of the media item (e.g., as displayed as the wallpaper background upon wake and/or as displayed in the editing user interface 1114) is also changed based on whether the “Media Playback on Wake” setting is toggled on or off. For example, as illustrated in
In some embodiments, in accordance with a determination that the “Media Playback on Wake” setting is toggled “off,” one or more settings for editing the wake screen user interface become disabled such that the user is not enabled to modify the one or more settings. For example, while the “Media Playback on Wake” setting is toggled “on”, the user is enabled to change a color filter that is applied to the wallpaper background and/or is enabled to turn a depth effect of the media item on and/or off. In some embodiments, while the “Media Playback on Wake” setting is toggled to “off,” the settings to change the color filter and/or to toggle the depth effect on and/or off are no longer displayed and/or editable by the user.
In some embodiments, while the “Media Playback on Wake” setting is toggled to “off,” the device 100 detects a user input selecting the media library icon 1120, and in response to the user input, displays a plurality of selectable media items that the user is enabled to select to be used as the wallpaper background. In some embodiments, as explained above, device 100 displays a set of recommended media items, selected from the media library, that are determined as good candidates to be used as the wallpaper background. In some embodiments, the device 100 detects a user input selecting a recommended media item from the set of recommended media items (e.g., or selecting another media item that is not displayed in the set of recommended media items but is still considered a good candidate to be used as the wallpaper background), and in response to the user input and in accordance with a determination that the user has selected a media item from the set of recommended media items that is considered a good candidate to be used as the wallpaper background, the device 100 displays a prompt (e.g., a text notification or other prompt) indicating that the “Media Playback on Wake” setting is currently toggled to “off” and/or otherwise prompting and/or recommending that the user toggle the “Media Playback on Wake” setting to “on” for the selected media item.
In some embodiments, while the “Media Playback on Wake” setting is toggled “off,” the device 100 detects a user input selecting a media item from the media library is not considered a good candidate to be used as the wallpaper background (e.g., the selected media item is not displayed in the set of recommended media items), and in response to the user input and in accordance with a determination that the user has selected a media item from that is not considered a good candidate to be used as the wallpaper background, the device 100 disables button 1122 such that the user is not enabled to turn the “Media Playback on Wake” setting to “on” for the selected media item.
In some embodiments, in accordance with a determination that the selected media item from the media item is selected from a library (e.g., a collection and/or a set) of media items that include multiple frames (e.g., each of the media items in the library includes at least two or more frames, or at least another minimum number of frames, such as at least 4 frames, 10 frames, or another number of frames), the “Media Playback on Wake” setting is automatically, without user input, set to “on” (e.g., optionally without regard to what the setting was toggled to before selecting the media item) for the selected media item. For example, if the media item is selected from a library of video items and/or from a library of lively photos (e.g., and/or is selected from the set of recommended media items), the Media Playback on Wake” setting is automatically, without user input, set to “on”. In some embodiments, in accordance with a determination that the selected media item from the media item is selected from a library (e.g., a collection and/or a set) of media items that do not include multiple frames (e.g., each of the media items in the library includes less than the minimum number of frames), for example, the media item is selected from a library of still images (e.g., static and/or non-lively photos), the “Media Playback on Wake” setting is automatically, without user input, set to “off” (e.g., and, optionally, the “Media Playback on Wake” setting is disabled for the still image such that the user is not enabled to toggle the setting to “on”).
In some embodiments, the device 100 detects user input 1116 directed to selection box 712 for the time indication, and in response to detecting the user input 1116, the device 100 displays an editing platter 1124 for editing the time indication, as illustrated in
In some embodiments, the option to change the weight of the text optionally includes a slider 1126-1, or other user interface element, that enables the user to change a weight (e.g., thickness and/or boldness) of the text through continuously variable values. For example, the user perceives the weight of the text to change continuously based on an amount of adjustment via the slider. In some embodiments, as used herein, a “continuously variable” set of font weight includes a substantially continuous set of font weights or a plurality of discrete font weights where a difference between two adjacent discrete font weights is not visually perceptible to a user of the device, which makes the set of discrete font weights appear, perceptually, to be continuous during adjustment. For example, in
In some embodiments, in response to user input 1128 for adjusting the position on slider 1126-1, the weight of the text is updated in accordance with the user input (e.g., in accordance with the new position of slider 1126-2) without changing a size of the text, as illustrated in
Method 1200 enables the device to automatically animate a wallpaper background upon waking the device from a low power state, while increasing a luminance of the display, indicates to the user when the device is transitioning out of a low power mode and enables the user to customize the wallpaper background to include an animation without requiring the user to define complicated animations and/or modifying the media manually, thereby providing improved feedback about a state of the device and reducing a number of user inputs required to display an animation in the wallpaper background.
While the computer system is in a low power state (e.g., a power-saving state, a state that the computer system transitions into after prolonged inactivity or in response to a user input turning off the display to conserve battery charge) in which respective user-captured media with multiple frames (e.g., a video or a photo with auxiliary frames that were captured at or near a time when a shutter button was pressed, sometimes referred to as a lively photo) is selected as a background (e.g., a wallpaper) for a wake screen (e.g., a wake screen user interface, a lock screen user interface, an always-on user interface that has a dimmed state in the low power state and a lit state when transitioned out of the low power state, and/or another system user interface that corresponds to a restricted state of the computer system), the computer system detects (1202) an event that corresponds to a trigger to wake the computer system to a higher power state (e.g., movement of the computer system, raising the computer system from a resting position, and/or impact and/or contact by a user on the computer system or touch-screen of the computer system). For example, as described with reference to
In response to detecting the event, the computer system displays (1204), via the one or more display generation components, a first wake screen that includes device status information (e.g., indicators for WiFi signal strength, cellular signal strength, location tracking status, battery level, mobile carrier, the locked/unlocked state of the computer system, the current date and time, and/other device status information, that were not displayed in the low power mode prior to detecting the event and displaying the first wake screen) and the background, wherein displaying the background for the wake screen includes playing through a plurality of frames of the respective user-captured media. For example, as described with reference to
In some embodiments, the event that corresponds to the trigger to wake the computer system is (1206) a change in orientation of the computer system (e.g., detected via one or more sensors of the computer system that optionally matches a movement or orientation profile for the computer system having been lifted or raised from a resting position and/or orientation). For example, as described with reference to
In some embodiments, the event that corresponds to the trigger to wake the computer system is (1208) a tap detected on the computer system (e.g., detected on a touch-sensitive surface such as a touch-sensitive display or detected on a housing of the computer system using one or more motion or vibration sensors). For example, as described with reference to
In some embodiments, the event that corresponds to the trigger to wake the computer system is (1210) a button press on a button of the computer system (e.g., a mechanical button or a solid-state button that is activated based on an intensity of an input meeting an intensity threshold and optionally provides haptic feedback when activation has occurred indicating that activation has occurred). For example, as described with reference to
In some embodiments, the low power state is (1212) a display off state (e.g., display is dark, turned off, and/or does not display any content); and displaying the first wake screen includes transitioning (e.g., gradually increasing a brightness of the display and/or displaying an animation for waking the display) from the display off state to displaying the first wake screen. For example, as described with reference to
In some embodiments, the low power state is (1214) a state in which a low power user interface is displayed (e.g., a user interface that is dimmer, has fewer user interface elements and/or which is updated less frequently than the wake screen). In some embodiments, displaying the first wake screen includes transitioning from displaying the low power user interface to displaying the first wake screen. For example, as described with reference to
In some embodiments, displaying the first wake screen includes (1216) reducing a degree of blurring of the respective user-captured media (e.g., while playing through a plurality of frames of the respective user-captured media, before playing through the plurality of frames of the respective user-captured media, and/or after playing through the plurality of frames of the respective user-captured media). For example, as described with reference to
In some embodiments, playing through the plurality of frames of the respective user-captured media includes (1218) playing through one or more interpolated frames, including one or more frames of the respective user-captured media that are automatically generated based on a prior frame that is played before a respective interpolated frame of the one or more interpolated frames and a subsequent frame that is played after the respective interpolated frame. In some embodiments there are multiple interpolated frames between two captured frames of the respective user-captured media. For example, as described with reference to
In some embodiments, playing through a plurality of frames of the respective user-captured media includes (1220) gradually (e.g., over a period of time) increasing an effective frame rate of the respective user-captured media as it is played back to create an effect (e.g., a visual effect and/or animation) of increasingly slowed slow motion over time as the respective user-captured media is played (e.g., optionally by inserting more and more interpolated frames between subsequent pairs of captured frames of the respective user-captured media). For example, upon waking the device from the low power state, the user-captured media (e.g., a lively photo or video) is initially displayed as progressing through the multiple frames at a first frame rate, and the frame rate decreases to a second frame rate that is slower than the first frame rate, as the wake animation continues. For example, as described with reference to
In some embodiments, the device status information on the wake screen includes (1222) one or more of a current date, a current time, weather information, one or more widgets that include content corresponding to one or more applications on the computer system (e.g., wherein a respective widget includes status information for a respective application that is updated over time, in accordance with a current status of the respective application), and one or more notifications. In some embodiments, the wake screen includes function controls for initiating a corresponding function (e.g., a flashlight control for initiating a flashlight function of the device and/or a camera control for initiating a camera function of the device that can be used to capture respective user-captured media such as a video or lively photo). For example, as described with reference to
In some embodiments, while displaying the first wake screen, the computer system detects (1224), via the one or more input devices, an input that corresponds to a request to dismiss the first wake screen (e.g., a swipe gesture or button press that corresponds to a request to display a home screen user interface). In some embodiments, in response to detecting the input that corresponds to the request to dismiss the first wake screen, the computer system: ceases to display the first wake screen; and displays a home screen user interface that includes one or more representations of one or more applications that when selected, cause the computer system to launch a corresponding application of the selected application of the one or more applications (and, optionally, one or more widgets that include information for corresponding applications). For example, as described with reference to FIGS. 4C1-4C2 and
In some embodiments, prior to displaying the first wake screen, the computer system detects (1226), via the one or more input devices, an input that corresponds to a request to capture media (e.g., a press of a hardware or virtual shutter button while a camera user interface is displayed). In some embodiments, in response to detecting the input that corresponds to the request to capture media, the computer system captures user-captured media using one or more cameras that are in communication with the computer system, wherein the user-captured media includes the respective user-captured media that is used for the wake screen. In some embodiments, the user-captured media is captured using a camera application of the computer system, wherein the camera application includes options for capturing lively photos and/or videos using the one or more cameras that are in communication with the computer system. In some embodiments, the one or more cameras are internal to the computer system. In some embodiments, the captured media is not captured with the camera of the device. For example, the captured media is optionally captured by a different device and is shared and/or sent to the computer system. For example, as described with reference to
In some embodiments, the computer system detects (1228), via the one or more input devices, in a wake screen editing user interface (e.g., editing user interface 1114 in
In some embodiments, the input that corresponds to the request to use the respective user-captured media as a background for a wake screen of the computer system is detected (1230) while displaying the wake screen editing user interface, wherein the wake screen editing user interface includes a plurality of options for editing a wake screen (e.g., adding or removing widgets, selecting a background, selecting a visual effect for a background, selecting a style for text displayed on the wake screen such as a color, font, line weight, and/or other selectable editing options for configuring a wake screen). In some embodiments, displaying the wake screen editing user interface includes displaying one or more suggestions for media to use as backgrounds, wherein displaying the one or more suggestions includes: in accordance with a determination that the respective user-captured media is determined to be a good candidate to be used as an animated background for a wake screen (e.g., based on a quality of the motion in the respective user-captured media item, based on a number of frames in the user-captured media item, and/or based on the presence of information such as metadata or frame information that enables interpolation between frames of the respective user-captured media item to generate a slow motion effect for the respective user-captured media item), including a representation of the respective user-captured media in the one or more suggestions (e.g., along with one or more other suggested media items, which optionally include other user-captured media items); and in accordance with a determination that the respective user-captured media is not determined to be a good candidate to be used as an animated background for a wake screen, forgoing including a representation of the respective user-captured media in the one or more suggestions (e.g., and suggesting one or more other user-captured media items instead of the respective user-captured media item). In some embodiments, the user is enabled to request the respective user-captured media to be used as a background for the wake screen even when the representation of the respective user-captured media is not included in the one or more suggestions. For example, the user is enabled to select from a photo and/or video library that includes additional user-captured media items that are not included in the one or more suggestions. For example, as described with reference to
In some embodiments, the wake screen editing user interface includes (1232) one or more controls (e.g., a selection box, a toggle control, or another type of control) for selecting whether or not the background of the wake screen animates when the wake screen is displayed. In some embodiments, while displaying the wake screen editing user interface, the computer system detects an input directed to the one or more controls for selecting whether or not the background of the wake screen animates. In some embodiments, in response to detecting the input directed to the one or more controls for selecting whether or not the background of the wake screen animates: in accordance with a determination that the input meets first criteria (e.g., an input directed to an enable option or an input directed to a toggle that is currently sent to disabling animation of the background of the wake screen), the computer system enables animation of the background of the wake screen (e.g., wherein the animation of the background of the wake screen includes playing through a plurality of frames of the respective user-captured media) and in accordance with a determination that the input meets second criteria different from the first criteria (e.g., an input directed to a disable option or an input directed to a toggle that is currently sent to enabling animation of the background of the wake screen), the computer system disables animation of the background of the wake screen. In some embodiments, when the wake screen of the computer system is displayed (e.g., as described in greater detail with reference to
In some embodiments, displaying the wake screen editing user interface includes (1234): in accordance with a determination that animation of the background of the wake screen is enabled, displaying a first set of editing options for editing the wake screen; and in accordance with a determination that animation of the background of the wake screen is disabled, displaying a second set of editing options for editing the wake screen, wherein the second set of editing options is different from the first set of editing options. In some embodiments, the second set of editing options includes an option for changing and/or selecting a color filter, an option for turning a depth effect on and/or off. In some embodiments, the second set of editing options includes the first set of editing options and one or more additional editing options that are only available if the animation is disabled. For example, as described with reference to
In some embodiments, displaying (1236) the wake screen editing user interface includes, in accordance with a determination that a currently selected media item for a background of the wake screen is a good candidate for animation and animation of the background of the wake screen is disabled, displaying a prompt to enable animation for the currently selected media item. In some embodiments, in accordance with a determination that a currently selected media item for a background of the wake screen is a good candidate for animation and animation of the background of the wake screen is enabled, the computer system forgoes displaying the prompt to enable animation for the currently selected media item. In some embodiments, in accordance with a determination that a currently selected media item for a background of the wake screen is not a good candidate for animation and animation of the background of the wake screen is disabled, the computer system forgoes displaying the prompt to enable animation for the currently selected media item. For example, as described with reference to
In some embodiments, displaying the wake screen editing user interface includes (1238), in accordance with a determination that a currently selected media item for a background of the wake screen is not a good candidate for animation and animation of the background of the wake screen is disabled, preventing enablement of animation of the background of the wake screen (e.g., forgoing display of the control for enabling animation of the background of the wake screen or displaying the control for enabling animation of the background with a visual effect indicating that the control is inactive). In some embodiments, in accordance with a determination that a currently selected media item for a background of the wake screen is a good candidate for animation and animation of the background of the wake screen is disabled, the computer system permits enablement of animation of the background of the wake screen (e.g., displays the control for enabling animation of the background of the wake screen or displaying the control for enabling animation of the background with a visual effect indicating that the control is active). For example, as described with reference to
In some embodiments, displaying the wake screen editing user interface includes (1240): in accordance with a determination that a currently selected media item for the background was selected from a collection of media items that include multiple frames, enabling animation of the background of the first wake screen (e.g., by default, automatically, and/or otherwise without additional user input); and in accordance with a determination that a currently selected media item for the background was selected from a collection of media items that include one or more media items without multiple frames, disabling animation of the background of the first wake screen (e.g., by default, automatically, and/or otherwise without additional user input). For example, as described with reference to
In some embodiments, displaying the wake screen editing user interface includes (1242), in accordance with a determination that animation of the background of the wake screen is enabled, repeatedly playing (e.g., looping, cross-fading, and/or cycling through in other manners) through multiple frames of a currently selected media item for the background of the wake screen. In some embodiments, in accordance with a determination that animation of the background of the wake screen is disabled, the computer system displays a still image corresponding to the currently selected media item for the background of the wake screen (e.g., forgoes playing through multiple frames of the currently selected media item for the background of the wake screen). For example, as described with reference to
In some embodiments, displaying the wake screen editing user interface includes (1244): in accordance with a determination that animation of the background of the wake screen is enabled, displaying a currently selected media item for the background with a first cropping (e.g., or first zoom). In some embodiments, when animation of the background of the wake screen is enable, a crop is selected that centers a focal point of movement (e.g., movement detected by the computer system using image processing) in the plurality of frames of the currently selected media item. In some embodiments, in accordance with a determination that animation of the background of the wake screen is disabled, the computer system displays the currently selected media item for the background with a second cropping that is different from the first cropping (e.g., or second zoom that is different from the first zoom). In some embodiments, when animation of the background of the wake screen is disabled, a crop is selected that centers a focal point of the still image corresponding to the currently selected media item. For example, as described with reference to
In some embodiments, while displaying the wake screen editing user interface, the computer system detects (1246) an input corresponding to a request to change whether animation of the background of the wake screen is enabled; and in response to detecting the input corresponding to the request to change whether the animation of the background of the wake screen is enabled (e.g., from enabled to unenabled, or from unenabled to enabled), the computer system changes whether the animation of the background of the wake screen is enabled; and changes a cropping (or, zoom, and/or center) of a currently selected media item for the background of the wake screen (e.g., that is displayed in the wake screen editing user interface). For example, as described with reference to
In some embodiments, displaying the wake screen editing user interface includes (1248) displaying a control (e.g., a slider control, a selection menu, or another type of control) for adjusting a font weight of text (e.g., text that is an indication of date and/or time or other computer system status information) in the wake screen editing user interface. In some embodiments, while displaying the wake screen editing user interface, the computer system detects an input (e.g., a swipe and/or drag input, a tap input at a different position on the slider and/or another type of input) directed to the control for adjusting the font weight of text in the wake screen editing user interface. In some embodiments, in response to detecting the input directed to the control for adjusting the font weight of text in the wake screen editing user interface, the computer system adjusts a font weight of the text (e.g., increasing or decreasing the font weight of the text). For example, as described with reference to
In some embodiments, the control for adjusting a font weight of text in the wake screen editing user interface includes (1250) a slider control. In some embodiments, adjusting the font weight of the text includes adjusting the font weight of the text through a perceptually continuous range of font weights (e.g., continuous set of font weights, a substantially continuous set of font weights, or a plurality of discrete font weights where a difference between two adjacent discrete font weights is not visually perceptible to a user of the device, which makes the set of discrete font weights appear, perceptually, to be continuous during adjustment) using the slider control. For example, as described with reference to
It should be understood that the particular order in which the operations in
The operations described above with reference to
In addition, in methods described herein where one or more steps are contingent upon one or more conditions having been met, it should be understood that the described method can be repeated in multiple repetitions so that over the course of the repetitions all of the conditions upon which steps in the method are contingent have been met in different repetitions of the method. For example, if a method requires performing a first step if a condition is satisfied, and a second step if the condition is not satisfied, then a person of ordinary skill would appreciate that the claimed steps are repeated until the condition has been both satisfied and not satisfied, in no particular order. Thus, a method described with one or more steps that are contingent upon one or more conditions having been met could be rewritten as a method that is repeated until each of the conditions described in the method has been met. This, however, is not required of system or computer readable medium claims where the system or computer readable medium contains instructions for performing the contingent operations based on the satisfaction of the corresponding one or more conditions and thus is capable of determining whether the contingency has or has not been satisfied without explicitly repeating steps of a method until all of the conditions upon which steps in the method are contingent have been met. A person having ordinary skill in the art would also understand that, similar to a method with contingent steps, a system or computer readable storage medium can repeat the steps of a method as many times as are needed to ensure that all of the contingent steps have been performed.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best use the invention and various described embodiments with various modifications as are suited to the particular use contemplated.
This application claims priority to U.S. Provisional Patent Application No. 63/470,964, filed Jun. 4, 2023 and U.S. Provisional Patent Application No. 63/465,209, filed May 9, 2023, each of which is hereby incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63470964 | Jun 2023 | US | |
63465209 | May 2023 | US |