The present disclosure relates generally to computer user interfaces, and more specifically to techniques for sharing application visual output.
Broadcasting and replying of video games and other live output of applications is a growing form of entertainment. Numerous websites support ecosystems for video game players to post previously recorded outputs of video games or to broadcast the output of a video game live. The popularity of these websites have only increased with the rise of e-sports leagues that hold competitions and tournaments for various applications.
Some techniques for sharing application visual output using electronic devices, however, are generally cumbersome and inefficient. For example, some existing techniques use a complex and time-consuming user interface, which may include multiple key presses or keystrokes. Existing techniques require more time than necessary, wasting user time and device energy. This latter consideration is particularly important in battery-operated devices.
Accordingly, the present technique provides electronic devices with faster, more efficient methods and interfaces for sharing application visual output. Such methods and interfaces optionally complement or replace other methods for sharing application visual output. Such methods and interfaces reduce the cognitive burden on a user and produce a more efficient human-machine interface.
In accordance with an embodiment, at a first electronic device with one or more processors, a communication interface, and memory, and that is in communication with a display, a first input is received corresponding to an affordance to start a task in an application executing on the one or more processors. In response to receiving the first input, the task is started. While the task is ongoing, visual output of the application is recorded as application task data. After the task has ceased, an affordance for sharing the application task data with a second electronic device that is associated with the first electronic device is caused to be displayed. While the affordance for sharing is displayed on the display, a second input is received that corresponds to selection of the affordance for sharing the application task data. In response to receiving the second input, the application task data is transmitted to the second electronic device over the communication interface.
In accordance with an embodiment, at a first electronic device with one or more processors, a communication interface, and memory, and that is in communication with a display, causing to display on the display a first affordance in an application. The first affordance is for broadcasting visual output of a task of the application. In response to receiving a first user input corresponding to selection of the first affordance and in accordance with a determination that multiple broadcast applications on the electronic device are capable of broadcasting visual output of the application while the task is ongoing: (1) a second affordance is caused to be displayed on the display, the second affordance is for selecting a broadcast application of the plurality of broadcast applications capable of broadcasting the visual output of the application and (2) while the second affordance is displayed on the display, second user input is received corresponding to selection of the second affordance. After receiving the second user input, the task is started and the visual output of the application is sent to the broadcast application for transmitting the visual output over the communication interface to a remote server.
An embodiment of a transitory computer readable storage medium stores one or more programs, the one or more programs comprising instructions, which when executed by one or more processors of an electronic device with a display and one or more input devices, cause the device to: cause to display on the display a first affordance in an application, wherein the first affordance is for broadcasting visual output of a task of the application; in response to receiving a first user input corresponding to selection of the first affordance: in accordance with a determination that multiple broadcast applications on the electronic device are capable of broadcasting visual output of the application while the task is ongoing: cause to display on the display a second affordance for selecting a broadcast application of the plurality of broadcast applications capable of broadcasting the visual output of the application; and while the second affordance is displayed on the display, receive second user input corresponding to selection of the second affordance; after receiving the second user input, start the task and sending the visual output of the application to the broadcast application for transmitting the visual output over the communication interface to a remote server.
An embodiment of a transitory computer readable storage medium stores one or more programs, the one or more programs comprising instructions, which when executed by one or more processors of an electronic device with a display and one or more input devices, cause the device to: receive a first input corresponding to an affordance to start a task in an application executing on the one or more processors; in response to receiving the first input, start the task; while the task is ongoing, record visual output of the application as application task data; after the task has ceased, causing to be displayed on the display an affordance for sharing the application task data with a second electronic device that is associated with the first electronic device; and while the affordance for sharing is displayed on the display, receive a second input that corresponds to selection of the affordance for sharing the application task data; and in response to receiving the second input, transmit the application task data to the second electronic device over the communication interface.
Executable instructions for performing these functions are, optionally, included in a non-transitory computer-readable storage medium or other computer program product configured for execution by one or more processors. Executable instructions for performing these functions are, optionally, included in a transitory computer-readable storage medium or other computer program product configured for execution by one or more processors.
Thus, devices are provided with faster, more efficient methods and interfaces for sharing application visual output, thereby increasing the effectiveness, efficiency, and user satisfaction with such devices. Such methods and interfaces may complement or replace other methods for sharing application visual output.
For a better understanding of the various described embodiments, reference should be made to the Description of Embodiments below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.
The following description sets forth exemplary methods, parameters, and the like. It should be recognized, however, that such description is not intended as a limitation on the scope of the present disclosure but is instead provided as a description of exemplary embodiments.
There is a need for electronic devices that provide efficient methods and interfaces for sharing application output. For example, video game broadcasts and replays are an increasingly popular form of entertainment. Such techniques can reduce the cognitive burden on a user who shares application visual output, thereby enhancing productivity. Further, such techniques can reduce processor and battery power otherwise wasted on redundant user inputs.
Below,
Although the following description uses terms “first,” “second,” etc. to describe various elements, these elements should not be limited by the terms. These terms are only used to distinguish one element from another. For example, a first touch could be termed a second touch, and, similarly, a second touch could be termed a first touch, without departing from the scope of the various described embodiments. The first touch and the second touch are both touches, but they are not the same touch.
The terminology used in the description of the various described embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “includes,” “including,” “comprises,” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The term “if” is, optionally, construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” is, optionally, construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
Embodiments of electronic devices, user interfaces for such devices, and associated processes for using such devices are described. In some embodiments, the device is a portable communications device, such as a mobile telephone, that also contains other functions, such as PDA and/or music player functions. Exemplary embodiments of portable multifunction devices include, without limitation, the iPhone®, iPod Touch®, and iPad® devices from Apple Inc. of Cupertino, Calif. Other portable electronic devices, such as laptops or tablet computers with touch-sensitive surfaces (e.g., touch screen displays and/or touchpads), are, optionally, used. It should also be understood that, in some embodiments, the device is not a portable communications device, but is a desktop computer with a touch-sensitive surface (e.g., a touch screen display and/or a touchpad).
In the discussion that follows, an electronic device that includes a display and a touch-sensitive surface is described. It should be understood, however, that the electronic device optionally includes one or more other physical user-interface devices, such as a physical keyboard, a mouse, and/or a joystick.
The device typically supports a variety of applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disk authoring application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a workout support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
The various applications that are executed on the device optionally use at least one common physical user-interface device, such as the touch-sensitive surface. One or more functions of the touch-sensitive surface as well as corresponding information displayed on the device are, optionally, adjusted and/or varied from one application to the next and/or within a respective application. In this way, a common physical architecture (such as the touch-sensitive surface) of the device optionally supports the variety of applications with user interfaces that are intuitive and transparent to the user.
Attention is now directed toward embodiments of portable devices with touch-sensitive displays.
As used in the specification and claims, the term “intensity” of a contact on a touch-sensitive surface refers to the force or pressure (force per unit area) of a contact (e.g., a finger contact) on the touch-sensitive surface, or to a substitute (proxy) for the force or pressure of a contact on the touch-sensitive surface. The intensity of a contact has a range of values that includes at least four distinct values and more typically includes hundreds of distinct values (e.g., at least 256). Intensity of a contact is, optionally, determined (or measured) using various approaches and various sensors or combinations of sensors. For example, one or more force sensors underneath or adjacent to the touch-sensitive surface are, optionally, used to measure force at various points on the touch-sensitive surface. In some implementations, force measurements from multiple force sensors are combined (e.g., a weighted average) to determine an estimated force of a contact. Similarly, a pressure-sensitive tip of a stylus is, optionally, used to determine a pressure of the stylus on the touch-sensitive surface. Alternatively, the size of the contact area detected on the touch-sensitive surface and/or changes thereto, the capacitance of the touch-sensitive surface proximate to the contact and/or changes thereto, and/or the resistance of the touch-sensitive surface proximate to the contact and/or changes thereto are, optionally, used as a substitute for the force or pressure of the contact on the touch-sensitive surface. In some implementations, the substitute measurements for contact force or pressure are used directly to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is described in units corresponding to the substitute measurements). In some implementations, the substitute measurements for contact force or pressure are converted to an estimated force or pressure, and the estimated force or pressure is used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is a pressure threshold measured in units of pressure). Using the intensity of a contact as an attribute of a user input allows for user access to additional device functionality that may otherwise not be accessible by the user on a reduced-size device with limited real estate for displaying affordances (e.g., on a touch-sensitive display) and/or receiving user input (e.g., via a touch-sensitive display, a touch-sensitive surface, or a physical/mechanical control such as a knob or a button).
As used in the specification and claims, the term “tactile output” refers to physical displacement of a device relative to a previous position of the device, physical displacement of a component (e.g., a touch-sensitive surface) of a device relative to another component (e.g., housing) of the device, or displacement of the component relative to a center of mass of the device that will be detected by a user with the user's sense of touch. For example, in situations where the device or the component of the device is in contact with a surface of a user that is sensitive to touch (e.g., a finger, palm, or other part of a user's hand), the tactile output generated by the physical displacement will be interpreted by the user as a tactile sensation corresponding to a perceived change in physical characteristics of the device or the component of the device. For example, movement of a touch-sensitive surface (e.g., a touch-sensitive display or trackpad) is, optionally, interpreted by the user as a “down click” or “up click” of a physical actuator button. In some cases, a user will feel a tactile sensation such as an “down click” or “up click” even when there is no movement of a physical actuator button associated with the touch-sensitive surface that is physically pressed (e.g., displaced) by the user's movements. As another example, movement of the touch-sensitive surface is, optionally, interpreted or sensed by the user as “roughness” of the touch-sensitive surface, even when there is no change in smoothness of the touch-sensitive surface. While such interpretations of touch by a user will be subject to the individualized sensory perceptions of the user, there are many sensory perceptions of touch that are common to a large majority of users. Thus, when a tactile output is described as corresponding to a particular sensory perception of a user (e.g., an “up click,” a “down click,” “roughness”), unless otherwise stated, the generated tactile output corresponds to physical displacement of the device or a component thereof that will generate the described sensory perception for a typical (or average) user.
It should be appreciated that device 100 is only one example of a portable multifunction device, and that device 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components. The various components shown in
Memory 102 optionally includes high-speed random access memory and optionally also includes non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Memory controller 122 optionally controls access to memory 102 by other components of device 100.
Peripherals interface 118 can be used to couple input and output peripherals of the device to CPU 120 and memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in memory 102 to perform various functions for device 100 and to process data. In some embodiments, peripherals interface 118, CPU 120, and memory controller 122 are, optionally, implemented on a single chip, such as chip 104. In some other embodiments, they are, optionally, implemented on separate chips.
RF (radio frequency) circuitry 108 receives and sends RF signals, also called electromagnetic signals. RF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other communications devices via the electromagnetic signals. RF circuitry 108 optionally includes well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. RF circuitry 108 optionally communicates with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication. The RF circuitry 108 optionally includes well-known circuitry for detecting near field communication (NFC) fields, such as by a short-range communication radio. The wireless communication optionally uses any of a plurality of communications standards, protocols, and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), high-speed uplink packet access (HSUPA), Evolution, Data-Only (EV-DO), HSPA, HSPA+, Dual-Cell HSPA (DC-HSPDA), long term evolution (LTE), near field communication (NFC), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Bluetooth Low Energy (BTLE), Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, IEEE 802.11n, and/or IEEE 802.11ac), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for e-mail (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
Audio circuitry 110, speaker 111, and microphone 113 provide an audio interface between a user and device 100. Audio circuitry 110 receives audio data from peripherals interface 118, converts the audio data to an electrical signal, and transmits the electrical signal to speaker 111. Speaker 111 converts the electrical signal to human-audible sound waves. Audio circuitry 110 also receives electrical signals converted by microphone 113 from sound waves. Audio circuitry 110 converts the electrical signal to audio data and transmits the audio data to peripherals interface 118 for processing. Audio data is, optionally, retrieved from and/or transmitted to memory 102 and/or RF circuitry 108 by peripherals interface 118. In some embodiments, audio circuitry 110 also includes a headset jack (e.g., 212,
I/O subsystem 106 couples input/output peripherals on device 100, such as touch screen 112 and other input control devices 116, to peripherals interface 118. I/O subsystem 106 optionally includes display controller 156, optical sensor controller 158, intensity sensor controller 159, haptic feedback controller 161, and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 receive/send electrical signals from/to other input control devices 116. The other input control devices 116 optionally include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 160 are, optionally, coupled to any (or none) of the following: a keyboard, an infrared port, a USB port, and a pointer device such as a mouse. The one or more buttons (e.g., 208,
A quick press of the push button optionally disengages a lock of touch screen 112 or optionally begins a process that uses gestures on the touch screen to unlock the device, as described in U.S. patent application Ser. No. 11/322,549, “Unlocking a Device by Performing Gestures on an Unlock Image,” filed Dec. 23, 2005, U.S. Pat. No. 7,657,849, which is hereby incorporated by reference in its entirety. A longer press of the push button (e.g., 206) optionally turns power to device 100 on or off. The functionality of one or more of the buttons are, optionally, user-customizable. Touch screen 112 is used to implement virtual or soft buttons and one or more soft keyboards.
Touch-sensitive display 112 provides an input interface and an output interface between the device and a user. Display controller 156 receives and/or sends electrical signals from/to touch screen 112. Touch screen 112 displays visual output to the user. The visual output optionally includes graphics, text, icons, video, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output optionally corresponds to user-interface objects.
Touch screen 112 has a touch-sensitive surface, sensor, or set of sensors that accepts input from the user based on haptic and/or tactile contact. Touch screen 112 and display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) on touch screen 112 and convert the detected contact into interaction with user-interface objects (e.g., one or more soft keys, icons, web pages, or images) that are displayed on touch screen 112. In an exemplary embodiment, a point of contact between touch screen 112 and the user corresponds to a finger of the user.
Touch screen 112 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other embodiments. Touch screen 112 and display controller 156 optionally detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 112. In an exemplary embodiment, projected mutual capacitance sensing technology is used, such as that found in the iPhone® and iPod Touch® from Apple Inc. of Cupertino, Calif.
A touch-sensitive display in some embodiments of touch screen 112 is, optionally, analogous to the multi-touch sensitive touchpads described in the following U.S. Pat. No. 6,323,846 (Westerman et al.), U.S. Pat. No. 6,570,557 (Westerman et al.), and/or U.S. Pat. No. 6,677,932 (Westerman), and/or U.S. Patent Publication 2002/0015024A1, each of which is hereby incorporated by reference in its entirety. However, touch screen 112 displays visual output from device 100, whereas touch-sensitive touchpads do not provide visual output.
A touch-sensitive display in some embodiments of touch screen 112 is described in the following applications: (1) U.S. patent application Ser. No. 11/381,313, “Multipoint Touch Surface Controller,” filed May 2, 2006; (2) U.S. patent application Ser. No. 10/840,862, “Multipoint Touchscreen,” filed May 6, 2004; (3) U.S. patent application Ser. No. 10/903,964, “Gestures For Touch Sensitive Input Devices,” filed Jul. 30, 2004; (4) U.S. patent application Ser. No. 11/048,264, “Gestures For Touch Sensitive Input Devices,” filed Jan. 31, 2005; (5) U.S. patent application Ser. No. 11/038,590, “Mode-Based Graphical User Interfaces For Touch Sensitive Input Devices,” filed Jan. 18, 2005; (6) U.S. patent application Ser. No. 11/228,758, “Virtual Input Device Placement On A Touch Screen User Interface,” filed Sep. 16, 2005; (7) U.S. patent application Ser. No. 11/228,700, “Operation Of A Computer With A Touch Screen Interface,” filed Sep. 16, 2005; (8) U.S. patent application Ser. No. 11/228,737, “Activating Virtual Keys Of A Touch-Screen Virtual Keyboard,” filed Sep. 16, 2005; and (9) U.S. patent application Ser. No. 11/367,749, “Multi-Functional Hand-Held Device,” filed Mar. 3, 2006. All of these applications are incorporated by reference herein in their entirety.
Touch screen 112 optionally has a video resolution in excess of 100 dpi. In some embodiments, the touch screen has a video resolution of approximately 160 dpi. The user optionally makes contact with touch screen 112 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which can be less precise than stylus-based input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user.
In some embodiments, in addition to the touch screen, device 100 optionally includes a touchpad (not shown) for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad is, optionally, a touch-sensitive surface that is separate from touch screen 112 or an extension of the touch-sensitive surface formed by the touch screen.
Device 100 also includes power system 162 for powering the various components. Power system 162 optionally includes a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
Device 100 optionally also includes one or more optical sensors 164.
Device 100 optionally also includes one or more contact intensity sensors 165.
Device 100 optionally also includes one or more proximity sensors 166.
Device 100 optionally also includes one or more tactile output generators 167.
Device 100 optionally also includes one or more accelerometers 168.
In some embodiments, the software components stored in memory 102 include operating system 126, communication module (or set of instructions) 128, contact/motion module (or set of instructions) 130, graphics module (or set of instructions) 132, text input module (or set of instructions) 134, Global Positioning System (GPS) module (or set of instructions) 135, and applications (or sets of instructions) 136. Furthermore, in some embodiments, memory 102 (
Operating system 126 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, iOS, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
Communication module 128 facilitates communication with other devices over one or more external ports 124 and also includes various software components for handling data received by RF circuitry 108 and/or external port 124. External port 124 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.). In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or similar to and/or compatible with, the 30-pin connector used on iPod® (trademark of Apple Inc.) devices.
Contact/motion module 130 optionally detects contact with touch screen 112 (in conjunction with display controller 156) and other touch-sensitive devices (e.g., a touchpad or physical click wheel). Contact/motion module 130 includes various software components for performing various operations related to detection of contact, such as determining if contact has occurred (e.g., detecting a finger-down event), determining an intensity of the contact (e.g., the force or pressure of the contact or a substitute for the force or pressure of the contact), determining if there is movement of the contact and tracking the movement across the touch-sensitive surface (e.g., detecting one or more finger-dragging events), and determining if the contact has ceased (e.g., detecting a finger-up event or a break in contact). Contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact, which is represented by a series of contact data, optionally includes determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations are, optionally, applied to single contacts (e.g., one finger contacts) or to multiple simultaneous contacts (e.g., “multitouch”/multiple finger contacts). In some embodiments, contact/motion module 130 and display controller 156 detect contact on a touchpad.
In some embodiments, contact/motion module 130 uses a set of one or more intensity thresholds to determine whether an operation has been performed by a user (e.g., to determine whether a user has “clicked” on an icon). In some embodiments, at least a subset of the intensity thresholds are determined in accordance with software parameters (e.g., the intensity thresholds are not determined by the activation thresholds of particular physical actuators and can be adjusted without changing the physical hardware of device 100). For example, a mouse “click” threshold of a trackpad or touch screen display can be set to any of a large range of predefined threshold values without changing the trackpad or touch screen display hardware. Additionally, in some implementations, a user of the device is provided with software settings for adjusting one or more of the set of intensity thresholds (e.g., by adjusting individual intensity thresholds and/or by adjusting a plurality of intensity thresholds at once with a system-level click “intensity” parameter).
Contact/motion module 130 optionally detects a gesture input by a user. Different gestures on the touch-sensitive surface have different contact patterns (e.g., different motions, timings, and/or intensities of detected contacts). Thus, a gesture is, optionally, detected by detecting a particular contact pattern. For example, detecting a finger tap gesture includes detecting a finger-down event followed by detecting a finger-up (liftoff) event at the same position (or substantially the same position) as the finger-down event (e.g., at the position of an icon). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event followed by detecting one or more finger-dragging events, and subsequently followed by detecting a finger-up (liftoff) event.
Graphics module 132 includes various known software components for rendering and displaying graphics on touch screen 112 or other display, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast, or other visual property) of graphics that are displayed. As used herein, the term “graphics” includes any object that can be displayed to a user, including, without limitation, text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations, and the like.
In some embodiments, graphics module 132 stores data representing graphics to be used. Each graphic is, optionally, assigned a corresponding code. Graphics module 132 receives, from applications etc., one or more codes specifying graphics to be displayed along with, if necessary, coordinate data and other graphic property data, and then generates screen image data to output to display controller 156.
Haptic feedback module 133 includes various software components for generating instructions used by tactile output generator(s) 167 to produce tactile outputs at one or more locations on device 100 in response to user interactions with device 100.
Text input module 134, which is, optionally, a component of graphics module 132, provides soft keyboards for entering text in various applications (e.g., contacts 137, e-mail 140, IM 141, browser 147, and any other application that needs text input).
GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to telephone 138 for use in location-based dialing; to camera 143 as picture/video metadata; and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets).
Applications 136 optionally include the following modules (or sets of instructions), or a subset or superset thereof:
Examples of other applications 136 that are, optionally, stored in memory 102 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, contacts module 137 are, optionally, used to manage an address book or contact list (e.g., stored in application internal state 192 of contacts module 137 in memory 102 or memory 370), including: adding name(s) to the address book; deleting name(s) from the address book; associating telephone number(s), e-mail address(es), physical address(es) or other information with a name; associating an image with a name; categorizing and sorting names; providing telephone numbers or e-mail addresses to initiate and/or facilitate communications by telephone 138, video conference module 139, e-mail 140, or IM 141; and so forth.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, telephone module 138 are optionally, used to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in contacts module 137, modify a telephone number that has been entered, dial a respective telephone number, conduct a conversation, and disconnect or hang up when the conversation is completed. As noted above, the wireless communication optionally uses any of a plurality of communications standards, protocols, and technologies.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, optical sensor 164, optical sensor controller 158, contact/motion module 130, graphics module 132, text input module 134, contacts module 137, and telephone module 138, video conference module 139 includes executable instructions to initiate, conduct, and terminate a video conference between a user and one or more other participants in accordance with user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, e-mail client module 140 includes executable instructions to create, send, receive, and manage e-mail in response to user instructions. In conjunction with image management module 144, e-mail client module 140 makes it very easy to create and send e-mails with still or video images taken with camera module 143.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, the instant messaging module 141 includes executable instructions to enter a sequence of characters corresponding to an instant message, to modify previously entered characters, to transmit a respective instant message (for example, using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for telephony-based instant messages or using XMPP, SIMPLE, or IMPS for Internet-based instant messages), to receive instant messages, and to view received instant messages. In some embodiments, transmitted and/or received instant messages optionally include graphics, photos, audio files, video files and/or other attachments as are supported in an MMS and/or an Enhanced Messaging Service (EMS). As used herein, “instant messaging” refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and Internet-based messages (e.g., messages sent using XMPP, SIMPLE, or IMPS).
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, GPS module 135, map module 154, and music player module, workout support module 142 includes executable instructions to create workouts (e.g., with time, distance, and/or calorie burning goals); communicate with workout sensors (sports devices); receive workout sensor data; calibrate sensors used to monitor a workout; select and play music for a workout; and display, store, and transmit workout data.
In conjunction with touch screen 112, display controller 156, optical sensor(s) 164, optical sensor controller 158, contact/motion module 130, graphics module 132, and image management module 144, camera module 143 includes executable instructions to capture still images or video (including a video stream) and store them into memory 102, modify characteristics of a still image or video, or delete a still image or video from memory 102.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and camera module 143, image management module 144 includes executable instructions to arrange, modify (e.g., edit), or otherwise manipulate, label, delete, present (e.g., in a digital slide show or album), and store still and/or video images.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, browser module 147 includes executable instructions to browse the Internet in accordance with user instructions, including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, e-mail client module 140, and browser module 147, calendar module 148 includes executable instructions to create, display, modify, and store calendars and data associated with calendars (e.g., calendar entries, to-do lists, etc.) in accordance with user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and browser module 147, widget modules 149 are mini-applications that are, optionally, downloaded and used by a user (e.g., weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, and dictionary widget 149-5) or created by the user (e.g., user-created widget 149-6). In some embodiments, a widget includes an HTML (Hypertext Markup Language) file, a CSS (Cascading Style Sheets) file, and a JavaScript file. In some embodiments, a widget includes an XML (Extensible Markup Language) file and a JavaScript file (e.g., Yahoo! Widgets).
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and browser module 147, the widget creator module 150 are, optionally, used by a user to create widgets (e.g., turning a user-specified portion of a web page into a widget).
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, search module 151 includes executable instructions to search for text, music, sound, image, video, and/or other files in memory 102 that match one or more search criteria (e.g., one or more user-specified search terms) in accordance with user instructions.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, and browser module 147, video and music player module 152 includes executable instructions that allow the user to download and play back recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files, and executable instructions to display, present, or otherwise play back videos (e.g., on touch screen 112 or on an external, connected display via external port 124). In some embodiments, device 100 optionally includes the functionality of an MP3 player, such as an iPod (trademark of Apple Inc.).
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, notes module 153 includes executable instructions to create and manage notes, to-do lists, and the like in accordance with user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, GPS module 135, and browser module 147, map module 154 are, optionally, used to receive, display, modify, and store maps and data associated with maps (e.g., driving directions, data on stores and other points of interest at or near a particular location, and other location-based data) in accordance with user instructions.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, text input module 134, e-mail client module 140, and browser module 147, online video module 155 includes instructions that allow the user to access, browse, receive (e.g., by streaming and/or download), play back (e.g., on the touch screen or on an external, connected display via external port 124), send an e-mail with a link to a particular online video, and otherwise manage online videos in one or more file formats, such as H.264. In some embodiments, instant messaging module 141, rather than e-mail client module 140, is used to send a link to a particular online video. Additional description of the online video application can be found in U.S. Provisional Patent Application No. 60/936,562, “Portable Multifunction Device, Method, and Graphical User Interface for Playing Online Videos,” filed Jun. 20, 2007, and U.S. patent application Ser. No. 11/968,067, “Portable Multifunction Device, Method, and Graphical User Interface for Playing Online Videos,” filed Dec. 31, 2007, the contents of which are hereby incorporated by reference in their entirety.
Each of the above-identified modules and applications corresponds to a set of executable instructions for performing one or more functions described above and the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (e.g., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of these modules are, optionally, combined or otherwise rearranged in various embodiments. For example, video player module is, optionally, combined with music player module into a single module (e.g., video and music player module 152,
In some embodiments, device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through a touch screen and/or a touchpad. By using a touch screen and/or a touchpad as the primary input control device for operation of device 100, the number of physical input control devices (such as push buttons, dials, and the like) on device 100 is, optionally, reduced.
The predefined set of functions that are performed exclusively through a touch screen and/or a touchpad optionally include navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates device 100 to a main, home, or root menu from any user interface that is displayed on device 100. In such embodiments, a “menu button” is implemented using a touchpad. In some other embodiments, the menu button is a physical push button or other physical input control device instead of a touchpad.
Event sorter 170 receives event information and determines the application 136-1 and application view 191 of application 136-1 to which to deliver the event information. Event sorter 170 includes event monitor 171 and event dispatcher module 174. In some embodiments, application 136-1 includes application internal state 192, which indicates the current application view(s) displayed on touch-sensitive display 112 when the application is active or executing. In some embodiments, device/global internal state 157 is used by event sorter 170 to determine which application(s) is (are) currently active, and application internal state 192 is used by event sorter 170 to determine application views 191 to which to deliver event information.
In some embodiments, application internal state 192 includes additional information, such as one or more of: resume information to be used when application 136-1 resumes execution, user interface state information that indicates information being displayed or that is ready for display by application 136-1, a state queue for enabling the user to go back to a prior state or view of application 136-1, and a redo/undo queue of previous actions taken by the user.
Event monitor 171 receives event information from peripherals interface 118. Event information includes information about a sub-event (e.g., a user touch on touch-sensitive display 112, as part of a multi-touch gesture). Peripherals interface 118 transmits information it receives from I/O subsystem 106 or a sensor, such as proximity sensor 166, accelerometer(s) 168, and/or microphone 113 (through audio circuitry 110). Information that peripherals interface 118 receives from I/O subsystem 106 includes information from touch-sensitive display 112 or a touch-sensitive surface.
In some embodiments, event monitor 171 sends requests to the peripherals interface 118 at predetermined intervals. In response, peripherals interface 118 transmits event information. In other embodiments, peripherals interface 118 transmits event information only when there is a significant event (e.g., receiving an input above a predetermined noise threshold and/or for more than a predetermined duration).
In some embodiments, event sorter 170 also includes a hit view determination module 172 and/or an active event recognizer determination module 173.
Hit view determination module 172 provides software procedures for determining where a sub-event has taken place within one or more views when touch-sensitive display 112 displays more than one view. Views are made up of controls and other elements that a user can see on the display.
Another aspect of the user interface associated with an application is a set of views, sometimes herein called application views or user interface windows, in which information is displayed and touch-based gestures occur. The application views (of a respective application) in which a touch is detected optionally correspond to programmatic levels within a programmatic or view hierarchy of the application. For example, the lowest level view in which a touch is detected is, optionally, called the hit view, and the set of events that are recognized as proper inputs are, optionally, determined based, at least in part, on the hit view of the initial touch that begins a touch-based gesture.
Hit view determination module 172 receives information related to sub-events of a touch-based gesture. When an application has multiple views organized in a hierarchy, hit view determination module 172 identifies a hit view as the lowest view in the hierarchy which should handle the sub-event. In most circumstances, the hit view is the lowest level view in which an initiating sub-event occurs (e.g., the first sub-event in the sequence of sub-events that form an event or potential event). Once the hit view is identified by the hit view determination module 172, the hit view typically receives all sub-events related to the same touch or input source for which it was identified as the hit view.
Active event recognizer determination module 173 determines which view or views within a view hierarchy should receive a particular sequence of sub-events. In some embodiments, active event recognizer determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, active event recognizer determination module 173 determines that all views that include the physical location of a sub-event are actively involved views, and therefore determines that all actively involved views should receive a particular sequence of sub-events. In other embodiments, even if touch sub-events were entirely confined to the area associated with one particular view, views higher in the hierarchy would still remain as actively involved views.
Event dispatcher module 174 dispatches the event information to an event recognizer (e.g., event recognizer 180). In embodiments including active event recognizer determination module 173, event dispatcher module 174 delivers the event information to an event recognizer determined by active event recognizer determination module 173. In some embodiments, event dispatcher module 174 stores in an event queue the event information, which is retrieved by a respective event receiver 182.
In some embodiments, operating system 126 includes event sorter 170. Alternatively, application 136-1 includes event sorter 170. In yet other embodiments, event sorter 170 is a stand-alone module, or a part of another module stored in memory 102, such as contact/motion module 130.
In some embodiments, application 136-1 includes a plurality of event handlers 190 and one or more application views 191, each of which includes instructions for handling touch events that occur within a respective view of the application's user interface. Each application view 191 of the application 136-1 includes one or more event recognizers 180. Typically, a respective application view 191 includes a plurality of event recognizers 180. In other embodiments, one or more of event recognizers 180 are part of a separate module, such as a user interface kit (not shown) or a higher level object from which application 136-1 inherits methods and other properties. In some embodiments, a respective event handler 190 includes one or more of: data updater 176, object updater 177, GUI updater 178, and/or event data 179 received from event sorter 170. Event handler 190 optionally utilizes or calls data updater 176, object updater 177, or GUI updater 178 to update the application internal state 192. Alternatively, one or more of the application views 191 include one or more respective event handlers 190. Also, in some embodiments, one or more of data updater 176, object updater 177, and GUI updater 178 are included in a respective application view 191.
A respective event recognizer 180 receives event information (e.g., event data 179) from event sorter 170 and identifies an event from the event information. Event recognizer 180 includes event receiver 182 and event comparator 184. In some embodiments, event recognizer 180 also includes at least a subset of: metadata 183, and event delivery instructions 188 (which optionally include sub-event delivery instructions).
Event receiver 182 receives event information from event sorter 170. The event information includes information about a sub-event, for example, a touch or a touch movement. Depending on the sub-event, the event information also includes additional information, such as location of the sub-event. When the sub-event concerns motion of a touch, the event information optionally also includes speed and direction of the sub-event. In some embodiments, events include rotation of the device from one orientation to another (e.g., from a portrait orientation to a landscape orientation, or vice versa), and the event information includes corresponding information about the current orientation (also called device attitude) of the device.
Event comparator 184 compares the event information to predefined event or sub-event definitions and, based on the comparison, determines an event or sub-event, or determines or updates the state of an event or sub-event. In some embodiments, event comparator 184 includes event definitions 186. Event definitions 186 contain definitions of events (e.g., predefined sequences of sub-events), for example, event 1 (187-1), event 2 (187-2), and others. In some embodiments, sub-events in an event (187) include, for example, touch begin, touch end, touch movement, touch cancellation, and multiple touching. In one example, the definition for event 1 (187-1) is a double tap on a displayed object. The double tap, for example, comprises a first touch (touch begin) on the displayed object for a predetermined phase, a first liftoff (touch end) for a predetermined phase, a second touch (touch begin) on the displayed object for a predetermined phase, and a second liftoff (touch end) for a predetermined phase. In another example, the definition for event 2 (187-2) is a dragging on a displayed object. The dragging, for example, comprises a touch (or contact) on the displayed object for a predetermined phase, a movement of the touch across touch-sensitive display 112, and liftoff of the touch (touch end). In some embodiments, the event also includes information for one or more associated event handlers 190.
In some embodiments, event definition 187 includes a definition of an event for a respective user-interface object. In some embodiments, event comparator 184 performs a hit test to determine which user-interface object is associated with a sub-event. For example, in an application view in which three user-interface objects are displayed on touch-sensitive display 112, when a touch is detected on touch-sensitive display 112, event comparator 184 performs a hit test to determine which of the three user-interface objects is associated with the touch (sub-event). If each displayed object is associated with a respective event handler 190, the event comparator uses the result of the hit test to determine which event handler 190 should be activated. For example, event comparator 184 selects an event handler associated with the sub-event and the object triggering the hit test.
In some embodiments, the definition for a respective event (187) also includes delayed actions that delay delivery of the event information until after it has been determined whether the sequence of sub-events does or does not correspond to the event recognizer's event type.
When a respective event recognizer 180 determines that the series of sub-events do not match any of the events in event definitions 186, the respective event recognizer 180 enters an event impossible, event failed, or event ended state, after which it disregards subsequent sub-events of the touch-based gesture. In this situation, other event recognizers, if any, that remain active for the hit view continue to track and process sub-events of an ongoing touch-based gesture.
In some embodiments, a respective event recognizer 180 includes metadata 183 with configurable properties, flags, and/or lists that indicate how the event delivery system should perform sub-event delivery to actively involved event recognizers. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate how event recognizers interact, or are enabled to interact, with one another. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate whether sub-events are delivered to varying levels in the view or programmatic hierarchy.
In some embodiments, a respective event recognizer 180 activates event handler 190 associated with an event when one or more particular sub-events of an event are recognized. In some embodiments, a respective event recognizer 180 delivers event information associated with the event to event handler 190. Activating an event handler 190 is distinct from sending (and deferred sending) sub-events to a respective hit view. In some embodiments, event recognizer 180 throws a flag associated with the recognized event, and event handler 190 associated with the flag catches the flag and performs a predefined process.
In some embodiments, event delivery instructions 188 include sub-event delivery instructions that deliver event information about a sub-event without activating an event handler. Instead, the sub-event delivery instructions deliver event information to event handlers associated with the series of sub-events or to actively involved views. Event handlers associated with the series of sub-events or with actively involved views receive the event information and perform a predetermined process.
In some embodiments, data updater 176 creates and updates data used in application 136-1. For example, data updater 176 updates the telephone number used in contacts module 137, or stores a video file used in video player module. In some embodiments, object updater 177 creates and updates objects used in application 136-1. For example, object updater 177 creates a new user-interface object or updates the position of a user-interface object. GUI updater 178 updates the GUI. For example, GUI updater 178 prepares display information and sends it to graphics module 132 for display on a touch-sensitive display.
In some embodiments, event handler(s) 190 includes or has access to data updater 176, object updater 177, and GUI updater 178. In some embodiments, data updater 176, object updater 177, and GUI updater 178 are included in a single module of a respective application 136-1 or application view 191. In other embodiments, they are included in two or more software modules.
It shall be understood that the foregoing discussion regarding event handling of user touches on touch-sensitive displays also applies to other forms of user inputs to operate multifunction devices 100 with input devices, not all of which are initiated on touch screens. For example, mouse movement and mouse button presses, optionally coordinated with single or multiple keyboard presses or holds; contact movements such as taps, drags, scrolls, etc. on touchpads; pen stylus inputs; movement of the device; oral instructions; detected eye movements; biometric inputs; and/or any combination thereof are optionally utilized as inputs corresponding to sub-events which define an event to be recognized.
Device 100 optionally also include one or more physical buttons, such as “home” or menu button 204. As described previously, menu button 204 is, optionally, used to navigate to any application 136 in a set of applications that are, optionally, executed on device 100. Alternatively, in some embodiments, the menu button is implemented as a soft key in a GUI displayed on touch screen 112.
In some embodiments, device 100 includes touch screen 112, menu button 204, push button 206 for powering the device on/off and locking the device, volume adjustment button(s) 208, subscriber identity module (SIM) card slot 210, headset jack 212, and docking/charging external port 124. Push button 206 is, optionally, used to turn the power on/off on the device by depressing the button and holding the button in the depressed state for a predefined time interval; to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or to unlock the device or initiate an unlock process. In an alternative embodiment, device 100 also accepts verbal input for activation or deactivation of some functions through microphone 113. Device 100 also, optionally, includes one or more contact intensity sensors 165 for detecting intensity of contacts on touch screen 112 and/or one or more tactile output generators 167 for generating tactile outputs for a user of device 100.
Each of the above-identified elements in
Attention is now directed towards embodiments of user interfaces that are, optionally, implemented on, for example, portable multifunction device 100.
It should be noted that the icon labels illustrated in
Although some of the examples that follow will be given with reference to inputs on touch screen display 112 (where the touch-sensitive surface and the display are combined), in some embodiments, the device detects inputs on a touch-sensitive surface that is separate from the display, as shown in
In some embodiments, all of the operations described below with reference to
In some embodiments, the user interface navigation logic 480 includes one or more modules (e.g., one or more event handlers 190, including one or more object updaters 177 and one or more GUI updaters 178 as described in greater detail above with reference to
In some embodiments, both the display 450 and the touch-sensitive surface 451 are integrated with the computing device (e.g., Computing Device A in
In some embodiments, the touch-sensitive surface 451 is integrated with the computing device while the display 450 is not integrated with the computing device (e.g., Computing Device B in
In some embodiments, the display 450 is integrated with the computing device while the touch-sensitive surface 451 is not integrated with the computing device (e.g., Computing Device C in
In some embodiments, neither the display 450 nor the touch-sensitive surface 451 is integrated with the computing device (e.g., Computing Device D in
In some embodiments, the computing device has an integrated audio system. In some embodiments, the computing device is in communication with an audio system that is separate from the computing device. In some embodiments, the audio system (e.g., an audio system integrated in a television unit) is integrated with a separate display 450. In some embodiments, the audio system (e.g., a stereo system) is a stand-alone system that is separate from the computing device and the display 450.
Additionally, while the following examples are given primarily with reference to finger inputs (e.g., finger contacts, finger tap gestures, finger swipe gestures), it should be understood that, in some embodiments, one or more of the finger inputs are replaced with input from another input device (e.g., a mouse-based input or stylus input). For example, a swipe gesture is, optionally, replaced with a mouse click (e.g., instead of a contact) followed by movement of the cursor along the path of the swipe (e.g., instead of movement of the contact). As another example, a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e.g., instead of detection of the contact followed by ceasing to detect the contact). Similarly, when multiple user inputs are simultaneously detected, it should be understood that multiple computer mice are, optionally, used simultaneously, or a mouse and finger contacts are, optionally, used simultaneously.
Exemplary techniques for detecting and processing touch intensity are found, for example, in related applications: International Patent Application Serial No. PCT/US2013/040061, titled “Device, Method, and Graphical User Interface for Displaying User Interface Objects Corresponding to an Application,” filed May 8, 2013, published as WIPO Publication No. WO/2013/169849, and International Patent Application Serial No. PCT/US2013/069483, titled “Device, Method, and Graphical User Interface for Transitioning Between Touch Input to Display Output Relationships,” filed Nov. 11, 2013, published as WIPO Publication No. WO/2014/105276, each of which is hereby incorporated by reference in their entirety.
In some embodiments, device 500 has one or more input mechanisms 506 and 508. Input mechanisms 506 and 508, if included, can be physical. Examples of physical input mechanisms include push buttons and rotatable mechanisms. In some embodiments, device 500 has one or more attachment mechanisms. Such attachment mechanisms, if included, can permit attachment of device 500 with, for example, hats, eyewear, earrings, necklaces, shirts, jackets, bracelets, watch straps, chains, trousers, belts, shoes, purses, backpacks, and so forth. These attachment mechanisms permit device 500 to be worn by a user.
Input mechanism 508 is, optionally, a microphone, in some examples. Personal electronic device 500 optionally includes various sensors, such as GPS sensor 532, accelerometer 534, directional sensor 540 (e.g., compass), gyroscope 536, motion sensor 538, and/or a combination thereof, all of which can be operatively connected to I/O section 514.
Memory 518 of personal electronic device 500 can include one or more non-transitory computer-readable storage mediums, for storing computer-executable instructions, which, when executed by one or more computer processors 516, for example, can cause the computer processors to perform the techniques described below, including processes 700 (
As used here, the term “affordance” refers to a user-interactive graphical user interface object that is, optionally, displayed on the display screen of devices 100, 300, and/or 500 (
As used herein, the term “focus selector” refers to an input element that indicates a current part of a user interface with which a user is interacting. In some implementations that include a cursor or other location marker, the cursor acts as a “focus selector” so that when an input (e.g., a press input) is detected on a touch-sensitive surface (e.g., touchpad 355 in
As used in the specification and claims, the term “characteristic intensity” of a contact refers to a characteristic of the contact based on one or more intensities of the contact. In some embodiments, the characteristic intensity is based on multiple intensity samples. The characteristic intensity is, optionally, based on a predefined number of intensity samples, or a set of intensity samples collected during a predetermined time period (e.g., 0.05, 0.1, 0.2, 0.5, 1, 2, 5, 10 seconds) relative to a predefined event (e.g., after detecting the contact, prior to detecting liftoff of the contact, before or after detecting a start of movement of the contact, prior to detecting an end of the contact, before or after detecting an increase in intensity of the contact, and/or before or after detecting a decrease in intensity of the contact). A characteristic intensity of a contact is, optionally, based on one or more of: a maximum value of the intensities of the contact, a mean value of the intensities of the contact, an average value of the intensities of the contact, a top 10 percentile value of the intensities of the contact, a value at the half maximum of the intensities of the contact, a value at the 90 percent maximum of the intensities of the contact, or the like. In some embodiments, the duration of the contact is used in determining the characteristic intensity (e.g., when the characteristic intensity is an average of the intensity of the contact over time). In some embodiments, the characteristic intensity is compared to a set of one or more intensity thresholds to determine whether an operation has been performed by a user. For example, the set of one or more intensity thresholds optionally includes a first intensity threshold and a second intensity threshold. In this example, a contact with a characteristic intensity that does not exceed the first threshold results in a first operation, a contact with a characteristic intensity that exceeds the first intensity threshold and does not exceed the second intensity threshold results in a second operation, and a contact with a characteristic intensity that exceeds the second threshold results in a third operation. In some embodiments, a comparison between the characteristic intensity and one or more thresholds is used to determine whether or not to perform one or more operations (e.g., whether to perform a respective operation or forgo performing the respective operation), rather than being used to determine whether to perform a first operation or a second operation.
In some embodiments, a portion of a gesture is identified for purposes of determining a characteristic intensity. For example, a touch-sensitive surface optionally receives a continuous swipe contact transitioning from a start location and reaching an end location, at which point the intensity of the contact increases. In this example, the characteristic intensity of the contact at the end location is, optionally, based on only a portion of the continuous swipe contact, and not the entire swipe contact (e.g., only the portion of the swipe contact at the end location). In some embodiments, a smoothing algorithm is, optionally, applied to the intensities of the swipe contact prior to determining the characteristic intensity of the contact. For example, the smoothing algorithm optionally includes one or more of: an unweighted sliding-average smoothing algorithm, a triangular smoothing algorithm, a median filter smoothing algorithm, and/or an exponential smoothing algorithm. In some circumstances, these smoothing algorithms eliminate narrow spikes or dips in the intensities of the swipe contact for purposes of determining a characteristic intensity.
The intensity of a contact on the touch-sensitive surface is, optionally, characterized relative to one or more intensity thresholds, such as a contact-detection intensity threshold, a light press intensity threshold, a deep press intensity threshold, and/or one or more other intensity thresholds. In some embodiments, the light press intensity threshold corresponds to an intensity at which the device will perform operations typically associated with clicking a button of a physical mouse or a trackpad. In some embodiments, the deep press intensity threshold corresponds to an intensity at which the device will perform operations that are different from operations typically associated with clicking a button of a physical mouse or a trackpad. In some embodiments, when a contact is detected with a characteristic intensity below the light press intensity threshold (e.g., and above a nominal contact-detection intensity threshold below which the contact is no longer detected), the device will move a focus selector in accordance with movement of the contact on the touch-sensitive surface without performing an operation associated with the light press intensity threshold or the deep press intensity threshold. Generally, unless otherwise stated, these intensity thresholds are consistent between different sets of user interface figures.
An increase of characteristic intensity of the contact from an intensity below the light press intensity threshold to an intensity between the light press intensity threshold and the deep press intensity threshold is sometimes referred to as a “light press” input. An increase of characteristic intensity of the contact from an intensity below the deep press intensity threshold to an intensity above the deep press intensity threshold is sometimes referred to as a “deep press” input. An increase of characteristic intensity of the contact from an intensity below the contact-detection intensity threshold to an intensity between the contact-detection intensity threshold and the light press intensity threshold is sometimes referred to as detecting the contact on the touch-surface. A decrease of characteristic intensity of the contact from an intensity above the contact-detection intensity threshold to an intensity below the contact-detection intensity threshold is sometimes referred to as detecting liftoff of the contact from the touch-surface. In some embodiments, the contact-detection intensity threshold is zero. In some embodiments, the contact-detection intensity threshold is greater than zero.
In some embodiments described herein, one or more operations are performed in response to detecting a gesture that includes a respective press input or in response to detecting the respective press input performed with a respective contact (or a plurality of contacts), where the respective press input is detected based at least in part on detecting an increase in intensity of the contact (or plurality of contacts) above a press-input intensity threshold. In some embodiments, the respective operation is performed in response to detecting the increase in intensity of the respective contact above the press-input intensity threshold (e.g., a “down stroke” of the respective press input). In some embodiments, the press input includes an increase in intensity of the respective contact above the press-input intensity threshold and a subsequent decrease in intensity of the contact below the press-input intensity threshold, and the respective operation is performed in response to detecting the subsequent decrease in intensity of the respective contact below the press-input threshold (e.g., an “up stroke” of the respective press input).
In some embodiments, the device employs intensity hysteresis to avoid accidental inputs sometimes termed “jitter,” where the device defines or selects a hysteresis intensity threshold with a predefined relationship to the press-input intensity threshold (e.g., the hysteresis intensity threshold is X intensity units lower than the press-input intensity threshold or the hysteresis intensity threshold is 75%, 90%, or some reasonable proportion of the press-input intensity threshold). Thus, in some embodiments, the press input includes an increase in intensity of the respective contact above the press-input intensity threshold and a subsequent decrease in intensity of the contact below the hysteresis intensity threshold that corresponds to the press-input intensity threshold, and the respective operation is performed in response to detecting the subsequent decrease in intensity of the respective contact below the hysteresis intensity threshold (e.g., an “up stroke” of the respective press input). Similarly, in some embodiments, the press input is detected only when the device detects an increase in intensity of the contact from an intensity at or below the hysteresis intensity threshold to an intensity at or above the press-input intensity threshold and, optionally, a subsequent decrease in intensity of the contact to an intensity at or below the hysteresis intensity, and the respective operation is performed in response to detecting the press input (e.g., the increase in intensity of the contact or the decrease in intensity of the contact, depending on the circumstances).
For ease of explanation, the descriptions of operations performed in response to a press input associated with a press-input intensity threshold or in response to a gesture including the press input are, optionally, triggered in response to detecting either: an increase in intensity of a contact above the press-input intensity threshold, an increase in intensity of a contact from an intensity below the hysteresis intensity threshold to an intensity above the press-input intensity threshold, a decrease in intensity of the contact below the press-input intensity threshold, and/or a decrease in intensity of the contact below the hysteresis intensity threshold corresponding to the press-input intensity threshold. Additionally, in examples where an operation is described as being performed in response to detecting a decrease in intensity of a contact below the press-input intensity threshold, the operation is, optionally, performed in response to detecting a decrease in intensity of the contact below a hysteresis intensity threshold corresponding to, and lower than, the press-input intensity threshold.
As used herein, an “installed application” refers to a software application that has been downloaded onto an electronic device (e.g., devices 100, 300, and/or 500) and is ready to be launched (e.g., become opened) on the device. In some embodiments, a downloaded application becomes an installed application by way of an installation program that extracts program portions from a downloaded package and integrates the extracted portions with the operating system of the computer system.
Attention is now directed towards embodiments of user interfaces (“UI”) and associated processes that are implemented on an electronic device, such as portable multifunction device 100, device 300, or device 500.
Attention is now directed towards embodiments of user interfaces (“UI”) and associated processes that optionally are implemented with an electronic device that communicates with and/or includes a display and a touch-sensitive surface, such as one of Computing Devices A-D in
In some cases electronic device 600 connects to network device 601 via communications channel 604C, which allows for connections to external devices 603A-C via communications channels 604D-F, respectively. In some cases electronic device 600 also directly connects to external device 603C via communications channel 604G. In some cases electronic device 600 also connects to remote server 603 via communications channel 604H, the Internet and communications channel 604I. Communications channels 604A-I are any form of communications channels, such as wired (e.g., Ethernet, USB, Lightning, Fiber) or wireless (e.g., WiFi, Bluetooth, IR) connections.
In response to receiving user input selecting affordance 622, electronic device 600 begins the gaming session and updates the visual output of the application to show the progress of the game, as depicted in
After the application task has ended,
In response to receiving user input selecting affordance 634, electronic device 600 causes menu 638 to be displayed, as depicted in
In some cases, menu 638 is displayed by the application based on data it received from the operating system of electronic device 600. In other cases, menu 638 is displayed by the operating system or other system components in response to a request from the application.
In response to user input selecting affordance 642 (or in the case where AirDrop is the only available way to share), device 600 causes menu 648 to be displayed, as depicted in
External devices 603A-603C corresponding to affordances 652, 654, and 656 are associated with electronic device 600 in some manner. In some embodiments, external devices 603A-603C are all within a threshold proximity of electronic device 600 (e.g., as indicated by Bluetooth communications); external devices 603A-603C are all associated with the user of electronic device 600 (e.g., the user has a common username on electronic device 600 and external devices 603A-603C or the user is signed on to electronic device 600 and external devices 603A-603C); or the devices are all only associated with the user of electronic device 600. In some cases electronic device 600 is associated with multiple users (e.g., electronic device 600 is a shared device such as a set top box that controls a user interface on a television which is a device that individual users do not typically sign on to using their personal communication or social accounts). In some cases external devices 603A-603C listed in menu 648 are used only with a single user (e.g., a device on which the user is signed on to one or more personal communication and social accounts so that the user can share the recorded visual output using the personal communication and/or social accounts).
In some embodiments, it is beneficial to share the recorded visual output with another device because electronic device 600 has limited applications or other means to share the recorded visual output (e.g., limited to sharing data to devices that are in close proximity and associated with the same user that is using electronic device 600) while one or more of external devices 603A-603C, in some embodiments, has many more applications or other means to share the recorded visual output (e.g., email, social networking, websites). In other words, in some cases, external devices 603A-603C in menu 648 have more ways to share the application task data than electronic device 600 that executed the application task. In addition to being able to use the additional resources of external devices 603A-603C listed in menu 648 to further share the recorded visual output, in some embodiments, it is be beneficial to share the recorded visual output with one or more of these devices because electronic device 600 has limited memory (e.g., non-volatile memory), such as 64 GB or less.
In response receiving user input selecting one of the affordances of menu 648, electronic device 600 will transmit the recorded visual output to the selected external device using any number of communications channels, such a WiFi, Bluetooth, or other communications protocols. In some cases, the transmission from electronic device 600 to the selected device (e.g., external device 603C) is direct in that the transmission does not pass through any intermediate servers or networking devices (e.g., via communications channel 604G of
Instead of recording visual output of the application automatically in response to the application starting, electronic device 600 can also record visual output of the application in response to a request form the user. For example, with reference to
In some embodiments, in response to receiving user input selecting affordance 664, electronic device 600 starts (e.g., immediately) to record visual output. In some embodiments, electronic device 600 waits until the current game session is resumed (e.g, unpaused). In some embodiments, the game session resumes automatically in response to the selection of affordance 664. In some embodiments, the game session resumes only in response to the user subsequently providing user input selecting affordance 668. In any of these cases, the visual output of the game session is recorded in accordance with the user input selecting affordance 664.
After the game session starts again (i.e., resumes), as depicted in
As described below, method 700 provides an intuitive way for sharing application visual output. The method reduces the cognitive burden on a user for sharing application visual output, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to share application visual output faster and more efficiently conserves power and increases the time between battery charges.
The first electronic device receives a first input (e.g., affordance selection via remote control or other device connected to the first electronic device) corresponding to an affordance to start a task (e.g., 622, 664, 668) (e.g., starting a game session or resuming a game session from the pause screen) in an application (e.g., a game application depicted in
In accordance with some embodiments, the third user input is received before the task starts (e.g., an affordance to turn on recording is displayed at a start screen of the task).
In accordance with some embodiments, the third user input is received after the task has started and while the task is paused (e.g.,
In accordance with some embodiments, the application is a game application (e.g., the game application depicted in
In accordance with some embodiments, the application task is a game session (e.g., the game session depicted in
In accordance with some embodiments, prior to the first electronic device starting to record the visual output of the task, the first electronic device receives a third user input that corresponds to an indication to record the visual output of the task (e.g., 664), wherein the first electronic device recording the visual output of the application as the application task data is based on (e.g., in response to) receiving the third user input.
In accordance with some embodiments, the first electronic device receives a fourth user input that corresponds to selection of an affordance to request the recording of the visual output of the task to stop (e.g., 680). After the first electronic device receives the fourth user input, the first electronic device ceases the recording of the visual output of the task while the task is executing (and, optionally deleting previously recorded visual output of the task).
In accordance with some embodiments, the first electronic device receives the fourth user input while the task is paused (e.g.,
In accordance with some embodiments, the first electronic device causing display of the affordance for sharing occurs in response to the cessation of the task (e.g., when the user has requested an end of the task by pausing or quitting a game, or when the task has ended automatically such as when the user loses a game or successfully completes a level). This simplifies the man-machine interface by automatically presenting an option to share the application task data in response to the task ending instead of having to switch applications to share the application task data.
In accordance with some embodiments, after the task has ceased, the first electronic device causes display of an affordance (e.g., 634) for providing an option to share the application task data on the display. The first electronic device receives fifth user input selecting the affordance for providing an option to share the application task data, wherein the display of the affordance for sharing (e.g., 644, 646, 652, 654, 656) occurs in response to receiving the fifth user input.
In accordance with some embodiments, the first electronic device detects the cessation of the task (e.g., the task is paused or terminated). In response to detecting the cessation of the task, the first electronic device ceases to record the visual output.
In accordance with some embodiments, the first electronic device detects the cessation of the task (e.g., the task is paused or terminated). In response to detecting the cessation of the task, the first electronic device causes display of an affordance (e.g., 632, 694) for viewing the application task data (e.g., providing the user with an option to review the video recording of the game when the game is paused, after successful completion of a level, and/or after losing the game). In response to the first electronic device receiving user input selecting the affordance for playing the application task data (e.g., 632, 694), the first electronic device causes display of the application task data.
In accordance with some embodiments, the second electronic device is a first external device (e.g., one of 603A-603C) of a plurality of external devices (e.g., 603A-603C). After the first electronic device receives the second user input selecting the affordance for sharing, the first electronic device causes display of one or more affordances (e.g., 652, 654, 656) associated the plurality of external devices (e.g., 603A-603C). The first electronic device receives sixth user input selecting an affordance corresponding to the first external device (e.g., one of 603A-603C). In response to the first electronic device receiving the sixth user input, the first electronic device designates the first external device as recipient of the transmission of the application task data.
In accordance with some embodiments, the one or more external devices (e.g., 603A-603C) are determined based on a proximity to the first electronic device.
In accordance with some embodiments, the second electronic device is associated with a user of the first electronic device (e.g., the first electronic device and the second electronic device are both associated or signed on with a same user account of a content synchronization or purchase sharing service such as a personal or family iCloud account).
In accordance with some embodiments, wherein the second electronic device is associated with only a user (e.g., an iPhone that only has a single user) of the first electronic device.
In accordance with some embodiments, the first electronic device is associated with multiple users (e.g., the first electronic device is a shared device such as a set top box that controls a user interface on a television which is a device that individual users do not typically sign on to their personal communication and social accounts) and the second electronic device is associated with a single user (e.g., a device on which the user is signed on to one or more personal communication and social accounts so that the user can share the recorded video using the personal communication and/or social accounts).
In accordance with some embodiments, the second electronic device is a smartphone or a tablet computer.
In accordance with some embodiments, the memory has a size of 64 GB or less.
In accordance with some embodiments, the application task data is a multimedia file.
In accordance with some embodiments, the application task data is transmitted directly (e.g., without going through intermediate servers or networking devices) to the second electronic device (e.g., 603C via 604G).
In accordance with some embodiments, the first electronic device includes a first set of one or more ways to share the application task data (e.g.,
In accordance with some embodiments, the affordance for sharing the application task data (e.g., 644, 646, 652, 654, 656) is a system user interface element and wherein the affordance to start the task (e.g., 622, 664, 668) and the affordance to stop the recording of the visual output (e.g., 680) are application user interface elements (e.g., user interface elements controlled by the application that invoke system protocols for recording the application task data).
In accordance with some embodiments, the second electronic device is configured to enable sharing via a plurality of different sharing services (e.g., email, social networks, video archives).
Note that details of the processes described above with respect to method 700 (e.g.,
In accordance with some embodiments,
As shown in
The processing unit 806 is configured to: receive (e.g., with receiving unit 812) a first input corresponding to an affordance to start a task in an application executing on the one or more processors; in response to receiving the first input, start (e.g., with starting unit 814) the task; while the task is ongoing, record (e.g., with recording unit 816) visual output of the application as application task data; after the task has ceased, cause to be displayed (e.g., with display causing unit 810) on the display an affordance for sharing the application task data with a second electronic device that is associated with the first electronic device; and while the affordance for sharing is displayed on the display, receive (e.g., with receiving unit 812) a second input that corresponds to selection of the affordance for sharing the application task data; and in response to receiving the second input, transmit (e.g., with transmitting unit 818) the application task data to the second electronic device over the communication interface.
In some embodiments, the application is a game application.
In some embodiments, the application task is a game session.
In some embodiments, the processing unit 806 is further configured to, prior to starting to record the visual output of the task, receive (e.g., with receiving unit 812) a third user input that corresponds to an indication to record the visual output of the task, wherein the recording the visual output of the application as the application task data is based on receiving the third user input.
In some embodiments, the third user input is received before the task starts.
In some embodiments, the third user input is received after the task has started and while the task is paused.
In some embodiments, the processing unit 806 is further configured to receive (e.g., with receiving unit 812) a fourth user input that corresponds to selection of an affordance to request the recording of the visual output of the task to stop; and after receiving the fourth user input, cease the recording (e.g., with recording unit 816) of the visual output of the task while the task is executing.
In some embodiments, receiving the fourth user input is received while the task is paused.
In some embodiments, cause display (e.g., with display causing unit 810) of the affordance for sharing occurs in response to the cessation of the task.
In some embodiments, the processing unit 806 is further configured to, after the task has ceased, cause display (e.g., with display causing unit 810) of an affordance for providing an option to share the application task data on the display; and receive (e.g., with the receiving unit 812) fifth user input selecting the affordance for providing an option to share the application task data, wherein the display of the affordance for sharing occurs in response to receiving the fifth user input.
In some embodiments, the processing unit 806 is further configured to detect (e.g., with detecting unit 822) the cessation of the task; and in response to detecting the cessation of the task, cease recording (e.g., with recording unit 816) the visual output.
In some embodiments, the processing unit 806 is further configured to detect (e.g., with detecting unit 822) the cessation of the task; in response to detecting the cessation of the task, cause display (e.g., with display causing unit 810) of an affordance for viewing the application task data; and in response to receiving user input selecting the affordance for playing the application task data, cause display (e.g., with display causing unit 810) of the application task data.
In some embodiments, the second electronic device is a first external device of a plurality of external devices and the processing unit 806 is further configured to after receiving the second user input selecting the affordance for sharing, cause display (e.g., with display causing unit 812) of one or more affordances associating the plurality of external devices; receive (e.g., with receiving unit 812) sixth user input selecting an affordance corresponding to the first external device; and in response to receiving the sixth user input, designate (e.g., with designating unit 820) the first external device as recipient of the transmission of the application task data.
In some embodiments, the one or more external devices are determined based on a proximity to the first electronic device.
In some embodiments, wherein the second electronic device is associated with a user of the first electronic device.
In some embodiments, wherein the second electronic device is associated with only a user of the first electronic device.
In some embodiments, the first electronic device is associated with multiple users and the second electronic device is associated with a single user.
In some embodiments, the second electronic device is a smartphone or a tablet computer.
In some embodiments, the memory has a size of 64 GB or less.
In some embodiments, the application task data is a multimedia file.
In some embodiments, the application task data is transmitted directly to the second electronic device.
In some embodiments, the first electronic device includes a first set of one or more ways to share the application task data and the second electronic includes a second set of one or more ways to share the application task data different from the first set of one or more ways.
In some embodiments, the affordance for sharing the application task data is a system user interface element and the affordance to start the task and the affordance to stop the recording of the visual output are application user interface elements.
In some embodiments, the second electronic device is configured to enable sharing via a plurality of different sharing services.
The operations described above with reference to
In some embodiments, the list of broadcast applications in menu 908 is determined using various techniques, including an API call for the operating system of electronic device 600, querying a database, checking configuration data, or other sources data about installed applications. In some embodiments, broadcast applications are applications that receive or generate a live stream of the visual output of an application (e.g., video game output) and send the visual output for broadcasting and viewing by remote users. In some embodiments, in accordance with a determination that multiple broadcast applications are available, menu 908 is displayed to allow the user to select one of the broadcast applications. In some embodiments, in accordance with a determination that only one broadcast application is available, electronic device 600 skips causing display of menu 908.
In some embodiments, in accordance with a determination that no broadcast applications are available on electronic device 600, electronic device 600 causes display of menu 916 of
Menu 916 includes indicator 922 that aids the user in selecting affordance 924 or 926. In some cases, in response to user input selecting affordance 924, electronic device causes to be display an application store interface displaying one or more broadcast applications that be downloaded and installed on electronic device 600. In other cases, in response to user input selecting affordance 924, electronic device 600 causes to be displayed one or more links to broadcast applications that can be downloaded and installed on electronic device.
In some embodiments, in response to user input selecting affordance 926, electronic device 600 returns display 600B to the initial display, as depicted in
In some embodiments, if user input is received selecting affordance 912 (corresponding to the BroadAll application) or if only one broadcast application is installed, a determination is made as to whether electronic device 600 has access to broadcast services through the corresponding application. In some embodiments, electronic device 600 determines whether a user is logged into or needs to log into the broadcast application or the service associated with the broadcast application. If either the application or the service requires the user to login and the user has not logged in yet, menu 928 is displayed to prompt the user to enter their login information for the application indicated by instructions 930. Affordance 932 allows for the user to input a username. Affordance 934 allows for the user to input a password. Once a username and password are entered, user input selecting affordance 935 results in the username and password being passed to the broadcast application for authentication. In some embodiments, the broadcast application authenticates the login information either locally or sends it to a remote server for authentication. As opposed to a user interface of the gaming application, menu 928 could instead be a menu of the broadcast application so that the login information is being directly entered into the broadcast application. As another option, instead of menu 928, electronic device 600 could switch to the broadcast application to enable the user to enter login information directly into the broadcast application.
In some embodiments, in response to the login information being successfully authenticated, or the user already being logged, or a login not being necessary, the application task (e.g., a session of a game application) begins. While the application task is executing, the visual output of the application task is sent to the broadcast application. The broadcast application then sends the visual output to a remote server (e.g., remote server 603 of
In some embodiments, pause menu 944 is displayed in response to electronic device 600 receiving user input requesting the game session to be paused. Pause menu includes affordance 946 for resuming the game session, affordance 948 for stopping the broadcast of the games session, and affordance 950 for restarting the game session. Indicator 952, which is not present in some embodiments, is used to aid the user in selecting an affordance.
As described below, method 1000 provides an intuitive way for sharing application visual output. The method reduces the cognitive burden on a user for sharing application visual output, thereby creating a more efficient human-machine interface. For battery-operated computing devices, enabling a user to share application visual output faster and more efficiently conserves power and increases the time between battery charges.
The first electronic device causes to display (1002) on the display a first affordance (e.g., 906) (e.g., on the start screen of the application) in an application (e.g., a game application) for broadcasting (e.g., sending for near-live distribution to an audience) visual output of a task (e.g., a game session) of the application.
In response to the first electronic device receiving a first user input (e.g., using a remote control or other input device in communication with the first electronic device) corresponding to selection of the first affordance (e.g., 906) and in accordance with the first electronic device determining that multiple broadcast applications on the first electronic device are capable of broadcasting visual output of the application while the task is ongoing: [a] the first electronic device causes to display (1006) on the display a second affordance (e.g., one of 910, 912, 914) for selecting a broadcast application of the plurality of broadcast applications (e.g., Pinch, BroadAll, BScreen) capable of broadcasting the visual output of the application; and [b] while the second affordance (e.g., one of 910, 912, 914) is displayed on the display, the first electronic device receives (1012) second user input corresponding to selection of the second affordance (e.g., one of 910, 912, 914). After (e.g., in response to or later in time) the first electronic device receives the second user input (e.g., after the user has selected an application and after potentially selected the type of game to start), the first electronic device starts (1014) the task and sends the visual output of the application to the broadcast application for transmitting the visual output over the communication interface (e.g., over communication channel 604C and 604H) to a remote server (e.g., 603) (e.g., the broadcasting server).
In accordance with some embodiments, prior to the first electronic device causing the second affordance to be displayed on the display, the first electronic device determines (1006) whether multiple broadcast applications (e.g., Twitch, YouTube, XBox) on the electronic device are capable of broadcasting (e.g., sending for near-live distribution to an audience) the visual output of the application.
In accordance with some embodiments, further in response to the first electronic device receiving the first user input (e.g., using remote control or other user input device) corresponding to a selection of the first affordance (e.g., 904) and in accordance with the first electronic device determining that there is only one broadcast application capable of broadcasting the visual output of the application, the first electronic device starts (1016) the task and sends the visual output of the task to the broadcast application (e.g., the installed broadcast application) for transmitting the visual output over the communication interface to a remote server (e.g., 603) (e.g., the broadcasting server) without causing display of the second affordance (e.g., any of 910, 921, 914).
In accordance with some embodiments, further in response to the first electronic device receiving the first user input (e.g., using remote control) corresponding to a selection of the first affordance. In accordance with the first electronic device determining that there are no applications capable of broadcasting the visual output of the application, the first electronic device prompts (1018) the user to install a broadcast application (e.g.,
In accordance with some embodiments, the application is a game application and the task is a session of the game application.
In accordance with some embodiments, the first electronic device causes to display on the display a third affordance (e.g., 906) with the first affordance (e.g., 904), wherein the third affordance corresponds to a request to start the task without broadcasting the visual output of the application. In response to receiving the user input selecting the third affordance, the first electronic device starts the task without sending the visual output of the application (e.g., to a broadcast application) for broadcasting.
In accordance with some embodiments, in accordance with the first electronic device determining that there are no applications capable of broadcasting the visual output of the application, the first electronic device forgoes the display of the first affordance (e.g., 906).
In accordance with some embodiments, the first electronic device receives a third user input (e.g., user input selecting affordance 948 via menu 944) indicating that sending the visual output of the application to the broadcast application should be ceased (e.g., selecting an affordance on a pause screen to stop broadcasting). In response to the first electronic device receiving the third user input, the first electronic device ceases to send the visual output of the task to the broadcast application.
In accordance with some embodiments, the electronic device sends the visual output of the application to the broadcast application occurs while the task is executing on the electronic device (e.g., a live stream of the visual output of the video game is being sent to the broadcast application, which is then broadcasting the content for viewing by remote users).
In accordance with some embodiments, the visual output of the application is a video recording of the application output (e.g., the output of the video game of
In accordance with some embodiments, the first electronic device transmits, via the broadcast application, the visual output over the communication interface to the remote server (e.g., 603) (e.g., for retransmission to remote users who are watching the live stream of the video game). By using the broadcast application, the design of the application producing the visual output can be simplified because instead of the application having to integrate broadcast functionality, the application need only pass on its visual out to the a different application.
In accordance with some embodiments, in response to the first electronic device receiving the second user input the first electronic device determines whether a user is logged into the broadcast application. In accordance with the first electronic device determining that the user is logged into the broadcast application, the first electronic device starts the task and sending the visual output of the application to the broadcast application. In accordance with the first electronic device determining that the user is not logged into the broadcast application, the first electronic device causes to display, on the display, a login window (e.g., 928) of the broadcast application.
In accordance with some embodiment, the login window (e.g., 928) is generated by the broadcast application user interface and is concurrently displayed with at least a portion of the application user interface (e.g., so that the application does not get access to the login credentials of the broadcast application).
In accordance with some embodiments, the first affordance (e.g., 904) is associated with the application user interface and wherein the second affordance (e.g., 910, 921, or 914) for selecting the broadcast application is associated with a system user interface.
In accordance with some embodiments,
As shown in
The processing unit 1106 is configured to: cause to display (e.g. with display causing unit 1110) on the display a first affordance in an application for broadcasting visual output of a task of the application; in response to receiving a first user input corresponding to selection of the first affordance and in accordance with a determination that multiple broadcast applications on the electronic device are capable of broadcasting visual output of the application while the task is ongoing: cause to display (e.g. with display causing unit 1110) on the display a second affordance for selecting a broadcast application of the plurality of broadcast applications capable of broadcasting the visual output of the application; and while the second affordance is displayed on the display, receive (e.g., with receiving unit 1112) second user input corresponding to selection of the second affordance; and after receiving the second user input, start (e.g., with starting unit 1114) the task and send (e.g., with sending unit 1116) the visual output of the application to the broadcast application for transmitting the visual output over the communication interface to a remote server.
In some embodiments, the processing unit 1106 is further configured to, prior to causing the second affordance to be displayed on the display, determine (e.g., with determining unit 1118) whether multiple broadcast applications on the electronic device are capable of broadcasting the visual output of the application.
In some embodiments, the application is a game application and the task is a session of the game application.
In some embodiments, the processing unit 1106 is further configured to: cause to display (e.g. with display causing unit 1110) on the display a third affordance with the first affordance, wherein the third affordance corresponds to a request to start the task without broadcasting the visual output of the application; and in response to receiving the user input selecting the third affordance, start (e.g., with starting unit 1114) the task without sending the visual output of the application for broadcasting.
In some embodiments, the processing unit 1106 is further configured to, further in response to receiving the first user input corresponding to a selection of the first affordance and in accordance with a determination that there is only one broadcast application capable of broadcasting the visual output of the application, start (e.g., with starting unit 1114) the task and send (e.g., with sending unit 1116) the visual output of the task to the broadcast application for transmitting the visual output over the communication interface to a remote server without causing display of the second affordance.
In some embodiments, the processing unit 1106 is further configured to, further in response to receiving the first user input corresponding to a selection of the first affordance and in accordance with a determination that there are no applications capable of broadcasting the visual output of the application, prompt (e.g., with prompting unit 1122) the user to install a broadcast application.
In some embodiments, the processing unit 1106 is further configured to in accordance with a determination that there are no applications capable of broadcasting the visual output of the application, forgo the causing of display (e.g., with display causing unit 1110) of the first affordance.
In some embodiments, the processing unit 1106 is further configured to receive (e.g., with receiving unit 1112) a third user input indicating that send (e.g., with sending unit 1116) the visual output of the application to the broadcast application should be ceased; and in response to receiving the third user input, ceasing to send the visual output of the task to the broadcast application.
In some embodiments, sending the visual output of the application to the broadcast application occurs while the task is executing.
In some embodiments, the visual output of the application is a video recording of the application output.
In some embodiments, the processing unit 1106 is further configured to, transmit (e.g., with transmitting unit 1120), by the broadcast application, the visual output over the communication interface to the remote server.
In some embodiments, the processing unit 1106 is further configured to, in response to receiving the second user input, determine (e.g., using determining unit 118) whether a user is logged into the broadcast application; in accordance with a determination that the user is logged into the broadcast application, start (e.g., with starting unit 1114) the task and send (e.g., with sending unit 1116) the visual output of the application to the broadcast application; and in accordance with a determination that the user is not logged into the broadcast application, cause to display (e.g. with display causing unit 1110), on the display, a login window of the broadcast application.
In some embodiments, the login window is generated by the broadcast application user interface and is concurrently displayed with at least a portion of the application user interface.
In some embodiments, the first affordance is associated with the application user interface and wherein the second affordance for selecting the broadcast application is associated with a system user interface.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the techniques and their practical applications. Others skilled in the art are thereby enabled to best utilize the techniques and various embodiments with various modifications as are suited to the particular use contemplated.
Although the disclosure and examples have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of the disclosure and examples as defined by the claims.
This application claims priority to U.S. provisional patent application 62/349,041, entitled “RECORDING AND BROADCASTING APPLICATION VISUAL OUTPUT,” filed Jun. 12, 2016, the content of which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5815657 | Williams et al. | Sep 1998 | A |
6097371 | Siddiqui et al. | Aug 2000 | A |
6416471 | Kumar et al. | Jul 2002 | B1 |
6662023 | Helle | Dec 2003 | B1 |
6912694 | Harrison et al. | Jun 2005 | B1 |
7302650 | Allyn et al. | Nov 2007 | B1 |
7305350 | Bruecken | Dec 2007 | B1 |
7773460 | Holt | Aug 2010 | B2 |
7970827 | Cumberbatch et al. | Jun 2011 | B1 |
8105208 | Oleson et al. | Jan 2012 | B2 |
8121945 | Rackley et al. | Feb 2012 | B2 |
8453940 | Diamond | Jun 2013 | B2 |
8462997 | Pettit et al. | Jun 2013 | B2 |
8467770 | Ben Ayed | Jun 2013 | B1 |
8543081 | Scott et al. | Sep 2013 | B2 |
8554694 | Ward et al. | Oct 2013 | B1 |
8566403 | Pascal et al. | Oct 2013 | B2 |
8595798 | Anand et al. | Nov 2013 | B2 |
8624836 | Miller et al. | Jan 2014 | B1 |
8666361 | Chu et al. | Mar 2014 | B2 |
8700158 | Mass et al. | Apr 2014 | B2 |
8706628 | Phillips | Apr 2014 | B2 |
8825445 | Hoffman et al. | Sep 2014 | B2 |
8931703 | Mullen et al. | Jan 2015 | B1 |
9338242 | Suchland et al. | May 2016 | B1 |
9904906 | Kim et al. | Feb 2018 | B2 |
10051103 | Gordon et al. | Aug 2018 | B1 |
10425284 | Dellinger et al. | Sep 2019 | B2 |
20010031622 | Kivela et al. | Oct 2001 | A1 |
20020029169 | Oki et al. | Mar 2002 | A1 |
20020068600 | Chihara et al. | Jun 2002 | A1 |
20020142734 | Wickstead | Oct 2002 | A1 |
20030002391 | Biggs | Jan 2003 | A1 |
20030182628 | Lira | Sep 2003 | A1 |
20040077462 | Brown et al. | Apr 2004 | A1 |
20040168107 | Sharp et al. | Aug 2004 | A1 |
20040218472 | Narayanaswami et al. | Nov 2004 | A1 |
20040225966 | Besharat et al. | Nov 2004 | A1 |
20040246607 | Watson et al. | Dec 2004 | A1 |
20050156873 | Walter et al. | Jul 2005 | A1 |
20050197063 | White | Sep 2005 | A1 |
20050202846 | Glass | Sep 2005 | A1 |
20060000900 | Fernandes et al. | Jan 2006 | A1 |
20060122748 | Nou | Jun 2006 | A1 |
20060155578 | Eisenberger et al. | Jul 2006 | A1 |
20060173749 | Ward et al. | Aug 2006 | A1 |
20060217104 | Cho | Sep 2006 | A1 |
20060271605 | Petruzzo | Nov 2006 | A1 |
20070067733 | Moore et al. | Mar 2007 | A1 |
20070096765 | Kagan | May 2007 | A1 |
20070101279 | Chaudhri et al. | May 2007 | A1 |
20070135043 | Hayes et al. | Jun 2007 | A1 |
20070194110 | Esplin et al. | Aug 2007 | A1 |
20070194113 | Esplin et al. | Aug 2007 | A1 |
20070254712 | Chitti | Nov 2007 | A1 |
20070261537 | Eronen et al. | Nov 2007 | A1 |
20070271340 | Goodman et al. | Nov 2007 | A1 |
20080040265 | Rackley, III et al. | Feb 2008 | A1 |
20080076637 | Gilley et al. | Mar 2008 | A1 |
20080077936 | Goel et al. | Mar 2008 | A1 |
20080183909 | Lim et al. | Jul 2008 | A1 |
20080259829 | Rosenblatt | Oct 2008 | A1 |
20090037326 | Chitti et al. | Feb 2009 | A1 |
20090057396 | Barbour et al. | Mar 2009 | A1 |
20090060170 | Coughlan et al. | Mar 2009 | A1 |
20090113315 | Fisher et al. | Apr 2009 | A1 |
20090195497 | Fitzgerald et al. | Aug 2009 | A1 |
20090205041 | Michalske | Aug 2009 | A1 |
20090216556 | Martin et al. | Aug 2009 | A1 |
20090231960 | Hutcheson | Sep 2009 | A1 |
20090276463 | Miller | Nov 2009 | A1 |
20090311993 | Horodezky | Dec 2009 | A1 |
20090313579 | Poulson | Dec 2009 | A1 |
20090319467 | Berg et al. | Dec 2009 | A1 |
20100054519 | Mulvey et al. | Mar 2010 | A1 |
20100064255 | Rottler et al. | Mar 2010 | A1 |
20100082481 | Lin et al. | Apr 2010 | A1 |
20100151908 | Skarby et al. | Jun 2010 | A1 |
20100151918 | Annambhotla et al. | Jun 2010 | A1 |
20100157742 | Relyea et al. | Jun 2010 | A1 |
20100190468 | Scott et al. | Jul 2010 | A1 |
20100198453 | Dorogusker et al. | Aug 2010 | A1 |
20100202368 | Hans | Aug 2010 | A1 |
20100211685 | McDowall et al. | Aug 2010 | A1 |
20100223145 | Dragt | Sep 2010 | A1 |
20100223563 | Green | Sep 2010 | A1 |
20100225962 | Okigami et al. | Sep 2010 | A1 |
20100226213 | Drugge et al. | Sep 2010 | A1 |
20100281374 | Schulz et al. | Nov 2010 | A1 |
20100299601 | Kaplan et al. | Nov 2010 | A1 |
20110004835 | Yanchar et al. | Jan 2011 | A1 |
20110010195 | Cohn | Jan 2011 | A1 |
20110059769 | Brunolli | Mar 2011 | A1 |
20110061010 | Wasko | Mar 2011 | A1 |
20110074699 | Marr et al. | Mar 2011 | A1 |
20110078025 | Shrivastav | Mar 2011 | A1 |
20110083111 | Forutanpour et al. | Apr 2011 | A1 |
20110088086 | Swink et al. | Apr 2011 | A1 |
20110098928 | Hoffman et al. | Apr 2011 | A1 |
20110106954 | Chatterjee et al. | May 2011 | A1 |
20110137678 | Williams | Jun 2011 | A1 |
20110159959 | Mallinson et al. | Jun 2011 | A1 |
20110167369 | Van Os | Jul 2011 | A1 |
20110193878 | Seo et al. | Aug 2011 | A1 |
20110197165 | Filippov et al. | Aug 2011 | A1 |
20110202883 | Oh et al. | Aug 2011 | A1 |
20110205851 | Harris et al. | Aug 2011 | A1 |
20110218765 | Rogers et al. | Sep 2011 | A1 |
20110234152 | Frossen et al. | Sep 2011 | A1 |
20110251892 | Laracey | Oct 2011 | A1 |
20110271223 | Cruz Moreno et al. | Nov 2011 | A1 |
20110296324 | Goossens et al. | Dec 2011 | A1 |
20110304685 | Khedouri et al. | Dec 2011 | A1 |
20120001922 | Escher et al. | Jan 2012 | A1 |
20120015779 | Powch et al. | Jan 2012 | A1 |
20120036029 | Esplin et al. | Feb 2012 | A1 |
20120044062 | Jersa et al. | Feb 2012 | A1 |
20120059787 | Brown et al. | Mar 2012 | A1 |
20120060092 | Hill et al. | Mar 2012 | A1 |
20120066628 | Ens et al. | Mar 2012 | A1 |
20120079122 | Brown et al. | Mar 2012 | A1 |
20120083258 | Rabii | Apr 2012 | A1 |
20120084210 | Farahmand | Apr 2012 | A1 |
20120092383 | Hysek et al. | Apr 2012 | A1 |
20120101887 | Harvey et al. | Apr 2012 | A1 |
20120116550 | Hoffman et al. | May 2012 | A1 |
20120117507 | Tseng et al. | May 2012 | A1 |
20120131441 | Jitkoff et al. | May 2012 | A1 |
20120136780 | El-awady et al. | May 2012 | A1 |
20120191603 | Nuzzi | Jul 2012 | A1 |
20120197523 | Kirsch et al. | Aug 2012 | A1 |
20120209829 | Thomas et al. | Aug 2012 | A1 |
20120258684 | Franz et al. | Oct 2012 | A1 |
20120310674 | Faulkner et al. | Dec 2012 | A1 |
20120310760 | Phillips et al. | Dec 2012 | A1 |
20120324390 | Tao | Dec 2012 | A1 |
20130027341 | Mastandrea | Jan 2013 | A1 |
20130031490 | Joo et al. | Jan 2013 | A1 |
20130044072 | Kobayashi et al. | Feb 2013 | A1 |
20130047034 | Salomon et al. | Feb 2013 | A1 |
20130050263 | Khoe et al. | Feb 2013 | A1 |
20130055147 | Vasudev et al. | Feb 2013 | A1 |
20130063383 | Anderssonreimer et al. | Mar 2013 | A1 |
20130069893 | Brinda et al. | Mar 2013 | A1 |
20130103814 | Carrasco et al. | Apr 2013 | A1 |
20130111579 | Newman et al. | May 2013 | A1 |
20130117696 | Robertson et al. | May 2013 | A1 |
20130137073 | Nacey et al. | May 2013 | A1 |
20130143512 | Hernandez et al. | Jun 2013 | A1 |
20130190083 | Toy | Jul 2013 | A1 |
20130205210 | Jeon et al. | Aug 2013 | A1 |
20130222270 | Winkler et al. | Aug 2013 | A1 |
20130225118 | Jang et al. | Aug 2013 | A1 |
20130246202 | Tobin | Sep 2013 | A1 |
20130254685 | Batraski et al. | Sep 2013 | A1 |
20130254705 | Mooring et al. | Sep 2013 | A1 |
20130260896 | Miura | Oct 2013 | A1 |
20130262155 | Hinkamp | Oct 2013 | A1 |
20130290013 | Forrester | Oct 2013 | A1 |
20130295872 | Guday et al. | Nov 2013 | A1 |
20130304276 | Flies | Nov 2013 | A1 |
20130320080 | Olson et al. | Dec 2013 | A1 |
20140022183 | Ayoub et al. | Jan 2014 | A1 |
20140025737 | Kruglick | Jan 2014 | A1 |
20140058860 | Roh et al. | Feb 2014 | A1 |
20140058935 | Mijares | Feb 2014 | A1 |
20140059493 | Kim et al. | Feb 2014 | A1 |
20140068520 | Missig et al. | Mar 2014 | A1 |
20140073298 | Rossmann | Mar 2014 | A1 |
20140074407 | Hernandez-Silveira et al. | Mar 2014 | A1 |
20140074717 | Evans | Mar 2014 | A1 |
20140081854 | Sanchez et al. | Mar 2014 | A1 |
20140094124 | Dave et al. | Apr 2014 | A1 |
20140134947 | Stouder-Studenmund | May 2014 | A1 |
20140143145 | Kortina et al. | May 2014 | A1 |
20140143678 | Mistry et al. | May 2014 | A1 |
20140149198 | Kim et al. | May 2014 | A1 |
20140164241 | Neuwirth | Jun 2014 | A1 |
20140176475 | Myers et al. | Jun 2014 | A1 |
20140181205 | Sherrets et al. | Jun 2014 | A1 |
20140187323 | Perry | Jul 2014 | A1 |
20140189584 | Weng et al. | Jul 2014 | A1 |
20140244495 | Davis et al. | Aug 2014 | A1 |
20140245177 | Maklouf et al. | Aug 2014 | A1 |
20140250391 | Jong et al. | Sep 2014 | A1 |
20140282103 | Crandall | Sep 2014 | A1 |
20140287821 | Barclay et al. | Sep 2014 | A1 |
20140289660 | Min et al. | Sep 2014 | A1 |
20140293755 | Geiser et al. | Oct 2014 | A1 |
20140304738 | Nakaoka et al. | Oct 2014 | A1 |
20140310618 | Venkatesh | Oct 2014 | A1 |
20140317543 | Kim | Oct 2014 | A1 |
20140325384 | Kobayashi | Oct 2014 | A1 |
20140325408 | Leppanen et al. | Oct 2014 | A1 |
20140337207 | Zhang et al. | Nov 2014 | A1 |
20140337450 | Choudhary et al. | Nov 2014 | A1 |
20150006376 | Nuthulapati et al. | Jan 2015 | A1 |
20150012425 | Mathew | Jan 2015 | A1 |
20150039494 | Sinton et al. | Feb 2015 | A1 |
20150066758 | DeNardis et al. | Mar 2015 | A1 |
20150067513 | Zambetti et al. | Mar 2015 | A1 |
20150098309 | Adams et al. | Apr 2015 | A1 |
20150100982 | Sirpal et al. | Apr 2015 | A1 |
20150121405 | Ekselius et al. | Apr 2015 | A1 |
20150130830 | Nagasaki et al. | May 2015 | A1 |
20150160856 | Jang et al. | Jun 2015 | A1 |
20150180980 | Welinder et al. | Jun 2015 | A1 |
20150185995 | Shoemaker et al. | Jul 2015 | A1 |
20150205509 | Scriven et al. | Jul 2015 | A1 |
20150217163 | Amis et al. | Aug 2015 | A1 |
20150256491 | Eatough et al. | Sep 2015 | A1 |
20150269848 | Yuen et al. | Sep 2015 | A1 |
20150286391 | Jacobs et al. | Oct 2015 | A1 |
20150295901 | Woodward et al. | Oct 2015 | A1 |
20150301506 | Koumaiha | Oct 2015 | A1 |
20150334546 | Diamond | Nov 2015 | A1 |
20150347711 | Soli et al. | Dec 2015 | A1 |
20150348009 | Brown et al. | Dec 2015 | A1 |
20150350861 | Soli et al. | Dec 2015 | A1 |
20160004393 | Faaborg et al. | Jan 2016 | A1 |
20160014266 | Bhatt | Jan 2016 | A1 |
20160022202 | Peterson et al. | Jan 2016 | A1 |
20160027420 | Eronen | Jan 2016 | A1 |
20160028869 | Bhatt | Jan 2016 | A1 |
20160044269 | Kang | Feb 2016 | A1 |
20160058336 | Blahnik et al. | Mar 2016 | A1 |
20160065505 | Iskander | Mar 2016 | A1 |
20160066005 | Davis | Mar 2016 | A1 |
20160098137 | Kim et al. | Apr 2016 | A1 |
20160150063 | Choi | May 2016 | A1 |
20160188181 | Smith et al. | Jun 2016 | A1 |
20160202889 | Shin et al. | Jul 2016 | A1 |
20160226713 | Dellinger et al. | Aug 2016 | A1 |
20160239724 | Arfvidsson et al. | Aug 2016 | A1 |
20160253864 | Weber et al. | Sep 2016 | A1 |
20160261675 | Block et al. | Sep 2016 | A1 |
20160327911 | Eim et al. | Nov 2016 | A1 |
20160357363 | Decker et al. | Dec 2016 | A1 |
20160358133 | Van Os et al. | Dec 2016 | A1 |
20160358134 | Van Os et al. | Dec 2016 | A1 |
20160358180 | Van Os et al. | Dec 2016 | A1 |
20170004507 | Henderson et al. | Jan 2017 | A1 |
20170063753 | Probasco et al. | Mar 2017 | A1 |
20170093780 | Lieb et al. | Mar 2017 | A1 |
20170123571 | Huang et al. | May 2017 | A1 |
20170134321 | Ushio et al. | May 2017 | A1 |
20170269792 | Xu et al. | Sep 2017 | A1 |
20170344257 | Gnedin et al. | Nov 2017 | A1 |
20170354845 | Williams et al. | Dec 2017 | A1 |
20180034765 | Keszler et al. | Feb 2018 | A1 |
20180039406 | Kong et al. | Feb 2018 | A1 |
20180081515 | Block et al. | Mar 2018 | A1 |
20180143761 | Choi et al. | May 2018 | A1 |
20190220243 | Decker et al. | Jul 2019 | A1 |
20190232110 | Williams | Aug 2019 | A1 |
20190232111 | Williams | Aug 2019 | A1 |
20190250813 | Block et al. | Aug 2019 | A1 |
20190334782 | Dellinger et al. | Oct 2019 | A1 |
20190339822 | Devine et al. | Nov 2019 | A1 |
20190349463 | Soli et al. | Nov 2019 | A1 |
20200213437 | Bhatt | Jul 2020 | A1 |
Number | Date | Country |
---|---|---|
2016100796 | Jun 2016 | AU |
1443427 | Sep 2003 | CN |
1536511 | Oct 2004 | CN |
1782685 | Jun 2006 | CN |
1997050 | Jul 2007 | CN |
101061484 | Oct 2007 | CN |
101505320 | Aug 2009 | CN |
101822020 | Sep 2010 | CN |
101827363 | Sep 2010 | CN |
101828411 | Sep 2010 | CN |
102646081 | Aug 2012 | CN |
102989159 | Mar 2013 | CN |
103154954 | Jun 2013 | CN |
103297610 | Sep 2013 | CN |
103581456 | Feb 2014 | CN |
104288983 | Jan 2015 | CN |
1705883 | Sep 2006 | EP |
2426902 | Mar 2012 | EP |
2632139 | Aug 2013 | EP |
2550639 | Nov 2017 | GB |
2003-296246 | Oct 2003 | JP |
2004-258738 | Sep 2004 | JP |
2005-339017 | Dec 2005 | JP |
2008-272301 | Nov 2008 | JP |
2009-147889 | Jul 2009 | JP |
2010-12335 | Jan 2010 | JP |
2012-53642 | Mar 2012 | JP |
2012-533117 | Dec 2012 | JP |
2014-44724 | Mar 2014 | JP |
2014-216868 | Nov 2014 | JP |
2015-531916 | Nov 2015 | JP |
10-2004-0067514 | Jul 2004 | KR |
10-2014-0105309 | Sep 2014 | KR |
498240 | Aug 2002 | TW |
200512616 | Apr 2005 | TW |
201210368 | Mar 2012 | TW |
201240499 | Oct 2012 | TW |
2007000012 | Jan 2007 | WO |
2007142703 | Dec 2007 | WO |
2007149731 | Dec 2007 | WO |
2014002711 | Jan 2014 | WO |
2014022711 | Feb 2014 | WO |
2014074407 | May 2014 | WO |
2016126733 | Aug 2016 | WO |
Entry |
---|
Advisory Action received for U.S. Appl. No. 14/869,877, dated Jan. 5, 2017, 3 pages. |
Advisory Action received for U.S. Appl. No. 14/870,793, dated Apr. 13, 2017, 3 pages. |
Final Office Action received for U.S. Appl. No. 14/599,425, dated May 19, 2017, 24 pages. |
Final Office Action received for U.S. Appl. No. 14/599,425, dated Oct. 8, 2015, 20 pages. |
Final Office Action received for U.S. Appl. No. 14/869,877, dated Aug. 3, 2016, 13 pages. |
Final Office Action received for U.S. Appl. No. 14/870,726, dated Apr. 19, 2017, 17 pages. |
Final Office Action received for U.S. Appl. No. 14/870,793, dated Jan. 19, 2017, 16 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/032474, dated Dec. 15, 2016, 7 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2016/016216, dated May 4, 2017, 14 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2016/014997, dated Aug. 31, 2016, 21 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/032474, dated Aug. 19, 2015, 8 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/053353, dated May 9, 2016, 21 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2016/016216, dated Jun. 27, 2016, 18 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2016/021403, dated May 12, 2016, 23 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2016/034175, dated Oct. 7, 2016, 17 pages. |
Invitation to Pay Addition Fees and Partial International Search Report received for PCT Patent Application No. PCT/US2016/014997, dated May 2, 2016, 5 pages. |
Invitation to Pay Addition Fees and Partial International Search Report received for PCT Patent Application No. PCT/US2016/016216, dated Apr. 20, 2016, 6 pages. |
Invitation to Pay Addition Fees received for PCT Patent Application No. PCT/US2017/036608, dated Aug. 14, 2017, 2 pages. |
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2016/034175, dated Aug. 11, 2016, 3 pages. |
Invitation to Pay Additional fees received for PCT Patent Application No. PCT/US2015/053353, dated Jan. 21, 2016, 7 pages. |
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2017/035554, dated Jul. 20, 2017, 2 pages. |
Invitation to Restrict or Pay Additional Fees received for PCT Patent Application No. PCT/US2016/016216, dated Dec. 19, 2016, 9 pages. |
“IOS Security”, White Paper, Available online at: https://web.archive.org/web/20150526223200/http://www.apple.com/business/docs/iOS_Security_Guide.pdf, Apr. 2015, 55 pages. |
“iPhone User Guide for iOS 7.1 Software”, Available online at: https://manuals.info.apple.com/MANUALS/1000/MA1681/en_US/iphone_ios7_user_guide.pdf, Mar. 10, 2014, pp. 1-162. |
Non-Final Office Action received for U.S. Appl. No. 14/503,372, dated Dec. 5, 2014, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/599,425, dated Mar. 17, 2015, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/599,425, dated Oct. 26, 2016, 22 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/869,877, dated Jan. 29, 2016, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/869,877, dated Jun. 16, 2017, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/870,726, dated Sep. 16, 2016, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/870,793, dated Apr. 19, 2016, 17 pages. |
Notice of Allowance received for Chinese Patent Application No. 201520358505.5, dated Jan. 13, 2016, 3 pages (2 pages of English Translation and 1 page of Official Copy). |
Notice of Allowance received for Taiwanese Patent Application No. 104117509, dated Mar. 31, 2017, 3 pages (Official Copy only) (see attached 37 CFR § 1.98(a) (3)). |
Office Action received for Australian Patent Application No. 2015100734, dated Jul. 29, 2015, 5 pages. |
Office Action received for Australian Patent Application No. 2015267240, dated Apr. 10, 2017, 5 pages. |
Office Action received for Australian Patent Application No. 2016100796, dated Aug. 26, 2016, 6 pages. |
Office Action received for Australian Patent Application No. 2016100796, dated Feb. 13, 2017, 4 pages. |
Office Action received for Australian Patent Application No. 2017100231, dated Apr. 13, 2017, 3 pages. |
Office Action received for Australian patent Application No. 2017100667, dated Aug. 3, 2017, 9 pages. |
Office Action received for Chinese Patent Application No. 201620509515.9, dated Nov. 9, 2016, 2 pages (1 pages of English Translation and 1 page of Official Copy). |
Office Action received for Danish Patent Application No. PA201670362, dated Jun. 1, 2017, 6 pages. |
Office Action received for Danish Patent Application No. PA201670362, dated Nov. 21, 2016, 11 pages. |
Office Action received for Danish Patent Application No. PA201670749, dated Jan. 30, 2017, 11 pages. |
Office Action received for Danish Patent Application No. PA201670751, dated Jan. 13, 2017, 9 pages. |
Office Action received for European Patent Application No. 15730890.9, dated Aug. 3, 2017, 4 pages. |
Office Action received for Taiwanese Patent Application No. 104117509, dated Aug. 22, 2016, 6 pages (3 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Taiwanese Patent Application No. 104132636, dated Mar. 23, 2017, 25 pages (10 pages of English Translation and 15 pages of Official Copy). |
Razykdreviews, “In Depth Review of Apple Watch Activity and Workout App”, Available online at: <URL: https://wvvw.youtube.com/watch?v=GkKI3qIK0ow>, May 11, 2015, 1 page. |
Rizknows, “Garmin Connect Mobile App—Review #2”, Available online at: https://www.youtube.com/watch?v=7my3wMpeRbE, Oct. 22, 2015, 1 page. |
Written Opinion Issued from International Preliminary Examining Authority for PCT Application No. PCT/US2016/016216, dated Feb. 20, 2017, 12 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2017/036608, dated Oct. 20, 2017, 15 pages. |
Extended European Search Report received for European Patent Application No. 16804040.0, dated Feb. 26, 2018, 9 pages. |
Office Action received for Australian Patent Application No. 2017101375, dated Feb. 19, 2018, 4 pages. |
Final Office Action received for U.S. Appl. No. 14/841,606, dated Sep. 7, 2018, 34 pages. |
Final Office Action received for U.S. Appl. No. 14/864,759, dated Sep. 4, 2018, 24 pages. |
Office Action received for Chinese Patent Application No. 201510284850.3, dated Jul. 9, 2018, 11 pages. (2 pages of English Translation and 9 pages of Official Copy). |
Summons to Attend Oral Proceedings received for European Patent Application No. 15730890.9, dated Sep. 10, 2018, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/864,759, dated Mar. 20, 2018, 20 pages. |
Office Action received for Australian Patent Application No. 2015267240, dated Mar. 21, 2018, 5 pages. |
Office Action received for Australian Patent Application No. 2016215440, dated Mar. 13, 2018, 3 pages. |
Office Action received for Chinese Patent Application No. 201510284850.3, dated Nov. 28, 2017, 15 pages (5 pages of English Translation and 10 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201580028677.9, dated May 25, 2018, 14 pages (4 pages of English Translation and 10 pages of Official copy). |
Certification of Examination received for Australian Patent Application No. 2018100158, dated Oct. 23, 2018, 2 pages. |
Non Final Office Action received for U.S. Appl. No. 14/869,877, dated Oct. 5, 2018, 19 pages. |
Office Action received for Japanese Patent Application No. 2017-545918, dated Sep. 14, 2018, 12 pages (7 pages of English Translation and 5 pages of Official copy). |
Office Action received for Korean Patent Application No. 10-2017-7024570, dated Sep. 28, 2018, 14 pages (6 pages of English Translation and 8 pages of Official copy). |
Office Action received for Danish Patent Application No. PA201670751, dated Nov. 13, 2017, 2 pages. |
Codrington, “Intuitive Scrolling Interfaces with CSS Scroll Snap Points”, Online Available at: https://www.sitepoint.com/intuitive-scrolling-interfaces-with-css-scroll-snap-points/, Published on Dec. 8, 2015, 14 pages. |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 14/870,793, dated Apr. 16, 2018, 15 pages. |
Final Office Action received for U.S. Appl. No. 14/869,877, dated Apr. 26, 2018, 18 pages. |
“Mugs”, Online Available: at https://web.archive.org/web/20151029034349/http://le-mugs.com/, Published on Oct. 29, 2015. |
Notice of Acceptance received for Australian Patent Application No. 2015267240, dated Apr. 10, 2018, 3 pages. |
Office Action received for Australian Patent Application No. 2018100158, dated Apr. 23, 2018, 5 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/599,424, dated Jan. 17, 2018, 13 pages. |
Office Action received for Danish Patent Application No. PA201670362, dated Jan. 29, 2018, 3 pages. |
Office Action received for Taiwanese Patent Application No. 104132636, dated Oct. 31, 2017, 10 pages (4 pages of English Translation and 6 pages of Official Copy). |
Final Office Action received for U.S. Appl. No. 14/599,424, dated Jun. 28, 2018, 12 pages. |
Final Office Action received for U.S. Appl. No. 14/599,425, dated Jun. 12, 2018, 45 pages. |
Intention to Grant received for European Patent Application No. 16706081.3, dated Jun. 11, 2018, 7 pages. |
Office Action received for Danish Patent Application No. PA201770423, dated Jun. 12, 2018, 7 pages. |
Final Office Action received for U.S. Appl. No. 14/863,069, dated Jul. 5, 2018, 19 pages. |
Intention to Grant received for European Patent Application No. 16706081.3, dated Jul. 18, 2018, 8 pages. |
Office Action received for Australian Patent Application No. 2016229847, dated Jul. 3, 2018, 4 pages. |
Block et al., U.S. Appl. No. 15/554,204 entitled “Sharing User-Configurable Graphical Constructs”, filed Aug. 28, 2017, 247 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2016/034175, dated Dec. 14, 2017, 14 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2017/035554, dated Sep. 22, 2017, 42 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/841,606, dated Dec. 7, 2017, 30 pages. |
Office Action received for Australian Patent Application No. 2017101375, dated Dec. 1, 2017, 3 pages. |
Office Action received for Japanese Patent Application No. 2016-569945, dated Nov. 10, 2017, 8 pages (4 pages of English Translation and 4 pages of Official Copy). |
Search Report and Opinion received for Danish Patent Application No. PA201770423, dated Oct. 4, 2017, 10 pages. |
Tweedie, Steven, “Create and Customize Your Own Emojis with ‘Makemoji’ For iPhone”. Available online at: http://www.businessinsider.com/create-custom-emojis-with-makemoji-app-2014-8, Aug. 19, 2014, 6 pages. |
Whitwam, Ryan, “Facer is Fast Becoming the De Facto Custom Watch Face Maker for Android Wear”, Available online at: http://www.androidpolice.com/2014/09/19/facer-is-fast-becoming-the-de-facto-custom-watch-face-maker-for-android-wear, Sep. 19, 2014, 11 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2016/014997, dated Dec. 21, 2017, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/599,425, dated Jan. 11, 2018, 42 pages. |
Notice of Acceptance received for Australian Patent Application No. 2016229847, dated Sep. 12, 2018, 3 pages. |
Notice of Allowance received for U.S. Appl. No. 14/870,726, dated Sep. 11, 2018, 9 pages. |
Office Action received for Japanese Patent Application No. 2016-569945, dated Sep. 10, 2018, 11 pages (6 pages of English Translation and 5 pages of Official Copy). |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/053353, dated Sep. 21, 2017, 15 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2016/021403, dated Sep. 21, 2017, 21 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/863,069, dated Oct. 5, 2017 ,19 pages. |
Office Action received for Danish Patent Application No. PA201670749, dated Oct. 3, 2017, 3 pages. |
Decision to Grant received for European Patent Application No. 16706081.3, dated Nov. 29, 2018, 2 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2017/035554, dated Dec. 20, 2018, 39 pages. |
Notice of Allowance received for U.S. Appl. No. 14/599,424, dated Dec. 13, 2018, 6 pages. |
Notice of Allowance received for U.S. Appl. No. 14/599,425, dated Dec. 19, 2018, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 14/864,759, dated Dec. 14, 2018, 10 pages. |
Office Action received for Australian Patent Application No. 2016270323, dated Nov. 26, 2018, 4 pages. |
Notice of Allowance received for U.S. Appl. No. 14/863,069, dated Feb. 6, 2019, 6 pages. |
Office Action received for Australian Patent Application No. 2016215440, dated Jan. 22, 2019, 2 pages. |
Office Action received for Chinese Patent Application No. 201610371774.4, dated Dec. 19, 2018, 13 pages (5 pages of English Translation and 8 pages of Official copy). |
Office Action received for Korean Patent Application No. 10-2017-7034558, dated Dec. 15, 2018, 15 pages (7 pages of English Translation and 8 pages of Official Copy). |
Office Action received for Taiwanese Patent Application No. 104132636, dated Dec. 13, 2018, 26 pages (9 pages of English Translation and 17 pages of Official Copy). |
Advisory Action received for U.S. Appl. No. 14/841,606, dated Feb. 28, 2019, 3 pages. |
Notice of Acceptance received for Australian Patent Application No. 2016215440, dated Feb. 28, 2019, 3 pages. |
Office Action received for Australian Patent Application No. 2018101855, dated Feb. 22, 2019, 4 pages. |
Office Action received for Japanese Patent Application No. 2017-562330, dated Jan. 18, 2019, 11 pages (6 pages of English Translation and 5 pages of Official Copy). |
Preliminary Opinion received for European Patent Application No. 15730890.9, dated Mar. 7, 2019, 4 pages. |
Supplemental Notice of Allowance received for U.S. Appl. No. 14/863,069, dated Mar. 1, 2019, 3 pages. |
Supplemental Notice of Allowance received for U.S. Appl. No. 14/870,726, dated Mar. 6, 2019, 6 pages. |
Extended European Search Report received for European Patent Application No. 16762356.0, dated Nov. 9, 2018, 10 Pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2017/036608, dated Dec. 27, 2018, 12 pages. |
Notice of Allowance received for U.S. Appl. No. 15/616,480, dated Jan. 3, 2019, 8 pages. |
Advisory Action received for U.S. Appl. No. 14/863,099, dated Sep. 8, 2016, 3 pages. |
Dharmasena, Anusha, “iMessage—send as text message Option”, YouTube, Available online at: https://www.youtube.com/watch?v=hXG-MdIW6FA, Feb. 18, 2013, 1 page. |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 14/774,664, dated May 31, 2018, 28 pages. |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 14/863,099, dated Jul. 28, 2017, 31 pages. |
Extended European Search Report received for European Patent Application No. 18213157.3, dated Apr. 12, 2019, 8 pages. |
Extended European Search Report received for European Patent Application No. 19163212.4, dated Jun. 25, 2019, 11 pages. |
Final Office Action received for U.S. Appl. No. 14/774,664, dated Aug. 25, 2017, 23 pages. |
Final Office Action received for U.S. Appl. No. 14/863,099, dated Apr. 21, 2016, 20 pages. |
Final Office Action received for U.S. Appl. No. 14/869,877, dated Jun. 11, 2019, 35 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2014/027882, dated Sep. 24, 2015, 8 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2013/032498, dated Feb. 10, 2014, 18 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2014/027882, dated Oct. 10, 2014, 11 pages. |
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2014/027882, dated Aug. 5, 2014, 2 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/554,204, dated Apr. 17, 2019, 19 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/774,664, dated Mar. 7, 2017, 23 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/841,606, dated May 8, 2019, 28 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/863,099, dated Dec. 2, 2015, 12 pages. |
Notice of Allowance received for Chinese Patent Application No. 201580028677.9, dated Apr. 2, 2019, 2 pages (1 page of English Translation and 1 page of Official Copy). |
Notice of Allowance received for Japanese Patent Application No. 2017-545918, dated Jul. 22, 2019, 4 pages (1 page of English Translation and 3 pages of Official Copy). |
Notice of Allowance received for U.S. Appl. No. 14/863,069, dated Jun. 18, 2019, 6 pages. |
Office Action received for Australian Patent Application No. 2016270323, dated May 29, 2019, 4 pages. |
Office Action received for Australian Patent Application No. 2018206772, dated Apr. 1, 2019, 4 pages. |
Office Action received for Australian Patent Application No. 2018279037, dated Jun. 18, 2019, 5 pages. |
Office Action received for Chinese Patent Application No. 201510284850.3, dated Jun. 21, 2019, 10 pages (4 pages of English Translation and 6 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201610371774.4, dated Jul. 8, 2019, 6 pages (3 pages of English Translation and 3 pages of Official copy). |
Office Action received for Danish Patent Application No. PA201770423, dated Mar. 29, 2019, 6 pages. |
Office Action received for European Patent Application No. 16804040.0, dated May 13, 2019, 12 pages. |
Office Action received for Korean Patent Application No. 10-2017-7024570, dated Jul. 10, 2019, 6 pages (3 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Korean Patent Application No. 10-2017-7034558, dated Jun. 4, 2019, 7 pages (3 pages of English Translation and 4 pages of Official Copy). |
Partial Supplementary European Search Report received for European Patent Application No. 17810749.6, dated Apr. 25, 2019, 8 pages. |
Search Report and Opinion received for Danish Patent Application No. PA201870385, dated Nov. 16, 2018, 10 pages. |
Supplemental Notice of Allowance received for U.S. Appl. No. 14/863,069, dated Mar. 29, 2019, 3 pages. |
Supplemental Notice of Allowance received for U.S. Appl. No. 15/616,480, dated Mar. 28, 2019, 2 pages. |
Wikipedia, “Enhanced Multi-Level Precedence and Pre-emption Service”, Available online at: https://de.wikipedia.org/w/index.php?%20title=Enhanced%20Multi%E3%83%BCLevel_Precedence_And_Pre-emption_Service&oldid=123047429, Oct. 2013, 2 pages (Official copy only) {See Communication under 37 CFR § 1.98(a) (3)}. |
Decision on Appeal received for U.S. Appl. No. 14/863,099, dated Aug. 22, 2019, 9 pages. |
Invitation to Pay Additional Fee received for PCT Patent Application No. PCT/US2019/024790, dated Jul. 18, 2019, 10 pages. |
Office Action received for European Patent Application No. 17810749.6, dated Aug. 20, 2019, 9 pages. |
Office Action received for Japanese Patent Application No. 2016-569945, dated Jul. 29, 2019, 5 pages (2 pages of English Translation and 3 pages of Official Copy). |
Applicant Initiated Interview Summary received for U.S. Appl. No. 15/554,204, dated Oct. 11, 2019, 5 pages. |
Notice of Allowance received for Japanese Patent Application No. 2017-562330, dated Sep. 20, 2019, 4 pages (1 page of English Translation and 3 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201680008151.9, dated Aug. 27, 2019, 24 pages (11 pages of English Translation and 13 pages of Official Copy). |
Office Action received for Korean Patent Application No. 10-2017-7034558, dated Sep. 25, 2019, 9 pages (4 pages of English Translation and 5 pages of Official Copy). |
Decision on Appeal received for U.S. Appl. No. 14/774,664, dated Sep. 12, 2019, 8 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2019/024790, dated Sep. 11, 2019, 18 pages. |
Office Action received for Danish Patent Application No. PA201870385, dated Aug. 23, 2019, 3 pages. |
Certificate of Examination received for Australian Patent Application No. 2018101855, dated Aug. 6, 2019, 2 pages. |
Office Action received for Australian Patent Application No. 2019100490, dated Jul. 26, 2019, 4 pages. |
Supplementary European Search Report received for European Patent Application No. 17810749.6, dated Aug. 6, 2019, 6 pages. |
Supplemental Notice of Allowance received for U.S. Appl. No. 14/863,069, dated Aug. 15, 2019, 3 pages. |
Office Action received for Australian Patent Application No. 2017286296, dated May 8, 2019, 4 pages. |
Extended European Search Report received for European Patent Application No. 17813824.4, dated Dec. 5, 2019, 7 pages. |
“How to Send and Receive files over Bluetooth on an Android Phone”, Online Available at: https://web.archive.org/web/20160529062240/http://www.androidtipsandhacks.com/android/send-receive-files-bluetooth-android-phone/, May 29, 2016, 7 pages. |
“Kamcord—Wikipedia”, Online Available at: https://en.wikipedia.org/w/index.php?title=Kamcord&oldid=712263010>, Mar. 28, 2016, 2 pages. |
“Kamcord Developers”, Online Available at: https://web.archive.org/web/20140827043641/http://www.kamcord.com/developers/>, Aug. 27, 2014, 7 pages. |
“Kamcord Developers—Quick Start Guide”, Online Available at: https://web.archive.org/web/20140801055705/https://www.kamcord.com/developers/docs/ios/features-and-settings/, Aug. 1, 2014, 10 pages. |
Office Action received for Chinese Patent Application No. 201610371774.4, dated Dec. 2, 2019, 6 pages (3 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201811330077.X, dated Nov. 13, 2019, 14 pages (6 pages of English Translation and 8 pages of Official Copy). |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 15/554,204, dated Jan. 31, 2020, 3 pages. |
Intention to Grant received for Danish Patent Application No. PA201870385, dated Jan. 24, 2020, 2 pages. |
Notice of Allowance received for Japanese Patent Application No. 2016-569945, dated Jan. 7, 2020, 4 pages (1 page of English Translation and 3 pages of Official Copy). |
Office Action received for Australian Patent Application No. 2018279037, dated Jan. 17, 2020, 4 pages. |
Summons to Attend Oral Proceedings received for European Patent Application No. 16804040.0, dated Jan. 24, 2020, 11 pages. |
Wikipedia, “Emoji”, Available online at: https://en.wikipedia.org/w/index.php?title=Emoji&oldid=648831795, Feb. 25, 2015, 12 pages. |
Wikipedia, “Emoticon”, Available online at: https://en.wikipedia.org/w/index.php?title=Emoticon&oldid=648776142, Feb. 25, 2015, 9 pages. |
Internet Blog Post, “[PC] Pre-Customization of Black Desert's Characters”, Online Available at: https://blog.naver.com/hsh6051/220209813968, Dec. 14, 2014, 41 pages (21 pages of English translation and 20 pages of Official Copy). |
Office Action received for Australian Patent Application No. 2018206772, dated Nov. 6, 2019, 4 pages. |
Office Action received for Korean Patent Application No. 10-2019-7029673, dated Nov. 5, 2019, 10 pages (4 pages of English Translation and 6 pages of Official Copy). |
Certificate of Examination received for Australian Patent Application No. 2019100490, dated Oct. 16, 2019, 2 pages. |
Final Office Action received for U.S. Appl. No. 15/554,204, dated Oct. 31, 2019, 22 pages. |
Brief Communication Regarding Oral Proceedings received for European Patent Application No. 16804040.0, dated May 28, 2020, 15 pages. |
Notice of Allowance received for Chinese Patent Application No. 201610371774.4, dated Jun. 4, 2020, 2 pages (1 page of English Translation and 1 page of Official Copy). |
Office Action received for Australian Patent Application No. 2017277971, dated Jun. 3, 2020, 3 pages. |
Notice of Acceptance received for Australian Patent Application No. 2017286296, dated May 1, 2020, 3 pages. |
Applicant Initiated Interview of Summary received for U.S. Appl. No. 16/389,722, dated Jul. 7, 2020, 5 pages. |
Notice of Allowance reveived for U.S. Appl. No. 15/554,204, dated Jul. 13, 2020, 10 pages. |
Office Action received for European Patent Application No. 19724963.4, dated Jul. 28, 2020, 6 pages. |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 16/145,033, dated Jun. 29, 2020, 5 pages. |
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 14/869,877, dated Jun. 26, 2020, 14 pages. |
Notice of Allowance received for Chinese Patent Application No. 201680008151.9, dated Jun. 16, 2020, 2 pages (1 page of English Translation and 1 page of Official Copy). |
Office Action received for Chinese Patent Application No. 201811330077.X, dated May 18, 2020, 9 pages (3 pages of English Translation and 6 pages of Official Copy). |
Advisory Action received for U.S. Appl. No. 15/554,204, dated Mar. 12, 2020, 3 pages. |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 16/407,590, dated Jun. 5, 2020, 3 pages. |
Decision to Grant received for Danish Patent Application No. PA201870385, dated Mar. 26, 2020, 2 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/145,033, dated Mar. 4, 2020, 50 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/363,945, dated Apr. 24, 2020, 13 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/377,892, dated May 21, 2020, 9 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/378,136, dated Jun. 2, 2020, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/389,722, dated Apr. 3, 2020, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/407,590, dated Apr. 10, 2020, 12 pages. |
Notice of Acceptance received for Australian Patent Application No. 201806772, dated Mar. 17, 2020, 3 pages. |
Notice of Acceptance received for Australian Patent Application No. 2018206772, dated May 13, 2020, 3 pages. |
Office Action received for Australian Patent Application No. 2018206772, dated Feb. 6, 2020, 4 pages. |
Office Action received for Chinese Patent Application No. 201680008151.9, dated Apr. 20, 2020, 6 pages (3 pages of English Translations and 3 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201680013193.1, dated Mar. 25, 2020, 21 pages (8 pages of English Translation and 13 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201710439448.7, dated Mar. 27, 2020, 13 pages (7 pages of English Translation and 6 pages of Official Copy). |
Office Action received for European Patent Application No. 18213157.3, dated May 15, 2020, 7 pages. |
Office Action received for Korean Patent Application No. 10-2019-7038235, dated Mar. 9, 2020, 15 pages (7 pages of English Translation and 8 pages of Official Copy). |
Number | Date | Country | |
---|---|---|---|
20170359623 A1 | Dec 2017 | US |
Number | Date | Country | |
---|---|---|---|
62349041 | Jun 2016 | US |