1. Field of Technology
The embodiments herein generally relate to online content storage systems that enable sharing of content between user accounts, and in particular to accounts shared among family members.
2. Background
Some online (or more colloquially, “cloud”) based content storage systems enable users to share content between multiple devices and multiple different users. Typically, each user must have their own account with the content storage system, whereby they then designate particular folders or files to be shared with other users.
Some content storage systems do not allow children (typically under the age of thirteen, but may vary depending on jurisdiction) to establish accounts due to legal concerns. However, many families would like to be able to allow their children to be able to share content with other family members. Not all online content storage systems allow for family accounts that include child accounts, and among those that do, establishing an account for a child can be complex and time consuming.
Content storage systems are sometimes associated with online digital stores where users can purchase digital content, such as music, video, electronic books, and computer applications. For content storage systems of this type, child family members typically have the ability to purchase digital content subject only to content rating restrictions, without direct supervision or approval by other family members. These content storage systems do not provide easy to administrator facilities for controlling the kinds of content that child family members can purchase.
An online content storage system enables a user to create a family account to be shared by a number of family members, including child family members. In one aspect, an adult family member having an account on the content storage system can initiate processes to set up a shared family account on the content storage system. The adult family member, acting an family organizer can select an account to be used as the shared family account, select an account to use as a purchase account to allow purchases to be made by other family members, invite other family members to join the shared family account, and to create a new account for child family members in an efficient and flexible manner.
In creating a new account for a child family member, the family organizer can use credit card related information, such as a credit card verification code to authenticate to the content storage system that the family organizer is setting up the account, rather than the child family member. The family organizer can also select particular content servers to which the child family member can have access. The family organizer can designate a real time purchase approval process be associated with the account of the child family member so that future purchases of content from the content servers by the child family member must be approved in real time by an adult family member prior to the purchase being approve and the content downloaded to the child family member's computing device. The family organizer can further select particular content restrictions based on country and media type to be applied to the child family member's account. Different content restriction templates can be provided by the content storage system for different countries and age ranges; the system can automatically select and periodically update the content restriction template applied to the child family member's account based on the age and country location.
Upon acceptance of an invitation to join a shared family account, an adult family member automatically obtains access to the shared family account, as well as designated content servers for obtaining content, and various services, such as a family group messaging service, a family calendar service, and device location service by which the computing devices of other family members may be easily located and shown on a map or caused to output an audio sound for locating.
Accordingly, in one aspect a method comprises, at an electronic device with a display, displaying a first screen to initiate setup of a family account at an online content storage system that is to be shared between family members. The electronic device receives an input on the first screen initiating setup of the online content storage account, and displays an identifier of a family account organizer and an indication of a source account with the online content storage system. The electronic device then displays at least one account to be selected as a purchase account for the family account, wherein purchases of content items by any family member having access to the family account are stored on the online content storage system in association with the family account, and can be accessed by any other family member having access to the family account. The electronic device receives an input selecting an account as the purchase account, and then displays credit card information associated with the purchase account. The electronic device receives an input confirming the credit card information, and sends to the online content storage system an indication to associate the credit card number for the purchase account with the family account. The device then displays a list of family members having access to the family account, along with an indication of the credit card number associated with the family account, and a first user interface element to add a family member to the family account.
In another aspect, the method includes the electronic device receives a selection of the user interface element to create an account identifier for a child family member, and then displays an input field for receiving a credit card verification number for a credit card number associated with the family account. Where electronic device receives an input that matches an actual credit card verification number of the credit card number associated with the account, and proceeds with the creation of an account identifier for the child family member by displaying input fields for receiving identification information for the child family member. If the electronic device receives an input that does not match the actual card verification number of the credit card number associated with the account, it terminates the creation of an account identifier for the child family member, and displays a list of family members having access to the family account, wherein the list of family members does not include the family child member.
In another aspect, the method includes the electronic device displaying a control for receiving an input selectively enabling the online content storage system to transmit to an adult family member associated with the family account, prior to a purchase of content by the added family member, a message indicating the content being to be purchased by the added family member and indicating that the adult family member has an option to approve the purchase prior to the purchase being processed and the content downloaded to a computing device of the added family member.
In another aspect the method includes the electronic device displaying an indication that the added family member is automatically associated with a shared family calendar service associated with the family account, or displaying an indication that the added family member is automatically associated with a group messaging service associated with the family account, or displaying an indication that the added family member is automatically associated with a device location service associated with the family account.
In another aspect the method includes the electronic device displaying an input field for receiving an identifier of the family member to be added to a family account at an online content storage system that is shared between family members, and receiving the identifier of the family member to be added to the family account. The electronic device determines an age of the family member to be added to the family account. Responsive to the age of the family member being less than a predetermined age limit, the electronic device selects one of a plurality of content access and purchase restriction templates to be associated with the family member, and displays an indication of the selected content access and purchase restriction template. The selected content access and purchase restriction template is applied the account of the added family member.
A real time purchase approval process enables an adult family member to approve a purchase of content being made by a child family member from a content server accessible to the child family member. In this aspect, a content storage system (and associated content servers) receives from the computing device of the child family member, a selection of content item to be purchased by the child family member. The content storage system determines from the account of the child family member whether a real time approval process is required. If so, the content storage system notifies, via the computing device, that approval of an adult family member is required prior to the purchase, and providing options for the child family member to request approval of the purchase or decline. If the child family member requests approval of the purchase, then in one embodiment the content storage system transmits to a computing device of at least one adult family member a message informing the adult family member that their approval is required for the child family member to purchase the content and providing options for the adult family member to review the request or defer the request until a later time. If the adult family member defers the request, the content storage system sets a timer and repeats the approval request at a later time. If the adult family member reviews the request, the content storage system provides an indication of the content being purchased, and options to either accept or decline the purchase. If the adult family member declines the purchase, then the purchase is terminated and the content is not made available to the child family member. If the adult family member approves the purchase, then content storage system proceeds to process the transaction and download the content to the child family member's computing device. In one embodiment, the child family member's computing device displays a message providing the child family member options for having an adult family member approve the purchase request in person. If the child family member selects the option to have an adult family member approve in person, then the computing device displays a screen into which the adult family member can enter their account password, or other authentication information (e.g., credit card verification code). Once the adult family member is authenticated then the purchase is processed and the content is downloaded to the child family member's computing device.
In one aspect, a method comprises at a content storage system associated with a one content server from which content can be purchased, storing a family account at the content storage system, the family account having at least one adult family member and at least one child family member, the family account associated with a purchase account. The content storage system receives from a computing device of a child family member a request to purchase a content item from the content server associated with the content storage system. The content storage system determines from family account whether the purchase requires a real time approval by an adult family member associated with the family account. In response to determining that the purchase requires a real time approval, and prior to processing the purchase and downloading the content item to the computing device of the child family member, the content storage system transmits to a computing device of the at least one adult family member a message indicating that approval of the purchase by the child family member is required. In response to receiving an approval message from the computing device of the adult family member, the content storage system processes the purchase transaction using the purchase account, and transmits the purchased content item to the computing device of the child family member. In response to receiving an a decline message from the computing device of the adult family member, the content storage system terminate the purchase.
Note that the various embodiments of the family sharing account described above can be combined with any other embodiments described herein. The features and advantages described in the specification are not all inclusive and, in particular, many additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter.
The figures depict, and the detail description describes, various non-limiting embodiments for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein.
Overview of System Architecture
An online content storage system enables users to set up family accounts that include accounts accessible by children of the family.
The content storage server 201 enables users to upload, search, browse and share content with others, including but not limited to other family members. Content handled by the content storage server 201 includes any type of files, including documents, presentations, computer data, spreadsheets, images, movies, music, applications, and so forth. Embodiments of a content storage server suitable use as the content storage server 201 are disclosed in US20130311597 which is incorporated by reference herein in its entirety.
The media server 202 provides an online, digital media store through which users of computing devices 100 to search, browse and obtain various types of media, including music, videos, movies, and television programs. The apps server 204 provides an online, digital media store through which user to search, browse and obtain executable computer applications and computer data for their computing device 100. The e-book server 206 provides an online, digital media store through which users to search, browse and obtain electronic books, magazines, journals for reading on their computing devices 100. Embodiments of various servers suitable for use as the media server 202, app server 204, and e-book server are described in U.S. Pat. Nos. 7,895,661 and 7,899,714 which are incorporated by reference in their entirety, herein. Any of the foregoing types of digital content can be free, rented, trial, or purchased. In one embodiment, the transactions are from these various servers are handled by the purchase server 210, and the subsequent downloading of content is handled by the download server 212.
The family calendar server 214 provides support for hosted calendars that can be shared among family members. Embodiments of a calendar server suitable for use as the family calendar server 214 are disclosed in U.S. Pat. Nos. 7,822,713 and 7,814,055, which are incorporated by reference in their entirety, herein.
The group messaging server 216 provides a platform for instant messaging between users, and in the embodiments described here, is beneficially used to provide instant messaging services between family members associated with a family account. Embodiments of a messaging server suitable for use as the group messaging server 216 are disclosed in U.S. Pat. Nos. 8,468,580, 8,352,873, 8,554,861, 8,020,105 and US20110055735, each of which is incorporated by reference in its entirety, herein.
User accounts server 208 stores information regarding user profile data (e.g., name, address, account ID, birthdate, access preferences and restrictions) and transactional information (e.g., purchase history, search history, etc.) for each user. The accounts server 208 handles updating of user account information, as well as verification and authentication of user credentials such as user name, user email address, user passwords.
The computing device 100 shown in
Memory 102 may include high-speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to memory 102 by other components of the device 100, such as the CPU 120 and the peripherals interface 118, may be controlled by the memory controller 122.
The peripherals interface 118 couples the input and output peripherals of the device to the CPU 120 and memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in memory 102 to perform various functions for the device 100 and to process data.
In some embodiments, the peripherals interface 118, the CPU 120, and the memory controller 122 may be implemented on a single chip, such as a chip 104. In some other embodiments, they may be implemented on separate chips.
The RF (radio frequency) circuitry 108 receives and sends RF signals, also called electromagnetic signals. The RF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other communications devices via the electromagnetic signals. The RF circuitry 108 may include well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. The RF circuitry 108 may communicate with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication. The wireless communication may use any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for email (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), and/or Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS)), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
The audio circuitry 110, the speaker 111, and the microphone 113 provide an audio interface between a user and the device 100. The audio circuitry 110 receives audio data from the peripherals interface 118, converts the audio data to an electrical signal, and transmits the electrical signal to the speaker 111. The speaker 111 converts the electrical signal to human-audible sound waves. The audio circuitry 110 also receives electrical signals converted by the microphone 113 from sound waves. The audio circuitry 110 converts the electrical signal to audio data and transmits the audio data to the peripherals interface 118 for processing. Audio data may be retrieved from and/or transmitted to memory 102 and/or the RF circuitry 108 by the peripherals interface 118. In some embodiments, the audio circuitry 110 also includes a headset jack (e.g. 212,
The I/O subsystem 106 couples input/output peripherals on the device 100, such as the touch screen 112 and other input/control devices 116, to the peripherals interface 118. The I/O subsystem 106 may include a display controller 156 and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 receive/send electrical signals from/to other input or control devices 116. The other input/control devices 116 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 160 may be coupled to any (or none) of the following: a keyboard, infrared port, USB port, and a pointer device such as a mouse. The one or more buttons may include an up/down button for volume control of the speaker 111 and/or the microphone 113. The one or more buttons may include a push button. A quick press of the push button may disengage a lock of the touch screen 112 or begin a process that uses gestures on the touch screen to unlock the device, as described in U.S. Pat. No. 7,657,849, which is hereby incorporated by reference. A longer press of the push button may turn power to the device 100 on or off. The user may be able to customize a functionality of one or more of the buttons. The touch screen 112 is used to implement virtual or soft buttons and one or more soft keyboards.
The touch-sensitive touch screen 112 provides an input interface and an output interface between the device and a user. The display controller 156 receives and/or sends electrical signals from/to the touch screen 112. The touch screen 112 displays visual output to the user. The visual output may include graphics, text, icons, video, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output may correspond to user-interface objects, further details of which are described below.
A touch screen 112 has a touch-sensitive surface, sensor or set of sensors that accepts input from the user based on haptic and/or tactile contact. The touch screen 112 and the display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) on the touch screen 112 and converts the detected contact into interaction with user-interface objects (e.g., one or more soft keys, icons, web pages or images) that are displayed on the touch screen. In an exemplary embodiment, a point of contact between a touch screen 112 and the user corresponds to a finger of the user.
The touch screen 112 may use LCD (liquid crystal display) technology, or LPD (light emitting polymer display) technology, although other display technologies may be used in other embodiments. The touch screen 112 and the display controller 156 may detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with a touch screen 112.
The display controller 156 determine movement of contact based on speed, velocity, pressure, and/or an acceleration of the contact point, and tracks the movement across the touch screen 112. The display controller 156 determines if the contact has stopped such as the detection of a finger-up event or a break in contact with the touch screen 112. These operations may be applied to a single contact (e.g., one finger touch) or to multiple simultaneous contacts.
The display controller 156 detects a gesture input by a user on the computing device 100. Different gestures have different touch patterns. A touch pattern is characterized by one or more contact points and their associated movements, from which the spatial or geometrical relationships between the contact points can be determined. The display controller 156 detects a gesture based on a particular touch pattern on the display screen. For example, the display controller 156 detects a finger tap gesture by detecting a finger-down event indicating an initial contact of a position on the touch screen 112 followed by detecting a finger-up event at substantially the same position where the finger is no longer touching the touch screen. In another example, the display controller 156 detects a finger swipe gesture on the touch screen based on detecting a finger-down event (appearance of a contact point) followed by detecting one or more finger-dragging events where the user drags his or her finger from the position associated with the finger-down event to another position on the touch screen 112 (movement of a contact point), and subsequently followed by a finger-up event (disappearance of the contact point).
A touch-sensitive display in some embodiments of the touch screen 112 may be analogous to the multi-touch sensitive tablets described in U.S. Pat. Nos. 6,323,846, 6,570,557, and US20020015024, each of which is hereby incorporated by reference herein, in its entirety. However, a touch screen 112 displays visual output from the portable device 100, whereas touch sensitive tablets do not provide visual output.
A touch-sensitive display in some embodiments of the touch screen 112 may be as described in U.S. Pat. Nos. 8,279,180, 7,663,607, 20060026521 U.S. Pat. Nos. 8,239,784, 7,614,008, 7,844,914, US20060033724, and US20060197753, each of which are incorporated by reference in its entirety, herein.
The user may make contact with the touch screen 112 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which are much less precise than stylus-based input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user.
In some embodiments, in addition to the touch screen 112, the computing device 100 may include a touchpad (not shown) for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad may be a touch-sensitive surface that is separate from the touch screen 112 or an extension of the touch-sensitive surface formed by the touch screen.
In some embodiments, the computing device 100 may include a physical or virtual click wheel as an input control device 116. A user may navigate among and interact with one or more graphical objects (henceforth referred to as icons) displayed in the touch screen 112 by rotating the click wheel or by moving a point of contact with the click wheel (e.g., where the amount of movement of the point of contact is measured by its angular displacement with respect to a center point of the click wheel). The click wheel may also be used to select one or more of the displayed icons. For example, the user may press down on at least a portion of the click wheel or an associated button. User commands and navigation commands provided by the user via the click wheel may be processed by an input controller 160 as well as one or more of the modules and/or sets of instructions in memory 102. For a virtual click wheel, the click wheel and click wheel controller may be part of the touch screen 112 and the display controller 156, respectively. For a virtual click wheel, the click wheel may be either an opaque or semitransparent object that appears and disappears on the touch screen display in response to user interaction with the device. In some embodiments, a virtual click wheel is displayed on the touch screen of a portable multifunction device and operated by user contact with the touch screen.
The computing device 100 also includes a power system 162 for powering the various components. The power system 162 may include a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
The computing device 100 may also include one or more optical sensors 164.
The computing device 100 may also include one or more proximity sensors 166.
The computing device 100 may also include one or more accelerometers 168.
In some embodiments, the software components stored in memory 102 may include an operating system 126, a communication module (or set of instructions) 128, a contact/motion module (or set of instructions) 130, a graphics module (or set of instructions) 132, a text input module (or set of instructions) 134, a Global Positioning System (GPS) module (or set of instructions) 135, and applications (or set of instructions) 136.
The operating system 126 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
The communication module 128 facilitates communication with other devices over one or more external ports 124 and also includes various software components for handling data received by the RF circuitry 108 and/or the external port 124. The external port 124 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.). In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or similar to and/or compatible with the 30-pin connector used on iPod (trademark of Apple Computer, Inc.) devices.
The contact/motion module 130 may detect contact with the touch screen 112 (in conjunction with the display controller 156) and other touch sensitive devices (e.g., a touchpad or physical click wheel). The contact/motion module 130 includes various software components for performing various operations related to detection of contact, such as determining if contact has occurred, determining if there is movement of the contact and tracking the movement across the touch screen 112, and determining if the contact has been broken (i.e., if the contact has ceased). Determining movement of the point of contact may include determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations may be applied to single contacts (e.g., one finger contacts) or to multiple simultaneous contacts (e.g., “multitouch”/multiple finger contacts). In some embodiments, the contact/motion module 130 and the display controller 156 also detects contact on a touchpad. In some embodiments, the contact/motion module 130 and the controller 160 detects contact on a click wheel.
The graphics module 132 includes various known software components for rendering and displaying graphics on the touch screen 112, including components for changing the intensity of graphics that are displayed. As used herein, the term “graphics” includes any object that can be displayed to a user, including without limitation text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations and the like.
The text input module 134, which may be a component of graphics module 132, provides soft keyboards for entering text in various applications (e.g., contacts 137, e-mail 140, IM 141, blogging 142, browser 147, and any other application that needs text input).
The GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to telephone 138 for use in location-based dialing, to camera 143 and/or blogger 142 as picture/video metadata, and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets).
The applications 136 may include the following modules (or sets of instructions), or a subset or superset thereof:
Examples of other applications 136 that may be stored in memory 102 include a video chat application, a messaging application, other word processing applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
Each of the above identified modules and applications correspond to a set of instructions for performing one or more functions described above. These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules may be combined or otherwise re-arranged in various embodiments. For example, video player module 145 may be combined with music player module 146 into a single module. In some embodiments, memory 102 may store a subset of the modules and data structures identified above. Furthermore, memory 102 may store additional modules and data structures not described above.
In some embodiments, the computing device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through a touch screen 112 and/or a touchpad. By using a touch screen and/or a touchpad as the primary input/control device for operation of the computing device 100, the number of physical input/control devices (such as push buttons, dials, and the like) on the device 100 may be reduced.
The predefined set of functions that may be performed exclusively through a touch screen and/or a touchpad include navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates the computing device 100 to a main, home, or root menu from any user interface that may be displayed on the device 100. In such embodiments, the touchpad may be referred to as a “menu button.” In some other embodiments, the menu button may be a physical push button or other physical input/control device instead of a touchpad.
Event sorter 170 receives event information and determines the application 136-1 and application view 191 of application 136-1 to which to deliver the event information. Event sorter 170 includes event monitor 171 and event dispatcher module 174. In some embodiments, application 136-1 includes application internal state 192, which indicates the current application view(s) displayed on touch sensitive display 112 when the application is active or executing. In some embodiments, device/global internal state 157 is used by event sorter 170 to determine which application(s) is (are) currently active, and application internal state 192 is used by event sorter 170 to determine application views 191 to which to deliver event information.
In some embodiments, application internal state 192 includes additional information, such as one or more of: resume information to be used when application 136-1 resumes execution, user interface state information that indicates information being displayed or that is ready for display by application 136-1, a state queue for enabling the user to go back to a prior state or view of application 136-1, and a redo/undo queue of previous actions taken by the user.
Event monitor 171 receives event information from peripherals interface 118. Event information includes information about a sub-event (e.g., a user touch on touch-sensitive display 112, as part of a multi-touch gesture). Peripherals interface 118 transmits information it receives from I/O subsystem 106 or a sensor, such as proximity sensor 166, accelerometer(s) 168, and/or microphone 113 (through audio circuitry 110). Information that peripherals interface 118 receives from I/O subsystem 106 includes information from touch-sensitive display 112 or a touch-sensitive surface.
In some embodiments, event monitor 171 sends requests to the peripherals interface 118 at predetermined intervals. In response, peripherals interface 118 transmits event information. In other embodiments, peripheral interface 118 transmits event information only when there is a significant event (e.g., receiving an input above a predetermined noise threshold and/or for more than a predetermined duration).
In some embodiments, event sorter 170 also includes a hit view determination module 172 and/or an active event recognizer determination module 173.
Hit view determination module 172 provides software procedures for determining where a sub-event has taken place within one or more views, when touch sensitive display 112 displays more than one view. Views are made up of controls and other elements that a user can see on the display.
Another aspect of the user interface associated with an application is a set of views, sometimes herein called application views or user interface windows, in which information is displayed and touch-based gestures occur. The application views (of a respective application) in which a touch is detected may correspond to programmatic levels within a programmatic or view hierarchy of the application. For example, the lowest level view in which a touch is detected may be called the hit view, and the set of events that are recognized as proper inputs may be determined based, at least in part, on the hit view of the initial touch that begins a touch-based gesture.
Hit view determination module 172 receives information related to sub-events of a touch-based gesture. When an application has multiple views organized in a hierarchy, hit view determination module 172 identifies a hit view as the lowest view in the hierarchy which should handle the sub-event. In most circumstances, the hit view is the lowest level view in which an initiating sub-event occurs (i.e., the first sub-event in the sequence of sub-events that form an event or potential event). Once the hit view is identified by the hit view determination module, the hit view typically receives all sub-events related to the same touch or input source for which it was identified as the hit view.
Active event recognizer determination module 173 determines which view or views within a view hierarchy should receive a particular sequence of sub-events. In some embodiments, active event recognizer determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, active event recognizer determination module 173 determines that all views that include the physical location of a sub-event are actively involved views, and therefore determines that all actively involved views should receive a particular sequence of sub-events. In other embodiments, even if touch sub-events were entirely confined to the area associated with one particular view, views higher in the hierarchy would still remain as actively involved views.
Event dispatcher module 174 dispatches the event information to one or more event recognizers (e.g., event recognizer 180). In some embodiments, event dispatcher module 174 dispatches the event information to one or more event recognizers in a single application. In other embodiments, event dispatcher module 174 dispatches the event information to event recognizers in a plurality of applications 136, or the operating system 126 and at least one application. In embodiments including active event recognizer determination module 173, event dispatcher module 174 delivers the event information to one or more event recognizers including an event recognizer determined by active event recognizer determination module 173. In some embodiments, event dispatcher module 174 stores in an event queue the event information, which is retrieved by a respective event receiver module 182.
In some embodiments, operating system 126 includes event sorter 170. Alternatively, application 136-1 includes event sorter 170. In yet other embodiments, event sorter 170 is a stand-alone module, or a part of another module stored in memory 102, such as contact/motion module 130.
In some embodiments, respective applications (e.g., 136-1 and 136-2) include a plurality of event handlers 190 and one or more application views 191, each of which includes instructions for handling touch events that occur within a respective view of the application's user interface. Each application view 191 of the application 136-1 includes one or more event recognizers 180. Typically, a respective application view 191 includes a plurality of event recognizers 180. In some embodiments, the operating system 126 includes one or more event recognizers 180. In some embodiments, the event sorter 170 includes one or more event recognizers 180. In other embodiments, one or more of event recognizers 180 are part of a separate module, such as a user interface kit (not shown) or a higher level object from which application 136-1 inherits methods and other properties. In some embodiments, a respective event handler 190 includes one or more of: data updater 176, object updater 177, GUI updater 178, and/or event data 179 received from event sorter 170. Event handler 190 may utilize or call data updater 176, object updater 177 or GUI updater 178 to update the application internal state 192. Alternatively, one or more of the application views 191 include one or more respective event handlers 190. Also, in some embodiments, one or more of data updater 176, object updater 177, and GUI updater 178 are included in a respective application view 191.
A respective event recognizer 180 receives event information (e.g., event data 179) from event sorter 170, and identifies an event from the event information. Event recognizer 180 includes event receiver 182 and event comparator 184. In some embodiments, event recognizer 180 also includes at least a subset of: metadata 183, and event delivery instructions 188 (which may include sub-event delivery instructions).
Event receiver 182 receives event information from event sorter 170. The event information includes information about a sub-event, for example, a touch or a touch movement. Depending on the sub-event, the event information also includes additional information, such as location of the sub-event. When the sub-event concerns motion of a touch the event information may also include speed and direction of the sub-event. In some embodiments, events include rotation of the device from one orientation to another (e.g., from a portrait orientation to a landscape orientation, or vice versa), and the event information includes corresponding information about the current orientation (also called device attitude) of the device.
Event comparator 184 compares the event information to predefined event or sub-event definitions and, based on the comparison, determines an event or sub-event, or determines or updates the state of an event or sub-event. In some embodiments, event comparator 184 includes event definitions 186. Event definitions 186 contain definitions of events (e.g., predefined sequences of sub-events), for example, event 1 (187-1), event 2 (187-2), and others. In some embodiments, sub-events in an event 187 include, for example, touch begin, touch end, touch movement, touch cancellation, and multiple touching. In one example, the definition for event 1 (187-1) is a double tap on a displayed object. The double tap, for example, comprises a first touch (touch begin) on the displayed object for a predetermined phase, a first lift-off (touch end) for a predetermined phase, a second touch (touch begin) on the displayed object for a predetermined phase, and a second lift-off (touch end) for a predetermined phase. In another example, the definition for event 2 (187-2) is a dragging on a displayed object. The dragging, for example, comprises a touch (or contact) on the displayed object for a predetermined phase, a movement of the touch across touch-sensitive display 112, and lift-off of the touch (touch end). In some embodiments, the event also includes information for one or more associated event handlers 190.
In some embodiments, event definition 187 includes a definition of an event for a respective user-interface object. In some embodiments, event comparator 184 performs a hit test to determine which user-interface object is associated with a sub-event. For example, in an application view in which three user-interface objects are displayed on touch-sensitive display 112, when a touch is detected on touch-sensitive display 112, event comparator 184 performs a hit test to determine which of the three user-interface objects is associated with the touch (sub-event). If each displayed object is associated with a respective event handler 190, the event comparator uses the result of the hit test to determine which event handler 190 should be activated. For example, event comparator 184 selects an event handler associated with the sub-event and the object triggering the hit test.
In some embodiments, the definition for a respective event 187 also includes delayed actions that delay delivery of the event information until after it has been determined whether the sequence of sub-events does or does not correspond to the event recognizer's event type.
When a respective event recognizer 180 determines that the series of sub-events do not match any of the events in event definitions 186, the respective event recognizer 180 enters an event impossible, event failed, or event ended state, after which it disregards subsequent sub-events of the touch-based gesture. In this situation, other event recognizers, if any, that remain active continue to track and process sub-events of an ongoing touch-based gesture.
In some embodiments, a respective event recognizer 180 includes metadata 183 with configurable properties, flags, and/or lists that indicate how the event delivery system should perform sub-event delivery to actively involved event recognizers. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate how event recognizers may interact with one another. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate whether sub-events are delivered to varying levels in the view or programmatic hierarchy.
In some embodiments, a respective event recognizer 180 activates event handler 190 associated with an event when one or more particular sub-events of an event are recognized. In some embodiments, a respective event recognizer 180 delivers event information associated with the event to event handler 190. Activating an event handler 190 is distinct from sending (and deferred sending) sub-events to a respective hit view. In some embodiments, event recognizer 180 throws a flag associated with the recognized event, and event handler 190 associated with the flag catches the flag and performs a predefined process.
In some embodiments, event delivery instructions 188 include sub-event delivery instructions that deliver event information about a sub-event without activating an event handler. Instead, the sub-event delivery instructions deliver event information to event handlers associated with the series of sub-events or to actively involved views. Event handlers associated with the series of sub-events or with actively involved views receive the event information and perform a predetermined process.
In some embodiments, data updater 176 creates and updates data used in application 136-1. For example, data updater 176 updates the telephone number used in contacts module 137, or stores a video file used in video player module 145. In some embodiments, object updater 177 creates and updates objects used in application 136-1. For example, object updater 176 creates a new user-interface object or updates the position of a user-interface object. GUI updater 178 updates the GUI. For example, GUI updater 178 prepares display information and sends it to graphics module 132 for display on a touch-sensitive display.
In some embodiments, event handler(s) 190 includes or has access to data updater 176, object updater 177, and GUI updater 178. In some embodiments, data updater 176, object updater 177, and GUI updater 178 are included in a single module of a respective application 136-1 or application view 191. In other embodiments, they are included in two or more software modules.
It shall be understood that the foregoing discussion regarding event handling of user touches on touch-sensitive displays also applies to other forms of user inputs to operate multifunction devices 100 with input-devices, not all of which are initiated on touch screens, e.g., coordinating mouse movement and mouse button presses with or without single or multiple keyboard presses or holds, user movements taps, drags, scrolls, etc., on touch-pads, pen stylus inputs, movement of the device, oral instructions, detected eye movements, biometric inputs, and/or any combination thereof, which may be utilized as inputs corresponding to sub-events which define an event to be recognized.
Family Account Management
The online content storage system 200 enables users to set up and manage family accounts that include accounts accessible by children of the family. To facilitate the set up process, a computing device 100 executes the family account management module 152 which displays a sequence of user interface screens configured to obtain information pertinent to the attributes of the accounts of the family members. The family account management module 152 can be locally executed on the computing device as illustrated, or can be itself hosted on the content storage system 200, e.g., as a browser accessing screens provided by the content storage system 200.
Referring again to
Referring generally to
Generally,
The method flow begins with a family sharing setup prompt 3.1, such as illustrated in screen 301 shown in
Following selection of link 303, the family sharing module 152 prompts 3.2 the user to select a specific account to use for sharing with other family members (“family account”). The user setting up the family sharing accounts is termed the “family organizer,” as shown in screen 305 in
Following selection 3.2 of an account for use as the family account, the family sharing module 152 prompts 3.3 the family organizer to select which account is to be used for purchases by other family members (“purchase account”), with a selection of links for a music account 308, an e-books account 310, an apps account 312, or another account 314, such as illustrated on screen 307 in
Each of the user's music account 308, an e-books account 310, an apps account 312, or another account 314 is associated with a credit card, debit card or other payment mechanism. Accordingly, following selection 3.3 of one of the user's accounts as a purchase account, the family sharing module 152 retrieves from the user account 208 the payment information, e.g., credit card number, associated with the selected account, and displays at least a portion (e.g., the full number or the last 4 digits) of the credit card number 316 to user, along with auxiliary information such as expiration date, such as illustrated on screen 309 in
Following selection of a purchase account (or new payment information), the family sharing module 152 provides 3.5 a family screen 321, such as illustrated in
Following selection of the link 328 for creating a new account for a child family member, the family organizer is prompted 3.6 with a notification explaining that a child account must be linked to an account of an adult family member, such as on screen 323 in
In the child information screen 327, the family organizer (or other adult family member) inputs the child's first and last names (336, 338), and date of birth 340. The family organizer can optionally select link 334 to set customized purchase and access restrictions for child, as further described below. Selection of link 342 transmits this information to the content storage system 200, which provides it to the user accounts server 208. The accounts server 208 creates a new account for the child, and links the child account to the shared family account, and returns an indication of a successful account creation to the family sharing module 152. To create an account for the child, the accounts server 208 creates a new account ID for the child and uses that account ID as an email address associated with the content storage system 200.
During (via link 342) or after the creation of the child account, the family organizer can set 3.9 specific purchase and access restrictions for a child, such as illustrated in screen 329 in
Further, the family organizer can select a control 370 to automatically update the restrictions as the child ages. Different templates of content access and purchase restrictions for media, movie ratings, and game ratings for different age ranges are stored by the user account server 208, for example a template for children ages 5-10, a template for children in the U.S. ages 11-13, a template for children ages 14-17, and template for users over age 18. Different sets of content access and purchase restriction templates are stored for different countries (e.g., US, Canada, United Kingdom, Japan, China) to reflect different cultural standards and rating systems. The user account server 208 automatically updates the particular restrictions applied to child member's account based on the age of the child. For example, each time a child member logs into the content storage system 200, the user account server 218 can determine the current age of the child from the birth date information previously entered and the current date. If the child's age in years has changed, the server 218 looks up the appropriate content restriction template for the current age and child's country and updates the restrictions accordingly.
Referring again to
As illustrated in
Following step 3.11, the family sharing module 152 confirms 3.12 that the family organizer is allowing the child family member to purchase content from the various available content sources using the purchase account, for example using the screen 335 illustrated in
Another feature of the confirmation step 3.12, as illustrated in
Upon selection of the continue link 354, the designated selections are transmitted to the content storage system 200, which passes the selections to the account server 208 to update the account information for the family sharing account and the child member account.
The family sharing module 152 displays 3.13 an updated family screen 321, such as illustrated in
From the family screen 321, the family organizer may also select link 326 to initiate operations for adding a family member to the family account, where the family member already has an account ID with the content storage system 200. Following selection of link 326, the family sharing module 152 displays 3.14 a screen configured to receive the name or email address in field 358 of the family member to be added to the family account, such as screen 337 shown in
If the verification by the account server 208 is successful, the family sharing module 152 is notified and it updates 3.15 the display of the name in field 358 on screen 337 in visually distinguished manner (e.g., change of color, font), as illustrated in
More specifically, the link 360 is beneficially selected by the family organizer when the identified family member is present in the same location as the family organizer, and as before, allows the family organizer to provide the computing device 100A to the family member to enter their password directly, as illustrated for example by screen 339 in
As part of the authentication check by the user account server 208, the server 208 determines from the user's account whether the invited family is a teenager, e.g., has an age in a predetermined age range, such as ages 13-18 (which range can be varied on a country by country basis by a system administrator). If the invited family member is in the teenage age range, then the family sharing module 152 displays 3.12 the confirmation screen 335, such as illustrated in
Referring back to step 3.15 and
As illustrated in
Referring to
When a family member joins a family account, the family member is automatically given access to a shared calendar associated with the family account. More specifically, upon creation of a family account, an online calendar of the family in instantiated and associated with the family account, and with each user account ID associated therewith. Access privileges are predefined by age group. For example, access privileges for each adult family member include the ability to create, edit, and delete calendar entries, as well as invite others, including family members and non-family members to calendared events. Child family members have restricted access privileges, so that they can see but not edit or delete calendar entries created by other family members. Access privileges for teen family members can be to create personal calendar entries only, and see but not edit calendar entries for other family members. The family organizer can modify the access privileges for teen and child family members. Management of these access privileges, as well as updating and distribution of calendar events between family members is handled by the family calendar server 314. More specifically, when a user accesses the calendar module 148 on their respective computing device 100, the module 148 passes the user's credentials (e.g., user account ID, password) to the family calendar server 216; the server 216 uses the account ID to identify the family calendar associated with the user account ID. Upon identification of the appropriate family calendar, the calendar server 216 transmits to the calendar module 148 current calendar information and access restrictions (if any), which calendar can then be displayed to the user on the user's computing device 100.
Another service to which family members are automatically given access to upon being added to a family account is a group messaging service, by which family members can readily use the instant messaging module 141 to communicate with each other. More specifically, in one embodiment, the group messaging server 216 maintains for each family account a list of family members. When a family organizer adds a new family member to the family account, the family sharing module 152 notifies the group messaging server 216 of the account ID and user name of the added member, and the group messaging server 216 updates the list of family members for the family account. When the newly added family member accesses the instant messaging module 141 on their computing device 100C, the module obtains from the group messaging server 216 the current list of family members for the family account. These family members are then added by the instant messaging module 141 to the family member's list of preselected users (e.g. “favorites” or “buddies” or “friends”) for messaging, which, for example, can place that the names of these family members at the top of the list for ease of access by the family member. This automatic configuration of the instant messaging module 141 on a user's computing device 100 saves the family member time by eliminating the need to manually configure the instant messaging module 141 to include each family member, and thereby has the beneficial side effect of saving battery life on the device 100.
Another service to which family members are automatically given access upon being added to a family account is a device location service, provided by the device location server 218. As noted above, a computing device 100 preferably includes a GPS module 135 that is enabled to determine the geographic location of the device 100. Referring to
Referring to
Referring to
The system 200 prompts 608 the child family member to authenticate themselves again using either their account ID or a fingerprint if the computing device 100 supports a fingerprint scanner, or other identification information. The child family member provides an authentication response 610. The system 200 receives and verifies 612 the authentication response using the user account server 208.
The system 200 determines 614 whether a family account is associated with the account ID of the child family member, and further determines 616 whether there a real-time purchase approval process is enabled (as in
If the real-time purchase approval process is enabled, then the system 200 transmits 618 a notification to the child family member's computing device 100B indicating the requirement for the purchase approval. The child family member's computing device 100B displays 619 a screen or window indicating that an adult family member is required to approve the purchase of the requested item and prompting the child family member to indicate via a link 704 whether to request such approval, for example from the family organizer or other designated adult family member or decline. An example screen 703 with an overlay 708 is illustrated in
In one embodiment, when a child family member requests to purchase 606, the system 200 determines via the content storage server 201 if the item has already been downloaded by another family member, and included in the family account. If so, then the child family member is notified that the content is already available in the family account, for example as illustrated in
Referring to
The family sharing module 152 on each adult family member's computing device 100 receives this message and displays 626 a screen or a banner indicating that the adult family member's approval is required, such as screen 707 illustrated in
Following selection of link 712, in one embodiment, the system 200 transmits to the adult family member's computing device 100 the details of the content item that is to purchased, which information is displayed 630 by the computing device, so that the adult family member has sufficient information about the content item, for example as illustrated by screen 709 in
Following selection of approve link 718, the adult family member's computing device 100 transmits an approval message back to the system 200. In one optional embodiment, following selection of approval link 718, the child family member's computing device 100 display a screen with an input field to receive the credit card authorization code for the purchase account to validate that the user is in fact the adult family member; alternatively, the input may be the user's account password, or other credentials not know to the child family member. The system receives the approval message (and optional authorization information), and using the purchase server 210, proceeds to process 632 the transaction, including authorization of payment using the purchase account associated with the family account. Upon authorization of payment, the content item is downloaded 634 by the download server 212 to the user's computing device 100B. Optionally, the user's computing device 100B receives a message indicating the purchase has been approved, such as illustrated in
In another embodiment, following selection of Yes link 706, the system 200 transmits to the user's computing device 100 a message indicating that the child family member can request permission for the purchase in person, for example by selecting link 722 in banner 724 in
The disclosure herein has been described in particular detail with respect to one possible embodiment. Those of skill in the art will appreciate that other embodiments may be practiced. First, the particular naming of the components and variables, capitalization of terms, the attributes, data structures, or any other programming or structural aspect is not mandatory or significant, and the mechanisms that implement the invention or its features may have different names, formats, or protocols. Also, the particular division of functionality between the various system components described herein is merely exemplary, and not mandatory; functions performed by a single system component may instead be performed by multiple components, and functions performed by multiple components may instead performed by a single component.
Some portions of above description present features in terms of algorithms and symbolic representations of operations on information. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. These operations, while described functionally or logically, are understood to be implemented by computer programs. Furthermore, it has also proven convenient at times, to refer to these arrangements of operations as modules or by functional names, without loss of generality.
Unless specifically stated otherwise as apparent from the above discussion, it is appreciated that throughout the description, discussions utilizing terms such as “determining” or “displaying” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system memories or registers or other such information storage, transmission or display devices.
Certain aspects of the embodiments disclosed herein include process steps and instructions described herein in the form of an algorithm. It should be noted that the process steps and instructions could be embodied in software, firmware or hardware, and when embodied in software, could be downloaded to reside on and be operated from different platforms used by real time network operating systems.
The algorithms and operations presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may also be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will be apparent to those of skill in the art, along with equivalent variations. In addition, the present invention is not described with reference to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any references to specific languages are provided for invention of enablement and best mode of the present invention.
The embodiments disclosed herein are well suited to a wide variety of computer network systems over numerous topologies. Within this field, the configuration and management of large networks includes storage devices and computers that are communicatively coupled to dissimilar computers and storage devices over a network, such as the Internet.
Finally, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, the disclosure herein is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
This application claims the benefit of U.S. Provisional Application No. 62/005,621 filed on May 30, 2014, which is incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5483261 | Yasutake | Jan 1996 | A |
5488204 | Mead et al. | Jan 1996 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5835079 | Shieh | Nov 1998 | A |
5880411 | Gillespie et al. | Mar 1999 | A |
6108735 | Pawlowski | Aug 2000 | A |
6188391 | Seely et al. | Feb 2001 | B1 |
6230204 | Fleming, III | May 2001 | B1 |
6310610 | Beaton et al. | Oct 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6570557 | Westerman et al. | May 2003 | B1 |
6678824 | Cannon et al. | Jan 2004 | B1 |
6690387 | Zimmerman et al. | Feb 2004 | B2 |
6785889 | Williams | Aug 2004 | B1 |
7015894 | Morohoshi | Mar 2006 | B2 |
7184064 | Zimmerman et al. | Feb 2007 | B2 |
7218226 | Wehrenberg | May 2007 | B2 |
7434713 | Linden | Oct 2008 | B2 |
7614008 | Ording | Nov 2009 | B2 |
7627652 | Commons | Dec 2009 | B1 |
7633076 | Huppi et al. | Dec 2009 | B2 |
7653883 | Hotelling et al. | Jan 2010 | B2 |
7657849 | Chaudhri et al. | Feb 2010 | B2 |
7663607 | Hotelling et al. | Feb 2010 | B2 |
7688306 | Wehrenberg et al. | Mar 2010 | B2 |
7814055 | Hullot et al. | Oct 2010 | B2 |
7822713 | Hullot et al. | Oct 2010 | B2 |
7844914 | Andre et al. | Nov 2010 | B2 |
7895661 | Dowdy et al. | Feb 2011 | B2 |
7899714 | Robbin et al. | Mar 2011 | B2 |
7957762 | Herz et al. | Jun 2011 | B2 |
8006002 | Kalayjian et al. | Aug 2011 | B2 |
8020105 | Lemay et al. | Sep 2011 | B1 |
8201224 | Spertus | Jun 2012 | B1 |
8239784 | Hotelling et al. | Aug 2012 | B2 |
8279180 | Hotelling et al. | Oct 2012 | B2 |
8285810 | Svendsen | Oct 2012 | B2 |
8352873 | Craig et al. | Jan 2013 | B2 |
8381135 | Hotelling et al. | Feb 2013 | B2 |
8468580 | Casey et al. | Jun 2013 | B1 |
8479122 | Hotelling et al. | Jul 2013 | B2 |
8538458 | Haney | Sep 2013 | B2 |
8554861 | Christie et al. | Oct 2013 | B2 |
8713535 | Malhotra et al. | Apr 2014 | B2 |
8812994 | Seymour et al. | Aug 2014 | B2 |
8825036 | Kemery | Sep 2014 | B2 |
9129135 | Hoefel et al. | Sep 2015 | B2 |
9137389 | Neal | Sep 2015 | B2 |
9172705 | Kong et al. | Oct 2015 | B1 |
9270760 | Heinberg | Feb 2016 | B2 |
9292882 | Blinder | Mar 2016 | B2 |
9294460 | Thomas | Mar 2016 | B1 |
9473419 | Brand | Oct 2016 | B2 |
9626720 | Robbin | Apr 2017 | B2 |
9755842 | Raleigh et al. | Sep 2017 | B2 |
9870238 | Astete et al. | Jan 2018 | B2 |
9875346 | Pitschel et al. | Jan 2018 | B2 |
9935956 | Cha et al. | Apr 2018 | B1 |
10243824 | Lalmanovitch et al. | Mar 2019 | B2 |
20020015024 | Westerman et al. | Feb 2002 | A1 |
20020049806 | Gatz | Apr 2002 | A1 |
20020065919 | Taylor et al. | May 2002 | A1 |
20030028622 | Inoue et al. | Feb 2003 | A1 |
20030061111 | Dutta et al. | Mar 2003 | A1 |
20030112874 | Rabinowitz et al. | Jun 2003 | A1 |
20040015985 | Kweon | Jan 2004 | A1 |
20040034853 | Gibbons et al. | Feb 2004 | A1 |
20040127197 | Roskind | Jul 2004 | A1 |
20040236650 | Zapiec et al. | Nov 2004 | A1 |
20050060412 | Chebolu et al. | Mar 2005 | A1 |
20050273399 | Soma | Dec 2005 | A1 |
20060026521 | Hotelling et al. | Feb 2006 | A1 |
20060033724 | Chaudhri et al. | Feb 2006 | A1 |
20060150240 | Robinson et al. | Jul 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20060265192 | Turicchi | Nov 2006 | A1 |
20070261030 | Wadhwa | Nov 2007 | A1 |
20080005319 | Anderholm et al. | Jan 2008 | A1 |
20080134295 | Bailey | Jun 2008 | A1 |
20080154903 | Crowley | Jun 2008 | A1 |
20090064314 | Lee | Mar 2009 | A1 |
20090086010 | Tiphane | Apr 2009 | A1 |
20090150545 | Flores et al. | Jun 2009 | A1 |
20090254656 | Vignisson et al. | Oct 2009 | A1 |
20090300671 | Scott et al. | Dec 2009 | A1 |
20100077036 | Deluca et al. | Mar 2010 | A1 |
20100146607 | Piepenbrink | Jun 2010 | A1 |
20100188579 | Friedman | Jul 2010 | A1 |
20100205656 | Fein et al. | Aug 2010 | A1 |
20100302182 | Wei et al. | Dec 2010 | A1 |
20100306163 | Beaty et al. | Dec 2010 | A1 |
20100312696 | Sinha | Dec 2010 | A1 |
20110055735 | Wood et al. | Mar 2011 | A1 |
20110065419 | Book et al. | Mar 2011 | A1 |
20110078767 | Cai et al. | Mar 2011 | A1 |
20110295727 | Ferris et al. | Dec 2011 | A1 |
20110302182 | Crawford | Dec 2011 | A1 |
20120014516 | Nies | Jan 2012 | A1 |
20120036552 | Dare et al. | Feb 2012 | A1 |
20120047024 | Urban et al. | Feb 2012 | A1 |
20120101952 | Raleigh et al. | Apr 2012 | A1 |
20120117478 | Vadde et al. | May 2012 | A1 |
20120210444 | Yabe et al. | Aug 2012 | A1 |
20120265803 | Ha | Oct 2012 | A1 |
20120291101 | Ahlstrom et al. | Nov 2012 | A1 |
20120317609 | Carrara et al. | Dec 2012 | A1 |
20120331568 | Weinstein | Dec 2012 | A1 |
20130018960 | Knysz | Jan 2013 | A1 |
20130031191 | Bolt Ross | Jan 2013 | A1 |
20130060616 | Block | Mar 2013 | A1 |
20130065555 | Baker et al. | Mar 2013 | A1 |
20130073654 | Cohen | Mar 2013 | A1 |
20130080522 | Ren | Mar 2013 | A1 |
20130083059 | Hwang et al. | Apr 2013 | A1 |
20130086169 | Bruich et al. | Apr 2013 | A1 |
20130088650 | Rouady et al. | Apr 2013 | A1 |
20130174100 | Seymour et al. | Jul 2013 | A1 |
20130227675 | Fujioka | Aug 2013 | A1 |
20130231093 | Toy et al. | Sep 2013 | A1 |
20130254288 | Harrison | Sep 2013 | A1 |
20130254660 | Fujioka | Sep 2013 | A1 |
20130311597 | Arrouye et al. | Nov 2013 | A1 |
20140009378 | Chew | Jan 2014 | A1 |
20140068755 | King et al. | Mar 2014 | A1 |
20140113593 | Zhou et al. | Apr 2014 | A1 |
20140136607 | Ou et al. | May 2014 | A1 |
20140150068 | Janzer | May 2014 | A1 |
20140162595 | Raleigh et al. | Jun 2014 | A1 |
20140237235 | Kuno et al. | Aug 2014 | A1 |
20140248852 | Raleigh et al. | Sep 2014 | A1 |
20140289866 | Beck et al. | Sep 2014 | A1 |
20140298207 | Ittah et al. | Oct 2014 | A1 |
20140320398 | Papstein | Oct 2014 | A1 |
20140325407 | Morris et al. | Oct 2014 | A1 |
20140330945 | Dabbiere et al. | Nov 2014 | A1 |
20140331314 | Fujioka | Nov 2014 | A1 |
20140337997 | Beck et al. | Nov 2014 | A1 |
20140344951 | Brewer | Nov 2014 | A1 |
20140364056 | Belk et al. | Dec 2014 | A1 |
20150006723 | Sheth et al. | Jan 2015 | A1 |
20150049018 | Gomez | Feb 2015 | A1 |
20150106833 | Kang et al. | Apr 2015 | A1 |
20150153928 | Chen et al. | Jun 2015 | A1 |
20150269547 | Fan | Sep 2015 | A1 |
20150323974 | Shuster et al. | Nov 2015 | A1 |
20150341484 | Yablokov et al. | Nov 2015 | A1 |
20150341506 | Mirza et al. | Nov 2015 | A1 |
20150348032 | Ioveva et al. | Dec 2015 | A1 |
20150350215 | Shi et al. | Dec 2015 | A1 |
20160050194 | Rachmiel | Feb 2016 | A1 |
20160080510 | Dawoud et al. | Mar 2016 | A1 |
20160094538 | Dabbiere et al. | Mar 2016 | A1 |
20160094881 | Khatua | Mar 2016 | A1 |
20160127210 | Noureddin et al. | May 2016 | A1 |
20160139752 | Shim et al. | May 2016 | A1 |
20160188821 | Ozeran | Jun 2016 | A1 |
20160198322 | Pitis | Jul 2016 | A1 |
20160232336 | Pitschel et al. | Aug 2016 | A1 |
20160292060 | Mathew et al. | Oct 2016 | A1 |
20160323191 | Aoki | Nov 2016 | A1 |
20160330078 | Bostick et al. | Nov 2016 | A1 |
20160349932 | Gorny | Dec 2016 | A1 |
20160373588 | Raleigh et al. | Dec 2016 | A1 |
20170010794 | Cho et al. | Jan 2017 | A1 |
20170019858 | Zhao et al. | Jan 2017 | A1 |
20170034228 | Grossman et al. | Feb 2017 | A1 |
20170053129 | Arif et al. | Feb 2017 | A1 |
20170055110 | Tian et al. | Feb 2017 | A1 |
20170149795 | Day, II | May 2017 | A1 |
20170180494 | Agrawal et al. | Jun 2017 | A1 |
20170201850 | Raleigh et al. | Jul 2017 | A1 |
20170251268 | Zhao | Aug 2017 | A1 |
20170279971 | Raleigh et al. | Sep 2017 | A1 |
20170295258 | Raleigh et al. | Oct 2017 | A1 |
20180018252 | Farrell et al. | Jan 2018 | A1 |
20180046818 | Amacker et al. | Feb 2018 | A1 |
20180060102 | Zhu et al. | Mar 2018 | A1 |
20180224915 | Shuster et al. | Aug 2018 | A1 |
20180276353 | Pitschel et al. | Sep 2018 | A1 |
20180343306 | Lotter et al. | Nov 2018 | A1 |
20190037037 | Umeya et al. | Jan 2019 | A1 |
20190081949 | Wu et al. | Mar 2019 | A1 |
20190082227 | Zhao | Mar 2019 | A1 |
20190347180 | Cranfill et al. | Nov 2019 | A1 |
20190347181 | Cranfill et al. | Nov 2019 | A1 |
Number | Date | Country |
---|---|---|
103294965 | Sep 2013 | CN |
104471521 | Mar 2015 | CN |
104487929 | Apr 2015 | CN |
105308634 | Feb 2016 | CN |
2429183 | Mar 2012 | EP |
2860608 | Apr 2015 | EP |
2000-163031 | Jun 2000 | JP |
2001-218976 | Aug 2001 | JP |
2002-342033 | Nov 2002 | JP |
2007-125338 | May 2007 | JP |
2016-13151 | Jan 2016 | JP |
2016-517598 | Jun 2016 | JP |
2007064200 | Jun 2007 | WO |
2011163481 | Dec 2011 | WO |
2014144908 | Sep 2014 | WO |
2015010111 | Jan 2015 | WO |
Entry |
---|
Lee, S.K. et al. (Apr. 1985). “A Multi-Touch Three Dimensional Touch-Sensitive Tablet,” Proceedings of CHI: ACM Conference on Human Factors in Computing Systems, pp. 21-25. |
Rubine, D.H. (Dec. 1991). “The Automatic Recognition of Gestures,” CMU-CS-91-202, Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Computer Science at Carnegie Mellon University, 285 pages. |
Rubine, D.H. (May 1992). “Combining Gestures and Direct Manipulation,” CHI ' 92, pp. 659-660. |
Westerman, W. (Spring 1999). “Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface,” A Dissertation Submitted to the Faculty of the University of Delaware in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Electrical Engineering, 364 pages. |
PCT International Search Report and Written Opinion for PCT/US2015/023406, dated Jun. 26, 2015, 13 pages. |
“Check your Apple ID device list to see where you're signed in”, Apple Support, Available online at: <https://support.apple.com/en-us/HT205064>, Mar. 7, 2018, 4 pages |
Extended Search Report received for European Patent Application No. 17756993.6, dated Oct. 4, 2019, 7 pages. |
Final Office Action received for U.S. Appl. No. 14/502,981, dated Dec. 28, 2018, 29 pages. |
Final Office Action received for U.S. Appl. No. 14/502,981, dated Oct. 12, 2017, 22 pages. |
Final Office Action received for U.S. Appl. No. 14/683,093, dated Dec. 15, 2016, 11 pages. |
Final Office Action received for U.S. Appl. No. 15/274,887, dated Jan. 5, 2018, 17 pages. |
Final Office Action received for U.S. Appl. No. 15/875,934, dated Aug. 8, 2019, 11 pages. |
Final Office Action received for U.S. Appl. No. 16/179,106, dated Mar. 6, 2020, 12 pages. |
Solightly, Daniel, “Google is Taking Steps in Search of “Digital Wellbeing”—I/O 2018”, Google Blog, Available Online at: <https://www.androidheadlines.com/2018/05/google-is-taking-steps-in-search-of-digital-wellbeing-i-o-2018. html>, May 8, 2018, 3 pages. |
International Search Report received for PCT Patent Application No. PCT/US2019/029214, dated Nov. 4, 2019, 6 pages. |
Koester, Mark, “Moment: Automatic Track and Know Your iPhone Usage”, Minding the Borderlands, Available Wine at: <http://www.markwk.com/2016/11/iphone-tracking-with-moment_html>, Nov. 8, 2016, 6 pages. |
Langer, Christina, “How to Disable Gmail Notifications on Android”, Available online at: <https://ccm.net/faq/35590-how-to-disable-gmail-notifications-on-android>, Apr. 11, 2016, 2 pages. |
“My Data Manager Track your Mobile Data Usage and Save Money”, AppPicker, Aug. 18, 2015. |
Nam, Luke, “Live in the Moment (app review)”, deTeched, Available online at: <http://www.deteched.com/2017/11/20/live-moment-app-review/amp/>, Nov. 20, 2017, 5 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/502,981, dated Apr. 3, 2020, 42 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/502,981, dated Apr. 11, 2018, 20 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/502,981, dated Feb. 14, 2017, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/683,093, dated Apr. 20, 2017, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/683,093, dated May 5, 2016, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/875,934, dated Jan. 23, 2020, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/875,934, dated Nov. 30, 2018, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/147,108, dated Mar. 22, 2019, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/179,106, dated Oct. 23, 2019, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/274,887, dated Jun. 23, 2017, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/147,069, dated Jan. 6, 2020, 29 pages. |
Notice of Allowance received for U.S. Appl. No. 14/683,093, dated Sep. 6, 2017, 12 pages. |
Notice of Allowance received for U.S. Appl. No. 15/875,934, dated Nov. 25, 2019, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 16/147,069, dated Apr. 29, 2020, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 16/147,108, dated Sep. 26, 2019, 28 pages. |
Notice of Allowance received for U.S. Appl. No. 15/274,887, dated Jul. 20, 2018, 9 pages. |
Search Report received for Danish Patent Application No. PA201870341, completed on Sep. 6, 2018, 4 pages. |
Search Report received for Danish Patent Application No. PA201870345, completed on Aug. 28, 2018, 4 pages. |
Number | Date | Country | |
---|---|---|---|
20150348185 A1 | Dec 2015 | US |
Number | Date | Country | |
---|---|---|---|
62005621 | May 2014 | US |