The present disclosure relates in general to computer systems and in particular to user interfaces for messaging client application programs.
Individuals are becoming increasingly reliant on electronic messaging services, such as email services, instant messaging services, SMS/MMS (also referred to as text-messaging) services, and so on. Such services allow users to send and receive messages. In some instances, the services may also provide additional features related to managing messages, such as the ability to store or archive messages in folders, delete messages, search stored messages, and so on.
Many users who rely on electronic messaging services use various electronic devices, such as laptop or desktop computers, smart phones, tablets, and so on, that are capable of connecting to various data networks and acting as clients of an electronic messaging service. For example, a client can receive messages from a messaging service and can provide a user interface that allows the user to interact with messages, e.g., by replying to messages, deleting messages, archiving messages, composing new messages to be sent, etc. When connected to a data network, the client device can communicate updates from the messaging service to the user and can communicate instructions to the messaging service to implement user actions. The convenience and speed of electronic messaging can lead to a large volume of messages being sent and received.
Unfortunately, many users of messaging services now find themselves overwhelmed by the volume of messages they receive. Techniques that allow users to better manage their messages are therefore desirable.
A user of a messaging service may access the service using different client devices at different times. Each client device can execute an application program that provides a graphical user interface via which the user can read, sort, respond to, and compose messages. The client devices can vary in user-interface factors such as screen size and resolution, types of supported input devices (e.g., touchscreen versus keyboard-and-mouse), and so on. When a user moves from one client device to another, interfaces adapted for different types of input and output devices and/or different screen sizes may be inconsistent, and the user may find it difficult or non-intuitive to interact with a different client.
Certain embodiments of the present invention relate to user interfaces for messaging client applications that can provide a more consistent experience across different client devices. For example, in some embodiments, a user interface can be presented that includes a list of message collections in a first pane, a message list for a current collection in a second pane, and a currently selected message in a third pane, arranged such that all three panes are simultaneously visible on a display screen. This can scale to smaller devices by presenting fewer panes at once and allowing the user to navigate between panes. The user can interact with messages directly from the message list in the second pane. For instance, by executing various input operations on a representation of a message in the message list, the user can move a message from one collection to another, defer a message for later action, delete a message, and so on.
In embodiments where user input is received via keyboard and pointing device (e.g., a mouse), the number of clicks required to perform an operation can be reduced. For example, in some embodiments, a user operating a pointing device such as a mouse can indicate an action to be taken on a message by performing a “drag” operation on a representation of the message in the message list. The message to be acted upon can be indicated by the position (or location) of an on-screen cursor, which can be placed on the representation of the message to be acted upon. The action to be taken on the message can be indicated by the direction and distance of the drag. Similarly, a user operating a multi-touch track pad to control an on-screen cursor can perform multi-touch gestures (e.g., two-finger swipe) to indicate an action to be taken on a message, with the message being indicated by the position of the on-screen cursor and the action being indicated by the direction and distance of the swipe. Such gestures can be intuitively similar to gestures used on a different client device that has a touch screen, where the user can perform the gesture directly on the screen.
In some embodiments, the user interface can provide visual feedback during a drag operation, e.g., using color and/or glyphs, to indicate the action that will be taken if the user ends the drag at the current location.
As another example, in some embodiments, hovering an on-screen cursor over a representation of a message in the message list can result in a group of control elements (or control menu) appearing, each mapped to a different action that can be taken on the message. The user can then move the cursor to the control element corresponding to the desired action and “click” an input device to execute the action. As the user moves the cursor over a particular control element, the color of the control element and/or the control menu and/or the message can be changed to indicate the action that will be taken if the user clicks while the cursor is in its current location.
The following detailed description together with the accompanying drawings will provide a better understanding of the nature and advantages of the present invention.
Certain embodiments of the present invention relate to user interfaces for messaging client applications that can provide a more consistent experience across different client devices. For example, in some embodiments, a user interface can be presented that includes a list of message collections in a first pane, a message list for a current collection in a second pane, and a currently selected message in a third pane, arranged such that all three panes are simultaneously visible on a display screen. This can scale to smaller devices by presenting fewer panes at once and allowing the user to navigate between them. The user can interact with messages directly from the message list in the second pane. For instance, by executing various input operations, the user can move a message from one collection to another, defer a message for later action, delete a message, and so on.
In embodiments where user input is received via keyboard and pointing device (e.g., a mouse), the number of clicks required to perform an operation can be reduced. For example, in some embodiments, a user operating a pointing device such as a mouse can indicate an action to be taken on a message by performing a “drag” operation on a representation of the message in the message list. The message to be acted upon can be indicated by the position of an on-screen cursor, and the action to be taken can be indicated by the direction and distance of the drag. Similarly, a user operating a multi-touch track pad to control an on-screen cursor can perform multi-touch gestures (e.g., two-finger swipe) to indicate an action to be taken on a message, with the message being indicated by the position of the on-screen cursor and the action being indicated by the direction and distance of the swipe. Such gestures can be intuitively similar to gestures used on a different client device that has a touch screen, where the user can perform the gesture directly on the screen.
In some embodiments, a client device can present a user interface including a message list pane that displays a list of messages. Other panes, such as a reading pane or a pane listing message collections, might or might not be present, depending, e.g., on the available display area of the client device. The client device can receive a user input from a user input device that controls a location of an on-screen cursor, and the input can indicate a drag operation on the first message from a starting cursor location to a current cursor location. While the user input continues to indicate the drag operation, the client device can provide a visual indicator identifying an action that will be performed on the first message if the user ends the drag operation, with the indicator being selected based on the current cursor location relative to the starting cursor location. The indicator can include, for example, a glyph and/or a color corresponding to a destination collection to which the message will be moved if the user ends the drag operation at the current cursor location.
When the user input indicates an end to the drag operation, the client device can perform the identified action on the message. For example, the client device can move the message from a current message collection to a destination message collection determined based on the current cursor location relative to the starting cursor location (e.g., direction and/or distance the cursor has moved). In some instances, when the user ends the drag operation, the client device can present a selection menu for the user to select options associated with the identified action (e.g., specifying how long to defer a message). The location at which the selection menu appears can be based on the current cursor location. The user can select an option from the menu, and the client device can use the selected option in connection with performing the action on the message.
As another example, in some embodiments, hovering an on-screen cursor over a representation of a message in the message list can result in a group of control elements (or control menu) appearing, each mapped to a different action that can be taken on the message. The user can then move the cursor to the control element corresponding to the desired action and click to execute the action.
In some embodiments, the client device can present a user interface including a message list pane that displays a listing of messages. In response to user input, the client device can present a control menu that includes various control elements corresponding to actions that can be performed on the message (e.g., moving the message to various message collections). The control menu can occupy a portion of the area allocated to a message in the message list and can overlay a portion of (or all of) the message. Alternatively, the control menu can occupy an area adjacent to the area allocated to the message, and if desired, other messages can be moved (e.g., up or down) to create space to display the control menu. The control menu can provide a glyph within each control element, where the glyph indicates the action corresponding to that control element.
The control menu can be presented in a dormant state if the on-screen-cursor is located within a particular message in the message list but not within a presentation area occupied by the control menu and in an active state if the location of the on-screen cursor is within the presentation area occupied by the control menu. When the control menu is presented in the active state, at least a portion of the control menu can be presented in an active color that is determined based on which one of the control elements includes the location of the on-screen cursor. In some embodiments, the entire control menu can be presented in the active color, or the entire message within which the cursor is located can be presented in the active color. In some embodiments, the control menu can be presented in the dormant state by presenting different control elements in different translucent colors; in the active state, one or more of the control elements (e.g., the element that contains the cursor location) can become opaque.
While the cursor is within the presentation area of the control menu, the client device can receive a click input. In response to the click input, the client device can perform an action based on which one of the control elements includes the current cursor location.
Messaging service 102 can be any service that allows users to send, receive, store, and/or access messages, where a “message” can include any electronic communication generated by a sender and directed to one or more recipients, such as email messages, instant messages (e.g., messages sent between users using various “chat” services), SMS/MMS messages (e.g., messages conforming to Short Messaging Service and/or Multimedia Messaging Service protocols supported by various cellular and other wireless data networks), voice messages, photo/image messages, social network messages, and so on. Examples of messaging service 102 can include email services such as Gmail™ (a service of Google Inc.) and Yahoo!® Mail (a service of Yahoo! Inc.). Other examples can include instant messaging or chat services such as Gmail's chat feature or the Facebook chat function (a service of Facebook, Inc.), SMS/MMS services provided by cellular data carriers, social network services with a messaging component (e.g., social networks provided by Facebook, Inc., or LinkedIn Corp.), and so on. In some embodiments, a user can establish an account with messaging service 102, and messaging service 102 can store and provide access to the user's messages 120. Messaging service 102 can also provide web-based client interfaces, dedicated application programs, application program interfaces (APIs), and/or other tools for facilitating user access to messages 120 using clients 108, 110.
Message management service 104 can be a service that acts as a proxy or intermediary between messaging service 102 and clients 108, 110, as indicated by dashed arrow 116. Message management service 104 can provide enhanced functionality for organizing, storing, accessing, composing, and/or responding to messages 120 stored by messaging service 102. One example of message management service 104 can be the Mailbox service of Dropbox, Inc.
Clients 108 and 110 can be user-operated computing devices that can execute software to interact with message management service 104 and/or messaging service 102. Various types of computing devices can be used, including desktop computers, laptop computers, tablet computers, smart phones, wearable computing devices, personal digital assistants, and so on. By way of example, client 108 can be a smart phone that can execute an application program (also referred to as an app) to communicate with message management service 104 via network 106. The app can be provided by a provider of message management service 104 and can be customized to allow access to enhanced message management functions supported by message management service 104.
Client 110 can be a desktop computer that can execute an app to communicate with message management service 104. This app can be, for example, a mail client app built into an operating system of a desktop computer, a web browser that interfaces with a web server provided by message management service 104, a service-specific application provided by the provider of message management service 104, or another app. It is to be understood that in some embodiments, clients 108 and 110 can execute other apps to communicate with messaging service 102 without using message management service 104.
A given user can have accounts with both messaging service 102 and message management service 104. The user's account with message management service 104 can be linked to the user's account with messaging service 102, allowing the user to use message management service 104 to manage messages 120 sent and received via messaging service 102. In some embodiments, a user can have multiple accounts with one or more messaging services 102 and can link any or all of these accounts to a single account with message management service 104. Message management service 104 can retrieve a subset of messages for the user from messaging service 102 and deliver these messages to client 108 and/or client 110.
As shown in
Switching back and forth between different clients can disrupt the user experience in subtle but potentially unpleasant ways. For instance, as noted above, client 108 may have a smaller display screen than client 110, making less information available at once when using client 108. As another example, client 108 may provide a touch screen interface 130 that allows the user to interact with messages by touching or performing gestures directly on an area of the screen on which the message is displayed. Client 110, in contrast, might not have a touch screen and can instead rely on separate devices such as keyboard 132 and mouse 134.
In accordance with certain embodiments of the present invention, a pleasant user experience can be facilitated by providing analogous interface controls across client devices. For example, client 108 may allow a user to move a message from the inbox to another folder or message collection by performing a swipe gesture in a particular direction across a representation of the message in a message list. In some embodiments, client 110 can support a similar interaction, for example, by allowing the user to operate mouse 134 to drag a message in a message list in a particular direction. In some embodiments, client 110 can also provide simplified and faster interactions with messages in a message list. For example, the user can select a message to act upon by hovering an on-screen cursor over the message to cause a menu of actions to appear, then select an action from the menu via point-and click. Specific examples are described below.
It will be appreciated that system 100 is illustrative and that variations and modifications are possible. Embodiments of the present invention can support any number of client devices, including client devices belonging to or otherwise associated with different users.
Further, in some embodiments, a message management service can interact with multiple messaging services and can manage messages of disparate types (e.g., email and social network messages). As described below, where the message management service interacts with multiple messaging services, the message management service can dynamically generate and suggest filtering rules with or without regard to which messaging service was the origin of a particular message.
In examples described herein, it is assumed that the user interface screen is presented on a client device that does not have a touch screen or similar input capability that would allow the user to indicate input by tapping directly on a displayed item. Instead, the user can operate an input device to direct cursor 220 to a desired location on screen 200, referred to herein as “moving” or “pointing” cursor 220. It is further assumed that the user can perform a selection action, also referred to herein as “clicking,” to indicate that the client device should take an action, which can be based on the current location of cursor 220. For instance, referring to
Other implementations are also possible. For example, a client device can have a track pad that allows the user to indicate cursor movement by moving a finger across the track pad and to indicate selection by tapping or pressing the pad or an associated button; a pen tablet that allows the user to indicate cursor movement by moving a pen across the tablet surface and to indicate selection by pressing or tapping the pen at the desired location or on a designated area on the tablet; or a joystick that allows the user to indicate cursor movement by pushing the stick in a particular direction and to indicate selection by pressing a button. As another example, a user can indicate cursor movement in various direction by operating arrow keys (or other designated keys) on a keyboard or keypad and indicate selection by operating an enter key (or other designated key). Those skilled in the art will appreciate that still other user input devices and/or mechanisms can be used.
As shown in
Center pane 222 can display a list of messages 224a-f for the current collection. In this example, messages 224a-f are displayed in a column. For each message, e.g., message 224a, center pane 222 can show selected message information, such as a sender (or senders) 226, a subject line 228, a date/time 230 indicating when the message was sent or received, and a preview portion 232 of the message content. The number of messages presented in center pane 222 can be determined dynamically and can be based on considerations such as the available space on the display, the user's preference as to font size, and the number of messages 224 in current collection 204. If the number of messages in current collection 204 exceeds the available space, center pane 222 can be vertically scrollable to allow the user to view additional messages. If the available space exceeds the number of messages in current collection 204, white space can be presented, e.g., at the bottom of center pane 222. In some embodiments, a visual indicator can be shown at the bottom of the list to allow the user to recognize that there are no further messages in the collection. Users can interact with messages in center pane 222. For instance, a user can select a message to view (e.g., by moving cursor 220 over the message and clicking). As another example, the user can move a message from the current collection to a different (“destination”) collection. Examples of user actions to control moving of messages from a collection presented in center pane 222 to a destination collection are described below.
Right pane 240 can be used to display a message selected from center pane 222. In this example, a subject line 242, sender 244, time stamp 246, and message body 248 are shown. If message body 248 is too large to fit within pane 240, pane 240 (or a portion thereof) can be scrollable to allow the user to view different portions of message body 248. In some embodiments, right pane 240 can provide a “conversation” view of related messages, such as a thread of email messages, in which all or part of the body of several different messages is presented. Right pane 240 can also include control elements 250 operable to move the current message (or conversation in the case of a conversation view) to a different collection. Each of control elements 250 can correspond to a different collection (e.g., any of the collections 204 shown in left pane 202). Any one of control elements 250 can be selected by the user, e.g., by moving cursor 220 to the location of the desired control element 250 and clicking, and this can be interpreted as an instruction to move the message shown in right pane 240 to the corresponding collection. However, use of control elements 250 may not be the fastest way to move messages; as described below, the user can also move a message by interacting with the abbreviated message representation in center pane 222.
It will be appreciated that interface screen 200 is illustrative and that variations and modifications are possible. The arrangement and layout of screen elements (including panes, control elements, and information elements) can be varied, and different combinations of screen elements can be presented. The arrangement of the panes can also be varied, and the arrangement can include horizontal and/or vertical segmentation into panes as desired. Depending on the size of the display screen and/or user preferences, the number of panes presented at the same time can be more or fewer than the three shown. For example in a device with a relatively small display, the number of panes displayed at the same time can be reduced; on some devices, only one of panes 202, 222, and 240 might be visible at a given time. In cases where fewer than all of panes 202, 222, and 240 are visible, navigation controls (e.g., clickable control elements) can be provided to allow the user to navigate between panes. On devices with larger screens, more panes might be shown; for instance, message lists for two or more message collections can be presented at the same time, using two or more panes similar to center pane 222. The particular message collections and messages shown are also for purposes of illustration and can be modified. Further, the messages can include any type of electronically communicated messages, including but not limited to email messages, text messages, social-network messages, photo or image messages, and so on.
In some embodiments, a user can interact with center pane 220 to move a message 224 from one collection to another. Examples of such interactions will now be described.
One type of interaction can be based on a “drag” operation using a pointing device. In one example of a drag operation, a user can hold down a button while moving cursor 220 from a starting location to an ending location, then release the button once cursor 220 has reached the desired ending location. For instance, referring to
In some embodiments of the present invention, a user can perform drag operations to instruct the client device to move a message from one collection to another, with the length and direction of the drag indicating the destination collection. An example is shown in
If the user ends the drag at the position shown in
In some embodiments, different destination collections can be selected by dragging the message by a different distance. For instance, from the position shown in
In some embodiments, different destination collections can be indicated by dragging the message in a different direction. For instance, from the position shown in
If the user ends the drag at the position shown in
In some embodiments, selection menu 520 can appear at a position determined by the location of cursor 320 at the end of the drag operation. For example, selection menu 520 can be rendered at a position such that cursor 320 is located at or near the center of selection menu 520, or at the top left corner, top right corner, or any other location. In some instances, the position of selection menu 520 can be constrained, e.g., by a rule that selection menu 520 should not extend beyond the edge of the visible area. Consistent positioning of selection menu 520 relative to cursor 320 can make selection menu 520 more conspicuous for the user. Further, if selection menu 520 is positioned such that cursor 320 is already near or inside menu 520, this can allow for faster user response, as cursor 320 does not have to be moved far in order to make a selection.
In some embodiments, instead of ending the drag at the position shown in
If the user ends the drag at the position shown in
In combination, the interfaces and actions shown in
At block 702, client 110 can display a user interface, e.g., interface screen 300 of
More specifically, at block 706, client 110 can determine whether the user input corresponds to an action on a message list. For instance, if the user is simply moving cursor 320, or if the user performs a click or a drag while cursor 320 is not positioned over a message in center pane 322 of
The remaining portions of process 700 show more specific examples of input processing that can occur when the user input corresponds to an action on a message list, e.g., an action performed while cursor 320 is positioned over a message in center pane 322 of
At block 714, client 110 can determine whether the user has begun a drag operation on a message (e.g., message 324b) in center pane 322. For example, a drag can be detected if, at a time when cursor 320 is positioned over a message, the user holds down a control button while moving a pointing device at least a certain minimum distance either to the left or right. The minimum distance can be selected as desired and can depend on properties of the pointing device; in some embodiments, the distance is selected to be long enough that it is unlikely that the action occurred by accident while the user was trying to execute some operation other than a drag. As another example, where a multi-touch track pad is used, a movement of two fingers horizontally for at least a minimum distance can be detected as beginning a message drag. At block 716, when the beginning of a message drag is detected, client 110 can display a first action option. The action option can depend on the direction of the drag. For instance, if the drag is to the right, action indicator 310 of
At block 718, client 110 can determine whether the user has extended a drag operation that was previously begun. For example, extension of a drag can be detected if, at a time when blocks 714 and 716 have previously been executed to display a first action option based on detecting a begin-drag, the user continues to hold down the control button while moving a pointing device in the same direction by at least a second threshold distance (y) beyond the distance (x) that caused the begin-drag to be detected. In some embodiments, the second threshold distance y can be equal to the minimum distance x; that is, if dragging by distance x indicates a begin-drag, then dragging by 2x indicates an extended drag. Other thresholds can also be chosen, and the second threshold distance y can be long enough that it is easy for the user to drag far enough (distance x) to trigger a begin-drag without also performing an extended drag (minimum distance x+y). At block 720, when an extended drag is detected, client 110 can transition the display to display a second action option. The second action option can depend on the direction of the drag. For instance, if a drag to the right is extended, action indicator 310 of
At block 722, client 110 can determine whether the user has reversed the direction of a drag operation that is in progress. This can occur, e.g., after detecting a begin-drag at block 714 or an extended drag at block 718. A reversal of drag direction can be detected, e.g., if while the user continues to hold the control button, motion of a pointing device that was previously moving to the left (right) becomes motion to the right (left). As another example, on a multi-touch track pad, a reversal of drag direction can be detected if the fingers (or other contact objects) begin moving in the other direction without lifting off the track pad. As with other drag actions, detecting a reversal can require that the movement in the reverse direction cover a large enough distance (e.g., distance y as defined above) to make it unlikely to have occurred inadvertently. When a reversal of drag direction is detected, at block 724, client 110 can revert to a previously displayed action option. For instance, if the reversal occurs when action indicator 322 of
At block 726, client 110 can determine whether the user has ended the drag operation. For example, client 110 can detect whether the user has released the control button of a mouse or lifted fingers off a track pad. If the drag operation has ended, at block 728, client 110 can perform an action on the message, such as moving the message to a destination collection. The message to be acted upon can be the message over which cursor 320 is positioned. (Where the displayed message moves with the cursor during a drag, the message is, in effect, selected at the beginning of the drag.) The action to be performed can be the action corresponding to the action indicator that is displayed when the drag operation ends. For instance, if action indicator 310 of
Moving a message to a destination collection can be accomplished in various ways. For example, the message can be removed from the message list in center pane 320 to indicate to the user that the message is being moved. An animated transition, e.g., as described above, can be used. In addition to updating the display, a message repository can also be updated to move the message from its previous collection to the destination collection. For example, referring to
Referring again to
It will be appreciated that process 700 is illustrative and that variations and modifications are possible. Steps described as sequential may be executed in parallel, order of steps may be varied, and steps may be modified, combined, added or omitted. For instance, the set of actions that can be performed on a message can include more actions, fewer actions, or different actions, and the set is not limited to actions related to moving a message to a destination collection. Further, in the case of moving a message, the particular destination collection associated with a given direction and distance of a drag can depend on which collection is the current message collection displayed in center pane 320. The particular definition of a user action corresponding to a drag can be modified as desired, e.g., based on the user input device(s) supported by a given client device. Further, while the foregoing description refers to dragging a message in a specific direction (e.g., left or right), other embodiments may provide for dragging messages in other directions in order to effect message-processing actions.
In embodiments described above, a user can perform a drag operation to move a message from a currently displayed collection to another collection. In other embodiments, when the user “hovers” an on-screen cursor over a message representation in a message list, a set of control elements for moving the message (also referred to as a “control menu”) can be presented, and the user can click on one of the control element to select it. In some embodiments, the set of control elements can initially be presented in a “dormant” state when the user hovers the cursor over the message, and one of the control elements (or the entire control menu) can transition to an “active” state if the user moves the cursor over that control element. The dormant and active states can be visually distinct; examples are described below.
When cursor 820 hovers over message 824b, a control menu that includes control elements 830, 832, 834, 836 can appear in a dormant state. In this example, each of control elements 830, 832834, 836 can appear in the dormant state as a translucent colored region overlaying a different portion of message 824b. The colors can be selected to be associated with different message collections. For instance, control element 830 can be in a color (e.g., brown) associated with lists collection 810, control element 832 can be in a color (e.g., yellow) associated with deferred collection 808, control element 834 can be in a color (e.g., green) associated with archive collection 812, and control element 836 can be in a color (e.g., red) associated with trash collection 814. In the dormant state, the colors can be translucent or pale or otherwise adjusted such that the presented information for message 824b remains readable. In this example, center region 840 of message 824 does not change color, and the user can click on center region 840 to indicate that message 824b should be presented in right pane 842.
In some embodiments, an animated transition can be used to indicate moving a message to another collection.
It is to be understood that, from the configuration of
At block 902, client 110 can display a user interface, e.g., interface screen 800 of
More specifically, at block 906, client 110 can determine whether the user input corresponds to hovering over a message in a message list. For example, if cursor 820 is positioned over a message in center pane 822 and no control button is currently activated, this can be detected as hovering. In some embodiments, detecting hovering can further require that cursor 820 is not moving. When hovering over a message in a message list is not detected, at block 908, client 110 can perform appropriate input processing based on the user input received. Depending on the specific user input and implementation, such processing can include changing the current message collection displayed in center pane 820, moving a message to a collection based on user operation of control elements 850 in right pane 842 of
If, at block 906, cursor 820 is hovering over a message, then at block 910, client 110 can show a control menu including control elements for acting on the message over which cursor 820 is hovering. For instance, as shown in
At block 916, client 110 can detect a next user action. At block 918, if the user clicks a control button, then at block 920, client 110 can perform an action based on the cursor location. For example, if the user clicks while cursor 820 is positioned within control element 832 shown in
If the action taken at block 916 is not a click, then at block 922, client 110 can determine whether the action corresponds to moving cursor 820 within the region occupied by the current message. If so, control elements for the message can continue to be displayed, and process 900 can return to block 912 to update the active control element. If, at block 924, cursor 820 has moved to a different message in the message list, process 900 can return to block 910 to display control elements for the message where cursor 820 is now positioned; control elements for the previous message can become hidden (e.g., not rendered on the display screen). If cursor 820 has moved out of the message list entirely (i.e., cursor 820 did not stay within the same message at block 922 and did not move onto another message in the message list at block 924), then at block 926, the control elements that were presented starting at block 910 can be hidden, and process 900 can return to block 904 to detect and process the user input.
It will be appreciated that the process 900 is illustrative and that variations and modifications are possible. Steps described as sequential may be executed in parallel, order of steps may be varied, and steps may be modified, combined, added or omitted. For instance, the set of available actions can include more actions, fewer actions, or different actions. Further, the set of control elements and their arrangement can depend on which collection is the current message collection displayed in center pane 820. The particular definition of user actions corresponding to hovering and clicking can be modified as desired, based on the particular user input device(s) available.
The style in which control elements are rendered in active and dormant states can also be modified.
In another arrangement, control elements can extend downward from a message to be acted upon.
In yet another arrangement, control elements can extend upward from the bottom of a message to be acted upon.
In yet another arrangement, a user can drag a message to reveal a control menu, then select a control element from the revealed menu.
Other similar arrangements can also be provided. For instance, all control elements can appear with motion in the same direction, and control elements can be arranged side-by-side (similar to
In yet another arrangement, a control menu can appear in a dormant state when the user hovers a cursor over a message and become active when the user moves the cursor over the menu.
In other embodiments, a control menu can appear as an inset over a portion of message 1424b. For example,
In any of the examples in
A control menu similar to control menu 1430 of
In other embodiments, rather than extending outward, the control menu can appear as an inset.
In embodiments where selection of a message and an action is based on a hover-and-click paradigm, color-based feedback cues can help reduce user errors. For example, in some of the examples shown in
Those skilled in the art with access to the present disclosure will recognize that other modifications are possible. Elements (including control elements and/or information elements) shown in different interface screens can be combined, and the arrangement of elements within an interface screen can modified. Elements can be different from the specific examples shown; for instance, all glyphs, colors, transitions, and the like are illustrative and not limiting. Further, the actions taken in response to user selection of a control element or user performance of a drag operation can include but are not limited to moving messages to various destination collections. For example, in some embodiments, the user may have the option to reply to or forward a message using control elements and operations similar to those described herein. All transitions can be animated, and the speed of animation can be as fast or slow as desired.
Various operations described herein can be implemented on computer systems, which can include systems of generally conventional design.
Processing unit(s) 1705 can include a single processor, which can have one or more cores, or multiple processors. In some embodiments, processing unit(s) 1705 can include a general-purpose primary processor as well as one or more special-purpose co-processors such as graphics processors, digital signal processors, or the like. In some embodiments, some or all processing units 1705 can be implemented using customized circuits, such as application specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs). In some embodiments, such integrated circuits execute instructions that are stored on the circuit itself. In other embodiments, processing unit(s) 1705 can execute instructions stored in storage subsystem 1710.
Storage subsystem 1710 can include various memory units such as a system memory, a read-only memory (ROM), and a permanent storage device. The ROM can store static data and instructions that are needed by processing unit(s) 1705 and other modules of computer system 1700. The permanent storage device can be a read-and-write memory device. This permanent storage device can be a non-volatile memory unit that stores instructions and data even when computer system 1700 is powered down. Some embodiments of the invention can use a mass-storage device (such as a magnetic or optical disk or flash memory) as a permanent storage device. Other embodiments can use a removable storage device (e.g., a floppy disk, a flash drive) as a permanent storage device. The system memory can be a read-and-write memory device or a volatile read-and-write memory, such as dynamic random access memory. The system memory can store some or all of the instructions and data that processing unit(s) 1705 need at runtime.
Storage subsystem 1710 can include any combination of computer readable storage media including semiconductor memory chips of various types (DRAM, SRAM, SDRAM, flash memory, programmable read-only memory) and so on. Magnetic and/or optical disks can also be used. In some embodiments, storage subsystem 1710 can include removable storage media that can be readable and/or writeable; examples of such media include compact disc (CD), read-only digital versatile disc (e.g., DVD-ROM, dual-layer DVD-ROM), read-only and recordable Blue-Ray® disks, ultra density optical disks, flash memory cards (e.g., SD cards, mini-SD cards, micro-SD cards, etc.), magnetic disks, and so on. The computer readable storage media do not include carrier waves and transitory electronic signals passing wirelessly or over wired connections.
In some embodiments, storage subsystem 1710 can store one or more software programs to be executed by processing unit(s) 1705, such as an operating system, a messaging client application, and so on. “Software” refers generally to sequences of instructions that, when executed by processing unit(s) 1705, cause computer system 1700 to perform various operations, thus defining one or more specific machine implementations that execute and perform the operations of the software programs. The instructions can be stored as firmware residing in read-only memory and/or applications stored in non-volatile storage media that can be read into volatile working memory for execution by processing unit(s) 1705. Software can be implemented as a single program or a collection of separate programs or program modules that interact as desired. From storage subsystem 1710, processing unit(s) 1705 can retrieve program instructions to execute and data to process in order to execute various operations described herein.
A user interface can be provided by one or more user input devices 1720 and one or more user output devices 1725. Input devices 1720 can include any device via which a user can provide signals to computer system 1700; computer system 1700 can interpret the signals as indicative of particular user requests or information. In various embodiments, input devices 1720 can include any or all of a keyboard, track pad, touch screen, mouse or other pointing device, scroll wheel, click wheel, dial, button, switch, keypad, microphone, and so on.
User output devices 1725 can include any device via which computer system 1700 can provide information to a user. For example, user output devices 1725 can include a display to display images generated by computer system 1700. The display can incorporate various image generation technologies, e.g., a liquid crystal display (LCD), light-emitting diode (LED) including organic light-emitting diodes (OLED), projection system, cathode ray tube (CRT), or the like, together with supporting electronics (e.g., digital-to-analog or analog-to-digital converters, signal processors, or the like). Some embodiments can include a device such as a touchscreen that function as both input and output device. In some embodiments, other user output devices 1725 can be provided in addition to or instead of a display. Examples include indicator lights, speakers, tactile “display” devices, printers, and so on.
In some embodiments, input device 1720 and output devices 1725 can interoperate to provide a graphical user interface (“GUI”) that allows a user to interact with computer system 1700 by using an input device to select a control element displayed on the screen (e.g., by operating a pointing device such as a mouse or touching the location where a control element is displayed on a touch screen).
Network interface 1735 can provide voice and/or data communication capability for computer system 1700, including the ability to communicate with various messaging services and/or message management services to access and act upon messages. In some embodiments, network interface 1735 can include radio frequency (RF) transceiver components for accessing wireless voice and/or data networks (e.g., using cellular telephone technology, advanced data network technology such as 3G, 4G or EDGE, Wi-Fi (IEEE 802.11 family standards), or other mobile communication technologies, or any combination thereof), GPS receiver components, and/or other components. In some embodiments, network interface 1735 can provide wired network connectivity (e.g., Ethernet) in addition to or instead of a wireless interface. Network interface 1735 can be implemented using a combination of hardware (e.g., antennas, modulators/demodulators, encoders/decoders, and other analog and/or digital signal processing circuits) and software components.
Bus 1740 can include various system, peripheral, and chipset buses that communicatively connect the numerous components of computer system 1700. For example, bus 1740 can communicatively couple processing unit(s) 1705 with storage subsystem 1710. Bus 1740 can also connect to input devices 1720 and output devices 1725. Bus 1740 can also couple computing system 1700 to a network through network interface 1735. In this manner, computer system 1700 can be a part of a network of multiple computer systems (e.g., a local area network (LAN), a wide area network (WAN), an intranet, or a network of networks, such as the Internet.
Some embodiments include electronic components, such as microprocessors, storage and memory that store computer program instructions in a computer readable storage medium. Many of the features described in this specification can be implemented as processes that are specified as a set of program instructions encoded on a computer readable storage medium. When these program instructions are executed by one or more processing units, they cause the processing unit(s) to perform various operation indicated in the program instructions. Examples of program instructions or computer code include machine code, such as is produced by a compiler, and files including higher-level code that are executed by a computer, an electronic component, or a microprocessor using an interpreter.
Through suitable programming, processing unit(s) 1705 can provide various functionality for computer system 1700. For example, computer system 1700 can execute a messaging client app that provides an interface operable by the user to interact with messages, including, e.g., any or all of the interface screens described above.
It will be appreciated that computer system 1700 is illustrative and that variations and modifications are possible. Computer system 1700 can have other capabilities not specifically described here (e.g., mobile phone, global positioning system (GPS), power management, one or more cameras, various connection ports for connecting external devices or accessories, etc.). Further, while computer system 1700 is described with reference to particular blocks, it is to be understood that these blocks are defined for convenience of description and are not intended to imply a particular physical arrangement of component parts. Further, the blocks need not correspond to physically distinct components. Blocks can be configured to perform various operations, e.g., by programming a processor or providing appropriate control circuitry, and various blocks might or might not be reconfigurable depending on how the initial configuration is obtained. Embodiments of the present invention can be realized in a variety of apparatus including electronic devices implemented using any combination of circuitry and software.
Embodiments of the present invention can be realized using any combination of dedicated components and/or programmable processors and/or other programmable devices. The various processes described herein can be implemented on the same processor or different processors in any combination. Where components are described as being configured to perform certain operations, such configuration can be accomplished, e.g., by designing electronic circuits to perform the operation, by programming programmable electronic circuits (such as microprocessors) to perform the operation, or any combination thereof. Further, while the embodiments described above may make reference to specific hardware and software components, those skilled in the art will appreciate that different combinations of hardware and/or software components may also be used and that particular operations described as being implemented in hardware might also be implemented in software or vice versa.
Computer programs incorporating various features of the present invention may be encoded and stored on various computer readable storage media; suitable media include magnetic disk or tape, optical storage media such as compact disk (CD) or DVD (digital versatile disk), flash memory, and other non-transitory media. Computer readable media encoded with the program code may be packaged with a compatible electronic device, or the program code may be provided separately from electronic devices (e.g., via Internet download or as a separately packaged computer-readable storage medium).
Thus, although the invention has been described with respect to specific embodiments, it will be appreciated that the invention is intended to cover all modifications and equivalents within the scope of the following claims.
This application is a continuation of U.S. application Ser. No. 14/248,272, filed Apr. 8, 2014 and entitled “MESSAGING CLIENT APPLICATION INTERFACE,” which in turn is a continuation-in-part of U.S. application Ser. No. 14/056,838, filed Oct. 17, 2013, entitled “System and Method for Organizing Messages,” which claims the benefit of U.S. Provisional Application No. 61/728,626, filed Nov. 20, 2012. The disclosure of each application is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5283560 | Bartlett | Feb 1994 | A |
5754178 | Johnston et al. | May 1998 | A |
5760773 | Berman | Jun 1998 | A |
5822526 | Waskiewicz | Oct 1998 | A |
6411684 | Cohn et al. | Jun 2002 | B1 |
6545669 | Kinawi et al. | Apr 2003 | B1 |
6941348 | Petry et al. | Sep 2005 | B2 |
7305242 | Zakharia | Dec 2007 | B2 |
7363345 | Austin-Lane et al. | Apr 2008 | B2 |
7505571 | Bhatia et al. | Mar 2009 | B2 |
7525571 | Ando et al. | Apr 2009 | B2 |
7525691 | Gordon et al. | Apr 2009 | B2 |
7730081 | Bromm et al. | Jun 2010 | B2 |
7788218 | Gerritsen et al. | Aug 2010 | B2 |
7818672 | McCormack | Oct 2010 | B2 |
7886236 | Kolmykov-Zotov | Feb 2011 | B2 |
7895537 | Gruen | Feb 2011 | B2 |
7899869 | Anderson | Mar 2011 | B1 |
7934156 | Forstall | Apr 2011 | B2 |
8005462 | Roy | Aug 2011 | B2 |
8019863 | Jeide et al. | Sep 2011 | B2 |
8054294 | Sakai et al. | Nov 2011 | B2 |
8250159 | Starbuck et al. | Aug 2012 | B2 |
8255468 | Vitaldevara et al. | Aug 2012 | B2 |
8291349 | Park | Oct 2012 | B1 |
8316095 | Nguyen | Nov 2012 | B1 |
8390577 | LeMort | Mar 2013 | B2 |
8438495 | Gilra et al. | May 2013 | B1 |
8451246 | Scholler | May 2013 | B1 |
8610682 | Fulcher | Dec 2013 | B1 |
8624864 | Birnbaum | Jan 2014 | B2 |
8635561 | Bullock et al. | Jan 2014 | B1 |
8645863 | Mandic | Feb 2014 | B2 |
8775520 | Lewis et al. | Jul 2014 | B1 |
8787335 | Smith et al. | Jul 2014 | B2 |
8793591 | Coleman et al. | Jul 2014 | B1 |
8782566 | Sarkar | Aug 2014 | B2 |
8836147 | Uno et al. | Sep 2014 | B2 |
8839147 | Kang | Sep 2014 | B2 |
8910068 | Shin | Dec 2014 | B2 |
9063989 | Buchheit et al. | Jun 2015 | B2 |
9244594 | Jiang | Jan 2016 | B2 |
9248376 | Yoshikawa et al. | Feb 2016 | B2 |
9360998 | Bauder | Jun 2016 | B2 |
9507448 | Park | Nov 2016 | B2 |
9542668 | Gilad et al. | Jan 2017 | B2 |
10025461 | Liu | Jul 2018 | B2 |
10033679 | Gilad et al. | Jul 2018 | B2 |
10402050 | Lee | Sep 2019 | B2 |
10788953 | Chaudhri et al. | Sep 2020 | B2 |
20010042087 | Kephart et al. | Nov 2001 | A1 |
20020054117 | Van et al. | May 2002 | A1 |
20020087704 | Chesnais et al. | Jul 2002 | A1 |
20030020687 | Sowden et al. | Jan 2003 | A1 |
20040119763 | Mizobuchi et al. | Jun 2004 | A1 |
20050015432 | Cohen | Jan 2005 | A1 |
20050043015 | Muramatsu | Feb 2005 | A1 |
20050068971 | Meisl et al. | Mar 2005 | A1 |
20050081059 | Bandini et al. | Apr 2005 | A1 |
20050114781 | Brownholtz et al. | May 2005 | A1 |
20050144568 | Gruen | Jun 2005 | A1 |
20050181768 | Roy | Aug 2005 | A1 |
20050192924 | Drucker et al. | Sep 2005 | A1 |
20050275638 | Kolmykov | Dec 2005 | A1 |
20060085391 | Turski | Apr 2006 | A1 |
20060090137 | Cheng et al. | Apr 2006 | A1 |
20060123360 | Anwar | Jun 2006 | A1 |
20060155810 | Butcher | Jul 2006 | A1 |
20060179415 | Cadiz et al. | Aug 2006 | A1 |
20060190830 | Gerstl et al. | Aug 2006 | A1 |
20060206570 | Heidloff et al. | Sep 2006 | A1 |
20060238517 | King et al. | Oct 2006 | A1 |
20070038707 | Broder et al. | Feb 2007 | A1 |
20070076857 | Chava et al. | Apr 2007 | A1 |
20070192708 | Lee et al. | Aug 2007 | A1 |
20070229465 | Sakai et al. | Oct 2007 | A1 |
20070277125 | Shin et al. | Nov 2007 | A1 |
20080059590 | Sarafijanovic et al. | Mar 2008 | A1 |
20080074399 | Lee | Mar 2008 | A1 |
20080094371 | Forstall | Apr 2008 | A1 |
20080126481 | Chakra et al. | May 2008 | A1 |
20080165145 | Herz | Jul 2008 | A1 |
20080165160 | Kocienda et al. | Jul 2008 | A1 |
20080201668 | Roy | Aug 2008 | A1 |
20080208980 | Champan et al. | Aug 2008 | A1 |
20080220798 | Potluri et al. | Sep 2008 | A1 |
20080222540 | Schultz et al. | Sep 2008 | A1 |
20080222546 | Mudd | Sep 2008 | A1 |
20080256179 | Gorty et al. | Oct 2008 | A1 |
20080270548 | Glickstein et al. | Oct 2008 | A1 |
20080294735 | Muntermann et al. | Nov 2008 | A1 |
20090007012 | Mandic | Jan 2009 | A1 |
20090076966 | Sun | Mar 2009 | A1 |
20090079699 | Sun | Mar 2009 | A1 |
20090093277 | Lee | Apr 2009 | A1 |
20090150905 | Lin et al. | Jun 2009 | A1 |
20090157831 | Tian et al. | Jun 2009 | A1 |
20090172695 | Lazaroff et al. | Jul 2009 | A1 |
20090178008 | Herz | Jul 2009 | A1 |
20090233618 | Bai et al. | Sep 2009 | A1 |
20090244015 | Sengupta et al. | Oct 2009 | A1 |
20090254624 | Baudin et al. | Oct 2009 | A1 |
20090276701 | Nurmi | Nov 2009 | A1 |
20090278806 | Duarte | Nov 2009 | A1 |
20090282360 | Park et al. | Nov 2009 | A1 |
20090307623 | Agarawala et al. | Dec 2009 | A1 |
20090313567 | Kwon et al. | Dec 2009 | A1 |
20100013780 | Ikeda | Jan 2010 | A1 |
20100031202 | Morris | Feb 2010 | A1 |
20100070594 | Yoshimura | Mar 2010 | A1 |
20100082759 | Nalliah | Apr 2010 | A1 |
20100095249 | Yoshikawa et al. | Apr 2010 | A1 |
20100125786 | Ozawa et al. | May 2010 | A1 |
20100153493 | Clarke | Jun 2010 | A1 |
20100177051 | Bilow | Jul 2010 | A1 |
20100199226 | Numi | Aug 2010 | A1 |
20100210214 | Pawar et al. | Aug 2010 | A1 |
20100214237 | Echeverri et al. | Aug 2010 | A1 |
20100235794 | Ording | Sep 2010 | A1 |
20100241973 | Whiddett | Sep 2010 | A1 |
20100257241 | Hale et al. | Oct 2010 | A1 |
20100295805 | Shin | Nov 2010 | A1 |
20100299599 | Shin | Nov 2010 | A1 |
20100299638 | Choi | Nov 2010 | A1 |
20110004662 | Dodsworth | Jan 2011 | A1 |
20110010672 | Hope | Jan 2011 | A1 |
20110022471 | Brueck et al. | Jan 2011 | A1 |
20110074828 | Capela | Mar 2011 | A1 |
20110081923 | Forutanpour et al. | Apr 2011 | A1 |
20110084921 | Kang | Apr 2011 | A1 |
20110099507 | Nesladek | Apr 2011 | A1 |
20110106880 | Strong et al. | May 2011 | A1 |
20110163970 | Lemay | Jul 2011 | A1 |
20110197153 | King et al. | Aug 2011 | A1 |
20110202616 | Kinoshita | Aug 2011 | A1 |
20110202878 | Park | Aug 2011 | A1 |
20110202879 | Stovicek | Aug 2011 | A1 |
20110209098 | Hinckley et al. | Aug 2011 | A1 |
20110209195 | Kennedy et al. | Aug 2011 | A1 |
20110219332 | Park | Sep 2011 | A1 |
20110289162 | Schlossnagle | Nov 2011 | A1 |
20110295958 | Liu et al. | Dec 2011 | A1 |
20110296351 | Ewing, Jr. et al. | Dec 2011 | A1 |
20110302515 | Kim | Dec 2011 | A1 |
20120023375 | Dutta et al. | Jan 2012 | A1 |
20120030407 | Pandey et al. | Feb 2012 | A1 |
20120124147 | Bayer | May 2012 | A1 |
20120131458 | Hays | May 2012 | A1 |
20120131474 | Panchadsaram et al. | May 2012 | A1 |
20120131659 | Roy et al. | May 2012 | A1 |
20120161458 | Sim | Jun 2012 | A1 |
20120185781 | Lemay | Jul 2012 | A1 |
20120185797 | Halchin | Jul 2012 | A1 |
20120210214 | Yoo et al. | Aug 2012 | A1 |
20120210334 | Sutedja et al. | Aug 2012 | A1 |
20120240054 | Webber | Sep 2012 | A1 |
20120254793 | Briand et al. | Oct 2012 | A1 |
20120256863 | Zhang | Oct 2012 | A1 |
20120290662 | Weber | Nov 2012 | A1 |
20120290946 | Schrock | Nov 2012 | A1 |
20120304074 | Ooi et al. | Nov 2012 | A1 |
20130007150 | Hertz | Jan 2013 | A1 |
20130019174 | Gil | Jan 2013 | A1 |
20130024780 | Sutedja et al. | Jan 2013 | A1 |
20130035123 | Smith et al. | Feb 2013 | A1 |
20130050118 | Kjelsbak et al. | Feb 2013 | A1 |
20130050131 | Lee | Feb 2013 | A1 |
20130057902 | Henry et al. | Mar 2013 | A1 |
20130067392 | Leonard et al. | Mar 2013 | A1 |
20130072263 | Kim | Mar 2013 | A1 |
20130097566 | Berglund | Apr 2013 | A1 |
20130100036 | Papakipos | Apr 2013 | A1 |
20130104089 | Rieffel et al. | Apr 2013 | A1 |
20130117713 | Bauder | May 2013 | A1 |
20130125063 | Lee et al. | May 2013 | A1 |
20130132904 | Primiani et al. | May 2013 | A1 |
20130145303 | Prakash | Jun 2013 | A1 |
20130167082 | Joo et al. | Jun 2013 | A1 |
20130174089 | Ki | Jul 2013 | A1 |
20130179517 | Grosu et al. | Jul 2013 | A1 |
20130179801 | Audet et al. | Jul 2013 | A1 |
20130185650 | Gutowitz | Jul 2013 | A1 |
20130204946 | Forstall et al. | Aug 2013 | A1 |
20130205220 | Yerli | Aug 2013 | A1 |
20130212529 | Amarnath | Aug 2013 | A1 |
20130227483 | Thorsander | Aug 2013 | A1 |
20130238651 | Benedek | Sep 2013 | A1 |
20130246975 | Oddiraju et al. | Sep 2013 | A1 |
20130290876 | Anderson et al. | Oct 2013 | A1 |
20130290879 | Greisson | Oct 2013 | A1 |
20130324222 | De Viveiros Ortiz | Dec 2013 | A1 |
20130332850 | Bovet | Dec 2013 | A1 |
20140007005 | Libin | Jan 2014 | A1 |
20140028585 | Park | Jan 2014 | A1 |
20140035826 | Frazier | Feb 2014 | A1 |
20140123043 | Schmidt et al. | May 2014 | A1 |
20140129942 | Rathod | May 2014 | A1 |
20140143358 | Beausoleil et al. | May 2014 | A1 |
20140143683 | Underwood, IV et al. | May 2014 | A1 |
20140143738 | Underwood, IV et al. | May 2014 | A1 |
20140157182 | Kim | Jun 2014 | A1 |
20140165003 | Branton | Jun 2014 | A1 |
20140165012 | Shen | Jun 2014 | A1 |
20140223347 | Seo | Aug 2014 | A1 |
20140278786 | Liu-Qiu-Yan | Sep 2014 | A1 |
20140282005 | Gutowitz | Sep 2014 | A1 |
20140282254 | Feiereisen | Sep 2014 | A1 |
20140304615 | Coe et al. | Oct 2014 | A1 |
20140317545 | Miyazaki | Oct 2014 | A1 |
20140325425 | Milam | Oct 2014 | A1 |
20150032829 | Barshow et al. | Jan 2015 | A1 |
20150046867 | Moore | Feb 2015 | A1 |
20150067596 | Brown | Mar 2015 | A1 |
20150112749 | Erdal | Apr 2015 | A1 |
20150135337 | Fushman et al. | May 2015 | A1 |
20150169068 | Plagemann et al. | Jun 2015 | A1 |
20150277717 | Barabash et al. | Oct 2015 | A1 |
20150281156 | Beausoleil | Oct 2015 | A1 |
20160026704 | Strong et al. | Jan 2016 | A1 |
20160274747 | Bauder | Sep 2016 | A1 |
20160328097 | In | Nov 2016 | A1 |
20180173318 | Kim et al. | Jun 2018 | A1 |
Number | Date | Country |
---|---|---|
2011253700 | Dec 2011 | AU |
2746065 | Oct 2011 | CA |
101273591 | Sep 2008 | CN |
101641946 | Feb 2010 | CN |
101796478 | Aug 2010 | CN |
102495692 | Jun 2012 | CN |
1942401 | Jul 2008 | EP |
2116927 | Nov 2009 | EP |
2362592 | Aug 2011 | EP |
2055059 | Oct 2012 | EP |
2004-303207 | Oct 2004 | JP |
2009-151638 | Jul 2009 | JP |
2010-525740 | Jul 2010 | JP |
2011-188327 | Sep 2011 | JP |
2012-038292 | Feb 2012 | JP |
0163875 | Aug 2001 | WO |
2008030970 | Mar 2008 | WO |
2012068401 | May 2012 | WO |
2012068407 | May 2012 | WO |
2013009092 | Jan 2013 | WO |
2013009092 | Apr 2013 | WO |
2014081863 | May 2014 | WO |
2014081870 | May 2014 | WO |
Entry |
---|
Author: Randy Nordell Title: Making Outlook Work For You—Outlook Follow Up Flags Date: Nov. 2, 2011 pp. 1-5 (Year: 2011). |
Tsai, Henry, “Getting Started on Astrid's “Remind Me” Gmail Integration,” Astrid.com, [online], Feb. 11, 2013, [retrieved on Jan. 3, 2013], retrieved from the internet: <URL: http://biog.astrid.comNog/2013/02111/getting-started-onastrids-remind-me-extension>, 5 pages. |
Klinker, Jacob, “Sliding Messaging Pro,” 2013, [online], [retrieved on May 13, 2013], retrieved from the internet <URL:https://play.google.com/store/apps/details?id=com.klinkerandroid>, 3 pages. |
Author: Guiding Tech Title: Hold Mouse Right Button for More Options On Drag and Drop Date: Jul. 1, 2012 pp. 1-6. |
“Triage: Email First Aid,” 2013, [online], [retrieved on May 13, 2013], retrieved from the internet <URL: https://www.triage.cc/>, 2 pages. |
“Taskbox: Emails & Task Lists Rolled Into One,” posted Apr. 22, 2013 by Jesse Virgil, [online], [retrieved on Jan. 3, 2014], retrieved from the internet <URL: http://iphone.appstorm.net/reviews/productivity/taskbox-emails-task-lists-rolled-into-one/>, 12 pages. |
“Taskbox—Mail,” 2013, [online], [retrieved on May 13, 2013], retrieved from the internet <URL: https://itunes.apple.com/us/app/taskbox-mail>, 3 pages. |
“Orchestra To-do,” 2013, [online], [retrieved on May 13, 2013], retrieved from the internet <URL: https://itunes.apple.com/us/app/orchestra-to-do>, 3 pages. |
“Move Over Mailbox, Taskbox Is Better for US Working Folks,” Mar. 9, 2013, Niblitz.com, [online], [retrieved on Jan. 3, 2014], retrieved from the internet <URL: http://nibletz.com/2013/03/09/move-mailbox-taskbox-working-folks>, 2 pages. |
“Browse through your email more efficiently with Swipe For Mail,” posted Feb. 19, 2012 by Cody Lee, [online], [retrieved on May 13, 2013], retrieved from the internet <URL:http://www.idownloadblog.com/2012/02/19/swipemail-iailbreak-tweak/>, 5 pages. |
“Activator: a free and extremely powerful jailbreak app launcher for iPhone,” posted Feb. 1, 2010 by Thomas Wong, iSource.com, [online], [retrieved on May 13, 2013], retrieved from the internet <URL: http://isource.com/2010/02/01/activator-a-free-and-extremely-powerful>, 5 pages. |
“[Quick Tip] Hold Mouse Right Button for More Options On Drag and Drop,” Guiding Tech, Jul. 1, 2012, [online], <URL: http://www.guidingtech.com/12504/hold-mouse-right-button-more-drag- drop-options/>, 6 pages. |
Rana, “Pinch-to-Pinch, Swipe To Delete Features Coming to Gmail for Android”, iGyaan Network, dated Oct. 13, 2012, 7 pages. |
Nishihata, et al., “jQuery Mobile, smartphone site design introductory”, ASCII Media Works Inc., Sep. 13, 2012, pp. 116-117. |
Communication pursuant to Article 94(3) EPC for European Application No. 13805647.8 dated Oct. 30, 2019, 18 pages. |
Office Action for U.S. Appl. No. 15/486,285 dated Dec. 19, 2019, 14 pages. |
Advisory Action from U.S. Appl. No. 15/486,285, dated Oct. 2, 2019, 3 pages. |
Extended European Search Report for EP Application No. 19152886.8 dated Jun. 17, 2019, 9 pages. |
International Preliminary Report on Patentability for PCT Application No. PCT/US13/71066, dated Jun. 4, 2015, 12 pages. |
International Preliminary Report on Patentability for PCT Application No. PCT/US13/71074, dated Jun. 4, 2015, 10 pages. |
International Search Report and Written Opinion for PCT Application No. PCT/US13/71066, dated Mar. 18, 2014, 14 pages. |
International Search Report and Written Opinion for PCT Application No. PCT/US2013/071074, dated Aug. 13, 2014, 15 pages. |
Invitation to Pay Fees and Partial International Search Report for PCT Application No. PCT/US2013/071074, dated Feb. 17, 2014, 7 pages. |
Office Action for U.S. Appl. No. 14/056,838 dated Jul. 8, 2014, 18 pages. |
Office Action for U.S. Appl. No. 14/056,838 dated May 8, 2015, 15 pages. |
Office Action for U.S. Appl. No. 14/056,850 dated Nov. 9, 2015, 24 pages. |
Office Action for U.S. Appl. No. 14/077,160 dated Jul. 24, 2015, 23 pages. |
Office Action for U.S. Appl. No. 14/155,304 dated Sep. 24, 2015, 25 pages. |
Office Action for U.S. Appl. No. 14/248,272 dated Jun. 19, 2014, 20 pages. |
Advisory Action from U.S. Appl. No. 15/486,285, dated Aug. 24, 2020, 3 pages. |
Communication under Rule 71(3) EPC for European Application No. 13805647.8 dated Jun. 12, 2020, 41 pages. |
Final Office Action from U.S. Appl. No. 15/486,285, dated Jun. 24, 2020, 15 pages. |
Communication pursuant to Article 94(3) EPC for European Application No. 19152886.8 dated Oct. 19, 2020, 7 pages. |
Communication under Rule 94(3) EPC for European Application No. 13805647.8 dated Nov. 3, 2020, 4 pages. |
Non-Final Office Action from U.S. Appl. No. 15/486,285, dated Dec. 17, 2020, 15 pages. |
Final Office Action from U.S. Appl. No. 15/486,285, dated May 3, 2021, 16 pages. |
Communication under Rule 71(3) EPC for European Application No. 13805647.8 dated Jun. 16, 2021, 42 pages. |
Number | Date | Country | |
---|---|---|---|
20170310813 A1 | Oct 2017 | US |
Number | Date | Country | |
---|---|---|---|
61728626 | Nov 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14248272 | Apr 2014 | US |
Child | 15644672 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14056838 | Oct 2013 | US |
Child | 14248272 | US |