Real-time document annotation

Information

  • Patent Grant
  • 11625529
  • Patent Number
    11,625,529
  • Date Filed
    Wednesday, October 7, 2020
    4 years ago
  • Date Issued
    Tuesday, April 11, 2023
    2 years ago
Abstract
Aspects of the present disclosure relate to systems and methods for receiving, managing, and displaying annotations on documents in real-time. A user (e.g., an author of a document) uploads a document into a real-time annotation system, which may then generate a composite presentation based on the uploaded document. The composite presentation includes all the content of the document presented in a specially configured graphical user interface to receive and manage annotations from a plurality of user devices.
Description
TECHNICAL FIELD

The subject matter disclosed herein relates to electronic document annotation. In particular, example embodiments may relate to techniques for receiving, allocating, managing, and displaying annotations on electronic documents.


BACKGROUND

Documents are often created and shared with groups of individuals to facilitate the exchange of information. Presentation programs, such as Microsoft PowerPoint, aid this effort by creating a medium in which a presenter (or group of presenters) can create presentation decks made up of slides containing visual and auditory content, which can then be presented to an individual or group of individuals in a slide show format. Individuals viewing the presentation often provide feedback to the presenter after the conclusion of the presentation, in the form of comments and questions, provided in-person, or at a later time, or as a red-lined hardcopy of the presentation itself. This collaborative effort is often the ultimate goal of presenting the information to a group of individuals in the first place, but can be impaired as a result of the inability to effectively provide, receive, and manage the feedback.


For example, in a traditional presentation, a presenter first creates and distributes a set of presentation documents in the form of a digital slide show delivered electronically, or as a printed hardcopy to a group of individuals. In a live presentation of the document, the presenter explains the content of the presentation to the group, pausing as a result of interruptions as a result of questions or comments related to the content of the presentation. These frequent pauses in the presentation can result in a segmented and disconnected explanation of topics, and may also result in the presenter losing their train of thought, slowing down the pace of the presentation, and possibly missing or skipping over important concepts.


In cases where the presentation is simply delivered to individuals to view on their own time, the presenter is then faced with the problem of receiving feedback and comments through multiple mediums (e.g., phone calls, in person conversations, red-lined hardcopies, or emails), as well as duplicative comments which become time consuming and frustrating to repeatedly address. As a result, many comments may remain unresolved, and the presentation may fail to incorporate valuable feedback which the presenter may have missed.


It is therefore valuable to devise systems and methods to facilitate and encourage the collaborative process, while avoiding the current cumbersome, time-consuming, and error-prone methods which exist currently.





BRIEF DESCRIPTION OF THE DRAWINGS

Various ones of the appended drawings merely illustrate example embodiments of the present inventive subject matter and cannot be considered as limiting its scope.



FIG. 1 is an architecture diagram depicting a presentation platform having a client-server architecture configured for exchanging data, according to an example embodiment.



FIG. 2 is a block diagram illustrating various modules comprising a real-time annotation system, which is provided as part of the presentation platform, consistent with some embodiments.



FIG. 3 is a flowchart illustrating a method for receiving and managing annotations on documents in real-time, consistent with some embodiments.



FIG. 4 is a flowchart illustrating a method for displaying annotations and receiving comments on the annotations from one or more users, consistent with some embodiments.



FIG. 5 is an interface diagram illustrating a composite presentation, comprising a graphical user interface overlaid on a composite presentation, according to example embodiments.



FIG. 6 is an interface diagram illustrating a user interface, comprising a newsfeed, and a document repository, according to example embodiments.



FIG. 7 is an interface diagram illustrating the user interface, comprising the newsfeed, updated to display annotation notifications, and an upload window, according to example embodiments.



FIG. 8 is a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed.





DETAILED DESCRIPTION

Reference will now be made in detail to specific example embodiments for carrying out the inventive subject matter. Examples of these specific embodiments are illustrated in the accompanying drawings, and specific details are set forth in the following description in order to provide a thorough understanding of the subject matter. It will be understood that these examples are not intended to limit the scope of the claims to the illustrated embodiments. On the contrary, they are intended to cover such alternatives, modifications, and equivalents as may be included within the scope of the disclosure.


Aspects of the present disclosure relate to systems and methods for receiving, managing, and displaying annotations on presentation objects in real-time. The presentation objects include electronic documents, such as PowerPoint® presentations, as well as simple word processing documents, and PDF files, and includes an arrangement of one or more presentation elements (e.g., graphics, text, videos, interactive web elements). A user (e.g., a human author of a document) uploads a presentation object(s) into a real-time annotation system, which generates a composite presentation based on the presentation object(s). The presentation object includes one or more user identifiers of users associated with the presentation object data, including, for example, an author of the presentation object, one or more users assigned to the presentation object, as well as a user who uploaded the presentation object into the real-time annotation system. The composite presentation is displayed within a graphical user interface (GUI), and the GUI includes one or more interactive elements overlaid on the composite presentation to receive and manage annotations from user devices, in real-time.


From a user perspective, the users viewing the composite presentation can provide annotations directly onto the GUI through an input component of a user device (e.g., mobile device or desktop computer) by a single or double-click of a mouse, or in the case of touch-enabled devices, a predefined tactile input (e.g., a tap, a swipe, etc.). For example, to provide an annotation on a particular presentation element within the composite presentation, a first user positions a cursor over a location in the GUI corresponding to a desired presentation element of the composite presentation, and provides a user input.


Responsive to receiving the user input, the real-time annotation system indicates receipt of the user input, for example by causing display of a dialogue box within the GUI, where the first user may provide additional annotation data. The annotation data may include, for example, a user identifier of the first user, a string of text (e.g., comments from users), an indication of approval or disapproval, an annotation type (e.g., a question, or a comment) a status indicator flag (e.g., resolved or unresolved), a reference to another user (e.g., a user identifier), as well as multimedia content (e.g., audio and video). The annotation data is then indexed and stored according to the user identifier of the first user, the annotation content, and the composite presentation. The real-time annotation system generates and displays a graphical element representative of the annotation data at the location within the GUI corresponding to the presentation element indicated by the first user. By selecting the graphical element, the annotation data is displayed in a separate window as a user generated annotation, for viewing by a user. For example, a user may select the graphical element by simply placing a cursor over a location of the graphical element within the GUI (e.g., hovering the cursor over the graphical element), as well as by providing an explicit selection of the graphical element (e.g., clicking).


In some embodiments, the graphical element generated by the real-time annotation system includes additional indicators to identify the specific annotation, as well as the annotation type. For example, annotations marked as a “question” may be displayed in red (or any other color), while annotations marked as a “comment” may be displayed as blue (or any other color).


Responsive to receiving the annotation data, the real-time annotation system causes a notification that the annotation data has been received to be displayed at the devices of the one or more users associated with the presentation object of the composite presentation. In this way, the real-time annotation system may notify the one or more users associated with a particular presentation object of annotations received on the composite presentation in real-time. The notification includes an identification of the composite presentation in which the annotation was made and a user identifier of the user who provided the annotation data. Responsive to selecting the notification, the real-time annotation system may cause display of the annotation data as a user generated annotation, which the user may interact with by, for example providing a response. The associated users may then review the user generated annotation, and provide any necessary response or feedback requested.


In some embodiments, the real-time annotation system maintains composite presentations associated with user identifiers, such that all composite presentations associated with a particular user may be accessed through a single GUI at the user device. For example, a composite presentation may be associated to a user identifier based on the user having authored or uploaded the document, the user placing an annotation on the document, or an annotation on the document referencing the user. The real-time annotation system causes display of a user interface which includes a presentation of one or more graphical elements representing each of the associated composite presentations.


In some instances, the user generated annotation may include references to specific users, for example based on a user identifier. Responsive to receiving an annotation with a references to a specific user based on the user identifier, the real-time annotation system may cause display of a notification on a device of the specified user, which the specified user may then respond to.


In additional example embodiments, the user generated annotations may also include additional GUI elements with functionality to receive follow requests, as well as assignment request to assign an annotation to a user based on a user identifier. For example, by selecting the follow request included in the GUI, a user may then be notified of all comments and other activities which occur in reference to the annotation. Similarly, by assigning the annotation to a user based on a user identifier, the assigned user may then be notified of all comments and activities which occur in reference to the annotation.



FIG. 1 is a network diagram illustrating a network environment 100 suitable for operating a real-time document annotation system 150. A presentation platform 102 provides server-side functionality, via a network 104 (e.g., an intranet, the Internet or a Wide Area Network (WAN)), to one or more clients such as the client device 110. FIG. 1 illustrates a web client 112, client applications 114, and a programmatic client 116 executing on respective client device 110.


An Application Program Interface (API) server 120 and a web server 122 are coupled to, and provide programmatic and web interfaces respectively to, one or more application servers 140. The application servers 140 host the real-time annotation system 150. The application servers 140 are, in turn, shown to be coupled to one or more database servers 124 that facilitate access to one or more databases 126.


A real-time annotation system 150 provides real-time document annotation management functions for the presentation platform 102. For example, the real-time annotation system 150 receives a document (or documents) and generates a composite presentation document configured to receive annotation data from user devices in real-time.


As shown, the network system 100 includes a client device 110 in communication with a presentation platform 102 over a network 104. The presentation platform 102 communicates and exchanges data with the client device 110 that pertains to various functions and aspects associated with the presentation platform 102 and its users. Likewise, the client device 110, which may be any of a variety of types of devices that include at least a display, a processor, and communication capabilities that provide access to the network 104 (e.g., a smart phone, a tablet computer, a personal digital assistant (PDA), a personal navigation device (PND), a handheld computer, a desktop computer, a laptop or netbook, or a wearable computing device), may be operated by a user (e.g., a person) of the network system 100 to exchange data with the presentation platform 104 over the network 106.


The client device 110 communicates with the network 104 via a wired or wireless connection. For example, one or more portions of the network 104 may comprises an ad hoc network, an intranet, an extranet, a Virtual Private Network (VPN), a Local Area Network (LAN), a wireless LAN (WLAN), a Wide Area Network (WAN), a wireless WAN (WWAN), a Metropolitan Area Network (MAN), a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a cellular telephone network, a wireless network, a Wireless Fidelity (Wi-Fi®) network, a Worldwide Interoperability for Microwave Access (WiMax) network, another type of network, or any suitable combination thereof.


In various embodiments, the data exchanged between the client device 110 and the presentation platform 102 may involve user-selected functions available through one or more user interfaces (UIs). The UIs may be specifically associated with a web client 112 (e.g., a browser) or an application 114, executing on the client device 110, and in communication with the presentation platform 102.


Turning specifically to the presentation platform 102, a web server 122 is coupled to (e.g., via wired or wireless interfaces), and provides web interfaces to, an application server 140. The application server 140 hosts one or more applications (e.g., web applications) that allow users to use various functions and services of the presentation platform 104. For example, the application server 140 may host a real-time annotation system 150 that provides a number of real-time document annotation management functions. In some embodiments, the real-time annotation system 150 runs and executes on the application server 140, while in other embodiments, the application server 140 provides the client device 110 with a set of instructions (e.g., computer-readable code) that causes the web client 112 of the client device 110 to execute and run the real-time annotation system 150. The real-time annotation system 150 receives presentation objects and generates composite presentations based on the presentation objects.


The presentation objects received and ingested by the real-time annotation system 150 may, for example, include a presentation slideshow (e.g., PowerPoint presentation) made up of a set of one or more presentation documents. Each of the presentation documents included within the presentation slideshow may include a set of presentation elements in the form of auditory or visual content (e.g., images, text strings, audio files, etc.).


The real-time annotation system 150 receives the presentation object (e.g., presentation slideshow) as presentation data delivered from a client device (e.g., client device 110), and store the presentation object within a database (e.g., database 126), at a memory location associated with the user of the client device. The real-time annotation system 150 then generates a composite presentation based on the presentation data. The composite presentation generated by the real-time presentation system 150 includes the presentation data received from the client device, overlaid with elements of a GUI configured to receive and manage annotation data from one or more users in real-time.



FIG. 2 is a block diagram illustrating various modules comprising the real-time annotation system 150, which is provided as part of the presentation platform 102, consistent with some embodiments. As is understood by skilled artisans in the relevant computer and Internet-related arts, the modules and engines illustrated in FIG. 2 represent a set of executable software instructions and the corresponding hardware (e.g., memory and processor) for executing the instructions. To avoid obscuring the inventive subject matter with unnecessary detail, various functional components (e.g., modules and engines) that are not germane to conveying an understanding of the inventive subject matter have been omitted from FIG. 2. However, a skilled artisan will readily recognize that various additional functional components may be supported by the real-time annotation system 150 to facilitate additional functionality that is not specifically described herein. Furthermore, the various functional modules and engines depicted in FIG. 2 may reside on a single computer (e.g., a client device), or may be distributed across several computers in various arrangements such as cloud-based architectures.


The real-time annotation system 150 is shown as including a document ingest module 210, a presentation module 220, an annotation module 230, and a notification module 240, all configured to communicate with each other (e.g., via a bus, shared memory, a switch, or application programming interfaces (APIs)). The aforementioned modules of the real-time annotation system 150 may, furthermore, access one or more databases that are part of the presentation platform 102 (e.g., database 126), and each of the modules may access one or more computer-readable storage media of the client device 110.


The document ingest module 210 receives presentation data from various client computing devices, and communicates appropriate responses to the requesting client devices to indicate receipt of the documents. The document ingest module 210 may receive presentation data from client devices. For example, the document ingest module 210 provides a number of interfaces (e.g., APIs or user interfaces that are presented by the client device 102) that allow users to upload data (e.g., documents).


The presentation module 220 is configured to access the ingested presentation data, and generate a composite presentation to display within a GUI at a client device. The composite presentation is displayed within a GUI, and the GUI includes one or more interactive elements overlaid on the composite presentation to receive and manage annotations from user devices, in real-time. The composite presentation includes one or more user identifiers of users associated with the presentation data.


Additionally, the presentation module 220 causes display of additional user interfaces that include graphical representations of the various annotation options provided by the real-time annotation system 150. These user interfaces may include various interface elements to share, assign, save, comment and open composite presentations. The presentation module 220 also receives and processes user input received through such user interfaces. Examples of the user interfaces provided by the presentation module 220 are discussed below in reference to FIGS. 4-7.


The annotation module 230 is configured to receive and manage annotation data received from client devices (e.g., client device 110). For example, users (e.g., user 106) viewing the composite presentation at a client device may provide one or more annotations related to the presentation data of the composite presentation. The annotation module 230 is configured to receive the annotation data and store the annotation data at a memory location in a database (e.g., database 126) associated with the composite presentation. The annotation data includes, for example: location data indicating a location which the annotation is directed to; user identifiers associated with the user providing the annotation; users referenced within the annotation; annotation content such as a text string; and annotation identification information such as data indicating an annotation type.


The notification module 240 is configured to generate and cause display of notifications responsive to the annotation module 230 providing an indication that annotation data has been received from a client device. For example, responsive to receiving an indication that the annotation module 230 has received annotation data from a client device, the notification module 240 generates a notification of the annotation to be displayed at one or more client devices. The notification of the annotation may include, for example, the annotation data, including the user identifier of the user which provided the annotation, as well as an indication of the composite presentation which the annotation applies to. The notification module 240 may then cause display of the notification at a client device of all users associated with the notification. Users associated with the notification include, for example, any users referenced within the annotation (e.g., based on a user identifier), the author of the composite presentation, a user who originally uploaded the document into the real-time annotation system 150, and the users who provide annotations.



FIG. 3 is a flowchart illustrating a method 300 for receiving and managing annotations on documents in real-time, consistent with some embodiments. The method 300 may be embodied in computer-readable instructions for execution by one or more processors such that the operations of the method 300 may be performed in part or in whole by the application server 140. In particular, the operations of the method 300 may be performed in part or in whole by the real-time annotation system 150; accordingly, the method 300 is described below by way of example with reference thereto. However, it shall be appreciated that at least some of the operations of the method 300 may be deployed on various other hardware configurations and the method 300 is not intended to be limited to the application server 140 or the real-time annotation system 150.


At operation 310, the document ingest module 210 receives presentation data from a client device (e.g., client device 110). The presentation data may include one or more presentation documents comprising presentation content. After receiving the presentation data, the document ingest module 210 allocates a memory location for the presentation data within the database 126, and stores the presentation data in the database 126.


At operation 320, responsive to receiving an indication that the real-time annotation system 150 has received presentation data, the presentation module 220 accesses the memory location associated with the presentation data and generates a composite presentation including the presentation data and one or more associated user identifiers. The presentation module 220 causes display of the composite presentation overlaid with a GUI, the GUI including functionality to receive and manage annotations from one or more client devices viewing the composite presentation.


At operation 330, the presentation module 220 causes display of the composite presentation at one or more client devices. The composite presentation may be displayed responsive to receiving an invitation to view the composite presentation from another user (e.g., the author of the presentation, or another user with access to the presentation). For example a user may access the composite presentation through a GUI of the real-time annotation system 150 at their user device. The user may select a graphical element representative of the composite presentation, and responsive to receiving an indication that the graphical element has been selected, the presentation module 220 causes display of the composite presentation.


At operation 340, the annotation module 230 receives annotation data from one or more client devices on which the composite presentation is presented. The annotation data may include a string of text, a user identifier associated with the user providing the annotation, a location within a composite presentation which the annotation applies, and an identifier associated with the composite presentation itself. In some instances, the annotation data may also include one or more user identifiers referenced by the annotating user. In this way, the system may provide notifications to the users associated with the one or more user identifiers of the annotation.


At operation 350, the notification module 240 receives an indication that annotation data has been received, and generates a notification to be displayed at one or more client devices. The notification may be displayed as a text box with a first portion of the text box including a string of text or other indicator (e.g., a graphical element) referencing a corresponding composite presentation, and a second portion of the text box includes a user identifier indicating a user who created the annotation. In some embodiments, the notification may also include an indication of an annotation type (e.g., question or comment), as well as a timestamp indicating a time and date in which the annotation was first received. A user may then access the corresponding composite presentation by selecting the notification. Responsive to a selection of the notification, the real-time annotation system 150 may then cause display of the composite presentation, along with the user generated annotation itself.


In some embodiments, the annotation data may also include a status indicator, indicating whether the user generated annotation has been resolved or unresolved. For example, the user generated annotation may include annotation data indicating a question requiring a response. The status indicator of the annotation may indicate that the annotation is “unresolved,” until a comment is provided, or a user switches the state of the annotation to “resolved” by providing an input from a client device. Once the annotation has been marked resolved, the display of the notification of the annotation may disappear from the GUI.



FIG. 4 is a flowchart illustrating a method 400 for displaying user generated annotations and receiving comments on the user generated annotations from one or more users, consistent with some embodiments. The method 400 may be embodied in computer-readable instructions for execution by one or more processors such that the operations of the method 400 may be performed in part or in whole by the application server 140. In particular, the operations of the method 400 may be performed in part or in whole by the real-time annotation system 150; accordingly, the method 400 is described below by way of example with reference thereto. However, it shall be appreciated that at least some of the operations of the method 400 may be deployed on various other hardware configurations and the method 400 is not intended to be limited to the application server 140 or the real-time annotation system 150.


As discussed above, in FIG. 3, the method 300 describes a method for receiving presentation data, and generating and displaying a composite presentation overlaid with a GUI configured to receive and manage user generated annotations in real-time from one or more user devices. The method 400 may initiate at a point in which a user has received a notification of a user generated annotation, and provided an indication that the user generated annotation be displayed at the user device. At operation 410, the annotation module 230 causes display of the user generated annotation pertaining to a composite presentation within a graphical user interface of the real-time annotation system 150. The user generated annotation may have been received from another user accessing the composite presentation, or alternatively could have been created by the viewing user themselves. The user generated annotation may be displayed as graphical element represented as a text box including a first portion which includes a text string (e.g., a question or comment), a second portion which indicates a source of the annotation (e.g., a user identifier), and a third portion which may include a text field to provide an annotation comment to the annotation. In some instances, the user generated annotation may also include an annotation identifier (e.g., a number, letter, or other form of identification).


At operation 420, the user viewing the user generated annotation provides a comment on the user generated annotation. The user may initiate the comment by selecting the user generated annotation and providing an annotation comment as a text string directly into a text field of the user generated annotation. The annotation comment may include additional text, pertaining to the annotation, as well as one or more references to additional users (e.g., based on user identifiers).


At operation 430, responsive to receiving an indication of the annotation comment from the user device, the notification module 240 generates and causes display of a notification at the user devices of the associated users. The associated users include the author of the composite presentation (or user who uploaded the presentation deck into the real-time annotation system), the user who provided the annotation on which the comment was received, as well as any users referenced in the comment itself, or who may have commented on the annotation. The notification may include a reference to the annotation on which the comment was received, a reference to the corresponding composite presentation, as well as a user identifier associated with the user who provided the comment.


At operation 440, the notification module 240 receives an indication of a selection of the notification from a user device. Responsive to the notification module 240 receiving the selection, at operation 450, the annotation module 230 causes display of the user generated annotation at the user device, including the most recent comment.



FIG. 5 is an interface diagram illustrating a composite presentation 510 overlaid with a GUI 500, according to example embodiments. The composite presentation 510 is shown to include, a set of user generated annotations 515, each of the user generated annotations within the set of user generated annotations 515, and graphical elements (e.g., 520 and 525) to represent a reference location of a corresponding user generated annotation, and a cursor 560 to place annotations on the composite presentation 510.


In some embodiments, the user generated annotations 515 are generated responsive to an input received from a client device (e.g., client device 110), via the cursor 560. For example a user may move the cursor 560 over a location in the composite presentation 510 within the GUI 500 and provide a user input (e.g., a single or double click). Responsive to the received user input, the real-time annotation system 150 generates and places a graphical element (e.g., 520, 525) at the location of the cursor 560 in the GUI 500. Each graphical element may be visually distinct from one another, for example, by numbering, unique colors or patterns, or labels. As shown in FIG. 5, each graphical element includes a unique number identifying an associated annotation among the user generated annotations 515.


The user generated annotations (e.g., 515) created are then displayed within a portion of the GUI 500. The user generated annotation may include an annotation identifier 530, in order to identify a corresponding graphical element (e.g., graphical element 525) associated with the annotation, a source identifier 535 (e.g., a user identifier), identifying a user who created/placed the user generated annotation on the composite presentation 510, a status indicator 540, to indicate a status of the annotation (e.g., resolved or unresolved), an assignment field 545, where a user may assign the user generated annotation to one or more users, based on one or more user identifiers, a comment field 550, where a user may provide a comment onto the user generated annotation, and annotation content 555, which may include a text string, an indication of approval or disapproval (e.g., a like or dislike), or simply a flag to alert a recipient of a particular element within the composite presentation 510.



FIG. 6 is an interface diagram illustrating a GUI 600, comprising a newsfeed 610, and a document repository 615, according to example embodiments. In some embodiments, the newsfeed 610 may include a switch 620 to toggle between various types of content to be displayed within the newsfeed 610. For example, the switch 620 may toggle the content displayed between an updates section and an annotations section, wherein the updates section displays annotation notifications in real-time, and the annotations section displays abbreviated annotations. The newsfeed displayed in FIG. 6 depicts the annotations section.


The annotations section of the newsfeed 610 comprises a list of all user generated annotations which the user accessing the GUI 600 has created, has been assigned, or is following. As shown, the section may be separated into a personal tasks section 625 to display annotations which the user themselves has created, and a following section 630 to display annotations which the user is following (e.g., requested updates for).


A document repository 615 includes an arrangement of graphical representations of all documents which a user is associated with. As stated above, a user may be associated with a document if they uploaded the document into the real-time annotation system 150, created an annotation on a composite presentation, received an assignment on an annotation, or were referenced in a comment or annotation by a user identifier.



FIG. 7 is an interface diagram illustrating the GUI 600, comprising the newsfeed 610, updated to display annotation notifications (e.g., annotation notification 715), and an upload window 710, according to example embodiments. The annotation notifications are received from one or more client devices and displayed in real-time. In some embodiments, the user may sort the annotation notifications in the order they are received, with the most recently received annotations at the top of the newsfeed 610, and the older annotations preceding.


The annotation notifications include an indication of the nature of the notification (e.g., “you were mentioned in the comment,” “you were mentioned in an annotation,” “you were assigned to a task”), an indication of what composite presentation the annotation is regarding (e.g., “Test Document”), as well as a text string. The user may access the corresponding composite presentation by selecting a desired annotation notification from among the set of annotation notifications displayed.


The upload window 710 enables users to upload presentation data into the real-time annotation system 150. Upon uploading the presentation data via the upload window 710, the real-time annotation system 150 stores the document within the database 126, and generates a composite presentation based on the uploaded document. A user may also provide a title for the document, a description for the document, and a file type of the document. In some embodiments, the uploading user may also share the document with one or more additional users by including a set of user identifiers into the upload window 710.


Modules, Components, and Logic


Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium) or hardware modules. A “hardware module” is a tangible unit capable of performing certain operations and may be configured or arranged in a certain physical manner. In various example embodiments, one or more computer systems (e.g., a standalone computer system, a client computer system, or a server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.


In some embodiments, a hardware module may be implemented mechanically, electronically, or any suitable combination thereof. For example, a hardware module may include dedicated circuitry or logic that is permanently configured to perform certain operations. For example, a hardware module may be a special-purpose processor, such as a Field-Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC). A hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations. For example, a hardware module may include software executed by a general-purpose processor or other programmable processor. Once configured by such software, hardware modules become specific machines (or specific components of a machine) uniquely tailored to perform the configured functions and are no longer general-purpose processors. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.


Accordingly, the phrase “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. As used herein, “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where a hardware module comprises a general-purpose processor configured by software to become a special-purpose processor, the general-purpose processor may be configured as respectively different special-purpose processors (e.g., comprising different hardware modules) at different times. Software accordingly configures a particular processor or processors, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.


Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).


The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein. As used herein, “processor-implemented module” refers to a hardware module implemented using one or more processors.


Similarly, the methods described herein may be at least partially processor-implemented, with a particular processor or processors being an example of hardware. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. Moreover, the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an API).


The performance of certain of the operations may be distributed among the processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the processors or processor-implemented modules may be distributed across a number of geographic locations.


Example Machine Architecture and Machine-Readable



FIG. 8 is a block diagram illustrating components of a machine 800, according to some example embodiments, able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium) and perform any one or more of the methodologies discussed herein. Specifically, FIG. 8 shows a diagrammatic representation of the machine 800 in the example form of a computer system, within which instructions 816 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the machine 800 to perform any one or more of the methodologies discussed herein may be executed. Additionally, or alternatively, the machine 800 may correspond to any one of the client device 102, the web server 110, the application server 112, or the third-party computing system 118. The instructions transform the general, non-programmed machine into a particular machine programmed to carry out the described and illustrated functions in the manner described. In alternative embodiments, the machine 800 operates as a standalone device or may be coupled (e.g., networked) to other machines. In a networked deployment, the machine 800 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine 800 may comprise, but not be limited to, a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a PDA, an entertainment media system, a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), a smart home device (e.g., a smart appliance), other smart devices, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing the instructions 816, sequentially or otherwise, that specify actions to be taken by the machine 800. Further, while only a single machine 800 is illustrated, the term “machine” shall also be taken to include a collection of machines 800 that individually or jointly execute the instructions 816 to perform any one or more of the methodologies discussed herein.


The machine 800 may include processors 810, memory/storage 830, and I/O components 850, which may be configured to communicate with each other such as via a bus 802. In an example embodiment, the processors 810 (e.g., a Central Processing Unit (CPU), a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an ASIC, a Radio-Frequency Integrated Circuit (RFIC), another processor, or any suitable combination thereof) may include, for example, a processor 812 and a processor 814 that may execute the instructions 816. The term “processor” is intended to include multi-core processor that may comprise two or more independent processors (sometimes referred to as “cores”) that may execute instructions contemporaneously. Although FIG. 8 shows multiple processors, the machine 800 may include a single processor with a single core, a single processor with multiple cores (e.g., a multi-core processor), multiple processors with a single core, multiple processors with multiples cores, or any combination thereof.


The memory/storage 830 may include a memory 832, such as a main memory, or other memory storage, and a storage unit 836, both accessible to the processors 810 such as via the bus 802. The storage unit 836 and memory 832 store the instructions 816 embodying any one or more of the methodologies or functions described herein. The instructions 816 may also reside, completely or partially, within the memory 832, within the storage unit 836, within at least one of the processors 810 (e.g., within the processor's cache memory), or any suitable combination thereof, during execution thereof by the machine 800. Accordingly, the memory 832, the storage unit 836, and the memory of the processors 810 are examples of machine-readable media.


As used herein, “machine-readable medium” means a device able to store instructions and data temporarily or permanently, and may include, but is not limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, optical media, magnetic media, cache memory, other types of storage (e.g., Erasable Programmable Read-Only Memory (EEPROM)), and/or any suitable combination thereof. The term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store the instructions 816. The term “machine-readable medium” shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., instructions 816) for execution by a machine (e.g., machine 800), such that the instructions, when executed by one or more processors of the machine (e.g., processors 810), cause the machine to perform any one or more of the methodologies described herein. Accordingly, a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” excludes signals per se.


Furthermore, the machine-readable medium is non-transitory in that it does not embody a propagating signal. However, labeling the tangible machine-readable medium “non-transitory” should not be construed to mean that the medium is incapable of movement—the medium should be considered as being transportable from one real-world location to another. Additionally, since the machine-readable medium is tangible, the medium may be considered to be a machine-readable device.


The I/O components 850 may include a wide variety of components to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on. The specific I/O components 850 that are included in a particular machine will depend on the type of machine. For example, portable machines such as mobile phones will likely include a touch input device or other such input mechanisms, while a headless server machine will likely not include such a touch input device. It will be appreciated that the I/O components 850 may include many other components that are not shown in FIG. 8. The I/O components 850 are grouped according to functionality merely for simplifying the following discussion and the grouping is in no way limiting. In various example embodiments, the I/O components 850 may include output components 852 and input components 854. The output components 852 may include visual components (e.g., a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)), acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor, resistance mechanisms), other signal generators, and so forth. The input components 854 may include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or another pointing instrument), tactile input components (e.g., a physical button, a touch screen that provides location and/or force of touches or touch gestures, or other tactile input components), audio input components (e.g., a microphone), and the like.


In further example embodiments, the I/O components 850 may include biometric components 856, motion components 858, environmental components 860, or position components 862 among a wide array of other components. For example, the biometric components 856 may include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram based identification), and the like. The motion components 858 may include acceleration sensor components (e.g., accelerometer), gravitation sensor components, rotation sensor components (e.g., gyroscope), and so forth. The environmental components 860 may include, for example, illumination sensor components (e.g., photometer), temperature sensor components (e.g., one or more thermometers that detect ambient temperature), humidity sensor components, pressure sensor components (e.g., barometer), acoustic sensor components (e.g., one or more microphones that detect background noise), proximity sensor components (e.g., infrared sensors that detect nearby objects), gas sensors (e.g., gas detection sensors to detect concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), or other components that may provide indications, measurements, or signals corresponding to a surrounding physical environment. The position components 862 may include location sensor components (e.g., a Global Position System (GPS) receiver component), altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like.


Communication may be implemented using a wide variety of technologies. The I/O components 850 may include communication components 864 operable to couple the machine 800 to a network 880 or devices 870 via a coupling 882 and a coupling 872, respectively. For example, the communication components 864 may include a network interface component or other suitable device to interface with the network 880. In further examples, the communication components 864 may include wired communication components, wireless communication components, cellular communication components, Near Field Communication (NFC) components, Bluetooth® components (e.g., Bluetooth® Low Energy), Wi-Fi® components, and other communication components to provide communication via other modalities. The devices 870 may be another machine or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a Universal Serial Bus (USB)).


Moreover, the communication components 864 may detect identifiers or include components operable to detect identifiers. For example, the communication components 864 may include Radio Frequency Identification (RFID) tag reader components, NFC smart tag detection components, optical reader components (e.g., an optical sensor to detect one-dimensional bar codes such as Universal Product Code (UPC) bar code, multi-dimensional bar codes such as Quick Response (QR) code, Aztec code, Data Matrix, Dataglyph, MaxiCode, PDF4117, Ultra Code, UCC RSS-2D bar code, and other optical codes), or acoustic detection components (e.g., microphones to identify tagged audio signals). In addition, a variety of information may be derived via the communication components 864, such as location via Internet Protocol (IP) geo-location, location via Wi-Fi® signal triangulation, location via detecting an NFC beacon signal that may indicate a particular location, and so forth.


Transmission Medium


In various example embodiments, one or more portions of the network 880 may be an ad hoc network, an intranet, an extranet, a VPN, a LAN, a WLAN, a WAN, a WWAN, a MAN, the Internet, a portion of the Internet, a portion of the PSTN, a plain old telephone service (POTS) network, a cellular telephone network, a wireless network, a Wi-Fi® network, another type of network, or a combination of two or more such networks. For example, the network 880 or a portion of the network 880 may include a wireless or cellular network and the coupling 882 may be a Code Division Multiple Access (CDMA) connection, a Global System for Mobile communications (GSM) connection, or another type of cellular or wireless coupling. In this example, the coupling 882 may implement any of a variety of types of data transfer technology, such as Single Carrier Radio Transmission Technology (1×RTT), Evolution-Data Optimized (EVDO) technology, General Packet Radio Service (GPRS) technology, Enhanced Data rates for GSM Evolution (EDGE) technology, third Generation Partnership Project (3GPP) including 3G, fourth generation wireless (4G) networks, Universal Mobile Telecommunications System (UMTS), High Speed Packet Access (HSPA), Worldwide Interoperability for Microwave Access (WiMAX), Long Term Evolution (LTE) standard, others defined by various standard-setting organizations, other long range protocols, or other data transfer technology.


The instructions 816 may be transmitted or received over the network 880 using a transmission medium via a network interface device (e.g., a network interface component included in the communication components 864) and using any one of a number of well-known transfer protocols (e.g., hypertext transfer protocol (HTTP)). Similarly, the instructions 816 may be transmitted or received using a transmission medium via the coupling 872 (e.g., a peer-to-peer coupling) to the devices 870. The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carrying the instructions 816 for execution by the machine 800, and includes digital or analog communications signals or other intangible media to facilitate communication of such software.


Language


Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.


Although an overview of the inventive subject matter has been described with reference to specific example embodiments, various modifications and changes may be made to these embodiments without departing from the broader scope of embodiments of the present disclosure. Such embodiments of the inventive subject matter may be referred to herein, individually or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single disclosure or inventive concept if more than one is, in fact, disclosed.


The embodiments illustrated herein are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. The Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.


As used herein, the term “or” may be construed in either an inclusive or exclusive sense. Moreover, plural instances may be provided for resources, operations, or structures described herein as a single instance. Additionally, boundaries between various resources, operations, modules, engines, and data stores are somewhat arbitrary, and particular operations are illustrated in a context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within a scope of various embodiments of the present disclosure. In general, structures and functionality presented as separate resources in the example configurations may be implemented as a combined structure or resource. Similarly, structures and functionality presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and improvements fall within a scope of embodiments of the present disclosure as represented by the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.


In this document, the terms “a” or “an” are used, as is common in patent documents, to include one or more than one, independent of any other instances or usages of “at least one” or “one or more.” In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Also, in the following claims, the terms “including” and “comprising” are open-ended; that is, a system, device, article, or process that includes elements in addition to those listed after such a term in a claim are still deemed to fall within the scope of that claim. Moreover, in the following claims, the terms “first,” “second,” “third,” and so forth are used merely as labels, and are not intended to impose numerical requirements on their objects.

Claims
  • 1. A method comprising: receiving, by a first client device, an update to a set of graphical elements being displayed by the first client device, the update including an annotation identifier corresponding to a first annotation type and data identifying a first graphical element from the set of graphical elements, the update to the set of graphical elements having been transmitted by a cloud-based server in response to the cloud-based server having received annotation data defining the update to the set of graphical elements from a second client device that is displaying the set of graphical elements;updating a presentation of the set of graphical elements being displayed by the first client device based on the update such that the annotation identifier is presented at a location corresponding to the first graphical element;in response to detecting an input selecting the annotation identifier presented at the location corresponding to the first graphical element, transmitting a request to the cloud-based server for the annotation data corresponding to the annotation identifier;in response to receiving the annotation data from the cloud-based server, updating the presentation of the set of graphical elements being displayed by the first client device based on the annotation data;receiving, by the first client device, an assignment of the annotation data to a user based on a user identifier received via an input device of the first client device; andtransmitting, by the first client device, the assignment and the user identifier to the cloud-based server, the transmitting causing a presentation of a notification on the second client device by the cloud-based server, the notification including an indication that the user has been assigned to the annotation data, text of the comment, the user identifier, and an indication of the annotation data.
  • 2. The method of claim 1, wherein the annotation data includes multimedia content.
  • 3. The method of claim 1, wherein updating the presentation of the set of graphical elements being displayed by the first client device comprises: causing presentation of a user identifier that identifies a user profile associated with the second client device.
  • 4. The method of claim 1, further comprising: receiving a second update to the set of graphical elements being displayed by the first client device, the second update including a second annotation identifier corresponding to a second annotation type and data identifying a second graphical element from the set of graphical elements, the second update to the set of graphical elements having been transmitted by the cloud-based server in response to the cloud-based server having received second annotation data defining the second update to the set of graphical elements from a third client device that is displaying the set of graphical elements.
  • 5. The method of claim 4, further comprising: updating presentation of the set of graphical elements being displayed by the first client device based on the second update such that the second annotation identifier is presented at a location corresponding to the second graphical element.
  • 6. A client device comprising: one or more computer processors; andone or more computer-readable mediums storing instructions that, when executed by the one or more computer processors, cause the client device to perform operations comprising:receiving an update to a set of graphical elements being displayed by the client device, the update including an annotation identifier corresponding to a first annotation type and data identifying a first graphical element from the set of graphical elements, the update to the set of graphical elements having been transmitted by a cloud-based server in response to the cloud-based server having received annotation data defining the update to the set of graphical elements from a second client device that is displaying the set of graphical elements;updating presentation of the set of graphical elements being displayed by the client device based on the update such that the annotation identifier is presented at a location corresponding to the first graphical element;in response to detecting an input selecting the annotation identifier presented at the location corresponding to the first graphical element, transmitting a request to the cloud-based server for the annotation data corresponding to the annotation identifier;in response to receiving the annotation data from the cloud-based server, updating presentation of the set of graphical elements being displayed by the client device based on the annotation data;receiving, by the client device, an assignment of the annotation data to a user based on a user identifier received via an input device of the client device; andtransmitting, by the client device, the assignment and the user identifier to the cloud-based server, the transmitting causing a presentation of a notification on the second client device by the cloud-based server, the notification including an indication that the user has been assigned to the annotation data, text of the comment, the user identifier, and an indication of the annotation data.
  • 7. The client device of claim 6, wherein the annotation data includes multimedia content.
  • 8. The client device of claim 6, wherein updating the presentation of the set of graphical elements being displayed by the client device comprises: causing presentation of a user identifier that identifies a user profile associated with the second client device.
  • 9. The client device of claim 6, the operations further comprising: receiving a second update to the set of graphical elements being displayed by the client device, the second update including a second annotation identifier corresponding to a second annotation type and data identifying a second graphical element from the set of graphical elements, the second update to the set of graphical elements having been transmitted by the cloud-based server in response to the cloud-based server having received second annotation data defining the second update to the set of graphical elements from a third client device that is displaying the set of graphical elements.
  • 10. The client device of claim 9, the operations further comprising: updating presentation of the set of graphical elements being displayed by the client device based on the second update such that the second annotation identifier is presented at a location corresponding to the second graphical element.
  • 11. A non-transitory computer-readable medium storing instructions that, when executed by one or more computer processors of a client device, cause the client device to perform operations comprising: receiving an update to a set of graphical elements being displayed by the client device, the update including an annotation identifier corresponding to a first annotation type and data identifying a first graphical element from the set of graphical elements, the update to the set of graphical elements having been transmitted by a cloud-based server in response to the cloud-based server having received annotation data defining the update to the set of graphical elements from a second client device that is displaying the set of graphical elements;updating presentation of the set of graphical elements being displayed by the client device based on the update such that the annotation identifier is presented at a location corresponding to the first graphical element;in response to detecting an input selecting the annotation identifier presented at the location corresponding to the first graphical element, transmitting a request to the cloud-based server for the annotation data corresponding to the annotation identifier; andin response to receiving the annotation data from the cloud-based server, updating presentation of the set of graphical elements being displayed by the client device based on the annotation data;receiving, by the client device, an assignment of the annotation data to a user based on a user identifier received via an input device of the client device; andtransmitting, by the client device, the assignment and the user identifier to the cloud-based server, the transmitting causing a presentation of a notification on the second client device by the cloud-based server, the notification including an indication that the user has been assigned to the annotation data, text of the comment, the user identifier, and an indication of the annotation data.
  • 12. The non-transitory computer-readable medium of claim 11, wherein the annotation data includes multimedia content.
  • 13. The non-transitory computer-readable medium of claim 11, wherein updating the presentation of the set of graphical elements being displayed by the client device comprises: causing presentation of a user identifier that identifies a user profile associated with the second client device.
  • 14. The non-transitory computer-readable medium of claim 11, the operations further comprising: receiving a second update to the set of graphical elements being displayed by the client device, the second update including a second annotation identifier corresponding to a second annotation type and data identifying a second graphical element from the set of graphical elements, the second update to the set of graphical elements having been transmitted by the cloud-based server in response to the cloud-based server having received second annotation data defining the second update to the set of graphical elements from a third client device that is displaying the set of graphical elements; andupdating presentation of the set of graphical elements being displayed by the client device based on the second update such that the second annotation identifier is presented at a location corresponding to the second graphical element.
RELATED APPLICATIONS

This application is a continuation of and claims the priority benefit of U.S. patent application Ser. No. 16/046,397, filed Jul. 26, 2018, which is a continuation of and claims the priority benefit of U.S. patent application Ser. No. 15/051,565, filed Feb. 23, 2016, which claims the priority benefit of U.S. Provisional Application No. 62/272,617, filed Dec. 29, 2015, which are incorporated by reference herein in their entireties.

US Referenced Citations (564)
Number Name Date Kind
4881179 Vincent Nov 1989 A
5109399 Thompson Apr 1992 A
5241625 Epard Aug 1993 A
5329108 Lamoure Jul 1994 A
5670987 Doi et al. Sep 1997 A
5781704 Rossmo Jul 1998 A
5845300 Comer et al. Dec 1998 A
5999911 Berg et al. Dec 1999 A
6057757 Arrowsmith et al. May 2000 A
6065026 Cornelia et al. May 2000 A
6091956 Hollenberg Jul 2000 A
6101479 Shaw Aug 2000 A
6161098 Wallman Dec 2000 A
6219053 Tachibana et al. Apr 2001 B1
6232971 Haynes May 2001 B1
6237138 Hameluck et al. May 2001 B1
6243706 Moreau et al. Jun 2001 B1
6247019 Davies Jun 2001 B1
6279018 Kudrolli et al. Aug 2001 B1
6341310 Leshem et al. Jan 2002 B1
6366933 Ball et al. Apr 2002 B1
6369835 Lin Apr 2002 B1
6370538 Lamping et al. Apr 2002 B1
6430305 Decker Aug 2002 B1
6456997 Shukla Sep 2002 B1
6523019 Borthwick Feb 2003 B1
6549944 Weinberg et al. Apr 2003 B1
6560620 Ching May 2003 B1
6581068 Bensoussan et al. Jun 2003 B1
6594672 Lampson et al. Jul 2003 B1
6631496 Li et al. Oct 2003 B1
6642945 Sharpe Nov 2003 B1
6665683 Meitzer Dec 2003 B1
6714936 Nevin, III Mar 2004 B1
6775675 Nwabueze et al. Aug 2004 B1
6828920 Owen et al. Dec 2004 B2
6839745 Dingari et al. Jan 2005 B1
6850317 Mullins et al. Feb 2005 B2
6877137 Rivette et al. Apr 2005 B1
6944777 Belani et al. Sep 2005 B1
6944821 Bates et al. Sep 2005 B1
6967589 Peters Nov 2005 B1
6976210 Silva et al. Dec 2005 B1
6978419 Kantrowitz Dec 2005 B1
6980984 Huffman et al. Dec 2005 B1
6985950 Hanson et al. Jan 2006 B1
7036085 Barros Apr 2006 B2
7043702 Chi et al. May 2006 B2
7051039 Murthy et al. May 2006 B1
7055110 Kupka May 2006 B2
7086028 Davis et al. Aug 2006 B1
7139800 Bellotti et al. Nov 2006 B2
7158878 Rasmussen Jan 2007 B2
7162475 Ackerman Jan 2007 B2
7168039 Bertram Jan 2007 B2
7171427 Witkowski et al. Jan 2007 B2
7174377 Bernard et al. Feb 2007 B2
7194680 Roy et al. Mar 2007 B1
7213030 Jenkins May 2007 B1
7269786 Malloy et al. Sep 2007 B1
7278105 Kitts et al. Oct 2007 B1
7290698 Poslinski et al. Nov 2007 B2
7333998 Heckerman et al. Feb 2008 B2
7370047 Gorman May 2008 B2
7379811 Rasmussen et al. May 2008 B2
7379903 Caballero et al. May 2008 B2
7392254 Jenkins Jun 2008 B1
7418656 Petersen Aug 2008 B1
7426654 Adams, Jr. et al. Sep 2008 B2
7441182 Beilinson et al. Oct 2008 B2
7441219 Perry et al. Oct 2008 B2
7454466 Bellotti et al. Nov 2008 B2
7467375 Tondreau et al. Dec 2008 B2
7487139 Fraleigh et al. Feb 2009 B2
7502786 Liu et al. Mar 2009 B2
7525422 Bishop et al. Apr 2009 B2
7529727 Arning et al. May 2009 B2
7558677 Jones Jul 2009 B2
7574428 Leiserowitz et al. Aug 2009 B2
7579965 Bucholz Aug 2009 B2
7596285 Brown Sep 2009 B2
7614006 Molander Nov 2009 B2
7617232 Gabbert et al. Nov 2009 B2
7620628 Kapur et al. Nov 2009 B2
7627812 Chamberlain et al. Dec 2009 B2
7634717 Chamberlain et al. Dec 2009 B2
7703021 Flam Apr 2010 B1
7712049 Williams et al. May 2010 B2
7716077 Mikurak et al. May 2010 B1
7716140 Nielsen et al. May 2010 B1
7725530 Sah et al. May 2010 B2
7725547 Albertson et al. May 2010 B2
7730082 Sah et al. Jun 2010 B2
7730109 Rohrs et al. Jun 2010 B2
7765489 Shah et al. Jul 2010 B1
7770100 Chamberlain et al. Aug 2010 B2
7805457 Viola et al. Sep 2010 B1
7809703 Balabhadrapatruni et al. Oct 2010 B2
7818658 Chen Oct 2010 B2
7877421 Berger et al. Jan 2011 B2
7880921 Dattilo et al. Feb 2011 B2
7894984 Rasmussen et al. Feb 2011 B2
7899611 Downs et al. Mar 2011 B2
7917376 Bellin et al. Mar 2011 B2
7920963 Jouline et al. Apr 2011 B2
7933862 Chamberlain et al. Apr 2011 B2
7941336 Robin-Jan May 2011 B1
7958147 Turner et al. Jun 2011 B1
7962281 Rasmussen et al. Jun 2011 B2
7962848 Bertram Jun 2011 B2
7966199 Frasher et al. Jun 2011 B1
7970240 Chao et al. Jun 2011 B1
7971150 Raskutti et al. Jun 2011 B2
7984374 Caro et al. Jul 2011 B2
8001465 Kudrolli et al. Aug 2011 B2
8001482 Bhattiprolu et al. Aug 2011 B2
8010507 Poston et al. Aug 2011 B2
8010545 Stefik et al. Aug 2011 B2
8015487 Roy et al. Sep 2011 B2
8024778 Cash et al. Sep 2011 B2
8036632 Cona et al. Oct 2011 B1
8073857 Sreekanth Dec 2011 B2
8103543 Zwicky Jan 2012 B1
8134457 Velipasalar et al. Mar 2012 B2
8145703 Frishert et al. Mar 2012 B2
8185819 Sah et al. May 2012 B2
8191005 Baier et al. May 2012 B2
8214361 Sandler et al. Jul 2012 B1
8214764 Gemmell et al. Jul 2012 B2
8225201 Michael Jul 2012 B2
8229947 Fujinaga Jul 2012 B2
8230333 Decherd et al. Jul 2012 B2
8280880 Aymeloglu et al. Oct 2012 B1
8290838 Thakur et al. Oct 2012 B1
8290942 Jones et al. Oct 2012 B2
8301464 Cave et al. Oct 2012 B1
8301904 Gryaznov Oct 2012 B1
8302855 Ma et al. Nov 2012 B2
8312367 Foster Nov 2012 B2
8312546 Alme Nov 2012 B2
8352881 Champion et al. Jan 2013 B2
8368695 Howell et al. Feb 2013 B2
8392556 Goulet et al. Mar 2013 B2
8397171 Klassen et al. Mar 2013 B2
8412707 Mianji Apr 2013 B1
8447722 Ahuja et al. May 2013 B1
8452790 Mianji May 2013 B1
8463036 Ramesh et al. Jun 2013 B1
8489331 Kopf et al. Jul 2013 B2
8489641 Seefeld et al. Jul 2013 B1
8498984 Hwang et al. Jul 2013 B1
8514082 Cova et al. Aug 2013 B2
8515207 Chau Aug 2013 B2
8527949 Pleis et al. Sep 2013 B1
8554579 Tribble et al. Oct 2013 B2
8554709 Goodson et al. Oct 2013 B2
8566353 Fink et al. Oct 2013 B2
8577911 Stepinski et al. Nov 2013 B1
8589273 Creeden et al. Nov 2013 B2
8620641 Farnsworth et al. Dec 2013 B2
8635520 Christiansen et al. Jan 2014 B2
8646080 Williamson et al. Feb 2014 B2
8682696 Shanmugam Mar 2014 B1
8688573 Rukonic et al. Apr 2014 B1
8689108 Duffield et al. Apr 2014 B1
8713467 Goldenberg et al. Apr 2014 B1
8726379 Stiansen et al. May 2014 B1
8732574 Burr et al. May 2014 B2
8739278 Varghese May 2014 B2
8742934 Sarpy, Sr. et al. Jun 2014 B1
8745516 Mason et al. Jun 2014 B2
8781169 Jackson et al. Jul 2014 B2
8799313 Satlow Aug 2014 B2
8799799 Cervelli et al. Aug 2014 B1
8807948 Luo et al. Aug 2014 B2
8812960 Sun et al. Aug 2014 B1
8830322 Nerayoff et al. Sep 2014 B2
8832594 Thompson et al. Sep 2014 B1
8868537 Colgrove et al. Oct 2014 B1
8917274 Ma et al. Dec 2014 B2
8924872 Bogomolov et al. Dec 2014 B1
8930874 Duff et al. Jan 2015 B2
8938686 Erenrich et al. Jan 2015 B1
8984390 Aymeloglu et al. Mar 2015 B2
9058315 Burr Jun 2015 B2
9165100 Begur et al. Oct 2015 B2
9286373 Elliot et al. Mar 2016 B2
9348880 Kramer et al. May 2016 B1
9753921 Devincenzi et al. Sep 2017 B1
10042832 Vagell Aug 2018 B1
10839144 Sood et al. Nov 2020 B2
20010021936 Bertram Sep 2001 A1
20020032677 Morgenthaler et al. Mar 2002 A1
20020033848 Sciammarella et al. Mar 2002 A1
20020091707 Keller et al. Jul 2002 A1
20020095360 Joao Jul 2002 A1
20020095658 Shulman Jul 2002 A1
20020103705 Brady Aug 2002 A1
20020116120 Ruiz et al. Aug 2002 A1
20020145620 Smith et al. Oct 2002 A1
20020174201 Ramer et al. Nov 2002 A1
20020196229 Chen et al. Dec 2002 A1
20030028560 Kudrolli et al. Feb 2003 A1
20030036927 Bowen Feb 2003 A1
20030039948 Donahue Feb 2003 A1
20030061132 Yu, Sr. et al. Mar 2003 A1
20030093755 O'Carroll May 2003 A1
20030126102 Borthwick Jul 2003 A1
20030144868 MacIntyre et al. Jul 2003 A1
20030163352 Surpin et al. Aug 2003 A1
20030225755 Iwayama et al. Dec 2003 A1
20030229848 Arend et al. Dec 2003 A1
20040032432 Baynger Feb 2004 A1
20040034570 Davis Feb 2004 A1
20040044648 Anfindsen et al. Mar 2004 A1
20040064256 Barinek et al. Apr 2004 A1
20040078451 Dietz et al. Apr 2004 A1
20040085318 Hassler et al. May 2004 A1
20040095349 Bito et al. May 2004 A1
20040143602 Ruiz et al. Jul 2004 A1
20040163039 Gorman Aug 2004 A1
20040193600 Kaasten et al. Sep 2004 A1
20040205492 Newsome Oct 2004 A1
20040236688 Bozeman Nov 2004 A1
20040236711 Nixon et al. Nov 2004 A1
20040260702 Cragun et al. Dec 2004 A1
20050010472 Quatse et al. Jan 2005 A1
20050027705 Sadri et al. Feb 2005 A1
20050028094 Allyn Feb 2005 A1
20050039116 Slack-Smith Feb 2005 A1
20050080769 Gemmell et al. Apr 2005 A1
20050086207 Heuer et al. Apr 2005 A1
20050091186 Elish Apr 2005 A1
20050125715 Di Franco et al. Jun 2005 A1
20050162523 Darrell et al. Jul 2005 A1
20050180330 Shapiro Aug 2005 A1
20050182793 Keenan et al. Aug 2005 A1
20050183005 Denoue et al. Aug 2005 A1
20050188016 Vdaygiri et al. Aug 2005 A1
20050246327 Yeung et al. Nov 2005 A1
20050251786 Citron et al. Nov 2005 A1
20060026120 Carolan et al. Feb 2006 A1
20060026170 Kreitler et al. Feb 2006 A1
20060026561 Bauman et al. Feb 2006 A1
20060031779 Theurer et al. Feb 2006 A1
20060045470 Poslinski et al. Mar 2006 A1
20060053097 King et al. Mar 2006 A1
20060053170 Hill et al. Mar 2006 A1
20060059139 Robinson Mar 2006 A1
20060059423 Lehmann et al. Mar 2006 A1
20060074866 Chamberlain et al. Apr 2006 A1
20060080619 Carlson et al. Apr 2006 A1
20060089139 Mainzer Apr 2006 A1
20060129746 Porter Jun 2006 A1
20060136513 Ngo et al. Jun 2006 A1
20060139375 Rasmussen et al. Jun 2006 A1
20060143075 Carr et al. Jun 2006 A1
20060149596 Surpin et al. Jul 2006 A1
20060155654 Plessis et al. Jul 2006 A1
20060178915 Chao Aug 2006 A1
20060203337 White Sep 2006 A1
20060218637 Thomas et al. Sep 2006 A1
20060241974 Chao et al. Oct 2006 A1
20060242040 Rader Oct 2006 A1
20060242630 Koike et al. Oct 2006 A1
20060265417 Amato et al. Nov 2006 A1
20060271277 Hu et al. Nov 2006 A1
20060277460 Forstall et al. Dec 2006 A1
20060279630 Aggarwal et al. Dec 2006 A1
20070000999 Kubo et al. Jan 2007 A1
20070011150 Frank Jan 2007 A1
20070016363 Huang et al. Jan 2007 A1
20070018986 Hauser Jan 2007 A1
20070038962 Fuchs et al. Feb 2007 A1
20070043686 Teng et al. Feb 2007 A1
20070057966 Ohno et al. Mar 2007 A1
20070061752 Cory Mar 2007 A1
20070078832 Ott, IV et al. Apr 2007 A1
20070083541 Fraleigh et al. Apr 2007 A1
20070113164 Hansen et al. May 2007 A1
20070118794 Hollander et al. May 2007 A1
20070136095 Weinstein Jun 2007 A1
20070168871 Jenkins Jul 2007 A1
20070174760 Chamberlain et al. Jul 2007 A1
20070185850 Walters et al. Aug 2007 A1
20070192265 Chopin et al. Aug 2007 A1
20070208497 Downs et al. Sep 2007 A1
20070208498 Barker et al. Sep 2007 A1
20070208736 Tanigawa et al. Sep 2007 A1
20070245339 Bauman et al. Oct 2007 A1
20070266336 Nojima et al. Nov 2007 A1
20070284433 Domenica et al. Dec 2007 A1
20070294643 Kyle Dec 2007 A1
20070299697 Friedlander et al. Dec 2007 A1
20080016155 Khalatian Jan 2008 A1
20080040684 Crump et al. Feb 2008 A1
20080051989 Welsh Feb 2008 A1
20080052142 Bailey et al. Feb 2008 A1
20080077597 Butler Mar 2008 A1
20080077642 Carbone Mar 2008 A1
20080091693 Murthy Apr 2008 A1
20080104019 Nath May 2008 A1
20080109714 Kumar et al. May 2008 A1
20080126951 Sood et al. May 2008 A1
20080162616 Gross et al. Jul 2008 A1
20080172607 Baer Jul 2008 A1
20080177782 Poston et al. Jul 2008 A1
20080186904 Koyama et al. Aug 2008 A1
20080195417 Surpin Aug 2008 A1
20080195608 Clover Aug 2008 A1
20080222295 Robinson et al. Sep 2008 A1
20080249820 Pathria et al. Oct 2008 A1
20080255973 El Wade et al. Oct 2008 A1
20080263468 Cappione et al. Oct 2008 A1
20080267107 Rosenberg Oct 2008 A1
20080276167 Michael Nov 2008 A1
20080278311 Grange et al. Nov 2008 A1
20080288306 MacIntyre et al. Nov 2008 A1
20080288475 Kim et al. Nov 2008 A1
20080301643 Appleton et al. Dec 2008 A1
20080313132 Hao et al. Dec 2008 A1
20080313243 Poston et al. Dec 2008 A1
20090002492 Velipasalar et al. Jan 2009 A1
20090024962 Gotz Jan 2009 A1
20090027418 Maru et al. Jan 2009 A1
20090030915 Winter et al. Jan 2009 A1
20090031401 Cudich et al. Jan 2009 A1
20090043801 LeClair et al. Feb 2009 A1
20090055251 Shah et al. Feb 2009 A1
20090088964 Schaaf et al. Apr 2009 A1
20090089651 Herberger et al. Apr 2009 A1
20090106178 Chu Apr 2009 A1
20090112678 Luzardo Apr 2009 A1
20090112745 Stefanescu Apr 2009 A1
20090112910 Picault et al. Apr 2009 A1
20090119309 Gibson et al. May 2009 A1
20090125369 Kloostra et al. May 2009 A1
20090125459 Norton et al. May 2009 A1
20090132921 Hwangbo et al. May 2009 A1
20090132953 Reed, Jr. et al. May 2009 A1
20090144262 White et al. Jun 2009 A1
20090144274 Fraleigh et al. Jun 2009 A1
20090150868 Chakra et al. Jun 2009 A1
20090164934 Bhattiprolu et al. Jun 2009 A1
20090171939 Athsani et al. Jul 2009 A1
20090172511 Decherd et al. Jul 2009 A1
20090177962 Gusmorino et al. Jul 2009 A1
20090179892 Tsuda et al. Jul 2009 A1
20090187546 Whyte Jul 2009 A1
20090199106 Jonsson et al. Aug 2009 A1
20090216562 Faulkner et al. Aug 2009 A1
20090222400 Kupershmidt et al. Sep 2009 A1
20090222760 Halverson et al. Sep 2009 A1
20090234720 George et al. Sep 2009 A1
20090248757 Havewala et al. Oct 2009 A1
20090249178 Ambrosino et al. Oct 2009 A1
20090249244 Robinson et al. Oct 2009 A1
20090271343 Vaiciulis et al. Oct 2009 A1
20090281839 Lynn et al. Nov 2009 A1
20090282068 Shockro et al. Nov 2009 A1
20090287470 Farnsworth et al. Nov 2009 A1
20090292626 Oxford et al. Nov 2009 A1
20090307049 Elliott, Jr. et al. Dec 2009 A1
20090313463 Pang et al. Dec 2009 A1
20090319891 MacKinlay et al. Dec 2009 A1
20100004857 Pereira et al. Jan 2010 A1
20100011282 Dollard et al. Jan 2010 A1
20100042922 Bradateanu et al. Feb 2010 A1
20100057622 Faith Mar 2010 A1
20100057716 Stefik et al. Mar 2010 A1
20100070523 Delgo et al. Mar 2010 A1
20100070842 Aymeloglu et al. Mar 2010 A1
20100070844 Aymeloglu et al. Mar 2010 A1
20100076813 Ghosh et al. Mar 2010 A1
20100098318 Anderson Apr 2010 A1
20100100963 Mahaffey Apr 2010 A1
20100103124 Kruzeniski et al. Apr 2010 A1
20100114887 Conway et al. May 2010 A1
20100122152 Chamberlain et al. May 2010 A1
20100131457 Heimendinger May 2010 A1
20100162176 Dunton Jun 2010 A1
20100191563 Schlaifer et al. Jul 2010 A1
20100198684 Eraker et al. Aug 2010 A1
20100199225 Coleman et al. Aug 2010 A1
20100223260 Wu Sep 2010 A1
20100238174 Haub et al. Sep 2010 A1
20100250412 Wagner Sep 2010 A1
20100262901 DiSalvo Oct 2010 A1
20100280851 Merkin Oct 2010 A1
20100280857 Liu et al. Nov 2010 A1
20100293174 Bennett Nov 2010 A1
20100306713 Geisner et al. Dec 2010 A1
20100306722 Lehoty et al. Dec 2010 A1
20100313239 Chakra et al. Dec 2010 A1
20100321399 Ellren et al. Dec 2010 A1
20100325526 Ellis et al. Dec 2010 A1
20100325581 Finkelstein et al. Dec 2010 A1
20100330801 Rouh Dec 2010 A1
20110047159 Baid et al. Feb 2011 A1
20110047540 Williams et al. Feb 2011 A1
20110060753 Shaked et al. Mar 2011 A1
20110061013 Bilicki et al. Mar 2011 A1
20110074788 Regan et al. Mar 2011 A1
20110074811 Hanson et al. Mar 2011 A1
20110078173 Seligmann et al. Mar 2011 A1
20110093327 Fordyce, III et al. Apr 2011 A1
20110099133 Chang et al. Apr 2011 A1
20110001191 Ruhl et al. May 2011 A1
20110107196 Foster May 2011 A1
20110113320 Neff et al. May 2011 A1
20110117878 Barash et al. May 2011 A1
20110137766 Rasmussen et al. Jun 2011 A1
20110161409 Nair et al. Jun 2011 A1
20110170799 Carrino et al. Jul 2011 A1
20110173032 Payne et al. Jul 2011 A1
20110173093 Psota et al. Jul 2011 A1
20110179048 Satlow Jul 2011 A1
20110208565 Ross et al. Aug 2011 A1
20110208724 Jones et al. Aug 2011 A1
20110218934 Elser Sep 2011 A1
20110219450 Mcdougal et al. Sep 2011 A1
20110225198 Edwards et al. Sep 2011 A1
20110225482 Chan et al. Sep 2011 A1
20110270705 Parker Nov 2011 A1
20110270922 Jones et al. Nov 2011 A1
20110291851 Whisenant Dec 2011 A1
20110310005 Chen et al. Dec 2011 A1
20110314007 Dassa et al. Dec 2011 A1
20120004894 Butler et al. Jan 2012 A1
20120019559 Siler et al. Jan 2012 A1
20120022945 Falkenborg et al. Jan 2012 A1
20120036013 Neuhaus et al. Feb 2012 A1
20120036434 Oberstein Feb 2012 A1
20120059853 Jagota Mar 2012 A1
20120065987 Farooq et al. Mar 2012 A1
20120066296 Appleton et al. Mar 2012 A1
20120079363 Folting et al. Mar 2012 A1
20120084117 Tavares et al. Apr 2012 A1
20120084184 Raleigh et al. Apr 2012 A1
20120106801 Jackson May 2012 A1
20120117082 Koperda et al. May 2012 A1
20120123989 Yu et al. May 2012 A1
20120131512 Takeuchi et al. May 2012 A1
20120144335 Abeln et al. Jun 2012 A1
20120159307 Chung et al. Jun 2012 A1
20120159399 Bastide et al. Jun 2012 A1
20120173985 Peppel Jul 2012 A1
20120188252 Law Jul 2012 A1
20120022158 Barney Aug 2012 A1
20120196557 Reich et al. Aug 2012 A1
20120196558 Reich et al. Aug 2012 A1
20120197657 Prodanovic Aug 2012 A1
20120197660 Prodanovich Aug 2012 A1
20120208636 Feige Aug 2012 A1
20120215784 King et al. Aug 2012 A1
20120221511 Gibson et al. Aug 2012 A1
20120221553 Wittmer et al. Aug 2012 A1
20120226590 Love et al. Sep 2012 A1
20120245976 Kumar et al. Sep 2012 A1
20120246148 Dror Sep 2012 A1
20120254129 Wheeler et al. Oct 2012 A1
20120266245 McDougal et al. Oct 2012 A1
20120284670 Kashik et al. Nov 2012 A1
20120290879 Shibuya et al. Nov 2012 A1
20120296907 Long et al. Nov 2012 A1
20120304244 Xie et al. Nov 2012 A1
20120323829 Stokes Dec 2012 A1
20120323888 Osann, Jr. Dec 2012 A1
20120330973 Ghuneim et al. Dec 2012 A1
20130000161 Yip et al. Jan 2013 A1
20130006725 Simanek et al. Jan 2013 A1
20130046842 Muntz et al. Feb 2013 A1
20130055264 Burr et al. Feb 2013 A1
20130124567 Balinsky et al. Feb 2013 A1
20130060786 Serrano et al. Mar 2013 A1
20130061169 Pearcy Mar 2013 A1
20130073377 Heath Mar 2013 A1
20130078943 Biage et al. Mar 2013 A1
20130097481 Kotler et al. Apr 2013 A1
20130097482 Marantz et al. Apr 2013 A1
20130101159 Chao et al. Apr 2013 A1
20130111320 Campbell et al. May 2013 A1
20130117651 Waldman et al. May 2013 A1
20130124638 Barreto May 2013 A1
20130150004 Rosen Jun 2013 A1
20130151305 Akinola et al. Jun 2013 A1
20130151453 Bhanot et al. Jun 2013 A1
20130157234 Gulli et al. Jun 2013 A1
20130166480 Popescu et al. Jun 2013 A1
20130176321 Mitchell et al. Jul 2013 A1
20130179420 Park et al. Jul 2013 A1
20130224696 Wolfe et al. Aug 2013 A1
20130238616 Rose et al. Sep 2013 A1
20130246170 Gross et al. Sep 2013 A1
20130246901 Massand Sep 2013 A1
20130251233 Yang et al. Sep 2013 A1
20130254126 Koenig et al. Sep 2013 A1
20130262527 Hunter et al. Oct 2013 A1
20130262528 Foit Oct 2013 A1
20130263019 Castellanos et al. Oct 2013 A1
20130268520 Fisher et al. Oct 2013 A1
20130279757 Kephart Oct 2013 A1
20130282696 John et al. Oct 2013 A1
20130288719 Alonzo Oct 2013 A1
20130290011 Lynn et al. Oct 2013 A1
20130290825 Arndt et al. Oct 2013 A1
20130297619 Chandrasekaran et al. Nov 2013 A1
20140003301 Richardt et al. Jan 2014 A1
20140006921 Gopinath Jan 2014 A1
20140019843 Schmidt Jan 2014 A1
20140019936 Cohanoff Jan 2014 A1
20140026025 Smith Jan 2014 A1
20140032506 Hoey et al. Jan 2014 A1
20140005901 Uchida et al. Feb 2014 A1
20140040371 Gurevich et al. Feb 2014 A1
20140047357 Alfaro et al. Feb 2014 A1
20140059038 Mcpherson et al. Feb 2014 A1
20140068487 Steiger et al. Mar 2014 A1
20140089339 Siddiqui et al. Mar 2014 A1
20140095509 Patton Apr 2014 A1
20140108068 Williams Apr 2014 A1
20140108380 Gotz et al. Apr 2014 A1
20140108985 Scott et al. Apr 2014 A1
20140129936 Richards et al. May 2014 A1
20140156527 Grigg et al. Jun 2014 A1
20140157172 Peery et al. Jun 2014 A1
20140164502 Khodorenko et al. Jun 2014 A1
20140189536 Lange et al. Jul 2014 A1
20140195515 Baker et al. Jul 2014 A1
20140195887 Ellis et al. Jul 2014 A1
20140208281 Ming Jul 2014 A1
20140222793 Sadkin et al. Aug 2014 A1
20140244284 Smith Aug 2014 A1
20140267294 Ma et al. Sep 2014 A1
20140267295 Sharma et al. Sep 2014 A1
20140279824 Tamayo Sep 2014 A1
20140304618 Carriero et al. Oct 2014 A1
20140316911 Gross Oct 2014 A1
20140333651 Cervelli et al. Nov 2014 A1
20140337772 Cervelli et al. Nov 2014 A1
20140358829 Hurwitz Dec 2014 A1
20150026622 Roaldson Jan 2015 A1
20150073954 Braff Mar 2015 A1
20150089353 Folkening Mar 2015 A1
20150100907 Erenrich et al. Apr 2015 A1
20150106379 Elliot et al. Apr 2015 A1
20150178259 Davis et al. Jun 2015 A1
20150186483 Tappan et al. Jul 2015 A1
20150212663 Papale et al. Jul 2015 A1
20150254220 Burr et al. Sep 2015 A1
20150339034 Garcia Nov 2015 A1
20150370772 Wang et al. Dec 2015 A1
20160062555 Ward et al. Mar 2016 A1
20160098176 Cervelli et al. Apr 2016 A1
20160110369 Cervelli et al. Apr 2016 A1
20160147399 Berajawala et al. May 2016 A1
20160162519 Stowe et al. Jun 2016 A1
20160210270 Kelly et al. Jul 2016 A1
20160284127 Rakshit et al. Sep 2016 A1
20160308940 Procopio et al. Oct 2016 A1
20170060829 Bhatt Mar 2017 A1
20170161246 Klima Jun 2017 A1
20170185575 Sood et al. Jun 2017 A1
20190026258 Sood et al. Jan 2019 A1
Foreign Referenced Citations (24)
Number Date Country
2013251186 Nov 2015 AU
102054015 May 2014 CN
102014103482 Sep 2014 DE
1672527 Jun 2006 EP
2551799 Jan 2013 EP
2778977 Sep 2014 EP
2993595 Mar 2016 EP
3002691 Apr 2016 EP
3009943 Apr 2016 EP
3032441 Jun 2016 EP
3188052 Jul 2017 EP
2516155 Jan 2015 GB
624557 Aug 2014 NZ
WO-2000009529 Feb 2000 WO
WO-01025906 Apr 2001 WO
WO-2001088750 Nov 2001 WO
WO-2005104736 Nov 2005 WO
WO-2007133206 Nov 2007 WO
WO-2009061501 May 2009 WO
WO-2010000014 Jan 2010 WO
WO-2010030913 Mar 2010 WO
WO-2010030914 Mar 2010 WO
WO-2012119008 Sep 2012 WO
WO-2015183397 Dec 2015 WO
Non-Patent Literature Citations (191)
Entry
“7 Things You Should Know About . . . Collaborative Annotation”, (2009), 2 pgs.
“A First Look: Predicting Market Demand for Food Retails using a Huff Analysis”, TRF Policy Solutions, CDFI Fund, Capacity Building Initiative, (Jul. 2012), 1-30.
“A Quick Guide to UniProtKB Swiss-Prot & TrEMBL”, UniProt Consortium, Ongoing and future developments at the Universal Protein Resource, (Sep. 2011), 1-2.
“About connecting shapes”, Microsoft Office—Visio, [Online] retrieved from the internet: <http://office.microsoft.com/enus/visio-help/about-connecting-shapes-HP085050369.aspx>, (Aug. 4, 2011), 6 pgs.
“Add and glue connectors with the Connector tool”, Microsoft Office—Visio,, [Online] retrieved from the internet: <http://office.microsoft.com/en-us/visio-help/add-and-glue-connectors-with-the-connector-tool-HA010048532.aspx?CTT =1 >, (Aug. 4, 2011), 1 pg.
“U.S. Appl. No. 12/556,318, Notice of Allowance dated Apr. 11, 2016”, 65 pgs.
“U.S. Appl. No. 12/556,321, Final Office Action dated Feb. 25, 2016”, 26 pgs.
“U.S. Appl. No. 12/556,321, Final Office Action dated Jun. 6, 2012”, 27 pgs.
“U.S. Appl. No. 12/556,321, Non Final Office Action dated Jul. 7, 2015”, 18 pgs.
“U.S. Appl. No. 12/556,321, Non Final Office Action dated Dec. 7, 2011”, 18 pgs.
“U.S. Appl. No. 13/669,274, Advisory Action dated Aug. 26, 2015”, 7 pgs.
“U.S. Appl. No. 13/669,274, Final Office Action dated May 6, 2015”, 12 pgs.
“U.S. Appl. No. 13/669,274, Non Final Office Action dated May 2, 2016”, 25 pgs.
“U.S. Appl. No. 13/827,491, Final Office Action dated Jun. 22, 2015”, 28 pgs.
“U.S. Appl. No. 13/827,491, Non Final Office Action dated Oct. 9, 2015”, 16 pgs.
“U.S. Appl. No. 13/917,571, Issue Notification dated Aug. 5, 2014”, 1 pg.
“U.S. Appl. No. 14/102,394, Notice of Allowance dated Aug. 25, 2014”, 13 pgs.
“U.S. Appl. No. 14/102,394, Office Action dated Mar. 27, 2014”, 16 pgs.
“U.S. Appl. No. 14/108,187, Applicant-Initiated Interview Summary dated Apr. 17, 2014”, 8 pgs.
“U.S. Appl. No. 14/108,187, First Action Interview dated Mar. 20, 2014”, 7 pgs.
“U.S. Appl. No. 14/108,187, Notice of Allowance dated Aug. 29, 2014”, 8 pgs.
“U.S. Appl. No. 14/135,289, First Action Interview Office Action Summary dated Jul. 7, 2014”, 12 pgs.
“U.S. Appl. No. 14/135,289, First Action Interview Pilot Program Pre-Interview Communication dated Apr. 16, 2014”, 8 pgs.
“U.S. Appl. No. 14/135,289, Notice of Allowance dated Oct. 14, 2014”, 9 pgs.
“U.S. Appl. No. 14/148,568, Final Office Action dated Oct. 22, 2014”, 32 pgs.
“U.S. Appl. No. 14/192,767 Corrected Notice of Allowability dated Apr. 20, 2015”, 6 pgs.
“U.S. Appl. No. 14/192,767, First Action Interview Office Action Summary dated Sep. 24, 2014”, 8 pgs.
“U.S. Appl. No. 14/192,767, First Action Interview Pilot Program Pre-Interview Communication dated May 6, 2014”, 23 pgs.
“U.S. Appl. No. 14/192,767, Notice of Allowance dated Dec. 16, 2014”, 9 pgs.
“U.S. Appl. No. 14/196,814, First Action Interview Office Action Summary dated Aug. 13, 2014”, 8 pgs.
“U.S. Appl. No. 14/222,364, Non Final Office Action dated Dec. 9, 2015”, 38 pgs.
“U.S. Appl. No. 14/225,006, Advisory Action dated Dec. 21, 2015”, 4 pgs.
“U.S. Appl. No. 14/225,006, First Action Interview Pre-Interview Communication dated Sep. 10, 2014”, 4 pgs.
“U.S. Appl. No. 14/225,084, First Action Interview Pre-Interview Communication dated Sep. 2, 2014”, 17 pgs.
“U.S. Appl. No. 14/225,160, First Action Interview Pre-Interview Communication dated Oct. 22, 2014”, 6 pgs.
“U.S. Appl. No. 14/265,637, First Action Interview Pre-Interview Communication dated Sep. 26, 2014”, 6 pgs.
“U.S. Appl. No. 14/265,637, Notice of Allowance dated Feb. 13, 2015”, 11 pgs.
“U.S. Appl. No. 14/268,964, First Action Interview Pre-Interview Communication dated Sep. 3, 2014”, 13 pgs.
“U.S. Appl. No. 14/268,964, Non Final Office Action dated Jul. 11, 2014”, 10 pgs.
“U.S. Appl. No. 14/268,964, Notice of Allowance dated Dec. 3, 2014”, 13 pgs.
“U.S. Appl. No. 14/289,596, Final Office Action dated Aug. 5, 2015”, 15 pgs.
“U.S. Appl. No. 14/289,596, First Action Interview Pre-Interview Communication dated Jul. 18, 2014”, 4 pgs.
“U.S. Appl. No. 14/289,599, First Action Interview Pre-Interview Communication dated Jul. 22, 2014”, 5 pgs.
“U.S. Appl. No. 14/294,098, Final Office Action dated Nov. 6, 2014”, 22 pgs.
“U.S. Appl. No. 14/294,098, First Action Interview Pre-Interview Communication dated Aug. 15, 2014”, 17 pgs.
“U.S. Appl. No. 14/294,098, Notice of Allowance dated Dec. 29, 2014”, 9 pgs.
“U.S. Appl. No. 14/306,138, First Action Interview Pre-Interview Communication dated Sep. 23, 2014”, 5 pgs.
“U.S. Appl. No. 14/306,147, First Action Interview—Pre-Interview Communication dated Sep. 9, 2014”, 6 pgs.
“U.S. Appl. No. 14/306,154, First Action Interview Pre-Interview Communication dated Sep. 9, 2014”, 4 pgs.
“U.S. Appl. No. 14/319,765, First Action Interview Pre-Interview Communication dated Nov. 25, 2014”, 4 pgs.
“U.S. Appl. No. 14/323,935, First Action Interview Pre-Interview Communication dated Nov. 28, 2014”, 4 pgs.
“U.S. Appl. No. 14/326,738, First Action Interview Pre-Interview Communication dated Dec. 2, 2014”, 5 pgs.
“U.S. Appl. No. 14/332,306, First Action Interview Pre-Interview Communication dated May 20, 2016”, 5 pgs.
“U.S. Appl. No. 14/473,860, First Action Interview dated Nov. 4, 2014”, 23 pgs.
“U.S. Appl. No. 14/473,860, Notice of Allowance dated Jan. 5, 2015”, 13 pgs.
“U.S. Appl. No. 14/479,160, First Action Interview Pre-Interview Communication dated Apr. 20, 2016”, 7 pgs.
“U.S. Appl. No. 14/552,336, First Action Interview Pre-Interview Communication dated Jul. 20, 2015”, 18 pgs.
“U.S. Appl. No. 14/552,336, Notice of Allowance dated Nov. 3, 2015”, 13 pgs.
“U.S. Appl. No. 14/571,098, First Action Interview dated Aug. 24, 2015”, 4 pgs.
“U.S. Appl. No. 14/571,098, First Action Interview Pre-Interview Communication dated Mar. 11, 2015”, 4 pgs.
“U.S. Appl. No. 14/571,098, First Action Interview Pre-Interview Communication dated Aug. 5, 2015”, 4 pgs.
“U.S. Appl. No. 14/571,098, First Action Interview Pre-Interview Communication dated Nov. 10, 2015”, 5 pgs.
“U.S. Appl. No. 14/601,735, Final Office Action dated May 3, 2018”, 29 pgs.
“U.S. Appl. No. 14/601,735, Final Office Action dated Jul. 11, 2017”, 25 pgs.
“U.S. Appl. No. 14/601,735, Non Final Office Action dated Nov. 15, 2017”, 28 pgs.
“U.S. Appl. No. 14/676,621, Notice of Allowance dated Feb. 10, 2016”, 5 pgs.
“U.S. Appl. No. 14/715,834, Final Office Action dated Jun. 28, 2016”, 13 pgs.
“U.S. Appl. No. 14/715,834, First Action Interview Pre-Interview Communication dated Feb. 19, 2016”, 19 pgs.
“U.S. Appl. No. 14/715,834, First Action Interview Pre-Interview Communication dated Apr. 13, 2016”, 21 pgs.
“U.S. Appl. No. 14/741,256, Restriction Requirement dated Feb. 9, 2016”, 6 pgs.
“U.S. Appl. No. 14/800,447, Examiner Interview Summary dated Mar. 3, 2016”, 28 pgs.
“U.S. Appl. No. 14/800,447, Final Office Action dated Jun. 6, 2016”, 15 pgs.
“U.S. Appl. No. 14/800,447, First Action Interview—Pre-Interview Communication dated Dec. 10, 2015”, 6 pgs.
“U.S. Appl. No. 14/841,338, Non Final Office Action dated Feb. 18, 2016”, 39 pgs.
“U.S. Appl. No. 14/842,734, First Action Interview Pre-Interview Communication dated Nov. 19, 2015”, 17 pgs.
“U.S. Appl. No. 14/871,465, First Action Interview Pre-Interview Communication dated Feb. 9, 2016”, 32 pgs.
“U.S. Appl. No. 14/871,465, First Action Interview Pre-Interview Communication dated Apr. 11, 2016”, 7 pgs.
“U.S. Appl. No. 14/883,498, First Action Interview Pre-Interview Communication dated Dec. 24, 2015”, 33 pgs.
“U.S. Appl. No. 14/883,498, Non Final Office Action dated Mar. 17, 2016”, 18 pgs.
“U.S. Appl. No. 14/961,481, Notice of Allowance dated May 2, 2016”, 6 pgs.
“U.S. Appl. No. 14/961,481, Pre-Interview Communication dated Mar. 2, 2016”, 12 pgs.
“U.S. Appl. No. 14/975,215, First Action Interview Pre-Interview Communication dated May 19, 2016”, 5 pgs.
“U.S. Appl. No. 15/051,565, Final Office Action dated Mar. 19, 2018”, 15 pgs.
“U.S. Appl. No. 15/051,565, Final Office Action dated Jun. 28, 2017”, 12 pgs.
“U.S. Appl. No. 15/051,565, Non Final Office Action dated Apr. 6, 2017”, 11 pgs.
“U.S. Appl. No. 15/051,565, Non Final Office Action dated Oct. 18, 2017”, 14 pgs.
“U.S. Appl. No. 15/051,565, Notice of Allowance dated Jun. 1, 2018”, 6 pgs.
“U.S. Appl. No. 16/646,397, Examiner Interview Summary dated Feb. 27, 2019”, 3 pgs.
“U.S. Appl. No. 16/046,397, Final Office Action dated Jun. 26, 2019”, 14 pgs.
“U.S. Appl. No. 16/046,397, Non Final Office Action dated Jan. 25, 2019”, 12 pgs.
“U.S. Appl. No. 16/046,397, Non Final Office Action dated Dec. 27, 2019”, 7 pgs.
“U.S. Appl. No. 16/046,397, Notice of Allowance dated Jul. 8, 2020”, 6 pgs.
“U.S. Appl. No. 16/046,397, Preliminary Amendment filed Jul. 31, 2018”, 9 pgs.
“U.S. Appl. No. 16/046,397, Response filed Apr. 1, 2020 to Non Final Office Action dated Dec. 27, 2019”, 10 pgs.
“U.S. Appl. No. 16/046,397, Response filed Mar. 12, 2019 to Non Final Office Action dated Jan. 25, 2019”, 11 pgs.
“U.S. Appl. No. 16/046,397, Response filed Sep. 26, 2019 to Final Office Action dated Jun. 26, 2019”, 13 pgs.
“Australian Application Serial No. 2013251186, First Examiner's Report dated Mar. 12, 2015”, 3 pgs.
“Australian Application Serial No. 2013251186, Notice of Acceptance dated Nov. 6, 2015”, 2 pgs.
“Bug 18726—[feature] Long-click means of invoking contextual menus not supported”, Bugzilla@Mozilla, [Online] retrieved from the internet: <http://bugzilia.mozilla.org/show_bug.cgi?id=18726>, (Jun. 13, 2013), 11 pgs.
“Canadian Application Serial No. 2,831,660, Office Action dated Jun. 9, 2015”, 4 pgs.
“Clip2Net—Share files, folders and screenshots easily”, Online Tech Tips, [Online]. Retrieved from the Internet: <URL: http://www.online-tech-tips.com/free-software-downloads/share-files-folders-screenshots/>, (Apr. 2, 2008), 5 pgs.
“European Application Serial No. 15190307.7, Extended Search Report dated Feb. 19, 2016”, 8 pgs.
“European Application Serial No. 12181585.6, Communication pursuant to Article 94(3) EPC dated Sep. 4, 2015”, 9 pgs.
“European Application Serial No. 14158861.6, Extended European Search Report dated Jun. 16, 2014”, 6 pgs.
“European Application Serial No. 14159464.8, Extended European Search Report dated Jul. 31, 2014”, 7 pgs.
“European Application Serial No. 14159464.8, Non Final Office Action dated Sep. 22, 2014”, 2 pgs.
“European Application Serial No. 14159464.8, Notice of Publication dated Aug. 20, 2014”, 2 pgs.
“European Application Serial No. 14189344.6, Office Action dated Feb. 29, 2016”, 9 pgs.
“European Application Serial No. 15188106.7, Extended European Search Report dated Feb. 3, 2016”, 8 pgs.
“European Application Serial No. 16207360,5, Communication Pursuant to Article 94(3) EPC dated Dec. 6, 2018”, 6 pgs.
“European Application Serial No. 16207360.5, Extended European Search Report dated Jun. 16, 2017”, 8 pgs.
“European Application Serial No. 16207360.5, Summons to Attend Oral Proceedings dated Aug. 19, 2019”, 5 pgs.
“Getting Started with VBA in Word 2010”, Microsoft—Developer Network, [Online]. [Accessed Apr. 4, 2014]. Retrieved from the Internet: <URL: http://msdn.microsoft.com/en-us/library/ff604039%28v=office. 14%29.aspx>, (2010), 17 pgs.
“GIS-NET 3 Public-Department of Regional Planning”, Planning & Zoning Information for Unincorporated LA County, [Online] retrieved from the internet: <http://gis.planning.lacounty.gov/GIS-NET3 Public/Viewer.html>, (Oct. 2, 2013).
“GrabUp—What a Timesaver!”, [Online], Retrieved from the Internet: <URL http://atlchris.com/191/grabup/>, (Aug. 11, 2008), 10 pgs.
“Great Britain Application Serial No. 1404457.2, Office Action dated Aug. 14, 2014”, 8 pgs.
“Great Britain Application Serial No. 1404486.1, Combined Search Report and Examination Report dated Aug. 27, 2014”, 5 pgs.
“Great Britain Application Serial No. 1404489.5, Combined Search Report and Examination Report dated Aug. 27, 2014”, 5 pgs.
“Great Britain Application Serial No. 1404499.4, Combined Search Report and Examination Report dated Aug. 20, 2014”, 6 pgs.
“Great Britain Application Serial No. 1404574.4, Office Action dated Dec. 18, 2014”, 2 pgs.
“Great Britain Application Serial No. 1408025.3, Office Action dated Nov. 6, 2014”, 3 pgs.
“Great Britain Application Serial No. 1411984.6, Office Action dated Dec. 22, 2014”, 6 pgs.
“KeyLines Datasheet”, Keylines.com, [Online], Retrieved from the Internet: <URL: http://keylines.com/wp-content/uploads/2014/03/KeyLines-datasheet.pdf>, (Mar. 2014), 2 pgs.
“Kwout”, [Online]. Retrieved from the Internet: <URL: http://web.archive.org/web/20080905132448/http://www.kwout.com/>, (Sep. 5, 2008), 2 pgs.
“Map of San Jose, CA”, Retrieved Oct. 2, 2013 from http://maps.google.com, (Oct. 2, 2013), 1 pg.
“Map of San Jose, CA.”, Retrieved Oct. 2, 2013 from http://maps.yahoo.com, (Oct. 2, 2013), 1 pg.
“Map of San Jose, CA.”, Retrieved Oct. 2, 2013 from http://maps.bing.com, (Oct. 2, 2013), 1 pg.
“Microsoft Windows Version 2002 Print Out 2”, Microsoft Windows, (2002), 6 pgs.
“Netherlands Application Serial No. 2011729, Search Report dated Aug. 13, 2015”, 8 pgs.
“Netherlands Application Serial No. 2012438, Search Report dated Sep. 21, 2015”, 8 pgs.
“New Zealand Application Serial No. 622473, First Examination Report dated Mar. 27, 2014”, 3 pgs.
“New Zealand Application Serial No. 622473, Office Action dated Jun. 19, 2014”, 2 pgs.
“New Zealand Application Serial No. 622513, Office Action dated Apr. 3, 2014”, 2 pgs.
“New Zealand Application Serial No. 622517, Office Action dated Apr. 3, 2014”, 3 pgs.
“New Zealand Application Serial No. 624557, Office Action dated May 14, 2014”, 2 pgs.
“New Zealand Application Serial No. 627962, Office Action dated Aug. 5, 2014”, 2 pgs.
“New Zealand Application Serial No. 628161, First Examination Report dated Aug. 25, 2014”, 2 pgs.
“New Zealand Application Serial No. 628263, Office Action dated Aug. 12, 2014”, 2 pgs.
“New Zealand Application Serial No. 628495, Office Action dated Aug. 19, 2014”, 2 pgs.
“New Zealand Application Serial No. 628585, Office Action dated Aug. 26, 2014”, 2 pgs.
“New Zealand Application Serial No. 628840, Office Action dated Aug. 28, 2014”, 2 pgs.
“O'Reilly.com”, [Online]. Retrieved from the Internet: <URL: http://oreilly.com/digitalmedia/2006/01/01/mac-os-x-screenshot-secrets.html, (Jan. 1, 2006), 10 pgs.
“Registering an Application to a URI Scheme”, Microsoft, [Online]. Retrieved from the Internet: <URL: http://msdn.microsoft.com/en-us/library/aa767914.aspx>, (accessed Apr. 4, 2009), 4 pgs.
“Share Screenshots via Internet in Seconds”, JetScreenshot.com, [Online], Retrieved from the Internet: <URL: http://web.archive.org/web/20130807164204/http://www.jetscreenshot.com/>, (Aug. 7, 2013), 1 pg.
“Snaglt 8.1.0 Print Out”, Snaglt Software release date Jun. 15, 2006, (Jun. 15, 2006), 6 pgs.
“Snaglt 8.1.0 Print Out 2”, Snagit—Software release date Jun. 15, 2006, (Jun. 15, 2006), 1-3.
“Snaglt Online Help Guide”, TechSmith Corp., Version 8.1, http://download.techsmith.com/snagit/docs/onlinehelp/enu/snagit_help.pdf>, (accessed Feb. 7, 2007), 284 pgs.
“The FASTA Program Package”, fasta-36.3.4, (Mar. 25, 2011), 1-29.
“Trick: How to Capture a Screenshot as PDF, Annotate, Then Share it”, Nitro, [Online]. Retrieved from the Internet: <URL: http://blog.nitropdf.com/2008/03/04/trick-how-to-capture-a-screenshot-as-pdf-annotate-it-then-share/>, (Mar. 4, 2008), 2 pgs.
“Using the Clipboard”, Microsoft, [Online], Retrieved from the Internet: <URL: http://msdn.microsoft.com/en-us/library/ms649016.aspx>, (accessed Jun. 8, 2009), 20 pgs.
“Visualizing Threats: Improved Cyber Security Through Network Visualization”, Keylines.com, [Online] retrieved from the internet: <http:/ /keylines.com/wp-content/uploads/2014/04/Visualizing-Threats1.pdf>, (May 12, 2014), 10 pgs.
Abbey, Kristen, “Review of Google Docs”, 2007: Currents in Electronic Literacy, http://currents.dwrl.utexas.edu/spring07/abbey.html, (May 1, 2007), 2 pgs.
Acklen, Laura, “Absolute Beginner's Guide to Microsoft Office Word '2003”, (Dec. 24, 2003), 15-18, 34-41, 308-316.
Adams, Michael, et al., “Worklets: A Service-Oriented Implementation of Dynamic Flexibility in Workflows”, OTM 2006, LNCS 4275, (2006), 291-308.
Ananiev, et al., “The New Modality API”, Sun Developer Network (SDN), [Online] retrieved from the internet: <http://web.archive.org/web/20061211011958/http://java.sun.com/developer/technicaiArticles/J2SE/Desktop/javase6/modality/>, (Jan. 21, 2006), 12 pgs.
Bluttman, et al., “Excel Formulas and Functions for Dummies”, Wiley Publishing, Inc.,, (2005), 280, 284-286.
Canese, Kathi, et al., “Chapter 2: PubMed: The Bibliographic Database”, The NCBI Handbook, (Oct. 2002), 1-10.
Chaudhuri, Surajit, et al., “An Overview of Business Intelligence Technology”, Communications of the ACM, vol. 54, No. 8., (Aug. 2011), 88-98.
Chen, et al., “Bringing Order to the Web: Automatically Categorizing Search Results”, Proceedings of the SIGCHI conference on Human Factors in Computing Systems, The Haque, The Netherlands, (2000), 145-152.
Conner, Nancy, “Remove a published document or blog post”, Google Apps: The Missing Manual: The Missing Manual section—Sharing and Collaborating on Documents , XP-002721325, (2008), 15 pgs.
Delcher, Arthur, et al., “Identifying Bacterial Genes and Endosymbiont DNA with Glimmer”, BioInformatics, vol. 23, No. 6, (2007), 673-679.
Dramowicz, Ela, “Retail Trade Area Analysis Using the Huff Model”, Directions Magazine,, [Online] retrieved from the internet: <http://www.directionsmag.com/articles/retail-trade-area-analysis-usinq-the-huff-model/123411>, (Jul. 2, 2005), 10 pgs.
Ferreira, Lucas De Carvalho, et al., “A Scheme for Analyzing Electronic Payment Systems”, (1997), 10 pgs.
Galliford, Miles, “Snaglt Versus Free Screen Capture Software: Critical Tools for Website Owners”, [Online]. Retrieved from the Internet: <URL: http://www.subhub.com/articies/free-screen-capture-software>, (Mar. 27, 2008), 10 pgs.
Goswami, Gautam, “Quite Writely Said!”, Blog: One Brick a Time, [Online], Retrieved from the Internet: <URL: http://gautamg.wordpress.com/2006/08/21/quite-writely-said/>, (Aug. 2005), 7 pgs.
Griffith, Daniel A, et al., “A Generalized Huff Model”, Geographical Analysis, vol. 14, No. 2, (Apr. 1982), 135-144.
Gu, Lifang, et al., “Record Linkage: Current Practice and Future Directions”, (Jan. 15, 2004), 32 pgs.
Hansen, D., et al., “Analyzing Social Media Networks with NodeXL: Insights from a Connected World”, Chapter 4, pp. 53-67 and Chapter 10, pp. 143-164, (Sep. 2010), 53-67; 143-164.
Hibbert, et al., “Prediction of Shopping Behavior Using a Huff Model Within a GIS Framework”, (Mar. 18, 2011), 16 pgs.
Hua, Yu, et al., “A Multi-attribute Data Structure with Parallel Bloom Filters for Network Services”, HiPC 2006, LNCS 4297, (2006), 277-288.
Huff, et al., “Calibrating the Huff Model Using ArcGIS Business Analyst”, ESRI, (Sep. 2008), 33 pgs.
Huff, David L, “Parameter Estimation in the Huff Model”, ESRI, ArcUser, (2003), 34-36.
Hunter, J, et al., “Collaborative Annotation of 3D Crystallographic Models”, The University of Queensland; I. Chem. Inf Model. 47, 2475-2484, (2007), 10 pgs.
Hunter, Jane, et al., “Towards Annotopia—Enabling the Semantic Interoperability of Web-Based Annotations”, Future Internet, (2012), 19 pgs.
Kahan, J., et al., “Annotea: an open RDF infrastructure for Shared Web Annotations”, Computer Networks vol. 39, No. 5, Elsevier Science Publishers B.V., Amsterdam, NL, (2002), 589-608.
Kitts, Paul, “Chapter 14: Genome Assembly and Annotation Process”, The NCBI Handbook, (Oct. 2002), 1-21.
Liu, T., “Combining GIS and the Huff Model to Analyze Suitable Locations for a New Asian Supermarket in the Minneapolis and St. Paul, Minnesota USA”, Papers in Resource Analysis, 2012, vol. 14, (2012), 8 pgs.
Madden, “Chapter 16: BLAST Sequence Analysis Tool”, The NCBI Handbook, (Oct. 2002), 1-15.
Manno, et al., “Introducing Collaboration in Single-user Applications through the Centralized Control Architecture”, (2010), 10 pgs.
Manske, “File Saving Dialogs”, [Online] retrieved from the internet: <http://www.mozilla.org/editor/ui specs/FileSaveDialogs.html>, (Jan. 20, 1999), 7 pgs.
Mizrachi, Ilene, “Chapter 1: GenBank: The Nucleotide Sequence Database”, The NCBI Handbook, (Oct. 2002), 1-14.
Nierman, Andrew, et al., “Evaluating Structural Similarity in XML Documents”, U of Michigan, (2002), 1-6.
Palmas, et al., “An Edge-Bunding Layout for Interactive Parallel Coordinates”, IEEE Pacific Visualization Symposium, (2014), 57-64.
Rouse, Margaret, “OLAP Cube”, [Online] retrieved from the internet: <http://searchdatamanagement.techtarget.com/definition/OLAP-cube>, (Apr. 28, 2012), 15 pgs.
Schroder, Stan, “15 Ways to Create Website Screenshots”, [Online], Retrieved from the Internet: <URL: http://mashable.com/2007/08/24/web-screenshots/>, (Aug. 24, 2007), 2 pgs.
Sigrist, Christian, et al., “PROSITE, a Protein Domain Database for Functional Characterization and Annotation”, Nucleic Acids Research, vol. 38, (2010), D161-D166.
Sirotkin, Karl, et al., “Chapter 13: The Processing of Biological Sequence Data at NCBI”, The NCBI Handbook, (Oct. 2002), 1-11.
Wang, Guohua, et al., “Research on a Clustering Data De-Duplication Mechanism Based on Bloom Filter”, IEEE, (2010), 5 pgs.
Warren, Christina, “TUAW Faceoff: Screenshot apps on the firing line”, [Online]. Retrieved from the Internet: <URL: http://www.tuaw.com/2008/05/05/tuaw-faceoff-screenshot-apps-on-the-firing-line/>, (May 5, 2008), 11 pgs.
Yang, Stephen JH, et al., “A Collaborative Multimedia Annotation Tool for Enhancing Knowledge Sharing in CSCL”, Interactive Learning Environments, 19(1), pp. 45-62; National Central University, Taiwan, Northern Illinois University, USA, University of Illinois at Chicago, USA, (Jan. 20, 2011), 21 pgs.
Yang, Yudong, “HTML Page Analysis Based on Visual Cues”, 2001 IEEE, (2001), 859-864.
Related Publications (1)
Number Date Country
20210027012 A1 Jan 2021 US
Provisional Applications (1)
Number Date Country
62272617 Dec 2015 US
Continuations (2)
Number Date Country
Parent 16046397 Jul 2018 US
Child 17064972 US
Parent 15051565 Feb 2016 US
Child 16046397 US