Schematic and database linking system

Information

  • Patent Grant
  • 11275753
  • Patent Number
    11,275,753
  • Date Filed
    Friday, April 20, 2018
    6 years ago
  • Date Issued
    Tuesday, March 15, 2022
    2 years ago
  • CPC
    • G06F16/252
    • G06F16/248
    • G06F16/24575
    • G06F16/94
    • G06F16/9535
  • Field of Search
    • CPC
    • G06F16/252
    • G06F16/94
    • G06F16/248
    • G06F16/24575
    • G06F16/9535
  • International Classifications
    • G06F16/25
    • G06F16/93
    • G06F16/248
    • G06F16/9535
    • G06F16/2457
    • Disclaimer
      This patent is subject to a terminal disclaimer.
      Term Extension
      443
Abstract
Various systems and methods are provided that display schematics and data associated with the various physical components in the schematics in an interactive user interface. For example, a computing device links data stored in one or more databases with schematics displayed in one or more interactive user interfaces. The computing device parses a digital image that depicts a schematic and identifies text visible in the digital image. Based on the identified text, the computing device recognizes representations of one or more physical components in the schematic and links the representations to data regarding the physical component in one or more databases, such as specification data, historical sensor data of the component, etc. The computing device modifies the digital image such that it becomes interactive and visible in a user interface in a manner that allows the user to select a physical component and view data associated with the selection.
Description
TECHNICAL FIELD

The present disclosure relates to systems and techniques for querying databases and displaying queried data in an interactive user interface.


BACKGROUND

A database may store a large quantity of data. For example, a system may comprise a large number of physical components that are each associated with measurements collected at regular intervals, and the measurements may be stored in the database and/or a system of databases. The measurement data can be supplemented with other data, such as information that describes each physical component, and the supplemental data can also be stored in the database and/or the system of databases. References herein to a “database” may refer to any type of data structure for storing and/or organizing data, including, but not limited to, relational databases (for example, Oracle database, mySQL database, and the like), spreadsheets, XML files, and text files, among others.


In some cases, a user may attempt to analyze a portion of the stored data. For example, the user may attempt to analyze a portion of the stored data that is associated with one or more physical components. However, as the number of measurements increases over time, it can become very difficult for the user to identify the relevant data and perform the analysis.


SUMMARY

The systems, methods, and devices described herein each have several aspects, no single one of which is solely responsible for its desirable attributes. Without limiting the scope of this disclosure, several non-limiting features will now be discussed briefly.


Disclosed herein are various systems and methods for displaying schematics (or other visual representations of multi-component systems) and data associated with various physical components in the schematics in an interactive user interface. For example, a computing device may be configured to link data stored in one or more databases with particular portions of schematics that can be displayed in one or more interactive user interfaces. The computing device may parse a digital image that depicts a schematic and identify text visible in the digital image, such as text that may include a component identifier (e.g., model number, name, characteristics, dimensions, etc.). Based on the identified text, the computing device may recognize representations of one or more physical components in the schematic and link the representations to data stored in the database that is associated with the recognized physical components.


In some embodiments, the computing device may modify the original digital image (or generate a mapping of the digital image) such that it becomes interactive and visible in a user interface in a manner that allows the user to select a physical component and view data that has been linked to the selected physical component. For example, in a schematic of a system (e.g., a manufacturing facility's automated fabrication process, which may include thousands of different physical components that provide dynamic data, such as sensor data) the digital image may be interactive such that the user can select a particular component (e.g., a particular sensor) in order to initiate automatic retrieval of various data regarding the sensor, such as information regarding the particular physical sensor component (e.g., specification data of the sensor that has been linked to the particular physical sensor component using the processes discussed herein), characteristics of the physical components, limits or alarms associated with the physical component, graphs that depict sensor data associated with the physical component, graphs that depicts a relationship between the physical component and other physical components that are located nearby current, historical output data of the particular physical sensor component, and/or other information related to the selected physical sensor component. For example, sensor data measurements may be provided in one or more graphs that are each associated with a selected physical component and, in some embodiments, multiple graphs (or other visualizations) of sensor data measurements from multiple different physical components may be concurrently displayed in an interactive user interface. In one embodiment, the graphs may be linked such that manipulating one graph (e.g., zooming in or out, adjusting the time range, etc.) causes an identical or nearly identical manipulation in other graphs that are visible. The graphs may be displayed and/or manipulated in any manner, such as by the techniques disclosed in U.S. Application No. 62/059,601, filed Oct. 3, 2014 and titled “CHRONICLE TIME-SERIES ANALYSIS SYSTEM, which is hereby incorporated by reference in its entirety.


One aspect of the disclosure provides a computing system configured to access one or more databases in substantially real-time to identify and link data associated with particular physical components with representations of the particular physical components illustrated in a schematic layout of the physical components in an interactive user interface. The computing system comprises a computer processor. The computing system further comprises one or more parts databases storing entries that each include an identity of a physical component and data associated with the respective physical component. The computing system further comprises a computer readable storage medium storing program instructions configured for execution by the computer processor in order to cause the computing system to access a digital image, where the digital image includes a schematic layout of a plurality of physical components; parse the image to identify first text in the image; compare the first text with identities of physical components that are included in the entries stored in the one or more parts databases; identify a first identity stored in the one or more parts databases that matches the first text; retrieve, from the one or more parts databases, data associated with a first physical component identified by the first identity in the one or more parts databases; determine an area covered by the first physical component in the digital image; create a link in a linkage database between the data associated with the first physical component and one or more of the first text in the digital image or the area in the digital image covered by the first physical component; and generate user interface data such that the interactive user interface includes the digital image and a link at one or more of a location of the first text in the digital image or the area covered by the first physical component, where the link, when selected, causes the interactive user interface to display the data associated with the first physical component.


The computing system of the preceding paragraph can include any sub-combination of the following features: where the computer readable storage medium further stores program instructions that cause the computing system to associate, in the linkage database, one or more data series associated with the first physical component, the one or more data series including historical data regarding input values, and output values associated with the first physical component; where the data associated with the first physical component comprises sensor data measured by the first physical component; where the data associated with the first physical component comprises a graph depicting a relationship between the first physical component and other physical components illustrated in the schematic layout; where the computer readable storage medium further stores program instructions that cause the computing system to, in response to a selection of a second physical component in the graph, update the user interface data such that the interactive user interface displays data associated with the second physical component; where the computer readable storage medium further stores program instructions that cause the computing system to, in response to a selection of the first physical component in the interactive user interface, update the user interface data such that the interactive user interface includes a window, where the window includes sensor data measured by the first physical component; where the computer readable storage medium further stores program instructions that cause the computing system to, in response to a selection of a second physical component in the interactive user interface, update the user interface data such that the interactive user interface includes a second window, where the second window includes sensor data measured by the second physical component; where the computer readable storage medium further stores program instructions that cause the computing system to, in response to a selection of first sensor data measured by the first physical component corresponding to a first time, update the user interface data such that the interactive user interface includes a marker in the second window at a location of second sensor data measured by the second physical component that corresponds with the first time; where the computer readable storage medium further stores program instructions that cause the computing system to, in response to a command to zoom in on the window to a first zoom level corresponding to a first data range along a y-axis, update the user interface data such that the interactive user interface zooms in on the second window to the first zoom level so that the sensor data measured by the first physical component is displayed for the first data range along the y-axis and the sensor data measured by the second physical component is displayed for the first data range along the y-axis; where the first user interface includes an index window that lists identities for each of the plurality of physical components illustrated in the schematic layout, and where the computer readable storage medium further stores program instructions that cause the computing system to, in response to a selection of an identity of the first physical component, update the user interface data to adjust a location of the digital image in the interactive user interface such that a representation of the first physical component in the schematic layout is centered in the interactive user interface; where the first user interface includes a notes window that identifies previous changes to the schematic layout, and where the computer readable storage medium further stores program instructions that cause the computing system to, in response to a selection of a first note listed in the notes window, update the user interface data such that the interactive user interface identifies a second physical component illustrated in the schematic layout that is associated with the first note; where the computer readable storage medium further stores program instructions that cause the computing system to, in response to a selection of the first physical component in the interactive user interface, update the user interface data such that the interactive user interface displays a note associated with the first physical component; where the computer readable storage medium further stores program instructions that cause the computing system to, in response to a request to animate a flow of data through one or more of the physical components in the plurality of physical components, update the user interface data such that the interactive user interface includes an animation that indicates a sensor value as oil passes through the first physical component at a first time and the sensor value as the oil passes through a second physical component at a second time after the first time; where the computer readable storage medium further stores program instructions that cause the computing system to, in response to an input providing model sensor data for the first physical component, update the user interface data such that the interactive user interface includes a prediction of a sensor value for a second physical component that is coupled to the first physical component; and where the first physical component is an injector on an oil platform.


Another aspect of the disclosure provides a computer-implemented method of accessing one or more databases in substantially real-time to identify and link data associated with particular physical components with representations of the particular physical components illustrated in a schematic layout of the physical components in an interactive user interface. The computer-implemented method comprises accessing a digital image, wherein the digital image includes a schematic layout of a plurality of physical components. The computer-implemented method further comprises parsing the image to identify first text in the image. The computer-implemented method further comprises comparing the first text with identities of physical components that are included in the entries stored in a parts database. The computer-implemented method further comprises identifying a first identity stored in the parts database that matches the first text. The computer-implemented method further comprises retrieving, from the parts database, data associated with a first physical component identified by the first identity in the parts database. The computer-implemented method further comprises determining an area covered by the first physical component in the digital image. The computer-implemented method further comprises creating a link in the parts database between the data associated with the first physical component and one or more of the first text in the digital image or the area in the digital image covered by the first physical component. The computer-implemented method further comprises generating user interface data such that the interactive user interface includes the digital image and a link at one or more of a location of the first text in the digital image or the area covered by the first physical component, where the link, when selected, causes the interactive user interface to display the data associated with the first physical component.


The computer-implemented method of the preceding paragraph can include any sub-combination of the following features: where the data associated with the first physical component comprises sensor data measured by the first physical component and a graph depicting a relationship between the first physical component and other physical components illustrated in the schematic layout; where the method further comprises updating, in response to a selection of a second physical component in the graph, the user interface data such that the interactive user interface displays data associated with the second physical component; where the method further comprises updating, in response to a selection of the first physical component in the interactive user interface, the user interface data such that the interactive user interface includes a window, where the window includes sensor data measured by the first physical component; where the computer readable storage medium further stores program instructions that cause the computing system to, in response to a selection of a second physical component in the interactive user interface, update the user interface data such that the interactive user interface includes a second window, where the second window includes sensor data measured by the second physical component; and where the method further comprises updating, in response to a selection of first sensor data measured by the first physical component corresponding to a first time, the user interface data such that the interactive user interface includes a marker in the second window at a location of second sensor data measured by the second physical component that corresponds with the first time.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1A illustrates a user interface that displays an unparsed digital image that depicts a schematic layout of various physical components.



FIG. 1B illustrates a user interface that displays a parsed digital image that depicts a schematic layout of various physical components.



FIGS. 2A-B illustrate user interfaces displaying the centering of physical components when selected in an index provided by the parts window.



FIG. 3 illustrates a user interface displaying the indication of a physical component associated with a selected note in a change log provided by the notes window.



FIGS. 4A-F illustrate user interfaces displaying a relationship between two selected physical components in the schematic layout depicted in the digital image.



FIGS. 4G-H illustrate user interfaces displaying the addition of a note to the schematic layout that is associated with a physical component.



FIGS. 5A-C illustrate user interfaces displaying a page providing information about a physical component.



FIGS. 6A-B illustrate user interfaces displaying a page providing information about another physical component that is selected.



FIGS. 7A-C illustrate user interfaces displaying a process for searching for a physical component of a particular interactive schematic or multiple schematics.



FIG. 8 is a flowchart depicting an illustrative operation of linking data with physical components in a schematic.



FIG. 9 illustrates a computer system with which certain methods discussed herein may be implemented, according to one embodiment.





DETAILED DESCRIPTION OF SPECIFIC EMBODIMENTS

Overview


As described above, it can become very difficult for the user to identify relevant data and perform an analysis for a system including thousands, millions, or more different components that interact with one another in various manners, especially when data regarding the physical components may be stored in multiple external databases. In some cases, the user may want to analyze data stored in one or more databases in conjunction with the layout of physical components. For example, the user may view a digital image in a user interface that includes a schematic of various physical components. The schematic may represent an actual device or structure, such as the example manufacturing facility schematics discussed above, a schematic of an electronic circuit, an oil well platform, or any other system including multiple physical components, and may depict relationships between the various physical components. The user may wish to view data measured by or associated with one or more of the physical components; however, there may be no linkage between the schematic and the various data regarding the physical components and one or more databases that include relevant data regarding the physical components. For example, different external or internal data sources may store different types of sensor measurements associated with the physical components. Other external or internal data sources may store various schematics and still other external or internal data sources may store information related to the physical components (e.g., model number, name, characteristics, dimensions, etc.). The schematic may not be interactive or provide any way for the user to selectively view data stored in one or more internal and/or external data sources that has been associated with physical components in the schematic by way of the process described herein, such as with reference to FIG. 8, while also viewing the schematic. Advantageously, this would allow the user, even an unskilled user, to easily view various types of data regarding components of interest while also viewing the physical location of the components with reference to one another on the schematic. Furthermore, this would advantageously reduce the need for the user to open a new window or separately access data sources to find the specific data associated with physical components of interest.


Accordingly, disclosed herein are various systems and methods for displaying schematics (or other visual representations of multi-component systems) and data associated with the various physical components in the schematics in an interactive user interface. For example, a computing device (e.g., the computing system 1000 of FIG. 9 described below) may be configured to link data stored in one or more databases with particular portions of schematics that can be displayed in one or more interactive user interfaces. The computing device may parse a digital image that depicts a schematic and identify text visible in the digital image, such as text that may include a component identifier (e.g., model number, name, characteristics, dimensions, etc.). Based on the identified text, the computing device may recognize representations of one or more physical components in the schematic and link the representations (or specific locations of the schematic in which the physical component is located) to various data associated with the physical components, which may be stored in one or more databases. In the embodiment of FIG. 9, the computing system 1000 may store (or have access to) a linkage database that stores associations between physical components in a schematic and the various information regarding the physical components that may be copied to the linkage database itself and/or includes in the linkage database with a reference to another one or more data sources.


In some embodiments, the computing device may modify the original digital image such that it becomes interactive and visible in a user interface in a manner that allows the user to select a physical component and view data associated with the selected physical component, such as data that is associated with the physical component in the linkage database and/or that is in external data sources at specific locations identified in the linkage database. For example, the digital image may be interactive such that the user can view sensor data measurements in one or more graphs that are each associated with a different physical component. The graphs may be linked such that manipulating one graph (e.g., zooming in or out, adjusting the time range, etc.) causes an identical or nearly identical manipulation in other graphs that are visible. As another example, the digital image may be interactive such that the user can select a physical component in the user interface, which causes the user interface to display information related to the physical component, such as an identification of the physical components, characteristics of the physical components, limits or alarms associated with the physical component, graphs that depict sensor data associated with the physical component, and/or graphs that depicts a relationship between the physical component and other physical components that are located nearby.


A schematic can be one of many schematics that are associated with each other. Thus, the interactive digital image depicting one schematic can include one or more links or references to other interactive digital images that depict the other associated schematics. When a user selects a link or reference to another interactive digital image, the linked or referenced interactive digital image may be opened and displayed in the user interface (e.g., in the same window as the interactive digital image the user was initially viewing or in a separate window).


The systems and methods described herein may provide several benefits. For example, the systems and methods described herein may improve the usability of the user interface by providing graphs overlaying a digital image depicting a schematic that can be manipulated by a user in a concurrent manner, thereby allowing the user to identify trends or other information associated with the physical components included in the schematic without having to separately manipulate each individual graph. As another example, the systems and methods described herein may reduce the processor load by linking data associated with the physical components with the graphical representation of the physical components in the digital image. Because the system compiles data and/or references to data associated with components in a linkage database, such data can be automatically and quickly (e.g., in real-time) accessed by the system and displayed to the user. Rather than requiring the user to manually access multiple data sources, possibly in multiple different software applications (e.g., a browser, proprietary parts information application, etc.) in multiple windows, while trying to match up the various data regarding physical components associated with the schematic in another window (e.g., an image viewer), the user can view all relevant information from the multiple data sources automatically and overlaid (or otherwise integrated with) the schematic and displayed within one window. As another example, the systems and methods described herein may reduce the latency experienced by the user in identifying relevant data associated with one or more physical components of a system by parsing the digital image and linking the relevant data stored in the linkage database with the identified physical components before the digital image is presented to the user. Additionally, the systems and methods discussed herein advantageously allow the user to view sensor output data, and even graphs of historical sensor data, for each of multiple sensors of a system, concurrently within a single user interface and with interactivity between the graphs of historical sensor data. Accordingly, these systems and methods may allow the user to identify correlations between sensors that may not be possible to identify using existing data analysis systems.


Example Parsing and Manipulation of a Digital Image in an Interactive User Interface



FIG. 1A illustrates a user interface 100 that displays an unparsed digital image 102 that depicts a schematic layout of various physical components. In this example, the physical components include a battery 110, a lamp 112, a resistor 114, an oscillator 116, a resistor 118, a resistor 120, a resistor 122, a capacitor 124, a capacitor 126, a resistor 128, a resistor 130, a capacitor 132, a resistor 134, a bell 136, a loop antenna 138, a speaker 140, an inductor 142, and a switch 144. While FIG. 1A depicts electrical physical components, this is not meant to be limiting. The embodiments described herein may relate to any type of physical component, such as mechanical components, devices present on an oil well platform, etc.


In an embodiment, the digital image 102 further includes a title box 104 that identifies the structure made up of the displayed physical components and a notes box 106 that identifies changes that have been made to the schematic over time. As illustrated in FIG. 1A, the digital image 102 may be a static image that has not been parsed. Thus, the user interface 100, at the portion that displays the digital image 102, may not be interactive. In some embodiments, the digital image is a scan of a physical printout or drawing of a schematic, such as may be generated with a scanner, camera, or other imaging equipment. In other embodiments, the digital image 102 is a computer aided design (CAD) image in one of various available file formats, or an image file generated from the CAD image, or any other digital format.



FIG. 1B illustrates the user interface 100 with a parsed digital image 103 that depicts a schematic layout of the various physical components included in the schematic that are have now been modified to become interactive. For example, the computing system 1000 may parse the digital image 102 and identify text, such as the outlines, shapes, colors, etc. of identified physical components, names or other textual (e.g., alphanumeric part numbers) information identifying the physical components, the text in the title box 104, and/or the text in the notes box 106. A database, for example, may be queried to determine whether any of the identified text corresponds to an identity of physical components found in entries of the database and/or to data stored in the database. If the query yields a match or close match (e.g., the text and an identity of the physical component match within a few letters), then the data associated with the identity of the physical component may be linked with the identified text. A separate page, not immediately visible to the user in the user interface 100, may be generated that includes some or all of the associated data and may be accessible upon a selection of a representation of a physical component in the digital image 103, as described in greater detail below. In a similar manner, shapes, colors, patterns, etc. of components may be detected and compared to those same types of features associated with known physical components (e.g., in manufacture specification manuals) in order to more positively identify a particular physical component. In some embodiments, information from multiple data sources may be used in order to identify (e.g., to a required confidence level) a particular physical component. For example, if text associated with a physical component is parsed and found to be associated with six different physical components, while the shape, size, outline, and/or other physical characteristics of the physical component are matched up with three possible physical components, an overlapping single physical component in each matched set may be determined to be the actual physical component in the schematic.


In an embodiment, once the text is identified, the text or a location near a physical component that corresponds with the text may be replaced with a link, or a link layer including the link may be overlaid on the schematic image, or some other software that tracks locations of user selections on the schematic may be used to determine a location of the user selection and initiate access of information regarding any physical components at that selected schematic location using the linkage database. The link may redirect the user to another page within the user interface 100 that provides additional information on the selected item. The link may also or in the alternative cause the user interface 100 to modify a placement of the digital image 103 (e.g., shift the digital image 103 to the right) or cause another window to appear in the user interface 100 to provide additional information on the selected item. For example, the text associated with the physical components in the schematic may include a link (as evidenced by the boxes surrounding the text associated with the physical components in FIG. 1B), and the text in the title box 104 may include a link and the text in the notes box 106 may include a link (as evidenced by the underlined text in FIG. 1B).


The user interface 100 may further include a parts window 108 that provides an index of physical components that have been identified in the digital image 102. The parts window 108 may overlay the digital image 102 or may be displayed in a separate window (not shown).


The parts window 108 may be interactive such that the user can select a physical component from the index and be presented with a location of the physical component in the digital image 103. For example, FIGS. 2A-B illustrate user interfaces 200 displaying the centering of physical components when selected in an index provided by the parts window 108. As illustrated in FIG. 2A, the user selects battery 110 in the index via a cursor 210. Selection may include clicking on the text in the index, hovering over the text in the index, providing a voice or key command as the cursor 210 is placed over the text in the index, and/or the like. Upon the selection of the battery 110 in the index, the user interface 200 adjusts the location of the digital image 103 such that the representation of the battery 110 in the schematic is centered. Likewise, upon the selection of the speaker 140 in the index, as illustrated in FIG. 2B, the user interface 200 adjusts the location of the digital image 103 such that the representation of the speaker 140 in the schematic is centered. In some embodiments, selection of a component may be indicated in the schematic using other visual distinctions, such as highlighting the part and/or part identifier or causing them to blink.



FIG. 3 illustrates a user interface 300 displaying the indication of a physical component associated with a selected note in a change log provided by the notes window 106. For example, the schematic illustrated in the digital image 103 may have been modified at previous times, and the notes window 106 may provide notes that indicate what physical components were changed and/or how the schematic was modified.


As illustrated in FIG. 3, the user may select a first note in the notes window 106 via the cursor 210. The first note may indicate that a capacitor CP-25 was replaced with the capacitor 124. Thus, an icon or marker, such as a bubble 310, may appear in the user interface 300 to highlight the physical component associated with the first note (e.g., the capacitor 124).



FIGS. 4A-F illustrate user interfaces 400 displaying a relationship between two selected physical components in the schematic layout depicted in the interactive digital image 103. As illustrated in FIG. 4A, the user may select the inductor 142 via the cursor 210.


As illustrated in FIG. 4B, selection of the inductor 142 may cause a window 410 to appear in the user interface 400. The window 410 may provide additional information associated with the inductor 142. For example, the window 410 may depict a graph showing measured sensor data associated with the inductor 142 and a description of the type and/or purpose of the inductor 142. The window 410 may also depict an indication of a lower limit (e.g., 1 ampere), represented by “LL” in FIG. 4B and an indication of an upper limit (e.g., 2 amperes), represented by “HH” in FIG. 4B. If the user is viewing historical sensor data in the graph, an alert may be generated when a portion of the historical sensor data illustrated in the graph reaches the lower or upper limit. Likewise, if the user is viewing current (e.g., real-time) sensor data in the graph, an alert may be generated at the time or immediately after the time the measured sensor data reaches the lower or upper limit. The alert may appear in a window overlaying the user interface 400, as a message appearing in the user interface 400, in a separately generated user interface, and/or the like. In addition or in the alternative, the limits and/or other rules associated with a sensor's output (or a combination of sensors outputs) may be applied to real-time data even when a user is not viewing the data, such that alerts may be generated and communicated to the user via an electronic message (e.g., text message, electronic mail, instant message, etc.), which may encourage the user to view the sensor data in the user interface 400 in order to determine if preventative action should be taken (e.g. replacing a failing part). The window 410 may also include a part page button 415 or another control (such as a link) that, when selected, causes the user interface 400 to display a page providing more information about the inductor 142, such as information associated with the particular part number of the inductor 142, which may be included in the linkage database itself or accessible at a location included in the linkage database.


As illustrated in FIG. 4C, the user has selected the resistor 128 via the cursor 210 while the window 410 is still open. As illustrated in FIG. 4D, selection of the resistor 128 may cause another window 420 to appear in the user interface 400. In this example, the window 420 provides additional information associated with the resistor 128, such as the information described above with respect to the inductor 142.


As illustrated in FIG. 4E, the graphs in the windows 410 and 420 may be linked. For example, the user may place the cursor 210 over a portion of the graph in the window 420 that corresponds with a first time, causing a marker 435 to appear in the graph in the window 410 at a location that also corresponds with the first time. Thus, a user can easily identify values of inputs and/or outputs of each of multiple components in a system at a same time or timer period.


As illustrated in FIG. 4F, the user may manipulate the graph in window, causing an identical or nearly identical manipulation in the graph in another window. For example, the user may zoom in on the graph in the window 420 (e.g., by changing the y-axis in the graph in the window 420 without changing the time range on the x-axis). In response to the zoom command, the graph in the window 410 may zoom in as well (e.g., by changing the y-axis in the graph in the window 410 in the same manner as with the graph in the window 420 without changing the time range on the x-axis). In some embodiments, the graphs in the windows 410 and 420 are manipulated such that they depict the same zoom level (e.g., the user may zoom to a first level in the graph in the window 420, and the graph in the window 410 may be zoomed to the first level as well).



FIGS. 4G-H illustrate user interfaces 450 displaying the addition of a note associated with a physical component. As illustrated in FIG. 4G, the user may select the loop antenna 138 via the cursor 210. As illustrated in FIG. 4H, upon selection of the loop antenna 138, a notes window 460 appears in the user interface 450 near a location of the loop antenna 138, displaying notes that had previously been associated with the physical component in some embodiments. The user may enter text in the notes window 460 and the notes window 460 may stay visible in the user interface 450 until the user closes the notes window 460. In some implementations, the notes window 460 is associated with the schematic layout and automatically appears when a user later reloads the schematic layout, without any additional user action.



FIGS. 5A-C illustrate user interfaces 500 displaying a page providing information about a physical component. As described above, the window 420 is associated with the resistor 128. As illustrated in FIG. 5A, the user may select the part page button 425 in the window 420 via the cursor 210.


Upon selection of the part page button 425, the user interface 500 displays a page 502 that provides additional information about the resistor 128, as illustrated in FIG. 5B. For example, the page 502 may include specifications of the resistor 128, limits and alarms associated with the resistor 128, a graph depicting sensor data measured by the resistor 128 or detected on the resistor 128 (e.g., a current, a voltage, etc.), and/or other physical components and their relation to the resistor 128.


In an embodiment, the relationship between the resistor 128 and the other physical components may be generated based on the schematic layout in the digital image 103. For example, by parsing the digital image 103, the computing system 1000 may recognize connectors (e.g., conductors, wires, pipes, conduits, etc.) that connect one or more physical components together. Based on these recognized connectors, the computing system 1000 may generate a graph (or other visualization) that depicts the connections between physical components in a block diagram 510 and/or the direction of flow of substances between physical components (as represented by arrows in FIGS. 5A-5C). As shown in the block diagram 510, a block representing the resistor 128 may be in the center and physical components coupled to the resistor 128 may be depicted to the right and left of the resistor 128. The block diagram 510 may further depict physical components coupled to the physical components coupled to the resistor 128, and so on.


As illustrated in FIG. 5C, the page 502 may further include a notes section that may provide information about the resistor 128 and a drawings section that includes icons representing schematics in which the resistor 128 can be found. The image depicted in an icon shows the location around the part or component that was selected (e.g., resistor 128 in this case). For example, the icon 520 may be selectable and, upon selection by the user, may redirect the user such that the user interface 500 displays the digital image illustrating the schematic associated with the icon 520 centered and/or zoomed in on the selected part or component (e.g., selection of the icon 520 causes the user interface 500 to display the digital image 103 centered and/or zoomed in on the resistor 128, in this case).



FIGS. 6A-B illustrate user interfaces 600 displaying a page providing information about another physical component that is selected. As illustrated in FIG. 6A, the user may select a block representing the resistor 120 in the block diagram 510 via the cursor 210. Upon selection of the block representing the resistor 120, the user interface 600 may display a page 602 associated with the resistor 120.


As shown in page 602, the page 602 may include specifications associated with the resistor 120, limits and alarms associated with the resistor 120, a graph depicting sensor data measured by the resistor 120 or detected on the resistor 120 (e.g., a current, a voltage, etc.), a block diagram 610 depicting the relationship between the resistor 120 and physical components located nearby Like with the block diagram 510, the block diagram 610 may depict the block representing the resistor 120 in the center.


In a further embodiment, the user can view an animation of a substance (e.g., oil, water, gas, current, voltage, etc.) as it is expected to pass through various physical components depicted in the digital image 103 (or in the block diagrams 510 and/or 610). The animation can be used to visualize historical data over time (e.g., historical sensor data that indicates how the substance has flowed through the physical components in the past in the current configuration of components) or to simulate the operation of the physical components in the current configuration of components using hypothetical data (e.g., a physical model describing the physical components in the current configuration and/or sample data to use in the simulation). For example, the user can select a physical component to serve as a starting point for the animation, choose a substance, and request to view an animation of the substance as it flows through the selected physical component as well as nearby physical components. The user can select whether the animation is to visualize historical data or whether the animation is to simulate a hypothetical scenario. The user can also select a time range for the animation (e.g., 1 minute, 1 hour, 1 day, 1 week, 1 month, 1 year, etc.). The user interface 100, 200, 300, 400, 450, 500, or 600 may include playback controls such that the user can play, pause, stop, rewind, and/or fast forward the animation.


If the user has indicated that the animation is for visualizing historical data, when the user selects the play option, the animation can show values associated with the substance historically measured at individual time instants as the substance passes through physical components during the selected time period (e.g., the historically measured values may be overlaid on the various components depicted in the user interface 100, 200, 300, 400, 450, 500, or 600). The animation may also or in the alternative be set to include a select number of physical components (e.g., the animation may progress until the substance reaches all selected physical components). The physical components that are a part of the animation may be highlighted or otherwise emphasized in the digital image 103 (or in the block diagrams 510 and/or 610). A timer may be displayed in the user interface 100, 200, 300, 400, 450, 500, or 600 to indicate a time that the substance reaches each physical component. Thus, the user may be able to visualize the flow of a substance as it passed through various physical components. The user can use this information to identify issues with the physical components, possible ways to reroute the substance if an issue with a particular physical component occurs, identify ways to efficiently route the substance, and/or the like.


If the user has indicated that the animation is for simulating hypothetical data, when the user selects the play option, the animation can simulate how changes to physical components or settings of physical components may impact operation of the system by displaying simulated values overlaid on the various components depicted in the user interface 100, 200, 300, 400, 450, 500, or 600. The user interface 100, 200, 300, 400, 450, 500, or 600 may display the same or similar information as displayed when the user has indicated that the animation is for visualizing historical data. For example, the animation may also or in the alternative be set to include a select number of physical components (e.g., the animation may progress until the substance reaches all selected physical components). The physical components that are a part of the animation may be highlighted or otherwise emphasized in the digital image 103 (or in the block diagrams 510 and/or 610). A timer may be displayed in the user interface 100, 200, 300, 400, 450, 500, or 600 to indicate a time that the substance reaches each physical component. Thus, the user may be able to visualize and model the flow of a substance as it would pass through various physical components. As described above, the user can use this information to identify issues with the physical components in a simulated configuration, possible ways to reroute the substance if an issue with a particular physical component is expected to occur, identify ways to efficiently route the substance, and/or the like.


Additionally, any graphs of sensor data that are displayed while an animation is presented (either based on historical or hypothetical data) may be updated to include an indicator on the graph associated with the current state of that sensor in the animation, where the indicator may dynamically move (e.g., from left to right across the graph) as the animation is played.



FIGS. 7A-C illustrate user interfaces 700 displaying a process for searching for a physical component of a particular interactive schematic or multiple schematics. As illustrated in FIG. 7A, the user may enter a search term in a search field 702. For example, the user may search for the term “RS-1,” which refers to a resistor, and confirm the selection by selecting search button 704. In some embodiments, a similar search interface may be included as part of the user interfaces depicting portions of the interactive schematic also.


As illustrated in FIG. 7B, upon selection of the search button 704, the user interface 700 may include a list of selectable results 710, 712, 714, and/or 716 that correspond with the search term. For example, the results 710, 712, 714, and/or 716 may be selectable icons that represent graphical representations of various physical components matching or nearly matching the search term, “RS-1” in this example. In an embodiment, the user selects the search result 712, which corresponds with the resistor 128, via the cursor 210.


As illustrated in FIG. 7C, upon selection of the search result corresponding to the resistor 128, the user interface 700 may depict the page 502 corresponding to the resistor 128. The user may then be able to view information about the resistor 128 and/or view schematics in view the resistor 128 can be found, as described above. Depending on user preferences and/or system defaults, selection of a search result may result in display of the interactive schematic where the selected component is included, along with some visual effect to make location of the selected physical component by the user easier.


Example Use Case of an Interactive User Interface Depicting a Parsed Digital Image


As an example, the user interface may include an interactive schematic that illustrates various selectable physical components that may be a part of an oil well platform, such as injectors, wellhead test separators, wellhead production separators, and/or the like. The user interface may also include an index that lists the physical components that can be found in the schematic of the oil well platform. Selecting any of the physical components depicted in the user interface may cause the user interface to display a page providing more information about the selected physical component, as described herein.


Example Process Flow



FIG. 8 is a flowchart 900 depicting an illustrative operation of linking data with physical components in a schematic. Depending on the embodiment, the method of FIG. 8 may be performed by various computing devices, such as by the computing system 1000 described below. Depending on the embodiment, the method of FIG. 8 may include fewer and/or additional blocks and the blocks may be performed in an order different than illustrated.


In block 902, a digital image is accessed. The digital image may include a schematic layout of a plurality of physical components. For example, the physical components may be components or devices found in an oil well platform.


In block 904, the image is parsed to identify first text in the image. The first text in the image may correspond with a physical component depicted in the schematic.


In block 906, the first text is compared with identities of physical components that are included in entries stored in one or more parts databases. For example, parts databases may include entries for various physical components that include the physical component's identity, sensor data, characteristics, descriptions, and/or the like. Different parts databases may be used for different types of components. For example, in the oil well platform example above, a first parts database may be accessed to get information on electrical sensor components, while another parts database is accessed to obtain data about fluid valves, pipes, and components. As noted above, information located in the multiple databases may be copied to the linkage database and/or may be referred to in the linkage database so that it is easily accessible to users.


In block 908, a first identity stored in a parts database (of possibly multiple databases that are searched) is identified that matches the first text. For example, the first text may be the name of a physical component that is also found in the parts database.


In block 910, data associated with a first physical component identified by the first identity in the parts database may be retrieved from the parts database. For example, the data may include characteristics of the first physical component, sensor data measured by or derived from the first physical component, and/or the like.


In block 912, an area covered by the first physical component is determined in the digital image. For example, the area covered by the first physical component may be near or at the same location as the first text.


In block 914, a link in the linkage database is created between the located data associated with the first physical component and one or more of the first text in the digital image or the area covered by the first physical component and/or first text. Such linkage allows the system to, upon selection of the first physical component in the schematic, access the data reference in the linkage database and display the data according to user preferences.


In block 916, user interface data is generated such that the interactive user interface includes the digital image and a link at one or more of a location of the first text in the digital image or the area covered by the first physical component. In an embodiment, selection of the first text or the area covered by the first physical component may result in a new window appearing in the interactive user interface displaying the linked data or a new page appearing in the interactive user interface that displays the linked data.


In an embodiment, data associated with a specific physical component is automatically linked to the physical component (e.g., if there is only one part of a particular part number in a system, historical sensor data stored in one or more databases in association with the part number may be automatically linked to the physical component identified in the schematic as being associated with the part number). In other embodiments, a user is provided with a user interface that allows for the linking of physical components located in the schematic to other data (e.g., historical sensor data) of the physical components. In addition, as future sensor data is added to the one or more databases in association with the other data, the future sensor data may be automatically linked to the located physical components such that the future sensor data can be viewed in the user interface.


Implementation Mechanisms


According to one embodiment, the techniques described herein are implemented by one or more special-purpose computing devices. The special-purpose computing devices may be hard-wired to perform the techniques, or may include digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) that are persistently programmed to perform the techniques, or may include one or more general purpose hardware processors programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination. Such special-purpose computing devices may also combine custom hard-wired logic, ASICs, or FPGAs with custom programming to accomplish the techniques. The special-purpose computing devices may be desktop computer systems, server computer systems, portable computer systems, handheld devices, networking devices or any other device or combination of devices that incorporate hard-wired and/or program logic to implement the techniques.


Computing device(s) are generally controlled and coordinated by operating system software, such as iOS, Android, Chrome OS, Windows XP, Windows Vista, Windows 7, Windows 8, Windows Server, Windows CE, Unix, Linux, SunOS, Solaris, iOS, Blackberry OS, VxWorks, or other compatible operating systems. In other embodiments, the computing device may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, I/O services, and provide a user interface functionality, such as a graphical user interface (“GUI”), among other things.


For example, FIG. 9 is a block diagram that illustrates a computer system 1000 upon which an embodiment may be implemented. For example, any of the computing devices discussed herein may include some or all of the components and/or functionality of the computer system 1000.


Computer system 1000 includes a bus 1002 or other communication mechanism for communicating information, and a hardware processor, or multiple processors, 1004 coupled with bus 1002 for processing information. Hardware processor(s) 1004 may be, for example, one or more general purpose microprocessors.


Computer system 1000 also includes a main memory 1006, such as a random access memory (RAM), cache and/or other dynamic storage devices, coupled to bus 1002 for storing information and instructions to be executed by processor 1004. Main memory 1006 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 1004. Such instructions, when stored in storage media accessible to processor 1004, render computer system 1000 into a special-purpose machine that is customized to perform the operations specified in the instructions. Main memory 1006 may also store cached data, such as zoom levels and maximum and minimum sensor values at each zoom level.


Computer system 1000 further includes a read only memory (ROM) 1008 or other static storage device coupled to bus 1002 for storing static information and instructions for processor 1004. A storage device 1010, such as a magnetic disk, optical disk, or USB thumb drive (Flash drive), etc., is provided and coupled to bus 1002 for storing information and instructions. For example, the storage device 1010 may store measurement data obtained from a plurality of sensors.


Computer system 1000 may be coupled via bus 1002 to a display 1012, such as a cathode ray tube (CRT) or LCD display (or touch screen), for displaying information to a computer user. For example, the display 1012 can be used to display any of the user interfaces described herein with respect to FIGS. 1A through 8. An input device 1014, including alphanumeric and other keys, is coupled to bus 1002 for communicating information and command selections to processor 1004. Another type of user input device is cursor control 416, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 1004 and for controlling cursor movement on display 1012. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane. In some embodiments, the same direction information and command selections as cursor control may be implemented via receiving touches on a touch screen without a cursor.


Computing system 1000 may include a user interface module to implement a GUI that may be stored in a mass storage device as executable software codes that are executed by the computing device(s). This and other modules may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.


In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Java, Lua, C or C++. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software modules configured for execution on computing devices may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, magnetic disc, or any other tangible medium, or as a digital download (and may be originally stored in a compressed or installable format that requires installation, decompression or decryption prior to execution). Such software code may be stored, partially or fully, on a memory device of the executing computing device, for execution by the computing device. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules or computing device functionality described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage


Computer system 1000 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 1000 to be a special-purpose machine. According to one embodiment, the techniques herein are performed by computer system 1000 in response to processor(s) 1004 executing one or more sequences of one or more instructions contained in main memory 1006. Such instructions may be read into main memory 1006 from another storage medium, such as storage device 1010. Execution of the sequences of instructions contained in main memory 1006 causes processor(s) 1004 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.


The term “non-transitory media,” and similar terms, as used herein refers to any media that store data and/or instructions that cause a machine to operate in a specific fashion. Such non-transitory media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 1010. Volatile media includes dynamic memory, such as main memory 1006. Common forms of non-transitory media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge, and networked versions of the same.


Non-transitory media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between non-transitory media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 1002. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.


Various forms of media may be involved in carrying one or more sequences of one or more instructions to processor 1004 for execution. For example, the instructions may initially be carried on a magnetic disk or solid state drive of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to computer system 1000 can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on bus 1002. Bus 1002 carries the data to main memory 406, from which processor 1004 retrieves and executes the instructions. The instructions received by main memory 1006 may retrieve and execute the instructions. The instructions received by main memory 1006 may optionally be stored on storage device 1010 either before or after execution by processor 1004.


Computer system 1000 also includes a communication interface 1018 coupled to bus 1002. Communication interface 1018 provides a two-way data communication coupling to a network link 1020 that is connected to a local network 1022. For example, communication interface 1018 may be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, communication interface 1018 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN (or WAN component to communicated with a WAN). Wireless links may also be implemented. In any such implementation, communication interface 1018 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.


Network link 1020 typically provides data communication through one or more networks to other data devices. For example, network link 1020 may provide a connection through local network 1022 to a host computer 1024 or to data equipment operated by an Internet Service Provider (ISP) 1026. ISP 1026 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the “Internet” 1028. Local network 1022 and Internet 1028 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 1020 and through communication interface 1018, which carry the digital data to and from computer system 1000, are example forms of transmission media.


Computer system 1000 can send messages and receive data, including program code, through the network(s), network link 1020 and communication interface 1018. In the Internet example, a server 1030 might transmit a requested code for an application program through Internet 1028, ISP 1026, local network 1022 and communication interface 1018.


The received code may be executed by processor 1004 as it is received, and/or stored in storage device 1010, or other non-volatile storage for later execution.


Terminology

Each of the processes, methods, and algorithms described in the preceding sections may be embodied in, and fully or partially automated by, code modules executed by one or more computer systems or computer processors comprising computer hardware. The processes and algorithms may be implemented partially or wholly in application-specific circuitry.


The various features and processes described above may be used independently of one another, or may be combined in various ways. All possible combinations and subcombinations are intended to fall within the scope of this disclosure. In addition, certain method or process blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate. For example, described blocks or states may be performed in an order other than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. The example blocks or states may be performed in serial, in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed example embodiments. The example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed example embodiments.


Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.


Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those skilled in the art.


It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure. The foregoing description details certain embodiments of the invention. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the invention can be practiced in many ways. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the invention should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the invention with which that terminology is associated. The scope of the invention should therefore be construed in accordance with the appended claims and any equivalents thereof.

Claims
  • 1. A computing system comprising: a computer processor; anda computer readable storage medium storing program instructions configured for execution by the computer processor in order to cause the computing system to:access a digital image, wherein the digital image includes a schematic layout of a first physical component and a second physical component;parse the digital image to create a link between first data associated with the first physical component and second data associated with the second physical component, wherein the first data is different than the second data;generate user interface data such that a user interface displays the digital image in a first window;in response to selection of the first physical component in the user interface, update the user interface data such that the user interface concurrently displays the first window and a second window, wherein the second window includes at least a portion of the first data associated with the first physical component; andin response to selection of the second physical component in the user interface, update the user interface data such that the user interface concurrently displays the first window, the second window, and a third window, wherein the third window is different than the first and second windows, wherein the third window includes at least a portion of the second data associated with the second physical component, and wherein a user-selected change to a visualization in the third window causes a corresponding change to a visualization in the second window,wherein the second window is displayed near the first physical component and the third window is displayed near the second physical component, and wherein the second window and third window concurrently at least partially cover respective portions of the digital image in the user interface; andin response to a selection of the portion of the first data associated with the first physical component corresponding to a first time, update the user interface data such that the user interface includes a marker in the third window at a location of the portion of the second data associated with the second physical component that corresponds with the first time.
  • 2. The computing system of claim 1, wherein the computer readable storage medium further stores program instructions that cause the computing system to associate one or more data series associated with the first physical component, the one or more data series including historical data regarding input values and output values associated with the first physical component.
  • 3. The computing system of claim 1, wherein the first data associated with the first physical component comprises sensor data measured by the first physical component.
  • 4. The computing system of claim 1, wherein the first data associated with the first physical component comprises a graph depicting a relationship between the first physical component and other physical components illustrated in the schematic layout.
  • 5. The computing system of claim 1, wherein the second data comprises sensor data measured by the second physical component.
  • 6. The computing system of claim 1, wherein the computer readable storage medium further stores program instructions that cause the computing system to, in response to a command to zoom in on the second window to a first zoom level corresponding to a first data range along a y-axis, update the user interface data such that the user interface zooms in on the third window to the first zoom level so that the portion of the first data associated with the first physical component is displayed for the first data range along the y-axis and the portion of the second data associated with the second physical component is displayed for the first data range along the y-axis.
  • 7. The computing system of claim 1, wherein the user interface includes an index window that lists identities for the first physical component and the second physical component illustrated in the schematic layout, and wherein the computer readable storage medium further stores program instructions that cause the computing system to, in response to a selection of an identity of the first physical component, update the user interface data to adjust a location of the digital image in the user interface such that a representation of the first physical component in the schematic layout is centered in the user interface.
  • 8. The computing system of claim 1, wherein the user interface includes a notes window that identifies previous changes to the schematic layout, and wherein the computer readable storage medium further stores program instructions that cause the computing system to, in response to a selection of a first note listed in the notes window, update the user interface data such that the user interface identifies a third physical component illustrated in the schematic layout that is associated with the first note.
  • 9. The computing system of claim 1, wherein the computer readable storage medium further stores program instructions that cause the computing system to, in response to the selection of the first physical component in the user interface, update the user interface data such that the user interface displays a note associated with the first physical component.
  • 10. The computing system of claim 1, wherein the computer readable storage medium further stores program instructions that cause the computing system to: identify one or more connectors that connect the first physical component to a third physical component;generate a graph depicting the first physical component, the third physical component, the one or more connectors that connect the first physical component to the third physical component, and a direction of flow of a substance between the first physical component and the third physical component; andin response to a selection of a part page button in the user interface, update the user interface data such that the user interface displays the generated graph.
  • 11. The computing system of claim 1, wherein the computer readable storage medium further stores program instructions that cause the computing system to, in response to a request to animate a flow of data through one or more of the first physical component or the second physical component, update the user interface data such that the user interface includes an animation that indicates a sensor value as a substance passes through the first physical component at a second first time and the sensor value as the substance passes through the second physical component at a third second time after the second first time.
  • 12. The computing system of claim 1, wherein the animation visualizes one of historical sensor data or hypothetical data.
  • 13. The computing system of claim 1, wherein the computer readable storage medium further stores program instructions that cause the computing system to, in response to an input providing model sensor data for the first physical component, update the user interface data such that the user interface includes a prediction of a sensor value for a third physical component that is coupled to the first physical component.
  • 14. A computer-implemented method comprising: accessing a digital image, wherein the digital image includes a schematic layout of a first physical component and a second physical component;parsing the digital image to create a link between first data associated with the first physical component and second data associated with the second physical component, wherein the first data is different than the second data;generating user interface data such that a user interface displays the digital image in a first window;in response to selection of the first physical component in the user interface, updating the user interface data such that the user interface concurrently displays the first window and a second window, wherein the second window includes at least a portion of the first data associated with the first physical component; andin response to selection of the second physical component in the user interface, updating the user interface data such that the user interface concurrently displays the first window, the second window, and a third window, wherein the third window is different than the first and second windows, wherein the third window includes at least a portion of the second data associated with the second physical component, and wherein a user-selected change to a visualization in the third window causes a corresponding change to a visualization in the second window,wherein the second window is displayed near the first physical component and the third window is displayed near the second physical component, and wherein the second window and the third window concurrently at least partially cover respective portions of the digital image in the interface; andin response to a selection of the portion of the first data associated with the first physical component corresponding to a first time, update the user interface data such that the user interface includes a marker in the third window at a location of the portion of the second data associated with the second physical component that corresponds with the first time.
  • 15. The computer-implemented method of claim 14, wherein the first data associated with the first physical component comprises sensor data measured by the first physical component and a graph depicting a relationship between the first physical component and other physical components illustrated in the schematic layout.
  • 16. The computer-implemented method of claim 14, wherein the second data comprises sensor data measured by the second physical component.
  • 17. The computer-implemented method of claim 14, further comprising, in response to a command to zoom in on the second window to a first zoom level corresponding to a first data range along a y-axis, updating the user interface data such that the user interface zooms in on the third window to the first zoom level so that the portion of the first data associated with the first physical component is displayed for the first data range along the y-axis and the portion of the second data associated with the second physical component is displayed for the first data range along the y-axis.
  • 18. A non-transitory, computer-readable storage medium storing computer-executable instructions, which if performed by one or more processors, cause the one or more processors to at least: access a digital image, wherein the digital image includes a schematic layout of a first physical component and a second physical component;parse the digital image to create a link between first data associated with the first physical component and second data associated with the second physical component, wherein the first data is different than the second data;generate user interface data such that a user interface displays the digital image in a first window;in response to selection of the first physical component in the user interface, update the user interface data such that the user interface concurrently displays the first window and a second window, wherein the second window includes at least a portion of the first data associated with the first physical component; andin response to selection of the second physical component in the user interface, update the user interface data such that the user interface concurrently displays the first window, the second window, and a third window, wherein the third window is different than the first and second windows, wherein the third window includes at least a portion of the second data associated with the second physical component, and wherein a user-selected change to a visualization in the third window causes a corresponding change to a visualization in the second window,wherein the second window is displayed near the first physical component and the third window is displayed near the second physical component, and wherein the second window and third window concurrently at least partially cover respective portions of the digital image in the user interface; andin response to a selection of the portion of the first data associated with the first physical component corresponding to a first time, update the user interface data such that the user interface includes a marker in the third window at a location of the portion of the second data associated with the second physical component that corresponds with the first time.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 14/883,498, entitled “SCHEMATIC AND DATABASE LINKING SYSTEM” and filed on Oct. 14, 2015, which is hereby incorporated by reference herein in its entirety.

US Referenced Citations (748)
Number Name Date Kind
4881179 Vincent Nov 1989 A
5109399 Thompson Apr 1992 A
5241625 Epard et al. Aug 1993 A
5329108 Lamoure Jul 1994 A
5632009 Rao et al. May 1997 A
5670987 Doi et al. Sep 1997 A
5781704 Rossmo Jul 1998 A
5798769 Chiu et al. Aug 1998 A
5845300 Comer Dec 1998 A
5999911 Berg et al. Dec 1999 A
6057757 Arrowsmith et al. May 2000 A
6065026 Cornelia et al. May 2000 A
6091956 Hollenberg Jul 2000 A
6101479 Shaw Aug 2000 A
6161098 Wallman Dec 2000 A
6219053 Tachibana et al. Apr 2001 B1
6232971 Haynes May 2001 B1
6237138 Hameluck et al. May 2001 B1
6243706 Moreau et al. Jun 2001 B1
6247019 Davies Jun 2001 B1
6279018 Kudrolli et al. Aug 2001 B1
6341310 Leshem et al. Jan 2002 B1
6366933 Ball et al. Apr 2002 B1
6369835 Lin Apr 2002 B1
6370538 Lamping et al. Apr 2002 B1
6374251 Fayyad et al. Apr 2002 B1
6430305 Decker Aug 2002 B1
6456997 Shukla Sep 2002 B1
6523019 Borthwick Feb 2003 B1
6530065 McDonald Mar 2003 B1
6549944 Weinberg et al. Apr 2003 B1
6560620 Ching May 2003 B1
6581068 Bensoussan et al. Jun 2003 B1
6594672 Lampson et al. Jul 2003 B1
6631496 Li et al. Oct 2003 B1
6642945 Sharpe Nov 2003 B1
6665683 Meltzer Dec 2003 B1
6674434 Chojnacki et al. Jan 2004 B1
6714936 Nevin, III Mar 2004 B1
6775675 Nwabueze et al. Aug 2004 B1
6820135 Dingman Nov 2004 B1
6828920 Owen et al. Dec 2004 B2
6839745 Dingari et al. Jan 2005 B1
6850317 Mullins et al. Feb 2005 B2
6877137 Rivette et al. Apr 2005 B1
6944777 Belani et al. Sep 2005 B1
6944821 Bates et al. Sep 2005 B1
6967589 Peters Nov 2005 B1
6976210 Silva et al. Dec 2005 B1
6978419 Kantrowitz Dec 2005 B1
6980984 Huffman et al. Dec 2005 B1
6985950 Hanson et al. Jan 2006 B1
7036085 Barros Apr 2006 B2
7042469 Fuller May 2006 B2
7043702 Chi et al. May 2006 B2
7055110 Kupka et al. May 2006 B2
7072801 James Jul 2006 B2
7086028 Davis et al. Aug 2006 B1
7139800 Bellotti et al. Nov 2006 B2
7143363 Gaynor et al. Nov 2006 B1
7158878 Rasmussen et al. Jan 2007 B2
7162475 Ackerman Jan 2007 B2
7168039 Bertram Jan 2007 B2
7171427 Witowski et al. Jan 2007 B2
7174377 Bernard et al. Feb 2007 B2
7194680 Roy et al. Mar 2007 B1
7213030 Jenkins May 2007 B1
7257793 Okano et al. Aug 2007 B2
7269786 Malloy et al. Sep 2007 B1
7278105 Kitts Oct 2007 B1
7290698 Poslinski et al. Nov 2007 B2
7333998 Heckerman et al. Feb 2008 B2
7370047 Gorman May 2008 B2
7379811 Rasmussen et al. May 2008 B2
7379903 Caballero et al. May 2008 B2
7392254 Jenkins Jun 2008 B1
7426654 Adams et al. Sep 2008 B2
7441182 Beilinson et al. Oct 2008 B2
7441219 Perry et al. Oct 2008 B2
7454466 Bellotti et al. Nov 2008 B2
7467375 Tondreau et al. Dec 2008 B2
7487139 Fraleigh et al. Feb 2009 B2
7502786 Liu et al. Mar 2009 B2
7525422 Bishop et al. Apr 2009 B2
7529727 Arning et al. May 2009 B2
7529734 Dirisala May 2009 B2
7558677 Jones Jul 2009 B2
7574409 Patinkin Aug 2009 B2
7574428 Leiserowitz et al. Aug 2009 B2
7579965 Bucholz Aug 2009 B2
7596285 Brown et al. Sep 2009 B2
7614006 Molander Nov 2009 B2
7617232 Gabbert et al. Nov 2009 B2
7620628 Kapur et al. Nov 2009 B2
7627812 Chamberlain et al. Dec 2009 B2
7634717 Chamberlain et al. Dec 2009 B2
7703021 Flam Apr 2010 B1
7706817 Bamrah et al. Apr 2010 B2
7712049 Williams et al. May 2010 B2
7716077 Mikurak May 2010 B1
7716140 Nielsen et al. May 2010 B1
7725530 Sah et al. May 2010 B2
7725547 Albertson et al. May 2010 B2
7730082 Sah et al. Jun 2010 B2
7730109 Rohrs et al. Jun 2010 B2
7761834 McConaghy Jul 2010 B2
7765489 Shah Jul 2010 B1
7770100 Chamberlain et al. Aug 2010 B2
7805457 Viola et al. Sep 2010 B1
7809703 Balabhadrapatruni et al. Oct 2010 B2
7814123 Nguyen et al. Oct 2010 B2
7818291 Ferguson et al. Oct 2010 B2
7818658 Chen Oct 2010 B2
7870493 Pall et al. Jan 2011 B2
7877421 Berger et al. Jan 2011 B2
7880921 Dattilo et al. Feb 2011 B2
7890868 Shah et al. Feb 2011 B2
7894984 Rasmussen et al. Feb 2011 B2
7899611 Downs et al. Mar 2011 B2
7899796 Borthwick et al. Mar 2011 B1
7917376 Bellin et al. Mar 2011 B2
7920963 Jouline et al. Apr 2011 B2
7933862 Chamberlain et al. Apr 2011 B2
7941336 Robin-Jan May 2011 B1
7958147 Turner et al. Jun 2011 B1
7962848 Bertram Jun 2011 B2
7966199 Frasher Jun 2011 B1
8001465 Kudrolli et al. Aug 2011 B2
8001482 Bhattiprolu et al. Aug 2011 B2
8010507 Poston et al. Aug 2011 B2
8010545 Stefik et al. Aug 2011 B2
8015487 Roy et al. Sep 2011 B2
8024778 Cash et al. Sep 2011 B2
8036632 Cona et al. Oct 2011 B1
8042110 Kawahara et al. Oct 2011 B1
8073857 Sreekanth Dec 2011 B2
8103543 Zwicky Jan 2012 B1
8134457 Velipasalar et al. Mar 2012 B2
8135484 Yamazaki et al. Mar 2012 B2
8145703 Frishert et al. Mar 2012 B2
8185819 Sah et al. May 2012 B2
8191005 Baier et al. May 2012 B2
8214361 Sandler et al. Jul 2012 B1
8214764 Gemmell et al. Jul 2012 B2
8225201 Michael Jul 2012 B2
8229947 Fujinaga Jul 2012 B2
8230333 Decherd et al. Jul 2012 B2
8271461 Pike et al. Sep 2012 B2
8280880 Aymeloglu et al. Oct 2012 B1
8290838 Thakur et al. Oct 2012 B1
8290926 Ozzie et al. Oct 2012 B2
8290942 Jones et al. Oct 2012 B2
8301464 Cave et al. Oct 2012 B1
8301904 Gryaznov Oct 2012 B1
8302855 Ma et al. Nov 2012 B2
8312367 Foster Nov 2012 B2
8312546 Aime Nov 2012 B2
8352881 Champion et al. Jan 2013 B2
8368695 Howell et al. Feb 2013 B2
8386377 Xiong et al. Feb 2013 B1
8392556 Goulet et al. Mar 2013 B2
8397171 Klassen et al. Mar 2013 B2
8412707 Mianji Apr 2013 B1
8447722 Ahuja et al. May 2013 B1
8452790 Mianji May 2013 B1
8463036 Ramesh et al. Jun 2013 B1
8489331 Kopf et al. Jul 2013 B2
8489641 Seefeld et al. Jul 2013 B1
8498984 Hwang et al. Jul 2013 B1
8510743 Hackborn et al. Aug 2013 B2
8514082 Cova et al. Aug 2013 B2
8515207 Chau Aug 2013 B2
8527949 Pleis et al. Sep 2013 B1
8554579 Tribble et al. Oct 2013 B2
8554653 Falkenborg et al. Oct 2013 B2
8554709 Goodson et al. Oct 2013 B2
8560413 Quarterman Oct 2013 B1
8577911 Stepinski et al. Nov 2013 B1
8589273 Creeden et al. Nov 2013 B2
8595234 Siripuapu et al. Nov 2013 B2
8620641 Farnsworth et al. Dec 2013 B2
8639757 Zang et al. Jan 2014 B1
8646080 Williamson et al. Feb 2014 B2
8676857 Adams et al. Mar 2014 B1
8682696 Shanmugam Mar 2014 B1
8688573 Ruknoic et al. Apr 2014 B1
8689108 Duffield et al. Apr 2014 B1
8713467 Goldenberg et al. Apr 2014 B1
8726379 Stiansen et al. May 2014 B1
8732574 Burr et al. May 2014 B2
8739278 Varghese May 2014 B2
8742934 Sarpy et al. Jun 2014 B1
8744890 Bernier Jun 2014 B1
8745516 Mason et al. Jun 2014 B2
8781169 Jackson et al. Jul 2014 B2
8787939 Papakipos et al. Jul 2014 B2
8788407 Singh et al. Jul 2014 B1
8798354 Bunzel et al. Aug 2014 B1
8799313 Satlow Aug 2014 B2
8799799 Cervelli et al. Aug 2014 B1
8807948 Luo et al. Aug 2014 B2
8812960 Sun et al. Aug 2014 B1
8830322 Nerayoff et al. Sep 2014 B2
8832594 Thompson et al. Sep 2014 B1
8868486 Tamayo Oct 2014 B2
8868537 Colgrove et al. Oct 2014 B1
8917274 Ma et al. Dec 2014 B2
8924872 Bogomolov et al. Dec 2014 B1
8930874 Duff et al. Jan 2015 B2
8937619 Sharma et al. Jan 2015 B2
8938686 Erenrich et al. Jan 2015 B1
8984390 Aymeloglu et al. Mar 2015 B2
9009171 Grossman et al. Apr 2015 B1
9009827 Albertson et al. Apr 2015 B1
9021260 Falk et al. Apr 2015 B1
9021384 Beard et al. Apr 2015 B1
9043696 Meiklejohn et al. May 2015 B1
9043894 Dennison et al. May 2015 B1
9058315 Burr et al. Jun 2015 B2
9069842 Melby Jun 2015 B2
9116975 Shankar et al. Aug 2015 B2
9148349 Burr et al. Sep 2015 B1
9165100 Begur et al. Oct 2015 B2
9223773 Isaacson Dec 2015 B2
9286373 Elliot et al. Mar 2016 B2
9298678 Chakerian et al. Mar 2016 B2
9348880 Kramer et al. May 2016 B1
9367872 Visbal et al. Jun 2016 B1
9984133 Cervelli et al. May 2018 B2
20010021936 Bertram Sep 2001 A1
20020032677 Morgenthaler et al. Mar 2002 A1
20020033848 Sciammarella et al. Mar 2002 A1
20020065708 Senay et al. May 2002 A1
20020077711 Nixon Jun 2002 A1
20020091707 Keller Jul 2002 A1
20020095360 Joao Jul 2002 A1
20020095658 Shulman Jul 2002 A1
20020103705 Brady Aug 2002 A1
20020116120 Ruiz et al. Aug 2002 A1
20020123864 Eryurek et al. Sep 2002 A1
20020130907 Chi et al. Sep 2002 A1
20020174201 Ramer et al. Nov 2002 A1
20020193969 Frantz et al. Dec 2002 A1
20020194119 Wright et al. Dec 2002 A1
20020196229 Chen et al. Dec 2002 A1
20030028269 Spriggs et al. Feb 2003 A1
20030028560 Kudrolli et al. Feb 2003 A1
20030036848 Sheha et al. Feb 2003 A1
20030036927 Bowen Feb 2003 A1
20030039948 Donahue Feb 2003 A1
20030061132 Mason et al. Mar 2003 A1
20030093755 O'Carroll May 2003 A1
20030126102 Borthwick Jul 2003 A1
20030140106 Raguseo Jul 2003 A1
20030144868 MacIntyre et al. Jul 2003 A1
20030163352 Surpin et al. Aug 2003 A1
20030200217 Ackerman Oct 2003 A1
20030225755 Iwayama et al. Dec 2003 A1
20030229848 Arend et al. Dec 2003 A1
20040032432 Baynger Feb 2004 A1
20040034570 Davis Feb 2004 A1
20040044648 Anfindsen et al. Mar 2004 A1
20040064256 Barinek et al. Apr 2004 A1
20040078451 Dietz et al. Apr 2004 A1
20040085318 Hassler et al. May 2004 A1
20040095349 Bito et al. May 2004 A1
20040111410 Burgoon et al. Jun 2004 A1
20040126840 Cheng et al. Jul 2004 A1
20040143602 Ruiz et al. Jul 2004 A1
20040143796 Lerner et al. Jul 2004 A1
20040163039 Gorman Aug 2004 A1
20040181554 Heckerman et al. Sep 2004 A1
20040193600 Kaasten et al. Sep 2004 A1
20040205492 Newsome Oct 2004 A1
20040221223 Yu et al. Nov 2004 A1
20040236688 Bozeman Nov 2004 A1
20040236711 Nixon et al. Nov 2004 A1
20040260702 Cragun et al. Dec 2004 A1
20040267746 Marcjan et al. Dec 2004 A1
20050010472 Quatse et al. Jan 2005 A1
20050027705 Sadri et al. Feb 2005 A1
20050028094 Allyn Feb 2005 A1
20050039116 Slack-Smith Feb 2005 A1
20050039119 Parks et al. Feb 2005 A1
20050065811 Chu et al. Mar 2005 A1
20050078858 Yao et al. Apr 2005 A1
20050080769 Gemmell Apr 2005 A1
20050086207 Heuer et al. Apr 2005 A1
20050091186 Elish Apr 2005 A1
20050125715 Di Franco et al. Jun 2005 A1
20050143096 Boesch Jun 2005 A1
20050162523 Darrell et al. Jul 2005 A1
20050166144 Gross Jul 2005 A1
20050180330 Shapiro Aug 2005 A1
20050182793 Keenan et al. Aug 2005 A1
20050183005 Denoue et al. Aug 2005 A1
20050183043 Wu Aug 2005 A1
20050210409 Jou Sep 2005 A1
20050246327 Yeung et al. Nov 2005 A1
20050251786 Citron et al. Nov 2005 A1
20060026120 Carolan et al. Feb 2006 A1
20060026170 Kreitler et al. Feb 2006 A1
20060026561 Bauman et al. Feb 2006 A1
20060031779 Theurer et al. Feb 2006 A1
20060045470 Poslinski et al. Mar 2006 A1
20060053097 King et al. Mar 2006 A1
20060053170 Hill et al. Mar 2006 A1
20060059139 Robinson Mar 2006 A1
20060059423 Lehmann et al. Mar 2006 A1
20060074866 Chamberlain et al. Apr 2006 A1
20060074881 Vembu et al. Apr 2006 A1
20060080139 Mainzer Apr 2006 A1
20060080283 Shipman Apr 2006 A1
20060080619 Carlson et al. Apr 2006 A1
20060093222 Saffer et al. May 2006 A1
20060129746 Porter Jun 2006 A1
20060136513 Ngo et al. Jun 2006 A1
20060139375 Rasmussen et al. Jun 2006 A1
20060142949 Helt Jun 2006 A1
20060143034 Rothermel Jun 2006 A1
20060143075 Carr et al. Jun 2006 A1
20060149596 Surpin et al. Jul 2006 A1
20060155654 Plessis et al. Jul 2006 A1
20060178915 Chao Aug 2006 A1
20060203337 White Sep 2006 A1
20060218637 Thomas et al. Sep 2006 A1
20060236303 Wilson Oct 2006 A1
20060241974 Chao et al. Oct 2006 A1
20060242040 Rader et al. Oct 2006 A1
20060242630 Koike et al. Oct 2006 A1
20060265417 Amato et al. Nov 2006 A1
20060271277 Hu et al. Nov 2006 A1
20060277460 Forstall et al. Dec 2006 A1
20060279630 Aggarwal et al. Dec 2006 A1
20070000999 Kubo et al. Jan 2007 A1
20070011150 Frank Jan 2007 A1
20070016363 Huang et al. Jan 2007 A1
20070018986 Hauser Jan 2007 A1
20070038646 Thota Feb 2007 A1
20070038962 Fuchs et al. Feb 2007 A1
20070043686 Teng et al. Feb 2007 A1
20070057966 Ohno et al. Mar 2007 A1
20070061752 Cory Mar 2007 A1
20070078832 Ott et al. Apr 2007 A1
20070083541 Fraleigh et al. Apr 2007 A1
20070088596 Berkelhamer et al. Apr 2007 A1
20070094389 Nussey et al. Apr 2007 A1
20070113164 Hansen et al. May 2007 A1
20070136095 Weinstein Jun 2007 A1
20070150369 Zivin Jun 2007 A1
20070162454 D'Albora et al. Jul 2007 A1
20070168871 Jenkins Jul 2007 A1
20070174760 Chamberlain et al. Jul 2007 A1
20070185850 Walters et al. Aug 2007 A1
20070192122 Routson et al. Aug 2007 A1
20070192265 Chopin et al. Aug 2007 A1
20070198571 Ferguson et al. Aug 2007 A1
20070208497 Downs et al. Sep 2007 A1
20070208498 Barker et al. Sep 2007 A1
20070208736 Tanigawa et al. Sep 2007 A1
20070233709 Abnous Oct 2007 A1
20070240062 Christena et al. Oct 2007 A1
20070245339 Bauman et al. Oct 2007 A1
20070266336 Nojima et al. Nov 2007 A1
20070284433 Domenica et al. Dec 2007 A1
20070294643 Kyle Dec 2007 A1
20070299697 Friedlander et al. Dec 2007 A1
20070300198 Chaplin et al. Dec 2007 A1
20080016155 Khalatian Jan 2008 A1
20080016216 Worley et al. Jan 2008 A1
20080040275 Paulsen et al. Feb 2008 A1
20080040684 Crump Feb 2008 A1
20080046481 Gould et al. Feb 2008 A1
20080051989 Welsh Feb 2008 A1
20080052142 Bailey et al. Feb 2008 A1
20080077597 Butler Mar 2008 A1
20080077642 Carbone et al. Mar 2008 A1
20080082486 Lermant et al. Apr 2008 A1
20080091693 Murthy Apr 2008 A1
20080104019 Nath May 2008 A1
20080109714 Kumar et al. May 2008 A1
20080126951 Sood et al. May 2008 A1
20080148398 Mezack et al. Jun 2008 A1
20080155440 Trevor et al. Jun 2008 A1
20080172607 Baer Jul 2008 A1
20080177782 Poston et al. Jul 2008 A1
20080186904 Koyama et al. Aug 2008 A1
20080195417 Surpin et al. Aug 2008 A1
20080195608 Clover Aug 2008 A1
20080208735 Balet et al. Aug 2008 A1
20080222295 Robinson et al. Sep 2008 A1
20080227473 Haney Sep 2008 A1
20080249820 Pathria Oct 2008 A1
20080249983 Meisels et al. Oct 2008 A1
20080263468 Cappione et al. Oct 2008 A1
20080267107 Rosenberg Oct 2008 A1
20080270328 Lafferty et al. Oct 2008 A1
20080276167 Michael Nov 2008 A1
20080278311 Grange et al. Nov 2008 A1
20080281819 Tenenbaum et al. Nov 2008 A1
20080288306 MacIntyre et al. Nov 2008 A1
20080288475 Kim et al. Nov 2008 A1
20080301042 Patzer Dec 2008 A1
20080301559 Martinsen et al. Dec 2008 A1
20080301643 Appleton et al. Dec 2008 A1
20080313132 Hao et al. Dec 2008 A1
20080313243 Poston et al. Dec 2008 A1
20080313281 Scheidl et al. Dec 2008 A1
20090002492 Velipasalar et al. Jan 2009 A1
20090018996 Hunt et al. Jan 2009 A1
20090024962 Gotz Jan 2009 A1
20090027418 Maru et al. Jan 2009 A1
20090030915 Winter et al. Jan 2009 A1
20090031401 Cudich et al. Jan 2009 A1
20090037912 Stoitsev et al. Feb 2009 A1
20090043801 LeClair Feb 2009 A1
20090055251 Shah et al. Feb 2009 A1
20090070162 Leonelli et al. Mar 2009 A1
20090076845 Bellin et al. Mar 2009 A1
20090088964 Schaaf et al. Apr 2009 A1
20090089651 Herberger et al. Apr 2009 A1
20090094270 Alirez et al. Apr 2009 A1
20090106178 Chu Apr 2009 A1
20090112678 Luzardo Apr 2009 A1
20090112745 Stefanescu Apr 2009 A1
20090119309 Gibson et al. May 2009 A1
20090125359 Knapic May 2009 A1
20090125369 Kloosstra et al. May 2009 A1
20090125459 Norton et al. May 2009 A1
20090132921 Hwangbo et al. May 2009 A1
20090132953 Reed et al. May 2009 A1
20090143052 Bates et al. Jun 2009 A1
20090144262 White et al. Jun 2009 A1
20090144274 Fraleigh et al. Jun 2009 A1
20090150868 Chakra et al. Jun 2009 A1
20090157732 Hao et al. Jun 2009 A1
20090164934 Bhattiprolu et al. Jun 2009 A1
20090171939 Athsani et al. Jul 2009 A1
20090172511 Decherd et al. Jul 2009 A1
20090172821 Daira et al. Jul 2009 A1
20090177962 Gusmorino et al. Jul 2009 A1
20090179892 Tsuda et al. Jul 2009 A1
20090187464 Bai et al. Jul 2009 A1
20090187546 Whyte et al. Jul 2009 A1
20090199106 Jonsson et al. Aug 2009 A1
20090216562 Faulkner et al. Aug 2009 A1
20090222400 Kupershmidt et al. Sep 2009 A1
20090222759 Drieschner Sep 2009 A1
20090222760 Halverson et al. Sep 2009 A1
20090228365 Tomchek et al. Sep 2009 A1
20090234720 George et al. Sep 2009 A1
20090248757 Havewala et al. Oct 2009 A1
20090249178 Ambrosino et al. Oct 2009 A1
20090249244 Robinson et al. Oct 2009 A1
20090254970 Agarwal et al. Oct 2009 A1
20090271343 Vaiciulis et al. Oct 2009 A1
20090281839 Lynn et al. Nov 2009 A1
20090282068 Shockro et al. Nov 2009 A1
20090287470 Farnsworth et al. Nov 2009 A1
20090292626 Oxford Nov 2009 A1
20090300589 Watters et al. Dec 2009 A1
20090307049 Elliott et al. Dec 2009 A1
20090313463 Pang et al. Dec 2009 A1
20090318775 Michelson et al. Dec 2009 A1
20090319891 MacKinlay Dec 2009 A1
20100004857 Pereira et al. Jan 2010 A1
20100011282 Dollard et al. Jan 2010 A1
20100042922 Bradateanu et al. Feb 2010 A1
20100057622 Faith et al. Mar 2010 A1
20100057716 Stefik et al. Mar 2010 A1
20100058212 Belitz et al. Mar 2010 A1
20100070523 Delgo et al. Mar 2010 A1
20100070842 Aymeloglu et al. Mar 2010 A1
20100070844 Aymeloglu et al. Mar 2010 A1
20100070845 Facemire et al. Mar 2010 A1
20100070897 Aymeloglu et al. Mar 2010 A1
20100076813 Ghosh et al. Mar 2010 A1
20100098318 Anderson Apr 2010 A1
20100100963 Mahaffey Apr 2010 A1
20100103124 Kruzeniski et al. Apr 2010 A1
20100106752 Eckardt et al. Apr 2010 A1
20100114887 Conway et al. May 2010 A1
20100122152 Chamberlain et al. May 2010 A1
20100131457 Heimendinger May 2010 A1
20100162176 Dunton Jun 2010 A1
20100191563 Schlaifer et al. Jul 2010 A1
20100198684 Eraker et al. Aug 2010 A1
20100199225 Coleman et al. Aug 2010 A1
20100223260 Wu Sep 2010 A1
20100228812 Uomini Sep 2010 A1
20100238174 Haub et al. Sep 2010 A1
20100250412 Wagner Sep 2010 A1
20100262901 DiSalvo Oct 2010 A1
20100280851 Merkin Nov 2010 A1
20100280857 Liu et al. Nov 2010 A1
20100293174 Bennett et al. Nov 2010 A1
20100306713 Geisner et al. Dec 2010 A1
20100306722 LeHoty et al. Dec 2010 A1
20100313119 Baldwin et al. Dec 2010 A1
20100313239 Chakra et al. Dec 2010 A1
20100318924 Frankel et al. Dec 2010 A1
20100321399 Ellren et al. Dec 2010 A1
20100325526 Ellis et al. Dec 2010 A1
20100325581 Finkelstein et al. Dec 2010 A1
20100330801 Rouh Dec 2010 A1
20110004498 Readshaw Jan 2011 A1
20110004626 Naeymi-Rad et al. Jan 2011 A1
20110029526 Knight et al. Feb 2011 A1
20110035721 Shimoyama et al. Feb 2011 A1
20110047159 Baid et al. Feb 2011 A1
20110047540 Williams et al. Feb 2011 A1
20110060753 Shaked et al. Mar 2011 A1
20110061013 Bilicki et al. Mar 2011 A1
20110066933 Ludwig Mar 2011 A1
20110074788 Regan et al. Mar 2011 A1
20110074811 Hanson et al. Mar 2011 A1
20110078055 Faribault et al. Mar 2011 A1
20110078173 Seligmann et al. Mar 2011 A1
20110093327 Fordyce, III et al. Apr 2011 A1
20110099133 Chang et al. Apr 2011 A1
20110107196 Foster May 2011 A1
20110117878 Barash et al. May 2011 A1
20110119100 Ruhl et al. May 2011 A1
20110137766 Rasmussen et al. Jun 2011 A1
20110153384 Horne et al. Jun 2011 A1
20110161096 Buehler et al. Jun 2011 A1
20110161409 Nair Jun 2011 A1
20110167105 Ramakrishnan et al. Jul 2011 A1
20110170799 Carrino et al. Jul 2011 A1
20110173032 Payne et al. Jul 2011 A1
20110173093 Psota et al. Jul 2011 A1
20110179048 Satlow Jul 2011 A1
20110185316 Reid et al. Jul 2011 A1
20110208565 Ross et al. Aug 2011 A1
20110208724 Jones et al. Aug 2011 A1
20110213655 Henkin Sep 2011 A1
20110218934 Elser Sep 2011 A1
20110219450 McDougal et al. Sep 2011 A1
20110225198 Edwards et al. Sep 2011 A1
20110225482 Chan et al. Sep 2011 A1
20110225586 Bentley et al. Sep 2011 A1
20110225650 Margolies et al. Sep 2011 A1
20110238495 Kang Sep 2011 A1
20110238553 Raj et al. Sep 2011 A1
20110251951 Kolkowtiz Oct 2011 A1
20110258158 Resende et al. Oct 2011 A1
20110270705 Parker Nov 2011 A1
20110276938 Perry Nov 2011 A1
20110289397 Eastmond et al. Nov 2011 A1
20110289407 Naik et al. Nov 2011 A1
20110289420 Morioka et al. Nov 2011 A1
20110291851 Whisenant Dec 2011 A1
20110310005 Chen et al. Dec 2011 A1
20110314007 Dassa et al. Dec 2011 A1
20120004894 Butler Jan 2012 A1
20120004904 Shin et al. Jan 2012 A1
20120015673 Klassen et al. Jan 2012 A1
20120019559 Siler et al. Jan 2012 A1
20120022945 Falkenberg et al. Jan 2012 A1
20120036013 Neuhaus et al. Feb 2012 A1
20120036434 Oberstein Feb 2012 A1
20120050293 Carlhian et al. Mar 2012 A1
20120059853 Jagota Mar 2012 A1
20120065987 Farooq et al. Mar 2012 A1
20120066296 Appleton et al. Mar 2012 A1
20120072825 Sherkin et al. Mar 2012 A1
20120079363 Folting et al. Mar 2012 A1
20120084117 Tavares et al. Apr 2012 A1
20120084118 Bai et al. Apr 2012 A1
20120084184 Raleigh Apr 2012 A1
20120106801 Jackson May 2012 A1
20120117082 Koperda et al. May 2012 A1
20120123989 Yu et al. May 2012 A1
20120131512 Takeuchi et al. May 2012 A1
20120137235 Ts et al. May 2012 A1
20120144335 Abeln et al. Jun 2012 A1
20120159307 Chung et al. Jun 2012 A1
20120159362 Brown et al. Jun 2012 A1
20120159399 Bastide et al. Jun 2012 A1
20120170847 Tsukidate Jul 2012 A1
20120173985 Peppel Jul 2012 A1
20120180002 Campbell et al. Jul 2012 A1
20120188252 Law Jul 2012 A1
20120196557 Reich et al. Aug 2012 A1
20120196558 Reich et al. Aug 2012 A1
20120197651 Robinson et al. Aug 2012 A1
20120197657 Prodanovic Aug 2012 A1
20120197660 Prodanovic Aug 2012 A1
20120203708 Psota et al. Aug 2012 A1
20120208636 Feige Aug 2012 A1
20120215784 King et al. Aug 2012 A1
20120221511 Gibson et al. Aug 2012 A1
20120221553 Wittmer et al. Aug 2012 A1
20120221580 Barney Aug 2012 A1
20120226590 Love et al. Sep 2012 A1
20120245976 Kumar et al. Sep 2012 A1
20120246148 Dror Sep 2012 A1
20120254129 Wheeler et al. Oct 2012 A1
20120266245 McDougal et al. Oct 2012 A1
20120283988 Pandey et al. Nov 2012 A1
20120284345 Costenaro et al. Nov 2012 A1
20120284670 Kashik et al. Nov 2012 A1
20120290879 Shibuya et al. Nov 2012 A1
20120296907 Long et al. Nov 2012 A1
20120304244 Xie et al. Nov 2012 A1
20120311684 Paulsen et al. Dec 2012 A1
20120323829 Stokes et al. Dec 2012 A1
20120323888 Osann, Jr. Dec 2012 A1
20120330801 McDougal et al. Dec 2012 A1
20120330973 Ghuneim et al. Dec 2012 A1
20130005362 Borghei Jan 2013 A1
20130006426 Healey et al. Jan 2013 A1
20130006725 Simanek et al. Jan 2013 A1
20130006916 McBride et al. Jan 2013 A1
20130016106 Yip et al. Jan 2013 A1
20130018796 Kolhatkar et al. Jan 2013 A1
20130024268 Manickavelu Jan 2013 A1
20130046635 Grigg et al. Feb 2013 A1
20130046842 Muntz et al. Feb 2013 A1
20130055264 Burr et al. Feb 2013 A1
20130060786 Serrano et al. Mar 2013 A1
20130061169 Pearcy et al. Mar 2013 A1
20130073377 Heath Mar 2013 A1
20130073454 Busch Mar 2013 A1
20130078943 Biage et al. Mar 2013 A1
20130086482 Parsons Apr 2013 A1
20130097482 Marantz et al. Apr 2013 A1
20130101159 Chao et al. Apr 2013 A1
20130110822 Ikeda et al. May 2013 A1
20130110877 Bonham et al. May 2013 A1
20130111320 Campbell et al. May 2013 A1
20130117651 Waldman et al. May 2013 A1
20130124567 Balinsky et al. May 2013 A1
20130150004 Rosen Jun 2013 A1
20130151148 Parundekar et al. Jun 2013 A1
20130151305 Akinola et al. Jun 2013 A1
20130151388 Falkenborg et al. Jun 2013 A1
20130151453 Bhanot et al. Jun 2013 A1
20130157234 Gulli et al. Jun 2013 A1
20130166348 Scotto Jun 2013 A1
20130166480 Popescu et al. Jun 2013 A1
20130166550 Buchmann et al. Jun 2013 A1
20130176321 Mitchell et al. Jul 2013 A1
20130179420 Park et al. Jul 2013 A1
20130197925 Blue Aug 2013 A1
20130224696 Wolfe et al. Aug 2013 A1
20130225212 Khan Aug 2013 A1
20130226318 Procyk Aug 2013 A1
20130226953 Markovich et al. Aug 2013 A1
20130232045 Tai et al. Sep 2013 A1
20130238616 Rose et al. Sep 2013 A1
20130238664 Hsu et al. Sep 2013 A1
20130246170 Gross et al. Sep 2013 A1
20130251233 Yang et al. Sep 2013 A1
20130262527 Hunter et al. Oct 2013 A1
20130262528 Foit Oct 2013 A1
20130263019 Castellanos et al. Oct 2013 A1
20130267207 Hao et al. Oct 2013 A1
20130268520 Fisher et al. Oct 2013 A1
20130274898 Thatikonda et al. Oct 2013 A1
20130279757 Kephart Oct 2013 A1
20130282696 John et al. Oct 2013 A1
20130288719 Alonzo Oct 2013 A1
20130290011 Lynn et al. Oct 2013 A1
20130290825 Arndt et al. Oct 2013 A1
20130297619 Chandarsekaran et al. Nov 2013 A1
20130311375 Priebatsch Nov 2013 A1
20130325826 Agarwal et al. Dec 2013 A1
20140019936 Cohanoff Jan 2014 A1
20140032506 Hoey et al. Jan 2014 A1
20140033010 Richardt et al. Jan 2014 A1
20140040371 Gurevich et al. Feb 2014 A1
20140047319 Eberlein Feb 2014 A1
20140047357 Alfaro et al. Feb 2014 A1
20140058763 Zizzamia et al. Feb 2014 A1
20140059038 McPherson et al. Feb 2014 A1
20140067611 Adachi et al. Mar 2014 A1
20140068487 Steiger et al. Mar 2014 A1
20140074855 Zhao et al. Mar 2014 A1
20140081685 Thacker et al. Mar 2014 A1
20140089339 Siddiqui et al. Mar 2014 A1
20140095273 Tang et al. Apr 2014 A1
20140095363 Caldwell Apr 2014 A1
20140095509 Patton Apr 2014 A1
20140108068 Williams Apr 2014 A1
20140108380 Gotz et al. Apr 2014 A1
20140108985 Scott et al. Apr 2014 A1
20140129261 Bothwell et al. May 2014 A1
20140129936 Richards et al. May 2014 A1
20140149436 Bahrami et al. May 2014 A1
20140156484 Chan et al. Jun 2014 A1
20140156527 Grigg et al. Jun 2014 A1
20140157172 Peery et al. Jun 2014 A1
20140164502 Khodorenko et al. Jun 2014 A1
20140178845 Riesberg et al. Jun 2014 A1
20140189536 Lange et al. Jul 2014 A1
20140195515 Baker et al. Jul 2014 A1
20140195887 Ellis et al. Jul 2014 A1
20140208281 Ming Jul 2014 A1
20140214579 Shen et al. Jul 2014 A1
20140222521 Chait Aug 2014 A1
20140222793 Sadkin et al. Aug 2014 A1
20140244284 Smith Aug 2014 A1
20140244388 Manouchehri et al. Aug 2014 A1
20140258246 Lo Faro et al. Sep 2014 A1
20140267294 Ma Sep 2014 A1
20140267295 Sharma Sep 2014 A1
20140279824 Tamayo Sep 2014 A1
20140282177 Wang et al. Sep 2014 A1
20140310266 Greenfield Oct 2014 A1
20140316911 Gross Oct 2014 A1
20140320145 Johnson et al. Oct 2014 A1
20140330845 Feldschuh Nov 2014 A1
20140333651 Cervelli et al. Nov 2014 A1
20140337772 Cervelli et al. Nov 2014 A1
20140344230 Krause et al. Nov 2014 A1
20140351070 Christner et al. Nov 2014 A1
20140358829 Hurwitz Dec 2014 A1
20150019394 Unser et al. Jan 2015 A1
20150026622 Roaldson et al. Jan 2015 A1
20150046870 Goldenberg et al. Feb 2015 A1
20150073954 Braff Mar 2015 A1
20150089353 Folkening Mar 2015 A1
20150089424 Duffield et al. Mar 2015 A1
20150100897 Sun et al. Apr 2015 A1
20150100907 Erenrich et al. Apr 2015 A1
20150106379 Elliot et al. Apr 2015 A1
20150134666 Gattiker et al. May 2015 A1
20150169709 Kara et al. Jun 2015 A1
20150169726 Kara et al. Jun 2015 A1
20150170077 Kara et al. Jun 2015 A1
20150178825 Huerta Jun 2015 A1
20150178877 Bogomolov et al. Jun 2015 A1
20150186483 Tappan et al. Jul 2015 A1
20150186821 Wang et al. Jul 2015 A1
20150187036 Wang et al. Jul 2015 A1
20150212663 Papale et al. Jul 2015 A1
20150221203 Concepcion et al. Aug 2015 A1
20150227295 Meiklejohn et al. Aug 2015 A1
20150242401 Liu Aug 2015 A1
20150254220 Burr et al. Sep 2015 A1
20150309719 Ma et al. Oct 2015 A1
20150317342 Grossman et al. Nov 2015 A1
20150324868 Kaftan et al. Nov 2015 A1
20160062555 Ward et al. Mar 2016 A1
20160098176 Cervelli et al. Apr 2016 A1
20160110369 Cervelli et al. Apr 2016 A1
20160162519 Stowe et al. Jun 2016 A1
Foreign Referenced Citations (48)
Number Date Country
2013251186 Nov 2015 AU
102054015 May 2014 CN
102014103482 Sep 2014 DE
102014215621 Feb 2015 DE
1672527 Jun 2006 EP
2551799 Jan 2013 EP
2560134 Feb 2013 EP
2778977 Sep 2014 EP
2835745 Feb 2015 EP
2835770 Feb 2015 EP
2838039 Feb 2015 EP
2846241 Mar 2015 EP
2851852 Mar 2015 EP
2858014 Apr 2015 EP
2858018 Apr 2015 EP
2863326 Apr 2015 EP
2863346 Apr 2015 EP
2869211 May 2015 EP
2884439 Jun 2015 EP
2884440 Jun 2015 EP
2891992 Jul 2015 EP
2911078 Aug 2015 EP
2911100 Aug 2015 EP
2940603 Nov 2015 EP
2940609 Nov 2015 EP
2993595 Mar 2016 EP
3002691 Apr 2016 EP
3009943 Apr 2016 EP
3032441 Jun 2016 EP
2516155 Jan 2015 GB
2518745 Apr 2015 GB
2012778 Nov 2014 NL
2013306 Feb 2015 NL
624557 Dec 2014 NZ
WO 2000009529 Feb 2000 WO
WO 01025906 Apr 2001 WO
WO 2001088750 Nov 2001 WO
WO 2002065353 Aug 2002 WO
WO 2005104736 Nov 2005 WO
WO 2007133206 Nov 2007 WO
WO 2008064207 May 2008 WO
WO 2009061501 May 2009 WO
WO 2010000014 Jan 2010 WO
WO 2010030913 Mar 2010 WO
WO 2010030914 Mar 2010 WO
WO 2012119008 Sep 2012 WO
WO 2013010157 Jan 2013 WO
WO 2013102892 Jul 2013 WO
Non-Patent Literature Citations (321)
Entry
“A First Look: Predicting Market Demand for Food Retail using a Huff Analysis,” TRF Policy Solutions, Jul. 2012, p. 30.
“A Quick Guide to UniProtKB Swiss-Prot & TrEMBL,” Sep. 2011, p. 2.
“A Word About Banks and the Laundering of Drug Money,” Aug. 18, 2012, http://www.golemxiv.co.uk/2012/08/a-word-about-banks-and-the-laundering-of-drug-money/.
Abbey, Kristen, “Review of Google Docs,” May 1, 2007, p. 2.
Acklen, Laura, “Absolute Beginner's Guide to Microsoft Word 2003,” Dec. 24, 2003, pp. 15-18, 34-41, 308-316.
Adams et al., “Worklets: A Service-Oriented Implementation of Dynamic Flexibility in Workflows,” R. Meersman, Z. Tari et al. (Eds.): OTM 2006, LNCS, 4275, pp. 291-308, 2006.
Alur et al., “Chapter 2: IBM InfoSphere DataStage Stages,” IBM InfoSphere DataStage Data Flow and Job Design, Jul. 1, 2008, pp. 35-137.
Amnet, “5 Great Tools for Visualizing Your Twitter Followers,” posted Aug. 4, 2010, http://www.amnetblog.com/component/content/article/115-5-grate-tools-for-visualizing-your-twitter-followers.html.
Ananiev et al., “The New Modality API,” http://web.archive.org/web/20061211011958/http://java.sun.com/developer/technicalArticles/J2SE/De sktop/javase6/modality/ Jan. 21, 2006, p. 8.
Bluttman et al., “Excel Formulas and Functions for Dummies,” 2005, Wiley Publishing, Inc., pp. 280, 284-286.
Boyce, Jim, “Microsoft Outlook 2010 Inside Out,” Aug. 1, 2010, retrieved from the internet https://capdtron.files.wordpress.com/2013/01/outlook-2010-inside_out.pdf.
Bugzilla@Mozilla, “Bug 18726—[feature] Long-click means of invoking contextual menus not supported,” http://bugzilla.mozilla.org/show_bug.cgi?id=18726 printed Jun. 13, 2013 in 11 pages.
Canese et al., “Chapter 2: PubMed: The Bibliographic Database,” The NCBI Handbook, Oct. 2002, pp. 1-10.
Celik, Tantek, “CSS Basic User Interface Module Level 3 (CSS3 UI),” Section 8 Resizing and Overflow, Jan. 17, 2012, retrieved from internet http://www.w3.org/TR/2012/WD-css3-ui-20120117/#resizing-amp-overflow retrieved on May 18, 2015.
Chaudhuri et al., “An Overview of Business Intelligence Technology,” Communications of the ACM, Aug. 2011, vol. 54, No. 8.
Chen et al., “Bringing Order to the Web: Automatically Categorizing Search Results,” CHI 2000, Proceedings of the SIGCHI conference on Human Factors in Computing Systems, Apr. 1-6, 2000, The Hague, The Netherlands, pp. 145-152.
Chung, Chin-Wan, “Dataplex: An Access to Heterogeneous Distributed Databases,” Communications of the ACM, Association for Computing Machinery, Inc., vol. 33, No. 1, Jan. 1, 1990, pp. 70-80.
Conner, Nancy, “Google Apps: The Missing Manual,” May 1, 2008, pp. 15.
Definition “Identify”, downloaded Jan. 22, 2015, 1 page.
Definition “Overlay”, downloaded Jan. 22, 2015, 1 page.
Delcher et al., “Identifying Bacterial Genes and Endosymbiont DNA with Glimmer,” BioInformatics, vol. 23, No. 6, 2007, pp. 673-679.
Dramowicz, Ela, “Retail Trade Area Analysis Using the Huff Model,” Directions Magazine, Jul. 2, 2005 in 10 pages, http://www.directionsmag.com/articles/retail-trade-area-analysis-using-the-huff-model/123411.
“The FASTA Program Package,” fasta-36.3.4, Mar. 25, 2011, pp. 29.
Galliford, Miles, “Snaglt Versus Free Screen Capture Software: Critical Tools for Website Owners,” <http://www.subhub.com/articles/free-screen-capture-software>, Mar. 27, 2008, pp. 11.
GIS-NET 3 Public _ Department of Regional Planning. Planning & Zoning Information for Unincorporated La County. Retrieved Oct. 2, 2013 from http://gis.planning.lacounty.gov/GIS-NET3 Public/Viewer.html.
Glaab et al., “EnrichNet: Network-Based Gene Set Enrichment Analysis,” Bioinformatics 28.18 (2012): pp. i451-i457.
Goswami, Gautam, “Quite Writly Said!,” One Brick at a Time, Aug. 21, 2005, pp. 7.
“GrabUp—What a Timesaver!” <http://atlchris.com/191/grabup/>, Aug. 11, 2008, pp. 3.
Griffith, Daniel A., “A Generalized Huff Model,” Geographical Analysis, Apr. 1982, vol. 14, No. 2, pp. 135-144.
Gu et al., “Record Linkage: Current Practice and Future Directions,” Jan. 15, 2004, pp. 32.
Hansen et al., “Analyzing Social Media Networks with NodeXL: Insights from a Connected World”, Chapter 4, pp. 53-67 and Chapter 10, pp. 143-164, published Sep. 2010.
Hardesty, “Privacy Challenges: Analysis: It's Surprisingly Easy to Identify Individuals from Credit-Card Metadata,” MIT News On Campus and Around the World, MIT News Office, Jan. 29, 2015, 3 pages.
Hibbert et al., “Prediction of Shopping Behavior Using a Huff Model Within a GIS Framework,” Healthy Eating in Context, Mar. 18, 2011, pp. 16.
Hogue et al., “Thresher: Automating the Unwrapping of Semantic Content from the World Wide Web,” 14th International Conference on World Wide Web, WWW 2005: Chiba, Japan, May 10-14, 2005, pp. 86-95.
Hua et al., “A Multi-attribute Data Structure with Parallel Bloom Filters for Network Services”, HiPC 2006, LNCS 4297, pp. 277-288, 2006.
Huang et al., “Systematic and Integrative Analysis of Large Gene Lists Using DAVID Bioinformatics Resources,” Nature Protocols, 4.1,2008, 44-57.
Huff et al., “Calibrating the Huff Model Using ArcGIS Business Analyst,” ESRI, Sep. 2008, pp. 33.
Huff, David L., “Parameter Estimation in the Huff Model,” ESRI, ArcUser, Oct.-Dec. 2003, pp. 34-36.
Hur et al., “SciMiner: web-based literature mining tool for target identification and functional enrichment analysis,” Bioinformatics 25.6 (2009): pp. 838-840.
jetscreenshot.com, “Share Screenshots via Internet in Seconds,” <http://web.archive.org/web/20130807164204/http://www.jetscreenshot.com/>, Aug. 7, 2013, pp. 1.
Kahan et al., “Annotea: an Open RDF Infrastructure for Shared Web Annotations”, Computer Networks, Elsevier Science Publishers B.V., vol. 39, No. 5, dated Aug. 5, 2002, pp. 589-608.
keylines.com, “An Introduction to KeyLines and Network Visualization,” Mar. 2014, <http://keylines.com/wp-content/uploads/2014/03/KeyLines-White-Paper.pdf> downloaded May 12, 2014 in 8 pages.
keylines.com, “KeyLines Datasheet,” Mar. 2014, <http://keylines.com/wp-content/uploads/2014/03/KeyLines-datasheet.pdf> downloaded May 12, 2014 in 2 pages.
keylines.com, “Visualizing Threats: Improved Cyber Security Through Network Visualization,” Apr. 2014, <http://keylines.com/wp-content/uploads/2014/04/Visualizing-Threats1.pdf> downloaded May 12, 2014 in 10 pages.
Kitts, Paul, “Chapter 14: Genome Assembly and Annotation Process,” The NCBI Handbook, Oct. 2002, pp. 1-21.
Kwout, <http://web.archive.org/web/20080905132448/http://www.kwout.com/> Sep. 5, 2008, pp. 2.
Li et al., “Interactive Multimodal Visual Search on Mobile Device,” IEEE Transactions on Multimedia, vol. 15, No. 3, Apr. 1, 2013, pp. 594-607.
Liu, Tianshun, “Combining GIS and the Huff Model to Analyze Suitable Locations for a New Asian Supermarket in the Minneapolis and St. Paul, Minnesota USA,” Papers in Resource Analysis, 2012, vol. 14, pp. 8.
Madden, Tom, “Chapter 16: The BLAST Sequence Analysis Tool,” The NCBI Handbook, Oct. 2002, pp. 1-15.
Manno et al., “Introducing Collaboration in Single-user Applications through the Centralized Control Architecture,” 2010, pp. 10.
Manske, “File Saving Dialogs,” <http://www.mozilla.org/editor/ui specs/FileSaveDialogs.html>, Jan. 20, 1999, pp. 7.
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.yahoo.com.
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.bing.com.
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.google.com.
Microsoft—Developer Network, “Getting Started with VBA in Word 2010,” Apr. 2010, <http://msdn.microsoft.com/en-US/library/ff604039%28v=office.14%29.aspx> as printed Apr. 4, 2014 in 17 pages.
Microsoft Office—Visio, “About connecting shapes,” <http://office.microsoft.com/en-us/visio-help/about-connecting-shapes-HP085050369.aspx> printed Aug. 4, 2011 in 6 pages.
Microsoft Office—Visio, “Add and glue connectors with the Connector tool,” <http://office.microsoft.com/en-us/visio-help/add-and-glue-connectors-with-the-connector-tool-HA010048532.aspx?CTT=1 > printed Aug. 4, 2011 in 1 page.
Microsoft Windows, “Microsoft Windows Version 2002 Print Out 2,” 2002, pp. 1-6.
Microsoft, “Registering an Application to a URI Scheme,” <http://msdn.microsoft.com/en-US/library/aa767914.aspx>, printed Apr. 4, 2009 in 4 pages.
Microsoft, “Using the Clipboard,” <http://msdn.microsoft.com/en-us/library/ms649016.aspx>, printed Jun. 8, 2009 in 20 pages.
Mizrachi, Ilene, “Chapter 1: GenBank: The Nuckeotide Sequence Database,” The NCBI Handbook, Oct. 2002, pp. 1-14.
Nierman, “Evaluating Structural Similarity in XML Documents”, 6 pages, 2002.
Nitro, “Trick: How to Capture a Screenshot As PDF, Annotate, Then Share It,” <http://blog.nitropdf.com/2008/03/04/trick-how-to-capture-a-screenshot-as-pdf-annotate-it-then-share/>, Mar. 4, 2008, pp. 2.
Nolan et al., “MCARTA: A Malicious Code Automated Run-Time Analysis Framework,” Homeland Security, 2012 IEEE Conference on Technologies for, Nov. 13, 2012, pp. 13-17.
Olanoff, Drew, “Deep Dive with the New Google Maps for Desktop with Google Earth Integration, It's More than Just a Utility,” May 15, 2013, pp. 1-6, retrieved from the internet: http://web.archive.org/web/20130515230641/http://techcrunch.com/2013/05/15/deep-dive-with-the-new-google-maps-for-desktop-with-google-earth-integration-its-more-than-just-a-utility/.
Online Tech Tips, “Clip2Net—Share files, folders and screenshots easily,” <http://www.online-tech-tips.com/free-software-downloads/share-files-folders-screenshots/>, Apr. 2, 2008, pp. 5.
o'reilly.com, http://oreilly.com/digitalmedia/2006/01/01/mac-os-x-screenshot-secrets.html published Jan. 1, 2006 in 10 pages.
Palmas et al., “An Edge-Bunding Layout for Interactive Parallel Coordinates” 2014 IEEE Pacific Visualization Symposium, pp. 57-64.
Perdisci et al., “Behavioral Clustering of HTTP-Based Malware and Signature Generation Using Malicious Network Traces,” USENIX, Mar. 18, 2010, pp. 1-14.
“Potential Money Laundering Warning Signs,” snapshot taken 2003, https://web.archive.org/web/20030816090055/http:/finsolinc.com/ANTI-MONEY%20LAUNDERING%20TRAINING%20GUIDES.pdf.
Quest, “Toad for ORACLE 11.6—Guide to Using Toad,” Sep. 24, 2012, pp. 1-162.
“Refresh CSS Ellipsis When Resizing Container—Stack Overflow,” Jul. 31, 2013, retrieved from internet http://stackoverflow.com/questions/17964681/refresh-css-ellipsis-when-resizing-container, retrieved on May 18, 2015.
Rouse, Margaret, “OLAP Cube,” <http://searchdatamanagement.techtarget.com/definition/OLAP-cube>, Apr. 28, 2012, pp. 16.
Schroder, Stan, “15 Ways To Create Website Screenshots,” <http://mashable.com/2007/08/24/web-screenshots/>, Aug. 24, 2007, pp. 2.
Shi et al., “A Scalable Implementation of Malware Detection Based on Network Connection Behaviors,” 2013 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery, IEEE, Oct. 10, 2013, pp. 59-66.
Sigrist, et al., “PROSITE, a Protein Domain Database for Functional Characterization and Annotation,” Nucleic Acids Research, 2010, vol. 38, pp. D161-D166.
Sirotkin et al., “Chapter 13: The Processing of Biological Sequence Data at NCBI,” The NCBI Handbook, Oct. 2002, pp. 1-11.
SnagIt, “SnagIt 8.1.0 Print Out 2,” Software release date Jun. 15, 2006, pp. 1-3.
SnagIt, “SnagIt 8.1.0 Print Out,” Software release date Jun. 15, 2006, pp. 6.
SnagIt, “SnagIt Online Help Guide,” <http://download.techsmith.com/snagit/docs/onlinehelp/enu/snagit_help.pdf>, TechSmith Corp., Version 8.1, printed Feb. 7, 2007, pp. 284.
Symantec Corporation, “E-Security Begins with Sound Security Policies,” Announcement Symantec, Jun. 14, 2001.
Thompson, Mick, “Getting Started with GEO,” Getting Started with GEO, Jul. 26, 2011.
Umagandhi et al., “Search Query Recommendations Using Hybrid User Profile with Query Logs,” International Journal of Computer Applications, vol. 80, No. 10, Oct. 1, 2013, pp. 7-18.
Wang et al., “Research on a Clustering Data De-Duplication Mechanism Based on Bloom Filter,” IEEE 2010, 5 pages.
Warren, Christina, “TUAW Faceoff: Screenshot apps on the firing line,” <http://www.tuaw.com/2008/05/05/tuaw-faceoff-screenshot-apps-on-the-firing-line/>, May 5, 2008, pp. 11.
Wikipedia, “Federated Database System,” Sep. 7, 2013, retrieved from the internet on Jan. 27, 2015 http://en.wikipedia.org/w/index.php?title=Federated database system&oldid=571954221.
Yang et al., “HTML Page Analysis Based on Visual Cues”, A129, pp. 859-864, 2001.
Zheng et al., “GOEAST: a web-based software toolkit for Gene Ontology enrichment analysis,” Nucleic acids research 36.suppl 2 (2008): pp. W385-W363.
Notice of Acceptance for Australian Patent Application No. 2014250678 dated Oct. 7, 2015.
Notice of Acceptance for Australian Patent Application No. 2013251186 dated Nov. 6, 2015.
Notice of Allowance for U.S. Appl. No. 12/556,318 dated Apr. 11, 2016.
Notice of Allowance for U.S. Appl. No. 12/556,318 dated Nov. 2, 2015.
Notice of Allowance for U.S. Appl. No. 13/247,987 dated Mar. 17, 2016.
Notice of Allowance for U.S. Appl. No. 13/839,026 dated Mar. 11, 2016.
Notice of Allowance for U.S. Appl. No. 14/102,394 dated Aug. 25, 2014.
Notice of Allowance for U.S. Appl. No. 14/108,187 dated Aug. 29, 2014.
Notice of Allowance for U.S. Appl. No. 14/135,289 dated Oct. 14, 2014.
Notice of Allowance for U.S. Appl. No. 14/148,568 dated Aug. 26, 2015.
Notice of Allowance for U.S. Appl. No. 14/192,767 dated Dec. 16, 2014.
Notice of Allowance for U.S. Appl. No. 14/192,767 dated Apr. 20, 2015.
Notice of Allowance for U.S. Appl. No. 14/225,084 dated May 4, 2015.
Notice of Allowance for U.S. Appl. No. 14/265,637 dated Feb. 13, 2015.
Notice of Allowance for U.S. Appl. No. 14/268,964 dated Dec. 3, 2014.
Notice of Allowance for U.S. Appl. No. 14/294,098 dated Dec. 29, 2014.
Notice of Allowance for U.S. Appl. No. 14/323,935 dated Oct. 1, 2015.
Notice of Allowance for U.S. Appl. No. 14/326,738 dated Nov. 18, 2015.
Notice of Allowance for U.S. Appl. No. 14/473,552 dated Jul. 24, 2015.
Notice of Allowance for U.S. Appl. No. 14/473,860 dated Jan. 5, 2015.
Notice of Allowance for U.S. Appl. No. 14/504,103 dated May 18, 2015.
Notice of Allowance for U.S. Appl. No. 14/552,336 dated Nov. 3, 2015.
Notice of Allowance for U.S. Appl. No. 14/579,752 dated Apr. 4, 2016.
Notice of Allowance for U.S. Appl. No. 14/616,080 dated Apr. 2, 2015.
Notice of Allowance for U.S. Appl. No. 14/676,621 dated Feb. 10, 2016.
Notice of Allowance for U.S. Appl. No. 14/961,481 dated May 2, 2016.
Official Communication for Australian Patent Application No. 2013251186 dated Mar. 12, 2015.
Official Communication for Australian Patent Application No. 2014201511 dated Feb. 27, 2015.
Official Communication for Australian Patent Application No. 2014202442 dated Mar. 19, 2015.
Official Communication for Australian Patent Application No. 2014210604 dated Jun. 5, 2015.
Official Communication for Australian Patent Application No. 2014210614 dated Jun. 5, 2015.
Official Communication for Australian Patent Application No. 2014213553 dated May 7, 2015.
Official Communication for Australian Patent Application No. 2014250678 dated Jun. 17, 2015.
Official Communication for Canadian Patent Application No. 2831660 dated Jun. 9, 2015.
Official Communication for European Patent Application No. 12181585.6 dated Sep. 4, 2015.
Official Communication for European Patent Application No. 14158861.6 dated Jun. 16, 2014.
Official Communication for European Patent Application No. 14159464.8 dated Jul. 31, 2014.
Official Communication for European Patent Application No. 14180142.3 dated Feb. 6, 2015.
Official Communication for European Patent Application No. 14180281.9 dated Jan. 26, 2015.
Official Communication for European Patent Application No. 14180321.3 dated Apr. 17, 2015.
Official Communication for European Patent Application No. 14180321.3 dated May 9, 2016.
Official Communication for European Patent Application No. 14180432.8 dated Jun. 23, 2015.
Official Communication for European Patent Application No. 14186225.0 dated Feb. 13, 2015.
Official Communication for European Patent Application No. 14187739.9 dated Jul. 6, 2015.
Official Communication for European Patent Application No. 14187996.5 dated Feb. 12, 2015.
Official Communication for European Patent Application No. 14187996.5 dated Feb. 19, 2016.
Official Communication for European Patent Application No. 14189344.6 dated Feb. 20, 2015.
Official Communication for European Patent Application No. 14189344.6 dated Feb. 29, 2016.
Official Communication for European Patent Application No. 14189347.9 dated Mar. 4, 2015.
Official Communication for European Patent Application No. 14189802.3 dated May 11, 2015.
Official Communication for European Patent Application No. 14191540.5 dated May 27, 2015.
Official Communication for European Patent Application No. 14197879.1 dated Apr. 28, 2015.
Official Communication for European Patent Application No. 14197895.7 dated Apr. 28, 2015.
Official Communication for European Patent Application No. 14197938.5 dated Apr. 28, 2015.
Official Communication for European Patent Application No. 14199182.8 dated Mar. 13, 2015.
Official Communication for European Patent Application No. 15155845.9 dated Oct. 6, 2015.
Official Communication for European Patent Application No. 15155846.7 dated Jul. 8, 2015.
Official Communication for European Patent Application No. 15165244.3 dated Aug. 27, 2015.
Official Communication for European Patent Application No. 15175106.2 dated Nov. 5, 2015.
Official Communication for European Patent Application No. 15175151.8 dated Nov. 25, 2015.
Official Communication for European Patent Application No. 15183721.8 dated Nov. 23, 2015.
Official Communication for European Patent Application No. 15188106.7 dated Feb. 3, 2016.
Official Communication for European Patent Application No. 15190307.7 dated Feb. 19, 2016.
Official Communication for European Patent Application No. 15190307.7 dated Sep. 26, 2017.
Official Communication for Great Britain Patent Application No. 1404457.2 dated Aug. 14, 2014.
Official Communication for Great Britain Patent Application No. 1404486.1 dated Aug. 27, 2014.
Official Communication for Great Britain Patent Application No. 1404489.5 dated Aug. 27, 2014.
Official Communication for Great Britain Patent Application No. 1404499.4 dated Aug. 20, 2014.
Official Communication for Great Britain Patent Application No. 1404574.4 dated Dec. 18, 2014.
Official Communication for Great Britain Patent Application No. 1408025.3 dated Nov. 6, 2014.
Official Communication for Great Britain Patent Application No. 1411984.6 dated Dec. 22, 2014.
Official Communication for Great Britain Patent Application No. 1413935.6 dated Jan. 27, 2015.
Official Communication for Netherlands Patent Application No. 2011729 dated Aug. 13, 2015.
Official Communication for Netherlands Patent Application No. 2012437 dated Sep. 18, 2015.
Official Communication for Netherlands Patent Application No. 2012438 dated Sep. 21, 2015.
Official Communication for Netherlands Patent Application No. 2013306 dated Apr. 24, 2015.
Official Communication for New Zealand Patent Application No. 622473 dated Jun. 19, 2014.
Official Communication for New Zealand Patent Application No. 622473 dated Mar. 27, 2014.
Official Communication for New Zealand Patent Application No. 622513 dated Apr. 3, 2014.
Official Communication for New Zealand Patent Application No. 622517 dated Apr. 3, 2014.
Official Communication for New Zealand Patent Application No. 624557 dated May 14, 2014.
Official Communication for New Zealand Patent Application No. 627962 dated Aug. 5, 2014.
Official Communication for New Zealand Patent Application No. 628161 dated Aug. 25, 2014.
Official Communication for New Zealand Patent Application No. 628263 dated Aug. 12, 2014.
Official Communication for New Zealand Patent Application No. 628495 dated Aug. 19, 2014.
Official Communication for New Zealand Patent Application No. 628585 dated Aug. 26, 2014.
Official Communication for New Zealand Patent Application No. 628840 dated Aug. 28, 2014.
Official Communication for U.S. Appl. No. 12/556,318 dated Jul. 2, 2015.
Official Communication for U.S. Appl. No. 12/556,321 dated Feb. 25, 2016.
Official Communication for U.S. Appl. No. 12/556,321 dated Jun. 6, 2012.
Official Communication for U.S. Appl. No. 12/556,321 dated Dec. 7, 2011.
Official Communication for U.S. Appl. No. 12/556,321 dated Jul. 7, 2015.
Official Communication for U.S. Appl. No. 13/247,987 dated Apr. 2, 2015.
Official Communication for U.S. Appl. No. 13/247,987 dated Sep. 22, 2015.
Official Communication for U.S. Appl. No. 13/669,274 dated Aug. 26, 2015.
Official Communication for U.S. Appl. No. 13/669,274 dated May 2, 2016.
Official Communication for U.S. Appl. No. 13/669,274 dated May 6, 2015.
Official Communication for U.S. Appl. No. 13/827,491 dated Dec. 1, 2014.
Official Communication for U.S. Appl. No. 13/827,491 dated Jun. 22, 2015.
Official Communication for U.S. Appl. No. 13/827,491 dated Mar. 30, 2016.
Official Communication for U.S. Appl. No. 13/827,491 dated Oct. 9, 2015.
Official Communication for U.S. Appl. No. 13/831,791 dated Feb. 11, 2016.
Official Communication for U.S. Appl. No. 13/831,791 dated Mar. 4, 2015.
Official Communication for U.S. Appl. No. 13/831,791 dated Aug. 6, 2015.
Official Communication for U.S. Appl. No. 13/835,688 dated Jun. 17, 2015.
Official Communication for U.S. Appl. No. 13/835,688 dated Sep. 30, 2015.
Official Communication for U.S. Appl. No. 13/835,688 dated Jun. 7, 2016.
Official Communication for U.S. Appl. No. 13/839,026 dated Aug. 4, 2015.
Official Communication for U.S. Appl. No. 14/102,394 dated Mar. 27, 2014.
Official Communication for U.S. Appl. No. 14/108,187 dated Apr. 17, 2014.
Official Communication for U.S. Appl. No. 14/108,187 dated Mar. 20, 2014.
Official Communication for U.S. Appl. No. 14/134,558 dated Oct. 7, 2015.
Official Communication for U.S. Appl. No. 14/135,289 dated Apr. 16, 2014.
Official Communication for U.S. Appl. No. 14/135,289 dated Jul. 7, 2014.
Official Communication for U.S. Appl. No. 14/148,568 dated Oct. 22, 2014.
Official Communication for U.S. Appl. No. 14/148,568 dated Mar. 26, 2015.
Official Communication for U.S. Appl. No. 14/148,568 dated Mar. 27, 2014.
Official Communication for U.S. Appl. No. 14/192,767 dated Sep. 24, 2014.
Official Communication for U.S. Appl. No. 14/192,767 dated May 6, 2014.
Official Communication for U.S. Appl. No. 14/196,814 dated Aug. 13, 2014.
Official Communication for U.S. Appl. No. 14/196,814 dated May 5, 2015.
Official Communication for U.S. Appl. No. 14/196,814 dated Oct. 7, 2015.
Official Communication for U.S. Appl. No. 14/222,364 dated Dec. 9, 2015.
Official Communication for U.S. Appl. No. 14/225,006 dated Sep. 10, 2014.
Official Communication for U.S. Appl. No. 14/225,006 dated Sep. 2, 2015.
Official Communication for U.S. Appl. No. 14/225,006 dated Dec. 21, 2015.
Official Communication for U.S. Appl. No. 14/225,006 dated Feb. 27, 2015.
Official Communication for U.S. Appl. No. 14/225,084 dated Sep. 11, 2015.
Official Communication for U.S. Appl. No. 14/225,084 dated Sep. 2, 2014.
Official Communication for U.S. Appl. No. 14/225,084 dated Feb. 20, 2015.
Official Communication for U.S. Appl. No. 14/225,084 dated Feb. 26, 2016.
Official Communication for U.S. Appl. No. 14/225,084 dated Jan. 4, 2016.
Official Communication for U.S. Appl. No. 14/225,160 dated Feb. 11, 2015.
Official Communication for U.S. Appl. No. 14/225,160 dated Aug. 12, 2015.
Official Communication for U.S. Appl. No. 14/225,160 dated May 20, 2015.
Official Communication for U.S. Appl. No. 14/225,160 dated Oct. 22, 2014.
Official Communication for U.S. Appl. No. 14/225,160 dated Jan. 25, 2016.
Official Communication for U.S. Appl. No. 14/225,160 dated Jul. 29, 2014.
Official Communication for U.S. Appl. No. 14/265,637 dated Sep. 26, 2014.
Official Communication for U.S. Appl. No. 14/268,964 dated Jul. 11, 2014.
Official Communication for U.S. Appl. No. 14/268,964 dated Sep. 3, 2014.
Official Communication for U.S. Appl. No. 14/289,596 dated Jul. 18, 2014.
Official Communication for U.S. Appl. No. 14/289,596 dated Jan. 26, 2015.
Official Communication for U.S. Appl. No. 14/289,596 dated Apr. 30, 2015.
Official Communication for U.S. Appl. No. 14/289,596 dated Aug. 5, 2015.
Official Communication for U.S. Appl. No. 14/289,596 dated May 9, 2016.
Official Communication for U.S. Appl. No. 14/289,599 dated Jul. 22, 2014.
Official Communication for U.S. Appl. No. 14/289,599 dated May 29, 2015.
Official Communication for U.S. Appl. No. 14/289,599 dated Sep. 4, 2015.
Official Communication for U.S. Appl. No. 14/294,098 dated Aug. 15, 2014.
Official Communication for U.S. Appl. No. 14/294,098 dated Nov. 6, 2014.
Official Communication for U.S. Appl. No. 14/306,138 dated Sep. 14, 2015.
Official Communication for U.S. Appl. No. 14/306,138 dated Mar. 17, 2016.
Official Communication for U.S. Appl. No. 14/306,138 dated Feb. 18, 2015.
Official Communication for U.S. Appl. No. 14/306,138 dated Sep. 23, 2014.
Official Communication for U.S. Appl. No. 14/306,138 dated Dec. 24, 2015.
Official Communication for U.S. Appl. No. 14/306,138 dated May 26, 2015.
Official Communication for U.S. Appl. No. 14/306,138 dated Dec. 3, 2015.
Official Communication for U.S. Appl. No. 14/306,147 dated Feb. 19, 2015.
Official Communication for U.S. Appl. No. 14/306,147 dated Dec. 24, 2015.
Official Communication for U.S. Appl. No. 14/306,147 dated Aug. 7, 2015.
Official Communication for U.S. Appl. No. 14/306,147 dated Sep. 9, 2014.
Official Communication for U.S. Appl. No. 14/306,154 dated Feb. 1, 2016.
Official Communication for U.S. Appl. No. 14/306,154 dated Mar. 11, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated May 15, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated Nov. 16, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated Mar. 17, 2016.
Official Communication for U.S. Appl. No. 14/306,154 dated Jul. 6, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated Sep. 9, 2014.
Official Communication for U.S. Appl. No. 14/319,765 dated Feb. 1, 2016.
Official Communication for U.S. Appl. No. 14/319,765 dated Sep. 10, 2015.
Official Communication for U.S. Appl. No. 14/319,765 dated Jun. 16, 2015.
Official Communication for U.S. Appl. No. 14/319,765 dated Nov. 25, 2014.
Official Communication for U.S. Appl. No. 14/319,765 dated Feb. 4, 2015.
Official Communication for U.S. Appl. No. 14/323,935 dated Jun. 22, 2015.
Official Communication for U.S. Appl. No. 14/323,935 dated Nov. 28, 2014.
Official Communication for U.S. Appl. No. 14/323,935 dated Mar. 31, 2015.
Official Communication for U.S. Appl. No. 14/326,738 dated Dec. 2, 2014.
Official Communication for U.S. Appl. No. 14/326,738 dated Jul. 31, 2015.
Official Communication for U.S. Appl. No. 14/326,738 dated Mar. 31, 2015.
Official Communication for U.S. Appl. No. 14/473,552 dated Feb. 24, 2015.
Official Communication for U.S. Appl. No. 14/473,860 dated Nov. 4, 2014.
Official Communication for U.S. Appl. No. 14/479,160 dated Apr. 20, 2016.
Official Communication for U.S. Appl. No. 14/486,991 dated Mar. 10, 2015.
Official Communication for U.S. Appl. No. 14/490,612 dated Aug. 18, 2015.
Official Communication for U.S. Appl. No. 14/490,612 dated Jan. 27, 2015.
Official Communication for U.S. Appl. No. 14/490,612 dated Mar. 31, 2015.
Official Communication for U.S. Appl. No. 14/504,103 dated Mar. 31, 2015.
Official Communication for U.S. Appl. No. 14/504,103 dated Feb. 5, 2015.
Official Communication for U.S. Appl. No. 14/516,386 dated Feb. 24, 2016.
Official Communication for U.S. Appl. No. 14/552,336 dated Jul. 20, 2015.
Official Communication for U.S. Appl. No. 14/571,098 dated Nov. 10, 2015.
Official Communication for U.S. Appl. No. 14/571,098 dated Mar. 11, 2015.
Official Communication for U.S. Appl. No. 14/571,098 dated Feb. 23, 2016.
Official Communication for U.S. Appl. No. 14/571,098 dated Aug. 24, 2015.
Official Communication for U.S. Appl. No. 14/571,098 dated Aug. 5, 2015.
Official Communication for U.S. Appl. No. 14/579,752 dated Aug. 19, 2015.
Official Communication for U.S. Appl. No. 14/579,752 dated May 26, 2015.
Official Communication for U.S. Appl. No. 14/579,752 dated Dec. 9, 2015.
Official Communication for U.S. Appl. No. 14/631,633 dated Feb. 3, 2016.
Official Communication for U.S. Appl. No. 14/639,606 dated Oct. 16, 2015.
Official Communication for U.S. Appl. No. 14/639,606 dated May 18, 2015.
Official Communication for U.S. Appl. No. 14/639,606 dated Jul. 24, 2015.
Official Communication for U.S. Appl. No. 14/639,606 dated Apr. 5, 2016.
Official Communication for U.S. Appl. No. 14/676,621 dated Oct. 29, 2015.
Official Communication for U.S. Appl. No. 14/676,621 dated Jul. 30, 2015.
Official Communication for U.S. Appl. No. 14/715,834 dated Apr. 13, 2016.
Official Communication for U.S. Appl. No. 14/715,834 dated Feb. 19, 2016.
Official Communication for U.S. Appl. No. 14/726,353 dated Mar. 1, 2016.
Official Communication for U.S. Appl. No. 14/726,353 dated Sep. 10, 2015.
Official Communication for U.S. Appl. No. 14/741,256 dated Feb. 9, 2016.
Official Communication for U.S. Appl. No. 14/800,447 dated Dec. 10, 2015.
Official Communication for U.S. Appl. No. 14/800,447 dated Mar. 3, 2016.
Official Communication for U.S. Appl. No. 14/800,447 dated Jun. 6, 2016.
Official Communication for U.S. Appl. No. 14/813,749 dated Sep. 28, 2015.
Official Communication for U.S. Appl. No. 14/813,749 dated Apr. 8, 2016.
Official Communication for U.S. Appl. No. 14/841,338 dated Feb. 18, 2016.
Official Communication for U.S. Appl. No. 14/842,734 dated Nov. 19, 2015.
Official Communication for U.S. Appl. No. 14/871,465 dated Apr. 11, 2016.
Official Communication for U.S. Appl. No. 14/871,465 dated Feb. 9, 2016.
Official Communication for U.S. Appl. No. 14/883,498 dated Mar. 17, 2016.
Official Communication for U.S. Appl. No. 14/883,498 dated Dec. 24, 2015.
Official Communication for U.S. Appl. No. 14/961,481 dated Mar. 2, 2016.
Official Communication for U.S. Appl. No. 14/975,215 dated May 19, 2016.
Restriction Requirement for U.S. Appl. No. 13/839,026 dated Apr. 2, 2015.
Ferreira et al., “A Scheme for Analyzing Electronic Payment Systems,” Brazil 1997.
Notice of Allowance for U.S. Appl. No. 14/883,498 dated Jan. 31, 2018.
Official Communication for U.S. Appl. No. 14/332,306 dated May 20, 2016.
Official Communication for U.S. Appl. No. 14/715,834 dated Jun. 28, 2016.
Official Communication for U.S. Appl. No. 14/883,498 dated Aug. 10, 2017.
Official Communication for U.S. Appl. No. 14/883,498 dated Aug. 22, 2016.
Official Communication for U.S. Appl. No. 14/883,498 dated Feb. 9, 2017.
Official Communication for European Patent Application No. 15190307.7 dated Apr. 16, 2018.
Related Publications (1)
Number Date Country
20180239768 A1 Aug 2018 US
Provisional Applications (1)
Number Date Country
62064793 Oct 2014 US
Continuations (1)
Number Date Country
Parent 14883498 Oct 2015 US
Child 15959016 US