The subject matter disclosed herein relates to real-time monitoring of conditions of industrial equipment. In particular, example embodiments relate to systems and methods for analyzing data associated with industrial equipment to derive qualitative measures of industrial equipment conditions and provide user interfaces (UIs) to present the derived qualitative measures of industrial equipment conditions.
Industrial equipment (e.g., machinery) is often equipped with sensors to provide insights regarding the ongoing performance and condition of the equipment to aid in decisions regarding maintenance and replacement of such equipment. Each piece of equipment may have several different sensors, each of which is responsible for providing output data related to different functional aspects of the equipment. Given the large quantity of sensor data output for each piece of equipment, it is often difficult to accurately interpret and understand the data that is output. For example, conventional methods involve performing calculations using the sensor data in accordance with a standard rubric. However, the conventional methodologies are very time consuming, and their result is simply a snapshot of the condition of the equipment at the time of the particular sensor data used in the calculations. As a result, the insights gained from the application of these conventional methodologies do not reflect the current condition of the equipment being analyzed.
Additionally, industrial equipment is often subject to periodic reviews (e.g., annual or quarterly) to assess performance and condition of the equipment. During such reviews, the condition of the equipment is evaluated to determine whether repairs or maintenance need to occur to avoid potentially lengthy downtime of the equipment due to wear or damage. The conventional review process again involves performing calculations using the sensor data in accordance with a standard rubric. However, it is often difficult to manually collect the sensor data, and the calculations of the conventional methodologies are often too complex to produce results that are both accurate and timely. Further, in many instances, it is important for a decision maker (e.g., a human tasked with maintaining equipment in a proper working order) to understand how sensor data output is used in the calculations and what the exact values are, though the conventional methodologies typically do not provide a convenient way for the decision makers to do this. Making decisions regarding maintenance and replacement of equipment without an accurate understanding of the current condition of the equipment and the calculations used to assess the current condition may ultimately compromise the health and longevity of the equipment.
Various ones of the appended drawings merely illustrate example embodiments of the present inventive subject matter and cannot be considered as limiting its scope.
Reference will now be made in detail to specific example embodiments for carrying out the inventive subject matter. Examples of these specific embodiments are illustrated in the accompanying drawings, and specific details are set forth in the following description in order to provide a thorough understanding of the subject matter. It will be understood that these examples are not intended to limit the scope of the claims to the illustrated embodiments. On the contrary, they are intended to cover such alternatives, modifications, and equivalents as may be included within the scope of the disclosure.
Aspects of the present disclosure relate to a system and methods for real-time monitoring of industrial equipment conditions. A real-time auditing system collects and consumes source data from sensors coupled to equipment (e.g., a machine or set of machines) and from human-generated reports (e.g., maintenance reports). The real-time auditing system uses the source data to determine conditions (e.g., a state of working order) of the equipment in real-time. The conditions are determined in “real-time” such that they are determined as the source data changes (e.g., as the sensor output changes). More specifically, the real-time auditing system analyzes the source data to compute scores related to various factors that impact the overall equipment condition. The factors that impact the overall equipment condition are referred to herein as “condition factors,” and the scores related thereto are, accordingly, referred to herein as “condition factor scores.” Each computed condition factor score provides a qualitative and quantitative measure of a functional aspect of the equipment condition. Using the condition factor scores, the real-time auditing system also determines an overall condition score that provides a qualitative and quantitative measure of an overall functional state of the equipment.
Additional aspects of the real-time auditing system involve UIs for presenting an overview of the condition factors of equipment along with the associated condition factor score of each piece of equipment. The UIs are configured to present the condition factor scores in two views: 1) a table view and 2) a detailed view. The table view includes presentation of condition factor scores for each condition factor for each piece of equipment in a simple table. The detailed view includes the condition factor scores for each condition factor for each piece of equipment along with detailed information related to how each score was derived. For example, for a particular condition factor score, the detailed view may include a presentation of any underlying formulas used to compute the score along with the variable values (e.g., sensor data values) used to generate the condition factor scores from the formulas. By providing the derivation information within the detailed view, the real-time auditing system allows for conditions of industrial equipment to be easily audited in real-time (e.g., as the conditions of the equipment change).
Additionally, the UIs allow users to filter and sort the presentation of condition factor scores by condition factors, condition factor values, equipment types, equipment features, and equipment locations. The UIs also provide users with a user guide that users may access to obtain the calculation methodology used in calculating any one of the condition factor scores. The UIs allow users to manually edit condition factor scores and create notes (e.g., textual or audio) to associate with edited scores.
The UIs also includes multiple views for presenting aggregate condition data (e.g., overall condition scores and condition factor scores) associated with sets of equipment. For example, the UIs may include a view for presenting aggregate condition data associated with a facility (e.g., a manufacturing facility) that includes a set of equipment. As another example, the UIs may include a view for presenting aggregate condition data associated with an asset that includes multiple facilities, each of which includes a set of equipment. As yet another example, the UIs may include a view for presenting aggregate condition data associated with a region (e.g., a geographic region) that includes multiple assets as well as a global view that aggregates condition data from multiple regions. Accordingly, the real-time monitoring system may find application in monitoring a variety of industrial equipment that operates in a variety of industrial contexts, including manufacturing and processing facilities such as consumer product manufacturing facilities, chemical plants, oil drilling platforms, oil refineries, drug manufacturing facilities, aircraft manufacturing facilities, automotive manufacturing facilities, and the like.
As shown, the network system 100 includes a real-time auditing system 102, a device 104, and a third-party computing system 106, all communicatively coupled to each other via a network 108. The real-time auditing system 102 may be implemented in a special-purpose (e.g., specialized) computer system, in whole or in part, as described below.
Also shown in
The device 104 may also include any one of a web client 112 or application 114 to facilitate communication and interaction between the user 110 and the real-time auditing system 102. In various embodiments, information communicated between the real-time auditing system 102 and the device 104 may involve user-selected functions available through one or more user interfaces (UIs). The UIs may be specifically associated with the web client 112 (e.g., a browser) or the application 114. Accordingly, during a communication session with the device 104, the real-time auditing system 102 may provide the device 104 with a set of machine-readable instructions that, when interpreted by the device 104 using the web client 112 or the application 114, cause the device 104 to present a UI, and transmit user input received through such an UI back to the real-time auditing system 102. As an example, the UIs provided to the device 104 by the real-time auditing system 102 allow users to view information that is updated in real-time describing conditions of equipment (e.g., the information is updated as the conditions change).
The network 108 may be any network that enables communication between or among systems, machines, databases, and devices (e.g., between real-time auditing system 102 and the device 104). Accordingly, the network 108 may be a wired network, a wireless network (e.g., a mobile or cellular network), or any suitable combination thereof. The network 108 may include one or more portions that constitute a private network, a public network (e.g., the Internet), or any suitable combination thereof. Accordingly, the network 108 may include one or more portions that incorporate a local area network (LAN), a wide area network (WAN), the Internet, a mobile telephone network (e.g., a cellular network), a wired telephone network (e.g., a plain old telephone system (POTS) network), a wireless data network (e.g., a WiFi network or WiMax network), or any suitable combination thereof. Any one or more portions of the network 108 may communicate information via a transmission medium. As used herein, “transmission medium” refers to any intangible (e.g., transitory) medium that is capable of communicating (e.g., transmitting) instructions for execution by a machine (e.g., by one or more processors of such a machine), and includes digital or analog communication signals or other intangible media to facilitate communication of such software.
Also shown in
The interface module 200 receives requests from the device 104, and communicates appropriate responses to the device 104. The interface module 200 may receive requests from devices in the form of Hypertext Transfer Protocol (HTTP) requests or other web-based, API requests. For example, the interface module 200 provides a number of interfaces (e.g., APIs or UIs that are presented by the device 104) that allow data to be received by the real-time auditing system 102.
The interface module 200 also provides UIs to devices (e.g., device 104) that include graphical representations of various analytics and other output produced by the real-time auditing system 102. To provide a UI to the device 104, the interface module 200 transmits a set of machine-readable instructions to the device 104 that causes the device 104 to present the UI on a display of the device 104. The set of machine-readable instructions may, for example, include presentation data (e.g., representing the UI) and a set of instructions to display the presentation data. The device 104 may temporarily store the presentation data to enable display of the UI.
The UIs provided by the interface module 200 may include various graphs, tables, charts, and other graphics used, for example, to present condition data describing the working state of equipment. The interfaces may also include various input control elements (e.g., sliders, buttons, drop-down menus, check-boxes, and data entry fields) that allow users to specify various inputs, and the interface module 200 receives and processes user input received through such input control elements.
The UIs provided by the interface module 200 may include multiple “views” for presenting condition data to users along with a selectable element (e.g., a toggle) to switch between views. For example, the UIs provided by the interface module 200 may include a table view and a detailed view of condition data along with a toggle to switch between the table view and the detailed view. The table view includes a table in which the rows correspond to condition factors, the columns correspond to individual pieces of industrial equipment, and each entry in the table includes a condition factor score for the corresponding condition factor and individual instance of industrial equipment. The detailed view includes, for each piece of industrial equipment included in the table view, a condition factor score for each condition factor along with derivation information associated with each condition factor score. The derivation information includes underlying information used in generating a condition factor score. For example, the derivation information may include one or more variable values (e.g., corresponding to a sensor output value) used in calculating the score as well one or more formulas used to generate the score with the one or more variable values. Using the toggle button, a viewing user can easily switch between the table view and the detailed view from within the UI.
As shown, the interface module 200 includes a filter/sort module 201 configured to filter and sort condition data presented within UIs. The filter/sort module 201 may filter and/or sort information in accordance with user selections made via interactive elements (e.g., buttons, drop-down menus, check-boxes or combinations thereof) displayed in conjunction with the condition data. For example, the UIs may include selectable elements that allow the user to filter and/or sort information based on any one of condition factor, condition factor score, equipment type, equipment attributes, and equipment location (e.g., facility in which the equipment is located). Depending on the filters selected, the filtering of the condition data may include adding or removing one or more condition factor scores. Depending on the specified sorting selection, the sorting of the condition data may include displaying condition factor scores in a particular order (e.g., ascending or descending order), particular grouping (e.g., condition factor scores corresponding to equipment of the same type may be grouped together), or combinations of both. Examples of the UIs provided by the interface module 200 (e.g., to the device 104) are discussed below in reference to
The data retrieval module 202 is configured to retrieve and integrate source data (e.g., sensor data and report data) for analysis in the real-time auditing system 102. The source data obtained by the data retrieval module 202 includes: sensor data corresponding to data output by one or more sensors 116 coupled to the industrial equipment 118; and report data comprising information (e.g., human-generated information) describing operational aspects of the industrial equipment 118. Accordingly, in some instances, the data retrieval module 202 may retrieve sensor data through appropriate requests submitted to one or more machine interfaces of the sensors 116 that provide a direct link to the output of the sensors 116. In some instances, the data retrieval module 202 retrieves source data from the third-party computing system 106 through appropriate requests (e.g., API requests or calls) transmitted over the network 108. The data may be retrieved by the data retrieval module 202 on a constant basis or as changes to the data are detected.
In integrating the source data into the real-time auditing system 102, the data retrieval module 202 performs operations including cleaning, extracting, transforming, and translating raw data, and creating a data object for each instance of equipment (e.g., an individual machine) included in the source data. Each data object, which may be referred to as an “equipment data object,” is a data structure (e.g., a table) that includes information describing the equipment, such as: an equipment identifier, a designator of equipment type, equipment age, equipment location (e.g., a particular facility, asset, or region in which the equipment resides), and current and historic condition data of the equipment. The source data that is retrieved and integrated into the real-time auditing system 102 is stored in the database 212 for subsequent processing and analysis by the real-time auditing system 102.
The calculation engine 204 is configured to analyze source data to derive condition data (e.g., individual condition factor scores and an overall condition score) describing various aspects of the condition of equipment. More specifically, the calculation engine 204 uses the source data to compute condition factor scores, each of which provides a measure (both qualitative and quantitative) of a condition factor that impacts an overall functional state of the industrial equipment 118. In other words, the condition factor score provides a measure of an aspect of equipment condition. In computing each condition factor score, the calculation engine 204 accesses a predefined formula or set of formulas corresponding to the particular condition factor score being computed. The calculation engine 204 then evaluates the formula or set of formulas using a portion of the source data (e.g., specific sensor data or report data corresponding to one or more variables included in the formula). Each computed condition factor score is stored in the database 212 (e.g., as part of the equipment data object) with a timestamp so as to maintain a record of current and historic condition factor scores associated with the industrial equipment 118.
The calculation engine 204 is further configured to derive information related to the overall condition of the industrial equipment 118. In particular, the calculation engine 204 computes an overall condition score for the industrial equipment 118 that provides a measure (that is both qualitative and quantitative) of an overall functional state (e.g., an overall condition) of the industrial equipment 118. The calculation engine 204 computes the overall condition score by aggregating the condition factor scores computed for the industrial equipment 118. For example, in calculating the overall condition score for the industrial equipment 118, the calculation engine 204 may compute a weighted average of the condition factor scores. More specifically, in this example, the calculation engine 204 may apply a predefined weight to each condition factor score (e.g., by multiplying the weight to the condition factor score) based on the relative impact of the corresponding condition factor to the overall condition. Once the weight is applied to each condition factor score, the calculation engine 204 may compute the average of the weighted values to produce the overall condition score.
The calculation engine 204 is further configured to compute aggregate condition data for various sets of industrial equipment. For example, the calculation engine 204 may compute aggregate condition data for a facility that includes a set of industrial equipment, an asset that includes a set of facilities, or a region that includes a set of assets. The aggregate condition data includes an overall aggregate condition score as well as one or more aggregate condition factor scores. Accordingly, in calculating aggregate condition data, the calculation engine 204 may aggregate respective condition factor scores (e.g., the condition factor score of each piece of equipment) corresponding to each condition factor to compute aggregate condition factor scores. The calculation engine 204 may further aggregate respective overall condition scores (e.g., the overall condition factor score of each piece of equipment) to compute an overall aggregate condition score.
The listener module 206 is configured to monitor incoming source data to detect any changes thereto. The listener module 206 detects changes to source data by comparing current source data to previous source data, and based on the comparison, determines whether a difference exists. For example, the listener module 206 may detect a change in value of sensor data produced by the sensors 116 by comparing current values in the sensor data to previous values in the sensor data and determining, based on the comparison, whether a difference exists between the two. In response to detecting a change to the source data, the listener module 206 causes the calculation engine 204 to dynamically re-compute one or more impacted condition factor scores. In other words, the listener module 206 causes the calculation engine 204 to compute one or more updated condition factor scores based on a detected difference in updated source data.
The edit/comment module 208 is responsible for processing user edits and comments related to condition data of the industrial equipment 118. To this end, the edit/comment module 208 may work in conjunction with the interface module 200 to display a comment component (e.g., a window), which is an interface element operable to provide further information about a particular condition factor score and receive user comments related to the condition factor score, or an editor component (e.g., a window), which is an interface element operable to receive user edits and comments related to condition factor scores. The comment and editor components both include a field for users to submit comments, and the editor component further provides interactive elements (e.g., a drop-down menu) to enable users to manually edit condition factor scores. The user comments may, for example, be in the form of text, image, audio, or video. The user comments may be related to an edit to a condition factor score or an overall condition score of a single piece of industrial equipment or a set of industrial equipment. The edit/comment module 208 updates scores based on the received user edits and stores each user comment in the database 212 with an association to the applicable piece of industrial equipment or set of industrial equipment. Additionally, the edit/comment module 208 works in conjunction with the interface module 200 to present the received user comments in the comment component. Depending on the type of comment received, the presentation of the comment may include displaying a textual comment, displaying all or part of a video file or image, or presenting all or part of an audio file.
In some embodiments, the UIs provided by the interface module 200 include a selectable element that allows users to access a user guide provided by the user guide module 210. The user guide module 210 may work in conjunction with the interface module 200 to provide the user guide in response to selection of the selectable element. The user guide provided by the user guide module 210 comprises calculation criteria 210A and calculation methodology 210B. The calculation criteria 210A specifies criteria used to determine the condition factor scores and hazard levels. For example, the calculation criteria may specify how underlying values (e.g., resulting from intermediate calculations) map to particular condition factor scores. The calculation methodology 210B specifies assumptions, formulas, and methods used to determine condition factor scores. Further details regarding the user guides provided by the user guide module 210 are illustrated in
The database 212 is a network-accessible machine-readable storage medium. In addition to storing the equipment object corresponding to the industrial equipment 118, the database 212 stores equipment objects for multiple other pieces of equipment that may be distributed across multiple facilities, assets, and regions. Each equipment object includes information describing the equipment such as equipment identifiers, equipment age, equipment locations (e.g., facility, asset, or region), current and historic condition factor scores, and current and historic overall condition scores.
As is understood by skilled artisans in the relevant computer and Internet-related arts, each functional component (e.g., engine, module, or database) illustrated in
At operation 305, the data retrieval module 202 ingests source data. The source data includes sensor data (e.g., sets of output data of sensors 116) and report data (e.g., human-generated reports) related to the industrial equipment 118. The source data may include multiple sets of sensor data (e.g., sensor data measuring various operational aspects of a piece of equipment). Each set of sensor data may correspond to the output of a particular sensor and may each include measurements of various operational aspects (e.g., functional attributes) of the industrial equipment 118 (e.g., engine or motor speed, torque, valve positions, precooler temperatures, deceleration, acceleration, pH, moisture, flow rate, altitude, depth/level, or flux). In ingesting the source data, the data retrieval module 202 performs operations including: obtaining sensor data from the sensors 116 (e.g., directly from the sensors 116 or from the third-party computing system 106); obtaining the report data from the third-party computing system 106; parsing the sensor data and the report data to identify data related to the industrial equipment 118; and storing the parsed source data in the database 212 for subsequent processing and analysis by the real-time auditing system 102. The data retrieval module 202 stores the parsed source data related to the industrial equipment 118 in association with or as part of an equipment object (e.g., a data structure) corresponding to the industrial equipment 118.
At operation 310, the calculation engine 204 computes one or more condition factor scores associated with the industrial equipment 118 based on the ingested source data. Each computed condition factor score provides a qualitative and quantitative measure of a condition factor that impacts an overall functional state of the industrial equipment. In computing each condition factor score, the calculation engine 204 accesses a predefined formula or set of formulas (e.g., from database 212) corresponding to the particular condition factor score being computed. Each formula may include one or more variables that correspond to the output of a particular sensor from the sensors 116. The calculation engine 204 evaluates the accessed formula or set of formulas using a portion of the source data (e.g., specific sensor data or report data corresponding to one or more variables included in the formula). In some instances, the calculation engine 204 may use the calculation methodology 210B to map intermediate values resulting from the evaluation of the formula to a particular condition factor score. For example, the calculation engine 204 may assign a condition factor score to particular condition factor of a piece of industrial equipment based on an underlying value (e.g., resulting from evaluation of the one or more predefined formulas)
At operation 315, the calculation engine 204 computes an overall condition factor score associated with the industrial equipment 118 based on the one or more condition factor scores. The overall condition factor score provides a qualitative and quantitative measure of an overall functional state of the industrial equipment 118. The calculation engine 204 computes the overall condition score by aggregating the condition factor scores computed for the industrial equipment 118. For example, in calculating the overall condition score for the industrial equipment 118, the calculation engine 204 may compute a weighted average of the condition factor scores.
At operation 320, the calculation engine 204 stores the computed one or more condition factor scores and the overall condition score, which collectively compose the condition data, in the database 212. The calculation engine 204 may store the condition data in association with or as part of the equipment object corresponding to the industrial equipment 118. The calculation engine 204 stores the one or more condition factor scores and the overall condition scores with an associated time stamp so as to maintain a record of condition factor scores and overall condition scores of the industrial equipment 118 over time.
At operation 325, the interface module 200 generates presentation data representing a portion of a UI for presenting the condition data (e.g., the one or more condition factor scores and the overall condition score). In generating the presentation data, the interface module 200 accesses the one or more condition factor scores and the overall condition score from the database 212 and incorporates them into the presentation data. The portion of the UI may correspond to either the table view or detailed view of condition data. In instances in which the UI corresponds to the detailed view, the interface module 200 may further access and incorporate variable values (e.g., sensor data values), formulas, and intermediate values (e.g., resulting from evaluation of a formula using one or more variable values) used by the calculation engine 204 into the presentation data.
The generating of the presentation data representing the portion of the UI may further include determining a hazard level associated with each condition factor based on the corresponding condition factor score, and assigning a visual indicator to the condition factor score based on the determined hazard level. Similarly, the generating of the presentation data may further include determining an overall hazard level of the industrial equipment 118 based on the overall condition score, and assigning a visual indicator to the overall hazard level based on the overall hazard level. Each hazard level may, for example, correspond to one of multiple coarsely granular relative rankings such as high, medium, and low. The determining of the hazard level may include comparing the score (either condition factor score or overall condition score) to a range of scores associated with each level, and determining, based on the comparison, the range in which the score falls. For example, condition factor scores of 0-4 may correspond to a high hazard level, scores of 5-7 may correspond to a medium hazard level, and scores of 8-10 may correspond to a low hazard level.
The visual indicator assigned to the condition factor score may include text corresponding to the coarsely granular relative ranking (e.g., “high,” “medium,” or “low”) and in addition, or in the alternative, a color. For example, scores corresponding to a “high” hazard level may be displayed with a red colored indicator, scores corresponding to a “medium” hazard level may be displayed with a yellow colored indicator, and scores corresponding to a “low” hazard level may be displayed with a green colored indicator. Accordingly, the assigning of the visual indicator to a condition factor may include selecting a particular color from among multiple available colors based on, for example, information included in a look-up table.
At operation 330, the interface module 200 causes display of the portion of the UI (e.g., the table view or the detailed view) on the device 104. The interface module 200 may cause display of the UI on the device 104 by providing the device 104 (e.g., through electronic transmission) with the presentation data representing the portion of the UI and instructions that, when executed by the device 104, cause the device 104 to display the portion of UI. Examples of portions of the UI displayed by the device 104, according to some embodiments, are illustrated in
As shown in
At operation 340, responsive to the listener module 206 detecting the change in the source data, the calculation engine 204 dynamically recomputes the one or more condition factor scores, the result of which is at least one updated condition factor score. In other words, the calculation engine 204 uses updated source data (e.g., changed source data) to update (e.g., recompute) the one or more condition factor scores.
At operation 345, responsive to the computation of at least one updated condition factor score, the calculation engine 204 dynamically recomputes the overall condition factor scores associated with the industrial equipment 118, the result of which is an updated overall condition score. In other words, the calculation engine 204 updates the overall condition score using the updated one or more condition factor scores.
At operation 350, the calculation engine 204 stores the updated one or more condition scores and the updated overall condition score in the database 212. The calculation engine 204 stores the updated one or more condition scores and the updated overall condition score as part of, or in association with, the equipment object corresponding to the industrial equipment 118 so as to maintain an association with current scores and historic scores.
At operation 355, the interface module 200 updates the portion of the UI displayed in the device 104 to include the updated one or more condition factor scores and the overall condition score associated with the industrial equipment 118. The updating of the UI includes replacing previous scores (e.g., condition factor scores or the overall condition score) in the presentation data with current (e.g., updated) scores. Additionally, the updating of the UI includes determining an updated hazard level based on current scores, and updating the visual indicator assigned to each condition factor based on the corresponding updated hazard level. As with the initial determination of hazard level, determining the updated hazard level may be based on which of multiple ranges (e.g., ranges of scores) the updated score falls into. In instances in which the portion of the UI being displayed corresponds to the detailed view, the updating of the display of the UI may further include updating one or more underlying variable values or other intermediate values to reflect changes to sensor data values in the source data.
As shown in
At operation 360, the interface module 200 provides a selectable element (e.g., a toggle button) within the UI that is operable (e.g., by way of user selection) to toggle the UI between the table view and a detailed view. The interface module 200 causes the selectable element to be presented in conjunction with the table view (e.g., in an area in the UI above the table view).
At operation 365, the interface module 200 receives user selection of the selectable element. The user selection may be accomplished using an input device (e.g., mouse or touch screen) of the device 104 on which the UI is being displayed.
In response to receiving the user selection, the interface module 200 accesses derivation information related to the condition factor scores displayed within the table view, at operation 370. For each condition factor score, the derivation information may include one or more underlying variable values (e.g., a sensor data value), one or more formulas used to generate the condition factor score using the one or more underlying variable values, and one or more intermediate values resulting from the evaluation of a formula using the one or more underlying variable values.
Also responsive to receiving the user selection, the interface module 200 updates the UI to present the detailed view that includes the derivation information, at operation 375. The updating of the UI may include providing (e.g., transmitting) the device 104 (e.g., through electronic transmission) with the presentation data representing the detailed view and further instructions that, when executed by the device 104, cause the device 104 to display the detailed view in place of the table view.
As shown in
At operation 380, the interface module 200 receives, via the device 104, a filter selection from among multiple filters presented within the UI. The filters may, for example, include condition factor, equipment type, equipment attributes, and equipment location (e.g., facility).
At operation 385, the filter/sort module 201 of the interface module 200 filters the condition data presented in the portion of the UI in accordance with the filter selection. Depending on what other filter selections have been selected, if any, the filtering of the condition data by the filter/sort module 201 may include adding or removing one or more condition factors, condition factor scores, or pieces of industrial equipment from display within the portion of the UI (e.g., table view or detailed view). For example, upon receiving a filter selection corresponding to a particular condition factor, the filter/sort module 201 may remove condition factor scores for all condition factors except for the condition factor corresponding to the filter selection.
As shown in
At operation 390, the interface module 200 receives a sort selection via the device 104. The sort selection may be specified by a user of the device 104 using one or more elements displayed in conjunction with the portion of the UI presented on the device 104. The sort selection may, for example, include a selection based on condition factor, condition factor score, equipment type, equipment attributes, and equipment location (e.g., facility).
At operation 395, the filter/sort module 201 of the interface module 200 sorts the condition data presented in the portion of the UI according to the sort selection. The sorting of the condition data by the filter/sort module 201 may include displaying condition factor scores according to a particular order and/or grouping.
At operation 805, the interface module 200 receives a user selection of a score displayed within a UI displayed on the device 104. The score may be either an overall condition score or a condition factor score. The user may select the score displayed in the UI through appropriate interaction with an input device (e.g., mouse click) of the device 104.
In response to receiving the user selection of the score, the interface module 200, working in conjunction with the edit/comment module 208, causes display of an editor component in the UI (e.g., by transmitting a set of machine-readable instructions to the device 104 that causes the device 104 to display the editor component) at operation 810. The editor component is a UI element operable (e.g., by user interaction) to facilitate manual editing of scores and receiving comments related to the edited score. The editor component may, accordingly, provide one or more interface elements (e.g., entry fields or buttons) that allow users to change scores and submit textual, audio, or video comments related to the selected score. For example, the comment component may include a drop-down menu to select a score and a text entry field for the user to enter a textual comment. As another example, the comment component may include one or more buttons that allow the user to record an audio or video comment, or a combination thereof.
The editor component may be displayed in conjunction with the score in the UI. For example, in response to receiving a selection of a score, the interface module 200 may cause the comment component to be presented overlaid upon a portion of the UI. An example editor component, according to some embodiments, is illustrated in
At operation 815, the interface module 200, working in conjunction with the edit/comment module 208, receives a user edit to the score (e.g., via selection of a new score from a drop-down menu included in the editor component). At operation 820, the edit/comment module 208 updates the score according to the user edit. The updating of the score performed by the edit/comment module 208 may include modifying a value included in an equipment object to reflect the user edit to the score.
At operation 825, the interface module 200, working in conjunction with the edit/comment module 208, receives a user-generated (e.g., human-generated) comment input via the editor component. As noted above, the user generated comment may be in the form of text, audio, video, or various combinations thereof.
At operation 830, the edit/comment module 208 stores the user generated comment in association with the user-edited score. For example, the edit/comment module 208 may store the user-generated comment as part of the corresponding equipment object.
At operation 835, the interface module 200, working in conjunction with the edit/comment module 208, causes presentation of the user-generated comment within the comment component (e.g., by transmitting a set of machine-readable instructions to the device 104 that causes the device 104 to present the user-generated comment). Depending on the type of comment received, the presentation of the comment may include displaying a textual comment, displaying all or part of a video file, or presenting all or part of an audio file.
At operation 905, the calculation engine 204 accesses condition data associated with a set of industrial equipment. The condition data includes condition factor scores and overall condition scores of each piece of equipment in the set. The set of industrial equipment may, for example, correspond to a collection of equipment located at a particular facility, (e.g., a consumer product manufacturing facility, a chemical plant, an oil refinery, a drug manufacturing facility, an aircraft manufacturing facility, or an automotive manufacturing facility), multiple collections of equipment located across multiple facilities of a particular asset, or multiple collections of equipment located across multiple facilities of multiple different assets in a particular region (e.g., geographic region).
At operation 910, the calculation engine 204 computes an aggregate condition data for the set of industrial equipment. The aggregate condition data includes an overall aggregate condition score and one or more aggregate condition factor scores. Accordingly, the calculating of aggregate condition data may include, for each condition factor, aggregating (e.g., determining a weighted average) respective condition factor scores (e.g., the condition factor score of each piece of equipment) to compute an aggregate condition factor score. The calculation of aggregate condition data may further include aggregating (e.g., determining a weighted average) respective overall condition scores (e.g., the overall condition factor score of each piece of equipment) to compute an overall aggregate condition score.
At operation 915, the interface module 200 uses the aggregate condition data to generate presentation data representing a portion of a UI. The UI represented by the presentation data includes an identifier of the set of equipment (e.g., an identifier of the facility, asset, or region), the overall aggregate condition score, and the one or more aggregate condition factors.
At operation 920, the interface module 200 causes display of the portion of the UI on the device 104. For example, the interface module 200 may provide the device 104 with the presentation data representing the portion of the UI and a set of instructions that, when executed by the device 104, cause the device 104 to display the presentation data.
Additionally, each condition factor score included in the table 1002 includes a visual indicator (e.g., cross-hatching or lack of cross-hatching) of the hazard level represented by the corresponding condition factor score. Although the visual indicators illustrated in
A user selection of one of the condition factor scores included in the table 1002 (e.g., accomplished through placing the cursor over the value and selecting using a mouse click), results in the display of a comment component, an example of which is illustrated in
The UI 1300 further includes information related to specific equipment located at the facility. For example, table 1314 includes information about each facility in the asset such as a facility identifier, a graph of overall condition score over time, a graph of production (e.g., in number of units) over time, and a graph of production at risk.
The UI 1400 further includes information related to specific equipment located at the facility. For example, table 1412 includes information about each asset in the region such as an asset identifier, a graph of overall condition score over time, a graph of production (e.g., in number of units) over time, a graph of production at risk, and a number of facilities at each hazard level (e.g., low, medium, and high).
A user selection of one of the condition factor scores included in the table (e.g., accomplished through placing the cursor over the value and selecting using a mouse click), results in the display of an editor component, an example of which is illustrated in
With reference back to
With reference again to
Similarly, with reference yet again back to
An example of the view of a calculation methodology 1900 is illustrated in
Modules, Components, and Logic
Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium) or hardware modules. A “hardware module” is a tangible unit capable of performing certain operations and may be configured or arranged in a certain physical manner. In various example embodiments, one or more computer systems (e.g., a standalone computer system, a client computer system, or a server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
In some embodiments, a hardware module may be implemented mechanically, electronically, or any suitable combination thereof. For example, a hardware module may include dedicated circuitry or logic that is permanently configured to perform certain operations. For example, a hardware module may be a special-purpose processor, such as a Field-Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC). A hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations. For example, a hardware module may include software executed by a general-purpose processor or other programmable processor. Once configured by such software, hardware modules become specific machines (or specific components of a machine) uniquely tailored to perform the configured functions and are no longer general-purpose processors. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
Accordingly, the phrase “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. As used herein, “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where a hardware module comprises a general-purpose processor configured by software to become a special-purpose processor, the general-purpose processor may be configured as respectively different special-purpose processors (e.g., comprising different hardware modules) at different times. Software accordingly configures a particular processor or processors, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein. As used herein, “processor-implemented module” refers to a hardware module implemented using one or more processors.
Similarly, the methods described herein may be at least partially processor-implemented, with a particular processor or processors being an example of hardware. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. Moreover, the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an API).
The performance of certain of the operations may be distributed among the processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the processors or processor-implemented modules may be distributed across a number of geographic locations.
Example Machine Architecture and Machine-Readable
The machine 2100 may include processors 2110, memory/storage 2130, and input/output (I/O) components 1250, which may be configured to communicate with each other such as via a bus 2102. In an example embodiment, the processors 2110 (e.g., a Central Processing Unit (CPU), a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an ASIC, a Radio-Frequency Integrated Circuit (RFIC), another processor, or any suitable combination thereof) may include, for example, a processor 2112 and a processor 2114 that may execute the instructions 2116. The term “processor” is intended to include multi-core processor that may comprise two or more independent processors (sometimes referred to as “cores”) that may execute instructions contemporaneously. Although
The memory/storage 2130 may include a memory 2132, such as a main memory, or other memory storage, and a storage unit 2136, both accessible to the processors 2110 such as via the bus 2102. The storage unit 2136 and memory 2132 store the instructions 2116 embodying any one or more of the methodologies or functions described herein. The instructions 2116 may also reside, completely or partially, within the memory 2132, within the storage unit 2136, within at least one of the processors 2110 (e.g., within the processor's cache memory), or any suitable combination thereof, during execution thereof by the machine 2100. Accordingly, the memory 2132, the storage unit 2136, and the memory of the processors 2110 are examples of machine-readable media.
As used herein, “machine-readable medium” means a device able to store instructions and data temporarily or permanently, and may include, but is not limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, optical media, magnetic media, cache memory, other types of storage (e.g., Erasable Programmable Read-Only Memory (EEPROM)), and/or any suitable combination thereof. The term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store the instructions 2116. The term “machine-readable medium” shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., instructions 2116) for execution by a machine (e.g., machine 2100), such that the instructions, when executed by one or more processors of the machine (e.g., processors 2110), cause the machine to perform any one or more of the methodologies described herein. Accordingly, a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” excludes signals per se.
Furthermore, the machine-readable medium is non-transitory in that it does not embody a propagating signal. However, labeling the tangible machine-readable medium “non-transitory” should not be construed to mean that the medium is incapable of movement; the medium should be considered as being transportable from one real-world location to another. Additionally, since the machine-readable medium is tangible, the medium may be considered to be a machine-readable storage device.
The I/O components 1250 may include a wide variety of components to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on. The specific I/O components 1250 that are included in a particular machine will depend on the type of machine. For example, portable machines such as mobile phones will likely include a touch input device or other such input mechanisms, while a headless server machine will likely not include such a touch input device. It will be appreciated that the I/O components 1250 may include many other components that are not shown in
In further example embodiments, the I/O components 1250 may include biometric components 2156, motion components 2159, environmental components 2160, or position components 2162 among a wide array of other components. For example, the biometric components 2156 may include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram based identification), and the like. The motion components 2158 may include acceleration sensor components (e.g., accelerometer), gravitation sensor components, rotation sensor components (e.g., gyroscope), and so forth. The environmental components 2160 may include, for example, illumination sensor components (e.g., photometer), temperature sensor components (e.g., one or more thermometers that detect ambient temperature), humidity sensor components, pressure sensor components (e.g., barometer), acoustic sensor components (e.g., one or more microphones that detect background noise), proximity sensor components (e.g., infrared sensors that detect nearby objects), gas sensors (e.g., gas detection sensors to detect concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), or other components that may provide indications, measurements, or signals corresponding to a surrounding physical environment. The position components 2162 may include location sensor components (e.g., a Global Position System (GPS) receiver component), altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like.
Communication may be implemented using a wide variety of technologies. The I/O components 1250 may include communication components 2164 operable to couple the machine 2100 to a network 2190 or devices 2170 via a coupling 2192 and a coupling 2172, respectively. For example, the communication components 2164 may include a network interface component or other suitable device to interface with the network 2190. In further examples, the communication components 2164 may include wired communication components, wireless communication components, cellular communication components, Near Field Communication (NFC) components, Bluetooth® components (e.g., Bluetooth® Low Energy), Wi-Fi® components, and other communication components to provide communication via other modalities. The devices 2170 may be another machine or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a Universal Serial Bus (USB)).
Moreover, the communication components 2164 may detect identifiers or include components operable to detect identifiers. For example, the communication components 2164 may include Radio Frequency Identification (RFID) tag reader components, NFC smart tag detection components, optical reader components (e.g., an optical sensor to detect one-dimensional bar codes such as Universal Product Code (UPC) bar code, multi-dimensional bar codes such as Quick Response (QR) code, Aztec code, Data Matrix, Dataglyph, MaxiCode, PDF4210, Ultra Code, UCC RSS-2D bar code, and other optical codes), or acoustic detection components (e.g., microphones to identify tagged audio signals). In addition, a variety of information may be derived via the communication components 2164, such as location via Internet Protocol (IP) geo-location, location via Wi-Fi® signal triangulation, location via detecting an NFC beacon signal that may indicate a particular location, and so forth.
Transmission Medium
In various example embodiments, one or more portions of the network 2190 may be an ad hoc network, an intranet, an extranet, a VPN, a LAN, a WLAN, a WAN, a WWAN, a MAN, the Internet, a portion of the Internet, a portion of the PSTN, a plain old telephone service (POTS) network, a cellular telephone network, a wireless network, a Wi-Fi® network, another type of network, or a combination of two or more such networks. For example, the network 2190 or a portion of the network 2190 may include a wireless or cellular network and the coupling 2192 may be a Code Division Multiple Access (CDMA) connection, a Global System for Mobile communications (GSM) connection, or another type of cellular or wireless coupling. In this example, the coupling 2192 may implement any of a variety of types of data transfer technology, such as Single Carrier Radio Transmission Technology (1×RTT), Evolution-Data Optimized (EVDO) technology, General Packet Radio Service (GPRS) technology, Enhanced Data rates for GSM Evolution (EDGE) technology, third Generation Partnership Project (3GPP) including 3G, fourth generation wireless (4G) networks, Universal Mobile Telecommunications System (UMTS), High Speed Packet Access (HSPA), Worldwide Interoperability for Microwave Access (WiMAX), Long Term Evolution (LTE) standard, others defined by various standard-setting organizations, other long range protocols, or other data transfer technology.
The instructions 2116 may be transmitted or received over the network 2190 using a transmission medium via a network interface device (e.g., a network interface component included in the communication components 2164) and using any one of a number of well-known transfer protocols (e.g., HTTP). Similarly, the instructions 2116 may be transmitted or received using a transmission medium via the coupling 2172 (e.g., a peer-to-peer coupling) to the devices 2170. The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carrying the instructions 2116 for execution by the machine 2100, and includes digital or analog communications signals or other intangible media to facilitate communication of such software.
Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
Although an overview of the inventive subject matter has been described with reference to specific example embodiments, various modifications and changes may be made to these embodiments without departing from the broader scope of embodiments of the present disclosure. Such embodiments of the inventive subject matter may be referred to herein, individually or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single disclosure or inventive concept if more than one is, in fact, disclosed.
The embodiments illustrated herein are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. The Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.
As used herein, the term “or” may be construed in either an inclusive or exclusive sense. Moreover, plural instances may be provided for resources, operations, or structures described herein as a single instance. Additionally, boundaries between various resources, operations, modules, engines, and data stores are somewhat arbitrary, and particular operations are illustrated in a context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within a scope of various embodiments of the present disclosure. In general, structures and functionality presented as separate resources in the example configurations may be implemented as a combined structure or resource. Similarly, structures and functionality presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and improvements fall within a scope of embodiments of the present disclosure as represented by the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
In this document, the terms “a” or “an” are used, as is common in patent documents, to include one or more than one, independent of any other instances or usages of “at least one” or “one or more.” In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Also, in the following claims, the terms “including” and “comprising” are open-ended; that is, a system, device, article, or process that includes elements in addition to those listed after such a term in a claim are still deemed to fall within the scope of that claim. Moreover, in the following claims, the terms “first,” “second,” “third,” and so forth are used merely as labels, and are not intended to impose numerical requirements on their objects.
This application claims the benefit of priority of U.S. Provisional Application Ser. No. 62/459,514, titled “REAL-TIME AUDITING OF INDUSTRIAL EQUIPMENT CONDITION,” filed on Feb. 15, 2017, which is hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
5515488 | Hoppe et al. | May 1996 | A |
6430305 | Decker | Aug 2002 | B1 |
6587108 | Guerlain et al. | Jul 2003 | B1 |
6646660 | Patty | Nov 2003 | B1 |
6820135 | Dingman et al. | Nov 2004 | B1 |
6978419 | Kantrowitz | Dec 2005 | B1 |
6980984 | Huffman et al. | Dec 2005 | B1 |
7168039 | Bertram | Jan 2007 | B2 |
7461077 | Greenwood et al. | Dec 2008 | B1 |
7617232 | Gabbert et al. | Nov 2009 | B2 |
7756843 | Palmer | Jul 2010 | B1 |
7899796 | Borthwick et al. | Mar 2011 | B1 |
7917376 | Bellin et al. | Mar 2011 | B2 |
7941321 | Greenstein et al. | May 2011 | B2 |
8036971 | Aymeloglu et al. | Oct 2011 | B2 |
8037046 | Udezue et al. | Oct 2011 | B2 |
8046283 | Burns et al. | Oct 2011 | B2 |
8054756 | Chand et al. | Nov 2011 | B2 |
8155900 | Adams | Apr 2012 | B1 |
8214490 | Vos et al. | Jul 2012 | B1 |
8229902 | Vishniac et al. | Jul 2012 | B2 |
8290838 | Thakur et al. | Oct 2012 | B1 |
8302855 | Ma et al. | Nov 2012 | B2 |
8386377 | Xiong et al. | Feb 2013 | B1 |
8473454 | Evanitsky et al. | Jun 2013 | B2 |
8484115 | Aymeloglu et al. | Jul 2013 | B2 |
8489641 | Seefeld et al. | Jul 2013 | B1 |
8577911 | Stepinski et al. | Nov 2013 | B1 |
8589273 | Creeden et al. | Nov 2013 | B2 |
8688573 | Rukonic et al. | Apr 2014 | B1 |
8744890 | Bernier et al. | Jun 2014 | B1 |
8799799 | Cervelli et al. | Aug 2014 | B1 |
8806355 | Twiss et al. | Aug 2014 | B2 |
8812960 | Sun et al. | Aug 2014 | B1 |
8924388 | Elliot et al. | Dec 2014 | B2 |
8924389 | Elliot et al. | Dec 2014 | B2 |
8938686 | Erenrich et al. | Jan 2015 | B1 |
8949164 | Mohler | Feb 2015 | B1 |
9069842 | Melby | Jun 2015 | B2 |
9100428 | Visbal | Aug 2015 | B1 |
9111281 | Stibel et al. | Aug 2015 | B2 |
9129219 | Robertson et al. | Sep 2015 | B1 |
9256664 | Chakerian et al. | Feb 2016 | B2 |
9280618 | Bruce et al. | Mar 2016 | B1 |
9286373 | Elliot et al. | Mar 2016 | B2 |
9335911 | Elliot et al. | May 2016 | B1 |
20020065708 | Senay et al. | May 2002 | A1 |
20020095360 | Joao | Jul 2002 | A1 |
20020095658 | Shulman et al. | Jul 2002 | A1 |
20020103705 | Brady | Aug 2002 | A1 |
20020147805 | Leshem et al. | Oct 2002 | A1 |
20030126102 | Borthwick | Jul 2003 | A1 |
20040034570 | Davis | Feb 2004 | A1 |
20040111480 | Yue | Jun 2004 | A1 |
20040118933 | Readio | Jun 2004 | A1 |
20040153418 | Hanweck | Aug 2004 | A1 |
20040236688 | Bozeman | Nov 2004 | A1 |
20050010472 | Quatse et al. | Jan 2005 | A1 |
20050086207 | Heuer et al. | Apr 2005 | A1 |
20050154628 | Eckart et al. | Jul 2005 | A1 |
20050154769 | Eckart et al. | Jul 2005 | A1 |
20050240456 | Ward | Oct 2005 | A1 |
20060026120 | Carolan et al. | Feb 2006 | A1 |
20060026170 | Kreitler et al. | Feb 2006 | A1 |
20060080283 | Shipman | Apr 2006 | A1 |
20060143034 | Rothermel et al. | Jun 2006 | A1 |
20060143075 | Carr et al. | Jun 2006 | A1 |
20060143079 | Basak et al. | Jun 2006 | A1 |
20070000999 | Kubo et al. | Jan 2007 | A1 |
20070011304 | Error | Jan 2007 | A1 |
20070038646 | Thota | Feb 2007 | A1 |
20070150801 | Chidlovskii et al. | Jun 2007 | A1 |
20070156673 | Maga et al. | Jul 2007 | A1 |
20070162454 | D' Albora et al. | Jul 2007 | A1 |
20070185867 | Maga et al. | Aug 2007 | A1 |
20070192122 | Routson et al. | Aug 2007 | A1 |
20070203762 | Cutler | Aug 2007 | A1 |
20070284433 | Domenica et al. | Dec 2007 | A1 |
20080065655 | Chakravarthy et al. | Mar 2008 | A1 |
20080069081 | Chand et al. | Mar 2008 | A1 |
20080077642 | Carbone et al. | Mar 2008 | A1 |
20080103996 | Forman et al. | May 2008 | A1 |
20080208735 | Balet et al. | Aug 2008 | A1 |
20080222295 | Robinson et al. | Sep 2008 | A1 |
20080243711 | Aymeloglu et al. | Oct 2008 | A1 |
20080255973 | El Wade et al. | Oct 2008 | A1 |
20080270328 | Lafferty et al. | Oct 2008 | A1 |
20080294663 | Heinley et al. | Nov 2008 | A1 |
20080313132 | Hao et al. | Dec 2008 | A1 |
20090076845 | Bellin et al. | Mar 2009 | A1 |
20090094166 | Aymeloglu et al. | Apr 2009 | A1 |
20090094270 | Alirez et al. | Apr 2009 | A1 |
20090106178 | Chu | Apr 2009 | A1 |
20090112745 | Stefanescu | Apr 2009 | A1 |
20090125359 | Knapic et al. | May 2009 | A1 |
20090125459 | Norton et al. | May 2009 | A1 |
20090132953 | Reed, Jr. et al. | May 2009 | A1 |
20090157732 | Hao et al. | Jun 2009 | A1 |
20090187546 | Whyte | Jul 2009 | A1 |
20090187548 | Ji et al. | Jul 2009 | A1 |
20090249244 | Robinson et al. | Oct 2009 | A1 |
20090254842 | Leacock et al. | Oct 2009 | A1 |
20090259636 | Labrou et al. | Oct 2009 | A1 |
20090271343 | Vaiciulis et al. | Oct 2009 | A1 |
20090307049 | Elliott, Jr. et al. | Dec 2009 | A1 |
20090313463 | Pang et al. | Dec 2009 | A1 |
20090319418 | Herz | Dec 2009 | A1 |
20090319515 | Minton et al. | Dec 2009 | A1 |
20090319891 | MacKinlay et al. | Dec 2009 | A1 |
20100010667 | Sauder | Jan 2010 | A1 |
20100030722 | Goodson et al. | Feb 2010 | A1 |
20100031141 | Summers et al. | Feb 2010 | A1 |
20100042922 | Bradateanu et al. | Feb 2010 | A1 |
20100057622 | Faith | Mar 2010 | A1 |
20100070842 | Aymeloglu et al. | Mar 2010 | A1 |
20100098318 | Anderson | Apr 2010 | A1 |
20100106752 | Eckardt, III et al. | Apr 2010 | A1 |
20100114887 | Conway et al. | May 2010 | A1 |
20100131502 | Fordham | May 2010 | A1 |
20100161735 | Sharma | Jun 2010 | A1 |
20100191563 | Schlaifer et al. | Jul 2010 | A1 |
20100211535 | Rosenberger | Aug 2010 | A1 |
20100235915 | Memon et al. | Sep 2010 | A1 |
20100262688 | Hussain et al. | Oct 2010 | A1 |
20100293174 | Bennett | Nov 2010 | A1 |
20100312837 | Bodapati et al. | Dec 2010 | A1 |
20110040776 | Najm et al. | Feb 2011 | A1 |
20110061013 | Bilicki et al. | Mar 2011 | A1 |
20110078173 | Seligmann et al. | Mar 2011 | A1 |
20110093327 | Fordyce, III et al. | Apr 2011 | A1 |
20110099133 | Chang et al. | Apr 2011 | A1 |
20110153384 | Horne et al. | Jun 2011 | A1 |
20110173093 | Psota et al. | Jul 2011 | A1 |
20110208565 | Ross et al. | Aug 2011 | A1 |
20110208724 | Jones et al. | Aug 2011 | A1 |
20110213655 | Henkin et al. | Sep 2011 | A1 |
20110218955 | Tang et al. | Sep 2011 | A1 |
20110270604 | Qi et al. | Nov 2011 | A1 |
20110270834 | Sokolan et al. | Nov 2011 | A1 |
20110289397 | Eastmond et al. | Nov 2011 | A1 |
20110295649 | Fine et al. | Dec 2011 | A1 |
20110314007 | Dassa et al. | Dec 2011 | A1 |
20110314024 | Chang et al. | Dec 2011 | A1 |
20120004904 | Shin et al. | Jan 2012 | A1 |
20120011238 | Rathod | Jan 2012 | A1 |
20120011245 | Gillette et al. | Jan 2012 | A1 |
20120022945 | Falkenborg et al. | Jan 2012 | A1 |
20120054284 | Rakshit | Mar 2012 | A1 |
20120059853 | Jagota | Mar 2012 | A1 |
20120066166 | Curbera et al. | Mar 2012 | A1 |
20120079363 | Folting et al. | Mar 2012 | A1 |
20120084117 | Tavares et al. | Apr 2012 | A1 |
20120084287 | Lakshminarayan et al. | Apr 2012 | A1 |
20120089606 | Eshwar et al. | Apr 2012 | A1 |
20120131512 | Takeuchi et al. | May 2012 | A1 |
20120144335 | Abeln et al. | Jun 2012 | A1 |
20120158527 | Cannelongo et al. | Jun 2012 | A1 |
20120159362 | Brown et al. | Jun 2012 | A1 |
20120173381 | Smith | Jul 2012 | A1 |
20120215784 | King et al. | Aug 2012 | A1 |
20120221553 | Wittmer et al. | Aug 2012 | A1 |
20120226523 | Weiss et al. | Sep 2012 | A1 |
20120245976 | Kumar et al. | Sep 2012 | A1 |
20120323888 | Osann, Jr. | Dec 2012 | A1 |
20130016106 | Yip et al. | Jan 2013 | A1 |
20130054306 | Bhalla et al. | Feb 2013 | A1 |
20130055145 | Antony et al. | Feb 2013 | A1 |
20130057551 | Ebert et al. | Mar 2013 | A1 |
20130096988 | Grossman et al. | Apr 2013 | A1 |
20130110746 | Ahn | May 2013 | A1 |
20130151453 | Bhanot et al. | Jun 2013 | A1 |
20130166348 | Scotto | Jun 2013 | A1 |
20130166480 | Popescu et al. | Jun 2013 | A1 |
20130185245 | Anderson et al. | Jul 2013 | A1 |
20130185307 | El-yaniv et al. | Jul 2013 | A1 |
20130218879 | Park et al. | Aug 2013 | A1 |
20130226318 | Procyk et al. | Aug 2013 | A1 |
20130238616 | Rose et al. | Sep 2013 | A1 |
20130246170 | Gross et al. | Sep 2013 | A1 |
20130246537 | Gaddala | Sep 2013 | A1 |
20130246597 | Iizawa et al. | Sep 2013 | A1 |
20130263019 | Castellanos et al. | Oct 2013 | A1 |
20130268520 | Fisher et al. | Oct 2013 | A1 |
20130282696 | John et al. | Oct 2013 | A1 |
20130290825 | Arndt et al. | Oct 2013 | A1 |
20130297619 | Chandrasekaran et al. | Nov 2013 | A1 |
20130304770 | Boero et al. | Nov 2013 | A1 |
20130318604 | Coates et al. | Nov 2013 | A1 |
20140012796 | Petersen et al. | Jan 2014 | A1 |
20140040371 | Gurevich et al. | Feb 2014 | A1 |
20140053091 | Hou et al. | Feb 2014 | A1 |
20140058914 | Song et al. | Feb 2014 | A1 |
20140068487 | Steiger et al. | Mar 2014 | A1 |
20140095509 | Patton | Apr 2014 | A1 |
20140108380 | Gotz et al. | Apr 2014 | A1 |
20140108985 | Scott et al. | Apr 2014 | A1 |
20140123279 | Bishop et al. | May 2014 | A1 |
20140136285 | Carvalho | May 2014 | A1 |
20140143009 | Brice et al. | May 2014 | A1 |
20140156527 | Grigg et al. | Jun 2014 | A1 |
20140157172 | Peery et al. | Jun 2014 | A1 |
20140164502 | Khodorenko et al. | Jun 2014 | A1 |
20140189536 | Lange et al. | Jul 2014 | A1 |
20140189870 | Singla et al. | Jul 2014 | A1 |
20140195515 | Baker et al. | Jul 2014 | A1 |
20140222521 | Chait | Aug 2014 | A1 |
20140222784 | Handler | Aug 2014 | A1 |
20140222793 | Sadkin et al. | Aug 2014 | A1 |
20140226010 | Molin | Aug 2014 | A1 |
20140229554 | Grunin et al. | Aug 2014 | A1 |
20140280056 | Kelly | Sep 2014 | A1 |
20140282160 | Zarpas | Sep 2014 | A1 |
20140344230 | Krause et al. | Nov 2014 | A1 |
20140358829 | Hurwitz | Dec 2014 | A1 |
20140366132 | Stiansen et al. | Dec 2014 | A1 |
20150073929 | Psota et al. | Mar 2015 | A1 |
20150073954 | Braff | Mar 2015 | A1 |
20150095773 | Gonsalves et al. | Apr 2015 | A1 |
20150100897 | Sun et al. | Apr 2015 | A1 |
20150106170 | Bonica | Apr 2015 | A1 |
20150106379 | Elliot et al. | Apr 2015 | A1 |
20150134599 | Banerjee et al. | May 2015 | A1 |
20150135256 | Hoy et al. | May 2015 | A1 |
20150188872 | White | Jul 2015 | A1 |
20150242401 | Liu | Aug 2015 | A1 |
20150338233 | Cervelli et al. | Nov 2015 | A1 |
20150379413 | Robertson et al. | Dec 2015 | A1 |
20160004764 | Chakerian et al. | Jan 2016 | A1 |
20160180557 | Yousaf et al. | Jun 2016 | A1 |
Number | Date | Country |
---|---|---|
102546446 | Jul 2012 | CN |
103167093 | Jun 2013 | CN |
102054015 | May 2014 | CN |
102014204827 | Sep 2014 | DE |
102014204830 | Sep 2014 | DE |
102014204834 | Sep 2014 | DE |
2487610 | Aug 2012 | EP |
2858018 | Apr 2015 | EP |
2869211 | May 2015 | EP |
2889814 | Jul 2015 | EP |
2892197 | Jul 2015 | EP |
2963595 | Jan 2016 | EP |
2996053 | Mar 2016 | EP |
3035214 | Jun 2016 | EP |
3038002 | Jun 2016 | EP |
3040885 | Jul 2016 | EP |
WO-2005116851 | Dec 2005 | WO |
WO-2012061162 | May 2012 | WO |
Entry |
---|
“5 Great Tools for Visualizing your Twitter Followers”, Amnet Blog, http://www.amnetblog.com/component/content/article/115-5-great-tools-for-visualizing-your-twitter-followers.html, (Aug. 4, 2010), 1-5. |
“About OWA”, Open Web Analytics, [Online]. Retrieved from the Internet: <URL: http://www.openwebanalytics.com/?page jd=2>, (Accessed: Jul. 19, 2013), 5 pgs. |
“An Introduction to KeyLines and Network Visualization”, Keylines.com, [Online]. Retrieved from the Internet: <URL: http://keylines.com/wp-content/uploads/2014/03/KeyLines-White-Paper.pdf>, (Mar. 2014), 8 pgs. |
“Analytics for Data Driven Startups”, Trak.io, [Online]. Retrieved from the Internet: <URL: http://trak.io/>, (Accessed: Jul. 18, 2013), 3 pgs. |
“Appacts: Open Source Mobile Analytics Platform”, http://www.appacts.com, (Jul. 18, 2013), 1-4. |
“U.S. Appl. No. 13/827,491, Final Office Action dated Jun. 22, 2015”, 28 pgs. |
“U.S. Appl. No. 13/827,491, Non Final Office Action dated Oct. 9, 2015”, 16 pgs. |
“U.S. Appl. No. 13/827,491, Non Final Office Action dated Dec. 1, 2014”, 5 pgs. |
“U.S. Appl. No. 14/141,252, Final Office Action dated Apr. 14, 2016”, 28 pgs. |
“U.S. Appl. No. 14/141,252, Non Final Office Action dated Oct. 8, 2015”, 11 pgs. |
“U.S. Appl. No. 14/225,006, Advisory Action dated Dec. 21, 2015”, 4 pgs. |
“U.S. Appl. No. 14/225,006, Final Office Action dated Sep. 2, 2015”, 28 pgs. |
“U.S. Appl. No. 14/225,006, First Action Interview Pre-Interview Communication dated Feb. 27, 2015”, 5 pgs. |
“U.S. Appl. No. 14/225,006, First Action Interview Pre-Interview Communication dated Sep. 10, 2014”, 4 pgs. |
“U.S. Appl. No. 14/225,084, Examiner Interview Summary dated Jan. 4, 2016”, 3 pgs. |
“U.S. Appl. No. 14/225,084, Final Office Action dated Feb. 26, 2016”, 14 pgs. |
“U.S. Appl. No. 14/225,084, First Action Interview Pre-Interview Communication dated Feb. 20, 2015”, 5 pgs. |
“U.S. Appl. No. 14/225,084, First Action Interview Pre-Interview Communication dated Sep. 2, 2014”, 17 pgs. |
“U.S. Appl. No. 14/225,084, Non Final Office Action dated Sep. 11, 2015”, 13 pgs. |
“U.S. Appl. No. 14/225,084, Notice of Allowance dated May 4, 2015”, 26 pgs. |
“U.S. Appl. No. 14/225,160, Advisory Action dated May 20, 2015”, 7 pgs. |
“U.S. Appl. No. 14/225,160, Examiner Interview Summary dated Apr. 22, 2016”, 7 pgs. |
“U.S. Appl. No. 14/225,160, Final Office Action dated Jan. 25, 2016”, 25 pgs. |
“U.S. Appl. No. 14/225,160, Final Office Action dated Feb. 11, 2015”, 30 pgs. |
“U.S. Appl. No. 14/225,160, First Action Interview Pre-Interview Communication dated Jul. 29, 2014”, 19 pgs. |
“U.S. Appl. No. 14/225,160, First Action Interview Pre-Interview Communication dated Oct. 22, 2014”, 6 pgs. |
“U.S. Appl. No. 14/225,160, Non Final Office Action dated Jun. 16, 2016”, 14 pgs. |
“U.S. Appl. No. 14/225,160, Non Final Office Action dated Aug. 12, 2015”, 23 pgs. |
“U.S. Appl. No. 14/306,138, Examiner Interview Summary dated Dec. 3, 2015”, 3 pgs. |
“U.S. Appl. No. 14/306,138, Examiner Interview Summary dated Dec. 24, 2015”, 5 pgs. |
“U.S. Appl. No. 14/306,147, Final Office Action dated Dec. 24, 2015”, 22 pgs. |
“U.S. Appl. No. 14/319,161, Final Office Action dated Jan. 23, 2015”, 21 pgs. |
“U.S. Appl. No. 14/319,161, Notice of Allowance dated May 4, 2015”, 6 pgs. |
“U.S. Appl. No. 14/319,765, Non Final Office Action dated Feb. 1, 2016”, 10 pgs. |
“U.S. Appl. No. 14/323,935, Notice of Allowance dated Oct. 1, 2015”, 8 pgs. |
“U.S. Appl. No. 14/451,221, Non Final Office Action dated Oct. 21, 2014”, 16 pgs. |
“U.S. Appl. No. 14/463,615, Advisory Action dated Sep. 10, 2015”, 3 pgs. |
“U.S. Appl. No. 14/463,615, Final Office Action dated May 21, 2015”, 31 pgs. |
“U.S. Appl. No. 14/463,615, First Action Interview Pre-Interview Communication dated Jan. 28, 2015”, 29 pgs. |
“U.S. Appl. No. 14/463,615, First Action Interview Pre-Interview Communication dated Nov. 13, 2014”, 4 pgs. |
“U.S. Appl. No. 14/463,615, Non Final Office Action dated Dec. 9, 2015”, 44 pgs. |
“U.S. Appl. No. 14/479,863, First Action Interview Pre-Interview Communication dated Dec. 26, 2014”, 5 pgs. |
“U.S. Appl. No. 14/479,863, Notice of Allowance dated Mar. 31, 2015”, 23 pgs. |
“U.S. Appl. No. 14/483,527, Final Office Action dated Jun. 22, 2015”, 17 pgs. |
“U.S. Appl. No. 14/483,527, First Action Interview Pre-Interview Communication dated Jan. 28, 2015”, 6 pgs. |
“U.S. Appl. No. 14/483,527, Non Final Office Action dated Oct. 28, 2015”, 20 pgs. |
“U.S. Appl. No. 14/483,527, Notice of Allowance dated Apr. 29, 2016”, 34 pgs. |
“U.S. Appl. No. 14/552,336, First Action Interview Pre-Interview Communication dated Jul. 20, 2015”, 18 pgs. |
“U.S. Appl. No. 14/552,336, Notice of Allowance dated Nov. 3, 2015”, 13 pgs. |
“U.S. Appl. No. 14/562,524, First Action Interview Pre-Interview Communication dated Sep. 14, 2015”, 12 pgs. |
“U.S. Appl. No. 14/562,524, First Action Interview Pre-Interview Communication dated Nov. 10, 2015”, 6 pgs. |
“U.S. Appl. No. 14/571,098, Final Office Action dated Feb. 23, 2016”, 37 pgs. |
“U.S. Appl. No. 14/571,098, First Action Interview dated Aug. 24, 2015”, 4 pgs. |
“U.S. Appl. No. 14/571,098, First Action Interview Pre-Interview Communication dated Mar. 11, 2015”, 4 pgs. |
“U.S. Appl. No. 14/571,098, First Action Interview Pre-Interview Communication dated Aug. 5, 2015”, 4 pgs. |
“U.S. Appl. No. 14/571,098, First Action Interview Pre-Interview Communication dated Nov. 10, 2015”, 5 pgs. |
“U.S. Appl. No. 14/631,633, First Action Interview Pre-Interview Communication dated Sep. 10, 2015”, 5 pgs. |
“U.S. Appl. No. 14/676,621, Examiner Interview Summary dated Jul. 30, 2015”, 5 pgs. |
“U.S. Appl. No. 14/676,621, Final Office Action dated Oct. 29, 2015”, 10 pgs. |
“U.S. Appl. No. 14/746,671, First Action Interview Pre-Interview Communication dated Nov. 12, 2015”, 19 pgs. |
“U.S. Appl. No. 14/746,671, Notice of Allowance dated Jan. 21, 2016”, 7 pgs. |
“U.S. Appl. No. 14/800,447, First Action Interview Pre-Interview Communication dated Dec. 10, 2015”, 26 pgs. |
“U.S. Appl. No. 14/813,749, Final Office Action dated Apr. 8, 2016”, 80 pgs. |
“U.S. Appl. No. 14/813,749, Non Final Office Action dated Sep. 28, 2015”, 22 pgs. |
“U.S. Appl. No. 14/842,734, First Action Interview Pre-Interview Communication dated Nov. 19, 2015”, 17 pgs. |
“U.S. Appl. No. 14/858,647, Notice of Allowance dated Mar. 4, 2016”, 47 pgs. |
“U.S. Appl. No. 14/929,584, Final Office Action dated May 25, 2016”, 42 pgs. |
“U.S. Appl. No. 14/929,584, Non Final Office Action dated Feb. 4, 2016”, 15 pgs. |
“Apsalar—Mobile App Analytics & Advertising”, https://apsalar.com/, (Jul. 18, 2013), 1-8. |
“Beta Testing on the Fly”, TestFlight, [Online]. Retrieved from the Internet: <URL: https://testflightapp.com/>, (Accessed: Jul. 18, 2013), 3 pgs. |
“Countly”, Countly Mobile Analytics, [Online]. Retrieved from the Internet: <URL: http://count.ly/products/screenshots, (accessed Jul. 18, 2013), 9 pgs. |
“DISTIMO—App Analytics”, [Online]. Retrieved from the Internet: <URL: http://www.distimo.com/app-analytics, (accessed Jul. 18, 2013), 5 pgs. |
“European Application Serial No. 14187996.5, Communication Pursuant to Article 94(3) EPC dated Feb. 19, 2016”, 9 pgs. |
“European Application Serial No. 14187996.5, Extended European Search Report dated Feb. 12, 2015”, 7 pgs. |
“European Application Serial No. 14191540.5, Extended European Search Report dated May 27, 2015”, 9 pgs. |
“European Application Serial No. 14200246.8, Extended European Search Report dated May 29, 2015”, 8 pgs. |
“European Application Serial No. 14200298.9, Extended European Search Report dated May 13, 2015”, 7 pgs. |
“European Application Serial No. 15181419.1, Extended European Search Report dated Sep. 29, 2015”, 7 pgs. |
“European Application Serial No. 15184764.7, Extended European Search Report dated Dec. 14, 2015”, 8 pgs. |
“European Application Serial No. 15200073.3, Extended European Search Report dated Mar. 30, 2016”, 16 pgs. |
“European Application Serial No. 15201924.6, Extended European Search Report dated Apr. 25, 2016”, 8 pgs. |
“European Application Serial No. 15202919.5, Extended European Search Report dated May 9, 2016”, 13 pgs. |
“European Application Serial No. 16152984.7, Extended European Search Report dated Mar. 24, 2016”, 8 pgs. |
“Flurry Analytics”, [Online]. Retrieved from the Internet: <URL: http://www.flurry.com/, (accessed Jul. 18, 2013), 14 pgs. |
“Google Analytics Official Website—Web Analytics & Reporting”, [Online]. Retrieved from the Internet: <URL: http ://www.google.com/analytics/index.html, (accessed Jul. 18, 2013), 22 pgs. |
“Great Britain Application Serial No. 1404486.1, Combined Search Report and Examination Report dated Aug. 27, 2014”, 5 pgs. |
“Great Britain Application Serial No. 1404486.1, Office Action dated May 21, 2015”, 2 pgs. |
“Great Britain Application Serial No. 1404489.5, Combined Search Report and Examination Report dated Aug. 27, 2014”, 5 pgs. |
“Great Britain Application Serial No. 1404489.5, Office Action dated May 21, 2015”, 3 pgs. |
“Great Britain Application Serial No. 1404489.5, Office Action dated Oct. 6, 2014”, 1 pgs. |
“Great Britain Application Serial No. 1404499.4, Combined Search Report and Examination Report dated Aug. 20, 2014”, 6 pgs. |
“Great Britain Application Serial No. 1404499.4, Office Action dated Jun. 11, 2015”, 5 pgs. |
“Great Britain Application Serial No. 1404499.4, Office Action dated Sep. 29, 2014”, 1 pg. |
“Help File for ModelRisk Version 5—Part 1”, Vose Software, (2007), 375 pgs. |
“Help File for ModelRisk Version 5—Part 2”, Vose Software, (2007), 362 pgs. |
“Hunchlab: Heat Map and Kernel Density Calculation for Crime Analysis”, Azavea Journal, [Online]. Retrieved from the Internet: <www.azavea.com/blogs/newsletter/v4i4/kernel-density-capabilities-added-to-hunchlab>, (Sep. 9, 2014), 2 pgs. |
“KeyLines Datasheet”, Keylines.com, [Online]. Retrieved from the Internet: <URL: http://keylines.com/wp-content/uploads/2014/03/KeyLines-datasheet.pdf>, (Mar. 2014), 2 pgs. |
“Mixpanel: Actions speak louder than page views”, Mobile Analytics, [Online]. Retrieved from the Internet: <URL: https://mixpanel.com/>, (Accessed: Jul. 18, 2013), 13 pgs. |
“Mobile App Marketing & Analytics”, Localytics, [Online]. Retrieved from the Internet: <URL: http://www.localytics.com/>, (Accessed: Jul. 18, 2013), 12 pgs. |
“Mobile Web”, Wikipedia:, [Online] retrieved from the internet:https://en.wikipedia.org/w/index.php?title=Mobile Web&oldid=643800164, (Jan. 23, 2015), 6 pgs. |
“More than android analytics”, UserMetrix, [Online]. Retrieved from the Internet: <URL: http://usermetrix.com/android-analytics>, (Accessed: Jul. 18, 2013), 3 pgs. |
“More Than Mobile Analytics”, Kontagent, [Online]. Retrieved from the Internet: <URL: http://www. kontagent. com/>, (Accessed: Jul. 18, 2013), 9 pgs. |
“Multimap”, Wikipedia, [Online]. Retrieved from the Internet: <URL: https://en.wikipedia.org/w/index.php?title=Multimap&oldid=530800748>, (Jan. 1, 2013), 2 pgs. |
“Netherlands Application Serial No. 2012417, Netherlands Search Report dated Sep. 18, 2015”, W/ English Translation, 9 pgs. |
“Netherlands Application Serial No. 2012438, Search Report dated Sep. 21, 2015”, 8 pgs. |
“New Zealand Application Serial No. 622473, First Examination Report dated Mar. 27, 2014”, 3 pgs. |
“New Zealand Application Serial No. 622473, Office Action dated Jun. 19, 2014”, 2 pgs. |
“New Zealand Application Serial No. 622513, Office Action dated Apr. 3, 2014”, 2 pgs. |
“New Zealand Application Serial No. 628161, First Examination Report dated Aug. 25, 2014”, 2 pgs. |
“Piwik—Free Web Analytics Software”, Piwik, [Online]. Retrieved from the Internet: <URL: http://piwik.org/>, (Accessed: Jul. 19, 2013), 18 pgs. |
“Realtime Constant Customer Touchpoint”, Capptain—Pilot your apps, [Online] retrieved from the internet: <http://www.capptain.com>, (accessed Jul. 18, 2013), 6 pgs. |
“Refresh CSS ellipsis when resizing container”, Stack Overflow, [Online]. Retrieved from the Internet: <URL: http://stackoverflow.com/questions/17964681/refresh-css-ellipsis-when-resizing-container>, (Accessed: May 18, 2015), 1 pg. |
“SAP BusinessObjects Explorer Online Help”, SAP BusinessObjects, (Mar. 19, 2012), 68 pgs. |
“Visualizing Threats: Improved Cyber Security Through Network Visualization”, Keylines.com, [Online] retrieved from the internet: <http://keylines.com/wp-content/uploads/2014/04/Visualizing-Threats1.pdf>, (May 12, 2014), 10 pgs. |
“Welcome to StatCounter—Visitor Analysis for Your Website”, StatCounter, [Online]. Retrieved from the Internet: <URL: http://statcounter.com/>, (Accessed: Jul. 19, 2013), 17 pgs. |
Andrew, G. Psaltis, “Streaming Data—Designing the real-time pipeline”, vol. MEAP V03, (Jan. 16, 2015), 16 pgs. |
Celik, T, “CSS Basic User Interface Module Level 3 (CSS3 UI)”, Section 8; Resizing and Overflow, [Online] retrieved from the internet: <http://www.w3.org/TR/2012/WD-css3-ui-20120117/#resizing-amp-overflow>, (Jan. 17, 2012), 1-58. |
Chaudhuri, Surajit, et al., “An Overview of Business Intelligence Technology”, Communications of the ACM, vol. 54, No. 8., (Aug. 2011), 88-98. |
Cohn, David, et al., “Semi-supervised Clustering with User Feedback”, Cornell University, (2003), 9 pgs. |
Gill, Leicester, et al., “Computerised linking of medical methodological guidelines”, 3rournal of Epidemiolog and Coimmunity Health 47, (1993), pp. 316-319. |
Gorr, et al., “Crime Hot Spot Forecasting: Modeling and Comparative Evaluation”, Grant 98-IJ-CX-K005, (May 6, 2002), 37 pgs. |
Gu, Lifang, et al., “Record Linkage: Current Practice and Future Directions”, (Jan. 15, 2004), 32 pgs. |
Hansen, D., et al., “Analyzing Social Media Networks with NodeXL: Insights from a Connected World”, Chapter 4 & Chapter 10, (Sep. 2010), 38 pages. |
Hua, Yu, et al., “A Multi-attribute Data Structure with Parallel Bloom Filters for Network Services”, HiPC 2006, LNCS 4297, (2006), 277-288. |
Jan-Keno, Janssen, “Wo bist'n du?—Googles Geodienst Latitude”, Not in English, [Online] retrieved from the internet:http://www.heise.de/artikel-archiv/ct/2011/03/086/@00250@/ct.11.03.086-088.pdf, (Jan. 17, 2011), 86-88. |
Manno, et al., “Introducing Collaboration in Single-user Applications through the Centralized Control Architecture”, (2010), 10 pgs. |
Phillip, J Windley, “The Live Web: Building Event-Based Connections in the Cloud”, Course Technology PTR, (Dec. 21, 2011), 61 pgs. |
Sigrist, Christian, et al., “PROSITE, a Protein Domain Database for Functional Characterization and Annotation”, Nucleic Acids Research, vol. 38, (2010), D161-D166. |
Valentini, Giorgio, et al., “Ensembles of Learning Machines”, Lecture Notes in Computer Science: Neural Nets, Springer Berlin Heidelberg, (Sep. 26, 2002), 3-20. |
Wang, Guohua, et al., “Research on a Clustering Data De-Duplication Mechanism Based on Bloom Filter”, IEEE, (2010), 5 pgs. |
Winkler, William E, et al., “Record Linkage Software and Methods for Merging Administrative Lists”, Bureau of the Census Statistical Research Division :Statistical Research Report Series, No. RR2001/03, (Jul. 23, 2001), 11 pgs. |
“European Application Serial No. 18155334.8, Extended European Search Report dated Oct. 22, 2018”, 10 pgs. |
“European Application Serial No. 18155334.8, Partial European Search Report dated Jul. 18, 2018”, 12 pgs. |
Number | Date | Country | |
---|---|---|---|
20180232084 A1 | Aug 2018 | US |
Number | Date | Country | |
---|---|---|---|
62459514 | Feb 2017 | US |