Embodiments of the present disclosure relate generally to database queries and, more particularly, but not by way of limitation, to enhanced visual analysis of data using sequenced dataset reduction.
Users can query databases to perform investigations and find target data, e.g., the source of a food poisoning outbreak. However, due to the stratospheric rise in data collection, the amount of data to be analyzed using queries makes investigations impractical, and target data may never be found. Inexperienced data investigators often analyze a dataset down the wrong path, reducing the dataset to yield a useless result. As is evident, there is a demand for improved data investigation tools.
Various ones of the appended drawings merely illustrate example embodiments of the present disclosure and should not be considered as limiting its scope.
The description that follows includes systems, methods, techniques, instruction sequences, and computing machine program products that embody illustrative embodiments of the disclosure. In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments of the inventive subject matter. It will be evident, however, to those skilled in the art, that embodiments of the inventive subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures, and techniques are not necessarily shown in detail.
In various example embodiments, investigation of datasets can be enhanced through sequenced dataset reduction using sequenced filter templates. Reducing datasets using filters can result in widely varying results, many of which may not be useful for the type of analysis being conducted. For example, a user investigating a dataset trying to find the source of a food poisoning outbreak may implement different filters (e.g., filtering by distance, years, past outbreak data) to reduce the dataset to find the source of the outbreak. However, which filters are applied and in what order can drastically change the resulting dataset. For instance, an inexperienced user may apply a distance filter early in the analysis and inadvertently filter out the source of the outbreak.
These issues can be addressed using a sequenced filter template that reduces a dataset in a specific way—applying particular filters in a specified order—to yield a resultant dataset that more readily highlights the desired target to be identified (e.g., a source of a food poisoning outbreak). A sequenced filter template comprises a set of filters to be applied to a dataset in a specified sequence. The ordering of the sequence may, for example, be configured by an expert investigator that understands how to properly reduce a dataset to yield useful results. The expert investigator may, for example, be an individual that is familiar with past investigations and understands how to properly drill-down a set of data with multiple filters to yield a reduced dataset that readily identifies target sources.
To create datasets for analysis, in some embodiments, a browser may be configured to detect whether a webpage is parsable, and generate a parse interface to assist parsing useful datasets from the webpage. In some embodiments, the browser parse functionality is implemented using a browser plugin. The plugin detects the website of a webpage displayed within the browser and determines whether the website is parsable. If the website is parsable, the browser plugin parses the webpage and displays a parse user interface, which displays input fields auto-populated with parsed data from the webpage. The user may modify, remove, or add additional data to the input fields and submit directly to the backend system, which may in turn receive the data and store it as part of the dataset for analysis.
In various implementations, the client device 110 comprises a computing device that includes at least a display and communication capabilities that provide access to the networked system 102 via the network 104. The client device 110 comprises, but is not limited to, a remote device, work station, computer, Internet appliance, hand-held device, wireless device, portable device, wearable computer, cellular or mobile phone, Personal Digital Assistant (PDA), smart phone, tablet, ultrabook, netbook, laptop, desktop, multi-processor system, microprocessor-based or programmable consumer electronic, game consoles, set-top box, network Personal Computer (PC), mini-computer, and so forth. In an example embodiment, the client device 110 comprises one or more of a touch screen, accelerometer, gyroscope, biometric sensor, camera, microphone, Global Positioning System (GPS) device, and the like.
The client device 110 communicates with the network 104 via a wired or wireless connection. For example, one or more portions of the network 104 comprises an ad hoc network, an intranet, an extranet, a Virtual Private Network (VPN), a Local Area Network (LAN), a wireless LAN (WLAN), a Wide Area Network (WAN), a wireless WAN (WWAN), a Metropolitan Area Network (MAN), a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a cellular telephone network, a wireless network, a Wireless Fidelity (WI-FI®) network, a Worldwide Interoperability for Microwave Access (WiMax) network, another type of network, or any suitable combination thereof.
In some example embodiments, the client device 110 includes one or more of the applications (also referred to as “apps”). In some example embodiments, the browser parser 112 and data visualizer 114 access the various systems of the networked system 102 via a web interface supported by a web server 122. In some example embodiments, the browser parser 112 and data visualizer 114 access the various services and functions provided by the networked system 102 via a programmatic interface provided by an Application Program Interface (API) server 120. The data visualizer 114 is a dataset visualization tool that is configured to manipulate datasets and display visualizations that allow a human user to detect patterns, trends, or signals that would not previously have been detectable (e.g., signals that would otherwise be lost in noise). The data visualizer 114 is configured to work with a data visualizer backend system 150, which performs backend operations for the client side data visualizer 114. In some example embodiments, the data visualizer 114 is run from a browser as a web service and the data visualizer backend system 150 serves as the web service for the front end, e.g., the data visualizer 114.
The query sequencer 115 manages the sequenced filter template functionality for the data visualizer 114. In some embodiments, the query sequencer 115 is configured as a plugin that plugs into the data visualizer 114 to enhance the filtering capabilities of the data visualizer 114. As discussed in further detail below, in some embodiments, the modules and functionalities of the query sequencer 115 may be directly integrated into the data visualizer 114. The browser parser 112 is an Internet browser that is configured to parse webpages, and submit information obtained from parsing to a backend system for storage in the dataset. In some embodiments, the browser parser 112 is an Internet browser with a plugin that is configured to perform the parse operations.
Users (e.g., the user 106) comprise a person, a machine, or other means of interacting with the client device 110. In some example embodiments, the user 106 is not part of the network architecture 100, but interacts with the network architecture 100 via the client device 110 or another means. For instance, the user 106 provides input (e.g., touch screen input or alphanumeric input) to the client device 110 and the input is communicated to the networked system 102 via the network 104. In this instance, the networked system 102, in response to receiving the input from the user 106, communicates information to the client device 110 via the network 104 to be presented to the user 106. In this way, the user 106 can interact with the networked system 102 using the client device 110.
The API server 120 and the web server 122 are coupled to, and provide programmatic and web interfaces respectively to, one or more application server 140. The application server 140 can host a data visualizer backend system 150 configured to support the data visualizer 114, each of which comprises one or more modules or applications and each of which can be embodied as hardware, software, firmware, or any combination thereof. The application server 140 are, in turn, shown to be coupled to one or more database server 124 that facilitate access to one or more information storage repositories or database 126. In an example embodiment, the database 126 are storage devices that store database objects parsed from browser parser 112, as well as store datasets to be analyzed by the data visualizer 114.
Additionally, a third party application 132, executing on third party server 130, is shown as having programmatic access to the networked system 102 via the programmatic interface provided by the API server 120. For example, the third party application 132, utilizing information retrieved from the networked system 102, supports one or more features or functions on a website hosted by the third party. The third party website, for example, provides webpages which can be parsed using the browser parser 112.
Further, while the client-server-based network architecture 100 shown in
The user interface engine 220 is configured to generate and display user interfaces for implementing the sequenced filter templates. The template library 230 is a library of available sequenced filter templates for selection by a user. Each of the templates may be configured by an expert user to drill down and solve different types of investigative problems. For example, one template in the template library 230 can drill-down into a set of restaurant distribution and logistics data to detect the source of a food poisoning outbreak. In some example embodiments, each of the sequenced filter templates specifies a sequence in which to apply filters to a dataset in order to produce a reduced dataset useful for analysis.
Though an investigative scenario involving food poisoning is discussed here for illustrative purposes, it is appreciated that each sequence filter template can be configured for widely varying investigative purposes, e.g., detecting bank fraud, analyzing shipping/logistics problems, tracking humanitarian aid, detecting cyber threats, and other analysis problems.
The filter engine 240 manages the filters applied by templates of the template library 230. Each of the filters may have custom configured functionality that may be further refined by customization parameters by the non-expert user at runtime of a selected filter. For example, a years filter may be preconfigured by the expert to return datasets matching a year range 1990-1999 (10 years), while a customization parameter may change the span of years, e.g., 1995-1999 (5 years), shift the year range 2000-2009 (10 years, shifted), or other changes.
The query constructor engine 250 receives or retrieves the sequenced filter template from the template library 230, receives filter data including filter logic and customizable parameter data as available, and constructs sequenced query code for submission to the data visualizer 114 or submission to the data visualizer backend system 150. The sequenced query code can be structured query language, or other types of programmatic language to query a database.
One technical advantage of query sequencer 115 implementing sequenced filter templates is that non-expert users (e.g., users applying a configured sequenced filter template) can generate a reduced dataset that is similar to or the same as a reduced dataset generated by an expert investigative user. An additional technical advantage stems from the usability. Non-expert users may be of at least two types: a user that does not know the correct ordering of filters to apply, or a user that does not know how to produce the query code. In some cases, a non-expert user may not know the correct ordering of filters and may not know how to produce the query code for a sequenced filter. The query sequencer 115 handles both of these shortcomings by using expert-created filter templates to handle order sequencing, and user interfaces and the query constructor engine 250 to allow a non-expert user to product query code for a sequenced template filter without having to write query code.
The data visualizer 114 may further include additional components used to communicate with other network components, manipulate data, and generate visualizations of data for analysis. As illustrated in the example embodiment of
The method 400 may be embodied in machine-readable instructions for execution by a hardware component (e.g., a processor) such that the operations of the method 400 may be performed by the data visualizer 114; accordingly, the method 400 is described below, by way of example with reference thereto. However, it shall be appreciated that the method 400 may be deployed on various other hardware configurations and is not intended to be limited to the data visualizer 114. At operation 410, the user interface engine 220 generates a display of a selected sequenced filter template on a display screen of client device 110. The selected sequence template may be selected from the template library 230. The display of the selected sequenced filter template comprises fields for customization parameters to modify the functionality of the filters, as described above.
At operation 420, the plugin engine 210 receives customization parameters (e.g., entered by the user 106 using a user interface presented on the client device 110). In some example embodiments, customization parameters modify the scope or effect of a filter. For example, a filter may be a year range filter that filters out data not in a given range. A customization parameter can change the range in duration (e.g., last five years, last 24 hours), modify the starting and ending points of the filter, or other modifications. Further details of customization parameters are discussed below with reference to
At operation 430, the query constructor engine 250 generates query code using the selected filter template. The query constructor engine 250 generates each filter, modifies each filter according to received customization parameters, and arranges the filters into a sequence in the query.
At operation 440, the query comprising the plurality of filters modified by customization parameters is applied to a dataset to filter data per each filter to result in a reduced dataset. In some example embodiments, the reduced dataset is a dataset honed by a user to more readily display patterns and find target sources. At operation 450, the visualization library 270 displays the reduced dataset using one or more visualizations. For example, the visualization library 270 may display the reduced dataset as graph data having nodes connected by edges.
The flow diagram in
At operation 505, the user interface engine 220 generates a display of a selected sequenced filter template on a display screen of client device 110. The selected sequence template may be selected from the template library 230. The display of the selected sequenced filter template comprises fields for customization parameters to modify the functionality of the filters.
At operation 510, the plugin engine 210 receives customization parameters from the user 106. At operation 515, the query constructor engine 250 generates query code using the selected filter template. The query constructor engine 250 generates code for each filter, modifies each filter according to received customization parameters, and arranges the filters into a sequence in the query. The query may then be passed through the backend API 300, over network 104, to the application server 140. At operation 520, the data visualizer backend system 150 receives the sequenced query. At operation 525, the data visualizer backend system 150 translates the query to a code format for the database server 124 if necessary. For example, the query received at operation 520 may be in a proprietary query language and database server 124 may be an off-the-shelf commercially available platform (e.g., an Oracle Database system) that uses structured query language incompatible with the proprietary query language. In such an example embodiment, at operation 525 the query is translated from the proprietary query language format to the query language of database server 124 (e.g., Oracle SQL), such that the ordering of the filter sequence and parameters of the original query generated at operation 515 are retained. The backend API 300 transmits the query (e.g., translated query) to the database server 124. At operation 530, the database server 124 apply the query to a dataset in database 126 to generate the reduced dataset. At operation 535, the database server 124 transmits the reduced dataset to the application server 140. At operation 540, the data visualizer backend system 150 transmits the reduced dataset to the backend API 300 of the data visualizer 114 on client device 110.
At operation 545, the backend API 300 stores the reduce dataset on memory local to the client device 110, according to some example embodiments. At operation 550, the visualization library 270 uses the stored reduced dataset to generate a visualization and display the visualized reduced dataset on the display screen of client device 110. The user 106 may then view and manipulate the reduced dataset to identify target data (e.g., food poisoning source).
At operation 605, the database engine 275 receives the constructed sequenced query. At operation 610, the database engine 275 identifies the first filter in the sequence of the sequenced filter template. At operation 615, the database engine 275 applies the filter to the dataset to generate a first reduced dataset. At operation 620, the database engine 275 determines whether there are additional filters in the sequenced query. If there are additional filters in the sequence, then at operation 625, the next filter in the sequence is identified and the process goes to operation 615, where the next filter is applied. When there are no more filters in the sequenced, the operation continues to operation 630 where the dataset reduced by one or more filters is returned or output as the reduced dataset.
The constructed query view 708 is a logical view of the query code constructed by the query constructor engine 250, according to some example embodiments. As illustrated, the query may be implemented using structured query language (SQL) designed to access the database 126, though it is appreciated that the filtering code implemented can be other programming languages, according to some embodiments. The expert investigative user may be a programmer or code developer that is fluent or experienced in writing the query or filter code. Once the query code is written and stored to the query sequencer as a sequence filter template, the non-expert user can use the query code through user interface objects (e.g., checkboxes, drag and drop elements) as shown in further detail below.
The example query code beings with “SELECT * FROM table1”, where “SELECT” and “FROM” are statements of the query and “table1” is an example dataset to be reduced. Each of the four filters represented by arrows corresponds to filter code, as indicated by the double-sided arrows. In particular, the left-most arrow, a “medium” filter, corresponds to first filter code 710, which comprises additional query code (e.g., WHERE, AND, OR, etc.), as specified by the expert user. As illustrated, the first filter code 710 comprises parameter data 712 that includes one or more customization parameters that can be customized by the non-expert user when implementing the sequenced filter template. Similarly, the second filter (a “year” filter) corresponds to second filter code 714 with one or more parameter data 716, the third filter (an “area” filter) corresponds to the third filter code 718 with parameter data 720, and the fourth filter (a “distributor” filter) corresponds to the fourth filter code 722, having one or more parameter data 724. Each of the filters can be implemented using the loop operation of
Upon selecting the submit sequenced query 855, the filters and parameters of the selected poison analyzer template 831 are applied to the dataset to generate a reduced dataset (e.g., reduced dataset 706). As discussed, in some embodiments, the data visualizer 114 can directly apply the sequence filter template to the dataset using the database engine 275. In other example embodiments, the sequenced filter template query code is constructed by the query constructor engine 250 on the client device 110, then transmitted to the data visualizer backend system 150 for application to the dataset and generation of the reduced dataset, as discussed above with reference to
With reference to
If the parse engine 940 determines, at operation 1030, that website parse template library 920 does not have a parse template for the website, then the browser parser 112 cannot parse the page and the process terminates as illustrated at operation 1040. However, if it is determined that a parse template exists for the website, the parse engine 940 retrieves the parse template from the website parse template library 920 for processing. At operation 1050, the parse engine 940 uses the parse template retrieved from the website parse template library 920 to parse the webpage. As discussed, a parse template is configured to identify fields and extract values from the source code of the page. For example, the source code of a webpage may include title field source code, such as “<title> sample title </title>”. The browser parser 112 identifies the field using the tags (<title>), and extracts the data enclosed in the tags (sample title). The data obtained from parsing the webpage (e.g., sample title) are then passed to the user interface engine 930 for further processing. At operation 1060, the user interface engine 930 receives the parsed values and generates a user interface for display within the browser. The user interface displays a number of editable fields, each of which can be prepopulated with data parsed from the webpage. The user 106 can edit the data in the fields or enter new data into the field if none was parsed. At operation 1070, the user 106 clicks a submit button on the generated user interface, which causes the database API 950 to transmit or otherwise store the webpage as an object in the dataset.
The machine 1200 can include processors 1210, memory/storage 1230, and I/O components 1250, which can be configured to communicate with each other such as via a bus 1202. In an example embodiment, the processors 1210 (e.g., a Central Processing Unit (CPU), a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Radio-Frequency Integrated Circuit (RFIC), another processor, or any suitable combination thereof) can include, for example, processor 1212 and processor 1214 that may execute instructions 1216. The term “processor” is intended to include multi-core processor that may comprise two or more independent processors (sometimes referred to as “cores”) that can execute instructions contemporaneously. Although
The memory/storage 1230 can include a memory 1232, such as a main memory, or other memory storage, and a storage unit 1236, both accessible to the processors 1210 such as via the bus 1202. The storage unit 1236 and memory 1232 store the instructions 1216 embodying any one or more of the methodologies or functions described herein. The instructions 1216 can also reside, completely or partially, within the memory 1232, within the storage unit 1236, within at least one of the processors 1210 (e.g., within the processor's cache memory), or any suitable combination thereof, during execution thereof by the machine 1200. Accordingly, the memory 1232, the storage unit 1236, and the memory of the processors 1210 are examples of machine-readable media.
As used herein, the term “machine-readable medium” means a device able to store instructions and data temporarily or permanently and may include, but is not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, optical media, magnetic media, cache memory, other types of storage (e.g., Erasable Programmable Read-Only Memory (EEPROM)) or any suitable combination thereof. The term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions 1216. The term “machine-readable medium” shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., instructions 1216) for execution by a machine (e.g., machine 1200), such that the instructions, when executed by one or more processors of the machine 1200 (e.g., processors 1210), cause the machine 1200 to perform any one or more of the methodologies described herein. Accordingly, a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” excludes signals per se.
The I/O components 1250 can include a wide variety of components to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on. The specific I/O components 1250 that are included in a particular machine will depend on the type of machine. For example, portable machines such as mobile phones will likely include a touch input device or other such input mechanisms, while a headless server machine will likely not include such a touch input device. It will be appreciated that the I/O components 1250 can include many other components that are not shown in
In further example embodiments, the I/O components 1250 can include biometric components 1256, motion components 1258, environmental components 1260, or position components 1262 among a wide array of other components. For example, the biometric components 1256 can include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram based identification), and the like. The motion components 1258 can include acceleration sensor components (e.g., an accelerometer), gravitation sensor components, rotation sensor components (e.g., a gyroscope), and so forth. The environmental components 1260 can include, for example, illumination sensor components (e.g., a photometer), temperature sensor components (e.g., one or more thermometers that detect ambient temperature), humidity sensor components, pressure sensor components (e.g., a barometer), acoustic sensor components (e.g., one or more microphones that detect background noise), proximity sensor components (e.g., infrared sensors that detect nearby objects), gas sensor components (e.g., machine olfaction detection sensors, gas detection sensors to detect concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), or other components that may provide indications, measurements, or signals corresponding to a surrounding physical environment. The position components 1262 can include location sensor components (e.g., a Global Positioning System (GPS) receiver component), altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like.
Communication can be implemented using a wide variety of technologies. The I/O components 1250 may include communication components 1264 operable to couple the machine 1200 to a network 1280 or devices 1270 via a coupling 1282 and a coupling 1272, respectively. For example, the communication components 1264 include a network interface component or other suitable device to interface with the network 1280. In further examples, communication components 1264 include wired communication components, wireless communication components, cellular communication components, Near Field Communication (NFC) components, BLUETOOTH® components (e.g., BLUETOOTH® Low Energy), WI-FI® components, and other communication components to provide communication via other modalities. The devices 1270 may be another machine or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a Universal Serial Bus (USB)).
Moreover, the communication components 1264 can detect identifiers or include components operable to detect identifiers. For example, the communication components 1264 can include Radio Frequency Identification (RFID) tag reader components, NFC smart tag detection components, optical reader components (e.g., an optical sensor to detect one-dimensional bar codes such as a Universal Product Code (UPC) bar code, multi-dimensional bar codes such as a Quick Response (QR) code, Aztec Code, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code, Uniform Commercial Code Reduced Space Symbology (UCC RSS)-2D bar codes, and other optical codes), acoustic detection components (e.g., microphones to identify tagged audio signals), or any suitable combination thereof. In addition, a variety of information can be derived via the communication components 1264, such as location via Internet Protocol (IP) geo-location, location via WI-FI® signal triangulation, location via detecting a BLUETOOTH® or NFC beacon signal that may indicate a particular location, and so forth.
In various example embodiments, one or more portions of the network 1280 can be an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless WAN (WWAN), a metropolitan area network (MAN), the Internet, a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a plain old telephone service (POTS) network, a cellular telephone network, a wireless network, a WI-FI® network, another type of network, or a combination of two or more such networks. For example, the network 1280 or a portion of the network 1280 may include a wireless or cellular network, and the coupling 1282 may be a Code Division Multiple Access (CDMA) connection, a Global System for Mobile communications (GSM) connection, or other type of cellular or wireless coupling. In this example, the coupling 1282 can implement any of a variety of types of data transfer technology, such as Single Carrier Radio Transmission Technology (1×RTT), Evolution-Data Optimized (EVDO) technology, General Packet Radio Service (GPRS) technology, Enhanced Data rates for GSM Evolution (EDGE) technology, third Generation Partnership Project (3GPP) including 3G, fourth generation wireless (4G) networks, Universal Mobile Telecommunications System (UMTS), High Speed Packet Access (HSPA), Worldwide Interoperability for Microwave Access (WiMAX), Long Term Evolution (LTE) standard, others defined by various standard setting organizations, other long range protocols, or other data transfer technology.
The instructions 1216 can be transmitted or received over the network 1280 using a transmission medium via a network interface device (e.g., a network interface component included in the communication components 1264) and utilizing any one of a number of well-known transfer protocols (e.g., Hypertext Transfer Protocol (HTTP)). Similarly, the instructions 1216 can be transmitted or received using a transmission medium via the coupling 1272 (e.g., a peer-to-peer coupling) to devices 1270. The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carrying the instructions 1216 for execution by the machine 1200, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
Although an overview of the inventive subject matter has been described with reference to specific example embodiments, various modifications and changes may be made to these embodiments without departing from the broader scope of embodiments of the present disclosure. Such embodiments of the inventive subject matter may be referred to herein, individually or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single disclosure or inventive concept if more than one is, in fact, disclosed.
The embodiments illustrated herein are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. The Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.
As used herein, the term “or” may be construed in either an inclusive or exclusive sense. Moreover, plural instances may be provided for resources, operations, or structures described herein as a single instance. Additionally, boundaries between various resources, operations, modules, engines, and data stores are somewhat arbitrary, and particular operations are illustrated in a context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within a scope of various embodiments of the present disclosure. In general, structures and functionality presented as separate resources in the example configurations may be implemented as a combined structure or resource. Similarly, structures and functionality presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and improvements fall within a scope of embodiments of the present disclosure as represented by the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
The present application claims priority to and incorporates by reference U.S. Provisional Application No. 62/353,233 filed Jun. 22, 2016, entitled “Visual Analysis of Data using Sequenced Dataset Reduction.”
Number | Name | Date | Kind |
---|---|---|---|
5515488 | Hoppe et al. | May 1996 | A |
6430305 | Decker | Aug 2002 | B1 |
6801910 | Bedell | Oct 2004 | B1 |
6820135 | Dingman et al. | Nov 2004 | B1 |
6978419 | Kantrowitz | Dec 2005 | B1 |
6980984 | Huffman et al. | Dec 2005 | B1 |
7168039 | Bertram | Jan 2007 | B2 |
7461077 | Greenwood et al. | Dec 2008 | B1 |
7617232 | Gabbert et al. | Nov 2009 | B2 |
7756843 | Palmer | Jul 2010 | B1 |
7899796 | Borthwick et al. | Mar 2011 | B1 |
7917376 | Bellin et al. | Mar 2011 | B2 |
7941321 | Greenstein et al. | May 2011 | B2 |
8036971 | Aymeloglu et al. | Oct 2011 | B2 |
8037046 | Udezue et al. | Oct 2011 | B2 |
8046283 | Burns et al. | Oct 2011 | B2 |
8054756 | Chand et al. | Nov 2011 | B2 |
8214490 | Vos et al. | Jul 2012 | B1 |
8229902 | Vishniac et al. | Jul 2012 | B2 |
8290838 | Thakur et al. | Oct 2012 | B1 |
8302855 | Ma et al. | Nov 2012 | B2 |
8386377 | Xiong et al. | Feb 2013 | B1 |
8473454 | Evanitsky et al. | Jun 2013 | B2 |
8484115 | Aymeloglu et al. | Jul 2013 | B2 |
8489641 | Seefeld et al. | Jul 2013 | B1 |
8543567 | Collins | Sep 2013 | B1 |
8577911 | Stepinski et al. | Nov 2013 | B1 |
8589273 | Creeden et al. | Nov 2013 | B2 |
8688573 | Rukonic et al. | Apr 2014 | B1 |
8744890 | Bernier et al. | Jun 2014 | B1 |
8799799 | Cervelli et al. | Aug 2014 | B1 |
8806355 | Twiss et al. | Aug 2014 | B2 |
8812960 | Sun et al. | Aug 2014 | B1 |
8924388 | Elliot et al. | Dec 2014 | B2 |
8924389 | Elliot et al. | Dec 2014 | B2 |
8938686 | Erenrich et al. | Jan 2015 | B1 |
8949164 | Mohler | Feb 2015 | B1 |
9069842 | Melby | Jun 2015 | B2 |
9100428 | Visbal | Aug 2015 | B1 |
9111281 | Stibel et al. | Aug 2015 | B2 |
9129219 | Robertson et al. | Sep 2015 | B1 |
9256664 | Chakerian et al. | Feb 2016 | B2 |
9280618 | Bruce et al. | Mar 2016 | B1 |
9286373 | Elliot et al. | Mar 2016 | B2 |
9335911 | Elliot et al. | May 2016 | B1 |
9424333 | Bisignani | Aug 2016 | B1 |
20020065708 | Senay et al. | May 2002 | A1 |
20020095360 | Joao | Jul 2002 | A1 |
20020095658 | Shulman et al. | Jul 2002 | A1 |
20020103705 | Brady | Aug 2002 | A1 |
20020147805 | Leshem et al. | Oct 2002 | A1 |
20030126102 | Borthwick | Jul 2003 | A1 |
20040034570 | Davis | Feb 2004 | A1 |
20040111480 | Yue | Jun 2004 | A1 |
20040153418 | Hanweck | Aug 2004 | A1 |
20040236688 | Bozeman | Nov 2004 | A1 |
20050010472 | Quatse et al. | Jan 2005 | A1 |
20050086207 | Heuer et al. | Apr 2005 | A1 |
20050154628 | Eckart et al. | Jul 2005 | A1 |
20050154769 | Eckart et al. | Jul 2005 | A1 |
20060026120 | Carolan et al. | Feb 2006 | A1 |
20060026170 | Kreitler et al. | Feb 2006 | A1 |
20060080283 | Shipman | Apr 2006 | A1 |
20060143034 | Rothermel et al. | Jun 2006 | A1 |
20060143075 | Carr et al. | Jun 2006 | A1 |
20060143079 | Basak et al. | Jun 2006 | A1 |
20070000999 | Kubo et al. | Jan 2007 | A1 |
20070011304 | Error | Jan 2007 | A1 |
20070038646 | Thota | Feb 2007 | A1 |
20070150801 | Chidlovskii et al. | Jun 2007 | A1 |
20070156673 | Maga et al. | Jul 2007 | A1 |
20070162454 | D'Albora et al. | Jul 2007 | A1 |
20070185867 | Maga et al. | Aug 2007 | A1 |
20070192122 | Routson et al. | Aug 2007 | A1 |
20070284433 | Domenica et al. | Dec 2007 | A1 |
20070288248 | Rauch | Dec 2007 | A1 |
20080059437 | Nagappan | Mar 2008 | A1 |
20080065655 | Chakravarthy et al. | Mar 2008 | A1 |
20080069081 | Chand et al. | Mar 2008 | A1 |
20080077642 | Carbone | Mar 2008 | A1 |
20080103996 | Forman et al. | May 2008 | A1 |
20080208735 | Balet et al. | Aug 2008 | A1 |
20080222295 | Robinson et al. | Sep 2008 | A1 |
20080243711 | Aymeloglu et al. | Oct 2008 | A1 |
20080255973 | El Wade et al. | Oct 2008 | A1 |
20080270328 | Lafferty et al. | Oct 2008 | A1 |
20080294663 | Heinley et al. | Nov 2008 | A1 |
20080313132 | Hao et al. | Dec 2008 | A1 |
20090076845 | Bellin et al. | Mar 2009 | A1 |
20090094166 | Aymeloglu et al. | Apr 2009 | A1 |
20090094270 | Alirez et al. | Apr 2009 | A1 |
20090106178 | Chu | Apr 2009 | A1 |
20090112745 | Stefanescu | Apr 2009 | A1 |
20090125359 | Knapic et al. | May 2009 | A1 |
20090125459 | Norton et al. | May 2009 | A1 |
20090132953 | Reed, Jr. et al. | May 2009 | A1 |
20090157732 | Hao et al. | Jun 2009 | A1 |
20090187546 | Whyte | Jul 2009 | A1 |
20090187548 | Ji et al. | Jul 2009 | A1 |
20090249244 | Robinson et al. | Oct 2009 | A1 |
20090254842 | Leacock et al. | Oct 2009 | A1 |
20090259636 | Labrou et al. | Oct 2009 | A1 |
20090271343 | Vaiciulis et al. | Oct 2009 | A1 |
20090307049 | Elliott, Jr. et al. | Dec 2009 | A1 |
20090313463 | Pang et al. | Dec 2009 | A1 |
20090319418 | Herz | Dec 2009 | A1 |
20090319515 | Minton et al. | Dec 2009 | A1 |
20090319891 | MacKinlay et al. | Dec 2009 | A1 |
20100030722 | Goodson et al. | Feb 2010 | A1 |
20100031141 | Summers et al. | Feb 2010 | A1 |
20100042922 | Bradateanu et al. | Feb 2010 | A1 |
20100057622 | Faith | Mar 2010 | A1 |
20100070842 | Aymeloglu et al. | Mar 2010 | A1 |
20100098318 | Anderson | Apr 2010 | A1 |
20100106752 | Eckardt, III et al. | Apr 2010 | A1 |
20100114887 | Conway et al. | May 2010 | A1 |
20100131502 | Fordham | May 2010 | A1 |
20100161735 | Sharma | Jun 2010 | A1 |
20100191563 | Schlaifer et al. | Jul 2010 | A1 |
20100211535 | Rosenberger | Aug 2010 | A1 |
20100211615 | Kawakami | Aug 2010 | A1 |
20100235915 | Memon et al. | Sep 2010 | A1 |
20100262688 | Hussain et al. | Oct 2010 | A1 |
20100293174 | Bennett | Nov 2010 | A1 |
20100312837 | Bodapati et al. | Dec 2010 | A1 |
20100325164 | Norton | Dec 2010 | A1 |
20110040776 | Najm et al. | Feb 2011 | A1 |
20110061013 | Bilicki et al. | Mar 2011 | A1 |
20110078173 | Seligmann et al. | Mar 2011 | A1 |
20110093327 | Fordyce, III et al. | Apr 2011 | A1 |
20110099133 | Chang et al. | Apr 2011 | A1 |
20110153384 | Horne et al. | Jun 2011 | A1 |
20110173093 | Psota et al. | Jul 2011 | A1 |
20110208565 | Ross et al. | Aug 2011 | A1 |
20110208724 | Jones et al. | Aug 2011 | A1 |
20110208822 | Rathod | Aug 2011 | A1 |
20110213655 | Henkin et al. | Sep 2011 | A1 |
20110218955 | Tang et al. | Sep 2011 | A1 |
20110270604 | Qi et al. | Nov 2011 | A1 |
20110270834 | Sokolan et al. | Nov 2011 | A1 |
20110289397 | Eastmond et al. | Nov 2011 | A1 |
20110295649 | Fine et al. | Dec 2011 | A1 |
20110314007 | Dassa et al. | Dec 2011 | A1 |
20110314024 | Chang et al. | Dec 2011 | A1 |
20120004904 | Shin et al. | Jan 2012 | A1 |
20120011238 | Rathod | Jan 2012 | A1 |
20120011245 | Gillette et al. | Jan 2012 | A1 |
20120022945 | Falkenborg et al. | Jan 2012 | A1 |
20120054284 | Rakshit | Mar 2012 | A1 |
20120059853 | Jagota | Mar 2012 | A1 |
20120066166 | Curbera et al. | Mar 2012 | A1 |
20120079363 | Folting et al. | Mar 2012 | A1 |
20120084117 | Tavares et al. | Apr 2012 | A1 |
20120084287 | Lakshminarayan et al. | Apr 2012 | A1 |
20120089606 | Eshwar et al. | Apr 2012 | A1 |
20120131512 | Takeuchi et al. | May 2012 | A1 |
20120144335 | Abeln et al. | Jun 2012 | A1 |
20120158527 | Cannelongo | Jun 2012 | A1 |
20120159362 | Brown et al. | Jun 2012 | A1 |
20120173381 | Smith | Jul 2012 | A1 |
20120215784 | King et al. | Aug 2012 | A1 |
20120221553 | Wittmer et al. | Aug 2012 | A1 |
20120226523 | Weiss et al. | Sep 2012 | A1 |
20120245976 | Kumar et al. | Sep 2012 | A1 |
20120323888 | Osann, Jr. | Dec 2012 | A1 |
20130016106 | Yip et al. | Jan 2013 | A1 |
20130054306 | Bhalla et al. | Feb 2013 | A1 |
20130055145 | Antony et al. | Feb 2013 | A1 |
20130057551 | Ebert et al. | Mar 2013 | A1 |
20130096988 | Grossman et al. | Apr 2013 | A1 |
20130110746 | Ahn | May 2013 | A1 |
20130151453 | Bhanot et al. | Jun 2013 | A1 |
20130166348 | Scotto | Jun 2013 | A1 |
20130166480 | Popescu et al. | Jun 2013 | A1 |
20130185245 | Anderson et al. | Jul 2013 | A1 |
20130185307 | El-Yaniv et al. | Jul 2013 | A1 |
20130218879 | Park et al. | Aug 2013 | A1 |
20130226318 | Procyk et al. | Aug 2013 | A1 |
20130238616 | Rose et al. | Sep 2013 | A1 |
20130246170 | Gross et al. | Sep 2013 | A1 |
20130246537 | Gaddala | Sep 2013 | A1 |
20130246597 | Iizawa et al. | Sep 2013 | A1 |
20130263019 | Castellanos et al. | Oct 2013 | A1 |
20130268520 | Fisher et al. | Oct 2013 | A1 |
20130282696 | John et al. | Oct 2013 | A1 |
20130290825 | Arndt et al. | Oct 2013 | A1 |
20130297619 | Chandrasekaran et al. | Nov 2013 | A1 |
20130304770 | Boero et al. | Nov 2013 | A1 |
20130318604 | Coates et al. | Nov 2013 | A1 |
20140012796 | Petersen et al. | Jan 2014 | A1 |
20140040371 | Gurevich et al. | Feb 2014 | A1 |
20140053091 | Hou et al. | Feb 2014 | A1 |
20140058914 | Song et al. | Feb 2014 | A1 |
20140068487 | Steiger et al. | Mar 2014 | A1 |
20140095509 | Patton | Apr 2014 | A1 |
20140108380 | Gotz et al. | Apr 2014 | A1 |
20140108985 | Scott et al. | Apr 2014 | A1 |
20140123279 | Bishop et al. | May 2014 | A1 |
20140136285 | Carvalho | May 2014 | A1 |
20140143009 | Brice et al. | May 2014 | A1 |
20140156527 | Grigg et al. | Jun 2014 | A1 |
20140157172 | Peery et al. | Jun 2014 | A1 |
20140164502 | Khodorenko et al. | Jun 2014 | A1 |
20140189536 | Lange et al. | Jul 2014 | A1 |
20140189870 | Singla et al. | Jul 2014 | A1 |
20140195515 | Baker et al. | Jul 2014 | A1 |
20140222521 | Chait | Aug 2014 | A1 |
20140222793 | Sadkin et al. | Aug 2014 | A1 |
20140229554 | Grunin et al. | Aug 2014 | A1 |
20140280056 | Kelly | Sep 2014 | A1 |
20140282160 | Zarpas | Sep 2014 | A1 |
20140330866 | Hess | Nov 2014 | A1 |
20140344230 | Krause et al. | Nov 2014 | A1 |
20140358829 | Hurwitz | Dec 2014 | A1 |
20140366132 | Stiansen et al. | Dec 2014 | A1 |
20150073929 | Psota et al. | Mar 2015 | A1 |
20150073954 | Braff | Mar 2015 | A1 |
20150095773 | Gonsalves et al. | Apr 2015 | A1 |
20150100897 | Sun et al. | Apr 2015 | A1 |
20150106170 | Bonica | Apr 2015 | A1 |
20150106379 | Elliot et al. | Apr 2015 | A1 |
20150134599 | Banerjee et al. | May 2015 | A1 |
20150135256 | Hoy et al. | May 2015 | A1 |
20150188872 | White | Jul 2015 | A1 |
20150242401 | Liu | Aug 2015 | A1 |
20150324423 | Wang | Nov 2015 | A1 |
20150338233 | Cervelli et al. | Nov 2015 | A1 |
20150379413 | Robertson et al. | Dec 2015 | A1 |
20160004764 | Chakerian et al. | Jan 2016 | A1 |
20160180557 | Yousaf et al. | Jun 2016 | A1 |
Number | Date | Country |
---|---|---|
102546446 | Jul 2012 | CN |
103167093 | Jun 2013 | CN |
102054015 | May 2014 | CN |
102014204827 | Sep 2014 | DE |
102014204830 | Sep 2014 | DE |
102014204834 | Sep 2014 | DE |
2487610 | Aug 2012 | EP |
2858018 | Apr 2015 | EP |
2869211 | May 2015 | EP |
2889814 | Jul 2015 | EP |
2892197 | Jul 2015 | EP |
2963595 | Jan 2016 | EP |
2996053 | Mar 2016 | EP |
3035214 | Jun 2016 | EP |
3038002 | Jun 2016 | EP |
3040885 | Jul 2016 | EP |
WO-2005116851 | Dec 2005 | WO |
WO-2012061162 | May 2012 | WO |
Entry |
---|
“5 Great Tools for Visualizing your Twitter Followers”, Amnet Blog, http://www.amnetblog.com/component/content/article/115-5-great-tools-for-visualizing-your-twitter-followers.html, (Aug. 4, 2010), 1-5. |
“About OWA”, Open Web Analytics, [Online]. Retrieved from the Internet: <URL: http://www.openwebanalytics.com/?page jd=2>, (Accessed: Jul. 19, 2013), 5 pgs. |
“An Introduction to KeyLines and Network Visualization”, Keylines.com, [Online]. Retrieved from the Internet: <URL: http://keylines.com/wp-content/uploads/2014/03/KeyLines-White-Paper.pdf>, (Mar. 2014), 8 pgs. |
“Analytics for Data Driven Startups”, Trak.io, [Online]. Retrieved from the Internet: <URL: http://trak.io/>, (Accessed: Jul. 18, 2013), 3 pgs. |
“Appacts: Open Source Mobile Analytics Platform”, http://www.appacts.com, (Jul. 18, 2013), 1-4. |
“U.S. Appl. No. 13/827,491, Final Office Action dated Jun. 22, 2015”, 28 pgs. |
“U.S. Appl. No. 13/827,491, Non Final Office Action dated Oct. 9, 2015”, 16 pgs. |
“U.S. Appl. No. 13/827,491, Non Final Office Action dated Dec. 1, 2014”, 5 pgs. |
“U.S. Appl. No. 14/141,252, Final Office Action dated Apr. 14, 2016”, 28 pgs. |
“U.S. Appl. No. 14/141,252, Non Final Office Action dated Oct. 8, 2015”, 11 pgs. |
“U.S. Appl. No. 14/225,006, Advisory Action dated Dec. 21, 2015”, 4 pgs. |
“U.S. Appl. No. 14/225,006, Final Office Action dated Sep. 2, 2015”, 28 pgs. |
“U.S. Appl. No. 14/225,006, First Action Interview Pre-Interview Communication dated Feb. 27, 2015”, 5 pgs. |
“U.S. Appl. No. 14/225,006, First Action Interview Pre-Interview Communication dated Sep. 10, 2014”, 4 pgs. |
“U.S. Appl. No. 14/225,084, Examiner Interview Summary dated Jan. 4, 2016”, 3 pgs. |
“U.S. Appl. No. 14/225,084, Final Office Action dated Feb. 26, 2016”, 14 pgs. |
“U.S. Appl. No. 14/225,084, First Action Interview Pre-Interview Communication dated Feb. 20, 2015”, 5 pgs. |
“U.S. Appl. No. 14/225,084, First Action Interview Pre-Interview Communication dated Sep. 2, 2014”, 17 pgs. |
“U.S. Appl. No. 14/225,084, Non Final Office Action dated Sep. 11, 2015”, 13 pgs. |
“U.S. Appl. No. 14/225,084, Notice of Allowance dated May 4, 2015”, 26 pgs. |
“U.S. Appl. No. 14/225,160, Advisory Action dated May 20, 2015”, 7 pgs. |
“U.S. Appl. No. 14/225,160, Examiner Interview Summary dated Apr. 22, 2016”, 7 pgs. |
“U.S. Appl. No. 14/225,160, Final Office Action dated Jan. 25, 2016”, 25 pgs. |
“U.S. Appl. No. 14/225,160, Final Office Action dated Feb. 11, 2015”, 30 pgs. |
“U.S. Appl. No. 14/225,160, First Action Interview Pre-Interview Communication dated Jul. 29, 2014”, 19 pgs. |
“U.S. Appl. No. 14/225,160, First Action Interview Pre-Interview Communication dated Oct. 22, 2014”, 6 pgs. |
“U.S. Appl. No. 14/225,160, Non Final Office Action dated Jun. 16, 2016”, 14 pgs. |
“U.S. Appl. No. 14/225,160, Non Final Office Action dated Aug. 12, 2015”, 23 pgs. |
“U.S. Appl. No. 14/306,138, Examiner Interview Summary dated Dec. 3, 2015”, 3 pgs. |
“U.S. Appl. No. 14/306,138, Examiner Interview Summary dated Dec. 24, 2015”, 5 pgs. |
“U.S. Appl. No. 14/306,147, Final Office Action dated Dec. 24, 2015”, 22 pgs. |
“U.S. Appl. No. 14/319,161, Final Office Action dated Jan. 23, 2015”, 21 pgs. |
“U.S. Appl. No. 14/319,161, Notice of Allowance dated May 4, 2015”, 6 pgs. |
“U.S. Appl. No. 14/319,765, Non Final Office Action dated Feb. 1, 2016”, 10 pgs. |
“U.S. Appl. No. 14/323,935, Notice of Allowance dated Oct. 1, 2015”, 8 pgs. |
“U.S. Appl. No. 14/451,221, Non Final Office Action dated Oct. 21, 2014”, 16 pgs. |
“U.S. Appl. No. 14/463,615, Advisory Action dated Sep. 10, 2015”, 3 pgs. |
“U.S. Appl. No. 14/463,615, Final Office Action dated May 21, 2015”, 31 pgs. |
“U.S. Appl. No. 14/463,615, First Action Interview Pre-Interview Communication dated Jan. 28, 2015”, 29 pgs. |
“U.S. Appl. No. 14/463,615, First Action Interview Pre-Interview Communication dated Nov. 13, 2014”, 4 pgs. |
“U.S. Appl. No. 14/463,615, Non Final Office Action dated Dec. 9, 2015”, 44 pgs. |
“U.S. Appl. No. 14/479,863, First Action Interview Pre-Interview Communication dated Dec. 26, 2014”, 5 pgs. |
“U.S. Appl. No. 14/479,863, Notice of Allowance dated Mar. 21, 2015”, 23 pgs. |
“U.S. Appl. No. 14/483,527, Final Office Action dated Jun. 22, 2015”, 17 pgs. |
“U.S. Appl. No. 14/483,527, First Action Interview Pre-Interview Communication dated Jan. 28, 2015”, 6 pgs. |
“U.S. Appl. No. 14/483,527, Non Final Office Action dated Oct. 28, 2015”, 20 pgs. |
“U.S. Appl. No. 14/483,527, Notice of Allowance dated Apr. 29, 2016”, 34 pgs. |
“U.S. Appl. No. 14/552,336, First Action Interview Pre-Interview Communication dated Jul. 20, 2015”, 18 pgs. |
“U.S. Appl. No. 14/552,336, Notice of Allowance dated Nov. 3, 2015”, 13 pgs. |
“U.S. Appl. No. 14/562,524, First Action Interview Pre-Interview Communication dated Sep. 14, 2015”, 12 pgs. |
“U.S. Appl. No. 14/562,524, First Action Interview Pre-Interview Communication dated Nov. 10, 2015”, 6 pgs. |
“U.S. Appl. No. 14/571,098, Final Office Action dated Feb. 23, 2016”, 37 pgs. |
“U.S. Appl. No. 14/571,098, First Action Interview dated Aug. 24, 2015”, 4 pgs. |
“U.S. Appl. No. 14/571,098, First Action Interview Pre-Interview Communication dated Mar. 11, 2015”, 4 pgs. |
“U.S. Appl. No. 14/571,098, First Action Interview Pre-Interview Communication dated Aug. 5, 2015”, 4 pgs. |
“U.S. Appl. No. 14/571,098, First Action Interview Pre-Interview Communication dated Nov. 10, 2015”, 5 pgs. |
“U.S. Appl. No. 14/631,633, First Action Interview Pre-Interview Communication dated Sep. 10, 2015”, 5 pgs. |
“U.S. Appl. No. 14/676,621, Examiner Interview Summary dated Jul. 30, 2015”, 5 pgs. |
“U.S. Appl. No. 14/676,621, Final Office Action dated Oct. 29, 2015”, 10 pgs. |
“U.S. Appl. No. 14/746,671, First Action Interview Pre-Interview Communication dated Nov. 12, 2015”, 19 pgs. |
“U.S. Appl. No. 14/746,671, Notice of Allowance dated Jan. 21, 2016”, 7 pgs. |
“U.S. Appl. No. 14/800,447, First Action Interview Pre-Interview Communication dated Dec. 10, 2015”, 26 pgs. |
“U.S. Appl. No. 14/813,749, Final Office Action dated Apr. 8, 2016”, 80 pgs. |
“U.S. Appl. No. 14/813,749, Non Final Office Action dated Sep. 28, 2015”, 22 pgs. |
“U.S. Appl. No. 14/842,734, First Action Interview Pre-Interview Communication dated Nov. 19, 2015”, 17 pgs. |
“U.S. Appl. No. 14/858,647, Notice of Allowance dated Mar. 4, 2016”, 47 pgs. |
“U.S. Appl. No. 14/929,584, Final Office Action dated May 25, 2016”, 42 pgs. |
“U.S. Appl. No. 14/929,584, Non Final Office Action dated Feb. 4, 2016”, 15 pgs. |
“Apsalar—Mobile App Analytics & Advertising”, https://apsalar.com/, (Jul. 18, 2013), 1-8. |
“Beta Testing on the Fly”, TestFlight, [Online]. Retrieved from the Internet: <URL:https://testflightapp. com/>, (Accessed: Jul. 18, 2013), 3 pgs. |
“Countly”, Countly Mobile Analytics, [Online]. Retrieved from the Internet: <URL: http://count.ly/products/screenshots, (accessed Jul. 18, 2013), 9 pgs. |
“DISTIMO—App Analytics”, [Online]. Retrieved from the Internet: <URL: http://www.distimo.com/app-analytics, (accessed Jul. 18, 2013), 5 pgs. |
“European Application Serial No. 14187996.5, Communication Pursuant to Article 94(3) EPC dated Feb. 19, 2016”, 9 pgs. |
“European Application Serial No. 14187996.5, Extended European Search Report dated Feb. 12, 2015”, 7 pgs. |
“European Application Serial No. 14191540.5, Extended European Search Report dated May 27, 2015”, 9 pgs. |
“European Application Serial No. 14200246.8, Extended European Search Report dated May 29, 2015”, 8 pgs. |
“European Application Serial No. 14200298.9, Extended European Search Report dated May 13, 2015”, 7 pgs. |
“European Application Serial No. 15181419.1, Extended European Search Report dated Sep. 29, 2015”, 7 pgs. |
“European Application Serial No. 15184764.7, Extended European Search Report dated Dec. 14, 2015”, 8 pgs. |
“European Application Serial No. 15200073.3, Extended European Search Report dated Mar. 30, 2016”, 16 pgs. |
“European Application Serial No. 15201924.6, Extended European Search Report dated Apr. 25, 2016”, 8 pgs. |
“European Application Serial No. 15202919.5, Extended European Search Report dated May 9, 2016”, 13 pgs. |
“European Application Serial No. 16152984.7, Extended European Search Report dated Mar. 24, 2016”, 8 pgs. |
“Flurry Analytics”, [Online]. Retrieved from the Internet: <URL: http://www.flurry.com/, (accessed Jul. 18, 2013), 14 pgs. |
“Google Analytics Official Website—Web Analytics & Reporting”, [Online]. Retrieved from the Internet: <URL: http ://www.google.com/ analytics/index.html, (accessed Jul. 18, 2013), 22 pgs. |
“Great Britain Application Serial No. 1404486.1, Combined Search Report and Examination Report dated Aug. 27, 2014”, 5 pgs. |
“Great Britain Application Serial No. 1404486.1, Office Action dated May 21, 2015”, 2 pgs. |
“Great Britain Application Serial No. 1404489.5, Combined Search Report and Examination Report dated Aug. 27, 2014”, 5 pgs. |
“Great Britain Application Serial No. 1404489.5, Office Action dated May 21, 2015”, 3 pgs. |
“Great Britain Application Serial No. 1404489.5, Office Action dated Oct. 6, 2014”, 1 pgs. |
“Great Britain Application Serial No. 1404499.4, Combined Search Report and Examination Report dated Aug. 20, 2014”, 6 pgs. |
“Great Britain Application Serial No. 1404499.4, Office Action dated Jun. 11, 2015”, 5 pgs. |
“Great Britain Application Serial No. 1404499.4, Office Action dated Sep. 29, 2014”, 1 pg. |
“Help File for ModelRisk Version 5—Part 1”, Vose Software, (2007), 375 pgs. |
“Help File for ModelRisk Version 5—Part 2”, Vose Software, (2007), 362 pgs. |
“Hunchlab: Heat Map and Kernel Density Calculation for Crime Analysis”, Azavea Journal, [Online]. Retrieved from the Internet: <www.azavea.com/blogs/newsletter/v4i4/kernel-density-capabilities-added-to-hunchlab>, (Sep. 9, 2014), 2 pgs. |
“KeyLines Datasheet”, Keylines.com, [Online]. Retrieved from the Internet: <URL: http://keylines.com/wp-content/uploads/2014/03/KeyLines-datasheet.pdf>, (Mar. 2014), 2 pgs. |
“Mixpanel: Actions speak louder than page views”, Mobile Analytics, [Online]. Retrieved from the Internet: <URL: https://mixpanel.com/>, (Accessed: Jul. 18, 2013), 13 pgs. |
“Mobile App Marketing & Analytics”, Localytics, [Online]. Retrieved from the Internet: <URL: http://www.localytics.com/>, (Accessed: Jul. 18, 2013), 12 pgs. |
“Mobile Web”, Wikipedia:, [Online] retrieved from the internet:https://en.wikipedia.org/w/index.php?title=Mobile Web&oldid=643800164, (Jan. 23, 2015), 6 pgs. |
“More than android analytics”, UserMetrix, [Online]. Retrieved from the Internet: <URL: http://usermetrix.com/android-analytics>, (Accessed: Jul. 18, 2013), 3 pgs. |
“More Than Mobile Analytics”, Kontagent, [Online]. Retrieved from the Internet: <URL: http://www.kontagent.com/>, (Accessed: Jul. 18, 2013), 9 pgs. |
“Multimap”, Wikipedia, [Online]. Retrieved from the Internet: <URL: https://en.wikipedia.org/w/index.php?title=Multimap&oldid=530800748>, (Jan. 1, 2013), 2 pgs. |
“Netherlands Application Serial No. 2012417, Netherlands Search Report dated Sep. 18, 2015”, W/ English Translation, 9 pgs. |
“Netherlands Application Serial No. 2012421, Netherlands Search Report dated Sep. 18, 2015”, 8 pgs. |
“Netherlands Application Serial No. 2012438, Search Report dated Sep. 21, 2015”, 8 pgs. |
“New Zealand Application Serial No. 622473, First Examination Report dated Mar. 27, 2014”, 3 pgs. |
“New Zealand Application Serial No. 622473, Office Action dated Jun. 19, 2014”, 2 pgs. |
“New Zealand Application Serial No. 622513, Office Action dated Apr. 3, 2014”, 2 pgs. |
“New Zealand Application Serial No. 628161, First Examination Report dated Aug. 25, 2014”, 2 pgs. |
“Piwik—Free Web Analytics Software”, Piwik, [Online]. Retrieved from the Internet: <URL: http://piwik.org/>, (Accessed: Jul. 19, 2013), 18 pgs. |
“Realtime Constant Customer Touchpoint”, Capptain—Pilot your apps, [Online] retrieved from the internet: <http://www.capptain.com>, (accessed Jul. 18, 2013), 6 pgs. |
“Refresh CSS ellipsis when resizing container”, Stack Overflow, [Online]. Retrieved from the Internet: <URL: http://stackoverflow.com/questions/17964681/refresh-css-ellipsis-when-resizing-container>, (Accessed: May 18, 2015), 1 pg. |
“SAP BusinessObjects Explorer Online Help”, SAP BusinessObjects, (Mar. 19, 2012), 68 pgs. |
“Visualizing Threats: Improved Cyber Security Through Network Visualization”, Keylines.com, [Online] retrieved from the internet: <http:/ /keylines.com/wp-content/uploads/2014/04/Visualizing-Threats1.pdf>, (May 12, 2014), 10 pgs. |
“Welcome to StatCounter—Visitor Analysis for Your Website”, StatCounter, [Online]. Retrieved from the Internet: <URL: http://statcounter.com/>, (Accessed: Jul. 19, 2013), 17 pgs. |
Andrew, G. Psaltis, “Streaming Data—Designing the real-time pipeline”, vol. MEAP V03, (Jan. 16, 2015), 16 pgs. |
Celik, T, “CSS Basic User Interface Module Level 3 (CSS3 UI)”, Section 8; Resizing and Overflow, [Online] retrieved from the internet: <http://www.w3.org/TR/2012/WD-css3-ui-20120117/#resizing-amp-overflow>, (Jan. 17, 2012), 1-58. |
Chaudhuri, Surajit, et al., “An Overview of Business Intelligence Technology”, Communications of the ACM, vol. 54, No. 8., (Aug. 2011), 88-98. |
Cohn, David, et al., “Semi-supervised Clustering with User Feedback”, Cornell University, (2003), 9 pgs. |
Gill, Leicester, et al., “Computerised linking of medical methodological guidelines”, 3rournal of Epidemiolog and Coimmunity Health 47, (1993), pp. 316-319. |
Gorr, et al., “Crime Hot Spot Forecasting: Modeling and Comparative Evaluation”, Grant 98-IJ-CX-K005, (May 6, 2002), 37 pgs. |
Gu, Lifang, et al., “Record Linkage: Current Practice and Future Directions”, (Jan. 15, 2004), 32 pgs. |
Hansen, D., et al., “Analyzing Social Media Networks with NodeXL: Insights from a Connected World”, Chapter 4, pp. 53-67 and Chapter 10, pp. 143-164 (Sep. 2010), 53-67; 143-164. |
Hua, Yu, et al., “A Multi-attribute Data Structure with Parallel Bloom Filters for Network Services”, HiPC 2006, LNCS 4297, (2006), 277-288. |
Jan-Keno, Janssen, “Wo bist'n du?—Googles Geodienst Latitude”, Not in English, [Online] retrieved from the internet:http://www.heise.de/artikel-archivict/2011/03/086/©00250©/ct.11.03.086-088.pdf, (Jan. 17, 2011), 86-88. |
Manno, et al., “Introducing Collaboration in Single-user Applications through the Centralized Control Architecture”, (2010), 10 pgs. |
Phillip, J Windley, “The Live Web: Building Event-Based Connections in the Cloud”, Course Technology PTR, (Dec. 21, 2011), 61 pgs. |
Sigrist, Christian, et al., “PROSITE, a Protein Domain Database for Functional Characterization and Annotation”, Nucleic Acids Research, vol. 38, (2010), D161-D166. |
Valentini, Giorgio, et al., “Ensembles of Learning Machines”, Lecture Notes in Computer Science: Neural Nets, Springer Berlin Heidelberg, (Sep. 26, 2002), 3-20. |
Wang, Guohua, et al., “Research on a Clustering Data De-Duplication Mechanism Based on Bloom Filter”, IEEE, (2010), 5 pgs. |
Winkler, William E, et al., “Record Linkage Software and Methods for Merging Administrative Lists”, Bureau of The Census Statistical Research Division: Statistical Research Report Series, No. RR2001/03, (Jul. 23, 2001), 11 pgs. |
Number | Date | Country | |
---|---|---|---|
62353233 | Jun 2016 | US |