The subject matter disclosed herein relates to computer graphics generation and processing. In particular, example embodiments relate to systems and methods for generating graphical representations of event participation flows for presentation by an electronic visual display.
Conventional flow diagrams are useful in representing structure and flow in a complex system or network. Typical flow diagrams illustrate an order of interactions (e.g., data transfers) between components or participants in the system or network. In this way, typical flow charts are suitable for illustrating event flow participation of a single participant engaged in a sequence of events over a certain period of time. However, typical flow diagrams fail to convey information regarding quantitative distribution of values involved in the interactions between components. Thus, while typical flow diagrams are useful in illustrating an event flow of a single participant, typical flow diagrams are not suitable to illustrate event flows of a group of participants engaged in multiple concurrent events. For example, though a typical flow chart may be suitable to illustrate a sequence of television shows viewed by a single viewer over a certain time period, a single typical flow chart would be unable to suitably illustrate an aggregate sequence of television shows viewed by a group of viewers over the same time period because the group of viewers are likely to be watching multiple different television shows during the same blocks of time. Additionally, although information regarding the individual participation of individual participants may be maintained in sources such as tables of numerical data, deriving an aggregate event flow of all such users from such sources can be difficult due to challenges involved in analyzing and understanding a bulk set of raw numerical data.
Various ones of the appended drawings merely illustrate example embodiments of the present inventive subject matter and cannot be considered as limiting its scope.
Reference will now be made in detail to specific example embodiments for carrying out the inventive subject matter. Examples of these specific embodiments are illustrated in the accompanying drawings, and specific details are set forth in the following description in order to provide a thorough understanding of the subject matter. It will be understood that these examples are not intended to limit the scope of the claims to the illustrated embodiments. On the contrary, they are intended to cover such alternatives, modifications, and equivalents as may be included within the scope of the disclosure.
Aspects of the present disclosure relate to systems and methods for generating graphical representations of event participation flows. An “event participation flow” includes a flow of participants of a subject event between events before and after the subject event. Accordingly, each event participation flow may include preceding events, which are events participated in by the participants prior to the subject event, and subsequent events, which are events participated in by the participants after the subject event.
Additional aspects of the present disclosure involve providing user interfaces to present graphical representations of event participation flows. The user interfaces include graphical elements representing events in the event participation flow and the relationships between each event. More specifically, the relationships illustrated in the graphical representations of event participation flows illustrate how participants transition between events and in what quantity. Accordingly, the graphical representations of event participation flows include indications of a number of participants for each event, an order of participation, and a number of participants that transition from participating in one event to another event.
Using interactive elements included in the user interface, a user may filter the graphical representations of event participation flows according to event category, event attributes, or participant attributes. The user interface includes other elements that allow users to group elements according to category such that the graphical representation is updated to illustrate the flow of participation between event categories. Additionally, the users may specify a secondary event in order to analyze the event participation flows of a subset of participants in the subject event that also participated in the secondary event.
As an example of the forgoing, the events included in event participation flows may correspond to content programming (e.g., a broadcast program or television program, which is also known as a “TV show” in common parlance), and the participants of the event correspond to views of the content programming. The event participation flows provided, according to this example, illustrate a transition of viewers between content programs. In this example, event categories by which a user may filter the graphical representations correspond to channels that broadcast individual content programs. In this way, users may visualize how viewers transition between programming provided by different channels. Further, in this example, the event attributes include a start and end time of broadcast of the programs, and the participant attributes include demographic information such as viewer's age, gender, location, marital status, income level. employment status, and the like.
As another example, the events included in event participation flows may correspond to a purchase of a product, and the participants of the event correspond to consumers who purchased the product. The event participation flows provided, according to this example, illustrate a sequence of purchases made by a group of consumers. As a more concrete example, the products may be insurance, and the event participation flows may include a sequence of insurance product purchases such as a purchase of car insurance, followed by a purchase of fire insurance, and then followed by a purchase of life insurance.
As yet another example, the events included in event participation flows may correspond to content viewed via web pages, and the participants of the event may correspond to viewers of the content. In this way, the event participation flows may be used to illustrate users' navigational flow through web pages or content of a web site or network.
As shown, the network system 100 includes a client device 102 in communication with a data processing platform 104 over a network 106. The data processing platform 104 communicates and exchanges data with the client device 102 that pertains to various functions and aspects associated with the network system 100 and its users. Likewise, the client device 102, which may be any of a variety of types of devices that include at least a display, a processor, and communication capabilities that provide access to the network 106 (e.g., a smart phone, a tablet computer, a personal digital assistant (PDA), a personal navigation device (PND), a handheld computer, a desktop computer, a laptop or netbook, or a wearable computing device), and may be operated by a user (e.g., a person) of the network system 100 to exchange data with the data processing platform 104 over the network 106.
The client device 102 communicates with the network 106 via a wired or wireless connection. For example, one or more portions of the network 106 may comprise an ad hoc network, an intranet, an extranet, a Virtual Private Network (VPN), a Local Area Network (LAN), a wireless LAN (WLAN), a Wide Area Network (WAN), a wireless WAN (WWAN), a Metropolitan Area Network (MAN), a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a cellular telephone network, a wireless network, a Wireless Fidelity (Wi-Fi®) network, a Worldwide Interoperability for Microwave Access (WiMax) network, another type of network, or any suitable combination thereof.
In various embodiments, the data exchanged between the client device 102 and the data processing platform 104 involve user-selected functions available through one or more user interfaces (UIs). The UIs may be specifically associated with a web client 108 (e.g., a browser) or an application 109, executing on the client device 102, and in communication with the data processing platform 104.
Turning specifically to the data processing platform 104, a web server 110 is coupled to (e.g., via wired or wireless interfaces), and provides web interfaces to, an application server 112. The application server 112 hosts one or more applications (e.g., web applications) that allow users to use various functions and services of the data processing platform 104. For example, the application server 112 may host an event participation flow visualization system 114 that is used to generate and present graphical representations of event participation flows. In some embodiments, the event participation flow visualization system 114 runs and executes on the application server 112, while in other embodiments, the application server 112 provides the client device 102 with a set of instructions (e.g., computer-readable code) that causes the web client 108 of the client device 102 to execute and run the event participation flow visualization system 114.
The event participation flow visualization system 114 analyzes data to determine a flow of participation in events by participants of a subject event. The subject event is user specified and serves as a subject for the participation flow visualization. An event participation flow represents the transition of participation by participants between multiple events. The event participation flow includes an aggregate of individual participation flows of the participants in the subject event. Each individual participation flow includes an event sequence of the participants that includes the subject event, one or more preceding events, and one or more subsequent events. Accordingly, the aggregate of individual event sequences (also referred to as “aggregate event sequence”) represents an overall participation flow of the participants as it includes events participated in by the participants prior to participation in the subject event (e.g., preceding events) as well as events participated in by the participants subsequent to participation in the subject event.
Upon determining the participation flow of the participants of the subject event, the event participation flow visualization system 114 generates a graphical representation thereof. The event participation flow visualization system 114 transmits instructions to the client device 102 that cause the device to present a user interface for viewing and interacting with the graphical representation of the participation flow. As an example of the interactions provided by the user interface, users may filter the information displayed according to event category such that only events of a certain category are displayed, by event attributes such that only events with certain attributes are displayed, or by participant attributes such that only participants with certain attributes are included in the represented participation flow.
Further, users may specify an additional event for visualization along with the subject event. In response to a user specifying an additional event, the event participation flow visualization system 114 generates and causes presentation of the relationship between the subject event and the additional event that includes the sequence of the events along with other preceding and subsequent events participated in by the participants of the subject event and additional event.
The data analyzed by the event participation flow visualization system 114 includes event data that comprises a plurality of event data records. Each event data record includes information about an event. For example, each event data record includes an event identifier (e.g., a name), event attributes (e.g., event category, start time, and end time), and a list of event participant identifiers. The event data is linked to participant data including one or more participant data records. The participant data records include information about event participants. For example, each participant data record includes a participant identifier and participant attributes describing characteristics of event participants. Participant attributes may relate to demographic data, and, accordingly, may include participant gender, age, location information (e.g., hometown or current location), income level, employment history, or education history. Depending on the embodiment, the participant attributes may further include any one of the following: a type of contract or service plan the participant has with a particular company; an effective date of the contract or service plan; an average bill amount; an aggregate amount billed over a particular time period; a credit score; and a type of home (e.g., apartment or single family home).
Data analyzed by the event participation flow visualization system 114 (e.g., event data and participant data) is obtained from a third-party computing system 118 and in particular, a third-party database 120 communicatively coupled to the third-party computing system 118. The data may be routinely, automatically retrieved (e.g., nightly) by the event participation flow visualization system 114, or manually provided by a user of the third-party computing system 118 or the client device 102 for subsequent processing and analysis by the event participation flow visualization system 114.
The data obtained from the third-party computing system 118 is stored in a database 116, which is a machine-readable storage medium that is communicatively coupled to the application server 112 (e.g., via wired or wireless interfaces over the network 106). The data processing platform 104 may further include a database server (not shown) that facilitates access to the database 116. The database 116 may include multiple databases that may be internal or external to the data processing platform 104.
The event participation flow visualization system 114 is shown as including an interface module 200, a data retrieval module 202, a data analysis module 204, a visualization engine 206, and a filter module 208 all configured to communicate with each other (e.g., via a bus, shared memory, a switch, or application programming interfaces (APIs)). The aforementioned modules of the event participation flow visualization system 114 may, furthermore, access the database 116, and each of the modules may access one or more computer-readable storage media of the client device 102.
The interface module 200 receives requests from various client devices and communicates appropriate responses to the requesting client devices. The interface module 200 provides a number of interfaces (e.g., APIs or user interfaces that are presented by the client device 102) that allow data to be received by the event participation flow visualization system 114. For example, the interface module 200 may receive requests from client devices in the form of Hypertext Transfer Protocol (HTTP) requests, API requests, or other web-based requests.
The interface module 200 also provides user interfaces that include graphical representations of event participation flows. The interface module 200 also receives and processes user input received through such user interfaces. An example of the user interfaces provided by the interface module 200 is discussed below in reference to
The data retrieval module 202 is configured to retrieve data for processing and analysis. For example, the data retrieval module 202 obtains event data comprising a plurality of event data records. In some embodiments, the data retrieval module 202 retrieves such data from the third-party database 120 of the third-party computing system 118 through appropriate requests (e.g., API requests or calls) transmitted over the network 106. The data may be retrieved by the data retrieval module 202 on a periodic basis (e.g., nightly). In some embodiments, the data retrieval module 202 obtains data from a location specified by a user (e.g., via a user interface provided by the interface module 200).
The data analysis module 204 is configured to analyze data to determine event participation flows for participants of a subject event. The event participation flows represent a transition of participation between the subject event and multiple other events. Accordingly, an event participation flow includes the subject event, one or more preceding events, one or more subsequent events, and a flow quantity between each event. The subject event is specified by user input, and the participants of the subject event are identified based on information included in an event data record corresponding to the subject event. Preceding events are events in which one or more participants participated in prior to participating in the subject event. Subsequent events are events in which one or more participants participated in subsequent to participating in the subject event. The flow quantity between two events refers to a number of weighted or unweighted participants that transition from participating in one event to participating in another event.
The event participation flow is determined based on an aggregate of individual event sequences of each participant in the subject event. Accordingly, in determining the event participation flow of a set of participants, the data analysis module 204 determines an event sequence of each participant in the set. The event sequence is a sequence of events participated in by the same participant that includes the subject event. The data analysis module 204 determines the event participation flow of the set of participants by performing operations including identifying preceding events and subsequent events attended by at least one participant of the subject event. In some instances, the data analysis module 204 may determine that a participant did not participate in an event prior to the subject event, or that the participant did not participate in an event subsequent to the subject event.
The data analysis module 204 determines the preceding and subsequent events associated with each participant using a combination of information including event data (e.g., the list of participant identifiers for the event) and participant data (e.g., data regarding events participated in by the participants). For example, the data analysis module 204 may access participant data of the participants of the subject event that indicates the events participated in by each participant and is also linked to event data corresponding to these events. By cross referencing the participant data and the event data, the data analysis module 204 identifies preceding and subsequent events in which each participant has participated.
The data analysis module 204 further determines a participation quantity (e.g., a number of participants) for each identified preceding and subsequent event by summing a total number of participants associated with each respective event. The data analysis module 204 then determines flow quantities between each preceding event and the subject event, and between the subject event and each subsequent event. The data analysis module 204 determines the flow quantity between two events by determining the overlap of participants between the two events.
The visualization engine 206 is configured to generate graphical representations of the event participation flows determined by the data analysis module 204. The graphical representation of the event participation flows are presented in interfaces generated by the interface module 200. Each graphical representation includes graphical elements representing the subject events, preceding events, subsequent events, and the participation flows between each event. The participation flows are represented by connector elements, and a width of each connector element is proportional to a flow quantity (e.g., the number of participants that transitioned from participation in a first event to participation in a second event).
The user interfaces in which the graphical representations are presented include interactive elements that facilitate interaction of users with the graphical representations. For example, using an interactive element in the user interface, a user may specify one or more filters for the graphical representation. The filter module 208 is configured to filter the graphical representation of event participation flows according to the user filter selections. The filtering of the graphical representation of the event participation flows may include removing or modifying one or more graphical elements. The filter selections may, for example, include category filters that specify a particular category of event, event attribute filters that specify a particular event attribute (e.g., events with a particular start time), or participant attributes that specify a particular participant attribute (e.g., an age range).
A participant table 304 includes a plurality of participant data records. The participant data records include information about event participants such as a participant identifier, a list of events in which the participant participated, and participant attributes 306 describing characteristics of event participants. Participant attributes 306 may, for example, relate to demographic data, and accordingly, may include participant gender, age, location information (e.g., hometown or current location), employment history, or education history. Each event record in the event table 300 is linked to one or more participant records within the participant table 304 so as to associate a particular event with the data records of its participants. Participant data records in the participant table 304 are indexed by participant identifier. Further, each participant data record in the participant table 304 may be linked to one or more event records within the event table 300 so as to associate a particular participant with the data records of events in which they have participated.
At operation 405, the interface module 200 receives a subject event identifier. The subject event identifier, which identifies a subject event, is received via user input entered into a user interface provided by the interface module 200 and displayed on the client device 102. The subject event identifier may be specified by the user by inputting text in a text-input field of the user interface, by selecting the event identifier from a drop-down menu (or similar interface element) that includes a list of subject event identifiers, or by selecting a graphical element from an existing graphical representation of an event participation flow.
At operation 410, the data analysis module 204 accesses event data from the database 116. In some other embodiments, the data analysis module 204 accesses the event data from a location specified by the user, such as a local data repository of the client device 102. The event data includes a plurality of event data records including an event data record corresponding to the subject event. Each event data record includes an event identifier, event attributes (e.g., start time and end time), and a list of participant identifiers corresponding to participants of the event.
At operation 415, the data analysis module 204 determines an event participation flow for the list of participant identifiers of the subject event using the event data. The event participation flow includes the subject event, one or more preceding events, one or more subsequent events, and a flow quantity between each event. The event participation flow is determined by aggregating individual event sequences of each participant in the subject event. Accordingly, the event participation flow includes a flow quantity from each preceding event to the subject event, and a flow quantity from the subject event to each subsequent event. Further details regarding the determination of the aggregate event sequence are discussed below in reference to
At operation 420, the visualization engine 206 generates a graphical representation of the event participation flow of the participants of the subject event. The graphical representation of the event participation flow includes a plurality of graphical elements (e.g., blocks) representing the subject event, preceding events, subsequent events and the flow quantity of participants between the events. Each graphical element representing an event may include an indication of the number of participants for the event, a number of participants that participated in the event without participating in a preceding event, and a number of participants that participated in the event without participating in a subsequent event. In some instances, the number of participants in an event may be represented by a size of the graphical element representing the event. The flow of participants between events (e.g., flow from preceding event to subject event or subject event to subsequent event) are represented by connector elements (e.g., arrows), and the width of each connector element corresponds to the flow quantity of participants that have transitioned from one event to another.
At operation 425, the interface module 200 causes the presentation of a user interface that includes the graphical representation of the event participation flow of the participants of the subject event. For example, the interface module 200 provides the client device 102 with instructions that cause the client device 102 to present the user interface. The user interface used for displaying the graphical representation of the aggregate event sequence may, in some embodiments, be the same as the user interface for receiving the subject event identifier. In this manner, from the prospective of a user, once the subject event identifier is input into one portion of the user interface, the graphical representation of the event participation flow is then automatically displayed in another portion of the user interface. Further, the user may interact with the graphical representation of the event participation flow through interaction with elements of the user interface. For example, the user may select (e.g., by clicking on) one of the graphical elements representing a preceding or subsequent event to make that event the subject event, and, in response, the method 400 is repeated for the newly selected subject event.
As another example of the interaction provided by the user interface displaying the graphical representation of the event participation flow, the user may specify a secondary event in order to examine its relationship (e.g., the participation flow) with the subject event.
At operation 505, the interface module 200 receives a secondary event identifier identifying a secondary event for analysis along with the subject event. As with the subject event identifier, the secondary event identifier is received via user input entered into a user interface provided by the interface module 200 and displayed on the client device 102. The secondary event identifier may be specified by the user by inputting text in a text-input field of the user interface, by selecting the event identifier from a drop-down menu (or similar interface element) that includes a list of subject event identifiers, or by selecting a graphical element from an existing graphical representation of an event participation flow.
At operation 510, the data analysis module 204 determines a subset of participants in the subject event that also participated in the secondary event. The data analysis module 204 determines the subset of participants based on the event data. More specifically, the data analysis module 204 determines the subset of participants by comparing the list of participant identifiers included in the event data record corresponding to the subject event with the list of participant identifiers included in the event data record corresponding to the secondary event.
At operation 515, the data analysis module 204 determines a focused event participation flow involving the subset of participants of the subject event that also participated in the secondary event. The determining of the focused event participation flow by the data analysis module 204 includes determining one or more preceding events participated in by at least one participant of the subset of participants, determining one or more subsequent events participated in by at least one participant of the subset of participants, determining a total number of participants for each of the preceding and subsequent events, and determining a flow quantity between each preceding event and the subject event, and between the subject event and each subsequent event.
At operation 520, the visualization engine 206 generates the graphical representation of the focused event participation flow that includes the subject event and the secondary event. The graphical representation of the event participation flow includes graphical representations of the subject event and graphical elements representing the one or more preceding events or the one or more subsequent events. The graphical representation further includes connecter elements illustrating the flow of participation between each event and the width of each connector element represents the flow quantity.
At operation 525, the interface module 200 causes presentation of the graphical representation of the focused event participation flow that includes the subject event and the secondary event. For example, the interface module 200 transmits instructions to the client device 102 that cause the client device 102 to display a user interface including the graphical representation.
As yet another example of the interaction provided by the user interface displaying the graphical representation of the event participation flow, the user may select one or more filters to reduce the elements displayed in the graphical representation to a particular subset of elements.
At operation 605, the interface module 200 receives a filter selection entered as user input into the user interface displaying the graphical representation of the event participation flow. The filter may be selected from a drop-down menu, check box, or other such interactive element included in the user interface. The filter selection may, for example, be: a category filter specifying an event category; a participant filter specifying a participant attribute; or an event attribute filter specifying an event attribute.
At operation 610, the filter module 208 filters the graphical representation of the event participation flow according to the received filter selection. The filter of the graphical representation includes removing or modifying one or more graphical elements in accordance with the filter selection. For example, when the filter selection is an event category, the filter module 208 filters the graphical representation of the aggregate event sequence to include a subset of the graphical elements that correspond to representations of events in the event category. When the filter selection is a participant filter, the filter module 208 works in conjunction with the visualization engine 206 to update the graphical representation such that only participants having the specified participant attribute are included in the aggregate event sequence. When the filter selection is an event attribute filter, the filter module 208 filters the graphical representation of the event participation flow to include a subset of the graphical elements that correspond to representations of events having the event attribute.
At operation 705, the data analysis module 204 identifies a set of participants for the subject event. The subject event corresponds to a received subject event identifier (e.g., the received subject event identifier discussed above in reference to operation 405 of method 400). The data analysis module 204 identifies the set of participants for the subject event by accessing the event data record of the subject event, which includes a list of participant identifiers for the subject event.
At operation 710, the data analysis module 204 determines preceding events participated in by at least one participant of the subject event. A preceding event is an event participated in by a participant prior to participating in the subject event. At operation 715, the data analysis module 204 determines subsequent events participated in by at least one participant of the subject event. A subsequent event is an event participated in by a participant subsequent to participating in the subject event. The data analysis module 204 determines the preceding and subsequent events using a combination of information including event data records (e.g., the list of participant identifiers for the event) and participant data records (e.g., events participated in by the participants). For example, the data analysis module 204 may access participant data records corresponding to the list of participant identifiers for the subject event. Each of the accessed participant data records indicates the events participated in by the participant and is linked to the event data records corresponding to these events. Using the start and end time event attributes of each event data record attended by an individual participant, the data analysis module 204 identifies preceding and subsequent events in which the individual participant participated. In some instances, the data analysis module 204 may determine that one or more participants of the subject event did not participate in an event prior to the subject event, or that one or more participants did not participate in an event subsequent to the subject event.
At operation 720, the data analysis module 204 determines a participation quantity (e.g., a number of participants) for each preceding and subsequent event identified at operations 710 and 715, respectively. The participation quantity of the preceding and subsequent events is determined by summing a total number of participants associated with each respective event.
At operation 725, the data analysis module 204 determines flow quantities between each preceding event and the subject event, and between the subject event and each subsequent event. The data analysis module 204 determines the flow quantity between each preceding and subsequent event and the subject event by calculating a sum of participants of the subject event that also participated in the respective preceding or subsequent event.
As shown, the graphical representation of the event flow participation includes the graphical element 810 that represents the subject event. The graphical element 810 includes textual information about the event including an event title, information describing attributes of the event, a number of participants in the subject event, a number of participants in the event that did not participate in an event prior to the subject event (illustrated in
The graphical representation of the event flow participation further includes graphical elements that represent preceding events, which are events participated in by the participants of the subject event prior to participating in the subject event. For example, graphical element 812 represents “Preceding Event 1.”
The graphical representation of the event flow participation further includes graphical elements that represent subsequent events, which are events participated in by the participants of the subject event subsequent to participating in the subject event. For example, graphical element 814 represents “Subsequent Event 1.” Each of the graphical elements representing an event may include textual information about the event including, for example, an event title, event attributes, and a number of participants in the event. Additionally, the graphical elements representing events may be color coded according to the event category to which they belong.
The graphical elements representing events also include a textual indication of a percentage of participants in the subject event that are represented by the participants of the event the graphical element represents. Further, the size of each of the graphical elements relative to the size of the graphical element 820 represent the percentage of participants in the subject event that participated in the event the graphical element represents.
The graphical elements representing the preceding events are connected to the graphical element 810 representing the subject event by connector elements, and the graphical element 810 representing the subject event is also connected to the graphical elements representing subsequent events by connector elements. For example, graphical element 812 is connected to graphical element 810 by connector element 816, and the graphical element 810 is also connected to graphical element 814 by a connector element 818. The connector elements represent the flow of participants between events. A width of each connector represents a flow quantity. The flow quantity may represent a number of weighted or unweighted participants.
In some instances, a graphical element included in the graphical representation of event flow participation may represent multiple events. For example, events having a number of participants below a predefined threshold may be grouped together into a single element and labeled “other” as is the case with graphical elements 820 and 822 illustrated in
A user may specify a new subject event for event participation visualization by selecting another event from the list of events included in the drop-down menu 802 or through selection of one of the graphical elements representing a preceding or subsequent event. For example, upon receiving a user selection of the graphical element 812 (e.g., by positioning the mouse cursor over the element and clicking it), the event participation flow visualization system 114 generates and causes presentation of a graphical representation of an updated event participation flow with the preceding event represented by the graphical element 812 (“Preceding Event #1”) as the subject event.
By way of non-limiting example, the machine 1100 may comprise or correspond to a television, a computer (e.g., a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, or a netbook). a set-top box (STB), a personal digital assistant (PDA), an entertainment media system (e.g., an audio/video receiver), a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), a portable media player, or any machine capable of outputting audio signals and capable of executing the instructions 1102, sequentially or otherwise, that specify actions to be taken by machine 1100. Further, while only a single machine 1100 is illustrated, the term “machine” shall also be taken to include a collection of machines 1100 that individually or jointly execute the instructions 1102 to perform any one or more of the methodologies discussed herein.
The machine 1100 may include processors 1104, memory 1106, storage unit 1108 and 1/O components 1110, which may be configured to communicate with each other such as via a bus 1112. In an example embodiment, the processors 1104 (e.g., a central processing unit (CPU), a reduced instruction set computing (RISC) processor, a complex instruction set computing (CISC) processor, a graphics processing unit (GPU), a digital signal processor (DSP), an application specific integrated circuit (ASIC), a radio-frequency integrated circuit (RFIC), another processor, or any suitable combination thereof) may include, for example, processor 1114 and processor 1116 that may execute instructions 1102. The term “processor” is intended to include multi-core processors that may comprise two or more independent processors (sometimes referred to as “cores”) that may execute instructions contemporaneously. Although
The memory 1106 (e.g., a main memory or other memory storage) and the storage unit 1108 are both accessible to the processors 1104 such as via the bus 1112. The memory 1106 and the storage unit 1108 store the instructions 1102 embodying any one or more of the methodologies or functions described herein. In some embodiments, the database 116 resides on the storage unit 1108. The instructions 1102 may also reside, completely or partially, within the memory 1106, within the storage unit 1108, within at least one of the processors 1104 (e.g., within the processor's cache memory), or any suitable combination thereof, during execution thereof by the machine 1100. Accordingly, the memory 1106, the storage unit 1108, and the memory of processors 1104 are examples of machine-readable media.
As used herein, “machine-readable medium” means a device able to store instructions and data temporarily or permanently and may include, but is not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, optical media, magnetic media, cache memory, other types of storage (e.g., erasable programmable read-only memory (EEPROM)), or any suitable combination thereof. The term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions 1102. The term “machine-readable medium” shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., instructions 1102) for execution by a machine (e.g., machine 1100), such that the instructions, when executed by one or more processors of the machine 1100 (e.g., processors 1104), cause the machine 1100 to perform any one or more of the methodologies described herein (e.g., method 400, 500, 600, and 700). Accordingly, a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” excludes signals per se.
Furthermore, the “machine-readable medium” is non-transitory in that it does not embody a propagating signal. However, labeling the tangible machine-readable medium as “non-transitory” should not be construed to mean that the medium is incapable of movement—the medium should be considered as being transportable from one real-world location to another. Additionally, since the machine-readable medium is tangible, the medium may be considered to be a machine-readable device.
The I/O components 1110 may include a wide variety of components to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on. The specific I/O components 1110 that are included in a particular machine will depend on the type of machine. For example, portable machines such as mobile phones will likely include a touch input device or other such input mechanisms, while a headless server machine will likely not include such a touch input device. It will be appreciated that the I/O components 1110 may include many other components that are not specifically shown in
Communication may be implemented using a wide variety of technologies. The I/O components 1110 may include communication components 1122 operable to couple the machine 1100 to a network 1124 or devices 1126 via coupling 1128 and coupling 1130, respectively. For example, the communication components 1122 may include a network interface component or other suitable device to interface with the network 1124. In further examples, communication components 1122 may include wired communication components, wireless communication components, cellular communication components, near field communication (NFC) components, Bluetooth® components (e.g., Bluetooth® Low Energy), Wi-Fi® components, and other communication components to provide communication via other modalities. The devices 1126 may be another machine or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a Universal Serial Bus (USB)).
Modules, Components and Logic
Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules. A hardware module is a tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client, or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field-programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
Accordingly, the term “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired) or temporarily configured (e.g., programmed) to operate in a certain manner and/or to perform certain operations described herein. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses that connect the hardware modules). In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
Similarly, the methods described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment, or a server farm), while in other embodiments the processors may be distributed across a number of locations.
The one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., APIs).
Electronic Apparatus and System
Example embodiments may be implemented in digital electronic circuitry, or in computer hardware, firmware, or software, or in combinations of them. Example embodiments may be implemented using a computer program product, for example, a computer program tangibly embodied in an information carrier, for example, in a machine-readable medium for execution by, or to control the operation of, data processing apparatus, for example, a programmable processor, a computer, or multiple computers.
A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a standalone program or as a module, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site, or distributed across multiple sites and interconnected by a communication network.
In example embodiments, operations may be performed by one or more programmable processors executing a computer program to perform functions by operating on input data and generating output. Method operations can also be performed by, and apparatus of example embodiments may be implemented as, special purpose logic circuitry (e.g., an FPGA or an ASIC).
The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In embodiments deploying a programmable computing system, it will be appreciated that both hardware and software architectures merit consideration. Specifically, it will be appreciated that the choice of whether to implement certain functionality in permanently configured hardware (e.g., an ASIC), in temporarily configured hardware (e.g., a combination of software and a programmable processor), or in a combination of permanently and temporarily configured hardware may be a design choice. Below are set out hardware (e.g., machine) and software architectures that may be deployed, in various example embodiments.
Language
Although the embodiments of the present invention have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader scope of the inventive subject matter. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. The accompanying drawings that form a part hereof show by way of illustration, and not of limitation, specific embodiments in which the subject matter may be practiced. The embodiments illustrated are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed herein. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. This Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.
Such embodiments of the inventive subject matter may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent, to those of skill in the art, upon reviewing the above description.
All publications, patents, and patent documents referred to in this document are incorporated by reference herein in their entirety, as though individually incorporated by reference. In the event of inconsistent usages between this document and those documents so incorporated by reference, the usage in the incorporated references should be considered supplementary to that of this document; for irreconcilable inconsistencies, the usage in this document controls.
In this document, the terms “a” or “an” are used, as is common in patent documents, to include one or more than one, independent of any other instances or usages of “at least one” or “one or more.” In this document, the term “or” is used to refer to a nonexclusive or, such that “A or B” includes “A but not B,” “B but not A,” and “A and B,” unless otherwise indicated. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Also, in the following claims, the terms “including” and “comprising” are open-ended; that is, a system, device, article, or process that includes elements in addition to those listed after such a term in a claim are still deemed to fall within the scope of that claim.
The present application claims the benefit of priority of U.S. Provisional Application No. 62/244,585, filed on Oct. 21, 2015, which is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
6430305 | Decker | Aug 2002 | B1 |
6820135 | Dingman | Nov 2004 | B1 |
6978419 | Kantrowitz | Dec 2005 | B1 |
6980984 | Huffman et al. | Dec 2005 | B1 |
7168039 | Bertram | Jan 2007 | B2 |
7617232 | Gabbert et al. | Nov 2009 | B2 |
7756843 | Palmer | Jul 2010 | B1 |
7899796 | Borthwick et al. | Mar 2011 | B1 |
7917376 | Bellin et al. | Mar 2011 | B2 |
7941321 | Greenstein et al. | May 2011 | B2 |
8036971 | Aymeloglu et al. | Oct 2011 | B2 |
8046283 | Burns | Oct 2011 | B2 |
8054756 | Chand et al. | Nov 2011 | B2 |
8214490 | Vos et al. | Jul 2012 | B1 |
8229902 | Vishniac et al. | Jul 2012 | B2 |
8290838 | Thakur et al. | Oct 2012 | B1 |
8302855 | Ma et al. | Nov 2012 | B2 |
8473454 | Evanitsky et al. | Jun 2013 | B2 |
8484115 | Aymeloglu et al. | Jul 2013 | B2 |
8589273 | Creeden et al. | Nov 2013 | B2 |
8688573 | Rukonic et al. | Apr 2014 | B1 |
8744890 | Bernier | Jun 2014 | B1 |
8812960 | Sun et al. | Aug 2014 | B1 |
8924388 | Elliot et al. | Dec 2014 | B2 |
8924389 | Elliot et al. | Dec 2014 | B2 |
8938686 | Erenrich et al. | Jan 2015 | B1 |
8949164 | Mohler | Feb 2015 | B1 |
9100428 | Visbal | Aug 2015 | B1 |
9146954 | Boe et al. | Sep 2015 | B1 |
20020065708 | Senay et al. | May 2002 | A1 |
20020095360 | Joao | Jul 2002 | A1 |
20020095658 | Shulman | Jul 2002 | A1 |
20020103705 | Brady | Aug 2002 | A1 |
20020147805 | Leshem et al. | Oct 2002 | A1 |
20030126102 | Borthwick | Jul 2003 | A1 |
20040034570 | Davis | Feb 2004 | A1 |
20040111480 | Yue | Jun 2004 | A1 |
20040153418 | Hanweck | Aug 2004 | A1 |
20040236688 | Bozeman | Nov 2004 | A1 |
20050010472 | Quatse et al. | Jan 2005 | A1 |
20050086207 | Heuer et al. | Apr 2005 | A1 |
20050154628 | Eckart et al. | Jul 2005 | A1 |
20050154769 | Eckart et al. | Jul 2005 | A1 |
20060026120 | Carolan et al. | Feb 2006 | A1 |
20060143034 | Rothermel | Jun 2006 | A1 |
20060143075 | Carr et al. | Jun 2006 | A1 |
20060143079 | Basak et al. | Jun 2006 | A1 |
20070000999 | Kubo et al. | Jan 2007 | A1 |
20070011304 | Error | Jan 2007 | A1 |
20070038646 | Thota | Feb 2007 | A1 |
20070150801 | Chidlovskii et al. | Jun 2007 | A1 |
20070156673 | Maga | Jul 2007 | A1 |
20070185867 | Maga | Aug 2007 | A1 |
20070284433 | Domenica et al. | Dec 2007 | A1 |
20080222295 | Robinson et al. | Sep 2008 | A1 |
20080255973 | El Wade et al. | Oct 2008 | A1 |
20080313132 | Hao et al. | Dec 2008 | A1 |
20090106178 | Chu | Apr 2009 | A1 |
20090112745 | Stefanescu | Apr 2009 | A1 |
20090125359 | Knapic | May 2009 | A1 |
20090125459 | Norton et al. | May 2009 | A1 |
20090187546 | Whyte et al. | Jul 2009 | A1 |
20090187548 | Ji et al. | Jul 2009 | A1 |
20090249244 | Robinson et al. | Oct 2009 | A1 |
20090271343 | Vaiciulis et al. | Oct 2009 | A1 |
20090307049 | Elliott et al. | Dec 2009 | A1 |
20090313463 | Pang et al. | Dec 2009 | A1 |
20090319418 | Herz | Dec 2009 | A1 |
20090319891 | MacKinlay | Dec 2009 | A1 |
20100030722 | Goodson et al. | Feb 2010 | A1 |
20100031141 | Summers et al. | Feb 2010 | A1 |
20100042922 | Bradeteanu et al. | Feb 2010 | A1 |
20100057622 | Faith et al. | Mar 2010 | A1 |
20100070842 | Aymeloglu et al. | Mar 2010 | A1 |
20100098318 | Anderson | Apr 2010 | A1 |
20100114887 | Conway et al. | May 2010 | A1 |
20100131502 | Fordham | May 2010 | A1 |
20100161735 | Sharma | Jun 2010 | A1 |
20100191563 | Schlaifer et al. | Jul 2010 | A1 |
20100235915 | Memon et al. | Sep 2010 | A1 |
20100262688 | Hussain et al. | Oct 2010 | A1 |
20100312837 | Bodapati et al. | Dec 2010 | A1 |
20110061013 | Bilicki et al. | Mar 2011 | A1 |
20110078173 | Seligmann et al. | Mar 2011 | A1 |
20110093327 | Fordyce et al. | Apr 2011 | A1 |
20110099133 | Chang et al. | Apr 2011 | A1 |
20110153384 | Horne et al. | Jun 2011 | A1 |
20110173093 | Psota et al. | Jul 2011 | A1 |
20110181598 | O'Neall et al. | Jul 2011 | A1 |
20110208565 | Ross et al. | Aug 2011 | A1 |
20110213655 | Henkin | Sep 2011 | A1 |
20110218955 | Tang | Sep 2011 | A1 |
20110270604 | Qi et al. | Nov 2011 | A1 |
20110270834 | Sokolan et al. | Nov 2011 | A1 |
20110289397 | Eastmond et al. | Nov 2011 | A1 |
20110295649 | Fine | Dec 2011 | A1 |
20110314007 | Dassa et al. | Dec 2011 | A1 |
20110314024 | Chang et al. | Dec 2011 | A1 |
20120011238 | Rathod | Jan 2012 | A1 |
20120011245 | Gillette et al. | Jan 2012 | A1 |
20120022945 | Falkenborg et al. | Jan 2012 | A1 |
20120054284 | Rakshit | Mar 2012 | A1 |
20120059853 | Jagota | Mar 2012 | A1 |
20120066166 | Curbera et al. | Mar 2012 | A1 |
20120075324 | Cardno et al. | Mar 2012 | A1 |
20120079363 | Folting et al. | Mar 2012 | A1 |
20120084117 | Tavares et al. | Apr 2012 | A1 |
20120084287 | Lakshminarayan et al. | Apr 2012 | A1 |
20120131512 | Takeuchi et al. | May 2012 | A1 |
20120159362 | Brown et al. | Jun 2012 | A1 |
20120173381 | Smith | Jul 2012 | A1 |
20120215784 | King et al. | Aug 2012 | A1 |
20120221553 | Wittmer et al. | Aug 2012 | A1 |
20120226523 | Weiss | Sep 2012 | A1 |
20120245976 | Kumar et al. | Sep 2012 | A1 |
20120323888 | Osann, Jr. | Dec 2012 | A1 |
20130016106 | Yip et al. | Jan 2013 | A1 |
20130050217 | Armitage | Feb 2013 | A1 |
20130054306 | Bhalla | Feb 2013 | A1 |
20130057551 | Ebert et al. | Mar 2013 | A1 |
20130096988 | Grossman et al. | Apr 2013 | A1 |
20130110746 | Ahn | May 2013 | A1 |
20130151453 | Bhanot et al. | Jun 2013 | A1 |
20130166348 | Scotto | Jun 2013 | A1 |
20130166480 | Popescu et al. | Jun 2013 | A1 |
20130185245 | Anderson | Jul 2013 | A1 |
20130185307 | El-Yaniv et al. | Jul 2013 | A1 |
20130226318 | Procyk | Aug 2013 | A1 |
20130238616 | Rose et al. | Sep 2013 | A1 |
20130246170 | Gross et al. | Sep 2013 | A1 |
20130246537 | Gaddala | Sep 2013 | A1 |
20130246597 | Iizawa et al. | Sep 2013 | A1 |
20130208565 | Castellanos et al. | Oct 2013 | A1 |
20130263019 | Castellanos et al. | Oct 2013 | A1 |
20130282696 | John et al. | Oct 2013 | A1 |
20130290825 | Arndt et al. | Oct 2013 | A1 |
20130297619 | Chandrasekaran et al. | Nov 2013 | A1 |
20130304770 | Boero et al. | Nov 2013 | A1 |
20140012796 | Petersen et al. | Jan 2014 | A1 |
20140040371 | Gurevich et al. | Feb 2014 | A1 |
20140058914 | Song et al. | Feb 2014 | A1 |
20140068487 | Steiger et al. | Mar 2014 | A1 |
20140095509 | Patton | Apr 2014 | A1 |
20140108380 | Gotz et al. | Apr 2014 | A1 |
20140108985 | Scott et al. | Apr 2014 | A1 |
20140123279 | Bishop et al. | May 2014 | A1 |
20140136285 | Carvalho | May 2014 | A1 |
20140143009 | Brice et al. | May 2014 | A1 |
20140156527 | Grigg et al. | Jun 2014 | A1 |
20140157172 | Peery et al. | Jun 2014 | A1 |
20140164502 | Khodorenko et al. | Jun 2014 | A1 |
20140189536 | Lange et al. | Jul 2014 | A1 |
20140195515 | Baker et al. | Jul 2014 | A1 |
20140222521 | Chait | Aug 2014 | A1 |
20140222793 | Sadkin et al. | Aug 2014 | A1 |
20140229554 | Grunin et al. | Aug 2014 | A1 |
20140344230 | Krause et al. | Nov 2014 | A1 |
20140358829 | Hurwitz | Dec 2014 | A1 |
20140366132 | Stiansen et al. | Dec 2014 | A1 |
20150073929 | Psota et al. | Mar 2015 | A1 |
20150073954 | Braff | Mar 2015 | A1 |
20150095773 | Gonsalves et al. | Apr 2015 | A1 |
20150100897 | Sun et al. | Apr 2015 | A1 |
20150106379 | Elliot et al. | Apr 2015 | A1 |
20150135256 | Hoy et al. | May 2015 | A1 |
20150188872 | White | Jul 2015 | A1 |
20150338233 | Cervelli et al. | Nov 2015 | A1 |
20150347903 | Saxena et al. | Dec 2015 | A1 |
20150379413 | Robertson et al. | Dec 2015 | A1 |
Number | Date | Country |
---|---|---|
102546446 | Jul 2012 | CN |
103167093 | Jun 2013 | CN |
102054015 | May 2014 | CN |
102014204827 | Sep 2014 | DE |
102014204830 | Sep 2014 | DE |
102014204834 | Sep 2014 | DE |
2487610 | Aug 2012 | EP |
2858018 | Apr 2015 | EP |
2869211 | May 2015 | EP |
2889814 | Jul 2015 | EP |
2892197 | Jul 2015 | EP |
WO 20050116851 | Dec 2005 | WO |
Entry |
---|
AMNET, “5 Great Tools for Visualizing Your Twitter Followers,” posted Aug. 4, 2010, http://www.amnetblog.com/component/content/article/115-5-grate-tools-for-visualizing-your-twitter-followers.html. |
APPACTS, “Smart Thinking for Super Apps,” <http://www.appacts.com> Printed Jul. 18, 2013 in 4 pages. |
APSALAR, “Data Powered Mobile Advertising,” “Free Mobile App Analytics” and various analytics related screen shots <http://apsalar.com> Printed Jul. 18, 2013 in 8 pages. |
Capptain—Pilot Your Apps, <http://www.capptain.com> Printed Jul. 18, 2013 in 6 pages. |
Celik, Tantek, “CSS Basic User Interface Module Level 3 (CSS3 UI),” Section 8 Resizing and Overflow, Jan. 17, 2012, retrieved from internet http://www.w3.org/TR/2012/WD-css3-ui-20120117/#resizing-amp-overflow retrieved on May 18, 2015 in 58 pages. |
Chaudhuri et al., “An Overview of Business Intelligence Technology,” Communications of the ACM, Aug. 2011, vol. 54, No. 8, pp. 88-98. |
Cohn et al., “Semi-supervised Clustering with User Feedback,” Constrained Clustering: Advances in Algorithms, Theory, and Applications 4.1, 2003, pp. 17-32. |
Countly Mobile Analytics, <http://count.ly/> Printed Jul. 18, 2013 in 9 pages. |
DISTIMO—App Analytics, <http://www.distimo.com/app-analytics> Printed Jul. 18, 2013 in 5 pages. |
Flurry Analytics, <http://www.flurry.com/> Printed Jul. 18, 2013 in 14 pages. |
Google Analytics Official Website—Web Analytics & Reporting, <http://www.google.com/analytics.index.html> Printed Jul. 18, 2013 in 22 pages. |
Gorr et al., “Crime Hot Spot Forecasting: Modeling and Comparative Evaluation,” Grant 98-IJ-CX-K005, May 6, 2002, 37 pages. |
Gu et al., “Record Linkage: Current Practice and Future Directions,” Jan. 15, 2004, pp. 32. |
Hansen et al. “Analyzing Social Media Networks with NodeXL: Insights from a Connected World”, Chapter 4, pp. 53-67 and Chapter 10, pp. 143-164, published Sep. 2010. |
Hua et al., “A Multi-attribute Data Structure with Parallel Bloom Filters for Network Services” HiPC 2006, LNCS 4297, pp. 277-288, 2006. |
“HunchLab: Heat Map and Kernel Density Calculation for Crime Analysis,” Azavea Journal, printed from www.azavea.com/blogs/newsletter/v4i4/kernel-density-capabilities-added-to-hunchlab/ on Sep. 9, 2014, 2 pages. |
Keylines.com, “An Introduction to KeyLines and Network Visualization,” Mar. 2014, <http://keylines.com/wp-content/uploads/2014/03/KeyLines-White-Paper.pdf> downloaded May 12, 2014 in 8 pages. |
Keylines.com, “KeyLines Datasheet,” Mar. 2014, <http://keylines.com/wp-content/uploads/2014/03/KeyLines-datasheet.pdf> downloaded May 12, 2014 in 2 pages. |
Keylines.com, “Visualizing Threats: Improved Cyber Security Through Network Visualization,” Apr. 2014, <http://keylines.com/wp-content/uploads/2014/04/Visualizing-Threats1.pdf> downloaded May 12, 2014 in 10 pages. |
Kontagent Mobile Analytics, <http://www.kontagent.com/> Printed Jul. 18, 2013 in 9 pages. |
Localytics—Mobile App Marketing & Analytics, <http://www.localytics.com/> Printed Jul. 18, 2013 in 12 pages. |
Manno et al., “Introducing Collaboration in Single-user Applications through the Centralized Control Architecture,” 2010, pp. 10. |
Mixpanel—Mobile Analytics, <https://mixpanel.com/> Printed Jul. 18, 2013 in 13 pages. |
Open Web Analytics (OWA), <http://www.openwebanalytics.com/> Printed Jul. 19, 2013 in 5 pages. |
Piwik—Free Web Analytics Software. <http://piwik.org/> Printed Jul. 19, 2013 in18 pages. |
“Refresh CSS Ellipsis When Resizing Container—Stack Overflow,” Jul. 31, 2013, retrieved from internet http://stackoverflow.com/questions/17964681/refresh-css-ellipsis-when-resizing-container, retrieved on May 18, 2015. |
Sigrist et al., “PROSITE, a Protein Domain Database for Functional Characterization and Annotation,” Nucleic Acids Research 38.Suppl 1, 2010, pp. D161-D166. |
StatCounter—Free invisible Web Tracker, Hit Counter and Web Stats, <http://statcounter.com/> Printed Jul. 19, 2013 in 17 pages. |
TestFlight—Beta Testing on the Fly, <http://testflightapp.com/> Printed Jul. 18, 2013 in 3 pages. |
trak.io, <http://trak.io/> printed Jul. 18, 2013 in 3 pages. |
UserMetrix, <http://usermetrix.com/android-analytics> printed Jul. 18, 2013 in 3 pages. |
Valentini et al., “Ensembles of Learning Machines,” M. Marinaro and R. Tagliaferri (Eds.): WIRN VIETRI 2002, LNCS 2486, pp. 3-20. |
Vose et al., “Help File for ModelRisk Version 5,” 2007, Vose Software, pp. 349-353. [Uploaded in 2 Parts]. |
Wang et al., “Research on a Clustering Data De-Duplication Mechanism Based on Bloom Filter,” IEEE 2010, 5 pages. |
Wikipedia, “Multimap,” Jan. 1, 2013, https://en.wikipedia.org/w/index.php?title=Multimap&oldid=530800748. |
Notice of Allowance for U.S. Appl. No. 14/479,863 dated Mar. 31, 2015. |
Notice of Allowance for U.S. Appl. No. 14/225,084 dated May 4, 2015. |
Notice of Allowance for U.S. Appl. No. 14/319,161 dated May 4, 2015. |
Notice of Allowance for U.S. Appl. No. 14/323,935 dated Oct. 1, 2015. |
Notice of Allowance for U.S. Appl. No. 14/552,336 dated Nov. 3, 2015. |
Official Communication for U.S. Appl. No. 14/225,160 dated Jul. 29, 2014. |
Official Communication for U.S. Appl. No. 14/225,084 dated Sep. 2, 2014. |
Official Communication for U.S. Appl. No. 14/225,006 dated Sep. 10, 2014. |
Official Communication for U.S. Appl. No. 14/451,221 dated Oct. 21, 2014. |
Official Communication for U.S. Appl. No. 14/225,160 dated Oct. 22, 2014. |
Official Communication for U.S. Appl. No. 14/463,615 dated Nov. 13, 2014. |
Official Communication for U.S. Appl. No. 13/827,491 dated Dec. 1, 2014. |
Official Communication for U.S. Appl. No. 14/479,863 dated Dec. 26, 2014. |
Official Communication for U.S. Appl. No. 14/319,161 dated Jan. 23, 2015. |
Official Communication for U.S. Appl. No. 14/483,527 dated Jan. 28, 2015. |
Official Communication for U.S. Appl. No. 14/463,615 dated Jan. 28, 2015. |
Official Communication for U.S. Appl. No. 14/225,160 dated Feb. 11, 2015. |
Official Communication for U.S. Appl. No. 14/225,084 dated Feb. 20, 2015. |
Official Communication for U.S. Appl. No. 14/225,006 dated Feb. 27, 2015. |
Official Communication for U.S. Appl. No. 14/571,098 dated Mar. 11, 2015. |
Official Communication for U.S. Appl. No. 14/225,160 dated May 20, 2015. |
Official Communication for U.S. Appl. No. 14/463,615 dated May 21, 2015. |
Official Communication for U.S. Appl. No. 13/827,491 dated Jun. 22, 2015. |
Official Communication for U.S. Appl. No. 14/483,527 dated Jun. 22, 2015. |
Official Communication for U.S. Appl. No. 14/552,336 dated Jul. 20, 2015. |
Official Communication for U.S. Appl. No. 14/676,621 dated Jul. 30, 2015. |
Official Communication for U.S. Appl. No. 14/571,098 dated Aug. 5, 2015. |
Official Communication for U.S. Appl. No. 14/225,160 dated Aug. 12, 2015. |
Official Communication for U.S. Appl. No. 14/571,098 dated Aug. 24, 2015 |
Official Communication for U.S. Appl. No. 14/225,006 dated Sep. 2, 2015. |
Official Communication for U.S. Appl. No. 14/631,633 dated Sep. 10, 2015. |
Official Communication for U.S. Appl. No. 14/463,615 dated Sep. 10, 2015. |
Official Communication for U.S. Appl. No. 14/225,084 dated Sep. 11, 2015. |
Official Communication for U.S. Appl. No. 14/562,524 dated Sep. 14, 2015. |
Official Communication for U.S. Appl. No. 14/813,749 dated Sep. 28, 2015. |
Official Communication for U.S. Appl. No. 14/746,671 dated Sep. 28, 2015. |
Official Communication for U.S. Appl. No. 14/141,252 dated Oct. 8, 2015. |
Official Communication for U.S. Appl. No. 13/827,491 dated Oct. 9, 2015. |
Official Communication for U.S. Appl. No. 14/483,527 dated Oct. 28, 2015. |
Official Communication for U.S. Appl. No. 14/676,621 dated Oct. 29, 2015. |
Official Communication for U.S. Appl. No. 14/571,098 dated Nov. 10, 2015. |
Official Communication for U.S. Appl. No. 14/562,524 dated Nov. 10, 2015. |
Official Communication for U.S. Appl. No. 14/746,671 dated Nov. 12, 2015. |
Official Communication for U.S. Appl. No. 14/842,734 dated Nov. 19, 2015. |
Official Communication for U.S. Appl. No. 14/306,138 dated Dec. 3, 2015. |
Official Communication for U.S. Appl. No. 14/463,615 dated Dec. 9, 2015. |
Official Communication for U.S. Appl. No. 14/800,447 dated Dec. 10, 2015. |
Official Communication for U.S. Appl. No. 14/225,006 dated Dec. 21, 2015. |
Official Communication for U.S. Appl. No. 14/306,147 dated Dec. 24, 2015. |
Official Communication for U.S. Appl. No. 14/225,084 dated Jan. 4, 2016. |
Official Communication for New Zealand Patent Application No. 622473 dated Mar. 27, 2014. |
Official Communication for New Zealand Patent Application No. 622513 dated Apr. 3, 2014. |
Official Communication for New Zealand Patent Application No. 622473 dated Jun. 19, 2014. |
Official Communication for Great Britain Patent Application No. 1404499.4 dated Aug. 20, 2014. |
Official Communication for New Zealand Patent Application No. 628161 dated Aug. 25, 2014. |
Official Communication for Great Britain Patent Application No. 1404486.1 dated Aug. 27, 2014. |
Official Communication for Great Britain Patent Application No. 1404489.5 dated Aug. 27, 2014. |
Official Communication for Great Britain Patent Application No. 1404499.4 dated Sep. 29, 2014. |
Official Communication for Great Britain Patent Application No. 1404489.5 dated Oct. 6, 2014. |
Official Communication for European Patent Application No. 14187996.5 dated Feb. 12, 2015. |
Official Communication for European Patent Application No. 14200298.9 dated May 13, 2015. |
Official Communication for Great Britain Patent Application No. 1404486.1 dated May 21, 2015. |
Official Communication for Great Britain Patent Application No. 1404489.5 dated May 21, 2015. |
Official Communication for European Patent Application No. 14191540.5 dated May 27, 2015. |
Official Communication for European Patent Application No. 14200246.8 dated May 29, 2015. |
Official Communication for Great Britain Patent Application No. 1404499.4 dated Jun. 11, 2015. |
Official Communication for Netherlands Patents Application No. 2012421 dated Sep. 18, 2015. |
Official Communication for Netherlands Patents Application No. 2012417 dated Sep. 18, 2015. |
Official Communication for Netherlands Patent Application 2012438 dated Sep. 21, 2015. |
Official Communication for European Patent Application No. 15181419.1 dated Sep. 29, 2015. |
Official Communication for European Patent Application No. 15184764.7 dated Dec. 14, 2015. |
Number | Date | Country | |
---|---|---|---|
62244585 | Oct 2015 | US |