The present disclosure generally relates to the technical field of special-purpose machines that facilitate model objects, including computerized variants of such special-purpose machines and improvements to such variants, and to the technologies by which such special-purpose machines become improved compared to other special-purpose machines that facilitate model objects. In particular, the present disclosure addresses systems and methods for a model object management and storage system.
Currently, users utilizing models, such as mathematical models, statistical models, linear regression models, etc., are storing the models they create locally or in a personal folder. There is no centralized repository and/or standardized method for saving and accessing models. As a result, these users are not able to utilize models previously created by other users and build off of a previously achieved state of the model. Further, performance of the model at various states cannot be analyzed.
Various ones of the appended drawings merely illustrate example embodiments of the present disclosure and are not intended to limit its scope to the illustrated embodiments. On the contrary, these examples are intended to cover alternatives, modifications, and equivalents as may be included within the scope of the disclosure.
Reference will now be made in detail to specific example embodiments for carrying out the inventive subject matter of the present disclosure. In the following description, specific details are set forth in order to provide a thorough understanding of the subject matter. It shall be appreciated that embodiments may be practiced without some or all of these specific details.
Disclosed are systems, methods, and computer-readable storage media for a model object management and storage system. A model management system provides a centralized repository and standardized method for storing and accessing models. The model management system generates a model key identifying a model and associates various versions (i.e., states) of the model with the model key. As a result, each version of the model can be easily accessed. Each state of the model is saved as a file that includes the set of functions defining the respective model state. The model management system associates the files with the corresponding model key. The model management system uses the model key to identify each associated file representing the various versions (i.e., states) of the model.
Modeling (e.g., mathematical modelling) is the process of using various mathematical structures (e.g., graphs, equations, diagrams, scatterplots, tree diagrams, etc.) to represent real world situations. A resulting model provides an abstraction that reduces a problem to its essential characteristics.
Models are useful for a variety of purposes. For example, models are useful tools for engineers studying the effects of traffic on a bridge, a telephone company that wants to know the best price to charge for long distance service, and social scientists that wish to predict trends in population and disease.
A model generally describes a system by a set of variables and a set of functions that establish relationships between the variables. Variables may be of many types, such as real or integer numbers, boolean values, strings, etc. Variables represent some known properties of the system, such as measured system outputs, timing data, counters, and event occurrence (yes/no). The model itself is the set of functions (e.g., equations) that describe the relations between the different variables. Each function includes one or more terms and/or operators. Terms in a model can include variables, parameters (i.e., constants), variables multiplied by parameters, variables multiplied by other variables, or describe connections between nodes and/or units (e.g., as in a decision tree or deep neural network). An operator is an operation to be performed between multiple terms in the function, such as add, subtract, etc.
Once a model has been created, it can be used to predict the performance of the system based on provided variables. For example, a user may enter a set of variables into the model to receive a determined output based on the set of functions that define the model. The determined output describes the expected performance of the system based on the entered variable. For example, an engineer studying the effects of traffic on a bridge can enter variables for number of cars, time of day and weather into a model describing traffic on the bridge to determine the expected traffic under the given conditions.
In some example embodiments, machine learning techniques may be used to automate the process of building models. By using algorithms to iteratively learn from data, a set of variables may be analyzed to determine the functions that describe the relations between the different variables. For example, a set of known variables describing traffic on a bridge (e.g., time of day, number of cars, weather, etc.) can be analyzed using machine learning to determine functions describing the relationships between the variables.
As more known data is analyzed using machine learning, the accuracy of the model improves. Accordingly, additional variable data can be analyzed after a model has been generated to continuously fine tune the set of functions that define the model. For example, the parameters (i.e., constants) in a function may be updated as additional known variable data is analyzed.
A model can therefore progress through multiple states based on the amount of variable data that has been analyzed to generate the model. Each state of the model is represented by the resulting set of functions defining the model in the state. For example, a model generated based on a first set of variable data is in a first state as defined by the resulting functions (e.g., parameters, operators, etc.) determined based on the first set of variables. As additional variable data is analyzed (e.g., a second set of variable data), the functions (e.g., parameters, operators, etc.) are fine-tuned (i.e., modified), resulting in a second state of the model.
As explained above, prior systems do not provide a centralized repository and/or standardized method for saving and accessing models. As a result, users cannot easily find previous states of a model generated by others users, which can be used to further refine the model.
As shown, system 100 can include multiple computing devices connected to communication network 102 and configured to communicate with each other through use of communication network 102. Communication network 102 can be any type of network, including a local area network (“LAN”), such as an intranet, a wide area network (“WAN”), such as the internet, or any combination thereof. Further, communication network 102 can be a public network, a private network, or a combination thereof. Communication network 102 can also be implemented using any number of communication links associated with one or more service providers, including one or more wired communication links, one or more wireless communication links, or any combination thereof. Additionally, communication network 102 can be configured to support the transmission of data formatted using any number of protocols.
Multiple computing devices can be connected to communication network 102. A computing device can be any type of general computing device capable of network communication with other computing devices. For example, a computing device can be a personal computing device such as a desktop or workstation, a business server, or a portable computing device, such as a laptop, smart phone, or a tablet PC. A computing device can include some or all of the features, components, and peripherals of computing device 500 of
To facilitate communication with other computing devices, a computing device can include a communication interface configured to receive a communication, such as a request, data, etc., from another computing device in network communication with the computing device and pass the communication along to an appropriate module running on the computing device. The communication interface can also be configured to send a communication to another computing device in network communication with the computing device.
As shown, system 100 includes client device 104 and model management system 106. In system 100, a user can interact with model management system 106 through client device 104 connected to communication network 102 by direct and/or indirect communication. Client device 104 can be any of a variety of types of computing devices that include at least a display, a computer processor, and communication capabilities that provide access to communication network 102 (e.g., a smart phone, a tablet computer, a personal digital assistant (PDA), a personal navigation device (PND), a handheld computer, a desktop computer, a laptop or netbook, or a wearable computing device).
Model management system 106 can consist of one or more computing devices and support connections from a variety of different types of client devices 104, such as desktop computers; mobile computers; mobile communications devices (e.g. mobile phones, smart phones, tablets, etc.); smart televisions; set-top boxes; and/or any other network enabled computing devices. Client device 104 can be of varying type, capabilities, operating systems, etc. Furthermore, model management system 106 can concurrently accept connections from and interact with multiple client devices 104.
A user can interact with model management system 106 via client-side application 108 installed on client device 104. In some embodiments, client-side application 108 can include a model management system 106 specific component. For example, the component can be a stand-alone application, one or more application plug-ins, and/or a browser extension. However, the user can also interact with model management system 106 via third-party application 110, such as a web browser, that resides on client device 104 and is configured to communicate with model management system 106. In either case, client-side application 108 and/or third party application 110 can present a user interface (UI) for the user to interact with model management system 106.
Model management system 106 can include data storage 112 to store data. The stored data can include any type of data, such as digital data, documents, text files, audio files, video files, etc. Data storage 112 can be a storage device, multiple storage devices, or one or more servers. Alternatively, data storage 112 can be a cloud storage provider or network storage. Model management system 106 can store data in a network accessible storage (SAN) device, in a redundant array of inexpensive disks (RAID), etc. Date storage 110 can store data items using one or more partition types, such as FAT, FAT32, NTFS, EXT2, EXT3, EXT4, ReiserFS, BTRFS, and so forth.
Model management system 106 includes model manager 114 configured to enable users to create, store and access models. A model is any type of model designed to describe a system, such as a mathematical model, linear regression model, logistic regression model, random forest model, etc.
Model manager 114 provides a user with a model management interface that enables a user to utilize functionality of the model management system 106. For example, the model management interface can include user interface elements (e.g., buttons, text boxes, etc.) to create new models, store models and access stored models.
Model manager 114 utilizes a model key to associate multiple versions of a model. A model key can be any type of unique identifier used to identify a model. For example, a model key can be a string, number, etc. Model manager 114 generates a model key for a new model created by a user. Model manager 114 stores the model key in data storage 112.
Once a model key has been generated, model manager 114 uses the model key to associate multiple versions of the model as well as identify available versions of the model. For example, upon a user selecting to store a created model, model manager 114 stores the model in data storage 112 and associates the stored model with the corresponding model key. Accordingly, each version of the model will be associated via the model key.
To store a new version of a model, model manager 114 generates a new file that includes the set of functions defining the model state. The new file includes equations, terms, parameters, operators, etc., that were determined using machine learning on a set of known variables. The functions can be stored in a standardized format to allow cross language utilization, however the present disclosure supports both language agnostic and language specific parameter stores. Model manager 114 stores the generated file in data storage 112 and associates the file with its corresponding model key. The file can be any type of file. For example, in some embodiments, the file is a JavaScript Object Notation (JSON) file.
Model manager 114 may also associate metadata with the stored file, such as a version number, timestamp, creating user, etc. Modelling libraries corresponding to the model may also be stored in data storage 112 and associated with the model key.
Model manager 114 also uses the model keys to present a user with available models. For example, model manager 114 uses the model keys to presents a listing of the models maintained by model management system 106. The listing of models is presented in the model management interface, which enables a user to view the available models as well as select a desired model to work with. Upon receiving a user selection selecting a presented model, model manager 114 uses the corresponding model key to identify the associated versions of the selected model. Model manager 114 then presents a listing of the available versions of the model in the model management interface.
In some embodiments, model manager 114 generates a model architecture file associated with each model. The model architecture file describes the pipeline and/or progression of the model. For example, the model architecture file may list each version of the model, the dates the version of the model was created, a user associated with each version of the model, the set of functions defining the model state of each version of the model, etc. Additionally, the model architecture file may include links to each version of the model. A user can therefore access the model architecture file to view the progression of the model and select a listed model to access the model. Model manager 114 stores the generated model architecture file in data storage 112 and associates the model architecture file with the corresponding model key.
A user can select a desired version of the model presented in the model management interface or the model architecture file to further build on the model, generate a variation of the model, etc. A user may also select to view performance of the model across multiple versions. Model manager 114 can generate performance data, graphs, etc., based on data gathered from two or more versions of the model. For example, model manager 114 can present changes to the functions defining the model across multiple versions (e.g., states) of the model. This can include presenting changes to parameters (i.e., constants) included in the functions.
Model manager 114 stores generated performance data, graphs, etc., in a similar fashion as the models themselves. For example, model manager 114 stores the generated performance data, graphs, etc., in data storage 112 and associates the performance data, graphs, etc., with the corresponding model key, which is used by model manager 114 to access the stored performance data, graphs, etc. Further, the model manager 114 can include data describing the performance data, graphs, etc., to the model architecture file and add links to the model architecture file to access the performance data, graphs, etc.
As shown, model manger 114 includes interface module 202, key generation module 204, file generation module 206, storing module 208, model access module 210, model building module 212 and performance module 214.
Interface module 202 provides a user with a model management interface that enables the user to create, store and access models. The model management interface provides the user with user interface elements, such as text boxes, buttons, check boxes, etc., that allow a user to select from various functionality provided by model management system 106. For example, the model management interface includes user interface elements that a user can utilize to select to create, store or access models. As another example, the model management interface can present a listing of available models and corresponding versions of the model, thereby enabling the user to view and/or select available models.
Further, the model management interface may include a click to score button that enables a user to apply a selected version of a model to a compatible dataset to produce a scored dataset. For example, the click to score button, upon being selected, may cause the user to be prompted to first select a model from a list of available models. Upon the user selecting a model, the user may then be presented with a listing of available datasets to choose from to generate the scored dataset. Alternatively, in some embodiments, the user may first select a listed model and dataset and then select the click to score button to generate the scored dataset.
Key generation module 204 generates a model key for a new model. A model key is any type of identifier used to identify a model, such as a string, integer, etc. In some embodiments, key generation module 204 generates a unique model key for a newly created model. Alternatively, in some embodiments, key generation module 204 utilizes a model key provided by a user. For example, key generation module 204 can cause the interface module to query a user to provide a model key for a new model.
In some embodiments, key generation module 204 stores generated model keys in data storage 112. Data storage 112 may store a model key index listing the generated model keys. Key generation module 204 communicates with data storage 112 to modify the model key index to add a newly generated key. For example, key generation module 204 creates a new entry in the model key index that includes the generated model key.
File generation module 206 generates a file to save a version (e.g., state) of a model. The file includes the set of functions defining the model state. The new file includes equations, terms, parameters, operators, etc., that were determined using machine learning on a set of known variables. The functions are stored in a standardized format to allow cross language utilization. The file can be any type of file. For example, in some embodiments, the file is a JavaScript Object Notation (JSON) file.
File generation module 206 can further generate metadata associated with the file that describes the file. The metadata can include any type of data describing the file and/or the model state represented by the file. For example, the metadata can include a model name, version number, creation date, creating user, description of the model, etc.
Storing module 208 stores the generated file in data storage 112 and associates the file with its corresponding model key. The stored file can be associated with the model key using any of a number of known techniques. For example, storing module 208 can associate the stored file with the model key by updating the model key index in data storage 112 to indicate that the file is associated with a corresponding model key. In this type of embodiment, storing module 208 searches for the corresponding model key in the model key index and updates the associated entry to include data identifying the file. This can include a pointer to the file, an identifier for the file, etc.
As another example, storing module 208 can store the model key with the file. For example, storing module 208 can update metadata associated with the file to include the model key.
Model access module 210 provides access to models maintained by model management system 106. Model access module 210 communicates with data storage 112 to access model keys stored in data storage 112. For example, model access module 210 accesses the model key index and provides data identifying the models to interface module 202 to provide a listing of available models to a user. A user can use the provided listing to select a model to interact with.
Model access module 210 provides users with the available versions of a selected model. For example, model access module 210 identifies the files associated with a model key to identify the saved versions of the model. Model access module 210 provides data associated with the identified files to interface module 202 to be presented on the model management interface. For example, the data can include version numbers, creation dates, descriptions, creating users, etc. A user can therefore browse the various versions of a model and select models to utilize.
Model building module 212 enables a user to create a new model. For example, model building module 212 enables a user to build either a completely new model or new model from an existing model (e.g., a new version of the model). Model building module 212 provides tools enabling a user to visually train a model, develop featurization, vectorization, modeling algorithms, etc.
Performance module 214 provides functionality to analyze performance of a model over multiple versions. For example, performance module 214 enables users to view data indicating changes to the functions, equations, parameters, etc., that represent the model. A user can therefore monitor and analyze changes, trends, etc., in the model across multiple versions.
At operation 302, interface module 202 receives an input to store a model object in a first model state. The first model state can be generated by model building module 212 based on a first set of known variables. For example, model building module 212 generates the first set of functions using machine learning based on the first set of known variables. The first set of functions define the model object. For example, the first set of functions includes equations, parameter, operators, etc., that define the relationships between the variables.
At operation 304, file generation module 206 generates a first file including a first set of functions defining the first model state. The first file can be any type of file. For example, in some embodiments, the first file is a JSON file.
At operation 306, storing module 208 associates the first file with a model key identifying the model object. The model key is a string, integer, etc., that uniquely identifies the model object. Key generation module 204 generates the model key identifying the model object when the model object is initially created (e.g., when a user selects to create a first version of a new model, when a user selects to save a first version of a new model, etc.)
Storing module 208 stores the first file in data storage 112 and associated the first file with the model key. For example, storing module 208 can update a model key index stored in data storage 112 to indicate that the first file is associated with the model key. As another example, storing module 208 appends metadata to the first file indicating that the first file is associated with the model key. For example, the first file can be appended with the model key (e.g., characters, integers, etc.) itself.
At operation 308, interface module 202 receives an input to store the model object in a second model state. The second model state can be generated by model building module 212 based on the model object in the first model state and a second set of known variables that are different than the first set of known variables. For example, model building module 212 generates the second set of functions using machine learning based on the first state of the model object and the second set of known variables. The second set of functions includes updated equations, parameter, operators, etc., that define the relationships between the variables. For example, the second set of functions may include an updated parameter determined based on the second set of known variables.
At operation 310, file generation module 206 generates a second file including a second set of functions defining the second model state. The second file can be any type of file. For example, in some embodiments, the second file is a JSON file.
At operation 312, storing module 208 associates the second file with the model key identifying the model object. The second file is associated with the model key identifying the model object because the second state of the model object is updated version of the model object. Storing module 208 stores the second file in data storage 112 and associated the second file with the model key. For example, storing module 208 updates the model key index stored in data storage 112 to indicate that the second file is associated with the model key. As another example, storing module 208 appends metadata to the first file indicating that the second file is associated with the model key. For example, the second file can be appended with the model key (e.g., characters, integers, etc.) itself.
Although method 300 only describes only two versions of the model object being created and associated with the model key, this is only for ease of explanation and not meant to be limiting. Any number of versions of the model object can be generated and associated with the model key, and this disclosure anticipates any such embodiments.
At operation 402, interface module 202 receives an input to access a model object. The input can be the result of use utilizing the model management interface to select to access the model object.
At operation 404, model access module 210 identifies available versions of the model object based on the model key corresponding to the model object. In some embodiments, model access module 210 accesses the model key index in data storage 114 to identify the files associated with the model key. As another example, model access module 210 searches data storage 114 for model objects that have been appended with the model key.
The model access module 210 can provide data identifying the various versions of the model object to interface module 202. Interface module 202 can then present the data to the user in the model management interface. A user uses the data presented in the model management interface to view the available versions of the model and select to interact with one or more versions. For example, a user can select to build off of a version of the model.
As another example, the user can select to evaluate performance of the model object across multiple versions. For example, in response to interface module 202 receiving an input to evaluate performance of the model object, performance module 214 generates a report based on the set of functions defining the various states of the model object. The report can indicate changes from one version to the next, such as changes to parameters in the functions.
By way of non-limiting example, computing device 500 may comprise or correspond to a television, a computer (e.g., a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, or a netbook), a set-top box (STB), a personal digital assistant (PDA), an entertainment media system (e.g., an audio/video receiver), a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), a portable media player, or any machine capable of outputting audio signals and capable of executing instructions 502, sequentially or otherwise, that specify actions to be taken by computing device 500. Further, while only a single computing device 500 is illustrated, the term “machine” shall also be taken to include a collection of computing devices 500 that individually or jointly execute instructions 502 to perform any one or more of the methodologies discussed herein.
Computing device 500 may include processors 504, memory 506, storage unit 508 and I/O components 510, which may be configured to communicate with each other such as via bus 512. In an example embodiment, processors 504 (e.g., a central processing unit (CPU), a reduced instruction set computing (RISC) processor, a complex instruction set computing (CISC) processor, a graphics processing unit (GPU), a digital signal processor (DSP), an application specific integrated circuit (ASIC), a radio-frequency integrated circuit (RFIC), another processor, or any suitable combination thereof) may include, for example, processor 514 and processor 516 that may execute instructions 502. The term “processor” is intended to include multi-core processors that may comprise two or more independent processors (sometimes referred to as “cores”) that may execute instructions contemporaneously. Although
Memory 506 (e.g., a main memory or other memory storage) and storage unit 508 are both accessible to processors 504 such as via bus 512. Memory 506 and storage unit 508 store instructions 502 embodying any one or more of the methodologies or functions described herein. In some embodiments, database 516 resides on storage unit 508. Instructions 502 may also reside, completely or partially, within memory 506, within storage unit 508, within at least one of processors 504 (e.g., within the processor's cache memory), or any suitable combination thereof, during execution thereof by computing device 500. Accordingly, memory 506, storage unit 508, and the memory of processors 504 are examples of machine-readable media.
As used herein, “machine-readable medium” means a device able to store instructions and data temporarily or permanently and may include, but is not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, optical media, magnetic media, cache memory, other types of storage (e.g., erasable programmable read-only memory (EEPROM)), or any suitable combination thereof. The term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions 502. The term “machine-readable medium” shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., instructions 502) for execution by a machine (e.g., computing device 500), such that the instructions, when executed by one or more processors of computing device 500 (e.g., processors 504), cause computing device 500 to perform any one or more of the methodologies described herein (e.g., method 300 and 400). Accordingly, a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” excludes signals per se.
Furthermore, the “machine-readable medium” is non-transitory in that it does not embody a propagating signal. However, labeling the tangible machine-readable medium as “non-transitory” should not be construed to mean that the medium is incapable of movement—the medium should be considered as being transportable from one real-world location to another. Additionally, since the machine-readable medium is tangible, the medium may be considered to be a machine-readable device.
The I/O components 510 may include a wide variety of components to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on. The specific I/O components 510 that are included in a particular machine will depend on the type of machine. For example, portable machines such as mobile phones will likely include a touch input device or other such input mechanisms, while a headless server machine will likely not include such a touch input device. It will be appreciated that I/O components 510 may include many other components that are not specifically shown in
Communication may be implemented using a wide variety of technologies. I/O components 510 may include communication components 522 operable to couple computing device 500 to network 524 or devices 526 via coupling 528 and coupling 530, respectively. For example, communication components 522 may include a network interface component or other suitable device to interface with network 524. In further examples, communication components 522 may include wired communication components, wireless communication components, cellular communication components, near field communication (NFC) components, Bluetooth® components (e.g., Bluetooth® Low Energy), WiFi® components, and other communication components to provide communication via other modalities. The devices 1126 may be another machine or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a Universal Serial Bus (USB)).
Modules, Components and Logic
Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules. A hardware module is a tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client, or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field-programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
Accordingly, the term “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired) or temporarily configured (e.g., programmed) to operate in a certain manner and/or to perform certain operations described herein. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses that connect the hardware modules). In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
Similarly, the methods described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment, or a server farm), while in other embodiments the processors may be distributed across a number of locations.
The one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., APIs).
Electronic Apparatus and System
Example embodiments may be implemented in digital electronic circuitry, or in computer hardware, firmware, or software, or in combinations of them. Example embodiments may be implemented using a computer program product, for example, a computer program tangibly embodied in an information carrier, for example, in a machine-readable medium for execution by, or to control the operation of, data processing apparatus, for example, a programmable processor, a computer, or multiple computers.
A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a standalone program or as a module, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site, or distributed across multiple sites and interconnected by a communication network.
In example embodiments, operations may be performed by one or more programmable processors executing a computer program to perform functions by operating on input data and generating output. Method operations can also be performed by, and apparatus of example embodiments may be implemented as, special purpose logic circuitry (e.g., an FPGA or an ASIC).
The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In embodiments deploying a programmable computing system, it will be appreciated that both hardware and software architectures merit consideration. Specifically, it will be appreciated that the choice of whether to implement certain functionality in permanently configured hardware (e.g., an ASIC), in temporarily configured hardware (e.g., a combination of software and a programmable processor), or in a combination of permanently and temporarily configured hardware may be a design choice. Below are set out hardware (e.g., machine) and software architectures that may be deployed, in various example embodiments.
Language
Although the embodiments of the present invention have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader scope of the inventive subject matter. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. The accompanying drawings that form a part hereof show by way of illustration, and not of limitation, specific embodiments in which the subject matter may be practiced. The embodiments illustrated are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed herein. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. This Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.
Such embodiments of the inventive subject matter may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent, to those of skill in the art, upon reviewing the above description.
All publications, patents, and patent documents referred to in this document are incorporated by reference herein in their entirety, as though individually incorporated by reference. In the event of inconsistent usages between this document and those documents so incorporated by reference, the usage in the incorporated references should be considered supplementary to that of this document; for irreconcilable inconsistencies, the usage in this document controls.
In this document, the terms “a” or “an” are used, as is common in patent documents, to include one or more than one, independent of any other instances or usages of “at least one” or “one or more.” In this document, the term “or” is used to refer to a nonexclusive or, such that “A or B” includes “A but not B,” “B but not A,” and “A and B,” unless otherwise indicated. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Also, in the following claims, the terms “including” and “comprising” are open-ended; that is, a system, device, article, or process that includes elements in addition to those listed after such a term in a claim are still deemed to fall within the scope of that claim.
This application is a continuation of U.S. patent application Ser. No. 17/093,381, filed Nov. 9, 2020, now U.S. Pat. No. 11,526,471 which is a continuation of U.S. patent application Ser. No. 15/891,493, filed Feb. 8, 2018, now U.S. Pat. No. 10,866,936 which claims priority to U.S. Provisional Application Ser. No. 62/478,372, filed Mar. 29, 2017, the disclosures of which are incorporated herein in their entireties by reference.
Number | Name | Date | Kind |
---|---|---|---|
5515488 | Hoppe et al. | May 1996 | A |
6430305 | Decker | Aug 2002 | B1 |
6820135 | Dingman et al. | Nov 2004 | B1 |
6978419 | Kantrowitz | Dec 2005 | B1 |
6980984 | Huffman et al. | Dec 2005 | B1 |
7168039 | Bertram | Jan 2007 | B2 |
7461077 | Greenwood et al. | Dec 2008 | B1 |
7505990 | Krishna et al. | Mar 2009 | B2 |
7617232 | Gabbert et al. | Nov 2009 | B2 |
7756843 | Palmer | Jul 2010 | B1 |
7899796 | Borthwick et al. | Mar 2011 | B1 |
7917376 | Bellin et al. | Mar 2011 | B2 |
7941321 | Greenstein et al. | May 2011 | B2 |
8036971 | Aymeloglu et al. | Oct 2011 | B2 |
8037046 | Udezue et al. | Oct 2011 | B2 |
8046283 | Burns et al. | Oct 2011 | B2 |
8054756 | Chand et al. | Nov 2011 | B2 |
8214490 | Vos et al. | Jul 2012 | B1 |
8229902 | Vishniac et al. | Jul 2012 | B2 |
8239821 | Cook et al. | Aug 2012 | B2 |
8290838 | Thakur et al. | Oct 2012 | B1 |
8291230 | Moore et al. | Oct 2012 | B2 |
8302855 | Ma et al. | Nov 2012 | B2 |
8386377 | Xiong et al. | Feb 2013 | B1 |
8473454 | Evanitsky et al. | Jun 2013 | B2 |
8484115 | Aymeloglu et al. | Jul 2013 | B2 |
8489641 | Seefeld et al. | Jul 2013 | B1 |
8577911 | Stepinski et al. | Nov 2013 | B1 |
8589273 | Creeden et al. | Nov 2013 | B2 |
8688573 | Rukonic et al. | Apr 2014 | B1 |
8744890 | Bernier et al. | Jun 2014 | B1 |
8799799 | Cervelli et al. | Aug 2014 | B1 |
8806355 | Twiss et al. | Aug 2014 | B2 |
8812960 | Sun et al. | Aug 2014 | B1 |
8924388 | Elliot et al. | Dec 2014 | B2 |
8924389 | Elliot et al. | Dec 2014 | B2 |
8938686 | Erenrich et al. | Jan 2015 | B1 |
8949164 | Mohler | Feb 2015 | B1 |
9069842 | Melby | Jun 2015 | B2 |
9100428 | Visbal | Aug 2015 | B1 |
9111281 | Stibel et al. | Aug 2015 | B2 |
9129219 | Robertson et al. | Sep 2015 | B1 |
9256664 | Chakerian et al. | Feb 2016 | B2 |
9280618 | Bruce et al. | Mar 2016 | B1 |
9286373 | Elliot et al. | Mar 2016 | B2 |
9335911 | Elliot et al. | May 2016 | B1 |
10866936 | Lisuk et al. | Dec 2020 | B1 |
20020065708 | Senay et al. | May 2002 | A1 |
20020095360 | Joao | Jul 2002 | A1 |
20020095658 | Shulman et al. | Jul 2002 | A1 |
20020103705 | Brady | Aug 2002 | A1 |
20020147805 | Leshem et al. | Oct 2002 | A1 |
20030126102 | Borthwick | Jul 2003 | A1 |
20040034570 | Davis | Feb 2004 | A1 |
20040111480 | Yue | Jun 2004 | A1 |
20040153418 | Hanweck | Aug 2004 | A1 |
20040236688 | Bozeman | Nov 2004 | A1 |
20050010472 | Quatse et al. | Jan 2005 | A1 |
20050086207 | Heuer et al. | Apr 2005 | A1 |
20050154628 | Eckart et al. | Jul 2005 | A1 |
20050154769 | Eckart et al. | Jul 2005 | A1 |
20060026120 | Carolan et al. | Feb 2006 | A1 |
20060026170 | Kreitler et al. | Feb 2006 | A1 |
20060080283 | Shipman | Apr 2006 | A1 |
20060143034 | Rothermel et al. | Jun 2006 | A1 |
20060143075 | Carr et al. | Jun 2006 | A1 |
20060143079 | Basak et al. | Jun 2006 | A1 |
20070000999 | Kubo et al. | Jan 2007 | A1 |
20070011304 | Error | Jan 2007 | A1 |
20070038646 | Thota | Feb 2007 | A1 |
20070150801 | Chidlovskii et al. | Jun 2007 | A1 |
20070156673 | Maga et al. | Jul 2007 | A1 |
20070162454 | D' Albora et al. | Jul 2007 | A1 |
20070185867 | Maga et al. | Aug 2007 | A1 |
20070192122 | Routson et al. | Aug 2007 | A1 |
20070284433 | Domenica et al. | Dec 2007 | A1 |
20080065655 | Chakravarthy et al. | Mar 2008 | A1 |
20080069081 | Chand et al. | Mar 2008 | A1 |
20080077642 | Carbone | Mar 2008 | A1 |
20080103996 | Forman et al. | May 2008 | A1 |
20080208735 | Balet et al. | Aug 2008 | A1 |
20080222295 | Robinson et al. | Sep 2008 | A1 |
20080243711 | Aymeloglu et al. | Oct 2008 | A1 |
20080255973 | El Wade et al. | Oct 2008 | A1 |
20080270328 | Lafferty et al. | Oct 2008 | A1 |
20080294663 | Heinley et al. | Nov 2008 | A1 |
20080313132 | Hao et al. | Dec 2008 | A1 |
20090076845 | Bellin et al. | Mar 2009 | A1 |
20090094166 | Aymeloglu et al. | Apr 2009 | A1 |
20090094270 | Alirez et al. | Apr 2009 | A1 |
20090106178 | Chu | Apr 2009 | A1 |
20090112745 | Stefanescu | Apr 2009 | A1 |
20090125359 | Knapic et al. | May 2009 | A1 |
20090125459 | Norton et al. | May 2009 | A1 |
20090132953 | Reed, Jr. et al. | May 2009 | A1 |
20090157732 | Hao et al. | Jun 2009 | A1 |
20090187546 | Whyte | Jul 2009 | A1 |
20090187548 | Ji et al. | Jul 2009 | A1 |
20090210858 | Son et al. | Aug 2009 | A1 |
20090249244 | Robinson et al. | Oct 2009 | A1 |
20090254842 | Leacock et al. | Oct 2009 | A1 |
20090259636 | Labrou et al. | Oct 2009 | A1 |
20090271343 | Vaiciulis et al. | Oct 2009 | A1 |
20090307049 | Elliott, Jr. et al. | Dec 2009 | A1 |
20090313463 | Pang et al. | Dec 2009 | A1 |
20090319418 | Herz | Dec 2009 | A1 |
20090319515 | Minton et al. | Dec 2009 | A1 |
20090319891 | MacKinlay et al. | Dec 2009 | A1 |
20100030722 | Goodson et al. | Feb 2010 | A1 |
20100031141 | Summers et al. | Feb 2010 | A1 |
20100042922 | Bradateanu et al. | Feb 2010 | A1 |
20100057622 | Faith | Mar 2010 | A1 |
20100070842 | Aymeloglu et al. | Mar 2010 | A1 |
20100098318 | Anderson | Apr 2010 | A1 |
20100106752 | Eckardt, III et al. | Apr 2010 | A1 |
20100114887 | Conway et al. | May 2010 | A1 |
20100131502 | Fordham | May 2010 | A1 |
20100161735 | Sharma | Jun 2010 | A1 |
20100191563 | Schlaifer et al. | Jul 2010 | A1 |
20100211535 | Rosenberger | Aug 2010 | A1 |
20100235915 | Memon et al. | Sep 2010 | A1 |
20100262688 | Hussain et al. | Oct 2010 | A1 |
20100293174 | Bennett | Nov 2010 | A1 |
20100312837 | Bodapati et al. | Dec 2010 | A1 |
20110040776 | Najm et al. | Feb 2011 | A1 |
20110061013 | Bilicki et al. | Mar 2011 | A1 |
20110078173 | Seligmann et al. | Mar 2011 | A1 |
20110093327 | Fordyce, III et al. | Apr 2011 | A1 |
20110099133 | Chang et al. | Apr 2011 | A1 |
20110153384 | Horne et al. | Jun 2011 | A1 |
20110173093 | Psota et al. | Jul 2011 | A1 |
20110208565 | Ross et al. | Aug 2011 | A1 |
20110208724 | Jones et al. | Aug 2011 | A1 |
20110213655 | Henkin et al. | Sep 2011 | A1 |
20110218955 | Tang et al. | Sep 2011 | A1 |
20110270604 | Qi et al. | Nov 2011 | A1 |
20110270834 | Sokolan et al. | Nov 2011 | A1 |
20110289397 | Eastmond et al. | Nov 2011 | A1 |
20110295649 | Fine et al. | Dec 2011 | A1 |
20110314007 | Dassa et al. | Dec 2011 | A1 |
20110314024 | Chang et al. | Dec 2011 | A1 |
20120004904 | Shin et al. | Jan 2012 | A1 |
20120011238 | Rathod | Jan 2012 | A1 |
20120011245 | Gillette et al. | Jan 2012 | A1 |
20120022945 | Falkenborg et al. | Jan 2012 | A1 |
20120054284 | Rakshit | Mar 2012 | A1 |
20120059853 | Jagota | Mar 2012 | A1 |
20120066166 | Curbera et al. | Mar 2012 | A1 |
20120079363 | Folting et al. | Mar 2012 | A1 |
20120084117 | Tavares et al. | Apr 2012 | A1 |
20120084287 | Lakshminarayan et al. | Apr 2012 | A1 |
20120089606 | Eshwar et al. | Apr 2012 | A1 |
20120131512 | Takeuchi et al. | May 2012 | A1 |
20120144335 | Abeln et al. | Jun 2012 | A1 |
20120158527 | Cannelongo | Jun 2012 | A1 |
20120159362 | Brown et al. | Jun 2012 | A1 |
20120173381 | Smith | Jul 2012 | A1 |
20120215784 | King et al. | Aug 2012 | A1 |
20120221553 | Wittmer et al. | Aug 2012 | A1 |
20120226523 | Weiss et al. | Sep 2012 | A1 |
20120245976 | Kumar et al. | Sep 2012 | A1 |
20120323888 | Osann, Jr. | Dec 2012 | A1 |
20130016106 | Yip et al. | Jan 2013 | A1 |
20130054306 | Bhalla et al. | Feb 2013 | A1 |
20130055145 | Antony et al. | Feb 2013 | A1 |
20130057551 | Ebert et al. | Mar 2013 | A1 |
20130096988 | Grossman et al. | Apr 2013 | A1 |
20130110746 | Ahn | May 2013 | A1 |
20130151453 | Bhanot et al. | Jun 2013 | A1 |
20130166348 | Scotto | Jun 2013 | A1 |
20130166480 | Popescu et al. | Jun 2013 | A1 |
20130185245 | Anderson et al. | Jul 2013 | A1 |
20130185307 | El-yaniv et al. | Jul 2013 | A1 |
20130218879 | Park et al. | Aug 2013 | A1 |
20130226318 | Procyk et al. | Aug 2013 | A1 |
20130238616 | Rose et al. | Sep 2013 | A1 |
20130246170 | Gross et al. | Sep 2013 | A1 |
20130246537 | Gaddala | Sep 2013 | A1 |
20130246597 | Iizawa et al. | Sep 2013 | A1 |
20130263019 | Castellanos et al. | Oct 2013 | A1 |
20130268520 | Fisher et al. | Oct 2013 | A1 |
20130282696 | John et al. | Oct 2013 | A1 |
20130290825 | Arndt et al. | Oct 2013 | A1 |
20130297619 | Chandrasekaran et al. | Nov 2013 | A1 |
20130304770 | Boero et al. | Nov 2013 | A1 |
20130318604 | Coates et al. | Nov 2013 | A1 |
20140012796 | Petersen et al. | Jan 2014 | A1 |
20140040371 | Gurevich et al. | Feb 2014 | A1 |
20140053091 | Hou et al. | Feb 2014 | A1 |
20140058914 | Song et al. | Feb 2014 | A1 |
20140068487 | Steiger et al. | Mar 2014 | A1 |
20140095509 | Patton | Apr 2014 | A1 |
20140108380 | Gotz et al. | Apr 2014 | A1 |
20140108985 | Scott et al. | Apr 2014 | A1 |
20140123279 | Bishop et al. | May 2014 | A1 |
20140136285 | Carvalho | May 2014 | A1 |
20140143009 | Brice et al. | May 2014 | A1 |
20140156527 | Grigg et al. | Jun 2014 | A1 |
20140157172 | Peery et al. | Jun 2014 | A1 |
20140164502 | Khodorenko et al. | Jun 2014 | A1 |
20140189536 | Lange et al. | Jul 2014 | A1 |
20140189870 | Singla et al. | Jul 2014 | A1 |
20140195515 | Baker et al. | Jul 2014 | A1 |
20140222521 | Chait | Aug 2014 | A1 |
20140222793 | Sadkin et al. | Aug 2014 | A1 |
20140229554 | Grunin et al. | Aug 2014 | A1 |
20140280056 | Kelly | Sep 2014 | A1 |
20140282160 | Zarpas | Sep 2014 | A1 |
20140344230 | Krause et al. | Nov 2014 | A1 |
20140358829 | Hurwitz | Dec 2014 | A1 |
20140366132 | Stiansen et al. | Dec 2014 | A1 |
20150073929 | Psota et al. | Mar 2015 | A1 |
20150073954 | Braff | Mar 2015 | A1 |
20150095773 | Gonsalves et al. | Apr 2015 | A1 |
20150100897 | Sun et al. | Apr 2015 | A1 |
20150106170 | Bonica | Apr 2015 | A1 |
20150106379 | Elliot et al. | Apr 2015 | A1 |
20150134599 | Banerjee et al. | May 2015 | A1 |
20150135256 | Hoy et al. | May 2015 | A1 |
20150172120 | Dwarampudi et al. | Jun 2015 | A1 |
20150188872 | White | Jul 2015 | A1 |
20150242401 | Liu | Aug 2015 | A1 |
20150338233 | Cervelli et al. | Nov 2015 | A1 |
20150363435 | Ott et al. | Dec 2015 | A1 |
20150379413 | Robertson et al. | Dec 2015 | A1 |
20160004764 | Chakerian et al. | Jan 2016 | A1 |
20160180557 | Yousaf et al. | Jun 2016 | A1 |
20160292591 | Guirguis et al. | Oct 2016 | A1 |
20210056083 | Lisuk et al. | Feb 2021 | A1 |
Number | Date | Country |
---|---|---|
102546446 | Jul 2012 | CN |
103167093 | Jun 2013 | CN |
102054015 | May 2014 | CN |
102014204827 | Sep 2014 | DE |
102014204830 | Sep 2014 | DE |
102014204834 | Sep 2014 | DE |
2487610 | Aug 2012 | EP |
2858018 | Apr 2015 | EP |
2869211 | May 2015 | EP |
2889814 | Jul 2015 | EP |
2892197 | Jul 2015 | EP |
2963595 | Jan 2016 | EP |
2996053 | Mar 2016 | EP |
3035214 | Jun 2016 | EP |
3038002 | Jun 2016 | EP |
3040885 | Jul 2016 | EP |
WO-2005116851 | Dec 2005 | WO |
WO-2012061162 | May 2012 | WO |
Entry |
---|
Krasner et al., “A Description of the Model-View Controller User Interface Paradigm in the Smalltalk-80 System,” 1988, Journal of Object-Oriented Programming, pp. 1-4. (Year: 1988). |
Dan R. Olsen Jr., “A Programming Language Basis for User Interface Management,” CHI'89 Proceedings, ACM, May 1989, pp. 171-176. (Year: 1989). |
U.S. Appl. No. 15/891,493, U.S. Pat. No. 10,866,936, filed Feb. 8, 2018, Model Object Management and Storage System. |
U.S. Appl. No. 17/093,381, filed Nov. 9, 2020, Model Object Management and Storage System. |
“5 Great Tools for Visualizing your Twitter Followers”, Amnet Blog, [Online] Retrieved from the Internet: <URL: http://www.amnetblog.com/component/content/article/115-5-great-tools-for-visualizing-your-twitter-followers.html>, (Aug. 4, 2010), 1-5. |
“About OWA”, Open Web Analytics, [Online]. Retrieved from the Internet: <URL: http://www.openwebanalytics.com/?page jd=2>, (Accessed: Jul. 19, 2013), 5 pgs. |
“An Introduction to KeyLines and Network Visualization”, Keylines.com, [Online]. Retrieved from the Internet: <URL: http://keylines.com/wp-content/uploads/2014/03/KeyLines-White-Paper.pdf>, (Mar. 2014), 8 pgs. |
“Analytics For Data Driven Startups”, Trak.io, [Online]. Retrieved from the Internet: <URL: http://trak.io/>, (Accessed: Jul. 18, 2013), 3 pgs. |
“U.S. Appl. No. 13/827,491, Final Office Action dated Jun. 22, 2015”, 28 pgs. |
“U.S. Appl. No. 13/827,491, Non Final Office Action dated Oct. 9, 2015”, 16 pgs. |
“U.S. Appl. No. 13/827,491, Non Final Office Action dated Dec. 1, 2014”, 5 pgs. |
“U.S. Appl. No. 14/141,252, Final Office Action dated Apr. 14, 2016”, 28 pgs. |
“U.S. Appl. No. 14/141,252, Non Final Office Action dated Oct. 8, 2015”, 11 pgs. |
“U.S. Appl. No. 14/225,006, Advisory Action dated Dec. 21, 2015”, 4 pgs. |
“U.S. Appl. No. 14/225,006, Final Office Action dated Sep. 2, 2015”, 28 pgs. |
“U.S. Appl. No. 14/225,006, First Action Interview Pre-Interview Communication dated Feb. 27, 2015”, 5 pgs. |
“U.S. Appl. No. 14/225,006, First Action Interview Pre-Interview Communication dated Sep. 10, 2014”, 4 pgs. |
“U.S. Appl. No. 14/225,084, Examiner Interview Summary dated Jan. 4, 2016”, 3 pgs. |
“U.S. Appl. No. 14/225,084, Final Office Action dated Feb. 26, 2016”, 14 pgs. |
“U.S. Appl. No. 14/225,084, First Action Interview Pre-Interview Communication dated Feb. 20, 2015”, 5 pgs. |
“U.S. Appl. No. 14/225,084, First Action Interview Pre-Interview Communication dated Sep. 2, 2014”, 17 pgs. |
“U.S. Appl. No. 14/225,084, Non Final Office Action dated Sep. 11, 2015”, 13 pgs. |
“U.S. Appl. No. 14/225,084, Notice of Allowance dated May 4, 2015”, 26 pgs. |
“U.S. Appl. No. 14/225,160, Advisory Action dated May 20, 2015”, 7 pgs. |
“U.S. Appl. No. 14/225,160, Examiner Interview Summary dated Apr. 22, 2016”, 7 pgs. |
“U.S. Appl. No. 14/225,160, Final Office Action dated Jan. 25, 2016”, 25 pgs. |
“U.S. Appl. No. 14/225,160, Final Office Action dated Feb. 11, 2015”, 30 pgs. |
“U.S. Appl. No. 14/225,160, First Action Interview Pre-Interview Communication dated Jul. 29, 2014”, 19 pgs. |
“U.S. Appl. No. 14/225,160, First Action Interview Pre-Interview Communication dated Oct. 22, 2014”, 6 pgs. |
“U.S. Appl. No. 14/225, 160, Non Final Office Action dated Jun. 16, 2016”, 14 pgs. |
“U.S. Appl. No. 14/225,160, Non Final Office Action dated Aug. 12, 2015”, 23 pgs. |
“U.S. Appl. No. 14/306,138, Examiner Interview Summary dated Dec. 3, 2015”, 3 pgs. |
“U.S. Appl. No. 14/306,138, Examiner Interview Summary dated Dec. 24, 2015”, 5 pgs. |
“U.S. Appl. No. 14/306,147, Final Office Action dated Dec. 24, 2015”, 22 pgs. |
“U.S. Appl. No. 14/319,161, Final Office Action dated Jan. 23, 2015”, 21 pgs. |
“U.S. Appl. No. 14/319,161, Notice of Allowance dated May 4, 2015”, 6 pgs. |
“U.S. Appl. No. 14/319,765, Non Final Office Action dated Feb. 1, 2016”, 19 pgs. |
“U.S. Appl. No. 14/323,935, Notice of Allowance dated Oct. 1, 2015”, 8 pgs. |
“U.S. Appl. No. 14/451,221, Non Final Office Action dated Oct. 21, 2014”, 16 pgs. |
“U.S. Appl. No. 14/463,615, Advisory Action dated Sep. 10, 2015”, 3 pgs. |
“U.S. Appl. No. 14/463,615, Final Office Action dated May 21, 2015”, 31 pgs. |
“U.S. Appl. No. 14/463,615, First Action Interview Pre-Interview Communication dated Jan. 28, 2015”, 29 pgs. |
“U.S. Appl. No. 14/463,615, First Action Interview Pre-Interview Communication dated Nov. 13, 2014”, 4 pgs. |
“U.S. Appl. No. 14/463,615, Non Final Office Action dated Dec. 9, 2015”, 44 pgs. |
“U.S. Appl. No. 14/479,863, First Action Interview Pre-Interview Communication dated Dec. 26, 2014”, 5 pgs. |
“U.S. Appl. No. 14/479,863, Notice of Allowance dated Mar. 31, 2015”, 23 pgs. |
“U.S. Appl. No. 14/483,527, Final Office Action dated Jun. 22, 2015”, 17 pgs. |
“U.S. Appl. No. 14/483,527, First Action Interview Pre-Interview Communication dated Jan. 28, 2015”, 6 pgs. |
“U.S. Appl. No. 14/483,527, Non Final Office Action dated Oct. 28, 2015”, 20 pgs. |
“U.S. Appl. No. 14/483,527, Notice of Allowance dated Apr. 29, 2016”, 34 pgs. |
“U.S. Appl. No. 14/552,336, First Action Interview Pre-Interview Communication dated Jul. 20, 2015”, 18 pgs. |
“U.S. Appl. No. 14/552,336, Notice of Allowance dated Nov. 3, 2015”, 13 pgs. |
“U.S. Appl. No. 14/562,524, First Action Interview Pre-Interview Communication dated Sep. 14, 2015”, 12 pgs. |
“U.S. Appl. No. 14/562,524, First Action Interview Pre-Interview Communication dated Nov. 10, 2015”, 6 pgs. |
“U.S. Appl. No. 14/571,098, Final Office Action dated Feb. 23, 2016”, 37 pgs. |
“U.S. Appl. No. 14/571,098, First Action Interview dated Aug. 24, 2015”, 4 pgs. |
“U.S. Appl. No. 14/571,098, First Action Interview Pre-Interview Communication dated Mar. 11, 2015”, 4 pgs. |
“U.S. Appl. No. 14/571,098, First Action Interview Pre-Interview Communication dated Aug. 5, 2015”, 4 pgs. |
“U.S. Appl. No. 14/571,098, First Action Interview Pre-Interview Communication dated Nov. 10, 2015”, 5 pgs. |
“U.S. Appl. No. 14/631,633, First Action Interview Pre-Interview Communication dated Sep. 10, 2015”, 5 pgs. |
“U.S. Appl. No. 14/676,621, Examiner Interview Summary dated Jul. 30, 2015”, 5 pgs. |
“U.S. Appl. No. 14/676,621, Final Office Action dated Oct. 29, 2015”, 10 pgs. |
“U.S. Appl. No. 14/746,671, First Action Interview Pre-Interview Communication dated Nov. 12, 2015”, 19 pgs. |
“U.S. Appl. No. 14/746,671, Notice of Allowance dated Jan. 21, 2016”, 7 pgs. |
“U.S. Appl. No. 14/800,447, First Action Interview-Pre-Interview Communication dated Dec. 10, 2015”, 6 pgs. |
“U.S. Appl. No. 14/813,749, Final Office Action dated Apr. 8, 2016”, 80 pgs. |
“U.S. Appl. No. 14/813,749, Non Final Office Action dated Sep. 28, 2015”, 22 pgs. |
“U.S. Appl. No. 14/842,734, First Action Interview Pre-Interview Communication dated Nov. 19, 2015”, 17 pgs. |
“U.S. Appl. No. 14/858,647, Notice of Allowance dated Mar. 4, 2016”, 47 pgs. |
“U.S. Appl. No. 14/929,584, Final Office Action dated May 25, 2016”, 42 pgs. |
“U.S. Appl. No. 14/929,584, Non Final Office Action dated Feb. 4, 2016”, 15 pgs. |
“U.S. Appl. No. 15/891,493, Notice of Allowance dated Aug. 10, 2020”, 9 pgs. |
“U.S. Appl. No. 17/093,381, Notice of Allowance dated Aug. 10, 2022”, 10 pgs. |
“Apsalar—Mobile App Analytics & Advertising”, Data Powered Mobile Advertising, https://apsalar.com/, (Jul. 18, 2013), 1-8. |
“Beta Testing On The Fly”, TestFlight, [Online]. Retrieved from the Internet: <URL: https://testflightapp. com/>, (Accessed: Jul. 18, 2013), 3 pgs. |
“Countly”, Countly Mobile Analytics, [Online]. Retrieved from the Internet: <URL: http://count.ly/products/screenshots, (accessed Jul. 18, 2013), 9 pgs. |
“DISTIMO—App Analytics”, [Online]. Retrieved from the Internet: <URL: http://www.distimo.com/app-analytics, (accessed Jul. 18, 2013), 5 pgs. |
“European Application Serial No. 14187996.5, Communication Pursuant to Article 94(3) EPC dated Feb. 19, 16”, 9 pgs. |
“European Application Serial No. 14187996.5, Extended European Search Report dated Feb. 12, 2015”, 7 pgs. |
“European Application Serial No. 14191540.5, Extended European Search Report dated May 27, 2015”, 9 pgs. |
“European Application Serial No. 14200246.8, Extended European Search Report dated May 29, 2015”, 8 pgs. |
“European Application Serial No. 14200298.9, Extended European Search Report dated May 13, 2015”, 7 pgs. |
“European Application Serial No. 14202919.5, Office Action dated May 9, 2016”, 13 pgs. |
“European Application Serial No. 15181419.1, Extended European Search Report dated Sep. 29, 2015”, 7 pgs. |
“European Application Serial No. 15184764.7, Extended European Search Report dated Dec. 14, 2015”, 8 pgs. |
“European Application Serial No. 15200073.3, Extended European Search Report dated Mar. 30, 2016”, 16 pgs. |
“European Application Serial No. 15201924.6, Extended European Search Report dated Apr. 25, 2016”, 8 pgs. |
“European Application Serial No. 16152984.7, Extended European Search Report dated Mar. 24, 2016”, 8 pgs. |
“Flurry Analytics”, [Online]. Retrieved from the Internet: <URL: http://www.flurry.com/, (accessed Jul. 18, 2013), 14 pgs. |
“Google Analytics Official Website—Web Analytics & Reporting”, [Online]. Retrieved from the Internet: <URL: http ://www.google.com/ analytics/index.html, (accessed Jul. 18, 2013), 22 pgs. |
“Great Britain Application Serial No. 1404486.1, Combined Search Report and Examination Report dated Aug. 27, 2014”, 5 pgs. |
“Great Britain Application Serial No. 1404486.1, Office Action dated May 21, 2015”, 2 pgs. |
“Great Britain Application Serial No. 1404489.5, Combined Search Report and Examination Report dated Aug. 27, 2014”, 5 pgs. |
“Great Britain Application Serial No. 1404489.5, Office Action dated May 21, 2015”, 3 pgs. |
“Great Britain Application Serial No. 1404489.5, Office Action dated Oct. 6, 2014”, 1 pg. |
“Great Britain Application Serial No. 1404499.4, Combined Search Report and Examination Report dated Aug. 20, 2014”, 6 pgs. |
“Great Britain Application Serial No. 1404499.4, Office Action dated Jun. 11, 2015”, 5 pgs. |
“Great Britain Application Serial No. 1404499.4, Office Action dated Sep. 29, 2014”, 1 pg. |
“Help File for ModelRisk Version 5—Part 1”, Vose Software, (2007), 375 pgs. |
“Help File for ModelRisk Version 5—Part 2”, Vose Software, (2007), 362 pgs. |
“Hunchlab: Heat Map and Kernel Density Calculation for Crime Analysis”, Azavea Journal, [Online]. Retrieved from the Internet: <URL: www.azavea.com/blogs/newsletter/v4i4/kernel-density-capabilities-added-to-hunchlab>, (Sep. 9, 2014), 2 pgs. |
“KeyLines Datasheet”, Keylines.com, [Online]. Retrieved from the Internet: <URL: http://keylines.com/wp-content/uploads/2014/03/KeyLines-datasheet.pdf>, (Mar. 2014), 2 pgs. |
“Mixpanel: Actions speak louder than page views”, Mobile Analytics, [Online]. Retrieved from the Internet: <URL: https://mixpanel.com/>, (Accessed: Jul. 18, 2013), 13 pgs. |
“Mobile App Marketing & Analytics”, Localytics, [Online]. Retrieved from the Internet: <URL: http://www.localytics.com/>, (Accessed: Jul. 18, 2013), 12 pgs. |
“Mobile Web”, Wikipedia, [Online] Retrieved from the Internet: <https://en.wikipedia.org/w/index.php?title=MobileWeb&oldid=643800164>, (Jan. 23, 2015), 6 pgs. |
“More than android analytics”, UserMetrix, [Online]. Retrieved from the Internet: <URL: http://usermetrix.com/android-analytics>, (Accessed: Jul. 18, 2013), 3 pgs. |
“More Than Mobile Analytics”, Kontagent, [Online]. Retrieved from the Internet: <URL: http://www. kontagent. com/>, (Accessed: Jul. 18, 2013), 9 pgs. |
“Multimap”, Wikipedia, [Online]. Retrieved from the Internet: <URL: https://en.wikipedia.org/w/index.php?title=Multimap&oldid=530800748>, (Jan. 1, 2013), 2 pgs. |
“Netherlands Application Serial No. 2012417, Netherlands Search Report dated Sep. 18, 2015”, W/ English Translation, 9 pgs. |
“Netherlands Application Serial No. 2012421, Netherlands Search Report dated Sep. 18, 2015”, 8 pgs. |
“Netherlands Application Serial No. 2012438, Search Report dated Sep. 21, 2015”, 8 pgs. |
“New Zealand Application Serial No. 622473, First Examination Report dated Mar. 27, 2014”, 3 pgs. |
“New Zealand Application Serial No. 622473, Office Action dated Jun. 19, 2014”, 2 pgs. |
“New Zealand Application Serial No. 622513, Office Action dated Apr. 3, 2014”, 2 pgs. |
“New Zealand Application Serial No. 628161, First Examination Report dated Aug. 25, 2014”, 2 pgs. |
“Piwik—Free Web Analytics Software”, Piwik, [Online]. Retrieved from the Internet: <URL: http://piwik.org/>, (Accessed: Jul. 19, 2013), 18 pgs. |
“Realtime Constant Customer Touchpoint”, Capptain—Pilot your apps, [Online] Retrieved from the Internet: <URL: http://www.capptain.com>, (accessed Jul. 18, 2013), 6 pgs. |
“Refresh CSS ellipsis when resizing container”, Stack Overflow, [Online]. Retrieved from the Internet: < URL: http://stackoverflow.com/questions/17964681/refresh-css-ellipsis-when- resizing-container>, Accessed: May 18, 2015, (Jul. 31, 2013), 1 pg. |
“SAP BusinessObjects Explorer Online Help”, SAP BusinessObjects, (Mar. 19, 2012), 68 pgs. |
“Smart Thinking for Super Apps”, Appacts: Open Source Mobile Analytics Platform, [Online] Retrieved from the Internet: <URL: http://www.appacts.com>, (Jul. 18, 2013), 1-4. |
“Visualizing Threats: Improved Cyber Security Through Network Visualization”, Keylines.com, [Online] retrieved from the internet: <http:/ /keylines.com/wp-content/uploads/2014/04/Visualizing-Threats1.pdf>, (May 12, 2014), 10 pgs. |
“Welcome to StatCounter—Visitor Analysis for Your Website”, StatCounter—Free Invisible Web Tracker, Hit Counter and Web Stats, [Online]. Retrieved from the Internet: <URL: http://statcounter.com/>, (Accessed: Jul. 19, 2013), 17 pgs. |
Andrew, G. Psaltis, “Streaming Data—Designing the real-time pipeline”, vol. MEAP V03, (Jan. 16, 2015), 12 pgs. |
Bertino, Elisa, et al., “Object-Oriented Database Management Systems: Concepts and Issues”, IEEE, (Apr. 1991), 33-47. |
Celik, T, “CSS Basic User Interface Module Level 3 (CSS3 UI)”, Section 8; Resizing and Overflow, [Online] Retrieved from the Internet: <URL: http://www.w3.org/TR/2012/WD-css3-ui-20120117/#resizing-amp-overflow>, (Jan. 17, 2012), 1-58. |
Chaudhuri, Surajit, et al., “An Overview of Business Intelligence Technology”, Communications of the ACM, vol. 54, No. 8., (Aug. 2011), 88-98. |
Cohn, David, et al., “Semi-supervised Clustering with User Feedback”, Cornell University, Constrained Clustering: Advances in Algorithms, Theory, and Applications 4.1, (2003), 9 pgs. |
Dolk, Daniel R, et al., “Knowledge Representation for Model Management Systems”, IEEE Transactions on Software Engineering, vol. SE-10, No. 6, (Nov. 1984), 619-628. |
Gill, Leicester, et al., “Computerised linking of medical records: methodological guidelines”, Journal of Epidemiology and Community Health 1993; 47, (Feb. 1993), 316-319. |
Gorr, et al., “Crime Hot Spot Forecasting: Modeling and Comparative Evaluation”, Grant 98-IJ-CX-K005, (May 6, 2002), 37 pgs. |
Goyal, Gaurav, et al., “A detailed analysis of data consistency concepts in data exchange formats (JSON and XML)”, International Conference on Computing, Communication and Automation (ICCCA), Greater Noida, IN, (2017), 72-77. |
Gu, Lifang, et al., “Record Linkage: Current Practice and Future Directions”, (Jan. 15, 2004), 32 pgs. |
Hansen, D., et al., “Analyzing Social Media Networks with NodeXL: Insights from a Connected World”, Chapter 4, pp. 53-67 and Chapter 10, pp. 143-164, (Sep. 2010), 53- 67; 143-164. |
Hua, Yu, et al., “A Multi-attribute Data Structure with Parallel Bloom Filters for Network Services”, HiPC 2006, LNCS 4297, (2006), 277-288. |
Janssen, Jan-Keno, “Wo bist'n du?—Googles Geodienst Latitude”, w/ English Translation; Issue 3; 86-88, [Online] Retrieved from the Internet: <URL: http://www.heise.de/artikel-archiv/ct/2011/03/086/@00250@/ct.11.03.086-088.pdf>, (Jan. 17, 2011), 6 pgs. |
Manno, et al., “Introducing Collaboration in Single-user Applications through the Centralized Control Architecture”, (2010), 10 pgs. |
Sigrist, Christian, et al., “PROSITE, a Protein Domain Database for Functional Characterization and Annotation”, Nucleic Acids Research, vol. 38, (2010), D161-D166. |
Valentini, Giorgio, et al., “Ensembles of Learning Machines”, Lecture Notes in Computer Science: Neural Nets, Springer Berlin Heidelberg, (Sep. 26, 2002), 3-20. |
Wang, Guohua, et al., “Research on a Clustering Data De-Duplication Mechanism Based on Bloom Filter”, IEEE, (2010), 5 pgs. |
Windley, J Phillip, “The Live Web: Building Event-Based Connections in the Cloud”, Course Technology PTR Chapters 1, 2, and 10, (Dec. 21, 2011), 61 pgs. |
Winkler, William E, et al., “Bureau of the Census Statistical Research Division Record Linkage Software and Methods for Merging Administrative Lists”, Statistical Research Report Series, No. RR2001/03, (Jul. 23, 2001), 11 pgs. |
Number | Date | Country | |
---|---|---|---|
20230081135 A1 | Mar 2023 | US |
Number | Date | Country | |
---|---|---|---|
62478372 | Mar 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17093381 | Nov 2020 | US |
Child | 18051035 | US | |
Parent | 15891493 | Feb 2018 | US |
Child | 17093381 | US |