Searches of highly structured data

Information

  • Patent Grant
  • 10296617
  • Patent Number
    10,296,617
  • Date Filed
    Monday, October 5, 2015
    9 years ago
  • Date Issued
    Tuesday, May 21, 2019
    6 years ago
Abstract
Techniques related to searches of highly structured data are described. A body of data may be represented by an object-centric data model. For a search of the body of data, an indication of a particular search template to use may be received. The particular search template may specify one or more hierarchical object types that are within a scope of the search. The one or more hierarchical object types may be defined in the object-centric data model. The particular search template may specify at least one search field. A user interface may be generated based on the particular search template. The user interface may include the at least one search field.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

This application incorporates by reference the entirety of U.S. patent application Ser. No. 13/831,199, filed Mar. 14, 2013.


TECHNICAL FIELD

The present Application relates to information retrieval technology. More specifically, the example embodiment(s) described below relate to searches of highly structured data.


BACKGROUND

The approaches described in this section are approaches that could be pursued, but not necessarily approaches that have been previously conceived or pursued. Therefore, unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section.


Computers are very powerful tools for searching information. A search engine is a common mechanism that allows users to search for information using computers. A search engine accepts a search query as input. A search query is typically composed of one or more keywords and provides a search result as output. The search result identifies information that the search engine has determined “satisfies” the search query. Search engines often maintain an index of a corpus of information that allows the search engine to efficiently identify information in the corpus that satisfies a given search query.


One type of well-known search engine is an Internet search engine. Internet search engines are useful for searching semi-structured or unstructured data, such as the text content of a web page. However, the user interfaces of Internet search engines typically take a “one size fits all” approach with regard to how the user may scope the search. In particular, Internet search engines typically provide only a single text entry field into which the user enters one or more keywords. The Internet search engine then uses the entered keywords to identify information items that satisfy the entered keywords. This approach works well with the semi-structured and unstructured data that is indexed by Internet search engines, because such data typically is not represented by a highly-structured data model that is known to the user a priori the search. However, for highly structured data represented by a data model that the user has knowledge of prior to the search, the limited user interfaces provided by Internet search engines may be inadequate, inefficient, or cumbersome for users.





BRIEF DESCRIPTION OF THE DRAWINGS

The example embodiment(s) of the present Application are illustrated, by way of example and not limitation, in the accompanying drawings in which like reference numerals refer to similar elements and in which:



FIG. 1 depicts an example object-centric data model.



FIG. 2 depicts an example ontology.



FIG. 3 depicts an example user interface.



FIG. 4 depicts a plurality of example search templates.



FIGS. 5A-C depict example approaches for obtaining input to search fields.



FIG. 6 depicts an example customizable format for search results.



FIGS. 7-9 depict example detailed views of a search result.



FIG. 10 is a flow diagram that depicts an approach for searching highly structured data, according to some example embodiments of the present invention.



FIG. 11 depicts an example computer system in which embodiments may be implemented.



FIG. 12 is a very general block diagram of a computing device in which the example embodiment(s) of the present Application may be embodied.



FIG. 13 is a block diagram of a basic software system for controlling the operation of the computing device.





DESCRIPTION OF THE EXAMPLE EMBODIMENT(S)

In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the example embodiment(s) of the present Application. It will be apparent, however, that the example embodiment(s) may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the example embodiment(s). Modifiers such as “first” and “second” may be used to differentiate elements, but the modifiers do not necessarily indicate any particular order.


General Overview

Computer-implemented techniques for helping a user search a body of data that is highly structured are described. In some example embodiments, the body of data is represented by an object-centric data model. The object-centric data model is centered on the notion of data objects and properties of the data objects. Furthermore, the object-centric data model is based on an ontology that defines hierarchical object types and property types. For example, a data object in the body of data may have the hierarchical object type “Employee” and have a “Name” property, a “Title” property,” and a “Salary” property. The ontology may define the “Employee” object type as a child object type of the “Person” object type. Thus, the “Employee” object is also a “Person” object according to the object-centric data model.


In some example embodiments, the techniques encompass a computer-implemented method performed at one or more computing devices. The one or more computing devices include one or more processors and storage media storing one or more computer programs executed by the one or more processors to perform the method.


Performance of the method includes performing the operation of receiving an indication of a search template to use for a search of the body of data represented by the object-centric data model. The search may have a scope specified by the template. In particular, the search template may specify one or more hierarchical object types, defined in the object-centric data model, that are within the scope of the search. The search template may also specify a plurality of search fields.


Performance of the method may further include performing the operations of: based on the search template, generating a user interface that includes the plurality of search fields; and providing the user interface to a user. For at least a couple of reasons, the user interface may allow a user to search the body of data more efficiently and with higher precision and recall. One reason is that searches of the body of data initiated via the user interface may be scoped according to the one or more hierarchical object types specified in the search template. Scoping a search may involve associating a set of search fields with the one or more hierarchical object types that are specified. Another reason is that the search fields presented to the user in the user interface may be limited to those that are relevant to the one or more hierarchical object types specified in the search template. In other words, each search field may impose a property restriction on the search, the property restriction being dependent on the one or more hierarchical object types that are specified.


Example Object-Centric Data Model


FIG. 1 depicts an example object-centric data model. Referring to FIG. 1, object-centric data model 100 includes data objects 102, 104 and relationships 106A-N. Data object 102 includes object ID 108, object type 110, displayed data 112, property IDs 114A-N, and related object IDs 116. Property ID 114A is associated with property type 118, displayed type 120, and property value 122. Relationship 106N is associated with relationship type 124, displayed type 126, and related object IDs 128.


Object-centric data model 100 may be a logical data model that defines how data is represented. Object-centric data model 100 may be independent of any data storage model. For example, data may be stored in a relational database or a key-value store and still be represented by object-centric data model 100.


At a minimum, object-centric data model 100 is built on the notion of a data object 102, 104. A data object 102, 104 may represent a particular person, a particular location, a particular organization, a particular event, a particular document, or other instance of a noun. For example, a particular data object may correspond to “Barack Obama”, “San Francisco”, “Stanford University”, “2008 Financial Crisis”, etc.


A data object 102, 104 may be associated with zero or more properties. A property of a data object 102, 104 may be an attribute of the data object 102, 104 and may represent an individual data item. For example, a property may include a name, height, weight, or phone number of a person.


Data objects 102, 104 may be related based on one or more relationships 106A-N. A relationship 106A-N may be symmetric or asymmetric. For example, a pair of data objects 102, 104 may be related by an asymmetric “child of” relationship and/or a symmetric “kin of” relationship.


At the highest level of abstraction, a data object 102, 104 may be a container for information. The information may include object ID 108, object type 110, displayed data 112, property IDs 114A-N, related object IDs 116, any associated media (e.g., image, video recording, audio recording), any links to associated media, and/or any other data relevant to the data object 102, 104.


Data objects 102, 104 may be referenced based on unique identifiers that uniquely identify each data object 102, 104. Thus, a particular data object may store object ID 108, which is the unique identifier assigned to the particular data object. The particular data object may also store one or more related object IDs 116, which are the unique identifiers assigned to data objects 102, 104 that share one or more relationships 106A-N with the particular data object.


Object type 110 may indicate a category of data objects 102, 104. Example categories include “person”, “location”, “organization”, “event”, and “document”. The category may be indicated by a Uniform Resource Identifier (URI). For example, if a particular data object represents “Barack Obama”, then object type 110 may be “com.palantir.object.person”.


Displayed data 112 may be a user-friendly representation of a particular data object. For example, displayed data 112 may include a user-friendly version of object type 110, such as “Person”; a commonly used name for the particular data object, such as “Barack Obama”; and/or a thumbnail preview of the particular data object.


If a particular data object is associated with any properties, the particular data object may store property IDs 114A-N. Each property ID of property IDs 114A-N may be associated with a property type 118, a displayed type 120, and one or more property values 122.


Property type 118 may indicate a category of properties, and displayed type 120 may be a user-friendly version of property type 118. For example, if property type 118 is “com.palantir.property.name”, then displayed type 120 may be “Name”. A particular data object may have multiple properties of the same type. For example, a person may have multiple aliases.


Property value 122 may include one or more values of a particular property. Multiple values may correspond to component values. For example, the property value 122 “Barack Obama” may be broken down into component values “Barack” and “Obama”.


If a particular data object stores any related object IDs 116, a search for matching object identifiers may be performed to determine a relationship 106A-N with the particular data object. Each relationship of relationships 106A-N may be associated with a relationship type 124, a displayed type 126, and related object IDs 128.


Relationship type 124 may indicate a category of relationships 106A-N, and displayed type 126 may be a user-friendly version of relationship type 124. For example, if relationship type 124 is “com.palantir.relationship.appearsin”, then displayed type 126 may be “Appears In”. In other words, a search of data object 102 may cause displaying of displayed type 126 of relationship 106A and/or displayed data 112 of data object 104. Any further data related to data object 104 may be retrieved based on an additional search.


Example Ontology

Object-centric data model 100 may include a hierarchical data type ontology. FIG. 2 depicts an example ontology. Referring to FIG. 2, ontology 200 includes object types 202A-N, 212A-N. Each object type of object types 202A-N, 212A-N is associated with property types 204, 206, 214, 216 and one or more relationship types 208, 210, 218, 220.


Ontology 200 may be a hierarchical organization of object types 202A-N, 212A-N; property types 204, 206, 214, 216; and/or relationship types 208, 210, 218, 220. In other words, ontology 200 may define which of the property types 204, 206, 214, 216 and/or relationship types 208, 210, 218, 220 correspond to a particular object type. Ontology 200 may be static or dynamic, depending on whether it can be modified by an end user.


In the example of FIG. 2, object types 202A and 202N share the same hierarchical level but correspond to different hierarchical object types. For example, object types 202A and 202N may correspond to the hierarchical object types “Person” and “Location”, respectively. Different hierarchical object types may be associated with one or more different property types 204, 206, 214, 216 and/or one or more different relationship types 208, 210, 218, 220. For example, property types 204 may include “Name”, “Height”, “Weight”, and/or any other property types 204 that are relevant to the hierarchical object type “Person”. In contrast, property types 206 may include “Name”, “Latitude”, “Longitude”, and/or any other property types 206 that are relevant to the hierarchical object types “Location”.


In FIG. 2, object types 212A-N are depicted as descending from object type 202A. In other words, object types 212A-N may be sub-types of object type 202A. For example, object types 212A and 212N correspond to the hierarchical object types “Teacher” and “Lawyer”, which are both sub-types of “Person”. Thus, object types 212A-N may inherit property types 204 and relationship types 208 from object type 202A. For example, property types 214 may include property types 204 and one or more additional property types 214 that are relevant to the hierarchical object type “Teacher”. However, property types 214 and property types 216 may differ in at least one property type 214, 216. For example, property types 214 may include the property type “Grades Taught”, whereas property types 216 may include the property type “Bar Admissions”.


Hierarchical object types that share the same hierarchical level may be disjunctive and may involve disjunctive searches. For example, object types 212A and 212N may be disjunctive insofar as a data object 102, 104 associated with object type 212A is excluded from association with object type 212N. Thus, object types 212A and 212N may be searched separately.


In contrast, a search of a particular hierarchical object type may be equivalent to searching each sub-type of the particular hierarchical object type. For example, a single search of object type 202A may be the equivalent of separate searches of object types 212A-N.


Example User Interface

Searches of highly structured data may be initiated based on input at a user interface. FIG. 3 depicts an example user interface. Referring to FIG. 3, user interface 300 is provided to a user of a client device.


In the example of FIG. 3, user interface 300 is associated with a native mobile application on a smartphone. In FIG. 3, user interface 300 includes a single search field for accepting one or more keywords as input. However, clicking “Advanced” (e.g., via a touch screen) may cause a different user interface to be provided to the user.


Example Plurality of Search Templates

The different user interface may include a plurality of search templates. FIG. 4 depicts a plurality of example search templates. Referring to FIG. 4, search templates 400A-N are provided to the user.


Search templates 400A-N translate user intent into search queries. Each search template of search templates 400A-N specifies a scope of a search of data represented by an object-centric data model 100. As shall be described in greater detail below, the scope of the search may be defined by a plurality of search fields specified by a selected search template.


Each search template of search templates 400A-N specifies one or more hierarchical object types within the scope of the search. The one or more hierarchical object types may include disjunctive object types. Additionally or alternatively, the one or more hierarchical object types may include a hierarchical object type as well as a sub-type of the hierarchical object type.


For example, in FIG. 4, search template 400A corresponds to the hierarchical object type “Person”. As mentioned above, a search of a particular object type may be equivalent to searching each sub-type of the particular object type. Thus, any sub-type of a hierarchical object type that is within the scope of the search is also within the scope of the search. For example, the hierarchical object types “Teacher” and “Lawyer” are also within the scope of the search specified by search template 400A. In other words, a “Person” search may be implemented as separate searches of “Teacher” and “Lawyer” that are performed disjunctively.


In another example, a user may modify the plurality of search templates 400A-N of FIG. 4 to include a customized search template that specifies the hierarchical object types “Person” and “Location”. The customized search template may be generated by editing an existing search template or creating a new search template. Thus, the customized search template may specify disjunctive searches of “Person” and “Location”.


Approaches for Obtaining Input to Search Fields

Selecting a particular search template may cause generating a user interface 300 that includes a plurality of search fields. For example, FIG. 5A depicts an example plurality of search fields 500A-N provided to the user in response to selecting a particular search template. In an embodiment, a search field may accept a keyword and/or a property value 122 as input.


Each search template of search templates 400A-N may specify search fields 500A-N that correspond to relevant property types associated with one or more data objects 102, 104 that are within the scope of the search defined by the search template. In the example of FIG. 5A, the particular search template may be a customized search template specifying disjunctive searches for a “Person” and a “Document”. Thus, search fields 500A-N may correspond to relevant property types associated with the hierarchical object types “Person” and “Document”.


Each search field of the plurality of search fields 500A-N may be associated with a displayed type 120. For example, search field 500A follows the displayed type 120 “First”. Furthermore, each search field of the plurality of search fields 500A-N accepts input specifying a property value 122. For example, search field 500A accepts input specifying the property value 122 “John”. Note that input need not be provided to all of the search fields 500A-N. Furthermore, there may be hidden search fields associated with fixed property values, such as “USA” in a hidden “Country” search field.


Referring to FIG. 5A, there is a “Properties” section that includes search field 500A (hereinafter “property filter”). A property filter may specify a conjunctive search for each property type 118 that is provided with a property value 122. For example, a conjunctive search may be performed for the first name “John” and the last name “Smith”.



FIG. 5A also depicts a “Date Range” section that includes search field 500N (hereinafter “intrinsic date search”). An intrinsic date search is typically used with the hierarchical object types “Document” and “Event”. An “intrinsic date” may be a date of creation, a date of occurrence, a date of publication, or any other property that can be represented as a date.


A particular search field may obtain input in any of a number of different ways. Each search field of the plurality of search fields 500A-N is associated with an input type (e.g., SIMPLE, DATE, NUMERIC, ENUM, MAP). Thus, search fields 500A-N may include a text entry field, a date picker, a numeric keypad, a selectable list, an interactive map, or any other interface for obtaining input. FIG. 5B depicts a user interface 300 that obtains input in multiple ways.


Referring to FIG. 5B, each search field of search fields 502 corresponds to the same property type “First”. Selecting a “+” button may add an additional search field to search fields 502. Search fields 502 are disjunctive search fields. In other words, each property value 122 provided to search fields 502 may be searched disjunctively. However, search fields 502 may still be searched conjunctively with any other search fields.



FIG. 5B also depicts input 504 obtained for the property type “Location”. Input 504 may have been obtained based on selecting the crosshairs button adjacent to the “Location” search field. The crosshairs button may have caused the user to be provided with the interface depicted in FIG. 5C.



FIG. 5C depicts an example map interface including an interactive map 506. The user may add a geolocation pin to the interactive map 506. The geolocation pin may indicate a particular location and/or the center of a geographical region relevant to a search. For example, a geo-fenced search may be performed based on a user-specified geographical region 508. Thus, a particular search template may restrict a search to one or more locations. For example, the particular search template may restrict a scope of a search to one or more geographic regions.


In an embodiment (not shown), an input 504 to a particular search field may be obtained from a camera and/or any other hardware of the client computer at which the user interface 300 is displayed. For example, a camera may be used to populate search fields with measurements (e.g., distance between eyes, length of face, width of face) that can be used for face recognition.


Example Customizable Format for Search Results

After a search is performed, search results may be retrieved and provided to the user. The search results may be provided in any number of different formats, which may be customized by the user. FIG. 6 depicts an example customizable format for search results.


Referring to FIG. 6, customizable format 600 includes a list of search results. The list may be arranged in any order. For example, the list may be organized based on hierarchical object type.


Search results may be presented in any of a number of different ways. Each search result may include some or all of the displayed data 112 associated with the search result. Additionally or alternatively, one or more search results may be plotted on a map. Additionally or alternatively, the search results may be represented graphically using any of the techniques described in U.S. patent application Ser. No. 13/608,864, filed Sep. 10, 2012, the entirety of which is incorporated herein by reference. Additionally or alternatively, the search results may be provided as selectable histograms using any of the techniques described in U.S. patent application Ser. No. 14/676,621, filed Apr. 1, 2015, the entirety of which is incorporated herein by reference.


In addition to the search results, the user may be provided with past search results. For example, the results of recent searches may be provided to the user as a list. Current and/or past search results may be used to modify search templates 400A-N.


Example Detailed Views of a Search Result

Selecting a particular search result may cause the user to be provided with one or more detailed views of the particular search result. FIGS. 7-9 depict example detailed views of a search result.



FIG. 7 depicts an example “Profile” view. This view may include a summary of properties associated with a particular data object. In the example of FIG. 7, this view includes displayed data 112 associated with the particular data object as well as displayed type 120 and property value 122 for each property of the particular data object.


The user can do any of a number of things with a particular data object obtained as a search result. The user can edit the particular data object. For example, the user can modify a property and store the modified data object in the data store from which the unmodified data object was retrieved. Additionally or alternatively, the user can share the particular data object with another user. For example, the user may send a link to the particular data object to another user. Additionally or alternatively, the user can specify additional searches to be performed based on the particular data object using any of the techniques described in U.S. patent application Ser. No. 13/608,864. Additionally or alternatively, a search result may be geotagged (e.g., associated with a particular location).



FIG. 8 depicts an example “Related” view. This view may include any data objects 102, 104 that are related to the particular data object obtained as a search result. In the example of FIG. 8, this view includes displayed type 126 of each relationship between the particular data object and a related data object as well as the displayed data 112 associated with the related data object.



FIG. 9 depicts an example “Media” view. This view may include links to any media data associated with the particular data object obtained as a search result. Selecting a media link may cause media data to be rendered in a suitable manner. For example, selecting a video file may cause the video file to be presented in a video player.


Process Overview


FIG. 10 is a flow diagram that depicts an approach for searching highly structured data. At block 1000, an indication of a search template is received. The indication may include user input (e.g., input indicating a user's selection of a search template in a user interface) and/or a network message (e.g., a HTTP message that indicates a user's selection of a search template). The indication may specify that the search template is to be used for a search of data represented by an object-centric data model. The search template may specify the scope of the search, which may include one or more hierarchical object types that are defined in the object-centric data model. The search template may also specify a plurality of search fields.


For example, a user may select a “Person” search template. The “Person” search template may specify that a search is to be performed for data belonging to the “Person” object type. A “Person” object type is associated with certain property types. Thus, selecting the “Person” search template specifies these property types.


At block 1002, a user interface is generated based on the search template. The user interface may include the plurality of search fields. For example, the property types specified by the “Person” search template may translate into search fields corresponding to a person's name, age, address, phone number, etc.


At optional block 1004, the user interface is provided to a user. For example, search fields may be provided to the user so that the user can input property values to be matched when the search is performed.


At optional block 1006, input is received from the user through the user interface. The input may be in any of a number of formats. For example, the input may be a text entry, an interaction with a map, a selection from an enumerated list, etc.


At optional block 1008, the search is performed based on the input. For example, text input may be enclosed with wildcard operators, such as “*”, and matched against stored property values.


At optional block 1010, one or more first data objects are retrieved as a result of performing the search. The one or more first data objects are data objects that fell within the scope of the search. Any of a number of subsequent actions may be performed based on the one or more first data objects, including searching for one or more second data object that did not fall within the scope of the search.


Example Computer System


FIG. 11 depicts an example computer system in which embodiments may be implemented. Referring to FIG. 11, client computer 1100 includes client application 1102. Client computer 1100 is communicatively coupled to server computer 1104, which is communicatively coupled to body of data 1106.


A “computer” may be one or more physical computers, virtual computers, and/or computing devices. As an example, a computer may be one or more server computers, cloud-based computers, cloud-based cluster of computers, virtual machine instances or virtual machine computing elements such as virtual processors, storage and memory, data centers, storage devices, desktop computers, laptop computers, mobile devices, and/or any other special-purpose computing devices. A computer may be a client and/or a server. Any reference to “a computer” herein may mean one or more computers, unless expressly stated otherwise.


Client application 1102 may be a sequence of instructions executing at client computer 1100. At a minimum, client application 1102 may provide, to a user, any of the interfaces described in FIGS. 2-9. For example, client application 1102 may be a program executing in a web browser or a native mobile app. Client application 1102 may be installed on client computer 1100 to provide any of a number of benefits. The benefits may include faster execution, lower network latency and bandwidth consumption, and/or better access to the hardware of client computer 1100. Furthermore, a client application 1102 installed on client computer 1100 enables the client application 1102 to operate in an offline mode. For example, search results may be saved locally on client computer 1100 so that a user can interact with them without an Internet connection.


Server computer 1104 may include one or more computers, such as a web server, a mobile server, a gateway server, and/or a load-balancing server. A gateway server may regulate access to other servers, including structured data servers and unstructured data servers. Structured data servers may be computers that facilitate searches of structured data, such as data stored in an object-centric data model. Unstructured data servers may be computers that facilitate searches of unstructured data, such as by implementing text searches of documents. Server computer 1104 may send data to client computer 1100 using JavaScript Object Notation (JSON), Extensible Markup Language (XML), and/or any other data interchange format.


Body of data 1106 may represent the data being searched. For example, body of data 1106 may be stored in a database, a configuration file, and/or any other system and/or data structure that stores data. Additionally or alternatively, body of data 1106 may be stored in memory on server computer 1104. Additionally or alternatively, body of data 1106 may be stored in non-volatile storage. For example, body of data 1106 may be stored in a mobile database communicatively coupled to a mobile server and/or a repository communicatively coupled to a gateway server.


A user's access to body of data 1106 may be limited based on access controls. Thus, a search of a body of data 1106 may be restricted to data to which the user has access.


Basic Computing Device

Referring now to FIG. 12, it is a block diagram that illustrates a basic computing device 1200 in which the example embodiment(s) of the present Application may be embodied. Computing device 1200 and its components, including their connections, relationships, and functions, is meant to be exemplary only, and not meant to limit implementations of the example embodiment(s). Other computing devices suitable for implementing the example embodiment(s) may have different components, including components with different connections, relationships, and functions.


Computing device 1200 may include a bus 1202 or other communication mechanism for addressing main memory 1206 and for transferring data between and among the various components of device 1200.


Computing device 1200 may also include one or more hardware processors 1204 coupled with bus 1202 for processing information. A hardware processor 1204 may be a general purpose microprocessor, a system on a chip (SoC), or other processor.


Main memory 1206, such as a random access memory (RAM) or other dynamic storage device, also may be coupled to bus 1202 for storing information and software instructions to be executed by processor(s) 1204. Main memory 1206 also may be used for storing temporary variables or other intermediate information during execution of software instructions to be executed by processor(s) 1204.


Software instructions, when stored in storage media accessible to processor(s) 1204, render computing device 1200 into a special-purpose computing device that is customized to perform the operations specified in the software instructions. The terms “software”, “software instructions”, “computer program”, “computer-executable instructions”, and “processor-executable instructions” are to be broadly construed to cover any machine-readable information, whether or not human-readable, for instructing a computing device to perform specific operations, and including, but not limited to, application software, desktop applications, scripts, binaries, operating systems, device drivers, boot loaders, shells, utilities, system software, JAVASCRIPT, web pages, web applications, plugins, embedded software, microcode, compilers, debuggers, interpreters, virtual machines, linkers, and text editors.


Computing device 1200 also may include read only memory (ROM) 1208 or other static storage device coupled to bus 1202 for storing static information and software instructions for processor(s) 1204.


One or more mass storage devices 1210 may be coupled to bus 1202 for persistently storing information and software instructions on fixed or removable media, such as magnetic, optical, solid-state, magnetic-optical, flash memory, or any other available mass storage technology. The mass storage may be shared on a network, or it may be dedicated mass storage. Typically, at least one of the mass storage devices 1210 (e.g., the main hard disk for the device) stores a body of program and data for directing operation of the computing device, including an operating system, user application programs, driver and other support files, as well as other data files of all sorts.


Computing device 1200 may be coupled via bus 1202 to display 1212, such as a liquid crystal display (LCD) or other electronic visual display, for displaying information to a computer user. In some configurations, a touch sensitive surface incorporating touch detection technology (e.g., resistive, capacitive, etc.) may be overlaid on display 1212 to form a touch sensitive display for communicating touch gesture (e.g., finger or stylus) input to processor(s) 1204.


An input device 1214, including alphanumeric and other keys, may be coupled to bus 1202 for communicating information and command selections to processor 1204. In addition to or instead of alphanumeric and other keys, input device 1214 may include one or more physical buttons or switches such as, for example, a power (on/off) button, a “home” button, volume control buttons, or the like.


Another type of user input device may be a cursor control 1216, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 1204 and for controlling cursor movement on display 1212. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane.


While in some configurations, such as the configuration depicted in FIG. 12, one or more of display 1212, input device 1214, and cursor control 1216 are external components (i.e., peripheral devices) of computing device 1200, some or all of display 1212, input device 1214, and cursor control 1216 are integrated as part of the form factor of computing device 1200 in other configurations.


Functions of the disclosed systems, methods, and modules may be performed by computing device 1200 in response to processor(s) 1204 executing one or more programs of software instructions contained in main memory 1206. Such software instructions may be read into main memory 1206 from another storage medium, such as storage device(s) 1210. Execution of the software instructions contained in main memory 1206 cause processor(s) 1204 to perform the functions of the example embodiment(s).


While functions and operations of the example embodiment(s) may be implemented entirely with software instructions, hard-wired or programmable circuitry of computing device 1200 (e.g., an ASIC, a FPGA, or the like) may be used in other embodiments in place of or in combination with software instructions to perform the functions, according to the requirements of the particular implementation at hand.


The term “storage media” as used herein refers to any non-transitory media that store data and/or software instructions that cause a computing device to operate in a specific fashion. Such storage media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, non-volatile random access memory (NVRAM), flash memory, optical disks, magnetic disks, or solid-state drives, such as storage device 1210. Volatile media includes dynamic memory, such as main memory 1206. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, flash memory, any other memory chip or cartridge.


Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 1202. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.


Various forms of media may be involved in carrying one or more sequences of one or more software instructions to processor(s) 1204 for execution. For example, the software instructions may initially be carried on a magnetic disk or solid-state drive of a remote computer. The remote computer can load the software instructions into its dynamic memory and send the software instructions over a telephone line using a modem. A modem local to computing device 1200 can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on bus 1202. Bus 1202 carries the data to main memory 1206, from which processor(s) 1204 retrieves and executes the software instructions. The software instructions received by main memory 1206 may optionally be stored on storage device(s) 1210 either before or after execution by processor(s) 1204.


Computing device 1200 also may include one or more communication interface(s) 1218 coupled to bus 1202. A communication interface 1218 provides a two-way data communication coupling to a wired or wireless network link 1220 that is connected to a local network 1222 (e.g., Ethernet network, Wireless Local Area Network, cellular phone network, Bluetooth wireless network, or the like). Communication interface 1218 sends and receives electrical, electromagnetic, or optical signals that carry digital data streams representing various types of information. For example, communication interface 1218 may be a wired network interface card, a wireless network interface card with an integrated radio antenna, or a modem (e.g., ISDN, DSL, or cable modem).


Network link(s) 1220 typically provide data communication through one or more networks to other data devices. For example, a network link 1220 may provide a connection through a local network 1222 to a host computer 1224 or to data equipment operated by an Internet Service Provider (ISP) 1226. ISP 1226 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the “Internet” 1228. Local network(s) 1222 and Internet 1228 use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link(s) 1220 and through communication interface(s) 1218, which carry the digital data to and from computing device 1200, are example forms of transmission media.


Computing device 1200 can send messages and receive data, including program code, through the network(s), network link(s) 1220 and communication interface(s) 1218. In the Internet example, a server 1230 might transmit a requested code for an application program through Internet 1228, ISP 1226, local network(s) 1222 and communication interface(s) 1218.


The received code may be executed by processor 1204 as it is received, and/or stored in storage device 1210, or other non-volatile storage for later execution.


Basic Software System


FIG. 13 is a block diagram of a basic software system 1300 that may be employed for controlling the operation of computing device 1200. Software system 1300 and its components, including their connections, relationships, and functions, is meant to be exemplary only, and not meant to limit implementations of the example embodiment(s). Other software systems suitable for implementing the example embodiment(s) may have different components, including components with different connections, relationships, and functions.


Software system 1300 is provided for directing the operation of computing device 1200. Software system 1300, which may be stored in system memory (RAM) 1206 and on fixed storage (e.g., hard disk or flash memory) 1210, includes a kernel or operating system (OS) 1310.


The OS 1310 manages low-level aspects of computer operation, including managing execution of processes, memory allocation, file input and output (I/O), and device I/O. One or more application programs, represented as 1302A, 1302B, 1302C . . . 1302N, may be “loaded” (e.g., transferred from fixed storage 1210 into memory 1206) for execution by the system 1300. The applications or other software intended for use on device 1300 may also be stored as a set of downloadable computer-executable instructions, for example, for downloading and installation from an Internet location (e.g., a Web server, an app store, or other online service).


Software system 1300 includes a graphical user interface (GUI) 1315, for receiving user commands and data in a graphical (e.g., “point-and-click” or “touch gesture”) fashion. These inputs, in turn, may be acted upon by the system 1300 in accordance with instructions from operating system 1310 and/or application(s) 1302. The GUI 1315 also serves to display the results of operation from the OS 1310 and application(s) 1302, whereupon the user may supply additional inputs or terminate the session (e.g., log off).


OS 1310 can execute directly on the bare hardware 1320 (e.g., processor(s) 1204) of device 1200. Alternatively, a hypervisor or virtual machine monitor (VMM) 1330 may be interposed between the bare hardware 1320 and the OS 1310. In this configuration, VMM 1330 acts as a software “cushion” or virtualization layer between the OS 1310 and the bare hardware 1320 of the device 1200.


VMM 1330 instantiates and runs one or more virtual machine instances (“guest machines”). Each guest machine comprises a “guest” operating system, such as OS 1310, and one or more applications, such as application(s) 1302, designed to execute on the guest operating system. The VMM 1330 presents the guest operating systems with a virtual operating platform and manages the execution of the guest operating systems.


In some instances, the VMM 1330 may allow a guest operating system to run as if it is running on the bare hardware 1320 of device 1200 directly. In these instances, the same version of the guest operating system configured to execute on the bare hardware 1320 directly may also execute on VMM 1330 without modification or reconfiguration. In other words, VMM 1330 may provide full hardware and CPU virtualization to a guest operating system in some instances.


In other instances, a guest operating system may be specially designed or configured to execute on VMM 1330 for efficiency. In these instances, the guest operating system is “aware” that it executes on a virtual machine monitor. In other words, VMM 1330 may provide para-virtualization to a guest operating system in some instances.


The above-described basic computer hardware and software is presented for purpose of illustrating the basic underlying computer components that may be employed for implementing the example embodiment(s). The example embodiment(s), however, are not necessarily limited to any particular computing environment or computing device configuration. Instead, the example embodiment(s) may be implemented in any type of system architecture or processing environment that one skilled in the art, in light of this disclosure, would understand as capable of supporting the features and functions of the example embodiment(s) presented herein.


EXTENSIONS AND ALTERNATIVES

In the foregoing specification, the example embodiment(s) of the present Application have been described with reference to numerous specific details. However, the details may vary from implementation to implementation according to the requirements of the particular implement at hand. The example embodiment(s) are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

Claims
  • 1. A method, comprising: at one or more computing devices comprising one or more processors and storage media storing one or more computer programs executed by the one or more processors to perform the method, performing the operations of:receiving an indication that a particular search template is to be used for a search of a body of data represented by an object-centric data model, wherein the particular search template translates user intent into search queries and the object-centric data model defines data objects and properties of the data objects comprising hierarchical object types and property types based on an ontology independent of any data storage model;wherein the particular search template specifies one or more hierarchical object types defined in the object-centric data model that are within a scope of the search, wherein the hierarchical object types share disjunctive searches on a same hierarchical level;wherein the particular search template specifies at least one search field related to at least one corresponding hierarchical object type of the one or more hierarchical object types, wherein the search field imposes a property restriction on the search, the property restriction being dependent on the one or more hierarchical object types that are specified;based on the particular search template, generating a user interface that includes the at least one search field, wherein the search is scoped based on the at least one corresponding hierarchical object type for the at least one search field and equivalent to searching each sub-type of the hierarchical object type;retrieving one or more first data objects from the body of data based on causing the search to be executed against the body of data, wherein the one or more first data objects are within the scope of the search; andproviding one or more second data objects that are outside the scope of the search based on one or more relationships, defined in the object-centric data model, between the one or more first data objects and the one or more second data objects.
  • 2. The method of claim 1, wherein each search field of the at least one search field corresponds to a respective property type, defined in the object-centric data model, that is associated with the one or more hierarchical object types.
  • 3. The method of claim 1, wherein the at least one search field includes one or more search fields that accept input from an interactive map.
  • 4. The method of claim 1, wherein the particular search template restricts the search to a user-specified geographical region.
  • 5. The method of claim 1, wherein a particular property type associated with the one or more hierarchical object types corresponds to two or more disjunctive search fields in the at least one search field.
  • 6. The method of claim 1, wherein the one or more hierarchical object types include two or more disjunctive object types.
  • 7. The method of claim 1, wherein the particular search template is one of a plurality of search templates.
  • 8. The method of claim 1, wherein the particular search template specifies the scope of the search.
  • 9. The method of claim 1, further comprising: subsequent to generating the user interface, providing the user interface to a user.
  • 10. A system, comprising: storage media;one or more processors; andone or more programs stored in the storage media and configured for execution by the one or more processors, the one or more programs comprising instructions for: receiving an indication that a particular search template is to be used for a search of a body of data represented by an object-centric data model, wherein the particular search template translates user intent into search queries and the object-centric data model defines data objects and properties of the data objects comprising hierarchical object types and property types based on an ontology independent of any data storage model;wherein the particular search template specifies one or more hierarchical object types defined in the object-centric data model that are within a scope of the search wherein the hierarchical object types share disjunctive searches on a same hierarchical level;wherein the particular search template specifies at least one search field related to at least one corresponding hierarchical object type of the one or more hierarchical object types, wherein the search field imposes a property restriction on the search, the property restriction being dependent on the one or more hierarchical object types that are specified;based on the particular search template, generating a user interface that includes the at least one search field, wherein the search is scoped based on the at least one corresponding hierarchical object type for the at least one search field and equivalent to searching each sub-type of the hierarchical object type;retrieving one or more first data objects from the body of data based on causing the search to be executed against the body of data, wherein the one or more first data objects are within the scope of the search; andproviding one or more second data objects that are outside the scope of the search based on one or more relationships, defined in the object-centric data model, between the one or more first data objects and the one or more second data objects.
  • 11. The system of claim 10, wherein each search field of the at least one search field corresponds to a respective property type, defined in the object-centric data model, that is associated with the one or more hierarchical object types.
  • 12. The system of claim 10, wherein the at least one search field includes one or more search fields that accept input from an interactive map.
  • 13. The system of claim 10, wherein the particular search template restricts the search to a user-specified geographical region.
  • 14. The system of claim 10, wherein a particular property type associated with the one or more hierarchical object types corresponds to two or more disjunctive search fields in the at least one search field.
  • 15. The system of claim 10, wherein the one or more hierarchical object types include two or more disjunctive object sub-types.
  • 16. The system of claim 10, wherein the particular search template is one of a plurality of search templates.
  • 17. The system of claim 10, wherein the particular search template specifies the scope of the search.
  • 18. The system of claim 10, wherein the one or more programs further comprise instructions for: subsequent to generating the user interface, providing the user interface to a user.
US Referenced Citations (599)
Number Name Date Kind
5021792 Hwang Jun 1991 A
5109399 Thompson Apr 1992 A
5329108 Lamoure Jul 1994 A
5555503 Kyrtsos et al. Sep 1996 A
5632009 Rao et al. May 1997 A
5670987 Doi et al. Sep 1997 A
5781704 Rossmo Jul 1998 A
5798769 Chiu et al. Aug 1998 A
5845300 Corner Dec 1998 A
6057757 Arrowsmith et al. May 2000 A
6091956 Hollenberg Jul 2000 A
6141659 Barker Oct 2000 A
6161098 Wallman Dec 2000 A
6189003 Leal Feb 2001 B1
6219053 Tachibana et al. Apr 2001 B1
6279018 Kudrolli et al. Apr 2001 B1
6232971 Haynes May 2001 B1
6247019 Davies Jun 2001 B1
6272489 Rauch et al. Aug 2001 B1
6341310 Leshem et al. Jan 2002 B1
6366933 Ball et al. Apr 2002 B1
6369835 Lin Apr 2002 B1
6456997 Shukla Sep 2002 B1
6549944 Weinberg et al. Apr 2003 B1
6560620 Ching May 2003 B1
6581068 Bensoussan et al. Jun 2003 B1
6594672 Lampson et al. Jul 2003 B1
6631496 Li et al. Oct 2003 B1
6642945 Sharpe Nov 2003 B1
6674434 Chojnacki et al. Jan 2004 B1
6714936 Nevin, III Mar 2004 B1
6775675 Nwabueze et al. Aug 2004 B1
6820135 Dingman Nov 2004 B1
6828920 Owen et al. Dec 2004 B2
6839745 Dingari et al. Jan 2005 B1
6877137 Rivette et al. Apr 2005 B1
6976210 Silva et al. Dec 2005 B1
6980984 Huffman et al. Dec 2005 B1
6985950 Hanson et al. Jan 2006 B1
7036085 Barros Apr 2006 B2
7043702 Chi et al. May 2006 B2
7055110 Kupka et al. May 2006 B2
7113964 Bequet et al. Sep 2006 B1
7139800 Bellotti et al. Nov 2006 B2
7158797 Jayaraman et al. Jan 2007 B1
7158878 Rasmussen et al. Jan 2007 B2
7162475 Ackerman Jan 2007 B2
7168039 Bertram Jan 2007 B2
7171427 Witowski et al. Jan 2007 B2
7188100 De Bellis et al. Mar 2007 B2
7269786 Malloy et al. Sep 2007 B1
7278105 Kitts Oct 2007 B1
7290698 Poslinski et al. Nov 2007 B2
7333998 Heckerman et al. Feb 2008 B2
7370047 Gorman May 2008 B2
7379811 Rasmussen et al. May 2008 B2
7379903 Caballero et al. May 2008 B2
7383053 Kent et al. Jun 2008 B2
7426654 Adams et al. Sep 2008 B2
7454466 Bellotti et al. Nov 2008 B2
7467375 Tondreau et al. Dec 2008 B2
7487139 Fraleigh et al. Feb 2009 B2
7502786 Liu et al. Mar 2009 B2
7523100 Bionda et al. Apr 2009 B1
7525422 Bishop et al. Apr 2009 B2
7529727 Arning et al. May 2009 B2
7529734 Dirisala May 2009 B2
7533008 Mangino et al. May 2009 B2
7558677 Jones Jun 2009 B2
7574409 Patinkin Aug 2009 B2
7574428 Leiserowitz et al. Aug 2009 B2
7579965 Bucholz Aug 2009 B2
7596285 Brown et al. Sep 2009 B2
7614006 Molander Nov 2009 B2
7617232 Gabbert et al. Nov 2009 B2
7620628 Kapur et al. Nov 2009 B2
7627812 Chamberlain et al. Dec 2009 B2
7634717 Chamberlain et al. Dec 2009 B2
7652622 Hansen et al. Jan 2010 B2
7703021 Flam Apr 2010 B1
7706817 Bamrah et al. Apr 2010 B2
7712049 Williams et al. May 2010 B2
7716077 Mikurak May 2010 B1
7725530 Sah et al. May 2010 B2
7725547 Albertson et al. May 2010 B2
7730082 Sah et al. Jun 2010 B2
7730109 Rohrs et al. Jun 2010 B2
7747648 Kraft et al. Jun 2010 B1
7760969 Silverbrook et al. Jul 2010 B2
7770100 Chamberlain et al. Aug 2010 B2
7805457 Viola et al. Sep 2010 B1
7809703 Balabhadrapatruni et al. Oct 2010 B2
7818291 Ferguson et al. Oct 2010 B2
7818658 Chen Oct 2010 B2
7870493 Pall et al. Jan 2011 B2
7894984 Rasmussen et al. Feb 2011 B2
7899611 Downs et al. Mar 2011 B2
7917376 Bellin et al. Mar 2011 B2
7920963 Jouline et al. Apr 2011 B2
7933862 Chamberlain et al. Apr 2011 B2
7945470 Cohen et al. May 2011 B1
7962281 Rasmussen et al. Jun 2011 B2
7962495 Jain et al. Jun 2011 B2
7962848 Bertram Jun 2011 B2
7970240 Chao et al. Jun 2011 B1
7971150 Raskutti et al. Jun 2011 B2
7984374 Caro et al. Jun 2011 B2
7971784 Lapstun Jul 2011 B2
8001465 Kudrolli et al. Aug 2011 B2
8001482 Bhattiprolu et al. Aug 2011 B2
8010545 Stefik et al. Aug 2011 B2
8015487 Roy et al. Sep 2011 B2
8024778 Cash et al. Sep 2011 B2
8028894 Lapstun et al. Oct 2011 B2
8036632 Cona et al. Oct 2011 B1
8042110 Kawahara et al. Oct 2011 B1
8103543 Zwicky Jan 2012 B1
8134457 Velipasalar et al. Mar 2012 B2
8145703 Frishert et al. Mar 2012 B2
8185819 Sah et al. May 2012 B2
8214361 Sandler et al. Jul 2012 B1
8214764 Gemmell et al. Jul 2012 B2
8225201 Michael Jul 2012 B2
8229947 Fujinaga Jul 2012 B2
8230333 Decherd et al. Jul 2012 B2
8271461 Pike et al. Sep 2012 B2
8280880 Aymeloglu et al. Oct 2012 B1
8285725 Bayliss Oct 2012 B2
8290926 Ozzie et al. Oct 2012 B2
8290942 Jones et al. Oct 2012 B2
8301464 Cave et al. Oct 2012 B1
8301904 Gryaznov Oct 2012 B1
8312367 Foster Nov 2012 B2
8312546 Alme Nov 2012 B2
8352881 Champion et al. Jan 2013 B2
8368695 Howell et al. Feb 2013 B2
8397171 Klassen et al. Mar 2013 B2
8402047 Mangini et al. Mar 2013 B1
8412707 Mianji Apr 2013 B1
8447722 Ahuja et al. May 2013 B1
8452790 Mianji May 2013 B1
8463036 Ramesh et al. Jun 2013 B1
8477994 Noshadi Jul 2013 B1
8489331 Kopf et al. Jul 2013 B2
8489641 Seefeld et al. Jul 2013 B1
8498984 Hwang et al. Jul 2013 B1
8510743 Hackborn et al. Aug 2013 B2
8514082 Cova et al. Aug 2013 B2
8515207 Chau Aug 2013 B2
8521135 Cryderman Aug 2013 B2
8554579 Tribble et al. Oct 2013 B2
8554653 Falkenborg et al. Oct 2013 B2
8554709 Goodson et al. Oct 2013 B2
8577911 Stepinski et al. Nov 2013 B1
8589273 Creeden et al. Nov 2013 B2
8595234 Siripuapu et al. Nov 2013 B2
8620641 Farnsworth et al. Dec 2013 B2
8639757 Zang et al. Jan 2014 B1
8646080 Williamson et al. Feb 2014 B2
8676857 Adams et al. Mar 2014 B1
8688069 Cazanas et al. Apr 2014 B1
8689108 Duffield et al. Apr 2014 B1
8713467 Goldenberg et al. Apr 2014 B1
8726379 Stiansen et al. May 2014 B1
8739059 Rabenold et al. May 2014 B2
8739278 Varghese May 2014 B2
8742934 Sarpy et al. Jun 2014 B1
8744890 Bernier Jun 2014 B1
8745516 Mason et al. Jun 2014 B2
8762870 Robotham et al. Jun 2014 B2
8781169 Jackson et al. Jul 2014 B2
8787939 Papakipos et al. Jul 2014 B2
8788407 Singh et al. Jul 2014 B1
8799799 Cervelli et al. Aug 2014 B1
8812960 Sun et al. Aug 2014 B1
8830322 Nerayoff et al. Sep 2014 B2
8832594 Thompson et al. Sep 2014 B1
8849254 Bolon Sep 2014 B2
8868537 Colgrove et al. Oct 2014 B1
8917274 Ma et al. Dec 2014 B2
8924872 Bogomolov et al. Dec 2014 B1
8937619 Sharma et al. Jan 2015 B2
8938686 Erenrich et al. Jan 2015 B1
9009171 Grossman et al. Apr 2015 B1
9009827 Albertson et al. Apr 2015 B1
9021260 Falk et al. Apr 2015 B1
9021384 Beard et al. Apr 2015 B1
9037407 Thompson May 2015 B2
9043696 Meiklejohn et al. May 2015 B1
9043894 Dennison et al. May 2015 B1
9116975 Shankar et al. Aug 2015 B2
9123086 Freeland et al. Sep 2015 B1
9262529 Colgrove et al. Feb 2016 B2
9275069 Garrod et al. Mar 2016 B1
9301103 Thompson Mar 2016 B1
9313233 Sprague et al. Apr 2016 B2
9380431 Freeland et al. Jun 2016 B1
9674662 Freeland et al. Jun 2017 B2
9727376 Bills et al. Aug 2017 B1
20020033848 Sciammarella et al. Mar 2002 A1
20020065708 Senay et al. May 2002 A1
20020091707 Keller Jul 2002 A1
20020095658 Shulman Jul 2002 A1
20020116120 Ruiz et al. Aug 2002 A1
20020174201 Ramer et al. Nov 2002 A1
20020194119 Wright et al. Dec 2002 A1
20030028560 Kudrolli et al. Feb 2003 A1
20030039948 Donahue Feb 2003 A1
20030061211 Shultz et al. Mar 2003 A1
20030140106 Raguseo Jul 2003 A1
20030144868 MacLntyre et al. Jul 2003 A1
20030152277 Hall et al. Aug 2003 A1
20030163352 Surpin et al. Aug 2003 A1
20030225755 Lwayama et al. Dec 2003 A1
20030227746 Sato Dec 2003 A1
20030229848 Arend et al. Dec 2003 A1
20040032432 Baynger Feb 2004 A1
20040064256 Barinek et al. Apr 2004 A1
20040085318 Hassler et al. May 2004 A1
20040095349 Bito et al. May 2004 A1
20040111410 Burgoon et al. Jun 2004 A1
20040126840 Cheng et al. Jul 2004 A1
20040143602 Ruiz et al. Jul 2004 A1
20040143796 Lerner et al. Jul 2004 A1
20040163039 McPherson et al. Aug 2004 A1
20040193600 Kaasten et al. Sep 2004 A1
20040203380 Hamdi et al. Oct 2004 A1
20040221223 Yu et al. Nov 2004 A1
20040260702 Cragun et al. Dec 2004 A1
20040267746 Marcjan et al. Dec 2004 A1
20050027705 Sadri et al. Feb 2005 A1
20050028094 Allyn Feb 2005 A1
20050039119 Parks et al. Feb 2005 A1
20050065811 Chu et al. Mar 2005 A1
20050080769 Gemmell Apr 2005 A1
20050086207 Heuer et al. Apr 2005 A1
20050125436 Mudunuri et al. Jun 2005 A1
20050125715 Franco et al. Jun 2005 A1
20050143096 Boesch Jun 2005 A1
20050162523 Darrell et al. Jul 2005 A1
20050166144 Gross Jul 2005 A1
20050180330 Shapiro Aug 2005 A1
20050182793 Keenan et al. Aug 2005 A1
20050183005 Denoue et al. Aug 2005 A1
20050210409 Jou Sep 2005 A1
20050246327 Yeung et al. Nov 2005 A1
20050251786 Citron et al. Nov 2005 A1
20060026120 Carolan et al. Feb 2006 A1
20060026170 Kreitler et al. Feb 2006 A1
20060053096 Subramanian Mar 2006 A1
20060059139 Robinson Mar 2006 A1
20060074881 Vembu et al. Apr 2006 A1
20060080619 Carlson et al. Apr 2006 A1
20060093222 Saffer et al. May 2006 A1
20060116991 Calderwood Jun 2006 A1
20060129746 Porter Jun 2006 A1
20060139375 Rasmussen et al. Jun 2006 A1
20060142949 Helt Jun 2006 A1
20060143034 Rothermel Jun 2006 A1
20060149596 Surpin et al. Jul 2006 A1
20060161558 Tamma et al. Jul 2006 A1
20060161568 Dettinger et al. Jul 2006 A1
20060203337 White Sep 2006 A1
20060206235 Shakes et al. Sep 2006 A1
20060218637 Thomas et al. Sep 2006 A1
20060241974 Chao et al. Oct 2006 A1
20060242040 Rader et al. Oct 2006 A1
20060242630 Koike et al. Oct 2006 A1
20060250764 Howarth et al. Nov 2006 A1
20060271277 Hu et al. Nov 2006 A1
20060279630 Aggarwal et al. Dec 2006 A1
20070011150 Frank Jan 2007 A1
20070016363 Huang et al. Jan 2007 A1
20070038646 Thota Feb 2007 A1
20070038962 Fuchs et al. Feb 2007 A1
20070043744 Carro Feb 2007 A1
20070057966 Ohno et al. Mar 2007 A1
20070072591 McGary et al. Mar 2007 A1
20070078832 Ott et al. Apr 2007 A1
20070083541 Fraleigh et al. Apr 2007 A1
20070088596 Berkelhamer et al. Apr 2007 A1
20070094389 Nussey et al. Apr 2007 A1
20070118547 Gupta et al. May 2007 A1
20070130541 Louch et al. Jun 2007 A1
20070150369 Zivin Jun 2007 A1
20070150520 Bennett et al. Jun 2007 A1
20070174760 Chamberlain et al. Jul 2007 A1
20070192265 Chopin et al. Aug 2007 A1
20070198571 Ferguson et al. Aug 2007 A1
20070208497 Downs et al. Sep 2007 A1
20070208498 Barker et al. Sep 2007 A1
20070208736 Tanigawa et al. Sep 2007 A1
20070233709 Abnous Oct 2007 A1
20070240062 Christena et al. Oct 2007 A1
20070250491 Olszak et al. Oct 2007 A1
20070266336 Nojima et al. Nov 2007 A1
20070294643 Kyle Dec 2007 A1
20080007618 Yuasa Jan 2008 A1
20080025629 Obrador et al. Jan 2008 A1
20080040684 Crump Feb 2008 A1
20080051989 Welsh Feb 2008 A1
20080052142 Bailey et al. Feb 2008 A1
20080077597 Butler Mar 2008 A1
20080077642 Carbone et al. Mar 2008 A1
20080082486 Lermant et al. Apr 2008 A1
20080104019 Nath May 2008 A1
20080126951 Sood et al. May 2008 A1
20080155440 Trevor et al. Jun 2008 A1
20080164998 Scherpbier et al. Jul 2008 A1
20080195417 Surpin et al. Aug 2008 A1
20080195608 Clover Aug 2008 A1
20080208844 Jenkins Aug 2008 A1
20080222295 Robinson et al. Sep 2008 A1
20080227473 Haney Sep 2008 A1
20080252419 Batchelor et al. Oct 2008 A1
20080263468 Cappione et al. Oct 2008 A1
20080267107 Rosenberg Oct 2008 A1
20080276167 Michael Nov 2008 A1
20080278311 Grange et al. Nov 2008 A1
20080288306 Maclntyre et al. Nov 2008 A1
20080301559 Martinsen et al. Dec 2008 A1
20080301643 Appleton et al. Dec 2008 A1
20080313281 Scheidl et al. Dec 2008 A1
20090002492 Velipasalar et al. Jan 2009 A1
20090005070 Forstall et al. Jan 2009 A1
20090006471 Richardson et al. Jan 2009 A1
20090006474 Richardson et al. Jan 2009 A1
20090027418 Maru et al. Jan 2009 A1
20090030915 Winter et al. Jan 2009 A1
20090037912 Stoitsev et al. Feb 2009 A1
20090055251 Shah et al. Feb 2009 A1
20090088964 Schaaf et al. Apr 2009 A1
20090119309 Gibson et al. May 2009 A1
20090119578 Relyea et al. May 2009 A1
20090125369 Kloosstra et al. May 2009 A1
20090125459 Norton et al. May 2009 A1
20090132921 Hwangbo et al. May 2009 A1
20090132953 Reed et al. May 2009 A1
20090138790 Larcheveque et al. May 2009 A1
20090143052 Bates et al. Jun 2009 A1
20090144262 White et al. Jun 2009 A1
20090144274 Fraleigh et al. Jun 2009 A1
20090156231 Versteeg et al. Jun 2009 A1
20090164934 Bhattiprolu et al. Jun 2009 A1
20090171939 Athsani et al. Jul 2009 A1
20090172511 Decherd et al. Jul 2009 A1
20090177962 Gusmorino et al. Jul 2009 A1
20090179892 Tsuda et al. Jul 2009 A1
20090187464 Bai et al. Jul 2009 A1
20090222400 Kupershmidt et al. Sep 2009 A1
20090222759 Drieschner Sep 2009 A1
20090222760 Halverson et al. Sep 2009 A1
20090234720 George et al. Sep 2009 A1
20090249244 Robinson et al. Oct 2009 A1
20090265105 Davis et al. Oct 2009 A1
20090281839 Lynn et al. Nov 2009 A1
20090287470 Farnsworth et al. Nov 2009 A1
20090292626 Oxford Nov 2009 A1
20090315679 Bauchot et al. Dec 2009 A1
20100004857 Pereira et al. Jan 2010 A1
20100011282 Dollard et al. Jan 2010 A1
20100042922 Bradateanu et al. Feb 2010 A1
20100057716 Stefik et al. Mar 2010 A1
20100058212 Belitz et al. Mar 2010 A1
20100070523 Delgo et al. Mar 2010 A1
20100070842 Aymeloglu et al. Mar 2010 A1
20100070845 Facemire et al. Mar 2010 A1
20100070897 Aymeloglu et al. Mar 2010 A1
20100073315 Lee et al. Mar 2010 A1
20100082842 Lavrov et al. Apr 2010 A1
20100100963 Mahaffey Apr 2010 A1
20100103124 Kruzeniski et al. Apr 2010 A1
20100114887 Conway et al. May 2010 A1
20100121817 Meyer et al. May 2010 A1
20100122152 Chamberlain et al. May 2010 A1
20100131457 Heimendinger May 2010 A1
20100162176 Dunton Jun 2010 A1
20100173619 Hua et al. Jul 2010 A1
20100185984 Wright et al. Jul 2010 A1
20100191563 Schlaifer et al. Jul 2010 A1
20100191884 Holenstein et al. Jul 2010 A1
20100198684 Eraker et al. Aug 2010 A1
20100199225 Coleman et al. Aug 2010 A1
20100214117 Hazzani Aug 2010 A1
20100223543 Marston Sep 2010 A1
20100228812 Uomini Sep 2010 A1
20100250412 Wagner Sep 2010 A1
20100280857 Liu et al. Nov 2010 A1
20100281458 Paladino et al. Nov 2010 A1
20100293174 Bennett et al. Nov 2010 A1
20100306713 Geisner et al. Dec 2010 A1
20100313119 Baldwin et al. Dec 2010 A1
20100318924 Frankel et al. Dec 2010 A1
20100321399 Ellren et al. Dec 2010 A1
20100325526 Ellis et al. Dec 2010 A1
20100325581 Finkelstein et al. Dec 2010 A1
20100330801 Rouh Dec 2010 A1
20110022312 McDonough et al. Jan 2011 A1
20110029526 Knight et al. Feb 2011 A1
20110047159 Baid et al. Feb 2011 A1
20110060753 Shaked et al. Mar 2011 A1
20110061013 Bilicki et al. Mar 2011 A1
20110066933 Ludwig Mar 2011 A1
20110074811 Hanson et al. Mar 2011 A1
20110078055 Faribault et al. Mar 2011 A1
20110078173 Seligmann et al. Mar 2011 A1
20110093327 Fordyce et al. Apr 2011 A1
20110093440 Asakura et al. Apr 2011 A1
20110111786 Rao May 2011 A1
20110117878 Barash et al. May 2011 A1
20110119100 Ruhl et al. May 2011 A1
20110137766 Rasmussen et al. Jun 2011 A1
20110153384 Horne et al. Jun 2011 A1
20110158469 Mastykarz Jun 2011 A1
20110161096 Buehler et al. Jun 2011 A1
20110167105 Ramakrishnan et al. Jul 2011 A1
20110170799 Carrino et al. Jul 2011 A1
20110173032 Payne et al. Jul 2011 A1
20110185316 Reid et al. Jul 2011 A1
20110202557 Atsmon et al. Aug 2011 A1
20110208724 Jones et al. Aug 2011 A1
20110213655 Henkin Sep 2011 A1
20110218934 Elser Sep 2011 A1
20110219450 McDougal et al. Sep 2011 A1
20110225198 Edwards et al. Sep 2011 A1
20110238495 Kang Sep 2011 A1
20110238553 Raj et al. Sep 2011 A1
20110251951 Kolkowtiz Oct 2011 A1
20110258158 Resende et al. Oct 2011 A1
20110270705 Parker Nov 2011 A1
20110276423 Davidson Nov 2011 A1
20110289397 Eastmond et al. Nov 2011 A1
20110289407 Naik et al. Nov 2011 A1
20110289420 Morioka et al. Nov 2011 A1
20110291851 Whisenant Dec 2011 A1
20110310005 Chen et al. Dec 2011 A1
20110314007 Dassa et al. Dec 2011 A1
20120010812 Thompson Jan 2012 A1
20120014560 Obrador et al. Jan 2012 A1
20120015673 Klassen et al. Jan 2012 A1
20120019559 Siler et al. Jan 2012 A1
20120032975 Koch Feb 2012 A1
20120036013 Neuhaus et al. Feb 2012 A1
20120036434 Oberstein Feb 2012 A1
20120050293 Carlhian et al. Mar 2012 A1
20120066296 Appleton et al. Mar 2012 A1
20120072825 Sherkin et al. Mar 2012 A1
20120079363 Folting et al. Mar 2012 A1
20120084118 Bai et al. Apr 2012 A1
20120106801 Jackson May 2012 A1
20120117082 Koperda et al. May 2012 A1
20120131512 Takeuchi et al. May 2012 A1
20120137235 Ts et al. May 2012 A1
20120144335 Abeln et al. Jun 2012 A1
20120150578 Mangat et al. Jun 2012 A1
20120159307 Chung et al. Jun 2012 A1
20120159362 Brown et al. Jun 2012 A1
20120159399 Bastide et al. Jun 2012 A1
20120166929 Henderson et al. Jun 2012 A1
20120170847 Tsukidate Jul 2012 A1
20120173985 Peppel Jul 2012 A1
20120180002 Campbell et al. Jul 2012 A1
20120196557 Reich et al. Aug 2012 A1
20120196558 Reich et al. Aug 2012 A1
20120203708 Psota et al. Aug 2012 A1
20120208636 Feige Aug 2012 A1
20120216106 Casey Aug 2012 A1
20120221511 Gibson et al. Aug 2012 A1
20120221553 Wittmer et al. Aug 2012 A1
20120221580 Barney Aug 2012 A1
20120245976 Kumar et al. Sep 2012 A1
20120246148 Dror Sep 2012 A1
20120254129 Wheeler et al. Oct 2012 A1
20120268269 Doyle Oct 2012 A1
20120277914 Crow et al. Nov 2012 A1
20120284345 Costenaro et al. Nov 2012 A1
20120290879 Shibuya et al. Nov 2012 A1
20120296907 Long et al. Nov 2012 A1
20120311684 Paulsen et al. Dec 2012 A1
20120317202 Lewis Dec 2012 A1
20120323888 Osann, Jr. Dec 2012 A1
20120330973 Ghuneim et al. Dec 2012 A1
20130005362 Borghei Jan 2013 A1
20130006426 Healey et al. Jan 2013 A1
20130006725 Simanek et al. Jan 2013 A1
20130006916 McBride et al. Jan 2013 A1
20130013642 Klein et al. Jan 2013 A1
20130018796 Kolhatkar et al. Jan 2013 A1
20130024268 Manickavelu Jan 2013 A1
20130046635 Grigg et al. Feb 2013 A1
20130046842 Muntz et al. Feb 2013 A1
20130060786 Serrano et al. Mar 2013 A1
20130061169 Pearcy et al. Mar 2013 A1
20130073377 Heath Mar 2013 A1
20130073454 Busch Mar 2013 A1
20130078943 Biage et al. Mar 2013 A1
20130086482 Parsons Apr 2013 A1
20130097482 Marantz et al. Apr 2013 A1
20130101159 Chao et al. Apr 2013 A1
20130110877 Bonham et al. May 2013 A1
20130111320 Campbell et al. May 2013 A1
20130117651 Waldman et al. May 2013 A1
20130143597 Mitsuya et al. Jun 2013 A1
20130150004 Rosen Jun 2013 A1
20130151148 Parundekar et al. Jun 2013 A1
20130151388 Falkenborg et al. Jun 2013 A1
20130157234 Gulli et al. Jun 2013 A1
20130165069 Nitta et al. Jun 2013 A1
20130166550 Buchmann et al. Jun 2013 A1
20130176321 Mitchell et al. Jul 2013 A1
20130179420 Park et al. Jul 2013 A1
20130196614 Pahlevani Aug 2013 A1
20130224696 Wolfe et al. Aug 2013 A1
20130225212 Khan Aug 2013 A1
20130226318 Procyk Aug 2013 A1
20130226953 Markovich et al. Aug 2013 A1
20130232045 Tai Sep 2013 A1
20130235749 Cho et al. Sep 2013 A1
20130238616 Rose et al. Sep 2013 A1
20130246170 Gross et al. Sep 2013 A1
20130251233 Yang et al. Sep 2013 A1
20130262171 Solodko et al. Oct 2013 A1
20130262497 Case et al. Oct 2013 A1
20130262527 Hunter et al. Oct 2013 A1
20130262528 Foit Oct 2013 A1
20130263019 Castellanos et al. Oct 2013 A1
20130267207 Hao et al. Oct 2013 A1
20130268520 Fisher et al. Oct 2013 A1
20130279757 Kephart Oct 2013 A1
20130282696 John et al. Oct 2013 A1
20130288719 Alonzo Oct 2013 A1
20130290011 Lynn et al. Oct 2013 A1
20130290825 Arndt et al. Oct 2013 A1
20130295970 Sheshadri et al. Nov 2013 A1
20130297619 Chandarsekaran et al. Nov 2013 A1
20130311375 Priebatsch Nov 2013 A1
20140019936 Cohanoff Jan 2014 A1
20140032506 Hoey et al. Jan 2014 A1
20140033010 Richardt et al. Jan 2014 A1
20140040371 Gurevich et al. Feb 2014 A1
20140047319 Eberlein Feb 2014 A1
20140047357 Alfaro et al. Feb 2014 A1
20140059038 McPherson et al. Feb 2014 A1
20140067611 Adachi et al. Mar 2014 A1
20140068487 Steiger et al. Mar 2014 A1
20140074855 Zhao et al. Mar 2014 A1
20140079340 Kawano Mar 2014 A1
20140081685 Thacker et al. Mar 2014 A1
20140093174 Zhang et al. Apr 2014 A1
20140095273 Tang et al. Apr 2014 A1
20140095509 Patton Apr 2014 A1
20140108068 Williams Apr 2014 A1
20140108380 Gotz et al. Apr 2014 A1
20140108985 Scott et al. Apr 2014 A1
20140129261 Bothwell et al. May 2014 A1
20140149436 Bahrami et al. May 2014 A1
20140156527 Grigg et al. Jun 2014 A1
20140157172 Peery et al. Jun 2014 A1
20140164502 Khodorenko et al. Jun 2014 A1
20140176606 Narayan et al. Jun 2014 A1
20140189536 Lange et al. Jul 2014 A1
20140195515 Baker et al. Jul 2014 A1
20140195887 Ellis et al. Jul 2014 A1
20140214579 Shen et al. Jul 2014 A1
20140222521 Chait Aug 2014 A1
20140244388 Manouchehri et al. Aug 2014 A1
20140258827 Gormish et al. Sep 2014 A1
20140267294 Ma Sep 2014 A1
20140267295 Sharma Sep 2014 A1
20140279824 Tamayo Sep 2014 A1
20140302783 Aiuto et al. Oct 2014 A1
20140304582 Bills et al. Oct 2014 A1
20140310266 Greenfield Oct 2014 A1
20140316911 Gross Oct 2014 A1
20140333651 Cervelli et al. Nov 2014 A1
20140337772 Cervelli et al. Nov 2014 A1
20140344230 Krause et al. Nov 2014 A1
20140357299 Xu et al. Dec 2014 A1
20140358252 Ellsworth et al. Dec 2014 A1
20150005014 Huang et al. Jan 2015 A1
20150019394 Unser et al. Jan 2015 A1
20150046870 Goldenberg et al. Feb 2015 A1
20150080012 Sprague et al. Mar 2015 A1
20150089424 Duffield et al. Mar 2015 A1
20150100897 Sun et al. Apr 2015 A1
20150100907 Erenrich et al. Apr 2015 A1
20150134633 Colgrove et al. May 2015 A1
20150134666 Gattiker et al. May 2015 A1
20150169709 Kara et al. Jun 2015 A1
20150169726 Kara et al. Jun 2015 A1
20150170077 Kara et al. Jun 2015 A1
20150178825 Huerta Jun 2015 A1
20150178877 Bogomolov et al. Jun 2015 A1
20150186821 Wang et al. Jul 2015 A1
20150187036 Wang et al. Jul 2015 A1
20150227295 Meiklejohn et al. Aug 2015 A1
20150331919 Freeland et al. Nov 2015 A1
20160110458 Colgrove et al. Apr 2016 A1
20170132200 Noland May 2017 A1
Foreign Referenced Citations (41)
Number Date Country
102014103482 Sep 2014 DE
102014215621 Feb 2015 DE
1 672 527 Jun 2006 EP
2 400 448 Dec 2011 EP
2400448 Dec 2011 EP
2551799 Jan 2013 EP
2560134 Feb 2013 EP
2778977 Sep 2014 EP
2816513 Dec 2014 EP
2835745 Feb 2015 EP
2835770 Feb 2015 EP
2838039 Feb 2015 EP
2846241 Mar 2015 EP
2851852 Mar 2015 EP
2858014 Apr 2015 EP
2858018 Apr 2015 EP
2863326 Apr 2015 EP
2863346 Apr 2015 EP
2869211 May 2015 EP
2884439 Jun 2015 EP
2884440 Jun 2015 EP
2891992 Jul 2015 EP
2911078 Aug 2015 EP
2911100 Aug 2015 EP
2916276 Sep 2015 EP
2516155 Jan 2015 GB
2518745 Apr 2015 GB
2012778 Nov 2014 NL
2013306 Feb 2015 NL
624557 Dec 2014 NZ
WO 2000009529 Feb 2000 WO
WO 2002065353 Aug 2002 WO
WO 2004038548 May 2004 WO
WO 2004038548 May 2004 WO
WO 2005104736 Nov 2005 WO
WO 2008064207 May 2008 WO
WO 2009061501 May 2009 WO
WO 2010000014 Jan 2010 WO
WO 2010030913 Mar 2010 WO
WO 2013010157 Jan 2013 WO
WO 2013102892 Jul 2013 WO
Non-Patent Literature Citations (230)
Entry
Official Communication for European Patent Application No. 14180321.3 dated Apr. 17, 2015.
Official Communication for Australian Patent Application No. 2014201511 dated Feb. 27, 2015.
Official Commuincation for Australian Patent Application No. 2014202442 dated Mar. 19, 2015.
Official Communication for Great Britain Patent Application No. 1404457.2 dated Aug. 14, 2014.
Hansen et al., “Analyzing Social Media Networks with NodeXL: Insights from a Connected World”, Chapter 4, pp. 53-67 and Chapter 10, pp. 143-164, published Sep. 2010.
Official Communication for New Zealand Patent Application No. 628263 dated Aug. 12, 2014.
Gesher, Ari, “Palantir Screenshots in the Wild: Swing Sightings,” The Palantir Blog, Sep. 11, 2007, pp. 1- 12.
Official Communication for European Patent Application No. 15155846.7 dated Jul. 8, 2015.
European Patent Office, “Search Report” in Application No. 214 159 447.3-1958, dated Sep. 28, 2016, 6 pages.
Claims in European Application No. 14 159 447.3-1958 dated Sep. 2016, 2 pages.
Palmas et al., “An Edge-Bunding Layout for Interactive Parallel Coordinates” 2014 IEEE Pacific Visualization Symposium, pp. 57-64.
Manske, “File Saving Dialogs,” <http://www.mozilla.org/editor/ui_specs/FileSaveDialogs.html>, Jan. 20, 1999, pp. 7.
Huang et al., “Systematic and Integrative Analysis of Large Gene Lists Using David Bioinformatics Resources,” Nature Protocols, 4.1, 2008, 44-57.
Definition “Identify”, downloaded Jan. 22, 2015, 1 page.
Official Communication for European Patent Application No. 14180142.3 dated Feb. 6, 2015.
Microsoft—Developer Network, “Getting Started with VBA in Word 2010,” Apr. 2010, <http://msdn.microsoft.com/en-us/library/ff604039%28v=office.14%29.aspx> as printed Apr. 4, 2014 in 17 pages.
Bugzilla@Mozilla, “Bug 18726—[feature] Long-click means of invoking contextual menus not supported,” http://bugzilla.mozilla.org/show_bug.cgi?id=18726 printed Jun. 13, 2013 in 11 pages.
Official Communication for Netherlands Patent Application No. 2013306 dated Apr. 24, 2015.
Official Communication for European Patent Application No. 14197879.1 dated Apr. 28, 2015.
“A Quick Guide to UniProtKB Swiss-Prot & TrEMBL,” Sep. 2011, pp. 2.
Chen et al., “Bringing Order to the Web: Automatically Categorizing Search Results,” CHI 2000, Proceedings of the SIGCHI conference on Human Factors in Computing Systems, Apr. 1-6, 2000, The Hague, The Netherlands, pp. 145-152.
Official Communication for European Patent Application No. 14187996.5 dated Feb. 12, 2015.
Keylines.com, “An Introduction to KeyLines and Network Visualization,” Mar. 2014, <http://keylines.com/wp-content/uploads/2014/03/KeyLines-White-Paper.pdf> downloaded May 12, 2014 in 8 pages.
Dramowicz, Ela, “Retail Trade Area Analysis Using the Huff Model,” Directions Magazine, Jul. 2, 2005 in 10 pages, http://www.directionsmag.com/articles/retail-trade-area-analysis-using-the-huff-model/123411.
Official Communication for European Patent Application No. 14158861.6 dated Jun. 16, 2014.
Yang et al., “HTML Page Analysis Based on Visual Cues”, A129, pp. 859-864, 2001.
Li et al., “Interactive Multimodal Visual Search on Mobile Device,” IEEE Transactions on Multimedia, vol. 15, No. 3, Apr. 1, 2013, pp. 594-607.
Official Communication for Australian Patent Application No. 2014210614 dated Jun. 5, 2015.
Official Communication for European Patent Application No. 14159464.8 dated Oct. 8, 2014.
Kitts, Paul, “Chapter 14: Genome Assembly and Annotation Process,” The NCBI Handbook, Oct. 2002, pp. 1-21.
Official Communication for European Patent Application No. 14159464.8 dated Jul. 31, 2014.
Official Communication for Australian Patent Application No. 2014210604 dated Jun. 5, 2015.
Rouse, Margaret, “OLAP Cube,” <http://searchdatamanagement.techtarget.com/definition/OLAP-cube>, Apr. 28, 2012, pp. 16.
Goswami, Gautam, “Quite Writly Said!,” One Brick at a Time, Aug. 21, 2005, pp. 7.
Conner, Nancy, “Google Apps: The Missing Manual,” May 1, 2008, pp. 15.
Olanoff, Drew, “Deep Dive with the New Google Maps for Desktop with Google Earth Integration, It's More than Just a Utility,” May 15, 2013, pp. 1-6, retrieved from the internet: http://web.archive.org/web/20130515230641/http://techcrunch.com/2013/05/15/deep-dive-with-the-new-google-maps-for-desktop-with-google-earth-integration-its-more-than-just-a-utility/.
Official Communication for New Zealand Patent Application No. 628840 dated Aug. 28, 2014.
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.bing.com.
Official Communication for Great Britain Patent Application No. 1413935.6 dated Jan. 27, 2015.
Official Communication for European Patent Application No. 14159464.8 dated Sep. 22, 2014.
Official Communication for Australian Patent Application No. 2014250678 dated Jun. 17, 2015.
Griffith, Daniel A., “A Generalized Huff Model,” Geographical Analysis, Apr. 1982, vol. 14, No. 2, pp. 135-144.
Kahan et al., “Annotea: an Open RDF Infrastructure for Shared Web Annotations”, Computer Networks, Elsevier Science Publishers B.V., vol. 39, No. 5, dated Aug. 5, 2002, pp. 589-608.
Official Communication for Australian Patent Application No. 2014213553 dated May 7, 2015.
Official Communication for New Zealand Patent Application No. 622513 dated Aug. 3, 2014.
Nierman, “Evaluating Structural Similarity in XML Documents”, 6 pages, 2002.
Official Communication for European Patent Application No. 14191540.5 dated May 27, 2015.
Official Communication for European Patent Application No. 14186225.0 dated Feb. 13, 2015.
Ananiev et al., “The New Modality API,” http://web.archive.org/web/20061211011958/http://java.sun.com/developer/technicalArticles/J2SE/Desktop/javase6/modality/ Jan. 21, 2006, pp. 8.
Chung, Chin-Wan, “Dataplex: An Access to Heterogeneous Distributed Databases,” Communications of the ACM, Association for Computing Machinery, Inc., vol. 33, No. 1, Jan. 1, 1990, pp. 70-80.
Official Communication for New Zealand Patent Application No. 627962 dated Aug. 5, 2014.
“A Word About Banks and the Laundering of Drug Money,” Aug. 18, 2012, http://www.golemxiv.co.uk/2012/08/a-word-about-banks-and-the-laundering-of-drug-money/.
Official Communication for Great Britain Patent Application No. 1404574.4 dated Dec. 18, 2014.
“Potential Money Laundering Warning Signs,” snapshot taken 2003, https://web.archive.org/web/20030816090055/http:/finsolinc.com/ANTI-MONEY%20LAUNDERING%20TRAINING%20GUIDES.pdf.
Official Communciation for Great Britain Patent Application No. 1411984.6 dated Dec. 22, 2014.
GIS-NET 3 Public _ Department of Regional Planning. Planning & Zoning Information for Unincorporated LA County. Retrieved Oct. 2, 2013 from http://gis.planning.lacounty.gov/GIS-NET3_Public/Viewer.html.
Microsoft Office—Visio, “About connecting shapes,” <http://office.microsoft.com/en-us/visio-help/about-connecting-shapes-HP085050369.aspx> printed Aug. 4, 2011 in 6 pages.
Official Communication for European Patent Application No. 14189802.3 dated May 11, 2015.
Official Communication for European Patent Application No. 14189344.6 dated Feb. 20, 2015.
Definition “Overlay”, downloaded Jan. 22, 2015, 1 page.
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.yahoo.com.
Manno et al., “Introducing Collaboration in Single-user Applications through the Centralized Control Architecture,” 2010, pp. 10.
Boyce, Jim, “Microsoft Outlook 2010 Inside Out,” Aug. 1, 2010, retrieved from the internet https://capdtron.files.wordpress.com/2013/01/outlook-2010-inside_out.pdf.
Wikipedia, “Federated Database System,” Sep. 7, 2013, retrieved from the internet on Jan. 27, 2015 http://en.wikipedia.org/w/index.php?title=Federated_database_system&oldid=571954221.
Official Communication for European Patent Application No. 14197895.7 dated Apr. 28, 2015.
Official Communication for European Patent Application No. 14199182.8 dated Mar. 13, 2015.
Huff et al., “Calibrating the Huff Model Using ArcGIS Business Analyst,” ESRI, Sep. 2008, pp. 33.
Official Communication for New Zealand Patent Application No. 624557 dated May 14, 2014.
Acklen, Laura, “Absolute Beginner's Guide to Microsoft Word 2003,” Dec. 24, 2003, pp. 15-18, 34-41, 308-316.
Celik, Tantek, “CSS Basic User Interface Module Level 3 (CSS3 UI),” Section 8 Resizing and Overflow, Jan. 17, 2012, retrieved from internet http://www.w3.org/TR/2012/WD-css3-ui-20120117/#resizing-amp-overflow retrieved on.
Liu, Tianshun, “Combining GIS and the Huff Model to Analyze Suitable Locations for a New Asian Supermarket in the Minneapolis and St. Paul, Minnesota USA,” Papers in Resource Analysis, 2012, vol. 14, pp. 8.
Official Communication for New Zealand Patent Application No. 622517 dated Apr. 3, 2014.
Keylines.com, “Visualizing Threats: Improved Cyber Security Through Network Visualization,” Apr. 2014, <http://keylines.com/wp-content/uploads/2014/04/Visualizing-Threats1pdf> downloaded May 12, 2014 in 10 pages.
Official Communication for European Patent Application No. 14189347.9 dated Mar. 4, 2015.
Keylines.com, “KeyLines Datasheet,” Mar. 2014, <http://keylines.com/wp-content/uploads/2014/03/KeyLines-datasheet.pdf> downloaded May 12, 2014 in 2.
Official Communication for New Zealand Patent Application No. 628495 dated Aug. 19, 2014.
Umagandhi et al., “Search Query Recommendations Using Hybrid User Profile with Query Logs,” International Journal of Computer Applications, vol. 80, No. 10, Oct. 1, 2013, pp. 7-18.
Official Communication for New Zealand Patent Application No. 628161 dated Aug. 25, 2014.
Huff, David L., “Parameter Estimation in the Huff Model,” ESRI, ArcUser, Oct.-Dec. 2003, pp. 34-36.
Hogue et al., “Thresher: Automating the Unwrapping of Semantic Content from the World Wide Web,” 14th International Conference on World Wide Web, WWW 2005: Chiba, Japan, May 10-14, 2005, pp. 86-95.
Canese et al., “Chapter 2: PubMed: The Bibliographic Database,” The NCBI Handbook, Oct. 2002, pp. 1-10.
“Refresh CSS Ellipsis When Resizing Container—Stack Overflow,” Jul. 31, 2013, retrieved from internet http://stackoverflow.com/questions/17964681/refresh-css-ellipsis-when-resizing-container, retrieved on May 18, 2015.
Thompson, Mick, “Getting Started with GEO,” Getting Started with GEO, Jul. 26, 2011.
Delcher et al., “Identifying Bacterial Genes and Endosymbiont DNA with Glimmer,” Biolnformatics, vol. 23, No. 6, 2007, pp. 673-679.
Hardesty, “Privacy Challenges: Analysis: It's Surprisingly Easy to Identify Individuals from Credit-Card Metadata,” MIT News on Campus and Around the World, Mit News Office, Jan. 29, 2015, 3 pages.
Official Communication for New Zealand Patent Application No. 628585 dated Aug. 26, 2014.
Bluttman et al., “Excel Formulas and Functions for Dummies,” 2005, Wiley Publishing, Inc., pp. 280, 284-286.
Official Communication for European Patent Application No. 14180432.8 dated Jun. 23, 2015.
Official Communication for European Patent Application No. 14180281.9 dated Jan. 26, 2015.
Hibbert et al., “Prediction of Shopping Behavior Using a Huff Model Within a GIS Framework,” Healthy Eating in Context, Mar. 18, 2011, pp. 16.
Sirotkin et al., “Chapter 13: The Processing of Biological Sequence Data at NCBI,” The NCBI Handbook, Oct. 2002, pp. 1-11.
Madden, Tom, “Chapter 16: The BLAST Sequence Analysis Tool,” The NCBI Handbook, Oct. 2002, pp. 1-15.
Microsoft Office—Visio, “Add and glue connectors with the Connector tool,” <http://office.microsoft.com/en-us/visio-help/add-and-glue-connectors-with-the-connector-tool-HA010048532.aspx?CTT=1> printed Aug. 4, 2011 in 1 page.
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.google.com.
Notice of Acceptance for Australian Patent Application No. 2014250678 dated Oct. 7, 2015.
Official Communication for Great Britain Patent Application No. 1408025.3 dated Nov. 6, 2014.
“A First Look: Predicting Market Demand for Food Retail using a Huff Analysis,” TRF Policy Solutions, Jul. 2012, pp. 30.
Amnet, “5 Great Tools for Visualizing Your Twitter Followers,” posted Aug. 4, 2010, http://www.amnetblog.com/component/content/article/115-5-grate-tools-for-visualizing-your-twitter-followers.html.
Sigrist, et al., “Prosite, a Protein Domain Database for Functional Characterization and Annotation,” Nucleic Acids Research, 2010, vol. 38, pp. D161-D166.
“The FASTA Program Package,” fasta-36.3.4, Mar. 25, 2011, pp. 29.
Mizrachi, Ilene, “Chapter 1: Gen Bank: The Nuckeotide Sequence Database,” The NCBI Handbook, Oct. 2002, pp. 1-14.
IBM, “Determining Business Object Structure,” IBM, 2004, 9 pages.
U.S. Appl. No. 13/181,392, filed Jul. 21, 2011, Notice of Allowance, dated Jan. 22, 2015.
U.S. Appl. No. 13/838,815, filed Mar. 15, 2013, Notice of Allowance, dated Jan. 29, 2015.
U.S. Appl. No. 13/838,815, filed Mar. 15, 2013, Notice of Allowance, dated Jun. 19, 2015.
U.S. Appl. No. 13/181,392, filed Jul. 21, 2011, Final Office Action, dated Aug. 28, 2014.
U.S. Appl. No. 14/088,251, filed Nov. 22, 2013, Office Action, dated Feb. 12, 2015.
U.S. Appl. No. 14/487,342, filed Sep. 16, 2014, First Action Interview, dated Apr. 23, 2015.
U.S. Appl. No. 14/196,814, filed Mar. 4, 2014, Office Action, dated May 5, 2015.
U.S. Appl. No. 14/027,118, filed Sep. 13, 2013, Office Action, dated May 12, 2015.
U.S. Appl. No. 14/088,251, filed Nov. 22, 2013, Final Office Action, dated May 20, 2015.
U.S. Appl. No. 13/831,199, filed Mar. 14, 2013, Office Action, dated Jun. 3, 2015.
U.S. Appl. No. 14/088,251, filed Nov. 22, 2013, Interview Summary, dated Jun. 30, 2015.
U.S. Appl. No. 14/334,232, filed Jul. 17, 2014, Office Action, dated Jul. 10, 2015.
U.S. Appl. No. 13/839,026, filed Mar. 15, 2013, Office Action, dated Aug. 4, 2015.
U.S. Appl. No. 14/088,251, filed Nov. 22, 2013, Office Action, dated Aug. 26, 2015.
U.S. Appl. No. 14/027,118, filed Sep. 13, 2013, Office Action, dated Sep. 16, 2015.
U.S. Appl. No. 13/839,026, filed Mar. 15, 2013, Restriction Requirement, dated Apr. 2, 2015.
U.S. Appl. No. 14/487,342, filed Sep. 16, 2014, Notice of Allowance, dated Sep. 23, 2015.
U.S. Appl. No. 14/196,814, filed Mar. 4, 2014, Office Action, dated Oct. 7, 2015.
U.S. Appl. No. 13/831,199, filed Mar. 14, 2013, Final Office Action, dated Oct. 6, 2015.
U.S. Appl. No. 14/334,232, filed Jul. 17, 2014, Notice of Allowance, dated Nov. 10, 2015.
U.S. Appl. No. 14/690,905, filed Apr. 20, 2015, Notice of Allowance, dated Nov. 23, 2015.
U.S. Appl. No. 14/690,905, filed Apr. 20, 2015, Office Action, dated Oct. 7, 2015.
U.S. Appl. No. 13/247,987, filed Sep. 28, 2011, Office Action, dated Apr. 2, 2015.
U.S. Appl. No. 13/831,791, filed Mar. 15, 2013, Office Action, dated Mar. 4, 2015.
U.S. Appl. No. 13/835,688, filed Mar. 15, 2013, First Office Action Interview, dated Jun. 17, 2015.
U.S. Appl. No. 14/102,394, filed Dec. 10, 2013, Notice of Allowance, dated Aug. 25, 2014.
U.S. Appl. No. 14/108,187, filed Dec. 16, 2013, Notice of Allowance, dated Aug. 29, 2014.
U.S. Appl. No. 14/135,289, filed Dec. 19, 2013, Notice of Allowance, dated Oct. 14, 2014.
U.S. Appl. No. 14/148,568, filed Jan. 6, 2014, Office Action, dated Oct. 22, 2014.
U.S. Appl. No. 14/148,568, filed Jan. 6, 2014, Office Action, dated Mar. 26, 2015.
U.S. Appl. No. 14/192,767, filed Feb. 27, 2014, Notice of Allowance, dated Dec. 16, 2014.
U.S. Appl. No. 14/225,006, filed Mar. 25, 2014, First Office Action Interview, dated Sep. 10, 2014.
U.S. Appl. No. 14/225,006, filed Mar. 25, 2014, First Office Action Interview, dated Feb. 27, 2015.
U.S. Appl. No. 14/225,084, filed Mar. 25, 2014, First Office Action Interview, dated Sep. 2, 2014.
U.S. Appl. No. 14/225,084, filed Mar. 25, 2014, Notice of Allowance, dated May 4, 2015.
U.S. Appl. No. 14/225,084, filed Mar. 25, 2014, First Office Action Interview, dated Feb. 20, 2015.
U.S. Appl. No. 14/225,160, filed Mar. 25, 2014, Final Office Action, dated Feb. 11, 2015.
U.S. Appl. No. 14/225,160, filed Mar. 25, 2014, Advisory Action, dated May 20, 2015.
U.S. Appl. No. 14/225,160, filed Mar. 25, 2014, First Office Action Interview, dated Oct. 22, 2014.
U.S. Appl. No. 14/268,964, filed May 2, 2014, First Office Action Interview, dated Sep. 3, 2014.
U.S. Appl. No. 14/268,964, filed May 2, 2014, Notice of Allowance, dated Dec. 3, 2014.
U.S. Appl. No. 14/289,596, filed May 28, 2014, First Office Action Interview, dated Jul. 18, 2014.
U.S. Appl. No. 14/289,596, filed May 28, 2014, Final Office Action, dated Jan. 26, 2015.
U.S. Appl. No. 14/289,599, filed May 28, 2014, First Office Action Interview, dated Jul. 22, 2014.
U.S. Appl. No. 14/289,599, filed May 28, 2014, Final Office Action, dated May 29, 2015.
U.S. Appl. No. 14/294,098, filed Jun. 2, 2014, Final Office Action, dated Nov. 6, 2014.
U.S. Appl. No. 14/294,098, filed Jun. 2, 2014, First Office Action Interview, dated Aug. 15, 2014.
U.S. Appl. No. 14/294,098, filed Jun. 2, 2014, Notice of Allowance, dated Dec. 29, 2014.
U.S. Appl. No. 14/306,138, filed Jun. 16, 2014, Final Office Action, dated Feb. 18, 2015.
U.S. Appl. No. 14/306,138, filed Jun. 16, 2014, First Office Action Interview, dated Sep. 23, 2014.
U.S. Appl. No. 14/306,138, filed Jun. 16, 2014, Office Action, dated May 26, 2015.
U.S. Appl. No. 14/306,147, filed Jun. 16, 2014, First Office Action Interview, dated Sep. 9, 2014.
U.S. Appl. No. 14/306,147, filed Jun. 16, 2014, Final Office Action, dated Feb. 19, 2015.
U.S. Appl. No. 14/306,154, filed Jun. 16, 2014, First Office Action Interview, dated Sep. 9, 2014.
U.S. Appl. No. 14/306,154, filed Jun. 16, 2014, Final Office Action, dated Mar. 11, 2015.
U.S. Appl. No. 14/306,154, filed Jun. 16, 2014, Advisory Action, dated May 15, 2015.
U.S. Appl. No. 14/319,765, filed Jun. 30, 2014, First Office Action Interview, dated Feb. 4, 2015.
U.S. Appl. No. 14/319,765, filed Jun. 30, 2014, First Office Action Interview, dated Nov. 25, 2014.
U.S. Appl. No. 14/323,935, filed Jul. 3, 2014, Office Action, dated Jun. 22, 2015.
U.S. Appl. No. 14/323,935, filed Jul. 3, 2014, First Office Action Interview, dated Nov. 28, 2014.
U.S. Appl. No. 14/323,935, filed Jul. 3, 2014, First Office Action Interview, dated Mar. 31, 2015.
U.S. Appl. No. 14/326,738, filed Jul. 9, 2014, First Office Action Interview, dated Dec. 2, 2014.
U.S. Appl. No. 14/326,738, filed Jul. 9, 2014, First Office Action Interview, dated Mar. 31, 2015.
U.S. Appl. No. 14/473,552, filed Aug. 29, 2014, Interview Summary, dated Feb. 24, 2015.
U.S. Appl. No. 14/486,991, filed Sep. 15, 2014, Office Action, dated Mar. 10, 2015.
U.S. Appl. No. 14/504,103, filed Oct. 1, 2014, First Office Action Interview, dated Feb. 5, 2015.
U.S. Appl. No. 14/504,103, filed Oct. 1, 2014, Notice of Allowance, dated May 18, 2015.
U.S. Appl. No. 14/504,103, filed Oct. 1, 2014, First Office Action Interview, dated Mar. 31, 2015.
U.S. Appl. No. 14/579,752, filed Dec. 22, 2014, First Office Action Interview, dated May 26, 2015.
U.S. Appl. No. 14/616,080, filed Feb. 6, 2015, Notice of Allowance, dated Apr. 2, 2015.
U.S. Appl. No. 14/639,606, filed Mar. 5, 2015 First Office Action Interview, dated May 18, 2015.
U.S. Appl. No. 14/639,606, filed Mar. 5, 2015, First Action Interview, dated Jul. 24, 2015.
U.S. Appl. No. 14/306,147, filed Jun. 16, 2014, Office Action, dated Aug. 7, 2015.
U.S. Appl. No. 14/319,765, filed Jun. 30, 2014, Final Office Action, dated Jun. 16, 2015.
U.S. Appl. No. 14/289,596, filed May 28, 2014, Advisory Action, dated Apr. 30, 2015.
U.S. Appl. No. 14/225,006, filed Mar. 25, 2014, Final Office Action, dated Sep. 2, 2015.
U.S. Appl. No. 13/836,815, filed Mar. 15, 2013, Office Action, dated Oct. 24, 2015.
U.S. Appl. No. 14/579,752, filed Dec. 22, 2014, Final Office Action, dated Aug. 19, 2015.
U.S. Appl. No. 13/839,026, filed Mar. 15, 2013, Notice of Allowance, dated Jul. 6, 2015.
U.S. Appl. No. 14/306,154, filed Jun. 16, 2014, Office Action, dated Jul. 6, 2015.
U.S. Appl. No. 14/306,138, filed Jun. 16, 2014, Final Office Action, Sep. 14, 2015.
U.S. Appl. No. 14/473,860, filed Aug. 29, 2014, Notice of Allowance, dated Jan. 5, 2015.
U.S. Appl. No. 13/831,199, filed Mar. 14, 2013, Office Action, dated May 9, 2016.
U.S. Appl. No. 14/196,814, filed Mar. 4, 2014, First Office Action Interview, dated Apr. 16, 2014.
U.S. Appl. No. 14/813,749, filed Jul. 30, 2015, Office Action, dated Sep. 28, 2015.
U.S. Appl. No. 14/225,160, filed Mar. 25, 2014, Office Action, dated Aug. 12, 2015.
U.S. Appl. No. 13/831,199, filed Mar. 14, 2013, Final Office Action, dated Nov. 4, 2016.
U.S. Appl. No. 14/490,612, filed Sep. 18, 2014, Final Office Action, dated Aug. 18, 2015.
U.S. Appl. No. 14/486,991, filed Sep. 15, 2014, Notice of Allowance, dated May 1, 2015.
U.S. Appl. No. 14/027,118, filed Feb. 4, 2016, Notice of Allowance, dated Apr. 4, 2016.
U.S. Appl. No. 15/047,405, filed Feb. 18, 2016, Office Action, dated Apr. 1, 2016.
U.S. Appl. No. 14/726,353, filed May 29, 2015, First Office Action Interview, dated Sep. 10, 2015.
U.S. Appl. No. 13/247,987, filed Sep. 28, 2011, Office Action, dated Sep. 22, 2015.
U.S. Appl. No. 12/556,318, filed Sep. 9, 2009, Office Action, dated Jul. 2, 2015.
U.S. Appl. No. 14/631,633, filed Feb. 25, 2015, First Office Action Interview, dated Sep. 10, 2015.
U.S. Appl. No. 14/326,738, filed Jul. 9, 2014, Final Office Action, dated Jul. 31, 2015.
U.S. Appl. No. 14/077,159, filed May 3, 2016, Office Action, dated Mar. 12, 2014.
U.S. Appl. No. 15/145,177, filed May 3, 2016, Office Action, dated Jul. 29, 2016.
U.S. Appl. No. 14/289,599, filed May 28, 2014, Advisory Action, dated Sep. 4, 2015.
U.S. Appl. No. 15/145,177, filed May 3, 2016, Final Office Action, dated Aug. 7, 2015.
U.S. Appl. No. 14/319,765, filed Jun. 30, 2014, Advisory Action, dated Sep. 10, 2015.
U.S. Appl. No. 14/196,814, filed Mar. 4, 2014, Final Office Action, dated Dec. 14, 2016.
U.S. Appl. No. 14/196,814, filed Mar. 4, 2014, Notice of Allowance, dated Apr. 6, 2017.
U.S. Appl. No. 14/225,084, filed Mar. 25, 2014, Office Action, dated Sep. 11, 2015.
U.S. Appl. No. 14/196,814, filed Mar. 4, 2014, Final Office Action, dated Jun. 13, 2016.
U.S. Appl. No. 14/134,558, filed Dec. 19, 2013, Office Action, dated 10/07/20165.
U.S. Appl. No. 13/831,791, filed Mar. 15, 2013, Final Office Action, dated Aug. 6, 2015.
New Zealand Intellectual Property Office, “First Examination Report” in application No. 35215130/AJS, dated Apr. 1, 2014, 2 pages.
Official Communication for New Zealand Patent Application No. 622501 dated Apr. 1, 2014.
Official Communication for New Zealand Patent Application No. 622501 dated Jun. 5, 2014.
Official Communication for European Patent Application No. 14159447.3 dated Nov. 25, 2014.
Official Communication for European Patent Application No. 14159447.3 dated Jan. 8, 2015.
Official Communication for European Patent Application No. 15157642.8 dated Jul. 20, 2015.
U.S. Appl. No. 14/985,201, filed Dec. 30, 2015, Notice of Allowance, dated Apr. 18, 2018.
U.S. Appl. No. 14/077,159, filed Nov. 11, 2013, Notice of Allowance, dated Aug. 15, 2014.
Stack Overflow, How to use update trigger to update another table, May 2012, 2 pages.
U.S. Appl. No. 14/088,251, filed Nov. 22, 2013, Final Office Action, dated Apr. 18, 2016.
U.S. Appl. No. 13/831,199, filed Mar. 14, 2013, Notice of Allowance, dated Jan. 2, 2018.
U.S. Appl. No. 13/831,199, filed Mar. 14, 2013, Office Action, dated May 19, 2017.
U.S. Appl. No. 14/196,814, filed Mar. 4, 2014, Final Office Action, dated Dec. 18, 2014.
U.S. Appl. No. 15/145,177, filed May 3, 2016, Final Office Action, dated Dec. 2, 2016.
U.S. Appl. No. 14/196,814, filed Mar. 4, 2014, Interview Summary, dated Jul. 28, 2015.
U.S. Appl. No. 13/839,026, filed Mar. 15, 2013, Notice of Allowance, dated Mar. 11, 2016.
U.S. Appl. No. 14/580,218, filed Dec. 23, 2014, Final Office Action, dated Jan. 7, 2016.
U.S. Appl. No. 14/806,517, filed Jul. 22, 2015, Pre Interview Office Action, dated Oct. 26, 2016.
U.S. Appl. No. 14/985,201, filed Dec. 30, 2015, Pre Interview Office Action, dated Jun. 15, 2017.
U.S. Appl. No. 14/985,201, filed Dec. 30, 2015, Pre Interview Office Action, dated Oct. 3, 2017.
U.S. Appl. No. 13/838,815, filed Mar. 15, 2013, Notice of Allowance, dated Mar. 3, 2015.