Methods for integrating semantic search, query, and analysis and devices thereof

Information

  • Patent Grant
  • 10025880
  • Patent Number
    10,025,880
  • Date Filed
    Friday, January 6, 2017
    7 years ago
  • Date Issued
    Tuesday, July 17, 2018
    6 years ago
Abstract
A method, non-transitory computer readable medium and data management computing apparatus comprising searching across a plurality of different heterogeneous data indexes based on portions of one or more search keywords in response to a received request. A result set for each of the plurality of different heterogeneous data indexes is obtained based on the searching. Further, one or more facets to each of the obtained results sets are added. Furthermore, a plurality of visualization techniques are automatically identified for each of the obtained results sets based on the facets in each of the obtained result sets and a model entity type associated with each of the plurality of different heterogeneous data indexes. Finally, each of the obtained results sets with the added facets and the identified one of the plurality of visualization techniques is provided.
Description
FIELD

This technology generally relates to the collection, semantic modeling, persistent storage, and subsequent search, query and analysis of vast amount of heterogeneous data that is derived from computer applications and systems, computer and network based human interactions, and networked physical devices, sensors and systems.


BACKGROUND

The connected world, also referred to as the internet of things, is growing quickly. Analysts have estimated that along with the continued growth of humans using the Internet, the number of connected devices and systems will rise from five billion to one trillion in the next ten years. However, the traditional ways to manage and communicate with these systems has not changed. In other words, all the information from these systems is not accessible or is not able to be correlated in a way that helps people or businesses do their jobs better and more efficiently, find information they are looking for in the proper context, or make this data consumable in a meaningful way. In addition, user expectations for interacting with systems have changed and more consistent ways to share dynamic information in this environment have not been found.


Existing technologies handle the rising amount of data using enterprise resource planning (ERP) systems, portals and related technologies, traditional business intelligence systems and manufacturing intelligence systems. However, these existing technologies do not provide the required data in real time and also restrict the type and amounts of data that can be accessed by the users. Additionally, existing technologies fail to provide an interactive system to solve a problem or to search information relating to a specific domain. Further, the existing technologies do not provide any analytical solution of the data available across different servers within an organization and are not compatible with third party database servers.


SUMMARY

A method for integrating semantic search, query, and analysis across heterogeneous data types includes a data management computing apparatus for searching by a across a plurality of different heterogeneous data indexes based on portions of one or more search keywords in response to a received request. A result set for each of the plurality of different heterogeneous data indexes is obtained based on the searching by the data management computing apparatus. Further, one or more facets to each of the obtained results sets are added by the data management computing apparatus. Furthermore, a plurality of visualization techniques are automatically identified by the data management computing apparatus for each of the obtained results sets based on the facets in each of the obtained result sets and a model entity type associated with each of the plurality of different heterogeneous data indexes. Finally, each of the obtained results sets with the added facets and the identified one of the plurality of visualization techniques is provided by the data management computing apparatus.


A non-transitory computer readable medium having stored thereon instructions for integrating semantic search, query, and analysis across heterogeneous data types comprising machine executable code which when executed by at least one processor, causes the processor to perform steps including searching by a across a plurality of different heterogeneous data indexes based on portions of one or more search keywords in response to a received request. A result set for each of the plurality of different heterogeneous data indexes is obtained based on the searching. Further, one or more facets to each of the obtained results sets are added. Furthermore, a plurality of visualization techniques are automatically identified for each of the obtained results sets based on the facets in each of the obtained result sets and a model entity type associated with each of the plurality of different heterogeneous data indexes. Finally, each of the obtained results sets with the added facets and the identified one of the plurality of visualization techniques is provided.


A data management computing apparatus including one or more processors, a memory coupled to the one or more processors which are configured to execute programmed instructions stored in the memory including searching by a across a plurality of different data indexes based on portions of one or more search keywords in response to a received request. A result set for each of the plurality of different heterogeneous data indexes is obtained based on the searching. Further, one or more facets to each of the obtained results sets are added. Furthermore, a plurality of visualization techniques are automatically identified for each of the obtained results sets based on the facets in each of the obtained result sets and a model entity type associated with each of the plurality of different heterogeneous data indexes. Finally, each of the obtained results sets with the added facets and the identified one of the plurality of visualization techniques is provided.


This technology provides a number of advantages including providing more effective methods, non-transitory computer readable media, and apparatuses for integrating semantic search, query, and analysis across heterogeneous data types. This technology more effectively guides users to the information that they are seeking. Additionally, this technology provides answers to questions that were previously unanswerable via traditional business intelligence and reporting tools applications. This technology also helps find unforeseen relationships in data and business processes that can lead to innovative solutions and better dissemination of knowledge.


Another advantage of this technology is that it executes and manages searches like a conversation. This technology is able to suggest, refine, relate, and educate a user during the search process. Additionally, this technology may provide feedback so that the process of searching provides the right answer or helps to change the question that is being searched. By adding the context of the application, as well as data about who the user is and how that user is currently interacting with the application, this technology can suggest a different question before it is even asked or add specific search terms to the question as it is being asked based on to add more granularity to the results.


Yet another advantage of this technology is that it continuously collects and indexes more heterogeneous data than existing technologies which allows more data to be mined and searched over time. Additionally, by using a number of well-defined search paradigms, such as tagging, faceting, and text indexing, this technology helps users mine heterogeneous data more effectively to solve complex questions or problems easily and efficiently. Further, by extending traditional techniques for searching and combining those techniques with access to analytics and the existing capabilities of the underlying graph database, this technology is able to identify unforeseen scenarios buried within the captured heterogeneous data.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of a network environment with an exemplary data management computing apparatus for integrated search, query, and analysis across heterogeneous data types; and



FIG. 2A-2B are flowcharts of an exemplary method for performing integrated search, query, and analysis across heterogeneous data types.





DETAILED DESCRIPTION

A network environment 10 with a data management computing apparatus 14 for integrated search, query, and analysis across heterogeneous data types is illustrated in FIG. 1. The environment 10 includes the data management computing apparatus 14, a plurality of client computing devices 12, and a plurality of data servers 16 which are coupled together by the Local Area Network (LAN) 28 and Wide Area Network (WAN) 30, although the environment 10 can include other types and numbers of devices, components, elements, databases and communication networks in other topologies and deployments. While not shown, the exemplary environment 10 may include additional components, such as routers, switches and other devices which are well known to those of ordinary skill in the art and thus will not be described here. This technology provides a number of advantages including providing more effective methods, non-transitory computer readable media, and apparatuses for integrating semantic search, query, and analysis across heterogeneous data types.


The data management computing apparatus 14 provides a number of functions including integrating semantic search, query, and analysis across heterogeneous data types and systems, although other numbers and types of systems can be used and other numbers and types of functions can be performed. The data management computing apparatus 14 includes at least one processor 18, memory 20, input and display devices 22, and interface device 24 which are coupled together by bus 26, although data management computing apparatus 14 may comprise other types and numbers of elements in other configurations.


Processor(s) 18 may execute one or more non-transitory programmed computer-executable instructions stored in the memory 20 for the exemplary methods illustrated and described herein, although the processor(s) can execute other types and numbers of instructions and perform other types and numbers of operations. The processor(s) 18 may comprise one or more central processing units (“CPUs”) or general purpose processors with one or more processing cores, such as AMD® processor(s), although other types of processor(s) could be used (e.g., Intel®).


Memory 20 may comprise one or more tangible storage media, such as RAM, ROM, flash memory, CD-ROM, floppy disk, hard disk drive(s), solid state memory, DVD, or any other memory storage types or devices, including combinations thereof, which are known to those of ordinary skill in the art. Memory 20 may store one or more non-transitory computer-readable instructions of this technology as illustrated and described with reference to the examples herein that may be executed by the one or more processor(s) 18. The flowchart shown in FIGS. 2A-2B are representative of example steps or actions of this technology that may be embodied or expressed as one or more non-transitory computer or machine readable instructions stored in memory 20 that may be executed by the processor(s) 18.


Input and display devices 22 enable a user, such as an administrator, to interact with the data management computing apparatus 14, such as to input and/or view data and/or to configure, program and/or operate it by way of example only. Input devices may include a touch screen, keyboard and/or a computer mouse and display devices may include a computer monitor, although other types and numbers of input devices and display devices could be used. Additionally, the input and display devices 22 can be used by the user, such as an administrator to develop applications using Application interface.


The interface device 24 in the data management computing apparatus 14 is used to operatively couple and communicate between the data management computing apparatus 14, the client computing devices 12, and the plurality of data servers which are all coupled together by LAN 28 and WAN 30. By way of example only, the interface device 24 can use TCP/IP over Ethernet and industry-standard protocols, including NFS, CIFS, SOAP, XML, LDAP, and SNMP although other types and numbers of communication protocols can be used.


Each of the client computing devices 12 includes a central processing unit (CPU) or processor, a memory, an interface device, and an I/O system, which are coupled together by a bus or other link, although other numbers and types of network devices could be used. The client computing device 12 communicates with the data management computing apparatus 14 through LAN 28, although the client computing device 12 can interact with the data management computing apparatus 14 in other manners.


Each of the plurality of data servers 16 includes a central processing unit (CPU) or processor, a memory, an interface device, and an I/O system, which are coupled together by a bus 26 or other link, although other numbers and types of devices and systems could be used. Each of the plurality of data servers 16 enters, updates and/or store content, such as files and directories, although other numbers and types of functions can be implemented and other types and amounts of data could be entered, updated, or stored used. Each of the plurality of data servers 16 may include by way of example only, enterprise resource planning (ERP) systems, portals and related technologies, traditional business intelligence systems and manufacturing intelligence systems. Additionally, the plurality of data servers 16 can include real time information of devices or resources executing.


Although an exemplary environment 10 with the client computing devices 12, the data management computing apparatus 14 and the plurality of data servers 16 are described and illustrated herein, other types and numbers of systems, devices in other topologies can be used. It is to be understood that the systems of the examples described herein are for exemplary purposes, as many variations of the specific hardware and software used to implement the examples are possible, as will be appreciated by those skilled in the relevant art(s).


In addition, two or more computing systems or devices can be substituted for any one of the systems or devices in any example. Accordingly, principles and advantages of distributed processing, such as redundancy and replication also can be implemented, as desired, to increase the robustness and performance of the devices and systems of the examples. The examples may also be implemented on computer system(s) that extend across any suitable network using any suitable interface mechanisms and traffic technologies, including by way of example only teletraffic in any suitable form (e.g., voice and modem), wireless traffic media, wireless traffic networks, cellular traffic networks, 3G traffic networks, Public Switched Telephone Network (PSTNs), Packet Data Networks (PDNs), the Internet, intranets, and combinations thereof.


Furthermore, each of the systems of the examples may be conveniently implemented using one or more general purpose computer systems, microprocessors, digital signal processors, and micro-controllers, programmed according to the teachings of the examples, as described and illustrated herein, and as will be appreciated by those of ordinary skill in the art.


This technology defines a rapid, iterative approach to design and deployment, allowing solutions to be delivered incrementally, shortening the time to first value. This system's unique model-based design and development tools enable developers to build and deploy operational solutions in less time than traditional approaches.


The software platform described by this system defines a model driven development architecture in which the model has entities, which typically represents physical assets/devices, computer applications and systems, and people. Entities can also represent data objects and platform services. Each entity has its own properties and services, and can fire and consume events. All entities are treated as equal collaborators in any applications that utilize the underlying capabilities of the system.


Within this system, developers model the Things (people, systems and real world equipment/devices) in their world, independent of any specific use case. Things are augmented projections of their real world equivalents containing the complete set of data, services, events, historical activities, collaboration, relationships and user interfaces that define it and its place in the world. These Things can then be easily combined into solutions, tagged and related into industrial social graphs, searched/queried/analyzed, and mashed up into new operational processes.


This system enables applications that are ‘dynamic’ in that they continuously evolve and grow over time. As the application runs, it continuously collects and indexes new data about the entities in the model, which allows more data to be mined and searched over time. This system's unique technology provides the basis for this evolution, allowing users to answer questions, solve problems, and capture opportunities that have not even been anticipated. This exemplary technology increases in value the more it is used.


An exemplary apparatus and method, referred to herein as Search, Query, and Analysis (SQUEAL), is provided that allows users to: utilize built-in application and user context to guide users to the information that they are seeking; provide answers to questions that were previously unanswerable via traditional BIRT (Business Intelligence and Reporting Tools) applications; help users find unforeseen relationships in their data and business processes that can lead to innovative solutions and better spread of knowledge; and integrate search, collaboration, and analytical applications.


SQUEAL addresses both the Known-Unknown and the Unknown-Unknown domains. This technology utilizes a user defined model that follows a specific set of rules. Therefore, this exemplary technology can know how different model elements will be related and stored. Using a number of well-defined search paradigms, such as tagging, faceting, and text indexing, this technology helps users solve the Known-Unknown questions. Basic relationships can be traced in the system because of their definition within the model.


The more difficult solution is to enable users to advance to the Unknown-Unknown realm. The next level of value comes from being able to answer questions that are not answerable by direct connections between entities. Extending traditional techniques for search, and combining those techniques with access to analytics and the existing capabilities of the underlying graph database, offer the ability to identify unforeseen scenarios buried within the captured data.


Because SQUEAL can be used in the Unknown-Unknown realm, it is expected that relationships between data and new solutions for innovation and problem solving will result, as the unintended consequence of asking a question and seeing an unforeseen answer.


One of the benefits of examples of this technology is that it can use search like a conversation. Since all entities in the system are able to ‘converse’, making devices, sensors, systems and people equal participants in the process, search can be viewed as a conversation between user and engine. The system is able to suggest, refine, relate, and educate the user during the search process. The engine should provide feedback so that the process of searching provides the right answer, or helps to change the question that is being asked. By adding the context of the application, as well as who the user is and how that user is currently interacting with the application, this exemplary technology can suggest a different question before it is even asked—or add specific search terms to the question as it is being asked based on the user context, to add more granularity to the results.


As this exemplary technology suggests new solutions or paths, the user walks through a discovery path that is more complex and rich than a simple full text search.


A discovery path may be saved for future use. The discovery path may be kept private or shared with other users of the system. Each discovery path will have a “breadcrumb trail” marking the stops and turns of the path and any breadcrumb can be “pinned” to represent a returnable end point.


The ultimate goal is to make search ubiquitous within the system's collaboration and analytical applications. Traditional BIRT applications have been designed to answer specific questions. Examples are a report that summarizes yesterday's production output in a manufacturing plant, or last month's sales orders for a company. These applications provide analytics against well-defined data structures and well-defined semantic models that describe the data. They cannot adopt on the fly to user interaction and questioning. Even the output that is rendered to the end user follows a specific pattern.


Using search in a new way, with the help of the model, the user experience for consuming information can be entirely new. A user will be able to ask a question of the system, and a set of results can be presented that will include analytics, Mashups, and documents. The data can be part of the system's model or may point to data in an external store (Document management system, database application, or ERP application, for example).


There are multiple implementations within the system that enable search. These implementations will allow specific linkages between the system's model artifacts/content and search results. These include the following four implementations:


First, all data within the system can be tagged to add context. This includes design time data (the model), as well as runtime data that is collected within the scope of the customer applications. Tags can be dynamic. For example, you may have changing lot numbers in a manufacturing line. You can collect time series data against the machines on the line. When different material lots are moving through the line, you can tag the time series data with the lot numbers, for easy retrieval, correlation and analysis later.


Second, all text fields will be fully indexed by the search engine for reference. This includes model data, and run time data, and human collaboration data such as from operator or maintenance logs.


Third, the model is based on a graph database, with explicit relationships defined as part of the model. Relationships can be parent-child or sibling. A refrigerated truck implements a Thing Template that represents two Thing Shapes, a truck and a refrigerated cargo carrier. A user can ask the model to search for all Things that implement the Refrigerated_Truck Thing Template, and get a list of specific trucks in return. Relationship terms can be user defined, such as Vessel1 “feeds” Vessel2. Relationships apply to both design time and run time data, because the run time data is collected against, and is hence related to, an entity defined in the model. There is always a relationship between data collection and entities.


Fourth, external data can be “crawled” and indexed to be included as part of the search results along with a pointer to the actual document. These indexes of external data can also be tagged, using vocabularies, to add context to the search in relation to a user's query. Each of these implementations can be leveraged to provide a new experience for the data consumer.


When a search is performed, faceted results will automatically offer analytical applications based on the search results. This is possible because of the knowledge the SQUEAL application will have of the model defined within the system. Examples are: (1) Time series charts for stream data; (2) Full analytical applications for Mashups; and (3) Heat Maps and physical location maps for geotagged data.


Collaboration results will be treated similarly to analytical applications. Collaboration Mashups will be presented for search results that point to: (1) Blogs; (2) Forums; and (3) Wikis.


Interactive chat sessions can be automatically established for anything within a search result set that has a facet of Machine/RAP, where RAP is the system's Remote Application Proxy connector to a machine or device that is capable of supporting chat functionality.


Using contextualized search based on the user role and the application that the user is in, including any selections the user may have made within the application, the search results can be directed to specific types of analysis. Combining all these elements into a single user experience is the definition of SQUEAL.


An important capability of SQUEAL will be the ability to simultaneously search across many servers/data stores. For example, a manufacturing company will typically have many locations. A server may be deployed at each location, or in some companies, on a regional basis. Each server will collect their own data. If a maintenance worker at one plant site searches for a solution to a specific machine problem at his site, he may wish to search across each site within the company. That will require a simultaneous search across many servers, with the results aggregated and shown as a single search result.


Combining search and analytical solutions is a unique approach to managing and gaining knowledge from the large amount of data flows that are the result of the Internet of Things (JOT). SQUEAL is a single tool rather than the traditional split of the query and analytical solutions available today.


Using Mashup tagging to add to search terms allows web pages, mini-web applications, and other HTML artifacts to be included in the results of a SQUEAL inquiry.


A user who is working within an application that supports the user workflow may also have search embedded within the application. The application may have specific terms embedded that are automatically appended to the search, allowing the application itself as designed by the content developer to add search context. For example, this allows a content developer to specify “maintenance” or “maintenance procedure” so that a search within the maintenance application has pre-defined context.


Using Mashup faceting to have specific facets as the result of a search is a new approach to the search user experience. For example, this allows the content developer within a Mashup to define a specific search facet, such as analytical trends, so that a search within that Mashup will always have a category of analytical trends in the results.


The user context can be added to the search context, so that the selections that the user has made along the way in the Mashups may be added to the search terms.


Embedded search capability within a Mashup, as opposed to standalone search pages, contributes to delivering search and analytical results in a new way to the user, within normal work flow applications.


The examples may also be embodied as a non-transitory computer readable medium having instructions stored thereon for one or more aspects of the technology as described and illustrated by way of the examples herein, which when executed by a processor (or configurable hardware), cause the processor to carry out the steps necessary to implement the methods of the examples, as described and illustrated herein.


An exemplary method for integrating semantic search, query and analysis across heterogeneous data will now be described with reference to FIGS. 1 and 2A-2B. In step 205, the user of the client computing device 12 enters the credentials to login to an application executing on a client computing device 12, although the user may login to the executing application on the client computing device using any other methods. By way of example only, the executing application can be relating to HTML and/or Java Script applications running in the web browser of the client computing device 12, although the executing application can be any other applications on the client computing device 12.


In step 210, the client computing device 12 verifies the entered user credentials with the information stored within the memory, although the client computing device 12 can verify the user credentials using any other means or methods. If the client computing device 12 successfully verifies the user credentials, a Yes branch is taken to step 220 to provide access to use the executing one or more application, otherwise, a No branch to step 215 is taken to reject the login request and close the executing application on the client computing device 12.


In step 220, upon successful login of the user to the executing application on the client computing device 12, the client computing device 12 establishes a connection with the data management computing apparatus 14 via LAN 28 or WAN 30, although the client computing device 12 may establish a connection with the data management computing apparatus even before the successful login of the user to the executing application on the client computing device 12 using any other means. Additionally, the client computing device 12 sends the application information including, application name, application version along with the user credentials to the data management computing apparatus 14.


In step 225, the data management computing apparatus 14 receives a request from an application executing in the requesting client computing device 12 for search, query and analysis, although the data management computing apparatus 14 can receive any other types of requests in other manners from other devices or systems. Along with the request, the data management computing apparatus 14 receives at least a portion of a complete request from the requesting client computing device 12, although the data management computing apparatus 14 may receive the complete request from the requesting client computing device 12. By way of example only, the request is entered one character at a time in a help text field of the executing application in the client computing device 12, although the request could be entered in other manners, such as a being entered by pasting in and complete word or phrase. In this step, as each character is entered the client computing device 12 transmits the entered character to the data management computing apparatus 14, although the portions or all of the characters in the request can be provided in other manners, such as when each word of a search phrase is entered by way of example only. In this particular example, the request from the client computing device 12 is a query requesting information within the executing application in the client computing device 12, although other types and numbers of requests could be entered.


In step 230, the data management computing apparatus 14 utilizes stored information in the memory 20 about previous frequently asked questions, search terms and recent search results which includes the entered character(s) to automatically assist in completion of the query or can add context to the query prior to searching based on parameters, such as type of the executing application, geographical location of the requesting client computing device 12 or role of the user using the executing application, the requesting client computing device 12, although the data management computing apparatus 14 can assist at other times and use any other parameters to assist in completion of the request or adding context to the user request. In this example, the data management computing apparatus 14 obtains the role of the user using the requesting client computing device 12 when the user logs-in to at least one of the executing one or more applications, although the data management computing apparatus 14 can obtain any additional information. Additionally, the data management computing apparatus 14 also refines during the completion of the query based on previous top searches, highly rated search stored within the memory 20, although the data management computing apparatus 14 may refine the query based on any other parameters.


In step 235, the data management computing apparatus 14 splits the received query into keywords by splitting the words in the received query separated by blank spaces, although the data management computing apparatus 14 may split the received query into keywords using any other techniques. Optionally, the data management computing apparatus 14 may also refer to a dictionary stored within the memory 20 while splitting the received query into keywords. Additionally, while splitting the received query into keywords, the data management computing apparatus 14 ignores any articles such as a, the; and/or special characters, such as a comma, period, exclamatory mark, semi-colon, or question mark by way of example. Further, in this example, the data management computing apparatus 14 may choose to ignore numerical characters in the received query, although the data management computing apparatus 14 may consider the numerical character while splitting the received query into keywords. By way of example only, if the received query is “What is the temperature of the machine?” then the data management computing apparatus 14 splits the received query into keywords such as temperature, machine and ignore the question mark within the query.


In step 240, the data management computing apparatus 14 searches across indexes of heterogeneous data sets stored in the plurality of data servers 16 using the keywords formed in step 235 in real time to identify and obtain result sets associated with the keywords, although the data management computing apparatus 14 can search heterogeneous data sets using any other methods or techniques stored at any other memory location. By way of example only, the heterogeneous data includes structured data, unstructured data, third party indexes and time series data stored in the plurality of data servers 16, although other types and amounts of data at different locations can be searched. By searching across the indexes using the keywords, this technology provides quick and accurate searching across heterogeneous data as the keywords are searched across the indexes as opposed to searching across the actual data, although the entire data set could be searched. The data management computing apparatus 14 searches across indexes of heterogeneous data sets to obtain results sets relating to the received request, although the data management computing apparatus 14 can search the heterogeneous data sets for any other purpose. By way of example only, the result sets includes time series data with explicit values, unstructured data such as blog entries, forum discussions, information present on pages of web sites, structured data results from third party systems such as a knowledge management system and also data from transactional system such as work order execution or production order details.


In step 245, the data management computing apparatus 14 synchronously stores the searched indexes and the associated result sets with the searched indexes within the memory 20, although the data management computing apparatus 14 may store any other additional information associated with the searched indexes at any other memory location. Additionally, in this example, the data management computing apparatus 14 stores the searched indexes and the associated result sets with time stamp within the memory 20. By way of example only, the data management computing apparatus 14 stores the searched indexes and the associated result sets in a table, which maps the indexes with the associated result sets, although the data management computing apparatus 14 can store the searched index and the associated data in any other format.


In step 250, the data management computing apparatus 14 automatically adds facets for each of the result set. By way of example only, the data management computing apparatus 14 add the facets present in the plurality of data servers 16 based on a model-entity type, although the data management computing apparatus 14 can add the facets based on any other parameters stored at any other memory locations. In this technology, facets relates properties of the information in the result set which are dynamically derived and added by analysis of the result set obtained in step 240, although facets can include additional information and performing operations such as, classification of each information of the result set along multiple explicit dimensions which enables the information of the result set to be accessed and ordered in multiple ways rather than in a single, pre-determined, taxonomic order as done in the existing technologies. By way of example only, facets include time series charts for stream data, full analytical trends for mash-ups or heat maps or physical location maps for geo-tagged data. By automatically adding facets to the search results, the technology disclosed in this patent application provides benefits and advantages such as classifying information and finding data accurately. Additionally, as it would be appreciated by a person having ordinary skill in the art, model-entity type in this technology relates to an interfacing relationship between the results sets and the facets.


In step 255, the data management computing apparatus 14 automatically suggests visualization techniques to result sets and the facets based on model-entity type, although the data management computing apparatus 14 can automatically suggest visualization techniques based on any other parameters. In this technology, visualization techniques relate to techniques of representing data as a web page view, although visualization techniques can include representing data in any other format suitable for convenient viewing of the. By way of example only, examples of visualization techniques include web page view, print views, representing data as charts or graphs, although visualization techniques can include any other techniques of representing data. In this example, the data management computing apparatus 14 suggests the visualization techniques by referring to a table present within the memory 20. The table in the memory 20 includes the keywords, facets and their associated visualization techniques, although the table can include any other amounts additional information.


In step 260, the data management computing apparatus 14, the data management computing apparatus 14 renders the result sets as they are being searched. In this example, rendering relates to loading the searched result sets in the format they were searched and stored, although the results sets can be converted by the data management computing apparatus 14 to a standard format suitable to the executing application on the requesting client computing device 12. By way of example only, formats can be in a PDF, textual format or an image format, although other formats can be used.


In step 265, the data management computing apparatus 14 embeds the rendered result set with the associated facets and the visualization techniques within the work flow of the executing application on the requesting client computing device 12, although the data management computing apparatus 14 can output the rendered information to the requesting client computing devices using other techniques. Additionally, in this example, the data management computing apparatus 14 can also embed interactive chat functionality within the work-flow of the executing application in the requesting client computing device 12. The interactive chat functionality could assist the user of the requesting client computing device 12 to interact with subject matters experts or other professionals to find any additional information relating to the received request, although the chat functionality could provide any additional assistance. By embedding the rendered result set with the associated facets and the visualization techniques within the work flow, the technology provided in this patent application provides advantages to the user of requesting client computing device 12 to view all the rendered data within the executing application as opposed to switching between multiple screens to view different data.


In step 270, the data management computing apparatus 14 determines if the rendered result set is selected by the requesting client computing device 12 for further viewing. If the data management computing apparatus 14 determines that the rendered information is selected by the requesting client computing device 12, then the Yes branch is taken to step 275 where this exemplary method ends. If data management computing apparatus 14 determines that the rendered information is not selected, then the No branch is taken to step 280.


In step 280, the data management computing apparatus 14 provides one or more filters to the requesting client computing device 12 to further refine the rendered result set, although the data management computing apparatus 14 can assist in refining the search information using any methods. The data management computing apparatus 14 provides the filters based on the executing application in the client computing device 12, the received request in step 225, or user role, although the data management computing apparatus 14 can provide the filters based on other types and amount of criteria or other parameters. The filters provided by the data management computing apparatus 14 may be used to refine the result set to make the search more accurate.


In step 285, the data management computing apparatus 14 determines if the requesting client computing device 12 has selected any of the provided filters. If the data management computing apparatus 14 determines that the requesting client computing device 12 has not selected one or more of the provided filters, then the No branch is taken to step 275 where this exemplary method ends.


If in step 285 the data management computing apparatus 14 determines that the requesting client computing device 12 has selected one or more of the provided filters, then the Yes branch is taken to step 290. In step 290, the data management computing apparatus 14 refines the search by further searching the stored indexes and the associated result set in step 245 using the selected filters, although the data management computing apparatus 14 can refine the stored results using the updated keywords by any other techniques or methods. In another example, the data management computing apparatus 14 may perform a new search using the updated keywords and the flow of the process may flow back to step 240.


Next, in step 292, the data management computing apparatus 14 renders the refined search results as illustrated in step 260. The refined search results include the result set, and the associated facets and the visualization techniques, although the refined search results may include any amounts of any additional information.


Further, in step 294, the data management computing apparatus 14 embeds the refined result set, the facets and the visualization techniques as illustrated in step 265 and the flow of the process ends in step 296.


Having thus described the basic concept of this technology, it will be rather apparent to those skilled in the art that the foregoing detailed disclosure is intended to be presented by way of example only, and is not limiting. Various alterations, improvements, and modifications will occur and are intended to those skilled in the art, though not expressly stated herein. These alterations, improvements, and modifications are intended to be suggested hereby, and are within the spirit and scope of this technology. Additionally, the recited order of processing elements or sequences, or the use of numbers, letters, or other designations therefore, is not intended to limit the claimed processes to any order except as may be specified in the claims. Accordingly, this technology is limited only by the following claims and equivalents thereto.

Claims
  • 1. A method for integrating semantic search, query, and analysis across data sets associated with a plurality of monitored physical assets, the method comprising: presenting, via a graphical user interface, an input field for a search engine that accesses, in a single search, a plurality of indexes generated from data sets associated with a plurality of monitored physical assets, wherein the search engine accesses, in the same single search, run-time data and design-time data of a model of a monitored physical asset of the plurality of monitored physical assets, wherein the design-time data comprises graph-database elements comprising one or more model templates used to define the monitored physical asset, wherein a model template includes one or more attributes of the monitored physical asset including properties associated with the monitored physical asset, andwherein the run-time data comprises specific properties for the monitored physical asset implemented from the one or more model templates;receiving, in the input field of the graphical user interface, one or more search keywords;in response to receiving the one or more search keywords, searching, by a computing device, across the plurality of indexes to determine i) one or more result sets for each of the plurality of indexes having matched portions of the one or more search keywords to the run-time data and ii) one or more result sets having matched portions of the one or more search keywords to the graph-database elements of the design-time data; andpresenting, via the graphical user interface, i) graphical objects associated with the one or more result sets associated with each matched monitored physical asset of the plurality of monitored physical assets, results associated with one or more matched properties of the run-time data, and results associated with one or more matched properties of the design-time data, and ii) graphical objects associated with one or more filters to refine the searched results.
  • 2. The method of claim 1, further comprising, in response to receiving a filter of the one or more filters, applying, by the computing device, the received one or more filters to the obtained results sets to provide a corresponding refined results set; andpresenting, via the graphical user interface, graphical objects associated with the refined result sets.
  • 3. The method of claim 1, wherein the one or more filters includes a search facet.
  • 4. The method of claim 3, wherein the search facet includes a category of analytic trends for analysis of the one or more result sets.
  • 5. The method of claim 3, wherein at least one of the search facets is selected from the group consisting of a time series chart, a full analytical trend chart, a Mashup, a heat map, and a physical location map.
  • 6. The method of claim 1, further comprising: identifying, by the computing device, an applicable visualization techniques for each of the results sets based on a given property or a given model entity type to which each of the one or more results sets is associated, wherein the graphical objects associated with the one or more result sets are presented in accordance with the identified visualization techniques.
  • 7. The method of claim 6, wherein the identified visualization technique are selected from the group consisting of a web page view, a graph, a chart, and a printable view.
  • 8. The method of claim 1, wherein the obtained results sets are graphically presented, via a single application, on a computing device associated with a user.
  • 9. The method of claim 1, wherein the data sets are heterogeneous data sets selected from the group consisting of data sets from an Enterprise Resource Planning (ERP) system, data sets from a Manufacturing Intelligence system, and data sets from a Business Intelligence system.
  • 10. The method of claim 1, wherein the model template further comprises services and events associated with the given monitored physical asset, and wherein the search across the plurality of indexes includes a search to determine one or more result sets having matched portions of one or more services or events.
  • 11. The method of claim 1, wherein a set of the results includes analytics, Mashup, and documents.
  • 12. The method of claim 1, wherein the data sets include tagged data.
  • 13. The method of claim 1, wherein the data sets include structured data, unstructured data, and time series data.
  • 14. A system comprising: a processor; anda memory having instructions, stored thereon, wherein execution of the instructions, cause the processor to:present, via a graphical user interface, an input field for a search engine that accesses, in a single search, a plurality of indexes generated from data sets associated with a plurality of monitored physical assets, wherein the search engine accesses, in the same single search, run-time data and design-time data of a model of a monitored physical asset of the plurality of monitored physical assets, wherein the design-time data comprises graph-database elements comprising one or more model templates used to define the monitored physical asset, wherein a model template includes one or more attributes of the monitored physical asset including properties associated with the monitored physical asset, and wherein the run-time data comprises specific properties for the monitored physical asset implemented from the one or more model templates;receive, in the input field of the graphical user interface, one or more search keywords;in response to receiving the one or more search keywords, direct a search, by a computing device, across the plurality of indexes to determine i) one or more result sets for each of the plurality of indexes having matched portions of the one or more search keywords to the run-time data and ii) one or more result sets having matched portions of the one or more search keywords to the graph-database elements of the design-time data; andpresent, via the graphical user interface, i) graphical objects associated with the one or more result sets associated with each matched monitored physical asset of the plurality of monitored physical assets, results associated with one or more matched properties of the run-time data, and results associated with one or more matched properties of the design-time, and ii) graphical objects associated with one or more filters to refine the searched results.
  • 15. The system of claim 14, wherein the instructions, when executed by the processor, further cause the processor to: in response to receiving a filter of the one or more filters, direct application by the computing device of the received one or more filters to the obtained results sets to provide a corresponding refined results set; andpresent, via the graphical user interface, graphical objects associated with the refined result sets.
  • 16. The system of claim 14, wherein the filters include a search facet.
  • 17. The system of claim 14, wherein the instructions, when executed by the processor, further cause the processor to: direct identification, by the computing device, an applicable visualization techniques for each of the results sets based on a given property or a given model entity type to which each of the results sets is associated, wherein the graphical objects associated with the one or more result sets are presented in accordance with the identified visualization techniques.
  • 18. The system of claim 17, wherein the identified visualization technique are selected from the group consisting of a web page view, a graph, a chart, and a printable view.
  • 19. The system of claim 14, wherein the model template further comprises services and events associated with the given monitored physical asset, and wherein the search across the plurality of indexes includes a search to determine one or more result sets having matched portions of one or more services or events.
  • 20. A non-transitory computer readable medium having instructions stored thereon, wherein execution of the instructions by a processor, cause the processor to: present, via a graphical user interface, an input field for a search engine that accesses, in a single search, a plurality of indexes generated from data sets associated with a plurality of monitored physical assets, wherein the search engine accesses, in the same single search, run-time data and design-time data of a model of a monitored physical asset of the plurality of monitored physical assets, wherein the design-time data comprises graph-database elements comprising one or more model templates used to define the monitored physical asset, wherein a model template includes one or more attributes of the monitored physical asset including properties associated with the monitored physical asset, and wherein the run-time data comprises specific properties for the monitored physical asset implemented from the one or more model templates;receive, in the input field of the graphical user interface, one or more search keywords;in response to receiving the one or more search keywords, direct a search, by a computing device, across the plurality of indexes to determine i) one or more result sets for each of the plurality of indexes having matched portions of the one or more search keywords to the run-time data and ii) one or more result sets having matched portions of the one or more search keywords to the graph-database elements of the design-time data; andpresent, via the graphical user interface, i) graphical objects associated with the one or more result sets associated with each matched monitored physical asset of the plurality of monitored physical assets, results associated with one or more matched properties of the run-time data, and results associated with one or more matched properties of the design-time data, and ii) graphical objects associated with one or more filters to refine the searched results.
Parent Case Info

This is a continuation application of U.S. patent application Ser. No. 13/679,361, filed Nov. 16, 2012, which claims priority to, and the benefit of, U.S. Provisional Patent Application Ser. No. 61/560,369 filed Nov. 16, 2011, each of which is hereby incorporated by reference in its entirety.

US Referenced Citations (436)
Number Name Date Kind
3656112 Paull Apr 1972 A
3916412 Amoroso, Jr. Oct 1975 A
3983484 Hodama Sep 1976 A
4063173 Nelson et al. Dec 1977 A
4103250 Jackson Jul 1978 A
4134068 Richardson Jan 1979 A
4216546 Litt Aug 1980 A
4554668 Denman et al. Nov 1985 A
4601059 Gammenthaler Jul 1986 A
4680582 Mejia Jul 1987 A
4704585 Lind Nov 1987 A
4887204 Johnson et al. Dec 1989 A
4979170 Gilhousen et al. Dec 1990 A
5113416 Lindell May 1992 A
5134615 Freeburg et al. Jul 1992 A
5159704 Pirolli et al. Oct 1992 A
5276703 Budin et al. Jan 1994 A
5361401 Pirillo Nov 1994 A
5422889 Sevenhans et al. Jun 1995 A
5454010 Leveque Sep 1995 A
5479441 Tymes et al. Dec 1995 A
5493671 Pitt et al. Feb 1996 A
5515365 Summer et al. May 1996 A
5734966 Farrer et al. Mar 1998 A
5737609 Reed et al. Apr 1998 A
5805442 Crater et al. Sep 1998 A
5892962 Cloutier Apr 1999 A
5909640 Farrer et al. Jun 1999 A
5925100 Drewry et al. Jul 1999 A
6169992 Beall et al. Jan 2001 B1
6182252 Wong et al. Jan 2001 B1
6198480 Cotugno et al. Mar 2001 B1
6377162 Delestienne et al. Apr 2002 B1
6430602 Kay et al. Aug 2002 B1
6473788 Kim et al. Oct 2002 B1
6510350 Steen, III et al. Jan 2003 B1
6553405 Desrochers Apr 2003 B1
6570867 Robinson et al. May 2003 B1
6618709 Sneeringer Sep 2003 B1
6675193 Slavin et al. Jan 2004 B1
6757714 Hansen Jun 2004 B1
6766361 Venigalla Jul 2004 B1
6797921 Niedereder et al. Sep 2004 B1
6810522 Cook et al. Oct 2004 B2
6813587 McIntyre et al. Nov 2004 B2
6850255 Muschetto Feb 2005 B2
6859757 Muehl et al. Feb 2005 B2
6915330 Hardy et al. Jul 2005 B2
6947959 Gill Sep 2005 B1
6980558 Aramoto Dec 2005 B2
6993555 Kay et al. Jan 2006 B2
7031520 Tunney Apr 2006 B2
7046134 Hansen May 2006 B2
7047159 Muehl et al. May 2006 B2
7054922 Kinney et al. May 2006 B2
7082383 Baust et al. Jul 2006 B2
7082460 Hansen et al. Jul 2006 B2
7117239 Hansen Oct 2006 B1
7149792 Hansen et al. Dec 2006 B1
7155466 Rodriguez Dec 2006 B2
7178149 Hansen Feb 2007 B2
7185014 Hansen Feb 2007 B1
7200613 Schlonski Apr 2007 B2
7250892 Bornhoevd et al. Jul 2007 B2
7254601 Baller et al. Aug 2007 B2
7269732 Kilian-Kehr Sep 2007 B2
7296025 Kung Nov 2007 B2
7321686 Shibata et al. Jan 2008 B2
7341197 Muehl et al. Mar 2008 B2
7380236 Hawley May 2008 B2
7386535 Kalucha Jun 2008 B1
7496911 Rowley et al. Feb 2009 B2
7529570 Shirota May 2009 B2
7529750 Bair May 2009 B2
7536673 Brendle et al. May 2009 B2
7555355 Meyer Jun 2009 B2
7566005 Heusermann et al. Jul 2009 B2
7570755 Williams et al. Aug 2009 B2
7587251 Hopsecger Sep 2009 B2
7591006 Werner Sep 2009 B2
7593917 Werner Sep 2009 B2
7613290 Williams et al. Nov 2009 B2
7616642 Anke et al. Nov 2009 B2
7617198 Durvasula Nov 2009 B2
7624092 Lieske et al. Nov 2009 B2
7624371 Kulkarni et al. Nov 2009 B2
7644120 Todorov et al. Jan 2010 B2
7644129 Videlov Jan 2010 B2
7647407 Omshehe et al. Jan 2010 B2
7653902 Bozak et al. Jan 2010 B2
7673141 Killian-Kehr et al. Mar 2010 B2
7684621 Tunney Mar 2010 B2
7685207 Helms Mar 2010 B1
7703024 Kautzleban et al. Apr 2010 B2
7707550 Resnick et al. Apr 2010 B2
7725815 Peters May 2010 B2
7728838 Forney et al. Jun 2010 B2
7730498 Resnick et al. Jun 2010 B2
7743015 Schmitt Jun 2010 B2
7743155 Pisharody et al. Jun 2010 B2
7650607 Resnick et al. Jul 2010 B2
7752335 Boxenhorn Jul 2010 B2
7757234 Krebs Jul 2010 B2
7761354 Kling et al. Jul 2010 B2
7765181 Thomas Jul 2010 B2
7774369 Herzog et al. Aug 2010 B2
7779089 Hessmer et al. Aug 2010 B2
7779383 Bornhoevd et al. Aug 2010 B2
7783984 Roediger et al. Aug 2010 B2
7802238 Clinton Sep 2010 B2
7814044 Schwerk Oct 2010 B2
7814208 Stephenson et al. Oct 2010 B2
7817039 Bornhoevd et al. Oct 2010 B2
7827169 Enenkiel Nov 2010 B2
7831600 Kilian Nov 2010 B2
7840701 Hsu et al. Nov 2010 B2
7852861 Wu et al. Dec 2010 B2
7853241 Harrison Dec 2010 B1
7853924 Curran Dec 2010 B2
7860968 Bornhoevd et al. Dec 2010 B2
7865442 Sowell Jan 2011 B1
7865731 Kilian-Kehr Jan 2011 B2
7865939 Schuster Jan 2011 B2
7873666 Sauermann Jan 2011 B2
7877412 Homer Jan 2011 B2
7882148 Werner et al. Feb 2011 B2
7886278 Stulski Feb 2011 B2
7890388 Mariotti Feb 2011 B2
7890568 Belenki Feb 2011 B2
7895115 Bayyapu et al. Feb 2011 B2
7899777 Bailer et al. Mar 2011 B2
7899803 Cotter et al. Mar 2011 B2
7908278 Akkiraju et al. Mar 2011 B2
7917629 Werner Mar 2011 B2
7921137 Lieske et al. Apr 2011 B2
7925979 Forney et al. Apr 2011 B2
7937370 Hansen May 2011 B2
7937408 Stuhec May 2011 B2
7937422 Ferguson, Jr. May 2011 B1
7945691 Dharamshi May 2011 B2
7953219 Freedman et al. May 2011 B2
7954107 Mao et al. May 2011 B2
7954115 Gisolfi May 2011 B2
7966418 Shedrinsky Jun 2011 B2
7975024 Nudler Jul 2011 B2
7987176 Latzina et al. Jul 2011 B2
7987193 Ganapam et al. Jul 2011 B2
7992200 Kuehr-McLaren et al. Aug 2011 B2
8000991 Montagut Aug 2011 B2
8005879 Bornhoevd et al. Aug 2011 B2
8024218 Kumar et al. Sep 2011 B2
8024743 Werner Sep 2011 B2
8051045 Vogler Nov 2011 B2
8055758 Hansen Nov 2011 B2
8055787 Victor et al. Nov 2011 B2
8060886 Hansen Nov 2011 B2
8065342 Borg Nov 2011 B1
8065397 Taylor et al. Nov 2011 B2
8069362 Gebhart et al. Nov 2011 B2
8073331 Mazed Dec 2011 B1
8074215 Cohen et al. Dec 2011 B2
8081584 Thibault et al. Dec 2011 B2
8082322 Pascarella et al. Dec 2011 B1
8090452 Johnson et al. Jan 2012 B2
8090552 Henry et al. Jan 2012 B2
8095632 Hessmer et al. Jan 2012 B2
8108543 Hansen Jan 2012 B2
8126903 Lehmann et al. Feb 2012 B2
8127237 Beringer Feb 2012 B2
8131694 Bender et al. Mar 2012 B2
8131838 Bornhoevd et al. Mar 2012 B2
8136034 Stanton et al. Mar 2012 B2
8145468 Fritzdche et al. Mar 2012 B2
8145681 Macaleer et al. Mar 2012 B2
8151257 Zachmann Apr 2012 B2
8156117 Krylov et al. Apr 2012 B2
8156208 Bornhoevd et al. Apr 2012 B2
8156473 Heidasch Apr 2012 B2
8183995 Wang et al. May 2012 B2
8190708 Short et al. May 2012 B1
8229944 Latzina et al. Jul 2012 B2
8230333 Decherd et al. Jul 2012 B2
8249906 Ponce de Leon Aug 2012 B2
8250169 Beringer et al. Aug 2012 B2
8254249 Wen et al. Aug 2012 B2
8261193 Alur et al. Sep 2012 B1
8271935 Lewis Sep 2012 B2
8280009 Stepanian Oct 2012 B2
8284033 Moran Oct 2012 B2
8285807 Slavin et al. Oct 2012 B2
8291039 Shedrinsky Oct 2012 B2
8291475 Jackson et al. Oct 2012 B2
8296198 Bhatt et al. Oct 2012 B2
8296266 Lehmann et al. Oct 2012 B2
8296413 Bornhoevd et al. Oct 2012 B2
8301770 Van Coppenolle et al. Oct 2012 B2
8306635 Pryor Nov 2012 B2
8312383 Gilfix Nov 2012 B2
8321790 Sherrill et al. Nov 2012 B2
8321792 Alur et al. Nov 2012 B1
8331855 Williams et al. Dec 2012 B2
8346520 Lu et al. Jan 2013 B2
8359116 Manthey Jan 2013 B2
8364300 Pouyez et al. Jan 2013 B2
8370479 Hart et al. Feb 2013 B2
8370826 Johnson et al. Feb 2013 B2
8375292 Coffman et al. Feb 2013 B2
8375362 Brette et al. Feb 2013 B1
RE44110 Venigalla Mar 2013 E
8392116 Lehmann et al. Mar 2013 B2
8392561 Dyer et al. Mar 2013 B1
8396929 Helfman et al. Mar 2013 B2
8397056 Malks et al. Mar 2013 B1
8406119 Taylor et al. Mar 2013 B2
8412579 Gonzalez Apr 2013 B2
8417764 Fletcher et al. Apr 2013 B2
8417854 Weng et al. Apr 2013 B2
8423418 Hald et al. Apr 2013 B2
8424058 Vinogradov et al. Apr 2013 B2
8433664 Ziegler et al. Apr 2013 B2
8433815 Van Coppenolle et al. Apr 2013 B2
8438132 Dziuk et al. May 2013 B1
8442933 Baier et al. May 2013 B2
8442999 Gorelik et al. May 2013 B2
8443069 Bagepalli et al. May 2013 B2
8443071 Lu et al. May 2013 B2
8457996 Winkler et al. Jun 2013 B2
8458189 Ludwig et al. Jun 2013 B1
8458315 Miche et al. Jun 2013 B2
8458596 Malks et al. Jun 2013 B1
8458600 Dheap et al. Jun 2013 B2
8473317 Santoso et al. Jun 2013 B2
8478861 Taylor et al. Jul 2013 B2
8484156 Hancsarik et al. Jul 2013 B2
8489527 Van Coppenolle et al. Jul 2013 B2
8490047 Petschnigg et al. Jul 2013 B2
8490876 Tan et al. Jul 2013 B2
8495072 Kapoor et al. Jul 2013 B1
8495511 Redpath Jul 2013 B2
8495683 Van Coppenolle et al. Jul 2013 B2
8516296 Mendu Aug 2013 B2
8516383 Bryant et al. Aug 2013 B2
8521621 Hetzer et al. Aug 2013 B1
8522217 Dutta et al. Aug 2013 B2
8522341 Nochta et al. Aug 2013 B2
8532008 Das et al. Sep 2013 B2
8533660 Mehr et al. Sep 2013 B2
8538799 Haller et al. Sep 2013 B2
8543568 Wagenblatt Sep 2013 B2
8547838 Lee et al. Oct 2013 B2
8549157 Schnellbaecher Oct 2013 B2
8555248 Brunswig et al. Oct 2013 B2
8560636 Kieselbach Oct 2013 B2
8560713 de Souza et al. Oct 2013 B2
8566193 Singh et al. Oct 2013 B2
8571908 Li et al. Oct 2013 B2
8572107 Fan et al. Oct 2013 B2
8577904 Marston Nov 2013 B2
8578059 Odayappan et al. Nov 2013 B2
8578328 Kamiyama et al. Nov 2013 B2
8578330 Dreiling et al. Nov 2013 B2
8584082 Baird et al. Nov 2013 B2
8588765 Harrison Nov 2013 B1
8594023 He et al. Nov 2013 B2
8635254 Harvey et al. Jan 2014 B2
8689181 Biron, III Apr 2014 B2
8752074 Hansen Jun 2014 B2
8762497 Hansen Jun 2014 B2
8769095 Hart et al. Jul 2014 B2
8788632 Taylor et al. Jul 2014 B2
8898294 Hansen Nov 2014 B2
9002980 Shedrinsky Apr 2015 B2
20020099454 Gerrity Jul 2002 A1
20020138596 Darwin et al. Sep 2002 A1
20030005163 Belzile Jan 2003 A1
20030093710 Hashimoto et al. May 2003 A1
20030117280 Prehn Jun 2003 A1
20040027376 Calder et al. Feb 2004 A1
20040133635 Spriestersbach et al. Jul 2004 A1
20040158455 Spivack et al. Aug 2004 A1
20040158629 Herbeck et al. Aug 2004 A1
20040177124 Hansen Sep 2004 A1
20050015369 Styles et al. Jan 2005 A1
20050021506 Sauermann et al. Jan 2005 A1
20050027675 Schmitt et al. Feb 2005 A1
20050060186 Blowers et al. Mar 2005 A1
20050102362 Price et al. May 2005 A1
20050198137 Pavlik et al. Sep 2005 A1
20050213563 Shaffer et al. Sep 2005 A1
20050240427 Crichlow Oct 2005 A1
20050289154 Weiss et al. Dec 2005 A1
20060031520 Bedekar et al. Feb 2006 A1
20060186986 Ma et al. Aug 2006 A1
20060208871 Hansen Sep 2006 A1
20070005736 Hansen et al. Jan 2007 A1
20070016557 Moore et al. Jan 2007 A1
20070027854 Rao et al. Feb 2007 A1
20070027914 Agiwal Feb 2007 A1
20070104180 Aizu et al. May 2007 A1
20070162486 Brueggemann et al. Jul 2007 A1
20070174158 Bredehoeft et al. Jul 2007 A1
20070260593 Delvat Nov 2007 A1
20070266384 Labrou et al. Nov 2007 A1
20070300172 Runge et al. Dec 2007 A1
20080098085 Krane et al. Apr 2008 A1
20080172632 Stambaugh Jul 2008 A1
20080208890 Milam Aug 2008 A1
20080222599 Nathan et al. Sep 2008 A1
20080231414 Canosa Sep 2008 A1
20080244077 Canosa Oct 2008 A1
20080244594 Chen et al. Oct 2008 A1
20080255782 Bilac et al. Oct 2008 A1
20080319947 Latzina et al. Dec 2008 A1
20090006391 Ram Jan 2009 A1
20090150431 Schmidt et al. Jun 2009 A1
20090193148 Jung et al. Jul 2009 A1
20090259442 Gandikota et al. Oct 2009 A1
20090265760 Zhu et al. Oct 2009 A1
20090299990 Setlur et al. Dec 2009 A1
20090300060 Beringer et al. Dec 2009 A1
20090300417 Bonissone et al. Dec 2009 A1
20090319518 Koudas et al. Dec 2009 A1
20090327337 Lee et al. Dec 2009 A1
20100017379 Naibo et al. Jan 2010 A1
20100017419 Francis et al. Jan 2010 A1
20100064277 Baird et al. Mar 2010 A1
20100077001 Vogel et al. Mar 2010 A1
20100094843 Cras Apr 2010 A1
20100125584 Navas May 2010 A1
20100125826 Rice et al. May 2010 A1
20100250440 Wang et al. Sep 2010 A1
20100257242 Morris Oct 2010 A1
20100286937 Hedley et al. Nov 2010 A1
20100287075 Herzog et al. Nov 2010 A1
20100293360 Schoop et al. Nov 2010 A1
20110035188 Martinez-Heras et al. Feb 2011 A1
20110078599 Guertler et al. Mar 2011 A1
20110078600 Guertler et al. Mar 2011 A1
20110099190 Kreibe Apr 2011 A1
20110137883 Lagad et al. Jun 2011 A1
20110138354 Hertenstein et al. Jun 2011 A1
20110145712 Pontier et al. Jun 2011 A1
20110145933 Gambhir et al. Jun 2011 A1
20110153505 Brunswig et al. Jun 2011 A1
20110154226 Guertler et al. Jun 2011 A1
20110161409 Nair et al. Jun 2011 A1
20110173203 Jung et al. Jul 2011 A1
20110173220 Jung et al. Jul 2011 A1
20110173264 Kelly Jul 2011 A1
20110208788 Heller et al. Aug 2011 A1
20110209069 Mohler Aug 2011 A1
20110219327 Middleton, Jr. et al. Sep 2011 A1
20110231592 Bleier et al. Sep 2011 A1
20110276360 Barth et al. Nov 2011 A1
20110307295 Steiert et al. Dec 2011 A1
20110307363 N et al. Dec 2011 A1
20110307405 Hammer et al. Dec 2011 A1
20110320525 Agarwal et al. Dec 2011 A1
20120005577 Chakra et al. Jan 2012 A1
20120059856 Kreibe et al. Mar 2012 A1
20120072435 Han Mar 2012 A1
20120072885 Taragin et al. Mar 2012 A1
20120078959 Cho et al. Mar 2012 A1
20120096429 Desari et al. Apr 2012 A1
20120117051 Liu et al. May 2012 A1
20120131473 Biron, III May 2012 A1
20120136649 Freising et al. May 2012 A1
20120143970 Hansen Jun 2012 A1
20120144370 Kemmler et al. Jun 2012 A1
20120150859 Hu Jun 2012 A1
20120158825 Ganser Jun 2012 A1
20120158914 Hansen Jun 2012 A1
20120166319 Deledda et al. Jun 2012 A1
20120167006 Tillert et al. Jun 2012 A1
20120173671 Callaghan et al. Jul 2012 A1
20120179905 Ackerly Jul 2012 A1
20120197488 Lee et al. Aug 2012 A1
20120197852 Dutta et al. Aug 2012 A1
20120197856 Banka et al. Aug 2012 A1
20120197898 Pandey et al. Aug 2012 A1
20120197911 Banka et al. Aug 2012 A1
20120239381 Heidasch Sep 2012 A1
20120239606 Heidasch Sep 2012 A1
20120254825 Sharma et al. Oct 2012 A1
20120259932 Kang et al. Oct 2012 A1
20120284259 Jehuda Nov 2012 A1
20120311501 Nonez et al. Dec 2012 A1
20120311526 DeAnna et al. Dec 2012 A1
20120311547 DeAnna et al. Dec 2012 A1
20120324066 Alam et al. Dec 2012 A1
20130006400 Caceres et al. Jan 2013 A1
20130036137 Ollis et al. Feb 2013 A1
20130054563 Heidasch Feb 2013 A1
20130060791 Szalwinski et al. Mar 2013 A1
20130067031 Shedrinsky Mar 2013 A1
20130067302 Chen et al. Mar 2013 A1
20130073969 Blank et al. Mar 2013 A1
20130080898 Lavian et al. Mar 2013 A1
20130110496 Heidasch May 2013 A1
20130110861 Roy et al. May 2013 A1
20130124505 Bullotta et al. May 2013 A1
20130124616 Bullotta et al. May 2013 A1
20130125053 Brunswig et al. May 2013 A1
20130132385 Bullotta et al. May 2013 A1
20130166563 Mueller et al. Jun 2013 A1
20130166568 Binkert et al. Jun 2013 A1
20130166569 Navas Jun 2013 A1
20130173062 Koenig-Richardson Jul 2013 A1
20130179565 Hart et al. Jul 2013 A1
20130185593 Taylor et al. Jul 2013 A1
20130185786 Dyer et al. Jul 2013 A1
20130191767 Peters et al. Jul 2013 A1
20130207980 Ankisettipalli et al. Aug 2013 A1
20130211555 Lawson et al. Aug 2013 A1
20130290441 Linden Levy Aug 2013 A1
20130246897 O'Donnell Sep 2013 A1
20130262641 Zur et al. Oct 2013 A1
20130275344 Heidasch Oct 2013 A1
20130275550 Lee et al. Oct 2013 A1
20130304581 Soroca et al. Nov 2013 A1
20140016455 Ruetschi et al. Jan 2014 A1
20140019432 Lunenfeld Jan 2014 A1
20140032531 Ravi et al. Jan 2014 A1
20140040286 Bane Feb 2014 A1
20140040433 Russell, Jr. et al. Feb 2014 A1
20140095211 Gloerstad et al. Apr 2014 A1
20140164358 Benzatti Jun 2014 A1
20140223334 Jensen et al. Aug 2014 A1
20140282370 Schaefer et al. Sep 2014 A1
20150007199 Valeva et al. Jan 2015 A1
20150058833 Venkata Naga Ravi Feb 2015 A1
20150271109 Bullotta et al. Sep 2015 A1
20150271229 Bullotta et al. Sep 2015 A1
20150271271 Bullotta et al. Sep 2015 A1
20150271295 Mahoney et al. Sep 2015 A1
20150271301 Mahoney et al. Sep 2015 A1
Foreign Referenced Citations (6)
Number Date Country
0497010 Aug 1992 EP
1187015 Mar 2002 EP
9921152 Apr 1999 WO
0077592 Dec 2000 WO
2008115995 Sep 2008 WO
2014145084 Sep 2014 WO
Non-Patent Literature Citations (5)
Entry
International Search Report and Written Opinion issued in related International Application No. PCT/US2015/021882 dated Jul. 30, 2015.
International Search Report and Written Opinion issued in related International Application No. PCT/US2015/021867 dated Jul. 31, 2015.
Hart Server, retrieved from 2001 internet archive of hartcomm.org http://www.hartcomm.org/server2/index.html, 13 pages (2001).
Ray, Learning XML, first edition, 277 pages (2001)—part 1—p. 1-146 Ray, Learning XML, first edition, 277 pages (2001)—part 2—p. 147-277.
Shi, L. et al., Understanding Text Corpora with Multiple Facets, IEEE Symposium on Visual Analytics Science and Technoloqy (VAST), 99-106 (2010).
Related Publications (1)
Number Date Country
20170242934 A1 Aug 2017 US
Provisional Applications (1)
Number Date Country
61560369 Nov 2011 US
Continuations (1)
Number Date Country
Parent 13679361 Nov 2012 US
Child 15400230 US