Field of the Invention
This invention pertains in general to data visualization on a computer and, in particular, to ways of representing data obtained from web pages and other sources.
Description of the Related Art
The World Wide Web and other information storage and retrieval systems contain a great deal of information. With the advent of search engines and other similar tools it has become relatively easy for an end-user to locate particular information. For example, one can obtain a wealth of information about the atomic elements by simply searching for the terms “atomic elements” on the Web.
It is generally acknowledged that certain types of information are easier to comprehend when presented in some formats rather than in other formats. For example, an end-user might find it easier to understand the relationships between the atomic masses of the atomic elements if the masses are presented graphically instead of listed as numeric values. Similarly, trends in stock prices are easier to understand when the prices are presented on a graph rather than as a list of dates and prices.
Information on the web is not always in a format that is easy for the end-user to comprehend. A web site that describes the atomic elements might display the periodic table and provide information like the atomic weights, melting points, and densities of the elements, but the site is unlikely to provide a graph of, say, atomic weights versus melting points. As a result, a person who desires to view such a graph must manually copy the information into another program that provides graphing capabilities.
Some web sites, like sites providing stock prices and other financial information, provide limited amounts of dynamic content. For example, a person can interact with such sites to create and manipulate graphs showing stock prices over time. Even on these sites, however, the person is restricted to a limited set of graphing options. Further, there is no way for a person to graphically manipulate data contained on multiple web sites short of copying the desired information into separate graphing program.
Therefore, there is a need in the art for a way to enable the end-user to organize and view structured information on web pages in a way that makes it easier to comprehend.
The above need is met by a system, method, and computer program product for graphically presenting facts. In one embodiment, a set of facts is established, where each fact has an attribute and a corresponding value. A graph type that facilitates understanding of the facts are determined, and the facts are displayed in a graph of the determined type.
The figures depict an embodiment of the present invention for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.
Document hosts 102 store documents and provide access to documents. A document is comprised of any machine-readable data including any combination of text, graphics, multimedia content, etc. A document may be encoded in an markup language, such as Hypertext Markup Language (HTML), i.e., a web page, in an interpreted language (e.g., JavaScript) or in any other computer readable or executable format. A document can include one or more hyperlinks to other documents. A typical document will include one or more facts within its content. A document stored in a document host 102 may be located and/or identified by a Uniform Resource Locator (URL); or Web address, or any other appropriate form of identification and/or location. A document host 102 is implemented by a computer system, and typically includes a server adapted to communicate over the network 104 via networking protocols (e.g., TCP/IP), as well as application and presentation protocols (e.g., HTTP, HTML, SOAP, D-HTML, Java). The documents stored by a host 102 are typically held in a file directory, a database, or other data repository. A host 102 can be implemented in any computing device (e.g., from a PDA or personal computer, a workstation, mini-computer, or mainframe, to a cluster or grid of computers), as well as in any processor architecture or operating system.
Janitors 110 operate to process facts extracted by importer 108. This processing can include but is not limited to, data cleansing, object merging, and fact induction. In one embodiment, there are a number of different janitors 110 that perform different types of data management operations on the facts. For example, one janitor 110 may traverse some set of facts in the repository 115 to find duplicate facts (that is, facts that convey the same factual information) and merge them. Another janitor 110 may also normalize facts into standard formats. Another janitor 110 may also remove unwanted facts from repository 115, such as facts related to pornographic content. Other types of janitors 110 may be implemented, depending on the types of data management functions desired, such as translation, compression, spelling or grammar correction, and the like.
Various janitors 110 act on facts to normalize attribute names, and values and delete duplicate and near-duplicate facts so an object does not have redundant information. For example, we might find on one page that Britney Spears' birthday is “Dec. 2, 1981” while on another page that her date of birth is “Dec. 2, 1981.” Birthday and Date of Birth might both be rewritten as Birthdate by one janitor and then another janitor might notice that Dec. 2, 1981 and Dec. 2, 1981 are different forms of the same date. It would choose the preferred form, remove the other fact and combine the source lists for the two facts. As a result when you look at the source pages for this fact, on some you'll find an exact match of the fact and on others text that is considered to be synonymous with the fact.
Build engine 112 builds and manages the repository 115. Service engine 114 is an interface for querying the repository 115. Service engine 114's main function is to process queries, score matching objects, and return them to the caller but it is also used by janitor 110.
Repository 115 stores factual information extracted from a plurality of documents that are located on document hosts 102. A document from which a particular fact may be extracted is a source document (or “source”) of that particular fact. In other words, a source of a fact includes that fact (or a synonymous fact) within its contents.
Repository 115 contains one or more facts. In one embodiment, each fact is associated with exactly one object. One implementation for this association includes in each fact an object ID that uniquely identifies the object of the association. In this manner, any number of facts may be associated with an individual object, by including the object ID for that object in the facts. In one embodiment, objects themselves are not physically stored in the repository 115, but rather are defined by the set or group of facts with the same associated object ID, as described below. Further details about facts in repository 115 are described below, in relation to
It should be appreciated that in practice at least some of the components of the data processing system 106 will be distributed over multiple computers, communicating over a network. For example, repository 115 may be deployed over multiple servers. As another example, the janitors 110 may be located on any number of different computers. For convenience of explanation, however, the components of the data processing system 106 are discussed as though they were implemented on a single computer.
In another embodiment, some or all of document hosts 102 are located on data processing system 106 instead of being coupled to data processing system 106 by a network. For example, importer 108 may import facts from a database that is a part of or associated with data processing system 106.
As described above, each fact is associated with an object ID 209 that identifies the object that the fact describes. Thus, each fact that is associated with a same entity (such as George Washington), will have the same object ID 209. In one embodiment, objects are not stored as separate data entities in memory. In this embodiment, the facts associated with an object contain the same object ID, but no physical object exists. In another embodiment, objects are stored as data entities in memory, and include references (for example, pointers or IDs) to the facts associated with the object. The logical data structure of a fact can take various forms; in general, a fact is represented by a tuple that includes a fact ID, an attribute, a value, and an object ID. The storage implementation of a fact can be in any underlying physical data structure.
Also, while the illustration of
Each fact 204 also may include one or more metrics 218. A metric provides an indication of the some quality of the fact. In some embodiments, the metrics include a confidence level and an importance level. The confidence level indicates the likelihood that the fact is correct. The importance level indicates the relevance of the fact to the object, compared to other facts for the same object. The importance level may optionally be viewed as a measure of how vital a fact is to an understanding of the entity or concept represented by the object.
Each fact 204 includes a list of one or more sources 220 that include the fact and from which the fact was extracted. Each source may be identified by a Uniform Resource Locator (URL), or Web address, or any other appropriate form of identification and/or location, such as a unique document identifier.
The facts illustrated in
Some embodiments include one or more specialized facts, such as a name fact 207 and a property fact 208. A name fact 207 is a fact that conveys a name for the entity or concept represented by the object ID. A name fact 207 includes an attribute 224 of “name” and a value, which is the name of the object. For example, for an object representing the country Spain, a name fact would have the value “Spain.” A name fact 207, being a special instance of a general fact 204, includes the same fields as any other fact 204; it has an attribute, a value, a fact ID, metrics, sources, etc. The attribute 224 of a name fact 207 indicates that the fact is a name fact, and the value is the actual name. The name may be a string of characters. An object ID may have one or more associated name facts, as many entities or concepts can have more than one name. For example, an object ID representing Spain may have associated name facts conveying the country's common name “Spain” and the official name “Kingdom of Spain.” As another example, an object ID representing the U.S. Patent and Trademark Office may have associated name facts conveying the agency's acronyms “PTO” and “USPTO” as well as the official name “United States Patent and Trademark Office.” If an object does have more than one associated name fact, one of the name facts may be designated as a primary name and other name facts may be designated as secondary names, either implicitly or explicitly.
A property fact 208 is a fact that conveys a statement about the entity or concept represented by the object ID. Property facts are generally used for summary information about an object. A property fact 208, being a special instance of a general fact 204, also includes the same parameters (such as attribute, value, fact ID, etc.) as other facts 204. The attribute field 226 of a property fact 208 indicates that the fact is a property fact (e.g., attribute is “property”) and the value is a string of text that conveys the statement of interest. For example, for the object ID representing Bill Clinton, the value of a property fact may be the text string “Bill Clinton was the 42nd President of the United States from 1993 to 2001.” Some object IDs may have one or more associated property facts while other objects may have no associated property facts. It should be appreciated that the data structures shown in
As described previously, a collection of facts is associated with an object ID of an object. An object may become a null or empty object when facts are disassociated from the object. A null object can arise in a number of different ways. One type of null object is an object that has had all of its facts (including name facts) removed, leaving no facts associated with its object ID. Another type of null object is an object that has all of its associated facts other than name facts removed, leaving only its name fact(s). Alternatively, the object may be a null object only if all of its associated name facts are removed. A null object represents an entity or concept for which the data processing system 106 has no factual information and, as far as the data processing system 106 is concerned, does not exist. In some embodiments, facts of a null object may be left in the repository 115, but have their object ID values cleared (or have their importance to a negative value). However, the facts of the null object are treated as if they were removed from the repository 115. In some other embodiments, facts of null objects are physically removed from repository 115.
A presentation engine 300 presents objects and facts in a customizable manner. That is, the presentation engine 300 provides flexible tools that enable an end-user to view the facts of one or more objects in ways desired by the end-user. The presentation engine 300 thus allows the end-user to view information in a way that is more comprehensible to the end-user.
In one embodiment, the modules of the presentation engine 300 are implemented as a JavaScript program that executes on client devices such as personal computers, cellular telephones, personal digital assistants (PDAs), etc. The JavaScript program interfaces with the data processing system 106 to access data and functionalities provided by it. The JavaScript program itself is controlled by a web browser, operating system, or other entity executing on the client device. In other embodiments, the presentation engine 300 is implemented using different coding techniques, relies solely on client-side data, and/or executes on the server.
An object access module 310 receives objects from the fact repository 115 and/or another source. In one embodiment, the objects are received in response to end-user interactions with a search engine or other object requestor 152 that provides access to the fact repository 115. For example, an end-user can utilize a search engine to search the fact repository 115 for objects matching a particular query. The object requestor 152 returns a set of matching objects. A user interface provided by the object access module 310 and/or another module allows the end-user to select particular objects and designate the objects for further analysis. In this manner, the end-user can execute multiple different searches on the fact repository 115 and designate objects from the different searches for further analysis.
For example, assume that the fact repository 115 stores a set of objects of type “atomic element,” including at least one object for each of the atomic elements. Also assume that the end-user executes a query for objects that match the phrase “atomic elements.” The results of this query include all of the objects of type “atomic element” may also produce other objects depending upon the information in the objects and/or the search algorithms. The end-user can select one or more of these objects and designate the selected object for further analysis. For purposes of this example, assume that the end-user designates each one object for each atomic element.
In another embodiment, the object access module 310 receives the objects from the end-user. For example, the end-user can supply user defined objects for use with the presentation engine 300. Further, the end-user can encounter objects on the network 104 at locations other than the fact repository 115, such as on web pages not associated with the repository, and cause the object access module 310 to supply those objects to the presentation engine 300.
In one embodiment, a collection module 312 receives the objects designated for further analysis by the end-user. The collection module 312 stores multiple collections, and each collection stores zero or more objects. In one embodiment, the end-user specifies the collection in which the collection module 312 stores the designated objects. In other embodiments, the collection module 312 stores designated objects in a default collection if the end-user does not specify a particular collection. The collection module 312 provides an interface allowing the end-user to manipulate the collections and the objects within the collections. For example, the end-user can view the objects within different collections, and add and remove objects.
Although most of this discussion focuses on a collection containing objects for the atomic elements, a collection can hold arbitrary and heterogeneous objects. For example, a collection can contain objects for the atomic elements, the actor M. Emmet Walsh, and the country China. Some heterogeneous objects may have attributes in common, while other objects might not have any common attributes.
A storage module 314 stores the collections and/or other data utilized by the presentation engine 300 for its operation. The storage module 314 acts as a place where other modules in the presentation engine 300 can store and retrieve information. In one embodiment, the storage module 314 is a logical construct that uses storage allocated from virtual memory on the client device on which the presentation engine 300 is executing. In other embodiments, some or all of the storage provided by the storage module 314 is located on a server connected to the client via the network 104. For example, the collection module 312 can store collections in the storage module 314. The storage module 314 stores the data describing the collections and references to the objects within the collections in a memory on the client. However, the objects themselves are stored on a server accessible via the network 104 and retrieved from the server when necessary or desired.
A user interface (UI) generation module 316 generates a UI for the end-user. Generally, the UI allows the end-user to view and manipulate the objects and the facts within the objects. In addition, the UI allows the end-user to control other aspects of the presentation engine 300, such as executing a search for new objects and designating objects for storage in a collection. In one embodiment, the UI is displayed on a display device of the client and contains buttons, list boxes, text boxes, images, hyperlinks, and/or other tools with which the end-user can control the presentation engine.
In one embodiment, the UI generation module 316 includes other modules for generating specific UI elements. These modules include a tabular presentation module 318 that generates UI elements displaying objects and facts in tables. Further, the UI generation module 316 includes a graphical presentation module 320 for displaying objects and facts in graphs. Additionally, the UI generation module includes a map presentation module 322 that displays objects and facts in combination with maps. The UI elements provided by these modules are described in more detail below.
A scoring module 324 evaluates objects, attributes, and values and outputs corresponding scores. These scores can be used to produce ranked lists of objects and/or facts. The factors utilized for the evaluations and scores depend upon the embodiment and the objects, attributes, and/or values being evaluated. In one embodiment, the presentation engine 300 uses the scores produced by the scoring module 324 to establish facts to show in certain UIs. The operation of the scoring module 324 is described in more detail below.
The UI 400 of
A results area 414 below the search text input box 410 displays objects that match the search query. In the illustrated embodiment, the results area 414 displays the names of the objects and a few facts about each object. The UI 400 includes a scroll bar 416 with which the end-user can view objects that are lower on the page.
In the illustrated UI 400, the results area 414 shows the objects for the elements Actinium, Polonium, and Chromium. The names 418 of the elements are displayed, along with images 420 that illustrate the respective elements. Beneath each name 418 is a set of facts 422 about the element. The illustrated UI shows four facts 422 about each element, although other embodiments can show fewer or more facts. For example, for the element Actinium the displayed facts 422 are:
Obtained From: extremely rare
Uses: No uses known
Discoverer: Andre Debierne
Color: Silvery
In addition, each displayed fact includes a URL that links to the source page from which the fact was derived.
The initial facts displayed for an object are determined based on scores produced by the scoring module 324. The scoring module 324 calculates scores for the facts of the object based on the metric (e.g., the importance and confidence levels) and outputs a ranked list. Facts having high importance and confidence are generally ranked higher than facts having low importance and/or confidence. In one embodiment, the rankings produced by the scoring module 324 are influenced by the search query utilized to produce the list of objects. The scoring module 324 can increase the importance component of a fact if that fact matches a search term. For example, a search query for “atomic element weight” will produce an increased importance for the “atomic weight” fact of an object, and likely result in the atomic weight fact being ranked high enough to appear on the UI 400.
Each displayed object 418 includes a button 424 that the end-user can select to save the object to a collection. In addition, other buttons 426 on the UI 400 provide the end-user with the options of saving all search results to a collection or saving only visible results to the collection. The collection 428 itself is displayed in a column on the left side of the UI 400. The names of the objects 430 in the collection are displayed, along with a check box 432 with which the end-user can remove the objects from the collection 428.
The illustrated UI 400 includes a “table” button 434 near the collection display. When this button 434 is selected by the end-user, the UI 400 presents a tabular view of the objects in the collection. This tabular view is produced by the tabular presentation module 318. Other embodiments of the presentation engine 300 provide other ways to access the table functionality.
The tabular presentation module 318 uses the scoring module 324 to select the initial attributes for the right-side columns 514 of the table 510. In one embodiment, the scoring module 324 identifies the most common attributes of the objects in the collection 428, and produces a list of the attributes sorted by commonality. Thus, an attribute shared by all of the objects in the collection 428 is ranked first in the list, while an attribute possessed by only one of the objects is ranked last. In one embodiment, the metrics associated with the attributes' facts are used to break ties. The tabular presentation module 318 shows the highest ranked attributes in the initial columns 514 of the table 510.
A drop down selection box 516 displayed near the top of the table 510 allows the end-user to select additional facts to include in the table. In one embodiment, the selection box 516 contains every unique attribute from the objects in the collection 428. The end-user can select one of the attributes, and in response the UI 500 adds a column to the table showing the values of the objects for the selected attribute. In one embodiment, the new column is made the rightmost column of the table 510. If an object lacks the attribute displayed in the table 510, one embodiment displays a blank space, text indicating that the attribute is not applicable, or another indicator that the object lacks the attribute.
The columns 514 of the table 500 displaying facts contain additional buttons allowing the end-user to manipulate the table. In one embodiment, each column contains buttons 518 labeled “Remove,” “Sort,” and “Graph.” The “Remove” button, when selected, removes the specified column from the table 510. If the “Sort” button is selected, the table is resorted based on the values of the attribute represented by the specified column. For example, choosing the “Sort” button of the “Density” column causes the table 510 to resort the atomic elements based on their densities.
When the “Graph” button is selected by the end-user, the UI 500 presents a graphical view of the objects in the collection. This graphical view is produced by the graphical presentation module 320. Specifically, the UI 500 presents a graph showing the objects' values for the attribute of the column selected by the end-user.
The graphical presentation module 320 determines the type of graph that best facilitates interpretation of the facts by the end-user and automatically generates a graph of that type. In one embodiment, graphical presentation module 320 considers the type of facts being graphed in determining the initial graph type. One consideration is the format of the value for the fact being graphed, e.g., whether the value is represented by a number, text, graphic, sound, etc. Another consideration is the meaning of the fact, e.g., is it a date, name, location, quantity, temperature, etc. In one embodiment, the graphical presentation module 320 determines the meaning of a fact from data within the object. For example, the meaning can be explicitly specified by meta-data within the object or can be derived from the name of the attribute associated with the fact.
The specific types of graphs that the graphical presentation module 320 produces can vary depending upon the embodiment and/or the values being graphed. In one embodiment, the module 320 graphs facts that are dates on a timeline. A timeline facilitates the interpretation of dates such as dates of births, deaths, disasters, discoveries and the like.
In one embodiment, the graphical presentation module 320 uses maps to graph facts that are locations because maps represent a good way to facilitate the understanding of such facts. The graphical presentation module 320 interacts with a map presentation module 322 to produce the maps. In one embodiment, the map presentation module 322 holds maps of geographic regions. The facts of objects specify geographic locations, such as countries, places of births and death, etc. either explicitly (e.g., latitude/longitude) or implicitly (e.g., by name). The graphical presentation module 320 uses the maps and facts to produce maps that illustrate the locations specified by the facts. For example, an end-user can execute a search on “volcanoes” and create a collection that stores objects representing volcanoes. Each of these volcano objects includes a fact that describes the volcano's location. The end-user can use the graphical presentation module 320 and map presentation module 322 to create a map that shows the locations of the volcanoes using push pins or other icons. Similarly, the end-user can create a collection of NBA basketball players, and then create a map showing the place of birth of each one.
Other embodiments of the graphical presentation module 320 graph other types of facts using other specialized graphs. In one embodiment, if the values of the attributes are numeric and are not of a type having a specialized graph, the graphical presentation module 320 produces a bar graph showing the value of the attribute for each object. If the values are not numeric, the graphical presentation module 320 produces a histogram grouping objects having the same value for the attribute. In either case, the graph shows the attribute along the X-axis. In one embodiment, the graphical presentation module 320 shows the graph as a scatter plot if two attributes are being graphed simultaneously.
As shown in
The end-user can use the “Graph” buttons to add or remove attributes from the graph 610. When an attribute is shown in the graph 610, its corresponding “Graph” button becomes a “No Graph” button. The end-user selects the “No Graph” button to remove the attribute from the graph. If an attribute shown in the table 510 is not displayed on the graph 610, the end-user can select that attribute's “Graph” button and cause the attribute to be graphed.
In one embodiment, the end-user can obtain additional information about a graphed object by using a mouse or other pointing device to move a cursor over the graphical representation of the object. For example, if the end-user moves the cursor over a particular bar of a bar graph, the graphical presentation module 320 displays additional facts about the object. Similarly, the end-user can move the cursor over a push pin icon on a map to view additional facts about the object having the location identified by the push pin.
Initially, the presentation engine 300 establishes 910 a collection of objects. This collection is typically established when an end-user interacts with the presentation engine 300, a search engine, and/or other entities to view a list of objects. The end-user selects certain objects and designates them for further analysis. The presentation engine 300 stores the designated objects in a collection or other logical grouping.
The end-user interacts with the presentation engine 300 to generate 912 a visual representation of the designated objects. For example, the end-user can use a UI presented by the presentation engine 300 to generate a table showing facts of the objects. Likewise, the end-user can cause the presentation engine 300 to generate graphs, timelines, and maps showing facts of the objects. Further, the end-user can manipulate 914 the visual representation by adding columns to a table, sorting the values shown by a graph, moving a cursor over a push pin icon on a map, etc. Using the presentation engine 300, the end-user can thus view the facts of the objects in a manner that is beneficial to the end-user, rather than being forced to use the presentation selected by the entity that initially provided the objects and/or facts.
The above description is included to illustrate the operation of the preferred embodiments and is not meant to limit the scope of the invention. The scope of the invention is to be limited only by the following claims. From the above discussion, many variations will be apparent to one skilled in the relevant art that would yet be encompassed by the spirit and scope of the invention.
This application is a continuation of U.S. application Ser. No. 11/342,277, filed Jan. 27, 2006, and entitled “Data Object Visualization Using Graphs,” incorporated herewith by reference in its entirety. This application is related to the following U.S. Applications, all of which are incorporated by reference herein: U.S. application Ser. No. 11/342,290, entitled “Data Object Visualization”, filed on Jan. 27, 2006; U.S. application Ser. No. 11/342,293, entitled “Data Object Visualization Using Maps”, filed on Jan. 27, 2006; U.S. application Ser. No. 11/341,069, entitled “Object Categorization for Information Extraction”, filed on Jan. 27, 2006; U.S. application Ser. No. 11/341,907, entitled “Designating Data Objects for Analysis”, filed concurrently on Jan. 27, 2006.
Number | Name | Date | Kind |
---|---|---|---|
4888690 | Huber | Dec 1989 | A |
4899292 | Montagna et al. | Feb 1990 | A |
5010478 | Deran | Apr 1991 | A |
5202982 | Gramlich et al. | Apr 1993 | A |
5475819 | Miller et al. | Dec 1995 | A |
5528549 | Doddington et al. | Jun 1996 | A |
5528550 | Pawate et al. | Jun 1996 | A |
5544051 | Senn et al. | Aug 1996 | A |
5560005 | Hoover et al. | Sep 1996 | A |
5574898 | Leblang et al. | Nov 1996 | A |
5664109 | Johnson et al. | Sep 1997 | A |
5724571 | Woods | Mar 1998 | A |
5778378 | Rubin | Jul 1998 | A |
5815415 | Bentley et al. | Sep 1998 | A |
5832479 | Berkowitz et al. | Nov 1998 | A |
5870739 | Davis, III et al. | Feb 1999 | A |
5905980 | Masuichi et al. | May 1999 | A |
5938717 | Dunne et al. | Aug 1999 | A |
5946692 | Faloutsos et al. | Aug 1999 | A |
5963940 | Liddy et al. | Oct 1999 | A |
6006221 | Liddy et al. | Dec 1999 | A |
6014661 | Ahlberg et al. | Jan 2000 | A |
6026388 | Liddy et al. | Feb 2000 | A |
6029195 | Herz | Feb 2000 | A |
6038560 | Wical | Mar 2000 | A |
6101515 | Wical et al. | Aug 2000 | A |
6105020 | Lindsay et al. | Aug 2000 | A |
6105030 | Syed et al. | Aug 2000 | A |
6182063 | Woods | Jan 2001 | B1 |
6192357 | Krychniak | Feb 2001 | B1 |
6216138 | Wells et al. | Apr 2001 | B1 |
6222540 | Sacerdoti | Apr 2001 | B1 |
6249784 | Macke et al. | Jun 2001 | B1 |
6263328 | Coden et al. | Jul 2001 | B1 |
6263335 | Paik et al. | Jul 2001 | B1 |
6304864 | Liddy et al. | Oct 2001 | B1 |
6311189 | deVries et al. | Oct 2001 | B1 |
6326962 | Szabo | Dec 2001 | B1 |
6327574 | Kramer et al. | Dec 2001 | B1 |
6363179 | Evans et al. | Mar 2002 | B1 |
6377943 | Jakobsson | Apr 2002 | B1 |
6480194 | Sang'udi et al. | Nov 2002 | B1 |
6519631 | Rosenschein et al. | Feb 2003 | B1 |
6529900 | Patterson et al. | Mar 2003 | B1 |
6584464 | Warthen | Jun 2003 | B1 |
6594658 | Woods | Jul 2003 | B2 |
6606659 | Hegli et al. | Aug 2003 | B1 |
6609123 | Cazemier et al. | Aug 2003 | B1 |
6629097 | Keith | Sep 2003 | B1 |
6643641 | Snyder | Nov 2003 | B1 |
6718324 | Edlund et al. | Apr 2004 | B2 |
6772150 | Whitman et al. | Aug 2004 | B1 |
6801548 | Duschatko et al. | Oct 2004 | B1 |
6832218 | Emens et al. | Dec 2004 | B1 |
6850896 | Kelman et al. | Feb 2005 | B1 |
6873982 | Bates et al. | Mar 2005 | B1 |
6885990 | Ohmori et al. | Apr 2005 | B1 |
6928436 | Baudel | Aug 2005 | B2 |
6961723 | Faybishenko et al. | Nov 2005 | B2 |
6968343 | Charisius et al. | Nov 2005 | B2 |
7013308 | Tunstall-Pedoe | Mar 2006 | B1 |
7031955 | de Souza et al. | Apr 2006 | B1 |
7043521 | Eitel | May 2006 | B2 |
7100083 | Little et al. | Aug 2006 | B2 |
7146538 | Johnson et al. | Dec 2006 | B2 |
7158983 | Willse et al. | Jan 2007 | B2 |
7421432 | Hoelzle et al. | Sep 2008 | B1 |
7669115 | Cho et al. | Feb 2010 | B2 |
7953720 | Rohde et al. | May 2011 | B1 |
8065290 | Hogue | Nov 2011 | B2 |
8112441 | Ebaugh et al. | Feb 2012 | B2 |
8352388 | Estes | Jan 2013 | B2 |
8463810 | Rennison | Jun 2013 | B1 |
8510321 | Ranganathan et al. | Aug 2013 | B2 |
8620909 | Rennison | Dec 2013 | B1 |
20010016828 | Philippe et al. | Aug 2001 | A1 |
20020010909 | Charisius et al. | Jan 2002 | A1 |
20020055954 | Breuer | May 2002 | A1 |
20020065814 | Okamoto et al. | May 2002 | A1 |
20020065815 | Layden | May 2002 | A1 |
20020128818 | Ho et al. | Sep 2002 | A1 |
20020154175 | Abello et al. | Oct 2002 | A1 |
20020173984 | Robertson et al. | Nov 2002 | A1 |
20030005036 | Mitzenmacher | Jan 2003 | A1 |
20030046288 | Severino et al. | Mar 2003 | A1 |
20030069880 | Harrison et al. | Apr 2003 | A1 |
20030097357 | Ferrari et al. | May 2003 | A1 |
20030115485 | Milliken | Jun 2003 | A1 |
20030120373 | Eames | Jun 2003 | A1 |
20030120644 | Shirota | Jun 2003 | A1 |
20030120654 | Edlund et al. | Jun 2003 | A1 |
20030120659 | Sridhar | Jun 2003 | A1 |
20030154071 | Shreve | Aug 2003 | A1 |
20030158855 | Farnham et al. | Aug 2003 | A1 |
20030182171 | Vianello | Sep 2003 | A1 |
20030195872 | Senn | Oct 2003 | A1 |
20030208486 | Dettinger et al. | Nov 2003 | A1 |
20030208665 | Peir et al. | Nov 2003 | A1 |
20030217052 | Rubenczyk et al. | Nov 2003 | A1 |
20040015566 | Anderson et al. | Jan 2004 | A1 |
20040030731 | Iftode et al. | Feb 2004 | A1 |
20040107125 | Guheen et al. | Jun 2004 | A1 |
20040122844 | Malloy et al. | Jun 2004 | A1 |
20040122846 | Chess et al. | Jun 2004 | A1 |
20040123240 | Gerstl et al. | Jun 2004 | A1 |
20040125137 | Stata et al. | Jul 2004 | A1 |
20040167909 | Wakefield et al. | Aug 2004 | A1 |
20040220904 | Finlay et al. | Nov 2004 | A1 |
20040236655 | Scumniotales et al. | Nov 2004 | A1 |
20040255237 | Tong | Dec 2004 | A1 |
20040260714 | Chatterjee et al. | Dec 2004 | A1 |
20040267700 | Dumais et al. | Dec 2004 | A1 |
20050022009 | Aguilera et al. | Jan 2005 | A1 |
20050033803 | Vleet et al. | Feb 2005 | A1 |
20050039033 | Meyers et al. | Feb 2005 | A1 |
20050050016 | Stanoi et al. | Mar 2005 | A1 |
20050055327 | Agrawal et al. | Mar 2005 | A1 |
20050057566 | Githens et al. | Mar 2005 | A1 |
20050060277 | Zlatanov et al. | Mar 2005 | A1 |
20050076012 | Manber et al. | Apr 2005 | A1 |
20050083413 | Reed et al. | Apr 2005 | A1 |
20050086222 | Wang et al. | Apr 2005 | A1 |
20050086520 | Dharmapurikar et al. | Apr 2005 | A1 |
20050108630 | Wasson et al. | May 2005 | A1 |
20050120004 | Stata et al. | Jun 2005 | A1 |
20050187898 | Chazelle et al. | Aug 2005 | A1 |
20050216464 | Toyama et al. | Sep 2005 | A1 |
20050217052 | Baskerville | Oct 2005 | A1 |
20050219929 | Navas | Oct 2005 | A1 |
20050256825 | Dettinger et al. | Nov 2005 | A1 |
20050268212 | Dagel | Dec 2005 | A1 |
20060004851 | Gold et al. | Jan 2006 | A1 |
20060020582 | Dettinger et al. | Jan 2006 | A1 |
20060047838 | Chauhan | Mar 2006 | A1 |
20060053175 | Gardner et al. | Mar 2006 | A1 |
20060064429 | Yao | Mar 2006 | A1 |
20060085386 | Thanu et al. | Apr 2006 | A1 |
20060085465 | Nori et al. | Apr 2006 | A1 |
20060112110 | Maymir-Ducharme et al. | May 2006 | A1 |
20060136585 | Mayfield et al. | Jun 2006 | A1 |
20060149700 | Gladish et al. | Jul 2006 | A1 |
20060173824 | Bensky et al. | Aug 2006 | A1 |
20060206508 | Colace et al. | Sep 2006 | A1 |
20060224582 | Hogue | Oct 2006 | A1 |
20060248456 | Bender et al. | Nov 2006 | A1 |
20060253491 | Gokturk et al. | Nov 2006 | A1 |
20070022085 | Kulkarni | Jan 2007 | A1 |
20070055656 | Tunstall-Pedoe | Mar 2007 | A1 |
20070067108 | Buhler et al. | Mar 2007 | A1 |
20070073663 | McVeigh et al. | Mar 2007 | A1 |
20070143353 | Chen | Jun 2007 | A1 |
20070179965 | Hogue et al. | Aug 2007 | A1 |
20070203867 | Hogue et al. | Aug 2007 | A1 |
20070203868 | Betz | Aug 2007 | A1 |
20070271249 | Cragun et al. | Nov 2007 | A1 |
20080005064 | Sarukkai | Jan 2008 | A1 |
20080097958 | Ntoulas et al. | Apr 2008 | A1 |
20080104019 | Nath | May 2008 | A1 |
20080209444 | Garrett et al. | Aug 2008 | A1 |
20080267504 | Schloter | Oct 2008 | A1 |
20090100048 | Hull et al. | Apr 2009 | A1 |
20120036145 | Tunstall-Pedoe | Feb 2012 | A1 |
Number | Date | Country |
---|---|---|
10245900 | Apr 2004 | DE |
11-265400 | Sep 1999 | JP |
2002-157276 | May 2002 | JP |
2002-540506 | Nov 2002 | JP |
2003-281173 | Oct 2003 | JP |
WO 0049526 | Aug 2000 | WO |
WO 2004114163 | Dec 2004 | WO |
WO 2008097051 | Aug 2008 | WO |
Entry |
---|
Hogue, Notice of Allowance, U.S. Appl. No. 11/356,679, Oct. 2, 2014, 16 pgs. |
Anagnostopoulos, Information fusion meta-search interface for precise photo acquisition on the web, Jun. 16-19, 2003, 7 pgs. |
Anick, Using Terminological Feedback for Web Search Refinement—A Log-based Study, ACM 2003, SIGIR '03, Toronto, Canada, Jul. 28-Aug. 1, 2003, 8 pgs. |
Anonymous, Wie erstelle ich bei StudiVZ eine Bilder-Verlinkung?, 2008, 10 pgs. |
Bharat, Personalized, Interactive News on the Web, Georgia Institute of Technology, Atlanta, GA, May 5, 1997, pp. 1-22. |
Bloom filter, Wikipedia, en.wikipedia.org/wiki/Bloom—filter (last modified Feb. 13, 2005), pp. 1-4. |
Bloom, Space/Time Trade-offs in Hash Coding with Allowable Errors, Communications of the ACM, vol. 13, No. 7, Jul. 1970, pp. 422-426. |
Brill, An Analysis of the AskMSR Question-Answering System, Proceedings of the Conference of Empirical Methods in Natural Language Processing (EMNLP), Jul. 2002, 8 pages. |
Brin, Extracting Patterns and Relations from the World Wide Web, Computer Science Department, Stanford University, 1999, 12 pages. |
Brin, The Anatomy of a Large-Scale Hypertextual Web Search Engine, 7th International World Wide Web Conference, Brisbane, Australia, Apr. 14-18, 1998, pp. 1-26. |
Cao, Bloom Filters—the math, www.cs.wisc.edu/˜cao/papers/summary-cache/node8.html, Jul. 5, 1998, pp. 1-6. |
Castro, iPhotos's new faces feature really does work, Feb. 17, 2009, 8 pgs. |
Chang, IEPAD: Information Extraction Based on Pattern Discovery, WWW10 '01, ACM, Hong Kong, May 1-5, 2001, pp. 681-688. |
Chesnais, The Fishwrap Personalized News System, Community Networking, Integrated Multimedia Services to the Home, Proceedings of the Second International Workshop on, Jun. 20-22, 1995, pp. 275-282. |
Chu-Carroll, A Multi-Strategy with Multi-Source Approach to Question Answering, IBM T.J. Watson Research Center, Yorktown Heights, NY, 2006, 8 pages. |
Clarke, FrontPage 2002 Tutorials—Adding Functionality to your Website with FrontPage 2002 Part II—Navigation, ABC—All 'Bout Computers, Apr. 2002, vol. 11, accessfp.net/fronpagenavigation.htm, 8 pages. |
Cowie, MOQA: Meaning-Oriented Question Answering, Proceedings of RIAO 2004, 15 pages. |
Dean, MapReduce: Simplified Data Processing on Large Clusters, OSDI, 2004, pp. 1-13. |
Etzioni, Web-scale Information Extraction in KnowItAll (Preliminary Results), WWW2004, ACM, New York, NY, May 17-20, 2004, 11 pages. |
Freitag, Boosted Wrapper induction, American Association for Artificial Intelligence, 2000, 7 pages. |
Google, IPRP, PCT/US2007/061158, Jul. 29, 2008, 6 pgs. |
Google, ISR/WO, PCT/US06/007639, Sep. 13, 2006, 5 pgs. |
Google, ISR/WO, PCT/US07/061156, Feb. 11, 2008, 5 pgs. |
Google, ISR/WO, PCT/US07/061157, Feb. 15, 2008, 10 pgs. |
Google, ISR/WO, PCT/US2006/010965, Jul. 5, 2006, 4 pgs. |
Google, International Search Report / Written Opinion, PCT/US2010/044603, Nov. 17, 2010, 11 pgs. |
Google, International Search Report / Written Opinion, PCT/US2010/044604, Oct. 6, 2010, 10 pgs. |
Google, Office Action, CA 2610208, Sep. 21, 2011, 3 pgs. |
Google, Office Action, JP 2008-504204, Oct. 12, 2011, 2 pgs. |
Guha, Disambiguating People in Search, WWW2004, New York, NY, May 17-22, 2004, 9 pages. |
Guha, Object Co-Identification on the Semantic Web, WWW2004, ACM, New York, NY, May 17-22, 2004, 9 pages. |
Hogue, Decision on Appeal, U.S. Appl. No. 11/342,277, Jan. 24, 2014, 7 pgs. |
Hogue, Final Office Action, U.S. Appl. No. 11/356,679, Dec. 3, 2013, 22 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/097,676, Jun. 28, 2007, 12 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/097,676, Dec. 31, 2007, 13 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/341,907, Jan. 8, 2008, 13 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/341,907, Dec. 17, 2009, 22 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/341,907, Jul. 24, 2009, 17 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/341,907, Nov. 24, 2008, 14 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/341,907, Jul. 27, 2010, 21 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/341,907, Jul. 31, 2008, 17 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/342,277, Dec. 8, 2008, 23 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/342,277, Dec. 16, 2009, 25 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/342,277, Aug. 18, 2008, 26 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/342,277, Jan. 22, 2008, 21 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/342,277, Jul. 26, 2010, 26 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/342,277, Jul. 27, 2009, 21 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/342,290, Aug. 7, 2008, 39 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/342,290, Jan. 24, 2008, 36 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/342,293, Apr. 3, 2009, 13 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/342,293, Jan. 18, 2008, 13 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/342,293, Jun. 18, 2010, 22 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/342,293, May 20, 2008, 18 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/342,293, Oct. 21, 2009, 15 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/342,293, Sep. 29, 2008, 14 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/356,679, May 13, 2008, 19 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/356,679, Mar. 16, 2009, 20 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/356,679, Dec. 18, 2009, 16 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/356,679, Jul. 20, 2010, 21 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/356,679, Jun. 21, 2011, 15 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/356,679, Sep. 24, 2008, 19 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/356,679, Feb. 29, 2012, 15 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/356,679, Mar. 29, 2013, 24 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/356,679, Jul. 30, 2009, 15 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/356,851, Apr. 1, 2009, 9 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/356,851, Apr. 7, 2008, 15 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/356,851, Nov. 12, 2009, 10 pgs. |
Hogue, Office Action, U.S. Appl. No. 11/356,851, Oct. 16, 2008, 10 pgs. |
Hogue, Tree Pattern Inference and Matching for Wrapper Induction on the World Wide Web, Master of Engineering in Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Jun. 2004, pp. 1-106. |
Ilyas, Rank-Aware Query Optimization, ACM SIGMOD 2004, Paris, France, Jun. 13-18, 2004, 12 pages. |
Information Entropy—Wikipedia, the free encyclopedia, retrieved on May 3, 2006, pp. 1-9. |
Information Theory—Wikipedia, the free encyclopedia, retrieved on May 3, 2006, pp. 1-12. |
Jones, Bootstrapping for Text Learning Tasks, Carnegie Mellon University, Pittsburgh, PA, 1999, 12 pages. |
Kamba, The Krakatoa Chronicle, An interactive, Personalized, Newspaper on the Web, w3.ord/conferences/www4/papers/93, 1993, pp. 1-12. |
Kehlenbeck, Office Action, U.S. Appl. No. 11/357,748, Sep. 11, 2007, 19 pgs. |
Kchlenbeck, Office Action, U.S. Appl. No. 11/357,748, Jan. 23, 2007, 10 pgs. |
Kosseim, Answer Formulation for Question-Answering, Concordia University, Montreal, Quebec, Canada, Oct. 1, 2007, 11 pages. |
Lin, Question Answering from the Web Using Knowledge Annotation and Knowledge Mining Techniques, CIKM '03, New Orleans, LA, Nov. 3-8, 2003, pp. 116-123. |
Liu, Mining Data Records in Web Pages, Conference '00, ACM 2000, pp. 1-10. |
McCallum, Object Consolidation by Graph Partitioning with a Conditionally-Trained Distance Metric, SIGKDD 03, Washington, DC, Aug. 24-27, 2003, 6 pages. |
Mihalcea, PageRank on Semantic Networks, with Application to Word Sense Disambiguation, Proceedings of the 20th International Conference on Computational Linguistics, Aug. 23-27, 2004, 7 pages. |
Mihalcea, TextRank: Bringing Order into Texts, Proceedings of the Conference on Empirical Methods in Natural Language Processing, Jul. 2004, 8 pages. |
Nyberg, The JAVELIN Question-Answering System at TREC2003: A Multi Strategy Approach With Dynamic Planning, TREC2003, Nov. 18-21, 2003, 9 pages. |
Ogden, Improving Cross-Language Text Retrieval with Human Interactions, Proc. of the 33rd Hawaii International Conference on System Sciences, IEEE 2000, pp. 1-9. |
Prager, IBM's Piquant in TREC2003, Nov. 18-21, 2003, 10 pages. |
Prager, Question Answering Using Constraint Satisfaction: QA-by Dossier with Constraints, 2004, 8 pages. |
Ramakrishnan, Is Question Answering an Acquired Skill?, WWW2004, New York, NY, May 17, 2004, pp. 111-120. |
Ritchford, Final Office Action, U.S. Appl. No. 13/292,017, Oct. 25, 2013, 17 pgs. |
Ritchford, Final Office Action, U.S. Appl. No. 13/292,030, Apr. 25, 2014, 16 pgs. |
Ritchford, Office Action, U.S. Appl. No. 11/356,728, Oct. 7, 2010, 54 pgs. |
Ritchford, Office Action, U.S. Appl. No. 11/356,728, May 21, 2008, 25 pgs. |
Ritchford, Office Action, U.S. Appl. No. 11/356,728, Nov. 26, 2008, 25 pgs. |
Ritchford, Office Action, U.S. Appl. No. 11/356,728, May 27, 2009, 34 pgs. |
Ritchford, Office Action, U.S. Appl. No. 11/356,728, Jan. 28, 2010, 50 pgs. |
Ritchford, Office Action, U.S. Appl. No. 13/292,017, Feb. 1, 2013, 15 pgs. |
Ritchford, Office Action, U.S. Appl. No. 13/292,017, Jun. 16, 2014, 15 pgs. |
Ritchford, Office Action, U.S. Appl. No. 13/292,017, Apr. 24, 2012, 9 pgs. |
Ritchford, Office Action, U.S. Appl. No. 13/292,030, May 1, 2012, 11 pgs. |
Ritchford, Office Action, U.S. Appl. No. 13/292,030, Jan. 4, 2013, 15 pgs. |
Ritchford, Office Action, U.S. Appl. No. 13/292,030, Jan. 6, 2014, 16 pgs. |
Rochelle, Office Action, U.S. Appl. No. 11/749,679, Oct. 8, 2010, 8 pgs. |
Rochelle, Office Action, U.S. Appl. No. 11/749,679, Mar. 22, 2010, 8 pgs. |
The MathWorks, Using Matlab Graphics, Dec. 1996, 52 pgs. |
Thompson, Freshman publishing experiment offers made-to-order newspapers, 1994, 4 pgs. |
Vespe, Office Action, U.S. Appl. No. 11/535,843, Aug. 18, 2009, 16 pgs. |
Vespe, Office Action, U.S. Appl. No. 11/535,843, Dec. 23, 2008, 15 pgs. |
Zhao, Decision on Appeal, U.S. Appl. No. 11/536,504, Nov. 21, 2013, 8 pgs. |
Zhao, Notice of Allowance, U.S. Appl. No. 11/536,504, Jun. 4, 2014, 17 pgs. |
Zhao, Notice of Allowance, U.S. Appl. No. 11/536,504, Feb. 6, 2014, 16 pgs. |
Zhao, Office Action, U.S. Appl. No. 11/536,504, Aug. 14, 2008, 19 pgs. |
Zhao, Office Action, U.S. Appl. No. 11/536,504, Feb. 23, 2009, 19 pgs. |
Number | Date | Country | |
---|---|---|---|
20140320499 A1 | Oct 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11342277 | Jan 2006 | US |
Child | 14223959 | US |