Method and apparatus for contextual search and query refinement on consumer electronics devices

Information

  • Patent Grant
  • 8935269
  • Patent Number
    8,935,269
  • Date Filed
    Monday, December 4, 2006
    17 years ago
  • Date Issued
    Tuesday, January 13, 2015
    9 years ago
Abstract
A method and a system for searching for information using an electronic device, such as a consumer electronic device, that can be connected to a network. Such searching for information involves determining a context for a search for information, forming a search query based on the context of search for information, and performing a contextual search based on the search query. Performing the contextual search further includes performing a query refinement.
Description
FIELD OF THE INVENTION

The present invention relates to contextual searches, and in particular, to search and query refinement on consumer electronic (CE) devices.


BACKGROUND OF THE INVENTION

The Internet (Web) has become a major source of information on virtually every conceivable topic. The easy accessibility of such vast amounts of information is unprecedented. In the past, someone seeking even the most basic information related to a topic was required to refer to a book or visit a library, spending many hours without a guarantee of success. However, with the advent of computers and the Internet, an individual can obtain virtually any information within a few clicks of a keyboard.


A consumer electronic (CE) device can be enriched by enabling the device to seamlessly obtain related information from the Internet, while the user enjoys the content available at home. However, at times, finding the right piece of information from the Internet can be difficult. The complexity of natural language, with characteristics such as polysemy, makes retrieving the proper information a non-trivial task. The same word, when used in different contexts can imply completely different meanings. For example, the word “sting” may mean bee sting when used in entomology, an undercover operation in a spy novel or the name of an artist when used in musical context. In the absence of any information about the context, it is difficult to obtain the proper results.


The traditional searching approach on a personal computer (PC) has been for a user to form an initial query and then iteratively refine the query depending upon the kind of results obtained based on the initial query. There are several problems with applying the PC approach to a CE device. First, a CE device would require a keyboard for a user to repeatedly enter queries/refinements to find the proper results. Further, searching is an involving process requiring some amount of cognitive load. A consumer using a CE device to listen to her favorite music may not be inclined to find relevant information from the Internet if it requires more effort than pushing a few buttons.


Further, querying a search engine not only requires entering keywords using a keyboard, but as noted, typically several iterations of refinement are required before the desired results are obtained. On a typical CE device without a keyboard, this is difficult to achieve. Forming a good query requires the user to have at least some knowledge about the context of the information desired, as well as the ability to translate that knowledge into appropriate search words. Even if the user has the skills required to form a good query and the means to enter the query, she may not be inclined to do so while using a CE device for entertainment. There is, therefore, a need for a method and system that provides contextual search and query refinement for CE devices.


BRIEF SUMMARY OF THE INVENTION

The present invention provides a method and a system for searching for information using an electronic device, such as a CE device, that can be connected to a network. Such searching for information involves determining a context for a search for information, forming a search query based on the context of search for information, and performing a contextual search based on the search query. Performing the contextual search further includes performing a query refinement.


The network includes a local network including CE devices, and an external network such as the Internet, wherein the search is directed to information in the external network. Determining the context further includes determining the context based on the content in the network, wherein searching further includes filtering the search results based on said context.


Determining the context further includes using metadata related to the content in the local network to determine the context for search query formation. Determining said context can further include using metadata related to the content in the network and current application states in the local network, to determine the context for query formation and result filtering.


Determining said context can further include gathering metadata about available content in the network. When the network includes a local network and an external network, the step of gathering metadata further includes gathering metadata about available content in the local network.


In addition, the step of determining said context can further include determining the context using metadata related to: available content in the local network, current application states in the local network and additional contextual terms derived from the external network.


As such, the present invention provides contextual search and query refinement for CE devices. The cognitive load of query formation is relegated to the device itself, freeing the user to simply enjoy the content. Knowing the context of the search query, the device then uses that context for query formation, as well as result filtering on behalf of the user.


These and other features, aspects and advantages of the present invention will become understood with reference to the following description, appended claims and accompanying figures.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 shows an example of a network implementing an embodiment of the present invention.



FIG. 2 shows an example contextual search and query refinement method for CE devices, according to an embodiment of the present invention.



FIG. 3 shows an example functional block diagram of a system implementing a contextual search and query refinement method, according to an embodiment of the present invention.



FIG. 4 shows a local taxonomy of metadata, according to an embodiment of the present invention.





DETAILED DESCRIPTION OF THE INVENTION

The present invention provides contextual search and query refinement for CE devices, wherein a cognitive load of query formation is relegated to the device itself, freeing the user to simply enjoy the content. The device then uses that context for query formation, as well as result filtering on behalf of the user.


In one example implementation involving a local area network, metadata related to local content and current application states are gathered. The application states include user application states such as the current device(s) activity, for example, playing a music CD, playing a DVD, etc. The gathered data is then used to obtain the context for query formation and result filtering, essentially without user intervention. In this example, the user application states, the local context, and optionally, additional contextual terms derived from external sources (e.g., the external network), are utilized to form an appropriate query. The query is submitted to a search engine and the results are presented to a user. Preferably, contextual information is used to refine the search results returned by the search engine so that the search results are more likely to satisfy the user request.



FIG. 1 shows a functional architecture of an example network 10, such as a local network (e.g., a home network) embodying aspects of the present invention. The network 10 comprises devices 20 which may include content, CE devices 30 (e.g., a cell phone, PDA, MP3 player, TV, VCR, STB, etc.) which may include content, and an interface device 40 that connects the network 10 to an external network 50 (e.g., another local network, the Internet, etc.). Though the devices 20 and 30 are shown separately, a single physical device can include one or more logical devices.


The devices 20 and 30, respectively, can implement the UPnP protocol for communication therebetween. Those skilled in the art will recognize that the present invention is useful with other network communication protocols such as JINI, HAVi, 1394, etc. The network 10 can comprise a wireless network, a wired network, or a combination thereof.


Referring to the flowchart in FIG. 2, each of the devices 20 and 30 can implement an example contextual search and query refinement process 200 according to the present invention, which includes the steps of:

    • Step 202: Extracting contextual information indicating current user activity from application states in the local network.
    • Step 204: Extracting contextual information about the content in the local network, from metadata sources in the local network.
    • Step 206: Extracting contextual information about the content in the local network, from metadata sources in an external network.
    • Step 208: Deriving additional contextual information from the extracted contextual information.
    • Step 210: Extracting additional contextual information about the content in the local network from results obtained from the external network.
    • Step 212: Forming an appropriate query based on the obtained contextual information, and performing a search based on the query.
    • Step 214: Using the contextual information to guide the selection of desired content from the search results.


In one example, step 204 described above for extracting contextual information, includes the further steps of:

    • (a) Extracting contextual information from one or more of the following sources:
      • (i) The user's current activity indicated by the states of applications running on devices in the local network (e.g., the user is playing media in a CD player, wherein the type of content being played is “music”); and
      • (ii) Metadata about the content available in the local network, from one of the following sources:
        • (1) The metadata sources in the local network (e.g., ID3 tags from a local MP3 player);
        • (2) The metadata sources over the external network (e.g., the album, the artist etc., information from a Compact Disc Database (CDDB));
        • (3) The metadata embedded in available content (e.g., closed captions), etc.; and
    • (b) Deriving additional contextual information from the obtained contextual information. In one example, given a current activity that is “playing music title: Brand New Day by artist: Sting,” it is deduced that Sting may have a “biography” and “Brand New Day” may have “lyrics.”


In addition, in step 214 using the obtained contextual information to guide the selection of the most relevant content from the search results can further include forming an appropriate query from the obtained (extracted) contextual information, by:

    • (a) Using a subset of terms from the obtained contextual information to form a query. For example, “music artist Sting biography age debut.”
    • (b) Using the obtained contextual information to guide the selection of the most relevant content from the search results. For example, while searching for “Sting discography,” search results that do not contain the album “Brand New Day” are ignored because from the local content, it is known that one of the albums by Sting is called “Brand New Day.”



FIG. 3 shows a functional block diagram of an example system 300 implementing such a contextual search and query refinement process, according to an embodiment of the present invention. The system 300 shows specific components that derive and use contextual information to form a query and to filter the search results for presentation to a user, as described.


The system 300 utilizes the following components: Broadcast Unstructured Data Sources 301, a Local Contextual Information Gatherer 302, a Local Metadata Cache 303, a Contextual Information Deriver 304, a Correlation Framework 305, a Broadcast Data Extractor and Analyzer 306, Local Content Sources 307, Document Theme Extractor 308, Application States 309, a client User Interface (UI) 310, a Query Execution Planner 312, a Correlation Plan Executor 314, a Correlation Constructor 316, an Internet Metadata Gatherer from Structured Sources 318, Internet Structured Data Sources 320, a query 322, a Search Engine Interface 324, Web Pages 326, a Snippet Analyzer 328, and Internet Unstructured Data Sources 330. The function of each component is further described below.


The Broadcast Unstructured Data Sources 301 comprises unstructured data embedded in media streams. Examples of such data sources include cable receivers, satellite receivers, TV antennas, radio antennas, etc.


The Local Contextual Information Gatherer (LCIG) 302 collects metadata and other contextual information about the contents in the local network. The LCIG 302 also derives additional contextual information from existing contextual information. The LCIG 302 further performs one or more of the following functions: (1) gathering metadata from local sources whenever new content is added to the local content/collection, (2) gathering information about a user's current activity from the states of applications running on the local network devices (e.g., devices 20, 30 in FIG. 1), and (3) accepting metadata and/or contextual information extracted from Internet sources and other external sources that describe the local content.


The LCIG 302 includes a Contextual Information Deriver (CID) 304 which derives new contextual information from existing information. For this purpose, the CID 304 uses a local taxonomy of metadata related concepts. An example partial taxonomy 400 is shown in FIG. 4. Each edge 402 (solid connector line) connects a pair of concepts 404 (solid ellipses). An edge 408 between a pair of concepts 404 represents a HAS-A relationship between that pair of concepts 404. Each edge 408 (dotted connector line) connects a concept 404 and a synonym 406 (dotted ellipse), and represents a IS-A relationship therebetween. As such, each edge 408 connects a concept 404 with its synonym 406.


In one example where the current information need is about a music artist, the CID 304 uses the taxonomy 400 to determine “biography” and “discography” as derived contextual terms. The CID 304 also knows that “age” and “debut” are relevant concepts in an artist's biography.


Referring back to FIG. 3, the LCIG 302 further maintains a local metadata cache 303, and stores the collected metadata in the cache 303. The cache 303 provides an interface for other system components to add, delete, access, and modify the metadata in the cache 303. For example, the cache 303 provides an interface for the CID 304, Local Content Sources 307, Internet Metadata Gatherer from Structured Sources 318, Broadcast Data Extractor and Analyzer 306, Document Theme Extractor 308 and Snippet Analyzer 328, etc., for extracting metadata from local or external sources.


The Broadcast Data Extractor and Analyzer (BDEA) 306 receives contextual information from the Correlation Framework (CF) 305 described further below, and uses that information to guide the extraction of a list of terms from data embedded in the broadcast content. The BDEA 306 then returns the list of terms back to the CF 305.


The Local Content Sources 307 includes information about the digital content stored in the local network (e.g., on CD's, DVD's, tapes, internal hard disks, removable storage devices, etc.).


The Document Theme Extractor (DTE) 308 receives contextual information from the CF 305 as input and performs one or more of the following operations guided by the contextual information: (1) extracting and selecting a list of terms that best summarize the themes of documents returned as search results by the Search Engine Interface 324, and returning the list to the CF 305, and (2) clustering the documents returned as search results, extracting and selecting therefrom a list of terms that best summarize the themes of each cluster, and returning the list to the CF 305. The DTE 308 decides among one or more of these operations based on current user requirements. For example, if only the top (most important) keywords from a set of documents are needed, then operation 1 above is utilized. If there is a need to cluster the documents returned and then find the most important (representative) keywords from each cluster, then operation 2 above is utilized.


The Local Application States 309 includes information about the current user activity using one or more devices 20 or 30 (e.g., the user is listening to music using a DTV).


The client UI 310 provides an interface for user interaction with the system 300. The UI 310 maps user interface functions to a small number of keys, receives user input from the selected keys and passes the input to the CF 305 in a pre-defined form. Further, the UI 310 displays the results from the CF 305 when instructed by the CF 305. An implementation of the UI 310 includes a module that receives signals from a remote control, and a web browser that overlays on a TV screen.


The Query Execution Planner (QEP) 312 provides a plan that carries out a user request to perform a task such as a search. The Correlation Plan Executor (CPE) 314 executes the plan by orchestrating components in the system 300 and correlating the results from the components to deliver better results to the user. For example, the CPE 314 performs a “task” by orchestrating all the components and devices required for performing the task.


The Correlation Constructor 316 cooperates with the QEP 312 to form a plan by correlating data gathered from external sources with the data gathered from the local network. The Correlation Constructor 316 can also form the plan automatically using the correlation.


The Metadata Gatherer from Structured Sources 318 gathers metadata about local content from the Internet Structured Data Sources 320. The Internet Structured Data Sources 320 includes data with semantics that are closely defined. Examples of such sources include Internet servers that host XML data enclosed by semantic-defining tags, Internet database servers such as CDDB, etc.


The query 322 is a type of encapsulation of the information desired, and is searched for, such as on the Internet. The query 322 is formed by the CF 305 from the information and metadata gathered from the local and/or external network.


The Search Engine Interface (SEI) 324 inputs a query 322 and transmits it to one or more search engines over the Internet, using a pre-defined Internet communication protocol such as HTTP. The SEI 324 also receives the response to the query from said search engines, and passes the response (i.e., search results) to a component or device that issued the query.


The Internet Unstructured Data Sources 330 includes data or data segments with semantics that cannot be analyzed (e.g., free text). Internet servers that host web pages typically contain this type of data.


The web pages 326 include web pages on the Internet that are returned in the search results. In one example, when a query is sent to a search engine, the search engine returns a list of URLs that are relevant to that query. For each relevant URL, most search engines also return a small piece of text such as a snippet, from a corresponding web page. The main purpose of the snippets is to provide the user a brief overview of what the web page is about. The snippet is either from the web page itself, or taken from the meta tags of the web page. Different search engines have different techniques for generating these snippets.


The Snippet Analyzer 328 inputs the search results and a query from the CF 305. The Snippet Analyzer 328 then analyzes snippets from the search results and extracts from the snippets terms that are relevant to the query. The extracted terms are provided to the CF 305.


The CF 305 orchestrates contextual query formation, contextual search and refinement by:

    • (a) Providing contextual information to appropriate components in the system 300 (i.e., one or more of the components 302, 306, 310, 324, 328, 308, 318) for query formation, query plan formation, plan execution or examining search results.
    • (b) Receiving a list of terms from components that retrieve related information from the Internet (i.e., one or more of components 308, 328, 324). Then, making the following decisions:
      • (i) Whether the terms in the list should be further refined;
      • (ii) Whether any of the terms in the list carry contextual information;
      • (iii) Whether and how a new query should be formed using the contextual information and the existing query; and
      • (iv) Whether any of the contextual information should be used as context for a query.
    • (c) If new contextual terms are found from a list of terms, then using all or some of the terms in task formation, and optionally providing the terms to the LCIG 302 to store for later use.
    • (d) If a new query should be formed, then constructing the query according to the decision made and executing the new query.
    • (e) If some of the contextual information should be used as context for a query, then using such information according to a pre-determined format, and executing the query.
    • (f) If a list of terms returned by a component needs to be further refined, then further refining the list of terms using the contextual information.


As such, a CE device 20 or 30 (FIG. 1) configured based on the process 200 and the system 300 can form a query and perform a search using contextual information about a user's activity, local network content, and the metadata about such content. The user is not required to be involved in this process. Further, users need not be skilled in query formation to obtain relevant results such as from the Internet. Such a configured CE device uses contextual information to select among the relevant results returned in response to the query.


As is known to those skilled in the art, the aforementioned example architectures described above, according to the present invention, can be implemented in many ways, such as program instructions for execution by a processor, as logic circuits, as an application specific integrated circuit, as firmware, etc.


The present invention has been described in considerable detail with reference to certain preferred versions thereof; however, other versions are possible. Therefore, the spirit and scope of the appended claims should not be limited to the description of the preferred versions contained herein.

Claims
  • 1. A method for automatically obtaining relevant search results from a search engine for a user who is using an electronic device, though a keyboard is not used to manually input search terms, the method comprising: facilitating communication between a first electronic device, which is a movie player or a music player, with one or more other devices in a local home network;obtaining a media name from one of the devices in the local home network that helps identify a particular song, music group, album or movie that has been played on the first electronic device wherein the media name is at least one selected from the group consisting of a movie title, a song title, a music album name and a music group name;performing two linked searches, a first search and a linked second search, to obtain additional information for the user that relates to media played on the first electronic device, wherein the two linked searches do not involve search terms that are directly inputted by the user and wherein the two linked searches include:automatically performing a first search of the Internet using one or more search terms that include at least a part of the media name;identifying a plurality of web pages using the first search;receiving web page text contained in the plurality of web pages identified in the first search;automatically analyzing the web page text to determine that some of the web page text provides additional contextual information relating to the media name;automatically selecting at least some of the web page text to use as one or more search terms in the second search;automatically performing the second search on an external network using the selected web page text as one or more search terms in the second search;receiving a search result in response to the second search; anddisplaying information to the user based on the search result from the second search.
  • 2. A method as recited in claim 1 wherein the local home network includes a second consumer electronic device, the method further comprising: obtaining information from the second consumer electronic device in the local home network a wherein the one or more search terms used in the first search are based on the media name and the information obtained from the second consumer electronic device.
  • 3. A method as recited in claim 1 wherein the steps of said method are performed at the first consumer electronic device.
  • 4. A method as recited in claim 1 wherein the steps of said method are performed at a second consumer electronic device that is coupled with the first consumer electronic device.
  • 5. A method as recited in claim 1 wherein: the external network is the Internet; andthe performing of the first search involves automatically entering the one or more search terms into an Internet search engine.
  • 6. A method as recited in claim 1 wherein the displaying of the information to the user is performed at a display module wherein the display module is selected from the group consisting of (a) a module that is part of the first consumer electronic device; and (b) a module that is coupled with but separate from the first consumer electronic device.
  • 7. A method as recited in claim 1 further comprising: analyzing the search result received in response to the second search to determine that a part of the search result is desirable and other parts of the search result are undesirable based on the least one selected from the group consisting of: (a) device activity information obtained from the first consumer electronic device and (b) the web page text received in response to the first search; anddisplaying the information to the user that is based on the desirable part of the search result and not based on the undesirable parts of the search result.
  • 8. A method as recited in claim 1 further comprising: obtaining external data from a database that is coupled with the first consumer electronic device through an external network wherein: the external data provides additional information related to current user activity on the first consumer electronic device; andthe one or more search terms used in the first search are based on the external data obtained from the database.
  • 9. A method as recited in claim 2 wherein neither the first nor the second consumer electronic device involves a physical keyboard suitable for manually typing words and search terms.
  • 10. A method as recited in claim 1 wherein the one or more search terms used in the first search are automatically determined and do not require the user to directly type any search terms.
  • 11. A method as recited in claim 1 further comprising: deriving additional search terms from the one or more search terms used in the first search, which includes: extrapolating additional search terms from the one or more search terms used in the first search; andusing the additional search terms in another search.
  • 12. A method as recited in claim 11 wherein: the extrapolating of the additional search terms is based on a taxonomy;the taxonomy is represented by a plurality of nodes that are connected by a multiplicity of edges, each node representing a concept, the multiplicity of edges including a first and a second type of edge, the first type of edge representing a “has” relationship between two of the concepts, the second type of edge representing a “is” relationship between two of the concepts;the extrapolating of the additional search terms further involves: matching at least one of the one or more search terms used in the first search with at least one of the nodes; andderiving the additional search terms from concepts that are linked by edges to the at least one of the nodes.
  • 13. A method as recited in claim 1 wherein the first consumer electronic device is selected from the group consisting of a mobile phone, a personal digital assistant (PDA), an MP3 player and a television.
  • 14. A computing system suitable for automatically obtaining relevant search results from a search engine for a user who is using an electronic device, though a keyboard is not used to manually input search terms, comprising: at least one processor;at least one memory that stores computer readable instructions, which when executed by the computing system cause the computing system to:facilitate communication between a first electronic device, which is a movie player or a music player, with one or more other devices in a local home network;obtain a media name from one of the devices in the local home network that helps identify a particular song, music group, album or movie that has been played on the first electronic device wherein the media name is at least one selected from the group consisting of a movie title, a song title, a music album name and a music group name;perform two linked searches, a first search and a linked second search, to obtain additional information for the user that relates to media played on the first electronic device, wherein the two linked searches do not involve search terms that are directly inputted by the user;automatically perform the first search wherein the first search is of the Internet and uses one or more search terms that include at least a part of the media name;identify a plurality of web pages using the first search;receive web page text contained in the plurality of web pages identified in the first search;automatically analyze the web page text to determine that some of the web page text provides additional contextual information relating to the media name;automatically select at least some of the web page text to use as one or more search terms in the second search;automatically perform the second search on an external network using the selected web page text as one or more search terms in the second search second search query, which is based on the first and second sets of search terms;receive a search result in response to the second search; anddisplay information to the user based on the search result from the second search.
  • 15. A computing system as recited in claim 14 wherein the computer readable instructions, when executed by the computer system, further cause the computing system to: obtain information from a second consumer electronic device in the local home network wherein the one or more search terms used in the first search are based on the media name and the information obtained from the second consumer electronic device.
  • 16. A computing system as recited in claim 14 wherein the computing system is a second consumer electronic device that is coupled with the first consumer electronic device.
  • 17. A computing system as recited in claim 14 wherein the computing system is part of the first consumer electronic device.
  • 18. A computing system as recited in claim 14 wherein the computing system is separate from and coupled with the first consumer electronic device.
  • 19. A computing system as recited in claim 14 wherein: the external network is the Internet; andthe performing of the first search involves entering the first set of search terms into an Internet search engine.
  • 20. An electronic device that automatically obtains relevant search results from a search engine for a user, though a keyboard is not used to manually input search terms, wherein the electronic device includes: executable code operable to facilitate communication between a electronic device, which is a movie player or a music player, with one or more other devices in a local home network;executable code operable to obtain a media name from one of the devices in the local home network that helps identify a particular song, music group, album or movie that has been played on the electronic device wherein the media name is at least one selected from the group consisting of a movie title, a song title, a music album name and a music group name;executable code operable to perform two linked searches, a first search and a linked second search, to obtain additional information for the user that relates to media played on the electronic device, wherein the two linked searches do not involve search terms that are directly inputted by the user;executable code operable to automatically perform the first search of the Internet using one or more search terms that include at least a part of the media name;executable code operable to identify a plurality of web pages using the first search;executable code operable to receive web page text contained in the plurality of web pages identified in the first search;executable code operable to automatically analyze the web page text to determine that some of the web page text provides additional contextual information relating to the media name;executable code operable to automatically select at least some of the web page text to use as one or more search terms in the second search;executable code operable to automatically perform the second search on an external network using the selected web page text as one or more search terms in the second search;executable code operable to receive a search result in response to the second search; andexecutable code operable to display information to the user based on the search result from the second search.
  • 21. An electronic device as recited in claim 20 further includes: executable code operable to obtain information from a second electronic device in the local home network wherein the one or more search terms used in the first search are based on the media name and the information obtained from the second electronic device.
  • 22. An electronic device as recited in claim 20 wherein: the external network is the Internet; andthe performing of the first search involves automatically entering the one or more search terms into an Internet search engine.
  • 23. A method for automatically obtaining more accurate and relevant information relating to music that a user is playing on an electronic device, the method comprising: facilitating communication between a first electronic device, which is a music player, with one or more other devices in a local home network;determining that the first electronic device is currently playing music;performing three searches, a first local search, a second Internet search and a third search, to obtain additional information for the user that relates to the music played on the first electronic device, wherein the three searches do not involve search terms that are directly inputted by the user and wherein the three searches involve:automatically performing the first search of the one or more networked devices in the local home network to find out more information relating to the music that is being played on the first electronic device;automatically performing the second search of the Internet to find out more information relating to the music that is being played on the first electronic device;receiving preliminary search results in response to the first and second searches;determining from the preliminary search results a media name that helps describe the music played on the first electronic device, the media name being one selected from the group consisting of a name of an album that contains the played music, a name of a music group that performs the played music and a name of an artist that performs the played music;forming a search query using the media name as one or more search terms;adding one or more additional search terms to the search query that indicate an interest in additional background information for the media name, the background information being at least one selected from the group consisting of lyrics of the played song and biographical information about the artist that performs the played music;after the additional terms have been added to the search query, performing the third search of the Internet using the search query;receiving search results in response to the third search that contains said background information; anddisplaying information to the user based on the search results received in response to the third search.
  • 24. A method as recited in claim 23 wherein the first search of the local home network involves searching for and obtaining information from metadata of an audio file stored in a device on the local home network.
  • 25. A method as recited in claim 24, wherein the audio file is an MP3 file and wherein the first search involves obtaining information from an ID3 tag of the MP3 audio file.
  • 26. A method as recited in claim 23 wherein the second search of the local home network involves searching for and obtaining information from a Compact Disc Database (CDDB).
US Referenced Citations (189)
Number Name Date Kind
5616876 Cluts Apr 1997 A
5790935 Payton Aug 1998 A
5918223 Blum et al. Jun 1999 A
5974406 Bisdikian et al. Oct 1999 A
5983214 Lang et al. Nov 1999 A
5983237 Jain et al. Nov 1999 A
5995959 Friedman et al. Nov 1999 A
6151603 Wolfe Nov 2000 A
6253238 Lauder et al. Jun 2001 B1
6334127 Bieganski et al. Dec 2001 B1
6412073 Rangan Jun 2002 B1
6438579 Hosken Aug 2002 B1
6480844 Cortes et al. Nov 2002 B1
6564210 Korda et al. May 2003 B1
6564213 Ortega et al. May 2003 B1
6601061 Holt et al. Jul 2003 B1
6637028 Voyticky et al. Oct 2003 B1
6665658 DaCosta et al. Dec 2003 B1
6721748 Knight et al. Apr 2004 B1
6766523 Herley Jul 2004 B2
6774926 Ellis et al. Aug 2004 B1
6807675 Maillard et al. Oct 2004 B1
6826512 Dara-Abrams et al. Nov 2004 B2
6834284 Acker et al. Dec 2004 B2
6842877 Robarts et al. Jan 2005 B2
6954755 Reisman Oct 2005 B2
6981040 Konig et al. Dec 2005 B1
7013300 Taylor Mar 2006 B1
7028024 Kommers et al. Apr 2006 B1
7054875 Keith, Jr. May 2006 B2
7062561 Reisman Jun 2006 B1
7069575 Goode et al. Jun 2006 B1
7110998 Bhandari et al. Sep 2006 B1
7158961 Charikar Jan 2007 B1
7158986 Oilver et al. Jan 2007 B1
7162473 Dumais et al. Jan 2007 B2
7165080 Kotcheff et al. Jan 2007 B2
7181438 Szabo Feb 2007 B1
7194460 Komamura Mar 2007 B2
7203940 Barmettler et al. Apr 2007 B2
7225187 Dumais et al. May 2007 B2
7283992 Liu et al. Oct 2007 B2
7284202 Zenith Oct 2007 B1
7343365 Farnham et al. Mar 2008 B2
7363294 Billsus et al. Apr 2008 B2
7386542 Maybury et al. Jun 2008 B2
7389224 Elworthy Jun 2008 B1
7389307 Golding Jun 2008 B2
7433935 Obert Oct 2008 B1
7483921 Tsuzuki et al. Jan 2009 B2
7552114 Zhang et al. Jun 2009 B2
7565345 Bailey et al. Jul 2009 B2
7577718 Slawson et al. Aug 2009 B2
7593921 Goronzy et al. Sep 2009 B2
7603349 Kraft et al. Oct 2009 B1
7613736 Hicken Nov 2009 B2
7617176 Zeng et al. Nov 2009 B2
7634461 Oral et al. Dec 2009 B2
7657518 Budzik et al. Feb 2010 B2
7685192 Scofield et al. Mar 2010 B1
7716158 McConnell May 2010 B2
7716199 Guha May 2010 B2
7725486 Tsuzuki et al. May 2010 B2
7793326 McCoskey Sep 2010 B2
7899915 Reisman Mar 2011 B2
7958115 Kraft Jun 2011 B2
8028323 Weel Sep 2011 B2
8090606 Svendsen Jan 2012 B2
20010003214 Shastri et al. Jun 2001 A1
20010023433 Natsubori et al. Sep 2001 A1
20020002899 Gjerdingen et al. Jan 2002 A1
20020022491 McCann et al. Feb 2002 A1
20020026436 Joory Feb 2002 A1
20020032693 Chiou et al. Mar 2002 A1
20020087535 Kotcheff et al. Jul 2002 A1
20020147628 Specter et al. Oct 2002 A1
20020161767 Shapiro et al. Oct 2002 A1
20020162121 Mitchell Oct 2002 A1
20030009537 Wang Jan 2003 A1
20030028889 McCoskey Feb 2003 A1
20030033273 Wyse Feb 2003 A1
20030070061 Wong et al. Apr 2003 A1
20030088553 Monteverde May 2003 A1
20030105682 Dicker et al. Jun 2003 A1
20030131013 Pope et al. Jul 2003 A1
20030158855 Farnham et al. Aug 2003 A1
20030172075 Reisman Sep 2003 A1
20030184582 Cohen Oct 2003 A1
20030221198 Sloo Nov 2003 A1
20030229900 Reisman Dec 2003 A1
20030231868 Herley Dec 2003 A1
20040031058 Reisman Feb 2004 A1
20040073944 Booth Apr 2004 A1
20040107821 Alcalde et al. Jun 2004 A1
20040194141 Sanders Sep 2004 A1
20040220925 Liu et al. Nov 2004 A1
20040244038 Utsuki et al. Dec 2004 A1
20040249790 Komamura Dec 2004 A1
20050004910 Trepess Jan 2005 A1
20050137996 Billsus et al. Jun 2005 A1
20050144158 Capper et al. Jun 2005 A1
20050154711 McConnell Jul 2005 A1
20050160460 Fujiwara et al. Jul 2005 A1
20050177555 Alpert et al. Aug 2005 A1
20050240580 Zamir et al. Oct 2005 A1
20050246726 Labrou et al. Nov 2005 A1
20050278362 Maren et al. Dec 2005 A1
20050289599 Matsuura et al. Dec 2005 A1
20060026152 Zeng et al. Feb 2006 A1
20060028682 Haines Feb 2006 A1
20060036593 Dean et al. Feb 2006 A1
20060066573 Matsumoto Mar 2006 A1
20060074883 Teevan et al. Apr 2006 A1
20060084430 Ng Apr 2006 A1
20060095415 Sattler et al. May 2006 A1
20060129533 Purvis Jun 2006 A1
20060133391 Kang et al. Jun 2006 A1
20060136670 Brown et al. Jun 2006 A1
20060156252 Sheshagiri et al. Jul 2006 A1
20060156326 Goronzy et al. Jul 2006 A1
20060161542 Cucerzan et al. Jul 2006 A1
20060184515 Goel et al. Aug 2006 A1
20060195362 Jacobi et al. Aug 2006 A1
20060210157 Agnihotri et al. Sep 2006 A1
20060242283 Shaik et al. Oct 2006 A1
20070038601 Guha et al. Feb 2007 A1
20070038934 Fellman Feb 2007 A1
20070043703 Bhattacharya et al. Feb 2007 A1
20070050346 Goel et al. Mar 2007 A1
20070061222 Allocca et al. Mar 2007 A1
20070061352 Dimitrova et al. Mar 2007 A1
20070073894 Erickson et al. Mar 2007 A1
20070078822 Cucerzan et al. Apr 2007 A1
20070101291 Forstall et al. May 2007 A1
20070107019 Romano et al. May 2007 A1
20070130585 Perret et al. Jun 2007 A1
20070143266 Tang et al. Jun 2007 A1
20070156447 Kim et al. Jul 2007 A1
20070179776 Segond et al. Aug 2007 A1
20070198485 Ramer et al. Aug 2007 A1
20070198500 Lucovsky et al. Aug 2007 A1
20070208755 Bhatkar et al. Sep 2007 A1
20070214123 Messer et al. Sep 2007 A1
20070214488 Nguyen et al. Sep 2007 A1
20070220037 Srivastava et al. Sep 2007 A1
20070233287 Sheshagiri et al. Oct 2007 A1
20070300078 Ochi et al. Dec 2007 A1
20080040316 Lawrence Feb 2008 A1
20080040426 Synstelien et al. Feb 2008 A1
20080082627 Allen et al. Apr 2008 A1
20080082744 Nakagawa Apr 2008 A1
20080097982 Gupta Apr 2008 A1
20080114751 Cramer et al. May 2008 A1
20080133501 Andersen et al. Jun 2008 A1
20080133504 Messer et al. Jun 2008 A1
20080162651 Madnani Jul 2008 A1
20080162731 Kauppinen et al. Jul 2008 A1
20080183596 Nash et al. Jul 2008 A1
20080183681 Messer et al. Jul 2008 A1
20080183698 Messer et al. Jul 2008 A1
20080204595 Rathod et al. Aug 2008 A1
20080208839 Sheshagiri et al. Aug 2008 A1
20080222232 Allen et al. Sep 2008 A1
20080229240 Garbow et al. Sep 2008 A1
20080235209 Rathod et al. Sep 2008 A1
20080235393 Kunjithapatham et al. Sep 2008 A1
20080242279 Ramer et al. Oct 2008 A1
20080250010 Rathod et al. Oct 2008 A1
20080256460 Bickmore Oct 2008 A1
20080266449 Rathod et al. Oct 2008 A1
20080288641 Messer et al. Nov 2008 A1
20080294998 Pyhalammi et al. Nov 2008 A1
20090029687 Ramer et al. Jan 2009 A1
20090044144 Morris Feb 2009 A1
20090055393 Messer et al. Feb 2009 A1
20090064017 Biniak et al. Mar 2009 A1
20090070184 Svendsen Mar 2009 A1
20090077052 Farrelly Mar 2009 A1
20090077065 Song et al. Mar 2009 A1
20090094339 Allen et al. Apr 2009 A1
20090112848 Kunjithapatham et al. Apr 2009 A1
20090119717 Newton et al. May 2009 A1
20090248883 Suryanarayana et al. Oct 2009 A1
20090288014 Fujioka Nov 2009 A1
20100070895 Messer Mar 2010 A1
20100191619 Dicker et al. Jul 2010 A1
20100257196 Waters et al. Oct 2010 A1
20100281393 Fujioka Nov 2010 A1
20110208768 Mehanna Aug 2011 A1
Foreign Referenced Citations (23)
Number Date Country
1393107 Jan 2003 CN
1723458 Dec 2003 CN
1585947 Feb 2005 CN
1723458 Jan 2006 CN
1808430 Jul 2006 CN
1848742 Oct 2006 CN
1906610 Jan 2007 CN
2003-099442 Apr 2003 JP
2004505563 Feb 2004 JP
2006228198 Aug 2006 JP
2007-012013 Jan 2007 JP
20020005147 Jan 2002 KR
20020006810 Jan 2002 KR
20040052339 Jun 2004 KR
20060027226 Mar 2006 KR
0137465 May 2001 WO
0180077 Oct 2001 WO
0243310 May 2002 WO
WO 03042866 May 2003 WO
2005055196 Jun 2005 WO
WO 2005072157 Aug 2005 WO
2007004110 Jan 2007 WO
2008030875 Mar 2008 WO
Non-Patent Literature Citations (98)
Entry
Apple Inc., Apple iTunes, http://www.apple.com/itunes/, Apr. 28, 2003.
Babaguchi, N. et al., “Intermodal collaboration: A strategy for semantic content analysis for broadcasted sports video,” Sep. 2003, pp. 13-16.
Brill, E., “A simple rule-based part of speech tagger,” Mar. 1992, Proceedings of the Third Conference on Applied Natural Language Processing, Trento, Italy.
Google Inc., Google Search Engine, http://www.google.com, Aug. 1998.
Google Inc., Google Desktop Search, http://desktop.google.com, Oct. 15, 2004.
Henzinger, M. et al, “Query-free news search,” May 2003, Proceedings on the 12th International Conference on World Wide Web, Budapest, Hungary.
Livingston, K. et al., “Beyond broadcast: a demo,” Jan. 2003, in Proceedings of the 8th international Conference on intelligent User interfaces, ACM Press, New York, NY, 325-325.
Microsoft Corporation, Microsoft Windows Media Player, http://www.microsoft.com/windows/windowsmedia/, Feb. 1999.
Microsoft Corporation, MSN TV, http://www.webtv.com/pc, Feb. 2006.
Opera Software ASA, Opera Browser for Devices, http://www.opera.com/products/devices/, Mar. 2006.
Rau Lisa, F. et al, “Domain-independent summarization of news,” Jun. 1994, in Summarizing Text for Intelligent Communication, pp. 71-75, Dagstuhl, Germany.
Spalti, M., “Finding and Managing Web Content with Copernic 2000,” Sep. 2000, Library Computing, Westport, pp. 217-221, vol. 18, No. 3.
Tjondronegoro, D. et al., “Extensible detection and indexing of highlight events in broadcasted sports video,” Jan. 2006, Proceedings of the 29th Australasian Computer Science Conference, Hobart, Australia.
Wachman, J. et al., “Tools for Browsing a TV Situation Comedy Based on Content Specific Attributes,” Mar. 2001, Multimedia Tools and Applications, v.13 n.3, p. 255-284.
Yahoo Search Engine, http://search.yahoo.com, Dec. 1998.
Zhuang, Y. et al, “Applying semantic association to support content-based video retrieval,” Oct. 1998, International workshop on Very Low Bitrate Video Coding (VLBV'98).
Notification of Transmittal of the International Search Report and the Written Opinion of the International Search Authority Application No. PCT/KR2008/000537 dated May 15, 2008, 16 pages.
Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority for International Application No. PCT/KR2008/001558, dated Jun. 26, 2008, 10 pages.
Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, dated Aug. 20, 2008; International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/KR2008/001941, 10 pages.
Moraveji, N. et al., “DynaLine: A Non-Disruptive TV User Interface for Passive Browsing of Internet Video,” Microsoft Research Technical Report, 2006, pp. 1-4, United States.
Miyamori, H. et al., “Webified Video: Media Conversion from TV Programs to Web Content for Cross-Media Information Integration,” Proceedings of the 16th International Conference on Database and Expert Systems Applications, 2005, pp. 176-185, Springer-Verlag, Berlin, Heidelberg, Germany.
Copernic Inc., Copernic Search Engine for your PC, http://www.copernic.com, downloaded Sep. 19, 2008, pp. 1, United States.
AOL LLC, http://www.aol.com, downloaded Sep. 19, 2008, pp. 1-2, United States.
Ask Search Engine, http://www.ask.com, downloaded Sep. 19, 2008, pp. 1, United States.
“Placement in the DMOZ.org directory—Pure Power”, DMOZ/Google Directory, posted on Apr. 30, 2005, http://www.ahfx.net/weblog/13, pp. 1-3, United States.
Microsoft Corporation, Microsoft Windows Media Player 10, http://www.microsoft.com/windows/windowsmedia/mp10, downloaded Sep. 19, 2008, United States.
Microsoft Corporation, Microsoft Windows Media Player, http://www.microsoft.com/windows/windowsmedia/, Sep. 19, 2008, pp. 1, United States.
Realnetworks, Inc., http://www.real.com, downloaded Sep. 19, 2008, United States.
U.S. Non-final Office Action for U.S. Appl. No. 11/725,865 mailed Oct. 16, 2009.
U.S. Non-final Office Action for U.S. Appl. No. 11/713,350 mailed Mar. 2, 2009.
U.S. Final Office Action for U.S. Appl. No. 11/713,350 mailed Aug. 14, 2009.
U.S. Non-final Office Action for U.S. Appl. No. 11/803,826 mailed Jul. 24, 2009.
U.S. Non-final Office Action for U.S. Appl. No. 11/900,847 mailed Oct. 28, 2009.
Livingston, K. et al., “Beyond Broadcast,” 8th International Conference on Intelligent User Interfaces, Jan. 2003, pp. 260-262, Association for Computing Machinery, New York, NY.
U.S. Final Office Action for U.S. Appl. No. 11/725,865 mailed Mar. 16, 2010.
U.S. Non-final Office Action for U.S. Appl. No. 11/713,250 mailed Mar. 8, 2010.
U.S. Final Office Action for U.S. Appl. No. 11/803,826 mailed Mar. 3, 2010.
U.S. Non-final Office Action for U.S. Appl. No. 12/056,184 mailed on Jun. 11, 2010.
U.S. Final Office Action for U.S. Appl. No. 12/056,184 mailed on Nov. 23, 2010.
U.S. Non-Final Office Action for U.S. Appl. No. 11/726,340 mailed May 19, 2009.
U.S. Non-final Office Action for U.S. Appl. No. 11/732,887 mailed on Jun. 5, 2009.
U.S. Final Office Action for U.S. Appl. No. 11/969,778 mailed on Sep. 2, 2010.
U.S. Non-final Office Action for U.S. Appl. No. 11/969,778 mailed on Apr. 19, 2010.
Vechtomova, O. et al., “Query expansion with terms selected using lexical cohesion analysis of documents”, Information Processing and Management: an International Journal, Oct. 2006, pp. 849-865, vol. 43, No. 4, Pergamon Press Inc., Tarrytown, New York, United States.
U.S. Non-final Office Action for U.S. Appl. No. 11/725,865 mailed Sep. 14, 2010.
U.S. Final Office Action for U.S. Appl. No. 11/713,350 mailed Sep. 9, 2010.
U.S. Office Action for U.S. Appl. No. 11/726,340 mailed Oct. 14, 2010.
U.S. Office Action for U.S. Appl. No. 11/726,340 mailed Feb. 14, 2011.
U.S. Office Action for U.S. Appl. No. 11/726,340 mailed Dec. 9, 2009.
U.S. Office Action for U.S. Appl. No. 12/263,089 mailed Mar. 25, 2011.
U.S. Office Action for U.S. Appl. No. 11/732,887 mailed on Dec. 4, 2009.
U.S. Office Action for U.S. Appl. No. 11/725,865 mailed Mar. 4, 2011.
U.S. Office Action for U.S. Appl. No. 11/803,826 mailed Jun. 1, 2011.
Office Action dated Jun. 27, 2011 from U.S. Appl. No. 11/725,865.
Office Action dated Jul. 6, 2011 from U.S. Appl. No. 11/789,609.
Office Action dated Oct. 6, 2010 from U.S. Appl. No. 11/981,019.
Office Action dated Dec. 29, 2009 from U.S. Appl. No. 11/981,019.
Final Office Action dated Mar. 16, 2011 from U.S. Appl. No. 11/981,019.
Final Office Action dated Jun. 17, 2010 from U.S. Appl. No. 11/981,019.
Office Action dated Aug. 2, 2011 from U.S. Appl. No. 11/713,312.
Google Inc., webhp, http://www.google.com/webhp?complete-1&hl-en, downloaded Sep. 25, 2008, p. 1.
Office Action dated Mar. 25, 2010 from Chinese Patent Application No. 200810082621.3, 7pp., China (English-language translation included—15 pp).
Tivo Inc., http://www.tivo.com, downloaded Sep. 19, 2008, 1 page.
“Computing Meng Individual Project, ANSES—Automatic News Summarization and Extraction System,”http://mmis.doc.ic.ac.uk/pr-1.wong-2002/overview.html, downloaded Sep. 24, 2008, 4 pages.
Miyauchi et al., “Highlight Detection and Indexing in Broadcast Sports Video by Collaborative Processing of Text, Audio, and Image,” Sytstems and Computers in Japan, vol. 34, No. 12, 2003, pp. 22-31, Translated from Denshi Joho Tsushin Gakkai Ronbunshi, vol. J85-D-II, No. 11, Nov. 2002, pp. 1692-1700.
Nitta, Naoka et al., “Automatic Story Segmentation of Closed-Caption Text for Semantic Content Analysis of Broadcasted Sports Video,” 8th International Workshop on Multimedia Information Systems, 2002, pp. 110-116.
Miura, K. et al., “Automatic Generation of a Multimedia Encyclopedia from TV Programs by Using Closed Captions and Detecting Principal Viedo Objects,”Proceedings of the Eighth IEEE International Sympsosium on Miltumedia (ISM ' 06), 2006, IEEE, 8 pages.
Office Action dated Aug. 19, 2011 from U.S. Appl. No. 11/821,938.
Office Action dated Aug. 22, 2011 from U.S. Appl. No. 11/981,019.
Final Office Action dated Sep. 1, 2011 from U.S. Appl. No. 12/263,089.
Office Action dated Sep. 8, 2011 from U.S. Appl. No. 12/544,994.
Final Office Action dated Sep. 21, 2011 from U.S. Appl. No. 11/633,880.
Final Office Action dated Sep. 27, 2011 from U.S. Appl. No. 11/969,778.
Notice of Allowance dated Nov. 1, 2011 from U.S. Appl. No. 11/821,938.
Chinese Office Action dated Feb. 1, 2011 from Chinese Application No. 2007101962371.
Final Office Action dated Nov. 10, 2011 from U.S. Appl. No. 11/803,826.
U.S. Office Action for U.S. Appl. No. 11/969,778 mailed Jun. 15, 2011.
Chinese Office Action dated Sep. 23, 2011 from Chinese Application No. 200880016311.X.
Chinese Office Action dated Sep. 8, 2011 from Chinese Application No. 200880009063.6.
Office Action dated Jan. 17, 2012 from U.S. Appl. No. 12/544,994.
Copernic Inc., http://copernic.com/en/products/desktop-search/index.html, Sep. 18, 2008.
Chinese Office Action dated Aug. 14, 2009 issued in Chinese Patent Application No. 2008100826213.
Japanese Office Action dated Mar. 21, 2012 from Japanese Application No. 2009-554447.
Chinese Office Action dated Apr. 17, 2012 from Chinese Application No. 200880016311.X.
Japanese Office Action dated Apr. 10, 2012 from Japanese Application No. 2010-508303.
Office Action dated Mar. 29, 2012 from U.S. Appl. No. 11/726,340.
Notice of Allowance dated Jan. 30, 2012 from U.S. Appl. No. 11/981,019.
Chinese Office Action dated Aug. 28, 2012 for Chinese Application No. 200880016311.X from China Patent Office, pp. 1-24, People's Republic of China (English-language translation included, pp. 1-15).
U.S. Final Office Action for U.S. Appl. No. 12/544,994 mailed Oct. 23, 2012.
Chinese Office Action dated Mar. 7, 2013 for Chinese Application No. 200880016311.X from China Intellectual Property Office, pp. 1-25, People's Republic of China (English-language translation included, pp. 1-16).
U.S. Notice of Allowance for U.S. Appl. No. 11/726,340 mailed Apr. 29, 2013.
Japanese Office Action dated Nov. 20, 2012 for Chinese Application No. 2009554447 from Japan Patent Office, pp. 1-4, Tokyo, Japan (Machine generated English-language translation included, pp. 1-2).
U.S. Advisory Action for U.S. Appl. No. 12/544,994 mailed Jan. 23, 2013.
U.S. Non-Final Office Action for U.S. Appl. No. 12/544,994 mailed Jul. 17, 2013.
U.S. Final Office Action for U.S. Appl. No. 12/544,994 mailed Nov. 5, 2013.
U.S. Notice of Allowance for U.S. Appl. No. 11/803,826 mailed Mar. 27, 2014.
U.S. Non-Final Office Action for U.S. Appl. No. 12/544,994 mailed Mar. 28, 2014.
U.S. Notice of Allowance for U.S. Appl. No. 12/544,994 mailed Sep. 19, 2014.
Related Publications (1)
Number Date Country
20080133504 A1 Jun 2008 US