Media players are in common use among a broad base of users. Radio and television have provided entertainment for generations of users. Portable transistor radios of the 1960s began a trend to smaller and more robust personal media players including very small players storing all digital content on both rotating and non-rotating media. Streaming media is available over both wireless and wired networks and may be displayed on cellular telephones and other portable media devices.
Information about the media being played is often available in the form of a ‘now playing’ identifier or a radio station genre. Often, however, a listener or viewer is interested in more information than simply what is playing. A listener may wonder what kind of instrument is playing at a given moment or the name of a back up singer. A media viewer may have similar questions related to a location or props in a particular scene.
A media player may be operable to accept a user input indicating interest in a media object at a particular point in the time during playback time of the media object. The ‘earmark’ may then be used to search for available information about the media object. The media object itself may contain metadata organized by time for use in supplying data. Alternatively, the metadata may provide keywords or phrases used to populate a search for related information. In another embodiment, the metadata may contain one or more URLs for directly accessing related information. The search may be made from the media player or may be performed at a computer using information sent by the media player. When searching from a computer, the search criteria, as little as a reference to the media object with a time, or as complete as URLs, may be transferred to a computer explicitly for the purpose of searching or may be transferred during a normal synchronizing operation.
The media player may be a handheld device, a virtual player on a computer, a set-top box, a cellular telephone, or other device capable of supporting media objects, a user interface, and in many cases, external communication.
Although the following text sets forth a detailed description of numerous different embodiments, it should be understood that the legal scope of the description is defined by the words of the claims set forth at the end of this disclosure. The detailed description is to be construed as exemplary only and does not describe every possible embodiment since describing every possible embodiment would be impractical, if not impossible. Numerous alternative embodiments could be implemented, using either current technology or technology developed after the filing date of this patent, which would still fall within the scope of the claims.
It should also be understood that, unless a term is expressly defined in this patent using the sentence “As used herein, the term ‘——————’ is hereby defined to mean . . . ” or a similar sentence, there is no intent to limit the meaning of that term, either expressly or by implication, beyond its plain or ordinary meaning, and such term should not be interpreted to be limited in scope based on any statement made in any section of this patent (other than the language of the claims). To the extent that any term recited in the claims at the end of this patent is referred to in this patent in a manner consistent with a single meaning, that is done for sake of clarity only so as to not confuse the reader, and it is not intended that such claim term by limited, by implication or otherwise, to that single meaning. Finally, unless a claim element is defined by reciting the word “means” and a function without the recital of any structure, it is not intended that the scope of any claim element be interpreted based on the application of 35 U.S.C. §112, sixth paragraph.
Much of the inventive functionality and many of the inventive principles are best implemented with or in software programs or instructions and integrated circuits (ICs) such as application specific ICs. It is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ICs with minimal experimentation. Therefore, in the interest of brevity and minimization of any risk of obscuring the principles and concepts in accordance to the present invention, further discussion of such software and ICs, if any, will be limited to the essentials with respect to the principles and concepts of the preferred embodiments.
The media device 100 may also include additional storage 108 (removable and/or non-removable) including, but not limited to, magnetic or optical disks or tape or any other memory that can be easily rewritten, may keep data for long periods of time when power is removed and may allow quick and efficient access to data. Such additional storage is illustrated in
The processing unit 102 may be any processing unit 102 capable of executing computer code to decode media data from a compressed format into a useable form fast enough such that music and video may be played continuously without skips or jumps. When in a portable media device, it may also be useful if the processor 102 is efficient in using power to increase the life of the power source. The processing unit 102 may also be used to execute code to support a user interface and external communications.
The user interface may include one or more displays 114 for both displaying control information and displaying viewable media. The display 114 may be a color LCD screen that fits inside the device 100. User input(s) 116 may include either manual buttons, soft buttons, or a combination of both. Soft buttons may be used when the display 114 includes a touch screen capability. Manual buttons may include re-definable keys with programmable legends.
The media device 100 may also contain communications connection(s) 122 that allow the device 100 to communicate with external entities 124, such as network endpoints or a computer used for synchronization. Communications connection(s) 122 is an example of communication media. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. The term computer readable media as used herein includes both storage media and communication media.
The power source may be a battery that may be rechargeable. The power source may also be a standard battery or an input from a power converter.
In operation, a user may activate playback of a media object using the user interface to select and play. At any point during the playback, the user may create a mark object using the user interface, for example, a soft key available only during playback. Creating the mark object may initiate activity by the processor 102 to execute a search for references. The search may be based on a combination of the media object and the playback time associated with the creation of the mark object. As described in more detail below, the results of the initial search may be used to launch a second search for information associated with the references discovered in the initial search. In one embodiment, metadata from the media object may be used as a seed for a single search. While any number of combinations of search may be used, in one exemplary embodiment, the initial search may be performed on the media device to locate metadata associated with the media object and the particular playback time. The second search may involve communication of the metadata over the communication port 122 with the external entity 124 to either directly or indirectly perform a search, such as a web search, using the metadata as a search key. Other combinations of search and data retrieval are discussed in more detail below.
At block 204 the media object may be played using the user interface, or played automatically by a programmable activation trigger. During playback of the media object, at block 206, creation of a mark object may be initiated by a user via a user interface element, such as a soft key. The mark object may be persistent, that is, permanently stored, or may be transitory, stored only in local volatile memory. The complexity of the mark object may vary substantially over different embodiments based on where and how much metadata or reference information is immediately available. In one embodiment, only a media object identifier and a time indicator may be used as the mark object. In such an embodiment, virtually all the metadata or reference information may be gathered from sources outside the media object itself, either locally or remotely. In another exemplary embodiment, when the media object contains its own metadata, creation of the mark object may include extracting metadata from the media object so that the mark object itself may include metadata corresponding to the media object.
The metadata may be more or less specific to a given time. That is, some metadata may be appropriate to all times of the playback, such as producer or director. Other metadata may be specific to a very narrow time range, for example, a five second scene of a car driving through a city street.
The metadata itself may vary substantially based on a particular embodiment. The metadata may have specific search keys used to initiate a web search. Search key metadata or direct URL metadata may provide links or search keys for any kind of data, but may be particularly useful for information that may change over time, such as actor filmography or the name of a store. Alternatively, the metadata may have pre-determined information about the time range of the media in question, such as artists, instruments, actors, locations, etc. Such information may be anticipated as likely to be requested and is also relatively stable. Metadata that is complete of itself may be directly displayable on the media device itself without use of a network.—For example, the metadata may include an actor's name, a product identifier or brand name, or a geographic location (e.g. Central Park). In such a case, that search key data may be passed to a common data search engine. In another embodiment, the metadata may include a universal resource locator (URL) and may represent a web destination of its own, for example, a specific product page in a company's on-line store. In another embodiment, the metadata may include a combination or key words and URLs. To illustrate a range of embodiments, minute 1:00-1:05 of an MP3 audio track may be associated with URL metadata that points to a record label's web site. Upon reaching the web site, a list of musicians and their instruments playing during that time period of the audio track may be listed. Additionally, the record label web site may include a “listeners who liked this also enjoyed . . . ” to help promote related items. To illustrate a more complex embodiment, at minute 22:00 of a movie, an actor in business attire may leave a subway station and walk into a hotel. Associated metadata may include the actor's name, the brand name of the suit, and a URL pointing to the hotel's web site. A combination of web search and navigation to a web destination may be incorporated into the data session based on the metadata. In another exemplary embodiment, the metadata may include a schema with all the artists who perform on a track listed by identifier along with references by artist identifier by time period in the track. An inquiry regarding a guest vocalist may be answered without referring the query to a web search. However, additional information requests, for example, a request related to the guest vocalist may be queued using the data from the locally-generated answer.
To accommodate scenes or thematic music elements, the metadata may be organized by time range. Using the illustration above, the scene of the actor walking into the hotel may play from minute 20:05 to minute 23:30. Any mark object falling in that time range may cause an association to the same metadata. More relationships between mark objects and metadata are discussed with respect to
The time in a media object may be extracted according to the digital media itself. In one embodiment, the time may be cumulative from the start, while in another embodiment the time may be associated with an MPEG frame reference. In still another embodiment, the time may come from a presentation time stamp (PTS) in streaming video.
At block 208, the mark object may be stored. The mark object may be stored on the media device 100 in either volatile or non-volatile memory. The mark object may be used on the media device itself, for example, in a set-top box, game consoles, smart phones, cellular phones and the like. A nominal amount of storage and network access allow utilization both local and remote metadata searches. Alternatively, the mark object may be sent to another computer or device for further processing when the media device 100 has a relatively small amount of storage or does not have a suitable network connection. As discussed above, the mark object may include as little as a media identifier and time reference. The mark object may be supplemented with locally available metadata before being sent to another device to perform the search.
At block 210, a search for related data may be performed using the mark object or metadata retrieved using the mark object. In some embodiments, the metadata may include executable code for presenting a user with search options related to the media object. For example, the metadata may include HTML statements for presenting a user with search options such as whether to search using keywords from the metadata or to select from a list of destination URLs from the metadata. User options may also include allowing the user to view locally available metadata before launching an external search. The search or web inquiry may be performed from the media device 100 or an external computer, for example, a computer used in synchronizing the media device 100. The search may be performed while the media object is still playing, providing results while the media object plays or after the playback is complete. Delaying the display of results may be standard on a portable media player, if screen size, processing power, or battery life are an issue. At block 212, the results may be returned and at block 214, the results may be displayed. When the results are returned in a suitable form, they may be communicated to and stored on the media device 100, even though the search may have been performed at a network accessible computer. The results may be added to locally available media object metadata to save time if the same query is made at a later time. The use of image analysis or scene identifiers may be used instead of or in supplement to metadata-based query support. A cursor-oriented user interface may be used to indicate a location on the screen with a cursor click on a spot on the scene. The cursor click inherently marks a playback time. An examination of the scene may use image analysis at the point of the cursor mark to use edge analysis or other pattern recognition technique to identify the shape indicated. The shape may be used as a key for a local or remote search to retrevied additional information about shape
Other combinations of metadata and search result sources are apparent to those of ordinary skill, including searches made on accessible devices in an ad-hoc network community.
The use of mark objects to create search criteria targeting time-oriented elements of a media object greatly expand the amount of information available to a consumer of media without burdening the media producer with changes to media data formats or media storage capability. However, as more data storage space becomes available through technologies such as Blu-ray, the ability to add items of interest will become more commonplace. Earmarking provides a useful way to make such additional data available to both current and future media object consumers. The techniques described above allow backward compatibility to ‘small media’ such as CDs using external metadata and forward compatibility with more dense storage media incorporating integral time-organized data.
Although the forgoing text sets forth a detailed description of numerous different embodiments of the invention, it should be understood that the scope of the invention is defined by the words of the claims set forth at the end of this patent. The detailed description is to be construed as exemplary only and does not describe every possibly embodiment of the invention because describing every possible embodiment would be impractical, if not impossible. Numerous alternative embodiments could be implemented, using either current technology or technology developed after the filing date of this patent, which would still fall within the scope of the claims defining the invention.
Number | Name | Date | Kind |
---|---|---|---|
5920694 | Carleton et al. | Jul 1999 | A |
6463444 | Jain et al. | Oct 2002 | B1 |
6551357 | Madduri | Apr 2003 | B1 |
6567980 | Jain et al. | May 2003 | B1 |
6578047 | Deguchi | Jun 2003 | B1 |
6801576 | Haldeman et al. | Oct 2004 | B1 |
6877134 | Fuller et al. | Apr 2005 | B1 |
6925197 | Dimitrova et al. | Aug 2005 | B2 |
6956593 | Gupta et al. | Oct 2005 | B1 |
6957226 | Attias | Oct 2005 | B2 |
7093191 | Jain et al. | Aug 2006 | B1 |
7127454 | Deguchi | Oct 2006 | B2 |
7158943 | van der Riet | Jan 2007 | B2 |
7190971 | Kawamoto | Mar 2007 | B1 |
7210039 | Rodgers et al. | Apr 2007 | B2 |
7260564 | Lynn et al. | Aug 2007 | B1 |
7295752 | Jain et al. | Nov 2007 | B1 |
7454401 | Yamamoto et al. | Nov 2008 | B2 |
7490107 | Kashino et al. | Feb 2009 | B2 |
7505605 | Rhoads et al. | Mar 2009 | B2 |
7636733 | Rothmuller | Dec 2009 | B1 |
7680781 | Wasserman et al. | Mar 2010 | B1 |
7801910 | Houh et al. | Sep 2010 | B2 |
RE41957 | Deguchi et al. | Nov 2010 | E |
7848948 | Perkowski et al. | Dec 2010 | B2 |
7890490 | Bovenschulte et al. | Feb 2011 | B1 |
20010018693 | Jain et al. | Aug 2001 | A1 |
20010023436 | Srinivasan et al. | Sep 2001 | A1 |
20020023020 | Kenyon et al. | Feb 2002 | A1 |
20020069218 | Sull et al. | Jun 2002 | A1 |
20030028432 | Troyansky et al. | Feb 2003 | A1 |
20030149975 | Eldering et al. | Aug 2003 | A1 |
20030177503 | Sull et al. | Sep 2003 | A1 |
20040002938 | Deguchi | Jan 2004 | A1 |
20040019521 | Birmingham | Jan 2004 | A1 |
20040059720 | Rodriguez | Mar 2004 | A1 |
20040093393 | Calligaro et al. | May 2004 | A1 |
20040133786 | Tarbouriech | Jul 2004 | A1 |
20040139047 | Rechsteiner et al. | Jul 2004 | A1 |
20040177096 | Eyal et al. | Sep 2004 | A1 |
20040236830 | Nelson et al. | Nov 2004 | A1 |
20050010787 | Tarbouriech | Jan 2005 | A1 |
20050055277 | Green et al. | Mar 2005 | A1 |
20050065853 | Ferreira | Mar 2005 | A1 |
20050091268 | Meyer et al. | Apr 2005 | A1 |
20050113066 | Hamberg | May 2005 | A1 |
20050204398 | Ryal | Sep 2005 | A1 |
20050229227 | Rogers | Oct 2005 | A1 |
20050234875 | Auerbach et al. | Oct 2005 | A1 |
20060069998 | Artman et al. | Mar 2006 | A1 |
20060072785 | Davidson et al. | Apr 2006 | A1 |
20060085825 | Istvan et al. | Apr 2006 | A1 |
20060089843 | Flather | Apr 2006 | A1 |
20060173825 | Hess et al. | Aug 2006 | A1 |
20060242161 | Ten Kate et al. | Oct 2006 | A1 |
20060259375 | Deguchi | Nov 2006 | A1 |
20070005581 | Arrouye et al. | Jan 2007 | A1 |
20070106693 | Houh et al. | May 2007 | A1 |
20070112837 | Houh et al. | May 2007 | A1 |
20070149114 | Danilenko | Jun 2007 | A1 |
20070168315 | Covannon et al. | Jul 2007 | A1 |
20070273754 | Cockerton et al. | Nov 2007 | A1 |
20080092168 | Logan et al. | Apr 2008 | A1 |
20080195660 | Tedesco et al. | Aug 2008 | A1 |
Number | Date | Country |
---|---|---|
2003076704 | Mar 2003 | JP |
WO-2006012629 | Feb 2006 | WO |
WO-2006077536 | Jul 2006 | WO |
Number | Date | Country | |
---|---|---|---|
20080109405 A1 | May 2008 | US |