This application claims the benefit of Korean Patent Application Nos. 2005-1749 filed on Jan. 7, 2005, and 2005-108532 filed on Nov. 14, 2005, in the Korean Intellectual Property Office, the disclosures of which are incorporated herein by reference.
1. Field of the Invention
The present invention relates to reproducing audio-visual (AV) data recorded on a storage medium, and more particularly, to a storage medium storing metadata for providing an enhanced search function.
2. Related Art
Storage media, such as DVDs and Blu-ray discs (BDs), store audio-visual (AV) data composed of video, audio, and/or subtitles that are compression-encoded according to standards for digital video and audio compression, such as a MPEG (Motion Picture Experts Group) standard. Storage media also store additional information such as encoding properties of AV data or the order in which the AV data is to be reproduced. In general, moving pictures recorded on a storage medium are sequentially reproduced in a predetermined order. However, the moving pictures can be reproduced in units of chapters while AV data is being reproduced.
The clips 110 are implemented as one object which includes a clip AV stream 112 for an AV data stream for a high picture quality movie and clip information 114 for attributes corresponding to the AV data stream. For example, the AV data stream may be compressed according to a standard, such as the motion picture experts group (MPEG). However, such clips 110 need not require the AV data stream 112 to be compressed in all aspects of the present invention. In addition, the clip information 114 may include audio/video properties of the AV data stream 112, an entry point map in which information regarding a location of a randomly accessible entry point is recorded in units of a predetermined section and the like.
Each playlist 120 includes a playlist mark composed of marks which indicate the positions of clips 110 corresponding to the playlist 120. Each playlist 120 also includes a set of reproduction intervals of these clips 110, and each reproduction interval is referred to as a play item 122. Hence, the AV data can be reproduced in units of playlists 120 and in an order of play items 122 listed in each playlist 120.
The movie object 130 is formed with navigation command programs, and these navigation commands start reproduction of a playlist 120, switch between movie objects 130, or manage reproduction of a playlist 120 according to preference of a user.
The index table 140 is a table at the top layer of the storage medium to define a plurality of titles and menus, and includes start location information of all titles and menus such that a title or menu selected by a user operation, such as title search or menu call, can be reproduced. The index table 140 also includes start location information of a title or menu that is automatically reproduced first when a storage medium is placed on a reproducing apparatus.
However, in such a storage medium, there is no method for jumping to an arbitrary scene according to a search condition (e.g., scene, character, location, sound, or item) desired by a user and reproducing the scene. In other words, a typical storage medium does not provide a function for moving to a portion of the AV data according to a search condition (e.g., scene, character, location, sound, or item) set by the user and reproducing the portion. Therefore, the storage medium cannot offer diverse search functions.
Since AV data is compression-encoded and recorded on a conventional storage medium according to an MPEG 2 standard and multiplexed, it is difficult to manufacture a storage medium that contains metadata needed to search for a moving picture. In addition, once a storage medium is manufactured, it is almost impossible to edit or reuse AV data or metadata stored in the storage medium.
Further, a currently defined playlist mark cannot distinguish multiple angles or multiple paths. Therefore, even when AV data supports multiple angles or multiple paths, it is difficult to provide diverse enhanced search functions on the AV data.
Various aspects and example embodiments of the present invention provide a storage medium storing metadata for providing an enhanced search function using various search keywords of audio-visual (AV) data. In addition, the present invention also provides a storage medium storing metadata for actively providing an enhanced search function in connection with AV data in various formats, and an apparatus and method for reproducing the storage medium.
Additional aspects and/or advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
In accordance with an aspect of the present invention, there is provided a storage medium storing: audio-visual (AV) data; and metadata for conducting an enhanced search of the AV data by scene using information regarding at least one search keyword.
The AV data may be a movie title. The metadata may be defined for each playlist which is a reproduction unit of the AV data. The enhanced search may be applied to a main playback path playlist which is automatically reproduced according to an index table when the storage medium is loaded.
The metadata may include information regarding an entry point of each scene. Each scene may be represented as content between two neighboring entry points. When a user searches for contents using a search keyword, search results may be represented as a group of entry points corresponding to metadata whose search keyword information matches the search keyword. The entry points may be sequentially arranged temporally on playlist.
The metadata may include information regarding an entry point and a duration of each scene. When the entry points are sequentially arranged temporally, each scene may be defined as a section between an entry point of the scene and a point at the end of the duration of the scene.
When a user searches for contents using the search keyword, a playlist may be reproduced from an entry point of a scene selected from search results by the user to the end of the playlist.
When a user searches for contents using the search keyword, a scene selected by the user from search results may be reproduced from the entry point of the scene for the duration of the scene, and a next scene may be reproduced.
When a user searches for contents using the search keyword, the search results may be sequentially reproduced without waiting for a user input.
When a user searches for contents using the search keyword, a scene selected by the user from search results may be reproduced from the entry point of the scene for the duration of the scene, and reproduction may be stopped.
The metadata may further include information regarding angles supported by each scene. When the AV data is represented by a single angle, each scene may be distinguished by the entry point thereof, and not by the information regarding the angles. No entry points found as a result of conducting the enhanced search using one search keyword may overlap each other.
When the AV data is multi-angle data, each scene can be distinguished by the entry point of the scene and the information regarding the angles. At least one of the entry points found as a result of conducting the enhanced search using one search keyword can overlap each other.
The at least one search keyword may comprise at least one of a scene type, a character, an actor, and search keyword that can be arbitrarily defined by an author. The metadata may be recorded in a file separately from the AV data.
In addition to the example embodiments and aspects as described above, further aspects and embodiments of the present invention will be apparent by reference to the drawings and by study of the following descriptions.
A better understanding of the present invention will become apparent from the following detailed description of example embodiments and the claims when read in connection with the accompanying drawings, all forming a part of the disclosure of this invention. While the following written and illustrated disclosure focuses on disclosing example embodiments of the invention, it should be clearly understood that the same is by way of illustration and example only and that the invention is not limited thereto. The spirit and scope of the present invention are limited only by the terms of the appended claims. The following represents brief descriptions of the drawings, wherein:
Reference will now be made in detail to the present embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The embodiments are described below in order to explain the present invention by referring to the figures.
The reading unit 210 reads audio-visual (AV) data and metadata for providing the enhanced search function from a storage medium 250, such as a Blu-ray disc (BD). The reproducing unit 220 decodes and reproduces the AV data. In particular, when a user inputs a search keyword, the reproducing unit 220 receives from the search unit 230 information regarding a scene matching the search keyword and reproduces the scene. When there are multiple scenes matching the search keyword, the reproducing unit 220 displays the scenes on the user interface 240 and reproduces one or more of the scenes selected by the user or sequentially reproduces all of the scenes. The reproducing unit 220 may also be called a playback control engine.
The search unit 230 receives a search keyword from the user interface 240 and searches for scenes matching the search keyword. Then, the search unit 240 transmits the search results to the user interface 240 to display the search results in the form of a list or to the reproducing unit 220 to reproduce the same. As illustrated in
The user interface 240 receives a search keyword input by a user or displays search results. Also, when a user selects a scene from search results, i.e., a list of scenes found, displayed on the user interface 240, the user interface 240 receives information regarding the selection.
Next, all scenes matching the input search keyword are searched for with reference to a metadata file at block 320. The metadata file defines a plurality of scenes, and includes information regarding search keywords associated with each scene and an entry point of each scene. The structure of the metadata file will be described in detail below. Portions of AV data which correspond to found scenes are searched for using entry points of the found scenes and are reproduced at block 330. In this way, an enhanced search can be conducted on AV data using various search keywords. Hereinafter, the enhanced search function will also be referred to as a “title scene search function.”
The user selects one of a plurality of search keyword categories displayed on the user interface 240 at stage #2, and selects a search keyword from the selected search keyword category at stage #3. For example, when the user selects “item” as a search keyword category and selects “tower” as a search keyword corresponding to “item,” the movie title is searched for scenes in which “tower” appears, and search results are displayed together with respective thumbnails at stage #4. When the user selects one of the search results, i.e., found scenes, the selected scene is reproduced at stage #5. Using a command such as “skip to next search result” or “skip to previous search result” on the user interface 40, a previous or next scene can be searched for and reproduced at stage #6.
A “highlight playback” function for sequentially reproducing all scenes found can also be provided. In the highlight playback, all search results are sequentially reproduced. As a result, there is no need to wait until a user selects one of the search results. When a user selects a search keyword associated with contents, search results for the selected search keyword are obtained. The search results form the highlights of the contents associated with the selected search keyword.
The structure of the metadata for the title scene search will now be described in detail herein below.
Using an entry point (EP) map included in clip information 114, each entry point is converted into an address of a scene in a clip AV stream 112 included in each clip 110. Therefore, the start position of each scene included in a clip AV stream 112, which is real AV data, can be found using an entry point. Each scene 512 also includes information regarding search keywords associated therewith (hereinafter referred to as search keyword information). For example, the search keyword information may include the following:
Scene 1 is a battle scene,
Characters are A, B and C,
Actors are a, b and c, and
Location is x.
Accordingly, a user can search for scenes matching a desired search keyword based on the search keyword information of each scene 512. In addition, the start positions of found scenes in a clip AV stream 112 can be determined using the entry points of the found scenes, and then the found scenes can be reproduced.
The metadata 500 for the title scene search is stored in files in a META directory separately from the AV data. A metadata file for a disc library is dlmt_xxx.xml, and a metadata file for the title scene search is esmt_xxx_yyyyy.xml. According to an embodiment of the present invention, the meta data 100 is recorded in an XML format and in a markup language for easy editing and reuse. Hence, after the storage medium is manufactured, data recorded thereon can be edited and reused.
An example of conducting the title scene search using metadata 500 will now be described as follows.
Specifically,
Referring to
A playlist which is automatically reproduced according to the index table when a storage medium 250 is loaded into an example reproducing apparatus 200, shown in
In summary, the application scope of the title that provides the enhanced search function has the following constraints.
When a user searches for contents using a search keyword, search results are represented as a group of entry points included in scenes having metadata whose search keyword information matches the search keyword. Such entry points are sequentially arranged temporally and transmitted to the playback control engine, i.e., as the reproducing unit 200 as shown in
Referring to
1) Scenario 1: Simple Playback
Regardless of duration, a playlist is reproduced from an entry point of a scene selected by a user from search results to the end of the playlist unless there is a user input. For example, when a user selects scenetype #1, playlist #1 is reproduced from an entry point of scene #1 to the end of playlist #1.
2) Scenario 2: Highlight Playback
A playlist is reproduced from an entry point of a scene selected by a user from search results until the end of the duration of the selected scene. Then, the reproducing unit 20 jumps to a next scene and reproduces the next scene. For example, when a user selects scenetype #2, only scene #1 and scene #3, which are search results, are reproduced. In other words, only the highlights of playlist #1 which are associated with the search keyword scenetype #2 are reproduced. Another example of the highlight playback is illustrated in
3) Scenario 3: Scene-Based Playback
Search results are reproduced by scene. In other words, a scene selected by a user from search results is reproduced from an entry point of the scene for the duration of the scene. After the duration, reproduction is stopped until a user input is received. Scenario 3 is similar to scenario 2 except that the reproduction is stopped at the end of the scene.
Found scenes can overlap each other because overlapping entry points can be distinguished by “angle_num” shown in
Referring to
In the case of play items which support multiple angles (for example, the second and fourth play items), the metadata 500 is applied to AV data corresponding to one of the supported multiple angles. For example, in the case of scene #1, parts of first and second play items are defined as a reproduction section, and a value of angle_num is three. The value of angle_num is applied only to play items that support multiple angles. Therefore, play items that do not support multiple angles are reproduced at a default angle. A player status register (PSR), 3 which is a state register of the reproducing apparatus 200, as shown, for example, in
As described above, the present invention provides a storage medium storing metadata for providing an enhanced search function using various search keywords for AV data, an apparatus and method for reproducing the storage medium. The present invention can also provide the enhanced search function in connection with AV data in various formats.
In other words, the metadata for providing the enhanced search function is defined by scene by an author, and each scene includes information regarding at least one search keyword. In addition, each scene includes information regarding an entry point and/or a duration, angles, and so on. Hence, the enhanced search function can be conducted using various search keywords.
Further, search results can be reproduced according to diverse scenarios, and the enhanced search function can be provided for movie titles that support multiple angles or multiple paths. Moreover, metadata can be created in multiple languages, thereby enabling the provision of the enhanced search function that supports multiple languages.
Example embodiments of the enhanced search method according to the present invention can be written as a computer program and can also be implemented in a general digital computer that executes the computer program recorded on a computer-readable medium. Codes and code segments constructing the computer program can be easily induced by computer programmers in the art. The computer-readable medium can be any data storage device that can store data and can be thereafter read by a computer. Examples of the computer-readable medium include read-only memory (ROM), random-access memory (RAM), CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and carrier waves (such as data transmission through the Internet). The computer-readable medium can also be distributed over network-coupled computer systems so that the computer program is stored and executed in a distributed fashion.
While the present invention has been particularly shown and described with reference to example embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention. For example, any computer-readable medium or data storage device may be utilized, as long as metadata is included in the playlist in the manner shown in
Number | Date | Country | Kind |
---|---|---|---|
10-2005-0001749 | Jan 2005 | KR | national |
10-2005-0108532 | Nov 2005 | KR | national |
Number | Name | Date | Kind |
---|---|---|---|
5694381 | Sako | Dec 1997 | A |
5999696 | Tsuga et al. | Dec 1999 | A |
6181872 | Yamane et al. | Jan 2001 | B1 |
6289165 | Abecassis | Sep 2001 | B1 |
6360057 | Tsumagari et al. | Mar 2002 | B1 |
6360234 | Jain et al. | Mar 2002 | B2 |
6553086 | Yoo et al. | Apr 2003 | B1 |
6633903 | Gould | Oct 2003 | B1 |
6760721 | Chasen et al. | Jul 2004 | B1 |
6772125 | Harradine et al. | Aug 2004 | B2 |
6799180 | McGrath et al. | Sep 2004 | B1 |
7031595 | Yamaguchi | Apr 2006 | B2 |
7565060 | Hamada et al. | Jul 2009 | B2 |
7764866 | Seo et al. | Jul 2010 | B2 |
7787755 | Seo et al. | Aug 2010 | B2 |
8041189 | Shinkai et al. | Oct 2011 | B2 |
20020018422 | Tonami et al. | Feb 2002 | A1 |
20020040360 | Sohma et al. | Apr 2002 | A1 |
20020044757 | Kawamura et al. | Apr 2002 | A1 |
20020069218 | Sull et al. | Jun 2002 | A1 |
20020076201 | Tsumagari et al. | Jun 2002 | A1 |
20020164152 | Kato et al. | Nov 2002 | A1 |
20020174430 | Ellis et al. | Nov 2002 | A1 |
20020198864 | Ostermann et al. | Dec 2002 | A1 |
20030061610 | Errico | Mar 2003 | A1 |
20030103604 | Kato et al. | Jun 2003 | A1 |
20030113096 | Taira et al. | Jun 2003 | A1 |
20030156504 | Kanegae et al. | Aug 2003 | A1 |
20030167264 | Ogura et al. | Sep 2003 | A1 |
20030229894 | Okada et al. | Dec 2003 | A1 |
20040001700 | Seo et al. | Jan 2004 | A1 |
20040012621 | Kaneko et al. | Jan 2004 | A1 |
20040047588 | Okada et al. | Mar 2004 | A1 |
20040086264 | Okada et al. | May 2004 | A1 |
20040139047 | Rechsteiner et al. | Jul 2004 | A1 |
20040170391 | Tsumagari et al. | Sep 2004 | A1 |
20040175146 | Tsumagari et al. | Sep 2004 | A1 |
20040210932 | Mori et al. | Oct 2004 | A1 |
20040215643 | Brechner et al. | Oct 2004 | A1 |
20040220791 | Lamkin et al. | Nov 2004 | A1 |
20040267742 | Polson et al. | Dec 2004 | A1 |
20050141869 | Kanegae et al. | Jun 2005 | A1 |
20050163480 | Takemoto | Jul 2005 | A1 |
20050165844 | Yanagita et al. | Jul 2005 | A1 |
20050198006 | Boicey et al. | Sep 2005 | A1 |
20050244137 | Takashima et al. | Nov 2005 | A1 |
20050254363 | Hamada et al. | Nov 2005 | A1 |
20060045473 | Alterman | Mar 2006 | A1 |
20060153535 | Chun et al. | Jul 2006 | A1 |
20070140653 | Kozuka et al. | Jun 2007 | A1 |
20070146792 | Shiraiwa | Jun 2007 | A1 |
20090182719 | Chun et al. | Jul 2009 | A1 |
20100202753 | Chun et al. | Aug 2010 | A1 |
20100217775 | Chun et al. | Aug 2010 | A1 |
20120179701 | Arrouye et al. | Jul 2012 | A1 |
Number | Date | Country |
---|---|---|
1142112 | Feb 1997 | CN |
1199908 | Nov 1998 | CN |
1338743 | Mar 2002 | CN |
0 685 845 | Apr 1995 | EP |
A-0685845 | Apr 1995 | EP |
0 847 196 | Jun 1998 | EP |
A-0847196 | Jun 1998 | EP |
0 929 072 | Jul 1999 | EP |
A-0929072 | Jul 1999 | EP |
1 102 271 | May 2001 | EP |
1198133 | Apr 2002 | EP |
1280347 | Jan 2003 | EP |
1 376 587 | Jan 2004 | EP |
1521267 | Apr 2004 | EP |
2 354 105 | Mar 2001 | GB |
11-025654 | Jan 1999 | JP |
2000-322875 | Nov 2000 | JP |
2001-155036 | Jun 2001 | JP |
2001-216726 | Aug 2001 | JP |
2001-292425 | Oct 2001 | JP |
2002-25224 | Jan 2002 | JP |
2002-108892 | Apr 2002 | JP |
2002-158971 | May 2002 | JP |
2002-373481 | Dec 2002 | JP |
2003-122761 | Apr 2003 | JP |
2003-186885 | Jul 2003 | JP |
10-2003-0033852 | May 2003 | KR |
10-2004-0066222 | Jul 2004 | KR |
10-2004-0094408 | Nov 2004 | KR |
10-2006-0011779 | Feb 2006 | KR |
2 228 546 | May 2004 | RU |
WO-9222983 | Dec 1992 | WO |
WO-9722201 | Jun 1997 | WO |
WO-0182624 | Nov 2001 | WO |
WO-0208948 | Jan 2002 | WO |
WO 2004074214 | Sep 2004 | WO |
WO 2004074976 | Sep 2004 | WO |
WO-2004084214 | Sep 2004 | WO |
WO 2004-086371 | Oct 2004 | WO |
WO 2004095834 | Nov 2004 | WO |
WO 2004098183 | Nov 2004 | WO |
WO 2006073275 | Jul 2006 | WO |
Entry |
---|
Preliminary Notice of First Office Action issued in Taiwanese Patent Application No. 095100437 on Nov. 26, 2008. |
European Search Report issued on Dec. 18, 2009, in corresponding European Application No. 06702008.1 (11 pages). |
PCT International Search Report and Written Opinion issued Mar. 30, 2006 re: International Patent Application No. PCT/KR2006/000050 (10 pp). |
Office Action issued in corresponding Malaysian Patent Application No. PI20060054 dated May 22, 209. |
Office Action issued on May 23, 2007 by the Korean Intellectual Property Office for the Korean Patent Application No. 2005-1749. |
U.S. Appl. No. 60/634,546. |
Korean Office Action issued on Sep. 14, 2010, in corresponding Korean Patent Application No. 10-2005-0108532 (6 pages). |
Chinese Office Action issued on Sep. 26, 2011, in counterpart Chinese Application No. 201010003912.6 (9 pages, in Chinese, including English translation). |
Japanese Office Action issued on Nov. 29, 2011, in counterpart Japanese Application No. 2007-550292 (10 pages, in Japanese, including English translation). |
Japanese Office Action issued on Nov. 29, 2011, in counterpart Japanese Application No. 2009-66978 (8 pages, in Japanese, including English translation). |
Examination Report dated Mar. 7, 2012, in counterpart European Patent Application No. 10 169 530.2 (in English), 10 pages). |
Summons to Oral Proceedings dated Mar. 23, 2012, in counterpart European Patent Application No. 06702008.1 (in English, 13 pages). |
Chinese Notification of Granting of Paper Rights to Invention mailed Aug. 6, 2012, issued in counterpart Chinese Patent Application No. 201010003913.0; 8 pages including English translation. |
Extended European Search Report issued on Nov. 26, 2009, in counterpart European Patent Application No. 06702009.9 (10 pages). |
Indian Office Action issued on Apr. 19, 2011, in counterpart Indian Patent Application No. 03348/CHENP/2007 (2 pages). |
International Search Report and Written Opinion issued Mar. 30, 2006, issued in counterpart International Patent Application No. PCT/KR2006/000051 (10 pages). |
Japanese Office Action mailed on Jun. 30, 2009, issued in counterpart Japanese Patent Application No. 2007-550293. |
Japanese Office Action issued on Jul. 27, 2010, issued in counterpart Japanese Patent Application No. 2007-550293; 4 pages. |
Korean Office Action mailed Oct. 30, 2007, issued in counterpart Korean Patent Application No. 10-2005-0001749; 2 pages. |
Korean Office Action mailed Dec. 1, 2006, issued in Korean Patent Application No. 10-2005-0001749. |
Russian Decision to Grant issued by the Federal Service on Industrial Property, Patents and Trademarks of Russian Patent Application No. 2007124568/28(026752) on May 28, 2009; 12 pages including English translation. |
White paper—Blu-ray Disc Format—2.A Logical and Audio Visual Application Format Specifications for BD-RE, Blu-ray Disc Founders, Aug. 2004, pp. 1-26 (complete document). |
White paper—Blu-ray Disc Format—2.B Audio Visual Application Format Specifications for BD-ROM, Blu-ray Disc Association, Mar. 2005, pp. 1-35 (complete document). |
Application Definition—Blu-ray Disc Format—BD-J Baseline Application and Logical Model Definition for BD-ROM, Blu-ray Disc Association, Mar. 2005, pp. 1-45 (complete document). |
Extended European Search Report issued on Nov. 24, 2010, in counterpart European Application No. 10169530.2 (17 pages). |
Japanese Office Action issued on Aug. 16, 2011, in counterpart Japanese Application No. 2009-228034 (3 pages, in Japanese, no English translation). |
Russian Office Action issued Dec. 24, 2012 in counterpart Russian Application No. 2008151865/28(068181); (11 pages including English translation). |
Russian Decision on Grant issued on Feb. 27, 2009, in counterpart Russian Application No. 2007125643/28(027937) (14 pages, including English translation). |
Taiwanese Preliminary Notice of First Office Action issued on Nov. 26, 2008 in counterpart Taiwanese Patent Application No. 095100437 (16 pages, including complete English translation). |
Taiwanese Preliminary Notice of First Office Action issued on Feb. 12, 2009 in counterpart Taiwanese Patent Application No. 095100436 (21 pages, including complete English translation). |
Malaysian Office Action issued Jun. 13, 2014 in counterpart Malaysian Patent Application No. PI20060054 (2 pages, in English). |
Number | Date | Country | |
---|---|---|---|
20060153542 A1 | Jul 2006 | US |