This application claims the benefit, under 35 U.S.C. §119 of European Patent Application 12305805.9, filed Jul. 5, 2012.
The invention relates to a method and an apparatus for prioritizing metadata, and more specifically to a method and an apparatus for prioritizing metadata associated to audio or video data based on an analysis of priority variables. The invention further relates to a method and an apparatus for annotating audio or video data.
Today huge amounts of data are available in libraries, archives and databases. Digitalization and metadata, i.e. data about data, have simplified the use of these data. During digitization or content analysis different metadata extracting methods are used to extract and save these metadata to an internal metadata repository. With the help of metadata, the underlying data can be accessed efficiently. However, with the increasing number of available methods for extracting metadata, the amount of metadata in the repositories increases accordingly. This enormous increase of metadata amount somewhat decreases the efficiency for data access. How to raise the metadata quality has thus become more and more important. The daily increasing amount of digital audio and video content poses new challenges for content management systems in digital film and video archives. Therefore, authoring tools that can access and edit audio and video content efficiently are required. One approach for tackling the problem is the description of the contents of the audio and video files with the help of semantically linked metadata, and the use of this type of metadata for effective management of the huge data sets. Browsing through content or search and retrieval of specific content can be realized very efficiently by applying semantically linked metadata. Semantically linked metadata is a kind of qualified metadata. Also various types of recommendations for similar content can be realized with semantically linked metadata.
Nonetheless, also with semantically linked metadata there the overall amount of metadata associated to an audio or video file is too large for certain applications. For example, for the semantic annotation and linking work for a video file archivists would prefer to have a tool that can efficiently limit the amount of metadata that is presented. To give an example, a face detection algorithm detects all faces in a video regardless of how relevant the detected faces are for a semantic description of the content. Typically, in a news program roughly about 5% of the detected faces are relevant for the semantic annotation and linking work. Therefore, it would greatly increase the efficiency and usability of a manual semantic annotation and linking tool if only the relevant elements were presented to the user in a graphical user interface. A key aspect of such a tool is hence a prioritization of semantic metadata regarding their probable relevance to the semantic description of the video data.
A solution for prioritizing metadata has been proposed in European Patent Application 11306747.4, where prioritization values are automatically determined by combining specific characteristics of independently generated semantic metadata. According to this solution, a method for determining priority values of metadata items of a first set of metadata items associated to a video data item, the first set of metadata items being of a first type, comprises the steps of:
In order to prioritize the metadata, priority variables are used. These priority variables are calculated from the different types of metadata and/or from relationships between the types of metadata. Once the priority variables have been determined, they are analyzed to automatically classify the metadata items into different categories, e.g. important and non-important. The final priority value for each metadata is thus represented by a flag, i.e. essentially by an integer value.
It is an object of the present invention to propose an improved solution for prioritizing metadata, which is suitable for complex applications of the prioritized metadata.
According to the invention, a method for prioritizing a metadata item associated to audio or video data comprises the steps of:
Accordingly, an apparatus for prioritizing a metadata item associated to audio or video data comprises:
It has been found that a simple integer as priority value restricts the application of complex scenarios for metadata prioritization. Therefore, according to the invention the prioritization results are represented by a priority table, e.g. an SQL database table. Such a table enables much more complex metadata prioritization applications. For example, different metadata prioritization methods may be applied to the same metadata item, which may result in different priority values. Using a single priority value as proposed in the prior art would require the developer to select a “best” priority value from the multiple results obtained in the processing stage of metadata prioritization. As a result all other priority values for a metadata item would be lost. The solution according to the invention preserves all determined priority values for future applications.
As a further advantage, the solution according to the invention allows storing additional information for the priority values.
To give an example, the additional information may include the total number of prioritization methods applied to a metadata item or priority values available for a metadata item, a preferred priority value set by the developer or a user, information whether the preferred priority value is an original or post-edited value, etc.
Preferably, a priority detail table is generated and stored, which comprises information about the used prioritization method. This has the advantage that it is documented how a specific priority value was actually determined.
According to another aspect of the invention, a method for annotating audio or video data comprises the steps of:
Accordingly, an apparatus for annotating audio or video data comprises:
The solution has the advantage that only a subset of the metadata items is presented to the user, e.g. on a display graphical user interface. This subset advantageously comprises only the most relevant metadata items, i.e. those metadata items with the highest priority value. This greatly increases the efficiency of a manual annotation of the audio or video data. For determining the priority values either one of a plurality of prioritization methods is used or previously determined priority tables with priority values are retrieved from the metadata repository.
For a better understanding the invention shall now be explained in more detail in the following description with reference to the figures. It is understood that the invention is not limited to this exemplary embodiment and that specified features can also expediently be combined and/or modified without departing from the scope of the present invention as defined in the appended claims. In the figures:
In the following the invention shall be explained for metadata extracted from video data. Of course, the invention is not limited to this type of data. It can likewise be applied to audio data or other types of data, e.g. text data.
During an automatic metadata extraction from video data, a plurality of types of metadata are generated. A first type of metadata is temporal segmentation metadata, which is based on the detection of scenes, shots, sub-shots, and the like. A second type of metadata is spatial segmentation metadata, which is obtained, for example, by face detection or face group segmentation, or more generally by object detection. Another type of metadata is quality metadata, such as contrast, brightness, sharpness, information about blocking artifacts and compression artifacts, overall-quality, or noise. Impairment metadata gives information about dropouts, dirt, and scratches, etc. Finally, semantic metadata includes, inter alia, text-annotations, subtitles and the genre of the video data. For developing high performance metadata applications, these metadata are prioritized, as described, for example, in European Patent Application 11306747.4.
An exemplary metadata prioritization result in the form of a segment table is illustrated in
In order to enable more complex scenarios for metadata prioritization, according to the invention the segment table of
An exemplary priority table of a prioritized metadata item is depicted in
The above mentioned priority detail table is exemplified in
In the following an application scenario of the solution according to the present invention shall be discussed.
When the first priority value for a metadata item is generated, a new entry is inserted into the priority detail table of
Now, when another metadata prioritization method is used to generate a new priority value for the same metadata item, a further priority value for the same metadata item is added. A new entry is inserted into the priority detail table, for example UUID “xx2”, TrackingID “UUID-123”, ProcessType “original”, . . . . Note that the TrackingID value should be the same as the one for “xx1”, because the two entries refer to the same metadata item. Subsequently, the existing entry in the priority table under UUID “xxxA” is updated. Mainly, the NumberOfMethod field is set to the value “2”. The MethodsTracking value “UUID-123” remains unchanged. Also, the identifier string “xxxA” in the segment table remains unchanged.
A developer may set a default priority value in the priority detail table, e.g. under UUID “xx3”. The corresponding ProcessType in the priority detail table has the value “post-edit”. The priority table is updated accordingly. The NumberOfMethod field and the DefaultPriority field are set to the values “3” and “xx3”, respectively, where “xx3” is the associated UUID value in the priority detail table.
Also a user may set a user preferred priority value in the priority detail table, e.g. under UUID “xx4”. The corresponding ProcessType in the priority detail table has the value “post-edit”. The priority table is updated accordingly. The NumberOfMethod field and the UserPriority field are set to the values “4” and “xx4”, respectively, where “xx4” is the associated UUID value in the priority detail table.
Number | Date | Country | Kind |
---|---|---|---|
12305805 | Jul 2012 | EP | regional |
Number | Name | Date | Kind |
---|---|---|---|
7444357 | Shin | Oct 2008 | B2 |
7463290 | Tojo et al. | Dec 2008 | B2 |
7620633 | Parsons et al. | Nov 2009 | B1 |
8752192 | Odaka et al. | Jun 2014 | B2 |
20040037183 | Tanaka | Feb 2004 | A1 |
20070263818 | Sumioka | Nov 2007 | A1 |
20090125602 | Bhatia | May 2009 | A1 |
20100161570 | Novak et al. | Jun 2010 | A1 |
20110058787 | Hamada | Mar 2011 | A1 |
20120170721 | Yoakum | Jul 2012 | A1 |
20120240112 | Nishiguchi | Sep 2012 | A1 |
Number | Date | Country |
---|---|---|
2608059 | Jun 2013 | EP |
WO2009047674 | Apr 2009 | WO |
Entry |
---|
EP Search Report dated Nov. 30, 2012. |
Number | Date | Country | |
---|---|---|---|
20140012850 A1 | Jan 2014 | US |