1. Field of the Invention
The present invention relates to a method of recording additional data such as lyric and user input data to be in synchronization with audio data on a rewritable recording medium, and of reproducing them synchronously therefrom.
2. Description of the Related Art
A disk-type recording medium such as a Compact Disk (CD) can store high-quality digital audio data permanently, so that it is very popular recording medium in these days.
Recently, a Digital Versatile Disk (called ‘DVD’ hereinafter) has been developed as a new disk-type recording medium. A DVD can store much more data than a CD, that is, high-quality moving pictures or audio data are recorded on a DVD for much longer time. Therefore, a DVD will be used widely in the near future.
There are three types of DVDs—DVD-ROM for read-only, DVD-R for write-once, and DVD-RAM or DVD-RW for rewritable. For a rewritable DVD, the standardization of data writing format is in progress.
The disk device configured as
If an analog signal is applied to the disk device of
The writing processor 16 converts a series of the compressed data into binary signals which are written in mark/space patterns on the writable DVD 10. Already-compressed digital data from outside are directly processed by the writing processor 16 to be written onto the writable DVD 10.
After recording of audio data, navigation data for the audio data are created and then recorded on the writable DVD 10.
The TXTDT_MG can include additional data of recorded songs such as lyrics. Therefore, when the controller 14 selects and reproduces a recorded song from the rewritable disk 10, it is able to present lyric text in characters on a screen by reading it from the TXTDT_MG.
Consequently, when a user selects a recorded song to play back from the rewritable DVD 10, he or she is able to view its lyric on a screen.
However, each of additional data such as a lyric included in the TXTDT_MG is linked with a recorded song wholly. In other words, a lyric in the TXTDT_MG cannot have information to synchronize in detail with a recorded song. Therefore, it is impossible to display lyric data step by step at the same speed that the recorded song is reproduced from a rewritable DVD.
It is an object of the present invention to provide a synchronizing method that records additional data such as lyric data and user input data to be synchronized minutely with audio data on a rewritable recording medium.
It is another object of the present invention to provide a synchronizing method that reproduces synchronously audio data and additional data thereof that have been recorded with minutely-synchronizing information.
It is another object of the present invention to provide a method and apparatus for providing data structures that allow a synchronous reproduction of main data and additional data, which address the limitations and disadvantages associated with the related art.
An audio data related information recording method in accordance with an aspect of the present invention segments additional information related with audio data recorded on a rewritable recording medium, records the information segments, and further records synchronizing information, e.g., time length to keep up presentation of each information segment or start time to present each information segment, in the vicinity of said each information segment in order to present each information segment in synchronization with a corresponding part of the recorded audio data.
An audio data related information reproducing method in accordance with an aspect of the present invention reads sequentially a plurality of information segments constituting a piece of additional information related with audio data recorded on a rewritable recording medium, and makes presentation of each information segment based on synchronizing information, e.g., time length to keep up presentation of each information segment or start time to present each information segment, recorded in association with each information segment in order to present each information segment in synchronization with a corresponding part of the recorded audio data.
According to an aspect of the present invention, there is provided a method of reproducing main data and additional data, the method comprising: receiving the additional data associated with the main data, the additional data being divided into a plurality of segment units; and reproducing the additional data in a synchronous manner with the main data using time information if indication information indicates a presence of the time to information, wherein the time information indicates a presentation time of the additional data with respect to the main data, and wherein the main data and the additional data are reproduced according to management data, the management data including link information for linking the main data and the additional data.
According to another aspect of the present invention, there is provided a method of reproducing main data and additional data, the method comprising: providing the additional data associated with the main data, the additional data being divided into a plurality of segment units; and reproducing the additional data in a synchronous manner to the main data using time information if indication information indicates a presence of the time information, wherein the time information is present only if the indication information indicates that the time information is present, wherein the main data and the additional data are reproduced according to link information for linking the main data and the additional data, and wherein the link information is separated stored from the main data and the additional data.
According to another aspect of the present invention, there is provided a method of providing additional data to be reproduced with main data, the method comprising: providing the additional data associated with the main data, the additional data being divided into a plurality of segment units and capable of being reproduced with the main data in a synchronous manner; and providing management data associated with the additional data, wherein the management information includes link information for linking the main data and the additional data, time information for reproducing the additional data with the main data in the synchronous manner, and attribute information for providing at least one attribute of the additional data.
According to another aspect of the present invention, there is provided a method of providing additional data to be reproduced with main data, the method comprising: providing the additional data associated with the main data, the additional data being divided into a plurality of segment units and capable of being reproduced with the main data in a synchronous manner; and providing management data associated with the additional data, wherein the management information includes time information and indication information indicating a presence of the time information, the time information being present only if the indicating information indicates the time information is present, wherein the management information further includes linking information for linking the main data and the additional to data, and wherein the additional data is provided separately from the main data.
These and other objects of the present application will become more readily apparent from the detailed description given hereinafter. However, it should be understood that the detailed description and specific examples, while indicating preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art from this detailed description.
The accompanying drawings, which are included to provide a further understanding of the invention, illustrate the preferred embodiments of the invention, and together with the description, serve to explain the principles of the present invention.
In the drawings:
In order that the invention may be fully understood, preferred embodiments thereof will now be described with reference to the accompanying drawings.
In an audio data recording method in accordance with an embodiment of the present invention, lyric or user data related with a song recorded on a rewritable disk such as DVD-RW is segmented into several units. Each segmented unit is linked with each corresponding part of the recorded song through segment synchronizing information for the purpose of minute synchronization of song and additional data.
As shown in
The output-time related information ‘Time #i’ is time-length, which a corresponding lyric segment is kept displayed for, or instant time when a corresponding lyric segment starts to be displayed. The lyric text may be displayed as subtitles. This time information is also used to differentiate a linked lyric segment from several lyric segments displayed altogether by different color or font for a specified duration.
Each lyric segment includes an ID code ‘IDCD’ as shown in
Each group of segmented units is linked with a recorded song through each ALUI search pointer in TXTDT_MG as shown in
To link a recorded song with an ALUI search pointer, a piece of CI (Cell Information) related with an AOB (Audio OBject), which corresponds to a single recorded song in general, is structured in the ORG_PGCI as shown in
Each CI includes an AOBI_SRPN (AOB Information SeaRch Pointer Number) for indexing information location of a related AOB (or song), an ASVUI_SRNP (ASVU (Audio Still Video Unit) Information SeaRch Pointer Number) for indexing information location of still video data linked with an AOB, and an ALUI search pointer number ‘ALUI_SRPN’ for indexing an ALUI search pointer in the TXTDT_MG that points a segment group containing a full lyric text related with an AOB.
Thus, if a song, namely, an AOB is chosen, the controller 14 reads the number written in ALUI_SPRN of CI associated with the selected AOB, and reads an address in ALUI search pointer indexed by the read number in the TXTDT_MG. The location of lyric text related with the selected song is found by this read address.
Then, lyric segments are retrieved sequentially from the addressed location by the reproduced signal processor 12. At this time, the controller 14 examines an ID code preceding each lyric segment. If the ID code is indicative of type of output-time related information, e.g., a time length for which the following lyric segment ‘SG_L_TXT #i’ is displayed, the controller 14 keeps outputting each lyric segment for a duration specified by the time length and changes current lyric segment with the next one after the time length expires. This operation continues until the end of AOB or stop request is received.
Referring to
If the ID code is indicative of not the type of output-time related information but the type of additional information, e.g., user data, the controller 14 processes the segmented additional information adequately for the designated type.
If the ID code indicates that there is no output-time related information in lyric segments, then the lyric segments are sequentially read out irrespective of time. This operation seems to correspond to a conventional lyric data displaying method.
The lyric segments related with a recorded song may be written in a file other than the RTR_AMG, instead of the TXTDT_MG
The ALFI is composed of ALFI_GI (ALFI General Information) and a plurality of ALUI (Audio Lyric Unit Information) search pointers, each including ‘ALU_SA’ for a start address of a lyric unit, ‘ALU_SZ’ for size of a lyric unit, and ‘L_ATR’ for attribute of a lyric.
The ALU_SA in each ALUI search pointer points to the location of a corresponding ALU (Audio Lyric Unit) in a lyric file named by ‘AR_Lyric.ARO’ that is not included in the RTR_AMG. Each ALU in the lyric file ‘AR_Lyric.ARO’ includes a lyric text associated with a single recorded song, and the lyric text is divided into several segments ‘SG_L_TXT #i’. Each lyric segment also has output-time related information ‘Time’ and ID code ‘IDCD as described in the aforementioned embodiment.
According to this structure of the RTR_AMG, the ALUI_search pointer number contained in CI indexes an ALUI search pointer pointing to a lyric unit in the file ‘AR_Lyric.ARO’ associated with a recorded AOB, namely a song.
Thus, if a song, namely, an AOB is chosen, the controller 14 reads the number written in ALUI_SPRN of CI associated with the selected AOB, and reads an address in ALUI search pointer contained in the field ‘ALFI’ indexed by the read number. The location of a lyric unit related with the selected song is found in the file ‘AR_Lyric.ARO’ by this read address.
Then, lyric segments are retrieved sequentially from the first segment ‘SG_L_TXT #1’ at the addressed location in the file ‘AR_Lyric.ARO’ by the reproduced signal processor 12. At this time, the controller 14 examines an ID code preceding each lyric segment.
If the ID code is indicative of the type of output-time related information, the controller 14 conducts continuous and synchronous display of a series of lyric segments together with reproduced audio data belonging to the selected AOB. In this lyric synchronous display operation, a just-displayed lyric segment can be differentiated by color or font from neighboring lyric segments displayed altogether.
The above-explained lyric data synchronizing method ensures minutely-synchronous lyric presentation with audio data, e.g., song being reproduced in real time from a rewritable recording medium. Thus, a user is able to enjoy a reproduced song better through the lyric text displayed in synchronization with the song.
The detailed description of the invention has been directed to certain exemplary embodiments, various modifications of these embodiments, as well as alternative embodiments, will be suggested to those skilled in the art. The invention encompasses any modifications or alternative embodiments that fall within the scope of the claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2001-0074382 | Nov 2001 | KR | national |
This application is a continuation of U.S. application Ser. No. 11/924,656, filed on Oct. 26, 2007, now U.S. Pat. No. 7,793,131, which is a continuation of U.S. application Ser. No. 11/593,082, filed on Nov. 6, 2006, now U.S. Pat. No. 7,293,189, which is a continuation of U.S. application Ser. No. 10/305,020, filed on Nov. 27, 2002, now U.S. Pat. No. 7,181,636, which claims the benefit of a foreign priority application filed in KOREA on Nov. 27, 2001, as Serial No. 10-2001-0074382. This application claims priority to all of these applications, and all of these applications are incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
5119474 | Beitel | Jun 1992 | A |
5127303 | Tsumura | Jul 1992 | A |
5194682 | Okamura | Mar 1993 | A |
5194683 | Tsumura | Mar 1993 | A |
5294982 | Salomon | Mar 1994 | A |
5408686 | Mankovitz | Apr 1995 | A |
5494443 | Nakai | Feb 1996 | A |
5497241 | Ostrover et al. | Mar 1996 | A |
5499921 | Sone | Mar 1996 | A |
5526284 | Mankovitz | Jun 1996 | A |
5583980 | Anderson | Dec 1996 | A |
5649234 | Klappert | Jul 1997 | A |
5683253 | Park | Nov 1997 | A |
5684542 | Tsukagoshi | Nov 1997 | A |
5701511 | Smith | Dec 1997 | A |
5705762 | Kang | Jan 1998 | A |
5726373 | Choi | Mar 1998 | A |
5808722 | Suzuki | Sep 1998 | A |
5854619 | Kato | Dec 1998 | A |
5898119 | Tseng et al. | Apr 1999 | A |
5953375 | Nishiwaki | Sep 1999 | A |
5960152 | Sawabe | Sep 1999 | A |
6006241 | Purnaveja | Dec 1999 | A |
6016295 | Endoh | Jan 2000 | A |
6076059 | Glickman et al. | Jun 2000 | A |
6118608 | Kakihara | Sep 2000 | A |
6172988 | Tiernan | Jan 2001 | B1 |
6173113 | Okada | Jan 2001 | B1 |
6173317 | Chaddha | Jan 2001 | B1 |
6188662 | Maeda et al. | Feb 2001 | B1 |
6259858 | Ando | Jul 2001 | B1 |
6260011 | Heckerman et al. | Jul 2001 | B1 |
6263330 | Bessette | Jul 2001 | B1 |
6267600 | Song | Jul 2001 | B1 |
6288990 | Fujiie | Sep 2001 | B1 |
6414725 | Clarin | Jul 2002 | B1 |
6429364 | Muraki | Aug 2002 | B1 |
6441291 | Hasegawa | Aug 2002 | B2 |
6453119 | Maruyama | Sep 2002 | B1 |
6467061 | Chung | Oct 2002 | B2 |
6542694 | Ando | Apr 2003 | B2 |
6553182 | Ando | Apr 2003 | B2 |
6580873 | Ando | Jun 2003 | B2 |
6584152 | Sporer | Jun 2003 | B2 |
6584274 | Ando | Jun 2003 | B2 |
6597861 | Tozaki | Jul 2003 | B1 |
6625388 | Winter | Sep 2003 | B2 |
6636238 | Amir et al. | Oct 2003 | B1 |
6654543 | Ando | Nov 2003 | B2 |
6654863 | Nishio | Nov 2003 | B2 |
6668158 | Tsutsui | Dec 2003 | B1 |
6697632 | Sood | Feb 2004 | B1 |
6788880 | Fuchigami | Sep 2004 | B1 |
6802019 | Lauder | Oct 2004 | B1 |
6979769 | Majima | Dec 2005 | B1 |
7092334 | Choi | Aug 2006 | B2 |
7181636 | Kim et al. | Feb 2007 | B2 |
7293189 | Kim et al. | Nov 2007 | B2 |
7587735 | Ando | Sep 2009 | B2 |
7657770 | Kim et al. | Feb 2010 | B2 |
7793131 | Kim et al. | Sep 2010 | B2 |
20010043525 | Ito | Nov 2001 | A1 |
20020005107 | Kurakake et al. | Jan 2002 | A1 |
20020006271 | Winter et al. | Jan 2002 | A1 |
20020034375 | Suda | Mar 2002 | A1 |
20020051081 | Hori et al. | May 2002 | A1 |
20020072047 | Michelson et al. | Jun 2002 | A1 |
20020159757 | Ando et al. | Oct 2002 | A1 |
20020163533 | Trovato et al. | Nov 2002 | A1 |
20030018662 | Li | Jan 2003 | A1 |
20030093790 | Logan | May 2003 | A1 |
20030095482 | Hung | May 2003 | A1 |
20030095794 | Chung | May 2003 | A1 |
20030198155 | Go | Oct 2003 | A1 |
20050068876 | Tanaka et al. | Mar 2005 | A1 |
20050084247 | Yoo et al. | Apr 2005 | A1 |
20060182418 | Yamagata et al. | Aug 2006 | A1 |
20060248266 | Sun et al. | Nov 2006 | A1 |
20070218444 | Konetski et al. | Sep 2007 | A1 |
Number | Date | Country |
---|---|---|
02-223086 | Sep 1990 | JP |
10-501916 | Feb 1998 | JP |
2000-149455 | May 2000 | JP |
2001-202755 | Jul 2001 | JP |
2001-0024820 | Mar 2001 | KR |
2002-0006620 | Jan 2009 | KR |
WO 9631880 | Oct 1996 | WO |
WO 0026911 | May 2000 | WO |
Entry |
---|
Korean Notice of Allowance dated Jun. 28, 2007 for Application No. 10-2006-0107084, with English translation, 4 pages. |
U.S. Office Action dated Oct. 28, 2010 for U.S. Appl. No. 11/409,990, 8 pages. |
U.S. Office Action dated Mar. 16, 2011 for U.S. Appl. No. 11/924,664, 16 pages. |
Korean Office Action dated Mar. 28, 2005, 2 pages (not in English Language). |
U.S. Office Action dated May 13, 2004 for U.S. Appl. No. 10/305,027, 9 pages. |
U.S. Office Action dated Oct. 25, 2004 for U.S. Appl. No. 10/305,027, 6 pages. |
U.S. Office Action dated Apr. 14, 2005 for U.S. Appl. No. 10/305,027, 7 pages. |
U.S. Office Action dated May 15, 2006 for U.S. Appl. No. 10/305,027, 9 pages. |
U.S. Office Action dated Oct. 17, 2008 for U.S. Appl. No. 11/409,990, 18 pages. |
U.S. Office Action dated May 21, 2009 for U.S. Appl. No. 11/924,658, 19 pages. |
Non-Final Office Action issued in U.S. Appl. No. 11/409,990, mailed Sep. 1, 2009, 7 pages. |
U.S. Office Action dated Apr. 3, 2013 for U.S. Appl. No. 13/542,712, 14 pages. |
U.S. Office Action dated Oct. 29, 2012 for U.S. Appl. No. 12/717,878, 7 pages. |
U.S. Office Action dated Nov. 16, 2012 for U.S. Appl. No. 13/542,712, 10 pages. |
Number | Date | Country | |
---|---|---|---|
20100169694 A1 | Jul 2010 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11924656 | Oct 2007 | US |
Child | 12557659 | US | |
Parent | 11593082 | Nov 2006 | US |
Child | 11924656 | US | |
Parent | 10305020 | Nov 2002 | US |
Child | 11593082 | US |