The present invention relates to a method of synchronizing different types of data in a multimedia file. It applies, for example, to portable systems such as mobile radio terminals, pocket computers or any other equipment that can have multimedia capabilities and for which the size of multimedia files and the computation power needed to process them constitute a problem.
There are very many monomedia files, i.e. files relating to only one particular type of data, such as the JPEG (Joint Photographic Expert Group) format for storing pictures or the RTF (Rich Text File) format for storing text.
The expression “multimedia file” generally refers to integrating different types of data (such as pictures, sound and text) in the same file. Each type of data is contained in a given track. Each track is organized in the form of a series of commands. Each track is scanned by a microprocessor. Each microprocessor executes, at the same time as the others, commands from one track or simultaneous commands from more than one track and can present the data, via different interfaces, to a user of an equipment with multimedia capabilities. The interfaces can be a screen for text and picture data and a loudspeaker for audio data. The user therefore sees text and pictures whilst hearing sounds.
The problem is therefore to match the text to the music and the pictures, i.e. to synchronize the different types of data contained in the same multimedia file.
Each microprocessor, associated with each track containing one type of data uses an oscillator. Each oscillator produces a signal with a frequency slightly different from those of the other oscillators. Also, the software executed by each processor can be based on different operating systems, which drift with time in dissimilar ways. Thus two microprocessors that begin to read their respective tracks at the same time are eventually no longer synchronized with each other. For example, if the microprocessor for the sound data track is lagging behind the microprocessor for the text data track, the text of a phrase will be displayed before the sung phrase is heard.
The prior art solution is temporal synchronization.
In the
Thus the microprocessor μp2 verifies every 3 μs whether its clock is synchronized to that of the first microprocessor μp1. If the microprocessor μp2 finds that it is in advance of the other one, it calculates the time difference and stops reading track 2 for that period of time. It then restarts in synchronism with the microprocessor μp1. It is apparent that the better the synchronization required, the greater the quantity of synchronization data that has to be sent and the more frequently it has to be sent.
In mobile terminals there are severe file size constraints. The available memory is limited for reasons of overall size and battery life. What is more, multimedia files must be downloadable from a server center in a reasonable time, which is directly dependent on the file size.
Storing recurrent synchronization data is costly in terms of memory: the flow of data exchanged is burdened with many exchanges of synchronization data, which overloads the memory.
The above solution also has a further and major disadvantage: the synchronization data can reach the microprocessor μp2 while it is in the middle of displaying a phrase. The display of the phrase is then stopped short, and the user does not receive the impression of fluid presentation of data.
The object of the present invention is to reduce the size of multimedia files, to optimize the quantity of data exchanged, and to provide optimum synchronization.
To this end, the invention provides a method of synchronizing data in a multimedia document (50) comprising at least two separate data files (track1, track2) referred to as the first file, the second file, etc., in which method:
The method is advantageously characterized in that the important event corresponds to a command to display a text, a command to display a picture, or a command to reproduce a sound.
The invention also provides a device for synchronizing data in a multimedia file containing at least one track in which said data is stored and at least one synchronization command in each track, said device having first means for reading the data of each track and second means enabling the first means to communicate with each other, the data communicated between said first means concerning the occurrence of a synchronization command. The device is characterized in that one of the first data reading means is designated as having the highest priority and forces the other first means to synchronize with it.
The invention and its advantages will become clearer in the course of the following description with reference to the accompanying drawings.
The data in a multimedia file according to the invention can comprise either time values or sound, text or picture coding values. The time values can represent a note duration, an image display time, a track start or end time, or a waiting time between two events. According to the invention, the tracks of the multimedia file also include synchronization commands related to the various events included in the track (note, picture, text, etc.).
The multimedia file 50 includes a header 55 and tracks 60, 70 and 80. According to the invention, a multimedia file can include a number of tracks from 1 to n and
The header 55 includes data that is common to all of the tracks and is not described in detail here.
Each track of the file 50 can contain a single type of data. For example, track 60 can be a MIDI (Musical Instrument Digital Interface) format track for sound, track 70 can contain a sequence of pictures, and track 80 can contain sequences of texts. The different tracks may be intended to be scanned by microprocessors and presented simultaneously to a user. The different microprocessors therefore scan the tracks at the same time.
Each track 60, 70 and 80 has a respective header 65, 75 and 85. Each header contains an indicator of the type of data contained in the track. Thus the microprocessor able to read MIDI data knows from this indicator which track to read.
Each track also contains data organized in the form of commands which are executed sequentially by the microprocessor (for example to display a picture or a text).
In this example:
Each track has a Start field for starting presentation to the user and an End field for ending presentation to the user.
Track 1 contains data relating to sound. A first field Nf1 represents the frequency of a first note and a second field Nd1 represents its duration. Likewise, the fields Nh2 and Nd2 define a second note. The field D1 represents a waiting time before presenting the subsequent notes of the track.
The fields Nh3 and Nd3 respectively represent the frequency and the duration of a third note.
Thus fields defining a note or a waiting time can follow on from each other in track 1.
Track 2 contains data corresponding to sequences of JPEG images. In this example, two JPEG images represented by the fields JPEG1 and JPEG2 must be presented to the user for a given time represented by the field JPEGd1 for the image JPEG1 and the field JPEGd2 for the image JPEG2. The fields JPEGD0, JPEGD1 and JPEGD2 represent waiting times before or between images.
Track 3 contains data corresponding to text messages. In this example two syllables represented by the fields TEXT1 and TEXT2 must be presented to the user. The fields TEXTD0 and TEXTD1 represent waiting times before a text.
The synchronization commands are represented by fields SYNCHi for i from 1 to n.
The synchronization commands are not temporal commands, as in the prior art, but are instead dependent on a specific event. Thus the fields SYNCHi are not present in the tracks at regular time intervals.
In the
It forces the other microprocessors in charge of the other tracks, referred to as slaves, to synchronize with it.
Some notes must correspond to the display of an image or a syllable.
In this example, the first fields NF1 and Nd1 correspond to a first note. The second note, corresponding to the second fields NF2 and Nd2, must be heard at the moment the first picture is displayed, corresponding to the field JPEG1 of track 2. Then, after a waiting time corresponding to the field D1, the third note, corresponding to the third fields Nf3 and Nd3, must be heard at the moment that the first syllable is displayed, corresponding to the field TEXT1 of track 1. Finally, the fourth note, corresponding to the fourth fields Nf4 and Nd4, must be heard at the moment at which are simultaneously displayed the second picture, corresponding to the field JPEG1 in track 2, and the second syllable, corresponding to the field TEXT2 in track 1.
Thus the first synchronization command, which corresponds to the field SYNCH1, is:
The second synchronization command, which corresponds to the field SYNCH2, is:
The third synchronization command, which corresponds to the field SYNCH3, is:
When the multimedia document is presented to the user, the microprocessors scan all the tracks at the same time. Two situations arise, according to whether the slave microprocessors are lagging behind or in advance of the master microprocessor. Each slave microprocessor can receive data concerning the synchronization commands from the master microprocessor.
The master microprocessor, which is dedicated to track 1, reaches the first synchronization command, corresponding to the field SYNCH1, and sends first synchronization data to the other microprocessors.
Two situations arise:
Thus each important command, i.e. a command whose execution must not be interrupted, is represented by a given field preceded by a field representing a synchronization command. The synchronization command is at the same place in all the other tracks. Because of this, different tracks are resynchronized before any important command, if necessary.
Thus the invention synchronizes data in a multimedia file without overloading the memory with unnecessary synchronization data, and thereby restricting transfers of synchronization data between the microprocessors, without overloading the tracks with large quantities of unnecessary synchronization data, and most importantly without the execution of an important command stopping in the middle.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/FR01/02844 | 9/13/2001 | WO | 00 | 3/12/2003 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO02/23912 | 3/21/2002 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5333299 | Koval et al. | Jul 1994 | A |
5471576 | Yee | Nov 1995 | A |
5487167 | Dinallo et al. | Jan 1996 | A |
5602356 | Mohrbacher | Feb 1997 | A |
5642171 | Baumgartner et al. | Jun 1997 | A |
5675511 | Prasad et al. | Oct 1997 | A |
5680639 | Milne et al. | Oct 1997 | A |
5701511 | Smith | Dec 1997 | A |
5737531 | Ehley | Apr 1998 | A |
5751280 | Abbott et al. | May 1998 | A |
5754783 | Mendelson et al. | May 1998 | A |
5768607 | Drews et al. | Jun 1998 | A |
5794018 | Vrvilo et al. | Aug 1998 | A |
5808987 | Oda et al. | Sep 1998 | A |
5822537 | Katseff et al. | Oct 1998 | A |
5826102 | Escobar et al. | Oct 1998 | A |
5861880 | Shimizu et al. | Jan 1999 | A |
5902949 | Mohrbacher | May 1999 | A |
6006241 | Purnaveja et al. | Dec 1999 | A |
6016166 | Huang et al. | Jan 2000 | A |
6148139 | Cookson et al. | Nov 2000 | A |
6173317 | Chaddha et al. | Jan 2001 | B1 |
6177928 | Basso et al. | Jan 2001 | B1 |
6195701 | Kaiserswerth et al. | Feb 2001 | B1 |
6230172 | Purnaveja et al. | May 2001 | B1 |
6288990 | Fujiie et al. | Sep 2001 | B1 |
6334026 | Xue et al. | Dec 2001 | B1 |
6349286 | Shaffer et al. | Feb 2002 | B2 |
6415135 | Salomaki | Jul 2002 | B1 |
6449653 | Klemets et al. | Sep 2002 | B2 |
6453355 | Jones et al. | Sep 2002 | B1 |
6480902 | Yuang et al. | Nov 2002 | B1 |
6490553 | Van Thong et al. | Dec 2002 | B2 |
6512778 | Jones et al. | Jan 2003 | B1 |
6564263 | Bergman et al. | May 2003 | B1 |
6611537 | Edens et al. | Aug 2003 | B1 |
6631522 | Erdelyi | Oct 2003 | B1 |
6665835 | Nicol et al. | Dec 2003 | B1 |
6744763 | Jones et al. | Jun 2004 | B1 |
6771703 | Oguz et al. | Aug 2004 | B1 |
6792615 | Rowe et al. | Sep 2004 | B1 |
6871006 | Oguz et al. | Mar 2005 | B1 |
20010014891 | Hoffert et al. | Aug 2001 | A1 |
20020116361 | Sullivan | Aug 2002 | A1 |
20020159519 | Tabatabai et al. | Oct 2002 | A1 |
20040017389 | Pan et al. | Jan 2004 | A1 |
20040103372 | Graham | May 2004 | A1 |
Number | Date | Country | |
---|---|---|---|
20040098365 A1 | May 2004 | US |