System and method for synchronizing a graphic image and a media event

Information

  • Patent Grant
  • 6275222
  • Patent Number
    6,275,222
  • Date Filed
    Friday, September 6, 1996
    28 years ago
  • Date Issued
    Tuesday, August 14, 2001
    23 years ago
Abstract
In an illustrative embodiment, the invention discloses a computer system in which an arbitrary set of graphical images can be used to navigate through a musical score. The musical score may be read through by a user while listening to the music and pointing to a corresponding position in the music. To this end, tracking information may be stored with image as a time-place mapping. In navigation through the musical score, the mapping may be used inversely to calculate musical time from a place pointed at in the score. In collaboration, the mapping may be used to calculate corresponding positions in different versions (parts) of the same musical score presented to different users.
Description




BACKGROUND OF THE INVENTION




This invention relates to a system and method for synchronizing a graphic image and a media event.




INTRODUCTION TO THE INVENTION




Our work extends to synchronizing a graphic image and a media event by way of a digital computer. By the notion, a “media event”, we mean, e.g., the contents of an audio file, a MIDI file (Musical Instrument Digital Interface), a computer animation, or a video file; and, by the notion “a graphic image”, we mean a semiotic construct e.g., a printed or hand-crafted musical score, a sketch, remarks, a storyboard, or an ethnic notation. For the sake of pedagogy, however, the following discussion references a particular (illustrative) case, wherein the media event comprises a digital compact disk (CD) representation of a musical performance and, the graphic image comprises a corresponding printed musical score from whence the performance can be realized.




SUMMARY OF THE INVENTION




We note that navigation through musical material in computer systems has always been a difficult problem. For example, typical extant systems present the music at a very low level (as an audio signal or time-line), or the musical score produced from an internal representation of the musical material looks clumsy or unprofessional.




We have now discerned a methodology and discovered a computer system in which any arbitrary set of graphical images can be used to navigate through musical material (i.e., the media event). A central concept, to this end, is the following. Musical material may be “read through” by a user while listening to the music and pointing to a corresponding position in the music (e.g., with a mouse, a touch/screen, or graphic tablet). This tracking information may be stored with the graphical image as a time-place mapping. In navigation through the musical material, the mapping may be used in inverse form to calculate musical time from a place pointed at in the score.




Accordingly, given this tracking information, selections of musical fragments can be made and operations (replay, record etc.) can be done depending on a particular realization of the present invention, and of which it thereby becomes a “front-end”. In a collaborative setup, different score layouts (parts) may be coordinated through the use of the time-place maps. These advantages, and others, are further detailed below.




We now disclose the invention by way of a computer system comprising:




1) means for storing a time-based media file corresponding to a media event;




2) means for storing a graphic image corresponding to the media event; and




3) means for relating time position in the media file with spatial position in the graphic image, so that a time-place mapping is created between the media file and the graphic image.




In a second aspect, the present invention discloses a computer method comprising the steps of:




1) storing a time-based media file that corresponds to a portion of a media event;




2) storing a graphic image that corresponds to a portion of the media event; and




3) relating time position in the media file with spatial position in the graphic image by moving a pointing device over the graphic image while the media file is presented, thereby constructing a time-place mapping between the media file and the graphic image.











BRIEF DESCRIPTION OF THE DRAWING




The invention is illustrated in the accompanying drawing, in which:





FIG. 1

shows a computer screen display of a graphic image;





FIG. 2

shows a system architecture of the present invention during the construction of a time-place mapping;





FIG. 3

shows a flowchart for realization of a map-building component of the system architecture;





FIG. 4

shows a display mode embodiment of the system architecture;





FIG. 5

shows a flowchart for realization of a map-interpreting component of the

FIG. 4

mode;





FIG. 6

shows a select mode embodiment of the system architecture;





FIG. 7

shows a flowchart for realization of a map-interpreting component of the

FIG. 6

mode; and





FIG. 8

shows a collaboration mode embodiment of the system architecture for computers connected in a network.











DETAILED DESCRIPTION OF THE INVENTION




Attention is first directed to

FIG. 1

, which shows a computer screen display comprising a graphic image


10


. The graphic image


10


has been scanned into a computer memory by known scanning techniques. In particular, the graphic image


10


includes a portion of a printed musical score.

FIG. 1

further shows an arbitrary highlight


12


(shaded region) of the musical score, which highlight


12


may be earmarked in a conventional manner by a touch-screen capability.




As summarized above, it is an objective of the present invention to relate an illustrative

FIG. 1

type spatially based graphic image


10


with a complementary time-based media file.

FIG. 2

, to which attention is now directed, shows an exemplary system architecture


14


which may be used to this end.




In overview, the

FIG. 2

system architecture


14


shows that a user can follow (point to) a scanned in score, (numerals


16


-


20


), while listening to a complementary time-based media file comprising an audio file (numerals


22


-


26


). A map builder


28


can create a time-position map


30


which can relate movement of the pointing device over the graphic image while the media file is presented, thereby constructing the time-place mapping. An illustrative map builder flowchart is shown in

FIG. 3

(numeral


32


).




Central to the

FIG. 3

flowchart program is a construction and interpretation of a time-position map. This map can store the track of the pointing device while the user ‘reads along’ with the music, and is used in subsequent interpretations of pointing position and time. Preferably, one such map exists for every page of the score. This map may also be pre-constructed and stored by e.g., the distributor of the time-based media and the graphic files.




A map preferably is an array in which time-position tuples are stored in increasing time order. To prevent information overload, subsequent positions preferably are only stored when they differ by more than a certain space accuracy. In a map, disjoint sections of pointer tracks may be stored (when the pointing device is lifted or released), each ending with a specific nil mark. These sections usually correspond to lines (systems) in the score, and the breaks between them indicate where a section break needs to be drawn.




There is one constructor procedure that builds a map, and two accessor functions that use one to either find the position of a point in time, or the time associated with a certain position.




Attention is now directed to

FIGS. 4

,


6


and


8


, which show alternative embodiments of the

FIG. 2

system architecture


14


, dedicated respectively to display, select, and collaboration modes (numerals


34


,


36


,


38


).




In particular, the

FIG. 4

display mode


34


is one wherein the computer can follow the score (pointing a cursor, turning the pages) while playing the audio file. A user may play a solo to the accompaniment played by the computer.

FIG. 5

provides a flowchart


40


for realization of the display mode.




The

FIG. 6

select mode


36


, on the other hand, is one wherein a user can select a fragment in the score, and the computer plays it.

FIG. 7

provides a flowchart


42


for realization of the select mode.




The

FIG. 8

collaboration mode


38


, finally, is one wherein multiple users, each with a possibly different graphical image corresponding to the same time-based media file, are shown each other's pointing, annotation, and selecting actions in the position appropriate for each graphical image.



Claims
  • 1. A computer system comprising:1) means for storing a time-based media file that corresponds to a portion of a media event; 2) means for storing a graphic image that corresponds to the portion of the media event; and 3) means for relating time position in the media file with spatial position in the graphic image that corresponds to the portion of the media event, so that a time-place mapping is created between the media file and the graphic image.
  • 2. A computer system according to claim 1, wherein the time-based media file comprises an audio file.
  • 3. A computer system according to claim 1, wherein the time-based media file comprises a video.
  • 4. A computer system according to claim 1, wherein the graphic image corresponds to a musical score.
  • 5. A computer system according to claim 1, comprising means for moving a pointing device over the graphic image while the media file is presented, thereby constructing the time-place mapping.
  • 6. A computer method comprising the steps of:1) storing a time-based media file that corresponds to a portion of a media event; 2) storing a graphic image that corresponds to a portion of the media event; and 3) relating time position in the media file with spatial position in the graphic image that corresponds to a portion of the media event by moving a pointing device over the graphic image while the media file is presented, thereby constructing a time-place mapping between the media file and the graphic image.
  • 7. A method according to claim 6, comprising a step of displaying a state of the media file by indicating a selected portion of the graphic image.
  • 8. A method according to claim 6, comprising a step of selecting a fragment of the media file pointing at the graphic image.
  • 9. A method according to claim 6, comprising a step of displaying pointing gestures and selections made by one user in the graphical image to appear in the appropriate position in another graphical image with another layout presented to another user by first relating a space position in the first image to a time position using the map corresponding to that image and then relating this time position to a space position in the other image using the map of this second image.
  • 10. A computer system as recited in claim 1, further includingmeans for storing said time-place mapping.
  • 11. A computer system as recited in claim 10, further includingmeans for accessing a portion of said graphic image from storage based upon a selected position in said media file.
  • 12. A computer system as recited in claim 10, further includingmeans for accessing a portion of said media file from storage based upon a selected spatial position in said graphic image.
  • 13. A computer system as recited in claim 11, further includingmeans for accessing a sequence of portions of said graphic image in accordance with presentation of said media file.
  • 14. A computer system as recited in claim 13, further includingmeans for accessing a sequence of portions of another graphic image in accordance with presentation of said media file.
  • 15. A computer system as recited in claim 14, wherein said graphic image and said another graphic image include different sets of pointing, annotation and selecting action information.
  • 16. A method as recited in claim 6, including the further step ofstoring said time-place mapping.
  • 17. A method as recited in claim 16, including the further step ofaccessing a portion of said media file in accordance with a selected location in said graphic image.
  • 18. A method as recited in claim 16, including the further step ofaccessing a portion of said graphic image in accordance with a selected location in said media file.
  • 19. A method as recited in claim 18, wherein said accessing step is repeated in accordance with a presentation of said media file.
  • 20. A method as recited in claim 18, including the further step ofaccessing a portion of another graphic image in accordance with a selected location in said media file.
US Referenced Citations (41)
Number Name Date Kind
4247571 Flament Jan 1981
4313036 Jabara et al. Jan 1982
4546212 Crowder et al. Oct 1985
4757495 Decker et al. Jul 1988
4812786 Davarian et al. Mar 1989
5012511 Hanle et al. Apr 1991
5073890 Danielsen Dec 1991
5077789 Clark, Jr. et al. Dec 1991
5146833 Lui Sep 1992
5159592 Perkins Oct 1992
5159594 Bales et al. Oct 1992
5173934 Marquet et al. Dec 1992
5185742 Bales et al. Feb 1993
5206899 Gupta et al. Apr 1993
5214641 Chen et al. May 1993
5289528 Ueno et al. Feb 1994
5315647 Araujo May 1994
5388264 Tobias, II et al. Feb 1995
5392345 Otto Feb 1995
5410543 Seitz et al. Apr 1995
5450482 Chen et al. Sep 1995
5459780 Sand Oct 1995
5471318 Ahuja et al. Nov 1995
5490212 Lautenschlager Feb 1996
5511002 Milne et al. Apr 1996
5537401 Tadamura et al. Jul 1996
5544229 Creswell et al. Aug 1996
5548636 Bannister et al. Aug 1996
5550906 Chau et al. Aug 1996
5577110 Aquino Nov 1996
5619557 Van Berkum Apr 1997
5657377 Pinard et al. Aug 1997
5657383 Gerber et al. Aug 1997
5663517 Oppenheim Sep 1997
5703943 Otto Dec 1997
5737333 Civanlar et al. Apr 1998
5742675 Kilander et al. Apr 1998
5751338 Ludwig, Jr. May 1998
5784546 Benman, Jr. Jul 1998
5812533 Cox et al. Sep 1998
5812819 Rodwin et al. Sep 1998
Foreign Referenced Citations (5)
Number Date Country
43 29 172 Aug 1993 DE
590863 Apr 1994 EP
2270814 Mar 1994 GB
2271912 Apr 1994 GB
2280334 Jan 1995 GB
Non-Patent Literature Citations (1)
Entry
Apple, Macromind MediaMaker, Macromind, pp. 145, 148, 151, and 182-183, Dec. 1990.