This application is based on Japanese Patent Application 2003-395925, filed on Nov. 26, 2003, the entire contents of which are incorporated herein by reference.
A) Field of the Invention
This invention relates to an electronic musical apparatus, and more in detail, an electronic musical apparatus that can display lyrics and a chord name on other electronic musical apparatus.
B) Description of the Related Art
In an electronic musical apparatus that has an automatic musical performing function such as an electronic musical instrument, when music data including lyrics data is reproduced, it is well-known that an external displaying apparatus displays lyrics via a video-out device (image data output circuit), for example refer to JP-A 2002-258838.
In the above-described prior art, lyrics corresponding to music data are output to an external apparatus as image data, and lyrics can be displayed on a separated displaying device and a displaying device that has a large screen.
In the prior art, however, image data (image signals) for displaying lyrics is generated based on lyrics data and image data is transmitted to an external apparatus via a video-out device, this kind of apparatus is expensive since the video-out device is generally expensive.
It is an object of the present invention to provide an electronic musical apparatus that can display lyrics on an external lyrics displaying apparatus (an electronic musical apparatus) without equipping an expensive video-out device.
It is another object of the present invention to provide an electronic musical apparatus that is capable of displaying lyrics information output from other electronic musical apparatus in synchronization with music data reproduced by the other electronic musical apparatus
According to one aspect of the present invention, there is provided an electronic music apparatus, comprising: an extractor that extracts lyrics data from music data for reproduction of music and comprising the lyrics data representing lyrics of the music; a transmitter that transmits the extracted lyrics data to an external device; a reproducer that reproduces the music data; and a outputting device that outputs synchronization information for controlling display of the lyrics by the external device based on the lyrics data to the external device during reproduction of the music data in accordance with a progress of the reproduction of the music data.
According to another aspect of the present invention, there is provided a lyrics displaying apparatus, comprising: a first receiver that receives lyrics data representing lyrics of music from an external device; a memory that temporarily stores the received lyrics data; a display that displays the lyrics in accordance with the received lyrics data; a second receiver that receives synchronization information from the external device; and a controller that controls display of the lyrics in accordance with the received synchronization information.
According to still another aspect of the present invention, there is provided an electronic music apparatus, comprising: an extractor that extracts text data from music data for reproduction of music and comprising the text data; a transmitter that transmits the extracted text data to an external device; a reproducer that reproduces the music data; and a outputting device that outputs synchronization information for controlling display of the text by the external device based on the text data to the external device during reproduction of the music data in accordance with a progress of the reproduction of the music data.
According to the present invention, lyrics can be displayed on an external lyrics displaying apparatus (an electronic musical apparatus) without equipping an expensive video-out device.
Moreover, according to the present invention, lyrics information that is output from other electronic musical apparatus can be displayed establishing synchronization to music data that is reproduced the other musical apparatus.
A RAM 3, a ROM 4, a CPU 5, an external storing device 7, a detector circuit 8, a display circuit 10, a musical tone generator 12, an effecter circuit 13, a MIDI interface 16, a communication interface 17 are connected to a bus 2 in the electronic musical apparatus 1.
A user can make various set up by using a plurality of panel switches 9 connected to the detector circuit 8. The panel switches 9 may be any device which can output a signal corresponding to input by a user, for example, one or a combination of a rotary encoder, a switch, a mouse, an alpha-numeric keyboard, a joy-stick, a jog-shuttle, etc. can be used as the panel switches 9.
Moreover, the panel switch 9 may be a software switch or the like displayed on a display 11 that is operated by using other switch such as a mouse.
The display circuit 10 is connected to the display 11, and various types of information can be displayed on the display 11.
The external storage device 7 includes an interface for the external storage device and is connected to the bus 2 via the interface. The external storage device is, for example, a floppy (a trademark) disk drive (FDD), a hard disk drive (HDD), a magneto optical disk (MO) drive, a CD-ROM (a compact disk read only memory) drive, a DVD (Digital Versatile Disk) drive, a semiconductor memory, etc.
Various types of parameters, various types of data, a program for realizing the embodiment of the present invention, music data, etc. can be stored in the external storage device 7. Moreover, in the embodiment of the present invention, at least one music data PD (
The RAM 3 provides a working area for the CPU 5 and stores a flag, a register or a buffer, and various types of parameters. Various types of parameters and control program, or programs for realizing the embodiment of the present invention can be stored in the ROM 4. The CPU 5 executes calculations or controls in accordance with a control program stored in the ROM 4 or the external storage device 7.
A timer 6 is connected to the CPU 5 and provides a basic clock signal, an interrupt process timing, etc. to the CPU.
The musical tone generator 12 generates a musical tone signal corresponding to a performance signal such as a MIDI signal or the like provided by MIDI information MD recorded in the external storage device 7, a MIDI device 18 connected to the MIDI interface 16, and the musical tone signal is provided to a sound system 14 via the effecter circuit 13.
A type of the musical tone generator may be anything such as a wave-memory type, FM type, a physical model type, a high frequency wave synthesizing type, a Formant synchronization type, VCO+VCF+VCΛ analogue synchronization type, an analogue simulation type, etc. Moreover, the musical tone generator 12 may be composed by using a dedicated hardware or by using a DSP and a micro-program, or may be composed of the CPU and a software program. Further, it may be a combination of those. Moreover, a plurality of reproduction channels may be formed by using one circuit by the time division, or one reproduction channel may be formed with one circuit.
The effecter circuit 13 gives various types of effects on the digital musical tone signal provided by the musical tone generator 12. The sound system 14 includes a D/A converter and a loudspeaker and converts the provided digital musical tone signal to an analogue musical tone signal for reproduction of a musical tone.
A musical performance switch 15 is connected to the detector circuit 8 and provides a musical performance signal in accordance with a user's instruction (a musical performance). In the embodiment of the present invention, a musical keyboard for a musical performance is used as the performance switch 15. The performance switch 15 may be any types of switches that can at least output a musical performance signal such as a MIDI signal.
The MIDI interface (MIDI I/F) 16 can be connected to a electronic musical instrument, other musical instrument, an audio device, a computer, etc., and at least can receive and transmit a MIDI signal. The MIDI interface 16 is not limited to a dedicated MIDI interface, and may be formed by using a widely used interface such as RS-232C, USB (universal serial bus), IEEE1394, etc. In this case, data other than MIDI message may be transmitted at the same time. Moreover, in the embodiment of the present invention, the electronic musical instrument 1A and the computer 1P are connected via this MIDI interfaces.
The MIDI device 18 is an audio device, a musical instrument, etc. connected to the MIDI interface 16. Type of the MIDI device 18 is not limited to a keyboard type musical instrument, it may be a stringed instrument type, a wind instrument type, a percussion instrument type, etc. Also, it is not limited to the apparatus in which the musical tone generator and the automatic musical performance device are built in one apparatus, and they may be separate devices connecting by using communication means such as MIDI or various types of communication networks. A user can input a performance signal by performing (operating) this MIDI device 18.
Moreover, the MIDI device 18 can be used as a switch for inputting various types of data other than musical performance information and various types of settings.
The communication interface 17 can be connected to the LAN (local area network), the Internet and a communication network 19 such as telephone line, etc. and is connected to a server computer 20 via the communication network 19. Then the communication interface 17 can download a control program, programs for realizing the embodiment of the present invention and performance information from the server computer 20 to the external storage device 7 such as the HDD, the RAM 4, etc.
Moreover, the communication interface 17 and the communication network 19 are not limited to be wired but also may be wireless. Moreover, the apparatus may be equipped with both of them.
The electronic musical instrument 1A includes at least a storage unit 31, a lyrics data generation unit 32, a reproduction unit 33 and a transmission unit 34. The computer (PC) 1P includes at least a receiving unit 35, a reproduction buffer 36, a display screen generation unit 37 and a display unit 38.
Music data PD including lyrics information (for example, lyrics event LE indicated in
The music data PD read from the storage unit 31 is transmitted to the generation unit 32 and the reproduction unit 33. In the reproduction unit 33, the music data PD is reproduced, and synchronization information SI is generated corresponding to progress in the reproduction of the music data PD and thereafter transmitted to the transmission unit 34.
The transmission unit 34 transmits the lyrics data LD received from the generation unit 32 to the receiving unit 35 in the computer 1P, for example, via the communication interface such as the MIDI interface. Also, the synchronization information SI received from the reproduction unit 33 is transmitted to the receiving unit 35. Moreover, transmissions of the lyrics data LD and the synchronization information SI are executed based on the MIDI Standards.
The receiving unit 35 transmits the lyrics data LD received from the transmission unit 34 to the reproduction buffer 36 and receives the synchronization information SI transmitted from the transmission unit 34 in sequence and transmits it to the display unit 38. The reproduction buffer 36 stores the lyrics data LD temporally. The display screen generation unit 37 generates a lyrics displaying screen for one page (a range that can be displayed at a time) based on the lyrics data LD stored in the reproduction buffer 36 and transmits to the display unit 38. The display unit 38 displays the lyrics displaying screen in accordance with the synchronization information SI transmitted from the receiving unit 35.
Moreover, the generation and the transmission of the lyrics data LD can be executed to the music data as a whole at a time. A processing example at a time of transmitting all the lyrics data LD at a time is shown in
The music data PD is consisted of at least timing data TM that represents a reproduction timing with a musical measure, beat and time, a note-on event NE that is event data representing event by each timing and a lyrics event LE. Also, the music data PD can be composed of a plurality of musical parts.
The timing data TM is data that represents time for processing various types of events represented by the event data. A processing time of an event can be represented by an absolute time from the very beginning of a musical performance or by a relative time that is an elapse from the previous event. For example, the timing data TM represents a processing time of the event by a parameter of the number of measures, the number of beats in the measure and time (clock) in the beat.
The event data is data representing contents of various types of events for reproducing a song. The event may be a note event (note data) NE that is a note-on event or a combination of a note-on event and a note-off event and represents a note directly relating to reproduction of a musical tone, a pitch change event (pitch bend event), a tempo change event, a setting event for setting a reproduction style of music such as a tone color change event, a lyrics event LE recording a text line of lyrics, etc.
The lyrics event LE records lyrics to be displayed at the timing with, for example, text data. Lyrics event LE is stored corresponding to a note event NE. That is, one lyrics event LE corresponds one note event NE. Timing represented with timing data TM of the lyrics event LE is the same timing as timing represented by corresponding timing data of the note event NE or timing just before and after the same timing that can be regarded as the same timing.
The lyrics data LD is composed including at least the lyrics event LE extracted from music data PD and the timing data TM representing display (reproduction) timing of the lyrics event LE. The lyrics event LE is composed of text data or the like representing a lyrics text line to be displayed. Moreover, the lyrics event LE includes a carriage return (new line) command and a new page command. Also, the lyrics event LE may include information about a font type, a font size and a display color of a lyrics text line to be displayed.
At Step SA1, a process at the electronic musical instrument side is started. At Step SA2, the music data PD (
At Step SA3, all the lyrics information (for example, the lyrics event LE in
At Step SA5, a lyrics displaying screen for the first page, that is, a lyrics displaying screen from the beginning of the music data to the lyrics event including the first new page command and the timing data TM corresponding to the lyrics event in accordance with the lyrics data LD generated at Step SA3, and the generated lyrics displaying screen is displayed on, for example, the display 11 on the electronic musical instrument 1A.
At Step SA6, it is judged whether reproduction of the selected music data PD at Step SA2 is started or not. When the reproduction is started, the process proceeds to Step SA7 as indicated with an arrow “YES” and a start command will be transmitted to the PC. When the reproduction is not started, the process repeats Step SA6 as indicated with an arrow “NO”.
At Step SA8, the music data PD is reproduced in accordance with song progress (a progress in the reproduction). The reproduction of the music data PD is based on the note events included in the music data PD, for example, the musical tone data is generated by the musical tone generator 12, and a musical tone will be sounded with the sound system 14 based on the generated musical tone data via the effecter circuit.
At Step SA9, it is judged whether the current timing is a new page timing or not, for example, whether the reproduction of the music data PD corresponding to the lyrics displayed in the previous page is finished or not. If it is the new page timing, the process proceeds to Step SA10 as indicated with an arrow “YES”. If it is not the new page timing, the process proceeds to Step SA11 as indicated with an arrow “NO”. In the embodiment of the present invention, since the lyrics event LE of the lyrics data LD includes a new page command, judgment whether it is a new page timing or not is executed by detecting the new page command in the lyrics event. Moreover, in a case of using the lyrics data LD not including a new page command, for example, the number of characters to be displayed in a page is set in advance, and timing to be a new page may be determined by the number of characters.
At Step SA10, the lyrics data LD up to the next new page timing (for every page) will be read, and a lyrics displaying screen for the next page is formed to display.
At Step SA11, a wipe process of the lyrics display is executed in accordance with the timing data LD, and the wipe process at this step is at least for displaying the lyrics corresponding to the current position in the music data can be visually recognized by the user. For example, a display style of the lyrics after the current position and of the lyrics before the current position will be different from each other. Moreover, the wipe process of the lyrics is executed by every character or a unit, that is, the lyrics event NE unit, corresponding to one key note (note event NE). Further, a smooth wipe can be applied within one character.
At Step SA12, a synchronization command (the synchronization information SI) is generated in accordance with the progress of the reproduction of the music data PD and transmitted to the PC. The synchronization information SI that is generated and transmitted at this step is based on the MIDI Standards, for example, a MIDI clock or a MIDI time code.
At Step SA13, a musical performance assistant function is executed if necessary. The musical performance assistant function is, for example, a fingering guide, etc.
At Step SA14, it is judged whether the reproduction of the music data PD is stopped (finished) or not. If the reproduction is stopped, the process proceeds to Step SA15 as indicated with an arrow “YES” and a stop command will be transmitted to the PC. Thereafter the process proceeds to Step SA16 to finish the process on the electronic musical instrument side. If the reproduction is continued (in progress), the process returns to Step SA7 to repeat the process after Step SA7.
At Step SB1, the process (a lyrics displaying software program) executed by the computer (PC) is started. At Step SB2, it is judged whether the lyrics data LD for all the pages transmitted from the electronic musical instrument at Step SA4 is received or not. If the lyrics data LD is received, for example, the lyrics data LD is stored in the reproduction buffer 36 (
At Step SB3, a lyrics displaying screen is formed based on the first page data from the received lyrics data LD for all the pages to display it on the display 11 in the computer 1P.
At Step SB4, it is judged whether the start command transmitted at Step SA7 is received or not. If the start command is received, the process proceeds to Step SB5 as indicated with an arrow “YES”. If the start command is not received, Step SB4 is repeated as indicated with an arrow “NO” to wait for receiving the start command.
At Step SB5, it is judged whether the current timing is a new page timing or not, for example, whether the reproduction of the music data PD corresponding to the lyrics displayed in the previous page is finished or not. If the current timing is the new page timing, the process proceeds to Step SB5 as indicated with an arrow “YES”. If it is not the new page timing, the process proceeds to Step SB7 as indicated with an arrow “NO”. The judgment whether it is the new page timing or not is executed by the similar way as at Step SA9.
At Step SB6, the lyrics data LD up to the next new page timing (for every page) is read, and a lyrics displaying screen for the next page is formed to display.
At Step SB7, a wipe process of the lyrics display is executed in accordance with the timing data LD, and the wipe process at this step is at least for displaying the lyrics corresponding to the current position in the music data can be visually recognized by the user. A velocity (tempo) of the wipe process is controlled on the PC side, and the wipe process is executed independently from that executed by the electronic musical instrument. Moreover, it is desirable that an initial value of the velocity (tempo) controlled on the PC side is, for example, received with the lyrics data from the electronic musical instrument before the reproduction.
At Step SB8, it is judged whether the synchronization information SI transmitted at Step SA12 is received or not. If the synchronization information SI is received, the process proceeds to Step SB9 as indicated with an arrow “YES”, and synchronization of timing is established by using the received synchronization information SI. That is, the process of the wipe process controlled on the PC side is adjusted in accordance with the synchronization signal. Here, by establishing the synchronization of timing, the lyrics display on the PC side can be synchronized with the reproduction of the music data by the electronic musical instrument and with the displaying timing of the lyrics data LD. If the synchronization information SI is not received, the process proceeds to Step SB10.
At Step SB10, it is judged whether the stop command transmitted at Step SA15 is received or not. If the stop command is received, the process proceeds to Step SB11 as indicated with an arrow “YES”. If the stop command is not received, the process returns to Step SB5 as indicated with an arrow “NO” to repeat the process after the Step SB5.
At Step SB11, the lyrics data LD stored in the reproduction buffer 36 is deleted, and the process proceeds to Step SB12 to finish the process on the PC side.
In the above-described examples shown in
Since the processes at Step SC1 and Step SC2 are similar to the processes at Step SA1 and Step SA2 in
At Step SC3, the lyrics data LD (
At Step SC4, a lyrics displaying screen for the first page is formed based on the lyrics data LD generated at Step SC3, and for example, the lyrics displaying screen is represented on the display 11 in the electronic musical instrument 1A.
Since the processes from Step SC5 to Step SC9 are similar to the processes from Step SA6 to Step SA10 in
At Step SC10, the lyrics data LD (
Since the processes from Step SC11 to Step SC16 are similar to the processes from Step SA11 to Step SA16 in
Also, since the processes from Step SD1 to Step SD4 are similar to the processes from Step SB1 to Step SB4 in
At Step SD5, it is judged whether the lyrics data LD transmitted (for a page to be displayed in the next screen) at Step SC10 is received or not. If the lyrics data LD is received, the process proceeds to Step SD6 as indicated with an arrow “YES”. If the lyrics data LD is not received, the process proceeds to Step SD7 as indicated with an arrow “NO”.
At Step SD6, as same as the process at Step SD3 (or Step SB3 in
Since the processes from Step SD7 to Step SD12 are similar to the processes from Step SB7 to Step SB12 in
In the above-described examples shown in
Moreover, in the above-described examples shown in
As described before, according to the embodiment of the present invention, the lyrics data LD is extracted from the music data to transmit the lyrics data LD to the external lyrics displaying apparatus, and the synchronization signal can be transmitted during the reproduction of the music in accordance with the progress of the music. By that, for example, an apparatus that can transmit the lyrics data LD and the synchronization information SI to the external electronic musical apparatus based on the MIDI Standards may be acceptable, and the lyrics can be displayed at the external lyrics displaying apparatus without an expensive video-out device.
Moreover, if the transmissions of the above-described lyrics data LD and the synchronization signal are based on the MIDI Standards, a new hardware for displaying lyrics at the external electronic musical apparatus becomes unnecessary because most of the electronic musical apparatus equips an interface based on the MIDI Standards.
Also, since the lyrics data LD is displayed in accordance with the synchronization signal after receiving the lyrics data LD from the external electronic musical apparatus, the lyrics display becomes possible in cooperation with the external electronic musical apparatus.
Furthermore, although the deletion of the lyrics data LD from the reproduction buffer is executed immediately after the reproduction of one music at Step SB11 in
Also, in order to prohibit storing and copying the stored lyrics data to a designated storage medium, it is preferable that a lyrics displaying software has a protection function on the receiving apparatus side for copy right protection, or to encode the lyrics data.
Although, in the embodiment, the transmission of the lyrics data LD is started when the music is selected, it is not limited to that. For example, the transmission may be started when the reproduction of the music is started (in this case, however; a user has to wait for the lyrics to be displayed from the reproduction instruction until the display of the lyrics will be enabled), or the lyrics data LD of the stored music may be transmitted without a relationship with the selection of the music during a blank time of the automatic musical performance apparatus. Moreover, if the transmission of the lyrics is not finished when the reproduction of the music is instructed, it is desirable that the reproduction of the music is delayed until the transmission finishes.
Moreover, although the lyrics for one page is transmitted when it becomes a new page timing of the lyrics in the examples shown in
Further, although the MIDI clock and the MIDI time code are mentioned as the synchronization information SI, “start”, “stop”, a tempo clock (F8), performance position information (a measure, a beat, a lapse clock from the beginning of the music, a lapse time from the beginning of the music) and any types of information that can establish synchronization between the transmission apparatus side and the receiving apparatus side may be used for the synchronization information SI.
Also, on the receiving apparatus side (the lyrics displaying apparatus), a background image may be selected corresponding to the music genre to display as a background of the lyrics display. The music genre may be transmitted from the electronic musical apparatus on the transmitting side to the electronic musical apparatus on the receiving side (the lyrics displaying apparatus) by including genre information in the music data, or the music genre may be decided from contents of the lyrics data LD at the electronic musical apparatus on the receiving side (the lyrics displaying apparatus).
Also, instead of the lyrics data, or in addition to the lyrics data, chord name data is stored in the music data, and it may be extracted to transfer to an external apparatus, and the chord name data may be displayed corresponding to the synchronization information received by the external apparatus. That is, the present invention can be applied not only to lyrics or chord names but also to a character (text) that is displayed along with a progress of music. In that case, text data (including lyrics and chord names) is stored in the music data in advance, extracted from the music data at once or by a certain unit, and transmitted to an external device. When the music data is reproduced, the synchronization information is transmitted as required from the electronic music apparatus to the external device, and the external device controls displaying style of characters (text) in accordance with the received text data in synchronization with the synchronization information.
Moreover, the electronic musical apparatus 1 (the electronic musical instrument 1A or the computer 1P) according to the embodiment of the present invention is not limited to a form of the electronic musical instrument or the computer, and it may be applied to a Karaoke device, a mobile communication terminal such as a cellular phone and an automatic performance piano. If it is applied to the mobile communication terminal, it is not limited to that the terminal has complete functions but also a system consisted of a terminal and a server as a whole may be realized by the terminal having one part of functions and the server having another part of functions.
Also, when the electronic musical instrument type is used, the type of the musical instrument is not limited to a keyboard instrument as explained in the embodiment of the present invention, and it may be a stringed instrument type, a wind instrument type and a percussion instrument type. Also, it is not limited to the apparatus in which the musical tone generator and the automatic musical performance device are built in one apparatus, and they may be separate devices connecting with each other by using communication means such as MIDI and various networks.
Also, in the embodiment of the present invention, the transmitting side of the lyrics data LD is the electronic musical instrument 1A, and the receiving side (the lyrics displaying apparatus) is the computer 1P. The transmitting side may be the computer 1P, and the receiving side may be the electronic musical instrument 1A.
Also, the embodiment of the present invention may be executed by a general personal computer to which a computer program corresponding to the embodiment of the present invention is installed.
In such a case, the computer programs or the like realizing the functions of the embodiment may be stored in a computer readable storage medium such as a CD-ROM and a floppy disk and supplied to users.
The present invention has been described in connection with the preferred embodiments. The invention is not limited only to the above embodiments. It is apparent that various modifications, improvements, combinations, etc. can be made by those skilled in the art.
Number | Date | Country | Kind |
---|---|---|---|
2003-395925 | Nov 2003 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
5506370 | Nakai et al. | Apr 1996 | A |
5561849 | Mankovitz | Oct 1996 | A |
5808223 | Kurakake et al. | Sep 1998 | A |
6538188 | Kondo | Mar 2003 | B2 |
20030051595 | Hasegawa | Mar 2003 | A1 |
Number | Date | Country |
---|---|---|
07-084587 | Mar 1995 | JP |
10-254467 | Sep 1998 | JP |
3218946 | Aug 2001 | JP |
2003-015668 | Jan 2003 | JP |
2003-323186 | Nov 2003 | JP |
2003-330473 | Nov 2003 | JP |
Number | Date | Country | |
---|---|---|---|
20050109195 A1 | May 2005 | US |