1. Field of the Invention
The present invention relates to a sheet music creation method and image processing apparatus and, more particularly, to a sheet music processing method and image processing apparatus for analyzing image data of sheet music input from an image input device such as a scanner or multifunctional peripheral, and changing the display form of part of the sheet music in accordance with the analysis result.
2. Description of the Related Art
As low-cost scanner devices become available, an original can be easily scanned and converted into digital data even at home. Inkjet printers have been developed as printing apparatuses, and even a multifunctional peripheral called a multi-function printer capable of not only printing but also scanning images has become popular. A user can use a scanner device and printer device to easily copy his originals and contents at home.
An example of content which enhances user convenience when copied or printed is sheet music. Sheet music expresses music using notes, rests, musical signs, and the like. Sheet music has several notations such as staff notation and tablature. In principal, sheet music is often expressed in black. Especially for a beginner, the readability of sheet music is poor, and it is difficult to read melodies. To solve this problem, there is disclosed a method of changing the colors or shapes of musical signs, notes, and the like to improve readability. For example, Japanese Patent Laid-Open No. 07-311543 discloses a method of coloring and expressing notes in colors assigned to respective scales, and a method of writing arrows on sheet music. Japanese Paten Laid-Open No. 07-304245 discloses a method of coloring and expressing musical signs regarding the tempo and dynamics of a performance and the like, or the background of musical signs in a color different from the color of notes.
In many cases, a player plays while not only giving attention to notes and the like corresponding to a portion he is playing, but also recognizing a melody within a predetermined range after a musical sign corresponding to a portion he is playing. For this reason, it is desirable that the player can recognize the continuity of a melody from sheet music. The conventional techniques can improve readability by changing the display form of musical signs, notes, and the like, but do not allow the user to recognize the continuity of a melody.
Sheet music as a general book describes information for a performance on the basis of rules built up through the historical development. Sheet music is not convenient for all players. A player edits sheet music so as to easily play for himself by, for example, adding signs to the sheet music or coloring signs. The editing work optimizes sheet music for the player. However, it is very cumbersome for the player to edit sheet music, so demands have arisen for easy editing of sheet music.
According to embodiments of the present invention, a melody repeated in sheet music is analyzed and undergoes image processing to visually discriminate a repeating melody. Some embodiments of the present invention provide a sheet music processing method or image processing apparatus capable of easily creating sheet music from which the player can recognize the continuity of a melody.
Some embodiments of the present invention provide a sheet music processing method of processing, by an image processing apparatus, image data of sheet music input by an input device, the method comprising the steps of setting, by a user using a designation unit of the image processing apparatus, a unit in which the image data of the sheet music is processed; dividing the image data of the sheet music into units corresponding to the unit; determining whether image data of a first one of the units is repeated in one or more others of the units; and processing the image data, when the image data of the first one of the units is determined to be repeated in the one or more others of the units, to append information using the image processing apparatus to the image data of the first one of the units and the image data of the one or more others of the units.
Some embodiments of the present invention provide an image processing apparatus which processes image data of sheet music input by an input device, the apparatus comprising a setting unit configured to set a unit in which the image data of the sheet music is processed; a division unit configured to divide the image data of the sheet music into units corresponding to the unit; a determination unit configured to determine whether image data of a first one of the units is repeated in one or more others of the units; and an image data processing unit configured to, when the determination unit determines that the image data of the first one of the units is repeated in the one or more others of the units, append information to the image data of the first one of the units and the image data of the one or more others of the units.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings. The same reference numerals denote the same parts, and a repetitive description thereof will be omitted.
<PC Application>
An embodiment for practicing the present invention using application software running on a personal computer (to be referred to as a PC hereinafter) serving as a host apparatus will be explained.
An image processing system 101 in
In
The CPU 202 controls the operation of the overall image processing apparatus (PC 103) via the bus 207 in accordance with a program and data stored in the ROM 201. At the same time, the CPU 202 performs image processing using the RAM 203 as a work memory. For example, the CPU 202 performs image processing for image data input via the I/O unit 204 or NIC 205, or image data stored in advance in a storage medium such as the HDD 206. The CPU 202 outputs the processed image data via the I/O unit 204 or NIC 205, or stores it in a storage medium such as the HDD 206.
For example, the I/O unit 204 is connected via a predetermined interface to an image input/output device such as a monitor (e.g., CRT or LCD), a printer, or a scanner, or a storage device having a storage medium such as a magnetic disk or optical disk. The I/O unit 204 can receive/output image data via the NIC 205 from/to a computer device connected to the storage device and the above-mentioned image input/output device. Examples of the network are Ethernet, FDDI (Fiber Distributed Data Interface), IEEE1394 serial bus, and USB (Universal Serial Bus).
<MFP>
The same effects as those described above can also be obtained using an image processing system in which a single image processing apparatus operates without the mediacy of a PC.
In
In
The reading unit 302 having a CCD reads an original image to output analog luminance data of red (R), green (G), and blue (B). The reading unit 302 may also include a contact image sensor (CIS) instead of the CCD. When the ADF 308 as shown in
The card interface 306 receives, in accordance with a predetermined operation to the operation unit 305, image data captured by a digital camera or the like and recorded on a memory card or the like. If necessary, an image processing unit 402 converts image data received via the card interface 306. For example, the image processing unit 402 converts image data corresponding to the color space (e.g., YCbCr) of a digital camera into one corresponding to the RGB color space (e.g., NTSC-RGB or sRGB). If necessary, the received image data undergoes various processing required for a given application, such as resolution conversion into an effective pixel count on the basis of header information of the image data.
A camera interface 413 is directly connected to a digital camera to read image data.
The image processing unit 402 performs image processes (to be described later) such as image analysis, calculation of the conversion characteristic, conversion from a luminance signal (RGB) into a density signal (CMYK), scaling, gamma conversion, and error diffusion. Data obtained by these image processes are stored in a RAM 407. When corrected data stored in the RAM 407 reach a predetermined amount necessary to print by the printing unit 303, the printing unit 303 executes a print operation.
A nonvolatile RAM 408 is, for example, a battery backup SRAM, and stores data unique to the MFP and the like.
The operation unit 305 has a photo direct print start key to select image data stored in a storage medium and start printing, a key to print an order sheet, and a key to read an order sheet. The operation unit 305 also has a copy start key to perform monochrome copying or color copying, a mode key to designate a mode such as resolution or image quality in copying, a stop key to stop a copy operation or the like, a ten-key pad to input the number of copies, a registration key, and the like. The CPU 401 detects the pressed states of these keys, and controls each unit in accordance with the states.
The display unit 304 includes a dot matrix type liquid crystal display (LCD) and LCD driver, and presents various displays under the control of the CPU 401. The display unit 304 displays the thumbnail of image data stored in a storage medium.
The printing unit 303 includes an inkjet head, general-purpose IC, and the like. The printing unit 303 reads out print data stored in the RAM 407, and prints it out as a hard copy under the control of the CPU 401.
A driving unit 411 includes a stepping motor for driving feed and delivery rollers in the operations of the reading unit 302 and printing unit 303, a gear for transmitting the driving force of the stepping motor, and a driver circuit for controlling the stepping motor.
A sensor unit 410 includes a sensor for detecting the width of a print medium, a sensor for detecting the presence/absence of a print medium, a sensor for detecting the width of an original image, a sensor for detecting the presence/absence of an original image, and a sensor for detecting the type of print medium. Based on pieces of information obtained from these sensors, the CPU 401 detects the states of an original and print medium.
A PC interface 414 is an interface between the PC and the MFP. The MFP executes operations such as printing and scanning on the basis of instructions transmitted from the PC via the PC interface 414.
In copying, image data of an original image read by the reading unit 302 is processed in the MFP, and printed by the printing unit 303. When the user designates a copy operation via the operation unit 305, the reading unit 302 reads an original set on the original table. Image data of the read original image is transmitted to the image processing unit 402, and undergoes image processing (to be described later). The processed image data is transmitted to the printing unit 303, and printed.
In step S110, image data which is read by the reading unit 302 and A/D-converted undergoes shading correction to correct for image sensor variations.
In step S120, the image data undergoes input device color conversion to convert image data corresponding to a color space unique to the reading unit 302 serving as an input device into image data corresponding to a standard color space. More specifically, the image data is converted into image data corresponding to a color space such as sRGB defined by the IEC (International Electrotechnical Commission) or AdobeRGB proposed by Adobe Systems. The conversion method is, for example, an arithmetic method based on a 3×3 or 3×9 matrix, or a look-up table method of looking up to a table describing a conversion rule and deciding a color space on the basis of the table.
In step S130, the image data having undergone color conversion undergoes image correction/manipulation processing. The processing includes edge emphasis processing for correcting blurring generated upon reading an original image, processing for removing offset generated when reading by light irradiation, and character manipulation processing for improving character legibility.
In step S140, enlargement/reduction processing is executed to convert the image data at a desired ratio when the user designates resizing or in layout copying of laying out two original sheets on one paper sheet. The conversion method is generally a bicubic method, nearest neighbor method, or the like. When laying out and printing a plurality of images on one print medium in layout copying or the like, the operations in steps S110 to S140 are repeated to read a plurality of images and lay out the read images on one page. Then, the process shifts to the following print operation. Sheet music image data generation processing according to some embodiments is executed after this processing.
In step S150, the image data corresponding to the standard color space is converted into image data corresponding to a color space unique to an output device. Similar to step S120, the conversion method suffices to be an arithmetic method based on a matrix or the look-up table method. Image data corresponding to the color space unique to the output device is converted into one corresponding to the color space of the colors of inks used in an inkjet MFP, such as cyan, magenta, yellow, and black.
In step S160, the image data undergoes quantization processing. For example, when an ink dot is expressed by a binary value representing whether or not to discharge an ink dot, the image data suffices to be binarized according to a quantization method such as error diffusion. As a result, the image data is converted into a data format printable by the printing unit. The print operation is executed based on the image data of this data format, forming an image.
In the first embodiment, sheet music is created using an MFP. Sheet music image data generation processing as a feature of the present invention will be explained with reference to the flowchart of
The first embodiment will exemplify the use of digital data (sheet music data) of sheet music obtained by reading the original sheet music by the reading unit of the MFP and digitizing it. Instead of sheet music data, the first embodiment is also applicable to digital data of sheet music distributed on the Internet, or MIDI data generally used as performance data.
In step S210, image data is input to an image processing unit 402. In the first embodiment, the image data is sheet music data obtained by reading the original sheet music using the reading unit of the MFP and digitizing it. When digital sheet music data distributed on the Internet is used as the image data, it suffices to directly input the digital data to the image processing unit 402. When MIDI data is used as the image data, the data is formed in a predetermined format (e.g., SMF file format). Thus, the data is converted into digital data expressed by general signs such as notes and rests, and the converted data is input to the image processing unit 402.
In step S220, image processing parameters necessary to implement the present invention are set. The image processing parameters are the following three types of parameters.
The first parameter designates a unit range for determining whether a set of notes, rests, and the like of sheet music is repeated, in order to execute the following sheet music data manipulation processing. The first parameter will be called a range parameter. As represented by lines 601 and 604 in
The second parameter designates the number of bars having the same set of notes, rests, and the like in a given bar when the range parameter is, for example, one bar. When the number of such bars reaches the second parameter, the following sheet music data manipulation processing is executed. The second parameter will be called a count designation parameter.
The third parameter designates a concrete method of processing successive ranges when executing the sheet music data manipulation processing. In other words, the third parameter is a parameter for setting appended information for setting the type of information to be appended to image data. The third parameter will be called a manipulation parameter.
The first embodiment performs the sheet music image data generation processing using the MFP. Parameters recommended as these three types of parameters are stored in advance in a ROM 406, and the recommended parameters are designated and set. Another parameter designation method is a method using a PC application. The parameters are set based on contents designated by the user via the user interface of the PC application. As still another method, it is also possible to arrange designation buttons corresponding to the parameters on the operation panel of an image processing apparatus, accept user instructions from these buttons, and set parameters. In this case, image processing contents which can be designated by the parameters need to be stored in advance in the ROM of the image processing apparatus.
A case where a count of two is set as the count designation parameter, and processing to fill the background image of a target bar in light gray in odd-numbered manipulation and in dark gray in even-numbered manipulation is designated as the manipulation parameter will be explained. When dark gray is expressed using eight bits, R=64, G=64, and B=64. When light gray is expressed using eight bits, R=128, G=128, and B=128.
In step S230, notes, musical signs, and the like are analyzed for one bar designated in step S220.
First, a method of analyzing the bar line of a bar in sheet music data will be explained.
Sheet music data undergoes skew correction processing. This is because the reading unit may read the original of sheet music with a skew or the original of sheet music itself may be set with a skew. The skew correction processing suffices to use a method of correcting a skew using the staff of sheet music. For example, the image density of read sheet music is calculated at a plurality of angles to create a histogram. When the angle is parallel to the staff, the histogram draws steep curves at portions corresponding to the five lines of the staff. From this, an angle at which the staff becomes parallel to the reading direction is calculated, and the entire image is rotated and corrected. If necessary, a binarization operation may also be done using a fixed or adaptive threshold.
Then, the staff is extracted. The position of the sheet music serving as a reference (reference position of the sheet music) is obtained using staff information from the image data having undergone the skew correction processing by performing, for example, the binarization operation using a fixed or adaptive threshold. A clef (staff start clef) described at the beginning of a staff is a G or F clef. By detecting the G or F clef, the center position of the clef is defined as a staff data start position serving as the reference position of sheet music. The staff start clef can be detected by pattern matching with staff start clefs stored in advance in the ROM 406. The end of the staff can be obtained by detecting the break of the staff.
The extracted staff is divided into bars. The lines 601 and 604 in
After dividing the sheet music data into bars in the above-described way, notes, musical signs, and the like are analyzed for each bar. In
In step S240, it is determined whether the two successive bars extracted and analyzed in step S230 have the same notes, musical signs, and the like. More specifically, the note and staff images of the bars 605 and 606 stored in the RAM 407 in step S230 are superimposed on each other so as to make the staffs match each other.
In
In step S260, the image data of the bar 606 in the RAM 407 is discarded. The image data of the bar 605 remains stored in order to determine whether the bar 605 is identical to the next bar. After that, the process advances to step S270.
In step S270, it is determined whether all bars of the image data have been processed. This image data is one input to the image processing unit 402 in step S210. This image data may be not image data of one piece of sheet music but image data of pieces of sheet music. Analysis of each bar starts from a bar including the G clef of a staff positioned at the top of the sheet music. After a bar at the end (right end) of the same staff is analyzed, a bar including the G clef positioned second from the top is analyzed. This processing is repeated to analyze all bars. More specifically, the processes in steps S230 to S260 are done for each bar.
Image data of the bar 605 and a bar 608 are analyzed in the above-described way to reveal that the positions and numbers of notes in the bars 605 and 608 match each other. Hence, the bars 605 and 608 are subjected to manipulation processing. In step S240, it is determined that the image data of the bars 605 and 608 are repetitive data, and the process advances to step S250.
In step S250, manipulation processing complying with the manipulation parameter designated in step S230 is executed. More specifically, information is appended to the image data so as to fill these bars in light gray, as represented by the bars 605 and 608 in
In step S280, the image data having undergone predetermined image processing is output to an output device color converter in the image processing unit 402. Then, the process ends.
The first embodiment has exemplified a case where the range parameter is one bar. The second embodiment will exemplify a case where a different range parameter is set. Assume that the count designation parameter and manipulation parameter are the same as those in the first embodiment.
As one of musical signs, a slur is used to bind several notes in an arc and smoothly play them without a break, as represented by a slur 611 in
In step S230 of
The third embodiment will exemplify a case where a range parameter different from those in the first and second embodiments is set.
In the third embodiment, the range parameter is set based on the numbers of rests and notes. In some cases, it is effective to set the range parameter on the basis of the numbers of rests and notes. For example, in a part played by a musical instrument other than one played by the player in sheet music used in an orchestra or the like, rests may be successive in the part of the player. During the performance of this part, for example, the player may turn a page of sheet music or adjust a string. For this reason, it is convenient if the player can easily recognize that rests continue, as represented by 612 in
The first embodiment has exemplified a case where repetitive bars are filled in light gray in odd-numbered manipulation and in dark gray in even-numbered manipulation. However, the manipulation parameter may also be set to execute manipulation processing different from that described above.
For example, in the first embodiment, the repetitive bars 605 and 608 in
The present invention is also practiced by supplying the software program code for implementing the functions of the above-described embodiments to, for example, an apparatus connected to various devices, and operating these devices in accordance with programs stored in the computer of the apparatus or the like.
In this case, the software program code implements the functions of the above-described embodiments, and the program code itself and a means such as a computer-readable storage medium for supplying the program code to a computer constitute the present invention. Concrete examples of the computer-readable storage medium are a flexible disk, hard disk, optical disk, magnetooptical disk, CD-ROM, magnetic tape, nonvolatile memory card, and ROM.
The present invention includes program code when the functions of the above-described embodiments are implemented by executing the program code supplied to the computer. The present invention also includes program code when the functions of the above-described embodiments are implemented by an OS (Operating System) and another application software or the like running on the computer in cooperation with each other.
The functions of the above-described embodiments are also implemented when program code supplied to the computer is stored in the memories of the function expansion board and function expansion unit of the computer, and, for example, the CPUs of the function expansion board and function expansion unit execute processing on the basis of the instructions of the program code. The present invention also incorporates this program code.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2007-331074, filed Dec. 21, 2007, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2007-331074 | Dec 2007 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
4958551 | Lui | Sep 1990 | A |
5690496 | Kennedy | Nov 1997 | A |
5706363 | Kikuchi | Jan 1998 | A |
5864631 | Shutoh | Jan 1999 | A |
5931680 | Semba | Aug 1999 | A |
6169239 | Aiardo | Jan 2001 | B1 |
6175070 | Naples et al. | Jan 2001 | B1 |
6333455 | Yanase et al. | Dec 2001 | B1 |
6365819 | Yamada | Apr 2002 | B2 |
6380471 | Matsumoto | Apr 2002 | B2 |
6545208 | Hiratsuka | Apr 2003 | B2 |
6635815 | Kosakaya | Oct 2003 | B2 |
6727418 | Matsumoto | Apr 2004 | B2 |
6945784 | Paquette et al. | Sep 2005 | B2 |
7074999 | Sitrick et al. | Jul 2006 | B2 |
7094960 | Ikeya et al. | Aug 2006 | B2 |
7098392 | Sitrick et al. | Aug 2006 | B2 |
7479595 | Shen et al. | Jan 2009 | B2 |
7579541 | Guldi | Aug 2009 | B2 |
7649134 | Kashioka | Jan 2010 | B2 |
7703014 | Funaki | Apr 2010 | B2 |
7728212 | Fujishima et al. | Jun 2010 | B2 |
7763790 | Robledo | Jul 2010 | B2 |
20010023633 | Matsumoto | Sep 2001 | A1 |
20020005109 | Miller | Jan 2002 | A1 |
20020118562 | Hiratsuka | Aug 2002 | A1 |
20030005814 | Matsumoto | Jan 2003 | A1 |
20040069115 | Hiratsuka et al. | Apr 2004 | A1 |
20040112201 | Funaki | Jun 2004 | A1 |
20050016361 | Ikeya et al. | Jan 2005 | A1 |
20050016368 | Perla | Jan 2005 | A1 |
20070022866 | Perla | Feb 2007 | A1 |
20070261536 | Shen et al. | Nov 2007 | A1 |
20070295194 | Reverdin | Dec 2007 | A1 |
20080092723 | Sawyer-Kovelman et al. | Apr 2008 | A1 |
20080156171 | Guldi | Jul 2008 | A1 |
20080289477 | Salter | Nov 2008 | A1 |
20090013855 | Fujishima et al. | Jan 2009 | A1 |
20090158915 | Ishii et al. | Jun 2009 | A1 |
20090161164 | Goto et al. | Jun 2009 | A1 |
20090161176 | Yamazoe et al. | Jun 2009 | A1 |
20090161917 | Hori et al. | Jun 2009 | A1 |
20100089221 | Miller | Apr 2010 | A1 |
Number | Date | Country |
---|---|---|
2120657 | Jul 1995 | CA |
05-035924 | Feb 1993 | JP |
06-102869 | Apr 1994 | JP |
06-102870 | Apr 1994 | JP |
06-102871 | Apr 1994 | JP |
06-149235 | May 1994 | JP |
06-149236 | May 1994 | JP |
07129156 | May 1995 | JP |
07-304245 | Nov 1995 | JP |
07-311543 | Nov 1995 | JP |
2879941 | Apr 1999 | JP |
2000-163044 | Jun 2000 | JP |
3169142 | May 2001 | JP |
2002-144664 | May 2002 | JP |
2003-015636 | Jan 2003 | JP |
2003-177745 | Jun 2003 | JP |
2003-288075 | Oct 2003 | JP |
3801939 | Jul 2006 | JP |
Number | Date | Country | |
---|---|---|---|
20090161917 A1 | Jun 2009 | US |