The present application is based upon and claims the benefit of priority from the prior Japanese Patent Application No. 2016-190423, filed Sep. 28, 2016, the entire contents of which are incorporated herein by reference.
The present invention relates to a chord judging apparatus and a chord judging method for judging chords of a musical piece.
There is a demand of extracting chords from music data. For instance, in general a standard MIDI (Musical Instrument Digital Interface) file includes a melody part and an accompaniment part. When a performer plays a musical piece with an electronic keyboard instrument, he/she can easily play a melody with his/her right hand and sometimes wants to enjoy playing the accompaniment part with his/her left hand. The standard MIDI files are preferable to include the accompaniment part but most of them have no such accompaniment part. As a matter of course, the performers who own valuable electronic keyboard instruments will want to play their instruments with their both hands. If chords of music can be judged and indicated from the standard MIDI file of a musical piece, it will be pleasure for the performers to play the chords with their left hands.
Several techniques of judging chords of music are disclosed in the following patent documents:
Japanese Unexamined Patent Publication No. 2000-259154,
Japanese Unexamined Patent Publication No. 2007-286637,
Japanese Unexamined Patent Publication No. 2015-40964, and
Japanese Unexamined Patent Publication No. 2015-79196.
According to one aspect of the invention, there is provided a chord judging method performed by a processor to judge chords of a musical piece whose data is stored in a memory, wherein the processor executes processes of estimating a first tonality based on component tones included in a first segment having a first length, the first segment being specified in the data of the musical piece; estimating a second tonality based on component tones included in a second segment having a second length different from the first length, the second segment being specified in the data of the musical piece and at least partially overlapping with the first segment; and comparing the estimated first tonality with the estimated second tonality to judge a tonality or a chord of the first segment of the musical piece.
According to another aspect of the invention, there is provided a chord judging apparatus for judging chords of a musical piece, provided with a processor and a memory for storing data of the musical piece, wherein the processor specifies plural segments in the data of the musical piece; estimates a tonality of each of the specified segments based on component tones included in the segment; and judges a chord of the plural segments of the musical piece based on modulation in tonality, when modulation is introduced in the estimated tonalities of the plural segments.
According to the invention as defined in the above claims, a tonality judgment which can judge modulation in tonality allows a more appropriate chord judgment.
The embodiments of the present invention will be described with reference to the accompanying drawings in detail.
The computer shown in
The CPU 101 serves to control the whole operation of the computer. The ROM 102 stores a chord-analysis processing program shown by flow charts of
The input unit 104 detects a user's input operation performed on a keyboard or by a mouse (both not shown), and gives notice of the detected result to the CPU 101. For instance, the input operation includes an operation of selecting a musical piece, an instructing operation of executing the chord-analysis, and an operation for playing back a musical piece. Further, it may be possible to down load a standard MIDI file of a musical piece through the communication interface 107 from the network, when the user operates the input unit 104.
The displaying unit 105 displays chord judgment data output under control of the CPU 101 on a liquid crystal display device.
When the user has operated the input unit 104 to obtain the standard MIDI file of a musical piece (music data) from the ROM 102 and/or the network and to instruct the play back of such standard MIDI file of a musical piece, the sound system 106 successively reads the sequence of the standard MIDI file of a musical piece and creates a musical tone signal using an instrument sound designated by the user to output the musical tone signal from a speaker (not shown).
The note event holds the following structure data. ITime holds a sounding start time. IGate holds a gate time (sounding time length). “Tick” is used as a unit to measure a time length. For example, a quarter note has a time length of 480 ticks and in a musical piece of a four-four meter, one beat has a time length of 480 ticks. byData[0] holds a status. byData[1] holds a pitch of a note made to sound. byData[2] holds a velocity of a note made to sound. byData[3] holds information required for controlling sounding of the note. “next” indicates a pointer which introduces the following note event, and “prev” indicates a pointer which introduces the previous note event. The CPU 101 refers to the “next” pointer and/or the “prev” pointer to access the following note event and/or the previous note event, respectively.
The CPU 101 refers to the pointer information such as metaev[0], metaev[1], metaev[2], . . . to obtain meta-information such as tempos and rhythms, which are necessary for controlling the sound system 106 to reproduce a musical piece.
The CPU 101 judges whether the user has tapped a specified button on an application program to instruct to finish the application program (step S402). When it is determined that the user has instructed to finish the application program (YES at step S402), the CPU 101 finishes the chord analyzing process shown by the flow chart of
When it is determined that the user has not yet instructed to finish the application program (NO at step S402), the CPU 101 judges whether the user has operated the input unit 104 to instruct to select a musical piece (step S403).
When it is determined that the user has instructed to select a musical piece (YES at step S403), the CPU 101 reads MIDI sequence data of the standard MIDI file of the musical piece having the data format shown in
Then, the CPU 101 performs the chord judging process to be described later to judge chords of the whole MIDI sequence data of the musical piece, which was instructed to read in at step S404 (step S405). Thereafter, the CPU 101 returns to the process at step S402.
When it is determined that the user has not instructed to select a musical piece (NO at step S403), the CPU 101 judges whether the user has operated the input unit 104 to instruct to play back a musical piece (step S406).
When it is determined that the user has instructed to play back a musical piece (YES at step S406), the CPU 101 interprets the MIDI sequence data held in RAM 103 and gives the sound system 106 an instruction of generating sound to playback the musical piece (step S407). Thereafter, the CPU 101 returns to the process at step S402.
When it is determined that the user has not instructed to play back a musical piece (NO at step S406), the CPU 101 returns to the process at step S402.
The CPU 101 executes the tonality judging process to determine a tonality of each measure in the musical piece (step S501 in
The CPU 101 repeatedly executes a series of processes (step S503 to step S505) on each of the measures in the musical piece (step S502).
While repeatedly executing the processes on all the measures, the CPU 101 repeatedly executes the processes at step S504 and step S504 on each of all the beats in the measure. At step S504, the CPU 101 executes a pitch-class power creating process in each beat. In the pitch-class power creating process, the CPU 101 judges component tones in the beat as a pitch-class power. The detail of the pitch-class power creating process will be described with reference to
At step S505, the CPU 101 executes a matching and result storing process. In the matching and result storing process, the CPU 101 judges the component tones of the beat based on accumulated values of power information of each pitch class in the current beat calculated at step S504, and decides the chord of the beat based on the component tones in the beat. The detailed process will be described with reference to
When the processes at step S504 and the step S505 have been executed in all the beats falling in the measure and the chord progressing data corresponding to all of the beats in the measure has been created, then the CPU 101 returns to the process at step S502.
When a series of processes (step S502 to step S505) have been executed on all the measures of the musical piece and the chord progressing data corresponding to all of the beats in all of the measures of the musical piece has been created, then the CPU 101 moves to the process at step S506.
In the process at step S506, the CPU 101 calculates a combination of chords whose cost will be the minimum in the whole musical piece from among all the combinations of the chord progressing data, which chord progressing data consists of plural candidates of the data format shown in
As a result, the CPU 101 confirms a route of the chord progression all over the whole musical piece, whereby the optimum chords are determined (step S507). This process will be described with reference to
The tonality judging process (step S501 of
In the case of the musical piece of a quadruple meter read on the RAM 103, the measure number iMeasNO advances in the following way 0, 1, 2, . . . , as shown at (a-3) in
In the flowchart of the tonality judging process shown in
The selection of the frame length is not restricted to from among the 1-measure frame length, 2-measure frame length, or 4-measure frame length, but for instance the frame length may be chosen from among a 2-measure frame length, 4-measure frame length, or 8-measure frame length. The CPU 101 shifts by one measure the starting measure of each of frames (indicated by arrows (b-3), (b-4) and (b-5) in
The CPU 101 executes a key judging process (step S603). In the key judging process, the CPU 101 judges component tones in each frame defined by iFrameType and further judges a tonality of the judged component tones (the CPU 101 works as a key judging unit). This process will be described with reference to
As shown in the result of
Further, for (d) the 2-measure frame length iFrameType=1, the tonalities: B♭, C, C, B♭, A♭, and E♭ are judged for (a) the measure numbers iMeasNo which are successively displaced by one unit (two measures). The tonality judgment is made in order of the upper tier, lower tier, upper tier, lower tier, . . . as shown at (d) in
For (e) the 4-measure frame length iFrameType=2, the tonalities: B♭, C, C, A♭, A♭, and A♭ are judged for (a) the measure numbers iMeasNo which are successively displaced by one unit (four measures). The tonality judgment is made from the upper left tier to the lower right tier as shown at (e) in
Having executed the key judging process at step S603 in
As shown in the example of
As shown in the example of
As described above, in the present embodiment of the invention the result of tonality judgment made on the plural frame lengths iFrameType is comprehensively evaluated. Therefore, even if the tonality is modulated, since the judgment results made for the short frame length such as 1-measure frame length and/or 2-measure frame length are employed based on the power evaluation values, it is possible to detect modulation of tonality. Further, even in the case that it is impossible only in one measure to confirm sounding enough for judging a chord, since the judgment result made on a longer frame length such as 2-measure frame length and/or 4-measure frame length is employed based on the power evaluation value, it is possible to make an appropriate judgment. Further, in the embodiment, when a power evaluation value is calculated as described later, since a tone other than the scale tones of the tonality is taken into consideration, a precise tonality judgment can be maintained.
After having executed the process at step S604, the CPU 101 returns to the process at step S602. The CPU 101 repeatedly executes the key judging process (step S603) and the result storing process (step S604) on every measure of the musical piece with respect to one value of iFrameType with the frame start measure shifted by one measure. When having finished the above processes on every measure, the CPU 101 returns to the process at step S601. Then, the CPU 101 repeatedly executes a series of processes (step S602 to step S604) with respect to all the measure frame lengths, iFrameType=0, 1 and 2. When the processes at step S602 to step S604 have been finished with respect to iFrameType=0, 1 and 2, the tonality judging process (step S501 in
Then, the CPU 101 executes a series of processes (step S903 to step S910) with respect to all the values of ikey from 0 to 11 expressing the key value of the tonality (step S902). At first, the CPU 101 executes a series of processes at step S903 to step S908.
More specifically, the CPU 101 clears the first power evaluation value IPower and the second power evaluation value IOtherPower to “0” (step S903).
Then, the CPU 101 executes the processes at step S905 to step S907 with respect to each of the pitch classes iPc having a value from 0 to 11 (step S904).
The CPU 101 judges whether the current pitch class iPc designated at step S904 is included in the scale notes of the tonality determined based on the current key value ikey designated at step S902 (step S905). The judgment at step S905 is made based on calculation for determining whether a value of scalenote[(12+iPC−ikey) %12] is 1 or not. FIG. 10 is a view for explaining the scale notes. In
When it is determined that the current pitch class iPc designated at step S904 is included in the scale notes in the integrated scale corresponding to the current key value designated at step S902 (YES at step S905), the CPU 101 accumulates the pitch class power IPitchClassPower[iPc] calculated with respect to the pitch class iPc at step S901 to obtain the first power evaluation value IPower (step S906). In the process at step S906 in
Meanwhile, when it is determined that the current pitch class iPc designated at step S904 is not included in the scale notes in the integrated scale corresponding to the current key value designated at step S902 (NO at step S905), the CPU 101 accumulates the pitch class power IPitchClassPower[iPc] calculated with respect to the pitch class iPc in the process at step S901 to obtain the second power evaluation value IOtherPower (step S907).
After having executed the processes at step S905 to step S907 with respect to all the values from 0 to 11 of the pitch class iPc (the judgment at step S904 finishes), the CPU 101 divides the first power evaluation value IPower by the second power evaluation value IOtherPower to obtain a quotient as the power evaluation value doKeyPower corresponding to the current key value ikey designated at step S902 (step S908). When the process is executed at step S908, the first power evaluation value IPower indicates to what degree of strength the scale notes in the integrated scale corresponding to the current key value ikey designated at step S902 are sounding. The second power evaluation value IOtherPower indicates to what degree of strength the notes other than the scale notes in the integrated scale corresponding to the key value ikey are sounding. Therefore, the power evaluation value doKeyPower obtained by calculating “IPower/IOtherPower” is an index indicating to what degree the currently sounding notes in the current frame are similar to the scale notes in the integrated scale corresponding to the current key value ikey.
The CPU 101 compares the power evaluation value doKeyPower corresponding to the current key value ikey calculated at step S908 with the power evaluation maximum value doMax corresponding to the key value being designated just before (step S909). When the power evaluation value doKeyPower is not smaller than the power evaluation maximum value doMax, the CPU 101 replaces the power evaluation maximum value doMax and the power evaluation maximum key value imaxkey with the current power evaluation value doKeyPower and the key value ikey, respectively (step S910). Then, the CPU 101 returns to the process at step S902, and executes the process for the following key value ikey.
The CPU 101 refers to the next pointer shown
The CPU 101 judges whether the current note event designated at step S1102 is involved in the frame (hereinafter, the “current frame range”) beginning from the starting measure designated at step S602 and having the frame length such as 1-measure frame length, 2-measure frame length, or 4-measure frame length, determined at step S601 in
When it is determined NO at step S1103, the CPU 101 determines that the current note event is not involved in the current frame range, and returns to the process at step S1102 to execute the process on the following note event.
When it is determined YES at step S1103, the CPU 101 judges whether the current frame range starting time iTickFrom comes after the sounding starting time “me-->ITime” of the current note event (step S1104).
When it is determined YES at step S1104, since the current frame range starting time iTickFrom comes after the sounding starting time “me-->ITime” of the current note event (the state of 1201 in
Meanwhile, when it is determined NO at step S1104, it is determined that the current frame range starting time iTickFrom is in the state of 1202 or 1203 in
After having executed the process at step S1105 or at step S1106, the CPU 101 judges whether the current frame range finishing time iTickTo comes after the sounding finishing time of the current note event (the sounding start time “me-->ITime”+the sounding time length “me-->IGate”) (step S1107).
When it is determined YES at step S1107, it is determined that the current frame range finishing time iTickTo comes after the sounding finishing time of the current note event (the state of 1201 or 1202 in
When it is determined NO at step S1107, it is determined that the current frame range finishing time iTickTo comes before the sounding finishing time of the current note event (the state of 1203 in
After having executed the process at step S1108 or at step S1109, the CPU 101 accesses the pitch byData[1] (Refer to
The CPU 101 divides the pitch iPitch of the current note event by 12, finding a reminder [iPitch %12] to calculate a pitch class of the current note event, and stores the following calculated value to a pitch class power IPichClassPower[iPitch %12] of the pitch class stored in the RAM 103. The CPU 101 multiplies velocity information IPowerWeight decided based on a velocity and part information of the current note event by a sounding time length (ITickEND−ITickStart) in the current frame range of the current note event to obtain the pitch class power IPichClassPower[iPitch %12]. For instance, the velocity information IPowerWeight is obtained by multiplying the velocity me-->byData[2] (Refer to
After having executed the process at step S1111, the CPU 102 returns to the process at step S1102 and performs the process on the following note event.
When a series of processes (step S1103 to step S1111) have been repeatedly executed and the processes have finished on all the note events “me” corresponding to the current track number iTrack, then the CPU 101 returns to the process at step S1101 and executes the process on the following track number iTrack. Further, when the processes at step S1102 to step S1111 have been repeatedly executed and the processes have finished on all the track numbers iTrack, then the CPU 101 finishes the pitch class power creating process (step S901 in
The CPU 101 repeatedly executes a series of processes (step S1302 to step S1303) on every measure composing the musical piece (step S1301). In the process at step S1301, the CPU 101 gives the leading measure of the musical piece the measure number of “0” and successively gives the following measures the consecutive number “i”.
The CPU 101 judges whether the measure number “i” is included in a group of the measure numbers from the measure number of the starting measure of the frame designated at step S602 to the current frame range of the frame length designated at step S601 in
When it is determined NO at step S1302, the CPU 101 returns to the process at step 1301, and executes the process on the following measure number.
When it is determined YES at step S1302, the CPU 101 judges whether the power evaluation value doKeyPower which is calculated for the current frame range in the key judging process at step S603 in
When it is determined NO at step S1303, the CPU 101 returns to the process at step 1301, and executed the process on the following measure number.
When it is determined YES at step S1303, the CPU 101 sets the power evaluation maximum key value imaxkey calculated in the process at step S910 in
The tonality data is initially created and stored in the RAM 103 as shown in
As shown in
Further, in the result obtained in the tonality judging process shown in
Furthermore, as will be understood from the result obtained in the tonality judging process shown in
When the series of processes (step S1302 to step S1304) have been executed on all the measure numbers “i” composing the musical piece, the CPU 101 finishes the result storing process (step S604 in the flow chart of
As will be understood from the result obtained in the tonality judging process shown in
The pitch-class power creating process (step S504) and the matching and result storing process (step S505) will be described in detail. The pitch-class power creating process (step S504) and the matching and result storing process (step S505) are repeatedly executed on every measure in the musical piece (step S502) and on each beat in the every measure (step S503) after the appropriate tonality in each measure of the musical piece has been judged in the tonality judging process at step S501 in
The pitch-class power creating process (step S504 in
The detailed process at step S504 in
After the replacement of the above variables, the CPU 101 executes the processes in accordance with the flow chart shown in
The CPU 101 executes a series of processes (step S1402 to step S1413) with respect to all the values iroot from 0 to 11, each indicating the root (fundamental note) of a chord (step S1401). The CPU 101 executes a series of processes (step S1403 to step S1413) with respect to all the chord types itype indicating types of chords (step S1402).
While repeatedly executing the processes (step S1403 to step S1413), the CPU 101 clears the first power evaluation value IPower and the second power evaluation value IOtherPower to “0” (step S1403).
The CPU 101 executes processes at step 1405 to step 1407 on all the pitch classes iPC from 0 to 11 (step S1404).
The CPU 101 judges whether the current pitch class iPc designated at step S1404 is included in the chord tones of the chord decided based on the chord root iroot designated at step S1401 and the chord type itype designated at step S1402 (step S1405). The judgment at step S1405 is made based on whether “chordtone[itype][(12+iPc−iroot) %12]” is 1 or not.
When the current pitch class iPc designated at step S1404 is involved in the chord tones of the chord corresponding to the current chord type itype designated based on the iroot designated at step S1401 and the current chord type itype designated in the process at step S1402 (YES step S1405), the CPU 101 accumulates the pitch class power IPichClassPower[iPc] calculated at step S504 in
Meanwhile, when the current pitch class iPc designated in the process at step S1404 is not involved in the chord tones of the chord corresponding to the current chord type itype designated based on the iroot designated in the process at step S1401 and the current chord type itype designated in the process at step S1402 (NO step S1405), the CPU 101 accumulates the pitch class power IPichClassPower[iPc] calculated in the process at step S504 in
When having executed the processes at step S1405 to step 1407 on all the pitch class iPc from 0 to 11 (FINISH at step S1407), the CPU 101 executes the following process. The CPU 101 decides a chord based on the chord root iroot and the chord type itype designated at present respectively at step S1401 and at step S1402 to determine the chord tones of the decided chord, and then divides the number of tones included in the scale tones in the tonality decided in the tonality judging process (step S501 in
TNR=(the number of tones included in the scale tones in the tonality in chord tones)/(the number of scale tones of the tonality) (1)
More specifically, the CPU 101 uses the measure number of the measure designated at present at step S502 in
For instance, when the tonality judgment results in a C major, compensation values in chords will be as follows: G7: 1, B dim: 1, B dim7: 0.75, B m7♭5=1.0, D dim7=0.75, F dim7=0.75
Further, the CPU 101 multiplies the first power evaluation value IPower calculated at step S1406 by the compensation coefficient TNR calculated at step S1408, and multiplies the second power evaluation value IOtherPower by a predetermined negative constant OPR, and then adds both the products to obtain the sum. Then, the CPU 101 sets the sum to the first power evaluation value IPower, thereby calculating a new power evaluation value IPower for the chord decided based on the chord root and the chord type designated at present respectively at step S1401 and at step S1402 (step S1409).
In the present embodiment, usage of the compensation coefficients TNR (1) will make the tonality judgment made on each measure in the tonality judging process (step S501 in
The CPU 101 repeatedly executes a series of processes (step S1411 to step S1413) on all the number “i” (i=0, 1, 2, . . . ) of chord candidates corresponding to the beat number ICnt of the current beat in the chord progressing data shown in
In the repeatedly executed processes, the CPU 101 obtains a power evaluation value chordProg[ICnti][i] doPowerValue in the chord information referred to by the pointer information chorProg[ICnt][i] of the (i+1)th candidate (if i=0, the first candidate, if i=1, the second candidate, and if i=2, the third candidate, . . . ) corresponding to the current beat number ICnt. The current beat number ICnt is the consecutive beat number counted from the leading part of the musical piece. In the case of the musical piece of a four-four meter, the beat number ICnt is given by (4 beats×the measure number at step S502)+(the beat number at step S503). The CPU 101 judges whether the power evaluation value IPower calculated at step S1409 is larger than the above power evaluation value chrodProg[ICnt][i].doPowerValue (step S1411).
When it is determined NO at step S1411, the CPU 101 returns to the process at step S1410 and increments “i” and executes the process on the following chord candidate.
When it is determined YES at step S1411, the CPU 101 sequentially accesses the chord information which are referred to by the pointer information chordProg[i+1][ICnt], pointer information chordProg[ICnt][i+2], pointer information chordProg[ICnt][i+3], . . . , and so on (step S1412). Then the CPU 101 stores the chord information (having the data format shown in
In the chord information, ITick stores a starting time of the current beat (decided at step S503) in the current measure decided at step S502. The starting time of the current beat corresponds to the current frame range starting time iTickFrom=480×(4 beats×the current measure number+the current beat number in the measure), as described in the description of the pitch class power creating process at step S504 in
After having finished executing the process on all the chord candidates (FINISH at step S1410), the CPU 101 returns to the process at step S1402 and executes the repeating process with respect to the following chord type itype.
After having finished executing the repeating process with respect to all the chord types itype (FINISH at step S1402), the CPU 101 returns to the process at step S1401 and executes the repeating process with respect to the following chord root iroot.
After having finished executing the repeating process with respect to all the chord roots iroot (FINISH at step S1401), the CPU 101 finishes the matching and result storing process (step S505 in the flow chart in
A minimum cost calculating process at step S506 in
In general, there are musical and natural rules in connection of chords before and/or behind notations of “sus4” and “mM7”. For example, in most cases the chord placed after the notation of “sus4” has the same chord root as the preceding chord, and the chords placed before and/or behind notation of “mM7” have the same chord root and are minor chords.
In the present embodiment, a cost of connection between two chords is defined based on a musical connection rule. At step S506 in
As shown in
In the minimum cost calculating process executed in the present embodiment, the total cost needed during a term from a time of start of sounding of a chord at the timing of the leading beat of the musical piece to a time of sounding of the chord candidate of the chord number iCurChord currently selected at the timing of the current beat IChordIdx after chord candidates are successively selected at each beat timing is defined as the optimum chord total minimum cost doOptimizeChordTotalMinimalCost[IChirdIdx], array variables to be stored in the RAM 103. Then, the optimum chord total minimum costs previously calculated for three chord candidates are added respectively to connection costs respectively between the current chord candidates and three chord candidates at the next preceding beat timing IPrevChordInx, whereby three sums are obtained. And the minimum sum among the three sums is determined as the optimum chord total minimum costs doOptimizeChordTotalMinimalCost[IChordIdx]. The chord candidate showing the minimum cost value at the next preceding beat timing IPrevChordIdx is defined as a next preceding optimum chord root OptimizizeChordRoutePrev[IChordInx] [iCurChord] leading to, the current chord candidate (array variable) to be stored in the RAM 103. In the minimum cost calculating process at step S506 in
THE CPU 101 stores a value of (the current beat timing IChordIdx−1) to the next preceding beat timing IPrevChordIdx (step S1702).
The CPU 101 designates the candidate number iCurChord at the current beat timing with respect to all the chord candidates every current beat timing IchordIdx designated at step S1701 to repeatedly execute a series of processes at step S1704 to step S1709 (step S1703).
The CPU 101 designates the candidate number IPrevChord at the next preceding beat timing with respect to all the chord candidates at the next beat timing every candidate number iCurChord at the current beat timing designated at step S1703 to repeatedly execute a series of processes at step S1705 to step S1708 (step S1704).
In the processes at step S1705 to step S1709, the CPU 101 calculates the connection cost defined when the chord candidate of the candidate number IPrevChord at the next preceding beat timing designated at step S1704 is modulated to the chord candidate of the candidate number iCurChord at the current beat designated at step S1703, and stores the calculated cost as a cost doCost (as a variable) in the RAM 103 (step S1705).
The CPU 101 adds the optimum chord total minimum cost doOptimizeChordTotalMinimalCost[IPrevChordIdx] [iPrevChord] which has been held for the chord candidate of the candidate number iPrevChord at the next preceding beat timing designated at step S1703, to the cost doCost (step S1706). In the case of the next preceding beat timing IPrevChordIdx=0 at the current beat timing IChordIdx=1, the optimum chord total minimum cost doOptimizeChordTotalMinimalCost[0][iPrevChord] (iPrevChord=0, 1, 2) is 0.
The CPU 101 judges whether the cost doCost updated at step S1706 is not larger than the cost minimum value doMin which has been calculated up to the candidate number iCurChord at the current beat timing designated at step S1703 and stored in the RAM 103 (step S1707). The cost doCost is set to an initial large value when the CPU 101 designates a new candidate number iCurChord at the current beat timing at step S1703.
When it is determined NO at step S1707, the CPU 101 returns to the process at step S1704 and increments the candidate number iPrevChord to execute the process on the following candidate number iPrevChord at the next preceding beat timing.
When it is determined YES at step S1707, the CPU 101 stores the cost doCost to the cost minimum value doMin in the RAM 103 and stores the candidate number iPrevChord at the next preceding beat timing designated at step S1704 to a cost minimum next-preceding chord iMinPrevChord in the RAM 103. Further, the CPU 101 stores the current beat timing IChordIdx and the cost doCost onto the optimum chord total minimum costdoOptimizeChordTotalMinimalCost [IChordIdx][iCurChord] of the chord candidate of the candidate number iCurChord at the current beat timing (step S1708). Thereafter, the CPU 101 returns to the process at step S1704 and increments the candidate number iPrevChord to execute the process on the following candidate number iPrevChord at the next preceding beat timing.
Having executed a series of processes (step S1705 to step S1708) on each candidate number iPrevChord at the next preceding beat timing successively designated at step S1704, the CPU 101 finishes executing the process on all the candidate numbers iPrevChord (=0, 1, 2) at the next preceding beat timing, and then the CPU 101 executes the following process. The CPU 101 stores the current beat timing IChordIdx and the cost minimum next-preceding chord iMinPrevChor onto the next-preceding optimal chord root iOptimizeChordRoute Prev[IChordIdx][iCurChord] of the candidate number iCurChord at the current beat timing. Thereafter, the CPU 101 returns to the process at step S1703 and increments the candidate number iCurChord to execute the process on the following candidate number iCurChord at the current beat timing.
Executing the processes (step S1704 to step S1709) on the candidate number iCurChord successively designated at step S1703 at the current beat timing, the CPU 101 finishes executing the process on all the candidate numbers iPrevChord (=0, 1, 2) at the current beat timing, and returns to the process step S1701. The CPU 101 increments the beat timing IChordIdx to execute the process on the following candidate number at the following beat timing IchordIdx.
When the processes (step S1702 to step S1709) have been executed at each of the current beat timings IchordIdx sequentially designated at step S1703 and the process has finished at all the current beat timings IchordIdx, the CPU 101 finishes executing the minimum cost calculating process (flow chart in
Similarly, the CPU 101 stores the next preceding beat timing IPrevChordIdx and the pointer information chordProg[IPrevChorgIdx] [iPrevChord] to the chord information (in the RAM 103) of the candidate number iPrevChord at the next preceding beat timing IPrevChordIdx onto the next preceding pointer (a variable) “prev” stored in the RAM 103 (step S1802).
The CPU 101 sets the connection cost doCost to an initial value 0.5 (step S1803).
The CPU 101 adds 12 to the chord root cur.IRoot (Refer to
When it is determined YES at step S1804, then it is evaluated that the modulation from the chord candidate of the candidate number iPrevChord at the next preceding beat timing IPrevChordIdx to the chord candidate of the candidate number iCurChord at the current beat timing IChordIdx introduces natural change in chords with an interval difference of 5 degrees. In this case, the CPU 101 sets the best value or the lowest value 0.0 to the connection cost doCost (step S1805).
When it is determined NO at step S1804, the CPU 101 skips over the process at step S1805 to maintain the connection cost doCost at 0.5.
The CPU 101 judges whether the chord type prev.Type (Refer to
When it is determined YES at step S1806, then it is decided that this case (chord modulation) meets well the music rule: a chord following the chord of “sus4” often has the same chord root as the chord of “sus4”, and introduces a natural chord modulation. In this case, the CPU 101 sets the best value or the lowest value 0.0 to the connection cost doCost (step S1807).
When it is determined NO at step S1806, the chord modulation is not natural. In this case, the CPU 101 sets the worst value 1.0 to the connection cost doCost (step S1808).
Then, the CPU 101 judges whether the chord type prev.iType in the chord information of the candidate number iPrevChord in the next preceding beat timing IPrevChordIdx is “mM7”, and the chord type cur.iType in the chord information of the candidate number iCurChord in the current beat timing IChordIdx is “m7”, and the chord root prev.iRpoot and the chord root cur.iRpoot in both chord information are the same (step S1809).
When it is determined YES at step S1809, the chord modulation meets well the music rule and very natural. In this case, the CPU 101 sets the best value or the lowest value 0.0 to the connection cost doCost (step S1810).
When it is determined NO at step S1809, the chord modulation is not natural. In this case, the CPU 101 sets the worst value 1.0 to the connection cost doCost (step S1811).
Further, the CPU 101 judges whether the chord type prev.iType in the chord information of the candidate number iPrevChord in the next preceding beat timing IPrevChordIdx is “maj”, and the chord type cur.iType in the chord information of the candidate number iCurChord in the current beat timing IChordIdx is “m”, and the chord root prev.iRpoot and the chord root cur.iRpoot in both chord information are the same (step S1812).
When it is determined YES at step S1812, the chord modulation is not natural. In this case, the CPU 101 sets the worst value 1.0 to the connection cost doCost (step S1813).
When it is determined NO at step S1812, the CPU 101 skips over the process at step S1813.
Finally, the CPU 101 subtracts the power evaluation value cur.doPowerValue in the chord information of the candidate number iCurChord in the current beat timing IChordIdx from 1 to obtain a first difference, and further subtracts the power evaluation value prev. doPowerValue in the chord information of the candidate number iPrevChord in the next preceding beat timing IPrevChordIdx from 1 to obtain a second difference. Then, the CPU 101 multiplies the first difference, the second difference and doCost, thereby adjusting the connection cost doCost (step S1814). Then the CPU 101 finishes the cost calculating process (flow chart in
In the minimum cost calculating process in
In the case of the chord candidate of “A mM7” of the candidate number iCurChord=1 (second candidate) at the current beat timing IChordIdx=1, a calculation is performed in a similar manner. The optimal chord total minimum cost doOptimizeChordTotalMinimalCost[1][1] of the current chord candidate “A mM7” is calculated and 0.5 is obtained as indicated in the bold line circle of “A mM7”. As the next preceding optimum chord route iOptimizeChord RoutePrev[1][1] of the current chord candidate “A mM7”, the next preceding chord candidate “Cm” is set, as indicated by the bold line arrow indicating the bold line circle of “A mM7”.
When the current beat timing progresses by 1 to IChordIdx=2, the current chord candidate will be “Dm” at the candidate number iCurChord=0 (first chord). In this case, at the next preceding beat timing IPrevChordIdx=1, the connection cost doCost defined when the next preceding chord candidate “Am” of the candidate number iPrevChord=0 (first candidate) is modulated to the current chord candidate “Dm” is calculated using the algorism shown by the flow chart of
With respect to the chord candidate of “Dsus4” of the candidate number iCurChord=1 (second candidate) at the current beat timing IChordIdx=2, the calculation is performed in a similar manner. The optimal chord total minimum cost doOptimizeChordTotalMinimalCost[2][1] of the current chord candidate “Dsus4” is calculated and 0.5 is obtained as indicated in the bold line circle of “Dsus4”. As the next preceding optimum chord route iOptimizeChord RoutePrev[2][1] of the current chord candidate “Dsus4”, the next preceding chord candidate “Am” is set, as indicated by the bold arrow indicating the bold line circle of “Dsus4”.
When the current beat timing further progresses by 1 to IChordIdx=3, the current chord candidate will be “G7” with the candidate number iCurChord=0 (first chord). In this case, at the next preceding beat timing IPrevChordIdx=2, the connection cost doCost defined when the next preceding chord candidate “Dm” of the candidate number iPrevChord=0 (first candidate) is modulated to the current chord candidate “G7” is calculated using the algorism shown by the flow chart of
In the case of the chord candidate of “Bdim” of the candidate number iCurChord=1 (second candidate) at the current beat timing IChordIdx=3, the calculation is performed in a similar manner. The optimal chord total minimum cost doOptimizeChordTotalMinimalCost[3][1] of the current chord candidate “Bdim” is calculated and 1.0 is obtained as indicated in the bold line circle of “Bdim”. As the next preceding optimum chord route iOptimizeChord RoutePrev[2][1] of the current chord candidate “Bdim”, the next preceding chord candidate “Dm” is set, as indicated by the bold arrow indicating the bold line circle of “Bdim”.
The route confirming process at step S507 in
In the example shown in
In the processes at step 1902 to step S1906, the CPU 101 judges whether the tail beat timing has been designated (step S1902).
The CPU 101 repeatedly executes a series of processes (step S1904 to step S1906) on all the chord candidates of the candidate number iCurChord at the tail beat timing IChordIdx designated at step S1901 (step S1903). In the processes, candidate number iCurChord is searched for, which shows the minimum value of the optimal chord total minimum cost doOptimizeChordTotalMinimalCost[IChordIdx] [iCurChord1] at the tail beat timing IChordIdx, as described in
In the processes executed repeatedly at step 1904 to step S1906, the CPU 101 judges whether the optimal chord total minimum cost doOptimizeChordTotalMinimalCost [IChordIdx][iCurChord1] of the candidate number iCurChord designated at step S1903 at the tail beat timing IChordIdx designated at step S1901 is not larger than the cost minimum value doMin stored in the RAM 103 (step S1904). The cost minimum value doMin is initially set to a large value.
When it is determined NO at step S1904, the CPU 101 returns to the process at step S1903 and increments the candidate number iCurChord.
When it is determined YES at step S1904, the CPU 101 sets the optimal chord total minimum cost doOptimizeChordTotalMinimalCost[IChordIdx][iCurChord1] of the candidate number iCurChord designated at step S1903 and at the tail beat timing IChordIdx designated at step S1901 to the cost minimum value doMin stored in the RAM 103 (step S1905).
The CPU 101 sets the candidate number iCurChord currently designated at step S1903 to the best chord candidate number iChordBest in RAM 103 (step S1906). Then the CPU 101 returns to the process at step S1903 and increments the candidate number iCurChord to execute the process thereon.
When the processes at step 1904 to step S1906 have been executed on all the candidate numbers iCurChord, the CPU 101 moves to the process at step S1908. In this state, as the best chord candidate number iChordBest, the chord candidate number of the chord candidate showing the minimum value of the optimal chord total minimum cost will be obtained at the tail beat timing. At step S1908, the CPU 101 stores the chord root chordProg[IChordIdx][iChordBest].iRoot in the chord information of the best chord candidate number iChordBest at the tail beat timing IChordIdx onto the chord root chordProg[IChordIdx][0].iRoot in the chord information of the first candidate at the tail beat timing IChordIdx (step S1908).
Then, the CPU 101 stores the chord type chordProg[[IChordIdx][iChordBest].iType in the chord information of the best chord candidate number iChordBest in the current tail beat timing IChordIdx onto the chord type chordProg[IChordIdx][0].iType in the chord information of the first candidate in the current tail beat timing IChordIdx (step S1909).
The CPU 101 stores the next preceding optimal chord route iOptmizeChordRoutePrev[IChordIdx][iChordBest] of the chord candidate of the best chord candidate number iChordBest in the current tail beat timing IChordIdx onto the candidate number iPrevChord in the next preceding beat timing (step S1910). Then the CPU 101 returns to the process at step S1901 and decrements the beat timing iChordIdx to execute the process thereon.
When the timing comes to the beat timing just before the tail, it is determined NO at step S1902. The CPU 101 stores the next preceding optimal chord route which was stored in the candidate number iPrevChord of the next preceding beat timing at step S1910, onto the best chord candidate number iChordBest (step S1907).
Further, executing the processes at step S1908 and step S1909, the CPU 101 stores the chord route chordProg[IChordIdx][iChordBest].iRoot and the chord type chordProg[IChordIdx][iChordBest].iType in the chord information of the best chord candidate number iChordBest in the current beat timing IChordIdx onto the chord route chordProg[IChordIdx][0].iRoot and the chord type chordProg[IChordIdx][0].iType in the chord information of the first candidate in the current beat timing IChordIdx, respectively.
Thereafter, the CPU 101 stores the next preceding optimal chord route iOptmizeChordRoutePrev[IChordIdx] [iChordBest] of the chord candidate of the best chord candidate number iChordBest in the tail beat timing IChordIdx onto the candidate number iPrevChord in the next preceding beat timing (step S1910). And the CPU 101 returns to the process at step S1901 and decrements the beat timing iChordIdx to execute the process thereon.
Having repeatedly executed the processes on each beat timing IChordIdx, the CPU 101 can output the optimum progressions of chords as the chord route chordProg[IChordIdx][0].iRoot and the chord type chordProg[IChordIdx][0].iType in the chord information of the first candidate in each beat timing IChordIdx, respectively.
In the minimum cost calculating process at step S506 in
In the embodiments described above, the tonality judgment in which a modulation are judged appropriately allows an accurate judgment of chords.
In the above embodiments, the chord judgment has been described using MIDI sequence data as data of a musical piece, but the chord judgment can be made based on a audio signal in place of the MIDI sequence data. In this case, Fourier transform is used to analyze an audio signal, thereby calculating a pitch class power.
In the embodiments described above, the control unit for performing various controlling operations is composed of a CPU (a general processor) which runs a program stored in ROM (a memory). But it is possible to compose the control unit from plural processors each specialized in a special operation. It is possible for the processor to have a general processor and/or a specialized processor with its own specialized electronic circuit and a memory for storing a specialized program.
For instance, when the control unit is composed of the CPU executing the program stored in ROM, examples of the programs and processes executed by the CPU will be given below:
(Configuration 1)
The processor uses music data stored in a memory; estimates a first tonality based on component tones included in a first segment having a first length, the first segment being specified in the data of the musical piece; estimates a second tonality based on component tones included in a second segment having a second length different from the first length, the second segment being specified in the data of the musical piece and at least partially overlapping with the first segment; and compares the estimated first tonality with the estimated second tonality to judge a tonality or a chord of the first segment of the musical piece.
(Configuration 2)
In the above configuration, the processor compares the estimated first tonality with the estimated second tonality to decide an appropriate tonality; and judges a chord of the first segment of the musical piece based on the decided appropriate tonality.
(Configuration 3)
In the above configuration, the processor judges component tones of each beat in a measure of the musical piece; and determines a chord of the beat based on the component tones judged at the beat.
(Configuration 4)
In the above configuration, the processor decides a value of power information of each of musical tones of the musical piece which is made note-on within a time period of the first segment, the second segment or the beat, based on the musical tone's velocity and sounding time length in the time period in judging chord tones respectively in the first segment, the second segment or the beat; and accumulates the decided values of power information for pitch classes corresponding respectively to pitches of the musical tones to calculate accumulative values of power information respectively for the pitch classes in the first segment, the second segment or the beat.
(Configuration 5)
In the above configuration, when the pitch classes corresponding to the pitches of the musical tones coincide respectively with scale tones in the candidates for the first tonality, scale tones in the candidate for the second tonality or component tones in the candidate for a chord, correspondingly to candidates for the first tonality of the first segment, the second tonality of the second segment or a chord of a beat, the processor accumulates the calculated accumulative values of power information for the pitch classes to find first power evaluation values; and when the pitch classes corresponding to the pitches of the musical tones do not coincide with the scale tones in the candidates for the first tonality, the scale tones in the candidate for the second tonality or the scale tones in the candidate for a chord, the processor accumulates the accumulative values of power information calculated in the pitch classes to find second power evaluation values; and the processor compares the first power evaluation values and the second power evaluation values found respectively for the candidates for the first tonality and the second tonality or the chord to judge the first tonality, the second tonality or the chord respectively in the first segment, the second segment, or the beat.
(Configuration 6)
In the above configuration, the first segment has a first segment length equivalent to one measure and the second segment has a second segment length equivalent to multiples of one measure; and the processor compares the first tonality and the second tonality judged for each measure in which the first segment and the second segment overlap each other to determine an appropriate tonality of the measure.
(Configuration 7)
In the above configuration, the processor sequentially specifies the first segments having the first segment length or the second segments having the second segment length in the data of the musical piece, each with a starting position shifted by one measure.
(Configuration 8)
In the above configuration, the processor displays the judged chords on a displaying unit.
(Configuration 9)
A chord judging apparatus for judging chords of a musical piece, provided with a processor and a memory for storing data of the musical piece, wherein the processor specifies plural segments in the data of the musical piece; estimates a tonality of each of the specified segments based on component tones included in the segment; and judges a chord of the plural segments of the musical piece based on modulation in tonality, when modulation is introduced in the estimated tonalities of the plural segments.
(Configuration 10)
In the above chord judging apparatus, the processor estimates a first tonality of a first segment having a first length based on component tones included in the first segment, the first segment being specified in the data of the musical piece; estimates a second tonality of a second segment having a second length based on component tones included in the second segment, the second segment being specified in the data of the musical piece and partially overlapping with the first segment; compares the estimated first tonality with the estimated second tonality to judge a tonality of the first segment of the musical piece; and judges a chord of the first segment of the musical piece based on the judged tonality of the first segment.
When the control unit is composed of plural specialized processors, it is possible to arbitrarily decide how many specialized processors are used or to which controlling operation a specialized processor is assigned. A configuration is described below, in which plural specialized processors are assigned to various sorts of controlling operations respectively.
(Configuration 11)
The control unit is composed of a tonality estimating processor (tonality estimating unit) which estimates a first tonality based on component tones included in a first segment having a first length, the first segment being specified in music data stored in the memory, and estimates a second tonality based on component tones included in the second segment having a second length different from the first length, the second segment being specified in the music data and at least partially overlapping with the first segment; a tonality deciding processor (tonality deciding unit) which compares the estimated first tonality with the estimated second tonality to decide an appropriate tonality; and a chord judging processor (chord judging unit) which judges a chord of the first segment of the musical piece based on the appropriate tonality.
Number | Date | Country | Kind |
---|---|---|---|
2016-190423 | Sep 2016 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
4290087 | Bixby | Sep 1981 | A |
4429606 | Aoki | Feb 1984 | A |
4450742 | Sugiura | May 1984 | A |
4499808 | Aoki | Feb 1985 | A |
5052267 | Ino | Oct 1991 | A |
5218153 | Minamitaka | Jun 1993 | A |
5302776 | Jeon | Apr 1994 | A |
5510572 | Hayashi | Apr 1996 | A |
5723803 | Kurakake | Mar 1998 | A |
6951977 | Streitenberger | Oct 2005 | B1 |
8178770 | Kobayashi | May 2012 | B2 |
10062368 | Minamitaka | Aug 2018 | B2 |
20020029685 | Aoki | Mar 2002 | A1 |
20030094090 | Tamura | May 2003 | A1 |
20040144238 | Gayama | Jul 2004 | A1 |
20040255759 | Gayama | Dec 2004 | A1 |
20050109194 | Gayama | May 2005 | A1 |
20060272486 | Chen | Dec 2006 | A1 |
20080307945 | Gatzsche | Dec 2008 | A1 |
20090151547 | Kobayashi | Jun 2009 | A1 |
20100126332 | Kobayashi | May 2010 | A1 |
20120060667 | Hara | Mar 2012 | A1 |
20140260915 | Okuda | Sep 2014 | A1 |
20160148605 | Minamitaka | May 2016 | A1 |
20160148606 | Minamitaka | May 2016 | A1 |
20170090860 | Gehring | Mar 2017 | A1 |
20170092245 | Kozielski | Mar 2017 | A1 |
20180090117 | Minamitaka | Mar 2018 | A1 |
20180090118 | Minamitaka | Mar 2018 | A1 |
Number | Date | Country |
---|---|---|
08007589 | Jan 1996 | JP |
11126075 | May 1999 | JP |
2000259154 | Sep 2000 | JP |
2007286637 | Nov 2007 | JP |
2010122630 | Jun 2010 | JP |
2012098480 | May 2012 | JP |
2015040964 | Mar 2015 | JP |
2015079196 | Apr 2015 | JP |
Entry |
---|
Related U.S. Appl. No. 15/677,656; Title: “Chord Judging Apparatus and Chord Judging Method”; First Named Inventor: Junichi Minamitaka; filed Aug. 15, 2017. |
Number | Date | Country | |
---|---|---|---|
20180090118 A1 | Mar 2018 | US |