next up previous
次へ: Conclusion 上へ: Automatic Rhythm Transcription from 戻る: Tempo Estimation

Experiments


The proposed method was evaluated by using 3 classical music pieces listed in Table 2 recorded in the MIDI format, which were performed 2 times by 5 players for each piece. 19 kinds of note values (whole note, quarter note, etc.) were treated and represented by 6859 ($=19^3$) hidden states in the HMM, whose transition probabilities were trained by 13 classical pieces containing 4355 note values, and whose output probabilities were trained by 2 music pieces by 2 players containing 1288 IOIs. Rhythm recognition rate RRR-1 (first estimation of rhythm) and RRR-2 (rhythm estimation after tempo estimation) were obtained as shown in Table 3. ``Prep'' shows the rate of correct preprocessing (synchronizing grace notes, etc.).


=0.6ex

表: Testing music pieces for rhythm recognition
Fuga
J. S. Bach: Fuga in C minor, BWV847, Well-Tempered Clavier Book I.
Sonata
L. v. Beethoven: Piano Sonata No. 20, 1st Mov.
Traumerei
R. Schumann: ``Traumerei'' from ``Kinderszenen,'' Op. 15, No.7.


表: Rhythm recognition results [%]
Data Prep. RRR-1 RRR-2
Fuga 97.5 94.1 94.3
Sonata 97.4 60.7 78.5
Traumerei 97.5 68.4 72.0




next up previous
次へ: Conclusion 上へ: Automatic Rhythm Transcription from 戻る: Tempo Estimation
平成16年3月25日