-
Hello, I'm trying to use DBNDownBeatTrackingProcessor and RNNDownBeatProcessor to identify beats and bars positions on classical music. I have access to the scores of the audio so I'm able to check the results manually as well as infer information such as the time signature or the mean BPM. Yet I've faced several problems:
To force the algorithms to detect the beats better I've tried to use the constraints parameters beats_per_bar (only available for downbeats tracking) or min_bpm & max_bpm, but faced the following failure:
Do you have any possible explanation on why these constraints do not work? Or any suggestion on how to get around this? Thanks a lot for your time! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
You're on the right way. I am wondering myself a bit that it always gets back to the wrong tempo (which should be outside the search range). There are a couple of more things to try:
It would be further helpful to see one of the problematic pieces, then I might have better suggestions. Last but not least, you could try to train your own model if you have annotated training material. The models are by no means trained on classical music. Together with Magdalena Fuentes and Matthew Davies, I gave a tutorial last ISMIR, you can find the material online. HTH |
Beta Was this translation helpful? Give feedback.
You're on the right way. I am wondering myself a bit that it always gets back to the wrong tempo (which should be outside the search range). There are a couple of more things to try:
transition_lambda
which makes it less probable to switch tempi.DBNDownBeatTrackingProcessor
to a certain range, e.g. 0..0.5. This could help to prevent switching to the wrong tempo which can be due to high peaks in the activation function.CRFBeatDetectionProcessor
which has shown better performance on Chopin's Mazurkas than the DBN approaches. If you have the tempo at hand, you can change…