Ph.D. research project of Domenico Stefani
The Jupyter notebook loads a dataset of feature vectors extracted from pitched and percussive sounds recorded with many acoustic guitars.
The techniques/classes are:
- Kick (Palm on lower body)
- Snare 1 (All fingers on lower side)
- Tom (Thumb on higher body)
- Snare 2 (Fingers on the muted strings, over the end of the fingerboard)
- Natural Harmonics (Stop strings from playing the dominant frequency, letting harmonics ring)
- Palm Mute (Muting partially the strings with the palm of the pick hand)
- Pick Near Bridge (Playing toward the bridge/saddle)
- Pick Over the Soundhole (Playing over the sound hole) (NEUTRAL NON-)TECHNIQUE
data/
: Folder containing the links to the feature dataset files. Download them in the folder with the links file.phase3results/
: Results of Experiment1 for a scientific paper submitted to IEEE/ACM Transactions on Audio, Speech, and Language Processing (TASLP).convert_to_script.py
: Script to convert the Colab/Jupyter notebook to a Python script.expressive-technique-classifier-phase3.ipynb
: Jupyter notebook with the code to train and test the classifier.guitarists_touch.ipynb
: Jupyter notebook with the code to train and test the classifier for Experiment 3.run_grid_search.py
: Script to run a grid search on the classifier.
Contact Domenico Stefani for any issues with running the code to repeat the experiments.
domenico[dot]stefani[at]unitn[dot]it
work.domenicostefani.com