The ACCompanion is an expressive accompaniment system.
This work was conducted at the Institute of Computational Perception at JKU.
This work was awarded the Science Breakthrough of the Year 2021 by the Falling Walls Foundation in Berlin. See press release, talk and demo.
The ACCompanion is an expressive accompaniment system. Similarly to a musician who accompanies a soloist playing a given musical piece, our system can produce a human-like rendition of the accompaniment part that follows the soloist's choices in terms of tempo, dynamics, and articulation. The ACCompanion works in the symbolic domain, i.e., it needs a musical instrument capable of producing and playing MIDI data, with explicitly encoded onset, offset, and pitch for each played note. We describe the components that go into such a system, from real-time score following and prediction to expressive performance generation and online adaptation to the expressive choices of the human player. Based on our experience with repeated live demonstrations in front of various audiences, we offer an analysis of the challenges of combining these components into a system that is highly reactive and precise, while still a reliable musical partner, robust to possible performance errors and responsive to expressive variations.
To set up the ACCompanion you need a couple of dependencies (Miniconda and Git).
Check if git
is installed by typing in your terminal:
git --version
If you get an error please install git
by following the instructions here according to your OS.
Check if conda
is installed by typing in your terminal:
git --version
If you get an error please install git
by following the instructions here according to your OS.
To install the ACCompanion copy the following steps in your terminal.
Clone and install the accompanion environment:
git clone https://github.com/CPJKU/accompanion.git
cd ./accompanion
conda env create -f environment.yml
Also, init the submodules if this step is not done automatically on cloning:
git submodule init
git submodule update
After the download and install are complete:
conda activate accompanion
pip install -e .
If you have already installed the ACCompanion, i.e. already done the Setup steps you should remember to activate your ACCompanion environment before trying out the accompanion by typing the following command in your terminal:
conda activate accompanion
If the accompanion enviroment is activated then you can follow the below instructions to try it out!
The ACCompanion features two playing modes, one for beginner and one for advanced players. In the beginner mode, for a two-handed piano piece, the ACCompanion plays the left hand as accompaniment while the user plays the right hand (usually containing the melody). In the advanced mode, the ACCompanion plays the secondo part of a four-hand piano piece, leaving the primo part to the user.
Beginner Mode
The default for the beginner mode runs 'Twinkle Twinkle Little Star':
cd Path/to/accompanion/
python ./bin/launch_acc.py --input Your_MIDI_Input -out Your_MIDI_Output
To run a different piece, use the --piece
flag and specify which piece you want to play:
cd Path/to/accompanion/
python ./bin/launch_acc.py --input Your_MIDI_Input -out Your_MIDI_Output --piece Your_Piece
Advanced Mode - Complex Pieces
The default for the advanced mode runs the Hungarian Dance No. 5 by Johannes Brahms (piano arrangement for four hands):
cd Path/to/accompanion/bin
python ./bin/launch_acc.py --input Your_MIDI_Input -out Your_MIDI_Output -f brahms
To Find out the arguments you can use with the ACCompanion, run the following command:
python ./bin/launch_acc.py --help
To find out which MIDI input and output ports are available on your system, run the following command:
python -c "import mido; print(mido.get_input_names()); print(mido.get_output_names())"
The ACCompanion can be used with a GUI. To do so, run the following command:
cd Path/to/accompanion/bin
python app.py
Data Requirements
For both the beginner and advanced mode, you will need the score of the piece you want to play in MusicXML format*. For the advanced mode, you will additionally need recording(s)** of the piece in MIDI format (for both the primo and secondo part).
*note: if the piece features (many) trills, make sure that they are written out as individual notes. This will ensure a (more) robust alignment. **note: the more recordings you have, the better the accompaniment.
Beginner Mode
Split the MusicXML-scores of the piece you want to add into a primo (right hand) and secondo (left hand) score, e.g., using a music notation software such as MuseScore. Add IDs to the notes in both scores.***
Create a new folder in the sample_pieces
folder and name it after the piece, e.g. new_piece
. Save the primo and secondo scores of your piece there as primo.musicxml
and secondo.musicxml
, respectively.
Finally, to play your piece, run:
cd Path/to/accompanion/bin
python launch_acc.py -f simple_pieces --piece new_piece
***The Parangonar package provides tools for this (see Additional Resources below).
For more instructions follow the submodule documentation : accompanion_pieces
add --test
flag to your command line arguments in order to switch to a Dummy MIDI routing system
necessary for testing purposes on VMs where ports and such cannot be accessed
- Parangonada Alignment Visualisation: Webtool and GitHub repo
If you use this work please cite us:
@inproceedings{cancino2023accompanion,
title = {The ACCompanion: Combining Reactivity, Robustness, and Musical Expressivity in an Automatic Piano Accompanist},
author = {Cancino-Chacón, Carlos and Peter, Silvan and Hu, Patricia and Karystinaios, Emmanouil and Henkel, Florian and Foscarin, Francesco and Varga, Nimrod and Widmer, Gerhard},
booktitle = {Proceedings of the Thirty-Second International Joint Conference on
Artificial Intelligence, {IJCAI-23}},
pages = {5779--5787},
year = {2023},
month = {8},
}
The code in this package is licensed under the Apache 2.0 Licence. For details, please see the LICENSE file.
The data and trained models included in this repository are licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0.
Data and model files usually have (but are not limited to) .match, .npy, .npz, .h5, .hdf5, .pkl, .pth or .mat file extensions.
If you want to include any of these files (or a variation or modification thereof) or technology which utilizes them in a commercial product, please contact Gerhard Widmer.
- Carlos Cancino-Chacón, Silvan Peter, Patricia Hu, Emmanouil Karystinaios, Florian Henkel, Francesco Foscarin, Nimrod Varga and Gerhard Widmer, The ACCompanion: Combining Reactivity, Robustness, and Musical Expressivity in an Automatic Piano Accompanist. Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence (IJCAI-23), Macao S.A.R.
- Carlos Cancino-Chacón, Martin Bonev, Amaury Durand, Maarten Grachten, Andreas Arzt, Laura Bishop, Werner Goebel, Gerhard Widmer The ACCompanion v0.1: An Expressive Accompaniment System. Proceedings of the Late-Breaking Demo Session of the 18th International Society for Music Information Retrieval Conference (ISMIR 2017), Suzhou, China, 2017
This work is supported by the European Research Council (ERC) under the EU’s Horizon 2020 research & innovation programme, grant agreement No. 10101937 ("Whither Music?").