Python program for understanding the relationship between acoustic and emotional entrainment.
We utilized Multimodal EmotionLines Dataset (MELD) to understand the relationship betweem acosutic-prosodic and emotion entrainmnet. The dataset can be downloaded from https://affective-meld.github.io/
ffmpeg (Download from https://www.ffmpeg.org/download.html) PRAAT
One Jupyter Notebook files is uploaded. The file presents a step-by-step procedure for extracting features and measuring entrainment distance.
JASP code is uploaded for further analysis.
J. Kejriwal, "Relationship between speech entrainment and emotion," 2022 10th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW), Nara, Japan, 2022, pp. 1-4, doi: 10.1109/ACIIW57231.2022.10086027.
J. Kejriwal and Š. Beňuš, "Speech Entrainment and Emotion," 2023 14th IEEE International Conference on Cognitive Infocommunications (CogInfoCom), Budapest, Hungary, 2023, pp. 000099-000104, doi: 10.1109/CogInfoCom59411.2023.10397502.