This is the official pytorch implementation for the paper "MEmoR: A Dataset for Multimodal Emotion Reasoning in Videos" in ACM Multimedia 2020."
- Python 3.6
- Clone this repo and install the python dependencies:
git clone https://github.com/sunlightsgy/MEmoR.git
cd MEmoR
pip install -r requirements.txt
The MEmoR datasets are released on onedrive. You should download the License Agreement in this repo and send back to thusgy2012 at gmail.com. Then you will get the password. Once downloaded, please set a soft link to the MEmoR dataset:
ln -s /path/to/MEmoR data
The training and testing configures are set in train.json
and test.json
. To switch between the primary and fine-grained emotions, modified emo_type
in these two files.
python train.py -c train.json -d [gpu_id]
python test.py -c test.json -d [gpu_id] -r /path/to/model
We provide a pretrained model for primary and fine-grained emotions in the data/pretrained on the downloaded datasets.
If you use this code or dataset for your research, please cite our papers.
@inproceedings{shen2020memor,
title={MEmoR: A Dataset for Multimodal Emotion Reasoning in Videos},
author={Shen, Guangyao and Wang, Xin and Duan, Xuguang and Li, Hongzhi and Zhu, Wenwu},
booktitle={Proceedings of the 28th ACM international conference on Multimedia},
pages={493--502},
year={2020},
organization={ACM}
}
This project template is borrowed from the project PyTorch Template Project.