This repository is the official implementation of the work DAdEE: Unsupervised Domain Adaptation in Early Exit PLMs
We built upon our code using the huggingface transformers.
To fine-tune a pre-trained language model on and train the internal classifiers follow the command:
python3 main.py --pretrain --adapt --src books --tgt dvd
We acknowledge the bert-aad repository and thank them for making the source code publically available