Skip to content

Latest commit

 

History

History
60 lines (38 loc) · 801 Bytes

README.md

File metadata and controls

60 lines (38 loc) · 801 Bytes

Toxic_Detection

GitHub

The technical report can be found here.

Requirement

  • Python >= 3.7
  • torch >= 1.9.0
  • numpy >= 1.17.2
  • transformers >= 4.15.0

Preparation

Clone

git clone https://github.com/hiyouga/Toxic_Detection.git

Create an anaconda environment:

conda create -n toxic python=3.7
conda activate toxic
pip install -r requirements.txt

Usage

Split data

python data/split_data.py

Training

python main.py

Show help message

python main.py -h

Acknowledgements

This is a group homework for "Machine Learning" in BUAA Graduate School.

Contact

hiyouga [AT] buaa [DOT] edu [DOT] cn

License

MIT