Skip to content
/ CHEF Public

The source code of paper "CHEF: A Pilot Chinese Dataset for Evidence-Based Fact-Checking"

Notifications You must be signed in to change notification settings

THU-BPM/CHEF

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 

Repository files navigation

CHEF: A Pilot Chinese Dataset for Evidence-Based Fact-Checking

This project provides tools for "CHEF: A Pilot Chinese Dataset for Evidence-Based Fact-Checking." in NAACL 2022 as a long paper.

Quick Links

Installation

For training, a GPU is recommended to accelerate the training speed.

PyTroch

The code is based on PyTorch 1.6+. You can find tutorials here.

Usage

Our models are in the Joint directory, and you can also find baseline models under Pipeline directory. We give the specific usage in the corresponding directory.

Data

Format

./data
└── CHEF
    ├── train.json
    ├── dev.json
    └── test.json

Download

For the Joint model (Ours), you can download the data and put it in the Data directory for use. For the Pipeline model, data needs to be preprocessed, and we give the preprocessed data in the Data directory.

Acknowledgements

Interpretable_Predictions

Kernel Graph Attention Network

X-Fact

Contact

If you have any problem about our code, feel free to contact: hxm19@mails.tsinghua.edu.cn

Reference

If the code is used in your research, hope you can cite our paper as follows:

@inproceedings{hu2022chef,
  abbr = {NAACL},
  title = {CHEF: A Pilot Chinese Dataset for Evidence-Based Fact-Checking},
  author = {Hu, Xuming and Guo, Zhijiang and Wu, guanyu and Liu, Aiwei and Wen, Lijie and Yu, Philip S.},
  booktitle = {Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics},
  year = {2022},
  code = {https://github.com/THU-BPM/CHEF}
}

About

The source code of paper "CHEF: A Pilot Chinese Dataset for Evidence-Based Fact-Checking"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published