Skip to content

Scripts for fine-tuning pretrained language models on custom data sets and downstream applications (summarization, generation)

Notifications You must be signed in to change notification settings

Nikoschenk/language_model_finetuning

Repository files navigation

language_model_finetuning

Scripts for fine-tuning pretrained language models on custom data sets, e.g.

Examples are provided for using the models (SciBERT, SciBERT fine-tuned, and BERT-"original") for extractive summarization (BERT) and text-generation (GPT-2).

Credits:

Copyright for papers belongs to ACL. Adaptations of original notebooks by Chris Callison-Burch and Derek Miller.

About

Scripts for fine-tuning pretrained language models on custom data sets and downstream applications (summarization, generation)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published