Skip to content

bitanb1999/gpt2_quotes_generation

 
 

Repository files navigation

Quotes Generator

Project description

Fine-tuned GPT2 model on the Quotes-500K dataset. For a given user prompt, it can generate motivational quotes starting with it. The model weights are deployed on Hugging Face Models Hub. Check it out at https://huggingface.co/nandinib1999/quote-generator.

Training data

This is the distribution of the total dataset into training, validation and test dataset for the fine-tuning task.

train 349796
validation 99942
test 49971

Training procedure

The model was fine-tuned using the Google Colab GPU for one epoch. The weights of the pre-trained GPT2 model were used as a base.

Eval results

Epoch Perplexity
1 15.180

About

Fine-tuned the GPT2 model to generate quotes

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 95.6%
  • Python 4.4%