Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Trained With Custom Dataset But Generation Isn't Working Properly #13

Open
nikhilanayak opened this issue Aug 20, 2021 · 2 comments
Open

Comments

@nikhilanayak
Copy link

I trained GeDi with a custom dataset, consisting of 5 categories (Rap, R&B, Country, Pop, and Rock), with 20000 songs for each category. I trained with Google Colab and the eval results are as shown:

08/17/2021 11:16:57 - INFO - __main__ - ***** Eval results *****
08/17/2021 11:16:57 - INFO - __main__ - acc = 0.7552531518911347
08/17/2021 11:16:57 - INFO - __main__ - overall_gen_loss = 2.471960085582733

When I try to generate with the category Rap, the text generated doesn't seem like it has trained off of the dataset at all. GeDi also predicts the probability that the generation is the desired class is ~0.56. Does this have to do with the size of my dataset or is there something else incorrect I have done?

@benkrause
Copy link

I'm not surprised that this doesn't work well. GeDi guides generation from the base language (In this case GPT-2) model towards the selected control code and away from other control codes. It sounds like you are using GeDi to guide GPT-2 towards generating rap lyrics and away from other kinds of lyrics. However, I'm guessing GPT-2 assigns a fairly low likelihood to lyrics in general, so the resulting generations probably don't contain lyrics (but may contain more rap-like topics or phrases).

If your goal is to simply have generations that resemble the training data, then CC-LM style generation would likely work better. So I'd recommend setting --gen_type cclm during generation. I also think finetuning GPT-2 to your full lyrical dataset and then using GeDi to guide generation could also work well, but that would require a little more work.

@nikhilanayak
Copy link
Author

Thanks, that makes a lot of sense. I think I'm probably going to try to finetune GPT Neo and use GeDi with that, although I'm not sure if that'll run on Colab's P/V100s.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants