Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

In train.py, what is the usage/meaning of 'prior_preservation'? #9

Open
HyeonHo99 opened this issue Mar 2, 2023 · 3 comments
Open

Comments

@HyeonHo99
Copy link

Thank you for the implementation.

There's one thing I don't clearly understand. What is the usage/effect of 'prior_preservation'?
Is it set to not None in training time and None in inference time?
Also, what is 'unet_2' doing in this code?

Thank you again.

@HyeonHo99
Copy link
Author

This is not something included in the original paper right?
Can you explain about this modification, please?

Thank you.

@bryandlee
Copy link
Owner

Hi, prior_preservation was added as a regularization to prevent overfitting (originally from Dreambooth paper)

@HyeonHo99
Copy link
Author

Thank you for clarifying it. I will check out the Dreambooth paper.
I have one more question. I think "train_temporal_conv" was not in the original Tune A Video paper.
Is this also one thing you added?
In default, it is False. Then what's the function of the conv_temporal (Conv1d) which is initialized as identity?
Do you recommend configuring 'train_temporal_conv' as True or False?

Thank you in advance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants