-
Notifications
You must be signed in to change notification settings - Fork 524
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] finetune with fparam/aparam #3256
Labels
Comments
We have to handle fparam/aparam in dp test, dp model-devi, and finetune. Thus, it will be more reasonable to add a new method to do it. |
github-merge-queue bot
pushed a commit
that referenced
this issue
Feb 21, 2024
Fix #3256. Signed-off-by: Jinzhe Zeng <jinzhe.zeng@rutgers.edu>
njzjz
added a commit
to njzjz/deepmd-kit
that referenced
this issue
Apr 6, 2024
Fix deepmodeling#3256. Signed-off-by: Jinzhe Zeng <jinzhe.zeng@rutgers.edu> (cherry picked from commit d629616)
njzjz
added a commit
that referenced
this issue
Apr 6, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Discussed in #3255
Originally posted by chtchelkatchev February 10, 2024
I pretrained the deepmd potential with fparam using the descriptor se_atten_v2 and deepmd=2.2.7. Then an attempt was made to finetune this potential on a new database with fparam, which led to an error message: "... in _prepare_feed_dict assert fparam is not None ... AssertionError". Without fparam fintuning works well. The ussue appears only with fparam. Does finetuning work with fparam?
The text was updated successfully, but these errors were encountered: