Skip to content

Commit

Permalink
[fix bug] load atomic_*.npy for tf tensor model (deepmodeling#4538)
Browse files Browse the repository at this point in the history
Fix bug mentioned in
deepmodeling#4536

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

- **Bug Fixes**
- Updated atomic property and weight label naming conventions across the
machine learning training and loss components to ensure consistent
terminology.
- Corrected placeholder key references in the training process to match
updated label names.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->

(cherry picked from commit 380efb9)
  • Loading branch information
ChiahsinChu authored and njzjz committed Feb 9, 2025
1 parent 43c8cae commit 2b7f53c
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 1 deletion.
2 changes: 1 addition & 1 deletion deepmd/tf/loss/tensor.py
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,7 @@ def label_requirement(self) -> list[DataRequirementItem]:
# data required
data_requirements.append(
DataRequirementItem(
"atom_" + self.label_name,
"atomic_" + self.label_name,
self.tensor_size,
atomic=True,
must=False,
Expand Down
7 changes: 7 additions & 0 deletions deepmd/tf/train/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -282,6 +282,13 @@ def _build_network(self, data, suffix="") -> None:
tf.int32, [None], name="t_mesh"
)
self.place_holders["is_training"] = tf.placeholder(tf.bool)
# update "atomic_" in self.place_holders.keys() with "atom_"
for kk in list(self.place_holders.keys()):
if "atomic_" in kk:
self.place_holders[kk.replace("atomic_", "atom_")] = (
self.place_holders.pop(kk)
)

self.model_pred = self.model.build(
self.place_holders["coord"],
self.place_holders["type"],
Expand Down

0 comments on commit 2b7f53c

Please sign in to comment.