Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(pt): fix seed in dpmodel fitting #3916

Merged
merged 1 commit into from
Jun 27, 2024

Conversation

iProzd
Copy link
Collaborator

@iProzd iProzd commented Jun 27, 2024

Summary by CodeRabbit

  • New Features
    • Introduced a new optional seed parameter across various fitting modules to enhance customization and reproducibility of model fitting processes.

@iProzd iProzd requested review from njzjz and wanghan-iapcm June 27, 2024 06:35
Copy link
Contributor

coderabbitai bot commented Jun 27, 2024

Walkthrough

Walkthrough

The recent updates across various files in the deepmd/dpmodel/fitting module mainly involve the inclusion and activation of a seed parameter in the __init__ methods of several classes. These changes help in controlling random number generation within the module, enhancing reproducibility. Additionally, some redundant comments were removed, and initialization logic was refined for clarity and explicitness, particularly setting flags such as old_impl.

Changes

File Change Summary
deepmd/dpmodel/fitting/dipole_fitting.py Removed seed from initial declaration, added during object initialization, set old_impl to False.
deepmd/dpmodel/fitting/dos_fitting.py Activated and reordered seed parameter in __init__ method.
deepmd/dpmodel/fitting/ener_fitting.py Utilized seed parameter in __init__ method, was previously commented out.
deepmd/dpmodel/fitting/invar_fitting.py Added seed parameter to __init__ method of FittingInvar class.
deepmd/dpmodel/fitting/polarizability_fitting.py Included seed in __init__, cleaned up comments, and passed seed to another function within the method.

Recent review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 17cdcb0 and 6d303ee.

Files selected for processing (5)
  • deepmd/dpmodel/fitting/dipole_fitting.py (2 hunks)
  • deepmd/dpmodel/fitting/dos_fitting.py (2 hunks)
  • deepmd/dpmodel/fitting/ener_fitting.py (2 hunks)
  • deepmd/dpmodel/fitting/invar_fitting.py (3 hunks)
  • deepmd/dpmodel/fitting/polarizability_fitting.py (2 hunks)
Additional context used
Ruff
deepmd/dpmodel/fitting/ener_fitting.py

33-33: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)


47-47: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)

deepmd/dpmodel/fitting/dos_fitting.py

36-36: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)


46-46: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)

deepmd/dpmodel/fitting/dipole_fitting.py

93-93: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)


106-106: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)

deepmd/dpmodel/fitting/invar_fitting.py

121-121: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)


136-136: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)

deepmd/dpmodel/fitting/polarizability_fitting.py

98-98: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)


111-111: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)

Additional comments not posted (6)
deepmd/dpmodel/fitting/ener_fitting.py (2)

72-72: Proper integration of seed parameter.

The seed parameter has been correctly added to the __init__ method and passed to the superclass constructor, aligning with the PR objectives.


47-47: Refactor mutable default argument.

Similar to the previous comment, mutable default values should be avoided. This applies to the trainable parameter as well.

- trainable: Optional[List[bool]] = None,
+ trainable: Optional[List[bool]] = None,
...
+ if trainable is None:
+     trainable = [True] * len(neuron)  # Assuming that the length of neuron list is the intended size

Likely invalid or redundant comment.

Tools
Ruff

47-47: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)

deepmd/dpmodel/fitting/dos_fitting.py (1)

71-71: Proper integration of seed parameter.

The seed parameter has been correctly added to the __init__ method and passed to the superclass constructor, aligning with the PR objectives.

deepmd/dpmodel/fitting/dipole_fitting.py (1)

144-144: Proper integration of seed parameter and handling of old_impl.

The seed parameter has been correctly added and the old_impl flag has been set to False as part of the initialization. This aligns with the PR objectives and the changes are properly integrated.

deepmd/dpmodel/fitting/invar_fitting.py (1)

138-138: Proper integration of seed parameter.

The seed parameter has been correctly added to the __init__ method and passed to the superclass constructor, aligning with the PR objectives.

deepmd/dpmodel/fitting/polarizability_fitting.py (1)

168-168: Proper integration of seed parameter.

The seed parameter has been correctly added to the __init__ method and passed to the superclass constructor, aligning with the PR objectives.


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share
Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai generate interesting stats about this repository and render them as a table.
    • @coderabbitai show all the console.log statements in this repository.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (invoked as PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Additionally, you can add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.

CodeRabbit Configration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Outside diff range and nitpick comments (7)
deepmd/dpmodel/fitting/ener_fitting.py (1)

Line range hint 33-33: Refactor mutable default argument.

It's recommended not to use mutable default values for function arguments as they can lead to unexpected behaviors. Use None as the default and initialize within the function if necessary.

- neuron: List[int] = [120, 120, 120],
+ neuron: Optional[List[int]] = None,
...
+ if neuron is None:
+     neuron = [120, 120, 120]
Tools
Ruff

47-47: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)

deepmd/dpmodel/fitting/dos_fitting.py (1)

Line range hint 36-36: Refactor mutable default argument.

Using mutable default values, especially numpy arrays, can lead to bugs. Initialize bias_dos within the function.

- bias_dos: Optional[np.ndarray] = None,
+ bias_dos: Optional[np.ndarray] = None,
...
+ if bias_dos is None:
+     bias_dos = np.zeros((ntypes, numb_dos), dtype=DEFAULT_PRECISION)
Tools
Ruff

46-46: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)

deepmd/dpmodel/fitting/dipole_fitting.py (2)

Line range hint 93-93: Refactor mutable default argument.

Mutable default values for function arguments should be avoided. Use None as the default and initialize within the function if necessary.

- neuron: List[int] = [120, 120, 120],
+ neuron: Optional[List[int]] = None,
...
+ if neuron is None:
+     neuron = [120, 120, 120]

Line range hint 106-106: Refactor mutable default argument.

Similar to the previous comment, mutable default values should be avoided. This applies to the trainable parameter as well.

- trainable: Optional[List[bool]] = None,
+ trainable: Optional[List[bool]] = None,
...
+ if trainable is None:
+     trainable = [True] * len(neuron)  # Assuming that the length of neuron list is the intended size
deepmd/dpmodel/fitting/invar_fitting.py (1)

Line range hint 121-121: Refactor mutable default argument.

It's recommended not to use mutable default values for function arguments as they can lead to unexpected behaviors. Use None as the default and initialize within the function if necessary.

- neuron: List[int] = [120, 120, 120],
+ neuron: Optional[List[int]] = None,
...
+ if neuron is None:
+     neuron = [120, 120, 120]
deepmd/dpmodel/fitting/polarizability_fitting.py (2)

Line range hint 98-98: Refactor mutable default argument.

Using mutable default values, especially numpy arrays, can lead to bugs. Initialize scale within the function.

- scale: Optional[List[float]] = None,
+ scale: Optional[List[float]] = None,
...
+ if scale is None:
+     scale = [1.0] * ntypes  # Assuming ntypes is the intended size

Line range hint 111-111: Refactor mutable default argument.

The trainable parameter should not use a mutable default value. Initialize it within the function.

- trainable: Optional[List[bool]] = None,
+ trainable: Optional[List[bool]] = None,
...
+ if trainable is None:
+     trainable = [True] * len(neuron)  # Assuming the length of neuron list is the intended size

deepmd/dpmodel/fitting/dos_fitting.py Show resolved Hide resolved
deepmd/dpmodel/fitting/invar_fitting.py Show resolved Hide resolved
Copy link

codecov bot commented Jun 27, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 82.72%. Comparing base (17cdcb0) to head (6d303ee).
Report is 117 commits behind head on devel.

Additional details and impacted files
@@            Coverage Diff             @@
##            devel    #3916      +/-   ##
==========================================
- Coverage   82.72%   82.72%   -0.01%     
==========================================
  Files         519      519              
  Lines       50515    50516       +1     
  Branches     3015     3016       +1     
==========================================
- Hits        41791    41789       -2     
- Misses       7788     7791       +3     
  Partials      936      936              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@iProzd iProzd added this pull request to the merge queue Jun 27, 2024
@github-merge-queue github-merge-queue bot removed this pull request from the merge queue due to failed status checks Jun 27, 2024
@wanghan-iapcm wanghan-iapcm added this pull request to the merge queue Jun 27, 2024
Merged via the queue into deepmodeling:devel with commit 58b8944 Jun 27, 2024
60 checks passed
mtaillefumier pushed a commit to mtaillefumier/deepmd-kit that referenced this pull request Sep 18, 2024
<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

- **New Features**
- Introduced a new optional `seed` parameter across various fitting
modules to enhance customization and reproducibility of model fitting
processes.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants