Skip to content

Conversation

@ChiahsinChu
Copy link
Contributor

@ChiahsinChu ChiahsinChu commented Feb 27, 2025

  • add plugin mode for data modifier
  • adapt existing data modifier (dipole_charge in TF backend)
  • adapt unit tests for data modifier

Summary by CodeRabbit

  • New Features

    • Introduced a unified modifiers framework that supports dynamic instantiation and enhanced dipole charge modifications, improving data handling and model evaluation.
    • Added new classes BaseModifier, DipoleChargeModifier, and DescrptDPA3 for better management of modifier functionalities and descriptor computations.
    • Implemented new activation function "silu" across various components, enhancing flexibility in neural network configurations.
    • Added new model configuration model_dpa3 for testing purposes.
  • Refactor

    • Updated the training pipeline to adopt the generic modifier approach, replacing previous type-specific logic.
    • Restructured module organization for better clarity in imports and enhanced argument handling for modifiers, transitioning to a more dynamic retrieval process.
  • Tests

    • Improved clarity in test configurations by switching to explicitly named parameters for modifier setup.
    • Updated import paths for DipoleChargeModifier in test files to reflect new module organization.
    • Expanded testing coverage for the DescrptDPA3 descriptor and added new test classes to validate functionality across different configurations.
    • Introduced new test class TestDataMixType to validate functionality of the DeepmdData class with various type maps.


import numpy as np

import deepmd.tf.op # noqa: F401

Check notice

Code scanning / CodeQL

Unused import Note

Import of 'deepmd' is not used.
def build_fv_graph(self) -> tf.Tensor:
"""Build the computational graph for the force and virial inference."""
with tf.variable_scope("modifier_attr"):
t_mdl_name = tf.constant(self.model_name, name="mdl_name", dtype=tf.string)

Check notice

Code scanning / CodeQL

Unused local variable Note

Variable t_mdl_name is not used.
"""Build the computational graph for the force and virial inference."""
with tf.variable_scope("modifier_attr"):
t_mdl_name = tf.constant(self.model_name, name="mdl_name", dtype=tf.string)
t_modi_type = tf.constant(

Check notice

Code scanning / CodeQL

Unused local variable Note

Variable t_modi_type is not used.
t_modi_type = tf.constant(
self.modifier_prefix, name="type", dtype=tf.string
)
t_mdl_charge_map = tf.constant(

Check notice

Code scanning / CodeQL

Unused local variable Note

Variable t_mdl_charge_map is not used.
name="mdl_charge_map",
dtype=tf.string,
)
t_sys_charge_map = tf.constant(

Check notice

Code scanning / CodeQL

Unused local variable Note

Variable t_sys_charge_map is not used.
self.t_ef = tf.placeholder(GLOBAL_TF_FLOAT_PRECISION, [None], name="t_ef")
nf = 10
nfxnas = 64 * nf
nfxna = 192 * nf

Check warning

Code scanning / CodeQL

Variable defined multiple times Warning

This assignment to 'nfxna' is unnecessary as it is
redefined
before this value is used.
nfxnas = 64 * nf
nfxna = 192 * nf
nf = -1
nfxnas = -1

Check notice

Code scanning / CodeQL

Unused local variable Note

Variable nfxnas is not used.
corr_av.append(av)
corr_f = np.concatenate(corr_f, axis=0)
corr_v = np.concatenate(corr_v, axis=0)
corr_av = np.concatenate(corr_av, axis=0)

Check notice

Code scanning / CodeQL

Unused local variable Note

Variable corr_av is not used.
default_mesh = make_default_mesh(True, False)

# evaluate
tensor = []

Check notice

Code scanning / CodeQL

Unused local variable Note

Variable tensor is not used.
box = data["box"][:get_nframes, :]
atype = data["type"][:get_nframes, :]
atype = atype[0]
nframes = coord.shape[0]

Check notice

Code scanning / CodeQL

Unused local variable Note

Variable nframes is not used.
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 27, 2025

📝 Walkthrough

Walkthrough

This pull request introduces new modules and updates to support a more flexible modifier system. A new base class (BaseModifier) is defined in both the dpmodel and TensorFlow packages to handle serialization, deserialization, and dynamic instantiation of modifiers. The training entry point is updated to utilize this base class for selecting the appropriate modifier based on type. Additionally, a new DipoleChargeModifier class is implemented with methods for graph-building, evaluation, and data modification. Test files are updated to reflect the new import paths and to use named parameters for clarity.

Changes

Files Change Summary
deepmd/dpmodel/modifier/__init__.py, deepmd/dpmodel/modifier/base_modifier.py New files added for the dpmodel modifier module. The __init__.py imports make_base_modifier and sets the public API, while base_modifier.py introduces BaseModifier for dynamic instantiation, serialization, and deserialization.
deepmd/tf/entrypoints/train.py Updated the get_modifier function to use BaseModifier.get_class_by_type, removed the direct import of DipoleChargeModifier, and updated type hints to use Optional[BaseModifier].
deepmd/tf/modifier/__init__.py, deepmd/tf/modifier/base_modifier.py, deepmd/tf/modifier/dipole_charge.py New files added for TensorFlow-specific modifier implementations. base_modifier.py defines a BaseModifier extending DeepPot and make_base_modifier(), and dipole_charge.py implements DipoleChargeModifier with methods for instantiation, graph construction, evaluation, serialization, and data modification.
source/tests/tf/test_data_modifier.py, source/tests/tf/test_data_modifier_shuffle.py Test files updated with the new import path for DipoleChargeModifier and conversion of positional arguments to named parameters for improved clarity.
deepmd/utils/argcheck.py Added a new modifier documentation string for doc_dipole_charge, registered the dipole_charge modifier with documentation, and updated argument retrieval to use a dynamic approach.
deepmd/tf/__init__.py, deepmd/tf/infer/__init__.py, deepmd/tf/infer/deep_eval.py Updated import paths for DipoleChargeModifier to reflect its new location in the modifier module and removed it from the infer module's public interface.

Possibly related PRs

  • fix(pt/dp): share params of repinit_three_body #4139: The changes in the main PR, which involve the addition of the make_base_modifier function and the BaseModifier class, are related to the modifications in the retrieved PR that enhance the handling of parameters for descriptors, including the share_params method, as both involve foundational changes to descriptor handling and instantiation logic.
  • feat(pt/dp/jax/xp): add DPA3 descriptor #4609: The changes in the main PR, which involve the addition of the make_base_modifier function and the BaseModifier class, are related to the modifications in the retrieved PR that also introduces the DescrptDPA3 class and its associated functionalities, as both involve enhancements to the descriptor framework within the same module.

Suggested reviewers

  • njzjz
  • iProzd
  • wanghan-iapcm
✨ Finishing Touches
  • 📝 Generate Docstrings

🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (10)
deepmd/tf/modifier/base_modifier.py (1)

2-4: Consider optional direct imports for readability.

Importing make_base_modifier from a nested path is fine, but if there's frequent usage, an alternative structure or aliasing might improve maintainability and clarity in the codebase.

deepmd/dpmodel/modifier/base_modifier.py (2)

23-33: Add more guidance or structure to the abstract serialize method.

Currently, the docstring is minimal, which may cause confusion for implementers about the expected format. Providing a sample return format or clarifying potential fields might improve developer experience.


34-50: Fix docstring mismatch in deserialize.

The docstring indicates the return type is BaseModel, but this file defines BaseModifier. Clarify which entity is actually returned to avoid confusion for implementers.

- BaseModel
+ BaseModifier
deepmd/tf/entrypoints/train.py (1)

279-285: Handle missing or invalid "type" gracefully.

If modi_data lacks a "type" field, modifier_type = modifier_params.pop("type") raises a KeyError. Ensure upstream logic always provides "type" or add a defensive check to avoid unhandled exceptions.

modifier_type = modifier_params.pop("type", None)
if modifier_type is None:
    # Provide a default or raise a more user-friendly error
    raise ValueError("No 'type' specified for the modifier.")
source/tests/tf/test_data_modifier.py (1)

100-104: Optional: Ensure consistency by using float literals for default float parameters.
While ewald_h=1 and ewald_beta=0.25 are acceptable, you might prefer using 1.0 for consistency with the float annotation.

 def _test_fv(self) -> None:
     dcm = DipoleChargeModifier(
-        model_name=str(tests_path / os.path.join(modifier_datapath, "dipole.pb")),
-        model_charge_map=[-8],
-        sys_charge_map=[6, 1],
-        ewald_h=1,
-        ewald_beta=0.25,
+        model_name=str(tests_path / os.path.join(modifier_datapath, "dipole.pb")),
+        model_charge_map=[-8],
+        sys_charge_map=[6, 1],
+        ewald_h=1.0,
+        ewald_beta=0.25,
     )
deepmd/tf/modifier/dipole_charge.py (5)

48-48: Use the simpler kwargs.get("model_name").
There's no need for the default argument here, as it's checked immediately after.

 def __new__(cls, *args, **kwargs):
-    model_file = kwargs.get("model_name", None)
+    model_file = kwargs.get("model_name")
     if model_file is None:
         raise TypeError("Missing required argument: 'model_name'")
     return super().__new__(cls, model_file)
🧰 Tools
🪛 Ruff (0.8.2)

48-48: Use kwargs.get("model_name") instead of kwargs.get("model_name", None)

Replace kwargs.get("model_name", None) with kwargs.get("model_name")

(SIM910)


136-151: Remove or comment out unused variables to tidy up the code.
These variables (t_mdl_name, t_modi_type, t_mdl_charge_map, t_sys_charge_map, t_ewald_h, t_ewald_b) are assigned but never used.

 with tf.variable_scope("modifier_attr"):
-    t_mdl_name = tf.constant(self.model_name, name="mdl_name", dtype=tf.string)
-    t_modi_type = tf.constant(
-        self.modifier_prefix, name="type", dtype=tf.string
-    )
-    t_mdl_charge_map = tf.constant(
-        " ".join([str(ii) for ii in self.model_charge_map]),
-        name="mdl_charge_map",
-        dtype=tf.string,
-    )
-    t_sys_charge_map = tf.constant(
-        " ".join([str(ii) for ii in self.sys_charge_map]),
-        name="sys_charge_map",
-        dtype=tf.string,
-    )
-    t_ewald_h = tf.constant(self.ewald_h, name="ewald_h", dtype=tf.float64)
-    t_ewald_b = tf.constant(
-        self.ewald_beta, name="ewald_beta", dtype=tf.float64
-    )
     ...
🧰 Tools
🪛 Ruff (0.8.2)

136-136: Local variable t_mdl_name is assigned to but never used

Remove assignment to unused variable t_mdl_name

(F841)


137-137: Local variable t_modi_type is assigned to but never used

Remove assignment to unused variable t_modi_type

(F841)


140-140: Local variable t_mdl_charge_map is assigned to but never used

Remove assignment to unused variable t_mdl_charge_map

(F841)


145-145: Local variable t_sys_charge_map is assigned to but never used

Remove assignment to unused variable t_sys_charge_map

(F841)


150-150: Local variable t_ewald_h is assigned to but never used

Remove assignment to unused variable t_ewald_h

(F841)


151-151: Local variable t_ewald_b is assigned to but never used

Remove assignment to unused variable t_ewald_b

(F841)


159-164: Remove or refactor unused local variables.
The variables nf, nfxnas, and nfxna are overwritten and never used in subsequent logic.

 nf = 10
 nfxnas = 64 * nf
 nfxna = 192 * nf
 nf = -1
-nfxnas = -1
-nfxna = -1
 self.t_box_reshape = tf.reshape(self.t_box, [-1, 9])
🧰 Tools
🪛 Ruff (0.8.2)

163-163: Local variable nfxnas is assigned to but never used

Remove assignment to unused variable nfxnas

(F841)


392-392: Remove the unused local variable.
tensor is assigned but never referenced, so removing it helps keep the code uncluttered.

-        tensor = []
         feed_dict_test = {}

481-481: Drop the redundant nframes assignment.
It's never used and might create confusion with similarly named variables in the same method.

 def modify_data(self, data: dict, data_sys: DeepmdData) -> None:
     ...
     atype = data["type"][:get_nframes, :]
-    nframes = coord.shape[0]
     tot_e, tot_f, tot_v = self.eval(coord, box, atype)
     ...
🧰 Tools
🪛 Ruff (0.8.2)

481-481: Local variable nframes is assigned to but never used

Remove assignment to unused variable nframes

(F841)

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 918d4de and 134b36a.

📒 Files selected for processing (8)
  • deepmd/dpmodel/modifier/__init__.py (1 hunks)
  • deepmd/dpmodel/modifier/base_modifier.py (1 hunks)
  • deepmd/tf/entrypoints/train.py (3 hunks)
  • deepmd/tf/modifier/__init__.py (1 hunks)
  • deepmd/tf/modifier/base_modifier.py (1 hunks)
  • deepmd/tf/modifier/dipole_charge.py (1 hunks)
  • source/tests/tf/test_data_modifier.py (2 hunks)
  • source/tests/tf/test_data_modifier_shuffle.py (1 hunks)
✅ Files skipped from review due to trivial changes (3)
  • deepmd/dpmodel/modifier/init.py
  • deepmd/tf/modifier/init.py
  • source/tests/tf/test_data_modifier_shuffle.py
🧰 Additional context used
🪛 Ruff (0.8.2)
deepmd/tf/modifier/dipole_charge.py

48-48: Use kwargs.get("model_name") instead of kwargs.get("model_name", None)

Replace kwargs.get("model_name", None) with kwargs.get("model_name")

(SIM910)


136-136: Local variable t_mdl_name is assigned to but never used

Remove assignment to unused variable t_mdl_name

(F841)


137-137: Local variable t_modi_type is assigned to but never used

Remove assignment to unused variable t_modi_type

(F841)


140-140: Local variable t_mdl_charge_map is assigned to but never used

Remove assignment to unused variable t_mdl_charge_map

(F841)


145-145: Local variable t_sys_charge_map is assigned to but never used

Remove assignment to unused variable t_sys_charge_map

(F841)


150-150: Local variable t_ewald_h is assigned to but never used

Remove assignment to unused variable t_ewald_h

(F841)


151-151: Local variable t_ewald_b is assigned to but never used

Remove assignment to unused variable t_ewald_b

(F841)


163-163: Local variable nfxnas is assigned to but never used

Remove assignment to unused variable nfxnas

(F841)


393-393: Local variable tensor is assigned to but never used

Remove assignment to unused variable tensor

(F841)


481-481: Local variable nframes is assigned to but never used

Remove assignment to unused variable nframes

(F841)

⏰ Context from checks skipped due to timeout of 90000ms (20)
  • GitHub Check: Test Python (5, 3.9)
  • GitHub Check: Test Python (1, 3.12)
  • GitHub Check: Test Python (2, 3.9)
  • GitHub Check: Build C library (2.14, >=2.5.0rc0,<2.15, libdeepmd_c_cu11.tar.gz)
  • GitHub Check: Build C library (2.18, libdeepmd_c.tar.gz)
  • GitHub Check: Build wheels for cp311-manylinux_x86_64
  • GitHub Check: Build wheels for cp311-macosx_x86_64
  • GitHub Check: Build wheels for cp311-macosx_arm64
  • GitHub Check: Build wheels for cp311-win_amd64
  • GitHub Check: Build wheels for cp310-manylinux_aarch64
  • GitHub Check: Build wheels for cp311-manylinux_x86_64
  • GitHub Check: Build C++ (clang, clang)
  • GitHub Check: Test C++ (false)
  • GitHub Check: Analyze (python)
  • GitHub Check: Test C++ (true)
  • GitHub Check: Build C++ (cuda, cuda)
  • GitHub Check: Build C++ (rocm, rocm)
  • GitHub Check: Build C++ (cpu, cpu)
  • GitHub Check: Build C++ (cuda120, cuda)
  • GitHub Check: Analyze (c-cpp)
🔇 Additional comments (3)
deepmd/dpmodel/modifier/base_modifier.py (2)

14-22: Check for missing or invalid type in __new__.

While dynamically switching the class is powerful, this snippet assumes kwargs["type"] is always valid. A fallback or error check for a missing or incorrect "type" key could prevent runtime errors early.


52-73: Ensure friendly error for missing "type" key in get_modifier.

Although modifier_params.pop("type", None) avoids a KeyError, returning None might cause issues if a subclass strictly requires a "type". Consider at least validating that a type was provided.

source/tests/tf/test_data_modifier.py (1)

10-10: Import path looks good.
The updated import path matches the new module structure and is consistent with the introduced DipoleChargeModifier.

@codecov
Copy link

codecov bot commented Mar 3, 2025

Codecov Report

Attention: Patch coverage is 66.66667% with 19 lines in your changes missing coverage. Please review.

Project coverage is 84.80%. Comparing base (eb9e71d) to head (af66d73).
Report is 79 commits behind head on devel.

Files with missing lines Patch % Lines
deepmd/dpmodel/modifier/base_modifier.py 62.50% 9 Missing ⚠️
deepmd/tf/modifier/dipole_charge.py 61.53% 5 Missing ⚠️
deepmd/tf/entrypoints/train.py 40.00% 3 Missing ⚠️
deepmd/tf/infer/deep_eval.py 0.00% 1 Missing ⚠️
deepmd/tf/modifier/base_modifier.py 80.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##            devel    #4621      +/-   ##
==========================================
- Coverage   84.81%   84.80%   -0.02%     
==========================================
  Files         688      692       +4     
  Lines       66294    66343      +49     
  Branches     3539     3538       -1     
==========================================
+ Hits        56230    56263      +33     
- Misses       8923     8938      +15     
- Partials     1141     1142       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@wanghan-iapcm wanghan-iapcm requested a review from iProzd March 3, 2025 14:25
Copy link
Collaborator

@wanghan-iapcm wanghan-iapcm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you please tell if the modifier.DipoleChargeModifier is a copy from infer.data_modifier.DipoleChargeModifier ?
If so it seems that you may safely remove infer/data_modifier.py

@ChiahsinChu
Copy link
Contributor Author

Could you please tell if the modifier.DipoleChargeModifier is a copy from infer.data_modifier.DipoleChargeModifier ?

Yes. It is basically the copy of infer.data_modifier.DipoleChargeModifier, except for the introduction of the parent class for the plugin. If you think this implementation is reasonable, I will remove the old class.

@ChiahsinChu ChiahsinChu force-pushed the devel-modifier-plugin branch from 8c86290 to f9c5d53 Compare March 4, 2025 03:17
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Inline review comments failed to post. This is likely due to GitHub's limits when posting large numbers of comments. If you are seeing this consistently it is likely a permissions issue. Please check "Moderation" -> "Code review limits" under your organization settings.

Actionable comments posted: 2

🧹 Nitpick comments (24)
deepmd/pt/entrypoints/main.py (1)

335-362: Consider adding error handling for DDP operations.

While the DDP initialization and cleanup are correctly implemented, consider wrapping the training code in a try-finally block to ensure the process group is always destroyed, even if training fails.

    # Initialize DDP
    if os.environ.get("LOCAL_RANK") is not None:
        dist.init_process_group(backend="cuda:nccl,cpu:gloo")

-    trainer = get_trainer(
-        config,
-        init_model,
-        restart,
-        finetune,
-        force_load,
-        init_frz_model,
-        shared_links=shared_links,
-        finetune_links=finetune_links,
-    )
-    # save min_nbor_dist
-    if min_nbor_dist is not None:
-        if not multi_task:
-            trainer.model.min_nbor_dist = torch.tensor(
-                min_nbor_dist, dtype=torch.float64, device=DEVICE
-            )
-        else:
-            for model_item in min_nbor_dist:
-                trainer.model[model_item].min_nbor_dist = torch.tensor(
-                    min_nbor_dist[model_item], dtype=torch.float64, device=DEVICE
-                )
-    trainer.run()
-    if dist.is_available() and dist.is_initialized():
-        dist.destroy_process_group()
+    try:
+        trainer = get_trainer(
+            config,
+            init_model,
+            restart,
+            finetune,
+            force_load,
+            init_frz_model,
+            shared_links=shared_links,
+            finetune_links=finetune_links,
+        )
+        # save min_nbor_dist
+        if min_nbor_dist is not None:
+            if not multi_task:
+                trainer.model.min_nbor_dist = torch.tensor(
+                    min_nbor_dist, dtype=torch.float64, device=DEVICE
+                )
+            else:
+                for model_item in min_nbor_dist:
+                    trainer.model[model_item].min_nbor_dist = torch.tensor(
+                        min_nbor_dist[model_item], dtype=torch.float64, device=DEVICE
+                    )
+        trainer.run()
+    finally:
+        if dist.is_available() and dist.is_initialized():
+            dist.destroy_process_group()
deepmd/jax/descriptor/dpa3.py (1)

26-37: Consider adding docstring for clarity

While the implementation is correct, adding a docstring to the __setattr__ method would improve code maintainability by explaining the purpose of the special handling for different attribute types.

def __setattr__(self, name: str, value: Any) -> None:
+    """Custom attribute setter for JAX compatibility.
+    
+    Handles special attributes:
+    - mean, stddev: Convert to JAX arrays and wrap in ArrayAPIVariable
+    - repflows: Deserialize and reconstruct DescrptBlockRepflows
+    - type_embedding: Deserialize and reconstruct TypeEmbedNet
+    
+    Parameters
+    ----------
+    name : str
+        Attribute name
+    value : Any
+        Attribute value
+    """
    if name in {"mean", "stddev"}:
        value = to_jax_array(value)
        if value is not None:
            value = ArrayAPIVariable(value)
    elif name in {"repflows"}:
        value = DescrptBlockRepflows.deserialize(value.serialize())
    elif name in {"type_embedding"}:
        value = TypeEmbedNet.deserialize(value.serialize())
    else:
        pass
    return super().__setattr__(name, value)
source/tests/pt/model/test_dpa3.py (1)

209-209: Remove or utilize the assigned variable model.

It appears model is assigned but never used. Consider removing the assignment or verifying the compiled model if it's meant for validation or further operations.

-            model = torch.jit.script(dd0)
+            torch.jit.script(dd0)
🧰 Tools
🪛 Ruff (0.8.2)

209-209: Local variable model is assigned to but never used

Remove assignment to unused variable model

(F841)

source/tests/consistent/descriptor/test_dpa3.py (1)

39-43: Simplify the redundant if-else block.

Both branches assign DescrptDPA3PD = None. Consider consolidating it into a single statement or removing the condition altogether.

-if INSTALLED_PD:
-    # not supported yet
-    DescrptDPA3PD = None
-else:
-    DescrptDPA3PD = None
+DescrptDPA3PD = None
🧰 Tools
🪛 Ruff (0.8.2)

39-43: Use ternary operator DescrptDPA3PD = None if INSTALLED_PD else None instead of if-else-block

(SIM108)

deepmd/dpmodel/descriptor/dpa3.py (2)

253-253: Avoid using mutable default argument.

Using a mutable list ([]) as a default can lead to unexpected behavior if it's modified in-place. It's best practice to default to None and then assign a list within the function body.

-        exclude_types: list[tuple[int, int]] = [],
+        exclude_types: Optional[list[tuple[int, int]]] = None,

...

+        if exclude_types is None:
+            exclude_types = []
🧰 Tools
🪛 Ruff (0.8.2)

253-253: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)


574-574: Remove unused variable or integrate it into the logic.

env_mat is never used after being assigned. Consider removing it to maintain cleaner code unless it’s needed for future expansions.

-env_mat = repflow_variable.pop("env_mat")
+repflow_variable.pop("env_mat")
🧰 Tools
🪛 Ruff (0.8.2)

574-574: Local variable env_mat is assigned to but never used

Remove assignment to unused variable env_mat

(F841)

deepmd/dpmodel/descriptor/repflows.py (7)

156-156: Avoid using mutable default arguments.

Currently, exclude_types: list[tuple[int, int]] = [] can introduce unexpected behavior if the default list is modified. Consider switching to a non-mutable default, such as None, and initialize it in the body of the function or constructor.

- exclude_types: list[tuple[int, int]] = []
+ exclude_types: Optional[list[tuple[int, int]]] = None
🧰 Tools
🪛 Ruff (0.8.2)

156-156: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)


353-353: Avoid using mutable default arguments.

Here, exclude_types: list[tuple[int, int]] = [] should also be replaced with a non-mutable default to prevent potential side effects.

- exclude_types: list[tuple[int, int]] = []
+ exclude_types: Optional[list[tuple[int, int]]] = None
🧰 Tools
🪛 Ruff (0.8.2)

353-353: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)


358-469: Consider refactoring the call method into smaller functions.

The call method is sizable, handling neighbor-related logic, environment matrix computations, array expansions, and more. Breaking it up into focused sub-methods (e.g., _prepare_edge_data(), _compute_angle_data()) would enhance clarity and reduce complexity.

🧰 Tools
🪛 Ruff (0.8.2)

440-440: Loop control variable idx not used within loop body

Rename unused idx to _idx

(B007)


440-440: Unused loop variable.

The variable idx is not used within the loop body. For clarity and to avoid lint warnings, rename it to _idx or remove it if not needed.

-for idx, ll in enumerate(self.layers):
+for _idx, ll in enumerate(self.layers):
🧰 Tools
🪛 Ruff (0.8.2)

440-440: Loop control variable idx not used within loop body

Rename unused idx to _idx

(B007)


543-772: Perform deeper testing and potential validation of parameters.

The RepFlowLayer constructor also has numerous parameters and optional features (e.g., angle compression, multi-edge message). Having more targeted unit tests or parameter validations can help detect misconfigurations early. Consider adding validation checks to confirm consistent parameter usage.


937-937: Unused variable assignment.

nall = node_ebd_ext.shape[1] is never used below. Removing or using the variable meaningfully helps keep the code clear of dead assignments.

- nall = node_ebd_ext.shape[1]
🧰 Tools
🪛 Ruff (0.8.2)

937-937: Local variable nall is assigned to but never used

Remove assignment to unused variable nall

(F841)


1199-1199: Remove unused local variable.

nitem is assigned but never used in list_update_res_residual. Clean up to avoid confusion.

- nitem = len(update_list)
🧰 Tools
🪛 Ruff (0.8.2)

1199-1199: Local variable nitem is assigned to but never used

Remove assignment to unused variable nitem

(F841)

source/tests/tf/test_deepmd_data.py (1)

385-445: Provide additional assertions for new multi-type scenarios.

The new TestDataMixType class covers mixed-type initialization and validations. You might consider adding checks for specific behaviors (e.g., loading data with mismatched types) to broaden your test coverage.

source/tests/universal/dpmodel/descriptor/test_descriptor.py (2)

21-23: Newly imported RepFlowArgs.

This import for RepFlowArgs is consistent with the newly introduced repflow logic. Ensure that additional arguments in RepFlowArgs are tested thoroughly to maintain coverage.


467-549: Avoid using mutable data structures for default parameters.

Line 474 uses exclude_types=[], which is mutable. Replace this with None and initialize a new list inside the function to prevent accidental mutation of the default.

- exclude_types=[],
+ exclude_types=None,
🧰 Tools
🪛 Ruff (0.8.2)

474-474: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)

deepmd/pt/model/descriptor/repflows.py (2)

178-178: Avoid using mutable default arguments.
exclude_types: list[tuple[int, int]] = [] can create shared references if mutated.

-        exclude_types: list[tuple[int, int]] = [],
+        exclude_types: Optional[list[tuple[int, int]]] = None,

# Then inside __init__:
if exclude_types is None:
    exclude_types = []
🧰 Tools
🪛 Ruff (0.8.2)

178-178: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)


360-360: Second mutable default argument.
exclude_types: list[tuple[int, int]] = [] in reinit_exclude may lead to accidental shared modifications.

-    def reinit_exclude(self, exclude_types: list[tuple[int, int]] = []):
+    def reinit_exclude(self, exclude_types: Optional[list[tuple[int, int]]] = None):
🧰 Tools
🪛 Ruff (0.8.2)

360-360: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)

deepmd/pt/model/descriptor/repflow_layer.py (2)

534-534: Remove unused variable.
nall = node_ebd_ext.shape[1] is declared but never used. Suggest removing it to keep the code clean.

-        nall = node_ebd_ext.shape[1]
🧰 Tools
🪛 Ruff (0.8.2)

534-534: Local variable nall is assigned to but never used

Remove assignment to unused variable nall

(F841)


788-788: Remove the unused variable.
nitem = len(update_list) is not used afterwards.

-        nitem = len(update_list)
🧰 Tools
🪛 Ruff (0.8.2)

788-788: Local variable nitem is assigned to but never used

Remove assignment to unused variable nitem

(F841)

deepmd/pt/model/descriptor/dpa3.py (4)

339-339: Rename the unused loop variable to underscore for clarity.

Since ii is not used within the loop body, renaming it to _ avoids confusion and signals the variable is intentionally unused.

-        for ii, descrpt in enumerate(descrpt_list):
+        for _, descrpt in enumerate(descrpt_list):
🧰 Tools
🪛 Ruff (0.8.2)

339-339: Loop control variable ii not used within loop body

Rename unused ii to _ii

(B007)


415-415: Remove the unused variable env_mat.

The variable env_mat is assigned but never used. Removing the assignment prevents clutter in the code.

-        env_mat = repflow_variable.pop("env_mat")
🧰 Tools
🪛 Ruff (0.8.2)

415-415: Local variable env_mat is assigned to but never used

Remove assignment to unused variable env_mat

(F841)


443-443: Correct the spelling in the docstring.

Minor spelling fix from "aotm" to "atom."

-            The extended aotm types. shape: nf x nall
+            The extended atom types. shape: nf x nall

471-471: Remove the unused variable nall.

Because nall is never used after its assignment, it's safe to remove the line to reduce extraneous code.

-        nall = extended_coord.view(nframes, -1).shape[1] // 3
🧰 Tools
🪛 Ruff (0.8.2)

471-471: Local variable nall is assigned to but never used

Remove assignment to unused variable nall

(F841)

🛑 Comments failed to post (2)
deepmd/dpmodel/descriptor/repflows.py (1)

132-210: 🛠️ Refactor suggestion

Consider splitting the constructor for better maintainability.

The __init__ method for DescrptBlockRepflows is very extensive, featuring numerous parameters and initialization steps. Splitting some setup logic into helper methods (e.g., _init_embeddings(), _init_layers()) could improve readability, maintainability, and facilitate easier unit testing.

🧰 Tools
🪛 Ruff (0.8.2)

156-156: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)

deepmd/pt/model/descriptor/dpa3.py (1)

105-105: 🛠️ Refactor suggestion

Avoid mutable default arguments in function signature.

Using a mutable list for the parameter exclude_types can lead to unexpected behavior due to aliasing. Instead, consider defaulting to None and initializing an empty list inside the function body.

-        exclude_types: list[tuple[int, int]] = [],
+        exclude_types: Optional[list[tuple[int, int]]] = None,

         ...
         if exclude_types is None:
             exclude_types = []
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

        exclude_types: Optional[list[tuple[int, int]]] = None,
        ...
        if exclude_types is None:
            exclude_types = []
🧰 Tools
🪛 Ruff (0.8.2)

105-105: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Inline review comments failed to post. This is likely due to GitHub's limits when posting large numbers of comments. If you are seeing this consistently it is likely a permissions issue. Please check "Moderation" -> "Code review limits" under your organization settings.

Actionable comments posted: 1

🧹 Nitpick comments (26)
deepmd/jax/descriptor/dpa3.py (1)

1-37: Consider adding null checks for serialized attributes.

While re-serializing repflows and type_embedding is likely valid, calling serialize() on a None value or an unexpected type would cause runtime errors. You could handle potential edge cases to make this more robust:

    elif name in {"repflows"}:
-       value = DescrptBlockRepflows.deserialize(value.serialize())
+       if value is not None:
+           value = DescrptBlockRepflows.deserialize(value.serialize())

    elif name in {"type_embedding"}:
-       value = TypeEmbedNet.deserialize(value.serialize())
+       if value is not None:
+           value = TypeEmbedNet.deserialize(value.serialize())
source/tests/pt/model/test_dpa3.py (1)

141-210: Unused local variable model in test_jit.

On line 209, model = torch.jit.script(dd0) is assigned but never used, which the static analysis hints also flagged. If you intend to use JIT-compiled results, consider verifying it within the test or remove the assignment if it's unneeded.

Proposed fix (remove unused variable):

- model = torch.jit.script(dd0)
🧰 Tools
🪛 Ruff (0.8.2)

209-209: Local variable model is assigned to but never used

Remove assignment to unused variable model

(F841)

source/tests/consistent/descriptor/test_dpa3.py (1)

39-43: Remove redundant if-else block.

Both branches assign DescrptDPA3PD = None, making this block a no-op. Consider removing it to simplify the code.

-if INSTALLED_PD:
-    # not supported yet
-    DescrptDPA3PD = None
-else:
-    DescrptDPA3PD = None
+DescrptDPA3PD = None
🧰 Tools
🪛 Ruff (0.8.2)

39-43: Use ternary operator DescrptDPA3PD = None if INSTALLED_PD else None instead of if-else-block

(SIM108)

deepmd/dpmodel/descriptor/dpa3.py (2)

253-253: Use a non-mutable default for the exclude_types parameter.

Defaulting to an empty list can potentially cause unexpected behavior. Instead, default to None and initialize inside the method.

-def __init__(
-    self,
-    ntypes: int,
-    # args for repflow
-    repflow: Union[RepFlowArgs, dict],
-    # kwargs for descriptor
-    concat_output_tebd: bool = False,
-    activation_function: str = "silu",
-    precision: str = "float64",
-    exclude_types: list[tuple[int, int]] = [],
-    env_protection: float = 0.0,
-    ...
-) -> None:
+def __init__(
+    self,
+    ntypes: int,
+    repflow: Union[RepFlowArgs, dict],
+    concat_output_tebd: bool = False,
+    activation_function: str = "silu",
+    precision: str = "float64",
+    exclude_types: Optional[list[tuple[int, int]]] = None,
+    env_protection: float = 0.0,
+    ...
+) -> None:
+    if exclude_types is None:
+        exclude_types = []
     self.exclude_types = exclude_types
🧰 Tools
🪛 Ruff (0.8.2)

253-253: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)


574-574: Remove unused variable env_mat.

The variable env_mat is never used after it is popped from repflow_variable, making this assignment redundant.

- env_mat = repflow_variable.pop("env_mat")
🧰 Tools
🪛 Ruff (0.8.2)

574-574: Local variable env_mat is assigned to but never used

Remove assignment to unused variable env_mat

(F841)

deepmd/dpmodel/descriptor/repflows.py (5)

156-156: Use a non-mutable default for exclude_types.

Defaulting to an empty list can potentially cause unexpected behavior. Instead, set it to None and initialize within the constructor.

-def __init__(
-    ...,
-    exclude_types: list[tuple[int, int]] = [],
-    env_protection: float = 0.0,
-    ...
-) -> None:
+def __init__(
+    ...,
+    exclude_types: Optional[list[tuple[int, int]]] = None,
+    env_protection: float = 0.0,
+    ...
+) -> None:
+    if exclude_types is None:
+        exclude_types = []
     self.exclude_types = exclude_types
🧰 Tools
🪛 Ruff (0.8.2)

156-156: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)


353-353: Use a non-mutable default for exclude_types.

Similar to the constructor, defaulting to an empty list can lead to unintended side effects. Default to None and handle initialization inside the method.

-def reinit_exclude(
-    self,
-    exclude_types: list[tuple[int, int]] = [],
-) -> None:
+def reinit_exclude(
+    self,
+    exclude_types: Optional[list[tuple[int, int]]] = None,
+) -> None:
+    if exclude_types is None:
+        exclude_types = []
     self.exclude_types = exclude_types
🧰 Tools
🪛 Ruff (0.8.2)

353-353: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)


440-440: Rename unused loop variable idx.

The loop control variable is not used inside the loop body, so renaming it to _ or _idx clarifies that it is unused.

-for idx, ll in enumerate(self.layers):
+for _idx, ll in enumerate(self.layers):
🧰 Tools
🪛 Ruff (0.8.2)

440-440: Loop control variable idx not used within loop body

Rename unused idx to _idx

(B007)


937-937: Remove the unused local variable nall.

This variable is assigned but never referenced, so removing it will improve clarity.

-nall = node_ebd_ext.shape[1]
🧰 Tools
🪛 Ruff (0.8.2)

937-937: Local variable nall is assigned to but never used

Remove assignment to unused variable nall

(F841)


1199-1199: Remove unused local variable nitem.

This variable is never referenced after being assigned. Removing it clarifies the code.

-nitem = len(update_list)
🧰 Tools
🪛 Ruff (0.8.2)

1199-1199: Local variable nitem is assigned to but never used

Remove assignment to unused variable nitem

(F841)

source/tests/universal/pt/model/test_model.py (1)

99-99: Typo in 'defalut_des_param'.
Consider renaming the variable to default_des_param for improved clarity.

- defalut_des_param = [
+ default_des_param = [
deepmd/tf/modifier/dipole_charge.py (4)

133-148: Variables assigned but never used.
t_mdl_name, t_modi_type, t_mdl_charge_map, t_sys_charge_map, t_ewald_h, and t_ewald_b are all assigned but not utilized. Consider removing these lines to reduce clutter.

🧰 Tools
🪛 Ruff (0.8.2)

133-133: Local variable t_mdl_name is assigned to but never used

Remove assignment to unused variable t_mdl_name

(F841)


134-134: Local variable t_modi_type is assigned to but never used

Remove assignment to unused variable t_modi_type

(F841)


137-137: Local variable t_mdl_charge_map is assigned to but never used

Remove assignment to unused variable t_mdl_charge_map

(F841)


142-142: Local variable t_sys_charge_map is assigned to but never used

Remove assignment to unused variable t_sys_charge_map

(F841)


147-147: Local variable t_ewald_h is assigned to but never used

Remove assignment to unused variable t_ewald_h

(F841)


148-148: Local variable t_ewald_b is assigned to but never used

Remove assignment to unused variable t_ewald_b

(F841)


160-160: Unused local variable nfxnas.
This assignment is immediately overwritten. Removing it might keep the code cleaner.

🧰 Tools
🪛 Ruff (0.8.2)

160-160: Local variable nfxnas is assigned to but never used

Remove assignment to unused variable nfxnas

(F841)


390-390: Unused assignment to tensor.
This variable is never referenced afterward. Recommend removing to avoid confusion.

🧰 Tools
🪛 Ruff (0.8.2)

390-390: Local variable tensor is assigned to but never used

Remove assignment to unused variable tensor

(F841)


478-478: nframes not used.
This assignment is never consumed in the code below. You can safely remove it or refactor if needed.

🧰 Tools
🪛 Ruff (0.8.2)

478-478: Local variable nframes is assigned to but never used

Remove assignment to unused variable nframes

(F841)

deepmd/pt/model/descriptor/repflows.py (4)

178-178: Avoid using a mutable default argument for 'exclude_types'.
When using a list as a default parameter, Python reuses the same list object across all calls, which can lead to unexpected behavior.

Here is a possible fix:

-def __init__(
-    self,
-    ...
-    exclude_types: list[tuple[int, int]] = [],
-    ...
-):
+def __init__(
+    self,
+    ...
+    exclude_types: Optional[list[tuple[int, int]]] = None,
+    ...
+):
+    if exclude_types is None:
+        exclude_types = []
🧰 Tools
🪛 Ruff (0.8.2)

178-178: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)


360-360: Avoid using a mutable default argument for 'exclude_types'.
Same concern arises for the reinit_exclude method signature.

Here is a possible fix:

-def reinit_exclude(
-    self,
-    exclude_types: list[tuple[int, int]] = [],
-) -> None:
+def reinit_exclude(
+    self,
+    exclude_types: Optional[list[tuple[int, int]]] = None,
+) -> None:
+    if exclude_types is None:
+        exclude_types = []
🧰 Tools
🪛 Ruff (0.8.2)

360-360: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)


461-461: Rename unused loop variable 'idx'.
The variable idx is never used in the loop. For clarity, rename it to _ or _idx.

-for idx, ll in enumerate(self.layers):
+for _, ll in enumerate(self.layers):
    ...
🧰 Tools
🪛 Ruff (0.8.2)

461-461: Loop control variable idx not used within loop body

Rename unused idx to _idx

(B007)


571-575: Use a ternary expression for brevity.
Using a ternary expression instead of an if-else statement can make the code more concise.

-if callable(merged):
-    sampled = merged()
-else:
-    sampled = merged
+sampled = merged() if callable(merged) else merged
🧰 Tools
🪛 Ruff (0.8.2)

571-575: Use ternary operator sampled = merged() if callable(merged) else merged instead of if-else-block

(SIM108)

deepmd/pt/model/descriptor/repflow_layer.py (3)

307-307: Remove or use the unused local variable 'e_dim'.
The variable e_dim is assigned but never used, creating dead code.

-nb, nloc, nnei, _ = edge_ebd.shape
-e_dim = edge_ebd.shape[-1]
+nb, nloc, nnei, e_dim = edge_ebd.shape
# If e_dim is truly unneeded, remove its assignment entirely:
- e_dim = edge_ebd.shape[-1]
🧰 Tools
🪛 Ruff (0.8.2)

307-307: Local variable e_dim is assigned to but never used

Remove assignment to unused variable e_dim

(F841)


534-534: Remove or use the unused local variable 'nall'.
Currently, nall is assigned but not used, which can add confusion.

-nb, nloc, nnei, _ = edge_ebd.shape
-nall = node_ebd_ext.shape[1]
+nb, nloc, nnei, _ = edge_ebd.shape
 # Remove nall if it's not needed
🧰 Tools
🪛 Ruff (0.8.2)

534-534: Local variable nall is assigned to but never used

Remove assignment to unused variable nall

(F841)


788-788: Remove or use the unused local variable 'nitem'.
nitem is assigned and not used, leading to unnecessary code.

-nitem = len(update_list)
 uu = update_list[0]
 for ii in range(1, len(update_list)):
     ...
🧰 Tools
🪛 Ruff (0.8.2)

788-788: Local variable nitem is assigned to but never used

Remove assignment to unused variable nitem

(F841)

deepmd/pt/model/descriptor/dpa3.py (4)

105-105: Avoid using a mutable default argument for 'exclude_types'.
This repeats the same concern for default-argument lists.

-def __init__(
-    self,
-    ...
-    exclude_types: list[tuple[int, int]] = [],
-    ...
-):
+def __init__(
+    self,
+    ...
+    exclude_types: Optional[list[tuple[int, int]]] = None,
+    ...
+):
+    if exclude_types is None:
+        exclude_types = []
🧰 Tools
🪛 Ruff (0.8.2)

105-105: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)


339-339: Rename unused loop variable 'ii'.
The loop variable ii is never used in the loop body. For clarity, rename it to _.

-for ii, descrpt in enumerate(descrpt_list):
+for _, descrpt in enumerate(descrpt_list):
    descrpt.compute_input_stats(merged, path)
🧰 Tools
🪛 Ruff (0.8.2)

339-339: Loop control variable ii not used within loop body

Rename unused ii to _ii

(B007)


415-415: Remove or use the unused local variable 'env_mat'.
This variable is assigned but not used.

-env_mat = repflow_variable.pop("env_mat")
 # Remove the assignment if not needed
🧰 Tools
🪛 Ruff (0.8.2)

415-415: Local variable env_mat is assigned to but never used

Remove assignment to unused variable env_mat

(F841)


471-471: Remove or use the unused local variable 'nall'.
The variable nall is assigned but not used, which can clutter the code.

-nframes, nloc, nnei = nlist.shape
-nall = extended_coord.view(nframes, -1).shape[1] // 3
+...
# Remove 'nall' if it's unused in subsequent logic
🧰 Tools
🪛 Ruff (0.8.2)

471-471: Local variable nall is assigned to but never used

Remove assignment to unused variable nall

(F841)

🛑 Comments failed to post (1)
source/tests/universal/dpmodel/descriptor/test_descriptor.py (1)

467-549: 🛠️ Refactor suggestion

Avoid mutable default for exclude_types.
In Python, using a list directly as a default parameter can lead to unexpected behavior. It is best practice to use None and initialize it inside the function body.

- def DescriptorParamDPA3(
-     ntypes,
-     rcut,
-     rcut_smth,
-     sel,
-     type_map,
-     env_protection=0.0,
-     exclude_types=[],
-     ...
- ):
+ def DescriptorParamDPA3(
+     ntypes,
+     rcut,
+     rcut_smth,
+     sel,
+     type_map,
+     env_protection=0.0,
+     exclude_types=None,
+     ...
+ ):
+     if exclude_types is None:
+         exclude_types = []
      ...
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

def DescriptorParamDPA3(
    ntypes,
    rcut,
    rcut_smth,
    sel,
    type_map,
    env_protection=0.0,
    exclude_types=None,
    update_style="res_residual",
    update_residual=0.1,
    update_residual_init="const",
    update_angle=True,
    n_multi_edge_message=1,
    a_compress_rate=0,
    a_compress_e_rate=1,
    a_compress_use_split=False,
    optim_update=True,
    fix_stat_std=0.3,
    precision="float64",
):
    if exclude_types is None:
        exclude_types = []
    input_dict = {
        # kwargs for repformer
        "repflow": RepFlowArgs(
            **{
                "n_dim": 20,
                "e_dim": 10,
                "a_dim": 8,
                "nlayers": 2,
                "e_rcut": rcut,
                "e_rcut_smth": rcut_smth,
                "e_sel": sum(sel),
                "a_rcut": rcut / 2,
                "a_rcut_smth": rcut_smth / 2,
                "a_sel": sum(sel) // 4,
                "a_compress_rate": a_compress_rate,
                "a_compress_e_rate": a_compress_e_rate,
                "a_compress_use_split": a_compress_use_split,
                "optim_update": optim_update,
                "fix_stat_std": fix_stat_std,
                "n_multi_edge_message": n_multi_edge_message,
                "axis_neuron": 2,
                "update_angle": update_angle,
                "update_style": update_style,
                "update_residual": update_residual,
                "update_residual_init": update_residual_init,
            }
        ),
        "ntypes": ntypes,
        "concat_output_tebd": False,
        "precision": precision,
        "activation_function": "silu",
        "exclude_types": exclude_types,
        "env_protection": env_protection,
        "trainable": True,
        "use_econf_tebd": False,
        "use_tebd_bias": False,
        "type_map": type_map,
        "seed": GLOBAL_SEED,
    }
    return input_dict


DescriptorParamDPA3List = parameterize_func(
    DescriptorParamDPA3,
    OrderedDict(
        {
            "update_residual_init": ("const",),
            "exclude_types": ([], [[0, 1]]),
            "update_angle": (True, False),
            "a_compress_rate": (1,),
            "a_compress_e_rate": (2,),
            "a_compress_use_split": (True, False),
            "optim_update": (True, False),
            "fix_stat_std": (0.3,),
            "n_multi_edge_message": (1, 2),
            "env_protection": (0.0, 1e-8),
            "precision": ("float64",),
        }
    ),
)
# to get name for the default function
DescriptorParamDPA3 = DescriptorParamDPA3List[0]
🧰 Tools
🪛 Ruff (0.8.2)

474-474: Do not use mutable data structures for argument defaults

Replace with None; initialize within function

(B006)

@iProzd iProzd enabled auto-merge March 13, 2025 07:57
@iProzd iProzd added this pull request to the merge queue Mar 14, 2025
Merged via the queue into deepmodeling:devel with commit 4c8600a Mar 14, 2025
61 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants