Skip to content

feat: add gradient descent information to thermoelastic2d problem#232

Open
gapaza wants to merge 1 commit intomainfrom
thermoelastic2d-grad-info
Open

feat: add gradient descent information to thermoelastic2d problem#232
gapaza wants to merge 1 commit intomainfrom
thermoelastic2d-grad-info

Conversation

@gapaza
Copy link
Contributor

@gapaza gapaza commented Mar 18, 2026

Description

Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.

This pull request adds gradient information to the Thermoelastic2D problem. It also modifies the OptiStep data class to include additional optional fields for gradient information. These fields include:

  • x: npt.NDArray (the current design before the gradient update)
  • x_update: npt.NDArray (the update step taken by the optimizer)
  • obj_values_update: npt.NDArray (how the objective values changed after the update step)

Type of change

Please delete options that are not relevant.

  • New feature (non-breaking change which adds functionality)

Screenshots

Please attach before and after screenshots of the change if applicable.

Checklist:

  • I have run the pre-commit checks with pre-commit run --all-files
  • I have run ruff check . and ruff format
  • I have run mypy .
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes

Reviewer Checklist:

  • The content of this PR brings value to the community. It is not too specific to a particular use case.
  • The tests and checks pass (linting, formatting, type checking). For a new problem, double check the github actions workflow to ensure the problem is being tested.
  • The documentation is updated.
  • The code is understandable and commented. No large code blocks are left unexplained, no huge file. Can I read and understand the code easily?
  • There is no merge conflict.
  • The changes are not breaking the existing results (datasets, training curves, etc.). If they do, is there a good reason for it? And is the associated problem version bumped?
  • For a new problem, has the dataset been generated with our slurm script so we can re-generate it if needed? (This also ensures that the problem is running on the HPC.)
  • For bugfixes, it is a robust fix and not a hacky workaround.

@gapaza gapaza requested a review from g-braeunlich March 18, 2026 14:58
@gapaza gapaza self-assigned this Mar 18, 2026
Comment on lines +28 to +30
x: npt.NDArray | None = None # the current design before the gradient update
x_update: npt.NDArray | None = None # the gradient update step taken by the optimizer
obj_values_update: npt.NDArray | None = None # how the objective values change after the update step
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
x: npt.NDArray | None = None # the current design before the gradient update
x_update: npt.NDArray | None = None # the gradient update step taken by the optimizer
obj_values_update: npt.NDArray | None = None # how the objective values change after the update step
x: npt.NDArray | None = None
"""the current design before the gradient update"""
x_update: npt.NDArray | None = None
"""the gradient update step taken by the optimizer"""
obj_values_update: npt.NDArray | None = None
"""how the objective values change after the update step"""

return True
return change < UPDATE_THRESHOLD and iterr >= MIN_ITERATIONS

def record_step(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PLR0913 starts getting annoying.
I would say: lets remove that check.
Could you do

--- a/pyproject.toml
+++ b/pyproject.toml
@@ -132,6 +132,7 @@ ignore = [
   "EM102",    # f-string-in-exception
   "E741",     # ambiguous-variable-name
   "FIX002",   # flake8-fixme (flake8-todos is enough)
+  "PLR0913",  # too-many-arguments
   "PTH",      # flake8-use-pathlib

and remove the corresponding noqa throughout the codebase?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants