Skip to content

Conversation

@santisoler
Copy link
Member

No description provided.

Add an optional `callback` argument to the
`GaussNewtonConjugateGradient` minimizer that gets called before
yielding any model. The callback function takes a new `MinimizerResult`
data class with some information about the minimization process.
Use the Log protocol in the Inversion class.
Revert this commit if not good. I pushed it by the end of the day.
The `InversionLog` now holds a `get_minimizer_callback` method that
returns a function that can be passed to `Minimizer`'s as the `callback`
argument. The `InversionLog` creates a new `MinimizerLog` every time the
`get_minimizer_callback` is called, appends it to a running list of
`minimizer_logs`, and returns the `update` method of the newly created
one. The `Inversion` then passes this callable to the `Minimizer` when
called. This way the logs of all minimization steps are being stored
inside the `InversionLog`.
It's best to build the Rich renderables and store them in memory rather
than rebuilding them every time the `__rich__` method is called. This is
to avoid async issues that might happen when the `Live` wants to update
the renderable, but some pieces of the code required to build it is
still running (filling the log for example). So it's better to always
have one renderable ready to be shown.
We better keep it flexible rather than having it as a dataclass.
Update the MinimizerLog so it can add any column in the result to the
logger.
Print the inversion log table first and then each of the minimizer logs
in a tree structure. This way we simplify the `__rich__` method of the
`InversionLogRich`: we don't need to worry about it having minimizer
logs or not.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant