Skip to content

Conversation

Arushi-Gupta13
Copy link

This PR refactors the codebase to replace traditional closures with the @closures annotation. This change improves readability, reduces redundant closure definitions, and enhances performance by leveraging Julia's built-in annotation for automatic closure handling.

integral = NeuralPDE.get_numeric_integral(strategy, indvars, multioutput, chain, derivative)

_pde_loss_function = NeuralPDE.build_loss_function(eq, indvars, depvars, phi, derivative,
_pde_loss_function = @closure NeuralPDE.build_loss_function(eq, indvars, depvars, phi, derivative,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this isn't a closure

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added @closure here under the assumption that it would be necessary for capturing variables, but I see that this isn't forming a closure.

phi = discretization.phi
# Analysis
# Analysis with closure
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why mention this?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In PR #900, it was mentioned to annotate closures with @closure to avoid boxing. I reviewed the relevant places where closures might be necessary and added the annotations and mentioned here the analysis is with closure.

@ChrisRackauckas
Copy link
Member

Can you show a concrete improvement from this? What led to it? Can you show what functions had issues with inference and a flamegraph?

if: build.message !~ /\[skip tests\]/

- label: "Documentation"
- label: "Documentation"p
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- label: "Documentation"p
- label: "Documentation"

DomainSets = "5b8099bc-c8ec-5219-889f-1d9e522a28bf"
ForwardDiff = "f6369f11-7733-5829-9624-2563aa707210"
Functors = "d9f16b24-f501-4c13-a1f2-28368ffc5196"
Glob = "c27321d9-0574-5035-807b-f59d2c89b15c"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is this?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it was for searching a keyword across all files, automating the search with a script.

Reexport = "189a3867-3050-52da-a836-e630ba90ab69"
RuntimeGeneratedFunctions = "7e49a35a-f44a-4d26-94aa-eba1b4ca6b47"
SciMLBase = "0bca4576-84f4-4d90-8ffe-ffa030f20462"
SparseArrays = "2f01184e-e22b-5df5-ae63-d93ebab69eaf"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is this needed?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It was throwing KeyError: key "SparseArrays" not found error continuously, this helped.

MiniMaxAdaptiveLoss(args...; kwargs...) = MiniMaxAdaptiveLoss{Float64}(args...; kwargs...)

function generate_adaptive_loss_function(pinnrep::PINNRepresentation,
@closure function generate_adaptive_loss_function(pinnrep::PINNRepresentation,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in the wrong spot

NonAdaptiveLoss(; kwargs...) = NonAdaptiveLoss{Float64}(; kwargs...)

function generate_adaptive_loss_function(::PINNRepresentation, ::NonAdaptiveLoss, _, __)
@closure function generate_adaptive_loss_function(::PINNRepresentation, ::NonAdaptiveLoss, _, __)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wrong spot

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants