Skip to content

Allocate default_material in the MPI shared_memory when copy Numpy arrays to meep#2221

Open
estebvac wants to merge 2 commits intoNanoComp:masterfrom
estebvac:master
Open

Allocate default_material in the MPI shared_memory when copy Numpy arrays to meep#2221
estebvac wants to merge 2 commits intoNanoComp:masterfrom
estebvac:master

Conversation

@estebvac
Copy link

@estebvac estebvac commented Sep 1, 2022

Related to discussion 2158 This is a possible way in which the memory is not copied multiple times per process, and leverages the use of the MPI shared memmory capabilities to reduce the memory consumption. In the example code showed in the discussion 2158, it was possible to increase the number of process running the simulation from 8 to 24 in a single node, since the memory consumption is reduced.

@codecov-commenter
Copy link

Codecov Report

Merging #2221 (5affc25) into master (63cd51f) will decrease coverage by 0.04%.
The diff coverage is n/a.

@@            Coverage Diff             @@
##           master    #2221      +/-   ##
==========================================
- Coverage   73.24%   73.20%   -0.05%     
==========================================
  Files          17       17              
  Lines        4897     4904       +7     
==========================================
+ Hits         3587     3590       +3     
- Misses       1310     1314       +4     
Impacted Files Coverage Δ
python/adjoint/connectivity.py 8.64% <0.00%> (-1.84%) ⬇️
python/visualization.py 41.49% <0.00%> (-1.16%) ⬇️
python/simulation.py 76.84% <0.00%> (ø)
python/adjoint/optimization_problem.py 57.06% <0.00%> (+0.22%) ⬆️

@smartalecH
Copy link
Collaborator

Do you want to try fixing the pre-commit errors?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants