You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
WARNING:2025-06-22 19:19:07,153:jax._src.compiler:107: PERSISTENT COMPILATION CACHE MISS for'jit_forward' with key 'jit_forward-277bb252941ee475b89264b7444b78689454bb1ae5ea3536b2179087a81f7aaa'
WARNING:2025-06-22 19:19:17,873:jax._src.compilation_cache:265: Writing jit_forward to persistent compilation cache with key 'jit_forward-277bb252941ee475b89264b7444b78689454bb1ae5ea3536b2179087a81f7aaa'
All done!
In later runs, the output has hundreds of variations (different lambda id) of
WARNING:2025-06-22 19:23:28,057:jax._src.pjit:1151: TRACING CACHE MISS at /home/braun/.local/lib/python3.12/site-packages/flax/core/scope.py:950:18 (Scope.param) because:
never seen function:
Scope.param.<locals>.<lambda> id=138999571240800 defined at /home/braun/.local/lib/python3.12/site-packages/flax/core/scope.py:951
but seen another function defined on the same line; maybe the function is
being re-defined repeatedly, preventing caching?
and then
WARNING:2025-06-22 19:23:28,578:jax._src.pjit:1151: TRACING CACHE MISS at /mnt/c/Users/braun/GitHub/myproject/cache_issue.py:38:8 (<module>) because:
never seen function:
forward id=139001048154176 defined at /mnt/c/Users/braun/GitHub/myproject/cache_issue.py:33
All done!
Note that the __call__ of DX7Algo15 has a lambda tick:
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
I have a large autogenerated Flax Linen Module that is not working well with caching. I'm using
on WSL2 w/ NVIDIA RTX 4080 Super driver 572.70 (newer drivers cause exit code 139 but that's a different story...)
cache_issue.py:
In the first run, the ending is
In later runs, the output has hundreds of variations (different lambda id) of
and then
Note that the
__call__
of DX7Algo15 has a lambdatick
:Can I get some help with improving caching?
cache_issue.py.txt
algo15.py.txt
Beta Was this translation helpful? Give feedback.
All reactions