This repository was archived by the owner on Sep 2, 2025. It is now read-only.
Custom Spark Resource Allocation for specific set of Models #1181
javashankar
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Is there a possibility to define and allocate different spark configuration like -
spark.executor.cores, spark.executor.instancesetc for specific set of models. There are few models in my project which would need large resources and few others which would need minimal resources.any pointers/inputs appreciated
Beta Was this translation helpful? Give feedback.
All reactions