Skip to content

Conversation

@YellowRoseCx
Copy link

Goal is to allow multiple GPUs to pool together their VRAM for use during evaluation and quantization. Goal is also to be able to set a maximum amount of memory to use per GPU and have the rest of the data offloaded to CPU RAM.

Motivation: 16gb VRAM GPU runs out of memory evaluating a 30b model on C4-new dataset. It also runs OOM during quantization

I believe a mixture of the current multi-gpu code(for benchmarks) and the CPU offloading scripts will be useful for achieving this

@YellowRoseCx YellowRoseCx changed the title Add MultiGPU support and CPU offloading for Evaluation and Quantization WIP: Add MultiGPU support and CPU offloading for Evaluation and Quantization May 18, 2023
@YellowRoseCx YellowRoseCx changed the title WIP: Add MultiGPU support and CPU offloading for Evaluation and Quantization WIP: Add MultiGPU support May 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant