- 
          
 - 
                Notifications
    
You must be signed in to change notification settings  - Fork 4.6k
 
Open
Labels
bugSomething isn't workingSomething isn't working
Description
What happened?
I had an old install of litellm using "main-latest" :
podman run -d --name litellm --rm \
  --network host \
  --env-file "$(pwd)/credentials.env" \
  -v "$(pwd)/config.yaml:/app/config.yaml:ro,Z" \
  ghcr.io/berriai/litellm:main-latest \
  --config /app/config.yaml \
  --host 0.0.0.0 \
  --port 8500
This install displays the version "1.77.3" on the main page of the web ui and had the the possibility to select vllm as the provider when hading a new model.
I recently build a new litellm on a kubernetes cluster by clonning the repo and used helm for the install : sudo helm -n apps install litellm . -f litellm/deploy/charts/litellm-helm/values.yaml. In the values.yaml file, I tried using a lot of differents version but none of them had the vllm option for the provider.
Is the vllm provider not supported anymore ? Or is it just a bug ?
I'm sorry if this issue is already fix or being fixed, but i didn't find any relevant issues.
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.78.5
Twitter / LinkedIn details
No response
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working