-
Notifications
You must be signed in to change notification settings - Fork 835
Description
Description
Currently, the AI integration in pgAdmin is tied to specific, llm providers. To improve flexibility and privacy, I am requesting the ability to configure a Custom LLM Provider URL/Endpoint.
This would allow users to leverage any OpenAI-compatible API, such as:
- LiteLLM (for proxying multiple enterprise models)
Suggested Implementation
In the pgAdmin "Preferences" settings under the AI section, please add the following fields:
- API Endpoint URL: (e.g., http://localhost:4000/v1 or a custom proxy URL)
I am thinking if you could make the base url configurable rather than hardcoded:
| API_URL = 'https://api.openai.com/v1/chat/completions' |
Use Case
Many organizations have strict data governance policies that prevent sending database schemas to public AI endpoints. By allowing a custom URL, pgAdmin users can point the AI feature to a self-hosted LiteLLM instance that strips sensitive data or routes requests to approved internal models.
Additional Context
Supporting the OpenAI API standard is the most efficient way to achieve this, as most local and alternative LLM runners have already adopted this specification.
Metadata
Metadata
Assignees
Labels
Type
Projects
Status