Skip to content

Support custom LLM Provider URLs (OpenAI-compatible) #9703

@javierlarota

Description

@javierlarota

Description
Currently, the AI integration in pgAdmin is tied to specific, llm providers. To improve flexibility and privacy, I am requesting the ability to configure a Custom LLM Provider URL/Endpoint.

This would allow users to leverage any OpenAI-compatible API, such as:

  • LiteLLM (for proxying multiple enterprise models)

Suggested Implementation
In the pgAdmin "Preferences" settings under the AI section, please add the following fields:

I am thinking if you could make the base url configurable rather than hardcoded:

API_URL = 'https://api.openai.com/v1/chat/completions'

Use Case
Many organizations have strict data governance policies that prevent sending database schemas to public AI endpoints. By allowing a custom URL, pgAdmin users can point the AI feature to a self-hosted LiteLLM instance that strips sensitive data or routes requests to approved internal models.

Additional Context
Supporting the OpenAI API standard is the most efficient way to achieve this, as most local and alternative LLM runners have already adopted this specification.

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

Status

No status

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions