Skip to content

Commit a7ad317

Browse files
hxcva1tae0y
andauthored
feat: add Google Vertex AI connector implementation (#493)
* feat: Add GoogleVertexAIConnector with settings validation and stubbed chat client * feat: Add GoogleVertexAIConnector to LanguageModelConnector factory * feat: Add Google Vertex AI parameters and environment variables to infra bicep templates * feat: Implement GoogleVertexAIConnector using GeminiChatClient * test: Add unit tests for GoogleVertexAIConnector * test: Fix failing tests for GoogleVertexAIConnector when model is missing * refactor: Replace magic strings with private const string in GoogleVertexAIConnectorTests * test: Add new test cases to GoogleVertexAIConnectorTests * test: Enable GoogleVertexAIConnector inheritance check in LanguageModelConnectorTests * refactor: Remove unnecessary function parameter * docs: Add Google Vertex AI (draft) * test: Add unit tests for GoogleVertexAIConnector (validation and client creation) * test: Remove GoogleVertexAI from unsupported connector tests * Update google-vertex-ai.md - 순번있는 항목 내부의 코드와 각주는 인덴트 * Update google-vertex-ai.md - 문서 컨벤션 반영 - GitHub Container Regitry, default and alternative model configuration * Update README.md - 프로젝트, docs 루트의 README에 google vertex ai 반영 * Update GoogleVertexAIConnectorType.cs - 메서드명 컨벤션 반영 - Invalid 케이스 확대: null, 빈문자열, 공백, 탭 - 누락된 Invalid 테스트 케이스 추가 - 테스트 입력값은 InlineData로 처리 * Update GoogleVertexAIConnector.cs - 누락된 예외처리 로직 추가 - EnsureLanguageModelSettingsValid 호출하지 않았을 경우 방어로직 * Update GoogleVertexAIConnector.cs - Connector 멤버로 AppSettings 선언 - 관련 로직 수정 - 불필요한 == 구문 제거 * Update GoogleVertexAIConnectorTests.cs - 호출 메서드별 예외 클래스 수정 * Update resources.bicep - connector 미지정시 google 기본값으로 설정하는 로직 제거 - 컨테이너로 env값 전달하는 로직 추가 * Update resources.bicep - apiKey secretRef --------- Co-authored-by: tae0y[박태영] <[email protected]>
1 parent ea7c84c commit a7ad317

File tree

10 files changed

+589
-3
lines changed

10 files changed

+589
-3
lines changed

README.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM
99
- [x] [Amazon Bedrock](https://docs.aws.amazon.com/bedrock)
1010
- [x] [Azure AI Foundry](https://learn.microsoft.com/azure/ai-foundry/what-is-azure-ai-foundry)
1111
- [x] [GitHub Models](https://docs.github.com/github-models/about-github-models)
12-
- [ ] [Google Vertex AI](https://cloud.google.com/vertex-ai/docs)
12+
- [x] [Google Vertex AI](https://cloud.google.com/vertex-ai/docs)
1313
- [x] [Docker Model Runner](https://docs.docker.com/ai/model-runner)
1414
- [x] [Foundry Local](https://learn.microsoft.com/azure/ai-foundry/foundry-local/what-is-foundry-local)
1515
- [x] [Hugging Face](https://huggingface.co/docs)
@@ -63,6 +63,7 @@ Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM
6363
- [Use Amazon Bedrock](./docs/amazon-bedrock.md#run-on-local-machine)
6464
- [Use Azure AI Foundry](./docs/azure-ai-foundry.md#run-on-local-machine)
6565
- [Use GitHub Models](./docs/github-models.md#run-on-local-machine)
66+
- [Google Vertex AI](./docs/google-vertex-ai.md#run-on-local-machine)
6667
- [Use Docker Model Runner](./docs/docker-model-runner.md#run-on-local-machine)
6768
- [Use Foundry Local](./docs/foundry-local.md#run-on-local-machine)
6869
- [Use Hugging Face](./docs/hugging-face.md#run-on-local-machine)
@@ -77,6 +78,7 @@ Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM
7778
- [Use Amazon Bedrock](./docs/amazon-bedrock.md#run-in-local-container)
7879
- [Use Azure AI Foundry](./docs/azure-ai-foundry.md#run-in-local-container)
7980
- [Use GitHub Models](./docs/github-models.md#run-in-local-container)
81+
- [Google Vertex AI](./docs/google-vertex-ai.md#run-on-local-container)
8082
- [Use Docker Model Runner](./docs/docker-model-runner.md#run-in-local-container)
8183
- ~~Use Foundry Local~~ 👉 NOT SUPPORTED
8284
- [Use Hugging Face](./docs/hugging-face.md#run-in-local-container)
@@ -91,6 +93,7 @@ Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM
9193
- [Use Amazon Bedrock](./docs/amazon-bedrock.md#run-on-azure)
9294
- [Use Azure AI Foundry](./docs/azure-ai-foundry.md#run-on-azure)
9395
- [Use GitHub Models](./docs/github-models.md#run-on-azure)
96+
- [Google Vertex AI](./docs/google-vertex-ai.md#run-on-azure)
9497
- ~~Use Docker Model Runner~~ 👉 NOT SUPPORTED
9598
- ~~Use Foundry Local~~ 👉 NOT SUPPORTED
9699
- [Use Hugging Face](./docs/hugging-face.md#run-on-azure)

docs/README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,7 @@
33
- [Amazon Bedrock](./amazon-bedrock.md)
44
- [Azure AI Foundry](./azure-ai-foundry.md)
55
- [GitHub Models](./github-models.md)
6+
- [Google Vertex AI](./google-vertex-ai.md)
67
- [Docker Model Runner](./docker-model-runner.md)
78
- [Foundry Local](./foundry-local.md)
89
- [Hugging Face](./hugging-face.md)

docs/google-vertex-ai.md

Lines changed: 241 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,241 @@
1+
# OpenChat Playground with Google Vertex AI
2+
3+
This page describes how to run OpenChat Playground (OCP) with Google Vertex AI integration.
4+
5+
## Get the repository root
6+
7+
1. Get the repository root.
8+
9+
```bash
10+
# bash/zsh
11+
REPOSITORY_ROOT=$(git rev-parse --show-toplevel)
12+
```
13+
14+
```powershell
15+
# PowerShell
16+
$REPOSITORY_ROOT = git rev-parse --show-toplevel
17+
```
18+
19+
## Run on local machine
20+
21+
1. Make sure you are at the repository root.
22+
23+
```bash
24+
cd $REPOSITORY_ROOT
25+
```
26+
27+
1. Add Google Vertex AI API Key. Replace `{{GOOGLE_VERTEX_AI_API_KEY}}` with your key.
28+
29+
```bash
30+
# bash/zsh
31+
dotnet user-secrets --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp \
32+
set GoogleVertexAI:ApiKey "{{GOOGLE_VERTEX_AI_API_KEY}}"
33+
```
34+
35+
```powershell
36+
# PowerShell
37+
dotnet user-secrets --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp `
38+
set GoogleVertexAI:ApiKey "{{GOOGLE_VERTEX_AI_API_KEY}}"
39+
```
40+
41+
> To get an API Key, refer to the doc [Using Gemini API keys](https://ai.google.dev/gemini-api/docs/api-key#api-keys).
42+
43+
1. Run the app. The default model OCP uses is [Gemini 2.5 Flash Lite](https://ai.google.dev/gemini-api/docs/models#gemini-2.5-flash-lite).
44+
45+
```bash
46+
# bash/zsh
47+
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- \
48+
--connector-type GoogleVertexAI
49+
```
50+
51+
```powershell
52+
# PowerShell
53+
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- `
54+
--connector-type GoogleVertexAI
55+
```
56+
57+
Alternatively, if you want to run with a different deployment, say [`gemini-2.5-pro`](https://ai.google.dev/gemini-api/docs/models#gemini-2.5-pro), other than the default one, you can specify it as an argument.
58+
59+
```bash
60+
# bash/zsh
61+
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- \
62+
--connector-type GoogleVertexAI \
63+
--model gemini-2.5-pro
64+
```
65+
66+
```powershell
67+
# PowerShell
68+
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- `
69+
--connector-type GoogleVertexAI `
70+
--model gemini-2.5-pro
71+
```
72+
73+
1. Open your web browser at `http://localhost:5280` and start entering prompts.
74+
75+
## Run in local container
76+
77+
1. Make sure you are at the repository root.
78+
79+
```bash
80+
cd $REPOSITORY_ROOT
81+
```
82+
83+
1. Build a container.
84+
85+
```bash
86+
docker build -f Dockerfile -t openchat-playground:latest .
87+
```
88+
89+
1. Get the Google Vertex AI key.
90+
91+
```bash
92+
API_KEY=$(dotnet user-secrets --project ./src/OpenChat.PlaygroundApp list --json | \
93+
sed -n '/^\/\//d; p' | jq -r '."GoogleVertexAI:ApiKey"')
94+
```
95+
96+
```powershell
97+
# PowerShell
98+
$API_KEY = (dotnet user-secrets --project ./src/OpenChat.PlaygroundApp list --json | `
99+
Select-String -NotMatch '^//(BEGIN|END)' | ConvertFrom-Json).'GoogleVertexAI:ApiKey'
100+
```
101+
102+
1. Run the app. The default model OCP uses is [Gemini 2.5 Flash Lite](https://ai.google.dev/gemini-api/docs/models#gemini-2.5-flash-lite).
103+
104+
```bash
105+
# bash/zsh - from locally built container
106+
docker run -i --rm -p 8080:8080 openchat-playground:latest \
107+
--connector-type GoogleVertexAI \
108+
--api-key $API_KEY
109+
```
110+
111+
```powershell
112+
# PowerShell - from locally built container
113+
docker run -i --rm -p 8080:8080 openchat-playground:latest --connector-type GoogleVertexAI `
114+
--api-key $API_KEY
115+
```
116+
117+
```bash
118+
# bash/zsh - from GitHub Container Registry
119+
docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest \
120+
--connector-type GoogleVertexAI \
121+
--api-key $API_KEY
122+
```
123+
124+
```powershell
125+
# PowerShell - from GitHub Container Registry
126+
docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest `
127+
--connector-type GoogleVertexAI `
128+
--api-key $API_KEY `
129+
```
130+
131+
Alternatively, if you want to run with a different deployment, say [`gemini-2.5-pro`](https://ai.google.dev/gemini-api/docs/models#gemini-2.5-pro), other than the default one, you can specify it as an argument.
132+
133+
```bash
134+
# bash/zsh - from locally built container
135+
docker run -i --rm -p 8080:8080 openchat-playground:latest \
136+
--connector-type GoogleVertexAI \
137+
--api-key $API_KEY \
138+
--model gemini-2.5-pro
139+
```
140+
141+
```powershell
142+
# PowerShell - from locally built container
143+
docker run -i --rm -p 8080:8080 openchat-playground:latest --connector-type GoogleVertexAI `
144+
--api-key $API_KEY `
145+
--model gemini-2.5-pro
146+
```
147+
148+
```bash
149+
# bash/zsh - from GitHub Container Registry
150+
docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest \
151+
--connector-type GoogleVertexAI \
152+
--api-key $API_KEY \
153+
--model gemini-2.5-pro
154+
```
155+
156+
```powershell
157+
# PowerShell - from GitHub Container Registry
158+
docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest `
159+
--connector-type GoogleVertexAI `
160+
--api-key $API_KEY `
161+
--model gemini-2.5-pro
162+
```
163+
164+
1. Open your web browser, navigate to `http://localhost:8080`, and enter prompts.
165+
166+
## Run on Azure
167+
168+
1. Make sure you are at the repository root.
169+
170+
```bash
171+
cd $REPOSITORY_ROOT
172+
```
173+
174+
1. Login to Azure:
175+
176+
```bash
177+
azd auth login
178+
```
179+
180+
1. Check login status.
181+
182+
```bash
183+
azd auth login --check-status
184+
```
185+
186+
1. Initialize `azd` template.
187+
188+
```bash
189+
azd init
190+
```
191+
192+
> **NOTE**: You will be asked to provide environment name for provisioning.
193+
194+
1. Get Google Vertex AI API Key.
195+
196+
```bash
197+
API_KEY=$(dotnet user-secrets --project ./src/OpenChat.PlaygroundApp list --json | \
198+
sed -n '/^\/\//d; p' | jq -r '."GoogleVertexAI:ApiKey"')
199+
```
200+
201+
```powershell
202+
# PowerShell
203+
$API_KEY = (dotnet user-secrets --project ./src/OpenChat.PlaygroundApp list --json | `
204+
Select-String -NotMatch '^//(BEGIN|END)' | ConvertFrom-Json).'GoogleVertexAI:ApiKey'
205+
```
206+
207+
1. Set Google Vertex AI configuration to azd environment variables.
208+
209+
```bash
210+
azd env set GOOGLE_VERTEX_AI_API_KEY $API_KEY
211+
```
212+
213+
The default model OCP uses is [Gemini 2.5 Flash Lite](https://ai.google.dev/gemini-api/docs/models#gemini-2.5-flash-lite). If you want to run with a different deployment, say [`gemini-2.5-pro`](https://ai.google.dev/gemini-api/docs/models#gemini-2.5-pro), other than the default one, add it to azd environment variables.
214+
215+
```bash
216+
azd env set GOOGLE_VERTEX_AI_MODEL gemini-2.5-pro
217+
```
218+
219+
1. Set the connector type to `GoogleVertexAI`
220+
221+
```bash
222+
azd env set CONNECTOR_TYPE GoogleVertexAI
223+
```
224+
225+
1. Provision and deploy:
226+
227+
```bash
228+
azd up
229+
```
230+
231+
> **NOTE**: You will be asked to provide Azure subscription and location for deployment.
232+
233+
Once deployed, you will be able to see the deployed OCP app URL.
234+
235+
1. Open your web browser, navigate to the OCP app URL, and enter prompts.
236+
237+
1. Clean up:
238+
239+
```bash
240+
azd down --force --purge
241+
```

infra/main.bicep

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -28,6 +28,9 @@ param azureAIFoundryDeploymentName string = ''
2828
param githubModelsToken string = ''
2929
param githubModelsModel string = ''
3030
// Google Vertex AI
31+
@secure()
32+
param googleVertexAIModel string = ''
33+
param googleVertexAIApiKey string = ''
3134
// Docker Model Runner - NOT SUPPORTED
3235
// Foundry Local - NOT SUPPORTED
3336
// Hugging Face
@@ -103,6 +106,8 @@ module resources 'resources.bicep' = {
103106
huggingFaceModel: huggingFaceModel
104107
githubModelsToken: githubModelsToken
105108
githubModelsModel: githubModelsModel
109+
googleVertexAIModel: googleVertexAIModel
110+
googleVertexAIApiKey: googleVertexAIApiKey
106111
ollamaModel: ollamaModel
107112
anthropicModel: anthropicModel
108113
anthropicApiKey: anthropicApiKey

infra/main.parameters.json

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -38,6 +38,12 @@
3838
"githubModelsModel": {
3939
"value": "${GH_MODELS_MODEL=openai/gpt-4o-mini}"
4040
},
41+
"googleVertexAIModel": {
42+
"value": "${GOOGLE_VERTEX_AI_MODEL}"
43+
},
44+
"googleVertexAIApiKey": {
45+
"value": "${GOOGLE_VERTEX_AI_API_KEY}"
46+
},
4147
"huggingFaceModel": {
4248
"value": "${HUGGING_FACE_MODEL=hf.co/Qwen/Qwen3-0.6B-GGUF}"
4349
},

infra/resources.bicep

Lines changed: 21 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,9 @@ param azureAIFoundryDeploymentName string = ''
2323
param githubModelsToken string = ''
2424
param githubModelsModel string = ''
2525
// Google Vertex AI
26+
@secure()
27+
param googleVertexAIModel string = ''
28+
param googleVertexAIApiKey string = ''
2629
// Docker Model Runner - NOT SUPPORTED
2730
// Foundry Local - NOT SUPPORTED
2831
// Hugging Face
@@ -246,6 +249,17 @@ var envGitHubModels = (connectorType == '' || connectorType == 'GitHubModels') ?
246249
}
247250
] : []) : []
248251
// Google Vertex AI
252+
var envGoogleVertexAI = connectorType == 'GoogleVertexAI' ? concat(googleVertexAIModel != '' ? [
253+
{
254+
name: 'GoogleVertexAI__Model'
255+
value: googleVertexAIModel
256+
}
257+
] : [], googleVertexAIApiKey != '' ? [
258+
{
259+
name: 'GoogleVertexAI__ApiKey'
260+
secretRef: 'google-vertex-ai-api-key'
261+
}
262+
] : []) : []
249263
// Docker Model Runner - NOT SUPPORTED
250264
// Foundry Local - NOT SUPPORTED
251265
// Hugging Face
@@ -348,6 +362,11 @@ module openchatPlaygroundApp 'br/public:avm/res/app/container-app:0.18.1' = {
348362
name: 'github-models-token'
349363
value: githubModelsToken
350364
}
365+
] : [], googleVertexAIApiKey != '' ? [
366+
{
367+
name: 'google-vertex-ai-api-key'
368+
value: googleVertexAIApiKey
369+
}
351370
] : [], anthropicApiKey != '' ? [
352371
{
353372
name: 'anthropic-api-key'
@@ -390,6 +409,7 @@ module openchatPlaygroundApp 'br/public:avm/res/app/container-app:0.18.1' = {
390409
envAmazonBedrock,
391410
envAzureAIFoundry,
392411
envGitHubModels,
412+
envGoogleVertexAI,
393413
envHuggingFace,
394414
envOllama,
395415
envAnthropic,
@@ -486,4 +506,4 @@ module ollama 'br/public:avm/res/app/container-app:0.18.1' = if (useOllama == tr
486506
}
487507

488508
output AZURE_CONTAINER_REGISTRY_ENDPOINT string = containerRegistry.outputs.loginServer
489-
output AZURE_RESOURCE_OPENCHAT_PLAYGROUNDAPP_ID string = openchatPlaygroundApp.outputs.resourceId
509+
output AZURE_RESOURCE_OPENCHAT_PLAYGROUNDAPP_ID string = openchatPlaygroundApp.outputs.resourceId

src/OpenChat.PlaygroundApp/Abstractions/LanguageModelConnector.cs

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -39,6 +39,7 @@ public static async Task<IChatClient> CreateChatClientAsync(AppSettings settings
3939
ConnectorType.AmazonBedrock => new AmazonBedrockConnector(settings),
4040
ConnectorType.AzureAIFoundry => new AzureAIFoundryConnector(settings),
4141
ConnectorType.GitHubModels => new GitHubModelsConnector(settings),
42+
ConnectorType.GoogleVertexAI => new GoogleVertexAIConnector(settings),
4243
ConnectorType.DockerModelRunner => new DockerModelRunnerConnector(settings),
4344
ConnectorType.FoundryLocal => new FoundryLocalConnector(settings),
4445
ConnectorType.HuggingFace => new HuggingFaceConnector(settings),

0 commit comments

Comments
 (0)