Skip to content

Commit ea7c84c

Browse files
qowehsikutisa
andauthored
Connector Implementation & Inheritance: Anthropic Claude (#499)
* refactor: split UI logic in ChatHeader.razor * test: add integration tests for NewChat button and icon visibility * feat: Add support for Anthropic API integration and update documentation * feat: Add documentation for running OpenChat Playground with Anthropic Claude integration * feat: Update default model references to Claude Sonnet 4 and adjust alternative model options * feat: Add support for Anthropic connector in LanguageModelConnector * feat: Implement Anthropic connector for LanguageModel integration * feat: Enhance AnthropicConnector with improved error handling and add unit tests * Adds documentation for Anthropic connector Adds links to the documentation for the Anthropic connector in the README, covering local machine, local container, and Azure deployments. Addresses the need to provide clear instructions for utilizing the new Anthropic connector. Related to #261 * Updates documentation to refer to Anthropic models Updates the documentation to refer to Anthropic models instead of the specific Claude model, reflecting the broader range of models now supported. Relates to #261 * feat: Refactor GetChatClientAsync to improve API client initialization and ensure proper function invocation * feat: add max_tokens parameter for Anthropic in appsettings.json Related to : #342 * Add `MaxTokens` parameter to the Anthropic section in appsettings.json (Microsoft.Extensions.Configuration.Binder automatically converts string values to integers when binding configuration) * feat: add property MaxTokens for Anthropic in AnthropicSettings Relate to : #258 * Add`MaxTokens` property to AnthropicSettings * fix: temporarily disable Anthropic * update LanguageModelConnector to include Anthropic in unsupported list * will be re-enabled when Anthropic PR is merged * test: skip all Anthropic connector tests until enabled * mark all AnthropicConnectorTests as skipped to avoid CI failures while connector is unsupported * refactor: update documentation to remove "Claude" references from Anthropic settings * feat: add option MaxTokens for parsing option Related to : #259 * Add option ‘MaxTokens’ to AnthropicArgumentOptions and update parsing logic * Update ArgumentOptions.cs, AnthropicArgumentOptions.cs, AnthropicArgumentOptionsTests.cs * feat: add constant for '--max-tokens' command-line argument * feat: Enforce MaxTokens configuration in AnthropicConnector Validates the `MaxTokens` configuration setting in the `AnthropicConnector` to ensure it is present and a positive integer. This prevents runtime errors due to missing or invalid configurations. Also updates the Anthropic connector to use the `MaxTokens` setting from the configuration for `MaxOutputTokens` option. * refactor : Remove redundant comment Removes an unnecessary comment that restates the purpose of the 'MaxTokens' property. * refactor: remove whitespace * refactor: revert unintended formatting change * feat: Improves Anthropic connector validation Adds validation to ensure that the Anthropic connector settings for max tokens are correctly configured. Specifically, it checks for both the presence and validity of the `MaxTokens` setting, throwing an exception if it is missing or invalid. * feat: Update Anthropic tests for max tokens config Updates Anthropic argument options tests to use an integer type for max tokens configuration and environment variables. This change improves type safety and simplifies the configuration logic by removing the need for string parsing. * Revert "feat: Update Anthropic tests for max tokens config" This reverts commit 4cbff1f. * Add AnthropicConnector (#2) * feat: add AnthropicConnector Related to: #261 * add bicep * add bicep * remove white space --------- Co-authored-by: sikutisa <[email protected]> Co-authored-by: sikutisa <[email protected]>
1 parent 853a339 commit ea7c84c

File tree

17 files changed

+918
-64
lines changed

17 files changed

+918
-64
lines changed

.github/workflows/azure-dev.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -31,6 +31,7 @@ jobs:
3131
AZURE_ENV_NAME: ${{ vars.AZURE_ENV_NAME }}
3232
AZURE_LOCATION: ${{ vars.AZURE_LOCATION }}
3333
GH_MODELS_TOKEN: ${{ secrets.GH_MODELS_TOKEN }}
34+
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
3435
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
3536

3637
steps:
@@ -149,6 +150,7 @@ jobs:
149150
shell: bash
150151
env:
151152
GitHubModels__Token: ${{ env.GH_MODELS_TOKEN }}
153+
Anthropic__ApiKey: ${{ env.ANTHROPIC_API_KEY }}
152154
OpenAI__ApiKey: ${{ env.OPENAI_API_KEY }}
153155
run: |
154156
azd provision --no-prompt

README.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM
1414
- [x] [Foundry Local](https://learn.microsoft.com/azure/ai-foundry/foundry-local/what-is-foundry-local)
1515
- [x] [Hugging Face](https://huggingface.co/docs)
1616
- [x] [Ollama](https://github.com/ollama/ollama/tree/main/docs)
17-
- [ ] [Anthropic](https://docs.anthropic.com)
17+
- [x] [Anthropic](https://docs.anthropic.com)
1818
- [ ] [Naver](https://api.ncloud-docs.com/docs/ai-naver-clovastudio-summary)
1919
- [x] [LG](https://github.com/LG-AI-EXAONE)
2020
- [x] [OpenAI](https://openai.com/api)
@@ -67,6 +67,7 @@ Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM
6767
- [Use Foundry Local](./docs/foundry-local.md#run-on-local-machine)
6868
- [Use Hugging Face](./docs/hugging-face.md#run-on-local-machine)
6969
- [Use Ollama](./docs/ollama.md#run-on-local-machine)
70+
- [Use Anthropic](./docs/anthropic.md#run-on-local-machine)
7071
- [Use LG](./docs/lg.md#run-on-local-machine)
7172
- [Use OpenAI](./docs/openai.md#run-on-local-machine)
7273
- [Use Upstage](./docs/upstage.md#run-on-local-machine)
@@ -80,6 +81,7 @@ Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM
8081
- ~~Use Foundry Local~~ 👉 NOT SUPPORTED
8182
- [Use Hugging Face](./docs/hugging-face.md#run-in-local-container)
8283
- [Use Ollama](./docs/ollama.md#run-on-local-container)
84+
- [Use Anthropic](./docs/anthropic.md#run-on-local-container)
8385
- [Use LG](./docs/lg.md#run-in-local-container)
8486
- [Use OpenAI](./docs/openai.md#run-in-local-container)
8587
- [Use Upstage](./docs/upstage.md#run-in-local-container)
@@ -93,6 +95,7 @@ Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM
9395
- ~~Use Foundry Local~~ 👉 NOT SUPPORTED
9496
- [Use Hugging Face](./docs/hugging-face.md#run-on-azure)
9597
- [Use Ollama](./docs/ollama.md#run-on-azure)
98+
- [Use Anthropic](./docs/anthropic.md#run-on-azure)
9699
- [Use LG](./docs/lg.md#run-on-azure)
97100
- [Use OpenAI](./docs/openai.md#run-on-azure)
98101
- [Use Upstage](./docs/upstage.md#run-on-azure)

docs/README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,7 @@
77
- [Foundry Local](./foundry-local.md)
88
- [Hugging Face](./hugging-face.md)
99
- [Ollama](./ollama.md)
10+
- [Anthropic](./anthropic.md)
1011
- [LG](./lg.md)
1112
- [OpenAI](./openai.md)
1213
- [Upstage](./upstage.md)

docs/anthropic.md

Lines changed: 294 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,294 @@
1+
# OpenChat Playground with Anthropic
2+
3+
This page describes how to run OpenChat Playground (OCP) with [Anthropic models](https://docs.claude.com/en/docs/about-claude/models) integration.
4+
5+
## Get the repository root
6+
7+
1. Get the repository root.
8+
9+
```bash
10+
# bash/zsh
11+
REPOSITORY_ROOT=$(git rev-parse --show-toplevel)
12+
```
13+
14+
```powershell
15+
# PowerShell
16+
$REPOSITORY_ROOT = git rev-parse --show-toplevel
17+
```
18+
19+
## Run on local machine
20+
21+
1. Make sure you are at the repository root.
22+
23+
```bash
24+
cd $REPOSITORY_ROOT
25+
```
26+
27+
1. Add Anthropic API Key for Claude connection. Make sure you should replace `{{ANTHROPIC_API_KEY}}` with your Anthropic API key.
28+
29+
```bash
30+
# bash/zsh
31+
dotnet user-secrets --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp \
32+
set Anthropic:ApiKey "{{ANTHROPIC_API_KEY}}"
33+
```
34+
35+
```bash
36+
# PowerShell
37+
dotnet user-secrets --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp `
38+
set Anthropic:ApiKey "{{ANTHROPIC_API_KEY}}"
39+
```
40+
41+
> For more details about Anthropic API keys, refer to the doc, [Anthropic API Documentation](https://docs.anthropic.com/claude/reference/getting-started-with-the-api).
42+
43+
1. Run the app. The default model OCP uses is [Claude Sonnet 4.5](https://www.anthropic.com/claude/sonnet).
44+
45+
```bash
46+
# bash/zsh
47+
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- \
48+
--connector-type Anthropic
49+
```
50+
51+
```powershell
52+
# PowerShell
53+
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- `
54+
--connector-type Anthropic
55+
```
56+
57+
Alternatively, if you want to run with a different model, say [Claude Opus 4.1](http://www.anthropic.com/claude-opus-4-1-system-card), other than the default one, you can specify it as an argument:
58+
59+
```bash
60+
# bash/zsh
61+
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- \
62+
--connector-type Anthropic \
63+
--model claude-opus-4-1
64+
```
65+
66+
```powershell
67+
# PowerShell
68+
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- `
69+
--connector-type Anthropic `
70+
--model claude-opus-4-1
71+
```
72+
73+
By default the app limits the model's response to [512 tokens](https://docs.claude.com/en/docs/about-claude/models/overview), other than the default one, you can specify it as an argument:
74+
75+
```bash
76+
# bash/zsh
77+
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- \
78+
--connector-type Anthropic \
79+
--max-tokens 2048
80+
```
81+
82+
```powershell
83+
# PowerShell
84+
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- `
85+
--connector-type Anthropic `
86+
--max-tokens 2048
87+
```
88+
89+
1. Open your web browser, navigate to `http://localhost:5280`, and enter prompts.
90+
91+
## Run in local container
92+
93+
1. Make sure you are at the repository root.
94+
95+
```bash
96+
cd $REPOSITORY_ROOT
97+
```
98+
99+
1. Build a container.
100+
101+
```bash
102+
docker build -f Dockerfile -t openchat-playground:latest .
103+
```
104+
105+
1. Get Anthropic API Key.
106+
107+
```bash
108+
# bash/zsh
109+
API_KEY=$(dotnet user-secrets --project ./src/OpenChat.PlaygroundApp list --json | \
110+
sed -n '/^\/\//d; p' | jq -r '."Anthropic:ApiKey"')
111+
```
112+
113+
```bash
114+
# PowerShell
115+
$API_KEY = (dotnet user-secrets --project ./src/OpenChat.PlaygroundApp list --json | `
116+
Select-String -NotMatch '^//(BEGIN|END)' | ConvertFrom-Json).'Anthropic:ApiKey'
117+
```
118+
119+
1. Run the app. The default model OCP uses is [Claude Sonnet 4.5](https://www.anthropic.com/claude/sonnet).
120+
121+
```bash
122+
# bash/zsh - from locally built container
123+
docker run -i --rm -p 8080:8080 openchat-playground:latest --connector-type Anthropic \
124+
--api-key $API_KEY
125+
```
126+
127+
```powershell
128+
# PowerShell - from locally built container
129+
docker run -i --rm -p 8080:8080 openchat-playground:latest --connector-type Anthropic `
130+
--api-key $API_KEY
131+
```
132+
133+
```bash
134+
# bash/zsh - from GitHub Container Registry
135+
docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest \
136+
--connector-type Anthropic \
137+
--api-key $API_KEY
138+
```
139+
140+
```powershell
141+
# PowerShell - from GitHub Container Registry
142+
docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest `
143+
--connector-type Anthropic `
144+
--api-key $API_KEY
145+
```
146+
147+
Alternatively, if you want to run with a different model, say [Claude Opus 4.1](http://www.anthropic.com/claude-opus-4-1-system-card), other than the default one, you can specify it as an argument:
148+
149+
```bash
150+
# bash/zsh - from locally built container
151+
docker run -i --rm -p 8080:8080 openchat-playground:latest --connector-type Anthropic \
152+
--api-key $API_KEY
153+
--model claude-opus-4-1
154+
```
155+
156+
```powershell
157+
# PowerShell - from locally built container
158+
docker run -i --rm -p 8080:8080 openchat-playground:latest --connector-type Anthropic `
159+
--api-key $API_KEY
160+
--model claude-opus-4-1
161+
```
162+
163+
```bash
164+
# bash/zsh - from GitHub Container Registry
165+
docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest \
166+
--connector-type Anthropic \
167+
--api-key $API_KEY
168+
--model claude-opus-4-1
169+
```
170+
171+
```powershell
172+
# PowerShell - from GitHub Container Registry
173+
docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest `
174+
--connector-type Anthropic `
175+
--api-key $API_KEY
176+
--model claude-opus-4-1
177+
```
178+
179+
By default the app limits the model's response to [512 tokens](https://docs.claude.com/en/docs/about-claude/models/overview), other than the default one, you can specify it as an argument:
180+
181+
```bash
182+
# bash/zsh - from locally built container
183+
docker run -i --rm -p 8080:8080 openchat-playground:latest --connector-type Anthropic \
184+
--api-key $API_KEY
185+
--max-tokens 2048
186+
```
187+
188+
```powershell
189+
# PowerShell - from locally built container
190+
docker run -i --rm -p 8080:8080 openchat-playground:latest --connector-type Anthropic `
191+
--api-key $API_KEY
192+
--max-tokens 2048
193+
```
194+
195+
```bash
196+
# bash/zsh - from GitHub Container Registry
197+
docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest \
198+
--connector-type Anthropic \
199+
--api-key $API_KEY
200+
--max-tokens 2048
201+
```
202+
203+
```powershell
204+
# PowerShell - from GitHub Container Registry
205+
docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest `
206+
--connector-type Anthropic `
207+
--api-key $API_KEY
208+
--max-tokens 2048
209+
```
210+
211+
1. Open your web browser, navigate to `http://localhost:8080`, and enter prompts.
212+
213+
## Run on Azure
214+
215+
1. Make sure you are at the repository root.
216+
217+
```bash
218+
cd $REPOSITORY_ROOT
219+
```
220+
221+
1. Login to Azure.
222+
223+
```bash
224+
azd auth login
225+
```
226+
227+
1. Check login status.
228+
229+
```bash
230+
azd auth login --check-status
231+
```
232+
233+
1. Initialize `azd` template.
234+
235+
```bash
236+
azd init
237+
```
238+
239+
> **NOTE**: You will be asked to provide environment name for provisioning.
240+
241+
1. Get Anthropic API Key.
242+
243+
```bash
244+
# bash/zsh
245+
API_KEY=$(dotnet user-secrets --project ./src/OpenChat.PlaygroundApp list --json | \
246+
sed -n '/^\/\//d; p' | jq -r '."Anthropic:ApiKey"')
247+
```
248+
249+
```bash
250+
# PowerShell
251+
$API_KEY = (dotnet user-secrets --project ./src/OpenChat.PlaygroundApp list --json | `
252+
Select-String -NotMatch '^//(BEGIN|END)' | ConvertFrom-Json).'Anthropic:ApiKey'
253+
```
254+
255+
1. Set Anthropic API Key to azd environment variables.
256+
257+
```bash
258+
azd env set ANTHROPIC_API_KEY $API_KEY
259+
```
260+
261+
The default model OCP uses is [Claude Sonnet 4.5](https://www.anthropic.com/claude/sonnet). If you want to run with a different model, say [Claude Opus 4.1](http://www.anthropic.com/claude-opus-4-1-system-card), other than the default one, add it to azd environment variables.
262+
263+
```bash
264+
azd env set ANTHROPIC_MODEL claude-opus-4-1
265+
```
266+
267+
By default the app limits the model's response to [512 tokens](https://docs.claude.com/en/docs/about-claude/models/overview), other than the default one, add it to azd environment variables.
268+
```bash
269+
azd env set ANTHROPIC_MAX_TOKENS 2048
270+
```
271+
272+
1. Set the connector type to `Anthropic`.
273+
274+
```bash
275+
azd env set CONNECTOR_TYPE Anthropic
276+
```
277+
278+
1. Run the following commands in order to provision and deploy the app.
279+
280+
```bash
281+
azd up
282+
```
283+
284+
> **NOTE**: You will be asked to provide Azure subscription and location for deployment.
285+
286+
Once deployed, you will be able to see the deployed OCP app URL.
287+
288+
1. Open your web browser, navigate to the OCP app URL, and enter prompts.
289+
290+
1. Clean up all the resources.
291+
292+
```bash
293+
azd down --force --purge
294+
```

infra/main.bicep

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -35,6 +35,11 @@ param huggingFaceModel string = ''
3535
// Ollama
3636
param ollamaModel string = ''
3737
// Anthropic
38+
@secure()
39+
param anthropicApiKey string = ''
40+
param anthropicModel string = ''
41+
@minValue(1)
42+
param anthropicMaxTokens int
3843
// LG
3944
param lgModel string = ''
4045
// Naver - NOT SUPPORTED
@@ -99,6 +104,9 @@ module resources 'resources.bicep' = {
99104
githubModelsToken: githubModelsToken
100105
githubModelsModel: githubModelsModel
101106
ollamaModel: ollamaModel
107+
anthropicModel: anthropicModel
108+
anthropicApiKey: anthropicApiKey
109+
anthropicMaxTokens: anthropicMaxTokens
102110
lgModel: lgModel
103111
openAIModel: openAIModel
104112
openAIApiKey: openAIApiKey

infra/main.parameters.json

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -44,6 +44,15 @@
4444
"ollamaModel": {
4545
"value": "${OLLAMA_MODEL=llama3.2}"
4646
},
47+
"anthropicApiKey": {
48+
"value": "${ANTHROPIC_API_KEY}"
49+
},
50+
"anthropicModel": {
51+
"value": "${ANTHROPIC_MODEL=claude-sonnet-4-5}"
52+
},
53+
"anthropicMaxTokens": {
54+
"value": "${ANTHROPIC_MAX_TOKENS=512}"
55+
},
4756
"lgModel": {
4857
"value": "${LG_MODEL=hf.co/LGAI-EXAONE/EXAONE-4.0-1.2B-GGUF}"
4958
},

0 commit comments

Comments
 (0)