You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+28-6Lines changed: 28 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -274,17 +274,38 @@ with TraceGraph coming soon).
274
274
## LLM API Setup
275
275
276
276
Currently we rely on [LiteLLM](https://github.com/BerriAI/litellm) or [AutoGen v0.2](https://github.com/microsoft/autogen/tree/0.2) for LLM caching and API-Key management.
277
-
By default, LiteLLM is used. To use it, set the keys as the environment variables, e.g.
277
+
278
+
By default, LiteLLM is used. To change the default backend, set the environment variable `TRACE_DEFAULT_LLM_BACKEND` on terminal
279
+
```bash
280
+
export TRACE_DEFAULT_LLM_BACKEND="<your LLM backend here>"# 'LiteLLM' or 'AutoGen`
281
+
```
282
+
or in python before importing `opto`
283
+
```python
284
+
import os
285
+
os.environ["TRACE_DEFAULT_LLM_BACKEND"] ="<your LLM backend here>"# 'LiteLLM' or 'AutoGen`
286
+
import opto
287
+
```
288
+
289
+
290
+
291
+
### Using LiteLLM as Backend
292
+
293
+
Set the keys as the environment variables, following the [documentation of LiteLLM](https://docs.litellm.ai/docs/providers). For example,
os.environ["OPENAI_API_KEY"] ="<your OpenAI API key here>"
298
+
os.environ["ANTHROPIC_API_KEY"] ="<your Anthropic API key here>"
299
+
```
300
+
In Trace, we add another environment variable `TRACE_LITELLM_MODEL` to set the default model name used by LiteLLM for convenience, e.g.,
301
+
```bash
302
+
export TRACE_LITELLM_MODEL='gpt-4o'
283
303
```
304
+
will set all LLM instances in Trace to use `gpt-4o` by default.
284
305
285
-
Please see the [documentation of LiteLLM](https://docs.litellm.ai/docs/providers) for more details on setting keys and end-point urls.
286
306
287
-
On the other hand, to use AutoGen, install Trace with autogen flag, `pip install trace-opt[autogen]`. AutoGen relies on `OAI_CONFIG_LIST`, which is a file you put in your working directory. It has the format of:
307
+
### Using AutoGen as Backend
308
+
First install Trace with autogen flag, `pip install trace-opt[autogen]`. AutoGen relies on `OAI_CONFIG_LIST`, which is a file you put in your working directory. It has the format of:
288
309
289
310
```json lines
290
311
[
@@ -298,7 +319,8 @@ On the other hand, to use AutoGen, install Trace with autogen flag, `pip install
298
319
}
299
320
]
300
321
```
301
-
You switch between different LLM models by changing the `model` field in this configuration file.
322
+
You can switch between different LLM models by changing the `model` field in this configuration file.
323
+
Note AutoGen by default will use the first model available in this config file.
302
324
303
325
You can also set an `os.environ` variable `OAI_CONFIG_LIST` to point to the location of this file or directly set a JSON string as the value of this variable.
warnings.warn("TRACE_LITELLM_MODEL environment variable is not found when loading the default model for LiteLLM. Attempt to load the default model from DEFAULT_LITELLM_MODEL environment variable. The usage of DEFAULT_LITELLM_MODEL will be deprecated. Please use the environment variable TRACE_LITELLM_MODEL for setting the default model name for LiteLLM.")
0 commit comments