Skip to content

Commit a7c9097

Browse files
91voltstefanotorneo
authored andcommitted
Update api key management info
1 parent 29f0fd7 commit a7c9097

File tree

1 file changed

+13
-12
lines changed

1 file changed

+13
-12
lines changed

src/arduino/app_bricks/cloud_llm/README.md

Lines changed: 13 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -24,16 +24,17 @@ This Brick acts as a gateway to powerful AI models hosted in the cloud. It is de
2424

2525
### Basic Conversation
2626

27-
This example initializes the Brick with an OpenAI model and performs a simple chat interaction.
27+
This example initializes the Brick with an OpenAI model and performs a simple chat interaction.
28+
29+
**Note:** The API key is not hardcoded. It is retrieved automatically from the **Brick Configuration** in App Lab.
2830

2931
```python
3032
import os
3133
from arduino.app_bricks.cloud_llm import CloudLLM, CloudModel
3234
from arduino.app_utils import App
3335

34-
# Initialize the Brick with your API key and preferred model
36+
# Initialize the Brick (API key is loaded from configuration)
3537
llm = CloudLLM(
36-
api_key="YOUR_OPENAI_API_KEY",
3738
model=CloudModel.OPENAI_GPT,
3839
system_prompt="You are a helpful assistant for an IoT device."
3940
)
@@ -56,8 +57,8 @@ from arduino.app_bricks.cloud_llm import CloudLLM, CloudModel
5657
from arduino.app_utils import App
5758

5859
# Initialize with memory enabled (keeps last 10 messages)
60+
# API Key is retrieved automatically from Brick Configuration
5961
llm = CloudLLM(
60-
api_key="YOUR_ANTHROPIC_API_KEY",
6162
model=CloudModel.ANTHROPIC_CLAUDE
6263
).with_memory(max_messages=10)
6364

@@ -81,13 +82,13 @@ App.run(chat_loop)
8182

8283
The Brick is initialized with the following parameters:
8384

84-
| Parameter | Type | Default | Description |
85-
| :-------------- | :-------------------- | :---------------------------- | :-------------------------------------------------------------------------- |
86-
| `api_key` | `str` | `os.getenv("API_KEY")` | The authentication key for the LLM provider. |
87-
| `model` | `str` \| `CloudModel` | `CloudModel.ANTHROPIC_CLAUDE` | The specific model to use. Accepts a `CloudModel` enum or its string value. |
88-
| `system_prompt` | `str` | `""` | A base instruction that defines the AI's behavior and persona. |
89-
| `temperature` | `float` | `0.7` | Controls randomness. `0.0` is deterministic, `1.0` is creative. |
90-
| `timeout` | `int` | `30` | Maximum time (in seconds) to wait for a response. |
85+
| Parameter | Type | Default | Description |
86+
| :-------------- | :-------------------- | :---------------------------- | :--------------------------------------------------------------------------------------------------------------------------------------- |
87+
| `api_key` | `str` | `os.getenv("API_KEY")` | The authentication key for the LLM provider. **Recommended:** Set this via the **Brick Configuration** menu in App Lab instead of code. |
88+
| `model` | `str` \| `CloudModel` | `CloudModel.ANTHROPIC_CLAUDE` | The specific model to use. Accepts a `CloudModel` enum or its string value. |
89+
| `system_prompt` | `str` | `""` | A base instruction that defines the AI's behavior and persona. |
90+
| `temperature` | `float` | `0.7` | Controls randomness. `0.0` is deterministic, `1.0` is creative. |
91+
| `timeout` | `int` | `30` | Maximum time (in seconds) to wait for a response. |
9192

9293
### Supported Models
9394

@@ -105,4 +106,4 @@ You can select a model using the `CloudModel` enum or by passing the correspondi
105106
- **`chat_stream(message)`**: Returns a generator yielding response tokens as they arrive.
106107
- **`stop_stream()`**: Interrupts an active streaming generation.
107108
- **`with_memory(max_messages)`**: Enables history tracking. `max_messages` defines the context window size.
108-
- **`clear_memory()`**: Resets the conversation history.
109+
- **`clear_memory()`**: Resets the conversation history.

0 commit comments

Comments
 (0)