Skip to content

Commit 615b3c9

Browse files
authored
Apply suggestions from code review
1 parent 9fcc244 commit 615b3c9

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ for chunk in stream:
5454

5555
Run larger models by offloading to Ollama’s cloud while keeping your local workflow.
5656

57-
- Supported models: `deepseek-v3.1:671b-cloud`, `gpt-oss:20b-cloud`, `gpt-oss:120b-cloud`, `kimi-k2:1t-cloud`, `qwen3-coder:480b-cloud`
57+
- Supported models: `deepseek-v3.1:671b-cloud`, `gpt-oss:20b-cloud`, `gpt-oss:120b-cloud`, `kimi-k2:1t-cloud`, `qwen3-coder:480b-cloud`, `kimi-k2-thinking` See [Ollama Models - Cloud](https://ollama.com/search?c=cloud) for more information
5858

5959
### Run via local Ollama
6060

@@ -70,7 +70,7 @@ ollama signin
7070
ollama pull gpt-oss:120b-cloud
7171
```
7272

73-
3) Use as usual (offloads automatically):
73+
3) Make a request:
7474

7575
```python
7676
from ollama import Client
@@ -85,14 +85,14 @@ messages = [
8585
]
8686

8787
for part in client.chat('gpt-oss:120b-cloud', messages=messages, stream=True):
88-
print(part['message']['content'], end='', flush=True)
88+
print(part.message.content, end='', flush=True)
8989
```
9090

9191
### Cloud API (ollama.com)
9292

9393
Access cloud models directly by pointing the client at `https://ollama.com`.
9494

95-
1) Create an API key, then set:
95+
1) Create an API key from [ollama.com](https://ollama.com/settings/keys) , then set:
9696

9797
```
9898
export OLLAMA_API_KEY=your_api_key
@@ -123,7 +123,7 @@ messages = [
123123
]
124124

125125
for part in client.chat('gpt-oss:120b', messages=messages, stream=True):
126-
print(part['message']['content'], end='', flush=True)
126+
print(part.message.content, end='', flush=True)
127127
```
128128

129129
## Custom client

0 commit comments

Comments
 (0)