Skip to content

Commit d624c89

Browse files
committed
docs: updated example code
1 parent eac7433 commit d624c89

File tree

1 file changed

+64
-4
lines changed
  • apps/site/pages/en/learn/getting-started

1 file changed

+64
-4
lines changed

apps/site/pages/en/learn/getting-started/fetch.md

Lines changed: 64 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -8,13 +8,54 @@ authors: benhalverson, LankyMoose
88

99
## Introduction
1010

11-
[Unidici](httpss://undici.nodejs.org) is an HTTP client libary that powers the fetch API in Node.js. It was written from scratch and does not rely on the built-in HTTP client in Node.js. It includes a number of features that make it a good choice for high-performance applications.
11+
[Undici](https://undici.nodejs.org) is an HTTP client libary that powers the fetch API in Node.js. It was written from scratch and does not rely on the built-in HTTP client in Node.js. It includes a number of features that make it a good choice for high-performance applications.
1212

13-
```mjs
13+
## Basic GET Usage
14+
15+
```js
1416
async function main() {
17+
// Like the browser fetch API, the default method is GET
1518
const response = await fetch('https://jsonplaceholder.typicode.com/posts');
1619
const data = await response.json();
1720
console.log(data);
21+
// returns something like:
22+
// {
23+
// userId: 1,
24+
// id: 1,
25+
// title: 'sunt aut facere repellat provident occaecati excepturi optio reprehenderit',
26+
// body: 'quia et suscipit\n' +
27+
// 'suscipit recusandae consequuntur expedita et cum\n' +
28+
// 'reprehenderit molestiae ut ut quas totam\n' +
29+
// 'nostrum rerum est autem sunt rem eveniet architecto'
30+
// }
31+
}
32+
33+
main().catch(console.error);
34+
```
35+
36+
## Basic POST Usage
37+
38+
```js
39+
// Data sent from the client to the server
40+
const body = {
41+
title: 'foo',
42+
body: 'bar',
43+
userId: 1,
44+
};
45+
46+
async function main() {
47+
const response = await fetch('https://jsonplaceholder.typicode.com/posts', {
48+
method: 'POST',
49+
headers: {
50+
'User-Agent': 'undici-stream-example',
51+
'Content-Type': 'application/json',
52+
},
53+
body: JSON.stringify(body),
54+
});
55+
const data = await response.json();
56+
console.log(data);
57+
// returns something like:
58+
// { title: 'foo', body: 'bar', userId: 1, id: 101 }
1859
}
1960

2061
main().catch(console.error);
@@ -24,15 +65,30 @@ main().catch(console.error);
2465

2566
Undici allows you to customize the Fetch API by providing options to the `fetch` function. For example, you can set custom headers, set the request method, and set the request body. Here is an example of how you can customize the Fetch API with Undici:
2667

68+
The [fetch](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API) function takes two arguments: the URL to fetch and an options object. The options object is the [Request](https://undici.nodejs.org/#/docs/api/Dispatcher?id=parameter-requestoptions) object that you can use to customize the request. The function returns a [Promises](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Using_promises) that resolves to a [Response](https://undici.nodejs.org/#/docs/api/Dispatcher?id=parameter-responsedata) object. One difference between the Fetch API in the browser and the Fetch API in Node.js is that the Node.js version does not support
69+
70+
In the following example, we are sending a POST request to the Ollama API with a JSON payload. Ollama is a cli tool that allows you to run LLM's (Large Language Models) on your local machine. You can download it [here](https://ollama.com/download)
71+
72+
```bash
73+
ollama run deepseek-r1:1.5b
74+
```
75+
76+
This will download the `deepseek-r1:1.5b` model and run it on your local machine.
77+
2778
With a pool, you can reuse connections to the same server, which can improve performance. Here is an example of how you can use a pool with Undici:
2879

29-
```mjs
80+
```js
3081
import { Pool } from 'undici';
3182

3283
const ollamaPool = new Pool('http://localhost:11434', {
3384
connections: 10,
3485
});
3586

87+
/**
88+
* Stream the completion of a prompt using the Ollama API.
89+
* @param {string} prompt - The prompt to complete.
90+
* @link https://github.com/ollama/ollama/blob/main/docs/api.md
91+
**/
3692
async function streamOllamaCompletion(prompt) {
3793
const { statusCode, body } = await ollamaPool.request({
3894
path: '/api/generate',
@@ -43,6 +99,8 @@ async function streamOllamaCompletion(prompt) {
4399
body: JSON.stringify({ prompt, model: 'deepseek-r1:8b' }),
44100
});
45101

102+
// You can read about hTTP status codes here: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status
103+
// 200 means the request was successful.
46104
if (statusCode !== 200) {
47105
throw new Error(`Ollama request failed with status ${statusCode}`);
48106
}
@@ -70,7 +128,9 @@ try {
70128

71129
## Streaming Responses with Undici
72130

73-
```mjs
131+
[Streams](https://nodejs.org/docs/v22.14.0/api/stream.html#stream) is a feature in Node.js that allows you to read and write chucks of data.
132+
133+
```js
74134
import { stream } from 'undici';
75135
import { Writable } from 'stream';
76136

0 commit comments

Comments
 (0)