Skip to content

Commit 0b3a84d

Browse files
authored
Update README.md
1 parent 373503b commit 0b3a84d

File tree

1 file changed

+24
-3
lines changed

1 file changed

+24
-3
lines changed

README.md

Lines changed: 24 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,27 @@
11
# batched
22
Rust macro utility for batching expensive async operations.
33

4+
## What is this?
5+
`batched` is designed for high-throughput async environments where many small, frequent calls would otherwise overwhelm your system or database. Instead of processing each call individually, it groups them into batches based on configurable rules (time window, size limit, concurrency), then executes a single batched operation.
6+
7+
This saves resources, reduces contention, and improves efficiency — all while letting callers use the function as if it were unbatched.
8+
9+
You annotate an async function with `#[batched]`, and the macro generates the batching logic automatically.
10+
11+
---
12+
13+
## When it’s useful (and when it’s not)
14+
15+
### ✅ Useful
16+
- **Database inserts/updates:** Instead of writing one row at a time, batch them into multi-row `INSERT` or `UPDATE` statements.
17+
- **External API calls with rate limits:** Reduce request overhead by batching multiple logical calls into one HTTP request.
18+
- **Expensive computations:** Grouping repeated small computations into a single parallel-friendly call.
19+
- **Services with bursts of traffic:** Smooth out request spikes by accumulating calls into fewer batch operations.
20+
21+
### ❌ Not useful
22+
- **Lightweight or fast operations**: If the work per call is already cheap (e.g. adding two numbers), batching only adds complexity and overhead.
23+
- **Strong ordering or per-call timing guarantees required**: Calls may be delayed slightly while waiting for the batch window.
24+
425
## Installation
526
```sh
627
cargo add batched
@@ -15,8 +36,8 @@ batched = "0.2.7"
1536
## #[batched]
1637
- **limit**: Maximum amount of items that can be grouped and processed in a single batch.
1738
- **concurrent**: Maximum amount of concurrent batched tasks running (default: `Infinity`)
18-
- **window**: Minimum amount of time (in milliseconds) the background thread waits before processing a batch.
19-
- **window[x]**: Minimum amount of time (in milliseconds) the background thread waits before processing a batch when latest buffer size is <= x
39+
- **window**: Maximum amount of time (in milliseconds) the background thread waits after the first call before processing a batch.
40+
- **window[x]**: Maximum amount of time (in milliseconds) the background thread waits after the first call before processing a batch, when the buffer size is <= x
2041

2142
The target function must have a single argument, a vector of items (`Vec<T>`).
2243

@@ -123,4 +144,4 @@ async fn service(messages: Vec<String>) -> Result<(), anyhow::Error> {
123144
let messages: Vec<Row> = insert_message_multiple(messages).await?;
124145
Ok(())
125146
}
126-
```
147+
```

0 commit comments

Comments
 (0)