Releases: traceloop/openllmetry
Releases · traceloop/openllmetry
0.45.3
0.45.2
0.45.1
0.45.0
v0.45.0 (2025-08-12)
Feat
- datasets: add dataset and datasets functionality (#3247)
Fix
- anthropic: support with_raw_response wrapper for span generation (#3250)
- langchain: fix nesting of langgraph spans (#3206)
- langchain: Add "dont_throw" to "on_llm_end" and remove blank file (#3232)
[main 2708764] bump: version 0.44.3 → 0.45.0
58 files changed, 69 insertions(+), 57 deletions(-)
0.44.3
0.44.2
0.44.1
0.44.0
v0.44.0 (2025-08-03)
Feat
- sdk: support multiple span processors (#3207)
- semantic-conentions-ai: add LLMVendor enum to semantic conventions (#3170)
Fix
- langchain: spans dictionary memory leak (#3216)
- openai-agents: use framework's context to infer trace (#3215)
- sdk: respect truncation otel environment variable (#3212)
- anthropic: async stream manager (#3220)
- langchain: populate metadata as span attributes in batch operations (#3218)
- anthropic: various fixes around tools parsing (#3204)
- qdrant: fix qdrant-client auto instrumentation condition (#3208)
- instrumentation: remove param
enrich_token_usage
and simplify token calculation (#3205) - langchain: ensure llm spans are created for sync cases (#3201)
- openai: support for openai non-consumed streams (#3155)
[main cfe309d] bump: version 0.43.1 → 0.44.0
58 files changed, 77 insertions(+), 57 deletions(-)
0.43.1
0.43.0
v0.43.0 (2025-07-22)
Feat
- prompts: add tool function support (#3153)
Fix
- llamaindex: structured llm model and temperature parsing (#3159)
- langchain: report token usage histogram (#3059)
- openai: prioritize api-provided token over tiktoken calculation (#3142)
- milvus: Add metrics support (#3013)
[main 6704f15] bump: version 0.42.0 → 0.43.0
58 files changed, 70 insertions(+), 57 deletions(-)