Skip to content

Conversation

@adityamehra
Copy link
Contributor

@adityamehra adityamehra commented Nov 6, 2025

This builds on top of #29

@adityamehra adityamehra requested review from a team as code owners November 6, 2025 19:08
return True

# PRIORITY 2: Check for explicit LLM span kind (even without messages, for compatibility)
if span_kind == "llm":
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.


# PRIORITY 3: Detect ReAct agent/task spans by kind
# These are agent workflows that contain LLM calls
if span_kind in ["agent", "task", "workflow"]:
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ideally, evals should be limited to llm spans

# ------------------------------------------------------------------
# Internal helpers
# ------------------------------------------------------------------
def _is_llm_span(self, span: ReadableSpan) -> bool:
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ideally, this method should return True for spans which are LLM calls or chat calls with the model

# Mapping from original span_id to translated INVOCATION (not span) for parent-child relationship preservation
self._original_to_translated_invocation: Dict[int, Any] = {}
# Buffer spans to process them in the correct order (parents before children)
self._span_buffer: List[ReadableSpan] = []
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Look into the span buffer logic

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove buffer logic if not required

# STEP 2: Check if this is an LLM span that needs evaluation
if self._is_llm_span(span):
_logger.debug(
"🔍 TRACELOOP PROCESSOR: LLM span '%s' detected! Processing immediately for evaluations",
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove emojis

"Failed to stop LLM invocation: %s", stop_err
)
else:
# Non-LLM spans (tasks, workflows, tools) - buffer for optional batch processing
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Revisit this logic

@adityamehra adityamehra enabled auto-merge (squash) November 6, 2025 21:02
@adityamehra adityamehra force-pushed the feature/single-tl-processor branch from beb41f3 to 8c60348 Compare November 6, 2025 21:09
@adityamehra adityamehra disabled auto-merge November 6, 2025 21:21
@adityamehra adityamehra force-pushed the feature/single-tl-processor branch from 3ca1998 to fbad29e Compare November 6, 2025 22:15
@adityamehra adityamehra force-pushed the feature/single-tl-processor branch from fbad29e to b192589 Compare November 6, 2025 23:55
@adityamehra adityamehra force-pushed the feature/single-tl-processor branch from b192589 to 2579cbe Compare November 6, 2025 23:58
@adityamehra adityamehra merged commit b9142c7 into main Nov 6, 2025
1 of 14 checks passed
@adityamehra adityamehra deleted the feature/single-tl-processor branch November 6, 2025 23:59
@github-actions github-actions bot locked and limited conversation to collaborators Nov 6, 2025
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants