You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- AI-adjacent products (LLMs integrated into existing products)
13
+
14
+
**LLM analytics is a good fit if:**
15
+
- They need to monitor traces, spans, token costs, latency, and analyze usage of AI features
16
+
- They're using trace summaries to debug and evals to make product decisions
17
+
- They care about questions like: “how does interacting with LLM features correlate with retention, usage, or revenue?”
18
+
- They're already PostHog users (or should be) using [product analytics](/product-analytics) and [session replay](/session-replay) to combine qualitative context with quantitative data
19
+
- They want to start getting value right away, without needing extensive setup and configuration
20
+
21
+
**Who else might want to use LLM analytics at their org:**
22
+
-**Application Ops / SRE** to monitor production AI systems for errors, prompt injection, jailbreaks, or other security issues
23
+
-**Product Managers** to understand user sentiment, usage and make decisions about their AI roadmap
24
+
-**Customer Success / Support Teams** to improve documentation or investigate user issues
25
+
26
+
### Who we're NOT building for (right now)
27
+
**AI Researchers** and **Machine Learning (ML) Engineers** doing:
0 commit comments