Skip to content
Draft
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
179 changes: 179 additions & 0 deletions pages/integrations/other/openai-codex.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,179 @@
---
title: Trace OpenAI Codex with Langfuse
sidebarTitle: OpenAI Codex
logo: /images/integrations/openai_icon.svg
description: Send OpenAI Codex OpenTelemetry traces to Langfuse for observability into coding sessions, model activity, and runtime behavior.
category: Integrations
---

# Trace OpenAI Codex with Langfuse

This guide shows you how to send [OpenAI Codex](https://developers.openai.com/codex) telemetry to Langfuse using Codex's built-in OpenTelemetry export.

> **What is OpenAI Codex?** [Codex](https://developers.openai.com/codex) is OpenAI's coding agent that can help you inspect code, edit files, run commands, and complete software engineering tasks from your local environment.

> **What is Langfuse?** [Langfuse](https://langfuse.com/) is an open-source LLM engineering platform that helps teams trace AI applications, debug issues, monitor production behavior, and evaluate quality.

## What This Integration Does

With Codex's OpenTelemetry exporter connected to Langfuse, you can:

- **Capture Codex traces** from local coding sessions
- **Inspect model and runtime spans** emitted by Codex
- **Monitor latency and errors** in Langfuse
- **Tag traces by environment** for easier filtering
- **Optionally include prompt content** when your policies allow it

The exact spans and attributes depend on the Codex version you are running, but Langfuse can ingest Codex's OTLP/HTTP traces directly.

## How It Works

Codex reads runtime configuration from either:

- `~/.codex/config.toml` for user-level configuration
- `.codex/config.toml` for project-level configuration

You configure Codex to export traces via OTLP/HTTP and point it to the Langfuse OpenTelemetry endpoint at `/api/public/otel/v1/traces`.

Langfuse authenticates OTLP requests via **Basic Auth** using your Langfuse public and secret keys. Unlike some OTLP backends, you do not need separate workspace or project headers because the Langfuse project is determined by the API keys you use.

<Callout type="warning">
Codex telemetry export is opt-in. Keep `log_user_prompt = false` unless your security and privacy policies explicitly allow prompt text to be exported.
</Callout>

## Quick Start

<Steps>

### Set up Langfuse

Sign up for [Langfuse Cloud](https://cloud.langfuse.com) or [self-host Langfuse](/self-hosting). Then create a project and copy the **public key** and **secret key** from your project settings.

### Create the Basic Auth header

Encode your Langfuse public and secret keys as a Basic Auth string:

```bash
echo -n "pk-lf-1234567890:sk-lf-1234567890" | base64
```

Use the output in your Codex configuration as:

```text
Authorization = "Basic <base64-encoded-public-key-colon-secret-key>"
```

<Callout type="info">
On GNU systems, you may need `base64 -w 0` to avoid line wrapping for long keys.
</Callout>

### Update your Codex configuration

Add the following to either `~/.codex/config.toml` or `.codex/config.toml`.

<Tabs items={["Langfuse Cloud (EU)", "Langfuse Cloud (US)", "Self-hosted"]}>
<Tab>

```toml
[otel]
trace_exporter = "otlp-http"
environment = "prod"
log_user_prompt = false

[otel.trace_exporter.otlp-http]
endpoint = "https://cloud.langfuse.com/api/public/otel/v1/traces"
protocol = "binary"
headers = { "Authorization" = "Basic <base64-encoded-public-key-colon-secret-key>" }
```

</Tab>
<Tab>

```toml
[otel]
trace_exporter = "otlp-http"
environment = "prod"
log_user_prompt = false

[otel.trace_exporter.otlp-http]
endpoint = "https://us.cloud.langfuse.com/api/public/otel/v1/traces"
protocol = "binary"
headers = { "Authorization" = "Basic <base64-encoded-public-key-colon-secret-key>" }
```

</Tab>
<Tab>

```toml
[otel]
trace_exporter = "otlp-http"
environment = "prod"
log_user_prompt = false

[otel.trace_exporter.otlp-http]
endpoint = "https://your-langfuse-domain.com/api/public/otel/v1/traces"
protocol = "binary"
headers = { "Authorization" = "Basic <base64-encoded-public-key-colon-secret-key>" }
```

</Tab>
</Tabs>

### Start a Codex session

Save the configuration, then start Codex and complete a task so it emits traces.

If you are using a project-local config, make sure you launch Codex from that project directory so `.codex/config.toml` is applied.

### View traces in Langfuse

Open your Langfuse project and inspect the traces generated by Codex. Depending on the Codex runtime and exporter output, you can review:

- Root traces for each Codex run
- Nested spans for model and runtime activity
- Environment metadata such as `prod`
- Errors and timing information

</Steps>

## Troubleshooting

### No traces appear in Langfuse

Check the following:

1. **Endpoint path is correct** and ends with `/api/public/otel/v1/traces`
2. **Authorization header is valid** and starts with `Basic `
3. **Your Base64 value is based on `public_key:secret_key`**
4. **You ran a new Codex session** after updating `config.toml`
5. **You edited the correct config file** for the Codex session you started

### Authentication errors

- Verify that your Langfuse public key starts with `pk-lf-`
- Verify that your Langfuse secret key starts with `sk-lf-`
- Confirm that your endpoint matches your Langfuse region:
- EU Cloud: `https://cloud.langfuse.com`
- US Cloud: `https://us.cloud.langfuse.com`

### Self-hosted setup does not work

- Confirm your Langfuse instance is reachable from the machine running Codex
- Make sure your Langfuse deployment supports the OpenTelemetry endpoint
- Verify TLS certificates if you are using HTTPS on a custom domain

Learn more in the [Langfuse OpenTelemetry guide](/integrations/native/opentelemetry).

### Prompt content is missing

This is expected when `log_user_prompt = false`.

Only set `log_user_prompt = true` if exporting prompt text is allowed by your internal policies.

## Resources

- [Langfuse OpenTelemetry integration](/integrations/native/opentelemetry)
- [Langfuse tracing overview](/docs/observability/overview)
- [Codex config basics](https://developers.openai.com/codex/config-basic)
- [Codex config reference](https://developers.openai.com/codex/config-reference)
- [Codex security and OpenTelemetry opt-in](https://developers.openai.com/codex/security/)