Langfuse

Using OpenRouter with Langfuse

Looking to auto-instrument without client code? Check out OpenRouter Broadcast to automatically send traces to Langfuse.

Using Langfuse

Langfuse provides observability and analytics for LLM applications. Since OpenRouter uses the OpenAI API schema, you can utilize Langfuse’s native integration with the OpenAI SDK to automatically trace and monitor your OpenRouter API calls.

Installation

Configuration

Set up your environment variables:

Simple LLM Call

Since OpenRouter provides an OpenAI-compatible API, you can use the Langfuse OpenAI SDK wrapper to automatically log OpenRouter calls as generations in Langfuse:

Advanced Tracing with Nested Calls

Use the @observe() decorator to capture execution details of functions with nested LLM calls:

Learn More