Skip to content

Commit 5dded9b

Browse files
authored
Merge pull request #71 from vincentkoc/docs-opik
docs: Adding Opik to OpenTelemetry
2 parents ee8d62c + ee39c2b commit 5dded9b

5 files changed

Lines changed: 309 additions & 0 deletions

File tree

open-telemetry/README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -35,6 +35,7 @@ This organization helps you track conversation-to-conversation and turn-to-turn
3535
| ------------------------------- | ------------------------------------------------------------------------- |
3636
| [Jaeger Tracing](./jaeger/) | Tracing with Jaeger, an open-source end-to-end distributed tracing system |
3737
| [Langfuse Tracing](./langfuse/) | Tracing with Langfuse, a specialized platform for LLM observability |
38+
| [Opik Tracing](./opik/) | Tracing with Opik, an open-source tracing and evaluation platform |
3839

3940
## Common Requirements
4041

open-telemetry/opik/README.md

Lines changed: 105 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,105 @@
1+
# Opik Tracing for Pipecat
2+
3+
This demo showcases OpenTelemetry tracing integration for Pipecat services using Opik, allowing you to visualize and analyze LLM traces, service calls, performance metrics, and dependencies.
4+
5+
> **Note**: Opik supports HTTP/JSON OpenTelemetry traces only (no logs or metrics).
6+
7+
## Setup Instructions
8+
9+
### 1. Get Your Opik API Key
10+
11+
Sign up or log in at [https://www.comet.com/opik](https://www.comet.com/opik) to get your API key and workspace name.
12+
13+
### 2. Environment Configuration
14+
15+
Create a `.env` file with your API keys and Opik configuration:
16+
17+
```
18+
# Enable tracing
19+
ENABLE_TRACING=true
20+
21+
# OTLP endpoint (defaults to Opik Cloud if not set)
22+
OTEL_EXPORTER_OTLP_ENDPOINT=https://www.comet.com/opik/api/v1/private/otel/v1/traces
23+
24+
# Opik headers - Configure your API key, workspace, and project name
25+
OTEL_EXPORTER_OTLP_HEADERS=Authorization=your_opik_api_key,Comet-Workspace=your_workspace_name,projectName=your_project_name
26+
27+
# Optional: Enable console output for debugging
28+
# OTEL_CONSOLE_EXPORT=true
29+
30+
# Service API keys
31+
DEEPGRAM_API_KEY=your_key_here
32+
CARTESIA_API_KEY=your_key_here
33+
OPENAI_API_KEY=your_key_here
34+
```
35+
36+
For self-hosted Opik installations, update the endpoint:
37+
```
38+
OTEL_EXPORTER_OTLP_ENDPOINT=http://<YOUR-OPIK-INSTANCE>/api/v1/private/otel/v1/traces
39+
```
40+
41+
### 3. Install Dependencies
42+
43+
```bash
44+
pip install -r requirements.txt
45+
```
46+
47+
> **Important**: Use the HTTP exporter (`opentelemetry-exporter-otlp-proto-http`), not the GRPC exporter. Opik only supports HTTP transport.
48+
49+
### 4. Run the Demo
50+
51+
```bash
52+
python bot.py
53+
```
54+
55+
### 5. View Traces in Opik
56+
57+
Open your browser to [https://www.comet.com/opik](https://www.comet.com/opik) and navigate to your project to view traces and analyze your LLM interactions.
58+
59+
## Opik-Specific Configuration
60+
61+
In the `bot.py` file, note the HTTP exporter configuration:
62+
63+
```python
64+
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
65+
66+
# Create the exporter for Opik (HTTP/JSON only)
67+
# Headers are configured via OTEL_EXPORTER_OTLP_HEADERS environment variable
68+
otlp_exporter = OTLPSpanExporter(
69+
endpoint=os.getenv(
70+
"OTEL_EXPORTER_OTLP_ENDPOINT",
71+
"https://www.comet.com/opik/api/v1/private/otel/v1/traces"
72+
),
73+
)
74+
75+
# Set up tracing with the exporter
76+
setup_tracing(
77+
service_name="pipecat-demo",
78+
exporter=otlp_exporter,
79+
console_export=bool(os.getenv("OTEL_CONSOLE_EXPORT")),
80+
)
81+
```
82+
83+
The OpenTelemetry SDK automatically reads headers from the `OTEL_EXPORTER_OTLP_HEADERS` environment variable.
84+
85+
## Key Features
86+
87+
- **HTTP/JSON Transport**: Opik uses HTTP transport for OpenTelemetry traces
88+
- **LLM-Focused**: Optimized for tracking and analyzing LLM interactions
89+
- **Required Headers**:
90+
- `Authorization`: Your Opik API key
91+
- `projectName`: Your project name in Opik
92+
- `Comet-Workspace`: Your workspace name (required for Comet-hosted installations)
93+
94+
## Troubleshooting
95+
96+
- **No Traces in Opik**: Verify your API key, workspace name, and project name are correct
97+
- **Authorization Errors**: Ensure your `OPIK_API_KEY` and `OPIK_WORKSPACE` are set correctly
98+
- **Connection Errors**: Check your network connectivity and endpoint URL
99+
- **Exporter Issues**: Try the Console exporter (`OTEL_CONSOLE_EXPORT=true`) to verify tracing works locally
100+
101+
## References
102+
103+
- [Opik Documentation](https://www.comet.com/docs/opik)
104+
- [Opik OpenTelemetry Integration Guide](https://www.comet.com/docs/opik/integrations/opentelemetry)
105+
- [OpenTelemetry Documentation](https://opentelemetry.io/docs/)

open-telemetry/opik/bot.py

Lines changed: 180 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,180 @@
1+
#
2+
# Copyright (c) 2024–2025, Daily
3+
#
4+
# SPDX-License-Identifier: BSD 2-Clause License
5+
#
6+
7+
import os
8+
9+
from dotenv import load_dotenv
10+
from loguru import logger
11+
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
12+
from pipecat.adapters.schemas.function_schema import FunctionSchema
13+
from pipecat.adapters.schemas.tools_schema import ToolsSchema
14+
from pipecat.audio.vad.silero import SileroVADAnalyzer
15+
from pipecat.frames.frames import LLMRunFrame, TTSSpeakFrame
16+
from pipecat.pipeline.pipeline import Pipeline
17+
from pipecat.pipeline.runner import PipelineRunner
18+
from pipecat.pipeline.task import PipelineParams, PipelineTask
19+
from pipecat.processors.aggregators.openai_llm_context import OpenAILLMContext
20+
from pipecat.runner.types import RunnerArguments
21+
from pipecat.runner.utils import create_transport
22+
from pipecat.services.cartesia.tts import CartesiaTTSService
23+
from pipecat.services.deepgram.stt import DeepgramSTTService
24+
from pipecat.services.llm_service import FunctionCallParams
25+
from pipecat.services.openai.llm import OpenAILLMService
26+
from pipecat.transports.base_transport import BaseTransport, TransportParams
27+
from pipecat.transports.daily.transport import DailyParams
28+
from pipecat.transports.websocket.fastapi import FastAPIWebsocketParams
29+
from pipecat.utils.tracing.setup import setup_tracing
30+
31+
load_dotenv(override=True)
32+
33+
IS_TRACING_ENABLED = bool(os.getenv("ENABLE_TRACING"))
34+
35+
# Initialize tracing if enabled
36+
if IS_TRACING_ENABLED:
37+
# Create the exporter for Opik (HTTP/JSON only)
38+
# Opik supports HTTP transport for traces only (no logs or metrics)
39+
# Headers are configured via OTEL_EXPORTER_OTLP_HEADERS environment variable
40+
otlp_exporter = OTLPSpanExporter(
41+
endpoint=os.getenv(
42+
"OTEL_EXPORTER_OTLP_ENDPOINT",
43+
"https://www.comet.com/opik/api/v1/private/otel/v1/traces",
44+
),
45+
)
46+
47+
# Set up tracing with the exporter
48+
setup_tracing(
49+
service_name="pipecat-demo",
50+
exporter=otlp_exporter,
51+
console_export=bool(os.getenv("OTEL_CONSOLE_EXPORT")),
52+
)
53+
logger.info("OpenTelemetry tracing initialized for Opik")
54+
55+
56+
async def fetch_weather_from_api(params: FunctionCallParams):
57+
await params.result_callback({"conditions": "nice", "temperature": "75"})
58+
59+
60+
# We store functions so objects (e.g. SileroVADAnalyzer) don't get
61+
# instantiated. The function will be called when the desired transport gets
62+
# selected.
63+
transport_params = {
64+
"daily": lambda: DailyParams(
65+
audio_in_enabled=True,
66+
audio_out_enabled=True,
67+
vad_analyzer=SileroVADAnalyzer(),
68+
),
69+
"twilio": lambda: FastAPIWebsocketParams(
70+
audio_in_enabled=True,
71+
audio_out_enabled=True,
72+
vad_analyzer=SileroVADAnalyzer(),
73+
),
74+
"webrtc": lambda: TransportParams(
75+
audio_in_enabled=True,
76+
audio_out_enabled=True,
77+
vad_analyzer=SileroVADAnalyzer(),
78+
),
79+
}
80+
81+
82+
async def run_bot(transport: BaseTransport):
83+
logger.info(f"Starting bot")
84+
85+
stt = DeepgramSTTService(api_key=os.getenv("DEEPGRAM_API_KEY"))
86+
87+
tts = CartesiaTTSService(
88+
api_key=os.getenv("CARTESIA_API_KEY"),
89+
voice_id="71a7ad14-091c-4e8e-a314-022ece01c121", # British Reading Lady
90+
)
91+
92+
llm = OpenAILLMService(
93+
api_key=os.getenv("OPENAI_API_KEY"), params=OpenAILLMService.InputParams(temperature=0.5)
94+
)
95+
96+
# You can also register a function_name of None to get all functions
97+
# sent to the same callback with an additional function_name parameter.
98+
llm.register_function("get_current_weather", fetch_weather_from_api)
99+
100+
@llm.event_handler("on_function_calls_started")
101+
async def on_function_calls_started(service, function_calls):
102+
await tts.queue_frame(TTSSpeakFrame("Let me check on that."))
103+
104+
weather_function = FunctionSchema(
105+
name="get_current_weather",
106+
description="Get the current weather",
107+
properties={
108+
"location": {
109+
"type": "string",
110+
"description": "The city and state, e.g. San Francisco, CA",
111+
},
112+
"format": {
113+
"type": "string",
114+
"enum": ["celsius", "fahrenheit"],
115+
"description": "The temperature unit to use. Infer this from the user's location.",
116+
},
117+
},
118+
required=["location", "format"],
119+
)
120+
tools = ToolsSchema(standard_tools=[weather_function])
121+
122+
messages = [
123+
{
124+
"role": "system",
125+
"content": "You are a helpful LLM in a WebRTC call. Your goal is to demonstrate your capabilities in a succinct way. Your output will be converted to audio so don't include special characters in your answers. Respond to what the user said in a creative and helpful way.",
126+
},
127+
]
128+
129+
context = OpenAILLMContext(messages, tools)
130+
context_aggregator = llm.create_context_aggregator(context)
131+
132+
pipeline = Pipeline(
133+
[
134+
transport.input(),
135+
stt,
136+
context_aggregator.user(),
137+
llm,
138+
tts,
139+
transport.output(),
140+
context_aggregator.assistant(),
141+
]
142+
)
143+
144+
task = PipelineTask(
145+
pipeline,
146+
params=PipelineParams(
147+
enable_metrics=True,
148+
enable_usage_metrics=True,
149+
),
150+
enable_tracing=IS_TRACING_ENABLED,
151+
# Optionally, add a conversation ID to track the conversation
152+
# conversation_id="8df26cc1-6db0-4a7a-9930-1e037c8f1fa2",
153+
)
154+
155+
@transport.event_handler("on_client_connected")
156+
async def on_client_connected(transport, client):
157+
logger.info(f"Client connected")
158+
# Kick off the conversation.
159+
await task.queue_frames([LLMRunFrame()])
160+
161+
@transport.event_handler("on_client_disconnected")
162+
async def on_client_disconnected(transport, client):
163+
logger.info(f"Client disconnected")
164+
await task.cancel()
165+
166+
runner = PipelineRunner(handle_sigint=False)
167+
168+
await runner.run(task)
169+
170+
171+
async def bot(runner_args: RunnerArguments):
172+
"""Main bot entry point compatible with Pipecat Cloud."""
173+
transport = await create_transport(runner_args, transport_params)
174+
await run_bot(transport)
175+
176+
177+
if __name__ == "__main__":
178+
from pipecat.runner.run import main
179+
180+
main()

open-telemetry/opik/env.example

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
DEEPGRAM_API_KEY=your_deepgram_key
2+
CARTESIA_API_KEY=your_cartesia_key
3+
OPENAI_API_KEY=your_openai_key
4+
5+
# Opik Configuration
6+
# Set to any value to enable tracing
7+
ENABLE_TRACING=true
8+
9+
# OTLP endpoint (defaults to Opik Cloud if not set)
10+
# For Opik Cloud: https://www.comet.com/opik/api/v1/private/otel/v1/traces
11+
# For self-hosted: http://<YOUR-OPIK-INSTANCE>/api/v1/private/otel/v1/traces
12+
OTEL_EXPORTER_OTLP_ENDPOINT=https://www.comet.com/opik/api/v1/private/otel/v1/traces
13+
14+
# Opik headers (get your API key from https://www.comet.com/opik)
15+
# Format: Authorization=<your-api-key>,Comet-Workspace=<your-workspace-name>,projectName=<your-project-name>
16+
OTEL_EXPORTER_OTLP_HEADERS=Authorization=<your-api-key>,Comet-Workspace=default,projectName=<your-project-name>
17+
18+
# Set to any value to enable console output for debugging
19+
# OTEL_CONSOLE_EXPORT=true
Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
pipecat-ai[daily,webrtc,silero,cartesia,deepgram,openai,tracing,runner]>=0.0.92
2+
pipecat-ai-small-webrtc-prebuilt
3+
opentelemetry-exporter-otlp-proto-http
4+
opik

0 commit comments

Comments
 (0)