This guide demonstrates how to use TrueFoundry OtelCollector along with the Traceloop SDK to instrument OpenAI API calls.

1

Install Dependencies

First, you need to install the following

pip install openai traceloop-sdk dotenv
2

Setup environment variables

To enable tracing, you’ll need to configure a few environment variables in your application.

Before proceeding, make sure you’ve, Created a tracing project and Generated an API token. If you haven’t done this yet, follow the instructions in Getting Started.

# Tracing configs
TRACELOOP_BASE_URL=<<control-plane-url>>/api/otel
TRACELOOP_HEADERS="Authorization=Bearer%20<<api-key>>,tfy-tracing-project=<<tracing-project-fqn>>"

# Application configs
OPENAI_API_KEY=sk-proj-*

Replace the placeholders above:

  • <<control-plane-url>>: Your actual TrueFoundry control plane URL
  • <<api-key>>: The API key associated with your tracing project
  • <<tracing-project-fqn>>: The fully qualified name of your tracing project
3

Initialise instrumentation

from traceloop.sdk import Traceloop
from dotenv import load_dotenv

# Load environment variables from a .env file
load_dotenv()

# initialize the Traceloop SDK
Traceloop.init()
4

OpenAI code

Below is an example of a OpenAI code setup. With the Traceloop SDK initialized as shown above, all agent activities will be automatically traced—no additional tracing code is required.

from openai import OpenAI
client = OpenAI()

stream = client.chat.completions.create(
    messages = [
            {"role": "system", "content": "You are an AI bot."},
            {"role": "user", "content": "Explain the concept of AI in 50 words"},
    ],
    model= "gpt-4o",
    stream=True,
    temperature=0.7,
    max_tokens=256,
    top_p=0.8,
    frequency_penalty=0,
    presence_penalty=0
)

for chunk in stream:
    if chunk.choices and len(chunk.choices) > 0 and chunk.choices[0].delta.content is not None:
        print(chunk.choices[0].delta.content, end="")
5

Run your application and view logged trace