This guide demonstrates how to use TrueFoundry OtelCollector along with the Traceloop SDK to instrument OpenAI API calls. In this example, we’ll show how to instrument a simple Python application that makes calls to OpenAI’s API.

1

Create Tracing Project, API Key and copy tracing code

Follow the instructions in Getting Started to create a tracing project, generate API key and copy the tracing code.

2

Install Dependencies

First, you need to install the following

pip install openai traceloop-sdk==0.38.12 dotenv
3

Add Tracing code to OpenAI application

For OpenAI applications, we need to add the Traceloop.init() call to the application. The Traceloop SDK will automatically trace all OpenAI API calls.

OpenAI Code
from dotenv import load_dotenv
from openai import OpenAI
import os

# importing traceloop sdk
from traceloop.sdk import Traceloop

load_dotenv()

# Add the traceloop init code to your application
TFY_API_KEY = os.environ.get("TFY_API_KEY")
Traceloop.init(
    api_endpoint="<enter_your_api_endpoint>",
    headers = {
        "Authorization": f"Bearer {TFY_API_KEY}",
        "TFY-Tracing-Project": "<enter_your_tracing_project_fqn>",
    },
)

# Initialize OpenAI client
client = OpenAI()

# Make API call - this will be automatically traced
stream = client.chat.completions.create(
    messages = [
            {"role": "system", "content": "You are an AI bot."},
            {"role": "user", "content": "Explain the concept of AI in 50 words"},
    ],
    model= "gpt-4o",
    stream=True,
    temperature=0.7,
    max_tokens=256,
    top_p=0.8,
    frequency_penalty=0,
    presence_penalty=0
)

for chunk in stream:
    if chunk.choices and len(chunk.choices) > 0 and chunk.choices[0].delta.content is not None:
        print(chunk.choices[0].delta.content, end="")
4

Run your application and view logged trace