Skip to main content
To get started with tracing, we need to create a tracing project and a tracing application on the TrueFoundry UI, generate an API key for the tracing project and then add the tracing initialization code to our application along with the project fqn, application name and the API key. On creating a new tracing project, you can configure which users have read/write/view access
1

Create a tracing project

  • Navigate to your Tracing.
  • Click New Tracing Projects and enter the project name, collaborators
  • Click Submit.

Creating tracing projects

2

Create a tracing application

  • Select the Tracing project created in the previous steps
  • Navigate to Applications
  • Click New Tracing Applications and enter the application name
  • Click Submit.

Creating tracing projects

3

Generate API Key

Generate API key following the steps here. Please save the value since we will need this in a later step.You can also use a Virtual Account token with access to the Tracing Project, in case you don’t want to use your personal token.
4

Install Traceloop SDK

Install Traceloop SDK using pip:
pip install traceloop-sdk
5

Add the initialization code to your application

Add this initialization code your code
TFY_API_KEY = os.environ.get("TFY_API_KEY")
Traceloop.init(
    api_endpoint="<enter_your_control_plane_url>/api/otel",
    app_name="enter_your_tracing_application_name"
    headers = {
        "Authorization": f"Bearer {TFY_API_KEY}",
        "TFY-Tracing-Project": "enter_your_tracing_project_fqn",
    },
)

After adding the code snippet to your application, it should look something like this:
import os
from traceloop.sdk import Traceloop
from openai import OpenAI

TFY_API_KEY = os.environ.get("TFY_API_KEY")
Traceloop.init(
    api_endpoint="<enter_your_control_plane_url>/api/otel",
    app_name="enter_your_tracing_application_name"
    headers = {
        "Authorization": f"Bearer {TFY_API_KEY}",
        "TFY-Tracing-Project": "enter_your_tracing_project_fqn",
    },
)

client = OpenAI(api_key=os.environ.get("OPENAI_API_KEY"))

stream = client.chat.completions.create(
    messages = [
            {"role": "system", "content": "You are an AI bot."},
            {"role": "user", "content": "Enter your prompt here"},
    ],
    model= "openai-main/gpt-4o",
    stream=True,
    temperature=0.7,
    max_tokens=256,
    top_p=0.8,
    frequency_penalty=0,
    presence_penalty=0,
    stop=["</s>"],
)

for chunk in stream:
    if chunk.choices and len(chunk.choices) > 0 and chunk.choices[0].delta.content is not None:
        print(chunk.choices[0].delta.content, end="")
Set environment variables for TFY_API_KEY as the API key you generated in step 2.
6

Run your application and view logged trace

Run your application and you should see traces on the TrueFoundry UI.

Add Tracing to your code based on framework

Advanced Configuration

Tracing Sampling

For production environments with high traffic, you may want to configure tracing sampling to reduce costs and improve performance. Learn more about Tracing Sampling to understand how to implement sampling strategies.

Annotate your Workflows, Agents and Tools

For complex workflows or chains, annotating them can help you gain better insights into their operations. With TrueFoundry, you can view the entire trace of your workflow for a comprehensive understanding. Traceloop offers a set of decorators to simplify this process. For instance, if you have a function that renders a prompt and calls an LLM, you can easily add the @workflow decorator to track and annotate the workflow. Let’s say your workflow calls more functions you can annotate them as @task
from openai import OpenAI
from traceloop.sdk.decorators import workflow, task

client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])

@task(name="joke_creation")
def create_joke():
    completion = client.chat.completions.create(
        model="gpt-3.5-turbo",
        messages=[{"role": "user", "content": "Tell me a joke about opentelemetry"}],
    )

    return completion.choices[0].message.content

@task(name="signature_generation")
def generate_signature(joke: str):
    completion = openai.Completion.create(
        model="davinci-002",[]
        prompt="add a signature to the joke:\n\n" + joke,
    )

    return completion.choices[0].text


@workflow(name="pirate_joke_generator")
def joke_workflow():
    eng_joke = create_joke()
    pirate_joke = translate_joke_to_pirate(eng_joke)
    signature = generate_signature(pirate_joke)
    print(pirate_joke + "\n\n" + signature)
Similarly, when working with autonomous agents, you can use the @agent decorator to trace the agent as a single unit. Additionally, each individual tool within the agent should be annotated with the @tool decorator.
from openai import OpenAI
from traceloop.sdk.decorators import agent, tool

client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])

@agent(name="joke_translation")
def translate_joke_to_pirate(joke: str):
    completion = client.chat.completions.create(
        model="gpt-3.5-turbo",
        messages=[{"role": "user", "content": f"Translate the below joke to pirate-like english:\n\n{joke}"}],
    )

    history_jokes_tool()

    return completion.choices[0].message.content


@tool(name="history_jokes")
def history_jokes_tool():
    completion = client.chat.completions.create(
        model="gpt-3.5-turbo",
        messages=[{"role": "user", "content": f"get some history jokes"}],
    )

    return completion.choices[0].message.content
I