To get started with tracing, we need to create a tracing project on the Truefoundry UI, generate an API key for the tracing project and then add the tracing initialization code to our application along with the project name and the API key.

Tracing Projects are created in a Truefoundry Repository. You can configure which users have read/write access to a repository and hence to a trace project.

Only users with access to a Repository can create tracing projects and view their trace data.

1

Create a tracing project

To create a tracing project from the UI:

  • Navigate to your ML Repo.
  • Go to the Trace Projects tab.
  • Click New Trace Project and enter the project name.
  • Click Submit.

Creating tracing projects

2

Generate API Key

Generate API key following the steps here. Please save the value since we will need this in a later step.

You can also use a Virtual Account token with access to the Repository, in case you don’t want to use your personal token.

3

Install Traceloop SDK

Install Traceloop SDK using pip:

pip install traceloop-sdk
4

Copy the code from UI to add to your application

Navigate to the Tracing Project you created in the previous steps. Copy the code snippet to add to your application.

Click on Log Traces button to generate the code snippet

5

Add the code to your application

After adding the code snippet to your application, it should look something like this:

import os
from traceloop.sdk import Traceloop
from openai import OpenAI

# This is the code snipped that we copied in the previous step
# Replace this with the code snippet you copied
TFY_API_KEY = os.environ.get("TFY_API_KEY")
Traceloop.init(
    api_endpoint="https://platform.live-demo.truefoundry.cloud/api/otel",
    headers = {
        "Authorization": f"Bearer {TFY_API_KEY}",
        "TFY-Tracing-Project": "tracing-project:live-demo/tracing/agno-plot-agent",
    },
)

client = OpenAI(api_key=os.environ.get("OPENAI_API_KEY"))

stream = client.chat.completions.create(
    messages = [
            {"role": "system", "content": "You are an AI bot."},
            {"role": "user", "content": "Enter your prompt here"},
    ],
    model= "openai-main/gpt-4o",
    stream=True,
    temperature=0.7,
    max_tokens=256,
    top_p=0.8,
    frequency_penalty=0,
    presence_penalty=0,
    stop=["</s>"],
)

for chunk in stream:
    if chunk.choices and len(chunk.choices) > 0 and chunk.choices[0].delta.content is not None:
        print(chunk.choices[0].delta.content, end="")
Set environment variables for TFY_API_KEY as the API key you generated in step 2.
6

Run your application and view logged trace

Run your application and you should see traces on the TrueFoundry UI.

Add Tracing to your code based on framework

Annotate your Workflows, Agents and Tools

For complex workflows or chains, annotating them can help you gain better insights into their operations. With TrueFoundry, you can view the entire trace of your workflow for a comprehensive understanding.

Traceloop offers a set of decorators to simplify this process. For instance, if you have a function that renders a prompt and calls an LLM, you can easily add the @workflow decorator to track and annotate the workflow. Let’s say your workflow calls more functions you can annotate them as @task

from openai import OpenAI
from traceloop.sdk.decorators import workflow, task

client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])

@task(name="joke_creation")
def create_joke():
    completion = client.chat.completions.create(
        model="gpt-3.5-turbo",
        messages=[{"role": "user", "content": "Tell me a joke about opentelemetry"}],
    )

    return completion.choices[0].message.content

@task(name="signature_generation")
def generate_signature(joke: str):
    completion = openai.Completion.create(
        model="davinci-002",[]
        prompt="add a signature to the joke:\n\n" + joke,
    )

    return completion.choices[0].text


@workflow(name="pirate_joke_generator")
def joke_workflow():
    eng_joke = create_joke()
    pirate_joke = translate_joke_to_pirate(eng_joke)
    signature = generate_signature(pirate_joke)
    print(pirate_joke + "\n\n" + signature)

Similarly, when working with autonomous agents, you can use the @agent decorator to trace the agent as a single unit. Additionally, each individual tool within the agent should be annotated with the @tool decorator.

from openai import OpenAI
from traceloop.sdk.decorators import agent, tool

client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])

@agent(name="joke_translation")
def translate_joke_to_pirate(joke: str):
    completion = client.chat.completions.create(
        model="gpt-3.5-turbo",
        messages=[{"role": "user", "content": f"Translate the below joke to pirate-like english:\n\n{joke}"}],
    )

    history_jokes_tool()

    return completion.choices[0].message.content


@tool(name="history_jokes")
def history_jokes_tool():
    completion = client.chat.completions.create(
        model="gpt-3.5-turbo",
        messages=[{"role": "user", "content": f"get some history jokes"}],
    )

    return completion.choices[0].message.content