To get started with tracing, we need to create a tracing project on the TrueFoundry UI, generate an API key for the tracing project and then add the tracing initialization code to our application along with the project
name and the API key.
Tracing Projects are created in a TrueFoundry Repository. You can configure which users have read/write access to a repository and hence to a trace project.
Only users with access to a Repository can create tracing projects and view their trace data.
1
Create a tracing project
To create a tracing project from the UI:
Navigate to your ML Repo.
Go to the Trace Projects tab.
Click New Trace Project and enter the project name.
Click Submit.
Creating tracing projects
To create a tracing project from the UI:
Navigate to your ML Repo.
Go to the Trace Projects tab.
Click New Trace Project and enter the project name.
For production environments with high traffic, you may want to configure tracing sampling to reduce costs and improve performance. Learn more about Tracing Sampling to understand how to implement sampling strategies.
For complex workflows or chains, annotating them can help you gain better insights into their operations. With TrueFoundry, you can view the entire trace of your workflow for a comprehensive understanding.
Traceloop offers a set of decorators to simplify this process. For instance, if you have a function that renders a prompt and calls an LLM, you can easily add the @workflow decorator to track and annotate the workflow. Let’s say your workflow calls more functions you can annotate them as @task
Copy
Ask AI
from openai import OpenAIfrom traceloop.sdk.decorators import workflow, taskclient = OpenAI(api_key=os.environ["OPENAI_API_KEY"])@task(name="joke_creation")def create_joke(): completion = client.chat.completions.create( model="gpt-3.5-turbo", messages=[{"role": "user", "content": "Tell me a joke about opentelemetry"}], ) return completion.choices[0].message.content@task(name="signature_generation")def generate_signature(joke: str): completion = openai.Completion.create( model="davinci-002",[] prompt="add a signature to the joke:\n\n" + joke, ) return completion.choices[0].text@workflow(name="pirate_joke_generator")def joke_workflow(): eng_joke = create_joke() pirate_joke = translate_joke_to_pirate(eng_joke) signature = generate_signature(pirate_joke) print(pirate_joke + "\n\n" + signature)
Similarly, when working with autonomous agents, you can use the @agent decorator to trace the agent as a single unit. Additionally, each individual tool within the agent should be annotated with the @tool decorator.
Copy
Ask AI
from openai import OpenAIfrom traceloop.sdk.decorators import agent, toolclient = OpenAI(api_key=os.environ["OPENAI_API_KEY"])@agent(name="joke_translation")def translate_joke_to_pirate(joke: str): completion = client.chat.completions.create( model="gpt-3.5-turbo", messages=[{"role": "user", "content": f"Translate the below joke to pirate-like english:\n\n{joke}"}], ) history_jokes_tool() return completion.choices[0].message.content@tool(name="history_jokes")def history_jokes_tool(): completion = client.chat.completions.create( model="gpt-3.5-turbo", messages=[{"role": "user", "content": f"get some history jokes"}], ) return completion.choices[0].message.content
Assistant
Responses are generated using AI and may contain mistakes.