Phidata is a framework for building AI applications with memory, knowledge, and tools. It provides a comprehensive platform for creating intelligent agents that can maintain context across conversations, access external knowledge bases, and use various tools to accomplish complex tasks.
Connect to TrueFoundry by configuring the OpenAI LLM in Phidata with your TrueFoundry gateway:
Copy
Ask AI
from phi.agent import Agentfrom phi.llm.openai import OpenAIChatllm = OpenAIChat( base_url="your-truefoundry-base-url", api_key="your-truefoundry-api-key", model='openai-main/gpt-4o' # Use any model from any provider)agent = Agent( llm=llm, description="You help people with their questions.", instructions=["tell fun and amazing facts about the topic"],)# Print a response to the clientagent.print_response("tell something about sabertooth tiger.", markdown=True)
Replace:
your-truefoundry-api-key with your actual TrueFoundry API key
your-truefoundry-base-url with your TrueFoundry Gateway URL
Use your desired model in the format provider-main/model-name
Monitor your Phidata applications through TrueFoundry’s metrics tab:With TrueFoundry’s AI gateway, you can monitor and analyze:
Performance Metrics: Track key latency metrics like Request Latency, Time to First Token (TTFS), and Inter-Token Latency (ITL) with P99, P90, and P50 percentiles
Cost and Token Usage: Gain visibility into your application’s costs with detailed breakdowns of input/output tokens and the associated expenses for each model
Usage Patterns: Understand how your application is being used with detailed analytics on user activity, model distribution, and team-based usage
Agent Performance: Monitor individual agent performance and tool usage patterns
Rate limit and Load balancing: Set up rate limiting, load balancing and fallback for your models