The Proxy API allows you to route requests directly to AI provider endpoints through TrueFoundry AI Gateway without any translation logic. This means you can use provider-native request and response formats while still benefiting from TrueFoundry’s features like logging, rate limiting, and budget management.TrueFoundry Proxy Endpoint routes your request to the appropriate provider and returns the provider’s response without modifying it, preserving the native request/response format.
TrueFoundry Proxy API supports all provider endpoints. However, the following endpoints have additional features like full logging, rate limiting, and budget management:
Anthropic Messages API - /v1/messages
Claude Code - Available from Vertex, Bedrock, and Anthropic providers
from anthropic import AnthropicBASE_URL = "https://{controlPlaneUrl}/api/llm"API_KEY = "your-truefoundry-api-key"# Configure Anthropic client with TrueFoundry settingsclient = Anthropic( api_key=API_KEY, base_url=BASE_URL)# Make requests as usual with your TrueFoundry model nameresponse = client.messages.create( model="your-truefoundry-model-name", max_tokens=1024, messages=[ {"role": "user", "content": "Tell me a joke about programming"} ])print(response.content)