Anthropic’s Messages API is a powerful interface for interacting with Claude models. When using TruefFundry as your model gateway, you can access this API through a proxy endpoint that handles authentication and routing to the appropriate model.
The Anthropic Python SDK provides a convenient way to interact with Claude models. Here’s how to configure it to work with the Truefoundry proxy:
Copy
Ask AI
from anthropic import AnthropicBASE_URL = "https://{controlPlaneUrl}/api/llm"API_KEY = "your-truefoundry-api-key"# Configure the Anthropic client to use TrueFoundry's Gatewayclient = Anthropic( api_key=API_KEY, base_url=BASE_URL, default_headers={ "Authorization": f"Bearer {API_KEY}" })# Make a request to the Messages APIdef generate_response(): response = client.messages.create( model="anthropic/claude-3-5", # The model name configured in Truefoundry max_tokens=1024, messages=[ { "role": "user", "content": "Hello, Claude! Please explain quantum computing in simple terms." } ] ) print(response.content)generate_response()
The response from the Messages API will have this structure:
Copy
Ask AI
{ "id": "msg_01XB89YSAA2VGMCF3ZS8ATTA1B", "type": "message", "role": "assistant", "content": [ { "type": "text", "text": "Quantum computing is like traditional computing but it uses quantum bits or 'qubits' instead of regular bits. While traditional bits can only be in a state of 0 or 1, qubits can exist in multiple states simultaneously thanks to a quantum property called 'superposition.' This allows quantum computers to process certain types of information much faster than regular computers.\n\nAnother key quantum property is 'entanglement,' where qubits become connected and the state of one instantly affects the other, no matter the distance between them.\n\nThese properties give quantum computers the potential to solve certain complex problems much faster than traditional computers, like factoring large numbers (important for encryption) or simulating molecular structures (useful for drug development).\n\nHowever, quantum computers are still in early development stages. They're extremely sensitive to their environment and require special conditions like ultra-cold temperatures to operate. They're not replacements for regular computers but specialized tools for specific types of problems." } ], "model": "claude-3-opus-20240229", "stop_reason": "end_turn", "stop_sequence": null, "usage": { "input_tokens": 14, "output_tokens": 178 }}
You can include a system prompt to guide Claude’s behavior:
Copy
Ask AI
client.messages.create( model="anthropic/claude-3-5", system="You are a helpful AI assistant that specializes in explaining complex topics simply.", messages=[ {"role": "user", "content": "Explain quantum entanglement."} ])
For streaming responses, use the streaming parameter:
Copy
Ask AI
with client.messages.stream( model="anthropic/claude-3-5", messages=[{"role": "user", "content": "Write a short poem about AI."}]) as stream: for text in stream.text_stream: print(text, end="", flush=True) # Access the final message at the end print("\nFinal message:", stream.get_final_message())