Adding Models
This section explains the steps to add OpenAI models and configure the required access controls.1
Navigate to OpenAI Models in AI Gateway
From the TrueFoundry dashboard, navigate to 
AI Gateway > Models and select OpenAI.
Navigate to OpenAI Models
2
Add OpenAI Account Details
Click 
Add OpenAI Account. Give a unique name to your OpenAI account and complete the form with your OpenAI authentication details (API Key). Add collaborators to your account, this will give access to the account to other users/teams. Learn more about access control here.
OpenAI Model Account Form
3
Add Models
Select the model from the list. If you see the model you want to add in the list of checkboxes, we support public model cost for these models.
(Optional) If the model you are looking for is not present in the options, you can add it using
+ Add Model at the end of list (scroll down to see the option) by filling the form. TrueFoundry AI Gateway supports all text and image models in OpenAI.The complete list of models supported by OpenAI can be found here.
Inference
After adding the models, you can perform inference using an OpenAI-compatible API via the Playground or by integrating with your own application.
Frequently Asked Questions
Can I add proprietary or self-hosted models to the Gateway?
Can I add proprietary or self-hosted models to the Gateway?
Yes—register OpenAI-compatible/self-hosted endpoints and route through Gateway.Learn more: Self-Hosted Models
Which providers are supported out of the box?
Which providers are supported out of the box?
TrueFoundry supports 20+ providers including OpenAI, Anthropic, Cohere, Google Gemini/Vertex, Azure OpenAI, AWS Bedrock, Mistral, Groq, AI21, Databricks, Together AI, DeepInfra, Cerebras, Sambanova, Perplexity, OpenRouter, and more.
Do you support adding a provider not listed (e.g., Cerebras, SambaNova)?
Do you support adding a provider not listed (e.g., Cerebras, SambaNova)?
Yes—use OpenAI-compatible APIs via self-hosted models.Learn more: Self-Hosted Models
Is model swapping supported without workflow changes?
Is model swapping supported without workflow changes?
Yes—the Gateway uses a unified OpenAI-compatible API that allows model swaps without code changes.Learn more: Introduction to LLM Gateway
Can I register multiple versions or aliases and route traffic by version?
Can I register multiple versions or aliases and route traffic by version?
Yes—register models with different IDs and configure routing weights.Learn more: Load Balancing Configuration
Who can add models and who can access them?
Who can add models and who can access them?
Controlled by RBAC and scoped API keys.Learn more: Gateway Access Control
Are Workday, Salesforce, SAP, or other SaaS systems supported?
Are Workday, Salesforce, SAP, or other SaaS systems supported?
Not natively, but you can integrate these through custom MCP Servers or plugins.Learn more: MCP Overview
Can workloads be dynamically routed across clouds?
Can workloads be dynamically routed across clouds?
Yes, the Gateway supports cross-cloud routing when models are deployed in multiple cloud environments.Learn more: Load Balancing Overview