Providers
Azure AI Foundry
TrueFoundry offers a secure and efficient gateway to seamlessly integrate various Large Language Models (LLMs) into your applications, including models hosted on Azure AI Foundry.
Adding Models
This section explains the steps to add Azure AI Foundry models and configure the required access controls.
- From TrueFoundry dashboard, go to
Platform
>Integration Providers
>Add Provider Integration
and chooseAzure
- Click on
Azure AI Foundry Model
in integrations section to open the form - Fill in the following required information:
- Model Name: A unique identifier for your model
- Azure API Key: Your Azure API key for authentication
- Endpoint URL: The deployment endpoint for your model
- Submit the form to save the integration
Supported Models
Azure AI Foundry integration supports various AI models. To start using models through Azure AI Foundry:
- Deploy Azure AI Foundry resource and deploy a model. The instructions are available here
- Once deployed, you will be able to see model endpoint and model name in the
Deployments
section of Azure AI Foundry resource - Note down the following details from your deployment:
- Model endpoint URL
- Model name and version
Listing of models in Azure AI Foundry
Adding a model to TrueFoundry as an integration
Troubleshooting
If you encounter issues while using the model:
- Verify your Azure API key is valid and has the necessary permissions
- Ensure the endpoint URL is correct and accessible
- Check if the model is properly deployed and running in Azure AI Foundry
- Confirm your Azure subscription is active and has sufficient quota