What is Dify?
Dify is an open-source platform for building AI applications with built-in LLMOps, workflow automation, and agent capabilities. It provides a comprehensive suite of tools for creating, managing, and deploying AI-powered applications through an intuitive visual interface.Key Features of Dify
- Workflow Studio: Create complex AI workflows using visual drag-and-drop interface with support for conditional logic, loops, and multi-step reasoning processes
- Agent Builder: Build intelligent agents with tool integration, conversation memory, and the ability to plan and execute multi-step tasks autonomously
- Knowledge Management: Upload documents, websites, and structured data to create searchable knowledge bases for RAG (Retrieval-Augmented Generation) applications
Prerequisites
Before integrating Dify with TrueFoundry, ensure you have:- TrueFoundry Account: Create a Truefoundry account with atleast one model provider and generate a Personal Access Token by following the instructions in Generating Tokens
- Dify Installation: Set up Dify using either the cloud version or self-hosted deployment with Docker
Integration Steps
This guide assumes you have Dify installed and running, and have obtained your TrueFoundry AI Gateway base URL and authentication token.Step 1: Generate Your TrueFoundry Access Token
- Navigate to your TrueFoundry dashboard and go to Access Management.
- Click New Personal Access Token to create a new token:
- Copy and securely store your Personal Access Token - you’ll need this for Dify configuration.
Step 2: Access Dify Model Provider Settings
- Log into your Dify workspace (cloud or self-hosted).
- Navigate to Settings and go to Model Provider:

Step 3: Install OpenAI-API-Compatible Provider
- In the Model Provider section, look for OpenAI-API-compatible and click Install.
- Configure the OpenAI-API-compatible provider with your TrueFoundry details:

- Model Name: Enter your TrueFoundry model ID (e.g.,
openai-main/gpt-4o
) - Model display name: Enter a display name (e.g.,
Gpt-4o
) - API Key: Enter your TrueFoundry Personal Access Token
- API endpoint URL: Enter your TrueFoundry Gateway base URL (e.g.,
https://internal.devtest.truefoundry.tech/api/llm/api/inference/openai
) - model name for API endpoint: Enter the endpoint model name (e.g.,
openai-main/gpt-4o
)
Step 4: Get Configuration Details from TrueFoundry
- Navigate to your TrueFoundry AI Gateway playground to get both the base URL and model name from the unified code snippet (ensure you use the same model name as written):

Get Base URL and Model Name from Unified Code Snippet
- Copy the base URL and model ID and paste them into Dify’s configuration fields.
- Get the API endpoint URL from the unified code snippet provided by TrueFoundry:
Step 5: Save and Test Your Configuration
- Click Save to apply your configuration in Dify.
- Create a new application or workflow to test the integration:
- Test the integration by creating a simple LLM workflow to verify that Dify is successfully communicating with TrueFoundry’s AI Gateway.
