Dify is an open-source platform for building AI applications with built-in LLMOps, workflow automation, and agent capabilities. TrueFoundry integrates seamlessly with Dify, providing enterprise-grade AI features including cost tracking, security guardrails, and access controls. TrueFoundry’s AI Gateway routes all LLM calls through the Gateway to ensure your AI applications are secure, compliant, and cost-effective.

Prerequisites

Before integrating Dify with TrueFoundry, ensure you have:
  1. TrueFoundry Account: Create a Truefoundry account with atleast one model provider and generate a Personal Access Token by following the instructions in Generating Tokens
  2. Dify Installation: Set up Dify using either the cloud version or self-hosted deployment with Docker

Integration Steps

This guide assumes you have Dify installed and running, and have obtained your TrueFoundry AI Gateway base URL and authentication token.

Step 1: Generate Your TrueFoundry Access Token

  1. Navigate to your TrueFoundry dashboard and go to Access Management.
  2. Click New Personal Access Token to create a new token:
  3. Copy and securely store your Personal Access Token - you’ll need this for Dify configuration.

Step 2: Access Dify Model Provider Settings

  1. Log into your Dify workspace (cloud or self-hosted).
  2. Navigate to Settings and go to Model Provider:

Step 3: Install OpenAI-API-Compatible Provider

  1. In the Model Provider section, look for OpenAI-API-compatible and click Install.
  2. Configure the OpenAI-API-compatible provider with your TrueFoundry details:
Fill in the following configuration:
  • Model Name: Enter your TrueFoundry model ID (e.g., openai-main/gpt-4o)
  • Model display name: Enter a display name (e.g., Gpt-4o)
  • API Key: Enter your TrueFoundry Personal Access Token
  • API endpoint URL: Enter your TrueFoundry Gateway base URL (e.g., https://internal.devtest.truefoundry.tech/api/llm/api/inference/openai)
  • model name for API endpoint: Enter the endpoint model name (e.g., openai-main/gpt-4o)

Step 4: Get Configuration Details from TrueFoundry

  1. Navigate to your TrueFoundry AI Gateway playground to get both the base URL and model name from the unified code snippet (ensure you use the same model name as written):

Get Base URL and Model Name from Unified Code Snippet

  1. Copy the base URL and model ID and paste them into Dify’s configuration fields.
  2. Get the API endpoint URL from the unified code snippet provided by TrueFoundry:

Step 5: Save and Test Your Configuration

  1. Click Save to apply your configuration in Dify.
  2. Create a new application or workflow to test the integration:
  3. Test the integration by creating a simple LLM workflow to verify that Dify is successfully communicating with TrueFoundry’s AI Gateway.
Your Dify workspace is now integrated with TrueFoundry’s AI Gateway and ready for building AI applications, workflows, and agents.