This guide provides instructions for integrating Dify with the Truefoundry AI Gateway.

What is Dify?

Dify is an open-source platform for building AI applications with built-in LLMOps, workflow automation, and agent capabilities. It provides a comprehensive suite of tools for creating, managing, and deploying AI-powered applications through an intuitive visual interface.

Key Features of Dify

  • Workflow Studio: Create complex AI workflows using visual drag-and-drop interface with support for conditional logic, loops, and multi-step reasoning processes
  • Agent Builder: Build intelligent agents with tool integration, conversation memory, and the ability to plan and execute multi-step tasks autonomously
  • Knowledge Management: Upload documents, websites, and structured data to create searchable knowledge bases for RAG (Retrieval-Augmented Generation) applications

Prerequisites

Before integrating Dify with TrueFoundry, ensure you have:
  1. TrueFoundry Account: Create a Truefoundry account with atleast one model provider and generate a Personal Access Token by following the instructions in Generating Tokens
  2. Dify Installation: Set up Dify using either the cloud version or self-hosted deployment with Docker

Integration Steps

This guide assumes you have Dify installed and running, and have obtained your TrueFoundry AI Gateway base URL and authentication token.

Step 1: Generate Your TrueFoundry Access Token

  1. Navigate to your TrueFoundry dashboard and go to Access Management.
  2. Click New Personal Access Token to create a new token:
  3. Copy and securely store your Personal Access Token - you’ll need this for Dify configuration.

Step 2: Access Dify Model Provider Settings

  1. Log into your Dify workspace (cloud or self-hosted).
  2. Navigate to Settings and go to Model Provider:
Dify settings page showing Model Provider selection in the left sidebar menu

Step 3: Install OpenAI-API-Compatible Provider

  1. In the Model Provider section, look for OpenAI-API-compatible and click Install.
  2. Configure the OpenAI-API-compatible provider with your TrueFoundry details:
Dify OpenAI-API-compatible provider configuration form with fields for model name, API key, and endpoint URL
Fill in the following configuration:
  • Model Name: Enter your TrueFoundry model ID (e.g., openai-main/gpt-4o)
  • Model display name: Enter a display name (e.g., Gpt-4o)
  • API Key: Enter your TrueFoundry Personal Access Token
  • API endpoint URL: Enter your TrueFoundry Gateway base URL (e.g., https://internal.devtest.truefoundry.tech/api/llm/api/inference/openai)
  • model name for API endpoint: Enter the endpoint model name (e.g., openai-main/gpt-4o)

Step 4: Get Configuration Details from TrueFoundry

  1. Navigate to your TrueFoundry AI Gateway playground to get both the base URL and model name from the unified code snippet (ensure you use the same model name as written):
TrueFoundry playground showing unified code snippet with base URL and model name for Dify integration

Get Base URL and Model Name from Unified Code Snippet

  1. Copy the base URL and model ID and paste them into Dify’s configuration fields.
  2. Get the API endpoint URL from the unified code snippet provided by TrueFoundry:

Step 5: Save and Test Your Configuration

  1. Click Save to apply your configuration in Dify.
  2. Create a new application or workflow to test the integration:
  3. Test the integration by creating a simple LLM workflow to verify that Dify is successfully communicating with TrueFoundry’s AI Gateway.
Dify application interface showing successful test results with TrueFoundry integration
Your Dify workspace is now integrated with TrueFoundry’s AI Gateway and ready for building AI applications, workflows, and agents.