Open WebUI is an open-source LLM application interface that enables users to interact with large language models through a user-friendly web UI. TrueFoundry integrates seamlessly with Open WebUI, allowing you to route all Open WebUI LLM requests through TrueFoundry’s Gateway for enhanced security, load balancing, rate limiting, cost management, and more.TrueFoundry’s AI Gateway provides robust integration with Open WebUI, ensuring all LLM calls are routed through the Gateway and benefit from its built-in features.
Before integrating Open WebUI with TrueFoundry, ensure you have:
TrueFoundry Account: Create a Truefoundry account with atleast one model provider and generate a Personal Access Token by following the instructions in Generating Tokens
Open WebUI Instance: Deploy Open WebUI by following the official documentation for local or cloud deployment
Step 1: Configure Open WebUI to Use TrueFoundry Gateway
Open the Open WebUI settings page.
Navigate to the “Admin Panel” section from the top right icon.
Open WebUI Navigate to Admin Panel
Add a new connection from Settings → Connections → Manage OpenAI API Connections (+).
Open WebUI Admin Panel
Add the following details:
API Key: Enter your PAT (Personal Access Token) or Virtual Account Token generated in Generating Tokens.
Base URL and Model ID: You will get both the base URL and model name from the unified code snippet in our playground (ensure you use the same model name as written)
Get Base URL and Model Name from Unified Code Snippet
Model ID: Add the model IDs from the unified code snippet (Press the + button on the right side to add a model.)