LLM Gateway FAQs

How can I connect my OpenAI or Anthropic account to TrueFoundry LLM Gateway?

You can connect your OpenAI, Anthropic, Mistral etc. account to TrueFoundry LLM Gateway by adding the integration from the Integrations page. This process is straightforward and supports various models, allowing you to seamlessly integrate and utilize different large language models within the TrueFoundry platform.

How do I define the TrueFoundry API key when accessing code snippet for integrating LLMs on Gateway?

Refer to the document here for generating API keys. You can use either personal access tokens or Virtual Accounts (recommended for production settings). Our Virtual Accounts implement the service account concept, allowing you to limit access to a subset of entities.

How do I access control my models on LLM Gateway?

You can restrict access to certain models for specific users or teams by configuring permissions against the model integration. This ensures that only authorized individuals or groups can utilize particular models.


If I deploy my own open source or fine-tuned model using TrueFoundry, can I access it through gateway? How?

To do this, use the "Add to LLM Gateway" feature is available on the deployment details page.

Where are the prompt responses stored?

Currently, we deploy Clickhouse database on the cluster where we store all the prompt responses.

Do you support function calling?

Yes. You can go to the setting sections and add the function calling support in the tools section.