Adding Models
This section explains the steps to add Databricks models and configure the required access controls.1
Navigate to Databricks Models in AI Gateway
From the TrueFoundry dashboard, navigate to 
AI Gateway
> Models
and select Databricks
.
Navigate to Databricks Models
2
Add Databricks Account and Authentication
Give a unique name for the Databricks account. This will be used to refer to the models later. Provide the authentication details for the gateway to access your Databricks models. Truefoundry supports both Service Principal and Personal Access Token (PAT) based authentication.
Finally, enter your Databricks workspace URL (e.g.,
Get Databricks Authentication Details
Get Databricks Authentication Details
Using Service Principal (Recommended):Service Principal authentication is the recommended approach for production environments as it provides better security and access control.
Using Personal Access Token (PAT):Personal Access Tokens are suitable for development and testing environments.
- Choose Service Principal Auth.
- Enter your Databricks Service Principal
Client ID
andOAuth Secret
.

Service Principal Authentication
- Choose Databricks API Key Based Auth.
- Enter your
PAT
.

Personal Access Token (PAT) Authentication
https://<workspace_id>.databricks.com
).3
Add Models
Click

+ Add Model
to add a new model configuration. The Model ID
in TrueFoundry must exactly match the serving endpoint name in your Databricks workspace.How to Set Up Databricks Serving Endpoints
How to Set Up Databricks Serving Endpoints
-
Access Databricks Serving: In your Databricks workspace, navigate to Serving in the left sidebar and click Create serving endpoint.
-
Configure Endpoint:
- Endpoint name: Choose a descriptive name. This name will be your Model ID in TrueFoundry.
- Served Entity: Choose from Foundation Models or your custom models.
- Deploy and Verify: Click Create and wait for the deployment to become Ready.

Add Databricks Model in TrueFoundry
Inference
After adding the models, you can perform inference using an OpenAI-compatible API via the Playground or by integrating it with your own application.
Infer Model in Playground or Get Code Snippet