truefoundry Docs home page
Search...
⌘K
Ask AI
Book Demo
Blog
Sign Up
Sign Up
Search...
Navigation
Deploy Model As API
MlFlow
Home
ML Engineering
AI Gateway
LLM Engineering
API Reference
Changelog
Getting Started
Setup Your Account
Key Concepts
Setup For CLI
Train and Deploy Models
Introduction
Develop and Iterate
Model Registry
Deploy Model As API
Overview
HuggingFace
Scikit Learn / XGBoost
FastAPI
LitServe
AWS Multi-Model Server
TorchServe
TensorFlow Serve (TFServe)
MlFlow
Migrate Sagemaker Pytorch Endpoint
Service Deployment
Introduction
Getting Started
Configuring your service
Job Deployment
Introduction
Getting Started
Configuring your Job
Workflow Deployment
Introduction To Workflow
Creating Your First Workflow
Interacting With Workflow
Workflow Concepts
Guides
Async Service Deployment
Introduction to Async Service
Deploy your first Async Service
Configure async service
Queue Integrations
Monitor Your Async Service
Volumes
Introduction to Volume
Creating And Using Volumes
ML Repository
Introduction
Experiment Tracking
Platform
Access Control
Secret Management
Manage Users and Teams
Generating TrueFoundry API Keys
Create Custom K8s Objects
Setup Gitops
Deployment Guardrails and Policies
Using TrueFoundry Secrets In Integrations
Using CLI to Deploy Resources with tfy apply
Integrations
Deploying On Your Own Cloud
Modes Of Deployment
Deploy Compute Plane
Deploy Control Plane
Advanced Configuration
Deploy TrueFoundry In An Air Gapped Environment
On this page
Examples
Guide
Deploy Model As API
MlFlow
Examples
Deploy Scikit-Learn Model with MLFlow MLServer
Deploy
transformers
model with MLFlow MLServer
Deploy Custom Python function with MLFlow Server
Guide
Coming Soon!
Was this page helpful?
Yes
No
TensorFlow Serve (TFServe)
Migrate Sagemaker Pytorch Endpoint
Assistant
Responses are generated using AI and may contain mistakes.