Getting Started
Train and Deploy Models
- Introduction
- Develop and Iterate
- Model Registry
- Deploy Model As API
Service Deployment
- Introduction
- Getting Started
- Configuring your service
- Dockerize Your Code
- Define Ports And Domains
- Add JWT Authentication
- Define Resources
- Environment Variables And Secrets
- Autoscaling
- Pause / Resume Service
- Liveness/Readiness Probe
- Rollout Strategy
- Mounting Files
- Access Cloud Services Like S3
- Download Models And Artifacts
- Using Fractional GPUs
- Using TPUs
- Redirect And Mirror Traffic
- Patch your Kubernetes Deployment (Advanced)
- Sticky Routing
- Set Up CI/CD
- Update, Rollback, Promote
- View Logs, Metrics And Events
- Set Up Alerts
- Log & Monitor Custom Metrics
Job Deployment
- Introduction To A Job
- Running your first Job
- Interacting With Your Job
- Monitor Your Job
- Configuring your Job
Workflow Deployment
Async Service Deployment
- Introduction to Async Service
- Deploy your first Async Service
- Configure async service
- Queue Integrations
- Monitor Your Async Service
Volumes
ML Repository
- Introduction To ML Repo
- Experiment Tracking
Secret Management
Platform
Deploying On Your Own Cloud
- Modes Of Deployment
- Deploy Compute Plane
- Deploy Control Plane Only
- Advanced Configuration
- Integrations
- Deploy TrueFoundry In An Air Gapped Environment
Dockerize Your Code
Its a good practice to always dockerize your code while deploying so that it is guaranteed to run anywhere. TrueFoundry helps you deploy your code on Kubernetes for which you will first need to create a docker image for your codebase.
For any deployment to work in Truefoundry, its essential to have a docker registry integrated with Truefoundry. This should have been already done by the Infra team during the setup step.
TrueFoundry can help you deploy in all the three use cases:
To deploy a already present docker image, choose the option of Docker Image
in the Service Deployment form.
Choosing the docker registry is optional - Truefoundry will automatically try to check if the image is present in any of the integrated registries.
- Select the Git Repo option in the Service Deployment form
- Choose your repository in the Repo URL box and the option
Dockerfile (I already have DockerFile)
in the Build section.
You will need to fill up 2 field here:
- Path to Build Context: This is the path in the Git repository where the code to be deployed is present. Usually this is the root directory of the Git repo and the value is
"./"
, however, if you have multiple services in the same repo, you can specify the path to the directory containing the service you want to deploy - for e.g."my-service/"
. - Path to Dockerfile: This is the path to the Dockerfile in the Git repository - relative to the
Path to Build Context
. For e.g. if the Dockerfile is present in themy-service
directory, and you have provided the build context as"my-service/"
, then the path to the Dockerfile is"./Dockerfile"
. However, if thePath to Build Context
is"./"
, then the path to the Dockerfile should be"./my-service/Dockerfile"
.
- Select the Git Repo option in the Service Deployment form
- Choose your repository in the Repo URL box and the option
Python Code (I don't have Dockerfile)
in the Build section.
You will need to fill up the following fields here:
- Path to Build Context: This is the path in the Git repository where the code to be deployed is present. Usually this is the root directory of the Git repo and the value is
"./"
, however, if you have multiple services in the same repo, you can specify the path to the directory containing the service you want to deploy - for e.g."my-service/"
. - Path to requirements.txt: This is the path to requirements.txt in the Git repository - relative to the
Path to Build Context
. For e.g. if the requirements.txt is present in themy-service
directory, and you have provided the build context as"my-service/"
, then the path to the requirements.txt is"./requirements.txt"
. However, if thePath to Build Context
is"./"
, then the path to the requirements.txt should be"./my-service/requirements.txt"
. - Python Version: This is the Python version to be used to build the image. This should be the same Python version using which the code has been written and tested.
- Command: This is the command to be used to run the code. This should be the same command which you are using to run the code locally.
- Select the option
Code From Laptop
in the Service Deployment form.
- Follow the steps to install truefoundry cli and login to truefoundry.
- Fill up the deployment form with the following key fields:
- Service Name: You can give any name which will later be used to identify the service in the TrueFoundry platform.
- Source Code (Build and deploy Source Code): If you have already written a Dockerfile locally, choose the option
Dockerfile (I already have DockerFile)
, else choose the optionPython Code (I don't have Dockerfile)
. The second option will work if you have Python code locally and a requriements.txt file containing the dependencies. The description of the fields can be found in the previous sections. - Ports: This is the port on which the service will be exposed. It will be the same port on which the service is running locally.
- Resources: You can start with the default values and later change after the first deployment.
- Download the deploy.py file, place it in the root of your project. Run it from the same directory where deploy.py is present. The command to run it will be:
python deploy.py
The Truefoundry CLI will build the image locally if Docker is installed in the environment, else it will build the image remotely on the Truefoundry platform. It will then push the image to the docker registry and deploy the service on Kubernetes.
Was this page helpful?