Deploy a Streamlit Service
What you'll learn
- Creating a Streamlit Service
- Deploying our service via
servicefoundry
This is a guide to deploy a app via Streamlit and servicefoundry
After you complete the guide, you will have a successfully deployed a Streamlit Service. Your deployed Streamlit Service will look like this:
Project structure
To complete this guide, you are going to create the following files:
main.py
: contains our inference and streamlit codeiris_classifier.joblib
: the model filedeploy.py
: contains our deployment coderequirements.txt
: contains our dependencies
Your final file structure is going to look like this:
.
├── main.py
├── iris_classifier.joblib
├── deploy.py
└── requirements.txt
Step 1: Writing our streamlit app
The first step is to create a streamlit app.
In our main.py code, we will be downloading the data and caching it when the app loads for the first time. We can choose to keep the data file uber-raw-data-sep14.csv.gz
in the same directory and read from there.
Create the main.py
and requirements.txt
files in the same directory where the model is stored
.
├── main.py
└── requirements.txt
main.py
main.py
import streamlit as st
import pandas as pd
import numpy as np
st.title("Uber pickups in NYC")
DATE_COLUMN = "date/time"
DATA_URL = (
"https://s3-us-west-2.amazonaws.com/"
"streamlit-demo-data/uber-raw-data-sep14.csv.gz"
)
@st.cache
def load_data(nrows):
data = pd.read_csv(DATA_URL, nrows=nrows)
lowercase = lambda x: str(x).lower()
data.rename(lowercase, axis="columns", inplace=True)
data[DATE_COLUMN] = pd.to_datetime(data[DATE_COLUMN])
return data
data_load_state = st.text("Loading data...")
data = load_data(10000)
data_load_state.text("Done! (using st.cache)")
st.subheader("Number of pickups by hour")
hist_values = np.histogram(data[DATE_COLUMN].dt.hour, bins=24, range=(0, 24))[0]
st.bar_chart(hist_values)
Click on the Open Recipe below to understand the main.py
:
requirements.txt
requirements.txt
streamlit
pandas
Step 2: Deploying the streamlit app
We will deploy the streamlit app we wrote in the above section. You can either deploy using the python APIs or you can deploy using a YAML file and the servicefoundry deploy
command.
Via python SDK
File Structure
.
├── main.py
├── deploy.py
└── requirements.txt
deploy.py
deploy.py
import argparse
import logging
from servicefoundry import Build, PythonBuild, Service, Resources, Port
logging.basicConfig(level=logging.INFO)
parser = argparse.ArgumentParser()
parser.add_argument("--workspace_fqn", required=True, type=str)
parser.add_argument("--host", required=True, type=str)
args = parser.parse_args()
service = Service(
name="streamlit",
image=Build(
build_spec=PythonBuild(
command="streamlit run main.py",
requirements_path="requirements.txt",
)
)
ports=[
Port(
port=8501,
host=args.host
)
],
resources=Resources(memory_limit=1500, memory_request=1000),
)
service.deploy(workspace_fqn=args.workspace_fqn)
Follow the recipe below to understand the deploy.py file :
Picking a value for
host
Providing a host value depends on the base domain urls configured in the cluster settings, you can learn how to find the base domain urls available to you here
For e.g. If your base domain url is
*.truefoundry.your-org.com
then a valid value can befastapi-your-workspace-8000.truefoundry.your-org.com
.Alternatively if you have a non wildcard based domain url e.g.
truefoundry.your-org.com
, then a valid value can betruefoundry.your-org.com/fastapi-your-workspace-8000
You can using Python API using:
python deploy.py --workspace_fqn <YOUR WORKSPACE FQN HERE> --host <YOUR HOST>
Via YAML file
File Structure
.
├── main.py
├── deploy.yaml
└── requirements.txt
deploy.yaml
deploy.yaml
name: streamlit
type: service
image:
type: build
build_source:
type: local
build_spec:
type: tfy-python-buildpack
command: streamlit run main.py
ports:
- port: 8501
host: <Provide a host value based on your configured domain>
resources:
memory_limit: 1500
memory_request: 1000
Follow the recipe below to understand the deploy.yaml code:
You can deploy the training job with YAML file using the command below:
servicefoundry deploy --workspace-fqn YOUR_WORKSPACE_FQN --file deploy.yaml --wait
End result
You can go to your deployments dashboard here and you will find a new deployment created with the name you "streamlit".
Afterwards, click on the service you just deployed. On the top-right corner you will see the endpoint of deployed application.
Click on the link here, you will redirected to your deployed application.
More Details
Updated 10 months ago