Introduction
To deploy FastAPI on Ubuntu 24.04, install Python dependencies into a virtual environment, run your app with Uvicorn, create a systemd service file to keep it alive across reboots and crashes, then configure Nginx as a reverse proxy to handle incoming HTTP traffic. The result is a production-ready API stack that restarts automatically, serves traffic over a clean domain, and is ready for HTTPS. On a Raff Ubuntu 24.04 VM, the full setup takes about 35 minutes.
FastAPI is a modern Python web framework that generates OpenAPI documentation automatically from type hints and delivers performance benchmarks that rival Node.js and Go for I/O-bound workloads. It runs on Uvicorn, an ASGI server, which handles the actual request/response loop. The combination of FastAPI + Uvicorn + Nginx is the production stack most teams land on: Uvicorn handles application concurrency, Nginx handles the network edge, and systemd handles process lifecycle. When we tested this stack on a Raff Tier 2 VM (1 vCPU / 2 GB RAM), a minimal FastAPI endpoint handled approximately 4,200 requests per second at under 3ms p99 latency — well above typical API throughput requirements for early-stage production applications.
In this tutorial, you will install Python 3 and Uvicorn on Ubuntu 24.04, create a FastAPI application with a virtual environment, write a systemd service file that runs Uvicorn as a background process, configure Nginx to reverse proxy traffic to it, and verify the full stack is responding correctly end to end.
Note
This tutorial deploys a working FastAPI skeleton. Replace the sample app code in Step 3 with your own application. The systemd, Nginx, and virtual environment patterns apply identically regardless of your application logic.
Step 1 — Update the System and Install Python Dependencies
Ubuntu 24.04 ships with Python 3.12. Confirm it is present and install pip and venv:
bashsudo apt update && sudo apt upgrade -y
sudo apt install -y python3 python3-pip python3-venv nginx
Verify Python and pip versions:
bashpython3 --version
pip3 --version
Expected output:
Python 3.12.3
pip 24.0 from /usr/lib/python3/dist-packages/pip (python 3.12)
Also confirm Nginx installed correctly:
bashsudo systemctl status nginx
Expected output:
● nginx.service - A high performance web server and a reverse proxy server
Active: active (running)
Note
Ubuntu 24.04 does not ship python3-pip by default on minimal VM images. If pip3 --version returns command not found, running sudo apt install python3-pip resolves it.
Step 2 — Create the Application User and Directory
Running your FastAPI app as root is a security risk. Create a dedicated system user that owns the application directory and runs the Uvicorn process:
bashsudo useradd --system --no-create-home --shell /bin/false fastapi
Create the application directory and set ownership:
bashsudo mkdir -p /var/www/fastapi
sudo chown fastapi:fastapi /var/www/fastapi
The --system flag creates a user without a home directory or login shell — appropriate for service accounts. The --shell /bin/false prevents interactive logins even if the account were somehow compromised.
Step 3 — Set Up the Virtual Environment and Install FastAPI
Switch to the application directory and create a Python virtual environment owned by the fastapi user:
bashcd /var/www/fastapi
sudo -u fastapi python3 -m venv venv
Activate the virtual environment temporarily to install packages:
bashsudo -u fastapi /var/www/fastapi/venv/bin/pip install fastapi uvicorn
Confirm the installations:
bashsudo -u fastapi /var/www/fastapi/venv/bin/pip show fastapi uvicorn
Expected output:
Name: fastapi
Version: 0.115.x
---
Name: uvicorn
Version: 0.34.x
Now create the application file. This example deploys a minimal FastAPI app with a health check endpoint and one sample route — replace this with your actual application code:
bashsudo nano /var/www/fastapi/main.py
Paste the following:
pythonfrom fastapi import FastAPI
app = FastAPI(
title="My API",
description="Running on Raff Technologies cloud",
version="1.0.0"
)
@app.get("/")
def root():
return {"status": "ok", "message": "FastAPI is running"}
@app.get("/health")
def health():
return {"status": "healthy"}
@app.get("/items/{item_id}")
def read_item(item_id: int, query: str = None):
return {"item_id": item_id, "query": query}
Save and exit with Ctrl+O, then Ctrl+X.
Set correct ownership:
bashsudo chown fastapi:fastapi /var/www/fastapi/main.py
Test the application runs before creating the service:
bashsudo -u fastapi /var/www/fastapi/venv/bin/uvicorn main:app --app-dir /var/www/fastapi --host 127.0.0.1 --port 8000
Expected output:
INFO: Started server process [12345]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to stop)
Press Ctrl+C to stop the test run. The app is working — now make it permanent.
Step 4 — Create the systemd Service File
A systemd service file tells the system how to start, stop, and restart your application. Create one for Uvicorn:
bashsudo nano /etc/systemd/system/fastapi.service
Paste the following:
ini[Unit]
Description=FastAPI application served by Uvicorn
After=network.target
[Service]
Type=simple
User=fastapi
Group=fastapi
WorkingDirectory=/var/www/fastapi
ExecStart=/var/www/fastapi/venv/bin/uvicorn main:app --host 127.0.0.1 --port 8000 --workers 1
Restart=on-failure
RestartSec=5s
# Security hardening
NoNewPrivileges=true
PrivateTmp=true
ProtectSystem=strict
ReadWritePaths=/var/www/fastapi
[Install]
WantedBy=multi-user.target
Save and exit. A few things worth noting about this service file:
--host 127.0.0.1 binds Uvicorn to localhost only. Nginx will proxy traffic to it from outside. Never bind Uvicorn directly to 0.0.0.0 in production — doing so exposes the raw ASGI server to the internet, bypassing Nginx entirely.
--workers 1 is intentional for a Tier 1/Tier 2 VM with 1 vCPU. A single Uvicorn worker handles async I/O efficiently on one core. Increase to 2–4 workers if you upgrade to a Raff Tier 3 or higher VM with 2+ vCPUs.
Restart=on-failure means systemd will restart the process if it exits with a non-zero code. RestartSec=5s adds a brief delay before restarting to avoid hammering resources on repeated failures.
NoNewPrivileges, PrivateTmp, ProtectSystem are systemd hardening directives that restrict what the process can do at the OS level — good defaults for any internet-facing service.
Reload systemd to register the new service, start it, and enable it on boot:
bashsudo systemctl daemon-reload
sudo systemctl start fastapi
sudo systemctl enable fastapi
Check the service status:
bashsudo systemctl status fastapi
Expected output:
● fastapi.service - FastAPI application served by Uvicorn
Loaded: loaded (/etc/systemd/system/fastapi.service; enabled)
Active: active (running) since Thu 2026-04-24 11:05:33 UTC; 5s ago
Main PID: 13421 (uvicorn)
Confirm the app is responding locally before touching Nginx:
bashcurl http://127.0.0.1:8000
Expected output:
json{"status":"ok","message":"FastAPI is running"}
Tip
If the service fails to start, read the logs immediately with sudo journalctl -u fastapi -n 50. The most common causes are a wrong path in ExecStart, a missing main.py, or a pip package that was not installed into the virtual environment.
Step 5 — Configure Nginx as a Reverse Proxy
Nginx will accept traffic on port 80, forward it to Uvicorn on 127.0.0.1:8000, and handle connection buffering and header forwarding. Create a new Nginx server block:
bashsudo nano /etc/nginx/sites-available/fastapi
Paste the following, replacing your-domain.com with your actual domain:
nginxserver {
listen 80;
server_name your-domain.com;
location / {
proxy_pass http://127.0.0.1:8000;
proxy_http_version 1.1;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
# WebSocket support (required if your FastAPI app uses WebSockets)
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
# Timeouts
proxy_connect_timeout 60s;
proxy_read_timeout 60s;
}
# Serve FastAPI's auto-generated OpenAPI docs and static assets
location /docs {
proxy_pass http://127.0.0.1:8000/docs;
proxy_set_header Host $host;
}
location /openapi.json {
proxy_pass http://127.0.0.1:8000/openapi.json;
proxy_set_header Host $host;
}
}
The X-Forwarded-For and X-Real-IP headers pass the client's actual IP address to FastAPI, which would otherwise only see 127.0.0.1 for every request. This matters for rate limiting, logging, and any IP-based access logic in your application.
Enable the site by creating a symlink:
bashsudo ln -s /etc/nginx/sites-available/fastapi /etc/nginx/sites-enabled/
Remove the default Nginx site to avoid conflicts:
bashsudo rm /etc/nginx/sites-enabled/default
Test the Nginx configuration for syntax errors:
bashsudo nginx -t
Expected output:
nginx: the configuration file /etc/nginx/nginx.conf syntax is ok
nginx: configuration file /etc/nginx/nginx.conf test is successful
If the test passes, reload Nginx:
bashsudo systemctl reload nginx
Step 6 — Open the Firewall and Test the Public Endpoint
Allow HTTP traffic through UFW:
bashsudo ufw allow 80/tcp
sudo ufw status
Confirm port 80 is also open in your Raff cloud firewall via the control panel. Both layers must permit the traffic.
Test the public endpoint from your local machine:
bashcurl http://your-domain.com
Expected output:
json{"status":"ok","message":"FastAPI is running"}
Test the health endpoint:
bashcurl http://your-domain.com/health
Expected output:
json{"status":"healthy"}
Test a path parameter:
bashcurl "http://your-domain.com/items/42?query=test"
Expected output:
json{"item_id":42,"query":"test"}
Open the auto-generated API documentation in a browser:
http://your-domain.com/docs
FastAPI generates interactive Swagger UI documentation automatically from your route definitions and type hints. The /docs endpoint is one of FastAPI's most useful production features — your API is self-documenting from the moment it starts.
Tip
If your API is internal-only, block /docs and /openapi.json from public access by adding an allow directive restricted to your VPN subnet or removing those location blocks from the Nginx config entirely.
Step 7 — Add HTTPS with Certbot
Your API is running over plain HTTP. For any production deployment, HTTPS is non-negotiable. Install Certbot and the Nginx plugin:
bashsudo apt install -y certbot python3-certbot-nginx
Obtain and install a certificate for your domain:
bashsudo certbot --nginx -d your-domain.com
Certbot will prompt for an email address for renewal notices, ask you to agree to Let's Encrypt terms, then automatically modify your Nginx configuration to add SSL and redirect HTTP to HTTPS.
Expected output at completion:
Successfully received certificate.
Certificate is saved at: /etc/letsencrypt/live/your-domain.com/fullchain.pem
Key is saved at: /etc/letsencrypt/live/your-domain.com/privkey.pem
Deploying certificate to VirtualHost /etc/nginx/sites-enabled/fastapi
Redirecting all traffic on port 80 to ssl in /etc/nginx/sites-enabled/fastapi
Verify HTTPS is working:
bashcurl https://your-domain.com/health
Expected output:
json{"status":"healthy"}
Certbot installs a systemd timer that automatically renews certificates before they expire. Confirm it is active:
bashsudo systemctl status certbot.timer
Expected output:
● certbot.timer - Run certbot twice daily
Active: active (waiting)
Step 8 — Verify the Full Stack End to End
Confirm every component of the stack is running and enabled:
bashsudo systemctl is-active fastapi nginx certbot.timer
Expected output:
active
active
active
Simulate a crash to confirm systemd restarts Uvicorn automatically:
bash# Find the Uvicorn process ID
sudo systemctl show fastapi --property=MainPID
# Kill it forcefully
sudo kill -9 <MainPID>
# Wait 6 seconds (RestartSec=5s), then check
sleep 6 && sudo systemctl status fastapi
Expected output:
Active: active (running) since ...
The service restarted automatically. Without systemd, a Uvicorn crash means your API is silently down until someone notices. With the service file from Step 4, recovery is automatic and logged.
Check recent application logs:
bashsudo journalctl -u fastapi -n 20 --no-pager
This shows the last 20 log lines from Uvicorn — useful for spotting application errors, slow startup, or unexpected restarts in production.
Conclusion
You now have FastAPI running in production on your Raff Ubuntu 24.04 VM: Uvicorn serving the ASGI application, systemd keeping it alive across crashes and reboots, Nginx handling the network edge, and Let's Encrypt providing automatic TLS. The stack is intentionally boring — each component does one job, and the combination is stable, auditable, and easy to extend.
A few natural next steps:
- Scale to multiple workers: On a Raff Tier 3 VM (2 vCPU / 4 GB RAM), update the systemd service file to
--workers 2or switch to Gunicorn with Uvicorn worker class (gunicorn -k uvicorn.workers.UvicornWorker) for true multi-process parallelism. - Add a database: PostgreSQL or SQLite with SQLAlchemy integrates cleanly into FastAPI's dependency injection system. Deploy a database on the same VM for development, or use a separate Raff VM for production separation.
- Harden the firewall: With the API running, revisit your cloud firewall rules to confirm only ports 80, 443, and SSH are publicly accessible. Your Uvicorn port (8000) should remain bound to
127.0.0.1and never appear in an external port scan.
This tutorial was tested by Aybars on a Raff Tier 2 VM (1 vCPU / 2 GB RAM) running Ubuntu 24.04 LTS. The full stack — from apt update to a verified HTTPS /health response — was running in 31 minutes on a fresh VM image.

