Step 01
Before you start
This course picks up from Persist chatbot history. You’ll take the same gemini-chatbot/ folder — with the chatsite Django project, the chat app, and the Messagemodel — and put it on the public internet via Vercel’s Python serverless runtime.
- A working
gemini-chatbot/folder from the previous three courses. - Git installed locally — verify with
git --version. - A GitHub account (free tier is fine) and an empty repository to push to.
- A Vercel account signed in with the same GitHub identity — sign up free at
vercel.com.
What you'll build
gemini-chatbot.vercel.app serving your Django chatbot. Every git push to main triggers a new production deploy automatically.Step 02
Pin dependencies in requirements.txt
Vercel installs Python packages from a requirements.txtat the project root. Pin the packages you’ve used across the previous courses, plus whitenoise so Django can serve its own static assets without a separate CDN.
requirements.txt
django>=5.0
google-generativeai>=0.8
python-dotenv>=1.0
whitenoise>=6.6
Install it into your existing virtualenv to confirm the pins resolve:
terminal
(.venv) $ pip install -r requirements.txtStep 03
Production-proof chatsite/settings.py
The settings.py you’ve been running locally has DEBUG = True and an empty ALLOWED_HOSTS — both of which Django will reject on a real domain. Patch the file so it reads production-sensitive values from environment variables and serves static files through WhiteNoise.
chatsite/settings.py
# chatsite/settings.py
import os
from pathlib import Path
BASE_DIR = Path(__file__).resolve().parent.parent
# Read SECRET_KEY and DEBUG from the environment in production.
SECRET_KEY = os.environ.get("DJANGO_SECRET_KEY", "dev-only-insecure-key")
DEBUG = os.environ.get("DJANGO_DEBUG", "False") == "True"
# Vercel assigns each deployment a *.vercel.app hostname.
ALLOWED_HOSTS = [".vercel.app", ".now.sh", "localhost", "127.0.0.1"]
MIDDLEWARE = [
"django.middleware.security.SecurityMiddleware",
# WhiteNoise must come right after SecurityMiddleware.
"whitenoise.middleware.WhiteNoiseMiddleware",
"django.contrib.sessions.middleware.SessionMiddleware",
"django.middleware.common.CommonMiddleware",
"django.middleware.csrf.CsrfViewMiddleware",
"django.contrib.auth.middleware.AuthenticationMiddleware",
"django.contrib.messages.middleware.MessageMiddleware",
"django.middleware.clickjacking.XFrameOptionsMiddleware",
]
# Collected static files live here; WhiteNoise serves them in production.
STATIC_URL = "static/"
STATIC_ROOT = BASE_DIR / "staticfiles"
STATICFILES_STORAGE = "whitenoise.storage.CompressedManifestStaticFilesStorage"
Why WhiteNoise
/static/ from. WhiteNoise lets Django itself serve compressed, fingerprinted static files from STATIC_ROOT, which is exactly what the serverless function needs.Step 04
Add vercel.json and a build script
vercel.json tells Vercel how to build and route the project. Point the Python builder at chatsite/wsgi.py — the WSGI entry point Django generated when you ran startproject — and route every non-static URL through it.
vercel.json
{
"version": 2,
"builds": [
{
"src": "chatsite/wsgi.py",
"use": "@vercel/python",
"config": { "maxLambdaSize": "15mb", "runtime": "python3.12" }
},
{
"src": "build_files.sh",
"use": "@vercel/static-build",
"config": { "distDir": "staticfiles" }
}
],
"routes": [
{ "src": "/static/(.*)", "dest": "/static/$1" },
{ "src": "/(.*)", "dest": "chatsite/wsgi.py" }
]
}
The static-build step invokes a small shell script that installs dependencies and runs collectstatic so WhiteNoise has something to serve.
build_files.sh
#!/usr/bin/env bash
# build_files.sh — runs on Vercel before the Python build
python -m pip install --upgrade pip
pip install -r requirements.txt
python manage.py collectstatic --noinput
Make the script executable so Vercel’s Linux runner can run it:
terminal
(.venv) $ chmod +x build_files.shStep 05
Plan for a read-only filesystem
Vercel runs your Django code as stateless serverless functions. That’s great for request handling, but it means db.sqlite3— which the persist-history course wrote to — won’t survive a redeploy and isn’t shared across parallel invocations.
Pick a managed database
DATABASES in settings.py for Vercel Postgres or Neon — both offer free tiers and give you a DATABASE_URL you can read with dj-database-url.Step 06
Commit, .gitignore, push to GitHub
Extend the .gitignore from earlier courses so build output and local databases stay off GitHub:
.gitignore
# In addition to the entries from earlier courses
db.sqlite3
staticfiles/
.vercel
Then initialise git, commit everything, and push to a fresh GitHub repository:
terminal
# From inside gemini-chatbot/
git init
git add .
git commit -m "Prepare Django chatbot for Vercel"
# Create an empty repo on github.com first, then:
git branch -M main
git remote add origin git@github.com:YOUR_USER/gemini-chatbot.git
git push -u origin main
Never commit .env
.env is still ignored. Your GEMINI_API_KEY will live as a Vercel environment variable, not in the repository.Step 07
Import the repo on Vercel
Go to vercel.com/new, pick the GitHub repo you just pushed, and let Vercel auto-detect the Python project. Before clicking Deploy, expand Environment Variables and add:
GEMINI_API_KEY— the same value from your local.env.DJANGO_SECRET_KEY— a long random string; generate one withpython -c "import secrets; print(secrets.token_urlsafe(50))".DJANGO_DEBUG— leave this unset (or set toFalse).
Click Deploy. Vercel runs build_files.sh, bundles the Python runtime, and assigns a your-project.vercel.app URL.
Prefer the CLI?
terminal
# Optional — deploy straight from the terminal
npm install -g vercel
vercel login
vercel # preview deployment
vercel --prod # production deployment
Step 08
Verify and iterate
Open the *.vercel.appURL in a browser and send a message. If the reply comes back from Gemini, you’re live. Open the Deployments tab on Vercel for build logs and the Functions tab for runtime logs — both show the exact traceback if a request 500s.
From here, every git push to main produces a new production deployment. Pushing to any other branch produces a preview deployment on its own URL — great for reviewing a change before it hits production.
Step 09
Where to take it next
- Attach a custom domain from the Vercel project settings — point a DNS record and Vercel handles TLS for you.
- Swap SQLite for Vercel Postgres so chat history survives redeploys and scales across parallel invocations.
- Enable Vercel Analytics to see real traffic and response times without wiring up a separate tool.
- Use preview deployments — open a pull request and Vercel builds a throwaway URL for it, perfect for sharing work-in-progress tweaks before merging.