← All courses

Tutorial · Intermediate

Create a chatbot

By Ian Freitz de la Cernaianfreitz.com

Build a conversational AI in Python using Google's free Gemini API — from API key to a chatbot with its own personality.

IntermediateGeminiPython~8 minFree tier

What you’ll become

A developer who can ship an AI chatbot from an empty folder to a running demo.

  • Wire up the Gemini API with a safely-stored key
  • Hold a multi-turn conversation loop in Python
  • Give the bot its own persona and voice
  • Structure a .env-safe, git-ready project

Step 01

Prerequisites & tools

Before writing any code, make sure the following are ready on your machine. These are the only prerequisites for the tutorial.

  • Python 3.9 or higher — verify with python --version.
  • A terminal — PowerShell or CMD on Windows; Terminal on macOS or Linux.
  • A code editor — VS Code is recommended, but any editor works.
  • A Google account — required to access Google AI Studio and generate a free API key.

Free tier

Google AI Studio provides up to 15 requests per minute and 1,500 requests per day at no cost — more than enough for this tutorial and a first real project.

Step 02

Get your Gemini API key

The API key is the credential that lets your Python script communicate with Gemini. Generating one is free.

  • Go to aistudio.google.com and sign in with your Google account.
  • In the left sidebar, click Get API Key (or visit aistudio.google.com/apikey directly).
  • Click Create API Key and select an existing Google Cloud project, or create a new one.
  • Copy the generated key — it starts with AIza… — and keep it somewhere safe.

Security

Never share the key publicly or commit it to a repository. The next step stores it safely in a .env file.

Step 03

Create the project & install libraries

Set up a clean project folder and install the two Python packages we need. A virtual environment keeps the dependencies isolated.

terminal

# 1. Create a project folder
mkdir gemini-chatbot && cd gemini-chatbot

# 2. Create a virtual environment
python -m venv .venv

# 3. Activate it (macOS / Linux)
source .venv/bin/activate

#    Activate it (Windows PowerShell)
.venv\Scripts\Activate.ps1

# 4. Install required packages
pip install google-generativeai python-dotenv

The folder should now look like this:

project structure

gemini-chatbot/
  .venv/         virtual environment (don't touch)
  .env           your API key (next step)
  chatbot.py     main chatbot script
  .gitignore     keeps .env out of git

Step 04

Store the API key in .env

Never hardcode the key into the Python file. Put it in a .env file and let python-dotenv load it at runtime.

.env

# Paste your actual Gemini API key here
GEMINI_API_KEY=AIzaSy...your_key_here

Then add a .gitignore so the key never leaves your machine:

.gitignore

.env
.venv/
__pycache__/
*.pyc

Step 05

Write the chatbot

Create chatbot.py in the project folder. The whole program fits in a single file.

chatbot.py

# Gemini AI Chatbot — powered by Google AI Studio
import os
from dotenv import load_dotenv
import google.generativeai as genai

# 1. Load API key from .env
load_dotenv()
api_key = os.getenv("GEMINI_API_KEY")

if not api_key:
    raise ValueError("GEMINI_API_KEY not found. Check your .env file.")

# 2. Configure the Gemini client
genai.configure(api_key=api_key)

# 3. Choose the model
model = genai.GenerativeModel("gemini-2.0-flash")

# 4. Start a chat session (keeps history)
chat = model.start_chat(history=[])

# 5. Conversation loop
print("Gemini chatbot ready. Type 'quit' to exit.\n")

while True:
    user_input = input("You: ").strip()

    if user_input.lower() in ("quit", "exit", "bye"):
        print("Goodbye.")
        break

    if not user_input:
        continue

    try:
        response = chat.send_message(user_input)
        print(f"\nGemini: {response.text}\n")
    except Exception as e:
        print(f"\nError: {e}\n")

What each section does:

  • load_dotenv() reads .env and exposes GEMINI_API_KEY through os.getenv.
  • genai.configure() authenticates every subsequent API call.
  • GenerativeModel("gemini-2.0-flash") selects a fast, free model well-suited to chat.
  • model.start_chat() creates a session that retains the conversation history across turns automatically.
  • The while loop reads user input, sends it to Gemini, and prints the reply until the user types quit.

Step 06

Run the chatbot

Make sure the virtual environment is active — the prompt should start with (.venv) — then run the script.

terminal

(.venv) $ python chatbot.py
Gemini chatbot ready. Type 'quit' to exit.

You: Hello! What can you help me with?

Gemini: Hi! I'm a Gemini-powered assistant. I can help
with questions, writing, coding, analysis, and more.

You: quit
Goodbye.

A working chatbot in roughly thirty lines of Python. Each turn sends your message to Gemini, the model generates a response, and the reply prints in your terminal — with previous turns retained as context.

Step 07

Give the bot a personality

Right now the chatbot is a general-purpose assistant. Specialise it by adding a system prompt — a hidden instruction that shapes its tone, role and behaviour for the whole conversation.

Replace the model = genai.GenerativeModel(...) line in chatbot.py with the following:

chatbot.py

# Define the bot's personality
system_prompt = """
You are Alex, a friendly and knowledgeable tech support assistant
for a software company. Your traits:

- Warm, encouraging, patient tone
- Ask clarifying questions when a problem is unclear
- Provide step-by-step solutions
- Use simple language; avoid jargon unless the user does
"""

# Create the model with a system instruction
model = genai.GenerativeModel(
    model_name="gemini-2.0-flash",
    system_instruction=system_prompt,
)

chat = model.start_chat(history=[])

Ideas

Try a customer-service agent, a coding mentor, a recipe generator, a travel planner, or a fictional character. The system prompt is the single biggest lever on a chatbot’s feel.

Step 08

Where to take it next

  • Add a web UI with Django so the chatbot runs in a browser — there’s a full follow-up course for this.
  • Ground it on your own documents with a retrieval step — let the bot answer from your PDFs or notes.
  • Persist conversations so history survives restarts — a full follow-up course walks through doing this with Django and SQLite.
  • Deploy to Vercel and share a public URL — there’s a full follow-up course that walks through shipping the Django version end-to-end.

Discussion1

1
Sign in to join the conversation.
powix187d ago

Love the course

powix187d ago

Hop to next course