Using Docker to run your applications makes everything more reproducible: running the Docker container in your computer and in your deployment service of choice should be identical, so you shouldn't run into unforeseen issues in production.
But oftentimes when running in production, you don't use a single service. Databases, background workers, and caches all need to run as well.
If you're using something like Railway or Render, you'll be used to setting up the different services with the UI:
IMAGE showing a complex project set-up with Render, using various different services
Although it's handy that you can use their UI to set up your services, it does pose a problem: how do you run all these locally, so you can make sure they all play together nicely before deploying?
This is where Docker Compose comes into play.
Creating a Docker container for a Flask app
Before we dive into Docker Compose, let's start by creating a basic Docker container for a Flask application. First, create a new directory for your project and navigate to it:
mkdir flask-docker-compose
cd flask-docker-compose
Create a new file named Dockerfile
with the following content:
FROM python:3.12
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
CMD ["python", "app.py"]
This Dockerfile specifies the base Python image, sets the working directory, installs dependencies, copies the application files, and defines the command to run the Flask app.
Create a requirements.txt
file with the following content:
flask
Finally, create an app.py
file with a basic Flask application:
from flask import Flask
app = Flask(__name__)
@app.route("/")
def hello():
return "Hello, World!"
if __name__ == "__main__":
app.run(host="0.0.0.0", port=5000)
Running the Flask app using Docker
To build and run the Docker container for your Flask app, execute the following commands:
docker build -t flask-app .
docker run -p 5000:5000 flask-app
Open your web browser and navigate to http://localhost:5000
. You should see the "Hello, World!" message.
Use Docker Compose to run the Flask app and PostgreSQL
Now that we have a basic Flask app running in a Docker container, let's add PostgreSQL to the mix using Docker Compose.
The Docker Compose code for Flask
Create a new file named docker-compose.yml
with the following content:
version: '3'
services:
web:
build: .
ports:
- "5000:5000"
depends_on:
- db
environment:
- DATABASE_URL=postgresql://postgres:password@db/myapp
volumes:
- .:/app
db:
image: postgres
environment:
- POSTGRES_PASSWORD=password
- POSTGRES_DB=myapp
volumes:
- postgres_data:/var/lib/postgresql/data
volumes:
postgres_data:
Let's break down the different parts of this Docker Compose file:
Port mapping
The ports
section maps the container's port 5000 to the host's port 5000, allowing you to access the Flask app from your local machine.
Healthchecks
Docker Compose allows you to define healthchecks for your services. In this example, we haven't added any specific healthchecks, but you can add them to ensure that your services are running properly.
Environment variables and files
The environment
section allows you to set environment variables for your services. In this case, we set the DATABASE_URL
for the Flask app to connect to the PostgreSQL database.
If you wanted to use a .env
file for your service instead, you'd define the following block in docker-compose.yml
:
web:
build: .
# ...
env_file:
- .env
Your .env
file would contain the environment variables you'd like defined in your service:
DATABASE_URL=postgresql://postgres:password@db/myapp
REDIS_URL=redis://redis:6379
SECRET_KEY=your-secret-key
Virtual network
Docker compose creates a virtual network for your service. When a container is created, it joins the virtual network with the given service name. That's why the PostgreSQL address is @db
and the Redis address hostname is simply redis
.
Whether you use environment variables or environment files, to access their values in your Flask app you'd use the os
module:
import os
from flask import Flask
app = Flask(__name__)
app.config["SQLALCHEMY_DATABASE_URI"] = os.getenv("DATABASE_URL")
app.config["SECRET_KEY"] = os.getenv("SECRET_KEY")
Docker Compose for Flask + Postgres
To run the Flask app and PostgreSQL using Docker Compose, execute the following command:
docker-compose up
Docker Compose will build the Flask app image and start both the Flask app and PostgreSQL containers.
Local volumes
In the volumes
section, we mount the current directory (.
) to the /app
directory inside the Flask app container. This allows you to make changes to your code and see the changes instantly without rebuilding the container.
We also define a named volume postgres_data
for PostgreSQL to persist the database data across container restarts.
Specifying dependencies between services
The depends_on
section allows you to specify dependencies between services. In this case, the Flask app depends on the PostgreSQL database, so Docker Compose will start the db
service before the web
service.
Adding more services to your Docker Compose set-up
Now that we have a basic Flask app and PostgreSQL running with Docker Compose, let"s explore how to add more services like Celery and Flower.
Celery
To add Celery, a distributed task queue, to your Docker Compose setup, update your docker-compose.yml
file:
version: '3'
services:
# ...
celery:
build: .
command: celery -A tasks worker --loglevel=info
volumes:
- .:/app
depends_on:
- db
- redis
redis:
image: redis
This adds a Celery worker service that depends on the PostgreSQL database and a Redis instance for message brokering.
To run Celery you'll need a tasks.py
file (feel free to name it differently, but that's the convention!) with the Celery
object definition and the task definitions. For example:
from celery import Celery
import os
app = Celery("tasks", broker=os.getenv("REDIS_URL"))
@app.task
def example_task(param):
# Perform task logic here
print(f"Executing task with param: {param}")
Flower
Flower is a web-based tool for monitoring Celery. To add Flower to your Docker Compose setup:
version: '3'
services:
# ...
flower:
build: .
command: celery flower -A tasks --port=5555
ports:
- "5555:5555"
depends_on:
- celery
This adds a Flower service that depends on the Celery worker and exposes port 5555 for accessing the Flower web interface.
With these additions, your Docker Compose setup now includes a Flask app, PostgreSQL database, Celery worker, Redis message broker, and Flower monitoring tool. You can start all the services with a single docker-compose up
command and develop your application locally with ease.
Conclusion
Thank you for reading! If you've enjoyed this article, consider subscribing to get notified when we publish our next one.