Skip to content

Instantly share code, notes, and snippets.

@wchesley
Last active January 26, 2020 22:00
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save wchesley/95095e1044eade872fe910d99406c25f to your computer and use it in GitHub Desktop.
Save wchesley/95095e1044eade872fe910d99406c25f to your computer and use it in GitHub Desktop.
Docker set up files for django and postgres, using docker-compose v3.7

Django, Postgres & Docker

First we need to set up a docker python environment:

FROM python:3.7.6-slim

# Set environment varibles
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1

# Set work directory
WORKDIR /code

# Install dependencies
COPY Pipfile Pipfile.lock /code/
RUN pip install pipenv && pipenv install --system

# Copy project
COPY . /code/

This will set up the docker environment for python. On the top line we’re using an official Docker image for Python 3.7.6-slim, a minified python image. Next we create two environment variables. PYTHONUNBUFFERED ensures our console output looks familiar and is not buffered by Docker, which we don’t want. PYTHONDONTWRITEBYTECODE means Python won’t try to write .pyc files which we also do not desire.
On the top line we’re using an official Docker image for Python 3.7. Next we create two environment variables. PYTHONUNBUFFERED ensures our console output looks familiar and is not buffered by Docker, which we don’t want. PYTHONDONTWRITEBYTECODE means Python won’t try to write .pyc files which we also do not desire.
The next line sets the WORKDIR to /code. This means the working directory is located at /code so in the future to run any commands like manage.py we can just use WORKDIR rather than need to remember where exactly on Docker our code is actually located.
Then we install our dependencies, making sure we have the latest version of pip, installing pipenv, copying our local Pipfile and Pipfile.lock into Docker and then running it to install our dependencies. The RUN command lets us run commands in Docker just as we would on the command line.
We can’t run a Docker container until it has an image so let’s do that by building the image for the first time.
docker build .

Next we need a new docker-compose.yml file.

version: '3.7'

services:
  db:
    image: postgres:12.1-alpine
    volumes:
      - postgres_data:/var/lib/postgresql/data/
  web:
    build: .
    command: python /code/manage.py runserver 0.0.0.0:8000
    volumes:
      - .:/code
    ports:
      - 8000:8000
    depends_on:
      - db

volumes:
  postgres_data:

On the top line we’re using the most recent version of Compose which is 3.7.

Under db for the database we want the Docker image for Postgres 12.1-alpine, again a minified image for postgres; and use volumes to tell Compose where the container should be located in our Docker container.

For web we’re specifying how the web service will run. First Compose needs to build an image from the current directory and start up the server at 0.0.0.0:8000. We use volumes to tell Compose to store the code in our Docker container at /code/. The ports config lets us map our own port 8000 to the port 8000 in the Docker container. This is the default Django port. And finally depends_on says that we should start the db first before running our web services.

The last section volumes is because Compose has a rule that you must list named volumes in a top-level volumes key.

For a Django deployment we need to migrate the database:
docker-compose run web python /code/manage.py migrate --noinput

Finally the image is ready for deployment, start the container with:
docker-compose up -d --build
Stop the container with:
docker-compose down

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment