Skip to content

Instantly share code, notes, and snippets.

@robkooper
Last active November 4, 2021 17:38
Show Gist options
  • Save robkooper/f503719224b17e614e41d1cbebdadf24 to your computer and use it in GitHub Desktop.
Save robkooper/f503719224b17e614e41d1cbebdadf24 to your computer and use it in GitHub Desktop.
Deploying Clowder using Docker
# This file will override the configation options in the docker-compose
# file. Copy this file to the same folder as docker-compose as .env
# ----------------------------------------------------------------------
# GENERAL CONFIGURATION
# ----------------------------------------------------------------------
# project name (-p flag for docker-compose)
#COMPOSE_PROJECT_NAME=dev
# ----------------------------------------------------------------------
# TRAEFIK CONFIGURATION
# ----------------------------------------------------------------------
# hostname of server
TRAEFIK_HOST=Host:clowder-docker.ncsa.illinois.edu;
# only allow access from localhost and NCSA
#TRAEFIK_IPFILTER=172.16.0.0/12, 141.142.0.0/16
# Run traffik on port 80 (http) and port 443 (https)
TRAEFIK_HTTP_PORT=80
TRAEFIK_HTTPS_PORT=443
TRAEFIK_HTTPS_OPTIONS=TLS
# enable SSL cerificate generation
TRAEFIK_ACME_ENABLE=true
# Use you real email address here to be notified if cert expires
TRAEFIK_ACME_EMAIL=devnull@example.com
# Always use https, trafic to http is redirected to https
TRAEFIK_HTTP_REDIRECT=Redirect.EntryPoint:https
# ----------------------------------------------------------------------
# CLOWDER CONFIGURATION
# ----------------------------------------------------------------------
# what version of clowder to use
CLOWDER_VERSION=develop
# path for clowder
#CLOWDER_CONTEXT=/clowder/
# list of initial admins
#CLOWDER_ADMINS=admin@example.com
# require approval of the clowder admins before user can login
#CLOWDER_REGISTER=true
# secret used to encrypt cookies for example
CLOWDER_SECRET=this_is_something_you_should_change
# admin key to clowder
CLOWDER_KEY=super_secret
# use SSL for login pages (set this if you enable ACME)
CLOWDER_SSL=true
# should clowder send email (false means send email using smtp server)
SMTP_MOCK=false
# name of the smtp server that will handle the emails from clowder
SMTP_SERVER=smtp
# ----------------------------------------------------------------------
# RABBITMQ CONFIGURATION
# ----------------------------------------------------------------------
# RabbitMQ username and password
#RABBITMQ_DEFAULT_USER=clowder
#RABBITMQ_DEFAULT_PASS=cats
# create the correct URI with above username and password
#RABBITMQ_URI=amqp://clowder:cats@rabbitmq/%2F
# exchange to be used
#RABBITMQ_EXCHANGE=clowder
# in case of external rabbitmq, the url to clowder
#RABBITMQ_CLOWDERURL=https://clowder-docker.ncsa.illinois.edu/

This will get clowder up and running. This assumes you have docker and docker-compose installed

Grab the 4 files from below, and modify as required

  • docker-compose.yml : the main docker setup file (same as in develop branch as of writing)
  • docker-compose.override.yml : changes to the docker-compose file, for example where to write the data
  • docker-compose.extractors.yml : list of extractors you want to start with clowder
  • .env : any environment variables changed that are used in the docker-compose files

Once you have the containers created, make the folder:

mkdir -p /home/clowder/clowder/{custom,data,elasticsearch,mongo,rabbitmq,traefik}

Start clowder:

docker-compose -f docker-compose.yml -f docker-compose.override.yml -f docker-compose.extractors.yml up -d

Wait and connect to the server: https://clowder-docker.ncsa.illinois.edu

version: '3.5'
# to use the extractors start with
# docker-compose -f docker-compose.yml -f docker-compose.override.yml -f docker-compose.extractors.yml up -d
services:
# ----------------------------------------------------------------------
# EXTRACTORS
# ----------------------------------------------------------------------
# extract checksum
filedigest:
image: clowder/extractors-digest:latest
restart: unless-stopped
networks:
- clowder
depends_on:
- rabbitmq
- clowder
environment:
- RABBITMQ_URI=${RABBITMQ_URI:-amqp://guest:guest@rabbitmq/%2F}
# extract preview image
imagepreview:
image: clowder/extractors-image-preview:latest
restart: unless-stopped
networks:
- clowder
depends_on:
- rabbitmq
- clowder
environment:
- RABBITMQ_URI=${RABBITMQ_URI:-amqp://guest:guest@rabbitmq/%2F}
# extract image metadata
imagemetadata:
image: clowder/extractors-image-metadata:latest
restart: unless-stopped
networks:
- clowder
depends_on:
- rabbitmq
- clowder
environment:
- RABBITMQ_URI=${RABBITMQ_URI:-amqp://guest:guest@rabbitmq/%2F}
# extract preview image from audio spectrogram
audiopreview:
image: clowder/extractors-audio-preview:${CLOWDER_VERSION:-latest}
restart: unless-stopped
networks:
- clowder
depends_on:
- rabbitmq
- clowder
environment:
- RABBITMQ_URI=${RABBITMQ_URI:-amqp://guest:guest@rabbitmq/%2F}
# extract pdf preview image
pdfpreview:
image: clowder/extractors-pdf-preview:${CLOWDER_VERSION:-latest}
restart: unless-stopped
networks:
- clowder
depends_on:
- rabbitmq
- clowder
environment:
- RABBITMQ_URI=${RABBITMQ_URI:-amqp://guest:guest@rabbitmq/%2F}
# extract video preview image as well as smaller video
videopreview:
image: clowder/extractors-video-preview:${CLOWDER_VERSION:-latest}
restart: unless-stopped
networks:
- clowder
depends_on:
- rabbitmq
- clowder
environment:
- RABBITMQ_URI=${RABBITMQ_URI:-amqp://guest:guest@rabbitmq/%2F}
version: "3.5"
volumes:
traefik:
driver_opts:
type: none
o: bind
device: /home/kooper/clowder/traefik
clowder-data:
driver_opts:
type: none
o: bind
device: /home/kooper/clowder/data
clowder-custom:
driver_opts:
type: none
o: bind
device: /home/kooper/clowder/custom
mongo:
driver_opts:
type: none
o: bind
device: /home/kooper/clowder/mongo
rabbitmq:
driver_opts:
type: none
o: bind
device: /home/kooper/clowder/rabbitmq
elasticsearch:
driver_opts:
type: none
o: bind
device: /home/kooper/clowder/elasticsearch
version: '3.5'
services:
# ----------------------------------------------------------------------
# SINGLE ENTRYPOINT
# ----------------------------------------------------------------------
# webserver to handle all traffic. This can use let's encrypt to generate a SSL cert.
traefik:
image: traefik:1.7
command:
- --loglevel=INFO
- --api
# Entrypoints
- --defaultentrypoints=https,http
- --entryPoints=Name:http Address::80 ${TRAEFIK_HTTP_REDIRECT:-""}
- --entryPoints=Name:https Address::443 ${TRAEFIK_HTTPS_OPTIONS:-TLS}
# Configuration for acme (https://letsencrypt.org/)
- --acme=${TRAEFIK_ACME_ENABLE:-false}
#- --acme.caserver=https://acme-staging-v02.api.letsencrypt.org/directory
- --acme.email=${TRAEFIK_ACME_EMAIL:-""}
- --acme.entrypoint=https
- --acme.onhostrule=true
- --acme.storage=/config/acme.json
- --acme.httpchallenge.entrypoint=http
- --acme.storage=/config/acme.json
- --acme.acmelogging=true
# DOCKER
- --docker=true
- --docker.endpoint=unix:///var/run/docker.sock
- --docker.exposedbydefault=false
- --docker.watch=true
restart: unless-stopped
networks:
- clowder
ports:
- "${TRAEFIK_HTTP_PORT-8000}:80"
- "${TRAEFIK_HTTPS_PORT-8443}:443"
labels:
- "traefik.enable=true"
- "traefik.backend=traefik"
- "traefik.port=8080"
- "traefik.frontend.rule=${TRAEFIK_HOST:-}PathPrefixStrip: /traefik"
- "traefik.website.frontend.whiteList.sourceRange=${TRAEFIK_IPFILTER:-172.16.0.0/12}"
volumes:
- /var/run/docker.sock:/var/run/docker.sock:ro
- traefik:/config
# ----------------------------------------------------------------------
# CLOWDER APPLICATION
# ----------------------------------------------------------------------
# main clowder application
clowder:
image: clowder/clowder:${CLOWDER_VERSION:-latest}
restart: unless-stopped
networks:
- clowder
depends_on:
- mongo
environment:
- CLOWDER_ADMINS=${CLOWDER_ADMINS:-admin@example.com}
- CLOWDER_REGISTER=${CLOWDER_REGISTER:-false}
- CLOWDER_CONTEXT=${CLOWDER_CONTEXT:-/}
- CLOWDER_SSL=${CLOWDER_SSL:-false}
- RABBITMQ_URI=${RABBITMQ_URI:-amqp://guest:guest@rabbitmq/%2F}
- RABBITMQ_EXCHANGE=${RABBITMQ_EXCHANGE:-clowder}
- RABBITMQ_CLOWDERURL=${RABBITMQ_CLOWDERURL:-http://clowder:9000}
- SMTP_MOCK=${SMTP_MOCK:-true}
- SMTP_SERVER=${SMTP_SERVER:-smtp}
labels:
- "traefik.enable=true"
- "traefik.backend=clowder"
- "traefik.port=9000"
- "traefik.frontend.rule=${TRAEFIK_HOST:-}PathPrefix: ${CLOWDER_CONTEXT:-/}"
volumes:
- clowder-custom:/home/clowder/custom
- clowder-data:/home/clowder/data
# ----------------------------------------------------------------------
# CLOWDER DEPENDENCIES
# ----------------------------------------------------------------------
# database to hold metadata (required)
mongo:
image: mongo:3.4
restart: unless-stopped
networks:
- clowder
volumes:
- mongo:/data/db
# message broker (optional but needed for extractors)
rabbitmq:
image: rabbitmq:management-alpine
restart: unless-stopped
networks:
- clowder
environment:
- RABBITMQ_SERVER_ADDITIONAL_ERL_ARGS=-rabbitmq_management path_prefix "/rabbitmq"
- RABBITMQ_DEFAULT_USER=${RABBITMQ_DEFAULT_USER:-guest}
- RABBITMQ_DEFAULT_PASS=${RABBITMQ_DEFAULT_PASS:-guest}
labels:
- "traefik.enable=true"
- "traefik.backend=rabbitmq"
- "traefik.port=15672"
- "traefik.frontend.rule=${TRAEFIK_HOST:-}PathPrefix: /rabbitmq"
- "traefik.website.frontend.whiteList.sourceRange=${TRAEFIK_IPFILTER:-172.16.0.0/12}"
volumes:
- rabbitmq:/var/lib/rabbitmq
# search index (optional, needed for search and sorting future)
elasticsearch:
image: elasticsearch:2
command: elasticsearch -Des.cluster.name="clowder"
networks:
- clowder
restart: unless-stopped
environment:
- cluster.name=clowder
volumes:
- elasticsearch:/usr/share/elasticsearch/data
# monitor clowder extractors
monitor:
image: clowder/extractors-monitor:${CLOWDER_VERSION:-latest}
restart: unless-stopped
networks:
- clowder
depends_on:
- rabbitmq
environment:
- RABBITMQ_URI=${RABBITMQ_URI:-amqp://guest:guest@rabbitmq/%2F}
- RABBITMQ_MGMT_PORT=15672
- RABBITMQ_MGMT_PATH=/rabbitmq
labels:
- "traefik.enable=true"
- "traefik.backend=monitor"
- "traefik.port=9999"
- "traefik.frontend.rule=${TRAEFIK_FRONTEND_RULE:-}PathPrefixStrip:/monitor"
# ----------------------------------------------------------------------
# NETWORK FOR CONTAINER COMMUNICATION
# ----------------------------------------------------------------------
networks:
clowder:
# ----------------------------------------------------------------------
# VOLUMES FOR PERSISTENT STORAGE
# ----------------------------------------------------------------------
volumes:
traefik:
clowder-data:
clowder-custom:
mongo:
rabbitmq:
elasticsearch:
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment