Skip to content

Instantly share code, notes, and snippets.

@egardner
Last active January 28, 2021 17:39
Show Gist options
  • Star 4 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save egardner/b4e16a9360c39d7c17a5eb44dd915a7d to your computer and use it in GitHub Desktop.
Save egardner/b4e16a9360c39d7c17a5eb44dd915a7d to your computer and use it in GitHub Desktop.
Basic MW dev environment with Docker

Basic Docker Dev setup with PHP/Apache, MariaDB, and ElasticSearch

This Gist is a simple working example of how to use Docker containers to set up a MediaWiki development environment.

The following containers are included:

  • MariaDB image (stock) as the app database. The DB name is dev_wiki and there is no root password (use in dev only!)
  • Custom PHP/Apache image based on the mediawiki-docker image; this image sets up some appropriate PHP and Apache config options and installs PECL extensions. Custom .ini files can be provided as well, see uploads.ini as an example.
  • ElasticSearch image (stock) to power the search extensions.

Setup

Create a top-level src and web directory inside of wherever you download these files. The src folder will be mounted as a volume inside the Mediawiki container, so Download/clone Mediawiki and any necessary extensions there. web is used to build the MW web application image.

Your file structure should look like this:

.
├── docker-compose.yml
├── src/
└── web/
    ├── Dockerfile
    ├── uploads.ini
    └── xdebug.ini

Once your files are in place, run docker-compose up to create and spin up containers.

Web-based install

Navigate to localhost:8080 in your browser and follow the steps in the web installer. Make sure that your DB name and credentials match what has been specified in the docker-compose.yml file. The DB host can just be listed as database thanks to the links property specified in the mediawiki container.

Save the LocalSettings.php file in src/ alongside the other MW code, and you should have a working version of Mediawiki now running inside your containers.

Configuring Extensions

Manual steps to configure extensions will still be required. Running maintenance scripts can be done with the docker exec command. For example, to run the setup script for Cirrus Search, you can do the following:

# in your shell, execute the bash command inside your Mediawiki container:
docker exec -it <container id> /bin/bash

#now you are running bash inside the container
php extensions/CirrusSearch/maintenance/updateSearchIndexConfig.php

CirrusSearch Configuration

An ElasticSearch image is provided ready-to-go in the docker-compose.yml file. To use the CirrusSearch extension, follow these steps:

  1. Download/clone and enable the Elastica extension (wfLoadExtension( "Elastica" );) in LocalSettings.php if you have not done so already
  2. Download/clone the CirrusSearch extension
  3. Add the following to the end of LocalSettings.php:
require_once( "$IP/extensions/CirrusSearch/CirrusSearch.php" );
$wgDisableSearchUpdate = true;
$wgCirrusSearchServers = [ "elasticsearch.svc" ];
  1. Find the ID of the running Mediawiki container using docker ps
  2. Shell into this container so you can run the configuration scripts: docker exec -it <container_id> /bin/bash
  3. At the new bash prompt, run: php extensions/CirrusSearch/maintenance/updateSearchIndexConfig.php
  4. Now remove the $wgDisableSearchUpdate line from LocalSettings
  5. Inside the Mediawiki container's bash prompt, run: php extensions/CirrusSearch/maintenance/forceSearchIndex.php --skipLinks --indexOnSkip and then:
    php extensions/CirrusSearch/maintenance/forceSearchIndex.php --skipParse. These scripts can take some time to run if there is a lot of content in the wiki.
  6. Finally, add this line to LocalSettings: $wgSearchType = "CirrusSearch";

Wikibase Configuration

  1. Download/clone Wikibase and its submodules into extensions/Wikibase
  2. Install composer inside your Mediawiki container. The easiest way to do that is probably to shell in to the Mediawiki container and run these php commands. This will build a copy of composer.phar that you can run with PHP. Any subsequent steps that say composer install can be replaced with php composer.phar install, etc.
  3. Install the composer merge plugin by running: php composer.phar https://github.com/wikimedia/composer-merge-plugin in the Mediawiki container bash prompt.
  4. Add a composer.local.json file to the top-level Mediawiki folder with these contents:
{
  "extra": {
    "merge-plugin": {
      "include": [
        "extensions/Wikibase/composer.json"
      ]
    }
  }
}
  1. Run composer install (or php composer.phar install)
  2. Add the following to the end of LocalSettings:
$wgEnableWikibaseRepo = true;
$wgEnableWikibaseClient = true;
require_once "$IP/extensions/Wikibase/repo/Wikibase.php";
require_once "$IP/extensions/Wikibase/repo/ExampleSettings.php";
require_once "$IP/extensions/Wikibase/client/WikibaseClient.php";
require_once "$IP/extensions/Wikibase/client/ExampleSettings.php";
  1. Shell into the Mediawiki container and run the setup scripts: php maintenance/update.php in the top level;
    php extensions/Wikibase/lib/maintenance/populateSitesTable.php
    php extensions/Wikibase/repo/maintenance/rebuildItemsPerSite.php
    php extensions/Wikibase/client/maintenance/populateInterwiki.php

WikibaseMediaInfo

Installation of this extension is pretty straitforward; follow these instructions.

To enable the "depicts" feature-flag, add this to LocalSettings: $wgMediaInfoEnableFilePageDepicts = true;

This gist will be revised as more improvements are added to this process.

version: '3'
services:
mediawiki:
build: ./web
depends_on:
- database
- elasticsearch
environment:
- MW_ELASTIC_HOST=elasticsearch.svc
- MW_ELASTIC_PORT=9200
volumes:
- ./src:/var/www/html
ports:
- 8080:80
links:
- database
database:
image: mariadb
restart: always
environment:
MYSQL_DATABASE: dev_wiki
MYSQL_ALLOW_EMPTY_PASSWORD: 'yes'
volumes:
- ./src:/var/www/html
elasticsearch:
image: elasticsearch:5.6
volumes: # Persist ES data in seperate "esdata" volume
- esdata:/usr/share/elasticsearch/data
environment:
- bootstrap.memory_lock=true
- "ES_JAVA_OPTS=-Xms512m -Xmx512m"
- discovery.type=single-node
ports:
- "9300:9300"
- "9200:9200"
networks:
default:
aliases:
- elasticsearch.svc
volumes:
esdata:
# Put this in web/Dockerfile
# Adapted from https://github.com/wikimedia/mediawiki-docker/blob/master/1.32/Dockerfile
FROM php:7.2-apache
# System Dependencies.
RUN apt-get update && apt-get install -y \
git \
imagemagick \
libicu-dev \
# Required for SyntaxHighlighting
python3 \
--no-install-recommends && rm -r /var/lib/apt/lists/*
# Install the PHP extensions we need
RUN docker-php-ext-install mbstring mysqli opcache intl
# Install Xdebug extension
RUN yes | pecl install xdebug
COPY xdebug.ini $PHP_INI_DIR/conf.d/
# Install the default object cache.
RUN pecl channel-update pecl.php.net \
&& pecl install apcu \
&& docker-php-ext-enable apcu
# Enable Short URLs
RUN a2enmod rewrite \
&& { \
echo '<Directory /var/www/html>'; \
echo ' RewriteEngine On'; \
echo ' RewriteCond %{REQUEST_FILENAME} !-f'; \
echo ' RewriteCond %{REQUEST_FILENAME} !-d'; \
echo ' RewriteRule ^ %{DOCUMENT_ROOT}/index.php [L]'; \
echo '</Directory>'; \
} > "$APACHE_CONFDIR/conf-available/short-url.conf" \
&& a2enconf short-url
# set recommended PHP.ini settings
# see https://secure.php.net/manual/en/opcache.installation.php
RUN { \
echo 'opcache.memory_consumption=128'; \
echo 'opcache.interned_strings_buffer=8'; \
echo 'opcache.max_accelerated_files=4000'; \
echo 'opcache.revalidate_freq=60'; \
echo 'opcache.fast_shutdown=1'; \
echo 'opcache.enable_cli=1'; \
} > /usr/local/etc/php/conf.d/opcache-recommended.ini
# Copy any custom .ini files here
COPY uploads.ini $PHP_INI_DIR/conf.d/
# SQLite Directory Setup
RUN mkdir -p /var/www/data \
&& chown -R www-data:www-data /var/www/data
# Version
# ENV MEDIAWIKI_MAJOR_VERSION 1.32
# ENV MEDIAWIKI_BRANCH REL1_32
# ENV MEDIAWIKI_VERSION 1.32.0
# ENV MEDIAWIKI_SHA512 5e198844bba12f5a3a73a05dd7d855d3e883914c6e7c23676921a169dc1c7089ed31adfb7369c24cbaf10b43171dd2a12929284b65edde44d7b9721385ff1cc3
# MediaWiki setup
# RUN curl -fSL "https://releases.wikimedia.org/mediawiki/${MEDIAWIKI_MAJOR_VERSION}/mediawiki-${MEDIAWIKI_VERSION}.tar.gz" -o mediawiki.tar.gz \
# && echo "${MEDIAWIKI_SHA512} *mediawiki.tar.gz" | sha512sum -c - \
# && tar -xz --strip-components=1 -f mediawiki.tar.gz \
# && rm mediawiki.tar.gz \
# && chown -R www-data:www-data extensions skins cache images
# Put this in web/uploads.ini
file_uploads = On
memory_limit = 20M
upload_max_filesize = 20M
post_max_size = 20M
max_execution_time = 600
# Put this in web/uploads.ini
# zend_extension install location may change
# The xdebug_remote_host can be set to host.docker.internal
# if you are running Docker for Mac. Otherwise this should be the local IP address
# of the host machine.
zend_extension = /usr/local/lib/php/extensions/no-debug-non-zts-20170718/xdebug.so
xdebug.remote_host = host.docker.internal
xdebug.remote_enable = 1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment