- Linux - be proficient using Ubuntu for example and the CLI and understand how the shell works, what are environment variables, subshells, processes/tasks, etc...
- Docker (and docker-compose) - what are containers and how they work (conceptually) and how to create and run ones
- Git - what does version control system mean and how to use Git
- RDB (relational databases) - what are relational databases, and understand tables, how to create them and make relations between them as needed... also understand that through SQLite and PostgreSQL (preferred) or MySQL
- Python - how to write Python very well and understand its [OOP] implementation...
- Virtualenv - And how to create virtual environments for python to isolate it from the system's installed version...
- Virtualenvwrapper to manage virtual environments easily
As often happens, I found the official documentation and forum answers to be "close, but no cigar", and so had to experiment a little to get things working.
The main problem for me was a lack of concrete configuration examples. That's not entirely GitHub's fault: having migrated from Google Domains to Namecheap in the middle of this project, I was once again reminded of how many different ways there are to do things in the name service universe [1].
Although you'd think the simplest setup would be to merely configure for the subdomain case (https://www.example.com), in my experience using the apex domain (https://example.com) instead resulted in fewer complications.
So here's my recipe for using a custom domain with GitHub pages where Namecheap is the DNS provider:
# this is inspired by the official ImportDicomFiles.py - however, this is MUCH faster because | |
# it uses multiple workers to dramatically reduce IO bottlenecks | |
# it doesn't re-instantiate the Http() and headers for every file, rather every folder | |
# it does assume the input folders are structured in the orthanc format, e.g. /00/00/0000a8768bd86... | |
# and expects PATH_TO_SYNC to be something like: | |
## "/path/to/orthanc/root/*" for all root folders | |
## "/path/to/orthanc/root/1*" for all folders beginning with 1 | |
## "/path/to/orthanc/root/23" for root folder 23 only |
This is a simple way of importing MySQL database in Docker.
-
In you Dockerfile you must have a shared folder. Shared folder is a directory in your host machine that is mounted to Docker instance.
-
Put the exported sql file in the shared folder.
-
Login to your Docker instance via
docker exec -it DOCKER_CONTAINER_ID bin/bash
. -
Login to MySQL via
mysql -u USERNAME -p
.
<!-- Iterate N times (replace N with a number --> | |
{% for i in "x"|ljust:"N" %} | |
<!-- Access numeric variable (0-based index) --> | |
{{ forloop.counter0 }} | |
<!-- Access numeric variable (1-based index) --> | |
{{ forloop.counter }} | |
{% endfor %} | |
<!-- For example, iterate from 0 to 3 --> |