apt-get install supervisor ngrok
Create the config files
- /root/.ngrok2/ngrok.yml
- /etc/supervisor/conf.d/ngrok.conf
supervisorctl reread
apt-get install supervisor ngrok
Create the config files
supervisorctl reread
Unless otherwise noted (either in this file or in a file's copyright section) the contents of this gist are Copyright ©️2020 by Christopher Allen, and are shared under spdx:Creative Commons Attribution Share Alike 4.0 International (CC-BY-SA-4.) open-source license.
If you more tips and advice like these, you can become a monthly patron on my GitHub Sponsor Page for as little as $5 a month; and your contributions will be multipled, as GitHub is matching the first $5,000! This gist is all about Homebrew, so if you like it you can support it by donating to them or becoming one of their Github Sponsors.
# /etc/systemd/system/airflow-scheduler-health.service | |
# Description: A simple SystemD process manager that execute the health check scripts. | |
# To monitor: `sudo journalctl -u airflow-scheduler-health -f` | |
# To investigate unhealthy periods: `tail -f /var/log/airflow/scheduler/unhealthy-periods.log` | |
[Unit] | |
Description=Airflow Scheduler Health Checks | |
[Service] | |
User=user | |
Group=group |
I hereby claim:
To claim this, I am signing this object:
To remove a submodule you need to:
""" | |
Copies all keys from the source Redis host to the destination Redis host. | |
Useful to migrate Redis instances where commands like SLAVEOF and MIGRATE are | |
restricted (e.g. on Amazon ElastiCache). | |
The script scans through the keyspace of the given database number and uses | |
a pipeline of DUMP and RESTORE commands to migrate the keys. | |
Requires Redis 2.8.0 or higher. |
# Use envFrom to load Secrets and ConfigMaps into environment variables | |
apiVersion: apps/v1beta2 | |
kind: Deployment | |
metadata: | |
name: mans-not-hot | |
labels: | |
app: mans-not-hot | |
spec: | |
replicas: 1 |
""" | |
Code that goes along with the Airflow tutorial located at: | |
https://github.com/airbnb/airflow/blob/master/airflow/example_dags/tutorial.py | |
""" | |
from airflow import DAG | |
from airflow.operators.python_operator import PythonOperator | |
from airflow.operators.generic_transfer import GenericTransfer | |
from airflow.contrib.hooks import FTPHook | |
from airflow.hooks.mysql_hook import MySqlHook |
# Showing the necessary part of adding the logging config to the python path | |
COPY config/log_config.py config/log_config.py | |
COPY config/__init__.py config/__init__.py | |
RUN chown -R airflow: ${AIRFLOW_HOME} | |
ENV PYTHONPATH ${PYTHONPATH}:/usr/lib/python2.7/site-packages/:${AIRFLOW_HOME}/config/ |