Skip to content

Instantly share code, notes, and snippets.

@multinerd
Last active August 17, 2020 14:03
Show Gist options
  • Save multinerd/dea70020e340e9c3707b45890f619d82 to your computer and use it in GitHub Desktop.
Save multinerd/dea70020e340e9c3707b45890f619d82 to your computer and use it in GitHub Desktop.
Checking minimum requirements...
Creating volumes for persistent storage...
Created sentry-data.
Created sentry-postgres.
Created sentry-redis.
Created sentry-zookeeper.
Created sentry-kafka.
Created sentry-clickhouse.
Created sentry-symbolicator.
sentry/sentry.conf.py already exists, skipped creation.
sentry/config.yml already exists, skipped creation.
sentry/requirements.txt already exists, skipped creation.
symbolicator/config.yml already exists, skipped creation.
relay/config.yml already exists, skipped creation.
Fetching and updating Docker images...
Some service image(s) must be built from source by running:
docker-compose build ingest-consumer sentry-cleanup symbolicator-cleanup snuba-cleanup post-process-forwarder web cron worker
latest: Pulling from getsentry/sentry
8d691f585fa8: Pulling fs layer
3fd6980f9df6: Pulling fs layer
c4f890363b98: Pulling fs layer
b8cfb9853b0a: Pulling fs layer
eb82833faede: Pulling fs layer
bb30ddde69f1: Pulling fs layer
190f2138791e: Pulling fs layer
bc183caa6183: Pulling fs layer
5ac490fa6934: Pulling fs layer
e24542cd4a06: Pulling fs layer
ea93165aeea7: Pulling fs layer
d7140356e379: Pulling fs layer
bd25bf66d54d: Pulling fs layer
33fd49a407f4: Pulling fs layer
bb30ddde69f1: Waiting
5ac490fa6934: Waiting
190f2138791e: Waiting
e24542cd4a06: Waiting
bc183caa6183: Waiting
b8cfb9853b0a: Waiting
eb82833faede: Waiting
ea93165aeea7: Waiting
bd25bf66d54d: Waiting
33fd49a407f4: Waiting
d7140356e379: Waiting
3fd6980f9df6: Verifying Checksum
3fd6980f9df6: Download complete
b8cfb9853b0a: Verifying Checksum
b8cfb9853b0a: Download complete
eb82833faede: Verifying Checksum
eb82833faede: Download complete
bb30ddde69f1: Verifying Checksum
bb30ddde69f1: Download complete
190f2138791e: Download complete
c4f890363b98: Verifying Checksum
c4f890363b98: Download complete
8d691f585fa8: Verifying Checksum
8d691f585fa8: Download complete
8d691f585fa8: Pull complete
3fd6980f9df6: Pull complete
c4f890363b98: Pull complete
b8cfb9853b0a: Pull complete
eb82833faede: Pull complete
bb30ddde69f1: Pull complete
190f2138791e: Pull complete
5ac490fa6934: Verifying Checksum
5ac490fa6934: Download complete
ea93165aeea7: Download complete
d7140356e379: Verifying Checksum
d7140356e379: Download complete
bd25bf66d54d: Verifying Checksum
bd25bf66d54d: Download complete
33fd49a407f4: Verifying Checksum
33fd49a407f4: Download complete
e24542cd4a06: Verifying Checksum
e24542cd4a06: Download complete
bc183caa6183: Verifying Checksum
bc183caa6183: Download complete
bc183caa6183: Pull complete
5ac490fa6934: Pull complete
e24542cd4a06: Pull complete
ea93165aeea7: Pull complete
d7140356e379: Pull complete
bd25bf66d54d: Pull complete
33fd49a407f4: Pull complete
Digest: sha256:725a427d696d2100448ac59b5e06a19410fa2a7816659b3bec74358a94fa207b
Status: Downloaded newer image for getsentry/sentry:latest
docker.io/getsentry/sentry:latest
Building and tagging Docker images...
Building web
Step 1/4 : ARG SENTRY_IMAGE
Step 2/4 : FROM ${SENTRY_IMAGE}
---> 4d632bb8842c
Step 3/4 : COPY . /usr/src/sentry
---> 56a25e1855ec
Step 4/4 : RUN if [ -s /usr/src/sentry/requirements.txt ]; then pip install -r /usr/src/sentry/requirements.txt; fi
---> Running in 9e2f5613ba56
/usr/local/lib/python2.7/site-packages/cryptography/__init__.py:39: CryptographyDeprecationWarning: Python 2 is no longer supported by the Python core team. Support for it is now deprecated in cryptography, and will be removed in a future release.
CryptographyDeprecationWarning,
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Removing intermediate container 9e2f5613ba56
---> eff187aa5b6b
Successfully built eff187aa5b6b
Successfully tagged sentry-onpremise-local:latest
smtp uses an image, skipping
memcached uses an image, skipping
redis uses an image, skipping
postgres uses an image, skipping
zookeeper uses an image, skipping
kafka uses an image, skipping
clickhouse uses an image, skipping
snuba-api uses an image, skipping
snuba-consumer uses an image, skipping
snuba-outcomes-consumer uses an image, skipping
snuba-sessions-consumer uses an image, skipping
snuba-transactions-consumer uses an image, skipping
snuba-replacer uses an image, skipping
symbolicator uses an image, skipping
relay uses an image, skipping
nginx uses an image, skipping
Building snuba-cleanup ...
Building symbolicator-cleanup ...
Building web ...
Building cron ...
Building worker ...
Building ingest-consumer ...
Building post-process-forwarder ...
Building sentry-cleanup ...
Building post-process-forwarder
Building web
Building snuba-cleanup
Building cron
Building sentry-cleanup
Step 1/4 : ARG SENTRY_IMAGE
Step 2/4 : FROM ${SENTRY_IMAGE}
---> 4d632bb8842c
Step 3/4 : COPY . /usr/src/sentry
---> Using cache
---> 56a25e1855ec
Step 4/4 : RUN if [ -s /usr/src/sentry/requirements.txt ]; then pip install -r /usr/src/sentry/requirements.txt; fi
---> Using cache
---> eff187aa5b6b
Successfully built eff187aa5b6b
Successfully tagged sentry-onpremise-local:latest
Building cron ... done
Building worker
Step 1/5 : ARG BASE_IMAGE
Step 2/5 : FROM ${BASE_IMAGE}
Step 1/5 : ARG BASE_IMAGE
Step 2/5 : FROM ${BASE_IMAGE}
---> eff187aa5b6b
Step 3/5 : RUN apt-get update && apt-get install -y --no-install-recommends cron && rm -r /var/lib/apt/lists/*
---> 654d0e15252b
Step 3/5 : RUN apt-get update && apt-get install -y --no-install-recommends cron && rm -r /var/lib/apt/lists/*
Step 1/4 : ARG SENTRY_IMAGE
Step 2/4 : FROM ${SENTRY_IMAGE}
---> 4d632bb8842c
Step 3/4 : COPY . /usr/src/sentry
---> Using cache
---> 56a25e1855ec
Step 4/4 : RUN if [ -s /usr/src/sentry/requirements.txt ]; then pip install -r /usr/src/sentry/requirements.txt; fi
---> Using cache
---> eff187aa5b6b
Successfully built eff187aa5b6b
Successfully tagged sentry-onpremise-local:latest
Building web ... done
Building ingest-consumer
Step 1/4 : ARG SENTRY_IMAGE
Step 2/4 : FROM ${SENTRY_IMAGE}
---> 4d632bb8842c
Step 3/4 : COPY . /usr/src/sentry
---> Using cache
---> 56a25e1855ec
Step 4/4 : RUN if [ -s /usr/src/sentry/requirements.txt ]; then pip install -r /usr/src/sentry/requirements.txt; fi
---> Using cache
---> eff187aa5b6b
Successfully built eff187aa5b6b
Successfully tagged sentry-onpremise-local:latest
Building worker ... done
Building symbolicator-cleanup
---> Running in 748d4fb65d37
---> Running in 401014ca5840
Step 1/4 : ARG SENTRY_IMAGE
Step 2/4 : FROM ${SENTRY_IMAGE}
---> 4d632bb8842c
Step 3/4 : COPY . /usr/src/sentry
---> Using cache
---> 56a25e1855ec
Step 4/4 : RUN if [ -s /usr/src/sentry/requirements.txt ]; then pip install -r /usr/src/sentry/requirements.txt; fi
---> Using cache
---> eff187aa5b6b
Successfully built eff187aa5b6b
Successfully tagged sentry-onpremise-local:latest
Building ingest-consumer ... done
Step 1/5 : ARG BASE_IMAGE
Step 2/5 : FROM ${BASE_IMAGE}
---> a2ec8e59d81f
Step 3/5 : RUN apt-get update && apt-get install -y --no-install-recommends cron && rm -r /var/lib/apt/lists/*
---> Running in ceb8ab0ec6e0
Step 1/4 : ARG SENTRY_IMAGE
Step 2/4 : FROM ${SENTRY_IMAGE}
---> 4d632bb8842c
Step 3/4 : COPY . /usr/src/sentry
---> Using cache
---> 56a25e1855ec
Step 4/4 : RUN if [ -s /usr/src/sentry/requirements.txt ]; then pip install -r /usr/src/sentry/requirements.txt; fi
---> Using cache
---> eff187aa5b6b
Successfully built eff187aa5b6b
Successfully tagged sentry-onpremise-local:latest
Building post-process-forwarder ... done
Get:1 http://deb.debian.org/debian buster InRelease [122 kB]
Get:2 http://security.debian.org/debian-security buster/updates InRelease [65.4 kB]
Get:3 http://deb.debian.org/debian buster-updates InRelease [51.9 kB]
Get:1 http://deb.debian.org/debian buster InRelease [122 kB]
Get:1 http://security.debian.org/debian-security stretch/updates InRelease [53.0 kB]
Get:2 http://security.debian.org/debian-security buster/updates InRelease [65.4 kB]
Ign:2 http://deb.debian.org/debian stretch InRelease
Get:3 http://deb.debian.org/debian stretch-updates InRelease [93.6 kB]
Get:3 http://deb.debian.org/debian buster-updates InRelease [51.9 kB]
Get:4 http://deb.debian.org/debian stretch Release [118 kB]
Get:5 http://deb.debian.org/debian stretch Release.gpg [2410 B]
Get:4 http://deb.debian.org/debian buster/main amd64 Packages [7906 kB]
Get:6 http://security.debian.org/debian-security stretch/updates/main amd64 Packages [551 kB]
Get:4 http://security.debian.org/debian-security buster/updates/main amd64 Packages [218 kB]
Get:5 http://security.debian.org/debian-security buster/updates/main amd64 Packages [218 kB]
Get:7 http://deb.debian.org/debian stretch-updates/main amd64 Packages [2596 B]
Get:5 http://deb.debian.org/debian buster/main amd64 Packages [7906 kB]
Get:8 http://deb.debian.org/debian stretch/main amd64 Packages [7080 kB]
Get:6 http://deb.debian.org/debian buster-updates/main amd64 Packages [7868 B]
Get:6 http://deb.debian.org/debian buster-updates/main amd64 Packages [7868 B]
Fetched 8372 kB in 7s (1226 kB/s)
Reading package lists...Fetched 8372 kB in 8s (1102 kB/s)
Reading package lists...
Reading package lists...Fetched 7900 kB in 8s (971 kB/s)
Reading package lists...
Reading package lists...
Building dependency tree...
Reading state information...
The following additional packages will be installed:
lsb-base sensible-utils
Suggested packages:
anacron logrotate checksecurity
Recommended packages:
default-mta | mail-transport-agent
Reading package lists...The following NEW packages will be installed:
cron lsb-base sensible-utils
0 upgraded, 3 newly installed, 0 to remove and 16 not upgraded.
Need to get 143 kB of archives.
After this operation, 383 kB of additional disk space will be used.
Get:1 http://deb.debian.org/debian buster/main amd64 sensible-utils all 0.0.12 [15.8 kB]
Get:2 http://deb.debian.org/debian buster/main amd64 lsb-base all 10.2019051400 [28.4 kB]
Get:3 http://deb.debian.org/debian buster/main amd64 cron amd64 3.0pl1-134+deb10u1 [99.0 kB]
Building dependency tree...debconf: delaying package configuration, since apt-utils is not installed

Reading state information...
Fetched 143 kB in 0s (1467 kB/s)
The following additional packages will be installed:
sensible-utils
Suggested packages:
anacron logrotate checksecurity
Recommended packages:
default-mta | mail-transport-agent
The following NEW packages will be installed:
cron sensible-utils
Selecting previously unselected package sensible-utils.
(Reading database ...
(Reading database ... 5%
(Reading database ... 10%
(Reading database ... 15%
(Reading database ... 20%
(Reading database ... 25%
(Reading database ... 30%
(Reading database ... 35%
(Reading database ... 40%
(Reading database ... 45%
(Reading database ... 50%
(Reading database ... 55%
(Reading database ... 60%
(Reading database ... 65%
(Reading database ... 70%
(Reading database ... 75%
(Reading database ... 80%
(Reading database ... 85%
(Reading database ... 90%
(Reading database ... 95%
(Reading database ... 100%
(Reading database ... 11918 files and directories currently installed.)
Preparing to unpack .../sensible-utils_0.0.12_all.deb ...
Unpacking sensible-utils (0.0.12) ...
0 upgraded, 2 newly installed, 0 to remove and 0 not upgraded.
Need to get 115 kB of archives.
After this operation, 333 kB of additional disk space will be used.
Get:1 http://deb.debian.org/debian buster/main amd64 sensible-utils all 0.0.12 [15.8 kB]
Get:2 http://deb.debian.org/debian buster/main amd64 cron amd64 3.0pl1-134+deb10u1 [99.0 kB]
Selecting previously unselected package lsb-base.
Preparing to unpack .../lsb-base_10.2019051400_all.deb ...
Unpacking lsb-base (10.2019051400) ...
debconf: delaying package configuration, since apt-utils is not installed

Fetched 115 kB in 0s (1633 kB/s)
Building dependency tree...Selecting previously unselected package sensible-utils.
(Reading database ...
(Reading database ... 5%
(Reading database ... 10%
(Reading database ... 15%
(Reading database ... 20%
(Reading database ... 25%
(Reading database ... 30%
(Reading database ... 35%
(Reading database ... 40%
(Reading database ... 45%
(Reading database ... 50%
(Reading database ... 55%
(Reading database ... 60%
(Reading database ... 65%
(Reading database ... 70%
(Reading database ... 75%
(Reading database ... 80%
(Reading database ... 85%
(Reading database ... 90%
(Reading database ... 95%
(Reading database ... 100%
(Reading database ... 6838 files and directories currently installed.)
Preparing to unpack .../sensible-utils_0.0.12_all.deb ...
Selecting previously unselected package cron.
Preparing to unpack .../cron_3.0pl1-134+deb10u1_amd64.deb ...
Unpacking sensible-utils (0.0.12) ...
Unpacking cron (3.0pl1-134+deb10u1) ...
Reading state information...
Suggested packages:
anacron logrotate checksecurity
Recommended packages:
exim4 | postfix | mail-transport-agent
The following NEW packages will be installed:
cron
0 upgraded, 1 newly installed, 0 to remove and 7 not upgraded.
Need to get 95.4 kB of archives.
After this operation, 257 kB of additional disk space will be used.
Get:1 http://deb.debian.org/debian stretch/main amd64 cron amd64 3.0pl1-128+deb9u1 [95.4 kB]
Setting up lsb-base (10.2019051400) ...
Selecting previously unselected package cron.
Preparing to unpack .../cron_3.0pl1-134+deb10u1_amd64.deb ...
Unpacking cron (3.0pl1-134+deb10u1) ...
Setting up sensible-utils (0.0.12) ...
debconf: delaying package configuration, since apt-utils is not installed
Fetched 95.4 kB in 0s (630 kB/s)
Setting up cron (3.0pl1-134+deb10u1) ...
Setting up sensible-utils (0.0.12) ...
Selecting previously unselected package cron.
(Reading database ...
(Reading database ... 5%
(Reading database ... 10%
(Reading database ... 15%
(Reading database ... 20%
(Reading database ... 25%
(Reading database ... 30%
(Reading database ... 35%
(Reading database ... 40%
(Reading database ... 45%
(Reading database ... 50%
(Reading database ... 55%
(Reading database ... 60%
(Reading database ... 65%
(Reading database ... 70%
(Reading database ... 75%
(Reading database ... 80%
(Reading database ... 85%
(Reading database ... 90%
(Reading database ... 95%
(Reading database ... 100%
(Reading database ... 6658 files and directories currently installed.)
Preparing to unpack .../cron_3.0pl1-128+deb9u1_amd64.deb ...
Setting up cron (3.0pl1-134+deb10u1) ...
Unpacking cron (3.0pl1-128+deb9u1) ...
Setting up cron (3.0pl1-128+deb9u1) ...
Adding group `crontab' (GID 101) ...
Done.
Adding group `crontab' (GID 101) ...
invoke-rc.d: could not determine current runlevel
invoke-rc.d: policy-rc.d denied execution of start.
Done.
invoke-rc.d: could not determine current runlevel
Adding group `crontab' (GID 101) ...
invoke-rc.d: policy-rc.d denied execution of start.
Done.
update-rc.d: warning: start and stop actions are no longer supported; falling back to defaults
invoke-rc.d: could not determine current runlevel
invoke-rc.d: policy-rc.d denied execution of start.
Removing intermediate container 401014ca5840
---> 0984bad58f31
Step 4/5 : COPY entrypoint.sh /entrypoint.sh
Removing intermediate container 748d4fb65d37
---> 36ea799a48c0
Step 4/5 : COPY entrypoint.sh /entrypoint.sh
Removing intermediate container ceb8ab0ec6e0
---> ee0915672099
Step 4/5 : COPY entrypoint.sh /entrypoint.sh
---> 1af2aa19978d
Step 5/5 : ENTRYPOINT ["/entrypoint.sh"]
---> 2285296f5782
Step 5/5 : ENTRYPOINT ["/entrypoint.sh"]
---> Running in 0eeafcc65b18
---> 28ff865a9a5b
Step 5/5 : ENTRYPOINT ["/entrypoint.sh"]
---> Running in 3f6b8078e705
---> Running in 12727950bd3f
Removing intermediate container 0eeafcc65b18
---> 9f84d2b13426
Successfully built 9f84d2b13426
Successfully tagged sentry-cleanup-onpremise-local:latest
Building sentry-cleanup ... done
Removing intermediate container 3f6b8078e705
---> b618f5d91fbd
Successfully built b618f5d91fbd
Successfully tagged snuba-cleanup-onpremise-local:latest
Building snuba-cleanup ... done
Removing intermediate container 12727950bd3f
---> 38c1c7fac21c
Successfully built 38c1c7fac21c
Successfully tagged symbolicator-cleanup-onpremise-local:latest
Building symbolicator-cleanup ... done
Docker images built.
Removing network onpremise_default
Network onpremise_default not found.
Removing network sentry_onpremise_default
Network sentry_onpremise_default not found.
Creating network "sentry_onpremise_default" with the default driver
Creating volume "sentry_onpremise_sentry-secrets" with default driver
Creating volume "sentry_onpremise_sentry-smtp" with default driver
Creating volume "sentry_onpremise_sentry-zookeeper-log" with default driver
Creating volume "sentry_onpremise_sentry-kafka-log" with default driver
Creating volume "sentry_onpremise_sentry-smtp-log" with default driver
Creating volume "sentry_onpremise_sentry-clickhouse-log" with default driver
Creating sentry_onpremise_clickhouse_1 ...
Creating sentry_onpremise_clickhouse_1 ... done
Bootstrapping and migrating Snuba...
Starting sentry_onpremise_clickhouse_1 ...
Starting sentry_onpremise_clickhouse_1 ... done
Creating sentry_onpremise_redis_1 ...
Creating sentry_onpremise_zookeeper_1 ...
Creating sentry_onpremise_zookeeper_1 ... done
Creating sentry_onpremise_kafka_1 ...
Creating sentry_onpremise_redis_1 ... done
Creating sentry_onpremise_kafka_1 ... done
+ '[' b = - ']'
+ snuba bootstrap --help
+ set -- snuba bootstrap --force
+ set gosu snuba snuba bootstrap --force
+ exec gosu snuba snuba bootstrap --force
%3|1597439866.439|FAIL|rdkafka#producer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#192.168.0.5:9092 failed: Connection refused (after 1ms in state CONNECT)
%3|1597439867.438|FAIL|rdkafka#producer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#192.168.0.5:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
2020-08-14 21:17:47,438 Connection to Kafka failed (attempt 0)
Traceback (most recent call last):
File "/usr/src/snuba/snuba/cli/bootstrap.py", line 56, in bootstrap
client.list_topics(timeout=1)
cimpl.KafkaException: KafkaError{code=_TRANSPORT,val=-195,str="Failed to get metadata: Local: Broker transport failure"}
%3|1597439868.441|FAIL|rdkafka#producer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#192.168.0.5:9092 failed: Connection refused (after 0ms in state CONNECT)
%3|1597439869.440|FAIL|rdkafka#producer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#192.168.0.5:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
2020-08-14 21:17:49,442 Connection to Kafka failed (attempt 1)
Traceback (most recent call last):
File "/usr/src/snuba/snuba/cli/bootstrap.py", line 56, in bootstrap
client.list_topics(timeout=1)
cimpl.KafkaException: KafkaError{code=_TRANSPORT,val=-195,str="Failed to get metadata: Local: Broker transport failure"}
%3|1597439870.453|FAIL|rdkafka#producer-3| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#192.168.0.5:9092 failed: Connection refused (after 0ms in state CONNECT)
2020-08-14 21:17:51,455 Connection to Kafka failed (attempt 2)
Traceback (most recent call last):
File "/usr/src/snuba/snuba/cli/bootstrap.py", line 56, in bootstrap
client.list_topics(timeout=1)
cimpl.KafkaException: KafkaError{code=_TRANSPORT,val=-195,str="Failed to get metadata: Local: Broker transport failure"}
2020-08-14 21:17:53,069 Topic events created
2020-08-14 21:17:53,069 Topic errors-replacements created
2020-08-14 21:17:53,069 Topic cdc created
2020-08-14 21:17:53,069 Topic outcomes created
2020-08-14 21:17:53,069 Topic ingest-sessions created
2020-08-14 21:17:53,069 Topic event-replacements created
2020-08-14 21:17:53,069 Topic snuba-commit-log created
2020-08-14 21:17:53,093 Creating tables for storage events
2020-08-14 21:17:53,117 Migrating storage events
2020-08-14 21:17:53,127 Executing migration: ALTER TABLE sentry_local ADD COLUMN _tags_hash_map Array(UInt64) MATERIALIZED arrayMap((k, v) -> cityHash64(concat(replaceRegexpAll(k, '(\\=|\\\\)', '\\\\\\1'), '=', v)), tags.key, tags.value) AFTER _tags_flattened
2020-08-14 21:17:53,148 Creating tables for storage errors
2020-08-14 21:17:53,160 Migrating storage errors
2020-08-14 21:17:53,168 Executing migration: ALTER TABLE errors_local ADD COLUMN _tags_hash_map Array(UInt64) MATERIALIZED arrayMap((k, v) -> cityHash64(concat(replaceRegexpAll(k, '(\\=|\\\\)', '\\\\\\1'), '=', v)), tags.key, tags.value) AFTER _tags_flattened
2020-08-14 21:17:53,187 Creating tables for storage groupedmessages
2020-08-14 21:17:53,195 Migrating storage groupedmessages
2020-08-14 21:17:53,203 Creating tables for storage groupassignees
2020-08-14 21:17:53,211 Migrating storage groupassignees
2020-08-14 21:17:53,216 Creating tables for storage outcomes_raw
2020-08-14 21:17:53,224 Migrating storage outcomes_raw
2020-08-14 21:17:53,230 Creating tables for storage outcomes_hourly
2020-08-14 21:17:53,243 Migrating storage outcomes_hourly
2020-08-14 21:17:53,249 Creating tables for storage sessions_raw
2020-08-14 21:17:53,257 Migrating storage sessions_raw
2020-08-14 21:17:53,264 Creating tables for storage sessions_hourly
2020-08-14 21:17:53,274 Migrating storage sessions_hourly
2020-08-14 21:17:53,285 Creating tables for storage transactions
2020-08-14 21:17:53,295 Migrating storage transactions
2020-08-14 21:17:53,301 Executing migration: ALTER TABLE transactions_local ADD COLUMN _tags_hash_map Array(UInt64) MATERIALIZED arrayMap((k, v) -> cityHash64(concat(replaceRegexpAll(k, '(\\=|\\\\)', '\\\\\\1'), '=', v)), tags.key, tags.value) AFTER _tags_flattened
Setting up database...
Starting sentry_onpremise_zookeeper_1 ...
Starting sentry_onpremise_zookeeper_1 ... done
Starting sentry_onpremise_redis_1 ...
Creating sentry_onpremise_memcached_1 ...
Starting sentry_onpremise_redis_1 ... done
Creating sentry_onpremise_symbolicator_1 ...
Starting sentry_onpremise_clickhouse_1 ...
Starting sentry_onpremise_clickhouse_1 ... done
Creating sentry_onpremise_postgres_1 ...
Creating sentry_onpremise_smtp_1 ...
Starting sentry_onpremise_kafka_1 ...
Starting sentry_onpremise_kafka_1 ... done
Creating sentry_onpremise_snuba-api_1 ...
Creating sentry_onpremise_snuba-outcomes-consumer_1 ...
Creating sentry_onpremise_snuba-sessions-consumer_1 ...
Creating sentry_onpremise_snuba-replacer_1 ...
Creating sentry_onpremise_snuba-consumer_1 ...
Creating sentry_onpremise_snuba-transactions-consumer_1 ...
Creating sentry_onpremise_smtp_1 ... done
Creating sentry_onpremise_symbolicator_1 ... done
Creating sentry_onpremise_memcached_1 ... done
Creating sentry_onpremise_snuba-sessions-consumer_1 ... done
Creating sentry_onpremise_snuba-outcomes-consumer_1 ... done
Creating sentry_onpremise_snuba-replacer_1 ... done
Creating sentry_onpremise_postgres_1 ... done
Creating sentry_onpremise_snuba-api_1 ... done
Creating sentry_onpremise_snuba-consumer_1 ... done
Creating sentry_onpremise_snuba-transactions-consumer_1 ... done
!! Configuration error: IndentationError: unexpected indent (<string>, line 166)
An error occurred, caught SIGERR on line 264
Cleaning up...
# This file is just Python, with a touch of Django which means
# you can inherit and tweak settings to your hearts content.
from sentry.conf.server import * # NOQA
# Generously adapted from pynetlinux: https://git.io/JJmga
def get_internal_network():
import ctypes
import fcntl
import math
import socket
import struct
iface = "eth0"
sockfd = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
ifreq = struct.pack("16sH14s", iface, socket.AF_INET, b"\x00" * 14)
try:
ip = struct.unpack(
"!I", struct.unpack("16sH2x4s8x", fcntl.ioctl(sockfd, 0x8915, ifreq))[2]
)[0]
netmask = socket.ntohl(
struct.unpack("16sH2xI8x", fcntl.ioctl(sockfd, 0x891B, ifreq))[2]
)
except IOError:
return ()
base = socket.inet_ntoa(struct.pack("!I", ip & netmask))
netmask_bits = 32 - int(round(math.log(ctypes.c_uint32(~netmask).value + 1, 2), 1))
return ("{0:s}/{1:d}".format(base, netmask_bits),)
INTERNAL_SYSTEM_IPS = get_internal_network()
DATABASES = {
"default": {
"ENGINE": "sentry.db.postgres",
"NAME": "postgres",
"USER": "postgres",
"PASSWORD": "",
"HOST": "postgres",
"PORT": "",
}
}
# You should not change this setting after your database has been created
# unless you have altered all schemas first
SENTRY_USE_BIG_INTS = True
# If you're expecting any kind of real traffic on Sentry, we highly recommend
# configuring the CACHES and Redis settings
###########
# General #
###########
# Instruct Sentry that this install intends to be run by a single organization
# and thus various UI optimizations should be enabled.
SENTRY_SINGLE_ORGANIZATION = True
SENTRY_OPTIONS["system.event-retention-days"] = int(
env("SENTRY_EVENT_RETENTION_DAYS", "90")
)
#########
# Redis #
#########
# Generic Redis configuration used as defaults for various things including:
# Buffers, Quotas, TSDB
SENTRY_OPTIONS["redis.clusters"] = {
"default": {
"hosts": {0: {"host": "redis", "password": "", "port": "6379", "db": "0"}}
}
}
#########
# Queue #
#########
# See https://docs.getsentry.com/on-premise/server/queue/ for more
# information on configuring your queue broker and workers. Sentry relies
# on a Python framework called Celery to manage queues.
rabbitmq_host = None
if rabbitmq_host:
BROKER_URL = "amqp://{username}:{password}@{host}/{vhost}".format(
username="guest", password="guest", host=rabbitmq_host, vhost="/"
)
else:
BROKER_URL = "redis://:{password}@{host}:{port}/{db}".format(
**SENTRY_OPTIONS["redis.clusters"]["default"]["hosts"][0]
)
#########
# Cache #
#########
# Sentry currently utilizes two separate mechanisms. While CACHES is not a
# requirement, it will optimize several high throughput patterns.
CACHES = {
"default": {
"BACKEND": "django.core.cache.backends.memcached.MemcachedCache",
"LOCATION": ["memcached:11211"],
"TIMEOUT": 3600,
}
}
# A primary cache is required for things such as processing events
SENTRY_CACHE = "sentry.cache.redis.RedisCache"
DEFAULT_KAFKA_OPTIONS = {
"bootstrap.servers": "kafka:9092",
"message.max.bytes": 50000000,
"socket.timeout.ms": 1000,
}
SENTRY_EVENTSTREAM = "sentry.eventstream.kafka.KafkaEventStream"
SENTRY_EVENTSTREAM_OPTIONS = {"producer_configuration": DEFAULT_KAFKA_OPTIONS}
KAFKA_CLUSTERS["default"] = DEFAULT_KAFKA_OPTIONS
###############
# Rate Limits #
###############
# Rate limits apply to notification handlers and are enforced per-project
# automatically.
SENTRY_RATELIMITER = "sentry.ratelimits.redis.RedisRateLimiter"
##################
# Update Buffers #
##################
# Buffers (combined with queueing) act as an intermediate layer between the
# database and the storage API. They will greatly improve efficiency on large
# numbers of the same events being sent to the API in a short amount of time.
# (read: if you send any kind of real data to Sentry, you should enable buffers)
SENTRY_BUFFER = "sentry.buffer.redis.RedisBuffer"
##########
# Quotas #
##########
# Quotas allow you to rate limit individual projects or the Sentry install as
# a whole.
SENTRY_QUOTAS = "sentry.quotas.redis.RedisQuota"
########
# TSDB #
########
# The TSDB is used for building charts as well as making things like per-rate
# alerts possible.
SENTRY_TSDB = "sentry.tsdb.redissnuba.RedisSnubaTSDB"
# Automatic switchover 90 days after Fri, Aug 14, 2020 4:56:01 PM. Can be removed afterwards.
SENTRY_TSDB_OPTIONS = {"switchover_timestamp": 1597438561 + (90 * 24 * 3600)}
#########
# SNUBA #
#########
SENTRY_SEARCH = "sentry.search.snuba.EventsDatasetSnubaSearchBackend"
SENTRY_SEARCH_OPTIONS = {}
SENTRY_TAGSTORE_OPTIONS = {}
###########
# Digests #
###########
# The digest backend powers notification summaries.
SENTRY_DIGESTS = "sentry.digests.backends.redis.RedisBackend"
##############
# Web Server #
##############
SENTRY_WEB_HOST = "0.0.0.0"
SENTRY_WEB_PORT = 9000
SENTRY_WEB_OPTIONS = {
"http": "%s:%s" % (SENTRY_WEB_HOST, SENTRY_WEB_PORT),
"protocol": "uwsgi",
# This is need to prevent https://git.io/fj7Lw
"uwsgi-socket": None,
"so-keepalive": True,
# Keep this between 15s-75s as that's what Relay supports
"http-keepalive": 15,
"http-chunked-input": True,
# the number of web workers
"workers": 3,
"threads": 4,
"memory-report": False,
# Some stuff so uwsgi will cycle workers sensibly
"max-requests": 100000,
"max-requests-delta": 500,
"max-worker-lifetime": 86400,
# Duplicate options from sentry default just so we don't get
# bit by sentry changing a default value that we depend on.
"thunder-lock": True,
"log-x-forwarded-for": False,
"buffer-size": 32768,
"limit-post": 209715200,
"disable-logging": True,
"reload-on-rss": 600,
"ignore-sigpipe": True,
"ignore-write-errors": True,
"disable-write-exception": True,
}
###########
# SSL/TLS #
###########
# If you're using a reverse SSL proxy, you should enable the X-Forwarded-Proto
# header and enable the settings below
# SECURE_PROXY_SSL_HEADER = ('HTTP_X_FORWARDED_PROTO', 'https')
# SESSION_COOKIE_SECURE = True
# CSRF_COOKIE_SECURE = True
# SOCIAL_AUTH_REDIRECT_IS_HTTPS = True
# End of SSL/TLS settings
############
# Features #
############
SENTRY_FEATURES["projects:sample-events"] = False
SENTRY_FEATURES.update(
{
feature: True
for feature in (
"organizations:discover",
"organizations:events",
"organizations:global-views",
"organizations:integrations-issue-basic",
"organizations:integrations-issue-sync",
"organizations:invite-members",
"organizations:sso-basic",
"organizations:sso-rippling",
"organizations:sso-saml2",
"organizations:performance-view",
"projects:custom-inbound-filters",
"projects:data-forwarding",
"projects:discard-groups",
"projects:plugins",
"projects:rate-limits",
"projects:servicehooks",
)
}
)
######################
# GitHub Integration #
######################
GITHUB_EXTENDED_PERMISSIONS = ["repo"]
#########################
# Bitbucket Integration #
########################
# BITBUCKET_CONSUMER_KEY = 'YOUR_BITBUCKET_CONSUMER_KEY'
# BITBUCKET_CONSUMER_SECRET = 'YOUR_BITBUCKET_CONSUMER_SECRET'
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment