Created
February 8, 2017 17:39
-
-
Save stevanl/d003866be086be4780d7f88e1597fffc to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
deployer$ docker-compose -f docker-compose.production.yml up | |
Starting redash_postgres_1 | |
Starting redash_redis_1 | |
Starting redash_worker_1 | |
Starting redash_server_1 | |
Starting redash_nginx_1 | |
Attaching to redash_postgres_1, redash_redis_1, redash_server_1, redash_nginx_1, redash_worker_1 | |
postgres_1 | LOG: database system was interrupted; last known up at 2017-02-08 17:29:18 UTC | |
postgres_1 | LOG: database system was not properly shut down; automatic recovery in progress | |
postgres_1 | LOG: record with zero length at 0/18734B8 | |
postgres_1 | LOG: redo is not required | |
postgres_1 | LOG: MultiXact member wraparound protections are now enabled | |
postgres_1 | LOG: database system is ready to accept connections | |
postgres_1 | LOG: autovacuum launcher started | |
redis_1 | [1] 08 Feb 17:37:36.030 # Warning: no config file specified, using the default config. In order to specify a config file use redis-server /path/to/redis.conf | |
redis_1 | _._ | |
redis_1 | _.-``__ ''-._ | |
redis_1 | _.-`` `. `_. ''-._ Redis 2.8.23 (00000000/0) 64 bit | |
redis_1 | .-`` .-```. ```\/ _.,_ ''-._ | |
redis_1 | ( ' , .-` | `, ) Running in stand alone mode | |
redis_1 | |`-._`-...-` __...-.``-._|'` _.-'| Port: 6379 | |
redis_1 | | `-._ `._ / _.-' | PID: 1 | |
redis_1 | `-._ `-._ `-./ _.-' _.-' | |
redis_1 | |`-._`-._ `-.__.-' _.-'_.-'| | |
redis_1 | | `-._`-._ _.-'_.-' | http://redis.io | |
redis_1 | `-._ `-._`-.__.-'_.-' _.-' | |
redis_1 | |`-._`-._ `-.__.-' _.-'_.-'| | |
redis_1 | | `-._`-._ _.-'_.-' | | |
redis_1 | `-._ `-._`-.__.-'_.-' _.-' | |
redis_1 | `-._ `-.__.-' _.-' | |
redis_1 | `-._ _.-' | |
redis_1 | `-.__.-' | |
redis_1 | | |
redis_1 | [1] 08 Feb 17:37:36.032 # Server started, Redis version 2.8.23 | |
redis_1 | [1] 08 Feb 17:37:36.032 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect. | |
redis_1 | [1] 08 Feb 17:37:36.032 # WARNING: The TCP backlog setting of 511 cannot be enforced because /proc/sys/net/core/somaxconn is set to the lower value of 128. | |
redis_1 | [1] 08 Feb 17:37:36.033 * DB loaded from disk: 0.000 seconds | |
redis_1 | [1] 08 Feb 17:37:36.033 * The server is now ready to accept connections on port 6379 | |
server_1 | [2017-02-08 17:37:36 +0000] [8] [INFO] Starting gunicorn 19.4.5 | |
worker_1 | Starting scheduler and 2 workers for queues: queries,scheduled_queries,celery... | |
server_1 | [2017-02-08 17:37:36 +0000] [8] [INFO] Listening at: http://0.0.0.0:5000 (8) | |
nginx_1 | 2017/02/08 17:37:36 [emerg] 1#1: host not found in upstream "redash:5000" in /etc/nginx/conf.d/default.conf:2 | |
nginx_1 | nginx: [emerg] host not found in upstream "redash:5000" in /etc/nginx/conf.d/default.conf:2 | |
redash_nginx_1 exited with code 1 | |
server_1 | [2017-02-08 17:37:36 +0000] [8] [INFO] Using worker: sync | |
server_1 | [2017-02-08 17:37:36 +0000] [13] [INFO] Booting worker with pid: 13 | |
server_1 | [2017-02-08 17:37:36 +0000] [16] [INFO] Booting worker with pid: 16 | |
server_1 | [2017-02-08 17:37:36 +0000] [19] [INFO] Booting worker with pid: 19 | |
server_1 | [2017-02-08 17:37:36 +0000] [22] [INFO] Booting worker with pid: 22 | |
worker_1 | [2017-02-08 17:37:38,578][PID:7][WARNING][redash.query_runner] BigQueryGCE query runner enabled but not supported, not registering. Either disable or install missing dependencies. | |
worker_1 | [2017-02-08 17:37:39,219: WARNING/MainProcess] /usr/local/lib/python2.7/dist-packages/celery/apps/worker.py:161: CDeprecationWarning: | |
worker_1 | Starting from version 3.2 Celery will refuse to accept pickle by default. | |
worker_1 | | |
worker_1 | The pickle serializer is a security concern as it may give attackers | |
worker_1 | the ability to execute any command. It's important to secure | |
worker_1 | your broker from unauthorized access when using pickle, so we think | |
worker_1 | that enabling pickle should require a deliberate action and not be | |
worker_1 | the default choice. | |
worker_1 | | |
worker_1 | If you depend on pickle then you should set a setting to disable this | |
worker_1 | warning and to be sure that everything will continue working | |
worker_1 | when you upgrade to Celery 3.2:: | |
worker_1 | | |
worker_1 | CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack', 'yaml'] | |
worker_1 | | |
worker_1 | You must only enable the serializers that you will actually use. | |
worker_1 | | |
worker_1 | | |
worker_1 | warnings.warn(CDeprecationWarning(W_PICKLE_DEPRECATED)) | |
worker_1 | | |
worker_1 | -------------- celery@edf5a95b36fb v3.1.23 (Cipater) | |
worker_1 | ---- **** ----- | |
worker_1 | --- * *** * -- Linux-4.8.6-x86_64-linode78-x86_64-with-Ubuntu-16.04-xenial | |
worker_1 | -- * - **** --- | |
worker_1 | - ** ---------- [config] | |
worker_1 | - ** ---------- .> app: redash:0x7f8b5ab62890 | |
worker_1 | - ** ---------- .> transport: redis://redis:6379/0 | |
worker_1 | - ** ---------- .> results: redis://redis:6379/0 | |
worker_1 | - *** --- * --- .> concurrency: 2 (prefork) | |
worker_1 | -- ******* ---- | |
worker_1 | --- ***** ----- [queues] | |
worker_1 | -------------- .> celery exchange=celery(direct) key=celery | |
worker_1 | .> queries exchange=queries(direct) key=queries | |
worker_1 | .> scheduled_queries exchange=scheduled_queries(direct) key=scheduled_queries | |
worker_1 | | |
worker_1 | [tasks] | |
worker_1 | . redash.tasks.check_alerts_for_query | |
worker_1 | . redash.tasks.cleanup_query_results | |
worker_1 | . redash.tasks.cleanup_tasks | |
worker_1 | . redash.tasks.execute_query | |
worker_1 | . redash.tasks.record_event | |
worker_1 | . redash.tasks.refresh_queries | |
worker_1 | . redash.tasks.refresh_schemas | |
worker_1 | . redash.tasks.send_mail | |
worker_1 | . redash.tasks.subscribe | |
worker_1 | . redash.tasks.version_check | |
worker_1 | | |
worker_1 | [2017-02-08 17:37:39,483: INFO/MainProcess] Connected to redis://redis:6379/0 | |
worker_1 | [2017-02-08 17:37:39,508: INFO/MainProcess] mingle: searching for neighbors | |
worker_1 | [2017-02-08 17:37:39,656: INFO/Beat] beat: Starting... | |
worker_1 | [2017-02-08 17:37:39,673: ERROR/Beat] Removing corrupted schedule file 'celerybeat-schedule': DBAccessError(13, 'Permission denied') | |
worker_1 | Traceback (most recent call last): | |
worker_1 | File "/usr/local/lib/python2.7/dist-packages/celery/beat.py", line 376, in setup_schedule | |
worker_1 | self._store = self._open_schedule() | |
worker_1 | File "/usr/local/lib/python2.7/dist-packages/celery/beat.py", line 366, in _open_schedule | |
worker_1 | return self.persistence.open(self.schedule_filename, writeback=True) | |
worker_1 | File "/usr/lib/python2.7/shelve.py", line 243, in open | |
worker_1 | return DbfilenameShelf(filename, flag, protocol, writeback) | |
worker_1 | File "/usr/lib/python2.7/shelve.py", line 227, in __init__ | |
worker_1 | Shelf.__init__(self, anydbm.open(filename, flag), protocol, writeback) | |
worker_1 | File "/usr/lib/python2.7/anydbm.py", line 85, in open | |
worker_1 | return mod.open(file, flag, mode) | |
worker_1 | File "/usr/lib/python2.7/dbhash.py", line 18, in open | |
worker_1 | return bsddb.hashopen(file, flag, mode) | |
worker_1 | File "/usr/lib/python2.7/bsddb/__init__.py", line 364, in hashopen | |
worker_1 | d.open(file, db.DB_HASH, flags, mode) | |
worker_1 | DBAccessError: (13, 'Permission denied') | |
worker_1 | [2017-02-08 17:37:39,678: ERROR/Beat] Process Beat | |
worker_1 | Traceback (most recent call last): | |
worker_1 | File "/usr/local/lib/python2.7/dist-packages/billiard/process.py", line 292, in _bootstrap | |
worker_1 | self.run() | |
worker_1 | File "/usr/local/lib/python2.7/dist-packages/celery/beat.py", line 553, in run | |
worker_1 | self.service.start(embedded_process=True) | |
worker_1 | File "/usr/local/lib/python2.7/dist-packages/celery/beat.py", line 470, in start | |
worker_1 | humanize_seconds(self.scheduler.max_interval)) | |
worker_1 | File "/usr/local/lib/python2.7/dist-packages/kombu/utils/__init__.py", line 325, in __get__ | |
worker_1 | value = obj.__dict__[self.__name__] = self.__get(obj) | |
worker_1 | File "/usr/local/lib/python2.7/dist-packages/celery/beat.py", line 512, in scheduler | |
worker_1 | return self.get_scheduler() | |
worker_1 | File "/usr/local/lib/python2.7/dist-packages/celery/beat.py", line 507, in get_scheduler | |
worker_1 | lazy=lazy) | |
worker_1 | File "/usr/local/lib/python2.7/dist-packages/celery/utils/imports.py", line 53, in instantiate | |
worker_1 | return symbol_by_name(name)(*args, **kwargs) | |
worker_1 | File "/usr/local/lib/python2.7/dist-packages/celery/beat.py", line 358, in __init__ | |
worker_1 | Scheduler.__init__(self, *args, **kwargs) | |
worker_1 | File "/usr/local/lib/python2.7/dist-packages/celery/beat.py", line 185, in __init__ | |
worker_1 | self.setup_schedule() | |
worker_1 | File "/usr/local/lib/python2.7/dist-packages/celery/beat.py", line 384, in setup_schedule | |
worker_1 | self._store = self._destroy_open_corrupted_schedule(exc) | |
worker_1 | File "/usr/local/lib/python2.7/dist-packages/celery/beat.py", line 372, in _destroy_open_corrupted_schedule | |
worker_1 | return self._open_schedule() | |
worker_1 | File "/usr/local/lib/python2.7/dist-packages/celery/beat.py", line 366, in _open_schedule | |
worker_1 | return self.persistence.open(self.schedule_filename, writeback=True) | |
worker_1 | File "/usr/lib/python2.7/shelve.py", line 243, in open | |
worker_1 | return DbfilenameShelf(filename, flag, protocol, writeback) | |
worker_1 | File "/usr/lib/python2.7/shelve.py", line 227, in __init__ | |
worker_1 | Shelf.__init__(self, anydbm.open(filename, flag), protocol, writeback) | |
worker_1 | File "/usr/lib/python2.7/anydbm.py", line 85, in open | |
worker_1 | return mod.open(file, flag, mode) | |
worker_1 | File "/usr/lib/python2.7/dbhash.py", line 18, in open | |
worker_1 | return bsddb.hashopen(file, flag, mode) | |
worker_1 | File "/usr/lib/python2.7/bsddb/__init__.py", line 364, in hashopen | |
worker_1 | d.open(file, db.DB_HASH, flags, mode) | |
worker_1 | DBAccessError: (13, 'Permission denied') | |
server_1 | [2017-02-08 17:37:39,824][PID:13][WARNING][redash.query_runner] BigQueryGCE query runner enabled but not supported, not registering. Either disable or install missing dependencies. | |
server_1 | [2017-02-08 17:37:39,944][PID:16][WARNING][redash.query_runner] BigQueryGCE query runner enabled but not supported, not registering. Either disable or install missing dependencies. | |
server_1 | [2017-02-08 17:37:40,086][PID:22][WARNING][redash.query_runner] BigQueryGCE query runner enabled but not supported, not registering. Either disable or install missing dependencies. | |
server_1 | [2017-02-08 17:37:40,100][PID:19][WARNING][redash.query_runner] BigQueryGCE query runner enabled but not supported, not registering. Either disable or install missing dependencies. | |
worker_1 | [2017-02-08 17:37:40,518: INFO/MainProcess] mingle: all alone | |
worker_1 | [2017-02-08 17:37:40,532: WARNING/MainProcess] celery@edf5a95b36fb ready. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment