Skip to content

Instantly share code, notes, and snippets.

@daosborne
Created March 3, 2020 22:52
Show Gist options
  • Save daosborne/cd501c56dc117e066b4f118f15af7e57 to your computer and use it in GitHub Desktop.
Save daosborne/cd501c56dc117e066b4f118f15af7e57 to your computer and use it in GitHub Desktop.
Arches 5.0 Docker setup
~/Research/archesdemo $ docker-compose up
Creating elasticsearch ... done
Creating letsencrypt ... done
Creating couchdb ... done
Creating db ... done
Creating arches ... done
Creating nginx ... done
Attaching to letsencrypt, elasticsearch, couchdb, db, arches, nginx
arches | Full command: run_arches
arches | Command: run_arches
arches | Testing if database server is up...
db | Add rule to pg_hba: 0.0.0.0/0
db | Add rule to pg_hba: replication replicator
db | Setup master database
letsencrypt | Running on localhost, so not downloading certificates. Exiting...
letsencrypt exited with code 0
nginx |
nginx | Initializing NginX to run on: localhost
nginx |
nginx | Setting Nginx up as reverse proxy for **local** Docker container: arches...
nginx | Copying Nginx configuration files...
nginx | Setting NginX conf to use certificates in /etc/letsencrypt/live/localhost...
nginx | Running Nginx on in the foreground
nginx | 2020/03/03 22:28:08 [warn] 8#8: "ssl_stapling" ignored, no OCSP responder URL in the certificate "/etc/letsencrypt/live/localhost/fullchain.pem"
nginx | nginx: [warn] "ssl_stapling" ignored, no OCSP responder URL in the certificate "/etc/letsencrypt/live/localhost/fullchain.pem"
elasticsearch | OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
db | psql: error: could not connect to server: could not connect to server: No such file or directory
db | Is the server running locally and accepting
db | connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
db | 2020-03-03 22:28:11.314 UTC [27] LOG: starting PostgreSQL 12.2 (Debian 12.2-2.pgdg100+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 8.3.0-6) 8.3.0, 64-bit
db | 2020-03-03 22:28:11.314 UTC [27] LOG: listening on IPv4 address "127.0.0.1", port 5432
db | 2020-03-03 22:28:11.323 UTC [27] LOG: listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
db | 2020-03-03 22:28:11.397 UTC [39] LOG: database system was shut down at 2020-02-25 12:51:23 UTC
db | 2020-03-03 22:28:11.419 UTC [27] LOG: database system is ready to accept connections
db | List of databases
db | Name | Owner | Encoding | Collate | Ctype | Access privileges
db | -----------+----------+----------+---------+---------+-----------------------
db | postgres | postgres | UTF8 | C.UTF-8 | C.UTF-8 |
db | template0 | postgres | UTF8 | C.UTF-8 | C.UTF-8 | =c/postgres +
db | | | | | | postgres=CTc/postgres
db | template1 | postgres | UTF8 | C.UTF-8 | C.UTF-8 | =c/postgres +
db | | | | | | postgres=CTc/postgres
db | (3 rows)
db |
db | postgres ready
db | Setup postgres User:Password
db | Creating superuser postgres
db | ALTER ROLE
db | Creating replication user replicator
db | CREATE ROLE
db | postgres db already exists
db | List of databases
db | Name | Owner | Encoding | Collate | Ctype | Access privileges
db | -----------+----------+----------+---------+---------+-----------------------
db | postgres | postgres | UTF8 | C.UTF-8 | C.UTF-8 |
db | template0 | postgres | UTF8 | C.UTF-8 | C.UTF-8 | =c/postgres +
db | | | | | | postgres=CTc/postgres
db | template1 | postgres | UTF8 | C.UTF-8 | C.UTF-8 | =c/postgres +
db | | | | | | postgres=CTc/postgres
db | (3 rows)
db |
db | /docker-entrypoint.sh: running /docker-entrypoint-initdb.d/init.sql
db | psql:/docker-entrypoint-initdb.d/init.sql: error: could not read from input file: Is a directory
db |
db | 2020-03-03 22:28:13.124 UTC [27] LOG: received smart shutdown request
db | 2020-03-03 22:28:13.138 UTC [27] LOG: background worker "logical replication launcher" (PID 45) exited with exit code 1
db | 2020-03-03 22:28:13.139 UTC [40] LOG: shutting down
db | 2020-03-03 22:28:13.193 UTC [27] LOG: database system is shut down
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:13,716Z", "level": "INFO", "component": "o.e.e.NodeEnvironment", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "using [1] data paths, mounts [[/usr/share/elasticsearch/data (/dev/sda1)]], net usable_space [43.6gb], net total_space [58.4gb], types [ext4]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:13,722Z", "level": "INFO", "component": "o.e.e.NodeEnvironment", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "heap size [386.6mb], compressed ordinary object pointers [true]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:13,733Z", "level": "INFO", "component": "o.e.n.Node", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "node name [ceeca20511f1], node ID [XKbpzIF3TXah1X1JNsIAYQ], cluster name [docker-cluster]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:13,734Z", "level": "INFO", "component": "o.e.n.Node", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "version[7.5.1], pid[1], build[default/docker/3ae9ac9a93c95bd0cdc054951cf95d88e1e18d96/2019-12-16T22:57:37.835892Z], OS[Linux/4.19.76-linuxkit/amd64], JVM[AdoptOpenJDK/OpenJDK 64-Bit Server VM/13.0.1/13.0.1+9]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:13,736Z", "level": "INFO", "component": "o.e.n.Node", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "JVM home [/usr/share/elasticsearch/jdk]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:13,738Z", "level": "INFO", "component": "o.e.n.Node", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "JVM arguments [-Des.networkaddress.cache.ttl=60, -Des.networkaddress.cache.negative.ttl=10, -XX:+AlwaysPreTouch, -Xss1m, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djna.nosys=true, -XX:-OmitStackTraceInFastThrow, -Dio.netty.noUnsafe=true, -Dio.netty.noKeySetOptimization=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Dio.netty.allocator.numDirectArenas=0, -Dlog4j.shutdownHookEnabled=false, -Dlog4j2.disable.jmx=true, -Djava.locale.providers=COMPAT, -Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Djava.io.tmpdir=/tmp/elasticsearch-7349473778430179926, -XX:+HeapDumpOnOutOfMemoryError, -XX:HeapDumpPath=data, -XX:ErrorFile=logs/hs_err_pid%p.log, -Xlog:gc*,gc+age=trace,safepoint:file=logs/gc.log:utctime,pid,tags:filecount=32,filesize=64m, -Des.cgroups.hierarchy.override=/, -Xms400m, -Xmx400m, -XX:MaxDirectMemorySize=209715200, -Des.path.home=/usr/share/elasticsearch, -Des.path.conf=/usr/share/elasticsearch/config, -Des.distribution.flavor=default, -Des.distribution.type=docker, -Des.bundled_jdk=true]" }
db | Postgres initialisation process completed .... restarting in foreground
db | 2020-03-03 22:28:14.181 UTC [151] LOG: starting PostgreSQL 12.2 (Debian 12.2-2.pgdg100+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 8.3.0-6) 8.3.0, 64-bit
db | 2020-03-03 22:28:14.185 UTC [151] LOG: listening on IPv4 address "0.0.0.0", port 5432
db | 2020-03-03 22:28:14.192 UTC [151] LOG: listening on IPv6 address "::", port 5432
db | 2020-03-03 22:28:14.199 UTC [151] LOG: listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
db | 2020-03-03 22:28:14.237 UTC [153] LOG: database system was shut down at 2020-03-03 22:28:13 UTC
db | 2020-03-03 22:28:14.258 UTC [151] LOG: database system is ready to accept connections
couchdb | [info] 2020-03-03T22:28:15.191449Z nonode@nohost <0.7.0> -------- Application couch_log started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:15.228114Z nonode@nohost <0.7.0> -------- Application folsom started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:15.264161Z nonode@nohost <0.47.0> -------- alarm_handler: {set,{system_memory_high_watermark,[]}}
couchdb | [info] 2020-03-03T22:28:15.468407Z nonode@nohost <0.7.0> -------- Application couch_stats started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:15.468897Z nonode@nohost <0.7.0> -------- Application khash started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:15.506152Z nonode@nohost <0.7.0> -------- Application couch_event started on node nonode@nohost
arches | Database server is up
arches | Testing if Elasticsearch is up...
arches | Elasticsearch is up
arches | Checking if database archesdemo exists...
couchdb | [info] 2020-03-03T22:28:15.566070Z nonode@nohost <0.7.0> -------- Application ibrowse started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:15.634835Z nonode@nohost <0.7.0> -------- Application ioq started on node nonode@nohost
arches | Database archesdemo does not exists yet, starting setup...
arches | Current work directory: /web_root/arches
arches |
arches |
arches | *** Initializing database ***
arches |
arches | *** Any existing Arches database will be deleted ***
arches |
arches |
arches | 5
couchdb | [info] 2020-03-03T22:28:15.635342Z nonode@nohost <0.7.0> -------- Application mochiweb started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:15.686090Z nonode@nohost <0.195.0> -------- Apache CouchDB 2.1.1 is starting.
couchdb |
couchdb | [info] 2020-03-03T22:28:15.686537Z nonode@nohost <0.196.0> -------- Starting couch_sup
couchdb | [notice] 2020-03-03T22:28:15.793262Z nonode@nohost <0.82.0> -------- config: [admins] admin set to -pbkdf2-88ef5f9e36f8e6458699cc0ee1cdc71761c9cf34,31816c3b9a9817c91640dd8212859330,10 for reason nil
couchdb | [notice] 2020-03-03T22:28:15.845509Z nonode@nohost <0.82.0> -------- config: [couchdb] uuid set to bb177d920f0343eb54bead62fd051ca2 for reason nil
arches | 4
couchdb | [info] 2020-03-03T22:28:17.212357Z nonode@nohost <0.201.0> -------- open_result error {not_found,no_db_file} for _users
couchdb | [info] 2020-03-03T22:28:17.387306Z nonode@nohost <0.195.0> -------- Apache CouchDB has started. Time to relax.
couchdb |
couchdb | [info] 2020-03-03T22:28:17.387599Z nonode@nohost <0.195.0> -------- Apache CouchDB has started on http://any:5986/
couchdb | [info] 2020-03-03T22:28:17.387964Z nonode@nohost <0.7.0> -------- Application couch started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:17.388077Z nonode@nohost <0.7.0> -------- Application ets_lru started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:17.492705Z nonode@nohost <0.261.0> -------- Opening index for db: _users idx: _design/_auth sig: "3e823c2a4383ac0c18d4e574135a5b08"
couchdb | [info] 2020-03-03T22:28:17.506927Z nonode@nohost <0.7.0> -------- Application rexi started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:17.555924Z nonode@nohost <0.201.0> -------- open_result error {not_found,no_db_file} for _nodes
couchdb | [warning] 2020-03-03T22:28:17.556027Z nonode@nohost <0.278.0> -------- creating missing database: _nodes
couchdb | [info] 2020-03-03T22:28:17.599596Z nonode@nohost <0.201.0> -------- open_result error {not_found,no_db_file} for _dbs
couchdb | [warning] 2020-03-03T22:28:17.599656Z nonode@nohost <0.291.0> -------- creating missing database: _dbs
couchdb | [warning] 2020-03-03T22:28:17.599716Z nonode@nohost <0.290.0> -------- creating missing database: _dbs
couchdb | [info] 2020-03-03T22:28:17.622930Z nonode@nohost <0.7.0> -------- Application mem3 started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:17.623354Z nonode@nohost <0.7.0> -------- Application fabric started on node nonode@nohost
arches | 3
couchdb | [info] 2020-03-03T22:28:17.714398Z nonode@nohost <0.7.0> -------- Application chttpd started on node nonode@nohost
couchdb | [notice] 2020-03-03T22:28:17.739450Z nonode@nohost <0.327.0> -------- chttpd_auth_cache changes listener died database_does_not_exist at mem3_shards:load_shards_from_db/6(line:403) <= mem3_shards:load_shards_from_disk/1(line:378) <= mem3_shards:load_shards_from_disk/2(line:407) <= mem3_shards:for_docid/3(line:91) <= fabric_doc_open:go/3(line:38) <= chttpd_auth_cache:ensure_auth_ddoc_exists/2(line:187) <= chttpd_auth_cache:listen_for_changes/1(line:134)
couchdb | [error] 2020-03-03T22:28:17.740371Z nonode@nohost emulator -------- Error in process <0.328.0> with exit value: {database_does_not_exist,[{mem3_shards,load_shards_from_db,"_users",[{file,"src/mem3_shards.erl"},{line,403}]},{mem3_shards,load_shards_from_disk,1,[{file,"src/mem3_shards.erl"},{line,378}]},{mem3_shards,load_shards_from_disk...
couchdb | [info] 2020-03-03T22:28:17.753066Z nonode@nohost <0.7.0> -------- Application couch_index started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:17.753368Z nonode@nohost <0.7.0> -------- Application couch_mrview started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:17.754862Z nonode@nohost <0.7.0> -------- Application couch_plugins started on node nonode@nohost
couchdb | [notice] 2020-03-03T22:28:17.869884Z nonode@nohost <0.82.0> -------- config: [features] scheduler set to true for reason nil
couchdb | [info] 2020-03-03T22:28:17.919379Z nonode@nohost <0.201.0> -------- open_result error {not_found,no_db_file} for _replicator
couchdb | [notice] 2020-03-03T22:28:17.929580Z nonode@nohost <0.344.0> -------- creating replicator ddoc <<"_replicator">>
couchdb | [info] 2020-03-03T22:28:17.949751Z nonode@nohost <0.7.0> -------- Application couch_replicator started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:17.971023Z nonode@nohost <0.7.0> -------- Application couch_peruser started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:18.036550Z nonode@nohost <0.7.0> -------- Application ddoc_cache started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:18.080153Z nonode@nohost <0.7.0> -------- Application global_changes started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:18.080846Z nonode@nohost <0.7.0> -------- Application jiffy started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:18.128914Z nonode@nohost <0.7.0> -------- Application mango started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:18.152576Z nonode@nohost <0.7.0> -------- Application setup started on node nonode@nohost
couchdb | [info] 2020-03-03T22:28:18.153239Z nonode@nohost <0.7.0> -------- Application snappy started on node nonode@nohost
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,565Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [aggs-matrix-stats]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,566Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [analysis-common]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,567Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [flattened]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,567Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [frozen-indices]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,568Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [ingest-common]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,569Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [ingest-geoip]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,569Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [ingest-user-agent]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,571Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [lang-expression]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,572Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [lang-mustache]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,573Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [lang-painless]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,574Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [mapper-extras]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,574Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [parent-join]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,575Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [percolator]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,576Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [rank-eval]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,577Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [reindex]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,578Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [repository-url]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,578Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [search-business-rules]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,579Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [spatial]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,580Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [transform]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,580Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [transport-netty4]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,581Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [vectors]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,581Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [x-pack-analytics]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,582Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [x-pack-ccr]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,582Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [x-pack-core]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,583Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [x-pack-deprecation]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,584Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [x-pack-enrich]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,584Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [x-pack-graph]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,585Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [x-pack-ilm]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,585Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [x-pack-logstash]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,586Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [x-pack-ml]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,586Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [x-pack-monitoring]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,587Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [x-pack-rollup]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,587Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [x-pack-security]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,588Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [x-pack-sql]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,588Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [x-pack-voting-only-node]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,589Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "loaded module [x-pack-watcher]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:18,590Z", "level": "INFO", "component": "o.e.p.PluginsService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "no plugins loaded" }
arches | 2
arches | 1
arches | 0
arches |
arches | Running: python manage.py setup_db --force
db | 2020-03-03 22:28:22.593 UTC [162] postgres@postgres ERROR: template database "template_postgis" does not exist
db | 2020-03-03 22:28:22.593 UTC [162] postgres@postgres STATEMENT:
db | CREATE DATABASE archesdemo
db | WITH OWNER = postgres
db | ENCODING = 'UTF8'
db | CONNECTION LIMIT=-1
db | TEMPLATE = template_postgis;
arches | Drop and recreate the database...
arches |
arches | SELECT pg_terminate_backend(pid) FROM pg_stat_activity
arches | WHERE datname IN ('archesdemo', 'template_postgis');
arches |
arches | DROP DATABASE IF EXISTS archesdemo;
arches |
arches | CREATE DATABASE archesdemo
arches | WITH OWNER = postgres
arches | ENCODING = 'UTF8'
arches | CONNECTION LIMIT=-1
arches | TEMPLATE = template_postgis;
arches |
arches | ERROR: template database "template_postgis" does not exist
arches |
arches | It looks like your PostGIS template database is not correctly referenced in
arches | settings.py/settings_local.py, or it has not yet been created.
arches |
arches | To create it, use:
arches |
arches | psql -U postgres -c "CREATE DATABASE template_postgis;"
arches | psql -U postgres -d template_postgis -c "CREATE EXTENSION postgis;"
arches |
arches | /web_root/arches/arches/app/search/base_index.py:12: SyntaxWarning: "is" with a literal. Did you mean "=="?
arches | if index_name is None or index_name is "":
couchdb | [error] 2020-03-03T22:28:22.730802Z nonode@nohost emulator -------- Error in process <0.465.0> with exit value: {database_does_not_exist,[{mem3_shards,load_shards_from_db,"_users",[{file,"src/mem3_shards.erl"},{line,403}]},{mem3_shards,load_shards_from_disk,1,[{file,"src/mem3_shards.erl"},{line,378}]},{mem3_shards,load_shards_from_disk...
couchdb | [notice] 2020-03-03T22:28:22.732101Z nonode@nohost <0.327.0> -------- chttpd_auth_cache changes listener died database_does_not_exist at mem3_shards:load_shards_from_db/6(line:403) <= mem3_shards:load_shards_from_disk/1(line:378) <= mem3_shards:load_shards_from_disk/2(line:407) <= mem3_shards:for_docid/3(line:91) <= fabric_doc_open:go/3(line:38) <= chttpd_auth_cache:ensure_auth_ddoc_exists/2(line:187) <= chttpd_auth_cache:listen_for_changes/1(line:134)
arches | Running: Creating couchdb system databases
arches | % Total % Received % Xferd Average Speed Time Time Time Current
arches | Dload Upload Total Spent Left Speed
couchdb | [error] 2020-03-03T22:28:23.112609Z nonode@nohost <0.449.0> 7346d25530 Request to create N=3 DB but only 1 node(s)
100 12 100 12 0 0 34 0 --:--:-- --:--:-- --:--:-- 34
arches | {"ok":true}
couchdb | [notice] 2020-03-03T22:28:23.304387Z nonode@nohost <0.449.0> 7346d25530 couchdb:5984 172.19.0.4 admin PUT /_users 201 ok 283
arches | % Total % Received % Xferd Average Speed Time Time Time Current
arches | Dload Upload Total Spent Left Speed
couchdb | [error] 2020-03-03T22:28:23.321820Z nonode@nohost <0.450.0> 74d710a33a Request to create N=3 DB but only 1 node(s)
100 12 100 12 0 0 279 0 --:--:-- --:--:-- --:--:-- 279
couchdb | [notice] 2020-03-03T22:28:23.358028Z nonode@nohost <0.450.0> 74d710a33a couchdb:5984 172.19.0.4 admin PUT /_global_changes 201 ok 37
arches | {"ok":true}
arches | % Total % Received % Xferd Average Speed Time Time Time Current
arches | Dload Upload Total Spent Left Speed
couchdb | [error] 2020-03-03T22:28:23.377151Z nonode@nohost <0.451.0> 0e73ce8079 Request to create N=3 DB but only 1 node(s)
100 12 100 12 0 0 255 0 --:--:-- --:--:-- --:--:-- 255
arches | {"ok":true}
couchdb | [notice] 2020-03-03T22:28:23.418164Z nonode@nohost <0.451.0> 0e73ce8079 couchdb:5984 172.19.0.4 admin PUT /_replicator 201 ok 42
arches | psql: error: could not connect to server: FATAL: database "archesdemo" does not exist
arches | Running: python manage.py packages -o import_graphs
db | 2020-03-03 22:28:23.529 UTC [163] postgres@archesdemo FATAL: database "archesdemo" does not exist
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:25,073Z", "level": "INFO", "component": "o.e.x.s.a.s.FileRolesStore", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "parsed [0] roles from file [/usr/share/elasticsearch/config/roles.yml]" }
db | 2020-03-03 22:28:25.215 UTC [164] postgres@archesdemo FATAL: database "archesdemo" does not exist
arches | operation: import_graphs
arches | /web_root/arches/arches/db/graphs/branches/Information Resource Copyright.json
arches | Traceback (most recent call last):
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 217, in ensure_connection
arches | self.connect()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 195, in connect
arches | self.connection = self.get_new_connection(conn_params)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/postgresql/base.py", line 178, in get_new_connection
arches | connection = Database.connect(**conn_params)
arches | File "/web_root/ENV/lib/python3.8/site-packages/psycopg2/__init__.py", line 126, in connect
arches | conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
arches | psycopg2.OperationalError: FATAL: database "archesdemo" does not exist
arches |
arches |
arches | The above exception was the direct cause of the following exception:
arches |
arches | Traceback (most recent call last):
arches | File "manage.py", line 27, in <module>
arches | execute_from_command_line(sys.argv)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/__init__.py", line 381, in execute_from_command_line
arches | utility.execute()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/__init__.py", line 375, in execute
arches | self.fetch_command(subcommand).run_from_argv(self.argv)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/base.py", line 323, in run_from_argv
arches | self.execute(*args, **cmd_options)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/base.py", line 364, in execute
arches | output = self.handle(*args, **options)
arches | File "/web_root/arches/arches/management/commands/packages.py", line 254, in handle
arches | self.import_graphs(options["source"])
arches | File "/web_root/arches/arches/management/commands/packages.py", line 1038, in import_graphs
arches | ResourceGraphImporter(archesfile["graph"], overwrite_graphs)
arches | File "/web_root/arches/arches/app/utils/data_management/resource_graphs/importer.py", line 69, in import_graph
arches | with transaction.atomic():
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/transaction.py", line 175, in __enter__
arches | if not connection.get_autocommit():
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 379, in get_autocommit
arches | self.ensure_connection()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 217, in ensure_connection
arches | self.connect()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/utils.py", line 89, in __exit__
arches | raise dj_exc_value.with_traceback(traceback) from exc_value
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 217, in ensure_connection
arches | self.connect()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 195, in connect
arches | self.connection = self.get_new_connection(conn_params)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/postgresql/base.py", line 178, in get_new_connection
arches | connection = Database.connect(**conn_params)
arches | File "/web_root/ENV/lib/python3.8/site-packages/psycopg2/__init__.py", line 126, in connect
arches | conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
arches | django.db.utils.OperationalError: FATAL: database "archesdemo" does not exist
arches |
db | 2020-03-03 22:28:25.565 UTC [165] postgres@archesdemo FATAL: database "archesdemo" does not exist
arches | psql: error: could not connect to server: FATAL: database "archesdemo" does not exist
arches | Running: python manage.py packages -o import_reference_data -s "arches/db/schemes/arches_concept_scheme.rdf"
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:26,418Z", "level": "INFO", "component": "o.e.x.m.p.l.CppLogMessageHandler", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "[controller/104] [Main.cc@110] controller (64 bit): Version 7.5.1 (Build ae3c3c51b849be) Copyright (c) 2019 Elasticsearch BV" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:27,779Z", "level": "DEBUG", "component": "o.e.a.ActionModule", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "Using REST wrapper from plugin org.elasticsearch.xpack.security.Security" }
couchdb | [notice] 2020-03-03T22:28:27.801650Z nonode@nohost <0.348.0> -------- couch_replicator_clustering : cluster stable
couchdb | [notice] 2020-03-03T22:28:27.810290Z nonode@nohost <0.364.0> -------- Started replicator db changes listener <0.722.0>
couchdb | [notice] 2020-03-03T22:28:27.876376Z nonode@nohost <0.722.0> -------- creating replicator ddoc <<"shards/80000000-9fffffff/_replicator.1583274503">>
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:28,228Z", "level": "INFO", "component": "o.e.d.DiscoveryModule", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "using discovery type [single-node] and seed hosts providers [settings]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:30,031Z", "level": "INFO", "component": "o.e.n.Node", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "initialized" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:30,031Z", "level": "INFO", "component": "o.e.n.Node", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "starting ..." }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:30,360Z", "level": "INFO", "component": "o.e.t.TransportService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "publish_address {172.19.0.3:9300}, bound_addresses {0.0.0.0:9300}" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:30,684Z", "level": "INFO", "component": "o.e.c.c.Coordinator", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "setting initial configuration to VotingConfiguration{XKbpzIF3TXah1X1JNsIAYQ}" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:30,882Z", "level": "INFO", "component": "o.e.c.s.MasterService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "elected-as-master ([1] nodes joined)[{ceeca20511f1}{XKbpzIF3TXah1X1JNsIAYQ}{bMcZEZawSM2C5I_foGJHnQ}{172.19.0.3}{172.19.0.3:9300}{dilm}{ml.machine_memory=2086154240, xpack.installed=true, ml.max_open_jobs=20} elect leader, _BECOME_MASTER_TASK_, _FINISH_ELECTION_], term: 1, version: 1, delta: master node changed {previous [], current [{ceeca20511f1}{XKbpzIF3TXah1X1JNsIAYQ}{bMcZEZawSM2C5I_foGJHnQ}{172.19.0.3}{172.19.0.3:9300}{dilm}{ml.machine_memory=2086154240, xpack.installed=true, ml.max_open_jobs=20}]}" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:30,951Z", "level": "INFO", "component": "o.e.c.c.CoordinationState", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "cluster UUID set to [ziBse8vDQFq3NdQ6z54p_w]" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:30,982Z", "level": "INFO", "component": "o.e.c.s.ClusterApplierService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "master node changed {previous [], current [{ceeca20511f1}{XKbpzIF3TXah1X1JNsIAYQ}{bMcZEZawSM2C5I_foGJHnQ}{172.19.0.3}{172.19.0.3:9300}{dilm}{ml.machine_memory=2086154240, xpack.installed=true, ml.max_open_jobs=20}]}, term: 1, version: 1, reason: Publication{term=1, version=1}" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:31,177Z", "level": "INFO", "component": "o.e.h.AbstractHttpServerTransport", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "publish_address {172.19.0.3:9200}, bound_addresses {0.0.0.0:9200}", "cluster.uuid": "ziBse8vDQFq3NdQ6z54p_w", "node.id": "XKbpzIF3TXah1X1JNsIAYQ" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:31,178Z", "level": "INFO", "component": "o.e.n.Node", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "started", "cluster.uuid": "ziBse8vDQFq3NdQ6z54p_w", "node.id": "XKbpzIF3TXah1X1JNsIAYQ" }
db | 2020-03-03 22:28:31.220 UTC [166] postgres@archesdemo FATAL: database "archesdemo" does not exist
arches | operation: import_reference_data
arches | Traceback (most recent call last):
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 217, in ensure_connection
arches | self.connect()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 195, in connect
arches | self.connection = self.get_new_connection(conn_params)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/postgresql/base.py", line 178, in get_new_connection
arches | connection = Database.connect(**conn_params)
arches | File "/web_root/ENV/lib/python3.8/site-packages/psycopg2/__init__.py", line 126, in connect
arches | conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
arches | psycopg2.OperationalError: FATAL: database "archesdemo" does not exist
arches |
arches |
arches | The above exception was the direct cause of the following exception:
arches |
arches | Traceback (most recent call last):
arches | File "manage.py", line 27, in <module>
arches | execute_from_command_line(sys.argv)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/__init__.py", line 381, in execute_from_command_line
arches | utility.execute()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/__init__.py", line 375, in execute
arches | self.fetch_command(subcommand).run_from_argv(self.argv)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/base.py", line 323, in run_from_argv
arches | self.execute(*args, **cmd_options)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/base.py", line 364, in execute
arches | output = self.handle(*args, **options)
arches | File "/web_root/arches/arches/management/commands/packages.py", line 251, in handle
arches | self.import_reference_data(options["source"], options["overwrite"], options["stage"], options["bulk_load"])
arches | File "/web_root/arches/arches/management/commands/packages.py", line 887, in import_reference_data
arches | ret = skos.save_concepts_from_skos(rdf, overwrite, stage)
arches | File "/web_root/arches/arches/app/utils/skos.py", line 80, in save_concepts_from_skos
arches | skos_value_types_list = list(skos_value_types.values_list("valuetype", flat=True))
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/models/query.py", line 256, in __len__
arches | self._fetch_all()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/models/query.py", line 1242, in _fetch_all
arches | self._result_cache = list(self._iterable_class(self))
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/models/query.py", line 182, in __iter__
arches | for row in compiler.results_iter(chunked_fetch=self.chunked_fetch, chunk_size=self.chunk_size):
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/models/sql/compiler.py", line 1085, in results_iter
arches | results = self.execute_sql(MULTI, chunked_fetch=chunked_fetch, chunk_size=chunk_size)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/models/sql/compiler.py", line 1131, in execute_sql
arches | cursor = self.connection.cursor()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 256, in cursor
arches | return self._cursor()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 233, in _cursor
arches | self.ensure_connection()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 217, in ensure_connection
arches | self.connect()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/utils.py", line 89, in __exit__
arches | raise dj_exc_value.with_traceback(traceback) from exc_value
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 217, in ensure_connection
arches | self.connect()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 195, in connect
arches | self.connection = self.get_new_connection(conn_params)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/postgresql/base.py", line 178, in get_new_connection
arches | connection = Database.connect(**conn_params)
arches | File "/web_root/ENV/lib/python3.8/site-packages/psycopg2/__init__.py", line 126, in connect
arches | conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
arches | django.db.utils.OperationalError: FATAL: database "archesdemo" does not exist
arches |
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:31,316Z", "level": "INFO", "component": "o.e.g.GatewayService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "recovered [0] indices into cluster_state", "cluster.uuid": "ziBse8vDQFq3NdQ6z54p_w", "node.id": "XKbpzIF3TXah1X1JNsIAYQ" }
db | 2020-03-03 22:28:31.734 UTC [167] postgres@archesdemo FATAL: database "archesdemo" does not exist
arches | psql: error: could not connect to server: FATAL: database "archesdemo" does not exist
arches | Running: python manage.py packages -o import_reference_data -s "arches/db/schemes/arches_concept_collections.rdf"
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:31,739Z", "level": "INFO", "component": "o.e.c.m.MetaDataIndexTemplateService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "adding template [.watches] for index patterns [.watches*]", "cluster.uuid": "ziBse8vDQFq3NdQ6z54p_w", "node.id": "XKbpzIF3TXah1X1JNsIAYQ" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:31,834Z", "level": "INFO", "component": "o.e.c.m.MetaDataIndexTemplateService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "adding template [.watch-history-10] for index patterns [.watcher-history-10*]", "cluster.uuid": "ziBse8vDQFq3NdQ6z54p_w", "node.id": "XKbpzIF3TXah1X1JNsIAYQ" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:31,930Z", "level": "INFO", "component": "o.e.c.m.MetaDataIndexTemplateService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "adding template [.triggered_watches] for index patterns [.triggered_watches*]", "cluster.uuid": "ziBse8vDQFq3NdQ6z54p_w", "node.id": "XKbpzIF3TXah1X1JNsIAYQ" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:31,987Z", "level": "INFO", "component": "o.e.c.m.MetaDataIndexTemplateService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "adding template [.slm-history] for index patterns [.slm-history-1*]", "cluster.uuid": "ziBse8vDQFq3NdQ6z54p_w", "node.id": "XKbpzIF3TXah1X1JNsIAYQ" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:32,046Z", "level": "INFO", "component": "o.e.c.m.MetaDataIndexTemplateService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "adding template [.monitoring-logstash] for index patterns [.monitoring-logstash-7-*]", "cluster.uuid": "ziBse8vDQFq3NdQ6z54p_w", "node.id": "XKbpzIF3TXah1X1JNsIAYQ" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:32,141Z", "level": "INFO", "component": "o.e.c.m.MetaDataIndexTemplateService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "adding template [.monitoring-es] for index patterns [.monitoring-es-7-*]", "cluster.uuid": "ziBse8vDQFq3NdQ6z54p_w", "node.id": "XKbpzIF3TXah1X1JNsIAYQ" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:32,227Z", "level": "INFO", "component": "o.e.c.m.MetaDataIndexTemplateService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "adding template [.monitoring-beats] for index patterns [.monitoring-beats-7-*]", "cluster.uuid": "ziBse8vDQFq3NdQ6z54p_w", "node.id": "XKbpzIF3TXah1X1JNsIAYQ" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:32,305Z", "level": "INFO", "component": "o.e.c.m.MetaDataIndexTemplateService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "adding template [.monitoring-alerts-7] for index patterns [.monitoring-alerts-7]", "cluster.uuid": "ziBse8vDQFq3NdQ6z54p_w", "node.id": "XKbpzIF3TXah1X1JNsIAYQ" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:32,380Z", "level": "INFO", "component": "o.e.c.m.MetaDataIndexTemplateService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "adding template [.monitoring-kibana] for index patterns [.monitoring-kibana-7-*]", "cluster.uuid": "ziBse8vDQFq3NdQ6z54p_w", "node.id": "XKbpzIF3TXah1X1JNsIAYQ" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:32,456Z", "level": "INFO", "component": "o.e.a.s.m.TransportMasterNodeAction", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "adding index lifecycle policy [watch-history-ilm-policy]", "cluster.uuid": "ziBse8vDQFq3NdQ6z54p_w", "node.id": "XKbpzIF3TXah1X1JNsIAYQ" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:32,539Z", "level": "INFO", "component": "o.e.a.s.m.TransportMasterNodeAction", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "adding index lifecycle policy [slm-history-ilm-policy]", "cluster.uuid": "ziBse8vDQFq3NdQ6z54p_w", "node.id": "XKbpzIF3TXah1X1JNsIAYQ" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:32,726Z", "level": "INFO", "component": "o.e.l.LicenseService", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "license [da78237d-6879-4770-93fb-25c881805f1d] mode [basic] - valid", "cluster.uuid": "ziBse8vDQFq3NdQ6z54p_w", "node.id": "XKbpzIF3TXah1X1JNsIAYQ" }
elasticsearch | {"type": "server", "timestamp": "2020-03-03T22:28:32,728Z", "level": "INFO", "component": "o.e.x.s.s.SecurityStatusChangeListener", "cluster.name": "docker-cluster", "node.name": "ceeca20511f1", "message": "Active license is now [BASIC]; Security is disabled", "cluster.uuid": "ziBse8vDQFq3NdQ6z54p_w", "node.id": "XKbpzIF3TXah1X1JNsIAYQ" }
db | 2020-03-03 22:28:34.991 UTC [168] postgres@archesdemo FATAL: database "archesdemo" does not exist
arches | operation: import_reference_data
arches | Traceback (most recent call last):
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 217, in ensure_connection
arches | self.connect()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 195, in connect
arches | self.connection = self.get_new_connection(conn_params)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/postgresql/base.py", line 178, in get_new_connection
arches | connection = Database.connect(**conn_params)
arches | File "/web_root/ENV/lib/python3.8/site-packages/psycopg2/__init__.py", line 126, in connect
arches | conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
arches | psycopg2.OperationalError: FATAL: database "archesdemo" does not exist
arches |
arches |
arches | The above exception was the direct cause of the following exception:
arches |
arches | Traceback (most recent call last):
arches | File "manage.py", line 27, in <module>
arches | execute_from_command_line(sys.argv)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/__init__.py", line 381, in execute_from_command_line
arches | utility.execute()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/__init__.py", line 375, in execute
arches | self.fetch_command(subcommand).run_from_argv(self.argv)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/base.py", line 323, in run_from_argv
arches | self.execute(*args, **cmd_options)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/base.py", line 364, in execute
arches | output = self.handle(*args, **options)
arches | File "/web_root/arches/arches/management/commands/packages.py", line 251, in handle
arches | self.import_reference_data(options["source"], options["overwrite"], options["stage"], options["bulk_load"])
arches | File "/web_root/arches/arches/management/commands/packages.py", line 887, in import_reference_data
arches | ret = skos.save_concepts_from_skos(rdf, overwrite, stage)
arches | File "/web_root/arches/arches/app/utils/skos.py", line 80, in save_concepts_from_skos
arches | skos_value_types_list = list(skos_value_types.values_list("valuetype", flat=True))
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/models/query.py", line 256, in __len__
arches | self._fetch_all()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/models/query.py", line 1242, in _fetch_all
arches | self._result_cache = list(self._iterable_class(self))
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/models/query.py", line 182, in __iter__
arches | for row in compiler.results_iter(chunked_fetch=self.chunked_fetch, chunk_size=self.chunk_size):
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/models/sql/compiler.py", line 1085, in results_iter
arches | results = self.execute_sql(MULTI, chunked_fetch=chunked_fetch, chunk_size=chunk_size)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/models/sql/compiler.py", line 1131, in execute_sql
arches | cursor = self.connection.cursor()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 256, in cursor
arches | return self._cursor()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 233, in _cursor
arches | self.ensure_connection()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 217, in ensure_connection
arches | self.connect()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/utils.py", line 89, in __exit__
arches | raise dj_exc_value.with_traceback(traceback) from exc_value
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 217, in ensure_connection
arches | self.connect()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 195, in connect
arches | self.connection = self.get_new_connection(conn_params)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/postgresql/base.py", line 178, in get_new_connection
arches | connection = Database.connect(**conn_params)
arches | File "/web_root/ENV/lib/python3.8/site-packages/psycopg2/__init__.py", line 126, in connect
arches | conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
arches | django.db.utils.OperationalError: FATAL: database "archesdemo" does not exist
arches |
arches |
arches |
arches | ----- RUNNING DATABASE MIGRATIONS -----
arches |
arches | ../entrypoint.sh: line 55: cd: /web_root/archesdemo: No such file or directory
arches | Current work directory: /web_root/archesdemo
db | 2020-03-03 22:28:36.519 UTC [169] postgres@archesdemo FATAL: database "archesdemo" does not exist
arches | Traceback (most recent call last):
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 217, in ensure_connection
arches | self.connect()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 195, in connect
arches | self.connection = self.get_new_connection(conn_params)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/postgresql/base.py", line 178, in get_new_connection
arches | connection = Database.connect(**conn_params)
arches | File "/web_root/ENV/lib/python3.8/site-packages/psycopg2/__init__.py", line 126, in connect
arches | conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
arches | psycopg2.OperationalError: FATAL: database "archesdemo" does not exist
arches |
arches |
arches | The above exception was the direct cause of the following exception:
arches |
arches | Traceback (most recent call last):
arches | File "manage.py", line 27, in <module>
arches | execute_from_command_line(sys.argv)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/__init__.py", line 381, in execute_from_command_line
arches | utility.execute()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/__init__.py", line 375, in execute
arches | self.fetch_command(subcommand).run_from_argv(self.argv)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/base.py", line 323, in run_from_argv
arches | self.execute(*args, **cmd_options)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/base.py", line 364, in execute
arches | output = self.handle(*args, **options)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/base.py", line 83, in wrapped
arches | res = handle_func(*args, **kwargs)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/commands/migrate.py", line 85, in handle
arches | connection.prepare_database()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/contrib/gis/db/backends/postgis/base.py", line 25, in prepare_database
arches | with self.cursor() as cursor:
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 256, in cursor
arches | return self._cursor()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 233, in _cursor
arches | self.ensure_connection()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 217, in ensure_connection
arches | self.connect()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/utils.py", line 89, in __exit__
arches | raise dj_exc_value.with_traceback(traceback) from exc_value
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 217, in ensure_connection
arches | self.connect()
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/base/base.py", line 195, in connect
arches | self.connection = self.get_new_connection(conn_params)
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/db/backends/postgresql/base.py", line 178, in get_new_connection
arches | connection = Database.connect(**conn_params)
arches | File "/web_root/ENV/lib/python3.8/site-packages/psycopg2/__init__.py", line 126, in connect
arches | conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
arches | django.db.utils.OperationalError: FATAL: database "archesdemo" does not exist
arches |
arches | Checking if Arches project archesdemo exists...
arches |
arches | ----- Custom Arches project 'archesdemo' does not exist. -----
arches | ----- Creating 'archesdemo'... -----
arches |
arches | Current work directory: /web_root
arches | Traceback (most recent call last):
arches | File "/web_root/ENV/bin/arches-project", line 6, in <module>
arches | exec(compile(open(__file__).read(), __file__, 'exec'))
arches | File "/web_root/arches/arches/install/arches-project", line 150, in <module>
arches | main()
arches | File "/web_root/arches/arches/install/arches-project", line 143, in main
arches | COMMANDS[args.command](args)
arches | File "/web_root/arches/arches/install/arches-project", line 56, in command_create_app
arches | cmd.handle(options)
arches | File "/web_root/arches/arches/install/arches-project", line 33, in handle
arches | self.validate_name(project_name, "project")
arches | File "/web_root/ENV/lib/python3.8/site-packages/django/core/management/templates.py", line 228, in validate_name
arches | raise CommandError(
arches | django.core.management.base.CommandError: 'archesdemo' conflicts with the name of an existing Python module and cannot be used as a project name. Please try another name.
arches | /web_root/arches/arches/install/arches-project:8: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
arches | import imp
arches | Something went wrong when creating your Arches project: archesdemo.
arches | Exiting...
arches exited with code 1
version: '2'
services:
arches:
container_name: arches
image: getty/arches:5.0
build:
context: .
dockerfile: ./Dockerfile
command: run_arches
volumes:
- arches-log:/arches/arches/logs
- arches-static:/static_root
environment:
- ARCHES_PROJECT=archesdemo
- INSTALL_DEFAULT_GRAPHS=True
- INSTALL_DEFAULT_CONCEPTS=True
- PGUSERNAME=postgres
- PGPASSWORD=postgres
- PGDBNAME=archesdemo
- PGHOST=db
- PGPORT=5432
- COUCHDB_HOST=couchdb
- COUCHDB_PORT=5984
- COUCHDB_USER=admin
- COUCHDB_PASS=password
- ESHOST=elasticsearch
- ESPORT=9200
- DJANGO_MODE=DEV
- DJANGO_DEBUG=True
# - DJANGO_REMOTE_DEBUG=False
- DOMAIN_NAMES=localhost
- PYTHONUNBUFFERED=0
- TZ=GMT
ports:
- '8000:8000'
depends_on:
- db
- elasticsearch
- couchdb
nginx:
container_name: nginx
image: cvast/cvast-nginx:1.2.0
restart: unless-stopped
ports:
- '80:80'
- '443:443'
volumes:
- arches-static:/www/static
- letsencrypt-acme-challenge:/var/www
- letsencrypt:/etc/letsencrypt
environment:
- NGINX_PROXY_MODE=local
- NGINX_PROTOCOL=http
- LOCAL_PROXY_HOST=arches
- LOCAL_PROXY_PORT=8000
- DOMAIN_NAMES=localhost
- PUBLIC_MODE=False
- TZ=GMT
depends_on:
- arches
db:
container_name: db
image: kartoza/postgis:12.0
volumes:
- postgres-data:/var/lib/postgresql/data
- postgres-log:/var/log/postgresql
- ./arches/install/init-unix.sql:/docker-entrypoint-initdb.d/init.sql # to set up the DB template
ports:
- '5432:5432'
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASS=postgres
- POSTGRES_DB=postgres
- POSTGRES_MULTIPLE_EXTENSIONS=postgis,postgis_topology
- TZ=GMT
couchdb:
container_name: couchdb
image: couchdb:2.1.1
ports:
- "5984:5984"
environment:
COUCHDB_USER: admin
COUCHDB_PASSWORD: password
volumes:
- couchdb-data:/usr/local/var/lib/couchdb
- couchdb-log:/usr/local/var/log/couchdb
elasticsearch:
container_name: elasticsearch
image: elasticsearch:7.5.1
volumes:
- elasticsearch-data:/usr/share/elasticsearch/data
ports:
- "9200:9200"
- "9300:9300"
environment:
- TZ=GMT
- discovery.type=single-node
- discovery.seed_hosts=
- "ES_JAVA_OPTS=-Xms400m -Xmx400m"
letsencrypt:
container_name: letsencrypt
image: cvast/cvast-letsencrypt:1.1
volumes:
- letsencrypt-acme-challenge:/var/www
- letsencrypt:/etc/letsencrypt
- letsencrypt-log:/var/log/letsencrypt
command: get_certificate
environment:
- MODE=regular
- LETSENCRYPT_EMAIL=info@example.com
- DOMAIN_NAMES=localhost
- PRODUCTION_MODE=False
- PERSISTENT_MODE=True
- TZ=GMT
volumes:
arches-log:
arches-static:
couchdb-data:
couchdb-log:
postgres-data:
postgres-log:
elasticsearch-data:
letsencrypt:
letsencrypt-log:
letsencrypt-acme-challenge:
@daosborne
Copy link
Author

Here, I'm trying to create an Arches 5.0 installation in Docker, following the instructions at https://github.com/archesproject/arches/blob/master/docker/Readme.md, as far as step 6 under Quick Start, docker-compose up, using the modified docker-compose.yml included in this gist.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment