Skip to content

Instantly share code, notes, and snippets.

docker pull abhioncbr/docker-superset:<version-tag>
cd docker-superset && docker run -p 5555:5555 -v config:/home/superset/config/ abhioncbr/docker-superset:<version-tag> cluster worker <superset_metadata_db_url> <redis_url>
cd docker-superset && docker run -p 8088:8088 -v config:/home/superset/config/ abhioncbr/docker-superset:<version-tag> cluster server <superset_metadata_db_url> <redis_url>
cd docker-superset/docker-files/ && SUPERSET_ENV=prod SUPERSET_VERSION=<version-tag> docker-compose up -d
cd docker-superset/docker-files/ && docker-compose up -d
  docker-superset
     |__config
     |    |__superset_config.py
     |
     |__docker-files
     |    |__docker-compose.yml
     |    |__.env   
@abhioncbr
abhioncbr / apache-superset-docker-image.md
Created January 12, 2019 22:29
Docker image of Apache Superset

A couple of days back, I wrote the post about how to run Apache Superset in the production environment for serving hundreds or thousands of users. Superset community members and users appreciated the post for which I am thankful to them, however over the Superset Slack and Gitter channels; many users asked various questions on setting Superset as a Docker container and how to use/run it. In this post, I am trying to explore more about docker image of a Superset, and I am hoping that after reading the post you will acquire a conceptual understanding of setting Superset as a Docker container and benefits of it.

Container Image

java.io.IOException: Output Stream closed
at org.apache.hadoop.fs.s3a.S3AOutputStream.checkOpen(S3AOutputStream.java:83) ~[hadoop-aws-2.8.3.jar:?]
at org.apache.hadoop.fs.s3a.S3AOutputStream.flush(S3AOutputStream.java:89) ~[hadoop-aws-2.8.3.jar:?]
at java.io.FilterOutputStream.flush(FilterOutputStream.java:140) ~[?:1.8.0_191]
at java.io.DataOutputStream.flush(DataOutputStream.java:123) ~[?:1.8.0_191]
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"paths" : "s3a://experiment-druid/input_data/wikiticker-2015-09-12-sampled.json.gz"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "s3n://experiment-druid/deepstorage"
},
Caused by: java.lang.NoSuchMethodError: com.amazonaws.services.s3.transfer.TransferManager.<init>(Lcom/amazonaws/services/s3/AmazonS3;Ljava/util/concurrent/ThreadPoolExecutor;)V
at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:287) ~[?:?]
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2669) ~[?:?]
at org.apache.hadoop.fs.FileSystem.access00(FileSystem.java:94) ~[?:?]
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703) ~[?:?]
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685) ~[?:?]
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373) ~[?:?]
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) ~[?:?]