To be able to use custom endpoints with the latest Spark distribution, one needs to add an external package (hadoop-aws). Then, custum endpoints can be configured according to docs.
bin/spark-shell --packages org.apache.hadoop:hadoop-aws:2.7.2
| FROM redhat/ubi10-minimal:latest | |
| ENV JAVA_HOME="/usr/lib/jvm/${JDK_VERSION}" | |
| RUN microdnf install -y tar gzip strace | |
| RUN mkdir -p "${JAVA_HOME}" | |
| RUN curl -#LfS --retry 8 "https://api.adoptium.net/v3/binary/version/jdk-25.0.1+8/linux/aarch64/jdk/hotspot/normal/eclipse?project=jdk" -o /jdktarfile | |
| ENV QEMU_STRACE=1 | |
| RUN tar -zxf /jdktarfile --strip 1 -C ${JAVA_HOME} |
| #!/bin/bash | |
| # This works for worker only. Coordinator doesn't support graceful shutdown. | |
| # This script will block until the server has actually shutdown. | |
| set -x | |
| http_port="$(cat /usr/lib/presto/etc/config.properties | grep 'http-server.http.port' | sed 's/^.*=\(.*\)$/\1/')" | |
| https_port="$(cat /usr/lib/presto/etc/config.properties | grep 'http-server.https.port' | sed 's/^.*=\(.*\)$/\1/')" | |
| if [ -n "$http_port" ] ; then | |
| res=$(curl -s -o /dev/null -w "%{http_code}" -XPUT --data '"SHUTTING_DOWN"' -H "Content-type: application/json" http://localhost:${http_port}/v1/info/state) |
To be able to use custom endpoints with the latest Spark distribution, one needs to add an external package (hadoop-aws). Then, custum endpoints can be configured according to docs.
bin/spark-shell --packages org.apache.hadoop:hadoop-aws:2.7.2
| #!/bin/bash | |
| #============================================================================================================== | |
| # Title : Merge.sh | |
| # Description : Merge two repositories using graft and git filter-branch. | |
| # Author : oneonestar | |
| # Date : 20150610 | |
| # Version : 1.2 | |
| # Usage : ./Merge.sh [fork repo] [your original repo] | |
| # Example : ./Merge.sh git@github.com:pmembrey/uppercaser.git git@github.com:oneonestar/LSP_ROT13 |