Skip to content

Instantly share code, notes, and snippets.

@elmiko
Created May 18, 2018 20:31
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save elmiko/64338bcf36bdbc19de63330dafd5c706 to your computer and use it in GitHub Desktop.
Save elmiko/64338bcf36bdbc19de63330dafd5c706 to your computer and use it in GitHub Desktop.
oshinko python36 experimental s2i builder template
apiVersion: v1
kind: Template
labels:
application: oshinko-python-spark
createdBy: template-oshinko-python36-spark-build-dc
metadata:
annotations:
description: Create a buildconfig, imagestream and deploymentconfig using source-to-image and Python Spark source files hosted in git
openshift.io/display-name: Apache Spark Python
name: oshinko-python36-spark-build-dc
objects:
- apiVersion: v1
kind: ImageStream
metadata:
name: ${APPLICATION_NAME}
labels:
app: ${APPLICATION_NAME}
spec:
dockerImageRepository: ${APPLICATION_NAME}
tags:
- name: latest
- apiVersion: v1
kind: BuildConfig
metadata:
name: ${APPLICATION_NAME}
labels:
app: ${APPLICATION_NAME}
spec:
output:
to:
kind: ImageStreamTag
name: ${APPLICATION_NAME}:latest
source:
contextDir: ${CONTEXT_DIR}
git:
ref: ${GIT_REF}
uri: ${GIT_URI}
type: Git
strategy:
sourceStrategy:
env:
- name: APP_FILE
value: ${APP_FILE}
forcePull: true
from:
kind: DockerImage
name: elmiko/radanalytics-pyspark:python36-latest
type: Source
triggers:
- imageChange: {}
type: ImageChange
- type: ConfigChange
- github:
secret: ${APPLICATION_NAME}
type: GitHub
- generic:
secret: ${APPLICATION_NAME}
type: Generic
- apiVersion: v1
kind: DeploymentConfig
metadata:
name: ${APPLICATION_NAME}
labels:
deploymentConfig: ${APPLICATION_NAME}
app: ${APPLICATION_NAME}
spec:
replicas: 1
selector:
deploymentConfig: ${APPLICATION_NAME}
strategy:
type: Rolling
template:
metadata:
labels:
deploymentConfig: ${APPLICATION_NAME}
app: ${APPLICATION_NAME}
spec:
containers:
- env:
- name: DRIVER_HOST
value: ${APPLICATION_NAME}-headless
- name: OSHINKO_CLUSTER_NAME
value: ${OSHINKO_CLUSTER_NAME}
- name: APP_ARGS
value: ${APP_ARGS}
- name: SPARK_OPTIONS
value: ${SPARK_OPTIONS}
- name: OSHINKO_DEL_CLUSTER
value: ${OSHINKO_DEL_CLUSTER}
- name: APP_EXIT
value: "true"
- name: OSHINKO_NAMED_CONFIG
value: ${OSHINKO_NAMED_CONFIG}
- name: OSHINKO_SPARK_DRIVER_CONFIG
value: ${OSHINKO_SPARK_DRIVER_CONFIG}
- name: POD_NAME
valueFrom:
fieldRef:
fieldPath: metadata.name
image: ${APPLICATION_NAME}
imagePullPolicy: IfNotPresent
name: ${APPLICATION_NAME}
resources: {}
terminationMessagePath: /dev/termination-log
volumeMounts:
- mountPath: /etc/podinfo
name: podinfo
readOnly: false
dnsPolicy: ClusterFirst
restartPolicy: Always
serviceAccount: oshinko
volumes:
- downwardAPI:
items:
- fieldRef:
fieldPath: metadata.labels
path: labels
name: podinfo
triggers:
- imageChangeParams:
automatic: true
containerNames:
- ${APPLICATION_NAME}
from:
kind: ImageStreamTag
name: ${APPLICATION_NAME}:latest
type: ImageChange
- type: ConfigChange
- apiVersion: v1
kind: Service
metadata:
name: ${APPLICATION_NAME}
labels:
app: ${APPLICATION_NAME}
spec:
ports:
- name: 8080-tcp
port: 8080
protocol: TCP
targetPort: 8080
selector:
deploymentConfig: ${APPLICATION_NAME}
- apiVersion: v1
kind: Service
metadata:
name: ${APPLICATION_NAME}-headless
labels:
app: ${APPLICATION_NAME}
spec:
clusterIP: None
ports:
- name: driver-rpc-port
port: 7078
protocol: TCP
targetPort: 7078
- name: blockmanager
port: 7079
protocol: TCP
targetPort: 7079
selector:
deploymentConfig: ${APPLICATION_NAME}
parameters:
- description: 'The name to use for the buildconfig, imagestream and deployment components'
from: 'python-spark-[a-z0-9]{4}'
generate: expression
name: APPLICATION_NAME
required: true
- description: The URL of the repository with your application source code
displayName: Git Repository URL
name: GIT_URI
- description: Optional branch, tag or commit
displayName: Git Reference
name: GIT_REF
- description: Git sub-directory path
name: CONTEXT_DIR
- description: The name of the main py file to run. If this is not specified and there is a single py file at top level of the git respository, that file will be chosen.
name: APP_FILE
- description: Command line arguments to pass to the Spark application
name: APP_ARGS
- description: List of additional Spark options to pass to spark-submit (for exmaple --conf property=value --conf property=value). Note, --master and --class are set by the launcher and should not be set here
name: SPARK_OPTIONS
- description: The name of the Spark cluster to run against. The cluster will be created if it does not exist, and a random cluster name will be chosen if this value is left blank.
name: OSHINKO_CLUSTER_NAME
- description: The name of a stored cluster configuration to use if a cluster is created, default is 'default'.
name: OSHINKO_NAMED_CONFIG
- description: The name of a configmap to use for the Spark configuration of the driver. If this configmap is empty the default Spark configuration will be used.
name: OSHINKO_SPARK_DRIVER_CONFIG
- description: If a cluster is created on-demand, delete the cluster when the application finishes if this option is set to 'true'
name: OSHINKO_DEL_CLUSTER
required: true
value: 'true'
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment