This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
FROM phusion/baseimage | |
MAINTAINER name | |
# Install basic packages | |
RUN \ | |
apt-get update; apt-get upgrade -y -qq; \ | |
apt-get install -y -qq wget; \ | |
apt-get install -y -qq curl; \ | |
apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/* |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
LOG_DIR empty; logging will go to /tmp/job-server | |
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0 | |
[2016-05-01 18:35:28,375] INFO spark.jobserver.JobServer$ [] [] - Starting JobServer with config { | |
# system properties | |
"app" : { | |
# system properties | |
"name" : "spark.jobserver.JobServer" | |
}, | |
# merge of /app/docker.conf: 39,application.conf: 101 | |
# universal context configuration. These settings can be overridden, see README.md |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
16/05/11 13:01:28 INFO DAGScheduler: ResultStage 29 (map at DrilldownArtist.scala:337) finished in 2,172 s | |
16/05/11 13:01:28 INFO DAGScheduler: Job 13 finished: map at DrilldownArtist.scala:337, took 38,416625 s | |
[error] (run-main-0) org.apache.spark.SparkException: Job aborted due to stage failure: Task 697 in stage 1.0 failed 1 times, most recent failure: Lost task 697.0 in stage 1.0 (TID 705, localhost): ExecutorLostFailure (executor driver exited caused by one of the running tasks) Reason: Executor heartbeat timed out after 156451 ms | |
[error] Driver stacktrace: | |
16/05/11 13:01:28 INFO MapOutputTrackerMaster: Size of output statuses for shuffle 12 is 14186 bytes | |
org.apache.spark.SparkException: Job aborted due to stage failure: Task 697 in stage 1.0 failed 1 times, most recent failure: Lost task 697.0 in stage 1.0 (TID 705, localhost): ExecutorLostFailure (executor driver exited caused by one of the running tasks) Reason: Executor heartbeat timed out after 156451 ms | |
Driver stacktrace: | |
at org.apache.sp |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
package jobs.outlier | |
import java.sql.Timestamp | |
import java.text.SimpleDateFormat | |
import play.api.libs.json._ | |
import play.api.libs.ws.WSResponse | |
import play.api.libs.ws.ning.NingWSClient | |
import scala.concurrent.duration._ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
<?xml version="1.0" encoding="UTF-8" standalone="yes"?> | |
<xsd:schema version="1.0" xmlns:xsd="http://www.w3.org/2001/XMLSchema" vc:minVersion="1.1" | |
targetNamespace="http://some/awesome/schema" | |
xmlns:vc="http://www.w3.org/2007/XMLSchema-versioning"> | |
<xsd:element name="myElem" type="myElem"/> | |
<xsd:complexType name="myElem"> | |
<xsd:sequence> | |
<xsd:element name="geburtsdatum" type="xsd:date" minOccurs="0"/> | |
</xsd:sequence> | |
</xsd:complexType> |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
FROM jenkinsci/jenkins:2.11 | |
MAINTAINER geoHeil | |
USER root | |
RUN \ | |
apt-get update; apt-get upgrade -y -qq; \ | |
apt-get install -y -qq wget; \ | |
apt-get install -y -qq git; \ | |
apt-get install -y -qq tar; \ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
FROM openjdk:8-jdk | |
MAINTAINER geoHeil | |
# Environment variables | |
ENV PIO_VERSION 0.10.0-incubating | |
ENV SPARK_VERSION 1.6.2 | |
ENV ELASTICSEARCH_VERSION 1.7.5 | |
ENV HBASE_VERSION 1.0.3 | |
# Base paths |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
package foo | |
import java.sql.Date | |
import org.apache.log4j.{Level, Logger} | |
import org.apache.spark.SparkConf | |
import org.apache.spark.sql.expressions.WindowSpec | |
import org.apache.spark.sql.functions._ | |
import org.apache.spark.sql.{Column, SparkSession} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
package at.ac.tuwien.thesis.problem | |
import org.apache.log4j.{Level, Logger} | |
import org.apache.spark.SparkConf | |
import org.apache.spark.sql.SparkSession | |
import org.apache.spark.sql.functions._ | |
case class FooBar(city: String, postcode: String) | |
object Foo extends App { |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import java.sql.Date | |
import org.apache.log4j.{Level, Logger} | |
import org.apache.spark.SparkConf | |
import org.apache.spark.ml.feature.VectorAssembler | |
import org.apache.spark.sql.SparkSession | |
import org.apache.spark.sql.functions._ | |
object Foo extends App { |