I hereby claim:
- I am joshrosen on github.
- I am joshrosen (https://keybase.io/joshrosen) on keybase.
- I have a public key whose fingerprint is DD10 7726 FCC2 C9F2 6BBE 0688 5CDE B147 5FD1 9FBE
To claim this, I am signing this object:
I hereby claim:
To claim this, I am signing this object:
The following gist is an extract of the article Flask-SQLAlchemy Caching. It allows automated simple cache query and invalidation of cache relations through event among other features.
# pulling one User object
user = User.query.get(1)
Several Apache Spark APIs rely on the ability to serialize Scala closures. Closures may reference non-Serializable objects, preventing them from being serialized. In some cases (SI-1419 and others), however, these references are unnecessary and can be nulled out, allowing otherwise-unserializable closures to be serialized (in Spark, this nulling is performed by the ClosureCleaner
).
Scala 2.12's use of Java 8 lambdas for implementing closures appears to have broken our ability to serialize closures which contain local def
s. If we cannot resolve this problem, Spark will be unable to support Scala 2.12 and will be stuck on 2.10 and 2.11 forever.
As an example which illustrates this problem, the following closure
has a nested localDef
and is defined inside of a non-serializable class:
``
diff --git a/.generated-mima-member-excludes b/member-excludes-new | |
index 1ebc496..5c4b58c 100644 | |
--- a/.generated-mima-member-excludes | |
+++ b/member-excludes-new | |
@@ -1,3 +1,7 @@ | |
+akka.actor.Actor.aroundPostStop | |
+akka.actor.Actor.aroundPreRestart | |
+akka.actor.Actor.aroundPreStart | |
+akka.actor.Actor.aroundReceive | |
com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassVisitor.api |
diff --git a/.generated-mima-class-excludes b/generate-class-excludes-new | |
index 68d31fa..7d3b0b6 100644 | |
--- a/.generated-mima-class-excludes | |
+++ b/generate-class-excludes-new | |
@@ -6,17 +6,13 @@ org.apache.spark.AccumulatorParam$StringAccumulatorParam$ | |
org.apache.spark.AccumulatorParam$UpdatedBlockStatusesAccumulatorParam$ | |
org.apache.spark.Accumulators | |
org.apache.spark.Accumulators# | |
-org.apache.spark.Accumulators# | |
org.apache.spark.Accumulators$ |
This code is part of a Jinja CMS for Google App Engine Python 2.7 and NDB datastore
A Jinja enviroment is created for every CMS site: site_key_id = 'example
The modules are created using compiler.py The resulting code objects are stored in the dadastore using Kind Runtimes and a BlobProperty
The modules can also be saved / downloaded as .pyc
in a zip archive: -compiled-templates.zip
val fileToRead = "/path/to/my/file" | |
val df = sc.textFile(fileToRead).map(l => Tuple1(l)).toDF("line").cache | |
val rdd: RDD[String] = df.rdd.map(_.getString(0)) |
import org.apache.spark._ | |
object Main { | |
def main(args: Array[String]): Unit = { | |
val conf = new SparkConf().set("spark.speculation", "true") | |
val sc = new SparkContext("local[2, 4]", "test", conf) | |
//sc.setLogLevel("DEBUG") | |
sc.hadoopConfiguration.set("mapred.output.committer.class", classOf[MyOutputCommitter].getCanonicalName) | |
val tempDir = java.nio.file.Files.createTempDirectory("outputcommitter-test") |
""" | |
Spaghetti code to delete comments from AmplabJenkins. | |
""" | |
import os | |
import sys | |
import requests | |
from link_header import parse as parse_link_header | |
import logging | |
import json |
[INFO] --- maven-dependency-plugin:2.10:tree (default-cli) @ spark-core_2.11 --- | |
[INFO] org.apache.spark:spark-core_2.11:jar:1.4.1 | |
[INFO] +- com.google.guava:guava:jar:14.0.1:provided | |
[INFO] +- com.twitter:chill_2.11:jar:0.5.0:compile | |
[INFO] | \- com.esotericsoftware.kryo:kryo:jar:2.21:compile | |
[INFO] | +- com.esotericsoftware.reflectasm:reflectasm:jar:shaded:1.07:compile | |
[INFO] | +- com.esotericsoftware.minlog:minlog:jar:1.2:compile | |
[INFO] | \- org.objenesis:objenesis:jar:1.2:compile | |
[INFO] +- com.twitter:chill-java:jar:0.5.0:compile | |
[INFO] +- org.apache.hadoop:hadoop-client:jar:2.2.0:compile |