Skip to content

Instantly share code, notes, and snippets.

@Krasnyanskiy
Created July 24, 2016 23:00
Show Gist options
  • Save Krasnyanskiy/7289cc1beca9c5725b1b0d4b43e3c8d4 to your computer and use it in GitHub Desktop.
Save Krasnyanskiy/7289cc1beca9c5725b1b0d4b43e3c8d4 to your computer and use it in GitHub Desktop.
-scala: spark: accumulator
val acc = sc.accumulator(0)
sc.parallelize(Seq(1, 2, 3)).map(acc += _).count() // count to start computation and increment accumulator
println(acc.value)
@Krasnyanskiy
Copy link
Author

Use foreach instead of map

sc.parallelize(Seq(1, 2, 3)).foreach(acc += _)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment