-
-
Save RAbraham/585939e5390d46a7d6f8 to your computer and use it in GitHub Desktop.
package org.apache.spark.graphx | |
import org.apache.spark.SparkContext | |
import org.apache.spark.SparkContext._ | |
import org.apache.spark.SparkConf | |
import org.apache.spark.rdd.RDD | |
import org.apache.spark._ | |
object repl { | |
val sc = new SparkContext("local", "test") //> sc : org.apache.spark.SparkContext = org.apache.spark.SparkContext@3724af13 | |
//| | |
val vertices = sc.parallelize(1L to 5L) //> vertices : org.apache.spark.rdd.RDD[Long] = ParallelCollectionRDD[0] at par | |
//| allelize at org.apache.spark.graphx.repl.scala:15 | |
println(vertices.count) //> 5 | |
} |
I use Intellij CE 2016.3, Spark 2.0.2 and run scala worksheet in eclipse compatible model, so far, most of them are ok now, there is only minor problem left.
open Preferences-> type scala -> in Languages & Frameworks, choose Scala -> Choose Worksheet -> only select eclipse compatibility mode
@ detectivebag - Would you be so kind as to provide a sample of the code you run? I have tried the exact same code as RAbraham with the settings you recommend ... and I still get errors. I have been searching for this answer for quite a while. See:
I would love to solve this issue.
@detectivebag eclipse compatibility mode solves the issue, thanks!
Yes, it solved it for me also. There are 4 options there (IntelliJ Idea Community edition 2016.3 on MacOs):
- Run worksheet in the compiler
- Run worksheet in the interactive mode
- Use "eclipse compatibility" mode
- Treat Scala scratch files as worksheet files.
It worked for me when I checked only the third option and left the other three unchecked.
Still works! Thank you!
Spark code is not working in scala worksheet. Trying in eclipse IDE