Skip to content

Instantly share code, notes, and snippets.

@andrassy
Created January 29, 2015 22:16
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save andrassy/273179ed7cb01a38973d to your computer and use it in GitHub Desktop.
Save andrassy/273179ed7cb01a38973d to your computer and use it in GitHub Desktop.
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.elasticsearch.spark._
import org.elasticsearch.spark.rdd.EsSpark
object ParentWithBackslash {
def main(args: Array[String]) {
val conf = new SparkConf()
conf.set("es.nodes", "localhost:9200")
conf.set("es.mapping.parent","parentId")
val sc = new SparkContext(conf)
val simpleDocWithParent = Map("itemId"->"1", "parentId"->"""a\s""")
val docs = sc.makeRDD(Seq(simpleDocWithParent))
docs.saveToEs("sparkdemo2/revenue", Map("es.mapping.id" -> "itemId"))
}
}
@costin
Copy link

costin commented Jan 30, 2015

Just tried this in master and works correctly:

val doc = Map("itemId" -> "1", "parent" -> """a\s""")
sc.makeRDD(Seq(doc)).saveToEs("spark-test/escaped-char", Map("es.mapping.id" -> "itemId"))
Tx [PUT]@[192.168.1.50:9500][spark-test/escaped-char/_bulk] w/ payload [{"index":{"_id":"1"}}
{"itemId":"1","parent":"a\\s"}
]

@andrassy
Copy link
Author

It's not the writing of the field value in the source that's failing, it's the setting of the parent Id (i.e. parent child).

val doc = Map("itemId" -> "1", "parent" -> """a\s""")
sc.makeRDD(Seq(doc)).saveToEs("spark-test/escaped-char", Map("es.mapping.id" -> "itemId", "es.mapping.parent" -> "parent"))

I've managed to get the case class version using Metadata/Source PairRdd to work though. I'm setting both ID and PARENT in the metadata Map. Still can't get the json or plain old Map versions to work though.

Incidentally, the parent mapping pre-exists in my elasticsearch index.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment