PairRDD.saveToEsWithMeta Object not found

I am attempting your example in the Spark support for Elasticsearch document page.

https://www.elastic.co/guide/en/elasticsearch/hadoop/master/spark.html

import org.elasticsearch.spark.rdd.Metadata._

val otp = Map("iata" -> "OTP", "name" -> "Otopeni")
val muc = Map("iata" -> "MUC", "name" -> "Munich")
val sfo = Map("iata" -> "SFO", "name" -> "San Fran")

// metadata for each document
// note it's not required for them to have the same structure
val otpMeta = Map(ID -> 1, TTL -> "3h")
val mucMeta = Map(ID -> 2, VERSION -> "23")
val sfoMeta = Map(ID -> 3)

// instance of SparkContext
val sc = ...

val airportsRDD = sc.makeRDD(Seq((otpMeta, otp), (mucMeta, muc), (sfoMeta, sfo)))
pairRDD.saveToEsWithMeta(airportsRDD, "airports/2015")

I cannot get it to compile as the pairRDD object is not found

[error] /testspark/src/main/scala/testspark.scala:43: not found: value PairRDD
[error] PairRDD.saveToEsWithMeta(airportsRDD, "airports/2015")
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 31 s, completed Mar 15, 2016 3:44:46 PM

I'm using sbt and here is my file:

name := "testspark"

version := "1.0"

scalaVersion := "2.10.6"

libraryDependencies += "org.elasticsearch" % "elasticsearch-spark_2.10" % "2.2.0"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.0"

I've also tried spark 1.5.2 with same results.

note: I do have a spark context defined that I've used for your other earlier examples.

I must be missing something somewhere I'm sure.

Thanks,

Well, an update to this issue. I was able to use the EsSpark class to make the example run successfully.

...

val otp = Map("iata" -> "OTP", "name" -> "Otopeni")
val muc = Map("iata" -> "MUC", "name" -> "Munich")
val sfo = Map("iata" -> "SFO", "name" -> "San Fran")

// metadata for each document
// note it's not required for them to have the same structure
val otpMeta = Map(ID -> 1, TTL -> "3h")
val mucMeta = Map(ID -> 2, VERSION -> "23")
val sfoMeta = Map(ID -> 3)


val airportsRDD = sc.makeRDD(Seq((otpMeta, otp), (mucMeta, muc), (sfoMeta, sfo)))

EsSpark.saveToEsWithMeta(airportsRDD, "airports/2015")

Not sure if this is correct, but it works. The documentation could be improved a bit.

Sorry for the typo. Besides using EsSpark you could have just used
airportsRDD.saveToES which works as long as one has the import org.elasticsearch.spark._ specified.