jeudi 2 octobre 2014

Using FunSuite to test Spark throws NullPointerException


Vote count:

0




I would like to use FunSuite to test my Spark jobs by extending FunSuite with a new function, called localTest, that runs a test with a default SparkContext:



class SparkFunSuite extends FunSuite {

def localTest(name : String)(f : SparkContext => Unit) : Unit = {
val conf = new SparkConf().setAppName(name).setMaster("local")
val sc = new SparkContext(conf)
try {
this.test(name)(f(sc))
} finally {
sc.stop
}
}
}


Then I can add tests easily to my testing suites:



class MyTestSuite extends SparkFunSuite {

localTest("My Spark test") { sc =>
assertResult(2)(sc.parallelize(Seq(1,2,3)).filter(_ <= 2).map(_ + 1).count)
}
}


The problem is that when I run the tests I get a NullPointerException:



[info] MyTestSuite:
[info] - My Spark test *** FAILED ***
[info] java.lang.NullPointerException:
[info] at org.apache.spark.SparkContext.defaultParallelism(SparkContext.scala:1215)
[info] at org.apache.spark.SparkContext.parallelize$default$2(SparkContext.scala:435)
[info] at MyTestSuite$$anonfun$1.apply(FunSuiteTest.scala:24)
[info] at MyTestSuite$$anonfun$1.apply(FunSuiteTest.scala:23)
[info] at SparkFunSuite$$anonfun$localTest$1.apply$mcV$sp(FunSuiteTest.scala:13)
[info] at SparkFunSuite$$anonfun$localTest$1.apply(FunSuiteTest.scala:13)
[info] at SparkFunSuite$$anonfun$localTest$1.apply(FunSuiteTest.scala:13)
[info] at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
[info] at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
[info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info] ...


What is causing the NullPointerException? Is my way to use Spark not correct in this context?


I'm using Scala 2.10.4 with spark-core 1.0.2 and scalatest 2.2.2.



asked 43 secs ago

mariop

764






Using FunSuite to test Spark throws NullPointerException

Aucun commentaire:

Enregistrer un commentaire