Java.lang.NullPointerException SparkContext: Error initializing SparkContext

Hi,

I'm getting errors using both spark-shell or pyspark. With pyspark I get at launch, but spark-shell only when I try to save to ES.

Here is what I'm getting:

Starting pyspark (Spark 1.4.0) (windows)

$ pyspark --packages org.elasticsearch:elasticsearch-spark_2.10:2.1.0
AppData\Local\Temp\spark-6a5be2b6-1492-4774-9d97-6ef60d718db0\userFiles-c7f551b1-1ed1-4aa6-8bd1-43c2fff1e76d\org.elasticsearch_elasticsearch-spark_2.10-2.1.0.jar
15/08/25 08:19:35 ERROR SparkContext: Error initializing SparkContext.
java.lang.NullPointerException
at java.lang.ProcessBuilder.start(Unknown Source)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:482)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
...
at py4j.GatewayConnection.run(GatewayConnection.java:207)
at java.lang.Thread.run(Unknown Source)
15/08/25 08:19:35 INFO SparkUI: Stopped Spark web UI at http://192.168.1.8:4040
15/08/25 08:19:35 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
15/08/25 08:19:35 ERROR SparkContext: Error stopping SparkContext after init error.
java.lang.NullPointerException
at org.apache.spark.network.netty.NettyBlockTransferService.close(NettyBlockTransferService.scala:152)
at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
at org.apache.spark.SparkEnv.stop(SparkEnv.scala:96)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1657)
at org.apache.spark.SparkContext.(SparkContext.scala:565)
at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:61)
...
at py4j.GatewayConnection.run(GatewayConnection.java:207)
at java.lang.Thread.run(Unknown Source)
Traceback (most recent call last):
File "[folder]spark-1.4.0-bin-hadoop2.6\bin..\python\pyspark\shell.py", line 43, in
sc = SparkContext(appName="PySparkShell", pyFiles=add_files)
File "[folder]spark-1.4.0-bin-hadoop2.6\python\pyspark\context.py", line 113, in init
conf, jsc, profiler_cls)
File "[folder]spark-1.4.0-bin-hadoop2.6\python\pyspark\context.py", line 165, in _do_init
self._jsc = jsc or self._initialize_context(self._conf._jconf)
File "[folder]spark-1.4.0-bin-hadoop2.6\python\pyspark\context.py", line 219, in _initialize_context
return self._jvm.JavaSparkContext(jconf)
File "[folder]spark-1.4.0-bin-hadoop2.6\python\lib\py4j-0.8.2.1-src.zip\py4j\java_gateway.py", line 701, in call
File "[folder]spark-1.4.0-bin-hadoop2.6\python\lib\py4j-0.8.2.1-src.zip\py4j\protocol.py", line 300, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.NullPointerException

This means that Spark couldn't initialize the SparkContext, so maybe you are giving wrong parameters.

How are you launching spark-shell or pyspark?

Hi eliasah,
I get the same error. I logged in through Cloudera Hue option

1 Like

Hi,I get the same error.Did you solve the problem ?