Vote count: 0
I'm running a spark v2.0.0 standalone cluster. I have livy running beside the Spark master.
I have set up a jupyter Python3 notetebook and have Spark Magic installed and have followed the nessesary instructions to connect Spark Magic to Livy although When I create my session I get an error message from the notebook.
Added endpoint http://spark-master:8998
Starting Spark application
ID YARN Application ID Kind State Spark UI Driver log Current session?
0 None pyspark idle ✔
---------------------------------------------------------------------------
LivyUnexpectedStatusException Traceback (most recent call last)
/opt/conda/lib/python3.5/site-packages/hdijupyterutils/ipywidgetfactory.py in submit_clicked(self, button)
63
64 def submit_clicked(self, button):
---> 65 self.parent_widget.run()
/opt/conda/lib/python3.5/site-packages/sparkmagic/controllerwidget/createsessionwidget.py in run(self)
56
57 try:
---> 58 self.spark_controller.add_session(alias, endpoint, skip, properties)
59 except ValueError as e:
60 self.ipython_display.send_error("""Could not add session with
/opt/conda/lib/python3.5/site-packages/sparkmagic/livyclientlib/sparkcontroller.py in add_session(self, name, endpoint, skip_if_exists, properties)
79 session = self._livy_session(http_client, properties, self.ipython_display)
80 self.session_manager.add_session(name, session)
---> 81 session.start()
82
83 def get_session_id_for_client(self, name):
/opt/conda/lib/python3.5/site-packages/sparkmagic/livyclientlib/livysession.py in start(self)
148 else:
149 command = Command("sqlContext")
--> 150 (success, out) = command.execute(self)
151 if success:
152 self.ipython_display.writeln(u"SparkContext available as 'sc'.")
/opt/conda/lib/python3.5/site-packages/sparkmagic/livyclientlib/command.py in execute(self, session)
29 statement_id = -1
30 try:
---> 31 session.wait_for_idle()
32 data = {u"code": self.code}
33 response = session.http_client.post_statement(session.id, data)
/opt/conda/lib/python3.5/site-packages/sparkmagic/livyclientlib/livysession.py in wait_for_idle(self, seconds_to_wait)
238 .format(self.id, self.status)
239 self.logger.error(error)
--> 240 raise LivyUnexpectedStatusException(u'{} See logs:\n{}'.format(error, self.get_logs()))
241
242 if seconds_to_wait <= 0.0:
LivyUnexpectedStatusException: Session 0 unexpectedly reached final status 'error'. See logs:
Error I get from the Livy logs when creating a new session in the manage spark section of jupyter
17/02/10 13:06:08 INFO StateStore$: Using BlackholeStateStore for recovery.
17/02/10 13:06:08 INFO BatchSessionManager: Recovered 0 batch sessions. Next session id: 0
17/02/10 13:06:08 INFO InteractiveSessionManager: Recovered 0 interactive sessions. Next session id: 0
17/02/10 13:06:08 INFO InteractiveSessionManager: Heartbeat watchdog thread started.
17/02/10 13:06:08 INFO WebServer: Starting server on http://spark-master:8998
17/02/10 13:06:34 INFO InteractiveSession$: Creating LivyClient for sessionId: 0
17/02/10 13:06:34 WARN RSCConf: Your hostname, spark-master, resolves to a loopback address, but we couldn't find any external IP address!
17/02/10 13:06:34 WARN RSCConf: Set livy.rsc.rpc.server.address if you need to bind to another address.
17/02/10 13:06:35 INFO InteractiveSessionManager: Registering new session 0
17/02/10 13:06:35 INFO ContextLauncher: 17/02/10 13:06:35 INFO driver.RSCDriver: Starting RPC server...
17/02/10 13:06:35 INFO ContextLauncher: 17/02/10 13:06:35 WARN rsc.RSCConf: Set livy.rsc.rpc.server.address if you need to bind to another address.
17/02/10 13:06:35 INFO ContextLauncher: 17/02/10 13:06:35 INFO driver.RSCDriver: Received job request 3ca8a52b-8dd5-41f0-8151-a8201d72d422
17/02/10 13:06:35 INFO ContextLauncher: 17/02/10 13:06:35 INFO driver.RSCDriver: SparkContext not yet up, queueing job request.
17/02/10 13:06:36 INFO ContextLauncher: Setting default log level to "WARN".
17/02/10 13:06:36 INFO ContextLauncher: To adjust logging level use sc.setLogLevel(newLevel).
17/02/10 13:06:36 INFO ContextLauncher: 17/02/10 13:06:36 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/02/10 13:06:37 INFO ContextLauncher: 17/02/10 13:06:37 ERROR repl.PythonInterpreter: Process has died with 1
17/02/10 13:06:37 INFO RSCClient: Received result for 3ca8a52b-8dd5-41f0-8151-a8201d72d422
and get this output in the livy logs
I'm unable to put my finger on what the exact issue/fix is. I'm able to create a successful connection if I set my session to use the Scala language instead of the Python. Although I only get the error if I set the session language to python. If someone knows a solution to connecting a livy-repl pyspark session in Jupyter please let me know!
Livy pyspark Python Session Error in Jypyter with Spark Magic - ERROR repl.PythonInterpreter: Process has died with 1
Aucun commentaire:
Enregistrer un commentaire