jeudi 19 février 2015

I am not sure whether the application is running on just the master or the whole cluster for Spark on EC2


Vote count:

0




I am using Spark 1.1.1 . I followed the instructions given on http://ift.tt/1CQrlrc and have a cluster of 1 master node and 1 worker on EC2 running.


I have made a jar of the application and rsynced it to the slaves. When I run the application using spark-submit with the deploy-mode of client, the application works. However, when I do so using deploy-mode cluster it gives me an error saying it cannot find the jar on the worker. The permission of the jar is 755 on both the master and worker.


I am not sure whether when I run the application using deploy-mode=client whether the application is using the workers. I don't think it is since the worker url does not show any completed jobs. But it does show failed jobs during deploy-mode=cluster.


Am I doing something wrong? Thank you for your help.



asked 40 secs ago







I am not sure whether the application is running on just the master or the whole cluster for Spark on EC2

Aucun commentaire:

Enregistrer un commentaire