how to fix error in lunching pyspark standalone mode

229 Views Asked by At

im new to pyspark and i tried to lunch pyspark standalone cluster .

  • i lunched the master using : bin\spark-class2.cmd org.apache.spark.deploy.master.Master
  • i lunched the worker using : bin\spark-class2.cmd org.apache.spark.deploy.worker.Worker -c 2 -m 2G spark://192.168.43.78:7077 spark://192.168.43.78:7077 is the URL of the master.
  • i lunched my code which is:
findspark.init('C:\spark\spark-3.0.3-bin-hadoop2.7')
conf=SparkConf()
conf.setMaster('spark://192.168.43.78:7077')
conf.setAppName('firstapp')
sc = SparkContext(conf=conf)
spark = SparkSession(sc)

and i got an error:

ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
ERROR SparkContext: Error initializing SparkContext.
java.lang.NullPointerException.

sc = SparkContext(conf=conf) <---
ERROR AsyncEventQueue: Listener AppStatusListener threw an exception
java.lang.NullPointerException.

is there a way to fix that error ?

0

There are 0 best solutions below