My program runs fine in client mode ,but when I try to run in cluster mode if fails ,the reason for that is the python version on the cluster nodes is different
I am trying to set the python driver path when my application runs in cluster mode
below is my spark submit command in cluster mode
spark-submit --master yarn --deploy-mode cluster --num-executors 10 --executor-cores 3 --driver-memory 50G --executor-memory 20G \
--conf spark.dynamicAllocation.enabled=false \
--conf spark.kryoserializer.buffer.max=1024 --conf spark.yarn.keytab=keytab_path --conf spark.yarn.principal=${10} \
--conf spark.yarn.appMasterEnv.PYSPARK_PYTHON=/bin/python3
--jars path_to_jars \
--py-files Pipeline.egg-info,<path>/app.py <application_path>/app.py arguments
below is the error
22/08/04 06:09:34 INFO yarn.ApplicationMaster: Starting the user application in a separate Thread
22/08/04 06:09:34 INFO yarn.ApplicationMaster: Waiting for spark context initialization...
22/08/04 06:09:34 ERROR yarn.ApplicationMaster: User application exited with status 1
22/08/04 06:09:34 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 13, (reason: User application exited with status 1)
22/08/04 06:09:34 ERROR yarn.ApplicationMaster: Uncaught exception:
org.apache.spark.SparkException: Exception thrown in awaitResult:
at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:226)
at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:447)
at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:275)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:805)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:804)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:804)
at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Caused by: org.apache.spark.SparkUserAppException: User application exited with 1
at org.apache.spark.deploy.PythonRunner$.main(PythonRunner.scala:106)
at org.apache.spark.deploy.PythonRunner.main(PythonRunner.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:673)
22/08/04 06:09:34 INFO yarn.ApplicationMaster: Deleting staging directory hdfs://test-scc/user/tst_rdip_cross/.sparkStaging/application_1643123069214_48871
22/08/04 06:09:35 INFO util.ShutdownHookManager: Shutdown hook called
on exploring the console logs and application logs we didn't find the cause of the error ,then we explored the yarn logs and found that the python version is incompatible on the cluster nodes
Please can someone help me
Thanks in advance
You can use
spark.pyspark.python
andspark.pyspark.driver.python
to set your python path when you submit your spark job in--conf
, if you use spark version >=2.1.0
.