SparkAppHandle states not getting updated in Kubernetes

336 Views Asked by At

While launching Spark application through SparkLauncher() , SparkAppHandle state is not getting updated.

sparkLaunch = new SparkLauncher()
.setSparkHome("/root/test/spark-2.4.0-bin-hadoop2.7")
.setMaster("k8s://https://172.16.23.30:6443")
.setVerbose(true)
.addSparkArg("--verbose")
.setAppResource("local:///opt/spark/examples/jars/spark-examples_2.11-2.4.0.jar")
.setConf("spark.app.name","spark-pi")
.setMainClass("org.apache.spark.examples.SparkPi")
.setConf("spark.executor.instances","5")
.setConf("spark.kubernetes.container.image","registry.renovite.com/spark:v2")
.setConf("spark.kubernetes.driver.pod.name","spark-pi-driver")
.setConf("spark.kubernetes.container.image.pullSecrets","dev-registry-key")
.setConf("spark.kubernetes.authenticate.driver.serviceAccountName","spark")
.setDeployMode("cluster")
;

SparkAppHandle handle = sparkLaunch.startApplication();

Observations:

Now, I tried listeners etc but handle.getState() returns UNKNOWN and when Spark application is completed. state changes to LOST.
SparkAppHandle is not null
handle.getAppId() is always null.
My best guess is that communication is not working properly between listener and Spark driver in Kubernetes.
0

There are 0 best solutions below