I have a spark Job that is deployed using k8s and it is of version 3.3.2 Recently there were some vulneabilities in spark 3.3.2
I changed my dockerfile to download 3.4.0 instead of 3.3.2 and also my application jar is built on spark 3.4.0
However while deploying, I get this error
Exception in thread "main" java.nio.file.NoSuchFileException: <path>/spark-assembly-1.0.jar
where "spark-assembly-1.0.jar" is the jar which contain my spark job.
I have this in deployment.yaml of the app
mainApplicationFile: "local:///<path>/spark-assembly-1.0.jar"
and I have not changed anything related to that. I see that some code has changed in spark 3.4.0 core's source code regarding jar location.
Has it really changed the functionality ? Is there anyone who is facing same issue as me ? Should the path be specified in a different way.
I hit this same issue. I believe the behaviour change was introduced in:
Our docker file was overriding the working directory of the base spark image
Changing it to this solved the problem:
Hope this helps!