I am trying to pass a custom log4j configuration file to the spark executor.
So far i am using the --files param to spark-submit and can indeed attest that the respective file is transported in the executor working folder. However when trying to use said file by passing --conf spark.executor.extraJavaOptions=-Dlog4j.configuration=file:log4j.properties to spark-submit, i notice a FileNotFound error in the executor logs. After that i see a message that the config file is being downloaded and placed inside the executor work dir.
To me it seems that, given the order of the logs, the log4j config file location is searched first, then the files are downloaded, which means that there is no way to decide the config using only spark-submit.
Can anyone confirm this ? Is there a workaround for this if my suspicions are true ? If I am wrong then where is my mistake in composing the spark-submit parameters?
I am using spark 3.2 on mesos