I am trying to execute Oozie 5.1 workflow on the spark cluster. The jar I am trying to run is written in scala 2.11.
I am getting the following error.
Caused by: java.lang.IllegalStateException: Library directory '/yarn/nm/usercache/analytics/appcache/application_1701161245298_15213/container_e12_1701161245298_15213_01_000001/./assembly/target/scala-2.11/jars' does not exist; make sure Spark is built.
My workflow def is:
<workflow-app xmlns="uri:oozie:workflow:1.0" name="master-workflow">
<start to="spark-workflow1"/>
<action name="spark-workflow1">
<spark xmlns="uri:oozie:spark-action:1.0">
<!-- Define spark action configuration for workflow1 -->
<master>yarn</master>
<name>File Write Oozie Job</name>
<class>com.crifhighmark.CrifAdhocTasks.FileCreationWithStateArohan</class>
<jar>${nameNode}/analytics_files/SSP/spark_jar/DataExtractSSPMain-0.0.1-SNAPSHOT.jar</jar>
<spark-opts>
--executor-memory 2G
--num-executors 4
--driver-memory 2G
--executor-cores 1
--conf spark.sql.shuffle.partitions=1000
--conf spark.serializer=org.apache.spark.serializer.KryoSerializer
--conf spark.datasource.hive.warehouse.metastoreUri="thrift://<HOST>:<PORT>"
--conf spark.hadoop.metastore.catalog.default=hive
--jars /BigData/analytics_files/SSP/Workflow/lib/ojdbc7.jar, /opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p2000.37147774/lib/spark/jars/*.jar
--files /etc/hive/conf/hive-site.xml
--conf spark.yarn.keytab=/home/analytics/analytics.keytab
--conf [email protected]
</spark-opts>
<arg>${customerID}</arg>
<arg>${product}</arg>
<arg>${selectColumn}</arg>
<arg>${filterColumnName}</arg>
<arg>${filterColumnValues}</arg>
<arg>${viewName}</arg>
<arg>${wf:id()}</arg>
<arg>${userName}</arg>
<arg>${outputPath}</arg>
</spark>
<ok to="end" />
<error to="fail" />
</action>
<kill name="fail">
<message>Workflow failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app>
Can anyone throw some light onthis?
I checked all the path locations, jar files, thrift URL, property file values etc. Job is getting submitted, but always failing with the said error message.