Using databricks dbutils in spark submit (Scala) - Null pointer exception

307 Views Asked by At

Im trying to use dbutils in scala spark. Im submitting this job on databricks using spark submit. But, Im getting null pointer exception.

import com.databricks.dbutils_v1.DBUtilsHolder.dbutils

try{
   val s3_ls = dbutils.fs.ls(targetS3Dir)
}
catch{
   case e: Exception =>
   logger.error(e)
}

I have added the following dependancy in build.sbt,

"com.databricks" %% "dbutils-api" % "0.0.4"

Im even adding com.databricks:dbutils-api:0.0.4 in --packages in spark-submit.

Im building a jar and passing it in the spark submit command. Im still getting the null pointer exception error.

Is there anything Im missing here?

1

There are 1 best solutions below

0
On

This library is just a placeholder so you can compile your code locally (and you need to mark this dependency as provided), but it doesn't contain actual code. You don't need to include it when submitting a job, because this jar is a part of the Databricks Runtime.