I am trying to build a spark application to access hana vora content.
my scala code is
import org.apache.spark.sql._
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
object Vora_Test{
def main(args: Array[String]) {
val sconf = new SparkConf().setAppName("VoraTestApp")
val sc = new SparkContext(sconf)
val sqlc = new SapSQLContext(sc)
val queryResult = sqlc.sql("SELECT * from DATA")
queryResult.collect().foreach(println)
}
}
I want to include the third party jar "spark-sap-datasources-1.2.33-assembly.jar" in my build. I tried sbt package, sbt assembly
I used the following build.sbt
name := "VoraApp"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies ++= Seq("org.apache.spark" %% "spark-core" % "1.5.2",
"com.sap.spark" % "extensiondist" % "1.2.37" from "file:///local/file/loc/lib/spark-sap-datasources-1.2.33-assembly.jar")
Nothing worked. I am getting the below error
Compiling 1 Scala source to local/file/loc/target/scala-2.10/classes... [error] bad symbolic reference. A signature in ExtendableSQLContext.class refers to type SQLContext [error] in package org.apache.spark.sql which is not available. [error] It may be completely missing from the current classpath, or the version on [error] the classpath might be incompatible with the version used when compiling ExtendableSQLContext.class.
How to overcome this error. I am new to scala, sbt, spark and vora.
Is that the actual path to the file? It looks unusual to me...