Not found value spark SBT project

3.4k Views Asked by At

Hi i am trying to set up a small spark application in SBT,

My build.sbt is

import Dependencies._

name := "hello"

version := "1.0"

scalaVersion := "2.11.8"

val sparkVersion = "1.6.1"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" %% "spark-streaming" % sparkVersion,
  "org.apache.spark" %% "spark-streaming-twitter" % sparkVersion
)

libraryDependencies += scalaTest % Test

Everything works fine i get all dependencies resolved by SBT, but when i try importing spark in my hello.scala project file i get this error not found: value spark

my hello.scala file is

package example
import org.apache.spark._
import org.apache.spark.SparkContext._

object Hello extends fileImport with App {
  println(greeting)
  anime.select("*").orderBy($"rating".desc).limit(10).show()
}

trait fileImport {
  lazy val greeting: String = "hello"
  var anime = spark.read.option("header", true).csv("C:/anime.csv")
  var ratings = spark.read.option("header", true).csv("C:/rating.csv")
}

here is error file i get

[info] Compiling 1 Scala source to C:\Users\haftab\Downloads\sbt-0.13.16\sbt\alfutaim\target\scala-2.11\classes...
[error] C:\Users\haftab\Downloads\sbt-0.13.16\sbt\alfutaim\src\main\scala\example\Hello.scala:12: not found: value spark
[error]   var anime = spark.read.option("header", true).csv("C:/anime.csv")
[error]               ^
[error] C:\Users\haftab\Downloads\sbt-0.13.16\sbt\alfutaim\src\main\scala\example\Hello.scala:13: not found: value spark
[error]   var ratings = spark.read.option("header", true).csv("C:/rating.csv")
[error]                 ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 3 s, completed Sep 10, 2017 1:44:47 PM
1

There are 1 best solutions below

6
On

spark is initialized in spark-shell only

but for the code you need to initialize the spark variable by yourself

import org.apache.spark.sql.SparkSession
val spark = SparkSession.builder().appName("testings").master("local").getOrCreate

you can change the testings name to your desired name .master option is optional if you want to run the code using spark-submit