Using "provided" with sbt breaks Intellij syntax highlighting for Spark

272 Views Asked by At

My dependencies are below. I've been told by the Spark people that I'm supposed to add % "provided" after the spark dependencies so they don't get bundled with my jar (not that I agree, but anyway). Now if I do add "provided" the syntax highlighting for the Spark modules breaks :( Everything is highlighted in red (unresolved) (after I do a sbt gen-idea). It continues to build OK.

Has anyone found a solution? I'm worried that I might have to hack together a script that adds the "provided" words just before I compile and then removes them after the build.

Making Spark, SBT and Intellij all play together is like playing wack-a-mole, it seems if you wack one problem, another just sprouts up!

libraryDependencies ++= Seq(
  "org.scalacheck" %% "scalacheck" % "1.10.1" % "test" withSources() withJavadoc(),
  "org.specs2" %% "specs2" % "1.14" % "test" withSources() withJavadoc(),
  "org.scalaz" %% "scalaz-core" % "7.0.5" withSources() withJavadoc(),
  "org.apache.commons" % "commons-math3" % "3.2" withSources() withJavadoc(),
  "io.spray" %%  "spray-json" % "1.3.1" withSources() withJavadoc(),
  ("org.apache.spark" % "spark-sql_2.10" % "1.0.0-cdh5.1.3") withSources() withJavadoc(),
  ("org.apache.spark" % "spark-core_2.10" % "1.0.0-cdh5.1.3") withSources() withJavadoc()
)

With provided:

  ("org.apache.spark" % "spark-sql_2.10" % "1.0.0-cdh5.1.3" % "provided") withSources() withJavadoc(),
  ("org.apache.spark" % "spark-core_2.10" % "1.0.0-cdh5.1.3" % "provided") withSources() withJavadoc()
0

There are 0 best solutions below