Unable to import plugin to scala project

177 Views Asked by At

I added this to <my_project_name>/project/plugins.sbt:

resolvers += "bintray-spark-packages" at "https://dl.bintray.com/spark-packages/maven/"

addSbtPlugin("org.spark-packages" % "sbt-spark-package" % "0.2.6")

in order to import sbt-spark-packages, but sbt tell me "Extracting structure failed: Build status: Error".

I tried with other plugin but the behavior is always the same.

sbt version: 1.8.2

scala version: 2.13.10

1

There are 1 best solutions below

3
Dmytro Mitin On

See the ticket

dl.bintray.com/spark-packages/maven is forbidden https://github.com/databricks/sbt-spark-package/issues/50

Bintray has been sunset.

Bintray is deprecated.

https://spark-packages.org/package/databricks/sbt-spark-package

This package doesn't have any releases published in the Spark Packages repo, or with maven coordinates supplied. You may have to build this package from source, or it may simply be a script.

Do

git clone https://github.com/databricks/sbt-spark-package.git
cd sbt-spark-package
git reset --hard v0.2.6
sbt package

Now you can find a JAR at sbt-spark-package/target/scala-2.10/sbt-0.13/sbt-spark-package-0.2.6.jar.

Do sbt publishLocal and it will be published at ~/.ivy2/local/org.spark-packages/sbt-spark-package/scala_2.10/sbt_0.13/0.2.6/jars/sbt-spark-package.jar.

Now you can use this sbt plugin in your project:

build.sbt

lazy val root = (project in file("."))
  .settings(
    name := "scalademo",
    scalaVersion := "2.11.12"
  )

project/build.properties

sbt.version = 0.13.18

project/plugins.sbt

addSbtPlugin("org.spark-packages" % "sbt-spark-package" % "0.2.6")

Please notice that sbt-spark-package is a plugin to sbt 0.13.x, not sbt 1.x

Support SBT 1.x https://github.com/databricks/sbt-spark-package/issues/40

In order to use the plugin with sbt 1.8.2 and Scala 2.13.10 you'll have to upgrade it yourself.

Moreover, sbt-spark-package seems to be outdated, abandoned, deprecated

java.lang.NoSuchMethodError: sbt.UpdateConfiguration.copy$default$1()Lscala/Option https://github.com/databricks/sbt-spark-package/issues/51

Is this plugin deprecated? https://github.com/databricks/sbt-spark-package/issues/48