java.lang.NoSuchMethodError: breeze.linalg.tile$.tile_DM_Impl2

328 Views Asked by At

I have a spark code that use breeze. I can se the breeze version of my project:

$ gradle dependencies | grep breeze
     |    |    +--- org.scalanlp:breeze_2.11:0.12
     |    |    |    +--- org.scalanlp:breeze-macros_2.11:0.12
     +--- org.scalanlp:breeze_2.11:0.12 (*)
     |    |    +--- org.scalanlp:breeze_2.11:0.12
     |    |    |    +--- org.scalanlp:breeze-macros_2.11:0.12
     +--- org.scalanlp:breeze_2.11:0.12 (*)
     |    |    +--- org.scalanlp:breeze_2.11:0.12
     |    |    |    +--- org.scalanlp:breeze-macros_2.11:0.12
     +--- org.scalanlp:breeze_2.11:0.12 (*)
|    |    |    +--- org.scalanlp:breeze_2.11:0.12
|    |    |    |    +--- org.scalanlp:breeze-macros_2.11:0.12
|    +--- org.scalanlp:breeze_2.11:0.12 (*)
|    |    |    +--- org.scalanlp:breeze_2.11:0.12
|    |    |    |    +--- org.scalanlp:breeze-macros_2.11:0.12
|    +--- org.scalanlp:breeze_2.11:0.12 (*)

The version of breeze included in spark 2.1.1 is 0.12. I can see this looking in the spark jars directory:

spark-2.1.1-bin-hadoop2.4$ find . -name *.jar | grep breeze
./jars/breeze_2.11-0.12.jar
./jars/breeze-macros_2.11-0.12.jar

But when I submit the job to spark (even local) I get this error:

java.lang.NoSuchMethodError: breeze.linalg.tile$.tile_DM_Impl2(Lscala/reflect/ClassTag;Lbreeze/storage/Zero;Lbreeze/generic/UFunc$InPlaceImpl2;)Lbreeze/generic/UFunc$UImpl2;
    at mypackage.MyClass.calcOne(MyClass.scala:51)
    at mypackage.MyClass$$anonfun$1.apply(MyClass.scala:36)
    at mypackage.MyClass$$anonfun$1.apply(MyClass.scala:35)
    at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
    at scala.collection.Iterator$class.foreach(Iterator.scala:893)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
    at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)
    at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1336)
    at scala.collection.TraversableOnce$class.fold(TraversableOnce.scala:212)
    at scala.collection.AbstractIterator.fold(Iterator.scala:1336)
    at org.apache.spark.rdd.RDD$$anonfun$fold$1$$anonfun$20.apply(RDD.scala:1044)

The command line used:

spark-2.1.1-bin-hadoop2.4/bin/spark-submit --class my.Main myjar.jar
1

There are 1 best solutions below

0
On

Found the problem:

My SPARK_HOME environment variable was setting to an old spark version.

So bin/spark-class was looking for jars dependencies in this other path