When geotools-wrapper library loaded to spark-shell even only to the driver path, it will cause this error happen.
spark-shell --driver-memory 5G --executor-memory 5G --num-executors 1 --executor-cores 5 --conf 'spark.driver.extraClassPath=/home/hadoop/segy_processor_libs/libs/geotools-wrapper-1.5.0-28.2.jar'
error:
java.lang.NoSuchFieldError: JAVA_17
at org.apache.spark.util.ClosureCleaner$.getFinalModifiersFieldForJava17(ClosureCleaner.scala:424)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:402)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:163)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2531)
at org.apache.spark.rdd.RDD.$anonfun$map$1(RDD.scala:413)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:405)
at org.apache.spark.rdd.RDD.map(RDD.scala:412)
... 47 elided
Seems this because geotools-wrapper library contain a different org.apache.commons.lang3 package. When geotools not included, JavaVersion.JAVA_17 is defined but when geotools included, JavaVersion.JAVA_17 be not defined.
scala> import org.apache.commons.lang3.JavaVersion
scala> JavaVersion.JAVA_17
And got:
<console>:24: error: value JAVA_17 is not a member of object org.apache.commons.lang3.JavaVersion
JavaVersion.JAVA_17
My question is, how can I solve this problem? Do I have to exclude org.apache.commons.lang3 when from geotools library? [not sure how to do it yet]
Another question is, why latest geotools-wrapper [1.5.0] still depend to sedona 1.3.1-incubating version [as on this pom https://repo1.maven.org/maven2/org/datasyslab/geotools-wrapper/1.5.0-28.2/geotools-wrapper-1.5.0-28.2.pom], not to sedona 1.5.0. I wonder if this will be a problem if I use sedona 1.5.0 with geotools-wrappe-1.5.0, while geotools-wrappe-1.5.0 actually depend to sedona 1.3.1-incubating?
Note: this project is build using SBT