Error when creating a Spark session in Zeppelin Docker: java.lang.NoSuchMethodError with Scala and Java

97 Views Asked by At

Hello, StackOverflow community!

I am facing a problem when trying to run Apache Spark in Zeppelin using Docker. When creating a SparkSession in PySpark, the following error occurs:

org.apache.zeppelin.interpreter.InterpreterException: java.lang.NoSuchMethodError: scala.tools.nsc.Settings.usejavacp()Lscala/tools/nsc/settings/AbsSettings$AbsSetting

Here is the configuration of my environment:

  • Spark versions 3.5.0
  • Java OpenJDK versions 17.0.7
  • Scala versions 2.12.18
  • Python versions 3.9.0

I am using the following code to create a SparkSession in PySpark:

%pyspark
from pyspark import SparkSession

spark = (SparkSession.builder
         .master("local[*]")
         .appName('PySpark_Tutorial')
         .getOrCreate())

I use the Docker command to run Zeppelin:

docker run -u $(id -u) -p 8080:8080 -p 4040:4040 --rm -v /mnt/c/spark:/opt/spark -e SPARK_HOME=/opt/spark --name zeppelin apache/zeppelin:0.11.0

The HOME paths and PATH variables are set correctly. I have not tried using older versions of Spark or Scala. There is speculation that the problem may be due to incompatibility between Scala and Spark versions.

I tried running Apache Spark in Zeppelin via Docker using the specified versions of Java, Scala and Python. I expected a Spark session to be successfully created with no errors. Instead encountered a java.lang.NoSuchMethodError error, apparently related to Scala settings. I have checked that all environment variables (HOME and PATH) are set correctly and match the required paths. Also verified that compatible versions of all components are used. I expected the configuration to work without problems, but encountered this error.

0

There are 0 best solutions below