Start PySpark in Jupyter notebook on EMR 6.5

604 Views Asked by At

I am trying to start a pyspark job using Amazon EMR Jupyter hub feature, as follow:

Cluster config

And with following code:

from pyspark import SparkSession

spark = SparkSession \
    .builder \
    .appName("My App") \
    .getOrCreate()

But at the end, I always got:

The code failed because of a fatal error:
    Session 0 unexpectedly reached final status 'dead'. See logs:
stdout: 

stderr: 
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/spark/jars/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/share/aws/emr/emrfs/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/share/aws/redshift/jdbc/redshift-jdbc42-1.2.37.1061.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/usr/lib/spark/jars/spark-unsafe_2.12-3.1.2-amzn-1.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
22/05/04 13:21:11 INFO RSCDriver: Connecting to: ip-10-42-255-42.eu-west-1.compute.internal:10000
22/05/04 13:21:11 INFO RSCDriver: Starting RPC server...
22/05/04 13:21:11 INFO RpcServer: Connected to the port 10001
22/05/04 13:21:11 WARN RSCConf: Your hostname, ip-10-42-255-42.eu-west-1.compute.internal, resolves to a loopback address, but we couldn't find any external IP address!
22/05/04 13:21:11 WARN RSCConf: Set livy.rsc.rpc.server.address if you need to bind to another address.
Exception in thread "main" java.lang.IncompatibleClassChangeError: Inconsistent constant pool data in classfile for class org/apache/livy/shaded/json4s/DefaultFormats. Method 'java.text.SimpleDateFormat $anonfun$df$1(org.apache.livy.shaded.json4s.DefaultFormats)' at index 156 is CONSTANT_MethodRef and should be CONSTANT_InterfaceMethodRef
    at org.apache.livy.shaded.json4s.DefaultFormats.$init$(Formats.scala:318)
    at org.apache.livy.shaded.json4s.DefaultFormats$.<init>(Formats.scala:296)
    at org.apache.livy.shaded.json4s.DefaultFormats$.<clinit>(Formats.scala)
    at org.apache.livy.repl.Session.<init>(Session.scala:66)
    at org.apache.livy.repl.ReplDriver.initializeSparkEntries(ReplDriver.scala:43)
    at org.apache.livy.rsc.driver.RSCDriver.run(RSCDriver.java:337)
    at org.apache.livy.rsc.driver.RSCDriverBootstrapper.main(RSCDriverBootstrapper.java:93)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.base/java.lang.reflect.Method.invoke(Method.java:566)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:959)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1047)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1056)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
22/05/04 13:21:11 INFO ShutdownHookManager: Shutdown hook called
22/05/04 13:21:11 INFO ShutdownHookManager: Deleting directory /mnt/tmp/spark-2804f6ee-21f1-4773-98dc-8b3e3bd1924a

Seems the livy version is clahing with the livy version embedded with the apache shaded jar, so I tried to override the jar using a fat jar that contains all the spark jar I'm used to use, and use the following config to import it:

%%configure -f
{
    "conf": {
        "spark.jars": "s3://mybucket/myfatjar.jar"
    }
}

But without any effect.

0

There are 0 best solutions below