Error initializing SparkContext caused by java.nio.channels.UnresolvedAddressException: null

96 Views Asked by At

I'm having an error running a scala spark application on my local machine that seems to be caused by a network error. Every spark application I've tried running has failed with the same error since earlier this week, when they would work fine before and nothing has changed (as far as I'm aware). The code runs fine on GitHub CI and other people's machines. The failures are blocking me from even compiling my code, since compiling the code runs unit tests.

I am on a MacBook Pro with an Intel Core i7 (x64), using IntelliJ Idea and sbt for writing, testing, and compiling the code. The error started off only affecting terminal sbt usage, but has also started to affect IntelliJ Idea's built in code running/testing utilities.

Traceback (changed some identifying names for discretion):

[info] Loading project definition from /Users/carlosplanelles/path/to/repo/project
[info] Set current project to ProjectName (in build file:/Users/carlosplanelles/path/to/repo/)
[info] Compiling 7 Scala sources to /Users/carlosplanelles/path/to/repo/target/scala-2.12/classes...
[info] Compiling 9 Scala sources to /Users/carlosplanelles/path/to/repo/target/scala-2.12/test-classes...
[warn] 7 deprecations (since 1.3.2); re-run with -deprecation for details
[warn] one warning found
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/carlosplanelles/.ivy2/cache/ch.qos.logback/logback-classic/jars/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/carlosplanelles/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]
[2024-02-09 16:43:27,956] [pool-4-thread-1] WARN  o.a.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
[2024-02-09 16:43:29,158] [pool-4-thread-1] ERROR org.apache.spark.SparkContext - Error initializing SparkContext. 
java.nio.channels.UnresolvedAddressException: null
        at sun.nio.ch.Net.checkAddress(Net.java:100)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:220)
        at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:134)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:550)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1334)
        at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506)
        at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491)
        at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:973)
        at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:248)
        at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:356)
        at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
        at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:500)
        at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
        at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
        at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.lang.Thread.run(Thread.java:750)
[info] TestClassName:
[info] com.unit.testing.package *** ABORTED ***
[info]   java.nio.channels.UnresolvedAddressException:
[info]   at sun.nio.ch.Net.checkAddress(Net.java:100)
[info]   at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:220)
[info]   at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:134)
[info]   at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:550)
[info]   at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1334)
[info]   at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506)
[info]   at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491)
[info]   at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:973)
[info]   at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:248)
[info]   at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:356)
[info]   ...
[info] Run completed in 3 seconds, 451 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 1
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] *** 1 SUITE ABORTED ***
[error] Error during tests:
[error]         com.unit.testing.package
[error] (test:testOnly) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 36 s, completed 9-Feb-2024 4:43:30 PM

SparkContext is initialized using this shared trait (have also tried local and local[1] as master url):

import org.apache.spark.sql.SparkSession

trait SharedSparkContext {
  implicit val _spark = SparkSession
    .builder()
    .master("local[*]")
    .getOrCreate()
}

I have tried cleaning up and reinstalling Java (JDK and JRE), reinstalling sbt, sbt clean, clearing IntelliJ Idea cache, and restarts of software and laptop. None of these changed the traceback at all.

0

There are 0 best solutions below