org.apache.spark.shuffle.FetchFailedException: The relative remote executor is dead

130 Views Asked by At
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: ResultStage 9 (runJob at FileFormatWriter.scala:237) has failed the maximum allowable number of times: 4. Most recent failure reason: org.apache.spark.shuffle.FetchFailedException: The relative remote executor(Id: 156), 
which maintains the block data to fetch is dead.
 at org.apache.spark.storage.ShuffleBlockFetcherIterator.throwFetchFailedException(ShuffleBlockFetcherIterator.scala:747)
 at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:662)
 at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:70)
 at org.apache.spark.util.CompletionIterator.next(CompletionIterator.scala:29)
 at scala.collection.Iterator$$anon$11.next(Iterator.scala:410)
 at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
 at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
 at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
 at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:31)
 at 

What can be done to fix this?

0

There are 0 best solutions below