Apache-spark Error: Task failed while writing rows into sequenceFile

386 Views Asked by At

I am creating a javaPairRDD and saving it to sequenceFileFormat with apache-spark. Spark version is 2.3. I am running this on normal 4 node cluster and path is also normal hdfs path. I am doing it using spark code (Java):

JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD <Integer> jr = sc.parallelize(Arrays.asList(1,2,3,4,5));
JavaPairRDD<NullWritable, Integer> outputData = jr.mapToPair( p -> { 
            return new Tuple2<>(NullWritable.get(), p);
        });
outputData.saveAsHadoopFile("hdfs://master:54310/user/output12",
                NullWritable.class, IntWritable.class, SequenceFileOutputFormat.class);
sc.close();

But when i try to run the code I get following exception

org.apache.spark.SparkException: Task failed while writing rows
0

There are 0 best solutions below