How to solve "JOB Loading Failed" error in Spark jobserver, while submitting jobs in Java?

350 Views Asked by At

This is a simple Java code as a Spark job, mentioned in Spark job-server github repo

package com.sample.wordcount;

import com.typesafe.config.Config;
import com.typesafe.config.ConfigFactory;
import org.apache.spark.api.java.JavaSparkContext;
import spark.jobserver.japi.JSparkJob;
import spark.jobserver.api.JobEnvironment;

public class SparkJavaJob implements JSparkJob {
    @Override
    public Object run(Object sc, JobEnvironment runtime, Config data) {
        return "OK";
    }

    @Override
    public Config verify(Object sc, JobEnvironment runtime, Config config) {
        return ConfigFactory.empty();
    }
}

While submitting it to Spark Jobserver, it shows Job loading failed.

{ "status": "JOB LOADING FAILED", "result": { "message": "com.sample.wordcount.SparkJavaJob cannot be cast to spark.jobserver.api.SparkJobBase", "errorClass": "java.lang.ClassCastException" }

Can anyone help me out with this?

1

There are 1 best solutions below

1
Abhisek Ray On

I was able to solve this issue. Basically I had to change the context type, while creating it.

curl -d '' localhost:8090/contexts/jcontext?context-factory=spark.jobserver.context.JavaSparkContextFactory&num-cpu-cores=2&memory-per-node=1g

We have to use "spark.jobserver.context.JavaSparkContextFactory"