SparkLauncher launching only one application even when submitting multiple applications

408 Views Asked by At

I am submitting/running multiple application through spark launcher in my java web app. it seems to submit only one app.Here is my code

Runnable run = new Runnable() {
        public void run() {
            try {
                SparkAppHandle sparkApp = new SparkLauncher()
                 .setAppResource("C:\\BigData\\spark\\examples\\jars\\spark-examples_2.11-2.4.0.jar")
                    .setMainClass("org.apache.spark.examples.SparkPi")
                    .setMaster("spark://192.168.2.233:7077")
                    .setConf("spark.scheduler.mode", "FAIR")
                    .setVerbose(true)
                    .setConf("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
                    .setConf("spark.sql.inMemoryColumnarStorage.batchSize", "10000")
                    .setConf("spark.sql.codegen","false")
                    .setConf("spark.submit.deployMode", "client")
                    .setConf("spark.executor.memory", "1g")
                    .setConf("spark.driver.memory", "1g")
                    .setConf("spark.cores.max", "1")
                    .setConf("spark.executor.cores", "1")
                    .setConf("spark.executor.instances", "1")
                    .setConf("spark.driver.host","192.168.2.233")
//                  .setConf("spark.dynamicAllocation.enabled", "true")
//                  .setConf("spark.shuffle.service.enabled", "true")
                    .startApplication();


                System.out.println(sparkApp.getState());

            } catch (IOException e) {
                // TODO Auto-generated catch block
                e.printStackTrace();
            }
        }
     };

//running two times so as to submit two parallel application
//in the application logic , we would pass different args to different app
     new Thread(run).start();
     new Thread(run).start();`

I have a standalone cluster with one worker node1(8gb,4cores) and another worker node2(8gb, 2 cores). Master is running on node1 and driver is also node1 only.

It seems that even though second thread launch application, nothing happens to it and the second application doesn't even appear in the WAITING state which would have been understandable.

0

There are 0 best solutions below