Multiple spark-submit using spark operator on k8s

508 Views Asked by At

Is it possible to submit multiple spark-submit using a single spark operator on k8s? Or is a dedicated spark-operator required for each spark-submit?

1

There are 1 best solutions below

2
On

The Spark operator allows to add the Spark functionalities on Kubernetes and use them in the form of SparkApplication custom resource components.

After that, you can set and execute multiple spark applications on the same cluster and they will be executed until there are resources available

∑apps-resource < Cluster_resources

If the sum of the applications-required resources exceeds the cluster resources the Job will be queued.

Moreover, the operator uses spark-submit under the scene but you need to interact with it only through SparkApplication. You can also use the native spark-submit to execute spark workload but in this case you're not using the Kubernetes operator.

Check how to install the operator and how to write application examples you can find examples of commands.