How to kill a spark job if application id is known?

1.7k Views Asked by At

I am using dse with spark . I have submitted an spark job to master using dse submit . How to kill the job by knowing its application ID ?

1

There are 1 best solutions below

3
On

dse spark-class org.apache.spark.deploy.Client kill <spark-master> <driver-id>

or

directly from the web UI (just ensure that spark.ui.killEnabled is set to true)