Azure Synapse - How to stop an Apache Spark application / notebook?

4.7k Views Asked by At

When I run (in debug mode) a Spark notebook in Azure Synapse Analytics, it doesn't seem to shutdown as expected.

In the last cell I call: mssparkutils.notebook.exit("exiting notebook")

But then when I fire off another notebook (again in debug mode, same pool), I get this error:

AVAILABLE_COMPUTE_CAPACITY_EXCEEDED: Livy session has failed. Session state: Error. Error code: AVAILABLE_COMPUTE_CAPACITY_EXCEEDED. Your job requested 12 vcores. However, the pool only has 0 vcores available out of quota of 12 vcores. Try ending the running job(s) in the pool, reducing the numbers of vcores requested, increasing the pool maximum size or using another pool. Source: User.

So I go to Monitor => Apache Spark applications and I see my the first notebook I ran still in a "Running" status and I can manually stop it.

How do I automatically stop the Notebook / Apache Spark application? I thought that was the notebook.exit() call but apparently not...

3

There are 3 best solutions below

0
On

You can run:

mssparkutils.session.stop()

This will end the session / spark application and release the resources.

2
On

In debug mode, the cluster's vcores are supplied to the notebook for the entire duration of the debug (that is one hour of inactivity or until you manually terminate it)

Thus, you have two options: Work on one notebook at a time, closing the debug before starting another

OR

Configure the session to reduce the number of executors so that the spark cluster can provision all three debug modes at the same time (might need to increase the size of the cluster)

0
On

While using the Spark notebook, I encountered an error message, which said 'Session failed. Run the notebook to start a new session.'

However, I resolved the issue by running the command 'mssparkutils.session.stop()', as suggested in an earlier comment.

I just wanted to express my gratitude for sharing this solution.