I have a databricks notebook with some code (Python) to upload a file from dbfs to a SharePoint location. The notebook runs correctly when executed stand-alone, and the file is uploaded, but when I try to schedule it using ADF or a Databricks job, the command for the SharePoint upload gets skipped.
Other commands are executed okay. I'm using O365 REST Python client for the Sharepoint upload. I'm not sure if my choice of library is causing this to happen.
Has anyone faced something similar?
From the info, is not clear if this in your code, but maybe it will help you or others with the mysterious "Command skipped" problem when running in job mode, as titled:
This will happen when a notebook runs another notebook using a run call, e g.,
%run ./subordinate_notebook
and that subordinate notebook ends withdbutils.exit("Some message")
In this situation, after that subordinate notebook exits, the remaining cells in the primary notebook are skipped. The message "Command skipped" will show.
Note, %run behaves differently from dbutuls.notebook.run()
Using
result_message = dbutuls.notebook.run(./subordinate_notebook)
will avoid this problem. Removing thedbutils.exit("Some message")
will also eliminate the issue.I hope that helps.