Is it possible to export published SQL files in your Synapse workspace to your sandbox environment via code and without the use of pipelines?
If not is it somehow possible to access your published SQL files via a notebook with e.g. pySpark, Scala, SparkSQL, C# etc?
You can get the list of
SQL scripts
from the Synapse workspace using following REST API.Use this REST API in Synapse notebook(Pyspark).
First create a Service principal and secret. Give the access for that Service principal to Synapse by following below steps.
Here, these are my SQL scripts in the workspace attached to a dedicated SQL pool named
rakeshdedipool
.Generate the bearer token for the service principal. I followed the code in this SO Answer by @Saideep Arikontham which uses
msal
for generating the bearer token.If you want, you can use postman also for the bearer token.
Now, use the bearer token in Pyspark and you can see the list of SQL Scripts.
You can get the scripts like and use as per your requirement in this Notebook.
(OR) Use the Powershell script Export-AzSynapseSqlScript to export the script files to a Storage account and you can also try with Python SDK.