I'm trying to deploy a Databricks workflow that is configured using yaml files. Currently I'm using dbx. Is there any way that, instead of using the YAML files within my project locally that is then uploaded with the dbx deploy command, to point the workflow to a project-specific conf folder mirroring the expected layout stored somewhere else? (s3 bucket, Azure Container, etc)
Using dbx version 0.18
I tried manually finding the path for the YAML file uploads, as well as digging through all of the documentation.
I tried the
dbxCLI to deploy adeployment.ymlfile which is in an Azure Blob storage. To quickly test it, I only generated a SAS token for the deployment file butdbxwas unable to find the file.Therefore, the workaround was to use
azcopythat downloads the remote deployment file from the cloud to local storage and then executes thedbx deploycommand.If you would like to deploy a file that is residing under a git repository, you will simple clone the repo and deploy the file:
If it is a raw file URL, you can use the
curlorwgetcommand: