ADF passing blank parameters to Databricks as double quotes

369 Views Asked by At

I'm using ADF to call Databricks notebooks via linked service.

When ADF passes parameters as blank (either leaving the input box as blank or typing ""), Databricks notebooks widgets read it as two double quotes (a string of "")

As in the image below both params would be read as double quotes in the Databricks notebook. (using dbutils.widgets.get())

enter image description here

To my knowledge, one way to handle this issue is to add handling code in the notebook after reading the widgets input. However, I'm wondering if there's something I've missed, or is there is any other ways around.

2

There are 2 best solutions below

8
On BEST ANSWER

In Parameters if you give blank value it takes value from Default value whatever you have given earlier. So, check the Default value you given.

enter image description here

Below is the input passed when I run the notebook with blank values.

enter image description here

and

enter image description here

If you observe clearly in this input even though i did not give quotes it is added, because the parameter is of type String whatever you pass it is enclosed by quotes automatically.

The same is with you when passed double quotes.

enter image description here

Inputs:

enter image description here

So, when these inputs sent to databricks it will take it with double quotes.

And below is the run details in databricks and run output in adf.

enter image description here

and enter image description here

So, do json.loads for parameter whichever having double quotes.

Below is the parameter i sent.

enter image description here

and while sending back to adf do json.dumps on data.

import json
x = json.loads(dbutils.widgets.get("name"))
y = json.loads(dbutils.widgets.get("id"))
dt = x+" "+y
dt = json.dumps(dt)
dbutils.notebook.exit(dt)

Output in databricks

enter image description here

My suggestion is send the text directly without double quotes for String type or do a json.loads() in notebook.

0
On

What happens is that the ADF passes the parameter as a null value and the Databricks's widget doesn't accept null values only empty strings.

To resolve this issue, you can treat the pipeline parameter before send to the "Base Parameter" at Settings in the Databricks activity to send empty strings to the Databricks Widgets instead of null values.

An example:

@if(empty(pipeline().parameters.start_date), '', pipeline().parameters.start_date)

This way, if the parameter start_date is empty, ADF will send an empty string instead of a null value.

In Databricks you can simple do this:

start_date = dbutils.widgets.get('start_date')
if start_date:
   ## any code here