I've been at this for a day now, trying every variation possible and searching for others solutions. I have a DF pipeline with a few DataBricks notebooks with the end result being saved to blob storage, but needs to be joined with a SQL table in order to update some values. I am using dynamic folder names to pull the blob file which works fine, but when I try the same thing for my SQL query, it doesn't fail, but it doesn't seem to select any records. So how does one use pipeline parameters in a Data Flow SQL query?
Any help is greatly appreciated!
The Azure SQL database source dataset doesn't work in data flow.
You need to create a pipeline paremeter a data flow parameter, use data flow parameter in the query.
For example:
Create the data flow parameter:
Query option:
Pipeline parameter:
Set the Data Flow parameter value from pipeline: