Azure Dataflow - Resource references resources which could not be loaded

69 Views Asked by At

I have a set of pipelines, dataflows and a trigger, that when validated show the error message: This resource references resources which could not be loaded. This occured after manually merging a feature branch to the develop and main branches. See below:

Workspace validation output

Example pipeline below { "name": "NewListingToStaging", "properties": { "activities": [ { "name": "Transform for Insert", "type": "ExecuteDataFlow", "dependsOn": [], "policy": { "timeout": "0.00:10:00", "retry": 1, "retryIntervalInSeconds": 30, "secureOutput": false, "secureInput": false }, "userProperties": [], "typeProperties": { "dataflow": { "referenceName": "TransformListingForInsert", "type": "DataFlowReference" }, "integrationRuntime": { "referenceName": "cc-integration-large-compute", "type": "IntegrationRuntimeReference" }, "traceLevel": "Fine", "runConcurrently": true } } ], "folder": { "name": "Rex CRM/Listings/Workers" }, "annotations": [] } }

I have fixed the names of several code sections to ensure the output name matches the input name of the next section e.g. [flatten1] because they were different in a few places due to the code merge. Also I checked that referenced resources existed in the code base. Despite this the code is still not valid. Also I compared the code with previous versions of the code base where possible and found no indication as to the cause of the validation error.

1

There are 1 best solutions below

0
Pratik Lad On

This resource references resources which could not be loaded

This error typically arises when essential resources, such as datasets, linked services, or dataflows, are absent within the Azure Data Factory (ADF) environment. It indicates a discrepancy between the referenced components and their availability within the ADF infrastructure.

To resolve the issue, you can follow below steps:

  • Please do a manual verification to ensure the existence of all resources, including datasets, linked services, integration runtimes, dataflows, etc.
  • If any fork branch is functioning correctly, comparing the two JSON would be beneficial.

This process aims to confirm the availability and functionality of each component within the ADF.