Can we use different run-time in Copy Activity of Azure Data Factory?

593 Views Asked by At

I am using Copy Activity for migrating the Data from On-premises Database to the on-cloud Database. Here I am using Self-Hosted Integration Runtime for both on-premises and on-cloud databases.

The Integration run-time is different for on-premises and on-cloud Databases.

When I execute the pipeline, it shows that both the source and target need to be in the same self-hosted integration runtime.

Is it possible to execute the pipeline having 2 self-hosted integration runtimes?

If it is possible, Please let me know how we can execute the pipeline of having different 2 self-hosted integration runtimes.

3

There are 3 best solutions below

0
On

Check this article https://learn.microsoft.com/en-us/azure/data-factory/concepts-integration-runtime You don't need a Self hosted IR to move data from on Prem to Cloud If the source is in Onprem you can use -self hosted ir If the database/target is in cloud -You can either use a Azure IR having a private endpoint/same self hosted ir. Also please let me know why u want 2 self hosted ir here

2
On

I think at this time I think this is what you have .

In-premise -> SH IR 1

On-cloud Databases. -> SH IR 2

Can you please let me know if what do you mean by "on-cloud Databases" ? Anyways , you can try the two step logic , To write the data to a blob and then read from blob and write to the cloud databses .

0
On

Just in case you never resolved this, I've overcome this same issue very recently but I had to use Azure Data Lake Gen2 storage account as a staging area - I think you were trying to avoid this option, but you should know that it added almost no time to the copy activity, and the data is temporary - only exists during the activity and not afterwards. You'll find the [Enable staging] option on the Settings tab: enter image description here