Have an azure data factory with a copy data step which reads data from a view (on a staging table) and pushes it into a database table. In database changed column name from QuanityReqUoM to QuantityReqUoM in destination table. In the Azure Data Factory I then re-imported the schema for the table destination dataset and reimported the mapping schemas in the Copy Data Activity in the pipeline. Saved everything and then published, but when I trigger the pipeline I get:
ErrorCode=SqlOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=A database operation failed. Please search error to get more details.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=Invalid column name 'QuanityRequiredUoM'. Invalid column name 'QuanityRequiredUoM'.,Source=.Net SqlClient Data Provider,SqlErrorNumber=207,Class=16,ErrorCode=-2146232060,State=1,Errors=[{Class=16,Number=207,State=1,Message=Invalid column name 'QuanityRequiredUoM'.,},{Class=16,Number=207,State=1,Message=Invalid column name 'QuanityRequiredUoM'.,},],'
Tried re-importing the schemas, both at the Destination dataset level and Pipeline Copy Data mapping level, re-saving and re-publishing with no effect.
Select a dataset and there is a definition in JSON format, like:
Just add or modify the definition manually and press “OK”, and then go to the Copy Data or DataFlow activity and refresh/reset. It will bring the changes you made it to the JSON definition.