Scenario: We have a copy activity in the data factory pipeline that, copies data from Azure Blob Storage (ADLS) to Azure SQL. in the sink, we are using SQL stored procedure to upsert the data using Merge statement. In the stored procedure, we are passing the Table type that holds the data from source.
While running the pipeline, it fails giving the below error message:
- Error: The MERGE statement attempted to UPDATE or DELETE the same row more than once.
- { "errorCode": "2200", "message": "ErrorCode=SqlOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=A database operation failed with the following error: 'The MERGE statement attempted to UPDATE or DELETE the same row more than once. This happens when a target row matches more than one source row. A MERGE statement cannot UPDATE/DELETE the same row of the target table multiple times. Refine the ON clause to ensure a target row matches at most one source row, or use the GROUP BY clause to group the source rows.',Source=,''Type=System.Data.SqlClient.SqlException,Message=The MERGE statement attempted to UPDATE or DELETE the same row more than once. This happens when a target row matches more than one source row. A MERGE statement cannot UPDATE/DELETE the same row of the target table multiple times. Refine the ON clause to ensure a target row matches at most one source row, or use the GROUP BY clause to group the source rows.,Source=.Net SqlClient Data Provider,SqlErrorNumber=8672,Class=16,ErrorCode=-2146232060,State=1,Errors=[{Class=16,Number=8672,State=1,Message=The MERGE statement attempted to UPDATE or DELETE the same row more than once. This happens when a target row matches more than one source row. A MERGE statement cannot UPDATE/DELETE the same row of the target table multiple times. Refine the ON clause to ensure a target row matches at most one source row, or use the GROUP BY clause to group the source rows.,},],'", "failureType": "UserError", "target": "Copy data to Azure SQL", "details": [] }
Solution: On Analyzing, we found that in our SQL table, we had one column set to not null that we are no longer using in our database. after deleting this column from the SQL table, our pipeline ran successfully.
- If we require this column, then we would have checked if the source is sending the data in this column or not. if the data was not regular (we see some nulls), then we would have edited the schema of the table to make this column allow Nulls.
For this reason, builders choose to make 솔 카지노 subtle iterative twists on successful math fashions quite than doing purely skins. Skins are helpful to fill a financial institution of machines together with your already successful math model, but they don't provide you with model new} shot at a home run. Thus each authentic game, each new twist on a trending feature is model new} "at bat". According to the American Gaming Association, before COVID-19, Missouri’s on line casino income had remained relatively steady for the last decade regardless of a 31% decrease in attendance since 2011.
ReplyDelete