Global web icon
azure.com
https://feedback.azure.com/d365community/error/
Community - Microsoft Azure
Powered by Dynamics 365 Customer Service. Learn more here. Privacy Terms of use © Microsoft 2021
Global web icon
microsoft.com
https://www.social.msdn.microsoft.com/Forums/en-US…
ADF Custom activity to push data from datalake to on premise database ...
It is not supported to use a Data Management Gateway from a custom activity to access on-premises data sources. Currently, gateway supports only the copy activity and stored procedure activity in Data Factory. So, suggest staging your data from ADLS to azure storage using custom activity and then coping your data from storage table to oracle table using copy activity. Tuesday, April 11, 2017 2 ...
Global web icon
microsoft.com
https://www.social.msdn.microsoft.com/Forums/en-US…
Unable to access Monitor & Control of Azure Data Factory
The problem is when you deploy your data factory via powershell, for some reason it's expecting a gateway object before it can continue. It works if you deploy through the portal (at least for DB->DB, haven't tested API -> DB)
Global web icon
microsoft.com
https://www.social.msdn.microsoft.com/Forums/sqlse…
Azure Data Factory Linked Service error- Failed to get the secret from ...
I am creating a linked service to a remote server in Azure Data Factory v2. The remote server uses username-password authentication mechanism. I have already created a linked service to the same server using username and password both in the linked service creation window and its working fine. I would like to store the password as a secret in Azure Key vault and access that secret from Azure ...
Global web icon
microsoft.com
https://www.social.msdn.microsoft.com/Forums/windo…
Error while creating metric alerts in ADF using .NET SDK
Question 0 Sign in to vote Hi, I am trying to create metric alerts in Azure Data Factory using .Net SDK Below is my Json Request :
Global web icon
microsoft.com
https://www.social.msdn.microsoft.com/Forums/en-US…
Copying multiple tables from a SQL Server to Blob Storage
I would now like to be able to transfer all the tables in a database from SQL to blob storage. While defining my dataset for this, I noticed that there wasn't a way to define all the tables in the same JSON file.
Global web icon
microsoft.com
https://www.social.msdn.microsoft.com/Forums/azure…
Complex JSON with nested arrays to SQL: Help!
Summary: REST data source resulting in json that needs to be transferred into an on premises SQL environment. I'd like one table per array object. I've got the REST pull working fine and can drop it into a json blob. Sample of the json at bottom.
Global web icon
microsoft.com
https://www.social.msdn.microsoft.com/Forums/sqlse…
Azure Portal: How to pass docker related setting from Azure Data ...
But When I trigger my batch job from Azure Data Factory, I got error: Task failed "Container-enabled compute node requires task container settings" I know we need to pass the parameters like "Image name" to a task like the following:
Global web icon
microsoft.com
https://social.msdn.microsoft.com/Forums/security/…
Copy data activity - ErrorCode 2200 ...
Ok. I get it. Thanks for checking that out. Now back to part of the original question. I could not find any way to turn off case-sensitivity within the "copy data activity". Is there a way to turn off that case-sensitivity so that a column named MyField will be matched successfully with a column named myField?
Global web icon
microsoft.com
https://www.social.msdn.microsoft.com/Forums/sqlse…
Upsert is not working properly in Copy Data Activity having sink as D365
I already created Alternate key having combination of two columns in D365 on the said entity and made sure of setting them in Copy Data Activity. But still it is going and creating the existing record.