dataverse azure integration Archives - Microsoft Dynamics 365 Blog https://microsoftdynamics.in/tag/dataverse-azure-integration/ Microsoft Dynamics CRM . Microsoft Power Platform Tue, 26 Dec 2023 09:37:34 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 https://i0.wp.com/microsoftdynamics.in/wp-content/uploads/2020/04/cropped-Microsoftdynamics365-blogs.png?fit=32%2C32&ssl=1 dataverse azure integration Archives - Microsoft Dynamics 365 Blog https://microsoftdynamics.in/tag/dataverse-azure-integration/ 32 32 176351444 Microsoft Entra ID Security Groups Management https://microsoftdynamics.in/2023/12/26/microsoft-entra-id-security-groups-management/ Tue, 26 Dec 2023 09:37:34 +0000 https://www.inogic.com/blog/?p=36861 As we all know, Microsoft renamed Azure Active Directory (AAD) to Microsoft Entra ID back in November 2023. The reason for this move is explained in detail in this Microsoft doc. This is just a product name change and all of the existing features and capabilities are still available in the Microsoft Entra ID. This...

The post Microsoft Entra ID Security Groups Management appeared first on Microsoft Dynamics 365 Blog.

]]>
As we all know, Microsoft renamed Azure Active Directory (AAD) to Microsoft Entra ID back in November 2023. The reason for this move is explained in detail in this Microsoft doc. This is just a product name change and all of the existing features and capabilities are still available in the Microsoft Entra ID.

This blog will have a quick walkthrough of the Microsoft Entra ID security group and team members’ management.

Managing team from Microsoft Entra ID admin center

You can navigate to Power Platform Admin Center and under Admin centers > “Microsoft Entra ID” will be available as one of the options.

Microsoft Entra ID Security Groups Management

On being clicked> It will redirect you to the Microsoft Entra URL Page. From within the sitemap, you can navigate to Identity > under it expand Users entity > Select All Users and that will display all the users as shown below

Microsoft Entra ID Security Groups Management

Managing from Azure Portal

The other way is to log to the Azure Portal > Under All services > select “Microsoft Entra ID” as shown below:

Microsoft Entra ID Security Groups Management

NOTE: The Azure AD product icon is replaced with the Microsoft Entra ID product icon as highlighted above.

Now let’s create a team in CRM with the type “Microsoft Entra ID Security Group” and manage the members in it. If you don’t know how to create teams in CRM then you need to refer to our previous blog wherein we have explained how to create a team of type “AAD Security Group” which is now renamed as “Microsoft Entra ID Security Group”.

When you try to create a team from Power Platform Admin Center > Select the appropriate Environment > Settings > Teams > Click on Create team > Quick create form will open as below.

You will observe that under Team Type these new renamed options are displayed. “Microsoft Entra ID Security Group” formerly called “AAD Security Group” and “Microsoft Entra ID Office Group” formerly called “AAD Office Group”.

Microsoft Entra ID Security Groups Management

When you select Team Types its relevant fields will be visible, as in this scenario when Team Type is selected as “Microsoft Entra ID Security Group” the below fields become visible on the form:

  • Group Name
  • Membership Type

Microsoft Entra ID Security Groups Management

  • Group Name– When you start typing the group name here, it helps you to select the group from existing groups that are created already in the Microsoft Entra admin center. For this demonstration select the “Sales Team” group as below:

Microsoft Entra ID Security Groups Management

Before starting to enter the text, make sure groups are pre-created in the Microsoft Entra admin center. As you can see “Sales Team” was already created as a security group in the Microsoft Entra admin center below:

Microsoft Entra ID Security Groups Management

  • Membership Type– which is defaulted to “Members and guests”, If you want you can change it to Members, Owners, or Guests as per need.

When you open the Team in CRM, you will find the “Azure AD Object Id for a group” gets auto-populated matching with the Group Object ID created in the Microsoft Entra Admin center.

Screen clip of “Sales Team” in CRM

Microsoft Entra ID Security Groups Management

NOTE: it is observed that in CRM when the “Entra Security Group” type team is created its associated queue is not created, and hence “Default Queue” is not granted for this type of team which does happen when you create Owner type team.

Screen clip of “Sales Team” in Microsoft Entra admin center

From within the sitemap, you can navigate to Identity > under it expand Groups entity > Select All groups, and open the appropriate team as shown below:

Microsoft Entra ID Security Groups Management

After creating the team, you can add members and select corresponding security roles.

NOTE: When you add members to Team from the Microsoft Entra admin center that doesn’t reflect in the CRM team instantly until that user first time accesses the environment.

Microsoft Entra ID Security Groups Management

As you can see from the Microsoft Entra admin center the above group has 2 team members but not all users are synced instantly to the corresponding Team created in CRM. Hence if you go and check the team members list in the CRM team it may show you a discrepancy in count as shown below only 1 member is shown below:

Microsoft Entra ID Security Groups Management

The simple reason for not displaying another member (Mike in our scenario) is it displays the list of users who have accessed the environment and as Mike hasn’t accessed the environment yet, Once he accesses the environment it will get instantly added in the team, and will inherit the security roles as well in run time.

NOTE: As per Microsoft the team member list in CRM doesn’t show all the group members of the Microsoft Entra group. The group member is added to or removed from the CRM group team only when a Microsoft Entra group member accesses the environment next time. You can refer Note section of this Microsoft doc for more details.

Conclusion

From within the Microsoft Entra Admin Center you can manage Groups, Group Members, Group Licensing, and Group Security quickly and easily.

Microsoft Power Platform

The post Microsoft Entra ID Security Groups Management first appeared on Microsoft Dynamics 365 CRM Tips and Tricks.

Source

The post Microsoft Entra ID Security Groups Management appeared first on Microsoft Dynamics 365 Blog.

]]>
4828
Load Data in Dynamics 365 CRM using Azure Copy Data Activity tool https://microsoftdynamics.in/2021/06/30/load-data-in-dynamics-365-crm-using-azure-copy-data-activity-tool/ Wed, 30 Jun 2021 10:48:36 +0000 https://www.inogic.com/blog/?p=28423 Introduction We recently had a business requirement to load data in Dynamics 365 CRM. We had a case where the user will create Account records in a CRM and at the end of the day, it should load records in the other Dynamics 365 CRM. To achieve this we have used the Azure Copy Data...

The post Load Data in Dynamics 365 CRM using Azure Copy Data Activity tool appeared first on Microsoft Dynamics 365 Blog.

]]>
Introduction

We recently had a business requirement to load data in Dynamics 365 CRM. We had a case where the user will create Account records in a CRM and at the end of the day, it should load records in the other Dynamics 365 CRM. To achieve this we have used the Azure Copy Data Activity tool, also we have configured scheduling on the “Copy Data” tool so it will Upsert records in the target system once a day.

In this blog, we have explained how we have configured the ‘Azure Copy Data Activity’ tool to Load records. So, let’s have a look at the steps we followed to achieve the same.

Step 1: Connect to the Azure https://portal.azure.com and navigate to the Data factory. If you don’t have a Data factory then create it first to use the ‘Copy Data Activity tool’.  Please refer to the below link to know how to create a Data factory:

https://docs.microsoft.com/en-us/azure/data-factory/quickstart-create-data-factory-portal#create-a-data-factory

Step 2:  We already have a Data factory i.e. “DataLoadActivity”, so we are using that but to add the Copy Data Activity tool, you need to add a pipeline.

DataloadActivity (Data factory) -> Author & Monitor -> Author -> Create new/Existing Pipeline -> Drag and Drop Copy Data.

Please refer to the below screenshots:

Navigate to Author & Monitor

Load data in D365 CRM

Navigate to Author

Load data in D365 CRM

Add Pipeline

Load data in D365 CRM

Drag and Drop ‘Copy Data’ tool.Load data in D365 CRM

Load data in D365 CRM

We can change its name from the ‘General tab’, we have renamed it as ‘Load Accounts’.

Load data in D365 CRM

Step 3: In the next step, we need to add a new dataset and configure a connection with Dynamics 365 CRM. To configure the dataset, we need to click on the Dataset menu and add New Dataset as shown below:

It will open the Data Stores window. Here we need to select Dataverse (Common data service for app) and click on continue.

Load data in D365 CRM

Once the dataset is added, we need to add link services. To add link services, click on the +New button and it will open another window where we need to set CRM connection details.

Load data in D365 CRM

Here, we need to pass details i.e., Name, Service Url, Authentication type(AAD Service Principal), Service Principal Id(Azure Active directory App Id) and service principal key(secret key) details. And after the ‘Test connection’, click on Create.

Load data in D365 CRM

  • Name: Connection name
  • Connect via integration runtime: AutoResolveintegrationruntime
  • Deployment Type: It has two options i.e. Online and OnpremisewithIfd. As we are connecting to Dynamics 365 CRM online, so need to select online.
  • Service uri: Dynamics 365 CRM url
  • Authentication Type: It has 2 options i.e. AAD service Principal and office365. Need to select ‘AAD service Principal’ option. FYI, office365 is deprecated now.
  • Service Principal credential type: Select ‘service principal key’ and then it will ask to enter Azure Active directory App Id and secret key.
  • Service Principal ID: Need to pass Azure Active Directory Application(client) Id.
  • Service Principal Key: Need to pass Azure Active Directory Secret key.

To establish a connection, you must have Azure Active Directory and use the App Id and secret key from the Azure Active Directory app. You can refer to the below article to know about how to configure/create Azure Active Directory app:

https://www.inogic.com/blog/create-azure-active-directory-app

Once Azure Active Directory app is configured, you need to create an Application user in CRM. To configure the Application user, we need to pass Azure Active Directory Application ID (App Id) as shown below:

Load data in D365 CRM

After this, we need to select the table name under connection. As we want to load data for the Account table so here we select the Account table. Using the same way, we have set up a dataset/connection for both Source and Target CRMs.

Load data in D365 CRM

Step 4: Now, we can set up the “Copy Data” tool. In the first step, we need to set up a Source detail, as shown below. Here, we have selected a table but as per the requirement, we can also select Query and pass the fetchxml query.

Load data in D365 CRM

Step 5: Next, we need to set up Sink details. Here we have selected Target CRM and selected behavior as Upsert. Also, it has other properties i.e. to ignore null values, perform upsert based on Alternate key, etc.

Load data in D365 CRM

Note: If you don’t want to upsert the record based on the Primary key field (i.e., accountid) then you can use the Alternate key property. You need to set an alternate key in your target CRM table field and then the alternate key will display here under the Alternate key name list.

Please refer to the below article to know how we can configure alternate key in CRM:

https://docs.microsoft.com/en-us/dynamics365/customerengagement/on-premises/customize/define-alternate-keys-reference-records?view=op-9-1#define-alternate-keys

Step 6: Once Sink details are configured, we need to set field mapping as shown below:

Load data in D365 CRM

Note: If you don’t see mapping fields, please click on Import Schemas.

Step 7: We can also add a trigger and configure scheduling to trigger the ‘Copy Data’ tool.

Load data in D365 CRM

It has multiple options to set trigger i.e., minutes/hourly/day/week. As per our requirement, we have set it to Day(s), as shown in the below screenshot

Load data in D365 CRM

Step 8: Publish all.

Load data in D365 CRM

We can also monitor the run history. To check the run history please navigate to Monitor -> Pipeline run.

Load data in D365 CRM

Also, we can run the pipeline by clicking on Debug without setting any Trigger, please refer to the below screenshot:

Load data in D365 CRM

Note: I couldn’t get the option/behaviour to perform only Update operation instead of using Upsert.

Conclusion:

As illustrated above, with the help of Azure Copy Data Activity tool, we can load data in Dynamics 365 CRM.

Reference link: https://docs.microsoft.com/en-us/azure/data-factory/quickstart-create-data-factory-copy-data-tool#start-the-copy-data-tool

Click2Undo

Source

The post Load Data in Dynamics 365 CRM using Azure Copy Data Activity tool appeared first on Microsoft Dynamics 365 Blog.

]]>
4393
Impersonation within Azure Function or Custom Connector when using AAD authentication https://microsoftdynamics.in/2020/09/03/impersonation-within-azure-function-or-custom-connector-when-using-aad-authentication/ Thu, 03 Sep 2020 11:15:41 +0000 https://www.inogic.com/blog/?p=24610 In the earlier blog posts, we discussed setting up an Azure function with AAD authentication and then creating a custom connector for the Azure function that also requires AAD authentication to make a connection to the connector. Given that the Azure function is configured for AAD authentication in the Authentication / Authorization section of the...

The post Impersonation within Azure Function or Custom Connector when using AAD authentication appeared first on Microsoft Dynamics 365 Blog.

]]>
In the earlier blog posts, we discussed setting up an Azure function with AAD authentication and then creating a custom connector for the Azure function that also requires AAD authentication to make a connection to the connector.

Given that the Azure function is configured for AAD authentication in the Authentication / Authorization section of the function as shown below

Azure Function or Custom Connector when using AAD authentication

Now that we have provided for AAD authentication that requires a user login, it would be good if all operations are executed within the context of the same user.

Within your Azure function, you can get the details of the logged-in user using the ClaimsPrincipal

ClaimsPrincipal principal = req.HttpContext.User;

if (principal.Identity != null)

{

log.LogInformation(“Claims identity ” + principal.Identity.Name);

}

if (principal.Claims != null)

{

foreach (Claim c in principal.Claims)

{

log.LogInformation(“CLAIM TYPE: ” + c.Type + “; CLAIM VALUE: ” + c.Value + “</br>”);

}

 }

In the console, you can see all the claims returned

Azure Function or Custom Connector when using AAD authentication

One of the claims returned is AADID

Azure Function or Custom Connector when using AAD authentication

Read this specific claim value

Claim claim = principal.Claims.FirstOrDefault(c => c.Type.Contains(“objectidentifier”));

string aadobjid = “”;

if (claim != null)

{

aadobjid = claim.Value;

log.LogInformation(“aadobjid = ” + aadobjid);

}

Every CRM User that we create has an associated AADID stored along which is this objectid.

Set this to the cds client object we have created for impersonation

//establish connection with CDS

CdsServiceClient client = new CdsServiceClient(connectionString);

if (!string.IsNullOrEmpty(aadobjid))

{

client.CallerAADObjectId = new Guid(aadobjid);

}

Do note if you run a WhoAmI request – it still returns the id of the original credentials used for establishing the connection.

However, when you create a record, you will notice that the owner of the new record is the same user that had logged in to the connector.

Source

The post Impersonation within Azure Function or Custom Connector when using AAD authentication appeared first on Microsoft Dynamics 365 Blog.

]]>
4297
Building Custom Connectors for Power Apps and Power Automate Flows – Part 2 https://microsoftdynamics.in/2020/08/31/building-custom-connectors-for-power-apps-and-power-automate-flows-part-2/ Mon, 31 Aug 2020 12:49:50 +0000 https://www.inogic.com/blog/?p=24566 With our Azure function ready in the earlier post. We now look at the steps to create a custom connector for this Azure function. Do remember, we have enabled AAD authentication for our Azure function. The APP registration provided while enabling AAD authentication was set for multi-tenant authentication. While we have an easy way to...

The post Building Custom Connectors for Power Apps and Power Automate Flows – Part 2 appeared first on Microsoft Dynamics 365 Blog.

]]>
With our Azure function ready in the earlier post. We now look at the steps to create a custom connector for this Azure function.

Do remember, we have enabled AAD authentication for our Azure function.

The APP registration provided while enabling AAD authentication was set for multi-tenant authentication.

Power Apps and Power Automate Flows

While we have an easy way to create a custom connector outside the scope of a solution, since we will also cover moving custom connectors from one environment to another, we will look at the steps to create the connector from within a solution

Navigate to https://make.powerapps.com and open your solution. Next, choose New > Other > Custom Connector.

Power Apps and Power Automate Flows

First Step provide Connector details like name, image and short description. In the host add the host name of your Azure function i.e if your azure function url was

https://xxx.azurewebsites.net/api/GetTaxRate?code=xxxx

the host would be xxx.azurewebsites.net

Next comes security. Since we are looking for AAD authentication, we will choose OAuth 2.0 in this step.

Power Apps and Power Automate Flows

Client ID = APP ID of the App registration created while setting the AAD security for Azure function

Client Secret = Secret key of the above app.

Leave the login url and tenant id as is.

Resource URL = Copy and paste the client id (app id of the app registration)

Ref: Why leave common in tenant id –  https://docs.microsoft.com/en-us/azure/active-directory/develop/howto-convert-app-to-be-multi-tenant

After you save the connector, the Redirect URL will be generated as shown above. Copy the redirect URL and we need to add this to our app registration (same one of which you have provided the client id here)

Next in the Definition tab, we will provide the details of the actions that we wish to expose through our connector

Under actions choose new action and provide the action details. The details entered here will be displayed to the user when using your connector and this action so make sure to have all the descriptive information included here.

Power Apps and Power Automate Flows

In the request section click import from sample and provide the request details. You can copy the url from Postman where we had done the test call.

Power Apps and Power Automate Flows

My request has 2 query parameters and in the header I need to pass the CRM connection details like URL and the Client ID, Secret for connection to the said environment.

When you click import, you will see all the query parameters and header information has been generated for you.

Power Apps and Power Automate Flows

Click Update Connector to save all changes to the connector.

Before we test the connector we need to complete a few additional settings in Azure APP registration.

Completing the authentication settings on Azure AD APP registration

Navigate to Authentication section of the App Registration

Power Apps and Power Automate Flows

Click Add a platform

Power Apps and Power Automate Flows

Choose Web and paste the Redirect URI copied from the connector

Power Apps and Power Automate Flows

Click Save to save this redirect uri.

Next navigate to the Expose an API setting of the APP registration and click add scope

Details provided here are displayed to the user when they are making a connection to our connector and are from another tenant. These details are used on the Consent screen

Power Apps and Power Automate Flows

Test the connection for the connector:

Navigate to https://make.powerapps.com > Data > Custom Connectors

Power Apps and Power Automate Flows

Click the + button against the connector to build a connection to it. When you click, you should be prompted to login to Azure AD

Power Apps and Power Automate Flows

Enter the credentials, if you enter the credentials of an environment other than the current environment, you would be prompted with the following screen

Power Apps and Power Automate Flows

Click Accept and you should have a successful connection object created for this connector.

Note sometimes, after this screen it may display you an error about app registration error. Try again and the next time it succeeds 🙂

Moving this custom connector to another environment

Since we added this connector to a solution, simply export the solution as managed solution and import it to the other environment.

Navigate to Custom connectors tab to find the connector listed there

Power Apps and Power Automate Flows

Click the + button to test connection with the connector

If you get an error about invalid client id and secret key,

Power Apps and Power Automate Flows

edit the connector and fill in the security page once again. It will be the same client id and secret key that we had added when creating this connector in the other environment.

Power Apps and Power Automate Flows

Consuming this connector through Power Automate Flows

We are now ready to create a flow and add a step for the operation in our connector.

Choose Manual Trigger of flow

Power Apps and Power Automate Flows

In the Custom tab, you should find our connector

Power Apps and Power Automate Flows

Choose our action

Power Apps and Power Automate Flows

And now provide the requested details in an end user friendly UI

Power Apps and Power Automate Flows

And the result is

Power Apps and Power Automate Flows

Source

The post Building Custom Connectors for Power Apps and Power Automate Flows – Part 2 appeared first on Microsoft Dynamics 365 Blog.

]]>
4330
Make Power BI report using data from Azure SQL server and view in Dynamics 365 https://microsoftdynamics.in/2018/11/05/make-power-bi-report-using-data-from-azure-sql-server-and-view-in-dynamics-365/ Mon, 05 Nov 2018 12:58:42 +0000 https://www.inogic.com/blog/?p=13289 Introduction: In this blog, we will see how to show data of SQL server into Dynamics CRM. For this first, you need to connect your Power BI Desktop with SQL server. After that, we can show Power BI report into Dynamics CRM. Step 1: Register with Microsoft Power BI and by Sign in with your...

The post Make Power BI report using data from Azure SQL server and view in Dynamics 365 appeared first on Microsoft Dynamics 365 Blog.

]]>
Introduction:

In this blog, we will see how to show data of SQL server into Dynamics CRM. For this first, you need to connect your Power BI Desktop with SQL server. After that, we can show Power BI report into Dynamics CRM.

Step 1:

Register with Microsoft Power BI and by Sign in with your Dynamic 365 Credentials.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 2:

Once you Sign in with Power BI, click on Get Data button which is present on left top.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 3:

Then select the Source as Azure SQL Database and click on connect.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 4:

After clicking on connect then enter the Server Detail of your Azure SQL database and mention the database on which you are going to create reports.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Meaning of selection of Data Connectivity Mode:

If we select-

  1. Import

It will give you all the tables which you have selected from SQL Server to Power BI. For real-time data, you have to refresh the dataset manually or you can schedule a refresh for it.

  1. Direct Query –

By this you can run queries directly to SQL Server means all data will not be imported to Power BI. Here you will get real-time data so no need to refresh dataset in Power BI.

Step 5:

After clicking on ok button select the tables which you want to import to Power BI. Then click on the Load button.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 6:

Here you can see your tables imported from SQL Server.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 7:

Here you can see relationship between your tables.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 8:

Here you can see all the data of tables.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 9:

Now create a report using power bi visualizations and filters.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 10:

To view this report in Dynamic CRM we have to publish the report.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 11:

After publishing report it will populate in power service.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 12:

Now Click on Pin Live Page to make report on dashboard of Dynamic CRM. Then add that report in the new dashboard or existing dashboard.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 13:

Now go to CRM dashboards from here you can get this Power BI report.

Step 1: From here you can get all Power BI dashboard reports.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 2: After that choose your Workspace and Dashboard where you had pin that Power BI report.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 3: Here is your Power BI report which you show in Dynamic CRM Dashboard.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Conclusion:

Using the simple steps above we can show Power BI reports with Azure SQL database (Data Source) in Dynamics CRM.

Click2Clone-One-Click-Productivity-App-to-Copy_clone-Dynamics-365_CRM-Records

Source

The post Make Power BI report using data from Azure SQL server and view in Dynamics 365 appeared first on Microsoft Dynamics 365 Blog.

]]>
4353
Passing data from Dynamics 365 to Azure Service Bus Queue using Plugins/Workflows https://microsoftdynamics.in/2018/08/16/passing-data-from-dynamics-365-to-azure-service-bus-queue-using-plugins-workflows/ Thu, 16 Aug 2018 12:18:01 +0000 https://www.inogic.com/blog/?p=12658 Introduction: Recently we had a business requirement where we need to pass data from Dynamics 365 CRM to Azure Service Bus queue through plugins and workflows. After some research and play around we found a solution for this. For this, we require an Azure Service Bus, an Azure function and a webhook. The CRM data...

The post Passing data from Dynamics 365 to Azure Service Bus Queue using Plugins/Workflows appeared first on Microsoft Dynamics 365 Blog.

]]>
Introduction:

Recently we had a business requirement where we need to pass data from Dynamics 365 CRM to Azure Service Bus queue through plugins and workflows. After some research and play around we found a solution for this.

For this, we require an Azure Service Bus, an Azure function and a webhook. The CRM data can be delivered to azure portal using the webhook. The webhook is registered to connect to the azure function. When plugin/workflow is triggered, webhook is called from the code which passes the data to azure function in JSON format and azure function is used to add the data to the queue of Azure service bus (ASB).

Passing data from Dynamics 365 to Azure Service Bus Queue using Plugins Workflows

The detailed steps are as follows:-

1. Create an Azure Service Bus:-

Open Azure portal in your CRM organization and Create Service Bus Namespace by navigating to + Create a Resource >> Integration >> Service Bus

2. Create an Azure function which will add the CRM data to Azure Service Bus Queue:-

a. Navigate to + Create a Resource >> Compute >> Function App

b. Create a C# HttpTrigger Function in it

c. Click on “Get function URL” link

Passing data from Dynamics 365 to Azure Service Bus Queue using Plugins Workflows

Passing data from Dynamics 365 to Azure Service Bus Queue using Plugins Workflows

This URL link is important and will be used in CRM Plugin/Workflow code and webhook registration.

d. Click on Integrate and + New Output to create a queue in ASB and provide it to Azure function

assing data from Dynamics 365 to Azure Service Bus Queue using Plugins Workflows

e. Select Azure Service Bus

Passing data from Dynamics 365 to Azure Service Bus Queue using Plugins Workflows

f. Select “Service Bus Queue” in Message Type. You can select the “Service Bus Connection” by clicking on new and selecting the desired Service Bus.

Passing data from Dynamics 365 to Azure Service Bus Queue using Plugins Workflows

g. Click on the function “f functionname” and paste the following code to add the JSON data from CRM to the ASB Queue in the azure function. Save the code.

using System;
using System.Net;
public static async Task<object> Run(HttpRequestMessage req,IAsyncCollector<Object> outputSbMsg, TraceWriter log)
{ log.Info($"Webhook was triggered!"); string jsonContent = await req.Content.ReadAsStringAsync(); log.Info("jsonContent " + jsonContent); await outputSbMsg.AddAsync(jsonContent); log.Info("added to queue"); return req.CreateResponse(HttpStatusCode.OK);
}

Here,     HttpRequestMessage req is  JSON data from CRM

IAsyncCollector<Object> outputSbMsg  is  ASB Queue name

Passing data from Dynamics 365 to Azure Service Bus Queue using Plugins Workflows

3. Assume we are sending the data of Account entity to Azure Service Bus (ASB). The data should be in JSON format before it is passed to Azure portal. Below is the plugin/workflow code:-

// Create a Class having members as data attributes to be passed to Azure
[DataContract]
public class AccountObj
{
[DataMember]
public string AccountID { get; set; }
[DataMember]
public string Name { get; set; }
[DataMember]
public string Telephone { get; set; }
}
// Insert the following code after getting the primary entity in Plugin/workflow
using System.IO;
using System.Net;
using System.Runtime.Serialization.Json;//Install from NuGet Package Manager using (WebClient client = new WebClient())
{
// Prepare Json data DataContractJsonSerializer serializer = new DataContractJsonSerializer(typeof(AccountObj));
MemoryStream memoryStream = new MemoryStream();
serializer.WriteObject(memoryStream, accObj);
var jsonObject = Encoding.Default.GetString(memoryStream.ToArray()); // Prepare WebHook
var webClient = new WebClient();
webClient.Headers[HttpRequestHeader.ContentType] = "application/json"; // Azure Function key
var code = "xPhPPB5tGCq86NRbe7wzgJika3bv4ahP9kw7xe5Asoja2vEk4fPqVw==&clientId=default";
// Azure Function Url
var serviceUrl = "https://callwebhookfromcrmassebly.azurewebsites.net/api/GenericWebhookCSharp1?code=" + code; // upload the json data to the serviceurl string response = webClient.UploadString(serviceUrl, jsonObject); }

4. Webhook Registration:-

With July 2017 Update, now we have the option to register a new Webhook through Plugin Registration tool. Download the latest Plugin Registration Tool from NuGet using the PowerShell script 

https://docs.microsoft.com/en-us/dynamics365/customer-engagement/developer/download-tools-nuget.

Through registering a Webhook, we can send data (execution context) about any operation performed on Dynamics 365 to the external services or application. The execution context information is passed in JSON format here.

a. Open Plugin Registration tool and Register New WebHook

Passing data from Dynamics 365 to Azure Service Bus Queue using Plugins Workflows

b. Enter details of registration from the “Get function URL” from step 2c

Passing data from Dynamics 365 to Azure Service Bus Queue using Plugins Workflows

5. Register your plugin/workflow assembly.

Working: – In the example, when “Account Number” of an account record is changed, the plugin/workflow is triggered which executes the webhook. The webhook passes the data to azure function and the azure function adds the data to queue. The result can be seen as below in the azure function Monitor section.

Passing data from Dynamics 365 to Azure Service Bus Queue using Plugins Workflows

Passing data from Dynamics 365 to Azure Service Bus Queue using Plugins Workflows

The CRM data is added as a message in the Queue of Azure service Bus as seen in the above screenshot.

Conclusion:

Using the simple steps above user can pass data from Dynamics 365 to Azure Service Bus Queue using Plugins/Workflows.

Integrate Dynamics CRM Online QuickBooks

Source

The post Passing data from Dynamics 365 to Azure Service Bus Queue using Plugins/Workflows appeared first on Microsoft Dynamics 365 Blog.

]]>
4360
Parse JSON string that represents the Dynamics 365 plugin execution context received in Azure Function https://microsoftdynamics.in/2018/06/25/parse-json-string-that-represents-the-dynamics-365-plugin-execution-context-received-in-azure-function/ Mon, 25 Jun 2018 12:20:28 +0000 https://www.inogic.com/blog/?p=12110 Introduction: In our previous blogs of this Integrating Dynamics 365 with Azure Functions series, we have gone through a walkthrough with an example of creating an Azure Function and call the same through the following, An example of directly calling an Azure function from traditional workflows. Register as a WebHook and invoke it from a...

The post Parse JSON string that represents the Dynamics 365 plugin execution context received in Azure Function appeared first on Microsoft Dynamics 365 Blog.

]]>
Introduction:

In our previous blogs of this Integrating Dynamics 365 with Azure Functions series, we have gone through a walkthrough with an example of creating an Azure Function and call the same through the following,

  1. An example of directly calling an Azure function from traditional workflows.
  2. Register as a WebHook and invoke it from a workflow.
  3. Register as a WebHook and register steps for messages that you would like the custom logic to be executed for.

In this blog, we will illustrate how to parse the JSON data that we received in the Azure Function. Let’s consider that we have registered a plugin step on an update of Account record which invoke a WebHook (here in our case Azure Function).

  • Read JSON data from the request body:

When plugin triggers and invokes a WebHook, three types of data received in the request i.e. Query String, Header Data and Request Body. The Request body contains a string that represents the JSON value of the RemoteExecutionContext class. This class defines the contextual information sent to a remote service endpoint at run-time. Below code snippet reads the content from the HttpRequestMessage and converts the received JSON string to proper deserializable JSON string.

using System.Text;
using System.Net; public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request"); string jsonContext = await req.Content.ReadAsStringAsync(); log.Info("Read context: " + jsonContext); jsonContext = FormatJson(jsonContext); log.Info("Formatted JSON Context string: " + jsonContext); return req.CreateResponse(HttpStatusCode.OK, "Success");
} /// <summary>
/// Function to convert the unformatted Json string to formatted Json string
/// </summary>
/// <param name="unformattedJson"></param>
/// <returns>string formattedJsonString</returns>
public static string FormatJson(string unformattedJson)
{ string formattedJson = string.Empty; try { formattedJson = unformattedJson.Trim('"'); formattedJson = System.Text.RegularExpressions.Regex.Unescape(formattedJson); } catch (Exception ex) { throw new Exception(ex.Message); } return formattedJson;
}

 

Remark: Add reference of System.Text so that we can use Regex expression.

 

  • Sample JSON string that gets received in the Azure Function:

Below is the sample JSON string.

{ "BusinessUnitId": "f0bf3c9a-8150-e811-a953-000d3af29fc0", "CorrelationId": "39499111-e689-42a1-ae8a-5b14a84514ce", "Depth": 1, "InitiatingUserId": "df010dad-f103-4589-ba66-76a5a04c2a11", "InputParameters": [ { "key": "Target", "value": { "__type": "Entity:http:\/\/schemas.microsoft.com\/xrm\/2011\/Contracts", "Attributes": [ { "key": "telephone1", "value": "1111" }, { "key": "accountid", "value": "ec4e2f7d-9d60-e811-a95a-000d3af24950" }, { "key": "modifiedon", "value": "\/Date(1527757524000)\/" }, { "key": "modifiedby", "value": { "__type": "EntityReference:http:\/\/schemas.microsoft.com\/xrm\/2011\/Contracts", "Id": "df010dad-f103-4589-ba66-76a5a04c2a11", "KeyAttributes": [], "LogicalName": "systemuser", "Name": null, "RowVersion": null } }, { "key": "modifiedonbehalfby", "value": null } ], "EntityState": null, "FormattedValues": [], "Id": "ec4e2f7d-9d60-e811-a95a-000d3af24950", "KeyAttributes": [], "LogicalName": "account", "RelatedEntities": [], "RowVersion": null } } ], "IsExecutingOffline": false, "IsInTransaction": true, "IsOfflinePlayback": false, "IsolationMode": 1, "MessageName": "Update", "Mode": 0, "OperationCreatedOn": "\/Date(1527757530151)\/", "OperationId": "08fec203-ec78-4f7a-a024-c96e329a64fe", "OrganizationId": "b0714265-8e72-4d3b-8239-ecf0970a3da6", "OrganizationName": "org94971a24", "OutputParameters": [], "OwningExtension": { "Id": "3db800fe-0963-e811-a95a-000d3af24324", "KeyAttributes": [], "LogicalName": "sdkmessageprocessingstep", "Name": "D365WebHookHttpTrigger: Update of account", "RowVersion": null }, "ParentContext": { "BusinessUnitId": "f0bf3c9a-8150-e811-a953-000d3af29fc0", "CorrelationId": "39499111-e689-42a1-ae8a-5b14a84514ce", "Depth": 1, "InitiatingUserId": "df010dad-f103-4589-ba66-76a5a04c2a11", "InputParameters": [ { "key": "Target", "value": { "__type": "Entity:http:\/\/schemas.microsoft.com\/xrm\/2011\/Contracts", "Attributes": [ { "key": "telephone1", "value": "1111" }, { "key": "accountid", "value": "ec4e2f7d-9d60-e811-a95a-000d3af24950" } ], "EntityState": null, "FormattedValues": [], "Id": "ec4e2f7d-9d60-e811-a95a-000d3af24950", "KeyAttributes": [], "LogicalName": "account", "RelatedEntities": [], "RowVersion": null } }, { "key": "SuppressDuplicateDetection", "value": false } ], "IsExecutingOffline": false, "IsInTransaction": true, "IsOfflinePlayback": false, "IsolationMode": 1, "MessageName": "Update", "Mode": 0, "OperationCreatedOn": "\/Date(1527757524631)\/", "OperationId": "08fec203-ec78-4f7a-a024-c96e329a64fe", "OrganizationId": "b0714265-8e72-4d3b-8239-ecf0970a3da6", "OrganizationName": "org94971a24", "OutputParameters": [], "OwningExtension": { "Id": "63cdbb1b-ea3e-db11-86a7-000a3a5473e8", "KeyAttributes": [], "LogicalName": "sdkmessageprocessingstep", "Name": "ObjectModel Implementation", "RowVersion": null }, "ParentContext": null, "PostEntityImages": [], "PreEntityImages": [], "PrimaryEntityId": "ec4e2f7d-9d60-e811-a95a-000d3af24950", "PrimaryEntityName": "account", "RequestId": "08fec203-ec78-4f7a-a024-c96e329a64fe", "SecondaryEntityName": "none", "SharedVariables": [ { "key": "ChangedEntityTypes", "value": [ { "__type": "KeyValuePairOfstringstring:#System.Collections.Generic", "key": "account", "value": "Update" } ] } ], "Stage": 30, "UserId": "df010dad-f103-4589-ba66-76a5a04c2a11" }, "PostEntityImages": [], "PreEntityImages": [], "PrimaryEntityId": "ec4e2f7d-9d60-e811-a95a-000d3af24950", "PrimaryEntityName": "account", "RequestId": "08fec203-ec78-4f7a-a024-c96e329a64fe", "SecondaryEntityName": "none", "SharedVariables": [], "Stage": 40, "UserId": "df010dad-f103-4589-ba66-76a5a04c2a11" }
  • Parse JSON string to ExpandoObject dynamic object:

Below code snippet deserialize the JSON string to dynamic ExpandoObject.

dynamic dynObj = Newtonsoft.Json.JsonConvert.DeserializeObject(jsonContext);

  • Read values from dynamic object:

Simple string type,

log.Info(“BusinessUnitId: ” + dynObj[“BusinessUnitId”]);

or

log.Info(“BusinessUnitId: ” + dynObj.BusinessUnitId);

Complex ParameterCollection type,

log.Info(“InputParameters->Target->LogicalName: ” + dynObj[“InputParameters”][0][“value”][“LogicalName”].ToString());

  • Parse JSON string to RemoteExecutionContext object:

Below code snippet deserialize the JSON string to RemoteExecutionContext object

#r "bin/Newtonsoft.Json.dll"
#r "bin/Microsoft.Xrm.Sdk.dll"
#r "bin/System.Runtime.Serialization.dll" using System.Net;
using System.Dynamic;
using System.Text;
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{ string jsonContext = await req.Content.ReadAsStringAsync(); log.Info("Read context: " + jsonContext); jsonContext = FormatJson(jsonContext); log.Info("Formatted JSON Context string: " + jsonContext); Microsoft.Xrm.Sdk.RemoteExecutionContext remoteExecutionContext = DeserializeJsonString<Microsoft.Xrm.Sdk.RemoteExecutionContext>(jsonContext);
} /// <summary>
/// Function to deserialize JSON string using DataContractJsonSerializer
/// </summary>
/// <typeparam name="RemoteContextType">RemoteContextType Generic Type</typeparam>
/// <param name="jsonString">string jsonString</param>
/// <returns>Generic RemoteContextType object</returns>
public static RemoteContextType DeserializeJsonString<RemoteContextType>(string jsonString)
{ //create an instance of generic type object RemoteContextType obj = Activator.CreateInstance<RemoteContextType>(); MemoryStream ms = new MemoryStream(Encoding.Unicode.GetBytes(jsonString)); System.Runtime.Serialization.Json.DataContractJsonSerializer serializer = new System.Runtime.Serialization.Json.DataContractJsonSerializer(obj.GetType()); obj = (RemoteContextType)serializer.ReadObject(ms); ms.Close(); return obj;
}

 

Remarks: Add reference of following dlls,

  • Xrm.Sdk.dll
  • Runtime.Serialization.dll

Below code snippet shows how to read values,

//read Plugin Message Name string messageName = remoteExecutionContext.MessageName; //read execution depth of plugin Int32 depth = remoteExecutionContext.Depth; //read BusinessUnitId Guid businessUnitid = remoteExecutionContext.BusinessUnitId; //read Target Entity Microsoft.Xrm.Sdk.Entity targetEntity = (Microsoft.Xrm.Sdk.Entity)remoteExecutionContext.InputParameters["Target"]; //read attribute from Target Entity string phoneNumber = targetEntity.Attributes["telephone1"].ToString(); log.Info("Message Name: " + messageName); log.Info("BusinessUnitId: " + businessUnitid); log.Info("Plugin Depth: " + depth); log.Info("TargetEntity Logical Name: " + targetEntity.LogicalName); log.Info("Phone Number: " + phoneNumber);

 

Parse JSON

Conclusion:

The steps given above describes how to Parse JSON string that represents the Dynamics 365 plugin execution context received in Azure Function. To read all blogs about Azure Functions visit: http://bit.ly/inogic-azurefunctions

Copy System and Custom Entities in Dynamics CRM

Source

The post Parse JSON string that represents the Dynamics 365 plugin execution context received in Azure Function appeared first on Microsoft Dynamics 365 Blog.

]]>
4369
Integrating Dynamics 365 with Azure Functions – Part 3 https://microsoftdynamics.in/2018/06/18/integrating-dynamics-365-with-azure-functions-part-3/ Mon, 18 Jun 2018 12:21:31 +0000 https://www.inogic.com/blog/?p=12038 Introduction: In our last post about Azure functions, we saw how to register the workflow assembly and execute the workflow using Microsoft Flow in Dynamics 365. Continuing with our series to call webhooks from CRM, let us register the Azure function created as a Webhook. We register this using the Plugin Registration Tool (PRT) From the PRT...

The post Integrating Dynamics 365 with Azure Functions – Part 3 appeared first on Microsoft Dynamics 365 Blog.

]]>
Integrating Dynamics 365 with Azure Functions Part 3

Introduction:

In our last post about Azure functions, we saw how to register the workflow assembly and execute the workflow using Microsoft Flow in Dynamics 365. Continuing with our series to call webhooks from CRM, let us register the Azure function created as a Webhook. We register this using the Plugin Registration Tool (PRT)

From the PRT choose Register → Register New Webhook

Integrating Dynamics 365 with Azure Functions

In the Endpoint URL provide the function URL.

Choose Authentication method as WebhookKey and provide the key. This key is what you passed as the code querystring earlier.

Note: Webhook is registered as Service End Point in CRM

Integrating Dynamics 365 with Azure Functions

You can now register a step for this webhook just like we do for a plugin assembly.

Integrating Dynamics 365 with Azure Functions

When you create an Account the PluginExecutionContext is received in the message body of the httprequest in your Azure function. You then need to parse that to get context related details.

Once registered as webhook, you can also invoke this through a workflow by executing the endpoint using the following code:

string serviceEndPoint = WebhookURL.Get(executionContext); IWorkflowContext context = executionContext.GetExtension<IWorkflowContext>(); string fetch = @"<fetch> <entity name='serviceendpoint' > <attribute name='serviceendpointid' /> <attribute name='name' /> <attribute name='authtype' /> <attribute name='url' /> <filter type='and'> <condition attribute='contract' operator='eq' value='8' /> <condition attribute='url' operator='eq' value='" + serviceEndPoint + @"' /> </filter> </entity> </fetch>"; EntityCollection coll = crmWorkflowContext.OrganizationService.RetrieveMultiple(new FetchExpression(fetch)); if (coll != null && coll.Entities.Count > 0) { IServiceEndpointNotificationService endpointService = executionContext.GetExtension<IServiceEndpointNotificationService>(); crmWorkflowContext.Trace("serviceEndPoint found: " + coll.Entities[0].Id.ToString()); endpointService.Execute(coll.Entities[0].ToEntityReference(), context); } 

Once again you receive the Workflow context in the message body that you can parse to read the information passed.

Conclusion:

There are other ways to invoke Azure functions from D365. Azure functions can be called through Logic Apps and you could register Logic Apps on D365 entities and actions. That’s content for another post. So Keep visiting this space 🙂

Generate Your Own New Leads Now Within Dynamics 365 CRM with Maplytics

Source

The post Integrating Dynamics 365 with Azure Functions – Part 3 appeared first on Microsoft Dynamics 365 Blog.

]]>
4377
Integrating Dynamics 365 with Azure Functions – Part 2 https://microsoftdynamics.in/2018/06/11/integrating-dynamics-365-with-azure-functions-part-2/ Mon, 11 Jun 2018 09:46:36 +0000 https://www.inogic.com/blog/?p=11946 Introduction: In our recent blog, We saw how to create an Azure function and now that we have our Azure function ready and hosted, let’s look at invoking the function through a workflow. At this point, we will execute the function through an HTTP request instead of registering the function as a Webhook. Let us modify the...

The post Integrating Dynamics 365 with Azure Functions – Part 2 appeared first on Microsoft Dynamics 365 Blog.

]]>
Integrating D365 with Azure Functions

Introduction:

In our recent blog, We saw how to create an Azure function and now that we have our Azure function ready and hosted, let’s look at invoking the function through a workflow. At this point, we will execute the function through an HTTP request instead of registering the function as a Webhook.

Let us modify the code from our previous blog to connect to a SQL database and return a record. We will then use the data returned from SQL to create a record in CRM.

Since VS Code does not provide intellisense, we will write the code in VS and then copy it to VS Code – Is there a better way to do this?

In VS, we have the following function ready

using Newtonsoft.Json; using Newtonsoft.Json.Linq; using System; using System.Collections.Generic; using System.Data; using System.Data.SqlClient; using System.Dynamic; static ExpandoObject ReadData(string id) { DataSet ds = null; dynamic obj = null; try { //connection string string connectionString = "Server=xxx.database.windows.net,1433;Initial Catalog=SampleSQL;Persist Security Info=False;User ID=xx;Password=xx;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;"; //ado connection object SqlConnection con = new SqlConnection(connectionString); //query customer string query = "select * from [SalesLT].[Customer] where CustomerID = " + id; //create adapter SqlDataAdapter adapter = new SqlDataAdapter(query, con); //execute query adapter.Fill(ds); //Check if record found if (ds!= null && ds.Tables.Count >0 && ds.Tables[0].Rows.Count > 0) { DataRow dr = ds.Tables[0].Rows[0]; obj = new ExpandoObject(); obj.Fname = dr["FirstName"].ToString(); obj.Lname = dr["LastName"].ToString(); obj.Email = dr["EmailAddress"].ToString(); } } catch (Exception ex) { throw new Exception(ex.Message); } return obj; } 

We will copy this function to VS Code and save. You will find the following errors reported for missing references.

Integrating D365 with Azure Functions

To reference assemblies in .csx we need to add the following lines in the run.csx

Integrating D365 with Azure Functions

Create a bin folder and copy Newtonsoft dll there so that it is referenced correctly here.

Next, replace the code to use full qualified type names “System.Data.DataRow”

Let us call this function from the main function and pass the data received in the name querystring to the function to look up the record from SQL

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log) { log.Info("C# HTTP trigger function processed a request."); // parse query parameter string name = req.GetQueryNameValuePairs() .FirstOrDefault(q => string.Compare(q.Key, "name", true) == 0) .Value; if (name == null) { // Get request body dynamic data = await req.Content.ReadAsAsync<object>(); name = data?.name; } //call function to retrieve data from Azure SQL dynamic obj = ReadData(name); //convert result to json string to be returned string resp = Newtonsoft.Json.JsonConvert.SerializeObject(obj); return req.CreateResponse(HttpStatusCode.OK, resp); // return name == null // ? req.CreateResponse(HttpStatusCode.BadRequest, "Please pass a name on the query string or in the request body") // : req.CreateResponse(HttpStatusCode.OK, "Hello " + name); }

Once it compiles fine, upload the code to Azure and test it.

Execution from Postman will now show the following result

Integrating D365 with Azure Functions

Let us now call this from a workflow assembly

The following code snippet would invoke the azure function using HTTP POST request.

Note: We are accepting Function URL and secret as a workflow parameter

//read url string url = executionContext.GetValue(AzureFunctionURL); //read secret string secret = executionContext.GetValue(AuthCode); // TODO: Implement your custom Workflow business logic. tracingService.Trace("Before calling azure function"); //Uri FuncURL = new Uri(url); UriBuilder uriBuilder = new UriBuilder(url); uriBuilder.Query = "code=" + secret; tracingService.Trace("Azure URL: " + uriBuilder.Uri.ToString()); string result = Post2Azure(uriBuilder,tracingService,service).Result; private static async Task<string> Post2Azure(UriBuilder uriBuilder, ITracingService tracingService, IOrganizationService service) { string result = string.Empty; //create http client object HttpClient client = new HttpClient(); //data to send var data = "{\"name\": \"5\"}"; //send in byte format var buffer = System.Text.Encoding.UTF8.GetBytes(data); var byteContent = new ByteArrayContent(buffer); byteContent.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("application/json"); tracingService.Trace("send request"); //execute request HttpResponseMessage response = client.PostAsync(uriBuilder.Uri, byteContent).Result; if (response == null) { tracingService.Trace("no response received"); throw new InvalidOperationException("Failed to obtain the httpResponse"); } result = await response.Content.ReadAsStringAsync(); tracingService.Trace("response read: " + result); //remove extra \ result = result.Replace(@"\", ""); //remove the start double quote result = result.Remove(0, 1); //remove ending double quote result = result = result.Remove(result.Length - 1, 1); JObject ob = JObject.Parse(result); tracingService.Trace("responseparsed to json: "); //Create a CRM contact Entity contact = new Entity("contact"); contact["firstname"] = ob["Fname"].ToString(); tracingService.Trace("firsname to json: " + ob["Fname"].ToString()); contact["lastname"] = ob["Lname"].ToString(); tracingService.Trace("firsname to json: " + ob["Lname"].ToString()); contact["emailaddress1"] = ob["Email"].ToString(); Guid id = service.Create(contact); tracingService.Trace("contact created " + id.ToString()); } 

Register the workflow assembly and execute the workflow. It would read from Azure SQL and create a contact in CRM

In the next step, we would register the Azure function as a Webhook.

Conclusion:

Using the simple steps above the user can register the workflow assembly and execute the workflow. In our next blog will see how to register the Azure function as a webhook and register steps for messages that you would like the custom logic to be executed for.

Copy System and Custom Entities in Dynamics CRM

Source

The post Integrating Dynamics 365 with Azure Functions – Part 2 appeared first on Microsoft Dynamics 365 Blog.

]]>
4382
Integrating Dynamics 365 with Azure Functions – Part 1 https://microsoftdynamics.in/2018/06/05/integrating-dynamics-365-with-azure-functions-part-1/ Tue, 05 Jun 2018 10:30:07 +0000 https://www.inogic.com/blog/?p=11876 Introduction: Azure function is a serverless architecture where your code is hosted in the cloud and you do not need any infrastructure to host this. Traditionally extending business logic for Dynamics 365 Customer Engagement (D365 CE) included creating plugin and workflow assemblies which would be deployed to CRM using Plugin registration tool. The assembly could...

The post Integrating Dynamics 365 with Azure Functions – Part 1 appeared first on Microsoft Dynamics 365 Blog.

]]>
Integrating Dynamics 365 with Azure Functions

Introduction:

Azure function is a serverless architecture where your code is hosted in the cloud and you do not need any infrastructure to host this. Traditionally extending business logic for Dynamics 365 Customer Engagement (D365 CE) included creating plugin and workflow assemblies which would be deployed to CRM using Plugin registration tool. The assembly could be either stored in the database, the preferred way or disk (on-prem only). The assembly had to be registered in sandbox mode that resulted in limited actions that could be performed through these assemblies.

For a few versions now, we were allowed to register Azure aware plugins and workflows. With v9.x, Webhooks are now supported. Webhooks are in layman terms – Event-driven programming – When an action occurs, it would raise/notify an event and it would push the data to all the subscribers of the event. Plugins and Workflow architecture were based on notifications where we would register/subscribe to messages on entities and when those actions occurred it would execute the business logic in our assemblies.

In this series, we would walk through an example of creating an Azure function and call the same through the following:

  1. An example of directly calling an Azure function from traditional workflows.
  2. Register as a WebHook and invoke it from a workflow.
  3. Register as a WebHook and register steps for messages that you would like the custom logic to be executed for.

Creating an Azure function:

Let us begin with creating the dev environment for Azure function to enable local execution and debugging.

You could use both Visual Studio as well as VS Code for developing Azure function. You could actually just write it all on a Notepad 🙂 Azure functions support various scripting languages, however, in this example, we will use C#.

Let us begin with creating a folder (Azure) which will be the root folder and will hold all the functions we create. In order to develop and test Azure Function locally, you must have installed “Azure Functions Core Tools. Use the “npm install -g azure-functions-core-tools” command to install it.

Next in VS Code – command prompt type in the following command to initialize the environment, “func init”

Integrating Dynamics 365 with Azure Functions

Now we shall go ahead and create our first function.

C:\user\AzureBlog>func new –language C# –template “Http Trigger” –name D365HttpTrigger

This will go ahead and create the following structure in the Azure blog folder:

Integrating Dynamics 365 with Azure Functions

The scaffolding is ready with the function code ready for execution. Your function is ready to be executed.

This function could be executed through either an HTTP Get or HTTP POST request.

Let us host this function locally so that we can execute this as is using Postman

Execute the below command in VS code to start the host service

func host start –debug vscode

You should see the following message once the function is hosted:

Integrating D365 with Azure Functions

This is the URL where your function is being hosted. You can now execute this function by calling this URL.

Integrating D365 with Azure Functions

Back in your command window in VS Code, you will see the debug info.

Integrating D365 with Azure Functions

Once you are hosting the function, it also tracks and compiles the changes made to the code in real time and reports errors if any

Integrating D365 with Azure Functions

You can now host this function on Azure. Login to Azure portal and create a new Function App.

Integrating D365 with Azure Functions

Next

Integrating Dynamics 365 with Azure Functions

There are 2 plans available Consumption and App Service Plan. Functions hosted in Consumption Plan need to finish execution within 5 minutes, it could further be extended to 10 minutes by changing configuration settings, but any function that would take longer than that to execute should move to App Service Plan.

Once your resource has been provisioned, this is where you land.

Integrating Dynamics 365 with Azure Functions

There are multiple ways to upload your function to Azure

Manually create the function and update the code

1. Click + button and create a function – Choose HTTP trigger for our example

Integrating Dynamics 365 with Azure Functions

2. Choose Function Level Authorization

Integrating Dynamics 365 with Azure Functions

This will require a code to be passed to invoke this function. If the code does not match, it will not execute the function.

3. You get the same scaffolding ready that you had using the command in VS Code.

Integrating Dynamics 365 with Azure Functions

4. Now if you had changed the code in a local environment you could simply copy/paste the code in run.csx and save it.

5. You can also use the View files section to upload files.

VSTS continuous integration

You can also have the files/folders created and uploaded directly. For this, you need to create Deployment Plan using the Deployment Options.

Integrating Dynamics 365 with Azure Functions

Now let us test this function through Postman.

Click Get Function URL found on the run.csx file to get the URL to execute this function. You will notice the URL comes with the code query string attached to it.

Integrating D365 with Azure Functions

If you provide an incorrect code you get the error status code

Integrating D365 with Azure Functions

With the Azure function hosted, it is now ready to be invoked from D365.

Conclusion:

In this blog, We saw how to create an Azure function and now that we have our Azure function ready and hosted, in our next blog we have look at invoking the function through a workflow.

We would like to take a moment to introduce our new product Alerts4Dynamics to you. Alerts4Dynamics lets you schedule and manage alerts in Dynamics 365 CRM to notify users about the updates in CRM, due invoices, reminder to send quotes, etc. You can define target audience and send them priority based alerts via pop-ups, form notifications and emails. You can also view the log of read/dismissed alerts by users and also create alerts for multiple records by defining rules.

Source

The post Integrating Dynamics 365 with Azure Functions – Part 1 appeared first on Microsoft Dynamics 365 Blog.

]]>
4385