The post Power BI Integration with Canvas App in Microsoft Dynamics 365 CRM appeared first on Microsoft Dynamics 365 Blog.
]]>In this blog, we will walk you through how to integrate the Power BI Report with Canvas App. Microsoft has recently provided an integration of the Power BI report within the Canvas app, as the report will be handy for users working on the Mobile apps at Service Location.
Let’s consider a scenario, a Field Service Technician will work on the Work Order and generate the Work Order Report, and this report needs to be integrated within Canvas app.
We will need to follow the steps below to achieve this requirement.
Step 1: Navigate to the Power BI Service at https://app.powerbi.com and make sure you have Power BI Dashboard added to the Power BI workspace.
Below is a screenshot of the same.
Step 2: Now develop a Canvas app to embed this Power BI Report. Navigate to https://make.powerapps.com and, within the appropriate environment, create a Canvas app.
Step 3: Insert Power BI Tile on the screen as shown in the below screenshot.
Step 4: Now we need to choose the below setting to enable Power BI Report.
Please refer to the below screenshot.
Step 5: Save the app and publish the changes. This will display the app with a report, as shown in the below screenshot.
We can use Power BI within Canvas app to have an analytical view of mobile applications along with other operations.
The post Power BI Integration with Canvas App in Microsoft Dynamics 365 CRM first appeared on Microsoft Dynamics 365 CRM Tips and Tricks.
Please go to Source and follow them
The post Power BI Integration with Canvas App in Microsoft Dynamics 365 CRM appeared first on Microsoft Dynamics 365 Blog.
]]>The post Power Automate pane in Canvas App appeared first on Microsoft Dynamics 365 Blog.
]]>Currently everyone want mobile app for doing any work. To build a mobile app in Dynamics 365 CRM the Canvas App is the better option. Canvas App is a no-code/low code business app where you can design the app by dragging and dropping elements onto a canvas. To automate any process Power Automate is used. So to combine capabilities of Canvas app and Power Auotmate big is a advantage as it enables to build any app with rich functionalities.
Recently in Canvas app, the feature to show Power Automate pane so that we can easily access and use Power Automate in Canvas app was introtuced. In order to show Power Automate we need to first enable it from settings.
To enable Power Automate pane click on Settings from top menu. A window will open, in that select “Upcoming features” option from left navigation and then go to “Enable Power Automate pane” as shown below:
Once it is enabled the following message is displayed.
After the app is reopened, go to Action tab and then click on Power Automate. It will now open the Power Automate pane as shown below:
Using this new Power Automate pane we can easily create new flow and add existing flow to Canvas app and get overview of Power Automate.
To add Power Automate into Canvas app Click on “Add Flow”. Once we click on it then it will open the below window with list of Power Automate flows. Here, select the Power Automate flow that is to be added in Canvas app. If we wanted to create new flow then click on “Create new flow” button.
Also it will give overview of all Power Automate flows that was added to Canvas app. From here, we can easily navigate to the respective flows and edit it.
In this way, by using Power Automate pane we can easily and quickly perform action related to Power Automate.
The post Power Automate pane in Canvas App appeared first on Microsoft Dynamics 365 Blog.
]]>The post Custom Pages: A step towards disappearing lines between Canvas Apps and Model Driven Apps in Power Platform / Dynamics 365 CRM appeared first on Microsoft Dynamics 365 Blog.
]]>Model-driven apps – Data first approach – more for backend end user with full form experience.
Canvas Apps – Design first approach – more for the field users with a mobile experience with simple to use and easily accessible apps being designed to serve specific purpose.
Traditionally these have been aligned as Model-driven apps for web experience with full support for pro-dev extensions and Canvas Apps for citizen developers to design using low-code expression language now branded as Power Fx.
Custom pages is the next big step in the journey to unify the app experience and enable the developers and designer to combine the best of both worlds to give a seamless user experience to the end users without they having to worry about the technology being used and wait a second… they do not have to worry about the licensing as well.
Read here, canvas apps designed as pages do not count towards app limits.
With one of the earlier updates, we are allowed to embed Canvas apps within a model-driven form. But with Custom pages we will now be able to take advantage of Canvas app designers to build quick ui solutions that we can surface within model driven apps as;
Let us have a quick look at each of these options.
To add a custom page, we need to go through the new app designer experience that is currently in preview and choose the Custom option.
Next either choose an existing page if you have already designed one or create a new one.
Note: Since it talks Canvas apps I thought “use an existing custom page” would show me a list of all my existing canvas apps from my previous work and allow me to quickly add one of those and get going… BUT canvas apps are not custom pages – custom page is a new component type though it uses Canvas designer for designing it is not a canvas app and it won’t even be listed as an APP in your app listing.
Currently there seems no way to use existing canvas apps as is in this screen here. Check out the recommended way shared by the product team to get them migrated.
If you would like to include this as a standalone page in sitemap then check Show in navigation. Once you click Add, the very familiar Canvas App Designer shows up for you to go ahead and build your app.
For this exercise I added a gallery component with Contacts listed in there.
With very minimal effort, we are now able to add visually appealing listing with images. Before Custom pages, this would perhaps have to be a Power Apps Component Framework Control requiring substantial efforts to develop.
Save and publish this app from the canvas designer and click back to navigate back to the app designer to see this added to the navigation.
Make sure to publish the App to see the preview for the page.
And that is all it takes to converge a canvas app with a model driven app!
Have a look at your solution and you will see this listed as a Page and not a Canvas App.
Calling a custom page from a ribbon button
Note, you need to first have a page component created. You now have the option of adding a new page in the Add new option within a solution.
For this example, we will create an unbound screen to accept notification types from the user as shown below:
These controls have simply been arranged on the canvas. No further code is added to any control here.
Now let us modify the text for the record label to instead show the name of the record that this button was invoked for.
In the App -> OnStart event type the following code:
Set(RecordId, Param(“recordId”));
Set (selectedRecord, LookUp(Accounts, Account = GUID(RecordId)));
Notify(RecordId);
Param(“recordId”) is a parameter that we will pass to this page when invoking it from the ribbon button from model driven app. This will be the recordId of the selected record from the home grid.
LookUp(Accounts, Account = GUID(RecordId))
Here we search for an account in the database with same recordid as the parameter received.
Now modify the text property of the record label added to the canvas to show the name of the account retrieved using the recordId passed as parameter.
With this, page design is complete, let us save and publish this page and return to the solution explorer.
Note: Make sure to also publish the app each time you edit the page, I noticed the page throws an error until the app is published and then you re-open the app editor.
Go back to App designer and show the Edit in preview option to use the new designer interface.
Choose edit command bar for the account entity.
Here, choose main grid that’s where we will add the button.
Add + New Command.
Since we would like this button to show up only when one record from the grid is selected we write Power Fx expression in the Visible property as shown above.
Next to show the page on the click of the button, we need to call a javascript function. Choose Run JavaScript for the Action property and select the javascript library with the code.
Type in the function name to execute from the library and in our case, we also pass the parameter which is id of selected records from the grid.
Here is what the showCanvas function looks like;
function showCanvas(id)
{
alert(“in here”);
alert(“text ” + id);
//set the pageType as custom, to call a custom page that we just created
// name is the logical name of the page you can pick this up from solution explorer
var pageInput = {
pageType: “custom”,
name: “rooh_querydialog_bd7bb”,
entityName: “account”,
recordId: id
};
//target = 2 is for dialog
//position = 1 is for center dialog
var navigationOptions = {
target: 2,
position: 1,
title: “Notification”
};
Xrm.Navigation.navigateTo(pageInput, navigationOptions).then(
function success() {
// Run code on success
alert(“loaded”);
},
function error() {
// Handle errors
alert(“error”);
})
}
Here is where you can pick up the logical name of the page you just created.
It doesn’t allow you to copy the name so be careful when typing out the name.
Save and publish the command bar and click Play to preview.
Note that while we can pass context information as parameters from the model driven app to the custom page, we are not able to pass any information back to the model driven app from the custom page. In the above example, there isn’t a way to return information of the selection made by the user on the custom page.
Alternate method could certainly be to update a field of the record using the Patch() from within the custom page on the OK or Cancel button click.
Being a canvas app, you can now also invoke a flow from the buttons on the custom page to process the selection further.
Conclusion: Custom pages would certainly help with many of the UI scenarios that have traditionally required developing custom web resources or pcf controls and with license considerations out of the picture, it would make adoption of this one a lot easier.
Please visit the Source and support them
The post Custom Pages: A step towards disappearing lines between Canvas Apps and Model Driven Apps in Power Platform / Dynamics 365 CRM appeared first on Microsoft Dynamics 365 Blog.
]]>The post Use Relevance Search API in Power Automate & Canvas App – Part 2 appeared first on Microsoft Dynamics 365 Blog.
]]>In the previous blog, we obtained the response from the Relevance Search API request through Power Automate Flow. In this blog, we will use that response in the Canvas app. We will see how we can leverage the potential of Relevance search in the Canvas app.
If you are not familiar with how to create Canvas app then you can refer the following article,
https://docs.microsoft.com/en-us/powerapps/maker/canvas-apps/get-started-test-drive
Let us start with creating a new canvas app,
• Sign in to https://make.powerapps.com and create a new Canvas app.
• As we are going to create Canvas app for Dynamics 365 app, we will select the Common Data Service Phone Layout,
• In the next screen, we will first add the connection then add the table.
• This will create a canvas app with default Browse, Detail and Edit screen. As we are going to work with Contact and lead, we added the screens for Lead and Contact.
• We will now add a new screen for the Relevance Search, like shown below
• We will call Power Automate Flow that we have created in the previous blog upon OnSelect of the Search button and upon OnChange of the textbox.
We will use below expression OnSelect of the button and on OnChange of the textbox.
This expression will run the ‘RelevanceSearchAPIFlow’ and will pass the text entered in the TextBox. After successful run, it will store the response in the Collection called “SearchCollection”.
• Save and run the App. Enter some text in the Textbox and either click enter or click on the Search Button. In the background, a Power Automate Flow should run successfully and response should get stored in the ‘SearchCollection’ collection.
• To check the collection, click on View -> Collections
You should see the response added in the collection.
• Now we will use this collection to bind to the List control. Select the List control and add the below expression to the Items property,
Here we are assigning non-empty items from the SearchCollection collection to the List control.
• After this, select the List control and add below expression OnSelect property of the control.
Here we are extracting and setting the GUID of the record that we have in the collection to the ‘SelectedItemID’ variable.
• We will now bind the record values to the list item. Select the first label and add below expression,
Here we are binding the primary field value of the record.
Similarly, select the second label and add below expression,
Here we are binding the Entity logical name to understand the entity type of the record,
• Select the ‘NextArrow’ button and add below expression upon OnSelect property,
Here we are first setting the GUID of the record to the SelectedItemID variable and then switching the Detail screen based on the Entity logical name. For example, if the entity is Contact, we are navigating the screen to DetailsScreenContact_1 and if the entity is lead, we are navigating to the DetailsScreenLead_1 screen.
• Now select the Detail screen where we will see the details of the record. Add below expression to Item property of the DetailScreen,
This expression is for the contact screen,
Similarly, add expression for other details screen,
This expression is for the Lead screen,
• Here we have setup the detail screens of lead and contact to show up the record details on click of the Search Results item from Relevance Search Screen.
Relevance Search screen,
On click of the next arrow button, it will open the appropriate entity record.
• You can add entities that you need and you can add fields to show up in the detail screen or the search result.
This way we can execute the Relevance search API from Canvas App and represent the Search results using List control.
You can use the logic explained in this blog to add Relevance search into your Canvas app.
Below is the small clip of the working Relevance Search screen,
Please visit the Source and support them
The post Use Relevance Search API in Power Automate & Canvas App – Part 2 appeared first on Microsoft Dynamics 365 Blog.
]]>The post Streamline Power BI Refresh: Refresh dataset after a successful refresh of dataflow appeared first on Microsoft Dynamics 365 Blog.
]]>Do you have a Power BI dataset that gets data from a dataflow? have you ever thought; “can I get the dataset refreshed only after the refresh of dataflow completed and was successful?” The answer to this question is yes, you can. One of the recent updates from the data integration team of Power BI made this available for you. Let’s see in this blog and video, how this is possible.
If you are using both dataflows and datasets in your Power BI architecture, then your datasets are very likely getting part of their data from Power BI dataflows. It would be great if you can get the Power BI dataset refreshed right after a successful refresh of the dataflow. In fact, you can do a scenario like below.
Power Automate recently announced availability of a connector that allows you to trigger a flow when a dataflow refresh completes.
You can then choose the workspace (or environment if you are using Power Platform dataflows), and the dataflow.
The dataflow refresh can succeed or fail. You can choose the proper action in each case. For doing this, you can choose the result of refresh to be Success.
In the event of successful refresh of the dataflow, you can then run the refresh of the Power BI dataset.
Refreshing Power BI dataset through Power Automate is an ability that we had for sometime in the service.
You can also capture the failure details and send a notification (or you can add a record in a database table for further log reporting) in the case of failure.
The overall flow seems a really simple but effective control of the refresh as you can see below.
Making sure that the refresh of the dataset happens after the refresh of the dataflow, was one of the challenges of Power BI developers if they use dataflow. Now, using this simple functionality, you can get the refresh process streamlined all the way from the dataflow.
Dataflow refresh can be done as a task in the Power Automate as well. Which might be useful for some scenarios, such as running the refresh of the dataflow after a certain event.
This is not only good for refreshing the Power BI dataset after the dataflow, it is also good for refreshing a dataflow after the other one. Especially in best practice scenarios of dataflow, I always recommend having layers of the dataflow for staging, data transformation, etc, as I explained in the below article.
Although, dataflow is not a replacement for the data warehouses. However, having features like this helps the usability and the adoption of this great transformation service.
Do you think of any scenarios that you use this for? let me know in the comments below, I’d love to hear about your scenarios.
The post Streamline Power BI Refresh: Refresh dataset after a successful refresh of dataflow appeared first on RADACAD.
The post Streamline Power BI Refresh: Refresh dataset after a successful refresh of dataflow appeared first on Microsoft Dynamics 365 Blog.
]]>The post Power Automate for OCR appeared first on Microsoft Dynamics 365 Blog.
]]>In the last post, I have explained how to use text recognizer fo the aim of detecting the text in a picture using the AI builder in Power Apps.
In this post and video, you will how to call it in Power Automate and extract the text from an image that is located in Onedrive and the result will be written back in Azure Blob Storage.
To start open the Power Apps and click on the AI Builder and Build and choose the text recognition.
then use the option the use in flow (Power Automate)
Next, in the Power Automate, click on the flow and create a new flow with a trigger of the When a file is created.
Next, the Predict module with the Model: TerxtRecognition model and the image from the previous trigger.
Next, use the Create blob for the output in Microsoft Azure.
for the name of the file, I will put the current time plus the file type that is a text file
Next, for the Blob content, use the function concat and from the Dynamic content select the PredictionOutput.
All done!
just put an image file like this on in the folder in one drive and run the flow to see the result in the blob storage!
See the video from
The post Power Automate for OCR appeared first on RADACAD.
The post Power Automate for OCR appeared first on Microsoft Dynamics 365 Blog.
]]>The post Recent Update in Predict Module in Power Automate Predict- Part 4 appeared first on Microsoft Dynamics 365 Blog.
]]>In the previous posts ( Post 1, Post 2, and Post 3) the process of analyzing the forms using the Form processing feature in AI Builder has been explained.
from last month some changes happen in the Prediction module in Microsoft Power Automate that will impact the form processing experience using flow.
In this post, I will show you how to work with the new one. The previous of you created still working.
the changes happen in the Predict module.
in the below flow we have a module Predict that has a Request Payload, and we need to specify the type of the image and the file as a string, also later we need to parse the result by another component name Parse JSON.
Now everything gets a bit easier, we have a Predict module that in the separate box it gets the document type and document itself, next in the create type we just need to use the Concat function to get the result and values.
I have used the below code for the File Content:
concat(‘ Session Title: ‘,outputs(‘Predict’)?[‘body/responsev2/predictionOutput/labels/Session_85648318f538ad191ff5651a30c24966/value’],’ ,Speaker Name: ‘,outputs(‘Predict’)[‘body/responsev2/predictionOutput/labels/Speaker_64a30215a5867855cdcb495ee6eef3ec/value’],’ ,Comments: ‘,outputs(‘Predict’)?[‘body/responsev2/predictionOutput/labels/What_0027580d34b848d69176d3f4a73de5c5450/value’])
also, watch below video
The post Recent Update in Predict Module in Power Automate Predict- Part 4 appeared first on RADACAD.
The post Recent Update in Predict Module in Power Automate Predict- Part 4 appeared first on Microsoft Dynamics 365 Blog.
]]>The post Automation of Sentiment Analysis using AI Builder appeared first on Microsoft Dynamics 365 Blog.
]]>There are many usages of AI Builder for Object detection, form processing, Binary Prediction, Text classification, business card Reader. However, besides them, there is another one that is available now in Power Automate. In this post, I am going to show you how to use this feature, for creating a process for automatically apply sentiment analysis and create an output file to store back to Onedrive and then clean the result using Power Query.
navigate to Power Automate
login to the account you have AI Builder environment, Then under the solution, create a new flow with the trigger of “When a file is Created” in Onedrive
In the next step, you need to select the Predict to component from the Next Step action, in the model section please choose the SentimentAnalysisModel
Then in the Request Payload section, you need to first click in the box, then write the below code
{“text”:””,”language”:”en”} (make sure the it is “)
new window chooses the Expression and clicks on the ” ” in front of the text and from the Dynamic Content, choose the File content.
now choose a new action, and select the Create file to send back the result to the Onedrive.
for the Folder path just choose the location you want to save, for the file name you can use the Expression box to create a simple name and type like
then in the File content select the ResponsePayload from Dynamic Content
Now just save the flow and test it you should upload a text file that has some comments like Hotel comment in the text format, then you need to run the flow.
below text can be a sample from hotel review
“The suite was awesome. We did not have much interaction with the staff. We did sleep on the queen size sofa bed in the living room instead of the queen size bed in the actual bedroom due to the temperature. It was very hot and humid. The air conditioner in the living room does not cool off the bedroom. This suite is very close to shopping and dining. After each day of adventures we would walk to dinner at different joints. I will stay here again when I return.”
Now a JSON file should created and you can use Power BI to analyze it in an easy way.
In the next post, I will explain how to analyze the result with Power Query.
see the full video from here
The post Automation of Sentiment Analysis using AI Builder appeared first on RADACAD.
The post Automation of Sentiment Analysis using AI Builder appeared first on Microsoft Dynamics 365 Blog.
]]>The post Refresh Power BI Queries Through Power Platform Dataflows: Unlimited Times with Any Frequency appeared first on Microsoft Dynamics 365 Blog.
]]>If you don’t have a Power BI Premium capacity license, then you are limited to refresh your dataflows up to eight times a day, with the frequency of 30 minutes. The good news for you is that you have a way to do unlimited refreshes and with whatever frequency you like using Power Platform Dataflows. Read the rest of this blog article to learn how.
In the world of Power BI, You can separate the data transformation layer using Dataflows. Dataflows can run one or multiple Power Query scripts on the cloud and store the result into Azure Data Lake. The result of the dataflow (which is stored in Azure Data Lake) can then be used in a Power BI dataset using Get Data from Power BI dataflows. If you want to learn more about Dataflows, I highly recommend reading my dataflows series that starts with this article.
Dataflow is not only a concept for Power BI, but It is also available as Power Platform dataflow. You can check out one of the announcements of features added in Power Platform Dataflows by Miguel Llopis (one of the product managers of the Data Integration team at Microsoft), here. You can create Power Platform Dataflows, very similar to the way that you create Power BI dataflows. The only difference at the beginning is where you create it. Power BI dataflows are created in Power BI service, Power Platform Dataflows can be created in Power Apps (or Power Platform) portal.
Login to Power Platform Portal here.
Then go to Dataflows under the Data section,
Start a New dataflow
just provide a name for the dataflow, and then create
At this step, you can either go ahead and get data from anywhere you want or use Blank Query and copy a Power Query script from Power BI Desktop into here.
after preparing your queries, you can go to the step of loading it.
The load to entity step in Power Platform Dataflows is a bit different from Power BI Dataflows. In Power BI Dataflows, the result will be loaded into CSV files into Azure Data Lake, so no more configuration is needed. However, Power Platform Dataflows stores the data into CDS (Common Data Services), and you have to select the entity you want to load this data into it.
You can either choose one of the existing entities or create a new entity, then map fields. Because these are CDS entities, there are some rules for having key fields, name fields, etc. If you don’t know what the CDS is, and where it stores the data, read my article about CDS here.
After setting up the load to entity and mapping fields correctly, you will get into the Refresh settings. This is where you can refresh your dataflow even with the frequency of a minute. You can refresh it as many times as you want. You are not limited to eight times or even 48 times a day.
Here you can see the frequency of the test dataflow that I set up, which ran every minute.
I know what you are going to ask now! What about licensing? What type of license do you need for this? Well, for Power Platform dataflows, you don’t need a Power BI license at all, not Premium, not even Pro. You do need, however, to have a Power Apps license. at the time of writing this blog post, there are two options, $10 a month, and $40 a month. Both of these can be cheaper than Power BI Premium if you are a single user (or even a few users). The main difference between the two plans of Power Apps, as I see is the database and file size for CDS;
You can use the result of dataflows in Power BI (similar to the way that you can use the result of Power BI dataflows in Power BI). You need to Get Data from Power Platform Dataflows.
However, at the time of writing this blog post, this feature is still beta and under development, and might not show the Power Platform dataflows.
Another option is also to get data from the Common Data Service. (Because Power Platform Dataflows stores the data into CDS)
Having a Power Query script that can refresh even every minute is great. However, you have to note that the dataflow refresh is different from the dataset refresh. If you use the result of the dataflow in a Power BI dataset, Still you need to get your dataset refreshed, and that means you are limited to eight times a day or 48 times a day or API refreshes depends on your Power BI license limits.
However, having the ability to refresh even the dataflow more frequent, still can be useful for some scenarios. I bet if you came to this post by searching for such functionality, you have some ideas about that already .
The post Refresh Power BI Queries Through Power Platform Dataflows: Unlimited Times with Any Frequency appeared first on RADACAD.
The post Refresh Power BI Queries Through Power Platform Dataflows: Unlimited Times with Any Frequency appeared first on Microsoft Dynamics 365 Blog.
]]>The post Form Processing Work flow- Create Form Processing Model Part 1 appeared first on Microsoft Dynamics 365 Blog.
]]>Another exiting new update in AI Builder is Form Processing. With for Processing you able to train the model with forms you have and AI builder able to detect each session of the form and the related values. This feature is so exiting form as for 5 years I am one of the organizer of SQL Saturday Auckland and always I have challenge with evaluation. In some strange way people prefer to evaluate the sessions via paper rather online link. So I need to create an application with cognitive service that able to detect the elements in a form. Such as title of the session, speaker name and the comments from the attendees. In this post and the next one, I will explain the process from creating the form processing model to Flow application that helps me.
in this post and next One, I will explain the process.
Before start Make sure
Now, you can start logging into the Power Apps make sure to access the environment you can see the AI Builder. You should see the AI Builder icon in left side of the window.
Next, under the AI Builder, choose the Build Option, then from AI model, choose the Form Processing.
In the new page, Form Processing put a name for the Model Name then select the create
The first step is Analyse Documents, you need to import at least 5 samples of the filled form you have. Just make sure the quality be good and the size of all file does not exceed 4MB.
As I mentioned before, I want to use this form processing for the aim of session evaluation, so I import 6 different session evaluation like below :
After importing the forms, you need to analyse the documents, so AI Builder can detect each section of the forms and related values.
Analysing the form take a couple of minutes, after that, you need to check the identified fields and choose the one that is matter to you
As you can see in the below picture, in this example three fields have been identified, and you able to see the confidence of each part by hovering your mouse over each field. Choose the fields you are happy about. The selected fields will be shown in the right panel.
After selecting the fields, you are in the second step of building form processing, to navigate to the third step, click on the Next.
Now, you imported your sample forms, at this stage you can train your model, you can see a summary of the model as the number of forms and number of selected fields. Then click on the Train option to navigate to the last step.
Training will take about a couple of minutes, depends on the number of documents for training, after training complete, Go to details page.
In the details page, you can see the number of documents you uploaded for training, the number of detected fields in each form, You can also test the model and then publish it.
for testing the mode, you need to import a sample of the form that you have, then wait to see the result in the Quick Test page.
In the sample, all trained fields should be identified. Next Back to the details page and Publish the model so you can use it in other applications
I want to create a workflow so I can upload pictures in OneDrive, then using this model and store back to OneDrive the result as a text file so I can use it Power BI report. As a result, after creating a model, I navigate to the Microsoft Flow. and I log in in the same environment I created the form processing model.
To make sure I can see the model I just created in Microsoft Flow, click on the model and you should see the Form Processing model is available with Publish Status as Live.
In the next post, I will explain how I create a Workflow that able to read scanned file from the OneDrive folder then apply the created model and store back in a text file in the OneDrive again.
The post Form Processing Work flow- Create Form Processing Model Part 1 appeared first on RADACAD.
The post Form Processing Work flow- Create Form Processing Model Part 1 appeared first on Microsoft Dynamics 365 Blog.
]]>