Power BI Archives - Microsoft Dynamics 365 Blog http://microsoftdynamics.in/tag/power-bi/ Microsoft Dynamics CRM . Microsoft Power Platform Wed, 12 Jul 2023 08:16:47 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 https://i0.wp.com/microsoftdynamics.in/wp-content/uploads/2020/04/cropped-Microsoftdynamics365-blogs.png?fit=32%2C32 Power BI Archives - Microsoft Dynamics 365 Blog http://microsoftdynamics.in/tag/power-bi/ 32 32 176351444 Power BI: Empowering Businesses to Make Data-Driven Decisions and Drive Innovation in Marketing Analytics http://microsoftdynamics.in/2023/07/12/power-bi-empowering-businesses-to-make-data-driven-decisions-and-drive-innovation-in-marketing-analytics/ Wed, 12 Jul 2023 08:16:47 +0000 https://www.inogic.com/blog/?p=35369 Data and Analytics (D&A) are widely recognized as the next frontier for driving innovation and enhancing productivity in the business realm. According to a report by McKinsey, organizations that embrace data-driven approaches can achieve impressive increases in EBITDA, with potential gains of up to 25%. Moreover, the majority of the world’s top 10 innovative companies,...

The post Power BI: Empowering Businesses to Make Data-Driven Decisions and Drive Innovation in Marketing Analytics appeared first on Microsoft Dynamics 365 Blog.

]]>
Power BI

Data and Analytics (D&A) are widely recognized as the next frontier for driving innovation and enhancing productivity in the business realm. According to a report by McKinsey, organizations that embrace data-driven approaches can achieve impressive increases in EBITDA, with potential gains of up to 25%. Moreover, the majority of the world’s top 10 innovative companies, as identified by Boston Consulting, are data-focused enterprises emphasizing the crucial role of data in shaping competitive advantages within the market.

To effectively leverage data and gain a competitive edge, organizations require powerful tools that enable marketers to make informed, data-driven decisions. One such tool that has made its mark with marketing analytics is Power BI. This blog will venture into the key features and benefits of Power BI for marketing analytics and explore how it empowers businesses to make data-driven decisions.

Power BI and Marketing Analytics

Power BI, developed by Microsoft, is a user-friendly analytics tool that connects businesses to data sources, transforming raw data into valuable insights and appealing reports. It facilitates marketing analytics with data integration, diverse visualizations, real-time reporting, and advanced analytics like predictive analytics and machine learning. Marketers can make informed decisions, identify trends, monitor campaigns, and optimize strategies efficiently.

Leveraging Power BI for Marketing Analytics

1. Data Preparation and Integration

Before diving into marketing analytics using Power BI, it is essential to ensure that your data is well-prepared and properly integrated. Follow these steps to leverage Power BI effectively:

  • Identify Key Data Sources: Determine the primary data sources relevant to your marketing analytics, such as CRM systems, Google Analytics, social media platforms, and advertising platforms.
  • Data Cleaning and Transformation: Cleanse and transform your data to ensure consistency and accuracy. Remove duplicates, handle missing values, and standardize data formats.
  • Data Integration: Connect Power BI to your data sources and import the relevant datasets. It offers a wide range of connectors that allow seamless integration with various data sources.
  • Data Modeling: Create a data model that aligns with your marketing goals. Define relationships between tables, create calculated columns and measures, and optimize data for efficient analysis.

2. Visualizing Marketing Data

Once your data is prepared and integrated, you can start visualizing your marketing data using Power BI. Follow these best practices for effective data visualization:

  • Right Visualizations: Select appropriate visualizations based on the type of data you want to represent. Get a variety of visualizations, including bar charts, line charts, pie charts, maps, and more with Power BI.
  • Key Metrics: Identify the key metrics and KPIs (Key Performance Indicators) relevant to your marketing goals. Highlight these metrics prominently in your reports and dashboards for easy tracking.
  • Filters and Slicers: Implement filters and slicers to enable interactive exploration of your data. Users can drill down into specific dimensions or filter data based on specific criteria, enhancing data analysis capabilities.
  • Interactive Dashboards: Build interactive dashboards that provide an overview of your marketing performance. Arrange visualizations logically and use drill-through features to enable deeper analysis.

3. Advanced Analytics and Insights

Power BI offers advanced analytics capabilities that can take your marketing analytics to the next level. Here are some ways to leverage advanced analytics in Power BI:

  • Predictive Analytics: Forecast future outcomes like customer churn, lifetime value, and conversion rates using predictive analytics algorithms. Optimize marketing strategies based on these insights.
  • Segmentation and Targeting: Effectively segment your customer base using machine learning algorithms. Tailor marketing campaigns based on demographic, behavioral, or transactional attributes.
  • Text Analytics: Extract valuable insights from unstructured text data such as social media comments and customer reviews. Analyze sentiment, identify trends, and gauge customer satisfaction using Power BI’s text analytics features.
  • Marketing Attribution: Measure the impact of marketing activities across channels and touchpoints. Use Power BI to attribute conversions and analyze campaign effectiveness.

Fostering a culture centered around data empowers organizations to enhance their effectiveness and efficiency. As per Forrester, organizations that leverage data to extract valuable insights for decision-making are nearly three times as likely to achieve double-digit growth.

Power BI empowers marketers with robust analytics and visualization capabilities, enabling data-driven decision-making in the realm of marketing analytics. By leveraging Power BI’s data integration, visualization, and advanced analytics features, marketers can gain valuable insights, optimize marketing campaigns, and drive business growth. Stay ahead of the competition by harnessing the power of Power BI for marketing analytics and making data-driven decisions that propel your business to success.

In conclusion, Power BI is a robust tool for marketing analytics that empowers businesses to make data-driven decisions. By integrating and visualizing data from various sources, marketers can gain a comprehensive view of their efforts. With real-time reporting, advanced analytics, and actionable insights, Power BI enables marketers to optimize campaigns and target segments effectively. By leveraging Power BI for marketing analytics, organizations can foster a culture centered around data and achieve significant growth through informed decision-making.

Inogic is a renowned Microsoft Gold ISV Partner, specializing in Dynamics 365 CRM and Power Platform professional services in India. With over 15 innovative apps, we bridge functional gaps, enhance user adoption, and boost productivity within Microsoft Dynamics 365 CRM. Our applications streamline business processes, ensuring a high return on investment. In addition to our app offerings, Inogic offers top-notch offshore development services for Dynamics CRM and Power Platform, delivering high-quality solutions at competitive prices.

When it comes to Power BI for marketing analytics, Inogic as an Offshore Techno-functional consultant excels at maximizing its potential. Our Power Platform development services enable businesses to integrate data from multiple sources, effectively visualize insights, access real-time reporting, and leverage advanced analytics. This partnership empowers organizations to make data-driven decisions and gain a competitive edge.

We also contribute to the community through our popular Inogic Blogs, which provide regular high-quality tips and tricks on all things Dynamics 365 CRM and Power Platform. To learn more about how Inogic can enhance your user experience with Dynamics 365 CRM and Power Platform, visit our website, or contact us at crm@inogic.com.

The post Power BI: Empowering Businesses to Make Data-Driven Decisions and Drive Innovation in Marketing Analytics first appeared on Microsoft Dynamics 365 CRM Tips and Tricks.

Please visit the Source and support them

The post Power BI: Empowering Businesses to Make Data-Driven Decisions and Drive Innovation in Marketing Analytics appeared first on Microsoft Dynamics 365 Blog.

]]>
4776
What is Data Factory in Microsoft Fabric http://microsoftdynamics.in/2023/05/30/what-is-data-factory-in-microsoft-fabric/ Tue, 30 May 2023 04:49:34 +0000 https://radacad.com/?p=18157 Microsoft Fabric is an end-to-end data analytics solution in the cloud, and one of its workloads is called Data Factory. In this article, you will learn what Data Factory is, how it works with the rest of Microsoft Fabric, and what are elements and functions of Data Factory. Video Microsoft Fabric To understand Data Factory, Read more about What is Data Factory in Microsoft Fabric[…]
The post What is Data Factory in Microsoft Fabric appeared first on RADACAD. ...

The post What is Data Factory in Microsoft Fabric appeared first on Microsoft Dynamics 365 Blog.

]]>
What is Data Factory in Microsoft Fabric

Microsoft Fabric is an end-to-end data analytics solution in the cloud, and one of its workloads is called Data Factory. In this article, you will learn what Data Factory is, how it works with the rest of Microsoft Fabric, and what are elements and functions of Data Factory.

Video

Microsoft Fabric

To understand Data Factory, it is best to understand Microsoft Fabric first. Microsoft Fabric is an end-to-end Data Analytics software-as-a-service offering from Microsoft. Microsoft Fabric combined some products and services to cover an end-to-end and easy-to-use platform for data analytics. Here are the components (also called workloads) of Microsoft Fabric.

Microsoft Fabric

To learn more about Microsoft Fabric and enable it in your organization, I recommend reading the articles below;

Data Factory Origin

Microsoft Fabric has a workload for Data Integration. Any end-to-end data analytics system should have a data integration component. Microsoft has been a strong data integration tool and service leader for decades. This started with SQL Server tools such as DTS (Data Transformation Service) and SSIS (SQL Server Integration Services) and then stepped into cloud-based technologies such as ADF (Azure Data Factory). Microsoft also used a data transformation engine that first targeted citizen data analysts called Power Query.

Data Factory is the data integration component of Microsoft Fabric which brings the power of Azure Data Factory and Power Query Dataflows into one place. For many years, we had these two technologies doing data transformations separately. But now, these two are combined under Fabric, called Data Factory.

Power Query

Power Query Dataflows was first announced a few years ago as an additional component to Power BI for data transformation as a cloud technology that is simple to use for data analysts. But soon, it became more than just for Power BI; it became Power Platform Dataflows. These days, Power Query Dataflows are used for data transformations in Power BI projects and data migration in Power Apps projects.

Power Query

Although Power Query Dataflows is also on the dataflow side, it needed some enhancements on scalability and the control of execution with some control flow elements (such as loop structures, conditional execution, etc.).

Azure Data Factory

Azure Data Factory came into the market many years ago as the next generation of SSIS for in-the-cloud ETL. However, the data transformation engine of Azure Data Factory was not built on a strong basis, so most of the time, ADF was used for data ingestion, and then with the help of SQL stored procedures, etc., for doing the transformation afterward. ADF was not a tool for citizen data analysts. It was instead for data engineers and developers. ADF used data pipelines to execute a group of activities as a flow, and among those activities, there were tasks such as copy data, running a stored procedure, etc.

Azure Data Factory. Image sourced from: https://learn.microsoft.com/en-us/azure/data-factory/introduction

For the past few years, we have always had this split; If you wanted a simple-to-use data transformation engine but not much data, use Power Query Dataflows. If you want scalable data ingestion, then use Azure Data Factory.

Best of Both Worlds

Now in Microsoft Fabric, We combine the best things from Power Query Dataflows and Azure Data Factory Data Pipelines into one stream: Data Factory. Data Factory ensures that you still have a simple-to-use and powerful transformation engine of Power Query for data transformation, but on the other hand, you also have the scalability of Data Pipelines and can build a control flow for execution of the ETL using the Data Pipelines. In other words, Data Factory is a state-of-the-art ETL software-as-a-service offering for Microsoft Fabric.

Data Factory in Microsoft Fabric combines Azure Data Factory and Power Query Dataflows together.

Elements of Data Factory

Combining these two services brings great features that make the Data Factory an ultimate ETL service. Here are some of those below;

Data Connectors

For an ETL (Extract, Transform, Load) system, one of the most important aspects is what sources the data can be fetched from. Data Factory offers hundreds of data connectors, enabling you to get data from sources such as databases, files, folders, software-as-a-service systems, etc.

Data Factory Connectors

It is also possible to create your connector if you are keen.

Dataflows

Dataflows are the heart of Data Factory. This is where you get the data from the sources, define the data transformation and prepare it in any shape needed, and finally load it into destinations. Dataflows use the Power Query data transformation engine and the user interface for creating it using the simple-to-use Power Query Editor online.

Dataflow

Power Query Editor online is not only powerful in the graphical interface, it also enables the developer to write code in M language, which is the data transformation language for Power Query.

Power Query Editor online

To learn more about Dataflows, I suggest reading my article below.

Dataflows support a few destinations at the time of writing this article which are;

  • Azure Data Explorer (Kusto)
  • Azure SQL Database
  • Data Warehouse
  • Lakehouse

Data Pipelines

Although Dataflows are the main ETL component of the Data Factory, they can be enhanced when wrapped by a control flow execution component. This control flow execution component is called Data Pipeline. A Data Pipeline is a group of activities (or tasks) defined by a particular flow of execution. The activities in a Pipeline can involve copying data, running a Dataflow, executing a stored procedure, looping until a certain condition is met, or executing a particular set of activities if a condition is met, etc.

Data Pipeline

Data Pipelines can then be scheduled, and there is a monitoring tool to check the execution stage of the pipeline in addition to the activity-state-outputs where you can define what happens if a certain task fails or succeeds.

As mentioned, one of the most important activities that can be done in a Pipeline is the execution of a Dataflow. This is where Dataflows and Data Pipelines work together in their best way.

Executing Dataflows from Data Pipeline

To learn more about Data Pipelines, read my article below;

Summary

Data Factory is an ETL-in-cloud solution that is the data integration workload of Microsoft Fabric. Data Factory is not a new product or service; it comes from many years of Microsoft data transformation tools and services. It is built on top of Power Query and Azure Data Factory. Data Factory uses two main components to deliver the best ETL scenarios possible; Dataflows and Data Pipelines. Dataflows are for the main get data, transform, and load process, and the Data Pipeline can control the rest of the execution with control flow activities.

I highly recommend reading the articles below to study more about Data Factory;

The post What is Data Factory in Microsoft Fabric appeared first on RADACAD.

Follow Source

The post What is Data Factory in Microsoft Fabric appeared first on Microsoft Dynamics 365 Blog.

]]>
4760
Streamline Power BI Refresh: Refresh dataset after a successful refresh of dataflow http://microsoftdynamics.in/2021/01/07/streamline-power-bi-refresh-refresh-dataset-after-a-successful-refresh-of-dataflow/ Thu, 07 Jan 2021 02:58:11 +0000 https://radacad.com/?p=14515 Do you have a Power BI dataset that gets data from a dataflow? have you ever thought; “can I get the dataset refreshed only after the refresh of dataflow completed and was successful?” The answer to this question is yes, you can. One of the recent updates from the data integration team of Power BI Read more about Streamline Power BI Refresh: Refresh dataset after a successful refresh of dataflow[…]
The post Streamline Power BI Refresh: Refresh dataset after a successful refresh of dataflow appeared first on RADACAD. ...

The post Streamline Power BI Refresh: Refresh dataset after a successful refresh of dataflow appeared first on Microsoft Dynamics 365 Blog.

]]>
Facebooktwitterredditpinterestlinkedintumblrmail

streamline Power BI dataflow and dataset refresh

Do you have a Power BI dataset that gets data from a dataflow? have you ever thought; “can I get the dataset refreshed only after the refresh of dataflow completed and was successful?” The answer to this question is yes, you can. One of the recent updates from the data integration team of Power BI made this available for you. Let’s see in this blog and video, how this is possible.

The scenario

If you are using both dataflows and datasets in your Power BI architecture, then your datasets are very likely getting part of their data from Power BI dataflows. It would be great if you can get the Power BI dataset refreshed right after a successful refresh of the dataflow. In fact, you can do a scenario like below.

streamline the refresh of Power BI dataset automatically after successful refresh of the Power BI dataflow

Power Automate connector for dataflow

Power Automate recently announced availability of a connector that allows you to trigger a flow when a dataflow refresh completes.

Trigger for when the dataflow refresh completes

Choosing the dataflow

You can then choose the workspace (or environment if you are using Power Platform dataflows), and the dataflow.

Dataflow setting in the Power Automate dataflow connector

Condition on success or fail

The dataflow refresh can succeed or fail. You can choose the proper action in each case. For doing this, you can choose the result of refresh to be Success.

checking if the dataflow refresh was successful

Refresh Power BI dataset

In the event of successful refresh of the dataflow, you can then run the refresh of the Power BI dataset.

refresh Power BI dataset from Power Automate

Refreshing Power BI dataset through Power Automate is an ability that we had for sometime in the service.

Capture the failure

You can also capture the failure details and send a notification (or you can add a record in a database table for further log reporting) in the case of failure.

send email notification if the dataflow refresh failed

Overall flow

The overall flow seems a really simple but effective control of the refresh as you can see below.

refresh Power BI dataset after dataflow

My thoughts

Making sure that the refresh of the dataset happens after the refresh of the dataflow, was one of the challenges of Power BI developers if they use dataflow. Now, using this simple functionality, you can get the refresh process streamlined all the way from the dataflow.

Dataflow refresh can be done as a task in the Power Automate as well. Which might be useful for some scenarios, such as running the refresh of the dataflow after a certain event.

refresh a dataflow from Power Automate

This is not only good for refreshing the Power BI dataset after the dataflow, it is also good for refreshing a dataflow after the other one. Especially in best practice scenarios of dataflow, I always recommend having layers of the dataflow for staging, data transformation, etc, as I explained in the below article.

multi-layered dataflow. source: https://docs.microsoft.com/en-us/power-query/dataflows/best-practices-reusing-dataflows

Although, dataflow is not a replacement for the data warehouses. However, having features like this helps the usability and the adoption of this great transformation service.

Do you think of any scenarios that you use this for? let me know in the comments below, I’d love to hear about your scenarios.

Video

Facebooktwitterlinkedinrssyoutube

The post Streamline Power BI Refresh: Refresh dataset after a successful refresh of dataflow appeared first on RADACAD.

Follow Source

The post Streamline Power BI Refresh: Refresh dataset after a successful refresh of dataflow appeared first on Microsoft Dynamics 365 Blog.

]]>
4368
Get Data using SQL Server o Microsoft dynamics 365 Online in Power BI and Embed the Power BI report as Dashboard in Dynamis 365 http://microsoftdynamics.in/2020/06/06/get-data-using-sql-server-o-microsoft-dynamics-365-online-in-power-bi-and-embed-the-power-bi-report-as-dashboard-in-dynamis-365/ Sat, 06 Jun 2020 18:00:21 +0000 http://microsoftdynamics.in/?p=3498 The post Get Data using SQL Server o Microsoft dynamics 365 Online in Power BI and Embed the Power BI report as Dashboard in Dynamis 365 appeared first on Microsoft Dynamics 365 Blog.

]]>

Check3: Create a Report in Power Bi with SQL Server Connection and Embed the Power BI report as Dashboard in Dynamis 365

One of the best parts which we check above is Relationships are correctly shown as it is in CDS

  • 2nd is adding lookup, Options field, regarding field all will give Name /Label and values both as separate Fields , for which we need to do some extra work when getting data from Odata.
  • Below I have selected Lookup and Options fields as Name and ID
https://i0.wp.com/microsoftdynamics.in/wp-content/uploads/2020/06/img_5edb6726ab9cd.png?fit=1832%2C784

Check4: Publish Report in Power BI Destop Client

  • We will use the above report itself to embed in Microsoft dynamics 365
  • We have added only account and contact entity with some field to demonstrate how Relationship are pre-build and we get Lookup name and Id
  • Now we will publish the report, select the workplace
https://i0.wp.com/microsoftdynamics.in/wp-content/uploads/2020/06/img_5edb81d545c56.png?fit=1747%2C863

Check5: Pin Report as Dashboard in Power BI Portal

  • Once published, we will edit credentials as OAuth
  • Navigate to My Workspace -> DataSets and edit credentials
https://i0.wp.com/microsoftdynamics.in/wp-content/uploads/2020/06/img_5edb8397d89e8.png?fit=1925%2C909
  • Selecting Authentication Method as “OAUTH”
  • Remember to enable Below.
    End users use their own OAuth2 credentials when accessing this data source via DirectQuery.
https://i0.wp.com/microsoftdynamics.in/wp-content/uploads/2020/06/img_5edb85a1c3a8c.png?fit=1628%2C860
  • Now we will ping Report as live so it can be used in dynamics 365
  • Navigating to report from workspace
https://i0.wp.com/microsoftdynamics.in/wp-content/uploads/2020/06/img_5edb8661ecd4c.png?fit=1918%2C579
  • Once pinned, we can see a pinned item in DashBoard, and will also be available as a dashboard in D365
  • I have changed name while pinning it as a dashboard
https://i0.wp.com/microsoftdynamics.in/wp-content/uploads/2020/06/img_5edb86fe3d722.png?fit=1916%2C489

Check5: Embed the Power Bi Dashboard in Dynamics 365

  • Navigating to D365 Dashboard
  • Clicking NEW, Either select Microsoft dynamics 365 or Power Bi Dashboard.
  • Select Layout

https://i0.wp.com/microsoftdynamics.in/wp-content/uploads/2020/06/img_5edb8815399a8.png?fit=1715%2C782
  • Select Power BI Tile
  • All our Dashboard in Selected workspace will be available
https://i0.wp.com/microsoftdynamics.in/wp-content/uploads/2020/06/img_5edb88cde6f4d.png?fit=1897%2C918

TESTING: Power BI Dashboard is added to D365

https://i0.wp.com/microsoftdynamics.in/wp-content/uploads/2020/06/img_5edb898a7dd33.png?fit=1919%2C886

The post Get Data using SQL Server o Microsoft dynamics 365 Online in Power BI and Embed the Power BI report as Dashboard in Dynamis 365 appeared first on Microsoft Dynamics 365 Blog.

]]>
3498
Get Data using SQL Server Connection of Microsoft dynamics 365 CDS online within Power BI and embedding in dynamics 365 http://microsoftdynamics.in/2020/06/06/get-data-using-sql-server-connection-of-microsoft-dynamics-365-cds-online-within-power-bi-and-embedding-in-dynamics-365/ Sat, 06 Jun 2020 17:47:35 +0000 http://microsoftdynamics.in/?p=3484 The post Get Data using SQL Server Connection of Microsoft dynamics 365 CDS online within Power BI and embedding in dynamics 365 appeared first on Microsoft Dynamics 365 Blog.

]]>

In This Post, we will Get Data in Power Bi using DataQuery on SQL Server Connection with Microsoft Dynamics 365 CRM

In the previous post, we enabled TDS Endpoint using a utility  Microsoft.Crm.SE.OrgDBOrgSettingsTool .  But now we have an option to enable the TDS endpoint from http://admin.powerplatform.microsoft.com/ .

  • Navigate to http://admin.powerplatform.microsoft.com/ -> Select Environment -> In Features, we will find en checkbox for enabling the TDS endpoint. That will make a connection to the SQL server possible.
  • Also, enable Power Bi Visualization embedding
  • Verify that your environment has at least version 9.1.0.17437.
https://i0.wp.com/microsoftdynamics.in/wp-content/uploads/2020/06/img_5edb5a87d2a96.png?fit=1638%2C829

Check1: Connect to SQL Server database in Power Bi

Once Prerequisites are completed we are ready to Get data using SQL Server

  • Click on Get Data -> select SQL Server and enter Details
    i.e ORGNAME.crm.dynamics.com,5558 (replace CRM as regions ) , Optional Database (ORGNAME)
  • Select DirectQuery as an option and Click OK
https://i0.wp.com/microsoftdynamics.in/wp-content/uploads/2020/06/img_5edb5f5ba4c68.png?fit=1919%2C813
  • Now we will do authentication using Microsoft account and connect
https://i0.wp.com/microsoftdynamics.in/wp-content/uploads/2020/06/img_5edb60a56f29a.png?fit=1709%2C807

Check2: Slect Entities and check Model for relationship testing

  • After Successful connection, let us select some entities and select load

  • I would be selecting Account and Contact and Load

  • previously when we use to get data using Odata, we needed to transform data instead of load, we can still transform data but it has reduced a lot of work after getting entities we will see that next
https://i0.wp.com/microsoftdynamics.in/wp-content/uploads/2020/06/img_5edb617d639ff.png?fit=1099%2C874

Once the connection is created, In the model we will see relationship connection pre-created. we don’t have to do much work regarding relationships.

As previously we use to create relationship again in Power BI.

https://i0.wp.com/microsoftdynamics.in/wp-content/uploads/2020/06/img_5edb62230a517.png?fit=1509%2C676

Check3: Create a Report in Power Bi with SQL Server Connection and Embed the Power BI report as Dashboard in Dynamis 365

One of the best parts which we check above is Relationships are correctly shown as it is in CDS

  • 2nd is adding lookup, Options field, regarding field all will give Name /Label and values both as separate Fields, for which we need to do some extra work when getting data from Odata.
  • Below I have selected Lookup and Options fields as Name and ID
https://i0.wp.com/microsoftdynamics.in/wp-content/uploads/2020/06/img_5edb6726ab9cd.png?fit=1832%2C784

The post Get Data using SQL Server Connection of Microsoft dynamics 365 CDS online within Power BI and embedding in dynamics 365 appeared first on Microsoft Dynamics 365 Blog.

]]>
3484
Analyse the JSON File with Power Query http://microsoftdynamics.in/2020/01/30/analyse-the-json-file-with-power-query/ Thu, 30 Jan 2020 02:18:34 +0000 https://radacad.com/?p=12584 In the last Post, I will explain how to analyze a JSON file that has been generated in the Sentiment Analysis process . some explanation, this is a JSON file that contains the sentiment analysis for the comments one traveler put on the hotel website as below The suite was awesome. We did not have Read more about Analyse the JSON File with Power Query[…]
The post Analyse the JSON File with Power Query appeared first on RADACAD. ...

The post Analyse the JSON File with Power Query appeared first on Microsoft Dynamics 365 Blog.

]]>
Facebooktwitterredditpinterestlinkedintumblrmail

In the last Post, I will explain how to analyze a JSON file that has been generated in the Sentiment Analysis process .

some explanation, this is a JSON file that contains the sentiment analysis for the comments one traveler put on the hotel website as below

The suite was awesome. We did not have much interaction with the staff. We did sleep on the queen size sofa bed in the living room instead of the queen size bed in the actual bedroom due to the temperature. It was very hot and humid. The air conditioner in the living room does not cool off the bedroom. This suite is very close to shopping and dining. After each day of adventures, we would walk to dinner at different joints. I will stay here again when I return.

the JSON file is like below

{“predictionOutput”:{“result”:{“sentiment”:”mixed”,”documentScores”:{“positive”:0.64,”neutral”:0.05,”negative”:0.31},”sentences”:[{“sentiment”:”positive”,”sentenceScores”:{“positive”:1.0,”neutral”:0.0,”negative”:0.0},”offset”:0,”length”:22},{“sentiment”:”neutral”,”sentenceScores”:{“positive”:0.01,”neutral”:0.83,”negative”:0.16},”offset”:23,”length”:48},{“sentiment”:”neutral”,”sentenceScores”:{“positive”:0.0,”neutral”:1.0,”negative”:0.0},”offset”:72,”length”:134},{“sentiment”:”neutral”,”sentenceScores”:{“positive”:0.03,”neutral”:0.93,”negative”:0.04},”offset”:207,”length”:26},{“sentiment”:”negative”,”sentenceScores”:{“positive”:0.01,”neutral”:0.06,”negative”:0.93},”offset”:234,”length”:69},{“sentiment”:”positive”,”sentenceScores”:{“positive”:0.92,”neutral”:0.08,”negative”:0.0},”offset”:304,”length”:48},{“sentiment”:”neutral”,”sentenceScores”:{“positive”:0.02,”neutral”:0.96,”negative”:0.02},”offset”:353,”length”:73},{“sentiment”:”neutral”,”sentenceScores”:{“positive”:0.03,”neutral”:0.92,”negative”:0.05},”offset”:427,”length”:37}]}},”operationStatus”:”Success”,”error”:null}

 

 

 

The related JSON file allocate an overall score to the whole comment, then for each sentence, you can get a separate sentiment score, the result in the JSON format is not able to analyze, so I decided to use Power Query to solve this problem

First Open Power BI desktop and navigate to Power Query, import the JSON file, then load the data, click on the record to expand it and to see the record and list. Right-click on both of them and add them as a separate query.

 

Then click on the To Table, and expand the records to see the inside, finally, you should see the 7 rows of data with label positive, negative and neutral, and the two last columns for the offset and length of each sentence.

 

Now we need to add the original text to merge them together, import the text file, then use the dot separator and Transpose to make them as a table. Then use the Add Column and add index column, to both the Sentence query and the hotel comment. Then in the Home Tab click on the Merge columns and merge them based on the Index

 

 

 

Facebooktwitterlinkedinrssyoutube

The post Analyse the JSON File with Power Query appeared first on RADACAD.

Follow Source

The post Analyse the JSON File with Power Query appeared first on Microsoft Dynamics 365 Blog.

]]>
4373
Automation of Sentiment Analysis using AI Builder http://microsoftdynamics.in/2020/01/28/automation-of-sentiment-analysis-using-ai-builder/ Tue, 28 Jan 2020 07:55:19 +0000 https://radacad.com/?p=12575 There are many usages of  AI Builder for Object detection, form processing, Binary Prediction, Text classification, business card Reader. However, besides them, there is another one that is available now in Power Automate. In this post, I am going to show you how to use this feature, for creating a process for automatically apply sentiment Read more about Automation of Sentiment Analysis using AI Builder[…]
The post Automation of Sentiment Analysis using AI Builder appeared first on RADACAD. ...

The post Automation of Sentiment Analysis using AI Builder appeared first on Microsoft Dynamics 365 Blog.

]]>
Facebooktwitterredditpinterestlinkedintumblrmail

There are many usages of  AI Builder for Object detection, form processing, Binary Prediction, Text classification, business card Reader. However, besides them, there is another one that is available now in Power Automate. In this post, I am going to show you how to use this feature, for creating a process for automatically apply sentiment analysis and create an output file to store back to Onedrive and then clean the result using Power Query.

Power Automate

navigate to Power Automate 

login to the account you have AI Builder environment, Then under the solution, create a new flow with the trigger of “When a file is Created” in Onedrive

 

In the next step, you need to select the Predict to component from the Next Step action, in the model section please choose the SentimentAnalysisModel 

Then in the Request Payload section, you need to first click in the box, then write the below code

{“text”:””,”language”:”en”} (make sure the it is “)

 

 

new window chooses the Expression and clicks on the ” ” in front of the text and from the Dynamic Content, choose the File content.

now choose a new action, and select the Create file to send back the result to the Onedrive.

for the Folder path just choose the location you want to save, for the file name you can use the Expression box to create a simple name and type like

concat(utcNow(),’.json’)

then in the File content select the ResponsePayload from Dynamic Content

 

Now just save the flow and test it you should upload a text file that has some comments like Hotel comment in the text format, then you need to run the flow.

below text can be a sample from hotel review

“The suite was awesome. We did not have much interaction with the staff. We did sleep on the queen size sofa bed in the living room instead of the queen size bed in the actual bedroom due to the temperature. It was very hot and humid. The air conditioner in the living room does not cool off the bedroom. This suite is very close to shopping and dining. After each day of adventures we would walk to dinner at different joints. I will stay here again when I return.”

Now a JSON file should created and you can use Power BI to analyze it in an easy way.

In the next post, I will explain how to analyze the result with Power Query.

see the full video from here

 

Facebooktwitterlinkedinrssyoutube

The post Automation of Sentiment Analysis using AI Builder appeared first on RADACAD.

Follow Source

The post Automation of Sentiment Analysis using AI Builder appeared first on Microsoft Dynamics 365 Blog.

]]>
4381
Refresh Power BI Queries Through Power Platform Dataflows: Unlimited Times with Any Frequency http://microsoftdynamics.in/2020/01/06/refresh-power-bi-queries-through-power-platform-dataflows-unlimited-times-with-any-frequency/ Mon, 06 Jan 2020 02:44:18 +0000 https://radacad.com/?p=12167 If you don’t have a Power BI Premium capacity license, then you are limited to refresh your dataflows up to eight times a day, with the frequency of 30 minutes. The good news for you is that you have a way to do unlimited refreshes and with whatever frequency you like using Power Platform Dataflows. Read more about Refresh Power BI Queries Through Power Platform Dataflows: Unlimited Times with Any Frequency[…]
The post Refresh Power BI Queries Through Power Platform Dataflows: Unlimited Times with Any Frequency appeared first on RADACAD. ...

The post Refresh Power BI Queries Through Power Platform Dataflows: Unlimited Times with Any Frequency appeared first on Microsoft Dynamics 365 Blog.

]]>
Facebooktwitterredditpinterestlinkedintumblrmail

If you don’t have a Power BI Premium capacity license, then you are limited to refresh your dataflows up to eight times a day, with the frequency of 30 minutes. The good news for you is that you have a way to do unlimited refreshes and with whatever frequency you like using Power Platform Dataflows. Read the rest of this blog article to learn how.

What is Dataflow?

In the world of Power BI, You can separate the data transformation layer using Dataflows. Dataflows can run one or multiple Power Query scripts on the cloud and store the result into Azure Data Lake. The result of the dataflow (which is stored in Azure Data Lake) can then be used in a Power BI dataset using Get Data from Power BI dataflows. If you want to learn more about Dataflows, I highly recommend reading my dataflows series that starts with this article.

Creating Power Platform Dataflows

Dataflow is not only a concept for Power BI, but It is also available as Power Platform dataflow. You can check out one of the announcements of features added in Power Platform Dataflows by Miguel Llopis (one of the product managers of the Data Integration team at Microsoft), here. You can create Power Platform Dataflows, very similar to the way that you create Power BI dataflows. The only difference at the beginning is where you create it. Power BI dataflows are created in Power BI service, Power Platform Dataflows can be created in Power Apps (or Power Platform) portal.

Login to Power Platform Portal here.

Then go to Dataflows under the Data section,

Start a New dataflow

just provide a name for the dataflow, and then create

At this step, you can either go ahead and get data from anywhere you want or use Blank Query and copy a Power Query script from Power BI Desktop into here.

after preparing your queries, you can go to the step of loading it.

Load to Entity

The load to entity step in Power Platform Dataflows is a bit different from Power BI Dataflows. In Power BI Dataflows, the result will be loaded into CSV files into Azure Data Lake, so no more configuration is needed. However, Power Platform Dataflows stores the data into CDS (Common Data Services), and you have to select the entity you want to load this data into it.

You can either choose one of the existing entities or create a new entity, then map fields. Because these are CDS entities, there are some rules for having key fields, name fields, etc. If you don’t know what the CDS is, and where it stores the data, read my article about CDS here.

Refresh Settings

After setting up the load to entity and mapping fields correctly, you will get into the Refresh settings. This is where you can refresh your dataflow even with the frequency of a minute. You can refresh it as many times as you want. You are not limited to eight times or even 48 times a day.

Refresh as much as you want

Here you can see the frequency of the test dataflow that I set up, which ran every minute.

What about Licensing?

I know what you are going to ask now! What about licensing? What type of license do you need for this? Well, for Power Platform dataflows, you don’t need a Power BI license at all, not Premium, not even Pro. You do need, however, to have a Power Apps license. at the time of writing this blog post, there are two options, $10 a month, and $40 a month. Both of these can be cheaper than Power BI Premium if you are a single user (or even a few users). The main difference between the two plans of Power Apps, as I see is the database and file size for CDS;

How to use the results in Power BI?

You can use the result of dataflows in Power BI (similar to the way that you can use the result of Power BI dataflows in Power BI). You need to Get Data from Power Platform Dataflows.

However, at the time of writing this blog post, this feature is still beta and under development, and might not show the Power Platform dataflows.

Another option is also to get data from the Common Data Service. (Because Power Platform Dataflows stores the data into CDS)

Note that Power BI Dataset Refresh Limites Still Applies

Having a Power Query script that can refresh even every minute is great. However, you have to note that the dataflow refresh is different from the dataset refresh. If you use the result of the dataflow in a Power BI dataset, Still you need to get your dataset refreshed, and that means you are limited to eight times a day or 48 times a day or API refreshes depends on your Power BI license limits.

However, having the ability to refresh even the dataflow more frequent, still can be useful for some scenarios. I bet if you came to this post by searching for such functionality, you have some ideas about that already 😉.

Video

Facebooktwitterlinkedinrssyoutube

The post Refresh Power BI Queries Through Power Platform Dataflows: Unlimited Times with Any Frequency appeared first on RADACAD.

Follow Source

The post Refresh Power BI Queries Through Power Platform Dataflows: Unlimited Times with Any Frequency appeared first on Microsoft Dynamics 365 Blog.

]]>
4384
Make Power BI report using data from Azure SQL server and view in Dynamics 365 http://microsoftdynamics.in/2018/11/05/make-power-bi-report-using-data-from-azure-sql-server-and-view-in-dynamics-365/ Mon, 05 Nov 2018 12:58:42 +0000 https://www.inogic.com/blog/?p=13289 Introduction: In this blog, we will see how to show data of SQL server into Dynamics CRM. For this first, you need to connect your Power BI Desktop with SQL server. After that, we can show Power BI report into Dynamics CRM. Step 1: Register with Microsoft Power BI and by Sign in with your...

The post Make Power BI report using data from Azure SQL server and view in Dynamics 365 appeared first on Microsoft Dynamics 365 Blog.

]]>
Introduction:

In this blog, we will see how to show data of SQL server into Dynamics CRM. For this first, you need to connect your Power BI Desktop with SQL server. After that, we can show Power BI report into Dynamics CRM.

Step 1:

Register with Microsoft Power BI and by Sign in with your Dynamic 365 Credentials.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 2:

Once you Sign in with Power BI, click on Get Data button which is present on left top.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 3:

Then select the Source as Azure SQL Database and click on connect.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 4:

After clicking on connect then enter the Server Detail of your Azure SQL database and mention the database on which you are going to create reports.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Meaning of selection of Data Connectivity Mode:

If we select-

  1. Import

It will give you all the tables which you have selected from SQL Server to Power BI. For real-time data, you have to refresh the dataset manually or you can schedule a refresh for it.

  1. Direct Query –

By this you can run queries directly to SQL Server means all data will not be imported to Power BI. Here you will get real-time data so no need to refresh dataset in Power BI.

Step 5:

After clicking on ok button select the tables which you want to import to Power BI. Then click on the Load button.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 6:

Here you can see your tables imported from SQL Server.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 7:

Here you can see relationship between your tables.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 8:

Here you can see all the data of tables.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 9:

Now create a report using power bi visualizations and filters.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 10:

To view this report in Dynamic CRM we have to publish the report.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 11:

After publishing report it will populate in power service.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 12:

Now Click on Pin Live Page to make report on dashboard of Dynamic CRM. Then add that report in the new dashboard or existing dashboard.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 13:

Now go to CRM dashboards from here you can get this Power BI report.

Step 1: From here you can get all Power BI dashboard reports.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 2: After that choose your Workspace and Dashboard where you had pin that Power BI report.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Step 3: Here is your Power BI report which you show in Dynamic CRM Dashboard.

Make Power BI report using data from Azure SQL server and view in Dynamic 365

Conclusion:

Using the simple steps above we can show Power BI reports with Azure SQL database (Data Source) in Dynamics CRM.

Click2Clone-One-Click-Productivity-App-to-Copy_clone-Dynamics-365_CRM-Records

Source

The post Make Power BI report using data from Azure SQL server and view in Dynamics 365 appeared first on Microsoft Dynamics 365 Blog.

]]>
4353