Category Archives: Power BI

The Modern Data Warehouse in Azure Part 4: The Serving Layer

In this video blog post I covered the serving layer step of building your Modern Data Warehouse in Azure. There are certainly some decisions to be made around how you want to structure your schema as you get it ready for presentation with whatever your business intelligence tool of choice, for this example I used Power BI, so I discuss some of the areas you should focus on:

  • What is your schema type? Snowflake or Star, or something else?
  • Where should you serve up the data? SQL Server, Synapse, ADLS, Databricks, or Something Else?
  • What are your Service level agreements for the business? What are your data processing times?
  • Can you save cost by using an option that’s less compute heavy?

What Power BI XMLA Endpoints means for you

Are you taking advantage of Power BI’s modeling capability? This is a terrific built in capability and many users build models as they begin to design their reports.  

However, as we transition Power BI into being our enterprise visualization and reporting tool, some of the legacy applications just aren’t going to go away. Many of those applications connect to XMLA endpoints from other semantic model providers such as analysis services on your SQL on prem.

So, the Power BI team decided to give you the ability to do the same. There is a newly announced feature in public preview called XMLA Endpoints for Power BI. Beyond the ability to connect to the XMLA endpoints with other analytical dashboarding tools (like Tableau), you can also connect to it from other toolsets that support XMLA.

For example, many of the Microsoft development tools give you the ability to connect as well, things like SQL Server Management Studio (SSMS), SQL Server Profiler, DAX Studio and even from Excel pivot tables. Just keep in mind it’s only currently available for read access, so some of your capabilities will be limited. Microsoft documentation on this feature states that they will be planning to offer a read/write option soon.

From a licensing perspective, access to XMLA Endpoints is available for datasets in Power BI Premium only, but any user can connect to the endpoints regardless of whether they have a Pro license. The feature itself is turned on within the settings of the Power BI Premium tenant. This is only available in preview at this time, so it’s not supported in production.

In this scenario, other tools such as SQL Server data tools or types where you can play with the model give you the ability to change your model with Visual Studio. Now you can start to see a scenario where you can start to use Git Hub repositories to manage your source control on the models among other endless uses.

I must say I’m impressed with the modular approach of the team to roll out Power BI as a 100% enterprise class tool. First, we saw the Data Flows feature which extracts the ELT/ELT into its own lane. Then things like composite models was introduced where you can have offline and online data sources. Now we’ve got XMLA Endpoints which clearly defines 3 separate layers of development within the Power BI ecosystem.

These recently added features have opened some great avenues and spread user adoption and I know there are more great things to come. The commitment of the team to evolve and improve this product has been excellent and I look forward to what they’ll roll out next.

New Development Feature for Azure Stream Analytics

Gaining insights from our data, especially in real-time is an important part of any business. Today I’d like to talk about some new development options for Azure Stream Analytics. If you’re not clear on what Azure Stream Analytics is, it’s a fully managed cloud solution in Azure that allows you to rapidly develop and deploy low cost solutions to gain real-time insights from devices, sensors, infrastructure and applications.

Stream Analytics is part of the Azure IoT suite which brings IoT to life and allows you to easily connect your devices and analyze previously untapped data and integrate business systems. The IoT workspace is expanding as it offers so much capability and information for things like production floors, jet engines and automobiles, just to name a few. I did another blog on some of the features here.

Today my focus is a new feature that allows you to do some local testing within Visual Studio to query logic with live data without needing to run in the cloud. You can test your queries locally while using live data streams from the sources such as Event Hub, IoT Hub or Blob Storage. Also, you can use the Stream Analytics time policies to be able to start and stop queries in a matter of seconds.

This offers a big improvement in development productivity, as you can save a lot time on the inner loop of query logic testing.

Some major benefits are:

  • The behavior query consistency, so you get the same experience when you’re using Visual Studio or the cloud interface.
  • Much shorter test cycles. You normally can expect a lag in cloud development. Now testing queries directly in Visual Studio in your local environment presents the opportunity to show the shape of the data coming in to help you easily adjust the query and see some immediate results.

A couple of caveats with deployment in this new feature:

  • The local testing feature should only be used for functional testing purposes. It doesn’t replace the performance or scalability tests that you would do inside the cloud.
  • It really should not be used for production purposes since it doesn’t guarantee any kind of SLA.
  • Also note, that when you’re on your machine, you can rely on local resources but when you deploy to the cloud, you can scale out to multiple nodes which allows you to add more streams and additional resources in order to process those.
  • Cloud deployment ensures things like check pointing, upgrades and other features that you need for production deployments, as well as provides the infrastructure to run your jobs 24/7.

So, remember, this new enhancement is just for testing purposes to help shorten the query and development cycle and avoid the lag in other testing and development tools. But a cool, time saving feature to investigate, and Microsoft is adding more features to Azure Steam Analytics.

Azure Data Factory vs Logic Apps

Customers often ask, should I use Logic Apps or Data Factory for this? Of course, the answer I give is the same as with most technology, it depends. What is the business use case we’re talking about?

Logic Apps can help you simplify how you build automated, scalable workflows that integrate apps and data across cloud and on premises services. Azure Data Factory is a cloud-based data integration service that allows you to create data driven workflows in the cloud for orchestrating and automating data movement and data transformation. Similar definitions, so that probably didn’t help at all, right?

Let me try to clear up some confusion. There are some situations where the best-case scenario is to use both, so where a feature is lacking in Data Factory but can be found in Logic Apps since it’s been around longer. A great use case is alerting and notifications, for instance. You can use the web API out of Data Factory and send a notification through a Logic App via email back to a user to say a job has competed or failed.

To answer the question of why I would use one over the other, I’d say it comes down to how much data we’re moving and how much transformation we need to do on that data to make it ready for consumption. Are we reporting on it, putting it in Azure Data Warehouse, building some facts and dimensions and creating our enterprise data warehouse then reporting off of that with Power BI? This would all require a decent amount of heavy lifting. I would not suggest a Logic App for that.

If you’re monitoring a folder on-prem or in OneDrive and you’re looking to see when files get posted there and you want to simply move that file to another location or send a notification about an action on the file, this a great use case for a Logic App.

However, the real sweet spot is when you can use them together, as it helps you maximize cost efficiency. Depending on what the operation is, it can be more or less expensive depending upon whether you’re using Data Factory or Logic Apps.

You can also make your operations more efficient. Utilize the power of Azure Data Factory with its SSIS integration runtimes and feature sets that include things like Data Bricks and the HDInsight clusters, where you can process huge amounts of data with massively parallel processing. Or use your Hadoop file stores for reporting off structured, unstructured or semi-structured data. But Logic Apps can help you enhance the process.

Clear as mud, right? Hopefully I was able to break it down a bit better. To put it in simple terms: when you think about Logic Apps, think about business applications, when you think about Azure Data Factory, think about moving data, especially large data sets, and transforming the data and building data warehouses.

 

Azure Common Data Services

What do you know about Azure Common Data Services? Today I’d like to talk about this product for apps which was recently re-done by Microsoft to expand upon the product’s vision. Common Data Services is an Azure-based business application platform that enables you to easily build and extend applications with your customer’s business data.

Common Data Services helps you bring together your data from across the Dynamics 365 Suite (CRM, AX, Nav, GP) and use this common data service to more easily extract data rather than having to get into the core of those applications. It also allows you to focus on building and delivering the apps that you want and insights and process automation that will help you run more efficiently. Plus it integrates nicely with PowerApps, Power BI and Microsoft Flow.

Some other key things:

  • If you want to build Power BI reports from your Dynamics 365 CRM data, there are pre-canned entities provided by Microsoft.
  • Data within the Common Data Services (CDS) is stored within a set of entities. An entity is just a set of records that’s used to store data, similar to how a table stores data within a database.
  • CDS should be thought of as a managed database service. You don’t have to create indexes or do any kind of database tuning; you’re not managing a database server as you would with SQL Server or a data warehouse. It’s designed to be somewhat of a centralized data repository from which you can report or do further things with.
  • PowerApps is quickly becoming a good replacement for things like Microsoft Access as it comes with along with functionality and feature sets. A common use for PowerApps is extending that data rather than having to dig into the background platform.
  • This technology is easy to use, to share and to secure. You set up your user account as you would with Azure Services, giving specific permissions/access based on the user.
  • It gives you the metadata you need based on that data and you can specify what kind of field or column you’re working with within that entity.
  • It gives you the ability of logic and validation; you can create business process rules and data flows from entity to entity or vice versa or from app to entity to PowerApps.
  • You can create workflows that automate business process, such as data cleansing or record updating; these workflows can run in the background without having to manage manually.
  • Gets good connectivity with Excel which makes it user friends for people comfortable with that platform.
  • For power users, there’s an SDK available for developers, which allows you to extend the product and build some cool custom apps.

I don’t think of this as a replacement for Azure SQL DW or DB but it does give you the capability to have table-based data in the cloud that has some nice hooks into the Dynamics 365 space, as well as outputting to PowerApps and Power BI.

3 Power BI Offerings to Consider…

I’m often asked by clients about which Power BI offering is best for their business and where they should store their data. The 3 main offerings around Power BI all have their strong points where they excel. It comes down to understanding what each offers to decide the best fit for your organization’s data and needs.

Continue reading 3 Power BI Offerings to Consider…

Overview of Power BI Embedded

Everyone is familiar with Power BI Desktop, Cloud and On-Prem. But not as many are familiar with Power BI Embedded. So, what is it? Power BI embedded allows your company to embed the dashboards and reports in your in-house developed applications, and you only need one Power BI account to be able to have a Power BI embedded environment.

This Azure service is separate from Power BI Premium or Pro and is built for compute, rather than per user, as with other Power BI iterations. The design is to focus on your applications and your customer, instead of the management and maintenance of things.

You have options when setting up your Azure tenant. You can use your existing tenant ID, create a new application for the tenant or a tenant for a specific customer. There are 3 straightforward steps to get you up and running:

1.  Set up your Azure Power BI Embedded environment within Azure. Then set up your tenets, user requirements and workspaces.

2.  Then you’re going to embed your content by going to your backend and set up your application and connect to Azure through the REST API that Azure provides. This is all secure and encrypted traffic going over SSL. If you’re using the authentication when you’re displaying your reports and dashboards, then you’re doing this through your backend application authentication system, rather than the Azure application authentication system.

3.  Lastly, you’re going to release your reports and dashboards to production. You’ll need to decide what compute requirements you need and then set up your tiered pricing, pick your plan and you’re ready to go.