Category Archives: Azure

Accelerate Your AI with Machine Learning on Azure Data Box Edge

In some past blogs I’ve discussed Azure Data Box and how the Data Box family has expanded. Today I’ll talk about Azure Data Box Edge (in preview) and elaborate on the machine learning service that it provides in your premises with the power of Azure behind it.

If you don’t know, Azure Data Box Edge is a physical hardware device that sits in your environment and collects data from environment sources like IOT data and other sources where you might take advantage of the AI features offered by the device. It then takes the data and sends it to Azure for more processing, storage or reporting purposes.

Microsoft recently announced Azure Machine Learning hardware accelerated models provided by Project Brain Wave on the Data Box Edge. Because most of our data is in real world applications and used at the edge of our networks – like image and videos collected from factories, retail stores or hospitals – it can now be used for things such as manufacturing defect analysis or inventory out of stock detection in diagnostics.

By applying machine learning models to the data on Data Box Edge, it provides lower latency (and savings on bandwidth cost) as we don’t have to send all the data to Azure for analysis. But it still offers that real time insight and speed to action for critical business decisions.

You can enable data scientists to simplify and accelerate the building, training and deployment of machine learning models using the Azure Machine Learning Service which is already generally available. They can access all these capabilities in their favorite Python environment, using the latest open source frameworks such as PyTorch, TensorFlow and sci-kit-learn.

These models can run on CPUs and GPUs, but this preview expands that out to field programmable gate array processes (FPGA), which is the processor on the Data Box Edge.

The preview is currently a bit limited but, in this case, you’re able to enhance the Azure Machine Learning Service by training a TensorFlow model for image classification scenarios. So, you would containerize that model in a docker container and then deploy it to the Data Box Edge IOT hub.

A good use case for this is if you’re using AI models for quality control purposes. Let’s say you know what a finished product should look like and what the quality specs are, and you build a model defining those parameters. Then you take an image of that product as it comes off the assembly line; now you can send those images to the Data Box Edge in your environment and more quickly capture defects.

Now you’re finding the root cause of defects quicker and throwing away fewer defective products and therefore, saving money. I’m looking forward to seeing how enterprises are going to leverage this awesome technology.

What Power BI XMLA Endpoints means for you

Are you taking advantage of Power BI’s modeling capability? This is a terrific built in capability and many users build models as they begin to design their reports.  

However, as we transition Power BI into being our enterprise visualization and reporting tool, some of the legacy applications just aren’t going to go away. Many of those applications connect to XMLA endpoints from other semantic model providers such as analysis services on your SQL on prem.

So, the Power BI team decided to give you the ability to do the same. There is a newly announced feature in public preview called XMLA Endpoints for Power BI. Beyond the ability to connect to the XMLA endpoints with other analytical dashboarding tools (like Tableau), you can also connect to it from other toolsets that support XMLA.

For example, many of the Microsoft development tools give you the ability to connect as well, things like SQL Server Management Studio (SSMS), SQL Server Profiler, DAX Studio and even from Excel pivot tables. Just keep in mind it’s only currently available for read access, so some of your capabilities will be limited. Microsoft documentation on this feature states that they will be planning to offer a read/write option soon.

From a licensing perspective, access to XMLA Endpoints is available for datasets in Power BI Premium only, but any user can connect to the endpoints regardless of whether they have a Pro license. The feature itself is turned on within the settings of the Power BI Premium tenant. This is only available in preview at this time, so it’s not supported in production.

In this scenario, other tools such as SQL Server data tools or types where you can play with the model give you the ability to change your model with Visual Studio. Now you can start to see a scenario where you can start to use Git Hub repositories to manage your source control on the models among other endless uses.

I must say I’m impressed with the modular approach of the team to roll out Power BI as a 100% enterprise class tool. First, we saw the Data Flows feature which extracts the ELT/ELT into its own lane. Then things like composite models was introduced where you can have offline and online data sources. Now we’ve got XMLA Endpoints which clearly defines 3 separate layers of development within the Power BI ecosystem.

These recently added features have opened some great avenues and spread user adoption and I know there are more great things to come. The commitment of the team to evolve and improve this product has been excellent and I look forward to what they’ll roll out next.

What is Azure Cost Management?

If you’re using Azure, one key piece is effectively planning and controlling the costs involved in running your business. In this post, I’ll discuss what I believe to be one of the most critical functions in Azure that will help you properly deploy and manage your environment.

Azure Cost Management helps you with this planning and cost control. Don’t confuse cost management with billing. To clarify, cost management and billing are two different things; billing is simply retrieving the bill, auditing it for accuracy and paying it. Cost management shows organization cost and usage patterns with advanced analytics.

Reports in cost management show the usage-based costs consumed by Azure services in third party marketplace offerings. Costs are based on negotiated prices and they factor in things like reservation and Azure hybrid benefit discounts for example.

These reports show all your internal and external costs for usage and some of the Azure Marketplace charges. One thing to be aware of when looking at the report is that some other charges such as reservation purchase, support and taxes are not yet shown.

They’ll also help you understand your spending and resource usage, as well as to find any spending anomalies. Some application of predictive analytics is also available to help with future budgeting needs based on your previous usage trends. Using this, Azure Management can deploy groups, budgets and recommendations to show you how your organization expenses are organized and how you can reduce costs going forward.

In addition, in 2018 Microsoft purchased a company called Cloudyn. Cloudyn was a third-party tool that allowed you to manage other cloud providers, as well as your Azure services. Since its purchase, Microsoft has begun to move some of those services over to Azure Cost Managementand pulled them out of or stopped supporting them on Cloudyn. The Azure Cost Management page gives you guidance as to which service is best for you.

Besides the service offering, Microsoft also offers some best practices around cost management including some key principles such as:

  • Planning ahead of deploying any resources so that the appropriate sizing and services are configured before you ‘jump in the pond’.
  • Visibility which gives you some alerting and reporting availability so you can ensure all the effected parties are notified appropriately of the services they are using and the costs they’re incurring.
  • Accountability allows you to ensure that those groups are accountable for the services they are using, and they understand the implications of the services they are deploying. And they are getting this information in a timely manner to know what their cost spending is.
  • Optimization is the process of deploying those compute resources and making sure you’re looking at where you can save some money like the appropriate licensing structures, using hybrid benefits, bringing your own license (BYOL) and looking at reserved instances for instance. And also looking at what usage your compute is consuming so you can see where you may be able to cut costs.

Azure Cost Management is a continuous, iterative process and the key to managing expenses appropriately is being sure you are keeping an eye on these things I’ve discussed and continually tweaking the services that you’re using.

What is Azure Active Directory B2C?

How important is secure identity management to you? If you’re like most, it is a top priority. In today’s post I’ll talk about Azure Active Directory B2C which is an identity management service that enables you to customize and control how users securely interact with your web, desktop, mobile or even single applications.

Using Azure AD B2C, users can sign up, sign in, reset passwords and edit profiles for the various applications they’re using.

When implementing these policies, we’ll have two choices:

  • Using common identity user flows within the Azure portal or,
  • For the more skilled developer or if the templates in the portal don’t support your use case, you can use XML based custom policies.

Once you make that decision, your choice will define the path of authentication, commonly referred to as the user journey. User journeys allow you to control behaviors by configuring some settings; things like social accounts (like Facebook) that the user uses to sign up for the application.

Data collected from the user as a first name or postal code would be used for authentication. You also have multi-factor authentication options, as well as the look and feel of how users interact with pages and information returned to the application.

Azure Active Directory B2C supports the open ID connect and the OAuth 2 protocols for these user journeys. These protocols will help ultimately receive a token that will allow for you to be authenticated. The interaction of every application follows a similar high-level pattern shown in the graphic below:

AAD B2C Flow

The steps here are:

1. The application directs the user to run a policy.

2. The user completes the policy according to the policy definition.

3. Then the application receives a token.

4. And then uses that token to try to a resource.

5. The resource server then validates the token to verify that access can be granted.

6. And the application will periodically refresh in the background ( there really are 5 steps but this 6th step is happening over and over).

Azure AD B2C can also work with additional identity providers such as Amazon, Facebook and Google that will create, maintain and manage identity information while providing authentication services to their (and other) applications.

Typically, you would only use one identity provider in your application but there are no restrictions for using more if your use case calls for it.

The main value for this service is the ability to lessen the need for username and password management for so many applications, thus improving the user experience. Our lives have been made a bit easier since we now have many applications, both web and desktop based, that allow that single sign on or no sign on experience because they are already pre-authenticated with a service like this.

What is Azure Active Directory Seamless Single Sign On?

We’re all dealing with many usernames and passwords in our everyday life, right? Today I’d like to talk about an authentication feature within Azure Active Directory that can help you with easier, faster access.

Azure Active Directory Seamless Single Sign-on (Azure AD Seamless SSO) automatically signs users in when they are on their corporate devices connected to their corporate network. When this is enabled, users don’t have to type their passwords, or even their username, to sign in to Azure Active Directory.

This feature provides users with easy access to cloud-based applications without needing any additional on premises components.

First let’s discuss how this is set up:

  • SSO is enabled used Azure AD Connect. The following steps will occur while enabling this feature:
    • A computer account representing Azure AD is created in your on premises Active Directory in each AD forest.
    • The computer account Kerberos decryption key is shared securely with Azure AD and then 2 Kerberos service principal names (SPNs) are created to represent 2 URLs that are used during Azure AD sign-on.

Authenticating in Browser

  • When doing authentication from a web browser for a web app, essentially a user navigates to a website and signs into Azure AD (see below).
AADSSSO-Image 1
  • Azure AD sends a Kerberos requests to on premises AD and on premises AD looks for an account related to the device you’re signing in on and a user account. If authorized, you get access.

Authenticating with Native Application

  • For a native client, like Outlook for instance, the process is a bit different (see below).
AADSSSO-Image 2
  • Here, the request is made from the device you’re using and authenticated off Azure AD, issuing a Kerberos ticket when it is successful.
  • When that ticket is authenticated off Azure AD and approved, a SAML token is sent to the app. Then it gets sent back to AAD for OAuth-2 authentication.
  • Once all that checks out, access is granted.

Now let’s talk about the benefits.

  • First, it’s a much better user experience. Users are automatically signed in both on premises and cloud-based applications using their built-in authentication, so there’s no need for users to repeatedly reenter their passwords.
  • It’s also easy to deploy and administer. There are no additional components needed on premises; it synchronizes your Azure Active Directory to your AD. Plus it works with any method of cloud authentication using password hash synchronization or pass through authentication.
  • Additionally, it can be rolled out to only some or all of your users by using group policy.

So, this is a great way to allow users to have multiple authentications into multiple websites and applications using only one authentication tool. This will minimize the amount of administration required to set up those users once it’s in place. And it should reduce the number of password resets for your help desk team or whomever oversees that.

How and Why to Add a Source Code Repository to Azure Data Factory

For developers, it’s very beneficial to have a source code repository. A source code repository helps to keep all your changes, to manage tasks, branches, share the code with a team and simply put, to keep it in safe place.

In this post, I’ll tell you why you should connect your Azure Data Factories to a source code repository, and I’ll demo how to do so:

  • To do this, I’ll start in my data factory inside my Azure portal.
  • When I go into Author & Monitor, I have the ability to either:
    • Set up a code repository within the landing page or main page here or,
    • I can go directly into my data factory and I can add it in the left-hand corner and pull down to where it says, ‘set up code repository’.
  • One thing to note is the code repository itself has been supported for a while, but recently with the release of Data Flows, they’re now supporting GitHub within the repositories as well (previously is was only Azure DevOps).
  • So, I select my GitHub account and fill in the information. *If you’re doing this for the first time, it’s going to prompt you to log into your GitHub account when you do this. In this case, I’ve already previously connected this so it’s going to know about my repositories.
  • Next, I select my repository name and I’ll go to my playground branch and I’m going to use my existing playground.
  • One field will ask me ‘Branch to import resources into:’ so if I’m importing resources, I can select an existing one or create a new one. For this demo I’m going to pick my playground.
  • Before I hit Save, notice on the left-hand side I’ve got zero pipelines, one data set and zero data flows. But when I connect to my playground, it’s going to bring everything in I’ve previously connected to within any of my areas I’ve saved up into or checked my code into anything in that playground branch. So, now you’ll see I have 6 pipelines, 23 data sets and 4 data flows.
  • One of the other nice pieces of being able to add source control is if I want to add a new data set. I just select my SQL Server I was previously connected to; I leave it on default for now and connect in.
  • I then select one of the tables; I selected the Product Category Table.
  • You’ll see at the top you have the option to Save All or Publish. Save All is going to save any of the changes across the tabs, so you can tell when there’s been a change, whether it be data set or pipeline or data flow by having a star next to the name.
  • Now instead of needing to publish every time you’re doing development, you can just save it and it will save here. So, rather than having to publish the entire pipeline and do any of the error checking and make sure the pipeline is in good standing, you can now just save part way through. This is a huge advantage over having to publish the entire pipeline which could cause some challenges which might not be efficient for development and such.

The new features of GitHub being added in gives us another great opportunity if you didn’t previously use Azure Dev Ops (formerly known as Visual Studio Team Services). I’ll be doing more upcoming blogs around Data Factory in Azure Every Day that will be beneficial to you with some of the nuances as the product has greatly evolved since releasing Data Flows.

Expanding the Azure Data Box Family

In a previous blog I introduced Azure Data Box. Today I’d like to talk about how Microsoft is expanding the Azure Data Box family by introducing you to the Azure Data Box Gateway and the Azure Data Box Edge devices.

Until now the Data Box Family has been the disc, the box and the heavy. Each have their own limits for storage but are designed to improve your way of uploading massive amounts of data into Azure, without having to wait for it to travel across the wire or saturate your bandwidth (consider that the offline method).

Microsoft learned from customers that they want a better way to sync their local storage directly with Azure storage for operations like archival and disaster recovery. Here’s where Azure Data Box Gateway comes in.

The Data Box Gateway is a cloud storage gateway device that resides on premises and sends your image, media and other data directly to Azure.

  • The Gateway is a virtual machine provisioned in your Hypervisor (VMware or Hyper V) where you write the data directly to this virtual device using the NFS or SMB protocols, which it then sends to Azure.
  • One use case for the Data Box Gateway is for things like continuously ingesting massive amounts of data. So, we have a local data source that requires large data amounts and capacities and we can stream those and sync them directly with our Azure storage.
  • Another use case would be for a cloud archival of data in a secure and efficient way. If you then think about the incremental data transfer over the network after the initial bulk transfer is done using the Data Box of your choice for direct tie in to the same Azure storage container that you’re using for your Data Box.

Azure Data Box Edge is a storage solution that allows you to process data and send it over the network to Azure.

  • Data Box Edge uses a physical device supplied by Microsoft to accelerate the secure data transfer.
  • The device resides on premises in your network stack and you write data to it (also using NFS or SMB.)
  • It is additionally equipped with AI enabled Edge computing capabilities which help to analyze, process or filter data as it moves to Azure block blob, page blob or Azure files.
  • It has the appropriate chips to process intelligent learning (artificial intelligence, machine learning, deep learning and such).
  • Use cases are for things like pre-processing data. So, we can analyze data from on premises or IOT devices to get faster information about the data. That pre-processing will allow us to do things like aggregating your data before it gets sent to Azure or modifying data, such as taking out PII.
  • You can also subset and transfer the data needed for deeper analytics in the cloud.
  • Additionally, you can analyze and react to IOT events. So, if you’re running IOT devices on prem and you want the ability to be quicker to respond when those events occur, this is a great way to handle that.
  • Another great use case is you can run machine learning models to get quick results that can be acted on before the data is sent to the cloud.
  • With these IOT use cases, you don’t have to wait for the data to be transmitted over the wire, do any of the munging happening up in Azure and then return results. You can return those results on the fly in real time and react more quickly.
  • Eventually the full data set is transferred to continue and help you to retain and improve any of your machine learning models. You can continually feed it data and have those models trained repeatedly, thus learning to be more concise over time.

The Data Box family is a very cool technology by having an online version to further extend its capabilities.

5 Ways Azure Makes Your Enterprise More Secure

Security is, or should be, a top priority; nothing is more important than making your enterprise secure. In this post I’ll tell you 5 ways Azure makes your enterprise more secure.

First off, Azure is a Microsoft product. When you’re one of the world’s largest companies, there are an enormous amount of threats that need to be evaluated every second of the day. So, obviously Microsoft is aware of these challenges.

With that in mind, Microsoft developed centers of excellence over the past ten years in order to be ready for these attacks. The Microsoft Threat Intelligence Center processed over 6.5 trillion signals so they could better understand what kind of information and what types of attack vendors there are.

Each month they block over 5 billion distinct malware threats. And they staff over 3500 security professionals in their defense operations centers to help thwart these attacks. Since Active Directory is a standard for user authentication control, they introduced Azure Active Directory years ago to extend that to their Azure platform.

All that being said, here are 5 ways that Azure makes your enterprise more secure:

1. Minimize the requirement for password use – By using Microsoft Authenticator and connecting to Software as a Service applications (like Drop Box, Salesforce, etc.) The authenticator replaces your password with a multi-factor sign in using something like your phone and your fingerprint, face ID or a pin based on the Windows device that you’re using.

With a 2-factor authentication when using those devices, you have a more simplified method instead of remembering a bunch of different passwords.

2. Security Scorecard – A while back I did a post on the Azure Secure Score and the Secure Score Center. With this, you’re using the Azure portal for having awareness where there are potentials for exposure or for best practices that need to be followed which helps your organization stay better secured.

3. Microsoft Threat Protection Suite – Helps detect, investigate and remediate issues across your organization, including endpoints, email, documents, identity and infrastructure elements. It also helps your security team automate many of those manual, mundane security tasks.

4. Confidentiality – Microsoft was the first cloud vendor to introduce confidential integrity in data while it’s in use. So, consumers don’t worry about their data being put in the wrong hands (like some of those other clouds vendors you may have heard of recently in the news).

Data is always encrypted at rest and in transit. The security will soon extend to the chip level for added security on certain Azure VMs. Intel has built in some security measures inside their chips and now Microsoft is going to interact directly with those chips to keep you more secure.

5. Microsoft Information Protection Service – This enables you to automatically discover, classify, label, protect and monitor data no matter where it lives or travels on your Microsoft devices.

We’re now seeing many more open source capabilities and seeing more of these applications being sent over to Macs and Linux PCs for instance. Essentially this labeling capability is built into office apps and such across all the major platforms and can add protection capability to things like PDF documents, a feature currently in preview.

But the idea is it’s going to help you protect from things such as PII being extended. So, it’s an added level of protection to ensure there are no security leaks.

So, it’s clear from all this that Microsoft not only has a commitment to securing their own services and software, but also enterprises and individuals are of critical importance when talking about security.

If you’re concerned about security, check out some of the things I mentioned here and remember, Microsoft is making the investment and doing all they can to keep things secure.

Microsoft and BlackRock Announce Retirement Planning Partnership

At this point, the state of financial planning is a potential major crisis with current and future generations coming upon retirement age with little or no savings to account for.

As we’ve moved away from pensions of the old, the responsibility held previously by the companies, has now shifted to individuals having to invest and save on their own to ensure they’re set up after they retire.

I wanted to share a recent press release from Microsoft who announced that they created a partnership with BlackRock to help reimagine the way people manage their retirement planning. BlackRock is a world leader in wealth management, including providing solutions to consumers and currently manages approximately 6.5 trillion in assets for investors worldwide.

The goal of this alliance is to find ways for people to interact with their retirement assets more, so they know what kind of contributions they’re making. BlackRock will design and manage a suite of next generation investment tools that aim to provide a ‘lifetime’ of income in retirement. This would be made available to US workers through their employer’s workplace savings plan.

The press release did not share much detail about what exactly the two firms will partner on, but the following is a quote from Microsoft CEO, Satya Nadella: “Together with BlackRock, we will apply the power of the cloud and AI to introduce new solutions that address this important challenge and reimagine retirement planning.”

As we know, AI, deep learning and machine learning and all their related technologies, can have a profound impact on information gathering, processing and the intelligence we can extract from it. This helps us make better decisions.

The idea here is to offer technology options to businesses for their employees to consume and promote fiduciary responsibility. There will be more complex options that have been shunned previously by employers because of their complexity and costliness.

BlackRock has shown that they want to move their technology footprint forward with acquisitions and investments in firms in recent years. In 2015, they acquired a robo advisor company, as well as invested in Acorns, a company which helps millennials save their spare change to put it into a savings account.

Last year, BlackRock acquired Investment, a company that gives them more sophisticated online investment tooling. It is also believed that additional partnerships will come along to help support any of these new investment options, the plans and the employees.

When it comes to how the world is changing, AI is thought to be one of the biggest conversations occurring in 2019. At the heart of AI is data—data quality and consistency. These important factors are something we focus on at Pragmatic Works, as well as knowing that this is what our clients need to rely on.

This press release shows where we’re going with some of the AI technology that’s a huge topic of conversations in organizations today.

3 Common Analytics Use Cases for Azure Databricks

Pragmatic Works is considered to be experts in the Microsoft Data Platform, both on-premises and in Azure. That being said, we often get asked many questions like, how can a certain technology benefit my company? One technology we are asked about a lot is Azure Databricks. This was released over a year ago in preview in the Azure portal and we’re starting to see some massive adoption by many companies, but not everyone is ready to delve into data science and deep analytics, so they haven’t had much exposure to what Databricks is and what it can do for their business.

There are some barriers preventing organizations from adopting data science and machine learning which can be applied to solve many common business challenges. Collaboration between data scientists, data engineers, business analysts who are working with data (structured and unstructured) from a multitude of sources is an example of one of those barriers.

In addition, there’s a complexity involved when you try to do things with these massive volumes of data. Then add in some cultural aspects, having multiple teams and using consultants, and with all these factors, how do you get that one common theme and common platform where everybody can work and be on the same page? Azure Databricks is one answer.

Here’s an overview of 3 common use cases that we’re beginning to see and how they can benefit your organization:

1. Recommendation Engines – Recommendation Engines are becoming an integral part of applications and software products as mobile apps and other advances in technology continue to change the way users choose and utilize information. Most likely when you’re shopping on any major retail site, they are going to make recommendations to related products based on the products you’ve selected or that you’re looking at.

2. Churn Analysis – Commonly known as customer attrition; basically, it’s when we lose customers. Using Databricks, there are ways to find out what some of the warning signs are behind that. Think about it, if you get ways to correlate the data that leads to a customer leaving your company, then you know that you have a better chance to possibly save that customer.

And we all know that keeping a customer and giving them the service they need or the product they want is significantly less costly than having to acquire new customers.

3. Intrusion Detection – This is needed to monitor networks or systems and activities for malicious activity or policy violations and produce electronic reports to some kind of dashboard or management station or wherever that is captured.

With the combination of streaming and batch technologies tightly integrated with Databricks and the Azure Data Platform, we are getting access to more real-time and static data correlations that are helping to make faster decisions and try to avoid some of these intrusions.

Once we get triggered that there is a problem, we can shut if off very quickly or use automation options to do that as well.

Today I wanted to highlight some of the ways that you can utilize Databricks to help your organization. If you have questions or would like to break down some of these barriers to adopting machine learning and data science for your business, we can help.

We are using all the Azure technologies and talking about them with our customer all the time, as well as deploying real world workload scenarios.