Category Archives: Infrastructure

Microsoft Reimagines Traditional SIEMs with Azure Sentinel

If you’re like most, security is at the forefront of your mind for your organization. You need the right tools and the right team to keep up with the balance of increasing number of sophisticated threats and with security teams being inundated with requests and alerts.

Today I’d like to tell you about Microsoft’s reimagined SIEM tool Azure Sentinel. Over the past 10 to 15 years, Security Information and Event Management (SIEM) has become extremely popular as an aggregation solution for security and events that happen in our network.

There are also software tools, hardware appliances and managed service providers that can help support your corporate needs to better understand the level of risks in real-time and over a span of time. They do things such as log aggregation, event correlation and forensic analysis and offer features for alerting, dashboarding and compliance checks.

These are great resources to help secure our environment, our users and devices. But unfortunately, the reality is security teams are being inundated with requests and alerts. Compile this with the noteworthy shortage of security professionals in the world – an estimated 3.5 million unfilled security jobs by 2021 – this is a major concern.

Microsoft decided to take a different approach with Azure Sentinel. Azure Sentinel provides intelligent security analytics at cloud scale for your entire enterprise. It makes it easy to collect data across your entire hybrid organization on any cloud, from devices to users to applications to servers. Azure Sentinel uses the power of AI to ensure you’re quickly identifying real threats.

With this tool:

  • You’ll eliminate the burden of traditional SIEMs as you’re eliminating the need to spend time on setting up, maintaining and having to scale the infrastructure to support other SIEMs.
  • Since it’s built on Azure, it offers virtually limitless cloud scale while addressing all your security needs.

Now let’s talk cost. Traditional SIEMs have proven to be expensive to own and operate, often requiring you to commit upfront and incur high cost for infrastructure maintenance and data ingestion. With Sentinel, you pay for what you use with no up-front costs. Even better, because of Microsoft’s relationships with so many enterprise vendors (and more partners being added) it easily connects to popular solutions, including Palo Alto networks, F5 networks, Symantec and Checkpoint offerings.

Azure Sentinel integrates with Microsoft Graph Security API, enabling you to import your own threat intelligence feeds and to customize threat detection and alert rules. There are custom dashboards that give you a view to allow you to optimize whatever your specific use case is.

Lastly, if you’d like to try this out for free, Microsoft is allowing you to connect to your Office 365 tenant to do some testing and check it out in greater detail. This product is currently in preview, so there may be some kinks but I’m looking forward to seeing how it develops in the future, as a true enterprise-class security solution for your environment, whether in the cloud, on premises, in data centers or remote users or devices.

Microsoft Announces Windows Virtual Desktop in Azure

Today I’m here with some exciting news out of Microsoft with the public preview of Microsoft Virtual Desktop. Virtual desktops are not a new invention and they are currently offered by multiple vendors.

Windows Virtual Desktop is comprised of the Windows desktops themselves and the application that you would pass out to users and the management solution for these are hosted in Azure. Inside public preview desktops and apps can be deployed on virtual machines in any Azure region in the US, with the management solution in data for these virtual machines residing in the US as well.

As the service moves closer to general availability, Microsoft will start to scale out the management solution and data localization to all Azure regions. Virtual desktops can be deployed with Windows 7 (with extended support) or Windows 10 for the workstation modes, and for the server versions you can run 2012 through 2019.

It will provide a full desktop virtualization environment inside your Azure subscription without having to run any additional gateway servers as you would if you were deploying this on other vendors’ platforms or in your on premises.

With Windows Virtual Desktop you can build custom images or pick some of the canned images provided in the Azure gallery. Images can be personalized and remain static with persistent desktops. There are also many configuration options, for instance; you want to run a single application for a connectivity for setting up a server/client type deployment of an application or you want to deploy pooled multi-session resources.

Another key point is you’re deploying this with significantly reduced overhead as you no longer need to manage the remote desktop roles like you would with remote desktop services on prem or with some of the other providers. You just have to manage the virtual machines inside of your Azure subscription.

There are many great use cases historically for virtual desktops in things like education and medical along with many others.

Accelerate Your AI with Machine Learning on Azure Data Box Edge

In some past blogs I’ve discussed Azure Data Box and how the Data Box family has expanded. Today I’ll talk about Azure Data Box Edge (in preview) and elaborate on the machine learning service that it provides in your premises with the power of Azure behind it.

If you don’t know, Azure Data Box Edge is a physical hardware device that sits in your environment and collects data from environment sources like IOT data and other sources where you might take advantage of the AI features offered by the device. It then takes the data and sends it to Azure for more processing, storage or reporting purposes.

Microsoft recently announced Azure Machine Learning hardware accelerated models provided by Project Brain Wave on the Data Box Edge. Because most of our data is in real world applications and used at the edge of our networks – like image and videos collected from factories, retail stores or hospitals – it can now be used for things such as manufacturing defect analysis or inventory out of stock detection in diagnostics.

By applying machine learning models to the data on Data Box Edge, it provides lower latency (and savings on bandwidth cost) as we don’t have to send all the data to Azure for analysis. But it still offers that real time insight and speed to action for critical business decisions.

You can enable data scientists to simplify and accelerate the building, training and deployment of machine learning models using the Azure Machine Learning Service which is already generally available. They can access all these capabilities in their favorite Python environment, using the latest open source frameworks such as PyTorch, TensorFlow and sci-kit-learn.

These models can run on CPUs and GPUs, but this preview expands that out to field programmable gate array processes (FPGA), which is the processor on the Data Box Edge.

The preview is currently a bit limited but, in this case, you’re able to enhance the Azure Machine Learning Service by training a TensorFlow model for image classification scenarios. So, you would containerize that model in a docker container and then deploy it to the Data Box Edge IOT hub.

A good use case for this is if you’re using AI models for quality control purposes. Let’s say you know what a finished product should look like and what the quality specs are, and you build a model defining those parameters. Then you take an image of that product as it comes off the assembly line; now you can send those images to the Data Box Edge in your environment and more quickly capture defects.

Now you’re finding the root cause of defects quicker and throwing away fewer defective products and therefore, saving money. I’m looking forward to seeing how enterprises are going to leverage this awesome technology.

What is Azure Active Directory B2C?

How important is secure identity management to you? If you’re like most, it is a top priority. In today’s post I’ll talk about Azure Active Directory B2C which is an identity management service that enables you to customize and control how users securely interact with your web, desktop, mobile or even single applications.

Using Azure AD B2C, users can sign up, sign in, reset passwords and edit profiles for the various applications they’re using.

When implementing these policies, we’ll have two choices:

  • Using common identity user flows within the Azure portal or,
  • For the more skilled developer or if the templates in the portal don’t support your use case, you can use XML based custom policies.

Once you make that decision, your choice will define the path of authentication, commonly referred to as the user journey. User journeys allow you to control behaviors by configuring some settings; things like social accounts (like Facebook) that the user uses to sign up for the application.

Data collected from the user as a first name or postal code would be used for authentication. You also have multi-factor authentication options, as well as the look and feel of how users interact with pages and information returned to the application.

Azure Active Directory B2C supports the open ID connect and the OAuth 2 protocols for these user journeys. These protocols will help ultimately receive a token that will allow for you to be authenticated. The interaction of every application follows a similar high-level pattern shown in the graphic below:

AAD B2C Flow

The steps here are:

1. The application directs the user to run a policy.

2. The user completes the policy according to the policy definition.

3. Then the application receives a token.

4. And then uses that token to try to a resource.

5. The resource server then validates the token to verify that access can be granted.

6. And the application will periodically refresh in the background ( there really are 5 steps but this 6th step is happening over and over).

Azure AD B2C can also work with additional identity providers such as Amazon, Facebook and Google that will create, maintain and manage identity information while providing authentication services to their (and other) applications.

Typically, you would only use one identity provider in your application but there are no restrictions for using more if your use case calls for it.

The main value for this service is the ability to lessen the need for username and password management for so many applications, thus improving the user experience. Our lives have been made a bit easier since we now have many applications, both web and desktop based, that allow that single sign on or no sign on experience because they are already pre-authenticated with a service like this.

What is Azure Active Directory Seamless Single Sign On?

We’re all dealing with many usernames and passwords in our everyday life, right? Today I’d like to talk about an authentication feature within Azure Active Directory that can help you with easier, faster access.

Azure Active Directory Seamless Single Sign-on (Azure AD Seamless SSO) automatically signs users in when they are on their corporate devices connected to their corporate network. When this is enabled, users don’t have to type their passwords, or even their username, to sign in to Azure Active Directory.

This feature provides users with easy access to cloud-based applications without needing any additional on premises components.

First let’s discuss how this is set up:

  • SSO is enabled used Azure AD Connect. The following steps will occur while enabling this feature:
    • A computer account representing Azure AD is created in your on premises Active Directory in each AD forest.
    • The computer account Kerberos decryption key is shared securely with Azure AD and then 2 Kerberos service principal names (SPNs) are created to represent 2 URLs that are used during Azure AD sign-on.

Authenticating in Browser

  • When doing authentication from a web browser for a web app, essentially a user navigates to a website and signs into Azure AD (see below).
AADSSSO-Image 1
  • Azure AD sends a Kerberos requests to on premises AD and on premises AD looks for an account related to the device you’re signing in on and a user account. If authorized, you get access.

Authenticating with Native Application

  • For a native client, like Outlook for instance, the process is a bit different (see below).
AADSSSO-Image 2
  • Here, the request is made from the device you’re using and authenticated off Azure AD, issuing a Kerberos ticket when it is successful.
  • When that ticket is authenticated off Azure AD and approved, a SAML token is sent to the app. Then it gets sent back to AAD for OAuth-2 authentication.
  • Once all that checks out, access is granted.

Now let’s talk about the benefits.

  • First, it’s a much better user experience. Users are automatically signed in both on premises and cloud-based applications using their built-in authentication, so there’s no need for users to repeatedly reenter their passwords.
  • It’s also easy to deploy and administer. There are no additional components needed on premises; it synchronizes your Azure Active Directory to your AD. Plus it works with any method of cloud authentication using password hash synchronization or pass through authentication.
  • Additionally, it can be rolled out to only some or all of your users by using group policy.

So, this is a great way to allow users to have multiple authentications into multiple websites and applications using only one authentication tool. This will minimize the amount of administration required to set up those users once it’s in place. And it should reduce the number of password resets for your help desk team or whomever oversees that.

How and Why to Add a Source Code Repository to Azure Data Factory

For developers, it’s very beneficial to have a source code repository. A source code repository helps to keep all your changes, to manage tasks, branches, share the code with a team and simply put, to keep it in safe place.

In this post, I’ll tell you why you should connect your Azure Data Factories to a source code repository, and I’ll demo how to do so:

  • To do this, I’ll start in my data factory inside my Azure portal.
  • When I go into Author & Monitor, I have the ability to either:
    • Set up a code repository within the landing page or main page here or,
    • I can go directly into my data factory and I can add it in the left-hand corner and pull down to where it says, ‘set up code repository’.
  • One thing to note is the code repository itself has been supported for a while, but recently with the release of Data Flows, they’re now supporting GitHub within the repositories as well (previously is was only Azure DevOps).
  • So, I select my GitHub account and fill in the information. *If you’re doing this for the first time, it’s going to prompt you to log into your GitHub account when you do this. In this case, I’ve already previously connected this so it’s going to know about my repositories.
  • Next, I select my repository name and I’ll go to my playground branch and I’m going to use my existing playground.
  • One field will ask me ‘Branch to import resources into:’ so if I’m importing resources, I can select an existing one or create a new one. For this demo I’m going to pick my playground.
  • Before I hit Save, notice on the left-hand side I’ve got zero pipelines, one data set and zero data flows. But when I connect to my playground, it’s going to bring everything in I’ve previously connected to within any of my areas I’ve saved up into or checked my code into anything in that playground branch. So, now you’ll see I have 6 pipelines, 23 data sets and 4 data flows.
  • One of the other nice pieces of being able to add source control is if I want to add a new data set. I just select my SQL Server I was previously connected to; I leave it on default for now and connect in.
  • I then select one of the tables; I selected the Product Category Table.
  • You’ll see at the top you have the option to Save All or Publish. Save All is going to save any of the changes across the tabs, so you can tell when there’s been a change, whether it be data set or pipeline or data flow by having a star next to the name.
  • Now instead of needing to publish every time you’re doing development, you can just save it and it will save here. So, rather than having to publish the entire pipeline and do any of the error checking and make sure the pipeline is in good standing, you can now just save part way through. This is a huge advantage over having to publish the entire pipeline which could cause some challenges which might not be efficient for development and such.

The new features of GitHub being added in gives us another great opportunity if you didn’t previously use Azure Dev Ops (formerly known as Visual Studio Team Services). I’ll be doing more upcoming blogs around Data Factory in Azure Every Day that will be beneficial to you with some of the nuances as the product has greatly evolved since releasing Data Flows.

Expanding the Azure Data Box Family

In a previous blog I introduced Azure Data Box. Today I’d like to talk about how Microsoft is expanding the Azure Data Box family by introducing you to the Azure Data Box Gateway and the Azure Data Box Edge devices.

Until now the Data Box Family has been the disc, the box and the heavy. Each have their own limits for storage but are designed to improve your way of uploading massive amounts of data into Azure, without having to wait for it to travel across the wire or saturate your bandwidth (consider that the offline method).

Microsoft learned from customers that they want a better way to sync their local storage directly with Azure storage for operations like archival and disaster recovery. Here’s where Azure Data Box Gateway comes in.

The Data Box Gateway is a cloud storage gateway device that resides on premises and sends your image, media and other data directly to Azure.

  • The Gateway is a virtual machine provisioned in your Hypervisor (VMware or Hyper V) where you write the data directly to this virtual device using the NFS or SMB protocols, which it then sends to Azure.
  • One use case for the Data Box Gateway is for things like continuously ingesting massive amounts of data. So, we have a local data source that requires large data amounts and capacities and we can stream those and sync them directly with our Azure storage.
  • Another use case would be for a cloud archival of data in a secure and efficient way. If you then think about the incremental data transfer over the network after the initial bulk transfer is done using the Data Box of your choice for direct tie in to the same Azure storage container that you’re using for your Data Box.

Azure Data Box Edge is a storage solution that allows you to process data and send it over the network to Azure.

  • Data Box Edge uses a physical device supplied by Microsoft to accelerate the secure data transfer.
  • The device resides on premises in your network stack and you write data to it (also using NFS or SMB.)
  • It is additionally equipped with AI enabled Edge computing capabilities which help to analyze, process or filter data as it moves to Azure block blob, page blob or Azure files.
  • It has the appropriate chips to process intelligent learning (artificial intelligence, machine learning, deep learning and such).
  • Use cases are for things like pre-processing data. So, we can analyze data from on premises or IOT devices to get faster information about the data. That pre-processing will allow us to do things like aggregating your data before it gets sent to Azure or modifying data, such as taking out PII.
  • You can also subset and transfer the data needed for deeper analytics in the cloud.
  • Additionally, you can analyze and react to IOT events. So, if you’re running IOT devices on prem and you want the ability to be quicker to respond when those events occur, this is a great way to handle that.
  • Another great use case is you can run machine learning models to get quick results that can be acted on before the data is sent to the cloud.
  • With these IOT use cases, you don’t have to wait for the data to be transmitted over the wire, do any of the munging happening up in Azure and then return results. You can return those results on the fly in real time and react more quickly.
  • Eventually the full data set is transferred to continue and help you to retain and improve any of your machine learning models. You can continually feed it data and have those models trained repeatedly, thus learning to be more concise over time.

The Data Box family is a very cool technology by having an online version to further extend its capabilities.

What is Azure Automation?

So, what do you know about Azure Automation? In this post, I’ll fill you in on this cool, cloud-based automation service that provides you the ability to configure process automation, update management and system configuration, which is managed across your on-premises resources, as well as your Azure cloud-based resources.

Azure Automation provides complete control of deployment operation and decommissions of workloads and resources for your hybrid environment. So, we can have a single pane of glass for managing all our resources through automation.

Some features I’d like to point out are:

  • It allows you to automate those mundane, error-prone activities that you perform as part of your system configuration and maintenance.
  • You can create Runbooks in PowerShell or Python that help you reduce the chance for misconfiguration errors. And it will help lower operational costs for the maintenance of those systems, as you can script it out to do it when you need instead of manually.
  • The Runbooks can be developed for on-premises or Azure resources and they use Web Hooks that allow you to trigger automation from things such as ITSM, Dev Ops and monitoring systems. So, you can run these remotely and trigger them from wherever you need to.
  • On configuration management side, you can build these desired state configurations for your enterprise environment. This will help you to set a baseline for how your systems will operate and will identify when there’s a variance from the initial system configuration, alerting you of any anomalies that could be problematic.
  • It has a rich reporting back end and alerting interface for full visibility into what’s happening in your Windows and Linux systems – on-premises and in Azure.
  • Gives you update management aspects (in Windows and Linux) to help you define the aspects of how updates are applied, and it helps administrators to specify which updates will be deployed, as well as successful or unsuccessful deployments and the ability to specify which updates should not be deployed to systems, all done through PowerShell or Python scripts.
  • It can share capabilities, so when you’re using multiple resources or building those Runbooks for automation, it allows you to share the resources to simplify management. You can build multiple scripts but use the same resources over and over as references for things like role-based access control, variables, credentials, certificates, connections, schedules and access to source control and PowerShell modules. You can check these in and out of source control like any kind of code-based project.
  • Lastly, and one of the coolest features in my opinion, where these are templates you’re deploying out in your systems, everyone has some similar challenges. There’s a community gallery where you can go and download templates others have created or upload ones you’ve created to share. With a few basic configuration tweaks and review to make sure they’re secure, this is a great option for making the process faster by finding an existing script and cleaning it up and deploying it in your systems and environment.

So, there’s a lot you can do with this service and I think it’s worth checking out as it can make your maintenance and management much simpler.

What is Azure Firewall?

I’d like to discuss the recently announced Azure Firewall service that is now just released in GA. Azure Firewall is a managed, cloud-based network security service that protects your Azure Virtual Network resources. It is a fully stateful PaaS firewall with built-in high availability and unrestricted cloud scalability.

It’s in the cloud and Azure ecosystem and it has some of that built-in capability. With Azure Firewall you can centrally create, enforce and log application and network connectivity policies across subscriptions and virtual networks, giving you a lot of flexibility.

It is also fully integrated with Azure Monitor for log analytics. That’s big because a lot of firewalls are not fully integrated with log analytics which means you can’t centralize these logs in OMS, for instance, which would give you a great platform in a single pane of glass for monitoring many of the technologies being used in Azure.

Some of the features within:

  • Built in high availability, so there’s no additional load balances that need to be built and nothing to configure.
  • Unrestricted cloud scalability. It can scale up as much as you need to accommodate changing network traffic flows – no need to budget for your peak traffic, it will accommodate any peaks or valleys automatically.
  • It has application FQDN filtering rules. You can limit outbound HTTP/S traffic to specified lists of fully qualified domain names including wildcards. And the feature does not require SSL termination.
  • There are network traffic filtering rules, so you can create, allow or deny network filtering rules by source and destination IP address, port and protocol. Those rules are enforced and logged across multiple subscriptions and virtual networks. This is another great example of having availability and elasticity to be able to manage many components at one time.
  • It has fully qualified domain name tagging. If you’re running Windows updates across multiple servers, you can tag that service as an allowed service to come through and then it becomes a set standard for all your services behind that firewall.
  • Outbound SNAT and inbound DNAT support, so you can identify and allow traffic originating from your virtual network to remote Internet destinations, as well as inbound network traffic to your firewall public IP address is translated (Destination Network Address Translation) and filtered to the private IP addresses on your virtual networks.
  • That integration with Azure Monitor that I mentioned in which all events are integrated with Azure Monitor, allowing you to archive logs to a storage account, stream events to your Event Hub, or send them to Log Analytics.

Another nice thing to note is when you set up an express route or a VPN from your on premises environment to Azure, you can use this as your single firewall for all those virtual networks and allow traffic in and out from there and monitor it all from that single place.

This was just released in GA so there are a few hiccups, but if none of the service challenges effect you, I suggest you give it a try. It will only continue to come along and get better as with all the Azure services. I think it’s going to be a great firewall service option for many.

What is Azure Data Box and Data Box Disk?

Are you looking to move large amounts of data into Azure? How does doing it for free sound and with an easier process? Today I’m here to tell you how to do just that with the Azure Data Box.

Picture this: you have a ton of data, let’s say 50 terabytes on-prem, and you need to get that into Azure because you’re going to start doing incremental back ups of a SQL Database, for instance. You have two options to get this done.

First option is to move that data manually. Which means you have to chunk it, set it up using AZ copy or a similar Azure data tool, put it up in a blob storage, then extract it and continue with the process. Sounds pretty painful, right?

Your second option is to use Azure Data Box which allows you to move large chunks of data up into Azure. Here’s how simple it is:

  • You order the Data Box through Azure (currently available in the US and EU)
  • Once received, you connect it to your environment however you plan to move that data
  • It uses standard protocols like SMB and CIFS
  • You copy the data you want to move and return the Data Box back to Azure and then they will upload the data into your storage container(s)
  • Once the data is uploaded, they will securely erase that Data Box

With the Data Box you get:

  • 256-bit encryption
  • A super tough, hardened box that can withstand drops or water, etc.
  • It can be pushed into Azure Blob
  • You can copy data up to 10 storage accounts
  • There are two 1 gigabit/second and two 10 gigabit/second connections to allow quick movement of data off your network onto the box

In addition, Microsoft has recently announced the Data Box Disk, which is a small 8 terabyte disk that you can order up to five of as part of the Data Box Disk.

With Data Box Disc you get:

  • 35 terabytes of usable capacity per order
  • Supports Azure Blobs
  • A USB SATA 2 and 3 interface
  • Uses 128-bit encryption
  • Like Data Box, it’s a simple process to connect it, unlock it, copy the data onto the disk and it send it back to copy those into a single storage account for you

Here comes the best part—while Azure Data Box and Data Box Disk are in Preview, this is a free service. Yes, you heard it right, Microsoft will send you the Data Box or Data Box Disk for free and you can move your data up into Azure for no cost.

Sure, it will cost you money when you buy your storage account and start storing large sums of data, but storage is cheap in Azure, so that won’t break the bank.