Category Archives: Security

What is Azure Firewall?

I’d like to discuss the recently announced Azure Firewall service that is now just released in GA. Azure Firewall is a managed, cloud-based network security service that protects your Azure Virtual Network resources. It is a fully stateful PaaS firewall with built-in high availability and unrestricted cloud scalability.

It’s in the cloud and Azure ecosystem and it has some of that built-in capability. With Azure Firewall you can centrally create, enforce and log application and network connectivity policies across subscriptions and virtual networks, giving you a lot of flexibility.

It is also fully integrated with Azure Monitor for log analytics. That’s big because a lot of firewalls are not fully integrated with log analytics which means you can’t centralize these logs in OMS, for instance, which would give you a great platform in a single pane of glass for monitoring many of the technologies being used in Azure.

Some of the features within:

  • Built in high availability, so there’s no additional load balances that need to be built and nothing to configure.
  • Unrestricted cloud scalability. It can scale up as much as you need to accommodate changing network traffic flows – no need to budget for your peak traffic, it will accommodate any peaks or valleys automatically.
  • It has application FQDN filtering rules. You can limit outbound HTTP/S traffic to specified lists of fully qualified domain names including wildcards. And the feature does not require SSL termination.
  • There are network traffic filtering rules, so you can create, allow or deny network filtering rules by source and destination IP address, port and protocol. Those rules are enforced and logged across multiple subscriptions and virtual networks. This is another great example of having availability and elasticity to be able to manage many components at one time.
  • It has fully qualified domain name tagging. If you’re running Windows updates across multiple servers, you can tag that service as an allowed service to come through and then it becomes a set standard for all your services behind that firewall.
  • Outbound SNAT and inbound DNAT support, so you can identify and allow traffic originating from your virtual network to remote Internet destinations, as well as inbound network traffic to your firewall public IP address is translated (Destination Network Address Translation) and filtered to the private IP addresses on your virtual networks.
  • That integration with Azure Monitor that I mentioned in which all events are integrated with Azure Monitor, allowing you to archive logs to a storage account, stream events to your Event Hub, or send them to Log Analytics.

Another nice thing to note is when you set up an express route or a VPN from your on premises environment to Azure, you can use this as your single firewall for all those virtual networks and allow traffic in and out from there and monitor it all from that single place.

This was just released in GA so there are a few hiccups, but if none of the service challenges effect you, I suggest you give it a try. It will only continue to come along and get better as with all the Azure services. I think it’s going to be a great firewall service option for many.

What is Azure Data Box and Data Box Disk?

Are you looking to move large amounts of data into Azure? How does doing it for free sound and with an easier process? Today I’m here to tell you how to do just that with the Azure Data Box.

Picture this: you have a ton of data, let’s say 50 terabytes on-prem, and you need to get that into Azure because you’re going to start doing incremental back ups of a SQL Database, for instance. You have two options to get this done.

First option is to move that data manually. Which means you have to chunk it, set it up using AZ copy or a similar Azure data tool, put it up in a blob storage, then extract it and continue with the process. Sounds pretty painful, right?

Your second option is to use Azure Data Box which allows you to move large chunks of data up into Azure. Here’s how simple it is:

  • You order the Data Box through Azure (currently available in the US and EU)
  • Once received, you connect it to your environment however you plan to move that data
  • It uses standard protocols like SMB and CIFS
  • You copy the data you want to move and return the Data Box back to Azure and then they will upload the data into your storage container(s)
  • Once the data is uploaded, they will securely erase that Data Box

With the Data Box you get:

  • 256-bit encryption
  • A super tough, hardened box that can withstand drops or water, etc.
  • It can be pushed into Azure Blob
  • You can copy data up to 10 storage accounts
  • There are two 1 gigabit/second and two 10 gigabit/second connections to allow quick movement of data off your network onto the box

In addition, Microsoft has recently announced the Data Box Disk, which is a small 8 terabyte disk that you can order up to five of as part of the Data Box Disk.

With Data Box Disc you get:

  • 35 terabytes of usable capacity per order
  • Supports Azure Blobs
  • A USB SATA 2 and 3 interface
  • Uses 128-bit encryption
  • Like Data Box, it’s a simple process to connect it, unlock it, copy the data onto the disk and it send it back to copy those into a single storage account for you

Here comes the best part—while Azure Data Box and Data Box Disk are in Preview, this is a free service. Yes, you heard it right, Microsoft will send you the Data Box or Data Box Disk for free and you can move your data up into Azure for no cost.

Sure, it will cost you money when you buy your storage account and start storing large sums of data, but storage is cheap in Azure, so that won’t break the bank.

 

Informatica Enterprise Data Catalog in Azure

If you’re like many Azure customers, you’ve been on the look out for a data catalog and data lineage tool and want one with all the key capabilities you’re looking for. Today, I’d like to tell you more about the Informatica Data Catalog which was discussed briefly in a previous Azure Every Day post.

The Informatica tool helps you to analyze, consolidate and understand large volumes of metadata in your enterprise. It allows you to extract both physical and business metadata for objects and organize it based on business concepts, as well as view data lineage and relationships for each of those objects.

Sources include databases, data warehouses, business glossaries, data integration and Business Intelligence reports and more – anything data related. The catalog maintains an indexed inventory of all the dated objects or ‘assets’ in your enterprise such as tables, columns, reports, views and schemas.

Metadata and statistical information in the catalog include things like profile results, as well as info about data domains and data relationships. It’s really the who, what, when, where and how of the data in your enterprise.

Informatica Data Catalog can be use for tasks such as:

  • Find your scalable assets by being able to scour your network or cloud space to look for assets that aren’t cataloged.
  • View lineage for those assets, as well as relationships between assets.
  • Enrich assets by tagging them with additional attributes, possibly tag a specific report as a critical item.

These are lots of useful features in the Data Catalog. Some key ones are:

  • Data Discovery – Do a semantic search, dynamic filtering, data lineage and relationships for assets across your enterprise.
  • Data Classification – Automatically or manually annotate data classifications to help with governance and discovery – who should have access to what data and what does the data contain.
  • Resource Administration – Like resource, schedule and attribute management, as well as connection or profile configuration management. All the items that surround the data that help you manage the data and the metadata around it.
  • Create and edit reusable profile definition settings.
  • Monitor resources and tasks within your environment.
  • Data domain management where you can create and edit domains and the kind of groups you want to group together with like data and reports.
  • Assign logical data domains to data groups.
  • Build composite data domains for management purposes.
  • Monitor the status of tasks in progress and look at some transformation logic for assets.

On top of this, you can look at how frequently the data is accessed and how valuable it is to your business users; showing this type of information around your data so you can trim reports that aren’t being used for instance.

When we talk about modern data warehousing in the Azure cloud, this is something we’ve been looking for. It’s a useful and valuable tool for those who want those data governance and lineage tools.

Introducing Microsoft Azure Sphere

As we continually see more and more consumer-grade internet devices, like appliances, baby monitors, thermostats, etc., we need a more robust way to manage these devices in a secure way. There are approximately 9 billion of these devices built every year. These devices each have a tiny chip or microcontroller (MCU) that hosts the compute, the storage and the memory, as well as the operating system.

I’m excited to tell you about a newer offering called Microsoft Azure Sphere. As these consumer-grade devices grow exponentially, the problem is that some of these devices are not built in a secure way, making them easily susceptible to hacking. There have been plenty of news stories about devices that have been hacked and then used for malicious purposes.

Microsoft is not alone in recognizing this issue. But they jumped on this in 2015 and they began to develop a way that they believed was a good approach to secure these devices and created Microsoft Azure Sphere. This is a solution for creating highly secure internet connected microcontroller devices with

3 main components:

1. Azure Sphere Certified MCUs – They’ve got manufacturers that architect a solution that combines real time and application processors built onto this MCU, using built in Microsoft security technology and connectivity capabilities.

They used the experience, processes and lessons learned from looking at the X-Box consoles that have been built over the past 15 years and put that into the design of these chips. So, third party companies can build these chips by using these processes and be certified.

2. Azure Sphere Operating System – Once an MCU is certified, you can install this operating system which is intended to be super secure and agile to serve those MCU purposes, including layers of security from Windows, Linux and specific security monitoring software all built into that operating system.

3. Azure Sphere Security Service – This allows you to protect each device, but also allows secure communication from device to device or device to the cloud.

Similar to what we talk about with IoT and IoT Hub, but there is a certified way of doing it to ensure you’re using the architecture that will remain secure and support by Microsoft for years to come. And this will apply to 10s of 1000s of companies that are going to build devices for all the areas I’ve mentioned and more, giving them that secure platform to build them.

Azure Sphere was just announced earlier this year and there’s a long way to go. But we’ll soon start to see things like software and hardware development kits (which you can pre-order now from Microsoft) and really get a chance to understand how it works and how it’s going to offer better solutions in the coming years for these connected devices.

It’s a cool technology and however they brand it, whether it’s Azure Sphere MCU Certified or something else, it will give us the assurance that we’ll be getting a quality product without the danger of getting exposed.

 

New Options for SQL 2008 End of Support

If you’re using SQL 2008 or 2008R2, you need to be aware that the extended support for those ends on July 9, 2019. This means the end of regular security updates which leads to more vulnerabilities, and the software won’t be updated, and you’ll have out of compliance risks. As such, here are some new options for SQL 2008 End of Support:

The best option would be with either a migration or an upgrade, but Microsoft has some options in place to help people out as they understand this can be easier said than done when you have applications that need to be upgraded and you must figure out how best to handle that.

That being said, upgrading provides better performance, efficiency, security features and updates, as well as new technology features and capabilities within the whole stack of SQL products (SSIS, SSRS, SSAS).

Other Options

Here are some options that Microsoft is offering to help with the end of support of 2008/2008R2:

    • First, they are going to extend security updates available for free in Azure for 2008/2008R2 for 3 more years. So, if you simply move your workload to an IaS VM in Azure, you’ll be supported without requiring application changes. You’ll have to pay for those virtual machine costs but it’s still a good deal to get you started.
    • You can migrate your workloads to Managed Instances, which will be in GA by the end of 2018. This will be able to support all applications out there, so you can start the transition up into Azure.
    • You can take advantage of the Azure hybrid licensing model to migrate to save money on licensing. With this you can save up to 55% on some of your PaaS SQL Server costs, but only if you have Enterprise Edition and Software Assurance.
    • For on-premises servers that need more time to upgrade, you’ll be able to purchase extended security service plans to extend out 3 years past the July 2019 date. So, if you’re struggling to get an application upgraded and validated, they’ll extend that out for a fee. Again, this is for customers with Software Assurance or subscription licenses under an Enterprise Agreement. These can be purchased annually and to cover only the servers that need updates.
    • Extended security updates will also be available for purchase as they get closer to the end of support, for both SQL Server and Windows Server.

Again, the first choice would be to upgrade or migrate those databases and move to Azure, but there are some challenges with doing so, and if none of those options work, there are some great options to extend your support.

Overview of Azure Operations Management Suite (OMS)

In this post I’d like to give an overview of what Azure Operations Management Suite is and what it can be used for. First, Operations Management Suite, or OMS, is a collection of management services designed for Azure cloud. As new services are added to Azure, more capabilities are being built into OMS to allow for integration.

OMS allows you to collect things in one central place like the many Azure services that need deeper insight and manageability, all from one portal, as well as being able to set up different groups and different ways of viewing your data. OMS can also be used with on prem resources with Window and Linux Agent, so you can collect logs or backup your servers or files to Azure, for example.

The key Operations Management Suite services are:

  • Log analytics allows you to monitor and analyze the availability and performance of different resources including physical and virtual machines, Azure Data Factory and other Azure services.
  • Proactive alerting for when an issue or problem in your environment is detected, so you can either take corrective action or have a preprogrammed corrective action.
  • Ability automate manual processes and enforce configuration for physical and virtual machines, like automating clean-up operations you do on servers for instance. You can do this through Runbooks which are based on PowerShell scripts or PowerShell workloads where you can programmatically do what you need to do within the OMS.
  • Integrate backups so the agent and integration allow for backing up a service, a file level; whatever you need to do for critical data and run those stores, whether they are on-prem or cloud-based resources.
  • Azure Site Recovery runs through OMS and helps you provide high availability for apps and servers that you’re running.
  • Orchestrate running your replication up into Azure. This allows you to do it from physical servers, Hyper Vs or VMware servers using Windows or Linux.

Mainly, it provides management solutions. These are prepackaged sets of templates provided by Microsoft and/or partners that help implement multiple OMS services at one time. One example is the Update Management Solution which creates a log search, dashboard and alerting inside log analytics, but at the same time creates an automation runbook for installing updates on the server. This will tell you when updates are available, when they’re needed and then let you automate the install of those updates.

There is a lot of power and capability that comes with the Operations Management Suite. It’s a great centralized management solution within Azure that is quick to configure and start using.

 

Hybrid Cloud Strategies and Management

Are you running a hybrid environment between on-premises and Azure? Do you want to be? In a recent webinar, Sr. Principal Architect, Chris Seferlis, answered the question: How can my organization begin using hybrid cloud today? In this webinar, he defines the four key pillars of true hybrid development, identity, security, data platform and development, and shows actionable resources to help get you started down the hybrid road.

Hybrid cloud presents great opportunity for your organization and is the path most are going down:

80% of enterprises see themselves operating hybrid clouds for the foreseeable future

58% of enterprises have a hybrid cloud strategy (up from 55% a year ago)

87% of organizations are planning to integrate on-premises datacenters with public cloud

In this in-depth webinar, Chris covers:

Hybrid Identity with Window Server Active Directory and Azure Active Directory – Identity is the new control plane. We’ve all got lots of services, devices and internal apps and firewalls do not protect your users in the cloud.

With Azure AD you:

  • Have 1000s of apps with 1 identity
  • Enable business without borders
  • Manage access at scale
  • Have cloud-powered protection

Security – Better security starts at the OS – protect identity, protect the OS on-premises or in the cloud, help secure virtual machines.

Coupling Server 2016 with Azure enables security for your environment at cloud speed.

Azure enables rapid deployment of build-in security controls, as well as products and services from security partners and provided integration of partner solutions. Microsoft is making a major commitment to integration with 3rd party tools for ease of transition and a true hybrid approach.

Data and AI – AI investment increased by 300% in 2017. Organizations that harness data, cloud and AI out-perform and out-innovate with nearly double operating margin.

This webinar will tell you how to transform your business with a modern data estate.

Other areas covered are:

Azure Stack – the 1st consistent hybrid cloud platform

Hybrid Protection with Azure Site Recovery – Azure reduces the common challenges of cost and complexity with increased compliance.

Azure File Sync – If you’re using a file server on-prem, let’s make it better with Azure.

Project Honolulu – A modern management platform to manage on-prem and Azure.

This webinar is chock-full of information to get you on the right path to running a hybrid environment between on-premises and Azure. Watch the complete webinar here and click here to download the slides from the session. If you want to learn more about hybrid cloud strategies, contact us – we’re here to help.

Azure Enterprise Security Package for HDInsight

In today’s post I’d like to talk about the Enterprise Security Package for Azure HDInsight. HDInsight is a managed cloud Platform as a Service offering built on the Hadoop framework. It allows you to build big data solutions using Hadoop, Spark, Hive, LLAP and R, among others.

Let’s think about the traditional deployment of Hadoop. In traditional deployment, you would deploy a cluster, give local admin access to users with SSH access to that cluster. Then you would hand it over to the data scientists, so they could do what they needed to run those data science workloads; train the models, run scripts and such.

With the adoption of these types of big data workloads into the enterprise, it became much more reliant on enterprise security. There was a need for role-based access control with Active Directory permissions. Admins wanted to get greater visibility into who was accessing the data and when, as well as what they tried to get into and were they successful in their attempts or not – basically all those audit requirements when we’re working with large data sets.

Who is the leader in enterprise security? Microsoft, of course, for Active Directory. The Enterprise Security Package allows you to add the cluster to the domain within the creative process, as a sort of ‘add-on’ to your Azure portal. Other things it allows you to do are:

  • Add an HDI cluster with Active Directory Domain Services.
  • Role based access control for HIVE, Spark and Interactive HIVE using Apache Ranger.
  • Specific file and folder permissions for the data inside of an Azure Data Lakes Store.
  • Auditing of logs to see who has access to what and when.

Currently, these features are only available for Spark, Hadoop and Interactive Query workloads, but more workloads will be adopted soon.

A Guide to GDPR Compliance with Microsoft Data Platform

As most people know, the GDPR is approaching quickly. May 25th to be exact. Most companies will need to review or modify their database management and data handling procedures, especially focusing on the security of data processing. In a recent webinar hosted by 3 experts in the Azure, SQL Data Platform and software arenas: Abraham Samuel, Technical Support Personnel, Microsoft; Brian Knight, Founder and CEO, Pragmatic Works; and Myself, Sr. Principal Architect, Pragmatic Works, offered an informational session on steps you need to take now to help in your journey with compliance.

This 2-hour webinar covered the key changes needed to be addressed for GDPR: Controls, Modifications, Transparent Policies and IT and Training. It also discusses how modernizing your data platform, on-premises and in Azure, will immediately reduce areas out of compliance, as well as what Azure tools and services are offered to help ensure you remain in compliance.

It also taps into experience from the Pragmatic Works team on some of the danger areas customers face and how the suite of software tools can help you expose areas of concern in your environment. Still using SQL Server 2008 or 2008 R2? Here you’ll learn what it means for 2008/2008 R2 end of support and paths to upgrade your SQL Server.

Take some time and watch this information packed webinar that will help eliminate confusion around GDPR and discuss the steps you need to take to be in compliance, as well as how to make your plans actionable. GDPR goes into effect this month. This webinar will educate you and give you options to move along your journey into GDPR and a Microsoft modern data platform.