Category Archives: Security

What is Azure Network Watcher?

Most of us are starting to deploy more and more cloud assets. When you think about how you deploy some assets in Azure, you basically build out a virtual network and you can set that up so it ties in with your on premises network through express route or VPN or you can run it independently in the cloud and have your virtual network set. The question is, how do you monitor and manage that virtual network, like some of the components and how the virtual machines interact? Here’s where Azure Network Watcher comes in.

Azure Network Watcher allows you to monitor, diagnose and gain insight into your network performance between various points in your network infrastructure.

Here’s a breakdown of some of the elements:

1. The Monitoring Element – You can monitor from one endpoint to another with connection monitor to ensure connectivity between 2 points, like a web application and a database for instance. You’ll be alerted with potential issues such as a disconnect between those two services.

It also monitors latency times for evaluation. When you look at those latency times over a period, you’ll know what the average latency is and the max and min. Then you can think about you possibly getting better service in a different Azure region.

2. The Network Performance Monitor – Allows monitoring between Azure and on-premises resources for hybrid scenarios using VPN or express route. It also has some advanced detection to traffic blackholing and routing errors – in other words, some advanced intelligence when it comes to these network issues.

Best of all, as you add more endpoints it will develop a visual diagram of your network with a topology tool which will look like a visio-diagram, showing IP addresses, host names, etc.

3. Diagnostic Tools – From a diagnostic standpoint there are several diagnostic tools that give you better insight into your virtual network by diagnosing possible causes of traffic issues.

IP Flow – Tells you which security rule allowed or denied traffic to or from a virtual machine in your virtual network for further inspection or remediation.

Another tool tests communication for routing rules by letting us add a source and destination IP, then shows the results of that route, again to investigate further or remediate.

The Connection Troubleshooting Tool – Enables you to test a connection between two VMs, FQDN, URI or IDP4 addresses and returns info like the Connection Monitor but only about that point and time latency, not over a span of time.

The Packet Capture Tool – Allows traffic to be captured to and from a virtual machine with some fine-grained filtering to be stored inn Azure storage and further analyzing with network encapture tools like Wire Shark, for instance.

4. Metrics Tools – There are some limitations as to how many resources you can deploy within an Azure network which can be based on subscriptions or regions. The Metric Tool gives you the visibility that you need to understand exactly where you are inside of those limitations. It shows you how many of those resources you’ve deployed and how many are still available that you can deploy – so it helps you set up planning for the future as you deploy more and more resources.

5. Logging – We’ve done some interesting things with log analytics. Log analytics provides the ability to capture data about a bunch of Azure networking components, like network security groups, public IP addresses, load balances, virtual networking and application gateways, to name a few.

All these logs can be captured and stored in Azure storage and further analyzed. Many can be fed into Operations Management Studio (OMS). This gives you a single pane of glass experience when you want to look at your environment at that “50,000-foot level”.

So, as you begin to deploy more and more assets into your Azure environment, this is a helpful service to monitor and manage your virtual network. You get a high-level overview of what that network looks like.

Improve Your Security Posture with Azure Secure Score

Security is a top priority for every business and we can never have enough of it, right? But at what point does it become too much to administer and prioritize security threats? I’m excited to tell you about a newly announced offering called Azure Secure Score which is part of the Azure Security Center.

If you’re unfamiliar, the Azure Security Center is a centralized place where you can get security recommendations based on the workloads you’ve deployed. In September at Ignite, Microsoft announced Secure Score as a security analytics tool that provides visibility of your organization’s security posture, as well as help you understand how secure your workloads are by assigning them a score.

The new Secure Score helps you prioritize and triage your response to security recommendations. It takes into consideration the severity and impact of the recommendation and based on that info it assigns a numerical value to show how fixing the recommendation can improve your security posture.

Once you implement a recommendation, the score and the overall Secure Score updates.

The main goals of Secure Score are:

  • To provide the capabilities that allow you to visualize the security posture.
  • Quickly triage and make suggestions to provide impactful actions that increase your security posture.
  • Measures the workload of the security over time.

So, how does Azure Security Center and Secure Score work?

  • Azure Security Center constantly reviews your active recommendations and calculates your Secure Score based on these.
  • The score of a recommendation is derived from its severity and security best practices that will affect your workload security over time.
  • It looks at your security and where you sit over a period. It’s not an immediate result and it won’t immediately change but it’s going to help you build up your score as you implement any recommendations and then you can silence them.
  • The Secure Score is calculated based on the ratio between your healthy resources and your total resources. If the number of healthy resources is equal to your total resources, you get the highest score value.
  • The overall score is an accumulation of all your recommendations. You can view your overall Secure Score across your subscriptions or management groups depending on the scope you select. The score will also vary based on the subscriptions selected and the active recommendations on them.

Remember, this is a marathon, not a sprint. It takes time to do the remediation, whether it be patching machines or closing ports or shutting off services. There are so many remedies offered that will make you more secure down the road. With this offering, you get a ‘scorecard’ for yourself and a look at what’s most imperative to implement first.

Be sure to check out the Azure Security Center. There are a lot of free options there as well as options to add additional services at a cost.

Using Azure to Drive Security in Banking Using Biometrics

In the digital world we live in today, it’s getting harder to verify identity in industries such as banking. We now do less and less transactions in person. No longer do we go into banks with passbook in hand and make deposits or withdrawals face to face with a bank teller. Many of us have moved from ATM transactions to digital banking.

With this move, banks have tried many approaches of 2-factor authentication, some better than others and obviously the need is there for secure forms of authentication for the users. Let me tell you how Azure is driving identity security in banking using biometric identification. By combining biometrics with artificial intelligence, banks are now able to take new approaches at verifying the digital identity of their customers and prospects.

If you don’t know, biometrics is the process of uniquely identifying a person’s physical and personal traits. These are then recorded into a database and those images or features are captured into an electronic device and are used as a unique form of identification. Some methods we use biometrics are fingerprint and facial recognition, hand geometry, iris or eye scan and even odor or scents.

Because of their uniqueness, these are much more reliable in confirming a person’s identity than a password or access card. So, how do you verify a person is who they say they are if they’re not in person? Microsoft partners are now leveraging some of the Azure platform offerings to do this—things such as Cognitive Service’s Vision API and Azure Machine Learning tools for performing multi-factor authentication in the banking industry.

The way this works is the user provides a government issued ID (a license or passport for example) and they validate it against standards provided by the ID issuer, so they’re building an algorithm for verification of that ID and putting that into a database. So, when someone submits an ID from a particular state, we know what that ID is supposed to look like and we look for all the distinguishing features of that ID.

To take this a step further, the second factor is they’re using facial recognition software on things like your phone or computer, like Face ID for the iPhone. It will take your photo, but it will also take a video of you and force you to move your head in certain motions in order validate that is it you – you’re not wearing a mask or something – and that you’re alive.

It takes a picture of your ID and matches it to your facial constructions and compares them side by side; this becomes your digital signature. This is considered extremely secure as now you have two forms of verification and you’re using biometrics. Crazy stuff when you think about it but in the digital world we live in, you must go to these lengths to verify someone’s identity when they are not right in front you.

This is still in the early phase of what we’ll see but it’s cool to see how it’s being used and will be interesting to see how it progresses in the future. We’ve got great consultants working with Cognitive Services and Machine Learning. Anything data or Azure related, we’re doing it.

What is Azure Automation?

So, what do you know about Azure Automation? In this post, I’ll fill you in on this cool, cloud-based automation service that provides you the ability to configure process automation, update management and system configuration, which is managed across your on-premises resources, as well as your Azure cloud-based resources.

Azure Automation provides complete control of deployment operation and decommissions of workloads and resources for your hybrid environment. So, we can have a single pane of glass for managing all our resources through automation.

Some features I’d like to point out are:

  • It allows you to automate those mundane, error-prone activities that you perform as part of your system configuration and maintenance.
  • You can create Notebooks in PowerShell or Python that help you reduce the chance for misconfiguration errors. And it will help lower operational costs for the maintenance of those systems, as you can script it out to do it when you need instead of manually.
  • The Notebooks can be developed for on-premises or Azure resources and they use Web Hooks that allow you to trigger automation from things such as ITSM, Dev Ops and monitoring systems. So, you can run these remotely and trigger them from wherever you need to.
  • On configuration management side, you can build these desired state configurations for your enterprise environment. This will help you to set a baseline for how your systems will operate and will identify when there’s a variance from the initial system configuration, alerting you of any anomalies that could be problematic.
  • It has a rich reporting back end and alerting interface for full visibility into what’s happening in your Windows and Linux systems – on-premises and in Azure.
  • Gives you update management aspects (in Windows and Linux) to help you define the aspects of how updates are applied, and it helps administrators to specify which updates will be deployed, as well as successful or unsuccessful deployments and the ability to specify which updates should not be deployed to systems, all done through PowerShell or Python scripts.
  • It can share capabilities, so when you’re using multiple resources or building those Notebooks for automation, it allows you to share the resources to simplify management. You can build multiple scripts but use the same resources over and over as references for things like role-based access control, variables, credentials, certificates, connections, schedules and access to source control and PowerShell modules. You can check these in and out of source control like any kind of code-based project.
  • Lastly, and one of the coolest features in my opinion, where these are templates you’re deploying out in your systems, everyone has some similar challenges. There’s a community gallery where you can go and download templates others have created or upload ones you’ve created to share. With a few basic configuration tweaks and review to make sure they’re secure, this is a great option for making the process faster by finding an existing script and cleaning it up and deploying it in your systems and environment.

So, there’s a lot you can do with this service and I think it’s worth checking out as it can make your maintenance and management much simpler.

What is Azure Firewall?

I’d like to discuss the recently announced Azure Firewall service that is now just released in GA. Azure Firewall is a managed, cloud-based network security service that protects your Azure Virtual Network resources. It is a fully stateful PaaS firewall with built-in high availability and unrestricted cloud scalability.

It’s in the cloud and Azure ecosystem and it has some of that built-in capability. With Azure Firewall you can centrally create, enforce and log application and network connectivity policies across subscriptions and virtual networks, giving you a lot of flexibility.

It is also fully integrated with Azure Monitor for log analytics. That’s big because a lot of firewalls are not fully integrated with log analytics which means you can’t centralize these logs in OMS, for instance, which would give you a great platform in a single pane of glass for monitoring many of the technologies being used in Azure.

Some of the features within:

  • Built in high availability, so there’s no additional load balances that need to be built and nothing to configure.
  • Unrestricted cloud scalability. It can scale up as much as you need to accommodate changing network traffic flows – no need to budget for your peak traffic, it will accommodate any peaks or valleys automatically.
  • It has application FQDN filtering rules. You can limit outbound HTTP/S traffic to specified lists of fully qualified domain names including wildcards. And the feature does not require SSL termination.
  • There are network traffic filtering rules, so you can create, allow or deny network filtering rules by source and destination IP address, port and protocol. Those rules are enforced and logged across multiple subscriptions and virtual networks. This is another great example of having availability and elasticity to be able to manage many components at one time.
  • It has fully qualified domain name tagging. If you’re running Windows updates across multiple servers, you can tag that service as an allowed service to come through and then it becomes a set standard for all your services behind that firewall.
  • Outbound SNAT and inbound DNAT support, so you can identify and allow traffic originating from your virtual network to remote Internet destinations, as well as inbound network traffic to your firewall public IP address is translated (Destination Network Address Translation) and filtered to the private IP addresses on your virtual networks.
  • That integration with Azure Monitor that I mentioned in which all events are integrated with Azure Monitor, allowing you to archive logs to a storage account, stream events to your Event Hub, or send them to Log Analytics.

Another nice thing to note is when you set up an express route or a VPN from your on premises environment to Azure, you can use this as your single firewall for all those virtual networks and allow traffic in and out from there and monitor it all from that single place.

This was just released in GA so there are a few hiccups, but if none of the service challenges effect you, I suggest you give it a try. It will only continue to come along and get better as with all the Azure services. I think it’s going to be a great firewall service option for many.

What is Azure Data Box and Data Box Disk?

Are you looking to move large amounts of data into Azure? How does doing it for free sound and with an easier process? Today I’m here to tell you how to do just that with the Azure Data Box.

Picture this: you have a ton of data, let’s say 50 terabytes on-prem, and you need to get that into Azure because you’re going to start doing incremental back ups of a SQL Database, for instance. You have two options to get this done.

First option is to move that data manually. Which means you have to chunk it, set it up using AZ copy or a similar Azure data tool, put it up in a blob storage, then extract it and continue with the process. Sounds pretty painful, right?

Your second option is to use Azure Data Box which allows you to move large chunks of data up into Azure. Here’s how simple it is:

  • You order the Data Box through Azure (currently available in the US and EU)
  • Once received, you connect it to your environment however you plan to move that data
  • It uses standard protocols like SMB and CIFS
  • You copy the data you want to move and return the Data Box back to Azure and then they will upload the data into your storage container(s)
  • Once the data is uploaded, they will securely erase that Data Box

With the Data Box you get:

  • 256-bit encryption
  • A super tough, hardened box that can withstand drops or water, etc.
  • It can be pushed into Azure Blob
  • You can copy data up to 10 storage accounts
  • There are two 1 gigabit/second and two 10 gigabit/second connections to allow quick movement of data off your network onto the box

In addition, Microsoft has recently announced the Data Box Disk, which is a small 8 terabyte disk that you can order up to five of as part of the Data Box Disk.

With Data Box Disc you get:

  • 35 terabytes of usable capacity per order
  • Supports Azure Blobs
  • A USB SATA 2 and 3 interface
  • Uses 128-bit encryption
  • Like Data Box, it’s a simple process to connect it, unlock it, copy the data onto the disk and it send it back to copy those into a single storage account for you

Here comes the best part—while Azure Data Box and Data Box Disk are in Preview, this is a free service. Yes, you heard it right, Microsoft will send you the Data Box or Data Box Disk for free and you can move your data up into Azure for no cost.

Sure, it will cost you money when you buy your storage account and start storing large sums of data, but storage is cheap in Azure, so that won’t break the bank.

 

Informatica Enterprise Data Catalog in Azure

If you’re like many Azure customers, you’ve been on the look out for a data catalog and data lineage tool and want one with all the key capabilities you’re looking for. Today, I’d like to tell you more about the Informatica Data Catalog which was discussed briefly in a previous Azure Every Day post.

The Informatica tool helps you to analyze, consolidate and understand large volumes of metadata in your enterprise. It allows you to extract both physical and business metadata for objects and organize it based on business concepts, as well as view data lineage and relationships for each of those objects.

Sources include databases, data warehouses, business glossaries, data integration and Business Intelligence reports and more – anything data related. The catalog maintains an indexed inventory of all the dated objects or ‘assets’ in your enterprise such as tables, columns, reports, views and schemas.

Metadata and statistical information in the catalog include things like profile results, as well as info about data domains and data relationships. It’s really the who, what, when, where and how of the data in your enterprise.

Informatica Data Catalog can be use for tasks such as:

  • Find your scalable assets by being able to scour your network or cloud space to look for assets that aren’t cataloged.
  • View lineage for those assets, as well as relationships between assets.
  • Enrich assets by tagging them with additional attributes, possibly tag a specific report as a critical item.

These are lots of useful features in the Data Catalog. Some key ones are:

  • Data Discovery – Do a semantic search, dynamic filtering, data lineage and relationships for assets across your enterprise.
  • Data Classification – Automatically or manually annotate data classifications to help with governance and discovery – who should have access to what data and what does the data contain.
  • Resource Administration – Like resource, schedule and attribute management, as well as connection or profile configuration management. All the items that surround the data that help you manage the data and the metadata around it.
  • Create and edit reusable profile definition settings.
  • Monitor resources and tasks within your environment.
  • Data domain management where you can create and edit domains and the kind of groups you want to group together with like data and reports.
  • Assign logical data domains to data groups.
  • Build composite data domains for management purposes.
  • Monitor the status of tasks in progress and look at some transformation logic for assets.

On top of this, you can look at how frequently the data is accessed and how valuable it is to your business users; showing this type of information around your data so you can trim reports that aren’t being used for instance.

When we talk about modern data warehousing in the Azure cloud, this is something we’ve been looking for. It’s a useful and valuable tool for those who want those data governance and lineage tools.

Introducing Microsoft Azure Sphere

As we continually see more and more consumer-grade internet devices, like appliances, baby monitors, thermostats, etc., we need a more robust way to manage these devices in a secure way. There are approximately 9 billion of these devices built every year. These devices each have a tiny chip or microcontroller (MCU) that hosts the compute, the storage and the memory, as well as the operating system.

I’m excited to tell you about a newer offering called Microsoft Azure Sphere. As these consumer-grade devices grow exponentially, the problem is that some of these devices are not built in a secure way, making them easily susceptible to hacking. There have been plenty of news stories about devices that have been hacked and then used for malicious purposes.

Microsoft is not alone in recognizing this issue. But they jumped on this in 2015 and they began to develop a way that they believed was a good approach to secure these devices and created Microsoft Azure Sphere. This is a solution for creating highly secure internet connected microcontroller devices with

3 main components:

1. Azure Sphere Certified MCUs – They’ve got manufacturers that architect a solution that combines real time and application processors built onto this MCU, using built in Microsoft security technology and connectivity capabilities.

They used the experience, processes and lessons learned from looking at the X-Box consoles that have been built over the past 15 years and put that into the design of these chips. So, third party companies can build these chips by using these processes and be certified.

2. Azure Sphere Operating System – Once an MCU is certified, you can install this operating system which is intended to be super secure and agile to serve those MCU purposes, including layers of security from Windows, Linux and specific security monitoring software all built into that operating system.

3. Azure Sphere Security Service – This allows you to protect each device, but also allows secure communication from device to device or device to the cloud.

Similar to what we talk about with IoT and IoT Hub, but there is a certified way of doing it to ensure you’re using the architecture that will remain secure and support by Microsoft for years to come. And this will apply to 10s of 1000s of companies that are going to build devices for all the areas I’ve mentioned and more, giving them that secure platform to build them.

Azure Sphere was just announced earlier this year and there’s a long way to go. But we’ll soon start to see things like software and hardware development kits (which you can pre-order now from Microsoft) and really get a chance to understand how it works and how it’s going to offer better solutions in the coming years for these connected devices.

It’s a cool technology and however they brand it, whether it’s Azure Sphere MCU Certified or something else, it will give us the assurance that we’ll be getting a quality product without the danger of getting exposed.

 

New Options for SQL 2008 End of Support

If you’re using SQL 2008 or 2008R2, you need to be aware that the extended support for those ends on July 9, 2019. This means the end of regular security updates which leads to more vulnerabilities, and the software won’t be updated, and you’ll have out of compliance risks. As such, here are some new options for SQL 2008 End of Support:

The best option would be with either a migration or an upgrade, but Microsoft has some options in place to help people out as they understand this can be easier said than done when you have applications that need to be upgraded and you must figure out how best to handle that.

That being said, upgrading provides better performance, efficiency, security features and updates, as well as new technology features and capabilities within the whole stack of SQL products (SSIS, SSRS, SSAS).

Other Options

Here are some options that Microsoft is offering to help with the end of support of 2008/2008R2:

    • First, they are going to extend security updates available for free in Azure for 2008/2008R2 for 3 more years. So, if you simply move your workload to an IaS VM in Azure, you’ll be supported without requiring application changes. You’ll have to pay for those virtual machine costs but it’s still a good deal to get you started.
    • You can migrate your workloads to Managed Instances, which will be in GA by the end of 2018. This will be able to support all applications out there, so you can start the transition up into Azure.
    • You can take advantage of the Azure hybrid licensing model to migrate to save money on licensing. With this you can save up to 55% on some of your PaaS SQL Server costs, but only if you have Enterprise Edition and Software Assurance.
    • For on-premises servers that need more time to upgrade, you’ll be able to purchase extended security service plans to extend out 3 years past the July 2019 date. So, if you’re struggling to get an application upgraded and validated, they’ll extend that out for a fee. Again, this is for customers with Software Assurance or subscription licenses under an Enterprise Agreement. These can be purchased annually and to cover only the servers that need updates.
    • Extended security updates will also be available for purchase as they get closer to the end of support, for both SQL Server and Windows Server.

Again, the first choice would be to upgrade or migrate those databases and move to Azure, but there are some challenges with doing so, and if none of those options work, there are some great options to extend your support.

Overview of Azure Operations Management Suite (OMS)

In this post I’d like to give an overview of what Azure Operations Management Suite is and what it can be used for. First, Operations Management Suite, or OMS, is a collection of management services designed for Azure cloud. As new services are added to Azure, more capabilities are being built into OMS to allow for integration.

OMS allows you to collect things in one central place like the many Azure services that need deeper insight and manageability, all from one portal, as well as being able to set up different groups and different ways of viewing your data. OMS can also be used with on prem resources with Window and Linux Agent, so you can collect logs or backup your servers or files to Azure, for example.

The key Operations Management Suite services are:

  • Log analytics allows you to monitor and analyze the availability and performance of different resources including physical and virtual machines, Azure Data Factory and other Azure services.
  • Proactive alerting for when an issue or problem in your environment is detected, so you can either take corrective action or have a preprogrammed corrective action.
  • Ability automate manual processes and enforce configuration for physical and virtual machines, like automating clean-up operations you do on servers for instance. You can do this through Runbooks which are based on PowerShell scripts or PowerShell workloads where you can programmatically do what you need to do within the OMS.
  • Integrate backups so the agent and integration allow for backing up a service, a file level; whatever you need to do for critical data and run those stores, whether they are on-prem or cloud-based resources.
  • Azure Site Recovery runs through OMS and helps you provide high availability for apps and servers that you’re running.
  • Orchestrate running your replication up into Azure. This allows you to do it from physical servers, Hyper Vs or VMware servers using Windows or Linux.

Mainly, it provides management solutions. These are prepackaged sets of templates provided by Microsoft and/or partners that help implement multiple OMS services at one time. One example is the Update Management Solution which creates a log search, dashboard and alerting inside log analytics, but at the same time creates an automation runbook for installing updates on the server. This will tell you when updates are available, when they’re needed and then let you automate the install of those updates.

There is a lot of power and capability that comes with the Operations Management Suite. It’s a great centralized management solution within Azure that is quick to configure and start using.