Microsoft Reimagines Traditional SIEMs with Azure Sentinel

If you’re like most, security is at the forefront of your mind for your organization. You need the right tools and the right team to keep up with the balance of increasing number of sophisticated threats and with security teams being inundated with requests and alerts.

Today I’d like to tell you about Microsoft’s reimagined SIEM tool Azure Sentinel. Over the past 10 to 15 years, Security Information and Event Management (SIEM) has become extremely popular as an aggregation solution for security and events that happen in our network.

There are also software tools, hardware appliances and managed service providers that can help support your corporate needs to better understand the level of risks in real-time and over a span of time. They do things such as log aggregation, event correlation and forensic analysis and offer features for alerting, dashboarding and compliance checks.

These are great resources to help secure our environment, our users and devices. But unfortunately, the reality is security teams are being inundated with requests and alerts. Compile this with the noteworthy shortage of security professionals in the world – an estimated 3.5 million unfilled security jobs by 2021 – this is a major concern.

Microsoft decided to take a different approach with Azure Sentinel. Azure Sentinel provides intelligent security analytics at cloud scale for your entire enterprise. It makes it easy to collect data across your entire hybrid organization on any cloud, from devices to users to applications to servers. Azure Sentinel uses the power of AI to ensure you’re quickly identifying real threats.

With this tool:

  • You’ll eliminate the burden of traditional SIEMs as you’re eliminating the need to spend time on setting up, maintaining and having to scale the infrastructure to support other SIEMs.
  • Since it’s built on Azure, it offers virtually limitless cloud scale while addressing all your security needs.

Now let’s talk cost. Traditional SIEMs have proven to be expensive to own and operate, often requiring you to commit upfront and incur high cost for infrastructure maintenance and data ingestion. With Sentinel, you pay for what you use with no up-front costs. Even better, because of Microsoft’s relationships with so many enterprise vendors (and more partners being added) it easily connects to popular solutions, including Palo Alto networks, F5 networks, Symantec and Checkpoint offerings.

Azure Sentinel integrates with Microsoft Graph Security API, enabling you to import your own threat intelligence feeds and to customize threat detection and alert rules. There are custom dashboards that give you a view to allow you to optimize whatever your specific use case is.

Lastly, if you’d like to try this out for free, Microsoft is allowing you to connect to your Office 365 tenant to do some testing and check it out in greater detail. This product is currently in preview, so there may be some kinks but I’m looking forward to seeing how it develops in the future, as a true enterprise-class security solution for your environment, whether in the cloud, on premises, in data centers or remote users or devices.

Microsoft Announces Windows Virtual Desktop in Azure

Today I’m here with some exciting news out of Microsoft with the public preview of Microsoft Virtual Desktop. Virtual desktops are not a new invention and they are currently offered by multiple vendors.

Windows Virtual Desktop is comprised of the Windows desktops themselves and the application that you would pass out to users and the management solution for these are hosted in Azure. Inside public preview desktops and apps can be deployed on virtual machines in any Azure region in the US, with the management solution in data for these virtual machines residing in the US as well.

As the service moves closer to general availability, Microsoft will start to scale out the management solution and data localization to all Azure regions. Virtual desktops can be deployed with Windows 7 (with extended support) or Windows 10 for the workstation modes, and for the server versions you can run 2012 through 2019.

It will provide a full desktop virtualization environment inside your Azure subscription without having to run any additional gateway servers as you would if you were deploying this on other vendors’ platforms or in your on premises.

With Windows Virtual Desktop you can build custom images or pick some of the canned images provided in the Azure gallery. Images can be personalized and remain static with persistent desktops. There are also many configuration options, for instance; you want to run a single application for a connectivity for setting up a server/client type deployment of an application or you want to deploy pooled multi-session resources.

Another key point is you’re deploying this with significantly reduced overhead as you no longer need to manage the remote desktop roles like you would with remote desktop services on prem or with some of the other providers. You just have to manage the virtual machines inside of your Azure subscription.

There are many great use cases historically for virtual desktops in things like education and medical along with many others.

Accelerate Your AI with Machine Learning on Azure Data Box Edge

In some past blogs I’ve discussed Azure Data Box and how the Data Box family has expanded. Today I’ll talk about Azure Data Box Edge (in preview) and elaborate on the machine learning service that it provides in your premises with the power of Azure behind it.

If you don’t know, Azure Data Box Edge is a physical hardware device that sits in your environment and collects data from environment sources like IOT data and other sources where you might take advantage of the AI features offered by the device. It then takes the data and sends it to Azure for more processing, storage or reporting purposes.

Microsoft recently announced Azure Machine Learning hardware accelerated models provided by Project Brain Wave on the Data Box Edge. Because most of our data is in real world applications and used at the edge of our networks – like image and videos collected from factories, retail stores or hospitals – it can now be used for things such as manufacturing defect analysis or inventory out of stock detection in diagnostics.

By applying machine learning models to the data on Data Box Edge, it provides lower latency (and savings on bandwidth cost) as we don’t have to send all the data to Azure for analysis. But it still offers that real time insight and speed to action for critical business decisions.

You can enable data scientists to simplify and accelerate the building, training and deployment of machine learning models using the Azure Machine Learning Service which is already generally available. They can access all these capabilities in their favorite Python environment, using the latest open source frameworks such as PyTorch, TensorFlow and sci-kit-learn.

These models can run on CPUs and GPUs, but this preview expands that out to field programmable gate array processes (FPGA), which is the processor on the Data Box Edge.

The preview is currently a bit limited but, in this case, you’re able to enhance the Azure Machine Learning Service by training a TensorFlow model for image classification scenarios. So, you would containerize that model in a docker container and then deploy it to the Data Box Edge IOT hub.

A good use case for this is if you’re using AI models for quality control purposes. Let’s say you know what a finished product should look like and what the quality specs are, and you build a model defining those parameters. Then you take an image of that product as it comes off the assembly line; now you can send those images to the Data Box Edge in your environment and more quickly capture defects.

Now you’re finding the root cause of defects quicker and throwing away fewer defective products and therefore, saving money. I’m looking forward to seeing how enterprises are going to leverage this awesome technology.

What Power BI XMLA Endpoints means for you

Are you taking advantage of Power BI’s modeling capability? This is a terrific built in capability and many users build models as they begin to design their reports.  

However, as we transition Power BI into being our enterprise visualization and reporting tool, some of the legacy applications just aren’t going to go away. Many of those applications connect to XMLA endpoints from other semantic model providers such as analysis services on your SQL on prem.

So, the Power BI team decided to give you the ability to do the same. There is a newly announced feature in public preview called XMLA Endpoints for Power BI. Beyond the ability to connect to the XMLA endpoints with other analytical dashboarding tools (like Tableau), you can also connect to it from other toolsets that support XMLA.

For example, many of the Microsoft development tools give you the ability to connect as well, things like SQL Server Management Studio (SSMS), SQL Server Profiler, DAX Studio and even from Excel pivot tables. Just keep in mind it’s only currently available for read access, so some of your capabilities will be limited. Microsoft documentation on this feature states that they will be planning to offer a read/write option soon.

From a licensing perspective, access to XMLA Endpoints is available for datasets in Power BI Premium only, but any user can connect to the endpoints regardless of whether they have a Pro license. The feature itself is turned on within the settings of the Power BI Premium tenant. This is only available in preview at this time, so it’s not supported in production.

In this scenario, other tools such as SQL Server data tools or types where you can play with the model give you the ability to change your model with Visual Studio. Now you can start to see a scenario where you can start to use Git Hub repositories to manage your source control on the models among other endless uses.

I must say I’m impressed with the modular approach of the team to roll out Power BI as a 100% enterprise class tool. First, we saw the Data Flows feature which extracts the ELT/ELT into its own lane. Then things like composite models was introduced where you can have offline and online data sources. Now we’ve got XMLA Endpoints which clearly defines 3 separate layers of development within the Power BI ecosystem.

These recently added features have opened some great avenues and spread user adoption and I know there are more great things to come. The commitment of the team to evolve and improve this product has been excellent and I look forward to what they’ll roll out next.

What is Azure Cost Management?

If you’re using Azure, one key piece is effectively planning and controlling the costs involved in running your business. In this post, I’ll discuss what I believe to be one of the most critical functions in Azure that will help you properly deploy and manage your environment.

Azure Cost Management helps you with this planning and cost control. Don’t confuse cost management with billing. To clarify, cost management and billing are two different things; billing is simply retrieving the bill, auditing it for accuracy and paying it. Cost management shows organization cost and usage patterns with advanced analytics.

Reports in cost management show the usage-based costs consumed by Azure services in third party marketplace offerings. Costs are based on negotiated prices and they factor in things like reservation and Azure hybrid benefit discounts for example.

These reports show all your internal and external costs for usage and some of the Azure Marketplace charges. One thing to be aware of when looking at the report is that some other charges such as reservation purchase, support and taxes are not yet shown.

They’ll also help you understand your spending and resource usage, as well as to find any spending anomalies. Some application of predictive analytics is also available to help with future budgeting needs based on your previous usage trends. Using this, Azure Management can deploy groups, budgets and recommendations to show you how your organization expenses are organized and how you can reduce costs going forward.

In addition, in 2018 Microsoft purchased a company called Cloudyn. Cloudyn was a third-party tool that allowed you to manage other cloud providers, as well as your Azure services. Since its purchase, Microsoft has begun to move some of those services over to Azure Cost Managementand pulled them out of or stopped supporting them on Cloudyn. The Azure Cost Management page gives you guidance as to which service is best for you.

Besides the service offering, Microsoft also offers some best practices around cost management including some key principles such as:

  • Planning ahead of deploying any resources so that the appropriate sizing and services are configured before you ‘jump in the pond’.
  • Visibility which gives you some alerting and reporting availability so you can ensure all the effected parties are notified appropriately of the services they are using and the costs they’re incurring.
  • Accountability allows you to ensure that those groups are accountable for the services they are using, and they understand the implications of the services they are deploying. And they are getting this information in a timely manner to know what their cost spending is.
  • Optimization is the process of deploying those compute resources and making sure you’re looking at where you can save some money like the appropriate licensing structures, using hybrid benefits, bringing your own license (BYOL) and looking at reserved instances for instance. And also looking at what usage your compute is consuming so you can see where you may be able to cut costs.

Azure Cost Management is a continuous, iterative process and the key to managing expenses appropriately is being sure you are keeping an eye on these things I’ve discussed and continually tweaking the services that you’re using.