In this video blog post I covered the serving layer step of building your Modern Data Warehouse in Azure. There are certainly some decisions to be made around how you want to structure your schema as you get it ready for presentation with whatever your business intelligence tool of choice, for this example I used Power BI, so I discuss some of the areas you should focus on:
What is your schema type? Snowflake or Star, or something else?
Where should you serve up the data? SQL Server, Synapse, ADLS, Databricks, or Something Else?
What are your Service level agreements for the business? What are your data processing times?
Can you save cost by using an option that’s less compute heavy?
you’re like most, security is at the forefront of your mind for your
organization. You need the right tools and the right team to keep up
with the balance of increasing number of sophisticated threats and with
security teams being inundated with requests and alerts.
Today I’d like to tell you about Microsoft’s reimagined SIEM tool Azure Sentinel.
Over the past 10 to 15 years, Security Information and Event Management
(SIEM) has become extremely popular as an aggregation solution for
security and events that happen in our network.
There are also software tools, hardware appliances and managed
service providers that can help support your corporate needs to better
understand the level of risks in real-time and over a span of time. They
do things such as log aggregation, event correlation and forensic
analysis and offer features for alerting, dashboarding and compliance
These are great resources to help secure our environment, our users and devices. But unfortunately, the reality is security
teams are being inundated with requests and alerts. Compile this with
the noteworthy shortage of security professionals in the world – an
estimated 3.5 million unfilled security jobs by 2021 – this is a major
Microsoft decided to take a different approach with Azure Sentinel. Azure Sentinel provides intelligent security analytics at cloud scale for your entire enterprise. It makes it easy to collect data across your entire hybrid organization on any cloud, from devices to users to applications to servers. Azure Sentinel uses the power of AI to ensure you’re quickly identifying real threats.
With this tool:
You’ll eliminate the burden of traditional SIEMs as you’re eliminating the need to spend time on setting up, maintaining and having to scale the infrastructure to support other SIEMs.
Since it’s built on Azure, it offers virtually limitless cloud scale while addressing all your security needs.
Now let’s talk cost. Traditional SIEMs have proven to be expensive to
own and operate, often requiring you to commit upfront and incur high
cost for infrastructure maintenance and data ingestion. With Sentinel, you pay for what you use with no up-front costs. Even better, because
of Microsoft’s relationships with so many enterprise vendors (and more
partners being added) it easily connects to popular solutions, including Palo Alto networks, F5 networks, Symantec and Checkpoint offerings.
Azure Sentinel integrates with Microsoft Graph Security API,
enabling you to import your own threat intelligence feeds and to
customize threat detection and alert rules. There are custom dashboards that give you a view to allow you to optimize whatever your specific use case is.
Lastly, if you’d like to try this out for free, Microsoft is
allowing you to connect to your Office 365 tenant to do some testing and
check it out in greater detail. This product is currently in
preview, so there may be some kinks but I’m looking forward to seeing
how it develops in the future, as a true enterprise-class security
solution for your environment, whether in the cloud, on premises, in
data centers or remote users or devices.
Today I’m here with some exciting news out of Microsoft with the public preview of Microsoft Virtual Desktop. Virtual desktops are not a new invention and they are currently offered by multiple vendors.
Windows Virtual Desktop is comprised of the Windows desktops themselves and the application that you would pass out to users and the management solution for these are hosted in Azure. Inside public preview desktops and apps can be deployed on virtual machines in any Azure region in the US, with the management solution in data for these virtual machines residing in the US as well.
As the service moves closer to general availability, Microsoft will start to scale out the management solution and data localization to all Azure regions. Virtual desktops can be deployed with Windows 7 (with extended support) or Windows 10 for the workstation modes, and for the server versions you can run 2012 through 2019.
It will provide a full desktop virtualization environment inside your Azure subscription without having to run any additional gateway servers as you would if you were deploying this on other vendors’ platforms or in your on premises.
With Windows Virtual Desktop you can build custom images or pick some of the canned images provided in the Azure gallery. Images can be personalized and remain static with persistent desktops. There are also many configuration options, for instance; you want to run a single application for a connectivity for setting up a server/client type deployment of an application or you want to deploy pooled multi-session resources.
Another key point is you’re deploying this with significantly reduced overhead as you no longer need to manage the remote desktop roles like you would with remote desktop services on prem or with some of the other providers. You just have to manage the virtual machines inside of your Azure subscription.
There are many great use cases historically for virtual desktops in things like education and medical along with many others.
If you’re using Azure, one key piece is effectively planning and controlling the costs involved in running your business. In this post, I’ll discuss what I believe to be one of the most critical functions in Azure that will help you properly deploy and manage your environment.
Azure Cost Management helps you with this planning and cost control. Don’t confuse cost management with billing. To clarify, cost management and billing are two different things; billing is simply retrieving the bill, auditing it for accuracy and paying it. Cost management shows organization cost and usage patterns with advanced analytics.
Reports in cost management show the usage-based costs consumed by Azure services in third party marketplace offerings. Costs are based on negotiated prices and they factor in things like reservation and Azure hybrid benefit discounts for example.
These reports show all your internal and external costs for usage and some of the Azure Marketplace charges. One thing to be aware of when looking at the report is that some other charges such as reservation purchase, support and taxes are not yet shown.
They’ll also help you understand your spending and resource usage, as well as to find any spending anomalies. Some application of predictive analytics is also available to help with future budgeting needs based on your previous usage trends. Using this, Azure Management can deploy groups, budgets and recommendations to show you how your organization expenses are organized and how you can reduce costs going forward.
In addition, in 2018 Microsoft purchased a company called Cloudyn. Cloudyn was a third-party tool that allowed you to manage other cloud providers, as well as your Azure services. Since its purchase, Microsoft has begun to move some of those services over to Azure Cost Managementand pulled them out of or stopped supporting them on Cloudyn. The Azure Cost Management page gives you guidance as to which service is best for you.
Besides the service offering, Microsoft also offers some best practices around cost management including some key principles such as:
Planning ahead of deploying any resources so that the appropriate sizing and services are configured before you ‘jump in the pond’.
Visibility which gives you some alerting and reporting availability so you can ensure all the effected parties are notified appropriately of the services they are using and the costs they’re incurring.
Accountability allows you to ensure that those groups are accountable for the services they are using, and they understand the implications of the services they are deploying. And they are getting this information in a timely manner to know what their cost spending is.
Optimization is the process of deploying those compute resources and making sure you’re looking at where you can save some money like the appropriate licensing structures, using hybrid benefits, bringing your own license (BYOL) and looking at reserved instances for instance. And also looking at what usage your compute is consuming so you can see where you may be able to cut costs.
Azure Cost Management is a continuous, iterative process and the key to managing expenses appropriately is being sure you are keeping an eye on these things I’ve discussed and continually tweaking the services that you’re using.
is, or should be, a top priority; nothing is more important than making
your enterprise secure. In this post I’ll tell you 5 ways Azure makes
your enterprise more secure.
First off, Azure is a Microsoft product. When you’re one of the
world’s largest companies, there are an enormous amount of threats that
need to be evaluated every second of the day. So, obviously Microsoft is
aware of these challenges.
With that in mind, Microsoft developed centers of excellence over the past ten years in order to be ready for these attacks.
The Microsoft Threat Intelligence Center processed over 6.5 trillion
signals so they could better understand what kind of information and
what types of attack vendors there are.
Each month they block over 5 billion distinct malware
threats. And they staff over 3500 security professionals in their
defense operations centers to help thwart these attacks. Since
Active Directory is a standard for user authentication control, they
introduced Azure Active Directory years ago to extend that to their
All that being said, here are 5 ways that Azure makes your enterprise more secure:
1. Minimize the requirement for password use – By
using Microsoft Authenticator and connecting to Software as a Service
applications (like Drop Box, Salesforce, etc.) The authenticator
replaces your password with a multi-factor sign in using something like
your phone and your fingerprint, face ID or a pin based on the Windows
device that you’re using.
With a 2-factor authentication when using those devices, you have a
more simplified method instead of remembering a bunch of different
2. Security Scorecard – A while back I did a post on the Azure Secure Score and the Secure Score Center.
With this, you’re using the Azure portal for having awareness where
there are potentials for exposure or for best practices that need to be
followed which helps your organization stay better secured.
3. Microsoft Threat Protection Suite – Helps detect,
investigate and remediate issues across your organization, including
endpoints, email, documents, identity and infrastructure elements. It
also helps your security team automate many of those manual, mundane
4. Confidentiality – Microsoft was the first cloud
vendor to introduce confidential integrity in data while it’s in use.
So, consumers don’t worry about their data being put in the wrong hands
(like some of those other clouds vendors you may have heard of recently
in the news).
Data is always encrypted at rest and in transit. The security will
soon extend to the chip level for added security on certain Azure VMs.
Intel has built in some security measures inside their chips and now
Microsoft is going to interact directly with those chips to keep you
5. Microsoft Information Protection Service – This
enables you to automatically discover, classify, label, protect and
monitor data no matter where it lives or travels on your Microsoft
We’re now seeing many more open source capabilities and seeing more of these applications being sent over to Macs and Linux PCs for instance. Essentially this labeling capability is built into office apps and such across all the major platforms and can add protection capability to things like PDF documents, a feature currently in preview.
But the idea is it’s going to help you protect from things such as
PII being extended. So, it’s an added level of protection to ensure
there are no security leaks.
So, it’s clear from all this that Microsoft not only has a
commitment to securing their own services and software, but also
enterprises and individuals are of critical importance when talking
If you’re concerned about security, check out some of the things I mentioned here and remember, Microsoft is making the investment and doing all they can to keep things secure.
what do you know about Azure Automation? In this post, I’ll fill you in
on this cool, cloud-based automation service that provides you the
ability to configure process automation, update management and system
configuration, which is managed across your on-premises resources, as
well as your Azure cloud-based resources.
Azure Automation provides complete control of deployment operation and decommissions of workloads and resources for your hybrid environment. So, we can have a single pane of glass for managing all our resources through automation.
Some features I’d like to point out are:
It allows you to automate those mundane, error-prone activities that you perform as part of your system configuration and maintenance.
You can create Runbooks in PowerShell or Python that help you reduce the chance for misconfiguration errors. And it will help lower operational costs for the maintenance of those systems, as you can script it out to do it when you need instead of manually.
The Runbooks can be developed for on-premises or Azure resources and they use Web Hooks that allow you to trigger automation from things such as ITSM, Dev Ops and monitoring systems. So, you can run these remotely and trigger them from wherever you need to.
On configuration management side, you can build these desired state configurations for your enterprise environment. This will help you to set a baseline for how your systems will operate and will identify when there’s a variance from the initial system configuration, alerting you of any anomalies that could be problematic.
It has a rich reporting back end and alerting interface for full visibility into what’s happening in your Windows and Linux systems – on-premises and in Azure.
Gives you update management aspects (in Windows and Linux) to help you define the aspects of how updates are applied, and it helps administrators to specify which updates will be deployed, as well as successful or unsuccessful deployments and the ability to specify which updates should not be deployed to systems, all done through PowerShell or Python scripts.
It can share capabilities, so when you’re using multiple resources or building those Runbooks for automation, it allows you to share the resources to simplify management. You can build multiple scripts but use the same resources over and over as references for things like role-based access control, variables, credentials, certificates, connections, schedules and access to source control and PowerShell modules. You can check these in and out of source control like any kind of code-based project.
Lastly, and one of the coolest features in my opinion, where these are templates you’re deploying out in your systems, everyone has some similar challenges. There’s a community gallery where you can go and download templates others have created or upload ones you’ve created to share. With a few basic configuration tweaks and review to make sure they’re secure, this is a great option for making the process faster by finding an existing script and cleaning it up and deploying it in your systems and environment.
So, there’s a lot you can do with this service and I think it’s worth
checking out as it can make your maintenance and management much
The operations that are required to fix a drill or piece of equipment in the field is much more significant when it’s unexpected. Shell can use AI to look at when maintenance is required on compressors, valves and other equipment that’s used for oil drilling. This will help to reduce unplanned downtime and repair efforts. If they can keep up with maintenance before equipment fails, they can plan downtime and do so at much less cost.
They’ll use AI to help steer the drill bits through shale deposits to find the best quality shale deposits.
Failures of equipment of great size, such as drilling equipment, can have a lot of related damage and danger. This technology will improve the safety of employees and customers by helping to reduce unexpected failures.
AI enabled drills will help chart a course for the well itself as it’s being drilled, as well as providing constant data from the drill bits on what type of material is being drilled through. The benefits here are 2-fold; they will get data on quality deposits and reduce the wear and tear on the drill. If the drill is using an IoT device to detect a harder material, they’ll have the knowledge to drill in a different area or to figure out the best path to reduce the wear and tear.
It will free up the geologists and engineers to be able to manage more drills at one time, making them more efficient, as well as reactive to deal with problems as they arise while drilling.
As with everything in Azure, this platform is a highly scalable platform that will allow Shell to grow with what is required, plus have the flexibility to take on new workloads. With IoT and AI, these workloads are very easily scaled using Azure as a platform and all the services available with it.
I wanted to share this interesting use case about Shell because it really displays the capabilities of the Azure Platform to solve the mundane and enable the unthinkable.
Are you looking to move large amounts of data into Azure? How does doing it for free sound and with an easier process? Today I’m here to tell you how to do just that with the Azure Data Box.
Picture this: you have a ton of data, let’s say 50 terabytes on-prem, and you need to get that into Azure because you’re going to start doing incremental back ups of a SQL Database, for instance. You have two options to get this done.
First option is to move that data manually. Which means you have to chunk it, set it up using AZ copy or a similar Azure data tool, put it up in a blob storage, then extract it and continue with the process. Sounds pretty painful, right?
Your second option is to use Azure Data Box which allows you to move large chunks of data up into Azure. Here’s how simple it is:
You order the Data Box through Azure (currently available in the US and EU)
Once received, you connect it to your environment however you plan to move that data
It uses standard protocols like SMB and CIFS
You copy the data you want to move and return the Data Box back to Azure and then they will upload the data into your storage container(s)
Once the data is uploaded, they will securely erase that Data Box
With the Data Box you get:
A super tough, hardened box that can withstand drops or water, etc.
It can be pushed into Azure Blob
You can copy data up to 10 storage accounts
There are two 1 gigabit/second and two 10 gigabit/second connections to allow quick movement of data off your network onto the box
In addition, Microsoft has recently announced the Data Box Disk, which is a small 8 terabyte disk that you can order up to five of as part of the Data Box Disk.
With Data Box Disc you get:
35 terabytes of usable capacity per order
Supports Azure Blobs
A USB SATA 2 and 3 interface
Uses 128-bit encryption
Like Data Box, it’s a simple process to connect it, unlock it, copy the data onto the disk and it send it back to copy those into a single storage account for you
Here comes the best part—while Azure Data Box and Data Box Disk are in Preview, this is a free service. Yes, you heard it right, Microsoft will send you the Data Box or Data Box Disk for free and you can move your data up into Azure for no cost.
Sure, it will cost you money when you buy your storage account and start storing large sums of data, but storage is cheap in Azure, so that won’t break the bank.
In today’s post I’d like to talk about site to site networking service. Azure already has a site to site VPN service, but the Azure Virtual WAN is a newer service currently in Preview. This networking service is optimized for branch to service connectivity and offers the capability to use partner devices currently supplied by preferred partners (currently Riverbed and Cisco) or the ability to manually configure this connectivity with your environment.
Azure Virtual WAN has some big differences to consider:
Automated set up and configuration of these devices by preferred partners makes much easier to configure them. You simply set up these connections which you can export directly from the device into Azure and it automatically sets it up for you.
It is designed for large scalability and more through-put. The site to site service is great for smaller workloads but this new service opens the pipe and allows the data to crank through much faster.
It’s designed as a Hub and Spoke model. The Hub being Azure and the Spoke being your branch office – all managed within Azure.
Let’s look at the 4 main components of this service:
The Virtual WAN Service itself – This asset is where the resources are collected, and it represents a virtual overlay of the Azure network. Think of it as a top down view of the connectivity between all the components in Azure and in your offices.
A site represents the on premises VPN device and its settings. I mentioned those preferred devices from Riverbed and Sysco (with more to come) and if you’re using a supported device, you can easily drop that configuration into Azure.
The hub is the connection point in Azure for those sites. The site connects to the hub and the virtual WAN is overlooking all of these components.
The hub virtual network connection allows your connection point for your hub to your virtual network.
So, your hub and your virtual network are connected through that virtual network connection. This allows the communication between your virtual networks in Azure and your site to site virtual WAN.
This offering makes the landscape a bit different with how people are doing connectivity into Azure and connecting their remote offices by consolidating what that network looks like, as well as making it easier by offering these preferred devices.
Again, this is still in Preview but definitely something I would suggest checking out.
If you’re like many Azure customers, you’ve been on the look out for a data catalog and data lineage tool and want one with all the key capabilities you’re looking for. Today, I’d like to tell you more about the Informatica Data Catalog which was discussed briefly in a previous Azure Every Day post.
The Informatica tool helps you to analyze, consolidate and understand large volumes of metadata in your enterprise. It allows you to extract both physical and business metadata for objects and organize it based on business concepts, as well as view data lineage and relationships for each of those objects.
Sources include databases, data warehouses, business glossaries, data integration and Business Intelligence reports and more – anything data related. The catalog maintains an indexed inventory of all the dated objects or ‘assets’ in your enterprise such as tables, columns, reports, views and schemas.
Metadata and statistical information in the catalog include things like profile results, as well as info about data domains and data relationships. It’s really the who, what, when, where and how of the data in your enterprise.
Informatica Data Catalog can be use for tasks such as:
Find your scalable assets by being able to scour your network or cloud space to look for assets that aren’t cataloged.
View lineage for those assets, as well as relationships between assets.
Enrich assets by tagging them with additional attributes, possibly tag a specific report as a critical item.
These are lots of useful features in the Data Catalog. Some key ones are:
Data Discovery – Do a semantic search, dynamic filtering, data lineage and relationships for assets across your enterprise.
Data Classification – Automatically or manually annotate data classifications to help with governance and discovery – who should have access to what data and what does the data contain.
Resource Administration – Like resource, schedule and attribute management, as well as connection or profile configuration management. All the items that surround the data that help you manage the data and the metadata around it.
Create and edit reusable profile definition settings.
Monitor resources and tasks within your environment.
Data domain management where you can create and edit domains and the kind of groups you want to group together with like data and reports.
Assign logical data domains to data groups.
Build composite data domains for management purposes.
Monitor the status of tasks in progress and look at some transformation logic for assets.
On top of this, you can look at how frequently the data is accessed and how valuable it is to your business users; showing this type of information around your data so you can trim reports that aren’t being used for instance.
When we talk about modern data warehousing in the Azure cloud, this is something we’ve been looking for. It’s a useful and valuable tool for those who want those data governance and lineage tools.