Category Archives: Azure

Azure Cognitive Services

AI solutions are exploding, and Azure has the most complete offering of any cloud provider! Watch this video to get started with our API based Cognitive Services in Azure and a sample architecture of how to employ them with the Azure Bot Service. Azure Cognitive Services are cloud-based services with REST APIs and client library SDKs available to help you build cognitive intelligence into your applications.

You can add cognitive features to your applications without having artificial intelligence (AI) or data science skills. Azure Cognitive Services comprise various AI services that enable you to build cognitive solutions that can see, hear, speak, understand, and even make decisions. Azure Bot Service enables you to build intelligent, enterprise-grade bots with ownership and control of your data. Begin with a simple Q&A bot or build a sophisticated virtual assistant.

https://docs.microsoft.com/en-us/azure/cognitive-services/what-are-cognitive-services

https://docs.microsoft.com/en-us/azure/cognitive-services/cognitive-services-apis-create-account?tabs=multiservice%2Cwindows

https://dev.botframework.com/

#Azure #AI #CognitiveServices #ArtificialIntelligence #Bots #ReferenceArchitecture #MachineLearning #API #Cloud #Data #DataScience

What is HTAP in Azure

Hybrid Transactional and Analytical Processing, or HTAP, is an advanced database capability that allows for both types of workloads to be performed without one impacting the performance of the other.

In this Video Blog, I cover some of the history of HTAP, some of the challenges and benefits of these systems, and where you can find them in Azure.

Overview of Azure Synapse Link featuring CosmosDB

Azure Synapse Link allows you to connect to your transactional system directly to run analytical and machine learning workloads while eliminating the need for ETL/ELT, batch processing and reload wait times.

In this vLog, I explain how to turn the capability to use Link on in CosmosDB, and what’s happening under the covers to give access to that analytical workload without impacting the performance of your transactional processing system.

Check it out here and let me know what you think!

Getting started with Spark Pools in Azure Synapse

In my latest video blog I discuss getting started on the newly Generally Available Spark Pools as a part of Azure Synapse, another great option for Data Engineering/Preparation, Data Exploration, and Machine learning workloads

Without going too deep into the history of Apache Spark, I’ll start with the basics. Essentially, in the early days of Big Data workloads, a basis for machine learning and deep learning for advanced analytics and AI, we would use a Hadoop cluster and move all these datasets across disks, but the disks were always the bottleneck in the process. So, the creators of Spark said hey, why don’t we do this in memory and remove that bottleneck. So they developed Apache Spark as an in memory data processing engine as a faster way to process these massive datasets.

When the Azure Synapse team wanted to make sure that they were offering the best possible data solution for all different kinds of workloads, Spark gave the ability to have an option for their customers that were already familiar with the Spark environment, and included this feature as part of the complete Azure Synapse Analytics offering.

Behind the scenes, the Synapse team is managing many of the components you’d find in Open-Sourced Spark such as:

  • Apache Hadoop Yarn – for the management of the clusters where the data is being processed
  • Apache Livy – for the job orchestration
  • Anaconda – a package manager, environment manager, Python/R data science distribution and a collection of over 7500 open source packages for increasing the capabilities of the Spark clusters

I hope you enjoy the post. Let me know your thoughts or questions!

Connecting to External Data with Azure Synapse

In my latest video blog I discuss and demonstrate some of the ways to connect to external data in Azure Synapse if there isn’t a need to import the data to the database or you want to do some ad-hoc analysis. I also talk about using COPY and CTAS statements if the requirement is to import the data after all. Check it out here

The Modern Data Warehouse in Azure Part 4: The Serving Layer

In this video blog post I covered the serving layer step of building your Modern Data Warehouse in Azure. There are certainly some decisions to be made around how you want to structure your schema as you get it ready for presentation with whatever your business intelligence tool of choice, for this example I used Power BI, so I discuss some of the areas you should focus on:

  • What is your schema type? Snowflake or Star, or something else?
  • Where should you serve up the data? SQL Server, Synapse, ADLS, Databricks, or Something Else?
  • What are your Service level agreements for the business? What are your data processing times?
  • Can you save cost by using an option that’s less compute heavy?

Microsoft Reimagines Traditional SIEMs with Azure Sentinel

If you’re like most, security is at the forefront of your mind for your organization. You need the right tools and the right team to keep up with the balance of increasing number of sophisticated threats and with security teams being inundated with requests and alerts.

Today I’d like to tell you about Microsoft’s reimagined SIEM tool Azure Sentinel. Over the past 10 to 15 years, Security Information and Event Management (SIEM) has become extremely popular as an aggregation solution for security and events that happen in our network.

There are also software tools, hardware appliances and managed service providers that can help support your corporate needs to better understand the level of risks in real-time and over a span of time. They do things such as log aggregation, event correlation and forensic analysis and offer features for alerting, dashboarding and compliance checks.

These are great resources to help secure our environment, our users and devices. But unfortunately, the reality is security teams are being inundated with requests and alerts. Compile this with the noteworthy shortage of security professionals in the world – an estimated 3.5 million unfilled security jobs by 2021 – this is a major concern.

Microsoft decided to take a different approach with Azure Sentinel. Azure Sentinel provides intelligent security analytics at cloud scale for your entire enterprise. It makes it easy to collect data across your entire hybrid organization on any cloud, from devices to users to applications to servers. Azure Sentinel uses the power of AI to ensure you’re quickly identifying real threats.

With this tool:

  • You’ll eliminate the burden of traditional SIEMs as you’re eliminating the need to spend time on setting up, maintaining and having to scale the infrastructure to support other SIEMs.
  • Since it’s built on Azure, it offers virtually limitless cloud scale while addressing all your security needs.

Now let’s talk cost. Traditional SIEMs have proven to be expensive to own and operate, often requiring you to commit upfront and incur high cost for infrastructure maintenance and data ingestion. With Sentinel, you pay for what you use with no up-front costs. Even better, because of Microsoft’s relationships with so many enterprise vendors (and more partners being added) it easily connects to popular solutions, including Palo Alto networks, F5 networks, Symantec and Checkpoint offerings.

Azure Sentinel integrates with Microsoft Graph Security API, enabling you to import your own threat intelligence feeds and to customize threat detection and alert rules. There are custom dashboards that give you a view to allow you to optimize whatever your specific use case is.

Lastly, if you’d like to try this out for free, Microsoft is allowing you to connect to your Office 365 tenant to do some testing and check it out in greater detail. This product is currently in preview, so there may be some kinks but I’m looking forward to seeing how it develops in the future, as a true enterprise-class security solution for your environment, whether in the cloud, on premises, in data centers or remote users or devices.

Microsoft Announces Windows Virtual Desktop in Azure

Today I’m here with some exciting news out of Microsoft with the public preview of Microsoft Virtual Desktop. Virtual desktops are not a new invention and they are currently offered by multiple vendors.

Windows Virtual Desktop is comprised of the Windows desktops themselves and the application that you would pass out to users and the management solution for these are hosted in Azure. Inside public preview desktops and apps can be deployed on virtual machines in any Azure region in the US, with the management solution in data for these virtual machines residing in the US as well.

As the service moves closer to general availability, Microsoft will start to scale out the management solution and data localization to all Azure regions. Virtual desktops can be deployed with Windows 7 (with extended support) or Windows 10 for the workstation modes, and for the server versions you can run 2012 through 2019.

It will provide a full desktop virtualization environment inside your Azure subscription without having to run any additional gateway servers as you would if you were deploying this on other vendors’ platforms or in your on premises.

With Windows Virtual Desktop you can build custom images or pick some of the canned images provided in the Azure gallery. Images can be personalized and remain static with persistent desktops. There are also many configuration options, for instance; you want to run a single application for a connectivity for setting up a server/client type deployment of an application or you want to deploy pooled multi-session resources.

Another key point is you’re deploying this with significantly reduced overhead as you no longer need to manage the remote desktop roles like you would with remote desktop services on prem or with some of the other providers. You just have to manage the virtual machines inside of your Azure subscription.

There are many great use cases historically for virtual desktops in things like education and medical along with many others.

Accelerate Your AI with Machine Learning on Azure Data Box Edge

In some past blogs I’ve discussed Azure Data Box and how the Data Box family has expanded. Today I’ll talk about Azure Data Box Edge (in preview) and elaborate on the machine learning service that it provides in your premises with the power of Azure behind it.

If you don’t know, Azure Data Box Edge is a physical hardware device that sits in your environment and collects data from environment sources like IOT data and other sources where you might take advantage of the AI features offered by the device. It then takes the data and sends it to Azure for more processing, storage or reporting purposes.

Microsoft recently announced Azure Machine Learning hardware accelerated models provided by Project Brain Wave on the Data Box Edge. Because most of our data is in real world applications and used at the edge of our networks – like image and videos collected from factories, retail stores or hospitals – it can now be used for things such as manufacturing defect analysis or inventory out of stock detection in diagnostics.

By applying machine learning models to the data on Data Box Edge, it provides lower latency (and savings on bandwidth cost) as we don’t have to send all the data to Azure for analysis. But it still offers that real time insight and speed to action for critical business decisions.

You can enable data scientists to simplify and accelerate the building, training and deployment of machine learning models using the Azure Machine Learning Service which is already generally available. They can access all these capabilities in their favorite Python environment, using the latest open source frameworks such as PyTorch, TensorFlow and sci-kit-learn.

These models can run on CPUs and GPUs, but this preview expands that out to field programmable gate array processes (FPGA), which is the processor on the Data Box Edge.

The preview is currently a bit limited but, in this case, you’re able to enhance the Azure Machine Learning Service by training a TensorFlow model for image classification scenarios. So, you would containerize that model in a docker container and then deploy it to the Data Box Edge IOT hub.

A good use case for this is if you’re using AI models for quality control purposes. Let’s say you know what a finished product should look like and what the quality specs are, and you build a model defining those parameters. Then you take an image of that product as it comes off the assembly line; now you can send those images to the Data Box Edge in your environment and more quickly capture defects.

Now you’re finding the root cause of defects quicker and throwing away fewer defective products and therefore, saving money. I’m looking forward to seeing how enterprises are going to leverage this awesome technology.

What Power BI XMLA Endpoints means for you

Are you taking advantage of Power BI’s modeling capability? This is a terrific built in capability and many users build models as they begin to design their reports.  

However, as we transition Power BI into being our enterprise visualization and reporting tool, some of the legacy applications just aren’t going to go away. Many of those applications connect to XMLA endpoints from other semantic model providers such as analysis services on your SQL on prem.

So, the Power BI team decided to give you the ability to do the same. There is a newly announced feature in public preview called XMLA Endpoints for Power BI. Beyond the ability to connect to the XMLA endpoints with other analytical dashboarding tools (like Tableau), you can also connect to it from other toolsets that support XMLA.

For example, many of the Microsoft development tools give you the ability to connect as well, things like SQL Server Management Studio (SSMS), SQL Server Profiler, DAX Studio and even from Excel pivot tables. Just keep in mind it’s only currently available for read access, so some of your capabilities will be limited. Microsoft documentation on this feature states that they will be planning to offer a read/write option soon.

From a licensing perspective, access to XMLA Endpoints is available for datasets in Power BI Premium only, but any user can connect to the endpoints regardless of whether they have a Pro license. The feature itself is turned on within the settings of the Power BI Premium tenant. This is only available in preview at this time, so it’s not supported in production.

In this scenario, other tools such as SQL Server data tools or types where you can play with the model give you the ability to change your model with Visual Studio. Now you can start to see a scenario where you can start to use Git Hub repositories to manage your source control on the models among other endless uses.

I must say I’m impressed with the modular approach of the team to roll out Power BI as a 100% enterprise class tool. First, we saw the Data Flows feature which extracts the ELT/ELT into its own lane. Then things like composite models was introduced where you can have offline and online data sources. Now we’ve got XMLA Endpoints which clearly defines 3 separate layers of development within the Power BI ecosystem.

These recently added features have opened some great avenues and spread user adoption and I know there are more great things to come. The commitment of the team to evolve and improve this product has been excellent and I look forward to what they’ll roll out next.