Accelerate Your AI with Machine Learning on Azure Data Box Edge

In some past blogs I’ve discussed Azure Data Box and how the Data Box family has expanded. Today I’ll talk about Azure Data Box Edge (in preview) and elaborate on the machine learning service that it provides in your premises with the power of Azure behind it.

If you don’t know, Azure Data Box Edge is a physical hardware device that sits in your environment and collects data from environment sources like IOT data and other sources where you might take advantage of the AI features offered by the device. It then takes the data and sends it to Azure for more processing, storage or reporting purposes.

Microsoft recently announced Azure Machine Learning hardware accelerated models provided by Project Brain Wave on the Data Box Edge. Because most of our data is in real world applications and used at the edge of our networks – like image and videos collected from factories, retail stores or hospitals – it can now be used for things such as manufacturing defect analysis or inventory out of stock detection in diagnostics.

By applying machine learning models to the data on Data Box Edge, it provides lower latency (and savings on bandwidth cost) as we don’t have to send all the data to Azure for analysis. But it still offers that real time insight and speed to action for critical business decisions.

You can enable data scientists to simplify and accelerate the building, training and deployment of machine learning models using the Azure Machine Learning Service which is already generally available. They can access all these capabilities in their favorite Python environment, using the latest open source frameworks such as PyTorch, TensorFlow and sci-kit-learn.

These models can run on CPUs and GPUs, but this preview expands that out to field programmable gate array processes (FPGA), which is the processor on the Data Box Edge.

The preview is currently a bit limited but, in this case, you’re able to enhance the Azure Machine Learning Service by training a TensorFlow model for image classification scenarios. So, you would containerize that model in a docker container and then deploy it to the Data Box Edge IOT hub.

A good use case for this is if you’re using AI models for quality control purposes. Let’s say you know what a finished product should look like and what the quality specs are, and you build a model defining those parameters. Then you take an image of that product as it comes off the assembly line; now you can send those images to the Data Box Edge in your environment and more quickly capture defects.

Now you’re finding the root cause of defects quicker and throwing away fewer defective products and therefore, saving money. I’m looking forward to seeing how enterprises are going to leverage this awesome technology.

Leave a Reply

Your email address will not be published. Required fields are marked *

I accept that my given data and my IP address is sent to a server in the USA only for the purpose of spam prevention through the Akismet program.More information on Akismet and GDPR.

This site uses Akismet to reduce spam. Learn how your comment data is processed.