Category Archives: Azure

Hybrid Identity Management with Azure Active Directory

With all the things organizations need to manage identity for – on-premises environments, mobile devices, laptops and other managed devices, plus our internal active directory systems – it’s becoming increasingly harder to manage. We are in a new world of mobile first, cloud first reality.

Here are a few stats to think about:

  • 63% of confirmed data breaches involve weak, default or stolen passwords
  • More than 80% of employees admit to using non-approved SaaS applications in their jobs
  • As we are trying to manage all this, IT budgets are barely growing – we’re seeing less than 1% growth year over year

In reality, those Software as a Service (SaaS) apps integrate nicely and enable users to be more efficient, but we must be able to manage all those identities. When a user comes into your environment, using all kinds of web applications with user accounts for each, and possibly access to a corporate credit card, then that person leaves the company or gets let go, it’s difficult to track all those if they are individually managed.

With Azure Active Directory, you can manage 1000s of apps with one identity, enable business without borders, as well as manage access to scale, plus you’re offering cloud-powered protection. With Azure AD at the core of your business, you are enabling identity as a control plane.

So, how does this look?

    • With Azure AD on your current on premises environment, you’ll want to link up with all those cloud applications (Azure, SaaS, Office 365, any public cloud).
    • In between, you’ve got Azure Active Directory, where you can easily sync that back with your on premises and then tie that into all those SaaS applications.
    • This allows you to offer self-service, single sign on to your users for all of those apps, plus any internal on premises areas you use with user names and passwords.
    • Everything will be synchronized across the landscapes and you can extend that out to your customers and partners as well.
    • This is a powerful way to enable your workforce, as well as sync with your customers and partners when you want them to have access to certain areas.

Simply put: 1000s of apps with one identity, using single sign on to any app using Microsoft Azure Active Directory. And to take it one step further, if you want to move any of your VMs up into Azure or any of your services up into a PaaS solution, you already have that integration and using your Azure AD domain services, you can set up your lift and shift that much easier.

 

An Overview of Azure File Sync

I have a question… Who is still using a file server? No need to answer, I know that most of us still are and need to use them for various reasons. We love them—well, we also hate them, as they are a pain to manage.

The pains with Windows File Server:

  • They never seem to have enough storage.
  • They never seem to be properly cleaned up; users don’t delete the files they’re supposed to.
  • The data never seems accessible when and where you need it.

In this blog, I’d like to walk you through Azure File Sync, so you can see for yourself how much better it is.

    • Let’s say I’m setting up a file server in my Seattle headquarters and that file server begins having problems, maybe I’m running out of space for example.
    • I decide to hook this up in a file share in Azure space.
    • I can set up cloud tiering and set up a threshold (say 50%), so that everything beyond that threshold, those files will start moving up into Azure.
    • When I set this threshold, it will start taking the oldest files and graying them out as far as users are concerned. The files are still there and visible as there, but they’ve been pushed off to the cloud, so that space has now been freed up on the file server.
    • If users ever need those files, they can click on them and redownload.
    • Now, let’s say I want to bring on another server at a branch office. I can simply bring up that server, synchronize it with the branch office based on those files in Azure.
    • From here, I can hook up my SMBs and NFS shares for my users and applications, as well as my work folders using multi-site technology. I have all my files synchronized and it’s going to give me direct cloud access to these files.
    • I can hook up my IaaS and PaaS solutions with my REST API or my SMB shares to be able to access these files.
    • With everything synchronized, I’m able to have a rapid file server disaster/data recovery. If my server in Seattle goes down, I simply remove it; my files are already up in Azure.
    • I bring on a new server, sync it back to Azure. My folders start to populate, and as they get used, people will download the files back and the rules that were set up will maintain.
    • The great thing is it can be used with SQL Server 2012 R2, as well as SQL Server 2016.
    • Now I have an all-encompassing solution (with integrated cloud back up within Azure) with better availability, better DR capability and essentially bottomless storage. Azure Backup Vault gets backed up automatically and storage is super cheap.

With Azure File Sync I get:

1. A centralize file service in Azure storage.

2. Cache in multiple locations for fast, local performance.

3.  I can utilize cloud based backup and fast data/disaster recovery.

3 Power BI Offerings to Consider…

I’m often asked by clients about which Power BI offering is best for their business and where they should store their data. The 3 main offerings around Power BI all have their strong points where they excel. It comes down to understanding what each offers to decide the best fit for your organization’s data and needs.

Continue reading 3 Power BI Offerings to Consider…

Overview and Benefits of Azure Cognitive Services

With Artificial Intelligence and Machine Learning, the possibilities for your applications are endless. Would you like to be able to infuse your apps, websites and bots with intelligent algorithms to see, hear, speak, understand and interpret your user needs through natural methods of communication, all without having any data science expertise?

Continue reading Overview and Benefits of Azure Cognitive Services

What is Azure Cosmos DB?

Are you familiar with Azure Cosmos DB? Cosmos DB is Microsoft’s globally distributed, multi-model database. With the click of a button, it allows you to elastically and independently scale throughput and storage across any number of Azure’s geographic regions, so you can put the data where your customers are.

Cosmos DB has custom built APIs that allow you a multitude of data sources, like SQL Server, Mongo DB and Azure tables, as well as offering 5 consistency models. It offers comprehensive Service Level Agreements (SLAs) with money back guarantees for availability (99.99% to be exact), latency, consistency and throughput; a big deal when you need to serve your customers at optimum performance.

Cosmos DB is a great option for many different use cases:

  • Companies that are doing IOT and telematics. Cosmos DB can ingest huge bursts of data, and process and analyze that data in near real-time. Then it will automatically archive all the data it ingests.
  • Retail and Marketing. Take an auto parts product catalog, for example, with tons of parts within the catalog, each with its own properties (some unique and some shared across parts). The next year, new vehicles or new parts model come out, with some similar and different properties. All that data adds up very quickly. Cosmos DB offers a very flexible schema in a hierarchical structure that can easily change the data around as things change.
  • Gaming Industry. Games like Halo 5 by Microsoft are built on a Cosmos DB platform, because they need performance that is quickly and dynamically scalable. You’ve got things like millisecond read-times, which avoids any lags in game play. You can index player related data and it has a social graph database that’s easily implemented with flexible schema for all social aspects of gaming.

Azure Cosmos DB ensures that your data gets there and gets there fast, with a wealth of features and benefits to make your life easier. And it’s easy to set up and manage.

 

Overview of Azure Databricks

I’d like to tell you about Azure Databricks. If you don’t know what that is, Azure Databricks provides an end-to-end, managed Apache Spark platform optimized for the cloud. It’s a fast, easy and collaborative analytics platform designed to help bridge the gap between data scientists, data engineers and business decision-makers using the power of Databricks on Azure.

Azure Databricks uses Microsoft Azure Active Directory as its security infrastructure and it’s optimized for ease of use, as well as ease of deployment within Azure. It features optimized connectors to Azure storage platforms (e.g. Data Lake and Blob Storage) for the fastest possible data access, and one-click management directly from the Azure console.

Some key features are:

Auto-scaling – This feature makes scaling much quicker and allows you to scale up or down as you need.

Auto-terminator – Helps you control the costs of your compute time, as well as assist you in preventing cost overruns (a concern for many cloud users).

Notebook Platform – The Notebook Platform supports standard languages (SQL, Python and R for example) and it builds a whole discussion environment around those platforms, enhancing collaboration amongst teams.

Here are some simple steps to get you started:

  • First, you’re going to prepare your data by ingesting it from your Azure storage platform, which has native support with Azure Databricks.
  • Next, you’re going to do any kind of transformation you need on your ingested data and store it in a Data Warehouse.
  • From here, you’ll want to start to perform analytics on your data. These platforms are built for lots of data and you’ll have the capability to explore large data sets in real time, as well as the ability to explore very quickly.
  • Lastly, you’re going to display the data. Databricks has native support for tools like Power BI to build your dashboards and analytics models.

So, Azure Databricks provides an end-to-end data solution. You can quickly spin up a cluster or do advanced analytics with this powerful platform. And with it, you can create and monitor robust pipelines that will help you dig deep and better understand your data, allowing you to make better business decisions.

3 Reasons Why You Should Move Your Business to the Cloud

Cyber security is on everyone’s mind these days and it can be a challenge for many organizations. If this sounds like you and you haven’t moved to the cloud, it’s something you should think about. I’d like to tell you why you should move your business to the cloud and why it could be more secure there.

1.  When you’re in the cloud business, having a secure cloud drives more business. That’s why cloud companies are willing to invest more to hire the best and brightest. So, the top security people in the world are going to the top cloud companies in the world.

2.  When moving to the cloud, typically, the customer only has to focus on one aspect of security because the rest is already taken care of, so by default, secure. You’d have to intentionally unlock something to make yourself less secure.

3.  Regulatory and certification requirements are more easily satisfied. With a foundation in place that’s already secure and certified, it allows you to focus on your app or infrastructure or whatever requirements you need to satisfy those regulatory compliance issues.

So, make this your year to move to the cloud and take some of the cyber security challenges off your mind.

Why Your Infrastructure Belongs in the Cloud

You haven’t moved to the cloud yet? In this Azure Every Day installment, I’d like to tell you the top 5 reasons why you may want to move your infrastructure to the cloud.

1. Cost – Many people can take advantage of operational cost savings by not having to invest in a bunch of hardware that sits unused. In the cloud, you only pay for what you use.

2.  Business Continuity – With the cloud, you have better, more guaranteed up-time without having to worry about in-house appliances or certain infrastructures or servers. You also get easier administration. The cloud locations in Azure are set up so you can easily maintain and migrate your systems. And there’s no need for a second data center, giving you high availability, as well as more cost savings.

3.  Agility – You don’t have to spend money having something running all the time. It’s easy to spin up and spin down as you need it. You also have the ability to scale at an exponential rate. You can start small, but quickly build in traffic or performance capabilities or whatever you need.

4.  Management and Maintenance – You can drastically reduce the time needed to maintain and manage your environment, as well as have one central area for monitoring and maintaining your systems. You’ll save time wasted on running back ups and maintaining servers.

5.  Improved Security – Cloud providers have it in their best interest to be secure. There are over 300,000 open security jobs in the US alone. Where do you think those people want to work when there’s top quality companies paying top dollar? You guessed it – cloud companies.

Most Important Components of Azure Data Factory

Are you new to Azure and not know what Azure Data Factory is? Azure Data Factory is Microsoft’s cloud version of an ETL or ELT tool that helps you get your data from one place to another and to transform it. Today, I’d like to tell you about the high-level components within Azure Data Factory. These components pull together a data factory that helps your data flow from its source and have an ultimate end-product for consumption.

  • Pipeline – A pipeline is a logical grouping of activities that performs a grouping of work. An example of an activity may be: you’re copying on-premise data from one data source to the cloud (Azure Data Lake for instance), you then want to run it through an HDI Hadoop cluster for further processing and analysis and put it into a reporting area. The components will be contained inside the pipeline and would be chained together to create a sequence of events, depending upon your specific requirement.
  • Linked Service – This is very similar to the concept of a connection string in SQL Server, where you’re saying what is the source and destination of your data.
  • Trigger – A trigger is a unit of processing that determines when a pipeline needs to be run. These can be scheduled or set off (triggered) by a different event.
  • Parameter – Essentially, the information you can store inside a pipeline that will pass in an argument when you need to fill in what that dataset or linked service is.
  • Control Flow – The control flow in a data factory is what’s orchestrating how the pipeline is going to be sequenced. This includes activities you’ll be performing with those pipelines, such as sequencing, branching and looping.

What is Internet of Things (IoT) and Why It Matters to IT

The Internet of Things (IoT) has become a growing topic both inside and outside of the workplace. It has the ability to change how we live and how we work. Many are already on board with IoT, and global IoT revenues are projected to reach over a trillion dollars by 2020. If you’re not there yet, I’d like to talk today about what IoT is and how it’s being used.

Internet of Things, or IoT, is defined as a device that used to be a stand-alone device, but is now connected to the internet. Consumer based devices include Google Home, Alexa, Smart Watches, Fitbits and home thermostats. These products are already changing the way the owners of these devices live.

From a business standpoint, with the help of services like Azure IoT Hub, this gets much bigger, with a much larger impact on how people work. Large engine monitoring devices for trains and planes, for example, have millions of components that are being monitored all the time, therefore, showing real-time statistics about what is happening on those devices.

Chevron is using Microsoft Azure IoT Hub in the backend as they build out their IoT infrastructure for monitoring their oil, and deep well, drilling devices. John Deer is mounting IoT devices on their equipment that tracks things such as where, how far apart or how deep seeds are being planted, so farmers can get planting information right from these devices.

Yes, IoT is a big deal and it’s helping manufacturing and other companies, as well as consumers alike, by expanding the capabilities of things that are, or can be, connected to the internet.