Category Archives: Review

Overview and Benefits of Azure Cognitive Services

With Artificial Intelligence and Machine Learning, the possibilities for your applications are endless. Would you like to be able to infuse your apps, websites and bots with intelligent algorithms to see, hear, speak, understand and interpret your user needs through natural methods of communication, all without having any data science expertise?

Continue reading Overview and Benefits of Azure Cognitive Services

Overview of Azure Databricks

I’d like to tell you about Azure Databricks. If you don’t know what that is, Azure Databricks provides an end-to-end, managed Apache Spark platform optimized for the cloud. It’s a fast, easy and collaborative analytics platform designed to help bridge the gap between data scientists, data engineers and business decision-makers using the power of Databricks on Azure.

Azure Databricks uses Microsoft Azure Active Directory as its security infrastructure and it’s optimized for ease of use, as well as ease of deployment within Azure. It features optimized connectors to Azure storage platforms (e.g. Data Lake and Blob Storage) for the fastest possible data access, and one-click management directly from the Azure console.

Some key features are:

Auto-scaling – This feature makes scaling much quicker and allows you to scale up or down as you need.

Auto-terminator – Helps you control the costs of your compute time, as well as assist you in preventing cost overruns (a concern for many cloud users).

Notebook Platform – The Notebook Platform supports standard languages (SQL, Python and R for example) and it builds a whole discussion environment around those platforms, enhancing collaboration amongst teams.

Here are some simple steps to get you started:

  • First, you’re going to prepare your data by ingesting it from your Azure storage platform, which has native support with Azure Databricks.
  • Next, you’re going to do any kind of transformation you need on your ingested data and store it in a Data Warehouse.
  • From here, you’ll want to start to perform analytics on your data. These platforms are built for lots of data and you’ll have the capability to explore large data sets in real time, as well as the ability to explore very quickly.
  • Lastly, you’re going to display the data. Databricks has native support for tools like Power BI to build your dashboards and analytics models.

So, Azure Databricks provides an end-to-end data solution. You can quickly spin up a cluster or do advanced analytics with this powerful platform. And with it, you can create and monitor robust pipelines that will help you dig deep and better understand your data, allowing you to make better business decisions.

Top 5 Takeaways from the Microsoft ICA Boot Camp

I was a recent attendee at the Microsoft International Cloud Architect Boot Camp, where I had the opportunity to participate in hands-on sessions, working closely with Microsoft teams and specialists, as well as other Microsoft Partners. This boot camp contained exclusive content that Pragmatic Works gets access to as a partner and as a preferred service within the Microsoft stack.

Here, I’d like to share my top 5 takeaways from this event:

1. Commitment to Security – As a cloud solution architect, I’m asked many questions around security and Microsoft Azure. One thing that amazed me was the commitment that Microsoft has made to security. They spend over a billion dollars each year on security to ensure they are secure from all threats. Microsoft is also the #1 attack to surface in the world. They are truly committed to making sure that your data and surfaces are secure.

2. Security Certifications – Microsoft has passed over 70 regulatory and government certifications when it comes to security and standardized processes. Their second-place competitor, AWS, has only completed 44 of these certifications. Getting these certifications and adhering to certain security and regulatory standards can be expensive, but there is a significant benefit for enterprise, government and small/medium-sized businesses.

3. Right-sizing Their Environment – This can be a challenge for many companies. Microsoft’s internal teams have gone completely to Azure and are managing their platforms within Azure for SQL databases, virtual machines and all other services Azure offers. By doing some specific right-sizing and keeping watch on what’s offered, they lowered their workloads and kept their CPU at the 95th percentile, and more importantly, they were able cut down on spending for their internals needs – to the tune of over 3 million dollars a month!

4. Differentiators from AWS – AWS is currently the #1 cloud platform as far as revenue and volume. But Microsoft is quickly catching up and they’ve identified several differentiators from AWS. Some key differentiators, such as Azure Recovery Zones and other such services, which have been slow to come up, will have releases to general audiences by the end of 2018. MS does not see any other differentiators that will allow AWS to continue to hold that lead.

5. Connections/Partnerships – By having Office 365, Dynamics 365, and Skype and LinkedIn connections, as well as the commitments to partners and ISVs, gives Microsoft a competitive advantage over AWS in what their ecosystem looks like. A common complaint heard is how AWS doesn’t work well with, or cater to, partners, leaving them to figure it out themselves.

Azure Site Recovery in 3 Minutes

Having a data/disaster recovery site or plan in place is crucial today, whether for compliance or to ensure if anything does happen, then your business will still be able to operate. Today’s episode of Azure Every Day focuses on Azure Site Recovery.

Azure Site Recovery is Microsoft’s business continuity or data recovery service. With this service, you can move your VMs to the cloud, back them up or go site to site. To utilize this service, you’ll need to coordinate and set up replication between the sites and/or servers.

You have some options of how to do this. You can back up an Azure VM to another Azure VM in a different geo locale, or back up a physical server, VM infrastructure or Hyper-V up to Azure. The physical and VMware are real-time replications, as opposed to Hyper-V, where you can get it down to about a 30-second window.

Azure Site Recovery has many great features, such as:
Application Awareness – It knows what you’re running (i.e. SharePoint, SQL Exchange, Active Directory, etc.) Because of this, it’s able to easily stop in one location and start in another in the event of a disaster.
Region to Region Replication – If you want to take your replication from the East Coast to the West Coast, this is built into the service, so it’s easily done.
Encryption – From a security standpoint, it supports encryption at rest and encryption in transit. This is extremely helpful when you’re backing up from one Azure Virtual Machine to another, or from your local VMware infrastructure to Azure. This will all be encryption in transit, as well as at rest, when it lands in Azure.

Some other key features are the auto failover and auto failback capabilities, as well as continuous replication, so your RTO and RPO are easily met by working on this platform. You can also run automated recovery scenarios, so you can test your disaster plan without any impact to your environment.

Pragmatic Works SSRS Master Class Review

A little while back, the boys from Pragmatic Works (www.pragmaticworks.com) came up to Boston for their “Master” level training series on SSRS. I attended the class, and wanted to share my experience so other interested people can get a preview of what to expect from taking a PW class.

History…

So, up front, I want to be honest about the fact that I am somewhat biased about the training services provided by PW due to my own past experiences in taking their classes. They have a plethora of offerings in relation to the SQL Server Stack and Data topics. I have attended several of their virtual and on-site classes in the past, and am always pleased with the course material, humorous injections, and interesting nuggets they always provide.

Content…

Ok, now that the free paid advertisement is over, let’s get to the meat and potatoes of the class.  Taught by Devin Knight (@knight_devin) and Mike Davis (@MikeDavisSQL), the focus of the class was to look at some of the deeper components of how to really expand the capabilities of SSRS beyond the “canned” options and features.  One thing I really like about the presentation style of the courses, beyond the humor, is the fact that they will talk about best practices, give demonstrations and examples of them, then follow-up with tips and tricks to get around some of the nuances of the technology being taught.  This particular course focused on several areas of particular interest to me, and some others that don’t apply to my situation, so they were just good for informational purposes.  The areas within SSRS covered were:

    • Good Report Design
    • Custom Code
    • Reporting from Cubes
    • Utilizing Reports
    • Subscriptions
    • IDEs
    • Configuration and Security

As well as a section on Power View.

Takeaways…

Overall, I thought the delivery of the class and the ability of the presenters to break up the material in order to keep it interesting was very good.  Devin and Mike clearly know their stuff, and very obviously love doing it as well as sharing their insight and the various “Microsoftisms” that can occur.  Below is a bit of detail on what worked and didn’t work as well for me about the class specifically.

Good stuff:

  • The examples used about displaying the numbers in such a way that they become more readable by highlighting numbers falling in certain ranges, or more specifically, in-report KPIs, were very helpful.
  • Getting into some of the deeper security and configuration topics offered some different techniques on establishing better security alternatives.
  • The modules on linking and mapping within reports were good to have in this course, as those areas have provided some headaches to myself and my peers in the past.

Neutral stuff:

  • Personally, I don’t do a whole lot with SSAS and Cubes, as most of my work surrounds SSIS and SSRS, but there were some interesting nuggets revealed and “ah ha” moments, as this appears to be somewhat of a tricky subject.

Not as useful stuff:

  • Given the class is at the “Master” level, I’m not sure an entire module and lab needed to be dedicated to Good Report Design.  My assumption is that the majority of the people in the class were there because they have a fair amount of experience in delivering reports, and they were looking for ways to better extend them.  Maybe instead of a module, using poor report design as a “pop quiz” throughout presenting in order to break up the monotony and add some humor.
  • I found the preparation materials for the class to be lacking a bit.  Many of the students in the class (I believe there were around 75 of us) did not have their environment setup correctly on the first day so Mike and Devin spent much of the day running from person to person to make sure the configuration was correct.  I especially remember this being the case for the SSAS and Cube sections.

Conclusion…

I’ve been a big fan of Pragmatic Works products and training for a while now and recommend them to anyone that is looking to brush up on their SQL skills, or might have a need for their suite of tools.  I found this class to be helpful and was able to use some of the topics covered almost immediately after taking the class.

Take 1: My first presentation at Sql saturday

I just recently presented at my first SQL Saturday event, SQL Saturday#334 – The Boston BI Session, and wanted to share my experience for future first-timers in the hopes it might help them with their presentations.  Special thanks to Mike Hillwig (@mikehillwig or http://mikehillwig.com/) for giving me a shot for this great event.  I did a fair amount of preparation leading up to the event in order to not be a total flop, and was able to speak at a local user group a couple of months before, which helped immensely. I’m a member of the SeacoastSQL User Group (http://seacoastsql.org/) out of Portsmouth, NH, and got some great guidance and feedback from Jack Corbett (@unclebiguns) after my first demo.  The group is co-run by Mike Walsh (@mike_walsh), and is regularly attended by 10-15 members at our monthly meetings.  I’ll take readers through the process I took to try and improve my presentation skills. Also, I will share the finished product and the elements of the presentation I believe I can improve on, and hope to, for any upcoming SQL Saturday experiences.

Preparation:

Pick a technology:

In order to prepare for my session, I took some time to think about what SQL technology I had the most experience with, and wanted to give an overview about.  I have seen some phenomenally brilliant people in a specific technology completely flop when trying to explain that technology, or freeze up when getting in front of a crowd, so I really wanted to make sure it was something I was very comfortable with.  When choosing my topic, SSRS, I decided it would be good to give an overview of the technology, as well as some “best practice” items for attendees to ponder as they walked away.  Also, I wanted to choose something I was very familiar with in case something went wrong with my demo, and I needed to adjust on the fly in order to keep things from becoming awkward.

Brush up on those speaking skills:

I’ve always been relatively comfortable being in front of a crowd and have loved the opportunity for good discussion.  There seems to be a general mix of people who like speaking and those who don’t.  Being in front of a crowd of your peers should be something to get excited about, and in order to build the community and our knowledge, everyone should try it at least once.  For those who aren’t aware, talking about tech can get a bit boring and tedious at times, so a nice overview of something where it can be kept light, and throwing some “softball” questions out to the audience to keep them engaged were items I focused on when building my presentation.  For those who aren’t comfortable, start with a small group to get feedback, or even just a recorded session to be able to playback your voice and notice what you’re doing that might annoy people.

For my presentation, I was showing a slideshow and a demo all in the same hour-long session.

I’ve actually seen 3 different types of speakers:

  • • All presenting with slides and examples
  • • Some presenting and some demo
  • • All demo

It’s really up to you what you want to do, and what you think will deliver an effective session to the audience.  My topic required some demonstration, and at the same time, gave me the opportunity to instill some methods and best practices for success.

Create a Script:

Some of the best advice I read about and received while preparing for the session was to create a script with some easy to reference queries for necessary coding elements of the presentation.  Also, the other piece of advice I picked up was to always avoid typing in a demo.  Copy and paste any code possible in order to avoid errors and delays.

Build your presentation:

I started with an overview of my background as well as the topics I wanted to cover.  Some people are better at reading from cue cards, but I’m more of an “off the cuff” speaker, so I just jotted down some notes I wanted to highlight in a basic order to go along with a PowerPoint presentation. Successful PowerPoint design rules are posted all around the web about how much content, bullet points, static text, and ways to keep people interested, so do some reading on how to make the presentation flow cleanly.

Practice the presentation:

The old saying is: Practice makes perfect, and not much can be further from the truth. I did a dry run about 6 times to get a sense of how long the whole presentation would take as well as putting the order of topics to memory.  From there, I recorded a session using the Camtasia Studio and sent it to a few friends to critique.  I knew I would be presenting for about an hour, including questions, so I made sure to leave time for interruptions, system stalls, and anything that might slow me down a bit.  My dry runs were taking about 45 minutes and when it came to the actual demonstration, it took 1 hour and 1 minute, so I was pleased with the timing.

Feedback:

When I reviewed the comments from the session evaluations, there was a mix of people who came to get introduced to the technology, and people who were refreshing their skills from some time ago.  Most people felt that they walked away having learned something, which means I succeeded in my mission. On a 1-5 scoring system, 5 being the best, I received many 4’s and 5’s, and a few 3’s, so it would seem people were pretty pleased with the topic and presentation.

Next time:

Among the items I learned in this presentation was that you can expect a wide range of questions from people, both on topic and off. I found myself spending time on questions that weren’t necessarily relevant to the conversation, so be aware of the audience and do your best to filter without being rude to the questioner if the question is off topic.

SQL Saturday events are for learning and networking, so if you find someone is showing interest in your topic, and/or somewhat jumping in and answering questions directed at you, I would suggest engaging that person after your session is over. This is a good opportunity for you to possibly learn more about the topic, or have a resource to rely on when you might be running into issues with a project.