For many years, Microsoft offered a simple set of Business Intelligence tools. That all-purpose suite featured a relational database, an ETL (extract, transform, and load) tool, an OLAP (online analytical processing) database, and a reporting tool. There were a handful of additions, but the core was this simple, monolithic toolset. One person could master it and provide world-class BI solutions. Today’s technologists have a wealth of options at their disposal. These abundant new offerings hold exciting possibility, allowing enterprising solution architects to tailor each design to today’s business demands.

Flexible Offerings Replace Multi-tool Approach

When formulating their Azure strategy, Microsoft took a different approach. I think they found inspiration in the established trades – builders, plumbers, mechanics, and electricians. Each tradesperson relies on a toolbox. Take the mechanic’s toolbox, for example: a huge chest, with drawer after drawer of instruments. One of those drawers will hold dozens of wrenches and sockets of varying sizes – the opposite of a one-size-fits-all tool. Each instrument accomplishes a single, specific task. The chest will also hold lots of specialized tools that aren’t used often, but that can transform a few nearly impossible feats to easily managed tasks. In any given assignment, a mechanic might use a dozen or more tools to finish a job. Microsoft’s rich cloud assortment gives technologists the same capabilities – offering tools to design tailored solutions for complex needs.

digital-toolbox

The Microsoft Azure and Power Platform ecosystems – the data equivalent to the mechanic’s toolbox – feature a mass of options compared to on-premises offerings. Microsoft’s database offerings in Azure are a great example. Rather than featuring a single Azure database that does everything, the company offers multiple products designed to excel with different workloads. There’s the traditional relational database in Azure SQL Database (and Azure SQL managed instance), or the massively parallel version in Azure Synapse. Cosmos DB, Databricks or Hadoop-based offerings are some of the more exotic options. Then there are the OLAP databases with Azure Analysis Services and Power BI. The story is the same with ETL: there’s Azure Data Factory, Power Automate (formerly Microsoft Flow), and Databricks. Other categories are rich with choices, too.

Superior Instruments Afford Sophisticated Solutions

These Azure tools all offer more refined, specialized abilities compared to their on-premises kin. Continuing in the database vein, Azure SQL Database is the relational SQL engine migrated to a cloud environment. While it works for many types of workloads, it best suits OLTP (online transaction processing) workloads. If you are working with a massive data warehouse, you would probably look at Azure Synapse (formerly Azure SQL Data Warehouse) instead. If you need document storage, you would look at Cosmos DB. Designing a solution around the workload would not be new to solution architects. In fact, veterans of on-premise architecture would probably ask the question, “Which tool should I be using? Azure SQL Database or Azure Synapse?” But if we find ourselves asking this as an either/or question, we need to take a step back. In the cloud, the solution might need both tools to operate best. Architects should instead be asking, “Which suite of technologies can work together to handle our needs?”

Let’s get a little more concrete with an example. Suppose we are creating a high-volume data warehouse in Azure. We’ve estimated that the data volume we need to support would work well with Synapse. But, there’s a problem – we have lots of control and audit data that doesn’t work well in a massively parallel environment. It’s small but important to the success of the project. When we architects ask ourselves, “Which tool should I use?”, we might use Azure SQL Database on the hyperscale tier. It will scale up and scale out. It doesn’t have any size limitations, unlike the other tiers of Azure SQL Database. But is it the right solution? It may be, but more experienced cloud architects among us will have an “Aha!” moment and ask, “Why not both?” Our scenario might allow for control and audit data to be separated from the bulk of the data. If so, it may make more sense to store the small data in the standard SQL engine and load the bulk dimension and fact data into Synapse. Then we can play each database engine to its strengths, rather than handicapping our solution simply to dump all our data into a single bucket.

Complex Data Requires Creative Solutions

The flexibility of Azure here allows for this new way of approaching a problem. Because the primary cloud structures are virtualized, we can now mix and match technologies in a way that would be prohibitively expensive and difficult to integrate in an on-premise environment. Play with the building blocks. Mix and match. Embrace the diversity that the cloud enables.

Contact us today if you have any questions or would like to discuss your organization’s needs.