Skip to main content

Bimodal analytics: What modern DataOps teams need to know when choosing their analytics platform mix

(Image credit: Image source: Shutterstock/ESB Professional)

The most recent Gartner Data Centre Magic Quadrant (MQ) for analytics and business intelligence (BI) platforms, released in early 2019, does what all of their MQs do best: Rank the market leaders for their respective areas of IT focus. For the analytics and BI edition of the MQ, not only does the report rank 21 vendors in terms of completeness of vision and ability to execute, it details the evolution of modern analytics platforms. This article won’t explore the fruits of the 2019 edition - you can review the Magic Quadrant source yourself - but rather will delve into the bimodal analytics model proposed within this latest report.

We’ll touch on some of the analytics platforms on either side of this bimodal model, and what BI and DataOps teams need to know when leveraging both traditional and experimental analytics technologies in their organisations. We’ll also explore how to implement this modern mix of legacy and newcomer analytics platforms without compromising the query performance, agility, and security of your IT team’s existing analytics structure.

What is the bimodal analytics model?

Gartner’s approach to ranking Business Intelligence vendors is a bit different from other analyst firms in that it splits them into two modes - traditional and forward-thinking processes.

Specifically, Gartner’s definition of bimodal is the following: “Bimodal IT refers to having two modes of IT, each designed to develop and deliver information- and technology-intensive services in its own way. Mode 1 is traditional, emphasising scalability, efficiency, safety and accuracy. Mode 2 is nonsequential, emphasising agility and speed.

“To put it another way, ‘Type 1 is traditional IT, focused on stability and efficiency, while Type 2 is an experimental, agile organisation focused on time-to-market, rapid application evolution, and, in particular, tight alignment with business units.’”

When translating this to analytics specifically, Mode 1 is comprised of traditional BI processes such as preparing data, scheduling and sending scheduled reports, and querying multiple tables and complex schemas in a single data source. Mode 2 refers to modern processes such as visualisation, agility, and self-service BI. 

Both modes are here to stay

Many BI teams have Tableau as an analytics tool in their stack - it’s ubiquitous in both smaller businesses and the enterprise. Along with Microsoft, whom Gartner deems as a leader in the space, a traditional approach to analytics (closer to Mode 1, that is) still reigns supreme in the enterprise. Salesforce certainly thought so with its gigantic $14.6 billion purchase of Tableau earlier this year.

At the same time, upstart and more experimental companies like Looker touting a ‘Mode 2’ analytics platform are quickly becoming the go-to choice for many enterprises. In fact, 73 per cent of Looker’s reference customers report Looker being their only enterprise analytics and BI platform. Google made it apparent that they see Looker’s potential with its $2.6 billion purchase of that company earlier this year. The bimodal approach to analytics is clearly no Gartner hype term with the market shares of both traditional and experimental analytics platforms holding steady. Gartner, too, deems a combination of both Mode 1 and Mode 2 technologies as the ultimate nirvana for IT.

There is absolutely deep value in having a mix of both technologies as a cornerstone of your enterprise analytics strategy - neither are going away and each mode of analytics process has its own distinct value for an enterprise. The challenge lies in what BI and DataOps teams need to consider when leveraging both technologies, and how this mix of tools can still be maintained and optimised without choking the query performance, agility, and security of your IT team’s existing analytics infrastructure.

Optimising every analytics investment with an adaptive analytics fabric

An adaptive analytics fabric is a new way to enable agile analytics through the combination of intelligent data virtualisation and autonomous data engineering. It gives uniform and shared access to data to BI users, whether the data is stored in Mode 1 analytics platforms such as a data warehouse, or within Mode 2 technologies such as a data lake.

An adaptive analytics fabric is completely agnostic to the analytical tools used and location of data sources, providing users with uniform, fast, transparent and secure access to data with any tool they choose. There are three main areas where adaptive analytics fabric efforts should be focused to maximise the benefits of your entire mix of analytics tools across the enterprise: performance, agility and security.

Enhanced query performance

Legacy analytics systems can be slow to query billions of records. Leveraging acceleration technologies ensures that as queries run inside the adaptive analytics fabric, machine learning is applied to determine the optimal path to satisfy the query and to make future queries faster and less resource-intensive. The time savings here (performance from 2x and up to as much as 1000x faster) will get BI users the data they need quicker, regardless of which end of the bimodal analytics spectrum a BI tool lies on.

Agile analysis for all BI & AI users

BI, data scientists and analyst teams - and really, any business user - need to get the same answers to the same questions regardless of the tool they use, whether PowerBI, Tableau, or Excel today, or a new tool in the future. However, different BI tools have varying dialects of query languages, making it difficult to ensure business users are receiving consistent answers across tools. By applying a “universal semantic layer,” queries are normalised and standardised in the adaptive analytics fabric. This layer should support any BI tool, regardless of the access protocol required, whether MDX, SQL, JDBC, ODBC or Rest API.

Preserving security rules for all data

A critical consideration when choosing the analytics platform mix is security. Enterprises need to seek a universal approach that preserves the security policies of each constituent data source and seamlessly merges them into a single, combined view, reflecting all the security data policies across the enterprise.

Adaptive analytics fabric for a cohesive analytics mix

According to Forbes, more than 60 per cent of enterprises use two or more BI tools, and this mix frequently includes both Mode 1 and Mode 2 analytics approaches. Organisations need to find a way to tie in all aspects of their analytics infrastructure across the enterprise, without compromising query performance, agility, and security. By employing adaptive analytics fabrics, enterprises can have the best of bimodal analytics in their organisation - the ideal mix, according to Gartner - and give their BI, data scientists and analysts one single, secure, and accessible source of truth.

Dave Mariani, VP of Technology, AtScale

Dave is one of the co-founders of AtScale and is the VP of Technology. Prior to AtScale, he was VP of Engineering at Klout & at Yahoo! where he built the world's largest multi-dimensional cube for BI on Hadoop. Mariani is a Big Data visionary & serial entrepreneur.