2017 will see AI and Intelligent Systems help enterprises accelerate towards a 21st century analytics mindset, away from the 1980’s style which continues to pervade enterprises today.
Analytics operations reach enterprise mainstream
Data analytics, AI and Intelligent Systems (operational systems that include predictive analytics) have been talked about largely in technology terms to date. Dowlaty sees 2017 as the year when these systems will become more widely and systematically deployed across the enterprise. Computing conversations in corporate IT will shift from technology discussions to use case implementations.
Mu Sigma’s recent ‘State of Analytics and Decision Sciences’ report showed that while two-thirds of senior decision makers recognise the positive impact analytics can have on business growth, many companies are failing to manage and exploit analytics effectively. For Intelligent Systems, AI and data analytics generally to make the desired business impact, companies need to shift their focus away from the technology itself and toward applying it to business issues.
“The technology is quite sophisticated now, so Chief Data Officers and CIOs need to stop deliberating the merits of Spark, Hadoop or MapReduce,” says Dowlaty. “It’s time to put Intelligent Systems into production and manage them much more effectively.”
Dowlaty says that doing so is being hampered by enterprise decision-makers still clinging to the analytics models of the 1980s, with their myriad dashboards and Excel spreadsheets.
“They are not thinking about how they can actually get their analytics into the real world. Firms that are digital natives are doing it, but not traditional enterprises.”
As a first step, those responsible for data analytics should take inspiration from the traditional DevOps function - how it has revolutionised corporate IT management and governance - and devise a similar approach for analytics, suggests Dowlaty: “We need AnalyticsOps, so to speak. This will bring about a renaissance in the enterprise, where the same rigor applied to managing IT assets, is applied to model development, algorithms, and problem solving. This is long overdue.”
Algorithms move from Cloud to the Edge
In recent years, the IT world has moved steadily to the Cloud. While the Cloud is good for many things, it’s not a cure-all, especially in a heavily instrumented world. 2017 will usher in greater emphasis on ‘fog’ or ‘edge’ computing, driven by the Internet of Things (IoT).
According to Gartner Group, nearly 21 billion devices will be connected to the Internet by 2020. It would be incredibly inefficient to transmit all the data generated by these devices to the Cloud. Doing so would require a great deal of bandwidth and affect processing speeds, especially when it comes to ‘heavy’ data such as video content.
Edge computing, as the name implies, is a distributed approach which moves data processing away from the Cloud to the edge of the network, where it takes place in smart routers, gateways or mobile devices. “You can use very low cost devices – think Raspberry Pis – to run analytic algorithms at the network edge, performing work like video capture and analysis” explains Zubin Dowlaty.
“To many large enterprises, this is still in the domain of hobbyists. They don’t realise that it’s often a much cheaper, more efficient approach than processing in the cloud or on-premise, especially in situations where real-time decisioning is required.”
Literate programming brings transparency and reproducible research to analytics
Natural language programming, a way of programming that uses regular sentences, has been around for some time, but it is now back in the limelight thanks to the growing popularity of enterprise notebooks like Jupyter, RSTUDIO and Zeppelin within the scientific computing and analytics community.
Online notebooks blend a mix of computer code and rich text. They can be read by humans, but also embedded as code to be executed as a computer programme to perform tasks such as data analysis.
“Enterprise notebooks are finally becoming the standard for top data scientists. Using notebooks with their mixed approach highlights that data analytics is no longer just geek territory – it’s now a team sport involving multiple disciplines. The notebooks ensure that both data scientists and non-technical contributors have a transparent view of analytics projects as they develop, and which are highly reproducible, says Dowlaty.
“Analytical problem solving is now a team sport, requiring an open platform for collaboration. Such platforms also mitigate the risks of those situations where data scientists leaving and taking their IP with them.”.
The Maker Culture is ushering in inexpensive analytics experimentation
The final important trend in data analytics that Dowlaty sees developing in 2017, is big businesses waking up to the potential of the fast, low-cost innovation strategies of the Maker Culture – and replacing more of their traditional R&D processes that are overly complex, rigid and expensive.
The Maker community can be described as a technology-enabled version of DIY culture, with ‘kitchen table’ inventors who develop technology-driven solutions to everyday problems on a small, experimental scale. And the time is right for the concept to be applied to analytics. More than this, it is enabling experimentation with intelligent systems.
Zubin Dowlaty explains: “The advantage is that you can create cheap prototypes quickly – for example using 3D printing – test and improve them in a short timeframe, and then either build them into enterprise-grade solutions – or discard them with minimal losses. 2017 will see enterprises make much more use of these approaches.
For example, one can easily create a custom harness using 3D printing to include various sensors as well as a Raspberry Pi with WiFi capability. This sensor harness can then be deployed, begin streaming sensor data and evaluate the efficacy. Armed with the final designs and knowledge, you can now scale up and produce the solution confidently at scale.”
In summary, Dowlaty predicts that 2017 will shake up traditional analytics approaches enterprises commonly used in terms of both creating new organisational structures and processes to accommodate modern analytics, and opening the gates of corporate IT to more low-cost ‘grassroots’ technologies and innovation strategies.
Image Credit: Bluebay / Shutterstock