January is the customary time to make predictions on what the year holds in store. Working in partnership with companies across multiple industries that are looking to develop data science and AI skills in their workforce, I have a good vantage point on the trends that are developing across the realm of technology. In addition, I have published recent research with colleagues at Cambridge University about the challenges that face organizations with deploying machine learning. From this perspective, there is a clear picture forming that 2021 will be a turning point within leading businesses for making a priority of operationalizing AI. In fact, the second half of 2020 has seen a new crop of tools, platforms and startups receiving investment to provide solutions to this difficult problem.
This prediction may seem surprising to some readers, who might rightly point to enterprises that have already launched AI projects. Indeed, there have been dozens of AI proof of concept (PoC) projects in the headlines over the past few years. International breweries using AI to improve operations, energy conglomerates creating a “digital power plant”, car manufacturers trying to predict when a vehicle will need servicing.
However, the key distinction here is whether these projects ever move past the PoC phase into full operation. Moreover, even if they do, whether these AI deployments actually deliver any value for the business beyond publicity.
Gartner predicted that 80 percent of AI projects in 2020 remain what they call “alchemy, run by wizards whose talents will not scale in the organization.” Others are even more pessimistic, with Algorithima’s 2020 State of Enterprise Machine Learning report estimating that only 8 percent of enterprises have sophisticated models in production.
There is nothing intrinsically wrong with pursuing PoC projects. In fact, I would encourage all organizations to start with low-hanging fruits and look at how they can “hack” their processes as a way of enabling experimentation, innovation and unlocking commercial advantage. However, it is beyond time that some of these pursuits work towards a destination of operationalization and scalable value.
More than anything else, if enterprises continue with the approach of getting excited about running experiments and stopping there, there is a risk that AI and ML will be seen as nothing more than a financial hole. The myriad opportunities for ML to have a material impact on a company's bottom line may never be realized if the funding runs out before we get to the operational stage. As an industry, then, we need to move past geeky PoCs and start actual deployments that will make businesses money.
The evolution of MLOps
Thankfully, there are a lot of lessons that can be learnt from the Software Engineering field and DevOps principles. The catalyst that may finally move the majority of AI projects beyond “alchemy” and into robust engineering in 2021 is called MLOps. This combination of “machine learning” and “operations” is, in the simplest terms, the practice for collaboration and communication between data scientists, developers and platform engineers to improve the production lifecycle of ML projects.
It is not a new idea. In fact, the term is appropriated from DevOps, a practice that has existed for two decades. Practicing MLOps means to follow standardisation and processes to automate and monitor all steps of the ML deployment workflow, including data and infrastructure management, model learning, testing, integration, releasing, deployment and security. Ultimately, MLOps expedites and removes the pain from embedding ML into scalable systems.
While 2021 is not the year the idea MLOps was conceived, there is strong evidence to suggest that it will be the year we see enterprises formalize an MLOps strategy and put it into practice.
This is partly driven from demand within organizations to reduce production cycles because of the high cost that has yet to be converted into ROI. According to the Algorithmia report, 22 percent of companies have had ML models in production for 1-2 years. Naturally, these organizations will be applying pressure to see results sooner rather than later and MLOps is a solution for shortening the time it takes to put a model into full production.
Tools and skills
Clearer evidence of the acceleration of MLOps as a practice can be seen in the investment into MLOps products. Like software development, MLOps requires an ecosystem of tools and frameworks that industrialize the process of creating ML models and create an environment for developers and operational professionals to collaborate. The availability of solutions is one of the fundamentals needed for MLOps to be put into practice.
Companies such as H2O and DataRobot have led the way in the field of AutoML tools but there has now been an explosion of startups being funded in this space. A new report from Cognilytica predicts exponential growth of the market, to the tune of $126.1 billion by 2025. This represents a 33.73 percent compound annual growth rate and demonstrates the recognition in the industry that new tools and platforms are required for AI deployments to be successful.
Once the tools are in place, the last piece of the puzzle in creating true MLOps is skills. It goes without saying that to build a consistent approach to ML will require a skills pool beyond what Gartner describes as the AI “wizards”. The talent pool has to be dramatically expanded to ensure that there are enough skilled professionals on both the developmental and operational sides of MLOps to bring the tools and processes together. Through our own apprenticeships and courses, we have seen a significant increase in demand, which once again suggests that enterprises are beginning to approach MLOps in earnest.
The implementation of MLOps and closer collaboration of software developers and AI practitioners will bring a maturity to the market in 2021. This will mean more processes and systems that enable the scaling and acceleration of machine learning capabilities. Hopefully, this will make 2021 the year that enterprises begin to reap the benefits of AI deployment through efficiencies and savings, leading to more investment in innovation.
Dr Raoul-Gabriel Urma, CEO and founder, Cambridge Spark