Why the UK Government won’t move to cloud…yet

The value of service exploration and innovation is potentially astronomical to citizens and delivery teams alike. The business of government (namely policy delivery) is, at its very core, dynamic and the services that underpin them should be too. 

Iterative, DevOps-led value chains with rapid time-to-market and short feedback cycles allow the public sector to increase its performance many times over, focusing limited technological budgets on what is required and efficiently delivering services to engage citizens. 

A fundamental part of this, in my view, is the use of scalable and reliable public cloud services, policy on which has recently been refreshed by the Cabinet Office in its ‘Cloud First’ guidance on the use of public cloud in public sector. This is a crucial enabler of digital innovation that allows departments to move much faster with potentially lower cost and reduced risk. 

The UK Government Digital Service (GDS) and the digital teams within the central government departments have shown that a radically different way of delivering public sector services can be a success, helping the UK government to rank top in the world for digital innovation. 

However, public security delivery does still bear the scars of “big IT”: legacy and heritage systems, continually outsourced teams, and the ghosts of big bangs past. These are slowly making way for new behaviours, approaches and thinking, but it’s an evolving process. 

The Cloud First policy, originally published in 2013, mandates that during any procurement process public sector organisations should “consider and fully evaluate potential cloud solutions first – before they consider any other option”, whilst the guidance on use of the cloud seeks to reassure users that the public cloud is safe for sensitive data. A factor not yet fully embraced by departmental security teams. 

But why does the government have to ‘refresh’ the emphasis on public cloud, as it did back in March 2017? It’s my view that the message hasn’t yet trickled down into actual widespread practice. 

As James Stewart, Director of Technical Architecture and Head of Technology at GDS, commented: “we've still got lots of myths to bust and best practice to share”. Not least, how the security patterns and approaches (adopted in global financial services and retail for some time now) are different to those security measures and models currently employed.   

Using public cloud services is the next logical step in driving public sector innovation forward, however, there’s a threat that the good work is being undone. 

Said the Rt Hon Lord Francis Maude, the ministerial architect of the GDS: “just at the moment when the UK has just recently been ranked top in the world for digital government, we are beginning to unwind precisely the arrangements that had led to that ... there is a sense these old structures in government, which are essentially about preserving the power of the mandarins, are being reasserted.”  

The genie is, however, out of the bottle and the value of rapidly deployable, scalable services able to respond to business and user needs is now a common expectation across the private and public sector.    

So, what’s stopping that next step? 

There are numerous critical and interrelated barriers, I believe: underdeveloped in-house capabilities, an overreliance on contractors and corporate consultancies, the big-bang nature of government projects and a lack of transparency and collaboration. 

Most cited is a lack of mature in-house skills and capabilities – for the last fifteen years government departments haven’t owned their own IT departments, which has meant there’s been very limited development of in-house capability. 

This has resulted in a very limited number of experienced civil service engineers possessed of a deep understanding of both business process and the technical services that underpin them. 

The resultant short-termism deprioritises forward-planning for skills and creates a consequent over-reliance on contractors and large corporate consultancies. The results have not been optimal: the recent £46m failure of a Scottish IT project counted amongst its causes “Accenture’s decision to take a waterfall approach, a fundamental loss of trust between the partners, and an underestimation of the complexities of the project.” 

Moreover, this has meant a much more difficult migration to an optimised public cloud model as the strata of historical service provider systems have had to be unpicked and rationalized without in-house experience. 

The problem is, as my colleague Hibri Marzook comments: “you need to find people to set a vision, not just install a product. You need digital leaders”.   

But, the lack of skills and vision and the consequent reliance on contractors spills over into self-reinforcing project management problems. The emerging use of specialist ‘rainbow teams’ using professional service providers, contractors and civil service skills (where available) to deliver outcomes are arguably struggling to have the intended impact due to limited public sector capabilities and leadership. 

Coupled with this is the amount of unplanned project work in a cycle spent firefighting and refactoring or removing dead code left by previous incumbents without the context of established, experienced in-house IT teams. 

When it comes to delivery methods, the service framework has done a considerable amount to improve matters but there is still some way to go. Hibri comments: “teams don’t have the autonomy, there’s still top-down management and old school project management rather than agile ways of working. 

The public sector should be looking to engage the tech talent pool available by adopting an open source model, so you’re not tied into big consultancies. This makes it more open and collaborative. They then need iterative small changes. Employ a continuous deliver mindset, then the taxpayer gets the most value and isn’t funding huge projects that are likely to fail”. 

Another of my colleagues, Henry Bell, has previously worked at a number of major government departments. He told me” “At one large department…there were enormous legacy mainframes that made up the back end that couldn’t realistically be touched. It used to be incredibly expensive and time-consuming to make any changes. 

But the front-end, consumer-facing portals were more accessible. So individual services were picked out, made into microservices and moved up the stack. This process was then slowly repeated across existing services. 

That department used to release twice a year. Now they release many times a day. And each change is much more granular, much more isolated. There are no huge deadlines, no massive expenditure of cash. Just small steps in the right direction.”

The old guard can’t help 

Once the key barriers have been identified, what’s the next step for government? 

What’s sure is that the old guard can’t help. These are the same companies that have been failing to deliver government IT projects for decades. The amount of money that has been wasted on them goes into the billions.  

Alistair Smith, Public Sector Lead and Client principal,  Contino

Image Credit: Melpomene / Shutterstock