The future of mainframe computing: Legacy or legendary?

The demise of mainframe computers has been predicted for decades, but they still thrive as the reliable core processing workhorse for many industries. In fact, IBM was able to report over 50 per cent increase in mainframe revenues in 2013.

But as the technology world evolves, games are changing, and the relevance and apparent suitability of the mainframe world is struggling to keep pace with the expanding demands of today's information-hungry customers, including trends such as cloud computing. This is compounded by recent high-profile mainframe outages that have dented not only customer service but also the reputation (and share price) of some major organisations. Can mainframe overcome the core challenges it faces? These include its assumed high cost, the IT skills crisis, and a perceived irrelevance to modern computing.

In many large-scale organisations, the mainframe is considered powerful, secure, cost effective and unrivalled in reliability. However, when an application's processing consumption is rising, or its response times are not meeting users' real-time expectations, often the cost benefit justification of these systems are jeopardised. This is either because MIPS/MSU costs continue to rise and in turn force hardware upgrades, or simply because the systems are becoming too complex and expensive to maintain.

By making sure that mainframe environments are kept up-to-date and understanding the new workloads with which the mainframe must contend, companies can overcome the surprises of degraded service, rising costs or unplanned hardware upgrades. IBM has invested billions in research and development, and the zEnterprise brand offers a mainframe environment that boasts staggering price performance improvements.

At the same time, it introduces a level of flexibility in terms of platform options that would have been unthinkable even just a decade ago. For example, the new IBM zEC12 mainframe environment is capable of executing more than 78,000 MIPS, meaning that is has 50 per cent more total system capacity than its predecessor.

As well as erroneous perceptions regarding cost, there are also not enough academic institutions supporting the survival and development of the mainframe.

The lack of IT skills in mainframe languages remains a significant issue in major enterprises and is one of the primary drivers (along with cost) for the growth of the outsourcing market in the last decade or so. Organisations are seriously considering a "rip and replace" approach to their systems for off the shelf solutions, because of the apparent lack of skilled mainframe developers, which in most cases is completely unnecessary and in turn exceedingly expensive.

Recent research with over 100 universities around the world determined how mainframe programming languages such as COBOL are being taught. It's clear we still have a long way to go. Nearly 75 per cent academics running IT courses at universities do not have COBOL programming as part of their curriculum. That's despite the fact that 71 per cent believe that today's organisations will continue to rely on applications built using the COBOL language for the next 10 years and beyond - a massive disparity in what is being taught and the skills needed in business.

Proponents of the mainframe development paradigm have forged partnerships with training organisations, academic institutions and others to help build the next generation of mainframe programming staff. Commerce, academia and even IT students themselves must come together with vendor support. Interestingly, one organisation sought to retrain some programming staff in key mainframe skills, and found that modern tooling enabled developers to pick up key mainframe skills such as COBOL in a matter of hours, effectively eradicating the perceived skills crisis in a single stroke.

Furthermore, the wider industry perception of the mainframe remains a genuine problem. While mainframe computing still processes many business critical applications for enterprise, the very fact that it (in most cases) quietly goes about its business with no fuss means there is little mindshare devoted. As a result, there is also very little new competitive edge or corporate emphasis placed on the mainframe, outside of the efforts of the vendor community.

Indeed, only when an outage takes place does anyone mention the word mainframe, or so it seems. The rest of the time, it is the silent majority of business value in IT. The untold story of the enduring value of the mainframe is a vacuum into which negative perceptions can flood, and this situation needs to change. A key to this is the investments in innovation being made in the mainframe world as key vendors seek to retain and improve mindshare.

Ultimately, all this means is that the mainframe can and should be seen as being as relevant today as it ever was. Indeed, instead of labelling the mainframe as a "legacy" environment, it's often far more appropriate to call them "legendary systems." These are the systems that run businesses, and they need the management, investment and innovation to support the future.

The mainframe is keeping organisations running and provides the bedrock for future innovation. While the vendor community is providing the technology platforms that organisations need (Eclipse, zEnterprise etc.), training a new generation of mainframe-savvy professionals who aren't afraid to extol the virtues of big iron is going to be imperative for the survival of the mainframe. The ingredients are there, but the mainframe community must continue to be vocal about the value it continues to provide.

Derek Britton is director of solution marketing at application modernisation and management specialist Micro Focus.