Skip to main content

Debunking the three biggest mainframe myths

(Image credit: Image Credit: IBM)

New technologies are constantly being introduced into our lives only to be usurped a few years later, but amidst all the toing-and-froing of the past few decades in large enterprise IT, there has been one constant: the mainframe. Despite false claims of the death of the mainframe, it remains the bedrock of many organisations, powering high-volume transactional workloads and mission critical business functions. The mainframe’s strategic relevance is only growing in the digital age; in fact, 64 per cent of mainframe-powered organisations are now planning to run more than half their mission-critical workloads on the platform, an increase from 57 per cent in 2018.

The reliability, power, scalability, security and transactional cost efficiency of the mainframe are all business-critical virtues unique to the platform, and they form the basis for why countless organisations are preserving the mainframe’s position in their IT environment. However, despite its proven benefits and the growing confidence enterprises have in its ability to power mission-critical workloads, some shameless pundits still debate its future. With that in mind, let’s debunk some of these fallacies by looking at the three biggest mainframe myths to uncover the fact-based truth about IT’s longest mainstay.

Myth 1 – You need to re-platform to take advantage of the cloud

Consumers don’t know their mobile banking app is retrieving data from a mainframe and making it available on their device via an app that’s hosted in the cloud, nor do they really care. The only thing that matters to a consumer is that the product or service they’re using is being delivered in a high-quality manner and helps them go about their lives easier. To provide the best customer experience, companies need to use the best possible platform, and while some are better than others at carrying out certain tasks, no single system is the best for everything.

Adopting a two-platform, one-speed IT strategy based on the mainframe and the cloud is therefore beneficial. Both platforms provide their respective strengths, and both platforms can run at the same speed and uncompromised quality required from Agile and DevOps processes. This empowers development teams to innovate at the same speed, and with the essential quality, regardless of platform and without the stress of migrating off systems that are proven to work.

Some misguided individuals are quick to jump on the cloud-first bandwagon, when the fact is, history shows organisations that attempt migrations off the mainframe often create business disasters for themselves and the customers they serve. Ultimately, two-platform, one-speed IT is a better, cheaper and faster way for mainframe-reliant organisations to make innovative progress when compared to the risky alternative of migrating decades of critical data over to the cloud.

Myth 2 – Mainframe development teams can’t be agile

When it comes to implementing Agile Development processes, it’s not just a question of whether the hardware is modern enough. The culture, processes and tools surrounding the platform are equally important as the hardware for enterprises trying to improve the velocity, quality and efficiency of their software development and delivery.

Adopting Agile or DevOps practices, and modernising tools and processes, will make it possible for developers of any experience to interact with the mainframe. Through this approach, enterprises can rest easy that the platform is easily adaptable to new tools, compatible with new technologies and, crucially, accessible for developers new to the mainframe. However, there isn’t a one-stop solution to modernising mainframe software delivery and the tools within it – it’s a process.

At the beginning of the journey, replacing outdated green-screen environments with modern, intuitive IDEs is key, as this creates an improved developer experience on the mainframe. Organisations can then begin automating complex, repetitive processes, such as testing. A comprehensive plan will ultimately enable siloed mainframe development and operations teams to form cross-platform Agile/DevOps teams and begin to work as part of the wider IT department.

Myth 3 – COBOL contains asbestos

First, COBOL is nothing more than program syntax that when compiled into machine code drives compute resources. Of the ever-increasing list of 1000+ syntaxes in the world, no single one is always best. Every program syntax has pros and cons by design. What’s unique about COBOL is the compiler. IBM has done a masterful job at continuously improving the COBOL compiler every few months. These improvements fine tune the resulting machine code so that it can most effectively and efficiently drive the latest and greatest compute resources in the engineering marvel that is the mainframe. That’s why the world’s economy rightfully runs on more than 220 billion lines of COBOL code, and that number is increasing every day.

Second, in the Age of Software, code that is proven to be efficient and effective is one of the greatest assets of an organisation. Converting proven COBOL code to unproven Java code will likely increase the lines of code by 50 per cent, decrease the performance and reliability, increase security risks, and increase total costs. That’s beyond crazy. Instead, an organisation’s precious time, money and energy should be invested in delivering new features that customers care about and removing technical debt that will improve future flows within your software delivery process.

Third, today’s incoming developers are polyglot programmers. By necessity, they’re comfortable working with a variety of programming languages, and COBOL is just another one to add to the list. For COBOL to slot seamlessly into the modern programmer’s skillset, modern DevOps tools are required to enable developers to work on multiple kinds of programs with ease and confidence. Fortunately, these tools are readily available and growing. A modern IDE, for example, provides developers a familiar and preferred experience within which they can code in almost any language on most platforms – demonstrating how COBOL can slot in nicely with other languages the modern developer uses frequently.

The modern mainframe reality

The mainframe continues to be an integral component in the IT infrastructure of countless household names around the world, and – like Mark Twain – rumours of its death have been greatly exaggerated. However, it can still be difficult for companies to know how the mainframe can be incorporated into their modern IT strategy. Companies don’t want to sacrifice the reliability, security and power of the mainframe, nor do they have to. Through adopting modern tooling and methodologies such as Agile Development and DevOps, enterprises can retain the many benefits the mainframe has to offer, integrate the platform with newer technologies, and achieve greater velocity, quality and efficiency in software development and delivery.

Christopher O’Malley, CEO, Compuware

Christopher O’Malley is CEO at Compuware.