In a fast-paced world of business technology, innovators and users scramble to stay at the cutting edge – at the head of the comet. Yet further back in the comet tail lies long-standing, trusted technology, often referred to as “legacy” systems. The idea of “legacy systems” means different things to different people. Yes, there are some consistent elements: people are usually talking about the use of COBOL or mainframe systems. But the binary of new vs. old that the term presents – good vs. bad – is at minimum misleading, to others potentially harmful to business.
Talking to one CIO who, when some of his applications were referred to as “legacy systems”, he brusquely responded, “these are not legacy; they are my core business”. Therein lies the core challenge: what might be a legacy system to an outside observer may in fact represent crown-jewel core business functionality to the organization. The real dilemma of course, is ensuring today’s IT needs can be addressed at the same time as supporting tomorrow’s opportunities. Keeping things running smoothly in the short term is critical — no matter what is going on or whichever future plans are made. And simply said, rip and replace tactics can be hugely cost-prohibitive, cause business downtime, and guarantee technical debt.
Legendary, not legacy
What makes an IT application that calculates interest rates, provides insurance quotation, manages critical stock control algorithms for a retailer, provides healthcare benefit processing, manages governmental benefits processing for millions, books parcel shipments, haulage logistics, travel schedules, “legacy”? What makes any of those things a legacy system if that is what the organization does as a core activity, except perhaps the perceived “age” of the system?
COBOL and mainframe systems are widely misrepresented. Historical ‘scandals’ that have long since proven false, claims of “dying” languages, or simple marketing tactics to convince prospects to buy new services, infrastructure and solutions. When the reality is that, these systems are alive, invested in, and still widely used. They are not just surviving, they are thriving.
These systems do have a long history – no one is denying it – but knowing that history, being able to see the journey to where they are today makes them more valuable, not less. Legendary, not legacy, if you like.
The roots of COBOL
1959. The idea of an enterprise-scale and robust language of choice for business was devised. That's what the B stands for. Built as part of a committee process, COBOL remains an open standard, managed by the International Standards Body.
The language was designed to be portable, on mainframes, mid-range, then UNIX, AIX, OS/2, Windows, Linux and all points in between. Its portability, while not as important for the mainframe community per-se at the time, increasingly became a key feature as IT teams began to explore new environments and new business models, as IT decentralized. Nowadays, nodes often no longer sit within owned networks anymore, instead sitting on third-party public or private clouds. That level of interconnectedness and hybrid IT is a de-facto model in many of the largest organizations, and is what makes modern COBOL so prevalent, and viable.
The challenge and opportunity lies in getting the best out of any technology investment to deliver business value. It comes down to appropriateness, defining where something is useful, where it makes most sense for the organization. So do you have the people with the necessary skills? Do you have the technology that easily connects to the right other pieces of other, newer technology? Is it working on the right platforms? Does it provide the right throughput and robustness and reliability?
COBOL ticks many, many of those boxes, and a fair candidate for a host of systems - the grown up stuff that if they fail, it matters. Granted, COBOL isn't used to build Angry Birds, but it does run over half of the global banking system, while countless other industries depend on it to function. Modern COBOL is part of many contemporary toolchains. Its usability allows for a breadth of deployments spanning, plug and play, service disentanglement, discrete microservices delivery, straight out of the box, and more.
A modern technology?
Due to its misrepresentation, the wider COBOL programmer community are not given the opportunity to bid to use COBOL in these new ways to solve the challenges of today. Plus the accepted wisdom, however much of a cliché, that in order to do something new, you have to use a new piece of technology. Fortunately, there has been a consistent, concerted effort by the community (such as the Open Mainframe Project), and large technology providers like IBM and ourselves at Micro Focus to invest, modernize and advocate. It has allowed its users to keep transforming as the digital economy has changed – running and transforming their business functions and capabilities at the same time.
As the cloud and container world continues to make new deployment options available, at the same time as continued innovation in the mainframe world, this means choice - it doesn’t mean one world will simply swallow the other. These worlds will continue to collide, overlap and interact. They'll continue to require mashups and connections that are based on working technology, but which support a brand new use case. When someone says, "We might need some new technology for that." We say, "Great, well, why don't you use COBOL – it will work anywhere you need it.”
It's not going to get any easier for organizations to figure all of this out, but actually, in some of that complexity arrives a heck of a lot of new opportunity as well, as well as trusted technology that has been proving its value again and again for decades. COBOL may be a 1959 idea, but it's a 2021 technology.
Derek Britton, Director of communications and brand strategy, Micro Focus