The programming language Cobol has been around for 61 years in some form or another. For many organisations, that age shows, and people who can keep mainframe-based Cobol applications upright are becoming harder and harder to find, especially as most computer science programs aren’t teaching it any more.
The importance, and brittleness, of these systems was on show back in April 2020, when, at the height of the COVID-19 pandemic, various state authorities from New Jersey to Kansas started to put out desperate pleas for Cobol programmers to volunteer or come out of retirement to keep their creaking unemployment systems running in the face of unprecedented demand.
That’s because, even at the ripe old age of 61, Cobol is still being used by many big banks, insurance companies, and public organisations to run core transactional business processes, like paying unemployment benefits or dispersing money from an ATM.
Cobol does the job, but it’s hard to maintain and integrate
“These are 20- to 30-year-old apps that have served the business well, but they accumulate technical debt and are very specific to what that business has. [Cobol] is functionally rich but it happens to run on a platform that is restrictive and doesn’t play with other modern systems,” said Tim Jones, managing director of application modernisation at software service provider Advanced.
Jones sees the dual mainframe and Cobol skills crises, associated cost considerations, and the need to maintain competitive advantage via technology innovations as the major drivers behind organisations moving on from Cobol today, especially those applications that still reside on a mainframe.
“It’s not just a Cobol skills issue, it’s a broader legacy modernisation issue, of which Cobol is just one element,” Jones said.
“Modern, mobile, and cloud systems integrating into mainframe is really difficult. When you are thinking about your competitive edge, if you are on mainframe, you will fall behind. If you are going through a digital transformation, you will eventually have to deal with the elephant in the room of that big mainframe in the corner that is hosting 70 per cent of your business applications.”
“As long as those workloads reside on the mainframe, there must be a business case to move away from Cobol,” said Markus Zahn, cluster lead for new mainframes at Commerzbank. “This might be due to a resource shortage caused by demographic effects or changes in customer expectations in terms of timely availability of data.”
How to modernise Cobol applications
For workloads where that business case has been successfully made, there are several ways to free yourself of those legacy constraints. You could try and find a similar piece of software off the shelf; conduct a straight lift-and-shift of the application onto cheaper infrastructure, or commit to a wholesale application rewrite into a modern language like Java or C#. But all those routes are fraught with risk.
By lift-and-shift, we typically mean a straight copy of an application from a legacy platform to a more modern one, without making any changes to the design of the application. This often requires using an emulator on the target platform to enable the application to continue to run as before.
But lift-and-shift may not provide the long-term value many organisations seek. “Lift-and-shift was a popular strategy a decade ago, but it is typical for organisations today to want more value from a modernisation project,” Jones said.
“Not only do they want to reduce operating costs, they need to increase agility to meet market demands, deliver future-ready technologies, adopt cloud and modern development practices, and drive IT innovation to support the business.”
Then there is the total rewrite option, where you break down your application and rewrite it in a more modern language.
“The challenge with a rewrite is whenever you look at something and the business requirements, these systems are 30 years old with limited documentation and no one person that understands it end to end, so it takes a long time,” Jones said, “If you do it right, you have something completely cloud-native out of the gate, but by the time you finish you could already be behind.”
Instead, many organisations are opting for something of a safe middle option, known as code refactoring. This involves some restructuring of the code, often through automated refactoring software from specialist providers. This technique maintains the existing business logic but enables better performance and portability, without drastically changing the core functionality.
“That is the route that seems to have the most success,” said Mark Cresswell, executive chairman at LzLabs, a service provider that specialises in modernising mainframe applications.
Here are two examples of organisations freeing themselves from mainframe-based Cobol, both using code refactoring techniques and automated refactoring software to modernise their applications.
How the UK DWP got off the mainframe
The UK Department for Work and Pensions (DWP), which is responsible for various welfare, pension, and child maintenance payment schemes that serve as many as 20 million claimants a year, opted for a conservative “like-to-better” automated conversion of its Cobol applications, with no lift-and-shift mainframe emulation.
As of 2015, many of the systems responsible for these payments—including war pensions and Jobseekers allowance unemployment benefits—were still written in Cobol, residing on ICL mainframes hosted by HP. “As you can appreciate, that is a really costly platform to support and maintain,” said Andy Jones, lead infrastructure engineer at the DWP, in a video interview last year.
The cost of maintaining the legacy infrastructure underpinning these systems, paired with the increasing difficulty the department was having in finding and retaining people who could maintain mainframe-based Cobol applications, highlighted the need to update those 25 million lines of code, all before support for the VME mainframe operating system was due to expire in December 2020.
“This was more about the infrastructure that supports the Cobol code, which was ICL infrastructure, which was aging, and the individuals with the expertise to support that were aging themselves,” Mark Bell, the VME replacement program lead at the DWP, told InfoWorld.
This wasn’t the first attempt to move on from these legacy systems by DWP engineers, however. Over the years, at least four attempts to modernise these payments systems, including one lengthy attempt to rewrite the Cobol code into Java, had been tried and failed. Now that the platform was truly reaching its end of life, a simplified strategy was settled on, and experts from application modernisation specialists at Advanced were drafted to help.
This strategy involved a code conversion from Cobol to the more modern, object-oriented Micro Focus Visual Cobol, and moving from the VME operating system to Red Hat Linux, hosted on private cloud servers by Crown Hosting Data Centres, a joint venture between the UK Cabinet Office government agency and the private Ark Data Centres.
More than 10 billion data rows would also be moved from the closed, hierarchical IDMSX database to Oracle relational databases as part of the project.
For the uninitiated, Micro Focus Visual Cobol is a more modern implementation of the Cobol language, with the aim of opening it up to modern development environments and concepts.
“Importantly it provides mixed-language support, which means you can bring on board Java and C# developers who can write new programs that seamlessly integrate with the existing Cobol programs, all from within the same development environment,” Advanced’s Jones said.
Work started with the smallest system, which pays £1.5 billion in housing benefits once per month in batches to about 360 local authorities. The new code was converted then tested in parallel against the original over a four-week period. Once everyone was satisfied that no disruption would occur, engineers cut over to the new system in February 2018.
From there, DWP engineers and Advanced specialists went system by system, including converting DWP’s largest service, for the Jobseekers allowance, over Easter 2020, just as the COVID-19 pandemic was starting to grip the UK.
“That was an intense time to do an application migration for a benefits service that was starting to see an avalanche of claims because of how COVID was hitting the country,” Bell said. After that, there were three smaller systems to move, with the department finally free of its VME mainframe-based Cobol applications in January 2021.
The cost savings and performance gains have already been significant for the government body. “This more modern operating model allows us to [reduce] resources and support,” Bell said. On the performance side, where Jobseekers allowances used to be processed in multiple four-hour batches, they now complete in about an hour.
The new cloud-based Micro Focus Visual Cobol setup may seem like kicking the problem down the road by still relying on a version of Cobol, but it allows the whole organisation to be more responsive and modern in its approach to software. Where updates to the old monolithic Cobol systems could only be deployed once or twice a year, the new object-oriented systems allow for smaller, more regular changes to be made by the developer team.
Those developers can also start to experiment in a dev/test environment on Amazon Web Services (AWS), build out a set of reusable APIs to expose key data sources, and push changes through a CI/CD pipeline. The move to Micro Focus Visual Cobol even opens the door to a more drastic rewrite into something like Java or C# in the future.
New York Times delivers on Cobol conversion
Similarly, the New York Times needed to update the application responsible for its daily home delivery service in 2015. The 35-year-old application was built in Cobol and ran on an IBM Z mainframe. The publishing company wanted to convert the application into Java and run it in the cloud with AWS, as it was becoming costly to maintain and didn’t integrate well with other, more modern systems.
An attempt to manually rewrite the home delivery application between 2006 and 2009 had already failed, leaving the application largely untouched until 2015, when it was running 600 batch jobs with 3,500 files sent daily to downstream consumers and systems, consuming about 3TB of data, and storing 20TB of backup data.
That year, engineers at the publisher decided on a strategy of migrating code and data using the code refactoring technique, using proprietary automated refactoring software from specialist partner Modern Systems (later acquired by Advanced).
The code was converted to Java and the data was shifted from indexed files to a relational Oracle database. This process took two years and, after less than a year of running in a private data centre, the new application—named Aristo—was migrated to AWS in March 2018 after eight further months of work.
“If the New York Times had its cloud strategy already in place before starting the mainframe migration, the company would have chosen to migrate the mainframe directly to AWS, avoiding the extra work for designing and implementing the on-premises Aristo deployment,” said an AWS blog post about the project.
Now the delivery system has been integrated into the more modern Digital Subscription Platform, which is run, built, and maintained by the same Subscription Platforms group at the New York Times. It has helped reduce the total cost of ownership for the Aristo application by 70 per cent per year.