The fast-paced digital economy creates a major paradox for IT departments. On the one hand, they need to accelerate time-to-market and deliver new solutions faster than ever. At the same time, they must maintain robust security, which doesn’t always come naturally with rapid release cycles. Greater speed can sometimes come at the cost of an increased risk of making mistakes, and as the frequent headlines decrying the latest cybersecurity catastrophe clearly show, those mistakes present a greater threat than ever. However, security concerns cannot be a barrier to progress. If they delay innovation, there’s a much bigger risk that organisations will disappear beneath the waves of digital disruption created by new and more agile incumbents. It’s easy to see why many IT leaders feel like they’re stuck between a rock and a hard place; but is there a way for speed and security to co-exist?
Go back to where it all began
At the heart of this puzzle lies the mainframe, which has been the mainstay of enterprise IT for decades. Security is firmly entrenched in the mainframe culture, which has a reputation for large, drawn-out units of work with meticulous planning, strategic thinking and safe decisions at their core. As such, the mainframe offers the perfect template for robust security. However, the machinery of IT can only move as fast as its slowest cog, so if businesses are looking to achieve speed, then mainframe teams must break with tradition and find a way to deliver their innovations faster.
If they are to do so without sacrificing on the quality and security that has been central to their history, mainframe teams must break complete solutions down into a sum of their parts and refocus on the minimum viable product (MVP). This means that rather than working on long, drawn-out projects to deliver a complete solution to the business, they instead release minor updates and improvements to specific features and functionality within the core application. That’s a lot like the way the manufacturing industry works; production line teams focus on creating their own part of the whole, rather than trying to single-handedly turn out a finished product. As such, they can satisfy the needs of the business in a much shorter timeframe, whilst maintaining the stability that they’ve built an established reputation for.
Automation, repeating your creation
The second piece of the puzzle is automation, which is critical to achieving the speed the business needs, without that being to the detriment of security. Mainframe teams need to again look to the manufacturing industry as an example and work to emulate a similar model. The objective should be to implement as close to full automation across the production line as possible, to speed-up any repeatable and routine tasks, such as quality checks. The core process of creating new code will remain the role of the human workforce, but everything that comes after can be automated, to establish and maintain a uniform standard of quality and security.
The first step on this journey is to introduce automated unit tests, to speed up the quality assurance process. Rather than the IT team wasting valuable time gathering test data, setting parameters and creating test environments that are later disregarded, the whole process is automated. Furthermore, it’s possible to create reusable testing scenarios as part of this process, so that standardised quality and security can be achieved more easily by ensuring that newly created features are subjected to the same checks. However, there’s another often-overlooked consideration that businesses must address within this process if they are to avoid creating a major security weakness and compliance failure.
Averting a data protection catastrophe
Research shows that 83 per cent of organisations test their mainframe applications using live customer data, to ensure they’re exposed to scenarios that are as close to the real-world as possible. Their concern is that if they used mock data, the tests won’t provide an accurate understanding of how the new functionality will behave when it’s launched into the real world.
However, most of them also don’t seek explicit consent from their customers to use their data in this way, which could leave them at odds with the new European General Data Protection Regulation (GDPR) when it comes into effect next May. Automation also provides a solution to this quandary; offering a means to quickly and easily extract the data needed for each test and then run it through an engine that masks anything that can be linked to an individual, whilst maintaining its value for testing.
Firing on all pistons with automated roll-outs
Finally, mainframe teams need to automate the deployment process to fully maximise the speed of delivery to the business. The dwindling pool of mainframe specialists means that in many cases, workloads spend significantly more time being queued up waiting for available resources to move them along than they do being worked on. Removing the need for a developer to manually push new updates into production can therefore be of major benefit in speeding up the mainframe cog. However, if trust is to be maintained, it’s vital that mainframe teams maintain full accountability and visibility, with an approval layer and audit capability built in. Mainframe teams must have the ability to roll-back changes and reinstate a previous version of an application if needed. Without that, there is a risk that speed will come at the cost of security and stability, undermining the whole process.
Ultimately, taking this approach will allow the mainframe to serve as an example to the entire IT organisation. Refocusing on the MVP and automating the manual, repeatable tasks that have been the key to the stability, security and reliability championed by mainframe teams over the past half century will prove that speed and security can co-exist, driving businesses into the digital future.
Elizabeth Maxwell, technical director, Compuware
Image Credit: Skeeze / Pixabay