In today’s digital economy, competition is fraught. A culture of immediacy is being fostered by the rising consumerisation of IT and ubiquitous connectivity, which means that speed matters.
Organisations are tested on their ability to deliver applications and updates on a weekly, daily and sometimes hourly basis to satisfy the whim of their customers. That’s all while ensuring that data is secure and can be used efficiently by multiple teams across the business.
In trying to meet demand for innovation and faster time to market, getting access to data, and fast, has never been more important to fuel business operations. Yet we see business after business hamstrung by the constraints of their data supply chain. Processes for requesting and provisioning data are slow, manual and largely ad-hoc, with organisations relying on passing the information hand-to-hand, from IT operations to the teams that need it.
Moving large data sets is so difficult and so slow that many teams end up using subsets or synthesised data during development or testing. In turn this increases defects, bugs and errors and creates real challenges for ensuring security and agility.
Getting over the data hump
To truly overcome data constraints, organisations must shift away from the bucket brigade approach and help teams to create a unified model for data management. What’s currently happening in many scenarios is that if a project manager needs fresh data for a new feature for a mobile app, they need to request this from a line manager before going to the database administrator. This process is then passed to the system admin who in turn needs to work with storage and network admins to get access and bandwidth for moving the data. These hurdles mean long delays, with many large companies waiting weeks, or even months to get the information they need.
Only by making data more agile, can businesses overcome the cost of delay, reduce the time it takes to provision data for critical applications and eliminate bottlenecks between teams. The ability to copy, secure and deliver near real-time data on demand has become the jewel in the crown for business decision makers.
To achieve this level of agility, organisations must introduce a new layer into IT infrastructure that shifts the business towards self-service and automation. For example, by taking a snapshot of the original data and then applying new tools like virtualisation at the data level, it is possible to replicate virtual sets of data on demand. Through this approach, data sets can be refreshed and reset on demand, environments can be bookmarked and shared between users and data can be rewound instantly to any point in time – all without the need for multiple parties to intervene with manual processes.
Adding data security into the mix
More importantly, IT can take back control. Currently, 90 per cent of the the data that exists within an organisation is a duplicate. By centralising data management, then it’s possible to reduce an organisations risk exposure and start to apply additional controls, such as data masking, that enhance security further. After all, IT is the point person whenever companies fall foul of an attack so the security of agile data needs to be an absolute priority.
While many have put data masking, the process of scrambling the data, into practice – this has traditionally been a costly, timely and labour intensive process. Bringing it into a service-based model within the virtualised layer ensures that organisations can readily extend masked data to any environment. This means teams can speed up app development - making one clone of the database – for each team to quickly test against the freshest data, as if they were the only ones using it.
IT can retain control by setting the data masking policy, data retention rules and who has access to it. Developers, testers and analysts can all provision, refresh and reset data and safeguard information for whoever needs it, for whatever project.
Removing the dependency
With many of today’s businesses trapped by a dependency on IT infrastructure to deliver complete and protected data on time to drive innovation, a change in approach must be prioritised. The time spent on creating and delivering innovative new services to delight existing customers, compete with industry heavy weights and increase wallet share, depends on how organisations are able to balance speed, agility and quality.
Only those who embrace technologies like Data as a Service, have a hope in joining the innovation race and supporting on demand access to secure data that will be the key to deploying fast, failing fast, learning fast and improving fast.
Iain Chidgey, vice president and general manager, Delphix
Image source: Shutterstock/Carlos Amarillo