Biodiversity is an important concept for ecology – for example, the World Wildlife Fund reports that the island of Borneo has 15,000 plant species, which rivals the continent of Africa. This diversity of life represents a huge potential resource for humanity and a challenge for the country itself to manage.
What does this example mean for IT professionals, and what can they learn?
Today, the complexity of new digital environments has led to a new situation for many organisations: digital biodiversity. The range of platforms, software ecosystems and approaches to delivering services back to the business has led to more variance across IT. Each of these platforms develop and grow independently, evolving to deliver what the business needs. Nevertheless, for those responsible for managing this IT, this diversity leads to a big problem around visibility.
In Borneo, new species are being discovered at a rate of three a month. Similarly, visibility across IT has been a challenge with new devices being discovered entering and leaving the network. Yet this visibility is essential, as you can’t assess what you do not see; you can’t defend what you do not see; you can’t secure what you do not see.
Device diversity in IT leads to more problems
The problem that IT has to deal with is systemic – with each new approach, new potential problems can occur. For example, traditional software development methodologies based on a Waterfall model evolved into Agile software delivery and then DevOps. This brought an acceleration in code delivery hardly conceivable before, getting software into production faster through the continuous integration and continuous deployment (CI/CD) pipeline.
That said, this speed of delivery represents another problem. As part of the software process, the role of digital certificates expanded, and they became universally adopted for protecting integrity and confidentiality of business applications. However, the responsibility for tracking and managing those certificates did not grow alongside this increase in speed. Expirations for certificates, therefore pose a big threat to business continuity and application availability.
Similarly, cloud environments created a new way of using traditional IT implementations with Infrastructure, platforms and Software as a Service. Each of these options promises faster delivery and agility, but they open up new problems around shared responsibility for those projects. Without the right understanding of these assets, companies can introduce new risks.
Alongside these major shifts, new technologies continue to complicate the picture too. Software containers like Docker defined new standards and sometimes new meanings for words like agility, velocity and time-to-market. What once was called Bring Your Own Device evolved into a complex, pervasive and ubiquitous computing approach that now is called Enterprise Mobility, where user devices and company-owned machines are used interchangeably by employees.
Traditional data centres and corporate networks have been enriched with a new range of devices, from specialised hardware and tablets to smart vending machines and automated ways to control ambient lights or temperature. It might seem like everything has an IP address, a connection, and is on the network. Yet tracing and tracking all these devices is not easy.
Visibility is a big data problem
Getting Visibility in place across IT is a challenge worth of Gartner’s original definition of big data, which was initially based on work by Doug Laney on the early 2000s. Visibility maps well to the three Vs:
- The first V is for Volume, because digital transformation brought in a huge number of assets and resources unseen previously.
- The second V is for Velocity, the pace at which these resources are changing.
- The third and most subtle V is for Variety, which covers the range of different devices, software assets, hardware and platforms that might be in place, as well as how current those assets are.
Getting Visibility across all these assets therefore involves understanding the distinct challenges that exist across IT. Thinking about IT as a mix of different ecosystems with their own digital biodiversity can help.
To start with, you need to have specialised eyes to collect data about every part of your IT landscape, no matter how diversified it is. Similarly, you cannot treat a project that has lifted a set of existing virtual machines into the cloud in the same way as a new cloud-based application running on microservices and containers.
Once you have properly distributed eyes, you need to have a brain able to collect data coming from the digital biodiversity. This central system should provide a single source of truth for all IT implementations and ensure they are up to date and accurate. In practice, this means taking data on IT assets, then normalising and categorising this data from multiple systems, then overlaying intelligence and context to help transform this raw data into consumable and actionable information.
It’s also important to understand how fast this data changes. For traditional IT instances, like data centres, the pace of change may be low as new application projects occur infrequently; any change could be easily captured. However, desktop estates change much more frequently as new PCs are implemented and imaged to replace old, obsolete or broken machines. In the software world, cloud services and applications can change even more rapidly, with new updates being implemented all the time. Keeping up with this change – regardless of what sources of data are used – is therefore essential.
Once this information is in place, it can support multiple processes across the organisation, harmonising IT, Security, Compliance, Procurement, SecOps, Incident Response and countless other audience types. Equally, it should be put together to keep pace with the rate of expansion across your IT landscape, rather than being constantly in need of its own updates or another terabyte of storage. With so much IT moving to the Cloud, a cloud-based approach would make the most sense.
With this visibility in place, data becomes a tool to be used rather than a burden. For teams involved in keeping up with IT – like security and IT asset management – this data becomes a key collaboration point to make everyone more productive. This information can also be automated to flow to other platforms and surrounding technologies, leveraging standard APIs to grant interoperability, to minimise human errors with automation and to enable transparent orchestration of workflows.
Digital biodiversity needs data
It is important to recognise the value that digital biodiversity represents. Each new platform supports the business in new ways, to achieve new goals. For IT, stopping this shift would be the equivalent of slashing down the rainforest – while it might achieve short-term goals to reduce costs, it would represent a huge long-term loss. By working across each platform and making better use of data, you can achieve far better results.
Keeping up with new IT implementations and changes in behaviour means understanding the whole ecosystem that exists. With so many new approaches and new changes taking place, it’s impossible for individuals to manage this manually. Making use of asset data and automation can help improve this process.
Marco Rottigni, Chief Technical Security Officer, Qualys