The pressures placed on security teams are mounting as organisations advance with their digital transformation initiatives. The move towards the development of cloud-first, or even cloud-only, infrastructure is inevitably increasing workloads for security professionals. Not only are they being brought into secure new services, but they are also still expected to ensure the security of existing on-premise infrastructure and assets. And while the ultimate goal of embracing a cloud-only perspective may be to remove network fragmentation, the CISO and their team have to manage the stresses of increased fragmentation with the deployment of each new device or service.
Cybersecurity is complex. This complexity shows no sign of dissipating anytime soon: the combination of managing the security which surrounds private clouds – often maintained by a range of cloud service providers [CSPs] and frequently misconfigured - and public clouds is just one example of how innovation is making life more difficult for security teams.
Hybrid networks have become the new normal for most organisations – they have to manage an ever-expanding attack surface made up of on-premise IT, OT, cloud and third-party environments. There are a lot of plates to keep spinning and there is an increasing burden placed on the CISO to prevent any breakages. The need for strong, focused and enduring risk management strategies has never been clearer. Fluid security is one such strategy that is used by some of the world’s largest, most complex organisations.
The concept of fluid security focusses on developing a unified, agnostic, and continuous security program that can support rapidly-evolving enterprise needs with minimal impact on the rest of the organisation. This means creating processes that can manage the security of existing infrastructure through to its retirement, as well as ensuring that the diverse security environment is absent of any redundancy - not just in technology, but with personnel and processes also.
Keep data equality front of mind
Focussing on how to be agnostic in a security sense means, irrespective of where data comes from, it must be regularly collected in a central location, and like-data must be normalised and amalgamated. This is regardless of the type of environment, vendors, etc. Normalising data means the information fits a taxonomy universal to the organisation, rather than being vendor-specific. Merging data results in clean datasets without duplicates so the resulting analysis and other processes can be more efficient. When seeking to simplify and centralise the management of a complex, fragmented environment, these data handling practises must be the first steps carried out to ensure success.
These recommendations also guarantee that all processes are founded on an accurate and complete data set without any inconsistencies. Take vulnerability prioritisation as an example: running an analysis on one complete set of vulnerability occurrences is far more efficient than analysing multiple sets from numerous scanners, running both on-premise and in the cloud, each with a different priority outcome.
If the first step to reducing complexity is centralising data, the next is finding a way to model that data. New possibilities of insight to the interrelationships of an environment can be revealed by creating an always up-to-date model of hybrid network infrastructure, security controls, assets, vulnerabilities and threats. Modelling can assist a range of security management processes, serving to unify teams with a complete overview of a company’s attack surface.
Having a united front
A key factor that normally contributes to the failure of securing any environment is disconnected processes. In hybrid environment, the opportunity for process-disconnect only rises. This is mainly because separate teams are put in charge of different areas of the network. In an increasing number of workplaces, it’s not just the security and operations teams that are operating in silos but DevOps/DevSecOps teams as well.
Despite every team having their specific function, the process involved in their day-to-day tasks must point towards one common goal. Looking at DevSecOps, they may have procedures for “security in code,” but any changes to new or existing services could have ramifications for compliance status and will need to be monitored for how their risk status may alter. This is where a thorough view of the cloud is crucial – with this visibility, security teams can discover and analyse vulnerabilities on services and containers. Furthermore, for policy compliance, operations teams must be able to test accessibility, security tags, cloud firewall rules and configurations.
These aforementioned examples are situations where the usefulness of a hybrid environment model shines through. With an offline model updated frequently via application programming interface (API) connections, security and operations teams do not require administrative access to cloud platforms and can carry out their processes with the cloud deployment experiencing minimal disruption. Upon the discovery of violation or risk, the issue can be eliminated simply via security and operations teams reporting back to DevSecOps and working with them to make the required amends.
Loyalty is key
Ongoing cyberhygiene processes designed to reduce risk and compliance violations are another consideration to ensure the longevity of any fluid security strategy. Because cloud services often have short lifecycles, there’s a tendency for teams to “set it and forget it” during deployments. However, DevOps is founded on replication, such as the easy construction of container-based services, the move from image to instance, etc., so it is also easy to replicate risk. Compared to an on-premise infrastructure it can also happen on a faster and wider scale. That’s why, even if the tools and processes to achieve that vigilance are different, cloud services must be treated with the same vigilance that is applied to other areas of the infrastructure.
The future security of hybrid networks can only be guaranteed by ensuring that the data handling and unified management processes described above become the standard. If a fluid approach to security is taken, the groundwork will be laid on which an established program prepped to deal with the challenges of today and to support innovation going forward can be realised. Even though cloud is considered as the “must-have” technology of the decade, technological developments are moving so quickly that dynamic computing could look very different in a few years. In fact, taking the speed of cloud adoption into consideration, the new solution could be here in a matter of months.
Amrit Williams, vice president of products, Skybox Security