Skip to main content

No-stalgia: why cybersecurity can’t keep looking back

(Image credit: Image Credit: Wright Studio / Shutterstock)

According to Verizon’s DBIR last year, the highest increase in threat actions year on year was social engineering and the human user has been, for seven years in a row, the highest growing target category. Furthermore, according to the UK ICO, over 90 per cent of data breaches are caused by human error.

From these stats, it is clear that attackers are winning, but their advantage is not that they have the best tech or the best minds, but that they are better organised. At present, security professionals are living this cognitive dissonance where we understand the problem but continue to deploy the same solutions and insist on the same poor practices. The cybersecurity industry keeps looking backwards, whilst our enemies look forward – no wonder we are falling behind.

The technology arms race continues, and we keep looking for techies to staff up our teams, spending a disproportionate amount of our budgets on technical security and creating a self-inflicted gap in our defences. We devote an inordinate amount of energy preparing for the most sophisticated threat vectors, while most incidents occur due to basic issues, or unforced errors.

Preparing for the worst does not prepare you for the likely, and we have been thinking in this broken way for too long. As an ex-CISO, I can understand that it is hard to adjust this mindset. We have been doing the same things for so long that making widescale changes can seem impossible, but the adjustments don’t have to be huge. All I ask is that as an industry we start listening to our users instead of vilifying them, and think outside of the box instead of in an outdated fashion.

Resketching security roles

The outcome we want is a sustainable, resilient cybersecurity standard, not a continuous game of technical cyber-ping pong. To achieve this, as security leaders, we have to reorganise our defences, and rethink our “culture”. This starts with a resketching of security roles.

It is unreasonable to call our end users the first line of defence and hold them accountable for cyber-hygiene. Yes, we need to continue to educate users, but it is time to shift our focus and work smarter. We have to start listening to users instead of demonising them, to understand them and their behaviour and quantify the human risk so that we can deliver processes and technical security that works for both users and the organisation.

As security leaders, we must take full responsibility and accept that ourselves and our teams are, and will continue to be, the first line of defence, instead of expecting our users to take responsibility for keeping organisations safe. Armed forces don’t expect ordinary citizens to go to war, so why are we as highly trained professionals expecting our untrained users to defend our organisations?

This isn’t an attack on technical CISOs. Their focus on patching critical technical vulnerabilities is paramount, but they are ultimately accountable for reducing cyber-exposure for their users, ensuring that the business has the right processes in place and that users have the basic tools at their disposal, enabling them to behave securely while remaining productive. Training their users is only the starting point, as most workers simply don’t have the capacity to think about security and keeping the company safe – they are busy enough with their own jobs to have to take on the extra burden of cybersecurity.

Changing security culture

We need to change our culture from one of elitism to one of inclusion. Expanding the talent pool and filling our talent gap means finding defenders outside the tech community. Let's stop being science, technology, engineering, and mathematics snobs and start to think outside of the box a little. One untapped resource for us is neurodiversity, which is simply the different ways the brain is wired. They can bring new perspectives to a company’s security stance – they solve problems in different ways. We are also increasingly seeing psychology being brought into cybersecurity, especially in the way we are training and communicating with users. I welcome these steps and would encourage the industry to keep thinking in this way. We must be open to considering potential and not just expertise when we hire.

Let's create a culture of pragmatism. I am not saying we need to shift our focus from tech or lower our technical security standards, but we should be taking care of the basic issues like tools for information classification, multifactor authentication and easy to use encryption first. Building a best in breed SOC that puts three monitors on the desk of each security analyst may make us feel like technical wizards, but it will never be enough to keep up with the velocity of cyber-threats if we haven’t got the basics right first.

If we want to build sustainable cybersecurity and win the war, we must reorganise our defences and rethink our culture as defenders. We have to go beyond just talking about it and be true to the concept of cybersecurity being People, Process and Technology, in this exact order. For too long we have prioritised the Technology and Process elements of security, whilst People have been neglected, despite accounting for the vast majority of security incidents.

To address the People component, we need a dramatic shift in how we think about security – instead of blaming users, listen to them; hire people that bring something new to the table and challenge the traditional way of thinking. We have to throw off the broken, outdated practices and policies that have continuously failed us and understand that training users is only the starting point, not the answer to all our problems.

Flavius Plesu, founder and CEO, OutThink

Flavius Plesu is founder and CEO of OutThink, the world’s first Predictive Human Risk Intelligence platform (SaaS), aimed at revolutionising security awareness and giving security teams the power that comes with identifying high risk users – fully understanding who is not behaving securely and why.