Skip to main content

Why businesses need to reconsider their shark-cage approach to cyber security

(Image credit: Image Credit: Den Rise / Shutterstock)

With the introduction of tighter data law restrictions earlier this year, more specifically the enactment of the EU's GDPR legislation, businesses have had to considerably adjust their strategy when it comes to tackling cyber security. More than ever, organisations are being pulled up on their responsibility to protect their customers' sensitive data and personal details.

This is already having a profound impact on how consumers are viewing brands. A recent study by Veritas discovered that 47 per cent of respondents would consider switching to a competitor in the event of a data breach disaster – this is enough to have most large corporations running for cover, and subsequently building a digital fort that often neglects the kind of loopholes in infrastructure that hackers strive to discover. Aside from this, data breaches can cost corporations millions in damage fee. If we consider Facebook as an example; $1.63 billion can certainly be viewed as a hefty fine for failing to adequately protect user data.

In response to the new legislation and very public leaks, many businesses have had to redress the issue of data protection. Discussion around this topic has been ongoing for the past few years, but GDPR has brought the issue back into focus. Data management, its storage, and the way in which it is handled internally has proved problematic for large corporations. It’s also worth observing that this issue transcends the limitations of Europe – this is indeed a global issue for global corporations.

Personal data stores – a gold mine for hackers

Customer data is the new currency for the digital era. Its value has only recently begun to be understood. Other valuable commodities up for grabs during a breach include business-critical IPR that can be extracted from systems and networks that are not properly safeguarded. Data controllers and processors are very much in the firing line if this intellectual property is stolen. The journey that data undergoes once it has been submitted to a company is a long one and is prone to several risks. One analogy we can refer to could be swimming in an ocean full of sharks; shedding a single drop of blood in an ocean of hackers can result in savage attacks from the likes of spyware and organised crime outfits. Unsurprisingly, if hackers are aware that a company has flaws in their security measures, their attacks become more focused and direct. Supporting this, FireEye’s M-Trends 2018 report confirms that 56 per cent of organisations that were targets of a significant attack in the last year were targeted again in the same period.

Isolationist attitudes to data protection

The current approach to tackle to this is very much centralised around an ethos of self-preservation. Constructing a self-preservative sphere, however, very much neglects some wider issues that can prove problematic- and costly - for large corporations.

No company can afford to become shark bait. In the wake of GDPR, businesses have been frantically sourcing tools and procedures to combat cybercrime in order to avoid potential data breaches. The first port of call is usually to fortify inhouse systems and infrastructure; a logical first step, but this isolated method is perhaps short-sighted in the long run. The protective shark cage can be constructed, yes – but is it necessarily the right approach?

In today’s global economy it doesn’t benefit businesses to view themselves as standing in isolation. Companies are affiliated with all sorts of external vendors, suppliers and clients in their extended supply chain. These are rarely taken into consideration. This means that partner businesses are inextricably linked, and any of these factors can open a backdoor either deliberately or accidentally – much to the detriment of brands like Facebook. Data that leaves local storage to travel externally around global networks is at risk of attack. The craftier cyber criminals recognise that data is most vulnerable in this state of transit rather than remaining in secured network end points. On average, it takes organisations 191 days to identify a data breach, and this could also be attributed to this transient data being targeted outside of the secured network. In recent years we have witnessed a significant amount of damage caused by third-party attacks as these criminals sneak upwards through the supply chain. Their goal is to identify weakness and entry points, and often they’re able to find them. Aside from threats against carrier networks themselves, criminals have also been known to siphon off data from network platforms rather than individual devices.

Moving forward: providing holistic data protection

With all of this in mind, how can organisations tackle these ongoing cyber threats? Conclusively, the individual approach hasn’t proved effective so far. Hackers will target any weakness unapologetically and go for the underbelly of the beast so to speak. Your company might not be victim this time, but for how long should you leave data exposed to an ocean of cybercrime? Rather than relying on shark cages, firms should consider searching for shark-free water instead. Network providers need to have the right tools and strategy in place to protect the volume, velocity and value of data crossing their infrastructure to provide a ‘clean’ network for their partners. A recent survey by the Ponemon Institute suggests that 77 per cent of a sample of 2800 IT professionals admitted to not having a formal cybersecurity incident response plan, and with the risks at hand this is worrying news. Without a clear strategy to handle breaches, it’s more important than ever to bolster the boat and avoid risk of leakage in the first place.

Investment in security tools and solutions that are built for large expansive networks is crucial for operators. Few scalable solutions can cope with the enormity and reach of the threats currently facing carrier-grade networks. A multi-layered approach to security is arguably the only way to truly create a barrier against a multitude of threats – tools that are designed to operate in a 100Gbps-rate environment are what’s needed. Technology that can detect and prevent cyber-attacks and breaches by providing total visibility is more than necessary, as well as that which can monitor and deliver diagnostics of network performance to analyse the network traffic that traverses the network and monitor security. Additionally, the technology needs to be able to detect threats and anomalies using signature matching against known threats. Several components are required to protect network infrastructure at this level, including:

  • integrated network visibility software
  • real-time monitoring tools
  • scrutiny of every packet
  • event-driven intrusion detection systems

Public and private organisations should demand all of this from their network provider as standard. This is also what providers of carrier-grade networks should be happily inclined to deliver. As one technology giant suggested years ago: the network is the computer – this has never been so true. Safeguarding the network properly means protecting all extremities.

It’s time for companies need to rethink their centralised approach to cybersecurity.

Steve Patton, Cyber Security Specialist & Director, Telesoft
Image Credit: Den Rise / Shutterstock

Steve is an experienced technical B2B cyber security specialist and Director. Steve is a frequent speaker on topics including security breaches, big data analytics, audit and compliance, and IT forensics.