Web traffic - the actions of real people, or by other computers? A recent study explains.
Cloud-based service, Incapsula, has revealed research indicating that 51 per cent of website traffic is through automated software programs; with many programmed for the intent of malicious activity.
According to ZDNet, approximately 49 per cent of site’s visitors are genuine humans non-human traffic mostly invisible due to not being recognised by analytics software - therefore resulting in numerous sites leaving themselves victim to a much higher risk of business disruption, as well as many other damaging factors.
The breakdown of an average site’s traffic is as follows:
- 5% is due to hacking tools looking for an unpatched or new vulnerability within a site.
- 5% is scrapers.
- 2% from automated comment spammers.
- 19% the result of “spies” collating competitive intelligence.
- 20% derived from search engines (non-human traffic but benign).
- 49% from people browsing the Internet.
Co-founder of Incapsula, Marc Gaffan, said: “Few people realize how much of their traffic is non-human, and that much of it is potentially harmful.”
“Because we have thousands of web sites as customers, we spot exploits way ahead of others and we can then block them for all our customers. That’s the benefit of scale. We also maintain a virtual patch service that prevents harmful exploits days and sometimes weeks before a patch is ready.”