Bad bots account for almost 40% of internet traffic

Robotic hands over a keyboard
(Image credit: Shutterstock)

Almost 40% of all traffic on the internet is “bad bot” activity, according to the latest industry data.

Automated bots, including those from search engines and social media networks, make up 64% of all internet traffic, the Barracuda Networks report found.

“The Bot attacks: Top Threats and Trends” report found that only a quarter of this was “good bot” activity. Nearly two-fifths (39%) were from “bad bots.” The report said these bad bots included basic web scrapers, attack scripts, and advanced persistent bots.

“These advanced bots try their best to evade standard defenses and attempt to perform their malicious activities under the radar. In our dataset, the most common of these persistent bots were ones that went after e-commerce applications and login portals,” the report said.

North America had the dubious distinction of accounting for the largest portion of bad bot traffic, and most of it originates from data centers, according to the report. North America accounted for 67% of bad bot traffic, followed by Europe and then Asia. The report said European bot traffic is more likely to come in from hosting services (VPS) or residential IPs than the North American traffic.

While the bots were automated, bad bots were designed to function during working hours, the report found.

RELATED RESOURCE

The state of ransomware in retail 2021

Insights into the current state of ransomware in the retail sector

FREE DOWNLOAD

“The attackers running these bad bots prefer to hide within the normal human traffic stream to avoid raising alarm bells. The common stereotype of a ‘hacker’ performing their attacks late into the night in a dark room with green fonts on a black screen has been replaced by people who set up their bots to carry out the automated attacks while they go about their day,” the report said.

Barracuda Networks gave some examples of bad bot activity, including pretending to be a known vulnerability scanner. This bot attempted to perform reconnaissance and probe for vulnerabilities using some basic attacks. As such, the bot was using a standard browser user agent but had additional custom HTTP headers that spoofed the headers of a scanner the victim organization used.

"While some bots like search engine crawlers are good, our research shows that over 60% of bots are dedicated to carrying out malicious activities at scale," said Nitzan Miron, VP of Product Management, Application Security, Barracuda.

"When left unchecked, these bad bots can steal data, affect site performance, and even lead to a breach. That's why it's critically important to detect and effectively block bot traffic."

Rene Millman

Rene Millman is a freelance writer and broadcaster who covers cybersecurity, AI, IoT, and the cloud. He also works as a contributing analyst at GigaOm and has previously worked as an analyst for Gartner covering the infrastructure market. He has made numerous television appearances to give his views and expertise on technology trends and companies that affect and shape our lives. You can follow Rene Millman on Twitter.