Distil Networks recently released its fifth annual Bad Bot Report titled, “Bad Bot Report 2018: The Year Bad Bots Went Mainstream.” The research analyzed hundreds of billions of bad bot requests at the application layer to provide insight and guidance on the nature and impact of automated threats in 2017. The findings come from Distil’s newly launched Distil Research Lab, a team of dedicated analysts who examine the most sophisticated automated threats for some of the world’s most attacked websites.
“This year bots took over public conversation, as the FBI continues its investigation into Russia’s involvement in the 2016 U.S. presidential election and new legislation made way for stricter regulations,” said Tiffany Olson Jones, CEO of Distil Networks. “Yet, as awareness grows, bot traffic and sophistication continue to escalate at an alarming rate. Despite bad bot awareness being at an all-time high, this year’s Bad Bot Report illustrates that no industry is immune to automated threats and constant vigilance is required in order to thwart attacks of this kind.”
Bad bots are used by competitors, hackers and fraudsters and are the key culprits behind web scraping, brute force attacks, competitive data mining, online fraud, account hijacking, data theft, spam, digital ad fraud and downtime. The report revealed an increase in bad bot traffic over 2016 and illustrated how public perception of bots has impacted enterprise behavior, such as handling abusive traffic from foreign IP addresses.
Key Findings from its 2018 Bad Bot Report
In 2017, bad bots accounted for 21.8 percent of all website traffic, a 9.5 percent increase over the previous year. Good bots increased by 8.7 percent to make up 20.4 percent of all website traffic.
For the first time, Russia became the most blocked country, with one in five companies (20.7 percent) implementing country-specific IP block requests. Last year’s leader, China, dropped down to sixth place with 8.3 percent.
Gambling companies and airlines suffer from higher proportions of bad bot traffic than other industries, with 53.1 percent and 43.9 percent of traffic coming from bad bots, respectively. Ecommerce, healthcare and ticketing websites suffer from highly sophisticated bots, which are difficult to detect.
83.2 percent of bad bots report their user agent as web browsers Chrome, Firefox, Safari or Internet Explorer. 10.4 percent claim to come from mobile browsers such as Safari Mobile, Android or Opera.
82.7 percent of bad bot traffic emanated from data centers in 2017, compared to 60.1 percent in 2016. The availability and low cost of cloud computing explains the dominance of data center use.
74 percent of bad bot traffic is made up of moderate or sophisticated bots, which evade detection by distributing their attacks over multiple IP addresses, or simulating human behavior such as mouse movements and mobile swipes.
Account takeover attacks occur 2-3 times per month on the average website, but immediately following a breach, they are 3x more frequent, as bot operators know that people reuse the same credentials across multiple websites.
The report also includes an explanation and categorization of bad bot behavior, as well as a deep dive into last year’s GiftGhostBot discovery.
According to Gartner, in Hype Cycle for Application Security, 2017: “Although bots have existed for more than 10 years in the form of applications or scripts, such as search engine crawlers and security testing and exploitation tools, they are increasingly employed to abuse web application and API functionality. Malicious or undesirable uses, such as content scraping, scalping, fraud and credential stuffing are affecting a growing number of organizations’ web assets. Bots can be distributed on multiple hosts to perform automated distributed denial of service (DDoS), but can also be “low and slow,” use browser automation or other evasion techniques to bypass existing web application security controls, such as IP blacklisting and rate limiting. The rise of more sophisticated bots in recent years therefore requires greater sophistication in detection and response.