A new study by Incapsula based on 1.45 billion bot visits to some 20,000 websites worldwide in a 90-day period found that these code-based visitors account for 61.5 percent of all website traffic, an increase of 21 percent over 2012.
The good news is that most of that growth comes from good bots -- search engine crawlers, SEO services crawlers, and other types of legitimate software agents, for instance. And spam bots are down from 2 percent in 2012 to 0.5 percent this year. Much of that is due to Google's efforts to discourage comment-spamming SEO methods as well as link-spamming.
"We've noticed a 75 percent reduction in comment spammers, and that's really significant," says Marc Gaffan, co-founder of Incapsula.
The bad news is that 31 percent of bots are malicious. There was an 8 percent increase in unclassified bots with hostile intentions, according to Incapsula. Those are bots posing as legit agents, such as search-engine or browser user agents. The aim of these "impersonators" is to bypass the website's security, and they are typically built for specific malicious activity, such as automated DDoS agents or Trojan-activated browsers.
"The increase in impersonation is obviously a bad sign ... and it's also a bad symptom of increased malicious activities," Gaffan says. These automated bots also can be used to scan websites for holes or to impersonate a Google bot, he says.
"Sixteen percent of all websites had some type of good impersonation going on," he says.
The key to combating unwanted impersonator bots is to benchmark legitimate ones, and to get the proper visibility into their presence and activity, he says. "You want to make sure you don't block some of the good bots. Blocking Google bots by mistake can be hazardous" to your SEO investment, for example, Gaffan says.
Have a comment on this story? Please click "Add Your Comment" below. If you'd like to contact Dark Reading's editors directly, send us a message.