Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Perimeter

Gather Intelligence On Web Bots To Aid Defense

BotoPedia, a registry of Web bots, could help companies keep their sites open to good crawlers but closed to attackers and site scrapers

Automated traffic to Web sites has steadily increased, driven by legitimate search-engine indexing, questionable crawlers, and malicious attackers. Companies need to know which is which.

To that end, Web-security cloud service Incapsula launched a site on Wednesday for cataloging Web bots, the automated programs that crawl websites to index pages, grab competitive price information, gather information on social-networking users, or scan for vulnerabilities. With the site, dubbed BotoPedia, the company is gathering data on the Internet addresses used by Web bots as well as the user-agent strings and any other identifying information. The catalog will be open, but moderated, in much the same way as Wikipedia, says Marc Gaffan, co-founder and vice president of business development for Incapsula.

"This is essentially trying to take the gray area and classify it to a higher level of granularity, so that website operators have got the ability to cherry-pick who they want to let in and who they don't," Gaffan says.

While many services attempt to identify bots by the user-agent strings -- typically indicating browser information -- the signature is changed too easily to be useful, he says. Instead, BotoPedia will include the user-agent string, IP addresses, and other details.

While that is a simple change, it's an important one, says Bogdan Botezatu, senior threat analyst with security firm BitDefender.

"If you block my spider, I will change its name and come back and crawl your Web server in a few minutes without losing much money or time," Botezatu says. "But if you block my IP address, then I will have to either change my IP or change my provider or move to a different data center."

[ Researchers release free search engine-based data mining tools to identify and extract sensitive information from many popular cloud-based services. See Researchers To Launch New Tools For Search Engine Hacking. ]

BotoPedia was initially seeded with data on the top 50 bots, but another dozen had been submitted by outside sources by Wednesday evening. While the operators of good Web bots will self-submit, researchers will likely add information on bad bots as well, Gaffan says.

"I do expect a lot of bad bots to get in there, but obviously not by them coming forward," he says.

The rise of automated Web traffic is playing out against the backdrop of an estimated quadrupling of Internet traffic by the year 2016, according to networking giant Cisco's efforts to predict future bandwidth demand. Web traffic will increase slightly faster, expanding some five-fold between 2011 and 2016, the company estimates.

Automated traffic is taking an increasing share of the pie. Currently, slightly more than half of the traffic to websites comes from bots, according to Incapsula's data. Of the total, 20 percent are good page indexers and other desired bots, another 19 percent are intelligence-gathering bots that sites may not want, and the remaining 12 percent are scrapers, comment spammers, and flat-out attacks.

Attacks could be automated SQL-injection attacks on back-end databases, the scraping of user information, or just automated attempts at logging in. Overall, sites should expect each Web application to suffer a sustained attack nearly 120 days of each year, according to a report issued earlier this week by Web security firm Imperva. Companies should prepare for intense automated attacks, the company says.

"The success of the whole mission depends on the defense performance when under attack," states the report. "Therefore, the defense solutions and procedures should be designed to accommodate attack bursts."

Increasingly, attackers will cloak themselves in the appearance of legitimacy. By appearing to be a search-engine index bot, attackers will be able to bypass most filters, Incapsula's Gaffan says.

In a study of 1,000 customers, Incapsula found that more than 16 percent encountered Web bots that impersonated Google's automated crawlers. Because Google search rankings are so important, no site wants to block the company from indexing its pages.

In the end, the company hopes the online catalog will empower companies to make better decisions about what automated traffic they allow to peruse their sites, and what traffic they block, he says.

"This will give website owners a lot of different information and better awareness into who they want to let in," Gaffan says.

Have a comment on this story? Please click "Add Your Comment" below. If you'd like to contact Dark Reading's editors directly, send us a message.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
COVID-19: Latest Security News & Commentary
Dark Reading Staff 10/23/2020
Russian Military Officers Unmasked, Indicted for High-Profile Cyberattack Campaigns
Kelly Jackson Higgins, Executive Editor at Dark Reading,  10/19/2020
Modern Day Insider Threat: Network Bugs That Are Stealing Your Data
David Pearson, Principal Threat Researcher,  10/21/2020
Register for Dark Reading Newsletters
White Papers
Video
Cartoon
Current Issue
Special Report: Computing's New Normal
This special report examines how IT security organizations have adapted to the "new normal" of computing and what the long-term effects will be. Read it and get a unique set of perspectives on issues ranging from new threats & vulnerabilities as a result of remote working to how enterprise security strategy will be affected long term.
Flash Poll
How IT Security Organizations are Attacking the Cybersecurity Problem
How IT Security Organizations are Attacking the Cybersecurity Problem
The COVID-19 pandemic turned the world -- and enterprise computing -- on end. Here's a look at how cybersecurity teams are retrenching their defense strategies, rebuilding their teams, and selecting new technologies to stop the oncoming rise of online attacks.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2020-27187
PUBLISHED: 2020-10-26
An issue was discovered in KDE Partition Manager 4.1.0 before 4.2.0. The kpmcore_externalcommand helper contains a logic flaw in which the service invoking D-Bus is not properly checked. An attacker on the local machine can replace /etc/fstab, and execute mount and other partitioning related command...
CVE-2020-7752
PUBLISHED: 2020-10-26
This affects the package systeminformation before 4.27.11. This package is vulnerable to Command Injection. The attacker can concatenate curl's parameters to overwrite Javascript files and then execute any OS commands.
CVE-2020-7127
PUBLISHED: 2020-10-26
A remote unauthenticated arbitrary code execution vulnerability was discovered in Aruba Airwave Software version(s): Prior to 1.3.2.
CVE-2020-7196
PUBLISHED: 2020-10-26
The HPE BlueData EPIC Software Platform version 4.0 and HPE Ezmeral Container Platform 5.0 use an insecure method of handling sensitive Kerberos passwords that is susceptible to unauthorized interception and/or retrieval. Specifically, they display the kdc_admin_password in the source file of the ur...
CVE-2020-7197
PUBLISHED: 2020-10-26
SSMC3.7.0.0 is vulnerable to remote authentication bypass. HPE StoreServ Management Console (SSMC) 3.7.0.0 is an off node multiarray manager web application and remains isolated from data on the managed arrays. HPE has provided an update to HPE StoreServ Management Console (SSMC) software 3.7.0.0* U...