Vulnerabilities / Threats
1/21/2011
02:58 PM
Connect Directly
Google+
LinkedIn
Twitter
RSS
E-Mail
50%
50%

Google Acknowledges Web Spam Complaints

Low-quality content has some Internet users worried about the relevance of Google search results.

Google on Friday tried to quell grumblings about the quality of its search results.

In recent months, prominent bloggers and tech news sites have noted many instances in which Google searches returned poor results. Programmer Jeff Atwood, who runs the popular Coding Horror blog, characterized the volume of complaints as "deafening" lately.

The issue is Web spam. Google principle engineer Matt Cutts, who runs Google's Web spam team, defines Web spam as "the junk you see in search results when Web sites try to cheat their way into higher positions in search results or otherwise violate search engine quality guidelines."

Web spam is a critical issue for Google, perhaps to the point that it imperils Google's search business. If low-quality content continues to find prominent placement in Google's search results and generates enough revenue -- through Google ads, third-party ads, or direct sales -- to fund further Web spam creation, users will slowly but surely turn to other means of content discovery. Social search is often mentioned as a contender in this scenario, which explains why the rise of Facebook has Google worried.

What makes Web spam particularly pernicious is that it's not as easy to identify as malware. Web spam runs the gamut, from blatant attempts to trick Google with unlawfully copied content and repeated search keywords to low-quality writing produced by so-called "content farms."

Cutts's response to the growing chorus of criticism is simultaneously to deny the accuracy of the complaints and to offer assurance that further steps to stamp out Web spam are being taken. Google's search quality is "better than it has ever been in terms of relevance, freshness and comprehensiveness," he insists, even as he acknowledges there has been "a slight uptick of spam in recent months," which Google's engineers are addressing.

Cutts cites a number of steps Google has taken to beat back Web spam, to identify hacked sites, and to alter its search algorithm to deemphasize low-quality Web sites. And he stresses the fact that being a Google advertising client doesn't buy a better search rank.

Cutts concedes that Google can and should do better, even as he suggests that users' perception of the prevalence of Web spam may be the result of "skyrocketing expectations."

The trouble is that Web spammers are trying to do better too.

If Google is to prevail, it may have to look beyond the security arms race, where stalemates rather than victories seem to be the norm, and forgo some ad revenue in order to starve the content farms that feed from Google's trough.

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Flash Poll
Current Issue
Cartoon
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2014-0640
Published: 2014-08-20
EMC RSA Archer GRC Platform 5.x before 5.5 SP1 allows remote authenticated users to bypass intended restrictions on resource access via unspecified vectors.

CVE-2014-0641
Published: 2014-08-20
Cross-site request forgery (CSRF) vulnerability in EMC RSA Archer GRC Platform 5.x before 5.5 SP1 allows remote attackers to hijack the authentication of arbitrary users.

CVE-2014-2505
Published: 2014-08-20
EMC RSA Archer GRC Platform 5.x before 5.5 SP1 allows remote attackers to trigger the download of arbitrary code, and consequently change the product's functionality, via unspecified vectors.

CVE-2014-2511
Published: 2014-08-20
Multiple cross-site scripting (XSS) vulnerabilities in EMC Documentum WebTop before 6.7 SP1 P28 and 6.7 SP2 before P14 allow remote attackers to inject arbitrary web script or HTML via the (1) startat or (2) entryId parameter.

CVE-2014-2515
Published: 2014-08-20
EMC Documentum D2 3.1 before P24, 3.1SP1 before P02, 4.0 before P11, 4.1 before P16, and 4.2 before P05 does not properly restrict tickets provided by D2GetAdminTicketMethod and D2RefreshCacheMethod, which allows remote authenticated users to gain privileges via a request for a superuser ticket.

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
Three interviews on critical embedded systems and security, recorded at Black Hat 2014 in Las Vegas.