"It really takes a lot of work to keep all this clean," he said. Between 20% and 30% of edits are vandalism or vandalism repair, he added.
Wikipedia, Hochman explained, is known for being an online encyclopedia that anyone can edit. What's less well known is that the site relies on computer-driven editing. "There actually are bots that are allowed to edit," he said.
ClueBot, for example, gets credit for almost 800,000 contributions to Wikipedia. ClueBot does a lot of article reversion, rolling vandalized articles back to a point back before the defacement.
Wikipedia's bots help spot copyright violations and they even call for help on IRC if needed. They're trained to avoid fights with people, so in cases where someone repeatedly re-vandalizes a page, they will ask for human intervention.
Sanjay Sehgal, founder & CEO of Praman, made the case for his company's HumanPresent technology, which the startup has deployed to prevent abuse in an unnamed, recently launched, massively multiplayer game. CAPTCHAs, he said, are too easy to defeat.
Pramana's technology attempts to differentiate between bot and human activity -- too many bots can ruin a game -- but it's about more than stopping spam, he said. It helps improve the user experience and provides traffic pattern data, which can lead to better ad revenue and lower bandwidth costs. When deployed for the unnamed online game, he said, the game publisher was able to determine that between 12% and 15% of its traffic came from bots.
Panel moderator Jonah Stein, an SEO consultant, observed that advertisers can play a role in reducing online blight too. For example, he pointed out that online ad network Right Media allows its advertisers to opt-out of deceptive advertising. He didn't provide any information about how many advertisers choose to do so.