In May, Victoria Grand, YouTube director of global communications and policy, said the company was working on software to blur faces in posted videos, in answer to requests from the human rights community and as an option to deal with privacy complaints from those captured on video against their will.
On Wednesday, YouTube made its face blurring system available through its Video Enhancements tool.
"Whether you want to share sensitive protest footage without exposing the faces of the activists involved, or share the winning point in your 8-year-old's basketball game without broadcasting the children's faces to the world, our face blurring technology is a first step towards providing visual anonymity for video on YouTube," said Amanda Conway, YouTube policy associate, in a blog post.
[ For more background read YouTube Tool Blurs Faces To Protect Privacy. ]
As InformationWeek recently noted, videos of significant events posted to YouTube have become an important element in professionally produced video news segments and as standalone reportage. Having an easy way to protect the privacy of individuals shown in such videos should help reduce the chance that human rights protesters caught on camera also will be caught by authorities.
YouTube's face blurring system creates a new copy of the video in question with the blur effect, and provides an option to delete the original, unaltered video. This prevents authorities from seeking the original raw footage from YouTube, although the person posting the video must make sure he or she hasn't retained a local copy.
YouTube says it would be difficult to bypass its blurring technology. "We can't say it's impossible to unblur, but we have made it incredibly difficult through pixelating the blurred regions and adding noise," a company spokeswoman said in an email. "In fact, we feel we've made it so difficult that it's not worth the immense effort required to try."
Security researcher Ashkan Soltani observed via Twitter that although face blurring is helpful, it's not a panacea for the privacy risks that confront human rights advocates. He points to research related to Bertillonage, a system for identifying people based on physical characteristics such as height, weight, eye color, and the like--biometrics, in other words.
Grand, in a blog post, makes just that point, noting that voices, names said on camera, and details captured even briefly might reveal information about the location or identity of people on-screen or off.
Conway offers even more reason to be cautious: This is emerging technology and it might not adequately conceal a person's identity. "It's possible that certain faces or frames will not be blurred," she says. "If you are not satisfied with the accuracy of the blurring as you see it in the preview, you may wish to keep your video private."
New apps promise to inject social features across entire workflows, raising new problems for IT. In the new, all-digital Social Networking issue of InformationWeek, find out how companies are making social networking part of the way their employees work. Also in this issue: How to better manage your video data. (Free with registration.)