Microsoft will offer its PhotoDNA software to law enforcement free of charge to help investigators track down verified child porn.

J. Nicholas Hoover, Senior Editor, InformationWeek Government

March 21, 2012

4 Min Read

Microsoft will offer free technology to help law enforcement agencies identify, track down, and rescue victims of sexual abuse and child pornography, the company said this week.

The company is making PhotoDNA freely available to law enforcement both as source code and packaged with software that the agencies already use. The software is currently used by Microsoft and Facebook to find, delete, and report child pornography online.

PhotoDNA, which was co-developed by Microsoft Research and Dartmouth College professor Hany Farid, creates and stores unique mathematical "signatures" of digital images and uses those signatures to identify copies even if they have been resized or recolored.

PhotoDNA includes 15,000 signatures of the "worst of the worst" images of child pornography already known from previous investigations, according to Microsoft Digital Crimes Unit public affairs manager Sam Doerr. Those signatures will enable law enforcement personnel to find child porn online, which will help them track down and prosecute child abusers.

[ Sometimes attacks come from within. Here's how to protect your business against malicious insiders. 10 Best Ways To Stop Insider Attacks. ]

The software is released by Microsoft’s Digital Crimes Unit, a group that works to promote a more secure Internet and protect children from Internet crime. Since PhotoDNA doesn’t come in an easy-to-use package, Microsoft is making it available through commonly used law enforcement software.

Microsoft is plugging PhotoDNA into two law enforcement platforms: NetClean Analyze and the Child Exploitation Tracking System. NetClean Analyze is a case management tool for child sex abuse investigations. The Child Exploitation Tracking System is based on Microsoft technology and is used by law enforcement agencies in seven countries, including the United States, for case management in child protection cases.

Microsoft is also making the source code available for direct licensing through the non-profit National Center for Missing and Exploited Children. The source code is targeted at law enforcement agencies that have the technical capacity and expertise to integrate it into their own software platforms. In addition to the National Center for Missing and Exploited Children, New Zealand’s Department of Internal Affairs and the Netherlands Forensic Institute are among the agencies already using the software.

Microsoft reported last year that its use of PhotoDNA had led to thousands of matches, and Doerr said there have been child sexual abuse cases with PhotoDNA ties. However, Doerr could not provide specific information regarding PhotoDNA's influence in those cases since Microsoft has no a direct role in the investigations.

Microsoft also uses PhotoDNA internally to combat child porn that users attempt to upload to its services and to prevent child porn from showing up in Bing search results. For example, when Bing indexes images, it does a PhotoDNA check, and when users share photos on Hotmail or upload them to Microsoft’s SkyDrive online storage service, PhotoDNA does its work behind the scenes there as well.

Once a photo has been identified as one that is in the PhotoDNA database, Microsoft removes the images from its servers or search results and reports its findings to law enforcement.

PhotoDNA is just one tool in Microsoft’s child porn-fighting arsenal. According to Doerr, the company also uses other technology to help flag suspicious images, and it has a team that manually responds to reports and monitors Microsoft services for pornographic images.

Microsoft isn’t the only technology company that partners with law enforcement to track down and prosecute the purveyors of child porn. Other service providers, most notably Facebook, also use PhotoDNA with their services, and are part of the National Center for Missing and Exploited Children’s Technology Coalition. For example, Google has developed software that helps child abuse investigators more easily search and identify images that might contain child pornography and that helps investigators more efficiently review video. The National Center for Missing and Exploited Children announced in December 2011 that Google helped it launch a new version of the CyberTipline, the national reporting mechanism for child sexual abuse, by donating technology to help the search company respond to reports of child porn on its services.

Every year, the NCMEC spokesman said, Google responds to thousands of subpoenas, requests from law enforcements, and users who flag possible child porn residing on Google servers and services. The company also maintains a database of sites suspected of containing these images, and has dedicated corporate and legal staff focused on related issues.

"When we become aware of any sexual abuse images in search results or hosted on our sites, we immediately remove them and report them to the National Center for Missing and Exploited Children," a Google spokesman told InformationWeek. However, the company declined to discuss proactive approaches that it might take to prevent child porn from showing up on its services or to confirm that such efforts are underway.

The biggest threat to your company's most sensitive data may be the employee who has legitimate access to corporate databases but less-than-legitimate intentions. Follow our advice in our Defend Data From Malicious Insiders report to mitigate the risk. (Free registration required.)

About the Author(s)

J. Nicholas Hoover

Senior Editor, InformationWeek Government

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights