Adobe & PhotoDNA

 

Last updated: January 5, 2017

 

Online photo storage and sharing has exploded in recent years and with it, so has the online manufacture and distribution of child sexual abuse material. As of June 30, 2015, the National Center for Missing and Exploited Children (NCMEC) CyberTipline has received more than 5.4 million reports of suspected child sexual exploitation. Over 2 million of these reports were received in the first six months of 2015 alone. Child sexual abuse victims may be rescued only to have images of the crimes committed against them circulating online for many years afterwards. The NCMEC Child Victim Identification Program has reviewed and analyzed more than 147 million child sexual abuse images and videos since it was created in 2002.

 

That is why Adobe has partnered with NCMEC and Microsoft to deploy PhotoDNA technology across our cloud storage platforms. PhotoDNA is a signature‐based image matching technology developed by Microsoft and Dartmouth College to help hosted service providers find, remove and report some of the worst known images of child pornography circulating online. PhotoDNA technology converts images into a grayscale format with uniform size, then divides the image into squares and assigns a numerical value that represents the unique shading found within each square. Together, those numerical values represent the ‘PhotoDNA signature’ (or ‘hash’) of an image, which can then be compared to signatures of other images to find copies of a given image with incredible accuracy.

 

NCMEC has created and maintains a list of PhotoDNA signatures for the ‘worst of the worst’ child sexual abuse imagery. This signature list—never the images themselves—is then shared with hosted service providers like Adobe who participate in the NCMEC PhotoDNA program.

 

Adobe uses PhotoDNA technology to check images users upload to our servers against these signatures of illegal child sexual abuse material. Adobe’s Trust & Safety team investigates any account that is flagged as containing an image that matches a known PhotoDNA signature. Accounts we confirm contain illegal child sexual abuse material are then permanently closed and reported to NCMEC’s CyberTipline as required by federal law.

 

It is also important to understand what PhotoDNA is not: PhotoDNA is not facial or object recognition technology. A PhotoDNA signature cannot be used to recreate an image or identify people or items within an image. PhotoDNA cannot mistakenly flag an innocent child image (or an image of a young looking adult) as an illegal one. It can only be used to identify copies of ‘worst of the worst’ child sexual abuse material for which NCMEC has assigned a PhotoDNA signature.

 

  • To learn more about PhotoDNA, please visit Microsoft’s PhotoDNA center.
  • To learn more about the National Center for Missing & Exploited Children, please visit www.missingkids.org.
  • If you have questions about Adobe’s use of PhotoDNA technology, please email us at privacy@adobe.com.

Adobe proudly participates in or sponsors the following organizations and initiatives:
National Center for Missing & Exploited Children
We Protect Global Alliance
Thorn - Digital Defenders of Children
Crimes Against Children Conference
The Technology Coalition
IWF