Adobe’s Commitment to Child Safety

 

Last updated: April 12, 2023

 

Adobe is deeply committed to keeping children safe online and doing our part to fight the spread of child sexual abuse material (CSAM). We have a zero-tolerance policy against any material uploaded to our servers that sexualizes, sexually exploits, or endangers children.  

 

There are several ways we carry out our commitment to keeping children and our users safe, including: 

 

  • We utilize scanning technologies such as Microsoft’s PhotoDNA and YouTube’s CSAI Match that enable us to compare digital signatures (or “hashes”) of images and videos uploaded by any Adobe user to our servers against databases of known CSAM hashes. Adobe reports all confirmed CSAM to the National Center for Missing and Exploited Children (NCMEC) as required by US federal law. We review all material reported to NCMEC to ensure a high level of accuracy and quality of all reports. To learn more about the National Center for Missing & Exploited Children, please visit www.missingkids.org

  • Our Trust and Safety team reviews, reports, and removes CSAM discovered by scanning technology, as well as by user reports or account investigations. We share hashes of previously undetected CSAM to the NCMEC-managed industry hash list to help prevent the redistribution of CSAM on other platforms. We continue to advance Adobe’s CSAM detection capabilities and build wellness features into our moderation tools.

  • Where possible, we permanently terminate any Adobe user account found to have uploaded or created content sexualizing minors on our platform. In Fiscal Year 2022, Adobe sent 1,809 CyberTips to NCMEC. We proactively detected 99.9% of the CSAM we reported to NCMEC, through scanning technology or account investigations.

  • We implement additional search protections in certain products that presents deterrence and intervention messaging when a CSAM-seeking query is identified.  This messaging warns the user that CSAM is illegal, provides information on how to report this material to Adobe, and links to mental health resources for users seeking CSAM.  This seeks to interrupt individuals proceeding down an offending pathway early with the ultimate goal of preventing further CSAM-seeking behavior and physical child sexual abuse from occurring.

  • We collaborate with others in industry to fight CSAM at  scale and have a longstanding partnership with NCMEC.  We donate technology and technical expertise to NCMEC to assist in their mission to help find missing children, reduce child sexual exploitation, and prevent child victimization. You can read more about Adobe’s partnership with NCMEC here

  • We also support and participate in the Technology Coalition, an organization comprised of technology industry leaders with the goal of preventing and eradicating online child sexual exploitation and abuse.  You can read more about the Technology Coalition here.  

 

What you can do

If you encounter material you believe to be child sexual abuse material while using Adobe's products and services, please report it through our in-product abuse reporting mechanisms or email abuse@adobe.com.

 

You can also make a report directly to the National Center for Missing and Exploited Children through www.CyberTipline.org or by calling 1-800-843-5678. 


Adobe proudly participates in or sponsors the following organizations and initiatives:

National Center for Missing & Exploited Children
The Technology Coalition
We Protect Global Alliance
Thorn - Digital Defenders of Children
Crimes Against Children Conference