Adobe Transparency Center
EU Digital Services Act (DSA) Transparency Report
Published: February 17, 2025
Reporting Period: February 17, 2024 – December 31, 2024
About this Transparency Report
Adobe Software Systems Ireland (“Adobe” or “we”) shares the European Commission's goal of making the internet safer through transparency and accountability. In accordance with Articles 15 and 24 of the European Union (“EU”) Digital Services Act (“DSA” or “Regulation”), Adobe is publishing our first transparency report (“Report”) for products and services we determined to be within scope of the Regulation. Adobe has products and services ranging from subscription enterprise collaboration tools to free photo editing mobile apps, and we are continually assessing whether our offerings fall within scope of the DSA. This report outlines our efforts to moderate content in the EU across our diverse offerings in accordance with our General Terms of Use, Content Policies, and product-specific Community Guidelines (collectively, “Policies”), and provides detailed metrics regarding those efforts for the reporting period, 17 February 2024 to 31 December 2024.
We are committed to regularly updating our Policies and processes and publishing this report on an annual basis in accordance with our obligations under this Regulation. More information about how we approach content moderation, and the DSA may be found at our Transparency Center.
This report includes the following information:
1. Orders Received from Member States’ Authorities
2. Notices Received in Accordance with Article 16
3. Content Moderation Engaged in at Adobe’s Own Initiative
4. Complaints Received Through Adobe’s Internal Complaint Handling Systems
5. Out-of-Court Dispute Settlements
6. Article 23 Suspensions Imposed to Mitigate Misuse
7. Average Monthly Active Users
This section details the volume of requests Adobe has received from law enforcement, courts and other government agencies in the EU to remove content or provide user information pursuant to Articles 9 and 10.
1.1 Government Removal Orders from EU Member States
Adobe did not receive any removal orders from law enforcement, courts or government agencies in the EU during the reporting period.
1.2 Requests for User Data from EU Member States1
User data requests include orders issued pursuant to local law and law enforcement requests as well as emergency disclosure requests. Our Trust & Safety Team carefully reviews these requests to determine the validity of the legal process, assess the proportionality of the request, and ensure compliance with international data protection commitments made by Adobe. More information about how Adobe handles requests for user data may be found in our guidelines on Government and Third-Party Requests for User Data.
The table below provides information on the number of requests for user data Adobe received from EU Member State authorities and whether any information was produced for the reporting period.
EU Member State |
Requests |
Requests for Which Some Information Was Produced |
France |
5 |
0 |
Germany |
25 |
8 |
Ireland |
3 |
2 |
Slovakia |
1 |
1 |
Spain |
3 |
0 |
Median time to acknowledge receipt: <1 day
Median time to give effect to the orders: <3 days
Adobe has processes in place to allow anyone in the EU, including users, non-users, Trusted Flaggers as defined by Article 22, and other entities, to report content that they believe violates Member State law in accordance with Article 16. When we receive these specific notices, we review the reported content in accordance with our Policies and take the appropriate action. If the reported content does not violate Adobe’s Policies, we review it for legality based on the provided information and may restrict access to the content in the relevant jurisdiction. Article 16 notices are reviewed manually by the Adobe Trust & Safety and IP teams and are not processed by automated means for any of Adobe’s in-scope products or services.
Trusted Flaggers who have been designated by the Digital Services Coordinator of the Member State in which they are established can, as described above, report allegedly illegal content in line with Article 22. Adobe has not received any reports by designated Trusted Flaggers as of 31 December 2024.
2.1 Number of Article 16 Notices Submitted, by Category of Alleged Illegal Content2
The table below provides information on the number of notices submitted by EU users pursuant to Article 16 during the reporting period, broken down by category of alleged illegal content and product/service.
Type of Alleged Illegal Content |
Adobe Commerce Cloud |
Adobe Document Cloud |
Adobe Express |
Adobe InDesign |
Adobe Stock |
Behance |
Child Safety |
0 |
0 |
0 |
0 |
2 |
0 |
Intellectual Property |
36 |
3 |
1 |
1 |
233 |
456 |
2.2 Number of Actions Taken in Response to Article 16 Notices3
The table below provides information on the number of actions4 taken in response to notices submitted by EU users pursuant to Article 16 during the reporting period, broken down by actions taken because the content was deemed to violate Adobe’s policies and content that was deemed to be illegal under local law.
Adobe Product or Service |
Actions Taken Due to Violation of Adobe’s Policies |
Actions Taken Because the Content was Deemed to be Illegal under Local Law |
Adobe Commerce |
0 |
143 |
Adobe Document Cloud |
0 |
4 |
Adobe InDesign |
0 |
1 |
Adobe Stock |
17 |
1663 |
Behance |
0 |
242 |
2.3 Median Time to Take Action on Content Identified in Article 16 Notices5
The table below provides information on the median time required for Adobe to take action on content identified in Article 16 notices, broken down by product/service.
Adobe Product or Service |
Median Time to Take Action |
Adobe Commerce |
8-9 days |
Adobe Document Cloud |
1-2 days |
Adobe InDesign |
4-5 days |
Adobe Stock |
<1 day |
Behance |
<1 day |
3.1 Adobe’s Approach to Content Moderation
Adobe believes that maintaining engaging and trustworthy communities that foster creativity requires clear guidelines for acceptable behavior and robust processes for consistent content enforcement. Our Policies, which may be found at our Transparency Center, establish standards for user conduct across all Adobe products and services. We discover policy-violative and alleged illegal content through in-product reporting, our publicly available reporting channels and through automated technologies.
When content violates our Policies, we take action against it globally. Although our Policies typically cover material that is locally illegal, Adobe is also committed to respecting applicable laws of the EU and its member states. If we determine that content violates local law but does not otherwise violate our Policies, we may disable it locally by blocking it in the relevant jurisdiction.
Content Reporting Mechanisms
In-Product Reporting
With many Adobe products and services, users can report content they believe violates our Policies via in-product reporting options. Those reporting options are detailed on a per-product basis here. For any other products and services, users and non-users may always reach out to abuse@adobe.com to file a report with Adobe’s Trust & Safety team. Whenever someone reports an alleged violation of our Policies, our team may review the content in question to determine whether a violation took place and action that content accordingly.
Reporting Forms
Anyone in the EU can report content on Adobe products or services that they believe violates applicable laws of the EU or its Member States through our Illegal Content Reporting Form. Reporters are asked to provide additional context about the allegedly illegal content, including the basis for the report and the country where they allege the law has been violated. Whenever someone reports an alleged violation of our Policies, our team may review the content in question to determine whether a violation took place and action that content accordingly.
Anyone around the world can report an intellectual property violation by visiting our infringement reporting form. We also accept notices via mail or fax as detailed in our Intellectual Property Removal Policy.
Intellectual Property Removal Policy
At Adobe, we respect the intellectual property rights of others, and we expect our users to do the same. As outlined above, intellectual property infringement is a violation of our Policies across all Adobe products and services. We disable content in response to complete and valid notices of infringement. When one effective notice is filed with Adobe against a user regarding one or more pieces of allegedly infringing content, the user will receive one 'strike' against their account. If the user receives three 'strikes' within a one-year period, their user account will be terminated. Our Intellectual Property Removal Policy is set out here.
Abusive Content Detection
To enforce our Policies on a global scale, Adobe relies on a variety of tools and mechanisms to detect and remove potentially violative content that is hosted on our servers. We utilize different measures depending on whether the content at issue is posted on an online platform (such as posted to Behance), shared using a publicly accessible link (such as shared via a public Adobe Express page) (collectively, “publicly accessible content”) or if the content is kept in private cloud storage. We do not utilize any of these measures on locally stored content.
Fully Automated Tools
Our automated tools use multiple signals to detect and remove publicly accessible content that may violate our Policies. For example, these tools enable us to detect and automatically remove fraud or phishing and spam content on products such as Adobe Express. They also enable us to detect and remove content on Behance that might violate our nudity or violence and gore policies. Classifiers assign scores to text, images, and videos detected across our products and services and remove the content based on these scores. We never automatically remove content located in private storage. Using these automated models helps us detect more problematic content and make quicker enforcement decisions, which in turn helps keep our communities safe.
Hybrid Tools
In addition to fully automated content removal, in some cases, we supplement automatic detection of violative publicly accessible content with human review to ensure the accuracy of our actions. Classifiers assign scores to text, images, and videos detected across our products and services, and our Trust & Safety team reviews the detected content and takes appropriate enforcement action. In all situations, human review of content only occurs after it has been flagged by our abuse detection models or reported by another user. We also use this hybrid system of review to combat child sexual abuse material, which may also be stored in private cloud storage, as detailed below.
Tools Used to Combat Child Sexual Abuse Material (CSAM)
- Adobe is deeply committed to keeping children safe online and doing our part to fight the spread of child sexual abuse material (CSAM). We have a zero-tolerance policy against any material uploaded to our servers, whether publicly accessible or kept in private storage, that sexualizes, sexually exploits, or endangers children. As part of these efforts, we employ several mechanisms combining automatic detection of content with human review, including:
- We utilize multiple methods to detect CSAM, such as sophisticated machine learning models and hash matching technology. These scans enable us to compare digital signatures (or “hashes”) of images and videos uploaded by any Adobe user to our servers against databases of known CSAM hashes. Adobe reports all confirmed CSAM to the National Center for Missing and Exploited Children (NCMEC), and our Trust & Safety team reviews all material reported to NCMEC to ensure a high level of accuracy and quality of all reports.
- Our Trust & Safety team will then review, report, and remove CSAM discovered by our machine learning models and hash matching technology, as well as by user reports or account investigations. We share hashes of previously undetected CSAM to the NCMEC-managed industry hash list to help prevent the redistribution of CSAM on other platforms. We continue to advance Adobe’s CSAM detection capabilities and build wellness features into our moderation tools.
Content Enforcement
Content Enforcement Teams
Adobe has several specially trained teams in place to ensure that reported content across our broad set of products and services is promptly reviewed and appropriately assessed:
- The Trust & Safety team is responsible for moderation of content on our user-generated content products and services (such as Behance) that violates the Adobe Content Policies.
- In addition, Adobe Stock has a team of individuals responsible for reviewing contributor submissions prior to allowing them to be offered for licensing as part of the Adobe Stock collection.
- Lastly, we have a team of IP agents who are specifically trained to handle IP infringement claims across all our products and services.
Our content enforcement teams receive detailed training on a variety of topics during onboarding and are updated on new laws or relevant political or historical contexts. In the event of particularly complex reports, content enforcement teams can discuss with leadership or escalate to members of Adobe’s IP and Trust & Safety Legal and Policy teams, who may in turn consult with both internal and external specialists with expertise in the laws of the EU and the Member States.
Content and Account Actions
Adobe acts quickly to take appropriate action against violations of our Policies or applicable law. Our Trust & Safety, Stock and IP teams review content that has been reported or detected for potential violations. When we determine that content is violative, we take action that may include:
- Global Deactivation: We first review content for violations of our Policies, and violative content is deactivated globally. When we deactivate content, it is no longer available to users or non-users anywhere in the world.
- Local Block: Adobe may restrict access to content in each relevant jurisdiction if we determine that the content violates local law but does not violate our policies. When we locally block content, that content is not visible to users or non-users in the relevant jurisdiction but remains visible elsewhere.
- Limiting Distribution: When we limit distribution of a piece of content, that content will remain on the product or service but may not be visible to certain users.
We consider several factors when determining the appropriate content enforcement action. As described above, we may make the decision to globally deactivate or locally block the content based on the nature of the violation or other context specific to each case. We also may decide on a case-by-case basis to allow certain content to remain on our products and services but take steps to ensure that it cannot be discovered accidentally or viewed by certain users.
Adobe may also restrict or limit the distribution or capabilities of any account for violations of our policies.
Notice and Appeals
For our in-scope products and services, Adobe provides email notice of content enforcement actions to impacted users and individuals or entities who report content.
Both impacted users and reporters can appeal our content enforcement decisions. If a user or reporter believes that our decision was made in error, they can file an appeal via our appeals form or via email. Some users may have additional appeal options or redress mechanisms available under their local law.
When we send an email to a user or reporter to detail an enforcement action, we typically provide a link to our appeals form within that email. If an appeal is submitted via the form, the user and reporter will receive additional updates via email. If a user or reporter chooses to contact us via email instead of through the appeals form, we will also send additional updates via email.
3.2 Number of Content Moderation Actions Taken at Adobe’s Initiative6
Adobe considers content moderation actions taken at our own initiative to be actions taken on content available in the EU on the basis that the content violates the Policies or is deemed to be illegal but was not formally reported to Adobe via an Article 9 order or Article 16 notice.
These content moderation actions include both proactive and reactive enforcement. Proactive enforcement occurs when an Adobe employee or contractor or Adobe technology identifies potentially policy-violating content, and actions that content based on our Policies. Reactive enforcement occurs when a user or other external entity reports content to Adobe and that content is actioned if in violation of our Policies.
Type of Policy Violation |
Adobe Creative Cloud storage7 |
Adobe Document Cloud |
Adobe Express |
Adobe InDesign |
Adobe Photoshop Express |
Adobe Photoshop Lightroom |
Adobe Portfolio |
Adobe Stock |
Behance |
Child Sexualization or Exploitation |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
374 |
Fraud or Phishing |
164 |
537 |
550 |
113 |
0 |
0 |
9 |
0 |
26 |
Hate content |
0 |
0 |
3 |
0 |
0 |
0 |
0 |
0 |
0 |
Intellectual Property |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
22,342 |
0 |
Nudity and Sexual Content |
0 |
0 |
1 |
0 |
479 |
92 |
0 |
0 |
55,209 |
Posting of Private Information |
0 |
1 |
0 |
0 |
1 |
0 |
0 |
0 |
51 |
Profanity |
0 |
0 |
0 |
0 |
0 |
40 |
0 |
0 |
26 |
Regulated Goods and Services |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
1 |
Spam |
83 |
62 |
0 |
0 |
7 |
4 |
0 |
0 |
7,710 |
Violence and Gore |
0 |
0 |
0 |
0 |
1 |
7 |
0 |
0 |
36,862 |
Other8 |
1 |
2 |
0 |
0 |
7 |
34 |
0 |
61,7209 |
7 |
3.3 Number of Content Moderation Actions Taken by Automated Means10
The table below provides information on the number of pieces of content removed as a direct result of automated enforcement, broken down by product/service. Not all Adobe products and services have automated enforcement mechanisms.
Type of Policy Violation |
Automated Removal Volume |
||||
Adobe Express |
Adobe InDesign |
Adobe Photoshop Express |
Adobe Portfolio |
Behance |
|
Fraud or Phishing |
308 |
113 |
0 |
9 |
0 |
Nudity and Sexual Content |
0 |
0 |
139 |
0 |
1,758 |
Violence and Gore |
0 |
0 |
0 |
0 |
99 |
Spam |
0 |
0 |
0 |
0 |
5,979 |
3.4 Number of Account Restrictions11
Adobe issues accounts to users for its products and services at what we call the “Adobe ID” level. This means that one individual account may leverage multiple Adobe products and services that have shared cloud storage capabilities. Any account restrictions that occur at the Adobe ID level rather than at the product-level are listed below as a “Multi-Service” account restriction.
The table below provides information on the volume of account restrictions, broken down by product/service.
Product/Service |
Restrictions Volume |
Adobe Stock |
44 |
Behance |
11,929 |
Frame.io |
57 |
Multi-Service |
221 |
4.1 Number of Appeals Received12
The table below provides information on the number of appeals received through all appeal channels, i.e., via the appeal link included in our notification emails and via email, organized by product/service.
Adobe Product or Service |
Number of Appeals Received |
Adobe Creative Cloud Storage |
2 |
Adobe Document Cloud |
2 |
Adobe Express |
1 |
Adobe InDesign |
6 |
Adobe Photoshop Express |
5 |
Adobe Photoshop Lightroom |
11 |
Adobe Stock |
113 |
Behance |
601 |
Multi-Service |
49 |
4.2 Number of Appeals, by Reason and Product/Service13
The table below provides information on the number of appeals received through all appeal channels, i.e., via the appeal link included in our notification emails and via email, organized by the basis for the original content moderation action and by product/service.
Type of Policy Violation |
Total Appeals Volume |
|||||||||
Adobe Commerce |
Adobe Creative Cloud storage |
Adobe Document Cloud |
Adobe Express |
Adobe InDesign |
Adobe Photoshop Express |
Adobe Photoshop Lightroom |
Adobe Stock |
Behance |
Multi-Service |
|
Fraud or Phishing |
0 |
0 |
0 |
0 |
5 |
0 |
0 |
50 |
0 |
10 |
Intellectual Property |
1 |
0 |
0 |
0 |
0 |
0 |
0 |
20 |
5 |
0 |
Nudity and Sexual Content |
0 |
0 |
0 |
1 |
0 |
4 |
9 |
0 |
513 |
0 |
Posting of Private Information |
0 |
0 |
0 |
0 |
0 |
1 |
0 |
0 |
0 |
0 |
Profanity |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
2 |
0 |
Spam |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
11 |
0 |
Violence and Gore |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
67 |
0 |
Other14 |
0 |
2 |
2 |
0 |
1 |
0 |
2 |
38 |
3 |
39 |
4.3 Number of Appeals, by Product/Service and Appeal Outcome15
The below data represents the number of pieces of content restored following a successful appeal.
Type of Policy Violation |
Total Restored Content and Accounts After Complaint |
|||||||||
Adobe Commerce |
Adobe Creative Cloud storage |
Adobe Document Cloud |
Adobe Express |
Adobe InDesign |
Adobe Photoshop Express |
Adobe Photoshop Lightroom |
Adobe Stock |
Behance |
Multi-Service |
|
Fraud or Phishing |
0 |
0 |
0 |
0 |
5 |
0 |
0 |
34 |
0 |
9 |
Intellectual Property |
1 |
0 |
0 |
0 |
0 |
0 |
0 |
12 |
5 |
0 |
Nudity and Sexual Content |
0 |
0 |
0 |
0 |
0 |
4 |
2 |
0 |
428 |
0 |
Posting of Private Information |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
Profanity |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
1 |
0 |
Spam |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
1 |
0 |
Violence and Gore |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
64 |
0 |
Other16 |
0 |
0 |
0 |
0 |
0 |
0 |
0 |
15 |
0 |
12 |
4.4: Median Time to Resolve Appeals17
The table below provides information on the median time required for Adobe to act on an appeal, broken down by product/service. We receive appeals through multiple channels, e.g., via the appeal link included in our notification emails and through our appeals form. The below data represents the median turnaround time for both appeal types.
Adobe Product or Service |
Median Time to Resolve an Appeal |
Adobe Commerce |
<1 day |
Adobe Creative Cloud storage |
4-5 days |
Adobe Document Cloud |
3-4 days |
Adobe Express |
N/A18 |
Adobe InDesign |
<1 day |
Adobe Photoshop Express |
<1 day |
Adobe Photoshop Lightroom |
<1 day |
Adobe Stock |
13-14 days |
Behance |
4-5 days |
Multi-Service |
1 day |
4.5: Number of Appeals Received from Content Reporters
Adobe did not receive any appeals from content reporters during the reporting period.
Adobe will inform both the user and the reporter of their opportunity to seek additional review by certified, EU-based out-of-court dispute settlement bodies. Adobe did not receive any requests from users nor reporters to settle content moderation disputes out of court during the reporting period.
6.1 Number of Suspensions for Manifestly Illegal Content Imposed Pursuant to Article 23
If an Adobe user frequently uploads manifestly illegal content, or a reporter frequently submits manifestly unfounded reports or appeals through our channels, we may suspend, for a reasonable period of time, their ability to use the relevant product, service, or system. Adobe did not suspend any accounts for submitting manifestly unfounded reports or appeals through our channels for the reporting period.
Article 24(2) of the (DSA) requires “online platforms” to publish information on their average monthly active users MAUs in the EU every six months. Adobe provides information about the EU MAUs for our products and services that may fall within the DSA’s scope of “online platforms” here.
1 We have included in the above table only those EU Member States whose authorities submitted requests for user data during the reporting period.
2 We have included in the above table only those Adobe products and services, and categories of alleged violations for which we received Article 16 notices during the reporting period.
3 We have included in the above table only those Adobe products and services for which we received Article 16 notices during the reporting period.
4 More than one action may be taken on an Article 16 notice.
5 We have included in the above table only those Adobe products and services for which we received Article 16 notices during the reporting period.
6 We have included in the above table only those products and services, and policy categories that have data to report for the reporting period.
7 Adobe Creative Cloud Storage refers to cloud storage (and content stored therein) tied to an Adobe Creative Cloud account.
8 Not all enumerated policy violation categories in this table will be applicable to our diverse offering and uses of our products and services. For this reason, some products will have content moderation actions listed under “Other” to account for enforcement of other product-specific terms and policies.
9 Adobe Stock may remove contributor assets in accordance with its Content Submission Guidelines, which contain product-specific requirements that do not conform with the policy categories enumerated in this table.
10 We have included in the below table only those products and services, and policy categories that have data to report for the reporting period.
11 We have included in the below table only those Adobe products and services from which restricted accounts in the European Union during the reporting period.
12 We have included in the below table only those Adobe products and services for which we received appeals during the reporting period.
13 We have included in the below table only those Adobe products and services for which we received appeals, and their associated policy categories, during the reporting period.
14 Not all enumerated policy violation categories in this table will be applicable to our diverse offering and uses of our products and services. For this reason, some products will have content moderation appeals listed under “Other” to account for enforcement of other product-specific terms and policies.
15 We have included in the below table only those Adobe products and services for which we received appeals, and their associated policy categories, during the reporting period.
16 Not all enumerated policy violation categories in this table will be applicable to our diverse offering and uses of our products and services. For this reason, some products will have content moderation appeals listed under “Other” to account for enforcement of other product-specific terms and policies.
17 We have included in the below table only those Adobe products and services for which we received appeals during the reporting period.
18 Although Adobe Express received one (1) appeal during the reporting period, that appeal had not been resolved by the time of this report. For this reason, there is no median time for resolution to report.