Published: [DATE OF PUBLICATION]

Reporting Period: February 17, 2024 – February X, 2025

About this Transparency Report

Adobe Software Systems Ireland (“Adobe” or “we”) shares the European Commission's goal of making the internet safer through transparency and accountability. In accordance with Articles 15 and 24 of the European Union (“EU”) Digital Services Act (“DSA” or “Regulation”), Adobe is publishing our first transparency report (“Report”) for the products and services we determined to be within scope of the Regulation. It outlines our efforts to moderate content across our diverse offerings in accordance with our General Terms of Use, Content Policies, and product-specific Community Guidelines (collectively, “Policies”), and provides detailed metrics regarding those efforts for the reporting period, 17 February 2024 to 31 December 2024.

Adobe has products and services ranging from subscription enterprise collaboration tools to free photo editing mobile apps and are continually assessing whether our offerings fall within scope of the DSA. We are committed to regularly updating our Policies and processes and publishing this report on an annual basis in accordance with our obligations under this Regulation. More information about how we approach content moderation, and the DSA may be found at our Transparency Center.

Section 1: Orders received from Member States’ authorities

Article 15(1)(a): information about orders received from Member States’ authorities

This section details the volume of requests from law enforcement and government agencies to remove content or provide user information pursuant to Articles 9 and 10.

1.1 Government removal orders from EU member states

Adobe did not receive any removal orders from courts or government agencies in the EU during the reporting period.

1.2 Requests for user data from EU member states

User data requests may include requests issued pursuant to local law and law enforcement requests, such as emergency disclosure requests. These requests are carefully reviewed by the Adobe Trust & Safety team to determine the validity of the legal process, to assess the proportionality of the request, and to ensure compliance with international data protection commitments made by Adobe.

Scope of data: Requests for user data from EU member state authorities

Median time to acknowledge receipt: [Placeholder]

Median time to give effect to the order: [Placeholder]

Table 1.2.1: Requests for user data from EU member state authorities
Country
Requests
Requests for which some information was produced
France
Germany
Ireland
Slovakia
Spain
[Placeholder: statement explaining that we’ve omitted from the above chart Member States whose authorities did not submit any requests]

Section 2: Notices received in accordance with Article 16

Article 15(1)(b): information about notices submitted in accordance with Art. 16

[Placeholder: overview text explaining that Adobe has content policies in place that apply worldwide, but we also take action pursuant to local laws.]

Adobe has updated its notice mechanisms in accordance with Article 16 to allow users, Trusted Flaggers (as defined by Article 22), and other entities to report content that they believe violates local law. When we receive such a notice, we review the reported content in line with our Content Policies as outlined in [Placeholder section] and take the appropriate action. If the reported content does not violate Adobe’s policies, we review it for legality based on the provided information and may restrict access to the content in the European Union. Article 16 notices are not processed by automated means for any of Adobe’s in-scope products or services.

Trusted Flaggers who have been designated by the Digital Services Coordinator of the Member State in which they are established can, as described above, report allegedly illegal content in line with Article 22. Adobe has not received any reports by designated Trusted Flaggers as of 31 December 2024.

2.1 Number of Article 16 notices submitted, by category of alleged illegal content

Table 2.1.1 reflects the number of notices submitted by EU users pursuant to Article 16 during the reporting period, broken down by category of alleged illegal content and product/service.

Table 2.1.1: Number of Article 16 notices submitted, by category of alleged illegal content and product/service
Number of Art. 16 Notices Received
Type of alleged illegal content
Behance
Stock
{{lightroom}} Community
Photoshop Express
Adobe Commerce Marketplace
Adobe Community
Child Safety (Illegal Content Form + Report Abuse)
0
0
0
0
0
0
Extremist Content
0
0
0
0
0
0
Fake Accounts
0
0
0
0
0
0
Fraud, Scams, and Phishing
0
0
0
0
0
0
Harassment and Cyberbullying
0
0
0
0
0
0
Hate
0
0
0
0
0
0
Intellectual Property
143
122
0
0
0
0
Misinformation and Disinformation
0
0
0
0
0
0
Nudity and Sexual Content
0
0
0
0
0
0
Regulated Goods and Services
0
0
0
0
0
0
Violence and Gore
0
0
0
0
0
0
Other Legal
0
0
0
0
0
0
Number of Art. 16 Notices Received (continued)
Type of alleged illegal content
Substance 3D Community Assets
Adobe Exchange
Adobe Document Cloud1
AEP Assets.
Adobe Commerce Cloud
Frame.io
Child Safety (Illegal Content Form + Report Abuse)
0
0
0
0
0
0
Extremist Content
0
0
0
0
0
0
Fake Accounts
0
0
0
0
0
0
Fraud, Scams, and Phishing
0
0
0
0
0
0
Harassment and Cyberbullying
0
0
0
0
0
0
Hate
0
0
0
0
0
0
Intellectual Property
0
0
1
0
30
0
Misinformation and Disinformation
0
0
0
0
0
0
Nudity and Sexual Content
0
0
0
0
0
0
Regulated Goods and Services
0
0
0
0
0
0
Violence and Gore
0
0
0
0
0
0
Other Legal
0
0
0
0
0
0
Number of Art. 16 Notices Received (continued)
Type of alleged illegal content
Adobe Express
Portfolio
InDesign
Creative Cloud Libraries/Storage
Other (Product Unspecified)2
Child Safety (Illegal Content Form + Report Abuse)
0
0
0
0
0
Extremist Content
0
0
0
0
0
Fake Accounts
0
0
0
0
0
Fraud, Scams, and Phishing
0
0
0
0
0
Harassment and Cyberbullying
0
0
0
0
0
Hate
0
0
0
0
0
Intellectual Property
1
2
1
0
0
Misinformation and Disinformation
0
0
0
0
0
Nudity and Sexual Content
0
0
0
0
1
Regulated Goods and Services
0
0
0
0
0
Violence and Gore
0
0
0
0
0
Other Legal
0
0
0
0
0
[Placeholder: statement explaining that we’ve omitted services for which we didn’t receive any Art. 16 notices]

2.2 Number of actions taken in response to Article 16 notices

Product/Service
Actions taken because the content was deemed to violate Adobe’s policies3
Actions taken because the content was deemed to be illegal under local law
Behance
0
Stock
0
{{lightroom}} Community
0
Photoshop Express
0
Firefly Community Gallery
0
Adobe Commerce Marketplace
0
Adobe Community
0
Substance Community 3D Assets
0
Adobe Exchange
0
Adobe Document Cloud1
0
1
AEP Assets
0
Adobe Commerce Cloud
0
114
Frame.io
0
Adobe Express
0
Portfolio
0
InDesign
0
1
Creative Cloud Libraries/Storage
0
Multi-service

2.3 Median time to take action on content identified in Article 16 notices

Table 2.3.1 reflects the median time, in hours, required to take action on content identified in Article 16 notices.

Table 2.3.1: Median time to take action on Article 16 notices, by product/service
Product/Service
Median time to take action (hours)
Behance
13.6
Stock
16.9
{{lightroom}} Community
N/A
Photoshop Express
N/A
Firefly Community Gallery
N/A
Adobe Commerce Marketplace
N/A
Adobe Community
N/A
Substance Community 3D Assets
N/A
Adobe Exchange
N/A
Adobe Exchange Adobe Document Cloud 1
30.33
AEP Assets
N/A
Adobe Commerce Cloud
196
Frame.io
N/A
Adobe Express
N/A
Portfolio
N/A
InDesign
115
Creative Cloud Libraries/Storage
N/A
[Placeholder: statement explaining that we’ve omitted services for which we didn’t receive any Art. 16 notices]

Section 3: Content moderation engaged in at Adobe’s own initiative

3.1 Proactive content moderation

[Placeholder: overview text detailing Adobe’s commitments to online safety and transparency]

[Placeholder: text outlining Adobe’s broad strategy to address harmful content across its products/services, including reporting options, reactive content moderation, proactive efforts to identify and moderate content, and participation in industry groups and organizations also aiming to create a safer internet.]

[Placeholder: explanation that enforcement actions may differ between products/services]

[Placeholder: training provided to content moderators]

[Placeholder: number and type of measures taken that affect availability of content]

[Placeholder: automated means for content moderation. Explain both “hybrid” mechanisms (e.g., CSAM scanning, classifiers for spam, etc.) and fully automated mechanisms (e.g., classifiers for phishing and nudity). For discussions of automated tools, also include safeguards and notes on their accuracy]

Table 3.1.1: Number of content moderation actions taken at Adobe’s initiative
Behance
Stock
{{lightroom}} Community
Photoshop Express
Adobe Commerce Marketplace
Profanity
Nudity or sexual content
1
Violence or gore
Harassment or bullying
Hate speech or symbols
Child sexualization, exploitation, or abuse 4
Extremist content
Nudity and Sexual Content
4
74
148
Regulated goods or services
Posting of private information
6
Copyright violations 4
10,001
Trademark violations 4
Fraud or phishing
Malicious link or file
Spam
Impersonation or fake account
Misinformation or disinformation
Adobe Commerce Cloud
Substance 3D Community Assets
Adobe Exchange and Extensibility
Adobe Document Cloud 1
AEP Assets
Profanity
Nudity or sexual content
1
Violence or gore
Harassment or bullying
Hate speech or symbols
Child sexualization, exploitation, or abuse 4
Extremist content
Nudity and Sexual Content
4
74
148
Regulated goods or services
Posting of private information
6
Copyright violations 4
10,001
Trademark violations 4
Fraud or phishing
Malicious link or file
Spam
Impersonation or fake account
Misinformation or disinformation
Adobe Commerce Cloud
Frame.io
Adobe Express
Portfolio
InDesign
Creative Cloud Libraries/Storage
Profanity
Nudity or sexual content
1
Violence or gore
Harassment or bullying
Hate speech or symbols
Child sexualization, exploitation, or abuse 4
Extremist content
Nudity and Sexual Content
4
74
148
Regulated goods or services
Posting of private information
6
Copyright violations 4
10,001
Trademark violations 4
Fraud or phishing
Malicious link or file
Spam
Impersonation or fake account
Misinformation or disinformation

Table 3.1.2: Number of content moderation actions taken by automated means

[Placeholder: explaining our classifiers for phishing, nudity, etc.]

Removal automation volume
Behance
Stock
{{lightroom}} Community
Photoshop Express
Firefly Community Gallery
Phishing
0
0
0
0
Nudity
0
0
88
0
Spam
0
0
0
0
Removal automation volume (continued)
Adobe Commerce Marketplace
Substance 3D Community Assets
Adobe Exchange and Extensibility
Adobe Document Cloud 1
AEP Assets
Phishing
0
Nudity
0
Spam
0
Removal automation volume (continued)
Adobe Commerce Cloud
Frame.io
Adobe Express
Portfolio
InDesign
Creative Cloud Libraries/Storage
Phishing
0
0
49
0
Nudity
0
0
0
0
0
Spam
0
0
0
0
0
[Placeholder: statement explaining that we’ve omitted services on which no content moderation actions were taken by automated means]
Table: 3.1.3 Number of account terminations in the European Union
Product/Service
Number of Accounts
Behance
Stock
1068
Multi-service
[Placeholder: statement explaining that we’ve omitted services on which no accounts were terminated]

Section 4: Appeals received through Adobe’s internal complaint handling systems (i.e., appeals)

Article 15(1)(d): information about complaints received through Adobe’s internal complaint-handling systems

4.1 Number of appeals received

Table 4.1: Number of appeals received, by product/service

Product/Service (Holly for IP, Sarthak)
Number of complaints received
Behance
Stock
{{lightroom}} Community
Photoshop Express Discover
Firefly Community Gallery
Adobe Commerce
Adobe Community
Substance Community 3D Assets
Adobe Exchange
Adobe Exchange Adobe Document Cloud (Adobe Sign, Adobe Web, Doc Cloud Storage)
AEP Assets
Adobe Commerce Cloud
Frame.io
Adobe Express
Portfolio
InDesign
Creative Cloud Libraries/Storage
Multi-product (Hendrix)

4.2 Number of appeals, by reason

4.2.1: Number of appeals, by reason and product/service