Adobe Transparency Center
EU Digital Services Act Transparency Report
Published: [DATE OF PUBLICATION]
Reporting Period: February 17, 2024 – February X, 2025
About this Transparency Report
Adobe Software Systems Ireland (“Adobe” or “we”) shares the European Commission's goal of making the internet safer through transparency and accountability. In accordance with Articles 15 and 24 of the European Union (“EU”) Digital Services Act (“DSA” or “Regulation”), Adobe is publishing our first transparency report (“Report”) for the products and services we determined to be within scope of the Regulation. It outlines our efforts to moderate content across our diverse offerings in accordance with our General Terms of Use, Content Policies, and product-specific Community Guidelines (collectively, “Policies”), and provides detailed metrics regarding those efforts for the reporting period, 17 February 2024 to 31 December 2024.
Adobe has products and services ranging from subscription enterprise collaboration tools to free photo editing mobile apps and are continually assessing whether our offerings fall within scope of the DSA. We are committed to regularly updating our Policies and processes and publishing this report on an annual basis in accordance with our obligations under this Regulation. More information about how we approach content moderation, and the DSA may be found at our Transparency Center.
Section 1: Orders received from Member States’ authorities
Article 15(1)(a): information about orders received from Member States’ authorities
This section details the volume of requests from law enforcement and government agencies to remove content or provide user information pursuant to Articles 9 and 10.
1.1 Government removal orders from EU member states
Adobe did not receive any removal orders from courts or government agencies in the EU during the reporting period.
1.2 Requests for user data from EU member states
User data requests may include requests issued pursuant to local law and law enforcement requests, such as emergency disclosure requests. These requests are carefully reviewed by the Adobe Trust & Safety team to determine the validity of the legal process, to assess the proportionality of the request, and to ensure compliance with international data protection commitments made by Adobe.
Scope of data: Requests for user data from EU member state authorities
Median time to acknowledge receipt: [Placeholder]
Median time to give effect to the order: [Placeholder]
Section 2: Notices received in accordance with Article 16
Article 15(1)(b): information about notices submitted in accordance with Art. 16
[Placeholder: overview text explaining that Adobe has content policies in place that apply worldwide, but we also take action pursuant to local laws.]
Adobe has updated its notice mechanisms in accordance with Article 16 to allow users, Trusted Flaggers (as defined by Article 22), and other entities to report content that they believe violates local law. When we receive such a notice, we review the reported content in line with our Content Policies as outlined in [Placeholder section] and take the appropriate action. If the reported content does not violate Adobe’s policies, we review it for legality based on the provided information and may restrict access to the content in the European Union. Article 16 notices are not processed by automated means for any of Adobe’s in-scope products or services.
Trusted Flaggers who have been designated by the Digital Services Coordinator of the Member State in which they are established can, as described above, report allegedly illegal content in line with Article 22. Adobe has not received any reports by designated Trusted Flaggers as of 31 December 2024.
2.1 Number of Article 16 notices submitted, by category of alleged illegal content
Table 2.1.1 reflects the number of notices submitted by EU users pursuant to Article 16 during the reporting period, broken down by category of alleged illegal content and product/service.
2.2 Number of actions taken in response to Article 16 notices
2.3 Median time to take action on content identified in Article 16 notices
Table 2.3.1 reflects the median time, in hours, required to take action on content identified in Article 16 notices.
Section 3: Content moderation engaged in at Adobe’s own initiative
3.1 Proactive content moderation
[Placeholder: overview text detailing Adobe’s commitments to online safety and transparency]
[Placeholder: text outlining Adobe’s broad strategy to address harmful content across its products/services, including reporting options, reactive content moderation, proactive efforts to identify and moderate content, and participation in industry groups and organizations also aiming to create a safer internet.]
[Placeholder: explanation that enforcement actions may differ between products/services]
[Placeholder: training provided to content moderators]
[Placeholder: number and type of measures taken that affect availability of content]
[Placeholder: automated means for content moderation. Explain both “hybrid” mechanisms (e.g., CSAM scanning, classifiers for spam, etc.) and fully automated mechanisms (e.g., classifiers for phishing and nudity). For discussions of automated tools, also include safeguards and notes on their accuracy]
| InDesign |
| Creative Cloud Libraries/Storage |
Table 3.1.2: Number of content moderation actions taken by automated means
[Placeholder: explaining our classifiers for phishing, nudity, etc.]