The DSA is Live! So… How are Companies Adapting?
By Nima Mozhgani and Alice Hunsberger, Integrity Institute Members
Nima Mozhgani is the CEO and Co-Founder of Unveil. Unveil automates compliance tasks, like transparency reporting, with regulations like the EU DSA. Prior to Unveil, Nima was a Sr. Program Manager at Snap Inc., leading transparency reporting and compliance efforts. You can find them on LinkedIn.
Alice Hunsberger is VP of Trust & Safety and Content Moderation at PartnerHero. She is also co-founder and host of the Trust in Tech podcast. You can find them on LinkedIn or alicelinks.com
The European Digital Services Act (DSA) went live on February 17, 2024, and now companies must navigate an array of compliance requirements. Although Very Large Online Platforms and Search Engines (VLOPs & VLOSEs) have been aligning with these mandates since mid 2023, the vast majority of companies are now stepping into previously unexplored compliance territory.
Leading up to the deadline, some of the non-VLOPs and non-VLOSEs proactively introduced solutions — and we’ve compiled a list of ones that are tackling compliance head on. Some are simple, others are extensive and in-depth, but those on this list show meaningful progress against compliance with the DSA’s requirements. Our goal is that these can serve as inspiration and references for companies as they steer a course to compliance with the DSA.
User Reporting
Under Article 20 of the DSA, online platforms must provide users with the tools to report content and accounts. Whether it’s something illegal or inappropriate, users should be able to flag it — and, consequently, platforms should be able to take action by removing or suspending any violating content and accounts on their services. This rule closely overlaps with Article 16, Notice and Action Mechanisms, which applies to all hosting services. The idea behind these articles is to make services safer by ensuring users can easily report issues and platforms can effectively respond.
Hinge - Hinge provides clear guidelines to report both in-app and online, and provide a web-form outside the app. The web-form is straightforward, and allows users to provide detailed information regarding potential harm that incurred. Their web-form and reporting menus map to their Community Guidelines, which enable remediation.
Twitch - Twitch provides detailed guidelines for reporting across desktop and mobile. They provide intuitive graphics to point users through each reporting menu, and further explain tools like “batch reporting” that help them moderate at scale. They enhance their reporting capabilities by speaking candidly about their appeals and enforcement approach, which are also relevant in keeping users safe under the DSA.
Appeals
It's not just about reporting problems; Article 20 also empowers users to challenge decisions made by platforms. If a user thinks that their content was unfairly removed or their account was wrongly restricted, they must be provided a way to appeal — basically they can ask the platform to reconsider its decision. The appeals process is about giving users a voice if they feel they've been treated unjustly, ensuring that the platforms' decisions are fair and reasonable.
Grindr - Grindr provides a streamlined web form for appeals that spans their list of enforcement reasons. This is important because it clearly presents users with choices to appeal for, and enables them to accurately respond to appeals on the backend. The form also requires users to validate that they’ve read the Grindr Community Guidelines, and each individual harm type’s description, which adds some friction to avoid spam, but is mindful of folks looking to appeal in earnest.
Discord - Discord provides both in-app and web appeals for users that have been enforced against for violating their Community Guidelines or Terms of Service. Moreover, they provide alternative resolution options for EU users and a general safety roadmap for their DSA compliance. Ultimately, Discord’s appeals efforts reflect the requirements under the DSA, and provide ample means to resolve appeals-related challenges.
Statement of Reasons
Under Article 17 of the DSA, anytime a hosting service or an online platform decides to restrict or remove a user's content, they have to clearly tell the user why. This explanation, called a “statement of reasons,” should provide details about what the user did wrong and what actions were taken against their content or account. The idea is to make sure that users aren't left in the dark about why their posts were removed or their accounts were affected, ensuring that these decisions are transparent and easy to understand.
Riot Games - Across their various games, specifically League of Legends and Valorant, Riot will issue notifications to both the reporting and reported account after a successful report. Reporting accounts are informed if their report was successful, and the reason for enforcement. Reported accounts are informed of their punishment, which can include chat bans to ranked match bans. Riot’s approach shows a detailed effort towards compliance and showcases a means for platforms to strive for.
Tinder - Tinder announced a simple approach for providing users with a statement of reasons. This represents a meaningful first-step in providing those affected by content or account removals, it does not currently delineate between harm types, or clarify whether automation or humans were used in the enforcement. These additional distinctions are required under the DSA, but nonetheless Tinder’s approach showcases how platforms may begin providing a statement of reasons for their users.
Monthly Active Users (MAU) Disclosures
Under Article 24(2), online platforms must publish their monthly active users at least once every 6 months. In other words, by August 17, 2024, platforms should publish their first set of MAUs, publicly, on their websites. The goal of this exercise is fairly straightforward: to enable additional transparency around the scale and reach of platforms operating in the EU.
eBay - eBay has established a hub for DSA efforts that includes their EU MAUs. The MAU calculations are placed upfront, and provide additional contextualization regarding the law and how eBay has calculated its MAU metrics. This additional transparency reflects the spirit of the regulation, in an effort to demystify platform activity for users.
Roblox - Since February 2023, Roblox has consistently reported MAU metrics under the DSA. Roblox clearly lists previous periods of MAUs for clear comparisons, as well, providing additional transparency and signals to the EU when Roblox may cross the threshold as a VLOP.
Transparency Reporting
Under Articles 15, 24, and 42, the DSA outlines numerous requirements for Transparency Reports. These requirements impact all companies that are subject to the DSA. So, regardless of whether a company is designated as an intermediary service, hosting service, or online platform, they require a Transparency Report with nuances depending on each company’s designation. Within companies’ transparency reports, they must publish both quantitative metrics, which include the results of their notice and action mechanisms, user reports, and appeals, as well as qualitative metrics, which touch on their content moderation approach, training, and policies.
Reddit - As an online platform, Reddit makes their transparency report easily accessible for the general public through detailed and visual breakdowns of their moderation policies and operations. In doing so, they position their transparency report as a differentiator for their branding and strategy — by emphasizing their commitment to safety. They delineate between their content types (e.g., chats, comments, posts, subreddits), and analyze trends between reporting periods. Although they must publish new data points under the DSA, such as data breakdowns by EU member states, Reddit’s current approach showcases how effective transparency reporting can yield positive brand safety results.
Cloudflare - As a hosting service, Cloudflare has been proactively and voluntarily publishing transparency reports since 2013. Their transparency center provides thorough contextualization around their policies and practices, working to make transparency reporting more understandable for lay audiences — which feels to be in the spirit of the DSA! Moreover, by making their report accessible to a wider audience, Cloudflare positions privacy and trust as a key differentiator. While they will need to expand their report with new data points and breakdowns, such as delineations between reasons for moderation decisions, their transparency reports highlight a commitment to user safety.