The Research Team Goes to Brussels

Matt Motyl spent the past 15 years studying political psychology and social media. He was a professor until Meta recruited him to join the Civic Integrity team in preparation for the 2020 US election. He worked primarily on building recommendation algorithms that elevated high-quality civic and health content, and decreased distribution of harmful content. Since leaving Meta in 2023, Matt has been advising civil society organizations, elected officials, and technology companies on how social technologies work (or fail to work).

Matt is a Resident Fellow in Research & Policy at the Integrity Institute, Senior Advisor to the University of Southern California’s Neely Center where he is the lead analyst for the Neely Social Media Index, and founder of Unmoderated Insights.

The full text of this post may be accessed here.


I am writing from the seat of the European Union in Brussels, Belgium. After 15 years studying social media, 4 years working in social media, and the past year-plus trying to fix social media from the outside, I was invited for an action-packed week with key stakeholders responsible for enforcing the expansive Digital Services Act regulating tech companies.

European Digital Media Observatory

The week began with the annual European Digital Media Observatory conference, which centered on the potential impacts of artificial intelligence and social media on the EU’s first parliamentary elections since 2019. The top fears raised by researchers were deepfake videos released on the eve of the elections, as more people have access to increasingly believable video-creating generative AI tools and can post these creations onto social media platforms that are unlikely to be able to verify whether the videos are real before distributing them to billions of voters. While experimental research on the impact of social media and deepfakes on elections is unclear, it’s fair to say that recommending videos showing people doing objectionable things they never did is… not great.

Another major theme, and the reason for my invitation, was data access and platform transparency. Specifically, the Digital Services Act requires platforms to share data with key stakeholders and researchers. Despite this requirement, many platforms have actually been reducing access to their data. For example, Meta announced it was shutting down CrowdTangle, which was the best tool to track in real-time what news Facebook and Instagram were amplifying to their billions of users. Although Meta did announce creating another tool, almost nobody in the large audience who previously had access to CrowdTangle was granted access to it (myself included). And, X (formerly Twitter) now charges around $500,000 per year for access to less data than they used to offer for free, effectively putting X data out of reach for most academic and non-profit researchers (again, myself included). The Digital Services Act requires data access, which is the basis for some of the current investigations into these companies. If found non-compliant, they could be fined up to 6% of their annual global revenue. For Meta, that fine would be a cool $8,040,000,000 (based on 2023 revenue).

After the conference, EDMO hosted representatives from the 27 EU member states’ hubs for a discussion around platform data. They invited me to speak about what data tech companies collect about their users, how the data are structured, and what researchers and regulators might need to ask for when requesting data. In preparing this presentation, and through many conversations over the course of the week, it’s clear to me that explaining this to people who haven’t worked on the technical side of these things in these companies is sorely needed. So, I’ve begun working on an interactive e-book on social technology data practices, and how outsiders can learn to work with the data that these companies are increasingly going to be required to make publicly available. For a brief summary, check out my slide deck here

European Commission, Center for Democracy & Technology - Europe, Mozilla, and more

The Integrity Institute Research & Policy team and I had many other meetings with key partners like the European Commission, the Center for Democracy & Technology - Europe, and the Mozilla Foundation. While I can’t yet share too much about those conversations, I will highlight three observations:

  1. The European Commission has an incredibly sophisticated group of experts working on all-things-Digital-Services-Act-related. The expertise was striking to me, as most of my prior advising and consultation was with United States-based elected officials and staff, who historically have not been particularly experienced in how the latest technology works (or fails to work). However, I have seen improvements on the US side, especially in the US Senate Judiciary Hearings on Child Safety earlier this year, and I am optimistic that the improvement will continue thanks to the federal push to hire more technologists into the public sector and programs like TechCongress.

  2. The European Union is way ahead of the US when it comes to tech regulation, and should provide many lessons to the rest of the world as they move toward regulating an industry that has operated without much oversight for the past few decades. It’s too early to say whether the DSA will be effective in reducing harms caused by / related to social technologies, but we will start to see some of the effects in the upcoming months as companies’ risk assessments are released to the public (ETA late August 2024) and their data transparency programs become more accessible. Regardless, this is the most far-reaching tech regulatory policy in the world and should be incredibly informative.

  3. Much like in the US, there are some excellent civil society and non-profit organizations working to create a safer and more useful social internet. Much gratitude to the Center for Democracy & Technology - Europe and the Mozilla Foundation for their important work.

All in all, it was a great trip, and I may be heading back later this year to lead workshops on how to understand and work with online platform data.

[The full text of this post may be accessed here.]

Previous
Previous

We Worked On Election Integrity At Meta. The EU – And All Democracies – Need to Fix the Feed Before It’s Too Late