How User Content Creation Changes the Impact of Social Media Enforcement

By Naomi Shiffman, Integrity Institute member


Summary:

  • When studying social media surfaces to assess the impacts of policy enforcement and deplatforming, academics have in the past treated different surface types with equivalent audience sizes identically. 
  • Surfaces span a spectrum from broadcast-only to highly interactive public surfaces -- and sometimes, users’ true engagement and experience with the surface is different from the designed intent. 
  • It’s important for academics to understand the true nature of the user experience on a given surface when studying it -- this experience will predict how enforcement actions play out in real life.

What is a social media surface?

Surfaces are the places in which users can create and consume content within a particular social media platform. Surfaces are differentiated by the key features and functionalities, including posting, direct messaging, and commenting in the context of larger objects such as groups or accounts. They are also differentiated by different broadcasting or storytelling media (Snapchat stories, TikTok videos, etc.), and the ability to engage in commerce or advertising.

What is an enforcement action?

Enforcement actions can be thought of as the operationalization of a platform's policies. These include:
  • Reductions in distribution for particular users or pieces of content
  • Removal of content or users
  • Creating friction and “cool off” periods following certain violations
  • Strikes on accounts or other complex objects (like groups or subreddits) to curb repeat offenders
  • And more.

There is a surface spectrum

Social media surfaces span a spectrum: from broadcast-only, to highly interactive. One one side of the spectrum are surfaces like Facebook pages and YouTube channels, where one or a small group of people is typically responsible for creating content. On the other side, there are highly interactive surfaces, like Facebook Groups and subreddits, where a large number of people can post and create content. 

A very rough approximation of the spectrum

In contrast to the intended purpose of a surface, the actual behaviors that occur on a surface can blur the broadcast/interactive distinction. For example, on TikTok and Twitter, tight networks of individual accounts may behave like an interactive surface, even though each individual account is run by one person or a small group of creators. And sometimes, surfaces initially designed to be interactive turn into broadcast surfaces, such as when a Facebook Group ends up becoming dominated by a single user and other members stop contributing content, making it functionally a page.
The fluidity of the spectrum means that when trying to assess the impact of enforcement, it is critical to understand the nature of user interaction with the surface being studied. Is it truly an interactive surface? How many users does an enforcement action directly impact?

The impacts of enforcement tend to depend on the number of unique content creators and their sense of belonging

In broadcast-type surfaces like Facebook pages, YouTube channels, or Twitter or TikTok accounts, enforcement actions most frequently impact a small group of creators’ ability to create. This means that most people consuming the content aren’t necessarily losing their own audience or their own ability to create content, due to that enforcement action. In these cases, content from the individual channel affected by the enforcement action may not resurface -- for example, President Donald Trump’s deplatforming on Twitter and Facebook largely meant his messages didn’t find the same audience as when he was on the platform. 
In interactive community surfaces, like Telegram channels, Facebook Groups and subreddits, enforcement actions directly impact a much larger group of people. When one of these surfaces gets shut down, tens of thousands of users could lose a community, an audience, and/or their main creator medium. Additionally, by the time a surface-wide enforcement action occurs, individual users in that community may have already had their individual posts or accounts enforced on for violations, increasing the feeling of being targeted and making the mass surface shut-down all the more salient.
This salience means that enforcement actions -- especially shut-downs -- on highly interactive surfaces tend to result in a hydra effect. Users feel personally attacked by the loss of their community and/or audience, and are likely to create or find new venues to replace what they lost, often changing names or using codewords to evade repeated enforcement action. While this sometimes results in crossplatform flight (such as the creation of Gettr, launched by President Donald Trump in an attempt to create a haven for right-wing users who felt silenced by “Big Tech”), evasion and the resurfacing of violating communities often happens on the same platform where the enforcement action happened, subverting the moderation attempt. After the Stop the Steal Facebook group was removed, clone groups quickly popped up and persisted in defiance of the enforcement action. After the r/The_Donald subreddit was removed, former members organized on a Discord server for continued coordination, but also encouraged members to join specific new subreddits to serve as the group’s new space. When people’s sense of community is violated, they will go to great lengths to retain it.
The large number of people who could potentially be held accountable for violating behavior or content changes the way decisions are made within platform policy teams, as well. These types of enforcement actions tend to be much more heavily debated, and take longer to execute, leading to lagging action. This is problematic, because by the time violating content and behavior are organically growing in interactive surfaces with a high number of content contributors, it’s indicative that the ideas and behavior are more normalized, and have somewhat taken on a life of their own. 

When studying content and behavior violations on platforms, different surfaces with equal numbers of consumers are not necessarily alike.

Academic study of platforms has sometimes insufficiently differentiated between different surface types. This is likely due to the challenges in obtaining the data necessary to robustly compare similar types of surfaces across different platforms. As government regulation improves academic access to data, it is critical that academics ground their research in user behavior and experience on social media surfaces, rather than just looking at audience size, or assuming that the prescribed use of a surface is the way it is actually used. This grounding can effectively happen by engaging with integrity workers to get a sense of potential gaps in understanding -- such as through the Integrity Institute, which connects integrity workers with academics for this exact purpose. Better collaboration between integrity workers and academics will lead to growth in the field, and ultimately to a stronger understanding of the impact of social media on society.
Previous
Previous

The Hidden Economy of Spam