What do we do about Telegram?

Hello,

For the past couple of years, Telegram has been my boogeyman when I needed to remind people to remain nervous about where social media could head. Telegram is an example of a platform that shows little to no interest in integrity or trust & safety work. It seems perfectly content to host violent hate groups, dangerous conspiracy groups, and authoritarian influence operations. And it’s also most likely going to be the next platform to hit a billion users.

A new article in Wired covers an investigation we did into Telegram and the extent to which Telegram is content to host dangerous groups. You can read it here.

The Apple App and Google Play stores have policies around the content that apps can distribute. If an app runs afoul of those policies, they risk being removed from the app stores and thus losing the ability to reach many of their users. As with all content policies, there are plenty of gray areas, and the app stores don’t always get it right. But many of their policies are pretty uncontroversial: for example, both Apple and Google have policies against apps that advocate or plan for committing violence against individuals and groups. Basic stuff, but it has real impact. If an app is distributing content advocating for violence, it runs a real risk of being removed from the app stores.

Telegram has found an “innovative” way to remain in the app stores but still host content that violates the app stores policies against calls to violence. Telegram can place “restrictions” on channels which prevent them from showing up on versions of the app downloaded from the app stores, but still lets these groups exist on Telegram, and they are still visible in the web based version of Telegram and the version of the app that can be downloaded directly from Telegram’s website. Telegram touts its self-hosted version as having “fewer restrictions”, both on their website and in their press releases.

What does this mean practically? It means that Telegram has looked at a channel and thought, “Oh crap, this channel is full of hate speech and calls to violence. If we show this channel on iPhones, we’ll get kicked out of the App Store. But hey, let’s still keep it on Telegram. So rather than remove the violent hate-filled channel from our platform, let’s just hide it from Apple and Google. And oh hey, let’s have users directly download the Android app from us or use the web browsers based version so they can still have access to the channel.” Telegram has looked at and evaluated the channels in question, and decided that yes, they were full of violence, but they wanted to keep them on the platform anyway.

And of course, hiding channels and content is ineffective and leaky. Text can just be copied from the restricted channels, images and videos downloaded, and that content can be directly put into non-restricted channels. Users can link to and tag the restricted channels and even instruct each other on how to get around the restrictions. As with most solutions to problems on social media that attack the problems at a superficial level, it just doesn’t work! And the non-restricted channels still act as an effective way for the restricted groups to reach large audiences.

[An aside: The founder of Telegram has said that “it’s unlikely that Telegram channels can be used to significantly amplify propaganda” because Telegram doesn’t algorithmically amplify content. But this is of course nonsense. Telegram has a reshare button, which makes it trivial with a couple taps to share a message to dozens of groups. This is a design choice, and it is one we have seen in Integrity Institute research lead to a huge amplification of misinformation. WhatsApp even restricts reshares around critical societal moments because it leads to the amplification of harmful content. So while it doesn’t have algorithmic amplification of content, it does have “design amplification” of content. And the fact that a user has to choose to join a particular channel doesn’t mean that channel won’t be flooded with content and ideas from channels they did not opt into or that channel won’t be coopted for uses the user didn’t anticipate when they joined it.]

This poses a challenge to the Apple App and Google Play stores. The policies that govern the app stores act as a “backstop” for how bad apps can get. And the app stores are one of the very few ways in which meaningful and existential pressure can be applied to social media companies from an external source. What do they do when an app in their stores creates the appearance of compliance with their policies but does not genuinely comply and actively subverts them?

But it poses a bigger problem to societies. What do we do when a platform that is perfectly happy distributing violent content hits a billion users?

When the European Commission released its list of platforms that would be regulated as "Very Large Online Platforms" (VLOPs) under the DSA, my first reaction was literally "wait, where the hell is Telegram??" Apparently, Telegram claimed that it didn't have enough users to be regulated as a VLOP. A claim which the EC is thankfully looking into. It just doesn't seem plausible that a platform with 700 million monthly active users, an alleged spot in the 10 most downloaded apps in the world, widespread  popularity in eastern Europe, and public access to profiles (here's a typical example), has fewer than 33 million users in the EU.

In general, Telegram is my go-to example of why, in the end, government regulation of social media platforms is necessary. As much as people love to hate on the major platforms for not taking enough action against various harms, they at least did some level of listening to complaints and hearing the case. Facebook, Twitter, and Google haven't always done an amazing job at taking criticisms seriously, but they have listened to many and acted on some. And they have all done proactive work to identify harms that could occur on their platforms and take some steps against them.

Telegram is simply not interested in engaging with civil society. The only way to get them to take actions on the harms the platform is causing seems to be to actually threaten them existentially, either through government threats to block access or the app stores’ threats to remove them.

Telegram is, in short, a platform that does not belong in polite society. And for now, it looks like Telegram is successfully evading any real accountability. So let this be a reminder that even as society, in general, turns its attention to AI and the future problems that may come from there, we still have yet to see all the harms that social media platforms can cause, and we would be wise to continue to work towards finding comprehensive policy solutions there.

Jeff


P.S. – The new article on Telegram is not the only one in Wired featuring perspectives from the Integrity Institute this month. Check out a previous Wired article that covers the impact of trust & safety layoffs and the downstream effect on the startup ecosystem that Sahar and II members were quoted in

Jeff Allen

Jeff Allen is the co-founder and chief research officer of the Integrity Institute. He was a data scientist at Facebook from 2016 to 2019. While at Facebook, he worked on tackling systemic issues in the public content ecosystems of Facebook and Instagram, developing strategies to ensure that the incentive structure that the platforms created for publishers was in alignment with Facebooks company mission statement.

Previous
Previous

From The ED’s Desk: Staff Retreat & Book Feature!

Next
Next

Integrity Guidance for Start Ups and Early Stage Companies