Dear Elon: Protect the Public Square

We agree that transparency, ethics, and free expression are keys to Twitter’s future. Here’s how you can make that happen.

We are a collection of folks with experience keeping online platforms safe, legal, and unbiased - a field known as Integrity. We’ve heard you’re purchasing Twitter, with the aim of making it a true modern public square. We all agree that social media is pretty broken, and we share your concern in protecting people’s ability to express themselves freely and hash out disagreements without fear of censorship or harassment. But a forum for “free expression” requires more than giving everyone the mic. It also requires clear processes, a culture of transparency, and a product that guides people toward best practices and behavior. Importantly, it also requires top executives to support those employees on the ground level who are tasked with building these systems, in enforcing their decisions consistently, and being open and public about their processes.

With Twitter, you have the opportunity to foster this dynamic and succeed where many platforms have failed. By being open about the decisions you make and the reasons that you make them, it’s time to allow the public to see and understand which tweets are rising to the top of their timelines, and why. We share your desire for radical transparency, and by respecting Twitter's users enough to share the logic behind your decisions, you can approach a user experience closer to a true public square than any other platform has been able to achieve by giving people the ability to have ownership and understanding of their experience.

The first step in that process, as you’ve said, is to provide algorithmic transparency. We agree that Twitter should be open about how its algorithms decide which posts get seen by millions of people and which do not. To give people the tools to understand the algorithm, users need to know what content is most successful on Twitter, and why the algorithm is scoring that content so highly. Transparency around high performing content begets trust – enhanced trust and affinity for Twitter leads to a more substantive debate on how the product can be improved.  

To go even further, we believe that Twitter and all social media platforms need to share public aggregate metrics on behavior. Users need to be informed about how much spam, hate speech and misinformation is being posted and spread on the platforms they use so they can have the full context of what is happening, the approaches taken to address those issues, and the reasons that led us to believe they were problems in the first place. You are in the position to now lead the effort among your peers. You can increase access to tools and data that can contribute to the public’s understanding of how a platform can support the health of the information ecosystem by making Twitter the example to follow.

Now to be clear, we want to protect the freedom of expression. Many of us went into this line of work because we wanted to help build an internet where people can fully express themselves, even when their beliefs differ from those with power. However freedom of expression can be limited not only by censorship but also by fears of cancellation or violence. Sometimes our attempts to make some people feel safe expressing themselves have caused others to feel censored. Companies have made mistakes here, and we don’t claim there are easy answers, but we hope you will work together with us on finding ways to do this better.

Though moderation, ranking, and the processes that surround those functions are important, the push toward transparency shouldn’t stop there. You can also lead by being transparent about changes to Twitter’s product design. It's prudent to scrutinize Twitter’s functions and the kind of behavior that those features encourage – the retweet button, comment thread, and the much discussed “edit function” all elicit dramatically different user behaviors. Product design dictates the culture of an online space, and can mitigate the need for enforcement and oversight down the line.

Lastly, we know how important freedom from political bias or financial incentives is to the health of the public square. As a result, platforms need to bake in rigorous standards to uphold that responsibility. This is exactly why we echo your concern about platforms making high-profile decisions (content moderation or otherwise) in a way that is ad-hoc, or subject to political or financial pressure.  We know that many people think we make decisions in a biased way to favor one side of the political aisle or another. Those decisions ought to be made in a principled way: by a transparent process, rather than any individual. 

When this work is done in the dark, especially during times of confusion or unrest, it will always breed mistrust. You could set a new gold standard for social media platforms as well as increase public trust and accountability by shining a light on these decisions.

We know we don’t have all the answers. No single person does. But we do have the expertise to know that doing nothing isn't the answer. Social media can be used as a tool to foster democracy, but we have also seen high-profile users harness disinformation to drive political instability. Leaving a space open to misinformation, spam, and incitements to violence doesn't work. The process is important -- but doing the process the right way is important too. We’re committed to trying new things, evaluating results through an iterative and scientific process that engenders trust and transparency. We hope that by engaging with us on this work in a meaningful way with the care and respect it deserves, you have the potential to improve it for the better.

Fundamentally, Integrity workers share your goal of free expression and an open, vibrant internet. We have years of testing, research, and work that we’ve put toward those ideals. Public mistrust of unseen content arbiters largely stems from the current opacity of this work. We urge you to empower Twitter integrity workers with the freedom to be transparent, the independence to execute policy consistently, and the funding they need to protect its users. We hope to see you champion these ideals and take this opportunity to forge collaborative relationships with integrity professionals, internal and external to Twitter.

Signatories

Sagnik Ghosh

Jeff Allen

Dylan Moses

Chris Campbell

Rob Ennals

Tim Gavin

Katie Harbath

Sahar Massachi

Arturo Béjar

Grady Ward

Juliet Shen

Brandon Silverman

Abe Katz

Chad Woodford

Nichole Sessego

Naomi Shiffman

Ravi Sandepudi

Bri Riggio

Elise Liu

Simeon Anderson

Sam Plank

Colleen Mearn