Yoel Roth, the former head of Trust & Safety at Twitter, has sounded the alarm. He warned about the challenges of moderation on decentralized social media platforms, known collectively as the fediverse. This new network of applications is made up of decentralized spaces like Mastodon, Threads and Pixelfed. Sadly, it is now under attack by forces intent on undermining the integrity and safety of our online discourse. In a podcast discussion with @Rabble on revolution.social, Roth outlined these issues and the need for robust moderation tools to ensure a healthy online environment.
Roth praised the progress the fediverse has made in democratizing online interactions. Even that, he said, is not enough to provide the oversight they would need to properly content moderate. He drove home his point about having the right tools. Without them, users and community admins can’t hold the safety and accountability standards that we all deserve in today’s online world. His insights mimic an emerging sentiment in the tech community around the increasing difficulty decentralized platforms face in moderating user behavior.
Lack of Moderation Tools
Roth went on to underscore an important limitation in the content moderation capacity of most fediverse platforms. He noted that platforms like Mastodon and Bluesky offer distinctive advantages due to their decentralized structure. Too often, they don’t have the moderation tools they need.
“We saw it coming two years ago. IFTAS saw it coming. Everybody who’s been working in this space is largely volunteering their time and efforts, and that only goes so far, because at some point, people have families and need to pay bills, and compute costs stack up if you need to run ML models to detect certain types of bad content,” – Yoel Roth
Roth’s worries reach into the realm of data collection. He thinks that privacy concerns might be constraining many federated platform admins from collecting and/or reviewing key logs. This lack of oversight—including, importantly, to user-generated content—can create a place where abusive, hateful, and dangerous content continue to spread unchecked.
Roth’s tenure at Twitter deeply shapes his perspective on the economics of moderation. He argues that the federated approach, while appealing for its democratic values, is inherently unsustainable without significant investment in moderation technology.
The Impact of Misinformation
The issue of misinformation is especially urgent when it comes to decentralized platforms. Roth told a fascinating story involving Jack Dorsey, the co-founder of Twitter. In a hilarious turn of events, Dorsey actually retweeted a post from a Russian troll without realizing it himself. This social media manipulation experience serves to reaffirm the susceptibility of social networks to engineering malfeasance.
“The CEO of the company liked this content, amplified it, and had no way of knowing as a user that Crystal Johnson was actually a Russian troll,” – Yoel Roth
Roth noted that LLMs might be able to produce posts that are more persuasive than a human-written post. This reality makes it exceedingly difficult, if not impossible, for platforms to implement effective, fair content moderation.
“If you’re starting with the content, you’re in an arms race against leading AI models and you’ve already lost,” – Yoel Roth
He urged that these behavioral signals can be hidden inside engaging content. This means that platforms need to do more and better with their moderation methods. Roth’s advocacy for better tools isn’t simply a focus on the technical improvements that create a safer online community, either.
The Need for Enforced Protections
Roth argues for a proactive approach to moderation, emphasizing the importance of enforcing protections across all user interactions—even outside of primary applications like Bluesky. He fears that without enforcement mechanisms, the objectives made by Web3 innovators using all their technology wizardry to create decentralized and democratically governed spaces might backfire.
“I don’t blame startups for being startups, or new pieces of software for lacking all the bells and whistles, but if the whole point of the project was increasing democratic legitimacy of governance, and what we’ve done is take a step back on governance, then has this actually worked at all?” – Yoel Roth
He noted the advantage of community-based control is that it rarely has the technical instruments for enforcement. This paradox is a conundrum for anyone dedicated to creating healthy communities online.
With these reflections, Roth recalls how the fediverse introduces a truly exciting new chapter in social media’s story. It also requires careful consideration of the obstacles to moderation. As users rush to these new decentralized platforms seeking more control in their lives, the need for responsible oversight becomes even more pressing.