Top headlines

Lead story

The dumpster fire that is the state of content moderation on X, formerly Twitter, is well documented. But even when social media platforms do a better job of weeding out hate speech and other harmful content, a key question remains: If social media platforms are today’s public spaces, how is it that a handful of big tech companies get to decide what speech and behavior is acceptable?

The question is far from academic. The fate of nations hinges on the intersection of politics, misinformation, social media platforms and the billionaires who own them. UMass Amherst digital media scholars Ethan Zuckerman and Chand Rajendra-Nicolucci put the lens of history on the problem and see a potential solution from the early days of the internet, before content moderation was a job description.

[ Sign up for our weekly Global Economy & Business newsletter, with interesting perspectives from experts around the world. ]

Eric Smalley

Science + Technology Editor

Content moderators like these workers make decisions about online communities based on company dictates. Ilana Panich-Linsman for The Washington Post via Getty Images

Let the community work it out: Throwback to early internet days could fix social media’s crisis of legitimacy

Ethan Zuckerman, UMass Amherst; Chand Rajendra-Nicolucci, UMass Amherst

In the days of online bulletin board systems, community members decided what was acceptable. Reviving that approach to content moderation offers Big Tech a path to legitimacy as public spaces.

Politics + Society

International

Environment + Energy

Ethics + Religion

Science + Technology

Trending on site

Today's graphic 📈