By now, everyone has seen Mark Zuckerberg’s video announcing various policy changes at Meta, which include the phasing-out of fact-checking in favor of X-style “community notes,” the re-embrace of political content on Meta properties, and a form of free-speech absolutism that will finally — finally! — allow users to call gay people mentally ill.
Can you feel our national conversation becoming freer, healthier, and more rational?
Yeah, neither can I.
The video is a fascinating cultural artifact. I am convinced that when our AI scholars are studying the effects of Jenga Capitalism in thirty years, they’ll return to it again and again, debating the meaning of its distinctive qualities: Zuckerberg’s triumphalist tone; the nakedly ideological framing; his very expensive watch.
But Zuckerberg’s video is equally fascinating for what it leaves out. At no point is he honest about why he made these changes.
Much of the coverage has focused on the fact that Zuckerberg wants to cozy up to Trump, or that he is sincerely motivated by returning to the free speech principles he felt forced to abandon when Sheryl Sandberg was running the show. But I haven’t seen anyone apply Occam’s Razor to cut through to the core of Zuckerberg’s main motivation: money.
Meta’s business model is built on engagement. That engagement is a function of content and algorithmic amplification. Now that the algorithms are free to consider most anything a human being (or AI) wants to say, they can more easily keep people glued to Meta’s family of apps. No need to leave Instagram for politically unhinged discourse! You can now find it on the same platform that facilitates eating disorders in teenage girls! Phew!
Despite its economic heft, Meta also doesn’t operate in a vacuum: The company faces new threats in the social media sector that likely drove Zuckerberg to make these changes. For example, I think it’s telling that he wants political content back on Facebook and Instagram after claiming he prefers to focus the experience on “meaningful updates” from “family and friends” (lol). He can’t afford to lose valuable engagement to the Mad Max world of X or the CCP-sanctioned environs of TikTok, which both allow human beings to be the political animals they are (unless, of course, they’re saying something Elon Musk doesn’t like or talking about Uighur Muslims).
To put the finest possible point on it: The election of Consuls Trump and Elon precipitated a vibe shift, and that vibe shift gave Zuckerberg the cultural cover to do the things he’s long wanted to do — cede responsibility for what happens on Meta’s apps, pretend algorithms are neutral pieces of software, and, most critically of all, make money for shareholders.
He’s triumphant for a reason. It’s freeing to disentangle oneself from social responsibility.
For what it’s worth, I believe in free speech, both as a legal concept and cultural value. If someone wants to write homophobic slurs on their personal blog, no one should stop them. I even agree with Zuckerberg that fact-checking has in many ways done more harm than good, especially since fact-checkers suppressed information about the Covid-19 pandemic that deserved a fair hearing: For example, do we even have scientific institutions capable of doing responsible “gain-of-function” research? (No, btw.) In another context, I’d be unambiguously supportive of giving users the freedom to debate a broader range of topics.
But let’s not pretend that what happens on algorithmically-mediated platforms relates — except in an indirect way — to the principles of free speech. It’s one thing to let people say whatever they want. It’s another thing entirely to amplify it, and that’s what Meta does.
It should be obvious by now, but apparently it bears repeating: Meta’s apps are entertainment products, not town squares. And whether it wants to admit it or not, Meta is responsible for the entertainment it chooses to provide. If the creator of a self-driving car programmed it to crash into a building, they would be held responsible. If an AI scriptwriter wrote a painful scene of dialogue, viewers would look to the model creator to fix it.
But Meta has long refused to fix anything, and Zuckerberg’s announcement only celebrates that long-unacknowledged fact. For decades (!), the company has swerved humanity into oncoming traffic again and again and again, then hides behind algorithmic obfuscation: “The AI knows what you want; if you don’t like the outcomes, it’s only because you can’t accept human beings for what they really are.”
Which is both true and not true. We don’t know what the algorithms would surface if they weren’t so relentlessly optimized for short-run engagement and monetization. Moreover, as the exceptional Henry Farrell writes, society uses social media to build a mental representation of what our fellow citizens are like. If all we see are car accidents, we begin to think human beings can’t drive, and update our views accordingly.
Concretely, in 2021, many people might have believed DEI was more popular than it turned out to be, while in 2025, it might mean we think people are more racist than they actually are. Regardless, when we use algorithmically-driven platforms to make these inferences, we can’t help but think differently ourselves. Act differently ourselves. We are social animals, continually trying to situate ourselves among our tribe.
Free speech defenders shouldn’t delude ourselves into believing that algorithmic platforms can ever be a boon to free speech. It’s impossible. The speech these platforms choose to elevate is fundamentally driven by commerce — by what transforms people into the best possible advertising targets. And, as purveyors of late-night infomercials figured out long ago, the best possible advertising targets are tired, anxious, and a bit depressed. I know it’s reductive, but it’s true. Psychologically, we’re just not very complicated.
Now, Meta’s algorithms will be even more empowered to turn its users into frazzled and emotionally exhausted scroller-consumers. Their apps will become havens of abuse so that Zuckerberg can boost engagement by a few hundred basis points. So he can sell a few more pairs of socks.
Someone’s gotta pay for that watch.