The news about Elon Musk’s intended purchase of Twitter has brought about a lot of hooey and hand-wringing. Here are my thoughts. I first listened to a very interesting interview by ex-White House speechwriter Jon Favereau of Renee DiResta, an expert on tech policy at the Stanford Internet Observatory, whom I have quoted numerous times in the past. She makes the case that Elon has a fundamental misunderstanding of what online free speech means, even ignoring the fact that free speech only applies to governments, not companies. Renee amplifies her piece for The Atlantic that she wrote a few weeks ago, saying that Elon is more about attention than freedom (and who knows if his bid will even go through). “Free expression should be a foundational value,” she wrote. She also makes the case that all online social media products moderate their content – and most do so reactively, inconsistently or clumsily or all three. This includes Truth Social, Gettr and Parler, just to name some of the more notable “free speech” ones. (The hyperlinks will take you to their community guidelines for your future reference.)
Suzanne Nossel, the CEO of the writers’ group PEN America, writes that “Musk will learn the hard way that there is no return to a mythic online Eden where all forms of speech flourish in miraculous harmony.” However, she agrees with him (and others) that our current content moderation methods are deeply flawed. If you haven’t learned the words “shadow banned” (where followers are deleted without telling them from your social accounts) or retconned (officially sanctioned revisionist history), you will hear them more often during these discussions.
So what is the solution? DiResta and others penned this piece in SciAm, suggesting that social media companies need to become more transparent. “The only way to understand what is happening on the platforms is for lawmakers and regulators to require social media companies to provide researchers and others access to data on the structures of social media, like platform features and algorithms.” PEN’s Nossel is also for more transparency. She suggests that more moderation is essential to prevent spammers, trolls, and other quackery from taking over social media and that “robust content moderation is here to stay,” especially to try to stem the tide of false positive takedowns of content and users. For example: TikTok restores more than 1M videos each month after initially removing them for violations. Of course, they allow millions more to be posted to their site. But still, that is an awful lot of content to judge.
I think there is a bigger question that many of the commentators aren’t really addressing: do we really want an online town square? The comparison doesn’t really work when millions of people are shouting to be heard, or in places in the world that are under the grip of authoritarians. It very quickly devolves from the marketplace of ideas to mob rule. DiResta spoke about the “high harm areas of online that are worth moderating,” which is a good way to look at this, especially given the absence of facts being spewed there and how they are amplified and become part of the conversation offline.