Elon Musk’s crusade to buy Twitter and, well, crusades in general, have a lot in common: Nobody asked for them.
On Tuesday, Musk made an offer to buy Twitter for $41 billion and take the public company private. The attempted takeover is in the name of increasing “freedom of speech” on the platform. Musk did his own “research” on freedom of speech issues on Twitter — in the form of a Twitter poll.
As Musk explained in a TED Talks interview, he thinks Twitter needs to be more sparing in both its content moderation and the people it suspends or bans for violating rules.
“If in doubt, let the speech exist,” Musk said, before also adding “I don’t have all the answers here.”
The thing is, research shows that most Americans, on both sides of the aisle, disagree. A new working paper of a study from MIT and Yale researchers found that 80 percent of Americans think social media companies should take action to reduce the spread of misinformation. While that includes both democrats and republicans, even 68 percent of republicans agree with that view.
“Our data suggests that [Musk’s views] are not representative,” David Rand, an MIT professor of management science and one of the study’s co-authors, said. “A lot of people in Silicon Valley have this kind of maybe libertarian, extreme free speech orientation that I don’t think is in line with actually how most Americans and social media platform users think about things.”
The study surveyed 4,900 Americans on their opinions about content moderation on social media. They asked about whether platforms should moderate misinformation generally, but also asked about one specific case of content moderation: QAnon. It turns out that even with a relatively partisan case of misinformation, there is still bi-partisan support for cleaning up social media: 52.5 percent of people identified as “Strong Republicans” agreed that companies should “try to reduce the spread of the QAnon conspiracy theory on their platforms.”
“A concern that often comes up when you talk about enforcement on misinformation is like, okay, great, nobody wants misinformation, but people really disagree about what’s misinformation,” Rand said. “And I think that’s actually overstated. There’s actually much less partisan disagreement [on what qualifies as misinformation] than you might think.”
Other studies have shown that the public thinks social media companies should bear the responsibility for misinformation spreading on their platforms, and a 2021 Pew survey found that there was rising support among the public for the idea that the government should intervene to reduce the spread of misinformation online. But the new MIT paper asks specifically about the action social media companies should take, and found bipartisan support for this type of content moderation.
While the study, drawn from data gathered using the online market research tool Lucid, has not been published or peer reviewed yet, other researchers view its findings as sound. Jonathan Nagler, the co-director of NYU’s Center for Social Media and Politics, said it was a “good study,” though he viewed sampling several thousand people, rather than tens of thousands, as a “limitation.” Still, Nagler echoed the disconnect between Musk’s “freedom of speech” views, and apparently that of the public.
“Most of the public thinks it’s a good idea to get misinformation off of platforms,” Nagler said. “Freedom of speech problems are a little bit in the eye of the beholder.”