X Close

Shutting down X won’t stop misinformation

Linda Yaccarino, chief executive officer of X Corp., at the VivaTech conference in Paris, 2024. Credit: Bloomberg via Getty

August 8, 2024 - 10:00am

Apparently, it’s Elon Musk’s fault. The rioting, that is. Jessica Simor KC wants Parliament to “pass a short Bill closing Twitter down in the UK”. Peter Jukes, the co-founder of Byline Media, compares X under Musk’s management to “Paris under Nazi occupation”. Edward Luce, associate editor of the Financial Times, argues that “Musk’s menace to democracy is intolerable”.

The British public is pointing the finger too. According to YouGov, 92% of Labour voters, 94% of Lib Dem voters, 89% of Conservative voters and 78% of Reform voters believe that social media is at least partly responsible for the riots.

The case against social media — including Twitter — is based on the fact that false reports had circulated online before the first riots. The truth is that the alleged perpetrator of the Southport stabbings is the son of Rwandan immigrants and not, as baselessly claimed, a Muslim who’d arrived illegally on a boat.

Obvious misinformation then, but is it really responsible for the rioting? That would presuppose that the rioters — be they far-Right provocateurs or local yobs — care a great deal about the distinction between one kind of migrant and another. But for the sake of argument, let’s assume that misinformation of this kind can cause riots. The next question, then, is whether banning Twitter would stop false rumours from spreading.

The answer, of course, is no. Even if there wasn’t a way of getting round a national ban on a global website (and there is), the misinformation would still circulate on other social networks. So what we’re really talking about here is a complete ban on social media.

We’d need to look at other forms of electronic communication too. In the wake of the 2011 riots, there was a major flap about the role played by the BlackBerry Messenger app (BBM) — an example of techno-scapegoating that looks rather silly in retrospect.

The fact is that misinformation doesn’t need the internet at all. You may recall the foot-and-mouth epidemic of 2001 — which was so bad that it delayed the general election that year. In the run up to the campaign, Conservative Central Office received multiple reports from party members that government officials were taking rooms in country pubs and hotels across the land. Supposedly, the visitors had told the locals that a massive further cull of the nation’s farm animals was being planned — one conveniently scheduled for after the election.

As the relevant desk officer in the Conservative Research Department, I was tasked with verifying this information. But upon contacting the informants, a familiar pattern emerged: basically the same story in every case, but always told at one remove. My interviewees hadn’t spoken to the visiting officials directly — instead, it was always a friend or a friend-of-a-friend. In other words, it shared the dynamics of a classic urban legend (or, in this case, a rural one).

2001 was before the age of social media. Facebook was still three years away from its launch date and Twitter five years away. But that didn’t stop an unfounded rumour from going viral.

It’s therefore doubtful that a digital clampdown could stop misinformation in its tracks. The proposal does, however, provide a distraction from the failures of immigration policy. There’s more than one way of obscuring the truth.


Peter Franklin is Associate Editor of UnHerd. He was previously a policy advisor and speechwriter on environmental and social issues.

peterfranklin_

Join the discussion


Join like minded readers that support our journalism by becoming a paid subscriber


To join the discussion in the comments, become a paid subscriber.

Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.

Subscribe
Subscribe
Notify of
guest

85 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments