With Britain struggling to put down far-right rioters, Elon Musk is only fanning the flames. Days after British Prime Minister Keir Starmer warned social media firms over online misinformation that fueled the unrest, Musk waded in by suggesting the riots were the result of mass migration, adding that “civil war was inevitable.”
In fact, evidence suggests that falsehoods amplified on Musk’s platform have fueled the unrest — and the world’s richest man has faced little to no repercussions. Even in Britain, which has passed one of the most ambitious laws to regulate toxic online behavior, authorities are hamstrung to address dangerous lies that proliferate across Telegram, TikTok or Musk’s X.
At tense times such as these, where online sparks can turn into real-world wildfires, some platforms still have not done enough to act against misinformation that spread across their services after the stabbing attack in Southport, England, that killed three young girls attending a dance class. The alleged attacker was born in the UK, but posts that went viral on X falsely claimed he was a Muslim “asylum seeker” named Ali al-Shakati who had come to the UK by boat last year. Andrew Tate, a repellent online influencer, told his 9 million followers on X that the suspect was “an illegal migrant.”
Illustration: Constance Chou
Rioters went on to attack mosques and set fire to hotels housing asylum seekers.
There have also been calls for violence. On Telegram, a messaging app and broadcaster, a user on Monday posted a list of targets across the UK for rioters to attend, to a channel with 13,000 followers. “WEDNESDAY NIGHT LADS,” it said. “MASK UP.”
When I alerted Telegram’s press team about the posting, they responded the following day, saying the channel had been removed for breaking Telegram’s rules around calls to violence.
However, the following morning, another channel with the same name had the same inflammatory post. Telegram is manually playing whack-a-mole, suggesting a poorly designed system at best.
“Each day, millions of pieces of dangerous content are removed before they can cause harm,” the Telegram spokesman said.
Britain’s new Online Safety Law, which passed last year, had the potential to improve such systems. Loosely inspired by health and safety policy, it was cleverly designed to thread the needle of maintaining free speech by giving large online platforms a “duty of care” for their end users. Rather than go after specific content or individual posters, the law gave British regulator Ofcom the power to conduct risk assessments on platforms like Telegram or X, to check misinformation was not spreading in such a way that could cause real-world harm. It went after systems and processes — not individual content. It gave Ofcom the power to issue substantial fines and, in extreme cases, block non-compliant services.
However, the act does not fully come into force until early next year. Even if it was in force today, it would not address the specific lies that fueled the current violence.
Jonathan Brash, a member of the British Parliament from Hartlepool, a city that saw violent clashes with police, told BBC radio that lies were being “spread quite deliberately” to stoke tensions in different communities.
However, under the new law, platforms like TikTok, X and Telegram did absolutely nothing wrong by letting those falsehoods go viral, even at a time when they could inflame further violence.
That ludicrous failing is the handiwork of Britain’s previous Conservative government, which watered down the Online Safety Act just before it was passed. They removed a section that banned “legal but harmful” content, so that the rules would only apply to content that was already illegal under existing law. The calls to violence on Telegram for instance, would break the rules, but Musk’s viral “civil war” comment or posts that the Southport perpetrator was Muslim, would not.
While the Internet — and notably Musk’s social media site — is a cesspool, it is of course not the cause of all social unrest. There are underlying social tensions and economic inequalities to consider. However, today the world is in a state where dog whistles can have a far-reaching impact thanks to the amplification of social media, and particularly now on platforms that have gutted their trust and safety teams.
The thugs rioting on Britain’s streets could not care less about the little girls who were killed more than a week ago. They are exploiting a tragedy to act out on their pathologies. They are capitalizing on the continued, unfettered power that online platforms have in controlling the information flow. A groundbreaking law that could have finally held those companies to account for dangerous misinformation lost much of its bite because of the short-sighted views of politicians. To pass the law at all was a step forward, but it was not big enough.
For now Starmer should avoid being drawn into a war of words with Musk that could get increasingly ugly. Having found his latest shiny object, the billionaire is retweeting videos and memes slamming Britain’s policing system and taunting the prime minister with hashtags like #TwoTierKeir. There is little point engaging with an Internet troll who seems to have little regard for the consequences of his actions, or of those who use his platform.
Parmy Olson is a Bloomberg Opinion columnist covering technology. A former reporter for the Wall Street Journal and Forbes, she is author of We Are Anonymous. This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.