Sir Keir Starmer has condemned comments by Elon Musk suggesting that “civil war is inevitable” following violent unrest in the UK.
The owner of X, formerly Twitter, posted the remarks on the platform in response to a video showing people aiming fireworks at police.
The prime minister’s spokesperson said there was “no justification” for Mr Musk’s comments, adding there was more that social media companies “can and should be doing”.
Justice Minister Heidi Alexander also criticised the tech billionaire for his “totally unjustifiable” and “pretty deplorable” comments.
Disorder has now lasted almost a week, following the fatal stabbing of three girls in Southport. The subsequent unrest in towns and cities across England and in parts of Northern Ireland has been fuelled by misinformation online, the far-right and anti-immigration sentiment.
Following Mr Musk’s post about civil war on Saturday, he has continued to comment on the unrest in the UK.
Mr Musk replied to a post on X from the prime minister – in which Sir Keir said he would not tolerate attacks on mosques or Muslim communities – asking: “Shouldn’t you be concerned about attacks on *all* communities?”
The tech billionaire has also replied to a post criticising UK policing, suggesting the police’s response “does seem one-sided”.
‘Pretty deplorable’
Commenting on Mr Musk’s remarks, Ms Alexander told BBC Breakfast that he “has a responsibility given the huge platform he has, and so, to be honest, I think his comments are pretty deplorable”.
Mr Musk has more than 192 million followers on the platform.
When asked about accusations of two-tier policing in the UK, Ms Alexander said that was a “baseless assertion” that does a “disservice to police men and women who go out to do their jobs and uphold the rule of law”.
She added social media companies had a “moral responsibility” to call for calm and help clamp down on misinformation.
The PM’s spokesperson also said social media firms “have a responsibility” to ensure criminal activity – including from those outside the UK – is not being shared online and state actors may be amplifying misinformation.
But they would not say which countries the government believes are behind the posts.
On Monday, Technology Secretary Peter Kyle said he had met representatives from TikTok, Facebook’s parent company Meta, Google and X “to make clear their responsibility to continue to work with us to stop the spread of hateful misinformation and incitement”.
The Home Secretary Yvette Cooper has also said the government would not tolerate “arm chair thuggery” and social media companies need to “take responsibility” over online posts encouraging criminality.
The BBC has approached X, Meta, TikTok and Snap for comment.
Offences concerning incitement under UK law predate social media, and are listed under the Public Order Act 1986.
This may include provoking violence and harassment, as well as engaging in rioting.
Meanwhile the Online Safety Act, which became law in 2023 but has not yet fully come into effect, will require social media firms to “take robust action against illegal content and activity”, including “racially or religiously aggravated” offences as well as inciting violence.
The criminal offences introduced by the act will cover sending “threatening communications” online, and sharing “false information intended to cause non-trivial harm”.
On Monday, Sir Keir emphasised that “criminal law applies online as well as offline”.
Social media involvement
Mr Musk’s comments have drawn criticism from some online, with satirist Armando Iannucci saying the Tesla and Space X CEO had been “taken in by your own platform, which amplifies noise at the expense of facts”.
Sunder Katwala, director of think tank British Future, said the post was “spreading a narrative that is crucial to socialising people with fairly extreme view towards condoning violence to protect their group”.
He said there needs to be “strong responses from government, Ofcom, and parliament” to the comments.
An Ofcom spokesperson told BBC News it is “moving quickly” to implement the Online Safety Act, so it can be enforced “as soon as possible”.
“When it comes fully into force, tech firms will have to assess the risk of illegal content on their platforms, take steps to stop it appearing and act quickly to remove it when they become aware of it,” they said.
“We expect the illegal harms duties to come into force from around the end of the year… and the additional duties on the largest services in 2026.”
Additional reporting by Tom Gerken, technology reporter