img

Over 2,000 abusive posts sent about Premier League and WSL figures in one weekend

Table of Contents
Over 2,000 abusive posts sent about Premier League and WSL figures in one weekend
Photo: Reuters
A BBC investigation has uncovered more than 2,000 extreme abusive social media posts - including death threats, rape threats, and racist slurs - directed at managers and players across the Premier League and the Women's Super League (WSL) in a single weekend.

Working with data science company Signify, the BBC analysed posts made during 10 Premier League and six WSL fixtures on 8 and 9 November.

The findings paint a stark picture: abusive content is rising, managers receive more abuse than players, and platforms are failing to remove large volumes of dangerous messages.

Managers Targeted Most - Amorim, Slot and Howe Among Worst Hit

Of the abusive posts identified, 82% appeared on X (formerly Twitter).

Premier League managers Ruben Amorim, Arne Slot, and Eddie Howe were the most frequently targeted individuals, whilst Chelsea Women manager Sonia Bompastor and the club themselves received half of all abuse aimed at WSL teams.

A majority of toxic messages - about 61% - originated from accounts based in the UK and the Republic of Ireland.

Signify's data suggests the overall volume of serious abuse is rising year on year.

Liverpool manager Slot acknowledged the issue, saying:
Abuse is never a good thing, whether it's about me or other managers.

I do not have social media so I don't see it, but I'm not stupid, I know it's there.

AI Flags 22,000 Posts - But Only 2,015 Confirmed as Extreme Abuse

Signify uses an AI system called Threat Matrix to scan social media posts across X, Instagram, Facebook, and TikTok.

Across the selected weekend, it analysed over 500,000 posts - flagging 22,389 as potentially abusive.

Every flagged message then went through a two-stage human review process.

Analysts confirmed that 2,015 messages met the threshold for extreme abuse, including threats to life, rape threats, and hate speech.

Thirty-nine posts were deemed serious enough to warrant further investigation, with one being referred to the police.

Only one of the posts flagged to Meta was removed. Some posts flagged to X were deleted, whilst others had their visibility reduced but remained online.

PFA chief executive Maheta Molango criticised the lack of accountability.
If this happened on the street, this would have criminal consequences.

So why is it that online people have got this sense of impunity? We need to put an end to this.

Tottenham vs Man Utd Produced Weekend's Worst Surge

The highest spike in abusive messages occurred during Tottenham Hotspur's dramatic 2-2 draw with Manchester United on 8 November - a match decided by two late goals.

Managers and players from both clubs were subjected to concentrated hate, including explicit death threats against Amorim.

One message seen by the BBC read: "Kill Amorim - someone get that dirty Portuguese."

WSL Abuse Human-Centred and Violent

Chelsea's controversial 1-1 draw with Arsenal generated the vast majority of WSL abuse that weekend.

Of the 97 verified abusive posts, over half were directed at Chelsea manager Bompastor, including threats of violence and homophobic insults.

Bompastor said the impact extends beyond those targeted.
I have a family, including kids. They don't want to see those comments online. They are so young, and people need to realise the effect it can have on them too.
Ex-Chelsea Women defender Jess Carter was subjected to racist abuse during Euro 2025, and Bompastor believes social platforms must take greater responsibility.

Clubs Begin Taking Action as Trust in Platforms Declines

With frustration growing, clubs are increasingly adopting their own protective measures.

Arsenal have partnered with Signify for three years and have seen a 90% reduction in abusive posts from affiliated fans after banning offenders and running education programmes.

Chelsea Women have now partnered with Signify, whilst Tottenham are conducting investigations into season-ticket holders suspected of sending abusive content.

The Premier League's director of content protection, Tim Cooper, said:
We're constantly monitoring around matches where the abuse can happen and looking for trigger instances such as a goal being scored, missed penalties, or even things like yellow and red cards.

The platforms can do more by changing their algorithms. That would be a step in the right direction.

Law Now Demands Platforms Act - But Are They Complying?

The Online Safety Act, which came into force in October 2023, requires social media companies to proactively remove illegal content such as hate speech and threats.

Ofcom is responsible for enforcing these rules.

Platforms, however, continue to argue that free speech concerns limit their actions.

Meanwhile, Signify reports a 25% annual rise in abusive content detected by its systems.

CEO Jonathan Hirshler said:
We understand the platforms' position on free speech but some of the stuff we're talking about is so egregious. Really nasty death threats and really horrible, violent content.

If the people who are the free speech absolutists out there read some of those messages, they wouldn't question why some of these are being reported and why action needs to be taken.
Despite repeated requests, neither X nor Meta responded to the BBC's questions.

Get new posts by email:
For any enquiries, please contact us here.

Post a Comment