AI Research Study on Online Abuse

Raheem Sterling, Manchester City

New recommendations unveiled for football stakeholders and social media platforms

A PFA Charity report into online abuse aimed at professional footballers has revealed significant blind spots in combatting online abuse, while 43% of Premier League players in the study experienced targeted and explicitly racist abuse.  

The PFA Charity’s study, in partnership with data science company, Signify Group, and supported by Kick It Out, used machine learning systems to analyse messages sent publicly via Twitter to 44 high profile current and former players from across the top divisions of English Football. 

During the six weeks of ‘Project Restart’, Signify analysed 825,515 tweets directed at the selected players, identifying over 3,000 explicitly abusive messages. 56% of all the discriminatory abuse identified during the study was racist. 

Key Findings:

  • 43% of Premier League players in the study experienced targeted and explicitly racist abuse. 
  • 29% of racially abusive posts came in emoji form. 
  • 50% of the total online abuse recorded was received by just three players, who called out racial abuse during Project Restart. 

Nearly half of the players’ accounts monitored in the study received abuse that would constitute a sanctionable offence in The FA handbook, demonstrating that players are held to a higher code of conduct than the people they engage with online. However, players are limited in how they can respond to this level of abuse, with action from social networks relying heavily on the victim of abuse to read and report every abusive message they receive. 

The study also found that Twitter’s algorithms were not effectively intercepting racially abusive posts that were sent using emojis. This highlights a glaring oversight, as abusive emoji posts constituted 29% of the racist messages players received. The use of emojis in racist abuse was something previously highlighted by the PFA during meetings with Twitter, however it remains a significant blind spot in tackling online abuse.

England and Manchester City forward, Raheem Sterling said: “I don’t know how many times I need to say this, but football and the social media platforms need to step up, show real leadership and take proper action in tackling online abuse. The technology is there to make a difference, but I’m increasingly questioning if there is the will.”   

Following the PFA’s Enough campaign in 2019, football’s stakeholders met at Wembley and identified machine learning as a potential method to capture, analyse and quantify online abuse. This resulting pilot study has demonstrated that this is a viable way for football to offer more protection for players, clubs, officials and fans. 

Wycombe Wanderers forward, Adebayo Akinfenwa said: “As someone who has experienced online abuse first-hand and spoken to teammates who have experienced the same, I can say that players don’t want warm words of comfort from football’s authorities and social media giants, we want action. The time for talking has passed, we now need action by those who can make a difference.” 

The PFA Charity wants to see a change in practice so that repercussions for online abuse is not solely reliant on victim complaints or platform intervention. We will be calling on football’s stakeholders and clubs to adopt a centralised system that collates and submits relevant evidence to the police and highlights any possible action within the game. All those involved in football need to work together collectively, to ensure this practice then becomes the industry norm. 

The damning data in this report means now is the time for football to take decisive action: 

  • Proactive monitoring of social media platforms: football’s stakeholders and clubs should now work together and fund a centralised AI driven system to proactively monitor abusive users across social media platforms.
  • Apply offline consequence for online actions: aim to identify abusive users, and then pursue real-world consequences including prosecutions, stadium bans, suspensions within amateur and grassroots football.
  • Evidence led pressure on social media platforms: to gain commitment towards more proactive interventions for abusive posts and for stronger measures to be taken against abusive users.
  • Call on social media platforms to address abusive emojis: greater use of monitoring and technology to address the use of emojis as a form of abuse.

Simone Pound, Head of Equalities at the PFA says: “Online abuse is a problem that will not go away without concerted action by the government, football authorities and social media platforms. The recommendations we have announced today can make a real impact, but it needs everyone to work together to achieve change. The players we represent are demanding action, but we need all the key stakeholders to step up and take responsibility together.  

“Social media companies must do more to address abuse on their channels and not consider it an expected experience. Twitter and Instagram are used extensively by players to engage with the footballing community and wider public. Both platforms generate huge profits off the content created by the football industry and have a duty of care to ensure players and all other users are protected from racist abuse while using their products.”

Sanjay Bhandari, Chair at Kick It Out says: “This report confirms what we have known for a while - that social media can be a battleground of hate with few consequences for abusers. We also know that players and the public who witness this hate are victims, who are let down by the cracks in the system. The question is, what do we do about it? We need government, law enforcement, the leagues and clubs to commit to working together to fill in those cracks in the enforcement system. We need better government regulation and improved sharing of data and intelligence to scratch below the surface, in order to understand and address root causes. Crucially, we need fans and social media organisations to be part of the solution. This is a behavioural and technological problem. We need behavioural and technological solutions."

Signify’s CEO, Jonathan Hirshler said: “This has been an important initiative, developed with the PFA, to support and protect players, staff and their families online through tangible action. It is a continuation of our work to help move the default response to online abuse from reactive to proactive. Driven by our proprietary threat monitoring service - Threat Matrix - we have been able to identify, analyse, source and de-anonymise targeted online abuse.”

Download the report

The findings in this report provide irrefutable, tangible evidence that must not be ignored. Please note, the report contains abusive and racist language.