In partnership with

The rise of artificial intelligence has brought a strange new reality to football. Scroll through TikTok and you might stumble across Lionel Messi and Cristiano Ronaldo cutting each other's hair, boarding the Titanic in Edwardian dress, or serving burgers. You might even find Kylian Mbappe riding a ski-lift with a turtle.

Welcome to the age of AI slop.

These tools are becoming more sophisticated and accessible by the day. The content they generate ranges from the obviously absurd to the eerily realistic, and distinguishing what is real from what is fabricated is becoming increasingly difficult.

For now, much of it seems like harmless fun. But underneath the comedy lurks a genuine question: at what point do players and clubs decide enough is enough?

Smart starts here.

You don't have to read everything — just the right thing. 1440's daily newsletter distills the day's biggest stories from 100+ sources into one quick, 5-minute read. It's the fastest way to stay sharp, sound informed, and actually understand what's happening in the world. Join 4.5 million readers who start their day the smart way.

The Brand Protection Problem

As football has evolved into a commercial powerhouse, players and clubs have had to become increasingly vigilant about protecting their brands. Chelsea midfielder Cole Palmer is a case in point. The 23-year-old has trademarked the term "Cold Palmer" with the UK government's Intellectual Property Office, along with his name, autograph, and signature shivering celebration.

But creating legal protections is one thing. Keeping up with the relentless tide of AI-generated content is another challenge entirely.

In the UK, there is limited legislation covering someone's likeness, or what football commonly refers to as image rights. Jonty Cowan, legal director at law firm Wiggin LLP, said AI was presenting "lots of novel challenges" for the industry.

"Various governments around the world are trying to figure out how do we react to AI?" he said.

When Fake Looks Indistinguishable From Real

AI is no longer just creating silly scenarios. It is putting players into situations that look entirely genuine.

Take the January signings of Antoine Semenyo and Marc Guehi by Manchester City. Before the club had released its official photographs, AI-generated images had already appeared online showing both players apparently signing contracts alongside manager Pep Guardiola. There was even an image of Semenyo being greeted at the training centre by former City star Yaya Toure. None of it happened, but you would never know from looking at the pictures.

A similar image surfaced last month showing Manchester United head coach Michael Carrick posing with Frank Ilett, the supporter who refuses to cut his hair until the Red Devils win five consecutive games. Once again, it never happened, but it looked completely authentic.

Cowan said it was difficult for there to be legal recourse when content is presented "in a non-contentious manner". Unless a person has suffered clear commercial or reputational damage, options are limited.

"It's always been quite challenging for an individual to enforce IP rights," he explained. "If it is a deepfake showing them in a compromising position, that's different."

Where the Law Currently Stands

The Data (Use and Access) Act came into force last month, making it a criminal offence to create, share, or request a sexually explicit deepfake. That is significant, but it covers only the most extreme end of the spectrum.

What about a video showing Celtic midfielder Luke McCowan punching an assistant referee? Could that damage his reputation, or is it simply too outlandish to be believed?

A more pressing concern for players may be the legal concept of "passing off", where someone unfairly associates their products with an established brand or individual to mislead consumers. In December 2024, the UK government said it was considering "introducing some kind of personality right", which would give players broader scope to take legal action.

Clubs have more options available. If someone puts a player in a Manchester City shirt, the club could pursue trademark infringement claims around their crest or design rights in their kit. The BBC understands City believe fans know their official channels remain the only source for genuine content. But as the lines between real and fake continue to blur, that stance may be tested.

Challenging the Platforms Directly

Taking creators to court is long and costly. Cowan suggested a quicker route: challenging the platforms themselves.

"The Online Safety Act has been introduced in the UK recently, and that is putting an obligation on platforms to tackle illegal content," he said. "Often, that is the easiest and quickest way to tackle these images."

This is fuelling growth in companies specialising in digital rights protection. These firms scrape websites and apps, using AI itself, to identify where a player's image has been used without permission. They can request takedowns without the affected parties getting involved.

AI as a Tool and a Threat

AI is not all bad news for football. Adverts and promotional material can now be created without players needing to leave their homes. But alongside the legitimate uses, it is far too easy for unauthorised parties to take a player's likeness and use it to promote their products.

Last year, Meta's oversight board banned a gambling advert on Facebook that used a manipulated video of former Brazil striker Ronaldo, imitating his voice. The advert had not been picked up by Meta's automated detection tools, and the platform was told to create "easily identifiable indicators that distinguish AI content" to prevent scam content.

The Football Association faced similar issues during Euro 2024, when fake AI-generated interviews appeared showing England head coach Gareth Southgate making derogatory remarks about his players. The videos were removed for breaching TikTok's AI content policy, but not before millions had already viewed and shared them.

What Happens Next?

Today, it is rare for anyone posting AI-generated content to label it as such, despite TikTok's guidelines explicitly asking users to do so.

Cowan believes major legislative change is unlikely soon, but platforms could face tougher rules.

"Under advertising regulations, influencers have to disclose where a video has been sponsored. I suspect we may end up with similar transparency requirements. A little '#AI generated' or similar label in the corner."

The problem is whether creators will bother to comply, and how effectively platforms can police it. As Cowan pointed out: "If you've got those egregious videos, where someone's putting out a hideous deepfake, they're not going to worry about adding that label."

For now, most clubs treat AI slop as just another feature of social media. But there may come a point when more decisive action is required. The technology is not slowing down. The question is whether football can keep pace.

Reply

Avatar

or to participate

Keep Reading