
07/10/25
AI Britain
Do you remember the Egg Monster?
We first stumbled across it back in November: a strange, slightly disturbing AI-generated character that appeared on our Instagram feed. What started as a curiosity quickly took us down a rabbit hole – suddenly we were noticing similar surreal AI content everywhere, each clip stranger than the last.
The Egg Monster was strange, grotesque even – but ultimately harmless. A glimpse into AI's creative potential that felt safely absurd. But AI-generated content is evolving. The bizarre is giving way to something that feels more serious, more pointed. Now it's creating caricatures of Britain: our towns, our politics, our divisions.
Much of what we’ve seen is obviously created to be funny.
For example, some of the AI videos tap into Britain's fondness for regional stereotypes – how people from different areas supposedly dress, talk, and behave. A similar formula appears in clips about 'UK shoppers', showing customer stereotypes tied to different retailers. Whilst these kinds of regional jokes aren’t new or necessarily problematic, it’s worth pausing to reflect if the new trend suggests anything deeper about how AI views us as a country?
Some AI representations of Britain are more overtly political – leaning into division and 'culture wars'. Videos range from caricatures of 'the woke left wing' to negative portrayals of flag-waving Unite the Kingdom attendees. Often these reels are framed as mock news reports, with an AI 'reporter' interviewing participants and claiming to reveal what people think.
Research by LSE, reported in the Times, has highlighted the circulation of anti-Muslim AI content – though this research only examined Britain First and Europe Invasion accounts. What we've found when looking more broadly is far harder to categorise – part of a sprawling landscape poking fun across races, cultures, left and right. What unites these videos is a common pattern: individuals distorted into caricatures, visuals exaggerated, viewpoints cut down into provocative soundbites.
The result feels like parody – and perhaps that's the intention. Many viewers in the comments treat them as humorous, though reactions vary widely.
It's perhaps unsurprising that immigration also features prominently in the videos we’ve seen. These AI-generated reels stage interviews with 'migrants' arriving on small boats, asking them why they've come to Britain, with the 'migrants' offering responses that poke fun at British culture.
Whilst currently it’s relatively easy to identify these videos as being AI generated they are rapidly getting better. The days of counting fingers or finding an extra leg seem to have gone - sometimes it’s only the audio, rather than the video that gives it away.
Historically satire and entertainment were more clearly identifiable by the bounded context in which we encountered the content. However, the lines between satire and ideology, entertainment and propaganda, factual reporting and artistic representation are becoming increasingly blurred. This is AI Britain.
At a time when people are forming opinions based on fragments of content seen on social media, how are these videos – which will become increasingly more convincing – contributing to wider public opinions? And whose responsibility is it to take appropriate action?
Beyond the algorithm, beyond the caricatures, lies the real Britain: a complex, contradictory, deeply human place that can't be captured in a soundbite or generated by a model.
The research industry has increasingly moved away from in-depth work with people from all walks of life in their own environments – favouring remote interviews, online panels, and digital communities. We’re concerned it’s already operating in its own filter bubble – perpetuating assumptions rather than reflecting what people actually do and think. Now there's talk of going further still: using AI-generated ‘synthetic respondents’. But if these videos are anything to go by, what would those respondents be like? Caricatures shaped by our own biases? Reflections of their creators' worldviews? This risks pushing both the industry and its clients further from the nuanced reality of people's lives.
We are launching In Reality: Meeting people where they are – an ongoing ethnographic project, spending time with people across the UK, in the places they live and the contexts of their everyday lives.
Internally our Revealing Reality research team call this project ‘Connections’. It’s the antithesis of AI.
We will be sharing live updates, on an ongoing basis, from our researchers who are currently out talking to people across the UK about some of the big questions currently facing society.
Contact us if you would like to find out more: damon.deionno@revealingreality.co.uk
Sign up to hear more
Contact the team
To discuss this, or any of our work, please feel free to contact: