The defend campaign didn’t begin with a grand plan. It began with a moment — one of those small, ordinary moments that suddenly shifts the ground beneath you.
It was early 2025. Jason Tanner switched on the radio, expecting the usual morning noise, and instead heard that Elon Musk was giving public commentary on British child‑grooming gangs. It was jarring. Why was a tech billionaire, with no background in child protection or UK social policy, suddenly positioning himself as an authority on one of Britain’s most sensitive issues?
Curiosity turned into unease. Jason opened Elon's X to see what was happening. What he found felt less like a conversation and more like a man-made storm.
A pattern hiding in plain sight
Strands of unrelated issues were being stitched together — child protection, immigration, crime, “law and order,” cultural identity — all woven into a single, emotionally charged narrative. Posts implied that the offenders were illegal immigrants, despite no proper evidence. Others tried to pin blame on the Prime Minister, calling for unrest and political upheaval.
It didn’t feel organic. It felt engineered.
Jason began responding to some of the more extreme posts, not to argue, but to inject a little truth into the noise. But something strange happened: his replies vanished into the void. Engagement collapsed. It was as if he’d been quietly pushed into a soundproof room.
Yet on Threads, a competing platform, similar posts reached people normally. The contrast was too sharp to ignore.
As he watched more closely, Jason noticed an odd phenomenon. Interactions on X often came from accounts linked to Musk’s family or companies — or from profiles that existed solely to retweet pro‑Musk content. They didnt feel like real conversations. They were signals. Echoes. A kind of artificial crowd.
And the same messaging — the same tone, the same themes — was appearing in posts relating to other countries too. Figures seemingly aligned with Musk’s interests were pushing identical narratives about immigration, crime, and threats to women’s safety. It was a template, replicated to posts crossing borders.
Meanwhile, official data told a very different story. The loudest claims on X simply didn’t match reality.
The moment everything clicked
After days of watching the same patterns repeat, Jason realised something fundamental:
the idea of “free speech” was being reshaped in front of our eyes.
Platforms claiming to defend it were using algorithms to decide who gets heard and who gets buried. Calls to shut down legacy media were being sold as liberation for the mass consumer, when in fact they reinforced power through communications control towards this platform and away from the established media, now labelled with a very loaded term - the legacy media..
People believed they were speaking freely. But their visibility — their very presence — depended on an algorithm they couldn’t see. Like Jason, they'd heard about algoritthms and just assumed they wer trendy new things/ A touch of the Emperors New Clothes had descended on society, too busy to question or understand.
That was the moment the campaign took root. Not as a project, but as a responsibility.
Once Jason recognised the pattern, he began documenting it. Not as a researcher in a lab, but as someone watching a system behave in ways that didn’t match its promises.
The more he looked, the clearer the picture became.
Posts challenging dominant narratives were quietly suppressed. Accounts pushing divisive themes appeared in clusters, often created within days of each other. Racialised language spread rapidly, amplified by networks of profiles that behaved more like coordinated units than individuals.
It wasn’t chaos. It was choreography.
The myth of “the only place for free speech”
Jason got the sense that X , under Elon Musk, positioned itself as a last refuge of “real free speech.” It was a seductive message — simple, bold, and repeated endlessly. But the reality was far more controlled.
Users were encouraged to believe:
Yet many were speaking into a hall of mirrors, where visibility depended on whether the algorithm approved.
The platform wasn’t protecting free speech. It was curating it.
In November 2025, Jason published findings that added a new layer to the story. X’s own AI system, Grok, criticised Musk for failing to deploy existing technologies that could detect and remove bots — tools that have been available since 2018.
The implication was stark:
the platform had the ability to clean up its own manipulation, but chose not to.
Bots weren’t a flaw. They were part of the ecosystem.
When a single platform can distort public debate, amplify misinformation, and present itself as the only “truthful” space, democracy becomes fragile. The public conversation narrows. Independent voices fade. And the idea of free speech becomes a slogan rather than a reality.
Fairer free speech requires diversity, transparency, and independence — not dependence on a single, privately controlled platform.
The campaign actively seeks supporters, volunteers & partners. Defend also welcomes case studies from individuals, private & public sector organisations, enterprises or charities. Contributions can be kept anonymous if preferred. jason@jasontanner.uk