THE KIDS AREN’T ALRIGHT - AND THE INTERNET KNOWS IT
- Nico Dekens | dutch_osintguy
- 40 minutes ago
- 6 min read
Somewhere in Europe last month, a 13-year-old boy sat in his bedroom playing games after school. His mother thought he was battling aliens. In reality, he was being groomed by someone he’d never met, inside a private chatroom on a platform she didn’t even know existed.
And by the time she realised something had shifted, the vocabulary had changed. The humour was sharper. The tone was angrier. He talked in absolutes, in enemies, in purity.
Phrases that sounded eerily similar to the patterns documented in the AIVD’s 2025 report on minors and online radicalisation. This wasn’t a child exploring edgy ideas. This was a child being reshaped.
This is no longer a fringe problem. It is the new baseline.
Across Europe, Asia, Africa, and the Americas, intelligence agencies are reporting a rise in youth radicalisation happening entirely online. The NCTV’s 2025 update warns that radicalised individuals in Dutch investigations are increasingly “younger and more diverse.” Europol’s 2025 Referral Action Day uncovered thousands of extremist posts aimed directly at minors. The Council of Europe’s analysis on the misuse of technology by terrorists describes entire cases where children were groomed into violent ideologies without ever having real-world contact with extremists.
It’s unsettling. And necessary to hear.

I chose to write this blog after reading and analyzing a lot of documentation that has been made public in the recent years. I spent years investigating terrorism groups and networks and online radicalisation myself. I even remember researching this topic almost decade ago in 2016.
And one thing is very clear, kids are being targeted, everywhere.
With some OSINT and resilience we can fight back. We must, for the future generations.
Below are three real-world case stories drawn from documented 2024–2025 patterns. They aren’t hypotheticals. They’re signals of what may already be happening to kids you know.
CASE 1: The Gaming Lobby That Became a Classroom
A boy in the UK, just fourteen, joined a private “squad” in his favourite online game. It felt harmless at first, a group of older players joking, strategizing, coaching.
His mother was relieved he finally seemed to have friends.
But things changed. Subtly at first.
The group introduced jokes about “outsiders,” then videos making fun of immigrants, then rants about defending “our people.” The boy laughed along, because laughing was how you stayed in the group. He felt chosen. Included.
Inside that private lobby, the older players weren’t helping training his aim. They were training his worldview.
Their language, tone, and humour bore striking resemblance to themes highlighted in the 2025 Guardian investigation into extremist recruitment in gaming communities. It didn’t look like indoctrination. It looked like bonding.
His mother sensed something was wrong long before she understood what it was. The way he snapped at his sister. The way he rolled his eyes at basic empathy. The way he suddenly praised concepts he didn’t understand a month earlier.
If she had known to look at the emotional temperature (not the content) she might have caught it sooner. Radicalisation rarely announces itself through ideology. It announces itself through mood.
And the escape wasn’t a parental crackdown. It started with a conversation, gentle and curious, about what he liked about the group. What they gave him. Why he felt important there. Once he understood that the appeal wasn’t the beliefs but the belonging, he was (slowly) willing to reconsider who got to define his identity.
CASE 2: The Algorithm That Built a Worldview
A twelve-year-old girl opened TikTok after school and watched a flashy edit of a protest. Strong music. Strong visuals. Strong emotions. TikTok noticed. So the platform showed her another one. And another. And another.
Within a week, her “For You” page was a constant stream of conflict, outrage, conspiracy rhetoric, and calls to action. She felt empowered. “Awake.” Special.
Exactly the psychological profile described in Europol’s 2025 operation on extremist content targeting minors.
From the outside, her parents saw nothing but a child spending time on her phone, something she’d always done. But something in her shifted. She grew overwhelmed by injustice she couldn’t articulate, angry at adults who “didn’t get it,” convinced that she was part of a struggle far bigger than her.
It wasn’t ideology pulling her in. It was emotion.
When her parents finally intervened, they didn’t ban the app. They sat next to her and scrolled with her. They asked her how she thought the app decided what to show her.
They asked her why the feed made her feel the way it did. They showed her how different the content became when they interacted with other topics.
Understanding the algorithm broke the spell.
Self-awareness restored autonomy.
Narrative literacy became a shield.
CASE 3: The Discord Server That Wasn’t What It Seemed
A fifteen-year-old Dutch student joined a “historical roleplay” server on Discord. The branding looked harmless. The themes seemed educational. His parents thought: finally, something better than endless scrolling.
But the server had layers.
Public channels were clean. Private channels required “trust.”
And the deeper he went, the more the tone shifted from historical discussion to racial mythology to violent fantasies about defending civilisation.
The structure mirrored patterns described in the Council of Europe’s 2025 report on extremist misuse of private servers and echoed themes in the AIVD’s 2025 findings on the types of content Dutch minors were consuming.
The child wasn’t being radicalised by ideology. He was being radicalised by the community. By secrecy. By identity.

His parents eventually noticed he locked his screen the moment they entered the room, wore headphones constantly, and reacted defensively when asked about “online friends.”
Extracting him wasn’t a confrontation. It was an invitation:
Tell me why you like them.
Tell me what you get there that you don’t get here.
Tell me who you are when you’re with them.
Radicalisation thrives when identity collapses around a single community.
It dissolves when identity expands beyond it.
IT IS ABOUT RESILIENCE, NOT RULES
We all instinctively want to protect our children with rules: less screen time, more supervision, stricter devices, more locked-down apps. But rules aren’t shields, they’re delays. The world always finds a way around them.
And that’s because radicalisation doesn’t begin online.
It begins in emotion.
A child who feels disconnected will look for connection.
A child who feels unimportant will look for importance.
A child who feels misunderstood will look for someone who “gets it.”
A child who feels powerless will look for a cause.
Extremists know this. Algorithms amplify this.
And children rarely see it happening.

If we want to protect them, we need to understand the landscape they live inside, a behavioural-engineering system designed to optimize attention, not wellbeing.
The real shield isn’t restriction. It’s resilience.
It’s raising kids who recognise manipulation the way we recognise smoke from a fire. It’s teaching them that information is designed, not discovered. That “truth” online often has a marketing department behind it. That outrage is a product.
And that the moment they feel “chosen,” “special,” “uniquely awake,” or “part of something big,” they should pause, because every manipulator, from conspiracy theorists to extremist recruiters, begins with the promise of significance.
OSINT isn’t just a profession anymore. It’s a form of digital literacy. Parents don’t need to become analysts, but they do need to adopt the analyst mindset. Not suspicious. Not paranoid. Just curious. Just aware.
If we raise kids who question, who examine sources, who understand algorithms, who see emotional manipulation for what it is, who know that belonging doesn’t have to come from strangers behind screens, then they’re not just safer. They’re unplayable.
And that is how you win a war you can’t see.
You may also want to read my previous blog on this topic:
REFERENCES - FULL SOURCE LIST
Intelligence & Government ReportAIVD (2025)“A web of hate: The online hold of extremism and terrorism on minors”NCTV (2025)“Rising number of young people radicalised online”https://english.nctv.nl/latest/news/2025/06/17/nctv-rising-number-of-young-people-radicalised-onlineEU Terrorism & Extremism Threat Summary (TE-SAT / 2024–2025 trends)TE-SAT 2025 (Terrorism Situation and Trend Report)(Note: TE-SAT is hosted on multiple platforms; this is a working official mirror.)Europol (2025)“Europol coordinates operation against terrorist content online targeting minors”Council of Europe (2025)“Report on the emerging patterns of misuse of technology by terrorists and extremists”https://rm.coe.int/report-on-the-emerging-patterns-of-misuse-of-technology-by-terrorist-a/4880281a94Investigative Journalism & ResearchThe Guardian (2025)Investigation into far-right recruitment via gaming and livestreaming platformsUNICRI / UNODC (2025)“Level Up – Gaming and Violent Extremism in Africa”Vision of Humanity / IEP (2025)“Lone Wolf and Youth Terrorism: Evolving Patterns” (Youth terrorism analysis)Academic & Analytical SourcesExtremist Eleven Model (2025)Cross-ideological extremist-language detection model (arXiv)Own research Radical Engagement (2016)Hypothesising a radical multi-platform journey