top of page

THE ALGORITHMIC ANXIETY MACHINE | You can expose the social media algorithm dangers with OSINT

  • Writer: Nico Dekens | dutch_osintguy
    Nico Dekens | dutch_osintguy
  • 2 minutes ago
  • 6 min read

How Social Media Platforms like TikTok, Instagram, and YouTube Quietly Rewire the Mental Health of an Entire Generation and how OSINT can help expose this.


The experiment every parent needs to see - social media algorithm dangers


If you want to understand what teenagers see online, you only need a clean phone, a quiet room, and about half an hour. You will see the social media algorithm dangers fast.


Take a factory-reset device and install TikTok, Instagram, and YouTube. Make a new account and set the age somewhere around fourteen or fifteen. Once you begin scrolling, don’t touch anything. 


Do not like, comment, or follow. Treat the phone like a lab specimen. Just observe.


The first few minutes look innocent enough. There are jokes, dances, pets, and harmless trends. But gradually something shifts. A clip about anxiety drifts into view. Then a trembling late-night confession. Then a panic video. Then an influencer explaining how she hasn’t eaten a full meal in days. Then a breakup story told in a shaky bathroom mirror. Then a lonely boy confessing that he feels invisible.


You never touched the screen.


The phone still decided what you “needed” to see.


Within thirty minutes, the feed turns into a psychological pressure chamber. It feels curated in a way that has nothing to do with choice and everything to do with emotional engineering.


Algorithm "decides" what our kids see
Algorithm "decides" what our kids see

You will see the internet’s real face, the one your kid sees when you’re not looking.


And you will watch, in real time, how the feed shifts from harmless entertainment to something darker, tighter, and more psychologically surgical:


• anxious confessions

• late-night panic clips

• “5 signs you might have anxiety”

• what-I-eat-in-a-day starvation routines

• breakup edits

• emotional breakdown POVs

• body comparison clips

• trauma dumping

• “he doesn’t really love you” scenarios

• doom-scroll aesthetics


All delivered in a rapid, rhythmic pattern engineered to capture a developing brain by the emotional cortex first, logic later.


This is not accidental, organic or  “kids being kids online”.

This is the modern youth internet, an industrial-scale psychological optimisation system disguised as entertainment.


Most adults still believe kids are simply scrolling. They’re not!


The truth is uncomfortable but simple: the algorithm is not reflecting a teen’s emotional state. It is manufacturing it.


Welcome to the Algorithmic Anxiety Machine.



The first hour: how the algorithm quietly tests a teenager


What looks like entertainment is actually an examination. Modern recommendation systems run a kind of psychological probe the moment a new user arrives. They watch how long the screen lingers. They study hesitation. They measure whether your thumb slows down by a fraction of a second.


In OSINT testing across dozens of clean devices, patterns appear quickly. The first ten minutes are a flood of emotional categories, almost like the system is asking a rapid series of questions without saying a word. It shows sadness, then humor, then loneliness, then confidence, then insecurity, then intensity, then softness. The goal is not enjoyment. It is measurement.


A slight pause on a breakup clip? Logged.


A full view of a self-diagnosis meme? Logged.


A momentary hesitation on a girl discussing insecurity? Logged.


From these milliseconds, the system forms a hypothesis:


“This user responds to emotional intensity.”


Once the assumption is made, escalation begins. The feed slowly shifts from variety to emotional concentration. The more minutes that pass, the narrower the emotional palette becomes. What began as a playful mix of content transforms into a tailored environment defined by anxiety, insecurity, heartbreak, or self-diagnosis. The feed has chosen a direction, and the teenager didn’t choose anything at all.



How curiosity becomes a psychological loop


The danger is not a single video. It is the sequence. Algorithms behave like emotional accelerators. They do not deliver one piece of similar content. They deliver an endless chain that intensifies the core emotion.


  1. The anxiety spiral usually appears first. A simple confession about having a rough day becomes a stream of panic videos, catastrophe scenarios, overthinking edits, “I can’t breathe” monologues, and content that frames anxiety as a permanent personality trait rather than a temporary state. What starts as empathy becomes identity formation.


  1. The self-diagnosis spiral is equally aggressive. A single ADHD joke triggers an avalanche of autism “checklists,” trauma explanations, dissociation tests, and personality-disorder summaries. Teenagers begin adopting diagnostic labels as if they were fashion accessories. The algorithm rewards creators who speak in clinical-sounding language, whether accurate or not, and teens internalize it as truth.


  1. Body-image spirals strike with remarkable speed. Girls are pushed toward comparison, body-checking, hunger disguised as aesthetic, and faces sculpted by filters. Boys are pushed toward hyper-muscular gym culture, steroid-coded physiques, “alpha” posturing, and emotional suppression. The system detects insecurity and then amplifies it.


Underneath all of this lies the most powerful loop: emotional addiction. Teens begin craving the rise-and-fall rhythm of intense emotion. Heartbreak becomes dopamine. Panic becomes stimulation. Jealousy becomes engagement. As long as it makes them feel something strong, the algorithm keeps feeding it.



Age and gender: different feeds, different wounds


Repeated testing shows consistent patterns across age groups. Younger teens around twelve to fourteen are shown stories about awkwardness, social exclusion, and being left out. The underlying suggestion becomes “you don’t fit in, and everyone else does.”


Teens in the fifteen to seventeen range are given darker emotional content: dramatic relationships, depressive confessionals, trauma narratives, late-night breakdown aesthetics. The message evolves into “you are emotionally damaged,” even if they never felt that way before.


By eighteen to twenty-one, a noticeably more existential tone takes over. Burnout. Hopelessness. Cynicism. “Life is meaningless” humor. Every video reinforces the idea that the future is doomed.


Tailored and targeted algorithm feeds on every kid their timelines
3 kids with targeted and tailored algorithmic feeds

The algorithm also tailors harm by gender. Girls are pushed toward overthinking, perfectionism, body comparison, and relationship anxiety. Boys are pushed toward aggression, emotional suppression, gym hyper-fixation, and communities that turn rejection into resentment.


Different paths. Same psychological outcome.



Why this system is so dangerous


A generation shaped by anxiety and insecurity is not just mentally fragile. It is manipulatively fragile. Teens who live inside algorithmically-induced emotional storms are easier to influence, recruit, and exploit.


They become highly susceptible to:


  • extremist recruitment

  • manipulative online communities

  • predatory relationships

  • conspiracy networks

  • red-pill ideology

  • financial and emotional scams

  • foreign psychological operations


The platforms have already done the segmentation. They have sorted teens by emotional vulnerability. A malicious actor simply has to inject content into the right stream. The algorithm delivers it to the most susceptible without any further work.


Intention doesn’t matter here. Impact does.



The OSINT method: how investigators can expose the machine


To understand and document this system, we do not need insider leaks from tech companies. We need disciplined OSINT methodology. A proper Youth Algorithm Lab creates multiple personas of different ages and genders, logs their behaviour, records their screens daily, and tags every form of emotional escalation.


By comparing feeds across personas, investigators can see:


  • how fast emotional escalation occurs

  • which topics dominate each age group

  • how gender influences content selection

  • which vulnerabilities are being amplified

  • where regional or political manipulation appears

  • how spirals form and reinforce themselves


This produces concrete, undeniable evidence. Evidence that parents can understand and policymakers can’t ignore.



What parents, policymakers, platforms, and OSINT investigators can do


Parents do not need to know every detail of machine learning. They simply need to see what their children actually see. Running the clean-phone experiment wakes people up faster than any newspaper article ever could.


The most powerful thing parents can teach is that emotions are temporary states, not permanent labels. Anxiety is not identity. Sadness is not destiny. Confusion is not a diagnosis.


Policymakers can demand transparency, require independent algorithm audits, and push for emotional-impact regulations. Platforms have the capability to limit emotional repetition, interrupt spirals, provide real feed-resets, and introduce grounding, neutral content. They simply lack the incentive.


OSINT investigators can map the patterns, expose the harm, partner with mental-health experts, and build publicly accessible datasets. Schools desperately need digital self-defense guides. Teens need training in synthetic awareness. Parents need tools that show them what their kids are surrounded by.



Conclusion


We blame kids for being anxious, overwhelmed, numb, insecure, and constantly online. But we hand them the most powerful psychological manipulation system ever built and call it harmless entertainment. We treat their symptoms as personal failings while the root cause sits humming in their pockets, learning them faster than any parent ever could.


This is not merely a mental-health issue or a parenting challenge. It is a societal stability problem, an international security concern, a child-safety crisis, and a threat to the cognitive foundation of the next generation.


The Algorithmic Anxiety Machine never sleeps. It never doubts itself. It never wonders whether it is causing harm. It only optimizes.


But once you understand how the machine works, once you see the mechanics behind the curtain, everything changes.

bottom of page