You taught the algorithm exactly what hurts you and now it shows you that thing every single day

  • Tension: We train algorithms to find our wounds, then wonder why scrolling feels like picking at scabs.
  • Noise: The belief that algorithms are neutral observers rather than mirrors we’ve taught to exploit us.
  • Direct Message: Your feed isn’t showing you what you need; it’s showing you what you can’t stop watching.

To learn more about our editorial approach, explore The Direct Message methodology.

Imagine you’re sitting with a friend who keeps dating the same type of person. Not the same person, but the same dynamic, the same inevitable ending.

They show you their dating app, proud of how well it knows them now, how it only shows them exactly their type. You watch them swipe through a parade of future disappointments, each one algorithmically selected to activate the exact pattern they claim they want to break.

This is what we’re all doing with our feeds, except the pattern isn’t about romance. It’s about whatever specific flavor of pain we’ve trained the machine to serve us.

The algorithm is an excellent student

I spent twelve years as a clinical psychologist, and one thing became clear: people are remarkably good at finding exactly what hurts them. Before algorithms, we had to work harder at it.

We had to seek out the relationships that confirmed our worst beliefs about ourselves, find the situations that activated our oldest wounds. Now we carry a device that has memorized every pause, every replay, every three-second hesitation that revealed what we can’t look away from.

The machine doesn’t care about your growth or healing. It cares about engagement, and nothing engages quite like an unresolved wound. Imran Ahmed, CEO of the Center for Countering Digital Hate, puts it starkly: “The algorithm recognises vulnerability and, instead of seeing it as something it should be careful around, it sees it as a potential point of addiction.”

Think about what you’ve taught it. Every time you lingered on that post about someone achieving what you haven’t. Every time you hate-watched someone living the life you think you should have. Every time you paused on content that made you feel less than, not enough, behind. You were training a very sophisticated system to understand your specific vulnerabilities.

We mistake recognition for healing

There’s something seductive about seeing our pain reflected back to us. In my practice, I watched clients light up when they found the perfect article about their attachment style, the ideal TikTok that explained their childhood trauma. Recognition feels like progress. It feels like understanding. But consuming content about your wounds isn’t the same as tending to them.

The algorithm has learned this too. It knows that you’ll watch twenty videos about anxious attachment, that you’ll save every post about healing from emotional neglect, that you’ll share content about setting boundaries while never actually setting them. It feeds you the language of healing without the actual work, recognition without integration.

I think about my mother, who spent thirty years managing undiagnosed anxiety while everyone called her “just a worrier.” If she’d had today’s internet, her feed would have been an endless stream of anxiety content. Would that have helped her? Or would it have simply given her more sophisticated ways to describe the cage she was in while keeping her firmly inside it?

The paradox of algorithmic comfort

Here’s what’s particularly insidious: the content that hurts us often feels comforting. It’s familiar. It confirms what we already suspect about ourselves and the world. If you believe you’re unlovable, the algorithm will find endless evidence. If you’re convinced everyone else has it figured out, your feed will confirm that too.

During my years in practice, I saw how people could become attached to their pain narratives. There’s a strange safety in knowing exactly how the story goes, even if it’s a tragedy. The algorithm has weaponized this. It doesn’t challenge your narrative; it reinforces it. It doesn’t push you toward growth; it keeps you scrolling in the same emotional loops you’ve been stuck in for years.

I catch myself doing this in the evenings, the time I’ve supposedly protected for reading. The algorithm knows I’m drawn to content about professional burnout, about the complexities of choosing solitude, about women who’ve stepped away from traditional paths. It’s not harmful content, exactly. But it’s content that keeps me circling the same questions instead of moving through them.

Breaking the training cycle

The most disturbing part isn’t that the algorithm learned our vulnerabilities. It’s that we keep teaching it, even after we understand what’s happening. We know the machine is exploiting our wounds, and yet we continue to offer them up, scroll after scroll.

In therapy, we called this repetition compulsion — the unconscious need to recreate familiar patterns, even destructive ones. The algorithm has made this compulsion frictionless. You don’t have to seek out situations that activate your core wounds. They’re delivered to you, personalized, optimized for maximum engagement with your specific pain points.

But here’s what I’ve learned, both as a therapist and as someone who lives with these same devices: awareness without action is just another form of consumption. Knowing that the algorithm is exploiting your vulnerabilities doesn’t protect you if you keep opening the app. Understanding the mechanism doesn’t neutralize its power.

What lives beneath the scroll

Sometimes I wonder what would happen if we all stopped feeding the machine our wounds. What if we became boring to the algorithm? What if we refused to engage with content that activates our oldest patterns?

The answer isn’t to pretend we don’t have vulnerabilities or to achieve some state of digital enlightenment where nothing affects us. It’s to recognize that the algorithm is showing us something important: a map of what we haven’t resolved. Every targeted ad for the life we think we should be living, every suggested video that makes us feel behind, every recommended post that activates our comparison reflex — these are invitations to look at what we’re avoiding in ourselves.

The real work isn’t in the feed. It never was. The real work is in what happens when we put the phone down and sit with whatever the algorithm was helping us avoid.

Conclusion

The algorithm knows us the way a dealer knows an addict — intimately, exploitatively, and without any interest in our recovery. We taught it exactly what hurts us, thinking we were just passing time, not realizing we were creating a detailed map of our vulnerabilities.

The question isn’t whether we can outsmart the algorithm or train it to show us better things. The question is whether we’re willing to stop offering up our wounds as engagement metrics. Whether we can recognize that the comfort of seeing our pain reflected back to us isn’t the same as healing it. Whether we can finally admit that we know exactly what we’re doing when we open the app for the twentieth time today, looking for something we know it can’t give us.

The algorithm will keep showing you exactly what hurts. That’s what you taught it to do. The only choice is whether you’ll keep watching.

Picture of Rachel Summers

Rachel Summers

Rachel Summers is a behavioral psychology writer and cultural commentator based in New York. With a background in social psychology and over a decade of experience exploring why people think, act, and feel the way they do, Rachel's work sits at the intersection of science and everyday life. She writes about emotional intelligence, generational patterns, relationship dynamics, and the quiet psychology behind modern living.

MOST RECENT ARTICLES

South Africa told the truth about apartheid 30 years ago — then treated truth as a substitute for justice

South Africa told the truth about apartheid 30 years ago — then treated truth as a substitute for justice

Democrats can't decide whether to hand Trump warrantless surveillance powers — and that tells you everything

Democrats can’t decide whether to hand Trump warrantless surveillance powers — and that tells you everything

Satellite-to-iPhone connectivity is about to eliminate every dead zone — and disconnection as a luxury

Satellite-to-iPhone connectivity is about to eliminate every dead zone — and disconnection as a luxury

Why Billy Crystal is rebuilding his Palisades home on a Broadway stage, eight shows a week

Why Billy Crystal is rebuilding his Palisades home on a Broadway stage, eight shows a week

The people who never appear in the briefing slides: Cuba-Russia intelligence and its real casualties

The people who never appear in the briefing slides: Cuba-Russia intelligence and its real casualties

The particular cruelty of being forgiven by someone who never actually understood what they were forgiving you for

The particular cruelty of being forgiven by someone who never actually understood what they were forgiving you for