Every app that tracks your health is also building a case about you

  • Tension: We surrender our most intimate data believing it serves our wellness while it actually serves someone else’s profit.
  • Noise: The promise of optimization distracts from the reality of surveillance capitalism dressed as self-care.
  • Direct Message: Your health app isn’t tracking your progress; it’s building a profile worth more than you imagine.

To learn more about our editorial approach, explore The Direct Message methodology.

The moment you download a health app, you become data. Not a person seeking wellness, not someone trying to understand their patterns better, but a collection of metrics waiting to be harvested, packaged, and sold.

I’ve watched this transformation happen in real-time with clients over my years in practice. They’d come in with their phones, showing me sleep graphs, mood charts, step counts — believing these numbers held some key to understanding themselves. What they didn’t realize was that while they were building a picture of their health, these apps were building something else entirely: a comprehensive dossier of their most vulnerable moments.

The illusion of private tracking

We tell these apps things we wouldn’t tell our closest friends. When we’re menstruating. How much we weigh after a binge. The nights we can’t sleep. Our heart rate during a panic attack. We type in symptoms we’re too embarrassed to mention to our doctors, log sexual encounters, track fertility windows. We do this believing we’re having a private conversation with ourselves, mediated by technology.

But privacy in health apps is largely theatrical. A study analyzing over 20,000 mobile health apps found that 88% collect user data, with 55% transmitting it to third-party servers. This isn’t a bug in the system — it’s the business model.

The psychological appeal is obvious. These apps promise what I could never quite deliver in therapy: perfect recall, objective measurement, undeniable progress. They offer the fantasy of the quantified self, where every aspect of our being can be tracked, optimized, improved. It’s compelling because it sidesteps the messy reality of being human — that our patterns are complex, our motivations contradictory, our healing non-linear.

What happens to your midnight anxieties

Every tap, every entry, every moment of vulnerability you share with your health app becomes part of a larger narrative about you — one you’re not writing and can’t edit. When you log that you’re feeling depressed on a Thursday afternoon, when you record your weight after the holidays, when you track your drinking habits, you’re not just creating a personal record. You’re feeding a machine that’s learning to know you better than you know yourself.

This data doesn’t stay put. It travels through networks of advertisers, insurance companies, data brokers — entities that have a vested interest in knowing your vulnerabilities. Your period tracking app might share data with Facebook. Your mental health app might sell insights to marketing firms. Your fitness tracker might inform your insurance premiums.

The violation isn’t always obvious or immediate. It accumulates quietly, like interest on a loan you didn’t know you’d taken. One day you realize that the ads following you around the internet know about your insomnia, your irregular periods, your anxiety — because you told an app you thought you could trust.

The commodification of vulnerability

Muhammad Ikram, a lecturer at the Macquarie University Cyber Security Hub, puts it plainly: “Some of this information collected is used for tracking purposes and profiling purposes, which is done by third parties like advertisers and tracking companies and which is basically a form of data mining and this is done without user consent and it is being done explicitly and implicitly.”

What strikes me most about this isn’t the betrayal — though that’s real — but the fundamental misunderstanding of what healing requires. Real wellness work, the kind that actually changes things, requires safety. It requires a container that can hold your most difficult truths without judgment or exploitation. These apps promise that container, but they’re actually sieves, leaking your private struggles into marketplaces you never agreed to enter.

I think about my years in practice, about the careful boundaries we maintained, the confidentiality that made honest exploration possible. Those boundaries weren’t just ethical requirements — they were the foundation that made the work possible. Without them, people couldn’t risk being real about what hurt, what scared them, what they needed.

Health apps have inverted this relationship. They’ve turned our need for understanding into their commodity. They’ve transformed our attempts at self-care into their profit streams. They’ve made our vulnerability their asset.

The cost of convenient insight

The tragedy isn’t just that we’re being surveilled — it’s that we’re participating so willingly, even eagerly. We’ve been sold the idea that tracking equals progress, that measurement equals understanding, that data equals truth. But anyone who’s done real psychological work knows that the most important things about us can’t be quantified. The texture of grief, the weight of inherited trauma, the particular way anxiety sits in your chest — these aren’t data points. They’re experiences that require witness, not metrics.

Yet we keep downloading, keep tracking, keep feeding these systems because they offer something therapy often can’t: immediate feedback, constant availability, the illusion of control. They’re always there at 3 AM when you can’t sleep, ready to receive whatever you need to confess or count or catalog.

The real cost isn’t just privacy — it’s the subtle reshaping of how we understand ourselves. When we filter our experience through apps designed to extract value from our vulnerability, we start to see ourselves as they see us: as problems to be optimized, conditions to be managed, data to be mined. We lose touch with the deeper truths that can’t be graphed or tracked or sold.

Finding a different way forward

I’m not suggesting we abandon all technology or return to paper journals — though there’s something to be said for records that can’t be hacked, sold, or subpoenaed. But we need to understand what we’re trading when we hand over our most intimate data to companies whose primary obligation is to their shareholders, not our wellbeing.

Real self-knowledge doesn’t come from perfect tracking. It comes from patient observation, from sitting with discomfort long enough to understand it, from recognizing patterns that repeat across years, not days. It comes from having our experiences witnessed by people who can hold them without agenda, who aren’t secretly calculating their market value.

If you must use these apps — and I understand the appeal, truly — treat them like you would any other extractive relationship. Give them the minimum required. Use fake names, secondary email addresses, approximate data. Protect your real vulnerabilities for spaces that have earned them.

Because your midnight anxieties, your body’s rhythms, your private struggles — these aren’t content. They’re not data points. They’re the tender parts of being human that deserve protection, not monetization. And no app, no matter how sophisticated its algorithm or compelling its interface, should be trusted with what you wouldn’t sell yourself.

The promise of technology was that it would free us, connect us, help us understand ourselves better. Instead, we’ve built a system where our attempts at healing become someone else’s profit, where our vulnerability becomes their commodity. Until we recognize this fundamental violation for what it is, we’ll keep building cases against ourselves, one tracked symptom at a time.

Picture of Rachel Summers

Rachel Summers

Rachel Summers is a behavioral psychology writer and cultural commentator based in New York. With a background in social psychology and over a decade of experience exploring why people think, act, and feel the way they do, Rachel's work sits at the intersection of science and everyday life. She writes about emotional intelligence, generational patterns, relationship dynamics, and the quiet psychology behind modern living.

MOST RECENT ARTICLES

Streaming platforms found the ceiling. Now they’re testing how low audiences will go.

What nobody tells you about the grief that comes after a relationship ends cleanly — with no villain, no betrayal, and no good reason to be as devastated as you are

Why some people feel guilty when things are going too well

Wellness culture was built for people who can afford to rest — and that’s exactly why it keeps failing the people who need it most

Your most loyal customers started as followers you almost ignored

Celebrity endorsements work until you realize the celebrity doesn’t know your product exists