Here’s the contradiction no one in tech wants to sit with: the same platforms that amplified #BodyPositivity into a global movement are simultaneously running algorithms that punish non-conforming bodies by burying them in the feed. TikTok banned the #SkinnyTok hashtag. Instagram rolled out tools to hide cosmetic surgery ads from minors. And yet — the underlying recommendation engines continue to do exactly what they were designed to do: surface content that maximizes engagement, which overwhelmingly means content featuring idealized, filtered, algorithmically rewarded bodies.
The platforms didn’t create beauty standards. But they built the most efficient distribution system for enforcing them that has ever existed.
A significant new analysis published by Books & Ideas lays out the problem with uncomfortable precision. The piece, titled “Algorithms and the Female Silhouette,” draws on Foucauldian theory to argue that social media platforms now function as what scholars call dispositifs — sociotechnical arrangements of power that promote idealized representations of the female body while simultaneously governing dietary, aesthetic, and fitness practices. The language is academic, but the reality it describes is visceral: women’s bodies are no longer monitored solely by family, peers, or colleagues. They are monitored — ranked, scored, amplified, or suppressed — by algorithms whose logic remains opaque even to the people they affect most.
The research highlights a cascade of now-familiar examples. Chubby and skinny filters on TikTok. Slimming tools embedded directly into smartphone cameras. The explosive promotion of Ozempic through “What I Eat in a Day” content that blurs the line between health advice and pharmaceutical marketing. Each example points to the same mechanism: bodies being fed into a digital feedback loop where conformity is rewarded with visibility and deviation is punished with silence.
What makes this moment particularly charged is the collision between two narratives that the platforms themselves have been eager to promote. On one side, there’s the empowerment story — social media as a democratizing force, giving voice to fat activists, body-positive influencers, and people who challenge beauty norms. On the other, there’s what the Books & Ideas analysis calls a “new form of biopower” — one that disciplines bodies not through explicit coercion but through the quiet mathematics of engagement optimization.
Both narratives are true. And that’s precisely the problem.
The algorithmic beauty economy isn’t a bug. It’s a business model.
To understand why filtered, idealized content dominates feeds, you have to follow the money. Recent case studies in digital marketing reveal how brands like Sephora are deploying AI-powered personalization to target consumers with hyper-specific beauty recommendations — often based on the same engagement data that platforms collect. The strategy is sophisticated: use algorithms to identify insecurities, then serve content that both triggers and addresses them. It’s a closed loop. The feed makes you feel inadequate, and then the ad offers the solution.
This isn’t speculation. It’s documented strategy. Brands are building entire campaigns around what marketers call “zero-party data” — information consumers volunteer through quizzes, filters, and interactive tools. As Crowdspring’s analysis of low-cost marketing tactics notes, AI-powered quizzes like “discover your skin type” aren’t just engagement tools. They’re data extraction mechanisms that feed personalized ad pipelines — pipelines that disproportionately target women with appearance-related messaging.
Meanwhile, the rise of interactive marketing at live events — AI photo booths, augmented reality filters, gamified beauty experiences — is extending the algorithmic beauty economy into physical spaces. The filter doesn’t stay on your phone anymore. It follows you to the conference, the product launch, the pop-up shop. The line between your face and its optimized digital version gets thinner every day.
And the beauty filter itself has become something more insidious than most coverage acknowledges. It’s not just a playful overlay. It’s a training mechanism. Every time a user sees their “improved” face — smoother skin, narrower nose, fuller lips — their brain recalibrates what “normal” looks like. Dermatologists have coined the term “Snapchat dysmorphia” to describe patients bringing filtered selfies to consultations as reference images for procedures. The filter doesn’t just change how you look on screen. It changes how you see yourself when the camera is off.
The body positivity movement was supposed to be the counterweight. And in many ways, it has been — creating space for larger bodies, scarred bodies, aging bodies, bodies that don’t conform. But the Books & Ideas analysis identifies what I’d call the visibility trap: even body-positive content must play by the algorithm’s rules to gain traction. Influencers who challenge beauty norms still need likes, shares, and watch time. The most successful body-positive posts are often those that are still visually polished, well-lit, and aesthetically conforming in every way except size. The algorithm rewards a specific kind of rebellion — the kind that still looks good on a feed.
This creates a strange paradox. The movement designed to liberate bodies from scrutiny only gains power by subjecting those same bodies to a different kind of scrutiny — the algorithmic kind. Visibility becomes the currency, and the algorithm is the bank.
What struck me about the Books & Ideas piece is its insistence that this is not a gender-neutral phenomenon. The analysis makes clear that the algorithmic disciplining of bodies is profoundly gendered. For social and historical reasons, women’s bodies have been the primary targets of control, shaping, and constraint. The digital era hasn’t disrupted this pattern. It has industrialized it. Fatphobia, the researchers argue, isn’t a personal bias — it’s a structural phenomenon that algorithms amplify at scale, making it feel like the natural order rather than a social construction.
And here’s what most of the coverage around beauty filters and algorithmic harm keeps missing: the solution isn’t just better algorithms.
There’s a comforting narrative that if platforms simply tweaked their recommendation engines — showed more diverse bodies, suppressed pro-eating-disorder content, labeled filtered images — the problem would be meaningfully addressed. And yes, those interventions help at the margins. But they fundamentally misunderstand the architecture of the problem.
The issue isn’t that algorithms are accidentally biased toward thin, filtered, idealized bodies. The issue is that engagement-maximization by design rewards content that triggers comparison, aspiration, and insecurity — because that content keeps people scrolling. You can’t reform the algorithm without reforming the business model. And no major platform has shown any willingness to do that.
This is the tension at the heart of our relationship with social media: we keep expecting tools built to capture attention to somehow also nurture well-being. It’s like asking a casino to promote financial literacy. The incentives point in opposite directions.
The deeper shift happening — the one this story is really about — is the colonization of bodily experience by digital systems. That’s not hyperbole. When your smartphone camera has a built-in beautification filter that activates by default, the device is making a judgment about your face before you do. When an algorithm decides which bodies get seen and which get suppressed, it’s exercising a form of power that no individual — no brand, no influencer, no policy team — fully controls.
We’re living through a moment where the most intimate aspects of self-perception — how you see your own body when you look in the mirror — are being shaped by systems designed to sell ads. The platforms know this. The brands know this. The research increasingly documents it.
The question is whether knowing it changes anything — or whether awareness itself becomes another piece of content to scroll past.