Every day, we make thousands of choices. What to watch, what to buy, what news to read, even who to date. We believe these decisions are ours alone, expressions of our unique tastes and preferences. But increasingly, the choices we make are being quietly curated by algorithms designed not to reflect our desires, but to predict and shape them.
The Invisible Hand of Recommendation
When you open Netflix, scroll through TikTok, or browse Amazon, you’re not seeing everything available. You’re seeing what an algorithm has determined you’re most likely to engage with. These recommendation systems analyse millions of data points: your viewing history, pause patterns, search queries, time spent hovering over thumbnails, and even the shows you started but never finished.
The result is a personalised feed that feels tailored to your interests. And it is. But it’s also training you, nudging you toward content that keeps you engaged longer, that makes you click more often, that turns browsing into buying.
The Feedback Loop of Preference
Here’s where it gets interesting: algorithms don’t just respond to your preferences, they create them. If you watch one true crime documentary, the algorithm serves you ten more. Soon, your feed is dominated by true crime content. You start to think of yourself as someone who loves true crime. But did you choose that identity, or did the algorithm choose it for you?
This feedback loop is self-reinforcing. The more content you consume in a particular category, the more the algorithm assumes that’s what you want. Your digital world narrows, even as you believe you’re exploring freely.
The Architecture of Engagement
Social media platforms are particularly adept at this manipulation. Facebook’s News Feed algorithm doesn’t show you posts in chronological order or comprehensively. It shows you content engineered to maximise engagement, often content that triggers strong emotional responses like outrage, fear, or tribal belonging.
YouTube’s autoplay feature isn’t random. It’s calculated to keep you watching, often leading viewers down recommendation rabbit holes that can radicalise opinions or reinforce existing biases. Former YouTube engineer Guillaume Chaslot revealed that the platform’s algorithm prioritised watch time above all else, regardless of accuracy or social impact.
The Paradox of Personalisation
The promise of algorithmic curation is convenience and relevance. Who wants to wade through thousands of options when a smart system can surface exactly what you’ll enjoy? But this convenience comes at a cost.
Filter bubbles emerge when algorithms only show us content that confirms what we already believe. We lose exposure to diverse perspectives, challenging ideas, or serendipitous discoveries that don’t fit our predicted profile. Our worldview calcifies, shaped not by genuine exploration but by mathematical optimisation.
E-Commerce and the Illusion of Discovery
Online shopping platforms employ similar tactics. Dynamic pricing algorithms adjust costs based on your browsing history and perceived willingness to pay. Product placements aren’t neutral—they’re influenced by advertising dollars and engagement metrics. That “you might also like” suggestion isn’t based solely on similarity, but on what maximises the platform’s revenue.
Even the scarcity you see (“only 2 left in stock” or “5 people viewing this now”) is often algorithmic manipulation designed to create urgency and override rational decision-making.
Reclaiming Agency
Understanding how algorithms work is the first step toward reclaiming genuine choice. We can actively seek out diverse sources, question why we’re seeing certain content, and recognise when our preferences are being manufactured rather than expressed.
Some practical strategies include regularly clearing recommendation histories, actively searching for content outside your usual patterns, using incognito modes to see unfiltered results, and consciously consuming media from sources that challenge rather than confirm your existing views.
Reclaiming Choice in a Curated World
Algorithms aren’t inherently evil. They’re tools that can surface valuable content and connect us with communities we’d never find otherwise. But we must remain aware that every swipe, click, and view is feeding a system designed to keep us engaged and not necessarily to make us informed, happy, or free.
The choices we think we’re making are often choices we’ve been guided toward. True autonomy in the digital age requires recognising this reality and actively working against the grain of algorithmic curation. Only then can we distinguish between what we genuinely choose and what we’ve been algorithmically persuaded to want.












