The Short Version (like me)
Your feeds are ranked by algorithms designed to keep you scrolling—not to keep you informed.
Outrage and falsehood spread faster than facts, because emotional content is what machines reward.
Bots, bot farms, and covert influence networks push narratives daily; platforms remove some but never all.
Paid influence goes far beyond ads. It includes influencers, campaigns, and even governments shaping what you see.
Tools exist to audit your feeds—and it’s time to use them.
Part I — What Your Feed Actually Is
Facebook, Instagram, YouTube and TikTok use machine-learning systems called recommender algorithms. These don’t simply show you posts in order. They weigh signals like your watch time, clicks, likes, comments, and follows, then predict what you’re most likely to engage with next.
TikTok perfected this with its For You Page—you don’t even need to follow anyone before the app knows your tastes.
The catch: when the incentive is “keep watching,” outrage wins. Peer-reviewed studies show social feedback mechanisms train users to post more outrage over time, and Science found that false news spreads 70% more quickly than true stories on Twitter/X.
Part II — Bots, Bot Farms, and Bought Attention
Not all manipulation looks the same. There are three main buckets:
1. Bots & Sockpuppets
Bots are automated accounts that can like, share, or post without a human behind the keyboard.
Sockpuppets are fake personas run by real people pretending to be someone else.
Example: A “middle-aged Ohio dad” account that always posts at 3 a.m. Moscow time and only shares political talking points? Probably not your neighbor Bob — it’s a sockpuppet.
2. Industrialized Manipulation
This isn’t just a guy in his basement. It’s organized, professional operations often backed by governments or private firms. The Oxford Internet Institute calls them “cyber troops.”
They run teams of people who manage thousands of accounts, pump out memes, and flood comment sections.
Their goal is to sway public opinion or at least make you feel like a certain view is more popular than it is.
Example: Ahead of elections in multiple countries, researchers have found networks of “cyber troops” seeding hashtags, hijacking trending topics, and drowning out critics with spam.
3. Bought Engagement
This is the black-market side: fake likes, fake follows, fake comments — all for sale.
Follower factories sell you 10,000 “fans” overnight.
Phone farms are literally rooms full of cheap smartphones, each logged into different accounts, auto-liking and auto-following to fake popularity.
Example: Ever notice an influencer suddenly gaining 50,000 followers in a week — but their comments are all “Nice pic 🔥🔥🔥” from accounts with no profile photos? That’s bought engagement.
Why This Works
Because platforms reward engagement — not accuracy. If an account racks up likes, comments, or shares (even if they’re fake), the algorithm assumes it must be worth showing to more people.
That means lies, spam, and manipulation can climb just as easily — sometimes easier — than truth.
Part III — Governments and Campaigns Going “Creator-First”
Covert ops. Meta’s reports show Russia, China, and Iran regularly running covert campaigns with fake personas.
Influencer loopholes. In the U.S., political influencers often avoid ad disclosures. The FTC tightened rules in 2023, but enforcement is spotty.
Global transparency push. The EU’s Digital Services Act now requires ad libraries and risk assessments, forcing platforms to open their black boxes.
Case study: Netanyahu. Just last week, Israeli PM Benjamin Netanyahu told U.S. influencers that TikTok is “the most important weapon” in securing support, urging them to shape narratives. Reports show influencers aligned with Israel are being paid $6,100 - $7,000 per post.
Part IV — The Receipts on Algorithm Incentives
Engagement drives exposure. Predicted watch time and clicks outweigh chronology.
Outrage pays. Content with moral-emotional language spreads further and faster.
Humans amplify lies. Studies show people—not bots—are the main drivers of false content going viral.
Ops are constant. Meta reports uncover new disinformation networks every quarter.
Transparency is patchy. Meta’s Ad Library, Google’s political ads database, and TikTok’s Commercial Content Library exist—but enforcement is uneven.
Part V — Paid Influence in Plain Sight
The ad isn’t always a banner. Today it’s often a trusted influencer holding up a talking point.
U.S. law requires “clear and conspicuous” disclosures (#ad upfront).
EU rules demand searchable ad libraries.
Yet many posts slip through, particularly when they’re issue-based or international.
If you can’t easily tell whether a post was sponsored, that’s the problem.
Part VI — What You Can Do
Audit your feeds. Use Meta Ad Library, Google Ads Transparency, and TikTok Commercial Content Library.
Don’t feed the algorithm. Scroll past content you hate quickly; don’t let dwell time reward it.
Spot coordination. Repeated talking points, new accounts, and copy-paste replies usually mean manipulation.
Expect disclosures. If a creator is pushing politics or products, demand visible labels.
Balance your sources. Subscribe directly, switch to chronological feeds, diversify inputs.
Part VII — What Platforms Owe the Public
Algorithm transparency. Not PR gloss, but auditable details on how systems weight engagement vs. quality.
Complete ad archives. Searchable, stable, global.
Clear influencer rules. Paid political influence should meet the same disclosure bar as ads.
Closing Argument
Your feed isn’t a mirror. It’s a marketplace of influence—algorithms maximizing attention, campaigns renting credibility, and covert ops seeding narratives.
The fix is not unplugging. The fix is power: the power to know who paid, the power to tune your own signals, and the power for independent watchdogs to audit what platforms won’t admit.
Until then, treat your feed like a courtroom. Demand evidence. Follow the money. And before you hit share, ask: who benefits if I amplify this?
"Audit your feeds. Use Meta Ad Library, Google Ads Transparency, and TikTok Commercial Content Library."
Thanks for the sources.I do my best to check sources before I repost something.
I don't know if I really understand a word of this essay, unfortunately. I know the words, just not what they actually mean all put together.
It takes me all morning to read all the people I follow on substack, read The Guardian, follow up on links to see what I think on Threads or FB.
I don't know if I have the time/mental bandwidth to dig deeper. Checking out the links will help, I hope. But understanding each company's "explanations" is hard, and then actually trying to fix my feeds is mind-boggling.
Maybe there's a job out there for someone to help people my age (74) fix their feeds.