The Red Pill Files

The Red Pill Files

📁 19 Archived Articles

The Rise of the Artificial Expert

Why You Can No Longer Trust the People You Follow

Aurel Nance's avatar
Aurel Nance
May 14, 2025
∙ Paid
5
2
Share
Illustration – All rights reserved – The Red Pill Files (TRPF)

You open LinkedIn.
A smiling face pops up — someone you’ve never heard of.
Underneath, it reads: “Top Voice in Leadership. Keynote Speaker. Bestselling Author.”
Two hundred thousand followers.
TEDx speaker.
Three newsletters.
A new book.
You scan their timeline: dozens of posts, each neatly crafted, each gathering thousands of likes.

You don't know why, but something feels off.
Maybe it's the hollow phrasing.
Maybe it's the generic advice dressed up as revelation.
Maybe it's the way their whole presence feels... designed.

Here’s a thought that should scare you more than anything else today:
That person might not even exist.
And even if they do, their "expertise" — their brand, their authority, their followers — might have been crafted, enhanced, and optimized entirely by machines.

Welcome to the new era.
Not where AI replaces experts.
Where AI creates them.

And if you think it’s just about "assistants" or "productivity tools," you’re already behind.


The Great Leap: From Helping Experts To Manufacturing Them

At first, AI was sold as a writing assistant.
Need help with a blog post? A LinkedIn caption? Maybe a quick outline for your ebook?
No big deal. Just a tool.

But then came the second wave:
Full ghostwriting engines, churning out complete newsletters and “thought leadership” pieces.
Autopilot publishing tools, scheduling a year’s worth of content based on trending topics.
AI-generated testimonials, "reviewing" fake books by fake experts who had never set foot on a stage, let alone lived the experiences they preached.

Today, entire personal brands are being built without a human at the wheel.

Here’s how it works:

  • A template for a “successful” thought leader is fed into an AI.

  • Posts are generated daily, optimized for engagement.

  • Books are assembled overnight with AI summarization tools, then self-published via Amazon KDP or Draft2Digital.

  • Endorsements are fabricated using fake accounts or cheap automation.

  • Paid promotions amplify the facade until it becomes indistinguishable from earned influence.

And it works because it looks close enough to reality to fool the audience.

In 2024 alone, Amazon removed over 13,000 AI-generated books posing as genuine nonfiction guides (source: The Guardian).
LinkedIn quietly adjusted its algorithms to combat an influx of AI-manufactured profiles that gained 10,000+ followers in under 90 days (source: LinkedIn Transparency Report).

The tools are getting better, not worse.
And the arms race for perceived authority has officially started.


Invisible Mechanics: How Machines Are Shaping Our Perception

It’s not just the volume of content.
It’s the surgical precision with which AI now manipulates the credibility signals we’ve learned to trust.

Brand voice modeling — training an AI to mimic the tone, cadence, and style of a “credible” figure.
Engagement optimization — deploying machine learning to predict what phrasing, topics, and posting times will maximize virality.
Social proof fabrication — manufacturing likes, comments, and shares via networks of bots or cheap click farms.

One click.
One subscription to a “personal branding accelerator.”
And suddenly, you’re “an expert.”

You didn’t have to live the life.
You didn’t have to fail, rebuild, endure — the things that normally forge real authority.
You just had to look like you had.

There’s a word for this: Simulacrum.
Jean Baudrillard nailed it decades ago in Simulacra and Simulation:
When the copy becomes more real, more desirable, than the original.

Today’s “experts” aren’t copies of something real.
They are copies of copies of copies — synthetic echoes crafted to sound right, without having earned the right to speak.


The Shattering Consequences

The real danger isn’t that AI will flood the market with bad content.
It’s that we will lose the ability to tell who’s real and who’s not.

Trust isn’t broken overnight.
It decays, drip by drip.

When you realize the leadership advice you read was optimized by a language model, not earned through years of painful trial and error...
When you discover the activist you admired doesn’t exist beyond a content calendar...
When the next "CEO" giving you advice has never built anything except a ghostwritten blog...

You stop believing.
In anyone.
In anything.

Authority collapses.
Not with a bang, but with an algorithmic shrug.

And here’s the most chilling part:
You won’t even realize it’s happening until it’s too late.


Before we go further, a quick reality check.

Three books you should absolutely revisit right now if you want to grasp the scale of what’s unfolding:

  • "Trust Me, I’m Lying" by Ryan Holiday — how media manipulation works, and why it’s easier than ever in the AI age.

  • "The Shallows" by Nicholas Carr — how the internet rewires our brains, making us more susceptible to superficial authority.

  • "The Revolt of the Public" by Martin Gurri — how loss of trust in elites reshapes society, and why fake authority could trigger real chaos.

Each of these works, written before the AI boom, now reads like prophecy.

Share

Before You Hit The Paywall

If this feels unsettling, that’s good.
You’re supposed to be unsettled.

Because the second part of this article will show you:

  • Real examples of AI-built thought leaders already gaining traction (names, numbers, methods).

  • The hidden systems that fuel their rise (ghostwriting farms, viral distribution networks, fake PR boosts).

  • What it means for democracy, business, and human connection.

  • How to build resistance — not by fighting AI, but by cultivating unmistakably human signals of credibility.

Comment below if this first part already triggered some realizations — I read and answer every single one.
Your feedback also helps push this article into more feeds. đŸ”„

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 The Red Pill Files
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture