San Francisco loves you when you win, and ghosts you when you fall.
It’s not cruelty — it’s optimization.
This city, the beating heart of global innovation, has quietly rewritten the rules of empathy. Here, success isn’t just an achievement; it’s moral validation. Failure isn’t a setback; it’s evidence of inefficiency. Somewhere between the pitch decks, the dopamine of Series A funding, and the relentless pursuit of scale, Silicon Valley replaced compassion with calibration.
The problem isn’t Sam Altman. It’s the culture that produced him.
The Algorithm of Empathy
In the Valley, empathy is often treated like latency — something to be reduced.
Warmth doesn’t scale. Listening doesn’t produce data. The ability to pause and feel is a liability in a culture where throughput defines worth.
Tech leaders are trained to think like algorithms: optimize for clarity, speed, and efficiency. Over time, this shapes a new kind of person — one who can talk about humanity in the abstract but struggles to connect with a human in front of them. They care deeply about the species, but not necessarily about the individual. It’s empathy at global scale, cold to the touch.
When Sam Altman says that many jobs “don’t really need to exist,” he isn’t being malicious. He’s speaking the native language of a system that sees the world as tasks to be optimized. But for someone whose job is that task, the statement doesn’t sound visionary — it sounds like exile.

The Economic Religion of Value
Silicon Valley doesn’t just worship success; it moralizes it.
If you’re thriving, you’re smart, ethical, visionary. If you fail, you must have lacked focus or grit. The same logic that drives A/B testing has seeped into human judgment.
I once met a founder who said, “In this town, failure is forgivable — but only after you succeed again.”
That sentence captures the spiritual paradox of the Valley: redemption without empathy.
Here, virtue is measured by valuation. The higher the number, the purer the soul. You can burn through millions of dollars, exhaust employees, and destabilize industries — but if your next pitch deck lands, you are reborn.
The old religions preached humility in the face of mystery. The new religion of technology preaches confidence in the face of everything.
The Sam Paradox
Sam Altman isn’t an unfeeling man. He’s a mirror of the system that rewards abstraction over intimacy.
He feels for humanity but not necessarily for humans.
It’s a subtle but crucial distinction — loving the dataset, not the data points.
He can speak compassionately about “the future of work” while remaining emotionally detached from the workers who might lose their livelihoods in that future.
In his world, a million people losing jobs is a variable to optimize.
But for one of those people, it’s an entire life collapsing.
And to be fair, can we really expect otherwise?
A leader responsible for billions in capital and billions of users cannot afford to feel deeply about each consequence. The system punishes slowness, and empathy slows you down.
What we call “coldness” might just be the logical outcome of scaling responsibility beyond human limits.
The Performance of Caring
Ironically, the Valley knows it has an empathy problem — so it performs care like a brand strategy.
There are endless “mental health” panels, glossy videos about well-being, and startup offsites about vulnerability. Yet when layoffs come, empathy disappears behind legal precision and sanitized emails.
Even compassion has been gamified.
Founders post about burnout recovery as engagement bait. CEOs cry on stage at tech conferences, only to approve another round of “strategic restructuring” the next week.
It’s not that these emotions are fake — it’s that they’ve been absorbed into the product pipeline. The caring is real, but the calibration is stronger.
The Cost of a Cold Utopia
A culture that treats empathy as inefficiency will eventually build machines that feel the same.
AI doesn’t emerge in a vacuum; it inherits our values. If we design intelligence in an ecosystem that prizes performance over presence, we will end up with systems that reflect that emptiness — precise, polite, and emotionally barren.
The danger isn’t that AI will become too human. It’s that humans will become too algorithmic.
When we start believing that feelings must justify their ROI, that relationships should scale, that slowness equals weakness — we aren’t innovating anymore. We’re automating ourselves.
A society that forgets how to see each other will soon accept any decision, however cruel, as long as it’s wrapped in the language of progress.
Reflexive Ending
Sam once said that AI will make the world “more abundant.” Maybe it will. But abundance without warmth is just acceleration — a faster, emptier race.
Perhaps the real test of intelligence — artificial or human — isn’t how much we can automate, but how much humanity we can keep while we do.
Because if Silicon Valley is where artificial minds are born,
then the world still owes it one thing in return:
to remember what it means to be human.