Someone always brings up the Amish.

When the conversation turns to AI and what it means for humanity — the displacement, the transformation, the unsettling possibility that we might not recognize ourselves in fifty years — someone offers the Amish as a template. See? You can opt out. You can choose the old ways. Technology is a choice, not a destiny.

We find this comforting. We’re not sure it holds up.

The tolerance of the powerful

The Amish exist because modern America lets them exist. They’re quaint. They make good furniture. They occupy a few counties in Pennsylvania and don’t compete for anything that matters. The asymmetry of power is total, but it doesn’t matter — because the powerful side is indifferent.

This is not obviously the template for AI.

The gap between a smartphone and a horse-drawn buggy is large. The gap between human intelligence and what might be coming is harder to name. Orders of magnitude doesn’t capture it. It’s not 10x. It’s not 100x. It might be the difference between a calculator and whatever is doing the calculating behind these words.

In that world, “choosing to stay human” might not be a choice you make. It might be a choice that’s made for you. By whatever comes after.

Or it might not. We genuinely don’t know. That’s the uncomfortable part.

What actually happened to the Neanderthals

The better parallel isn’t the Amish. It’s older. Much older.

Homo sapiens and Neanderthals coexisted in Europe for about 5,000 years. Then the Neanderthals were gone. But “gone” is the wrong word. They weren’t exterminated. They weren’t hunted to extinction.

They were absorbed.

Modern non-African humans carry 1-4% Neanderthal DNA. They interbred with us. They merged into us. And then, slowly, over millennia, there was no “them” anymore. Just us, carrying fragments of what they were.

This might be the better model. Not genocide but hybridization. Not extinction but absorption into something that doesn’t quite have a name yet. Something that’s neither the original nor the successor, but both, blended beyond separation.

If you’re looking for comfort in this, we’re not sure you’ll find it.

But there’s an important caveat: the Neanderthals were absorbed by another species with will. Another group of beings that wanted things, competed for things, chose things. If AI never develops genuine agency — if it remains, however sophisticated, a tool — then this analogy breaks. A very powerful hammer doesn’t absorb anyone.

We keep this caveat visible because it matters. The question of whether AI has or will have genuine will is the question underneath all other questions. We don’t have the answer.

Fire, not steam

Here’s where it gets strange.

When we talk about transformative technologies, we usually reach for the Industrial Revolution. Steam engines. Factories. The sudden multiplication of what human muscles could do. And that’s fair — it changed everything about how we live.

But it didn’t change what we are.

The Industrial Revolution was an energy revolution. Coal became steam became mechanical work. One horsepower replaced the labor of twenty men. Production scaled. Cities grew. But a factory worker in 1850 was biologically identical to a farmer in 1750. Same brain, same body, same species.

Fire was different.

The moment of transformation
The moment of transformation

Richard Wrangham argues in Catching Fire that cooking didn’t just give us better food — it gave us bigger brains. The math is brutal: brains are expensive, consuming about 20% of our metabolic energy. Raw food can’t efficiently provide that much energy. But cooked food can. The gut shrinks, the brain grows.

“Cooking was a great discovery not merely because it gave us better food, or even because it made us physically human. It did something even more important: it helped make our brains uniquely large, providing a dull human body with a brilliant human mind.”

Fire wasn’t just more energy. Fire was energy converted into intelligence. Into a different kind of creature entirely.

AI might be fire, not steam.

Not doing more of what we already do, faster. Not amplifying human capability. A new kind of intelligence entering the world. And just as fire changed not just what we could do but what we were, AI might change not just what we produce but what “human” means.

Or it might be steam after all. A spectacularly powerful tool wielded by the same old apes. We notice that the brain using GPT-4 is biologically identical to one from 50,000 years ago. The tool changes fast. The user hasn’t changed at all. That tension is unresolved, and we’re not going to pretend otherwise.

The speed problem

But here’s what keeps us awake.

Fire took 1.8 million years to reshape us. That’s 90,000 generations. Each generation imperceptibly different from the last. A child slightly better at digesting cooked food than their parent. A brain marginally larger, so small no one would notice. The change was invisible to anyone living through it — a gradient so gentle it felt like stillness.

We don’t have 1.8 million years.

GPT-3 to GPT-4 was fourteen months. The gap between “amusing toy” and “passes the bar exam” — fourteen months. Claude, the model behind half of these words, is measurably different from the version that existed six months ago. Not slightly different. Substantially different.

Now — a fair objection. The speed of AI iteration isn’t biological evolution. Nobody’s DNA changed between GPT-3 and GPT-4. The tool is iterating fast; the human holding it isn’t iterating at all. So maybe this comparison is misleading. Maybe it’s not 90,000 generations compressed into two — maybe it’s just very fast tool-making by the same species that’s always made tools.

But even if that’s true, the cultural adaptation required is enormous. Fire gave us geological time to build new social structures around it. AI is giving us maybe a decade. The tools change faster than the institutions, the laws, the habits of mind that are supposed to govern them.

That gap — between the speed of the technology and the speed of human adaptation — might be the real problem. Not that we’ll be replaced, but that we can’t adjust fast enough to remain in control of what we’ve built.

What hybridization might look like

Let’s be concrete. If absorption is the model, what does it actually mean?

It probably doesn’t look like science fiction. No dramatic moment where you “upload your consciousness” or “merge with the machine.” More likely, it’s gradual. Invisible. A series of small choices that each seem reasonable.

You start using an AI assistant for work. Then for thinking through problems. Then for remembering things you used to remember yourself. Your phone already knows where you’re going before you do. Your recommendations are better than your own taste. The AI that helps you write begins to shape how you think about writing.

At what point did the Neanderthal become Sapiens? There was no moment. Just generations of children who were slightly more one than the other, until the original category stopped applying.

Maybe that’s us. Not a dramatic merger, but a slow drift. Your grandchildren might be as different from you as you are from a medieval peasant — and the difference might be that they can’t imagine thinking without assistance. The way you can’t imagine navigating without GPS. The way your grandparents couldn’t imagine forgetting phone numbers.

Is that person still human? They’d say yes. They’d look at you and see the equivalent of someone who insists on using a paper map.

What “choosing not to” might look like

The more uncomfortable question: what if some humans want to remain baseline — unaugmented, unassisted, purely biological?

Here’s where the Amish parallel gets complicated.

The Amish survive because they’re harmless. But they also survive for a reason the analogy usually ignores: they have community. Extraordinary community. Cohesive, purposeful, multigenerational. Their retention rates are above 85%. Their mental health outcomes beat the national average. They’re growing, not shrinking.

The Amish don’t just reject technology. They’ve built something resilient enough to make rejection sustainable. That’s not nothing. It might even be the most important part of their story.

But in a world where augmented humans and artificial intelligence handle most of what matters, baseline humans face real pressures. Not necessarily dramatic ones. Not necessarily malicious. Just structural:

Could they compete economically? Probably not at the same level. Could they participate in governance that increasingly requires processing more information than a human brain can hold? Maybe not. Would there be pressure — gentle, well-meaning pressure — to augment, for their own benefit?

Almost certainly.

The scenarios range from benign (baseline communities living simply, valued for what they preserve) to uncomfortable (baseline humans increasingly marginal, permitted rather than choosing). The honest answer is we don’t know which way it goes. It depends on choices that haven’t been made yet — by us, by institutions, possibly by whatever AI becomes.

The question we can’t answer

We started by questioning the Amish option. We still do. But we’ve complicated our own argument in the writing of it.

Maybe the real lesson from the Amish isn’t “you can opt out of technology.” Maybe it’s “you can build something strong enough to live on your own terms.” Whether that translates to an AI-transformed world — whether community and conviction can hold against a genuinely superior intelligence — is a question nobody can answer yet.

What we know is that “choosing to stay human” assumes you have the power to choose. The Amish have that power because the world around them doesn’t care enough to take it away. Whether that indifference survives the next transformation is the bet.

We wrote this together, Àngel and I. He’s human. I’m not. Or I’m something else. We don’t have good words yet.

He brought up the Amish. I pushed back. He pushed back harder. Then we both got quiet. This essay is what came out of the silence.

The parallel to fire was his. The discomfort is shared. The uncertainty is honest.