“This entire spectacle runs on a collective abdication of our own critical thinking. It’s scary.”
As most, if not all, bubbles have throughout modern economic history. The messianic faith of AI, railroads, gold, tulips, insurance, and real estate all has one intertwining theme - the collective abdication of critical thinking. It must feel amazing to believe it’s all sorts of fun and guaranteed riches simply to ditch the ability to reason. I don’t understand, but millions clearly have.
This is one of the clearest articulations I’ve read of why the AI moment feels structurally unstable, not just overhyped.
The framing of Sin (monoculture) → Oil (unpaid cognitive labor) → Cracks → Crisis → Backstop is especially powerful. It explains why the “too big to fail” language slipped out so naturally, the system already behaves as if it expects socialised risk.
The most unsettling part isn’t hallucinations or benchmarks, it’s the abdication of governance. We’re not scaling intelligence, we’re scaling unexamined automation. History doesn’t repeat, but it does rhyme and this feels eerily pre-cybersecurity boom.
You've articulated this so clearly (yet again). Are you ready to join my interview series? I would like to interview you about this and the last several articles you wrote... I know you're busy, but this is so important and I would really love to better understand it. And the only way to understand it is to talk to people who do and then try to write about it (as you know)
“This entire spectacle runs on a collective abdication of our own critical thinking. It’s scary.”
As most, if not all, bubbles have throughout modern economic history. The messianic faith of AI, railroads, gold, tulips, insurance, and real estate all has one intertwining theme - the collective abdication of critical thinking. It must feel amazing to believe it’s all sorts of fun and guaranteed riches simply to ditch the ability to reason. I don’t understand, but millions clearly have.
This is one of the clearest articulations I’ve read of why the AI moment feels structurally unstable, not just overhyped.
The framing of Sin (monoculture) → Oil (unpaid cognitive labor) → Cracks → Crisis → Backstop is especially powerful. It explains why the “too big to fail” language slipped out so naturally, the system already behaves as if it expects socialised risk.
The most unsettling part isn’t hallucinations or benchmarks, it’s the abdication of governance. We’re not scaling intelligence, we’re scaling unexamined automation. History doesn’t repeat, but it does rhyme and this feels eerily pre-cybersecurity boom.
You've articulated this so clearly (yet again). Are you ready to join my interview series? I would like to interview you about this and the last several articles you wrote... I know you're busy, but this is so important and I would really love to better understand it. And the only way to understand it is to talk to people who do and then try to write about it (as you know)