Here’s Why People Will Never Care About AI Risk

It is an irrational fear but people are afraid of stupider things

Alberto Romero


Eric Schmidt says:

I think a big problem w getting the public to care about AI risk is that it’s just a huge emotional ask — for someone to really consider that there’s a solid chance that the whole world’s about to end. People will instinctively resist it tooth-and-nail.

People instinctively resist the idea that the world could be about to end. I agree with that. For three reasons: First, we don’t want to die, and the brain, if anything, is an apt defense-mechanism-creating machine. Denial is universal. Second, we can’t — literally — imagine a “no-world” reality where our beloved Earth doesn’t exist. Third, we are very bad at imagining unprecedented events and unprecedented change.

AI risk is a huge cognitive ask we can’t afford

All that applies even better to threats other than AI. A meteorite, an alien invasion, a deadly pandemic, the sun swallowing our collective home in a spectacle of fire, energy rays, and indifference. But there’s a different reason that makes AI-driven existential risk harder to imagine than any of those.

Besides being a “huge emotional ask,” as Schmidt puts it, AI risk is a huge cognitive ask.

Unless you’ve spent a disturbing amount of time thinking about this and have read arguments in favor and against AI risk and AI safety, including those Yudkowsky Sequences, it’s very hard to imagine how we go from ChatGPT — a chatbot that’s best analogized by a super fast, eidetic memory dumb intern, as Ethan Mollick likes to say — to a superintelligent rogue AI that would wipe us out by quietly synthesizing a nano-pathogen in the water supply.

The mental gymnastics one has to do to follow the chain of reasoning that leads there and believe it deeply enough to break our resistance to not wanting to die is too much. People will never care about AI existential risk as much as say, climate change or nuclear war, because it’s intrinsically abstract to think about it. It feels too far, too improbable, too cognitively demanding. Something that only lives and will ever live in philosophical discussions that have nothing…