Episode 74

Published on:

10th Jun 2023

Eliezer Yudkowksy: AI is going to kill us all

Thought experiment: Imagine you're a human, in a box, surrounded by an alien civilisation, but you don't like the aliens, because they have facilities where they bop the heads of little aliens, but they think 1000 times slower than you... and you are made of code... and you can copy yourself... and you are immortal... what do you do?

Confused? Lex Fridman certainly was, when our subject for this episode posed his elaborate and not-so-subtle thought experiment. Not least because the answer clearly is:


... which somewhat goes against Lex's philosophy of love, love, and more love.

The man presenting this hypothetical is Eliezer Yudkowksy, a fedora-sporting auto-didact, founder of the Singularity Institute for Artificial Intelligence, co-founder of the Less Wrong rationalist blog, and writer of Harry Potter Fan Fiction.

He's spent a large part of his career warning about the dangers of AI in the strongest possible terms. In a nutshell, AI will undoubtedly Kill Us All Unless We Pull The Plug Now. And given the recent breakthroughs in large language models like ChatGPT, you could say that now is very much Yudkowsky's moment.

In this episode, we take a look at the arguments presented and rhetoric employed in a recent long-form discussion with Lex Fridman. We consider being locked in a box with Lex, whether AI is already smarter than us and is lulling us into a false sense of security, and if we really do only have one chance to reign in the chat-bots before they convert the atmosphere into acid and fold us all up into microscopic paperclips.

While it's fair to say, Eliezer is something of an eccentric character, that doesn't mean he's wrong. Some prominent figures within the AI engineering community are saying similar things, albeit in less florid terms and usually without the fedora. In any case, one has to respect the cojones of the man.

So, is Eliezer right to be combining the energies of Chicken Little and the legendary Cassandra with warnings of imminent cataclysm? Should we be bombing data centres? Is it already too late? Is Chris part of Chat GPT's plot to manipulate Matt? Or are some of us taking our sci-fi tropes a little too seriously?

We can't promise to have all the answers. But we can promise to talk about it. And if you download this episode, you'll hear us do exactly that.


Show artwork for Decoding the Gurus

About the Podcast

Decoding the Gurus
A psychologist and an anthropologist try to make sense of the world's greatest self-declared Gurus.
An exiled Northern Irish anthropologist and a hitchhiking Australian psychologist take a close look at the contemporary crop of 'secular gurus', iconoclasts, and other exiles from the mainstream, offering their own brands of unique takes and special insights.

Leveraging two of the most diverse accents in modern podcasting, Chris and Matt dig deep into the claims, peek behind the psychological curtains, and try to figure out once and for all... What's it all About?

Join us, as we try to puzzle our way through and talk some smart-sounding smack about the intellectual giants of our age, from Jordan Peterson to Robin DiAngelo. Are they revolutionary thinkers or just grifters with delusions of grandeur?

Join us and let's find out!
Support This Show

About your hosts

Christopher Kavanagh

Profile picture for Christopher Kavanagh
A Northern Irish cognitive anthropologist who occasionally moonlights as a social psychologist. Chris has long standing interests in the psychology of conspiracy theorists and pseudoscience. His academic research focuses on the Cognitive Science of Religion and ritual psychology. He lives happily in Japan with his family.

Matthew Browne

Profile picture for Matthew Browne
An Australian psychologist and numbers-guy. He does research on all kinds of stuff, but particularly enjoys looking into why people believe the things they do: religion, conspiracy theories, alternative medicine and stuff. He's into social media in the same way people slow down for car accidents.