March 30, 2023

#368 – Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

The player is loading ...
#368 – Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Eliezer Yudkowsky is a researcher, writer, and philosopher on the topic of superintelligent AI. Please support this podcast by checking out our sponsors: - Linode: https://linode.com/lex to get $100 free credit

Apple Podcasts podcast player iconSpotify podcast player iconRSS Feed podcast player icon
Apple Podcasts podcast player iconSpotify podcast player iconRSS Feed podcast player icon

Eliezer Yudkowsky is a researcher, writer, and philosopher on the topic of superintelligent AI. Please support this podcast by checking out our sponsors:
Linode:https://linode.com/lexto get $100 free credit
House of Macadamias:https://houseofmacadamias.com/lexand use code LEX to get 20% off your first order
InsideTracker:https://insidetracker.com/lexto get 20% off

EPISODE LINKS:
Eliezer’s Twitter:https://twitter.com/ESYudkowsky
LessWrong Blog:https://lesswrong.com
Eliezer’s Blog page:https://www.lesswrong.com/users/eliezer_yudkowsky
Books and resources mentioned:
1. AGI Ruin (blog post):https://lesswrong.com/posts/uMQ3cqWDPHhjtiesc/agi-ruin-a-list-of-lethalities
2. Adaptation and Natural Selection:https://amzn.to/40F5gfa

PODCAST INFO:
Podcast website:https://lexfridman.com/podcast
Apple Podcasts:https://apple.co/2lwqZIr
Spotify:https://spoti.fi/2nEwCF8
RSS:https://lexfridman.com/feed/podcast/
YouTube Full Episodes:https://youtube.com/lexfridman
YouTube Clips:https://youtube.com/lexclips

SUPPORT & CONNECT:
– Check out the sponsors above, it’s the best way to support this podcast
– Support on Patreon:https://www.patreon.com/lexfridman
– Twitter:https://twitter.com/lexfridman
– Instagram:https://www.instagram.com/lexfridman
– LinkedIn:https://www.linkedin.com/in/lexfridman
– Facebook:https://www.facebook.com/lexfridman
– Medium:https://medium.com/@lexfridman

OUTLINE:
Here’s the timestamps for the episode. On some podcast players you should be able to click the timestamp to jump to that time.
(00:00) – Introduction
(05:19) – GPT-4
(28:00) – Open sourcing GPT-4
(44:18) – Defining AGI
(52:14) – AGI alignment
(1:35:06) – How AGI may kill us
(2:27:27) – Superintelligence
(2:34:39) – Evolution
(2:41:09) – Consciousness
(2:51:41) – Aliens
(2:57:12) – AGI Timeline
(3:05:11) – Ego
(3:11:03) – Advice for young people
(3:16:21) – Mortality
(3:18:02) – Love