Skip to content
Connor Leahy on Why Humanity Risks Extinction from AGI
· Existential Risk

Connor Leahy on Why Humanity Risks Extinction from AGI

Connor Leahy joins the podcast to discuss the motivations of AGI corporations, how modern AI is "grown", the need for a science of intelligence, the effects of AI on work, the radical implications of superintelligence, open-source AI, and what you might be able to do about all of this.

Watch Here


Listen Here


Episode Description

Connor Leahy joins the podcast to discuss the motivations of AGI corporations, how modern AI is "grown", the need for a science of intelligence, the effects of AI on work, the radical implications of superintelligence, open-source AI, and what you might be able to do about all of this.   

Here's the document we discuss in the episode:   

https://www.thecompendium.ai  

Timestamps: 

00:00 The Compendium 

15:25 The motivations of AGI corps  

31:17 AI is grown, not written  

52:59 A science of intelligence 

01:07:50 Jobs, work, and AGI  

01:23:19 Superintelligence  

01:37:42 Open-source AI  

01:45:07 What can we do?

Related episodes

No matter your level of experience or seniority, there is something you can do to help us ensure the future of life is positive.