80,000 Hours Podcast With Rob Wiblin
Risks from power-seeking AI systems (article narration by Zershaaneh Qureshi)
- Autor: Vários
- Narrador: Vários
- Editor: Podcast
- Duración: 1:29:32
- Mas informaciones
Informações:
Sinopsis
Hundreds of prominent AI scientists and other notable figures signed a statement in 2023 saying that mitigating the risk of extinction from AI should be a global priority. At 80,000 Hours, we’ve considered risks from AI to be the world’s most pressing problem since 2016. But what led us to this conclusion? Could AI really cause human extinction? We’re not certain, but we think the risk is worth taking very seriously. In particular, as companies create increasingly powerful AI systems, there’s a concerning chance that:These AI systems may develop dangerous long-term goals we don’t want.To pursue these goals, they may seek power and undermine the safeguards meant to contain them.They may even aim to disempower humanity and potentially cause our extinction.This article is written by Cody Fenwick and Zershaaneh Qureshi, and narrated by Zershaaneh Qureshi. It discusses why future AI systems could disempower humanity, what current AI research reveals about behaviours like power-seeking and deception, and how you ca