Listen ""Let’s think about slowing down AI" by Katja Grace"
Episode Synopsis
---client: ea_forumproject_id: curatedfeed_id: ai, ai_safety, ai_safety__governancenarrator: pwqa: kmnarrator_time: 5h00mqa_time: 1h50m---If you fear that someone will build a machine that will seize control of the world and annihilate humanity, then one kind of response is to try to build further machines that will seize control of the world even earlier without destroying it, forestalling the ruinous machine’s conquest. An alternative or complementary kind of response is to try to avert such machines being built at all, at least while the degree of their apocalyptic tendencies is ambiguous.The latter approach seems to me like the kind of basic and obvious thing worthy of at least consideration, and also in its favor, fits nicely in the genre ‘stuff that it isn’t that hard to imagine happening in the real world’. Yet my impression is that for people worried about extinction risk from artificial intelligence, strategies under the heading ‘actively slow down AI progress’ have historically been dismissed and ignored (though ‘don’t actively speed up AI progress’ is popular).Original article:https://forum.effectivealtruism.org/posts/vwK3v3Mekf6Jjpeep/let-s-think-about-slowing-down-ai-1Narrated for the Effective Altruism Forum by TYPE III AUDIO.Share feedback on this narration.
More episodes of the podcast TYPE III AUDIO (All episodes)
Part 10: How to make your career plan
14/06/2023
Part 8: How to find the right career for you
14/06/2023
Part 6: Which jobs help people the most?
14/06/2023
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.