Listen "Algorithms for a Just Future"
Episode Synopsis
One of the supposed promises of AI was that it would be able to take the bias out of human decisions, and maybe even lead to more equity in society. But the reality is that the errors of the past are embedded in the data of today, keeping prejudice and discrimination in. Pair that with surveillance capitalism, and what you get are algorithms that impact the way consumers are treated, from how much they pay for things, to what kinds of ads they are shown, to if a bank will even lend them money. But it doesn’t have to be that way, because the same techniques that prey on people can lift them up. Vinhcent Le from the Greenlining Institute joins Cindy and Danny to talk about how AI can be used to make things easier for people who need a break. In this episode you’ll learn about: Redlining—the pernicious system that denies historically marginalized people access to loans and financial services—and how modern civil rights laws have attempted to ban this practice.How the vast amount of our data collected through modern technology, especially browsing the Web, is often used to target consumers for products, and in effect recreates the illegal practice of redlining.The weaknesses of the consent-based models for safeguarding consumer privacy, which often mean that people are unknowingly waving away their privacy whenever they agree to a website’s terms of service.How the United States currently has an insufficient patchwork of state laws that guard different types of data, and how a federal privacy law is needed to set a floor for basic privacy protections.How we might reimagine machine learning as a tool that actively helps us root out and combat bias in consumer-facing financial services and pricing, rather than exacerbating those problems.The importance of transparency in the algorithms that make decisions about our lives.How we might create technology to help consumers better understand the government services available to them.If you have any feedback on this episode, please email [email protected]. Please visit the site page at https://eff.org/pod107 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology.This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creatorsDrops of H2O ( The Filtered Water Treatment ) by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/djlang59/37792 Ft: AirtoneCome Inside by Zep Hurme (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/zep_hurme/59681 Ft: snowflakeWarm Vacuum Tube by Admiral Bob (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/admiralbob77/59533 Ft: starfroschreCreation by airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/airtone/59721
More episodes of the podcast How to Fix the Internet
Protecting Privacy in Your Brain
27/08/2025
Separating AI Hope from AI Hype
13/08/2025
Smashing the Tech Oligarchy
30/07/2025
Finding the Joy in Digital Security
16/07/2025
Cryptography Makes a Post-Quantum Leap
02/07/2025
Why Three is Tor's Magic Number
04/06/2025
Love the Internet Before You Hate On It
21/05/2025
Digital Autonomy for Bodily Autonomy
07/05/2025
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.