Listen "Eric Schwitzgebel: The Weirdness of the World"
Episode Synopsis
In this conversation, we explore the philosophical art of embracing uncertainty with Eric Schwitzgebel, Professor of Philosophy at UC Riverside and author of "The Weirdness of the World." Eric's work celebrates what he calls "the philosophy of opening"—not rushing to close off possibilities, but instead revealing how many more viable alternatives exist than we typically recognize. As he observes, learning that the world is less comprehensible than you thought, that more possibilities remain open, constitutes a valuable form of knowledge in itself.The conversation centers on one of Eric's most provocative arguments: that if we take mainstream scientific theories of consciousness seriously and apply them consistently, the United States might qualify as a conscious entity. Not in some fascist "absorb yourself into the group mind" sense, but perhaps at the level of a rabbit—possessing massive internal information processing, sophisticated environmental responsiveness, self-monitoring capabilities, and all the neural substrate you could want (just distributed across individual skulls rather than contained in one).Key themes we explore:The United States Consciousness Thought Experiment: How standard materialist theories that attribute consciousness to animals based on information processing and behavioral complexity would, if applied consistently, suggest large-scale collective entities might be conscious too—and why every attempt to wiggle out of this conclusion commits you to other forms of weirdnessPhilosophy of Opening vs. Closing: Eric's distinction between philosophical work that narrows possibilities to find definitive answers versus work that reveals previously unconsidered alternatives, expanding rather than contracting the space of viable theoriesThe AI Consciousness Crisis Ahead: Why we'll face social decisions about how to treat AI systems before we have scientific consensus on whether they're conscious—with respectable theories supporting radically different conclusions and people's investments (emotional, religious, economic) driving which theories they embraceMimicry and Mistrust: Why we're justified in being more skeptical about AI consciousness than human consciousness—not because similarity proves anything definitively, but because AI systems trained to mimic human linguistic patterns raise the same concerns as parrots saying "hoist the flag"The Design Policy of the Excluded Middle: Eric's recommendation (which he doubts the world will follow) to avoid creating systems whose moral status we cannot determine—because making mistakes in either direction could be catastrophic at scaleStrange Intelligence Over Superintelligence: Why the linear conception of AI as "subhuman, then human, then superhuman" fundamentally misunderstands what's likely to emerge—we should expect radically different cognitive architectures with cross-cutting capacities and incapacities rather than human-like minds that are simply "better"About Eric Schwitzgebel: Eric Schwitzgebel is Professor of Philosophy at the University of California, Riverside, specializing in philosophy of mind and moral psychology. His work spans consciousness, introspection, and the ethics of artificial intelligence. Author of "The Weirdness of the World" and a forthcoming book on AI consciousness and moral status, Eric maintains an active blog (The Splintered Mind) where he explores philosophical questions with clarity and wit. His scholarship consistently challenges comfortable assumptions while remaining remarkably accessible to readers beyond academic philosophy.
More episodes of the podcast Artificiality: Being with AI
Tess Posner: AI, Creativity, and Education
09/11/2025
John Pasmore: Inclusive AI
11/10/2025
De Kai: Raising AI
21/09/2025
Joscha Bach at the Artificiality Summit 2024
23/08/2025
Beth Rudden: AI, Trust, and Bast AI
16/08/2025
Jamer Hunt on the Power of Scale
27/07/2025
Avriel Epps: Teaching Kids About AI Bias
12/07/2025
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.