Listen "Deep Fakes & AI Safety: What Every School Leader Must Know"
Episode Synopsis
In this episode of the Help 100 Schools Podcast, Karl Boehm interviews Evan Harris, president of Pathos Consulting Group, to discuss how deep fakes and AI abuse are creating new safety challenges for schools—and what leaders must do now to prepare. From fake crisis videos to AI-generated sextortion, these threats are no longer hypothetical.Evan brings a combination of experience as a former teacher and administrator, national advisor on AI risks, and researcher with Stanford’s Human-Centered AI Institute. Having co-authored the NAIS legal guide on deep fake sexual abuse, he shares real-world cases, prevention strategies, and actionable steps schools can take to protect students and strengthen digital safety.What’s Covered:1. The Reality of Deep Fake RisksDeep fakes aren’t just a tech buzzword—they’re fueling new forms of bullying, sextortion, and reputational damage.Real-world cases: a Baltimore principal framed with cloned audio, and fake videos of gunshots or fires disrupting schools.Why schools can’t wait until it happens to them.2. Three Essential Buckets for School SafetyPolicy: Write tech-neutral rules that cover both harmful media and threats to create it.Crisis Readiness: Don’t just keep a binder—practice scenarios as a team.Prevention: Train staff, build student awareness, and partner with parents early.3. The Human FactorWhy leaders, when confronted with this issue, instantly shift from administrator to parent.How victim notification can either lessen or multiply the damage.The importance of trauma-informed counseling, agency, and dignity for students.4. Big Schools vs. Small SchoolsLarger schools may have more resources, but small schools feel disruption more acutely.The biggest vulnerability isn’t your IT system—it’s your people.5. Five Steps You Can Take TodayUpdate your handbook policy with broad language and clear examples.Use inoculation theory—show your community a safe fake example so they know what to look for.Engage parents first so they’re prepared when kids come home with questions.Run a crisis comms tabletop with your leadership and MarCom teams.Plan victim notification protocols with compassion and care.Evan’s Top Takeaways for Schools:Prevention is possible—but you must start before a crisis.This isn’t niche. Research shows up to 1 in 5 high schoolers know of a classmate targeted with deep fake abuse.Parent partnerships are non-negotiable. They’re critical for prevention and communication.Skills matter more than binders. Crisis readiness comes from practice, not paperwork.Every child deserves safety. Protecting students’ digital dignity is core to your mission.Resources & LinksEmail Evan: [email protected] on LinkedIn: Evan Harris AIExplore: Pathos Consulting GroupJoin the ConversationHave questions about AI safety or deep fakes in schools? Tag us on social media and let us know what you’re seeing in your community. And don’t forget to subscribe to Help 100 Schools for more insights on leadership, safety, and the future of education.
More episodes of the podcast Help One Hundred Schools
Crisis Leadership Lessons with Leigh Toomey
11/11/2025
Daring Leadership & Authenticity with Nicole McDermott: Building Courageous School Communities
15/10/2025
Your School Has Been Breached: Now What?
09/07/2025
Elevation Feature Ep 7: Olney Friends School
31/03/2025
Gesher JDS: Elevation Feature Ep. 5
30/09/2024
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.