This is AGI (S1E4): Hallucinations

29/09/2025 9 min Temporada 1 Episodio 4

Listen "This is AGI (S1E4): Hallucinations"

Episode Synopsis

Hallucinating LLMs are a critical step towards artificial general intelligence (AGI). We should not try to fix them but instead build more complex agents that will channel the LLMs’ runaway creativity into self-perpetuating cycles of knowledge discovery.'This Is AGI', a podcast about the path to the artificial general intelligence. Listen every Monday morning on your favourite podcast platform.