Listen "Architectural Darwinism or Computational Profligacy"
Episode Synopsis
Send us a textThis podcast, takes a look at the application of Neural Architecture Search as a form of evolutionary algorithm or brute-force search to improve Large Language Model architectures, The idea that random modification of these architectures could result in long term improvements is compelling in its simplicity, however the case for it is flawed due to the astronomical costs of training, and the Neural Scaling Laws identifying that the primary driver of LLM performance is scale and not their architectures.
More episodes of the podcast Mind Cast
The Incarnation of Intelligence: A Strategic Analysis of the 2026 Embodied AI Inflection Point
09/01/2026
Dreams, Psychedelics, and AI Futures
02/01/2026
The Ludic Social Contract: Rule Ambiguity, Conflict, and Civic Development in Social Deduction Games
30/12/2025
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.