Deep Dive - Frontier AI with Dr. Jerry A. Smith
Why AI Hallucinates: The Math OpenAI Got Right and the Politics They Ignored
08 Sep 2025
Medium: https://medium.com/@jsmith0475/why-ai-hallucinates-the-math-openai-got-right-and-the-politics-they-ignored-1802138739f5 The article, by Dr. Jerry A. Smith, explores the multifaceted nature of AI hallucinations, arguing that they are not merely technical glitches but also socio-technical constructs. It highlights two key perspectives: first, Kalai et al. (2025) statistically explain why hallucinations are mathematically inevitable due to training and evaluation methods, advocating for rewarding model abstention when uncertain. Second, Smith (2025) introduces a Kantian framework, positing that the definition of a "hallucination" is inherently subjective and shaped by human evaluative choices, including benchmarks that embed specific cultural and political values. The text ultimately calls for a move beyond a "neutrality myth" in AI evaluation, advocating for multi-perspective assessments and the democratization of benchmark governance to ensure AI systems are more accountable and reflective of diverse human realities.
No persons identified in this episode.
This episode hasn't been transcribed yet
Help us prioritize this episode for transcription by upvoting it.
Popular episodes get transcribed faster
Other recent transcribed episodes
Transcribed and ready to explore now
3ª PARTE | 17 DIC 2025 | EL PARTIDAZO DE COPE
01 Jan 1970
El Partidazo de COPE
13:00H | 21 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana
12:00H | 21 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana
10:00H | 21 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana
13:00H | 20 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana
12:00H | 20 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana