Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Pricing
Podcast Image

AI: post transformers

DSPy and TextGrad: Compiling Language Model Systems

10 Nov 2025

Description

These two academic papers introduce novel programming models aimed at systematically optimizing complex AI systems, particularly those built using Large Language Models (LLMs). The first source presents **DSPy**, a framework that abstracts traditional, hard-coded LLM pipelines into parameterized, declarative modules that can be automatically optimized using a compiler and **teleprompters**, demonstrating superior performance compared to hand-crafted prompts on tasks like math word problems. The second source introduces **TEXTGRAD**, a general optimization framework that utilizes LLMs to generate and propagate **natural language gradients**—textual feedback—through computation graphs, applying this "textual differentiation" approach successfully across diverse domains, including prompt optimization, code refinement, and scientific applications like molecular and medical treatment plan design. Both works highlight the shift from relying on expert prompt engineering to employing systematic, programmatic optimization techniques for compound AI systems.Sources:October 5, 2023DSPY: COMPILING DECLARATIVE LANGUAGE MODEL CALLS INTO SELF-IMPROVING PIPELINEShttps://arxiv.org/pdf/2310.03714June 11, 2024TextGrad: Automatic “Differentiation” via Texthttps://arxiv.org/pdf/2406.07496

Audio
Featured in this Episode

No persons identified in this episode.

Transcription

This episode hasn't been transcribed yet

Help us prioritize this episode for transcription by upvoting it.

0 upvotes
🗳️ Sign in to Upvote

Popular episodes get transcribed faster

Comments

There are no comments yet.

Please log in to write the first comment.