Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Pricing
Podcast Image

AI: post transformers

Doubly Stochastic Attention for Transformers

10 Nov 2025

Description

The four papers we review dated from 1967 up to two papers in 2025 collectively discuss the mathematical properties and deep learning applications of **doubly stochastic matrices**, which are nonnegative matrices whose rows and columns sum to one. One paper, "Concerning Nonnegative Matrices and Doubly Stochastic Matrices," provides the **foundational mathematical theory** regarding the convergence of iterative row and column scaling (known as the Sinkhorn algorithm) to a unique doubly stochastic matrix, contingent on the original matrix having "total support." The other papers focus on **Transformer architecture enhancements**, proposing "Sinkformers" and "Sparse Sinkhorn Attention" as variants that replace the standard row-wise SoftMax attention with the Sinkhorn algorithm to enforce **doubly stochastic attention matrices** for improved performance and theoretical properties, such as a connection to the Wasserstein metric. Furthermore, the "Gradient Multi-Normalization" paper introduces a **stateless optimizer** that uses a multi-normalization procedure, including a "Square-Root Sinkhorn" variant, demonstrating its efficacy and efficiency in training large language models.Sources:1967:CONCERNING NONNEGATIVE MATRICES AND DOUBLY STOCHASTIC MATRICEShttps://projecteuclid.org/journalArticle/Download?urlId=pjm%2F1102992505June 24, 2022:Sinkformers: Transformers with Doubly Stochastic Attentionhttps://arxiv.org/pdf/2110.11773February 10, 2025:Gradient Multi-Normalization for Stateless and Scalable LLM Traininghttps://arxiv.org/pdf/2502.06742July 12, 2025:ESPFormer: Doubly-Stochastic Attention with Expected Sliced Transport Planshttps://arxiv.org/pdf/2502.07962

Audio
Featured in this Episode

No persons identified in this episode.

Transcription

This episode hasn't been transcribed yet

Help us prioritize this episode for transcription by upvoting it.

0 upvotes
🗳️ Sign in to Upvote

Popular episodes get transcribed faster

Comments

There are no comments yet.

Please log in to write the first comment.