Conditional Gradient Methods
Gábor Braun, Alejandro Carderera, Cyrille W. Combettes, Hamed Hassani, Amin Karbasi, Aryan Mokhtari, Sebastian Pokutta
From Core Principles to AI Applications
Click here to join our rewards scheme and earn points on this purchase!
Release Date: 15/12/2025
From Core Principles to AI Applications
Blends solid theoretical foundations with practical insight, the work demystifies projection-free optimization through the Frank–Wolfe method and its adaptive variants. This book tackles constrained challenges in machine learning, signal processing, and large-scale data, supported by rigorous proofs and clear illustrations.
Conditional Gradient Methods: From Core Principles to AI Applications offers a definitive and modern treatment of one of the most elegant and versatile algorithmic families in optimization: the Frank–Wolfe method and its many variants. Originally proposed in the 1950s, these projection-free techniques have seen a powerful resurgence, now playing a central role in machine learning, signal processing, and large-scale data science.
This comprehensive monograph unites deep theoretical insights with practical considerations, guiding readers through the foundations of constrained optimization and into cutting-edge territory, including stochastic, online, and distributed settings. With a clear narrative, rigorous proofs, and illuminating illustrations, the book demystifies adaptive variants, away-steps, and the nuances of dealing with structured convex sets. A FrankWolfe.jl Julia package that implements most of the algorithms in the book is available on a supplementary website.