Liam Madden

My research focuses on probability theory, linear algebra, and real analysis with applications to optimization, machine learning, and quantum computing. Among other things, I have studied the approximate quantum compiling problem, and developed algorithms and software for it. I have also studied the high probability convergence properties of stochastic gradient descent, as well as the finite-sample expressivity of feedforward neural networks and of next-token prediction models such as transformers. I received my BS from California Polytechnic State University in 2017 with a double major in Mechanical Engineering and Mathematics. I received my MS from the Department of Applied Mathematics at the University of Colorado Boulder in 2020 and my PhD in 2022, advised by Emiliano Dall’Anese and Stephen Becker. I held a Data Science Institute Postdoctoral Fellowship at the University of British Columbia from 2022-2024, supervised by Christos Thrampoulidis and Mark Schmidt. In my free time, I enjoy walking in the woods, composing poems, and dancing with friends.
selected publications
- Bounds For the Tracking Error of First-Order Online Optimization MethodsJournal of Optimization Theory and Applications, 2021
- Best Approximate Quantum Compiling ProblemsACM Transactions on Quantum Computing, 2022
- Memory capacity of two layer neural networks with smooth activationsSIAM Journal on Mathematics of Data Science, 2024
- High-probability Convergence Bounds for Non-convex Stochastic Gradient Descent with Sub-Weibull noiseJournal of Machine Learning Research, 2024
- Next-token Prediction Capacity: General Upper Bounds and a Lower Bound for TransformersIEEE Transactions on Information Theory, 2025