Jonathan Geuter
Hello!
I’m a third year PhD student in Applied Mathematics in the ML Foundations group at Harvard University, advised by David Alvarez-Melis. I am fortunate to be supported by a Kempner Graduate Fellowship at the Kempner Institute for the Study of Natural & Artificial Intelligence. My current research focuses on developing new and efficient methods for LLMs, in particular for test-time scaling and model distillation. I’m also very interested in masked diffusion models, and in diffusion models/flow matching more generally. I’ve worked a lot on optimal transport (OT) for machine learning, and utilized ideas from OT to derive rigorous machine learning algorithms. Previously, I completed my Bachelor’s and Master’s in mathematics at TU Berlin, including a year at UC Berkeley.
Feel free to reach out, I’m always happy to discuss research ideas! If you’re an undergrad or master’s student interested in working with me, please drop me an e-mail with [Interested in collaboration] in the subject line.
news
| Oct 08, 2025 | New preprint out: Boomerang Distillation Enables Zero-Shot Model Size Interpolation. We show that given a teacher and a single distilled student model, you can create models of intermediate sizes without any additional training! |
|---|---|
| May 08, 2025 | I was selected to receive a Kempner Institute Graduate Fellowship! |
| May 01, 2025 | Three papers accepted to ICML 2025! Universal Neural Optimal Transport (main conference), Entropy-Driven Pre-Tokenization for Byte-Pair Encoding (Tokenization Workshop), and Guided Speculative Inference for Efficient Test-Time Alignment of LLMs (Spotlight at ES-FoMo Workshop) |