About

I am a postdoctoral research associate in the Department of Mathematics and Statistics at UMass Amherst working with Markos Katsoulakis, Luc Rey-Bellet, and Paul Dupuis. My research interests lie broadly in the mathematics of machine learning for analyzing and developing novel generative modeling algorithms from the perspectives of mathematical control theory and mean-field games. I also develop new methods in rare event simulation for dynamical systems and sampling methods for Bayesian computation using tools from generative flows.

I earned my PhD in Computational Science and Engineering from MIT in 2022. My advisor was Youssef Marzouk who heads the Uncertainty Quantification group. I earned my Master’s degree in Aeronautics & Astronautics at MIT in 2017, and my Bachelor’s degrees in Engineering Physics and Applied Mathematics at UC Berkeley in 2015. I was a MIT School of Engineering 2019-2020 Mathworks Fellow. I spent the summer of 2017 as a research intern at United Technologies Research Center (now Raytheon), where I worked with Tuhin Sahai on novel queuing systems.

APMA 1930Z: Introduction to Mathematical Machine Learning (Fall 2024)

I will be co-teaching APMA 1930Z at Brown University in Fall 2024. APMA 1930Z is the second iteration of Math 590STA, first offered at UMass Amherst in Spring 2024. We cover classical solutions to machine learning tasks such as regression, classification, and dimension reduction from fundamental mathematical concepts.

Learning Learning

I am the co-organizer of the Learning Learning seminar, along with Hyemin Gu. This is an internal seminar at UMass Amherst where graduate students and postdocs discuss latest developments in machine learning and data science through reading groups and tutorials. It is also a venue for students to present their research. Please contact us if you wish to participate in the group!

Recent news & upcoming events

July: Excited to announce a new preprint: Combining Wasserstein-1 and Wasserstein-2 proximals: robust manifold learning via well-posed generative flows. We introduce a generative flow trained on the combination of Wasserstein-1 and Wasserstein-2 proximal of $f$-divergences. While optimal transport cost (Wasserstein-2 proximals) have been used to stabilize the training of generative flows, they still struggle to learn high-dimensional data supported on low-dimensional manifolds. Our new $\mathcal{W}_1\oplus\mathcal{W}_2$ generative flow learning distributions without densities using Wasserstein-1 proximals of the KL divergence. This is joint work with Hyemin Gu, Markos Katsoulakis, and Luc Rey-Bellet.

May: Excited to announce two new preprints! In Score-based generative models are provably robust: an uncertainty quantification perspective, we prove SGM generalization bounds in terms of integral probability metrics using regularity theory of Hamilton-Jacobi-Bellman equations and an novel uncertainty propagation perspective. This is joint work with Nikiforos Mimikos-Stamatopoulos and Markos Katsoulakis.

In Nonlinear denoising score matching for enhanced learning of structured distributions, we propose SGMs with nonlinear forward processes, which produces structure-preserving generative models. A nonlinear implementation of denoising score-matching is developed to facilitate the use of nonlinear processes. This is joint work with Jeremey Birrell, Markos Katsoulakis, Luc Rey-Bellet, and Wei Zhu.

News archive.