News
2024
October:
Our paper Transport map unadjusted Langevin algorithms has been accepted for publication in Foundations of Data Science!
New preprint on structure-preserving generative modeling! In Equivariant score-based generative models provably learn distributions with symmetries efficiently, we prove generalization bounds for equivariant score-based generative modeling and provably show that the frequently applied practice of data augmentation is inferior to using an explicitly equivariant score function when learning distributions invariant to a group. This work builds upon the PDE theory approach to analysis of generative models Score-based generative models are provably robust: an uncertainty quantification perspective. This is joint work with Ziyu Chen and Markos Katsoulakis.
Happy to announce that our paper Score-based generative models are provably robust: an uncertainty quantification perspective has been accepted to NeurIPS 2024 through the Main Track!
I am organizing a minisymposium at the SIAM Conference on Mathematics of Data Science 2024 titled Foundations of structure-exploiting flow-based generative models. I will also be presenting in the Optimization algorithms for mean-field games and applications in data science minisymposium where I will where I will be presenting on our recent work relating mean-field games with generative modeling.
August: I have moved to the Division of Applied Mathematics at Brown University. This is the continuation of the same AFOSR postdoc I started at UMass Amherst. My mentors as Paul Dupuis, Markos Katsoulakis, and Luc Rey-Bellet. This year, the focus will be on interacting particle methods for sampling and stochastic control.
July: Excited to announce a new preprint: Combining Wasserstein-1 and Wasserstein-2 proximals: robust manifold learning via well-posed generative flows. We introduce a generative flow trained on the combination of Wasserstein-1 and Wasserstein-2 proximal of $f$-divergences. While optimal transport cost (Wasserstein-2 proximals) have been used to stabilize the training of generative flows, they still struggle to learn high-dimensional data supported on low-dimensional manifolds. Our new $\mathcal{W}_1\oplus\mathcal{W}_2$ generative flow learning distributions without densities using Wasserstein-1 proximals of the KL divergence. This is joint work with Hyemin Gu, Markos Katsoulakis, and Luc Rey-Bellet.
May: Excited to announce two new preprints! In Score-based generative models are provably robust: an uncertainty quantification perspective, we prove SGM generalization bounds in terms of integral probability metrics using regularity theory of Hamilton-Jacobi-Bellman equations and an novel uncertainty propagation perspective. This is joint work with Nikiforos Mimikos-Stamatopoulos and Markos Katsoulakis.
In Nonlinear denoising score matching for enhanced learning of structured distributions, we propose SGMs with nonlinear forward processes, which produces structure-preserving generative models. A nonlinear implementation of denoising score-matching is developed to facilitate the use of nonlinear processes. This is joint work with Jeremey Birrell, Markos Katsoulakis, Luc Rey-Bellet, and Wei Zhu.
February: I am co-organizing a minisymposium at SIAM UQ 2024 titled Optimal Transport for Uncertainty Quantification with Panagiota Birmpa. I will also be presenting in the Computational Transport minisymposium where I will be presenting on our recent work relating mean-field games with generative modeling.
I am excited to announce our new preprint title Wasserstein proximal operators describe score-based generative models and resolve memorization. We show that score-based generative models can be fundamentally understood as the Wasserstein proximal operator of cross-entropy and we build informed models that resolve the memorization phenomenon in SGMs. This is joint work with Siting Liu, Wuchen Li, Markos Katsoulakis, and Stan Osher.
2023
November: I will be giving a talk at NYU Shanghai on our recent work Mean-Field Games Laboratory for Generative Modeling.
October: I will be visiting Emory University and speaking in their Computational and data-enabled science seminar series.
I gave a talk on our work Mean-Field Games Laboratory for Generative Modeling to the Machine Learning and Mean Field Games Seminar series
June: I gave a talk on our work Mean-Field Games Laboratory for Generative Modeling to the UCLA Level Set Collective.
May: I will be attending the Optimal transport in Data Science at ICERM. I will be presenting a poster on our recent work on the Mean-Field Games Laboratory for Generative Modeling.
April: Excited to announce new preprint titled A Mean-Field Games Laboratory for Generative Modeling. This joint work with Markos Katsoulakis. We show that flow and diffusion-based generative models, including normalizing flows, score-based models, and Wasserstein gradient flows can be derived from a single unifying mean-field games framework.
February: I am the creator and organizer of the Learning Learning. This is a seminar for students and postdocs to present their in-progress research, and practice giving research presentations.
Announced new preprint titled Transport map unadjusted Langevin algorithms. This joint work with Youssef Marzouk and Konstantinos Spiliopoulos. We show that Langevin algorithms applied to target distributions that are preconditioned with a normalizing transport map can accelerate sampling and are related to reversible perturbations of Langevin dynamics.
2022
September: I will be presenting at SIAM MDS 2022 in the minisymposium on Frontiers in Monte Carlo Methods for Physics. My talk will be about our work on the Transport map unadjusted Langevin algorithm.
Our paper on Geometry-informed irreversible perturbations for accelerated convergence of Langevin dynamics was accepted for publication in Statistics and Computing.
I joined the Department of Mathematics and Statistics at UMass Amherst as a postdoctoral research associate.
April: I am co-organizing a minisymposium at SIAM UQ 2022 titled Data-Driven Approaches to Rare and Extreme Events, in which I will be presenting our work on Data-driven methods for rare event simulation in stochastic dynamical systems.
February: Attending the Data Assimilation - Mathematical Foundations and Applications workshop at the Mathematical Research Institute of Oberwolfach, February 20-26.
January: Our paper on A Koopman framework for rare event simulation in stochastic differential equations was accepted for publication in the Journal for Computational Physics.
2021
December I successfully defended my PhD thesis on December 13!
Presented a poster on our Sampling via Controlled SDEs work at the ICBINB workshop at NeurIPS2021, December 13.