Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Pages

Posts

• A General Discussion of Flow Matching

5 minute read

Published:

Flow matching 12 is a continuous-time generative framework in which you learn a time-dependent vector field $v_{\theta}$, whose flow transports samples from a simple prior distribution ( usually a standard gaussian distribution) at $t=0$ to your target data distribution at $t=1$.

  1. Lipman Y, Chen R T Q, Ben-Hamu H, et al. Flow matching for generative modeling[J]. arXiv preprint arXiv:2210.02747, 2022. 

  2. Lipman Y, Havasi M, Holderrieth P, et al. Flow matching guide and code[J]. arXiv preprint arXiv:2412.06264, 2024. 

• PF-ODE Sampling in Diffusion Models

37 minute read

Published:

Diffusion sampling can be cast as integrating the probability flow ODE (PF-ODE), but dropping it into a generic ODE toolbox rarely delivers the best speed–quality trade-off. This post first revisits core numerical-analysis ideas. It then explains why vanilla integrators underperform on the semi-linear, sometimes stiff PF-ODE in low-NFE regimes, and surveys families that exploit diffusion-specific structure: pseudo-numerical samplers (PLMS/PNDM) and semi-analytic/high-order solvers (DEIS, DPM-Solver/++/UniPC). The goal is a practical, unified view of when and why these PF-ODE samplers work beyond “just use RK4.”

• A Panoramic View of Diffusion Model Sampling: From Classic Theory to Frontier Research

34 minute read

Published:

This article takes a deep dive into the evolution of diffusion model sampling techniques, tracing the progression from early score-based models with Langevin Dynamics, through discrete and non-Markov diffusion processes, to continuous-time SDE/ODE formulations, specialized numerical solvers, and cutting-edge methods such as consistency models, distillation, and flow matching. Our goal is to provide both a historical perspective and a unified theoretical framework to help readers understand not only how these methods work but why they were developed.

• Analysis of the Stability of Diffusion Model Training

19 minute read

Published:

while diffusion models have revolutionized generative AI, their training challenges stem from a combination of resource intensity, optimization intricacies, and deployment hurdles. A stable training process ensures that the model produces good quality samples and converges efficiently over time without suffering from numerical instabilities.

• An Overview of Diffusion Models

11 minute read

Published:

Diffusion models have been shown to be a highly promising approach in the field of image generation. They treat image generation as two independent processes: the forward process, which transforms a complex data distribution into a known prior distribution (typically a standard normal distribution) by gradually injecting noise; and the reverse process, which transforms the prior distribution back into the complex data distribution by gradually removing the noise.

awards

books

patents

projects

publications

services

talks