Selected Publications

* indicates equal contribution.

Generalization of Diffusion Models Arises with a Balanced Representation Space

ICLR 2026 Zekai Zhang*, Xiao Li*, Xiang Li, Lianghe Shi, Meng Wu, Molei Tao, Qing Qu

Diffusion models generalize by extracting underlying structures within the data, learning balanced and informative representations.

A Closer Look at Model Collapse: From a Generalization-to-Memorization Perspective

NeurIPS 2025 (Spotlight) Lianghe Shi*, Meng Wu*, Huijie Zhang, Zekai Zhang, Molei Tao, Qing Qu

When diffusion models are iteratively trained with synthetic data, the generated distribution collapses toward a stable but low-diversity, low-quality mode.

Understanding Representation Dynamics of Diffusion Models via Low-Dimensional Modeling

NeurIPS 2025 Xiao Li*, Zekai Zhang*, Xiang Li, Siyi Chen, Zhihui Zhu, Peng Wang, Qing Qu

Analyzes how time conditioning shapes diffusion representations and how these dynamics can diagnose overfitting.

Efficient Compression of Overparameterized Deep Models through Low-Dimensional Learning Dynamics

AISTATS 2024 Soo Min Kwon*, Zekai Zhang*, Dogyoon Song, Laura Balzano, Qing Qu

LeaF: Learning Frames for 4D Point Cloud Sequence Understanding

ICCV 2023 Yunze Liu, Junyu Chen, Zekai Zhang, Jingwei Huang, Li Yi