Liu Ziyin

Liu Ziyin

Email: liu.ziyin.p (at) gmail.com / ziyinl (at) mit.edu
Office: Room 26-209, MIT

CV



I am a Postdoctoral Fellow at MIT and NTT Research. At MIT, I work with Prof. Isaac Chuang. I also collborate with Prof. Tomaso Poggio in the BCS department. My research focus is in the theoretical foundation of deep learning. Prior to coming to MIT, I received my phD in physics at the University of Tokyo under the supervision of Prof. Masahito Ueda. I received by Bachelor's degree in physics and mathematics at Carnegie Mellon University. Personally, I am interested in art, literature and philosophy. I also play Go. If you have questions or want to collaborate, or just want to say hi, feel free to send an email. Also, NTT Research might be hiring new interns. Please consider applying.

A talk I gave recently: How does physics help understand deep learning?

Doctor thesis: Symmetry breaking in deep learning (深層学習に於ける対称性の破れ, 2023).
Master thesis: Mean-field learning dynamics of deep neural networks (2020).

Research Interest

I am particularly interested identifying scientific principles of artificial intelligence (what is a principle?), and I think tools and intuitions from other fields of sciences an be of great help. Broadly speaking, I work to advance the following fields Parameter Symmetry Breaking and Restoration Determines the Hierarchical Learning in AI Systems

Recent Preprints

  1. Parameter Symmetry Breaking and Restoration Determines the Hierarchical Learning in AI Systems
  2. Liu Ziyin, Yizhou Xu, Tomaso Poggio, Isaac Chuang
    Preprint 2025
    [arXiv]
  3. Self-Assembly of a Biologically Plausible Learning Circuit
  4. Qianli Liao*, Liu Ziyin*, Yulu Gan*, Brian Cheung, Mark Harnett, Tomaso Poggio
    Preprint 2024
    [arXiv]
  5. Law of Balance and Stationary Distribution of Stochastic Gradient Descent
  6. Liu Ziyin*, Hongchao Li*, Masahito Ueda
    Preprint 2023
    [arXiv]
  7. Probabilistic Stability of Stochastic Gradient Descent
  8. Liu Ziyin, Botao Li, Tomer Galanti, Masahito Ueda
    Preprint 2023
    [arXiv]

Publications

  1. Formation of Representations in Neural Networks
  2. Liu Ziyin, Isaac Chuang, Tomer Galanti, Tomaso Poggio
    ICLR 2025 (spotlight: 5% of all submissions)
    [paper]
  3. Remove Symmetries to Control Model Expressivity
  4. Liu Ziyin*, Yizhou Xu*, Isaac Chuang
    ICLR 2025
    [paper]
  5. When Does Feature Learning Happen? Perspective from an Analytically Solvable Model
  6. Yizhou Xu*, Liu Ziyin*
    ICLR 2025
    [paper]
  7. Parameter Symmetry and Noise Equilibrium of Stochastic Gradient Descent
  8. Liu Ziyin, Mingze Wang, Hongchao Li, Lei Wu
    NeurIPS 2024
    [paper] [arXiv]
  9. Symmetry Induces Structure and Constraint of Learning
  10. Liu Ziyin
    ICML 2024
    [arXiv]
  11. Zeroth, first, and second-order phase transitions in deep neural networks
  12. Liu Ziyin, Masahito Ueda
    Physical Review Research 2023
    [arXiv]
  13. Exact Solutions of a Deep Linear Network
  14. Liu Ziyin, Botao Li, Xiangming Meng
    Journal of Statistical Mechanics: Theory and Experiment, 2023
    [paper] [arXiv]
  15. On the stepwise nature of self-supervised learning
  16. James B. Simon, Maksis Knutins, Liu Ziyin, Daniel Geisz, Abraham J. Fetterman, Joshua Albrecht
    ICML 2023
    [paper] [arXiv]
  17. Sparsity by Redundancy: Solving L1 with SGD
  18. Liu Ziyin*, Zihao Wang*
    ICML 2023
    [paper] [arXiv]
  19. What shapes the loss landscape of self-supervised learning?
  20. Liu Ziyin, Ekdeep Singh Lubana, Masahito Ueda, Hidenori Tanaka
    ICLR 2023
    [paper] [arXiv]
  21. Exact Solutions of a Deep Linear Network
  22. Liu Ziyin, Botao Li, Xiangming Meng
    NeurIPS 2022
    [paper] [arXiv]
  23. Posterior Collapse of a Linear Latent Variable Model
  24. Zihao Wang*, Liu Ziyin*
    NeurIPS 2022 (oral: 1% of all submissions)
    [paper] [arXiv]
  25. Universal Thermodynamic Uncertainty Relation in Non-Equilibrium Dynamics
  26. Liu Ziyin, Masahito Ueda
    Physical Review Research (2022)
    [paper] [arXiv]
  27. Theoretically Motivated Data Augmentation and Regularization for Portfolio Construction
  28. Liu Ziyin, Kentaro Minami, Kentaro Imajo
    ICAIF 2022 (3rd ACM International Conference on AI in Finance)
    [paper] [arXiv]
  29. Power Laws and Symmetries in a Minimal Model of Financial Market Economy
  30. Liu Ziyin, Katsuya Ito, Kentaro Imajo, Kentaro Minami
    Physical Review Research (2022)
    [paper] [arXiv]
  31. Logarithmic landscape and power-law escape rate of SGD
  32. Takashi Mori, Liu Ziyin, Kangqiao Liu, Masahito Ueda
    ICML 2022
    [paper] [arXiv]
  33. SGD with a Constant Large Learning Rate Can Converge to Local Maxima
  34. Liu Ziyin, Botao Li, James B. Simon, Masahito Ueda
    ICLR 2022 (spotlight: 5% of all submissions)
    [paper] [arXiv]
  35. Strength of Minibatch Noise in SGD
  36. Liu Ziyin*, Kangqiao Liu*, Takashi Mori, Masahito Ueda
    ICLR 2022 (spotlight: 5% of all submissions)
    [paper] [arXiv]
  37. On the Distributional Properties of Adaptive Gradients
  38. Zhang Zhiyi*, Liu Ziyin*
    UAI 2021
    [paper] [arXiv]
  39. Noise and Fluctuation of Finite Learning Rate Stochastic Gradient Descent
  40. Kangqiao Liu*, Liu Ziyin*, Masahito Ueda
    ICML 2021
    [paper] [arXiv]
  41. Cross-Modal Generalization: Learning in Low Resource Modalities via Meta-Alignment
  42. Paul Pu Liang*, Peter Wu*, Liu Ziyin, Louis-Philippe Morency, Ruslan Salakhutdinov
    ACM Multimedia 2021
    NeurIPS 2020 Workshop on Meta Learning
    [arXiv] [code]
  43. Neural Networks Fail to Learn Periodic Functions and How to Fix It
  44. Liu Ziyin, Tilman Hartwig, Masahito Ueda
    NeurIPS 2020
    [paper] [arXiv]
  45. Deep Gamblers: Learning to Abstain with Portfolio Theory
  46. Liu Ziyin, Zhikang Wang, Paul Pu Liang, Ruslan Salakhutdinov, Louis-Philippe Morency, Masahito Ueda
    NeuRIPS 2019
    [paper] [arXiv] [code]
  47. Think Locally, Act Globally: Federated Learning with Local and Global Representations
  48. Paul Pu Liang*, Terrance Liu*, Liu Ziyin, Ruslan Salakhutdinov, Louis-Philippe Morency
    NeurIPS 2019 Workshop on Federated Learning (oral, distinguished student paper award)
    [paper] [arXiv] [code]
  49. Multimodal Language Analysis with Recurrent Multistage Fusion
  50. Paul Pu Liang, Ziyin Liu, Amir Zadeh, Louis-Philippe Morency
    EMNLP 2018 (oral presentation)
    [paper] [supp] [arXiv] [slides]

This page has been accessed several times since July 07, 2018.