Publications

You can also find my articles on my Google Scholar profile.

* corresponding author

Preprints

  1. Kuangyu Ding, Marie Maros, Gesualdo Scutari*. A New Decomposition Paradigm for Graph-structured Nonlinear Programs via Message Passing. 2025. [ arXiv ]
  2. Kuangyu Ding*, Kim-Chuan Toh. On exploration of an interior mirror descent flow for stochastic nonconvex constrained problem. 2025. [ arXiv ]
  3. Xie Xingyu, Kuangyu Ding, Kim-Chuan Toh, Yan Shuicheng*, Wei Tianwen*. Optimization hyper-parameter laws for large language models. 2024. [ arXiv ]

Journal Publications

  1. Nachuan Xiao, Kuangyu Ding*, Xiaoyin Hu, Kim-Chuan Toh. Developing Lagrangian-based methods for nonsmooth nonconvex optimization. Mathematics of Operations Research, 2026. [ MOOR ]
  2. Kuangyu Ding, Kim-Chuan Toh*. Stochastic Bregman Subgradient Methods for Nonsmooth Nonconvex Optimization Problems. Journal of Optimization Theory and Applications, 2025. [ JOTA ]
  3. Kuangyu Ding, Nachuan Xiao*, Kim-Chuan Toh. Adam-family methods with decoupled weight decay in deep learning. Transactions on Machine Learning Research, 2025. [ TMLR , J2C Certification ]
  4. Kuangyu Ding*, Jingyang Li, Kim-Chuan Toh. Nonconvex stochastic Bregman proximal gradient method with application to deep learning. Journal of Machine Learning Research, 2025. [ JMLR ]
  5. Kuangyu Ding, Xin-Yee Lam*, Kim-Chuan Toh. On proximal augmented Lagrangian based decomposition methods for dual block-angular convex composite programming problems. Computational Optimization and Applications, 2023. [ COAP ]

Conference Publications

  1. Jingyang Li, Kuangyu Ding, Kim-Chuan Toh, Pan Zhou. Memory-efficient 4-bit preconditioned stochastic optimization. Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2025. [ ICCV ]
  2. Jingyang Li, Pan Zhou, Kuangyu Ding, Kim-Chuan Toh*, Yinyu Ye*. Dimension-Reduced Adaptive Gradient Method. OPT 2022 Workshop (Optimization for Machine Learning), 2022. [ Workshop , Workshop paper ]