Wei Jiang @ LAMDA, NJU-CS

jiangwei.jpg 

姜伟
Wei Jiang
Ph.D. student, LAMDA Group
Department of Computer Science and Technology
National Key Laboratory for Novel Software Technology
Nanjing University, Nanjing 210023, China

Google Scholar


Supervisor

      Professor Lijun Zhang

Education Experiences

Research Interests

      My research interests include Machine Learning, Stochastic Optimization and Online learning.

Preprints

  1. Adaptive Variance Reduction for Stochastic Optimization under Weaker Assumptions [arXiv]
    W. Jiang, S. Yang, Y. Wang, L. Zhang

  2. Efficient Sign-Based Optimization: Accelerating Convergence via Variance Reduction [arXiv]
    W. Jiang, S. Yang, W. Yang, L. Zhang

Publication

  1. Projection-Free Variance Reduction Methods for Stochastic Constrained Multi-Level Compositional Optimization [arXiv]
    W. Jiang, S. Yang, W. Yang, Y. Wang, Y. Wan, and L. Zhang
    In Proceedings of the 41th International Conference on Machine Learning (ICML 2024), to appear, 2024.

  2. Small-loss Adaptive Regret for Online Convex Optimization
    W. Yang, W. Jiang, Y. Wang, P. Yang, Y. Hu, and L. Zhang
    In Proceedings of the 41th International Conference on Machine Learning (ICML 2024), to appear, 2024.

  3. Efficient Algorithms for Empirical Group Distributional Robust Optimization and Beyond [arXiv]
    D. Yu, Y. Cai, W. Jiang, and L. Zhang
    In Proceedings of the 41th International Conference on Machine Learning (ICML 2024), to appear, 2024.

  4. Non-stationary Projection-free Online Learning with Dynamic and Adaptive Regret Guarantees [arXiv]
    Y. Wang, W. Yang, W. Jiang, S. Lu, B. Wang, H. Tang, Y. Wan, and L. Zhang
    In Proceedings of the 38th AAAI Conference on Artificial Intelligence (AAAI 2024), to appear, 2024.

  5. Learning Unnormalized Statistical Models via Compositional Optimization [PDF]
    W. Jiang, J. Qin, L. Wu, C. Chen, T. Yang, L. Zhang
    In Proceedings of the 40th International Conference on Machine Learning (ICML 2023), pages 15105 - 15124, 2023.

  6. Multi-block-Single-probe Variance Reduced Estimator for Coupled Compositional Optimization [PDF, Supplementary]
    W. Jiang, G. Li, Y. Wang, L. Zhang, and T. Yang
    In Advances in Neural Information Processing Systems 35 (NeurIPS 2022), pages 32499 - 32511, 2022.

  7. Smoothed Online Convex Optimization Based on Discounted-Normal-Predictor [PDF, Supplementary]
    L. Zhang, W. Jiang, J. Yi, and T. Yang
    In Advances in Neural Information Processing Systems 35 (NeurIPS 2022), pages 4928 - 4942, 2022.

  8. Optimal Algorithms for Stochastic Multi-Level Compositional Optimization [PDF, Bibtex]
    W. Jiang, B. Wang, Y. Wang, L. Zhang, and T. Yang
    In Proceedings of the 39th International Conference on Machine Learning (ICML 2022), pages 10195 - 10216, 2022.

  9. Revisiting Smoothed Online Learning [PDF, Supplementary, Bibtex]
    L. Zhang, W. Jiang, S. Lu, and T. Yang
    In Advances in Neural Information Processing Systems 34 (NeurIPS 2021), pages 13599 - 13612, 2021.

  10. Dual Adaptivity: A Universal Algorithm for Minimizing the Adaptive Regret of Convex Functions [PDF, Supplementary, Bibtex]
    L. Zhang, G. Wang, W.-W. Tu, W. Jiang, and Z.-H. Zhou
    In Advances in Neural Information Processing Systems 34 (NeurIPS 2021), pages 24968 - 24980, 2021.

Honors and Awards

Foundation

Distributed Optimization of Compositional Loss Functions. Postgraduate Research & Practice Innovation Program of Jiangsu Province (KYCX24_0231) 2024.05-2025.05

Academic Service

Teaching Assistant

Correspondence