Yi-Kai Zhang @ LAMDA, NJU-AI

Yi-Kai Zhang
Ph.D. Candidate, LAMDA Group
School of Artificial Intelligence
Nanjing University, Nanjing 210023, China

Supervisor: Associate Professor Han-Jia Ye, Professor De-Chuan Zhan

Email: zhangyk@lamda.nju.edu.cn
Laboratory: Computer Science Building, Xianlin Campus of Nanjing University


I am a first year graduate student of School of Artificial Intelligence in Nanjing University and a member of LAMDA Group, which is led by professor Zhi-Hua Zhou.

I received my B.Sc. degree from Computer Science and Technology Department of Nanjing University and statistics minor specialty from Mathematics of Nanjing University in June 2021. In the same year, I was admitted to study for a M.Sc. degree in Nanjing University, under the supervision of Associate Professor Han-Jia Ye and Professor De-Chuan Zhan without entrance examination.

From September 2023, I started my Ph.D. degree in machine learning under the supervision of Associate Professor Han-Jia Ye and Professor De-Chuan Zhan.

Research Interests

My research interests include Machine Learning and Data Mining. Currently, I focus on learning in extreme environments, including Few-Shot (Meta) Learning, Debias and Model Reuse (Transfer of Pre-trained Models).

Publications - Preprints

  • Yi-Kai Zhang, Lu Ren, Chao Yi, Qi-Wei Wang, De-Chuan Zhan, Han-Jia Ye. ZhiJian: A Unifying and Rapidly Deployable Toolbox for Pre-trained Model Reuse. arXiv:2308.09158, 2023. [Paper] [Code] [Docs]

  • We introduces ZhiJian, a comprehensive and user-friendly toolbox for model reuse, utilizing the PyTorch backend. ZhiJian presents a novel paradigm that unifies diverse perspectives on model reuse, encompassing target architecture construction with PTM, tuning target model with PTM, and PTM-based inference.

Publications - Conference Papers

  • Yi-Kai Zhang, Ting-Ji Huang, Yao-Xiang Ding, De-Chuan Zhan, Han-Jia Ye. Model Spider: Learning to Rank Pre-Trained Models Efficiently. [Spotlight] In: Advances in Neural Information Processing Systems 36 (NeurIPS'23), New Orleans, 2023. [Paper] [Code]

  • We propose Model Spider, which tokenizes pre-trained models (PTMs) and tasks to enable efficient PTM selection. Leveraging PTMs' approximated performance on historical tasks, Model Spider learns to rank with model-task pairs and generalizes to new downstream tasks. PTM-specific task tokens further improves PTM selection. Model Spider balances efficiency and selection ability, demonstrating promising performance in various configurations of model zoos.

  • Yi-Kai Zhang, Qi-Wei Wang, Han-Jia Ye, De-Chuan Zhan. Learning Debiased Representations via Conditional Attribute Interpolation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR'2023), Vancouver, Canada, 2023. [Paper] [Code]

  • We propose chi-square model, a novel method for learning debiased representation. The chi-square model addresses dataset bias by identifying Intermediate Attribute Samples (IASs) operating a chi-pattern and rectifying representations through a chi-structured metric learning objective. It achieves remarkable performance across diverse datasets.

  • Yi-Kai Zhang, Da-Wei Zhou, Han-Jia Ye, De-Chuan Zhan. Audio-Visual Generalized Few-Shot Learning with Prototype-Based Co-Adaptation. In: Proceedings of the 23rd INTERSPEECH Conference (INTERSPEECH'2022), online conference, Songdo ConvensiA, Incheon, Korea, 2022. [Paper] [Code]

  • We propose Prototype-based Co-Adaptation with Transformer (Proto-CAT), a multi-modal generalized few-shot learning method for audio-visual speech recognition systems. In other words, Proto-CAT learns to recognize a novel class multi-modal object with few-shot training data, while maintaining its ability on those base closed-set categories.

Awards & Honors

Teaching Assistant

Service Work


Mail Address

Yi-Kai Zhang
National Key Laboratory for Novel Software Technology, Nanjing University, Xianlin Campus Mailbox 603,
163 Xianlin Avenue, Qixia District, Nanjing 210023, China
(南京市栖霞区仙林大道163号, 南京大学仙林校区603信箱, 软件新技术国家重点实验室, 210023.)