Su Lu @ LAMDA, NJU-CS

Su Lu 
Su Lu
Ph.D. Candidate
LAMDA Group
Department of Computer Science & Technology
Nanjing University, Nanjing 210023, China.

Email: lus [at] lamda.nju.edu.cn
      

[Resume-ENG][Resume-CHN]
      

Short Bio

Main Research Interests

Su mainly focuses on machine learning, especially:

Meta-Learning

Knowledge Distillation

Publications - Preprints

WSFG 
  • Su Lu, Han-Jia Ye, De-Chuan Zhan. Faculty Distillation with Optimal Transport. arXiv:2204.11526, 2022. [Arxiv]

  • This paper studies a new knowledge distillation paradigm called 'Faculty Distillation', which means selecting the most suitable teacher model from a group of ones and performing generalized knowledge reuse.

WSFG 
  • Su Lu, Han-Jia Ye, De-Chuan Zhan. Few-Shot Action Recognition with Compromised Metric via Optimal Transport. arXiv:2104.03737, 2021. [Arxiv]

  • In this paper, we propose a compromised video metric which simultaneously considers and balances long-term and short-term temporal relations in videos. We define the distance between two videos as the optimal transportation cost between their segment sequences, achieving promising performance in few-shot action recognition.

Publications - Conference Papers

WSFG 
  • Su Lu, Han-Jia Ye, Le Gan, De-Chuan Zhan. Towards Enabling Meta-Learning from Target Models. In: Advances in Neural Information Processing Systems 34 (NeurIPS'21), online, 2021. [ArXiv] [Paper] [Supplementary] [Code]

  • The widely adopted support-query protocol in meta-learning suffers from biased and noisy sampling of query instances. As an alternative, we evaluate the base model by measuring its distance to a target model. We show that a small number of target models can improve classic meta-learning algorithms.

WSFG 
  • Su Lu, Han-Jia Ye, De-Chuan Zhan. Tailoring Embedding Function to Heterogeneous Few-Shot Tasks by Global and Local Feature Adaptors. In: Proceedings of the 35th AAAI Conference on Artificial Intelligence (AAAI'21), online, 2021. [Paper] [Supplementary] [Code]

  • Task-adaptive representation is beneficial to metric-based few-shot learning algorithms. In this paper, we claim that class-specific local transformation helps to improve the representation ability of feature adaptor, especially for heterogeneous tasks.

WSFG 
  • Han-Jia Ye, Su Lu, De-Chuan Zhan. Distilling Cross-Task Knowledge via Relationship Matching. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR'20), Seattle, Washington, 2020. [oral] [Paper] [Supplementary] [Code]

  • To reuse cross-task knowledge, we distill the comparison ability and the local classification ability of the embedding and the top-layer classifier from a teacher model, respectively. Our proposed method can be also applied to standard knowledge distillation and middle-shot learning.

WSFG 
  • Xiang-Rong Sheng, De-Chuan Zhan, Su Lu, Yuan Jiang. Multi-View Anomaly Detection: Neighborhood in Locality Matters. In: Proceedings of the 33rd AAAI Conference on Artificial Intelligence (AAAI'19), Honolulu, Hawaii, 2019. [Paper] [Supplementary]

  • We study the problem of multi-view anomaly detection and propose a novel neighbor-based method. Our method can simultaneously detect dissension anomaly and unanimous anomaly without relying on clustering assumption.

Publications - Journal Articles

WSFG 
  • Han-Jia Ye, Su Lu, De-Chuan Zhan. Generalized Knowledge Distillation via Relationship Matching. IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), DOI: 10.1109/TPAMI.2022.3160328, 2022. [ArXiv] [Paper] [Supplementary] [Code] [Official Version]

  • We study the problem of distilling knowledge from a generalized teacher, whose label space may be same, overlapped, or totally different from the student's. The comparison ability of teacher is reused to bridge different label spaces.

Academic and Industrial Projects

Some Project

App Usage Prediction and Preloading

Awards & Honors & Contests

Academic Service

Teaching Assistant

Correspondence

Email: lus [at] lamda.nju.edu.cn
Office: Yifu Building A201, Xianlin Campus of Nanjing University
Address: Su Lu
                 National Key Laboratory for Novel Software Technology
                 Nanjing University, Xianlin Campus
                 163 Xianlin Avenue, Qixia District, Nanjing 210023, China