Shenghe Zheng is a second-year master degree candidate at the Department of Computer Science, Harbin Institute of Technology, fortunately advised by Prof.Hongzhi Wang. Previously, Shenghe Zheng obtained his Bachelor’s degree from Harbin Institute of Technology in June 2023.
His current research interests lie in Large Language Models, Multimodal Models, Efficient AI, and Neural Architecture Search(NAS). Currently, he is interning at the Shanghai AI Lab, conducting research related to large language models under the guidance of Dr. Peng Ye, and collaborating closely with Dr. Ganqu Cui. If you are seeking any form of academic cooperation, please feel free to email at shenghez.zheng@gmail.com.
📖 Educations
- 2023.08 - now, Master Candidate in Computer Science, Harbin Institute of Technology.
- Supervisor: Prof. Hongzhi wang, Rank: 1/129.
- 2019.08 - 2023.06, B.Eng. in Computer Science, Harbin Institute of Technology.
🔥 News
- 2025.06: 🎉 Our FREE-Merging is accepted by ICCV 2025.
- 2024.09: 🎉 Our IntraMix is accepted by NeurIPS 2024.
📝 Publications
† indicates equal contribution

FREE-Merging: Fourier Transform for Model Merging with Lightweight Experts
Shenghe Zheng, Hongzhi Wang. Code
- We find that task-specific information from fine-tuning harms Model Merging, so we use frequency-domain filtering to reduce conflicts and utilize lightweight experts to compensate for information loss during model merging.

IntraMix: Intra-Class Mixup Generation for Accurate Labels and Neighbors
Shenghe Zheng, Hongzhi Wang, Xianglong Liu. Code
- This paper introduces IntraMix, seamlessly applying Mixup to graph augmentation, enhancing model effectiveness in data-scarce scenarios.

DCLP: Neural Architecture Predictor with Curriculum Contrastive Learning
Shenghe Zheng, Hongzhi Wang, Tianyu Mu. Code
- This paper introduces contrastive and curriculum learning for neural predictors, greatly reducing the performance evaluation cost in NAS.

AutoTSC: Optimization Algorithm to Automatically Solve the Time Series Classification Problem
Tianyu Mu, Hongzhi Wang, Shenghe Zheng
- This paper introduces AutoML to time series classification, using dataset similarity to recommend algorithms for new datasets automatically.

Assassin: an automatic classification system based on algorithm selection
Tianyu Mu, Hongzhi Wang, Shenghe Zheng
- This paper uses reinforcement learning to select meta-features and recommend algorithms with hyperparameters for new datasets automatically.
📝 Preprint

Decouple and Orthogonalize: A Data-Free Framework for LoRA Merging
Shenghe Zheng, Hongzhi Wang, Chenyu Huang, Xiaohui Wang, Tao Chen, Jiayuan Fan, Shuyue Hu, Peng Ye
- We find that existing model merging methods are ineffective for LoRA due to large parameter magnitude differences. We propose a decoupling approach and optimize it via data-free orthogonalization.

Scaling Physical Reasoning with the PHYSICS Dataset
Shenghe Zheng†, Qianjia Cheng†, Junchi Yao†, Mengsong Wu, Haonan He, Ning Ding, Yu Cheng, Shuyue Hu, Lei Bai, Dongzhan Zhou, Ganqu Cui, Peng Ye
- To address the lack of training and testing datasets for evaluating the physical reasoning capabilities of large language models, we collected a high-quality, widely varied set of physics data and designed a physics-specific evaluation framework.

MoE-Gyro: Self-Supervised Over-Range Reconstruction and Denoising for MEMS Gyroscopes
Feiyang Pan†, Shenghe Zheng†, Chunyan Yin, Guangbin Dou
- To address the issue of low noise performance and limited range in low-cost gyroscopes, we propose a generative approach for noise reduction and peak compensation. Additionally, due to the lack of data in this domain, we constructed a targeted dataset.
🎖 Honors and Awards
- National Scholarship, 2024
- Tencent Scholarship, 2024
- Outstanding Graduate of Harbin Institute of Technology, 2023
- Second Prize at the 14th Undergraduate Academic Forum, Harbin Institute of Technology, 2021
- Honorable Award in American Mathematics Modelling Contest for College Students, 2021
- Outstanding Student of Harbin Institute of Technology, 2019-2024
- Renmin Scholarship, 2019-2022
Professional Activities
Reviewer: NeurIPS2025, ECAI2025