Hi, I am Yuechi Zhou (周月驰), a second-year graduate student supervised by Assoc. Prof. Juntao Li at Soochow University. In 2025.09, I will pursue a Ph.D. under Prof. Li Zhenghua at Soochow University (Master-PhD combined program).
Before this, I received my Bachelor’s degree (2019-2023, computer science) from Soochow University.
My research interest includes Natural Language Processing (NLP), Large Language Models (LLM). I am committed to constructing high-performance and efficient deployment algorithms for LLMs, exploring more economical and convenient applications of LLMs in various fields.
Exploration is limitless, innovation knows no boundaries.
🔥 News
- 2025.05: 🎉🎉 Two papers are accepted by ACL 2025!
- 2024.10: 🎉🎉 One paper is confirmed as Outstanding Paper by NLPCC 2024!
- 2024.08: 🎉🎉 One paper is accepted by NLPCC 2024!
- 2024.05: 🎉🎉 Releasing our new pretrained model: OpenBA-V2!
📝 Publications
ACL 2025 findings
[ALW: Adaptive Layer-Wise contrastive decoding enhancing reasoning ability in Large Language Models]
Yuechi Zhou, Chuyue Zhou, Jianxin Zhang, Juntao Li, Min Zhang
ACL 2025 main
[Accurate KV Cache Quantization with Outlier Tokens Tracing]
Yi Su*, Yuechi Zhou*, Quantong Qiu, Juntao Li, Qingrong Xia, Ping Li, Xinyu Duan, Zhefeng Wang, Min Zhang
NLPCC 2024 Outstanding
The Benefits in Shallow: Merge Decoding Across Large Language Model Layers
Yuechi Zhou, Chuyue Zhou, Wenjing Xie, Xinrui Wang, Jiuchang Chen, Zhenghua Ni, Juntao Li
Arxiv 24.05
OpenBA-V2: Reaching 77.3% High Compression Ratio with Fast Multi-Stage Pruning
Dan Qiao*, Yi Su* , Pinzheng Wang*, Jing Ye, WenJing Xie, Yuechi Zhou, Yuyang Ding, Zecheng Tang, Jikai Wang, Yixin Ji, Yue Wang, Pei Guo, Zechen Sun, Zikang Zhang, Juntao Li, Pingfu Chao, Wenliang Chen, Guohong Fu, Guodong Zhou, Qiaoming Zhu, Min Zhang
Arxiv 25.03
Chinese grammatical error correction based on knowledge distillation
Peng Xia, Yuechi Zhou, Ziyan Zhang, Zecheng Tang, Juntao Li
🗞️ Projects
- Participate in the pre-training of OpenBA-V2 in Soochow University (evaluation).
- Participate in the SFT code development of LLaMA-2 and OpenBA-V2 in Chinese AI chips Tecorigin T100 (evaluation).
- Develop prompt selection algorithms in model scheduling (Huawei Cloud).
- Generation Acceleration of Large Language Models (Huawei Cloud).
🎖 Honors and Awards
- Third Prize in Teco-RAG Large Model Development Challenge
- Soochow University Outstanding Graduate Student
- Soochow University First-Class Academic Scholarship
- National Encouragement Scholarship
- Soochow University Diligence and Encouragement Scholarship
📖 Educations
- 2019.09 - 2023.06, Bachelor, Institute of Computer Science and Technology, Soochow University, Suzhou.
- 2023.06 - current, Second-year Master, Artificial Intelligence Research Institute, Soochow University, Suzhou.
💻 Internships
- 2024.11 - 2025.02, Huawei Cloud System AI Innovation Lab, Hangzhou, China.