Jiwei Tang (汤济玮)

M.Eng. Student in Computer Science and Technology, Tsinghua University
Jiwei Tang
Current Role
Research Intern at Alibaba Group
Advisor
Prof. Hai-Tao Zheng
Selected Highlight
Total paper views 460 K+

About Me

I am a Master's student in Computer Science and Technology at Tsinghua University, advised by Prof. Hai-Tao Zheng. My research focuses on Natural Language Processing, Large Language Models, and Context Compression. I am currently a Research Intern at Future Living Lab of Alibaba Group (foundation model team), working on efficient large language models.

Research Interests: Natural Language Processing · Large Language Models · Context Compression

Education

M.Eng., Computer Technology
Tsinghua University | 2024.09 – 2027.06 (Expected)
🏆 Department-level Second-class Scholarship
B.Eng., Computer Science and Technology
Jinan University | 2020.09 – 2024.06 | Rank: 1 / 99
🏆 National Scholarship · Outstanding Graduate · Outstanding Thesis · Excellent Student × 2 · First-class Scholarship × 2

Experience

Research Intern — Alibaba Group
Future Living Lab · Foundation Model Team · 2025.07 – Present
  • Working on context compression for large language models at the foundation model team
  • Published first-author paper at ICLR 2026 (247K+ views)
  • Two papers under review (1 at ACL 2026 (217K+ views), 1 at ICML 2026)

Skills

Programming: Python, PyTorch, LaTeX, Linux, SSH
Language: Chinese (Native), English (CET-6)

Publications (* denotes equal contribution)

COMI: Coarse-to-Fine Context Compression via Marginal Information Gain
Jiwei Tang, et al.
ICLR 2026 (CCF-A, Score: 8 6 6 6, 247K+ views)
Perception Compressor: A Training-Free Prompt Compression Framework in Long Context Scenarios
Jiwei Tang, et al.
NAACL 2025 Findings (CCF-B)
CoS: Towards Optimal Event Scheduling via Chain-of-Scheduling
Yuxin Zhao, Jiwei Tang, et al.
AAAI 2026 (CCF-A)
GMSA: Enhancing Context Compression via Group Merging and Layer Semantic Alignment
Jiwei Tang, et al.
ACL 2026 Under Review (CCF-A, Meta Review 3.50, OA: 3.50)
Read As Human: Compressing Context via Parallelizable Close Reading and Skimming
Jiwei Tang, et al.
ACL 2026 Under Review (CCF-A, Meta Review 3.50, OA: 3.33, 217K+ views)
Data Distribution Matters: A Data-Centric Perspective on Context Compression for Large Language Models
Kangtao Lv*, Jiwei Tang*, et al.
ICML 2026 Under Review (CCF-A)
When Hard Negatives Hurt: Bridging the Generative-Discriminative Gap in Hard Negative Synthesis
Zhicheng Zhang*, Jiwei Tang*, et al.
KDD 2026 Under Review (CCF-A)