I am a senior research engineer at working on natural language processing (NLP). You can reach me at
.
My recent research interests center around the representation learning perspective of NLP, specifically on: (1) General pre-trained language models (PLMs) and their applications; (2) Parameter-efficient tuning for PLMs, e.g., prompt-based learning; (3) Efficient pre-training building bocks; (4) Adversarial & contrastive representation learning.