About Me
Currently, I am a final-year Ph.D. candidate at the Hong Kong University of Science and Technology. Supervised by Prof. Wei Wang, I am working on Natural Language Processing (NLP) as well as its applications.
Previously, I received a M.S. degree in Big Data and Technology (advised by Prof. Fangzhen Lin) from the Hong Kong University of Science and Technology in 2021. I obtained a B.S. degree in Mathematics and Applied Mathematics from Shanghai University (advised by Prof. Qingwen Wang) in 2020.
I’m on the job market. If you are interested in research collaborations or have job recommendations, please feel free to contact me.
(This webpage was last updated on Oct 9, 2024)
Research Interests
- Instruction Tuning of Large Language Models, especially on enhancing and evaluating the capability of language models to comprehend and execute complex instructions accurately.
- Alignment Tuning of Large Language Models, concentrating on averting models’ unintended behaviors and ensuring LLMs act in line with human expectations.
- Contrastive Learning in NLP, focusing on leveraging contrastive learning to enhance the quality of embeddings and to enable more nuanced and context-aware language model performances.
Selected Publications
Bridging and Modeling Correlations in Pairwise Data for Direct Preference Optimization
Yuxin Jiang, Bo Huang, Yufei Wang, Xingshan Zeng, Liangyou Li, Yasheng Wang, Xin Jiang, Lifeng Shang, Ruiming Tang, Wei Wang.
Arxiv Preprint, 2024 [pdf] [bibtex] BMCLearning to Edit: Aligning LLMs with Knowledge Editing
Yuxin Jiang, Yufei Wang, Chuhan Wu, Wanjun Zhong, Xingshan Zeng, Jiahui Gao, Liangyou Li, Xin Jiang, Lifeng Shang, Ruiming Tang, Qun Liu, Wei Wang.
ACL-2024 [pdf] [bibtex] LTEFollowBench: A Multi-level Fine-grained Constraints Following Benchmark for Large Language Models
Yuxin Jiang, Yufei Wang, Xingshan Zeng, Wanjun Zhong, Liangyou Li, Fei Mi, Lifeng Shang, Xin Jiang, Qun Liu, Wei Wang.
ACL-2024 [pdf] [bibtex] FollowBenchLion: Adversarial Distillation of Proprietary Large Language Models
Yuxin Jiang, Chunkit Chan*, Mingyang Chen*, Wei Wang.
EMNLP-2023 [pdf] [bibtex] LionGlobal and Local Hierarchy-aware Contrastive Framework for Implicit Discourse Relation Recognition
Yuxin Jiang, Linhan Zhang, Wei Wang.
ACL Findings-2023 [pdf] [bibtex] GOLF-for-IDRRImproved Universal Sentence Embeddings with Prompt-based Contrastive Learning and Energy-based Learning
Yuxin Jiang, Linhan Zhang, Wei Wang.
EMNLP Findings-2022 [pdf] [bibtex] PromCSE
More publications can be found [HERE].
Education
The Hong Kong University of Science and Technology Ph.D. in Individualized Interdisciplinary Program (Data Science and Analytics) Sep. 2021 -- Jul. 2025 (Expected) | |
The Hong Kong University of Science and Technology M.S. in Big Data and Technology Sep. 2020 -- Jul. 2021 | |
Shanghai University B.S. in Mathematics and Applied Mathematics Sep. 2016 -- Jul. 2020 |
Awards
- [2024] ACL 2024 Outstanding Paper Award (Top 1%)
- [2024] Research Travel Grant Award at HKUST
- [2023] Top 3% Paper Recognition of all papers accepted at ICASSP 2023 (Top 1%)
- [2023] Research Travel Grant Award at HKUST
- [2021] Postgraduate Studentship at HKUST (Top 5%)
- [2021] School of Engineering Excellent Student Scholarship at HKUST (Top 5%)
- [2020] School of Engineering Entrance Scholarship at HKUST (Top 5%)
- [2020] Outstanding Graduates of Shanghai (Top 1%)
- [2019] Third Prize at Science and Technology Innovation Competition for Undergraduates (Top 10%)
- [2016-2019] Grand Prize Scholarship, Leadership Scholarship, and Excellent Student at Shanghai University (Top 3%)
Talks
- 2023 December: EMNLP Conference on Large Language Models and the Future of NLP. Lion: Adversarial Distillation of Proprietary Large Language Models. [slides] [video]
Teaching Assistant
- INFH6780: Career Development for Information Hub Students. (Spring 2024)
- INFH5000: Information Science and Technology: Essentials and Trends. (Fall 2022)
Academic Service
- Conference Reviewer: EMNLP’22, 23, ACL’23, ACL Rolling Review’23, 24, ICLR’25.
- Conference External Reviewer: DASFAA’21, SIGIR’22, 23, ICDE’23, NeurIPS’23, 24.