Tencent AI Lab
Zhaopeng Tu is a principal researcher at Tencent AI Lab, whose research focuses on deep learning for natural language processing (NLP).
He is currently working on neural machine translation (NMT) and Seq2Seq learning for other NLP tasks, such as dialogue and question answering.
He has published over 30 papers in leading NLP/AI journals and conferences such as ACL, EMNLP, TACL, AAAI, and IJCAI.
He served as ACL2019, EMNLP2018-2019, NAACL2019 machine translation area co-chair, and AAAI2019 senior PC member.
- 05/15/2019: Two papers were accepted to ACL2019.
- 02/23/2019: Four papers were accepted to NAACL2019.
- 11/01/2018: Three papers were accepted to AAAI2019.
- 08/10/2018: Four papers were accepted to EMNLP2018.
- 07/27/2018: A lecture in CIPS Summer School, which gives a survey of recent progresses on improving NMT models (covering 70 papers). [slides (in Chinese)]
- 04/21/2018: One paper was accepted to ACL2018, which moves towards robust NMT.
Improved Language Understanding for Transformer
- Representation Learning
- Zi-Yi Dou, Zhaopeng Tu*, Xing Wang, Shuming Shi, and Tong Zhang. Exploiting Deep Representations for Neural Machine Translation. EMNLP 2018.
- Zi-Yi Dou, Zhaopeng Tu*, Xing Wang, Longyue Wang, Shuming Shi, and Tong Zhang. Dynamic Layer Aggregation for Neural Machine Translation with Routing-by-Agreement. AAAI 2019.
- Jie Hao, Xing Wang, Baosong Yang, Longyue Wang, Jinfeng Zhang, and Zhaopeng Tu*. Modeling Recurrence for Transformer. NAACL 2019.
- Xing Wang, Zhaopeng Tu, Longyue Wang, and Shuming Shi. Exploiting Sentential Context for Neural Machine Translation. ACL 2019 (Short).
- Self-Attention Networks
- Baosong Yang, Zhaopeng Tu*, Derek F. Wong, Fandong Meng, Lidia S. Chao, and Tong Zhang. Modeling Localness for Self-Attention Networks. EMNLP 2018.
- Baosong Yang, Jian Li, Derek Wong, Lidia S. Chao, Xing Wang, and Zhaopeng Tu*. Context-Aware Self-Attention Networks. AAAI 2019.
- Baosong Yang, Longyue Wang, Derek Wong, Lidia S. Chao, and Zhaopeng Tu*. Convolutional Self-Attention Networks. NAACL 2019 (Short).
- Baosong Yang, Longyue Wang, Derek F. Wong, Lidia S. Chao, and Zhaopeng Tu*. Assessing the Ability of Self-Attention Networks to Learn Word Order. ACL 2019.
- Multi-Head Attention
- Zhaopeng Tu, Zhengdong Lu, Yang Liu, Xiaohua Liu, and Hang Li. Modeling Coverage for Neural Machine Translation. ACL 2016. [code]
- Zhaopeng Tu, Yang Liu, Zhengdong Lu, Xiaohua Liu, and Hang Li. Context Gates for Neural Machine Translation. TACL 2017. [code]
- Zhaopeng Tu, Yang Liu, Lifeng Shang, Xiaohua Liu, and Hang Li. Neural Machine Translation with Reconstruction. AAAI 2017. [code]
- Xiang Kong, Zhaopeng Tu*, Shuming Shi, Eduard Hovy, and Tong Zhang. Neural Machine Translation with Adequacy-Oriented Learning. AAAI 2019.
Selected Professional Services
- Computational Linguistics: Reviewer
- ACL: Area Chair (2019), PC member (2017, 2018)
- EMNLP: Area Chair (2018, 2019)
- NAACL: Area Chair (2019)
- AAAI: Senior PC member (2019)