Abstract
Notwithstanding recent advances, syntactic generalization remains a challenge for text decoders. While some studies showed gains from incorporating source-side symbolic syntactic and semantic structure into text generation Transformers, very little work addressed the decoding of such structure. We propose a general approach for tree decoding using a transition-based approach. Examining the challenging test case of incorporating Universal Dependencies syntax into machine translation, we present substantial improvements on test sets that focus on syntactic generalization, while presenting improved or comparable performance on standard MT benchmarks. Further qualitative analysis addresses cases where syntactic generalization in the vanilla Transformer decoder is inadequate and demonstrates the advantages afforded by integrating syntactic information.
Original language | English |
---|---|
Title of host publication | CoNLL 2022 - 26th Conference on Computational Natural Language Learning, Proceedings of the Conference |
Publisher | Association for Computational Linguistics (ACL) |
Pages | 384-404 |
Number of pages | 21 |
ISBN (Electronic) | 9781959429074 |
State | Published - 2022 |
Event | 26th Conference on Computational Natural Language Learning, CoNLL 2022 collocated and co-organized with EMNLP 2022 - Abu Dhabi, United Arab Emirates Duration: 7 Dec 2022 → 8 Dec 2022 |
Publication series
Name | CoNLL 2022 - 26th Conference on Computational Natural Language Learning, Proceedings of the Conference |
---|
Conference
Conference | 26th Conference on Computational Natural Language Learning, CoNLL 2022 collocated and co-organized with EMNLP 2022 |
---|---|
Country/Territory | United Arab Emirates |
City | Abu Dhabi |
Period | 7/12/22 → 8/12/22 |
Bibliographical note
Publisher Copyright:©2022 Association for Computational Linguistics.