Abstract
The ability to consolidate information of different types is at the core of intelligence, and has tremendous practical value in allowing learning for one task to benefit from generalizations learned for others. In this paper we tackle the challenging task of improving semantic parsing performance, taking UCCA parsing as a test case, and AMR, SDP and Universal Dependencies (UD) parsing as auxiliary tasks. We experiment on three languages, using a uniform transition-based system and learning architecture for all parsing tasks. Despite notable conceptual, formal and domain differences, we show that multitask learning significantly improves UCCA parsing in both in-domain and out-of-domain settings. Our code is publicly available.
Original language | English |
---|---|
Title of host publication | ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers) |
Publisher | Association for Computational Linguistics (ACL) |
Pages | 373-385 |
Number of pages | 13 |
ISBN (Electronic) | 9781948087322 |
DOIs | |
State | Published - 2018 |
Event | 56th Annual Meeting of the Association for Computational Linguistics, ACL 2018 - Melbourne, Australia Duration: 15 Jul 2018 → 20 Jul 2018 |
Publication series
Name | ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers) |
---|---|
Volume | 1 |
Conference
Conference | 56th Annual Meeting of the Association for Computational Linguistics, ACL 2018 |
---|---|
Country/Territory | Australia |
City | Melbourne |
Period | 15/07/18 → 20/07/18 |
Bibliographical note
Publisher Copyright:© 2018 Association for Computational Linguistics