Multitask parsing across semantic representations

Daniel Hershcovich, Omri Abend, Ari Rappoport

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

58 Scopus citations

Abstract

The ability to consolidate information of different types is at the core of intelligence, and has tremendous practical value in allowing learning for one task to benefit from generalizations learned for others. In this paper we tackle the challenging task of improving semantic parsing performance, taking UCCA parsing as a test case, and AMR, SDP and Universal Dependencies (UD) parsing as auxiliary tasks. We experiment on three languages, using a uniform transition-based system and learning architecture for all parsing tasks. Despite notable conceptual, formal and domain differences, we show that multitask learning significantly improves UCCA parsing in both in-domain and out-of-domain settings. Our code is publicly available.

Original languageEnglish
Title of host publicationACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers)
PublisherAssociation for Computational Linguistics (ACL)
Pages373-385
Number of pages13
ISBN (Electronic)9781948087322
DOIs
StatePublished - 2018
Event56th Annual Meeting of the Association for Computational Linguistics, ACL 2018 - Melbourne, Australia
Duration: 15 Jul 201820 Jul 2018

Publication series

NameACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers)
Volume1

Conference

Conference56th Annual Meeting of the Association for Computational Linguistics, ACL 2018
Country/TerritoryAustralia
CityMelbourne
Period15/07/1820/07/18

Bibliographical note

Publisher Copyright:
© 2018 Association for Computational Linguistics

Fingerprint

Dive into the research topics of 'Multitask parsing across semantic representations'. Together they form a unique fingerprint.

Cite this