Abstract
The ability to learn from large unlabeled corpora has allowed neural language models to advance the frontier in natural language understanding. However, existing self-supervision techniques operate at the word form level, which serves as a surrogate for the underlying semantic content. This paper proposes a method to employ weak-supervision directly at the word sense level. Our model, named SenseBERT, is pre-trained to predict not only the masked words but also their WordNet supersenses. Accordingly, we attain a lexical-semantic level language model, without the use of human annotation. SenseBERT achieves significantly improved lexical understanding, as we demonstrate by experimenting on SemEval Word Sense Disambiguation, and by attaining a state of the art result on the 'Word in Context' task.
Original language | English |
---|---|
Title of host publication | ACL 2020 - 58th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference |
Publisher | Association for Computational Linguistics (ACL) |
Pages | 4656-4667 |
Number of pages | 12 |
ISBN (Electronic) | 9781952148255 |
State | Published - 2020 |
Externally published | Yes |
Event | 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020 - Virtual, Online, United States Duration: 5 Jul 2020 → 10 Jul 2020 |
Publication series
Name | Proceedings of the Annual Meeting of the Association for Computational Linguistics |
---|---|
ISSN (Print) | 0736-587X |
Conference
Conference | 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020 |
---|---|
Country/Territory | United States |
City | Virtual, Online |
Period | 5/07/20 → 10/07/20 |
Bibliographical note
Publisher Copyright:© 2020 Association for Computational Linguistics