A Consolidated Open Knowledge Representation for Multiple Texts

Rachel Wities*, Vered Shwartz, Gabriel Stanovsky, Meni Adler, Ori Shapira, Shyam Upadhyay, Dan Roth, Eugenio Martinez Camara, Iryna Gurevych, Ido Dagan

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

10 Scopus citations

Abstract

We propose to move from Open Information Extraction (OIE) ahead to Open Knowledge Representation (OKR), aiming to represent information conveyed jointly in a set of texts in an open text-based manner. We do so by consolidating OIE extractions using entity and predicate coreference, while modeling information containment between coreferring elements via lexical entailment. We suggest that generating OKR structures can be a useful step in the NLP pipeline, to give semantic applications an easy handle on consolidated information across multiple texts.

Original languageEnglish
Title of host publicationLSDSem 2017 - 2nd Workshop on Linking Models of Lexical, Sentential and Discourse-Level Semantics, Proceedings of the Workshop
PublisherAssociation for Computational Linguistics (ACL)
Pages12-24
Number of pages13
ISBN (Electronic)9781945626401
StatePublished - 2017
Externally publishedYes
Event2nd Workshop on Linking Models of Lexical, Sentential and Discourse-Level Semantics, LSDSem 2017 - Valencia, Spain
Duration: 3 Apr 2017 → …

Publication series

NameLSDSem 2017 - 2nd Workshop on Linking Models of Lexical, Sentential and Discourse-Level Semantics, Proceedings of the Workshop

Conference

Conference2nd Workshop on Linking Models of Lexical, Sentential and Discourse-Level Semantics, LSDSem 2017
Country/TerritorySpain
CityValencia
Period3/04/17 → …

Bibliographical note

Publisher Copyright:
© 2017 Association for Computational Linguistics

Fingerprint

Dive into the research topics of 'A Consolidated Open Knowledge Representation for Multiple Texts'. Together they form a unique fingerprint.

Cite this