We propose to move from Open Information Extraction (OIE) ahead to Open Knowledge Representation (OKR), aiming to represent information conveyed jointly in a set of texts in an open text-based manner. We do so by consolidating OIE extractions using entity and predicate coreference, while modeling information containment between coreferring elements via lexical entailment. We suggest that generating OKR structures can be a useful step in the NLP pipeline, to give semantic applications an easy handle on consolidated information across multiple texts.
|Original language||American English|
|Title of host publication||LSDSem 2017 - 2nd Workshop on Linking Models of Lexical, Sentential and Discourse-Level Semantics, Proceedings of the Workshop|
|Publisher||Association for Computational Linguistics (ACL)|
|Number of pages||13|
|State||Published - 2017|
|Event||2nd Workshop on Linking Models of Lexical, Sentential and Discourse-Level Semantics, LSDSem 2017 - Valencia, Spain|
Duration: 3 Apr 2017 → …
|Name||LSDSem 2017 - 2nd Workshop on Linking Models of Lexical, Sentential and Discourse-Level Semantics, Proceedings of the Workshop|
|Conference||2nd Workshop on Linking Models of Lexical, Sentential and Discourse-Level Semantics, LSDSem 2017|
|Period||3/04/17 → …|
Bibliographical noteFunding Information:
This work was supported in part by grants from the MAGNET program of the Israeli Office of the Chief Scientist (OCS) and the German Research Foundation through the German-Israeli Project Cooperation (DIP, grant DA 1600/1-1), and by Contract HR0011-15-2-0025 with the US Defense Advanced Research Projects Agency (DARPA).
© 2017 Association for Computational Linguistics