Proper noun compounds, e.g., “Covid vaccine”, convey information in a succinct manner (a “Covid vaccine” is a “vaccine that immunizes against the Covid disease”). These are commonly used in short-form domains, such as news headlines, but are largely ignored in information-seeking applications. To address this limitation, we release a new manually annotated dataset, PRONCI, consisting of 22.5K proper noun compounds along with their free-form semantic interpretations. PRONCI is 60 times larger than prior noun compound datasets and also includes non-compositional examples, which have not been previously explored. We experiment with various neural models for automatically generating the semantic interpretations from proper noun compounds, ranging from few-shot prompting to supervised learning, with varying degrees of knowledge about the constituent nouns. We find that adding targeted knowledge, particularly about the common noun, results in performance gains of upto 2.8%. Finally, we integrate our model generated interpretations with an existing Open IE system and observe an 7.5% increase in yield at a precision of 85%. The dataset and code are available at https://github.com/dair-iitd/pronci.
|Original language||American English|
|Number of pages||14|
|State||Published - 2022|
|Event||2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022 - Abu Dhabi, United Arab Emirates|
Duration: 7 Dec 2022 → 11 Dec 2022
|Conference||2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022|
|Country/Territory||United Arab Emirates|
|Period||7/12/22 → 11/12/22|
Bibliographical noteFunding Information:
Keshav is supported by a TCS Fellowship. Mausam is supported by grants from Huawei, Google, Verisk, and a Jai Gupta Chair Fellowship. We thank KnowDis team for their help in data annotations and HPC, IIT Delhi for the computational resources. This work was supported in part by a research grant no. 2088 from the Israeli Ministry of Science and Technology.
© 2022 Association for Computational Linguistics.