Abstract
This work reimplements a recent semantic bootstrapping child language acquisition (CLA) model, which was originally designed for English, and trains it to learn a new language: Hebrew. The model learns from pairs of utterances and logical forms as meaning representations, and acquires both syntax and word meanings simultaneously. The results show that the model mostly transfers to Hebrew, but that a number of factors, including the richer morphology in Hebrew, makes the learning slower and less robust. This suggests that a clear direction for future work is to enable the model to leverage the similarities between different word forms.
| Original language | English |
|---|---|
| Article number | 101714 |
| Journal | Computer Speech and Language |
| Volume | 90 |
| DOIs | |
| State | Published - Mar 2025 |
Bibliographical note
Publisher Copyright:© 2024
Keywords
- Child language acquisition
- Combinatory categorial grammar
- Semantic bootstrapping
Fingerprint
Dive into the research topics of 'A language-agnostic model of child language acquisition'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver