We describe a method for learning an incremental semantic grammar from data in which utterances are paired with logical forms representing their meaning. Working in an inherently incremental framework, Dynamic Syntax, we show how words can be associated with probabilistic procedures for the incremental projection of meaning, providing a grammar which can be used directly in incremental probabilistic parsing and generation. We test this on child-directed utterances from the CHILDES corpus, and show that it results in good coverage and semantic accuracy, without requiring annotation at the word level or any independent notion of syntax.
|Title of host publication||Proceedings of the Fourth Annual Workshop on Cognitive Modeling and Computational Linguistics|
|Publisher||Association for Computational Linguistics|
|Number of pages||10|
|Publication status||Published - 2013|
|Event||51st Annual Meeting of the Association for Computational Linguistics - Sofia, Bulgaria|
Duration: 4 Aug 2013 → 9 Aug 2013
|Conference||51st Annual Meeting of the Association for Computational Linguistics|
|Abbreviated title||ACL 2013|
|Period||4/08/13 → 9/08/13|
Eshghi, A., Hough, J., & Purver, M. (2013). Incremental Grammar Induction from Child-directed Dialogue Utterances. In Proceedings of the Fourth Annual Workshop on Cognitive Modeling and Computational Linguistics (pp. 94-103). Association for Computational Linguistics.