Incremental Grammar Induction from Child-directed Dialogue Utterances

Arash Eshghi, Julian Hough, Matthew Purver

Research output: Chapter in Book/Report/Conference proceedingConference contribution

10 Citations (Scopus)


We describe a method for learning an incremental semantic grammar from data in which utterances are paired with logical forms representing their meaning. Working in an inherently incremental framework, Dynamic Syntax, we show how words can be associated with probabilistic procedures for the incremental projection of meaning, providing a grammar which can be used directly in incremental probabilistic parsing and generation. We test this on child-directed utterances from the CHILDES corpus, and show that it results in good coverage and semantic accuracy, without requiring annotation at the word level or any independent notion of syntax.
Original languageEnglish
Title of host publicationProceedings of the Fourth Annual Workshop on Cognitive Modeling and Computational Linguistics
PublisherAssociation for Computational Linguistics
Number of pages10
Publication statusPublished - 2013
Event51st Annual Meeting of the Association for Computational Linguistics - Sofia, Bulgaria
Duration: 4 Aug 20139 Aug 2013


Conference51st Annual Meeting of the Association for Computational Linguistics
Abbreviated titleACL 2013
Internet address


Dive into the research topics of 'Incremental Grammar Induction from Child-directed Dialogue Utterances'. Together they form a unique fingerprint.

Cite this