Probabilistic Grammar Induction in an Incremental Semantic Framework

Arash Eshghi, Matthew Purver, Julian Hough, Yo Sato

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)


We describe a method for learning an incremental semantic grammar from a corpus in which sentences are paired with logical forms as predicate-argument structure trees. Working in the framework of Dynamic Syntax, and assuming a set of generally available compositional mechanisms, we show how lexical entries can be learned as probabilistic procedures for the incremental projection of semantic structure, providing a grammar suitable for use in an incremental probabilistic parser. By inducing these from a corpus generated using an existing grammar, we demonstrate that this results in both good coverage and compatibility with the original entries, without requiring annotation at the word level. We show that this semantic approach to grammar induction has the novel ability to learn the syntactic and semantic constraints on pronouns.
Original languageEnglish
Title of host publicationConstraint Solving and Language Processing
Subtitle of host publication7th International Workshop, CSLP 2012, Orléans, France, September 13-14, 2012, Revised Selected Papers
EditorsDenys Duchier, Yannick Parmentier
Number of pages16
ISBN (Electronic)9783642415784
ISBN (Print)9783642415777
Publication statusPublished - 2013
Event7th International Workshop on Constraint Solving and Language Processing 2012 - Orleans, France
Duration: 13 Sept 201214 Sept 2012

Publication series

NameLecture Notes in Computer Science
PublisherSpringer Berlin Heidelberg
ISSN (Print)0302-9743


Conference7th International Workshop on Constraint Solving and Language Processing 2012

ASJC Scopus subject areas

  • General Computer Science
  • Theoretical Computer Science


Dive into the research topics of 'Probabilistic Grammar Induction in an Incremental Semantic Framework'. Together they form a unique fingerprint.

Cite this