We describe a method for learning an incremental semantic grammar from a corpus in which sentences are paired with logical forms as predicate-argument structure trees. Working in the framework of Dynamic Syntax, and assuming a set of generally available compositional mechanisms, we show how lexical entries can be learned as probabilistic procedures for the incremental projection of semantic structure, providing a grammar suitable for use in an incremental probabilistic parser. By inducing these from a corpus generated using an existing grammar, we demonstrate that this results in both good coverage and compatibility with the original entries, without requiring annotation at the word level. We show that this semantic approach to grammar induction has the novel ability to learn the syntactic and semantic constraints on pronouns.
|Name||Lecture Notes in Computer Science|
|Publisher||Springer Berlin Heidelberg|
|Conference||7th International Workshop on Constraint Solving and Language Processing 2012|
|Period||13/09/12 → 14/09/12|
- Computer Science(all)
- Theoretical Computer Science