Neural AMR: Sequence-to-Sequence Models for Parsing and Generation

Ioannis Konstas, Srinivasan Iyer, Mark Yatskar, Yejin Choi, Luke Zettlemoyer

Research output: Chapter in Book/Report/Conference proceedingConference contribution

73 Citations (Scopus)

Abstract

Sequence-to-sequence models have shown strong performance across a broad range of applications. However, their application to parsing and generating text using Abstract Meaning Representation (AMR) has been limited, due to the relatively limited amount of labeled data and the non-sequential nature of the AMR graphs. We present a novel training procedure that can lift this limitation using millions of unlabeled sentences and careful preprocessing of the AMR graphs. For AMR parsing, our model achieves competitive results of 62.1 SMATCH, the current best score reported without significant use of external semantic resources.For AMR generation, our model establishes a new state-of-the-art performance of BLEU 33.8. We present extensive ablative and qualitative analysis including strong evidence that sequence-based AMR models are robust against ordering variations of graph-to-sequence conversions.
Original languageEnglish
Title of host publicationProceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
PublisherAssociation for Computational Linguistics
Pages146-157
Number of pages12
ISBN (Print)9781945626753
DOIs
Publication statusPublished - 1 Jul 2017
Event55th Annual Meeting of the Association for Computational Linguistics 2017 - Vancouver, Canada
Duration: 1 Jul 20174 Aug 2017

Conference

Conference55th Annual Meeting of the Association for Computational Linguistics 2017
Abbreviated titleACL 2017
CountryCanada
CityVancouver
Period1/07/174/08/17

Fingerprint Dive into the research topics of 'Neural AMR: Sequence-to-Sequence Models for Parsing and Generation'. Together they form a unique fingerprint.

  • Profiles

    Cite this

    Konstas, I., Iyer, S., Yatskar, M., Choi, Y., & Zettlemoyer, L. (2017). Neural AMR: Sequence-to-Sequence Models for Parsing and Generation. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (pp. 146-157). Association for Computational Linguistics. https://doi.org/Download pdf, https://doi.org/10.18653/v1/P17-1014