Understanding Disrupted Sentences Using Underspecified Abstract Meaning Representation

Angus Addlesee, Marco Damonte

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)


Voice assistant accessibility is generally overlooked as today's spoken dialogue systems are trained on huge corpora to help them understand the 'average' user. This raises frustrating barriers for certain user groups as their speech shifts from the average. People with dementia pause more frequently mid-sentence for example, and people with hearing impairments may mispronounce words learned post-diagnosis. We explore whether semantic parsing can improve accessibility for people with nonstandard speech, and consequently become more robust to external disruptions like dogs barking, sirens passing, or doors slamming mid-utterance. We generate corpora of disrupted sentences paired with their underspecified Abstract Meaning Representation (AMR) graphs, and use these to train pipelines to understand and repair disruptions. Our best disruption recovery pipeline lost only 1.6% graph similarity f-score when compared to a model given the full original sentence.

Original languageEnglish
Title of host publicationProceedings of the Annual Conference of the International Speech Communication Association
Number of pages5
Publication statusPublished - 2023
Event24th International Speech Communication Association Conference 2023 - Dublin, Ireland
Duration: 20 Aug 202324 Aug 2023


Conference24th International Speech Communication Association Conference 2023
Abbreviated titleINTERSPEECH 2023
Internet address


  • accessibility
  • human-computer interaction
  • semantic parsing
  • spoken dialogue systems

ASJC Scopus subject areas

  • Language and Linguistics
  • Human-Computer Interaction
  • Signal Processing
  • Software
  • Modelling and Simulation


Dive into the research topics of 'Understanding Disrupted Sentences Using Underspecified Abstract Meaning Representation'. Together they form a unique fingerprint.

Cite this