Multilevel auditory displays for mobile eyes-free location-based interaction

Yolanda Vazquez-Alvarez, Matthew P Aylett, Stephen A. Brewster, Rocio Von-Jungenfeld, Antti Virolainen

Research output: Chapter in Book/Report/Conference proceedingChapter

6 Citations (Scopus)


This paper explores the use of multilevel auditory displays to enable eyes-free mobile interaction with location-based information in a conceptual art exhibition space. Multilevel auditory displays enable user interaction with concentrated areas of information. However, it is necessary to consider how to present the auditory streams without overloading the user. We present an initial study in which a top-level exocentric sonification layer was used to advertise information present in a gallery-like space. Then, in a secondary interactive layer, three different conditions were evaluated that varied in the presentation (sequential versus simultaneous) and spatialisation (non-spatialised versus egocentric spatialisation) of multiple auditory sources. Results show that 1) participants spent significantly more time interacting with spatialised displays, 2) there was no evidence that a switch from an exocentric to an egocentric display increased workload or lowered satisfaction, and 3) there was no evidence that simultaneous presentation of spatialised Earcons in the secondary display increased workload.
Original languageEnglish
Title of host publicationCHI'14 Extended Abstracts on Human Factors in Computing Systems 2014
Place of PublicationNew York, NY, United States
PublisherAssociation for Computing Machinery
Number of pages6
ISBN (Print)9781450324748
Publication statusPublished - 26 Apr 2014
Event2014 CHI Conference on Human Factors in Computing Systems - Toronto, ON, Canada
Duration: 26 Apr 20141 May 2014


Conference2014 CHI Conference on Human Factors in Computing Systems
Abbreviated title CHI '14
CityToronto, ON


  • Eyes-free
  • interaction
  • auditory displays
  • spatial audio
  • H.5.2 [User Interfaces]
  • Interaction styles
  • evaluaton


Dive into the research topics of 'Multilevel auditory displays for mobile eyes-free location-based interaction'. Together they form a unique fingerprint.

Cite this