Development of a speech-based augmented reality system to support exploration of cityscape

Phil Bartie, William Mackaness

Research output: Contribution to journalArticlepeer-review

40 Citations (Scopus)


When people explore new environments they often use landmarks as reference points to help navigate and orientate themselves. This research paper examines how spatial datasets can be used to build a system for use in an urban environment which functions as a city guide, announcing Features of Interest (FoI) as they become visible to the user (not just proximal), as the user moves freely around the city. Visibility calculations for the FoIs were pre-calculated based on a digital surface model derived from LIDAR (Light Detection and Ranging) data. The results were stored in a text-based relational database management system (RDBMS) for rapid retrieval. All interaction between the user and the system was via a speech-based interface, allowing the user to record and request further information on any of the announced FoI. A prototype system, called Edinburgh Augmented Reality System (EARS), was designed, implemented and field tested in order to assess the effectiveness of these ideas. The application proved to be an innovative, ‘non-invasive' approach to augmenting the user's reality.
Original languageEnglish
Pages (from-to)63-86
Number of pages24
JournalTransactions in GIS
Issue number1
Publication statusPublished - Jan 2006


Dive into the research topics of 'Development of a speech-based augmented reality system to support exploration of cityscape'. Together they form a unique fingerprint.

Cite this