Detecting emotions in conversations between driver and in-car information systems

Christian Martyn Jones, Ing Marie Jonsson

Research output: Chapter in Book/Report/Conference proceedingConference contribution

11 Citations (Scopus)

Abstract

Speech interaction with in-car controls is becoming more common-place as the interaction is considered to be less distracting to the driver. Cars of today are equipped with speech recognition system to dial phone numbers and to control the cockpit environment. Furthermore satellite navigation systems provide the driver with verbal directions to their destination. The paper extends the speech interaction between driver and car to consider automatic recognition of the emotional state of the driver and appropriate responses by the car to improve the driver mood. The emotion of the driver has been found to influence driving performance and by actively responding to the emotional of the driver the car could improve their driving. © Springer-Verlag Berlin Heidelberg 2005.

Original languageEnglish
Title of host publicationAffective Computing and Intelligent Interaction - First International Conference, ACII 2005, Proceedings
Pages780-787
Number of pages8
Volume3784 LNCS
Publication statusPublished - 2005
Event1st International Conference on Affective Computing and Intelligent Interaction 2005 - Beijing, China
Duration: 22 Oct 200524 Oct 2005

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume3784 LNCS
ISSN (Print)0302-9743

Conference

Conference1st International Conference on Affective Computing and Intelligent Interaction 2005
Abbreviated titleACII 2005
Country/TerritoryChina
CityBeijing
Period22/10/0524/10/05

Fingerprint

Dive into the research topics of 'Detecting emotions in conversations between driver and in-car information systems'. Together they form a unique fingerprint.

Cite this