Emotions analysis of speech for call classification

Esraa Ali Hassan*, Neamat El Gayar, Moustafa M. Ghanem

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

Most existing research in the area of emotions recognition has focused on short segments or utterances of speech. In this paper we propose a machine learning system for classifying the overall sentiment of long conversations as being Positive or Negative. Our system has three main phases, first it divides a call into short segments, second it applies machine learning to recognize the emotion for each segment, and finally it learns a binary classifier that takes the recognized emotions of individual segments as features. We investigate different approaches for this final phase by varying how emotions for individual segments are aggregated and also by varying classification model used for the final phase. We present our experimental results and analysis based on a simulated data set collected specifically for this research.

Original languageEnglish
Title of host publication2010 10th International Conference on Intelligent Systems Design and Applications
PublisherIEEE
Pages242-247
Number of pages6
ISBN (Electronic)9781424481361
ISBN (Print)9781424481347
DOIs
Publication statusPublished - 13 Jan 2011
Event10th International Conference on Intelligent Systems Design and Applications 2010 - Cairo, Egypt
Duration: 29 Nov 20101 Dec 2010

Conference

Conference10th International Conference on Intelligent Systems Design and Applications 2010
Abbreviated titleISDA'10
Country/TerritoryEgypt
CityCairo
Period29/11/101/12/10

Keywords

  • Classification of calls
  • Emotions recognition
  • Machine learning
  • Speech analysis

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Science Applications
  • Hardware and Architecture

Fingerprint

Dive into the research topics of 'Emotions analysis of speech for call classification'. Together they form a unique fingerprint.

Cite this