LiteEmo: Lightweight Deep Neural Networks for Image Emotion Recognition

Yan-Han Chew, Lai-Kuan Wong, John See, Huai-Qian Khor, Balasubramanian Abivishaq

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Psychology studies have shown that an image can invoke various emotions, depending on the visual features as well as semantic content of the image. Ability to identify image emotion can be very useful for many applications, including image retrieval and aesthetics prediction. Notably, most of the existing deep learning-based emotion recognition models do not capitalize on additional semantics or contextual information and are computational expensive. Inspired to overcome these limitations, we proposed a lightweight multi-stream deep network that concatenates several MobileNet networks for performing image emotion analysis. Each stream in the multi-stream deep network represents the core emotion recognition, object recognition and image category recognition models respectively. Experimental results demonstrate the effectiveness of the additional contextual information in producing comparable performance as the state-of-the-art emotion models, but with lesser parameters, thus improving its practicality.

Original languageEnglish
Title of host publication2019 IEEE 21st International Workshop on Multimedia Signal Processing (MMSP)
PublisherIEEE
ISBN (Electronic)9781728118178
DOIs
Publication statusPublished - 18 Nov 2019
Event21st IEEE International Workshop on Multimedia Signal Processing 2019 - Kuala Lumpur, Malaysia
Duration: 27 Sep 201929 Sep 2019

Conference

Conference21st IEEE International Workshop on Multimedia Signal Processing 2019
Abbreviated titleMMSP 2019
CountryMalaysia
CityKuala Lumpur
Period27/09/1929/09/19

Keywords

  • Image emotion
  • lightweight
  • multi-stream network

ASJC Scopus subject areas

  • Signal Processing
  • Media Technology

Fingerprint Dive into the research topics of 'LiteEmo: Lightweight Deep Neural Networks for Image Emotion Recognition'. Together they form a unique fingerprint.

Cite this