Object classification with convolution neural network based on the time-frequency representation of their echo

Mariia Dmitrieva, Matias Valdenegro Toro, Keith Brown, Gary Heald, David Lane

Research output: Chapter in Book/Report/Conference proceedingConference contribution

11 Citations (Scopus)

Abstract

This paper presents classification of spherical objects with different physical properties. The classification is based on the energy distribution in wideband pulses that have been scattered from objects. The echo is represented in Time-Frequency Domain (TFD), using Short Time Fourier Transform (STFT) with different window lengths, and is fed into a Convolution Neural Network (CNN) for classification. The results for different window lengths are analysed to study the influence of time and frequency resolution in classification. The CNN performs the best results with accuracy of (98.44 ± 0.8)% over 5 object classes trained on grayscale TFD images with 0.1 ms window length of STFT. The CNN is compared with a Multilayer Perceptron classifier, Support Vector Machine, and Gradient Boosting.

Original languageEnglish
Title of host publication2017 IEEE International Workshop on Machine Learning for Signal Processing (MLSP)
PublisherIEEE
ISBN (Electronic)9781509063413
DOIs
Publication statusPublished - 7 Dec 2017
Event2017 IEEE 27th International Workshop on Machine Learning for Signal Processing - Tokyo, Japan
Duration: 25 Sept 201728 Sept 2017

Conference

Conference2017 IEEE 27th International Workshop on Machine Learning for Signal Processing
Abbreviated titleMLSP 2017
Country/TerritoryJapan
CityTokyo
Period25/09/1728/09/17

Keywords

  • Convolution neural networks
  • Object classification
  • Time-frequency representation
  • Wideband pulses

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Signal Processing

Fingerprint

Dive into the research topics of 'Object classification with convolution neural network based on the time-frequency representation of their echo'. Together they form a unique fingerprint.

Cite this