Real-Time classification of multi-modal sensory data for prosthetic hand control

Iris Kyranou, Agamemnon Krasoulis, Mustafa Suphi Erden, Kianoush Nazarpour, Sethu Vijayakumar

Research output: Chapter in Book/Report/Conference proceedingConference contribution

14 Citations (Scopus)
40 Downloads (Pure)

Abstract

Recent work on myoelectric prosthetic control has shown that the incorporation of accelerometry information along with surface electromyography (sEMG) has the potential of improving the performance and robustness of a prosthetic device by increasing the classification accuracy. In this study, we investigated whether myoelectric control could further benefit from the use of additional sensory modalities such as gyroscopes and magnetometers. We trained a multi-class linear discriminant analysis (LDA) classifier to discriminate between six hand grip patterns and used predictions to control a robotic prosthetic hand in real-Time. We recorded initial training data by using a total number of 12 sEMG sensors, each of which integrated a 9 degree-of-freedom inertial measurement unit (IMU). For classification, four different decoding schemes were used; 1) sEMG and IMU from all sensors 2) sEMG from all sensors, 3) IMU from all sensors and, finally, 4) sEMG and IMU from a nearly optimal subset of sensors. These schemes were evaluated based on offline classification accuracy on the training data, as well as with task-related metrics such as completion rates and times for a pick-And-place real-Time experiment. We found that the classifier trained with all the sensory modalities and sensors (condition 1) attained the best decoding performance by achieving a 90.4% completion rate with an average completion time of 41.9 sec in real-Time experiments. We also found that classifiers incorporating sEMG and IMU information outperformed on average the ones that only used sEMG signals, even when the amount of sensors used was less than half in the former case. These results suggest that using extra modalities along with sEMG might be more beneficial than including additional sEMG sensors.

Original languageEnglish
Title of host publication2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob)
PublisherIEEE
Pages536-541
Number of pages6
ISBN (Electronic)9781509032877
DOIs
Publication statusPublished - 28 Jul 2016
Event6th IEEE RAS/EMBS International Conference on Biomedical Robotics and Biomechatronics 2016 - Singapore, Singapore
Duration: 26 Jun 201629 Jun 2016

Publication series

NameIEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics
PublisherIEEE
ISSN (Electronic)2155-1782

Conference

Conference6th IEEE RAS/EMBS International Conference on Biomedical Robotics and Biomechatronics 2016
Abbreviated titleBioRob 2016
CountrySingapore
CitySingapore
Period26/06/1629/06/16

ASJC Scopus subject areas

  • Artificial Intelligence
  • Biomedical Engineering
  • Mechanical Engineering

Fingerprint Dive into the research topics of 'Real-Time classification of multi-modal sensory data for prosthetic hand control'. Together they form a unique fingerprint.

  • Cite this

    Kyranou, I., Krasoulis, A., Erden, M. S., Nazarpour, K., & Vijayakumar, S. (2016). Real-Time classification of multi-modal sensory data for prosthetic hand control. In 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob) (pp. 536-541). [7523681] (IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics). IEEE. https://doi.org/10.1109/BIOROB.2016.7523681