Automated nudity recognition using very deep residual learning network

Rasoul Banaeeyan, Hezerul Abdul Karim, Haris Lye, Mohammad Faizal Ahmad Fauzi, Sarina Mansor, John See

Research output: Contribution to journalArticlepeer-review

4 Citations (Scopus)

Abstract

The exponentially growing number of pornographic material has brought many challenges to the modern daily life, particularly where children and minors have unlimited access to the internet. In Malaysia, all local and foreign films should obtain the suitability approval before distribution or public viewing, and this process of screening visual contents of all the TV channels imposes a huge censorship cost to the service providers such as Unifi TV. To leverage this issue, this paper proposes to use an emerging model of Deep Learning (DL) techniques called Residual Learning Convolutional Neural Networks (ResNet), in order to automate the process of nudity detection in visual contents. The pre-trained ResNet model, with hundred and one layers, was utilized to perform transfer learning and solve a new binary classification problem of nudity versus non-nudity. The performance of the proposed model is evaluated based on a newly created dataset comprising more than 4k samples of nudity and non-nudity images. After conducting experiments on the nudity dataset, the deep learning method succeeded to achieve the best performance of 70.42% in term of F-score, 84.04% in term of accuracy, and 93.72% in term of AUC.

Original languageEnglish
Pages (from-to)136-141
Number of pages6
JournalInternational Journal of Recent Technology and Engineering
Volume8
Issue number3S
DOIs
Publication statusPublished - Oct 2019

Keywords

  • Convolutional neural network
  • Deep learning
  • Nudity recognition
  • Residual learning block

ASJC Scopus subject areas

  • General Engineering
  • Management of Technology and Innovation

Fingerprint

Dive into the research topics of 'Automated nudity recognition using very deep residual learning network'. Together they form a unique fingerprint.

Cite this