TY - JOUR
T1 - Restorable-Inpainting: A Novel Deep Learning Approach for Shoeprint Restoration
AU - Hassan, Muhammad
AU - Wang, Yan
AU - Wang, Di
AU - Pang, Wei
AU - Wang, Kangping
AU - Li, Daixi
AU - Zhou, You
AU - Xu, Dong
N1 - Funding Information:
This research was supported by the National Natural Science Foundation of China (Grant Nos. 61772227, 61972174, and 61972175), Science and Technology Development Foundation of Jilin Province (No. 20180201045GX, 20200201300JC, 20200401083GX, and 20200201163JC), and as the Paul K. and Diane Shumaker Endowment Fund to DX.
Publisher Copyright:
© 2022 Elsevier Inc.
PY - 2022/7
Y1 - 2022/7
N2 - Shoeprints are important information collected at the crime scene and are of great value for forensic analysis. Shoeprints collected in real-world scenarios are normally unclear, abrasive, and lack contextual and other kinds of missing information. In this research, we apply a novel deep learning technique called restorable inpainting to repair shoeprint contours and missing parts. Existing inpainting methods aim to fill artificially occluded areas with plausible pixels, but these methods may not restore missing information for occlusions in shoeprint images. In addition, because no ground-truth shoeprints exist for training samples, inpainting occluded regions becomes challenging. In this paper, we propose DeepShoePaint, a novel deep learning approach to perform restorable inpainting by restoring synthetic information resembling desirable shoeprint images necessary for forensics. DeepShoePaint novelly adapts a probabilistic distribution borrowed from the variational autoencoder into a U-Net-like structure forming a unified architecture trained in an unsupervised fashion to restore occluded and masked regions to produce human-verifiable shoeprints. The experimental results reveal that DeepShoePaint achieves outstanding human inspection and statistical assessment results and outperforms conventional inpainting models. We believe that this study can provide valuable insights, not limited to inpainting, into restoring desirable shoeprints to automate and facilitate the forensic investigation and examination process instead of using handcrafted methods.
AB - Shoeprints are important information collected at the crime scene and are of great value for forensic analysis. Shoeprints collected in real-world scenarios are normally unclear, abrasive, and lack contextual and other kinds of missing information. In this research, we apply a novel deep learning technique called restorable inpainting to repair shoeprint contours and missing parts. Existing inpainting methods aim to fill artificially occluded areas with plausible pixels, but these methods may not restore missing information for occlusions in shoeprint images. In addition, because no ground-truth shoeprints exist for training samples, inpainting occluded regions becomes challenging. In this paper, we propose DeepShoePaint, a novel deep learning approach to perform restorable inpainting by restoring synthetic information resembling desirable shoeprint images necessary for forensics. DeepShoePaint novelly adapts a probabilistic distribution borrowed from the variational autoencoder into a U-Net-like structure forming a unified architecture trained in an unsupervised fashion to restore occluded and masked regions to produce human-verifiable shoeprints. The experimental results reveal that DeepShoePaint achieves outstanding human inspection and statistical assessment results and outperforms conventional inpainting models. We believe that this study can provide valuable insights, not limited to inpainting, into restoring desirable shoeprints to automate and facilitate the forensic investigation and examination process instead of using handcrafted methods.
KW - Forensics
KW - Restorable inpainting
KW - Shoeprint
KW - U-Net
KW - Variational Auto-encoder
UR - http://www.scopus.com/inward/record.url?scp=85127359365&partnerID=8YFLogxK
U2 - 10.1016/j.ins.2022.03.080
DO - 10.1016/j.ins.2022.03.080
M3 - Article
SN - 0020-0255
VL - 600
SP - 22
EP - 42
JO - Information Sciences
JF - Information Sciences
ER -