pp. 2115-2133
S&M4046 Research Paper of Special Issue https://doi.org/10.18494/SAM5591 Published: May 30, 2025 Harnessing Deep Neural Networks for Rapid Knife Wound Identification in Forensic Science: A Proof-of-concept Study [PDF] Chun-Ta Wei, Jeff Cheng-Lung Lee, and Yu-Guang Chen (Received February 6, 2025; Accepted April 28, 2025) Keywords: crime scene investigations, forensic science, deep learning, field programmable gate array, knife mark
An automated knife-wound weapon-identification system, containing forensic science, image processing, and deep learning, is introduced in this study. Aimed at improving the analysis of tool marks, particularly knife-related injuries common in violent crimes, in this research, we address the limitations of traditional forensic methods, such as the lack of real wound images and the need for fast processing. To overcome these challenges, a simulated wound feature dataset was developed to train the LeNet-5 neural network model, which is deployed on a portable field-programmable gate-array-based system. This allows for the real-time recognition of tools from wound images in a camera sensor, achieving an average recognition accuracy of 98.8% on the test dataset and processing at least 600 operations per second, providing forensic experts with rapid and accurate insights. The importance of continuously refining the neural network and expanding the dataset to include real wound images is emphasized in this study, aiming for accuracy improvement. The potential of this system transforms crime scene investigations, offering quick, objective, and reliable analyses that can expedite weapon identification and aid in the swift resolution of criminal cases.
Corresponding author: Yu-Guang Chen![]() ![]() This work is licensed under a Creative Commons Attribution 4.0 International License. Cite this article Chun-Ta Wei, Jeff Cheng-Lung Lee, and Yu-Guang Chen, Harnessing Deep Neural Networks for Rapid Knife Wound Identification in Forensic Science: A Proof-of-concept Study, Sens. Mater., Vol. 37, No. 5, 2025, p. 2115-2133. |