Bone fracture classification using convolutional neural network architecture for high-accuracy image classification

Solikhun Solikhun, Agus Perdana Windarto, Putrama Alkhairi

Abstract


This research introduces an innovative method for fracture classification using convolutional neural networks (CNN) for high-accuracy image classification. The study addresses the need to improve the subjectivity and limited accuracy of traditional methods. By harnessing the capability of CNNs to autonomously extract hierarchical features from medical images, this research surpasses the limitations of manual interpretation and existing automated systems. The goal is to create a robust CNN-based methodology for precise and reliable fracture classification, potentially revolutionizing current diagnostic practices. The dataset for this research is sourced from Kaggle's public medical image repository, ensuring a diverse range of fracture images. This study highlights CNNs' potential to significantly enhance diagnostic precision, leading to more effective treatments and improved patient care in orthopedics. The novelty lies in the unique application of CNN architecture for fracture classification, an area not extensively explored before. Testing results show a significant improvement in classification accuracy, with the proposed model achieving an accuracy rate of 0.9922 compared to ResNet50's 0.9844. The research suggests that adopting CNN-based systems in medical practice can enhance diagnostic accuracy, optimize treatment plans, and improve patient outcomes.

Keywords


Bone fractures; Convolutional neural network; Deep learning; Image classification; Medical imaging

Full Text:

PDF


DOI: http://doi.org/10.11591/ijece.v14i6.pp6466-6477

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578

This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).