A Deep Learning Approach for Hand and Arm Bone Fracture Detection based on ResNet18 and EfficientNetB0 using X-rays images
DOI:
https://doi.org/10.57041/vol4iss2%20(Suppl.)pp6-15Keywords:
bone fracture detection, deep learning, resnet18, efficientnetb0, x-ray imagesAbstract
Quick diagnosis and successful treatment depend on the accurate identification of bone fractures. Despite their value, traditional diagnostic techniques like MRI and CT scans are frequently costly, less accessible, and may not be able to identify small fractures. Critical clinical decisions may be delayed due to the time-consuming and human error-prone process of manual X-ray imagine assessment. This paper offers an automated deep learning-based method for detecting and classifying bone fractures utilising two advanced convolutional neural network architectures: ResNet18 and EfficientNetB0. The study's dataset includes X-ray images of hands and arms of both normal (4383 images) and fractured (4480 images) patients. To improve image quality and avoid overfitting, preprocessing and data augmentation approaches were applied. With an accuracy of 99%, precision ranging from 0.97 to 1.00, and an F1-score of 0.99, EfficientNetB0 outperformed the other evaluated model, demonstrating remarkable classification competencies. Reliability was also demonstrated by the ResNet18 model, which identified fracture patterns with excellent accuracy and robustness of 98%. These findings demonstrate how deep learning may be used to quickly and accurately automate fracture identification. Additionally, even on systems with restricted resources, EfficientNetB0's lightweight design makes it appropriate for real-time clinical applications. This method can help doctors identify fractures early and accurately, decrease errors, and greatly increase diagnostic efficiency.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 https://pjosr.com/index.php/pjosr/cr

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
