OCNet-23: a fine-tuned transfer learning approach for oral cancer detection from histopathological images
Abstract
Oral squamous cell carcinoma (OSCC) is emerging as a significant global health concern, underscoring the need for prompt detection and treatment. Our study introduces an innovative diagnostic method for OSCC, leveraging the capabilities of artificial intelligence (AI) and histopathological images (HIs). Our primary objective is to expedite the identification process for medical professionals. To achieve this, we employ transfer learning and incorporate renowned models such as VGG16, VGG19, MobileNet_v1, MobileNet_v2, DenseNet, and InceptionV3. A key feature of our approach is the meticulous optimization of the VGG19 architecture, paired with advanced image preprocessing techniques such as contrast limited adaptive histogram equalization (CLAHE) and median blur. We conducted an ablation study with optimized hyperparameters, culminating in an impressive 95.32% accuracy. This groundbreaking research ensures accurate and timely diagnoses, leading to improved patient outcomes, and represents a significant advancement in the application of AI for oral cancer diagnostics. Utilizing a substantial dataset of 5,192 meticulously categorized images into OSCC and normal categories, our work pioneers the field of OSCC detection. By providing medical professionals with a robust tool to enhance their diagnostic capabilities, our method has the potential to revolutionize the sector and usher in a new era of more effective and efficient oral cancer treatment.
Keywords
Deep learning; Image processing; Oral cancer detection; Oral squamous cell carcinoma; Transfer learning
Full Text:
PDFDOI: http://doi.org/10.11591/ijece.v15i2.pp1826-1833
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).