Thai culture image classification with transfer learning

Munlika Rattaphun, Kritaphat Songsri-in

Abstract


Classifying images of Thai culture is important for a variety of applications, such as tourism, education, and cultural preservation. However, building a Machine learning model from scratch to classify Thai cultural images can be challenging due to the limited availability of annotated data. In this study, we investigate the use of transfer learning for the task of image classification on a dataset of Thai cultural images. We utilize three popular convolutional neural network models, namely MobileNet, EfficientNet, and residual network (ResNet) as baseline pre-trained models. Their performances were evaluated when they were trained from random initialization, used as a feature extractor, and fully fine-tuned. The results showed that all three models performed better in terms of accuracy and training time when they were used as a feature extractor, with EfficientNet achieving the highest accuracy of 95.87% while maintaining the training time of 24 ms/iteration. To better understand the reasoning behind the predictions made by the models, we deployed the gradient-weighted class activation mapping (Grad-CAM) visualization technique to generate heatmaps that the models attend to when making predictions. Both our quantitative and qualitative experiments demonstrated that transfer learning is an effective approach to image classification on Thai cultural images.

Keywords


convolutional neural network; deep learning; image classification; Thai culture; transfer learning

Full Text:

PDF


DOI: http://doi.org/10.11591/ijece.v13i6.pp6259-6267

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578

This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).