Optimizing olive disease classification through transfer learning with unmanned aerial vehicle imagery

El Mehdi Raouhi, Mohamed Lachgar, Hamid Hrimech, Ali Kartit

Abstract


Early detection of diseases in growing olive trees is essential for reducing costs and increasing productivity in this crucial economic activity. The quality and quantity of olive oil depend on the health of the fruit, making accurate and timely information on olive tree diseases critical to monitor growth and anticipate fruit output. The use of unmanned aerial vehicles (UAVs) and deep learning (DL) has made it possible to quickly monitor olive diseases over a large area indeed of limited sampling methods. Moreover, the limited number of research studies on olive disease detection has motivated us to enrich the literature with this work by introducing new disease classes and classification methods for this tree. In this study, we present a UAV system using convolutional neuronal network (CNN) and transfer learning (TL). We constructed an olive disease dataset of 14K images, processed and trained it with various CNN in addition to the proposed MobileNet-TL for improved classification and generalization. The simulation results confirm that this model allows for efficient diseases classification, with a precision accuracy achieving 99% in validation. In summary, TL has a positive impact on MobileNet architecture by improving its performance and reducing the training time for new tasks.

Keywords


Classification; Convolutional neural network; Olive diseases; Sensors; Transfer learning; Unmanned aerial vehicles;

Full Text:

PDF


DOI: http://doi.org/10.11591/ijece.v14i1.pp891-903

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578

This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).