Robust features extraction for general fish classification
Abstract
Image recognition process could be plagued by many problems including noise, overlap, distortion, errors in the outcomes of segmentation, and impediment of objects within the image. Based on feature selection and combination theory between major extracted features, this study attempts to establish a system that could recognize fish object within the image utilizing texture, anchor points, and statistical measurements. Then, a generic fish classification is executed with the application of an innovative classification evaluation through a meta-heuristic algorithm known as Memetic Algorithm (Genetic Algorithm with Simulated Annealing) with back-propagation algorithm (MA-B Classifier). Here, images of dangerous and non-dangerous fish are recognized. Images of dangerous fish are further recognized as Predatory or Poison fish family, whereas families of non-dangerous fish are classified into garden and food family. A total of 24 fish families were used in testing the proposed prototype, whereby each family encompasses different number of species. The process of classification was successfully undertaken by the proposed prototype, whereby 400 distinct fish images were used in the experimental tests. Of these fish images, 250 were used for training phase while 150 were used for testing phase. The back-propagation algorithm and the proposed MA-B Classifier produced a general accuracy recognition rate of 82.25 and 90% respectively.
Keywords
Features extraction; Anchor Points Measurements; Texture Measurements; statistical measurements, Back Propagation algorithm and meta-heuristic algorithm.
Full Text:
PDFDOI: http://doi.org/10.11591/ijece.v9i6.pp5192-5204
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).