Please use this identifier to cite or link to this item: http://cmuir.cmu.ac.th/jspui/handle/6653943832/62542
Title: Mass estimation of mango fruits (Mangifera indica L., cv. ‘Nam Dokmai’) by linking image processing and artificial neural network
Authors: Katrin Utai
Marcus Nagle
Simone Hämmerle
Wolfram Spreer
Busarakorn Mahayothee
Joachim Müller
Authors: Katrin Utai
Marcus Nagle
Simone Hämmerle
Wolfram Spreer
Busarakorn Mahayothee
Joachim Müller
Keywords: Agricultural and Biological Sciences;Chemical Engineering;Engineering
Issue Date: 1-Jan-2018
Abstract: © 2018 Asian Agricultural and Biological Engineering Association Computer-aided estimation of mass for irregularly-shaped fruits is a constructive advancement towards improved post-harvest technologies. In image processing of unsymmetrical and varying samples, object recognition and feature extraction are challenging tasks. This paper presents a developed algorithms that transform images of the mango cultivar ‘Nam Dokmai to simplify subsequent object recognition tasks, and extracted features, like length, width, thickness, and area further used as inputs in an artificial neural network (ANN) model to estimate the fruit mass. Seven different approaches are presented and discussed in this paper explaining the application of specific algorithms to obtain the fruit dimensions and to estimate the fruit mass. The performances of the different image processing approaches were evaluated. Overall, it can be stated that all the treatments gave satisfactory results with highest success rates of 97% and highest coefficient of efficiencies of 0.99 using two input parameters (area and thickness) or three input parameters (length, width, and thickness).
URI: https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85055892420&origin=inward
http://cmuir.cmu.ac.th/jspui/handle/6653943832/62542
ISSN: 18818366
Appears in Collections:CMUL: Journal Articles

Files in This Item:
There are no files associated with this item.


Items in CMUIR are protected by copyright, with all rights reserved, unless otherwise indicated.