Please use this identifier to cite or link to this item:
Full metadata record
DC FieldValueLanguage
dc.contributor.authorKatrin Utaien_US
dc.contributor.authorMarcus Nagleen_US
dc.contributor.authorSimone Hämmerleen_US
dc.contributor.authorWolfram Spreeren_US
dc.contributor.authorBusarakorn Mahayotheeen_US
dc.contributor.authorJoachim Mülleren_US
dc.description.abstract© 2018 Asian Agricultural and Biological Engineering Association Computer-aided estimation of mass for irregularly-shaped fruits is a constructive advancement towards improved post-harvest technologies. In image processing of unsymmetrical and varying samples, object recognition and feature extraction are challenging tasks. This paper presents a developed algorithms that transform images of the mango cultivar ‘Nam Dokmai to simplify subsequent object recognition tasks, and extracted features, like length, width, thickness, and area further used as inputs in an artificial neural network (ANN) model to estimate the fruit mass. Seven different approaches are presented and discussed in this paper explaining the application of specific algorithms to obtain the fruit dimensions and to estimate the fruit mass. The performances of the different image processing approaches were evaluated. Overall, it can be stated that all the treatments gave satisfactory results with highest success rates of 97% and highest coefficient of efficiencies of 0.99 using two input parameters (area and thickness) or three input parameters (length, width, and thickness).en_US
dc.subjectAgricultural and Biological Sciencesen_US
dc.subjectChemical Engineeringen_US
dc.titleMass estimation of mango fruits (Mangifera indica L., cv. ‘Nam Dokmai’) by linking image processing and artificial neural networken_US
article.title.sourcetitleEngineering in Agriculture, Environment and Fooden_US Hohenheimen_US GmbHen_US Mai Universityen_US Universityen_US
Appears in Collections:CMUL: Journal Articles

Files in This Item:
There are no files associated with this item.

Items in CMUIR are protected by copyright, with all rights reserved, unless otherwise indicated.