Extracting Pasture Phenotype and Biomass Percentages using Weakly Supervised Multi-target Deep Learning on a Small Dataset

DC FieldValueLanguage
dc.contributor.authorNarayanan, Badri-
dc.contributor.authorSaadeldin, Mohamed-
dc.contributor.authorAlbert, Paul-
dc.contributor.authorMcGuinness, Kevin-
dc.contributor.authorMacNamee, Brian-
dc.descriptionThe Irish Machine Vision and Image Processing Conference 2020 (IMVIP 2020), Sligo, Ireland (held online due to coronavirus outbreak), 31 August - 2 September 2020en_US
dc.description.abstractThe dairy industry uses clover and grass as fodder for cows. Accurate estimation of grass and clover biomass yield enables smart decisions in optimizing fertilization and seeding density, resulting in increased productivity and positive environmental impact. Grass and clover are usually planted together, since clover is a nitrogen-fixing plant that brings nutrients to the soil. Adjusting the right percentages of clover and grass in a field reduces the need for external fertilization. Existing approaches for estimating the grass-clovercomposition of a field are expensive and time consuming—random samples of the pasture are clipped and then the components are physically separated to weigh and calculate percentages of dry grass, clover and weeds in each sample. There is growing interest in developing novel deep learning based approaches to nondestructively extract pasture phenotype indicators and biomass yield predictions of different plant species from agricultural imagery collected from the field. Providing these indicators and predictions from images alone remains a significant challenge. Heavy occlusions in the dense mixture of grass, clover and weeds make it difficult to estimate each component accurately. Moreover, although supervised deep learning models perform well with large datasets, it is tedious to acquire large and diverse collections of field images with precise ground truth for different biomass yields. In this paper, we demonstrate that applying data augmentation and transfer learning is effective in predicting multi-target biomass percentages of different plant species, even with a small training dataset. The scheme proposed in this paper used a training set of only 261 images and provided predictions of biomass percentages of grass, clover, white clover, red clover, and weeds with mean absolute error (MAE) of 6.77%, 6.92%, 6.21%, 6.89%, and 4.80% respectively. Evaluation and testing were performed on a publicly available dataset provided by the Biomass Prediction Challenge [Skovsen et al., 2019]. These results lay the foundation for our next set of experiments with semi-supervised learning to improve the benchmarks and will further the quest to identify phenotype characteristics from imagery in a non-destructive way.en_US
dc.description.sponsorshipDepartment of Agriculture, Food and the Marineen_US
dc.description.sponsorshipScience Foundation Irelanden_US
dc.subjectComputer visionen_US
dc.subjectDeep learningen_US
dc.subjectTransfer learningen_US
dc.subjectSmart agricultureen_US
dc.subjectData augmentationen_US
dc.subjectWeak supervisionen_US
dc.titleExtracting Pasture Phenotype and Biomass Percentages using Weakly Supervised Multi-target Deep Learning on a Small Dataseten_US
dc.typeConference Publicationen_US
dc.statusPeer revieweden_US
item.fulltextWith Fulltext-
Appears in Collections:Computer Science Research Collection
Insight Research Collection
Files in This Item:
 File SizeFormat
DownloadIMVIP2020_Paper17_Camera_2.pdf2.05 MBAdobe PDF
Show simple item record

Page view(s)

Last Week
Last month
checked on Jan 20, 2022


checked on Jan 20, 2022

Google ScholarTM


If you are a publisher or author and have copyright concerns for any item, please email research.repository@ucd.ie and the item will be withdrawn immediately. The author or person responsible for depositing the article will be contacted within one business day.