Improving multi-label classification using inter-label associations and a new Kalman filter based ensemble method

Files in This Item:
Access to this item has been restricted by the copyright holder until:2021-07-31
File Description SizeFormat 
7892871.pdf3.21 MBAdobe PDF    Request a copy
Title: Improving multi-label classification using inter-label associations and a new Kalman filter based ensemble method
Authors: Pakrashi, Arjun
Permanent link: http://hdl.handle.net/10197/11664
Date: 2020
Online since: 2020-11-04T08:58:47Z
Abstract: In machine learning, classification algorithms are used to train models to recognise the class, or category, that an object belongs to. Most classification problems are multi-class, in which one object can belong to at most one class. However, there are many important real-world problems in which an object can belong to more than one class simultaneously. These are known as multi-label classification problems as an object can be labelled with more than one class. The multi-label classification algorithms in the literature range from very simple approaches, such as binary relevance, in which independent binary classifiers are built for each label, to sophisticated ensemble techniques, such as classifier chains, that build collections of interconnected classifiers. The most effective approaches tend to explicitly exploit relationships between the labels themselves, inter-label associations, and use ensembles. There is an opportunity, however, to more explicitly take advantage of inter-label associations and to use ensembling techniques that are more sophisticated than the bagging-based approaches that dominate the multi-label classification literature. There are several multi-label classification algorithms in the literature. The most basic methods are binary relevance and label powerset. Binary relevance considers each label as independent binary classification task and learns binary classifier models. Label powerset converts unique label assignment combinations to unique classes and then trains a multi-class classifier model. Although there are other methods which can benefit by considering the inter-label associations or through ensemble algorithms. Ensemble methods in multi-class domain generally perform much better than the individual classifier models. Although, except bagging like methods, there are not much work done in multi-label on boosting or boosting-like methods. This thesis investigates new algorithms for training multi-label classification models that exploit inter-label associations, and/or utilise ensemble models (especially boosting-like methods). Three new methods are proposed: Stacked-MLkNN, a stacked-ensemble-based lazy learning algorithm that exploits inter-label associations at the stacked layer; CascadeML, a neural network training algorithm that uses a cascade architecture to exploit inter-label associations and evolves the network architecture during training which minimises the requirement for hyperparameter tuning; and KFHE-HOMER, a multi-label ensemble training algorithm built using a newly proposed perspective on ensemble training that views it as a static state estimation problem that can be solved using the sensor fusion properties of the Kalman filter. This new perspective on ensemble training is also a contribution of this thesis, as are two new multi-class classification algorithms--- Kalman Filter-based Heuristic Ensemble (KFHE) and KalmanTune---that exploit it. Each newly proposed method is extensively evaluated across a set of well-known benchmark multi-label classification datasets, and compared to the performance of current state-of-the art methods. Each newly proposed method is found to be highly effective. Stacked-MLkNN performs better than all other existing instance-based multi-label classification algorithms against which it was compared. CascadeML can create models with comparable performance to the best performing multi-label methods, without requiring extensive hyperparameter tuning. KFHE outperforms leading multi-class ensemble methods, and KalmanTune can improve the performance of ensembles trained using boosting. Finally, KFHE-HOMER was found to perform better than all other multi-label classification methods against which it was compared.
Type of material: Doctoral Thesis
Publisher: University College Dublin. School of Computer Science
Qualification Name: Ph.D.
Copyright (published version): 2020 the Author
Keywords: Multi-labelEnsembleClassification
Language: en
Status of Item: Peer reviewed
Appears in Collections:Computer Science Theses

Show full item record

Page view(s)

115
Last Week
25
Last month
checked on Dec 1, 2020

Download(s)

7
checked on Dec 1, 2020

Google ScholarTM

Check


This item is available under the Attribution-NonCommercial-NoDerivs 3.0 Ireland. No item may be reproduced for commercial purposes. For other possible restrictions on use please refer to the publisher's URL where this is made available, or to notes contained in the item itself. Other terms may apply.