Pakrashi, ArjunArjunPakrashiMacNamee, BrianBrianMacNamee2020-11-062020-11-062019 Sprin2019-11-19978-3-030-34884-7http://hdl.handle.net/10197/11677The 39th SGAI International Conference on Artificial Intelligence (AI 2019), Cambridge, United Kingdom, 17-19 December 2019In multi-label classification a datapoint can be labelled with more than one class at the same time. A common but trivial approach to multi-label classification is to train individual binary classifiers per label, but the performance can be improved by considering associations between the labels, and algorithms like classifier chains and RAKEL do this effectively. Like most machine learning algorithms, however, these approaches require accurate hyperparameter tuning, a computationally expensive optimisation problem. Tuning is important to train a good multi-label classifier model. There is a scarcity in the literature of effective multi-label classification approaches that do not require extensive hyperparameter tuning. This paper addresses this scarcity by proposing CascadeML, a multi-label classification approach based on cascade neural network that takes label associations into account and requires minimal hyperparameter tuning. The performance of the CasecadeML approach is evaluated using 10 multi-label datasets and compared with other leading multi-label classification algorithms. Results show that CascadeML performs comparatively with the leading approaches but without a need for hyperparameter tuning.enThe final publication is available at www.springerlink.com.Machine learningMulti-label classificationCascade neural networksCascadeML: An Automatic Neural Network Architecture Evolution and Training Algorithm for Multi-label ClassificationConference Publication10.1007/978-3-030-34885-4_12020-02-03SFI/12/RC/2289_P2https://creativecommons.org/licenses/by-nc-nd/3.0/ie/