Efficient Sequence Regression by Learning Linear Models in All-Subsequence Space

Files in This Item:
File Description SizeFormat 
insight_publication.pdf710.72 kBAdobe PDFDownload
Title: Efficient Sequence Regression by Learning Linear Models in All-Subsequence Space
Authors: Gsponer, Severin
Smyth, Barry
Ifrim, Georgiana
Permanent link: http://hdl.handle.net/10197/9054
Date: 22-Sep-2017
Abstract: We present a new approach for learning a sequence regressionfunction, i.e., a mapping from sequential observations to a numericscore. Our learning algorithm employs coordinate gradient descent andGauss-Southwell optimization in the feature space of all subsequences.We give a tight upper bound for the coordinate wise gradients of squarederror loss that enables ecient Gauss-Southwell selection. The proposedbound is built by separating the positive and the negative gradients ofthe loss function and exploits the structure of the feature space. Extensiveexperiments on simulated as well as real-world sequence regressionbenchmarks show that the bound is eective and our proposed learningalgorithm is ecient and accurate. The resulting linear regression modelprovides the user with a list of the most predictive features selected duringthe learning stage, adding to the interpretability of the method.
Funding Details: Science Foundation Ireland
Type of material: Conference Publication
Keywords: Machine learningStatistics
Language: en
Status of Item: Peer reviewed
Conference Details: The European Conference on Machine Learning & Principles and Practice of Knowledge Discovery in Databases, Skopje, Macedonia 18-22 September 2017
Appears in Collections:Insight Research Collection

Show full item record

Download(s) 50

25
checked on May 25, 2018

Google ScholarTM

Check


This item is available under the Attribution-NonCommercial-NoDerivs 3.0 Ireland. No item may be reproduced for commercial purposes. For other possible restrictions on use please refer to the publisher's URL where this is made available, or to notes contained in the item itself. Other terms may apply.