Open Source Dataset and Deep Learning Models for Online Digit Gesture Recognition on Touchscreens

Files in This Item:
File Description SizeFormat 
IMVIP_2017_paper_52.pdf161.5 kBAdobe PDFDownload
Title: Open Source Dataset and Deep Learning Models for Online Digit Gesture Recognition on Touchscreens
Authors: Corr, Philip J.
Silvestre, Guenole C.
Bleakley, Christopher J.
Permanent link: http://hdl.handle.net/10197/9349
Date: 1-Sep-2017
Abstract: This paper presents an evaluation of deep neural networks for recognition of digits entered by users on a smartphone touchscreen. A new large dataset of Arabic numerals was collected for training and evaluation of the network. The dataset consists of spatial and temporal touch data recorded for 80 digits entered by 260 users. Two neural network models were investigated. The first model was a 2D convolutional neural (ConvNet) network applied to bitmaps of the glpyhs created by interpolation of the sensed screen touches and its topology is similar to that of previously published models for offline handwriting recognition from scanned images. The second model used a 1D ConvNet architecture but was applied to the sequence of polar vectors connecting the touch points. The models were found to provide accuracies of 98.50% and 95.86%, respectively. The second model was much simpler, providing a reduction in the number of parameters from 1,663,370 to 287,690. The dataset has been made available to the community as an open source resource.
Type of material: Conference Publication
Publisher: The Irish Pattern Recognition & Classification Society
Keywords: Computer visionMachine visionImage processingPattern recognition
Language: en
Status of Item: Peer reviewed
Conference Details: Irish Machine Vision and Image Processing Conference (IMVIP), Maynooth University, Ireland, 30 August- 1 September 2017
Appears in Collections:Computer Science Research Collection

Show full item record

Google ScholarTM

Check


This item is available under the Attribution-NonCommercial-NoDerivs 3.0 Ireland. No item may be reproduced for commercial purposes. For other possible restrictions on use please refer to the publisher's URL where this is made available, or to notes contained in the item itself. Other terms may apply.