Repository logo
  • Log In
    New user? Click here to register.Have you forgotten your password?
University College Dublin
    Colleges & Schools
    Statistics
    All of DSpace
  • Log In
    New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Institutes and Centres
  3. Insight Centre for Data Analytics
  4. Insight Research Collection
  5. FedFast: Going Beyond Average for Faster Training of Federated Recommender Systems
 
  • Details
Options

FedFast: Going Beyond Average for Faster Training of Federated Recommender Systems

Author(s)
Muhammad, Khalil  
Wang, Qinqin  
O'Reilly-Morgan, Diarmuid  
Tragos, Elias  
Smyth, Barry  
Hurley, Neil J.  
Geraci, James  
Lawlor, Aonghus  
Uri
http://hdl.handle.net/10197/12120
Date Issued
2020-08-27
Date Available
2021-04-22T15:51:32Z
Abstract
Federated learning (FL) is quickly becoming the de facto standard for the distributed training of deep recommendation models, us-ing on-device user data and reducing server costs. In a typical FLprocess, a central server tasks end-users to train a shared recommen-dation model using their local data. The local models are trained over several rounds on the users’ devices and the server combinesthem into a global model, which is sent to the devices for the pur-pose of providing recommendations. Standard FL approaches userandomly selected users for training at each round, and simply average their local models to compute the global model. The resulting federated recommendation models require significant client effortto train and many communication rounds before they converge to asatisfactory accuracy. Users are left with poor quality recommendations until the late stages of training. We present a novel technique, FedFast, to accelerate distributed learning which achieves goodaccuracy for all users very early in the training process. We achievethis by sampling from a diverse set of participating clients in each training round and applying an active aggregation method that propagates the updated model to the other clients. Consequently, with FedFast the users benefit from far lower communication costsand more accurate models that can be consumed anytime during the training process even at the very early stages. We demonstrate the efficacy of our approach across a variety of benchmark datasetsand in comparison to state-of-the-art recommendation techniques
Sponsorship
Science Foundation Ireland
Other Sponsorship
Insight Research Centre
Samsung Research
Samsung Electronics
Type of Material
Conference Publication
Publisher
ACM
Copyright (Published Version)
2020 the Authors
Subjects

Recommender systems

Federated learning

Active sampling

Faster training

Communication costs

DOI
10.1145/3394486.3403176
Language
English
Status of Item
Peer reviewed
Journal
KDD '20: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining
Conference Details
The 26th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD ’20), San Diego, California (held online due to coronavirus outbreak), 23-27th August 2020
This item is made available under a Creative Commons License
https://creativecommons.org/licenses/by-nc-nd/3.0/ie/
File(s)
No Thumbnail Available
Name

insight_publication.pdf

Size

903.42 KB

Format

Adobe PDF

Checksum (MD5)

0ad0f2eef838c153b02dc32588e91bdb

Owning collection
Insight Research Collection

Item descriptive metadata is released under a CC-0 (public domain) license: https://creativecommons.org/public-domain/cc0/.
All other content is subject to copyright.

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science

  • Cookie settings
  • Privacy policy
  • End User Agreement