Repository logo
  • Log In
    New user? Click here to register.Have you forgotten your password?
University College Dublin
    Colleges & Schools
    Statistics
    All of DSpace
  • Log In
    New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Institutes and Centres
  3. Insight Centre for Data Analytics
  4. Insight Research Collection
  5. Improving explainable recommendations by deep review-based explanations
 
  • Details
Options

Improving explainable recommendations by deep review-based explanations

Author(s)
Ouyang, Sixun  
Lawlor, Aonghus  
Uri
http://hdl.handle.net/10197/25679
Date Issued
2021-04-28
Date Available
2024-04-22T07:18:23Z
Abstract
Many e-commerce sites encourage their users to write product reviews, in the knowledge that they exert a considerable influence on users’ decision-making processes. These snippets of real-world experience provide an essential source of data for interpretable recommendations. However, current methods relying on user-generated content to make recommendations can run into problems because of well-known issues with reviews, such as noise, sparsity and irrelevant content. On the other hand, recent advances in text generation methods demonstrate significant text quality improvements and show promise in their ability to address these problems. In this paper, we develop two character-level deep neural network-based personalised review generation models, and improve recommendation accuracy by generating high-quality text which meets the input criteria of text-aware recommender systems. To make fair comparisons, we train review-aware recommender systems by human written reviews and attain advanced recommendations by feeding generated reviews at the inference step. Our experiments are conducted on four large review datasets from multiple domains. We leverage our methods’ performance by comparing with non-review based recommender systems and advanced review-aware recommender systems. The results demonstrate that we beat baselines on a range of metrics and obtain state-of-the-art performance on both rating prediction and top- N ranking. Our sparsity experiments validate that our generation models can produce high-quality text to tackle the sparsity problem. We also demonstrate the generation of useful reviews so that we can achieve up to 13.53% RMSE improvements. For explanation evaluation, quantitative analyses reveal good understandable scores for our generated review-based explanations, and qualitative case studies substantiate we can capture critical aspects in generating explanations.
Sponsorship
Science Foundation Ireland
Other Sponsorship
Insight Research Centre
Type of Material
Journal Article
Journal
IEEE Access
Volume
9
Start Page
67444
End Page
67455
Subjects

Computing methodologi...

Deep neural networks

Information systems

Natural language gene...

Recommender systems

DOI
10.1109/ACCESS.2021.3076146
Language
English
Status of Item
Peer reviewed
This item is made available under a Creative Commons License
https://creativecommons.org/licenses/by/3.0/ie/
File(s)
Loading...
Thumbnail Image
Name

Improving explainable recommendations by deep review-based explanations.pdf

Size

908.93 KB

Format

Adobe PDF

Checksum (MD5)

2290975490517c61e8f0628e5c237372

Owning collection
Insight Research Collection

Item descriptive metadata is released under a CC-0 (public domain) license: https://creativecommons.org/public-domain/cc0/.
All other content is subject to copyright.

For all queries please contact research.repository@ucd.ie.

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science

  • Cookie settings
  • Privacy policy
  • End User Agreement