AMBIQUAL - a full reference objective quality metric for ambisonic spatial audio

DC FieldValueLanguage
dc.contributor.authorNarbutt, Miroslaw-
dc.contributor.authorAllen, Andrew-
dc.contributor.authorSkoglund, Jan-
dc.contributor.authorChinen, Michael-
dc.contributor.authorHines, Andrew- IEEEen_US
dc.descriptionThe 2018 Tenth International Conference on Quality of Multimedia Experience (QoMex), Sardinia, Italy, 29 May- 1 June 2018en_US
dc.description.abstractStreaming spatial audio over networks requires efficient encoding techniques that compress the raw audio content without compromising quality of experience. Streaming service providers such as YouTube need a perceptually relevant objective audio quality metric to monitor users' perceived quality and spatial localization accuracy. In this paper we introduce a full reference objective spatial audio quality metric, AMBIQUAL, which assesses both Listening Quality and Localization Accuracy. In our solution both metrics are derived directly from the B-format Ambisonic audio. The metric extends and adapts the algorithm used in ViSQOLAudio, a full reference objective metric designed for assessing speech and audio quality. In particular, Listening Quality is derived from the omnidirectional channel and Localization Accuracy is derived from a weighted sum of similarity from B-format directional channels. This paper evaluates whether the proposed AMBIQUAL objective spatial audio quality metric can predict two factors: Listening Quality and Localization Accuracy by comparing its predictions with results from MUSHRA subjective listening tests. In particular, we evaluated the Listening Quality and Localization Accuracy of First and Third-Order Ambisonic audio compressed with the OPUS 1.2 codec at various bitrates (i.e. 32, 128 and 256, 512kbps respectively). The sample set for the tests comprised both recorded and synthetic audio clips with a wide range of time-frequency characteristics. To evaluate Localization Accuracy of compressed audio a number of fixed and dynamic (moving vertically and horizontally) source positions were selected for the test samples. Results showed a strong correlation (PCC=0.919; Spearman=0.882 regarding Listening Quality and PCC=0.854; Spearman=0.842 regarding Localization Accuracy) between objective quality scores derived from the B-format Ambisonic audio using AMBIQUAL and subjective scores obtained during listening MUSHRA tests. AMBIQUAL displays very promising quality assessment predictions for spatial audio. Future work will optimise the algorithm to generalise and validate it for any Higher Order Ambisonic formats.en_US
dc.description.sponsorshipEuropean Commission - European Regional Development Funden_US
dc.description.sponsorshipScience Foundation Irelanden_US
dc.relation.ispartof2018 Tenth International Conference on Quality of Multimedia Experience (QoMEX)en_US
dc.rights© 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.subjectVirtual realityen_US
dc.subjectSpatial audioen_US
dc.subjectAudio codingen_US
dc.subjectAudio compressionen_US
dc.subjectOpus codecen_US
dc.titleAMBIQUAL - a full reference objective quality metric for ambisonic spatial audioen_US
dc.typeConference Publicationen_US
dc.statusPeer revieweden_US
dc.description.othersponsorshipGoogle, Inc.en_US
item.fulltextWith Fulltext-
Appears in Collections:Computer Science Research Collection
Files in This Item:
File Description SizeFormat 
QoMEX2018_Ambiqual.pdf1.79 MBAdobe PDFDownload
Show simple item record

Citations 50

checked on Jun 18, 2019

Google ScholarTM



This item is available under the Attribution-NonCommercial-NoDerivs 3.0 Ireland. No item may be reproduced for commercial purposes. For other possible restrictions on use please refer to the publisher's URL where this is made available, or to notes contained in the item itself. Other terms may apply.