Options
Complexity-Reduced Model Adaptation for Digital Predistortion of RF Power Amplifiers With Pretraining-Based Feature Extraction
File(s)
File | Description | Size | Format | |
---|---|---|---|---|
Pretraining DPD.pdf | 4.38 MB |
Author(s)
Date Issued
March 2021
Date Available
10T17:29:02Z March 2021
Abstract
In this article, we present a new method to reduce the model adaptation complexity for digital predistortion (DPD) of radio frequency (RF) power amplifiers (PAs) under varying operating conditions, using pretrained transformation of model coefficients. Experimental studies show that the PA behavior variations can be effectively tracked using a small number of ``transformed'' coefficients, even with large deviations in its output characteristics. Based on this discovery, to avoid reextracting all the original coefficients every time when the operating condition changes, we propose to conduct a one-time off-line pretraining stage to extract the common features of PA behaviors under different operating conditions first. The online model adaptation process will then only need to identify a small number of transformed coefficients, which can result in a drastic reduction in the computational complexity of the model adaptation process. The proposed solution is validated by experimental results considering varying signal bandwidth and output power levels on a high-efficiency gallium-nitride Doherty PA, where the computational complexity is significantly reduced and the system performance is not compromised.
Sponsorship
Science Foundation Ireland
Type of Material
Journal Article
Publisher
IEEE
Journal
IEEE Transactions on Microwave Theory and Techniques
Volume
69
Issue
3
Start Page
1780
End Page
1790
Copyright (Published Version)
2020 IEEE
Language
English
Status of Item
Peer reviewed
ISSN
0018-9480
This item is made available under a Creative Commons License
Owning collection
Scopus© citations
8
Acquisition Date
Feb 1, 2023
Feb 1, 2023
Views
391
Acquisition Date
Feb 1, 2023
Feb 1, 2023
Downloads
315
Last Week
10
10
Last Month
13
13
Acquisition Date
Feb 1, 2023
Feb 1, 2023