通信学报 ›› 2014, Vol. 35 ›› Issue (1): 16-23.doi: 10.3969/j.issn.1000-436x.2014.01.003

• 学术论文 • 上一篇    下一篇

用记忆型BP神经网络实现HPA预失真的算法研究

黄春晖,温永杰   

  1. 福州大学 物理与信息工程学院,福建 福州 350108
  • 出版日期:2014-01-25 发布日期:2017-06-17
  • 基金资助:
    国家自然科学基金资助项目

Algorithm study of digital HPA predistortion using one novel memory type BP neural network

Chun-hui HUANG,Yong-jie WEN   

  1. College of Physics and Information Engineering,Fuzhou University,Fuzhou 350108,China
  • Online:2014-01-25 Published:2017-06-17

摘要:

在分析宽频带CMMB直放站高功率功放(HPA)特性的基础上,提出了一种可分离处理功放记忆效应和非线性的延时神经网络(FIR-NLNNN)模型。该模型以实数延时神经网络(RVTDNN)为基础,用Levenberg-Marquardt(LM)优化算法确定神经网络系数,在模型中新增参数 w0,给出了 LM 算法的修改公式。接着在预失真神经网络系统中引入Bayesian机理消除LM算法的过拟合现象,构建CMMB数字直放站的间接学习预失真器,拟合HPA的非线性和记忆效应。结果表明:RVTDNN和FIR-NLNNN 2种预失真器均能显著提高系统性能,降低邻信道功率比30 dB左右。在保持均方误差(MSE)小于10?6的情况下,FIR-NLNNN结构的网络参数比RVTDNN结构减少了近50%,迭代过程中的乘法和加法次数约降低75%。

关键词: 高功率功放, 预失真器, 神经网络, 记忆效应, LM算法, Bayesian算法

Abstract:

Based on the characteristic analysis of the high power amplifier (HPA) in wide-band CMMB repeater stations,a novel neural network was proposed which can respectively process the memory effect and the nonlinear of power amplifier.The novel model based on real-valued time-delay neural networks(RVTDNN) uses the Levenberg-Marquardt (LM) optimization to iteratively update the coefficients of the neural network.Due to the new parameters w0in the novel NN model,the modified formulas of LM algorithm were provided.Next,in order to eliminate the over-fitting of LM algorithm,the Bayesian regularization algorithm was applied to the predistortion system.Additionally,the predistorter of CMMB repeater stations based on the indirect learning method was constructed to simulate the nonlinearity and memory effect of HPA.Simulation results show that both the NN models can improve system performance and reduce ACEPR (adjacent channel error power ratio ) by about 30 dB.Moreover,with the mean square error less than 10?6,the coefficient of network for FIR-NLNNN is about half of that for RVTDNN.Similarly,the times of multiplication and addition in the iterative process of FIR-NLNNN are about 25% of that for RVTDNN.

Key words: HPA, predistortion, neural network, memory effect, LM algorithm, Bayesian algorithm

No Suggested Reading articles found!