摘要
随着阵列天线在各类移动平台上的广泛应用,时变幅相误差成为影响阵列信号处理技术工程化应用的重要因素。针对当前时变幅相误差无法有效校正的问题,结合自编码器思想,提出一种基于深度学习的阵列时变幅相误差校正算法。算法充分利用自编码器网络的数据特征提取与重构能力,设计了针对通道时变幅相误差校正的深度学习网络,给出了不含时变幅相误差数据(无扰数据)与含时变幅相误差数据(扰动数据)双驱动下的学习机制,基于期望输出与理想模型均方误差最小化原则,完成了对阵列流形隐匿特征提取,实现了阵列时变幅相误差的有效校正。仿真实验表明,所提算法可有效实现各通道时变幅相误差校正,在通道存在±80%随机时变幅度误差以及±5°随机时变相位误差时,幅度与相位误差校正后的均方差分别在0.5%和1.5%以内,当信噪比大于等于0 dB时,校正后数据的MUSIC算法测向精度与无误差数据基本一致,验证了所提算法的有效性。
As array antennas are widely used in various mobile platforms,the time-varying amplitude and phase error has become an important factor affecting the application of array signal processing technology.A deep learning-based algorithm for the correction of time-varying amplitude and phase errors in arrays is proposed in terms of the idea of autoencoder.The algorithm makes full use of the data feature extraction and reconstruction capability of the autoencoder network,designs a deep learning network for the correction of time-varying amplitude phase error of the channel,gives a double-driven learning mechanism without time-varying amplitude phase error data(unperturbed data)and time-varying amplitude phase error data(perturbed data),completes the extraction of the array stream shape hidden features based on the principle of minimising the mean square error of the desired output and the ideal model.The simulated experiments show that the algorithm can effectively correct the time-varying amplitude and phase errors of each channel,and the mean square error of the corrected amplitude and phase errors are within 0.5%and 1.5%respectively when there are±80%random time-varying amplitude errors and±5°random time-varying phase errors.The effectiveness of the proposed algorithm is verified.
作者
张梓轩
齐子森
许华
史蕴豪
ZHANG Zixuan;QI Zisen;XU Hua;SHI Yunhao(Institute of Telecommunication Engineering,Aire Force Engineering University,Xi′an 710077,China)
出处
《西北工业大学学报》
EI
CAS
CSCD
北大核心
2023年第6期1134-1145,共12页
Journal of Northwestern Polytechnical University
基金
国家自然科学基金(62131020)资助。
关键词
阵列校正
时变误差
自编码器
机器学习
array correction
time-varying error
autoencoder
machine learning