冰下声信道与脉冲噪声的多任务稀疏贝叶斯学习联合估计方法
A joint estimation method of under-ice acoustic channel and impulsive noise based on multi-task sparse Bayesian learning
-
摘要: 针对极地环境水声信道中的冰致脉冲噪声, 提出了一种多任务联合估计信道与脉冲噪声的算法, 以提高时变信道下的稀疏恢复性能。将接收数据块划分为子块, 利用子块信道之间的时间相关性设计算法, 在多任务稀疏贝叶斯学习信道估计模型中进一步考虑脉冲噪声, 通过变分贝叶斯方法推导出信道与脉冲噪声联合估计的迭代算法。该算法还针对数据子块之间信道消息传递的加权因子设计了一种自适应方案, 对其中的关键权重因子进行了优化, 进一步改善了Turbo迭代过程中的信道估计精度, 有效抑制了误差的传播。利用第十一次中国北极科考的冰下水声通信实验数据对算法进行验证, 结果表明所提算法有效抑制了脉冲噪声的干扰, 同时能有效减小误差传播。对于通信距离为11 km的高阶调制数据, 所提算法在5次迭代后相比未进行脉冲噪声消除的算法在误比特率上平均相对降低了约92.5%。Abstract: A multi-task joint estimation algorithm for both the channel and impulsive noise is proposed to improve the sparse recovery performance under time-varying channels to address the ice-induced impulsive noise in underwater acoustic channels in polar environments. The received data blocks are divided into sub-blocks, and the time correlation between the sub-block channel data is utilized to design the algorithm. Impulsive noise is further incorporated into the multi-task sparse Bayesian learning channel estimation model and an iterative algorithm is derived for joint estimation of the channel and impulsive noise using the variational Bayesian method. An adaptive scheme is introduced for the weighted factors in the channel message passing between data sub-blocks. Key weight factors are optimized to further improve the channel estimation accuracy during the Turbo iterative process, effectively mitigating error propagation. The algorithm has been validated using experimental data from the eleventh Chinese Arctic expedition’s sub-ice acoustic communication experiment. The results demonstrate that the proposed algorithm effectively suppresses the interference from impulsive noise, shows superior performance in preventing error propagation. For high-order modulation data over a communication distance of 11 km in a single-input-single-output system, the proposed algorithm achieves an average bit error rate reduction of 92.5% after five iterations compared to algorithms without pulse noise elimination.