TY - JOUR
T1 - Low-Complexity Neural Belief Propagation Decoding Algorithm Based on Tensor Ring Decomposition
AU - Liang, Yuanhui
AU - Lam, Chan Tong
AU - Wu, Qingle
AU - Ng, Benjamin K.
AU - Im, Sio Kei
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Neural belief propagation (NBP) decoding can improve the performance of belief propagation (BP) decoding for high-density parity check (HDPC) codes, at the expense of higher memory storage requirement and computational complexity due to the addition of trainable weight coefficients. To reduce the high storage requirement of NBP, the cyclically equivariant neural BP (CENBP) algorithm makes full use of the cyclically invariant property of the cyclic code, optimizes and reuses the weight coefficients of the NBP algorithm, at the expense of further increasing the computational complexity of NBP. In this paper, we propose low-complexity, in terms of both memory storage requirement and computational complexity, NBP and CENBP decoding algorithms based on Tensor Ring (TR) decomposition. First, in order to reduce the memory storage and computational complexity of the NBP algorithm, we propose a TR-based compression algorithm to compress the messages and mathematical calculations in the NBP decoding algorithm, called TR-NBP algorithm. Second, to address the high computational complexity of the CENBP algorithm, we propose to apply TR decomposition-based compression to the odd layers of the CENBP decoding algorithm, called TR-CENBP, to reduce the computational complexity, and further reduce the required memory storage requirement of the CENBP algorithm. Furthermore, we use TR decomposition-based compression to simplify the mathematical computations associated with the tanh function in the NBP algorithm to further reduce the complexity of the hardware implementation. Experimental results show that direct compression of BP algorithm using TR decomposition results in significant performance degradation and our proposed low complexity TR-NBP algorithm and TR-CENBP algorithm can greatly reduce both the memory storage requirement and computation complexity, without significant performance degradation for typical BCH and LDPC codes.
AB - Neural belief propagation (NBP) decoding can improve the performance of belief propagation (BP) decoding for high-density parity check (HDPC) codes, at the expense of higher memory storage requirement and computational complexity due to the addition of trainable weight coefficients. To reduce the high storage requirement of NBP, the cyclically equivariant neural BP (CENBP) algorithm makes full use of the cyclically invariant property of the cyclic code, optimizes and reuses the weight coefficients of the NBP algorithm, at the expense of further increasing the computational complexity of NBP. In this paper, we propose low-complexity, in terms of both memory storage requirement and computational complexity, NBP and CENBP decoding algorithms based on Tensor Ring (TR) decomposition. First, in order to reduce the memory storage and computational complexity of the NBP algorithm, we propose a TR-based compression algorithm to compress the messages and mathematical calculations in the NBP decoding algorithm, called TR-NBP algorithm. Second, to address the high computational complexity of the CENBP algorithm, we propose to apply TR decomposition-based compression to the odd layers of the CENBP decoding algorithm, called TR-CENBP, to reduce the computational complexity, and further reduce the required memory storage requirement of the CENBP algorithm. Furthermore, we use TR decomposition-based compression to simplify the mathematical computations associated with the tanh function in the NBP algorithm to further reduce the complexity of the hardware implementation. Experimental results show that direct compression of BP algorithm using TR decomposition results in significant performance degradation and our proposed low complexity TR-NBP algorithm and TR-CENBP algorithm can greatly reduce both the memory storage requirement and computation complexity, without significant performance degradation for typical BCH and LDPC codes.
KW - High-Density Parity Check Code
KW - Neural Belief Propagation Decoding
KW - Tensor Ring Decomposition
UR - http://www.scopus.com/inward/record.url?scp=85208394439&partnerID=8YFLogxK
U2 - 10.1109/TCCN.2024.3487999
DO - 10.1109/TCCN.2024.3487999
M3 - Article
AN - SCOPUS:85208394439
SN - 2332-7731
JO - IEEE Transactions on Cognitive Communications and Networking
JF - IEEE Transactions on Cognitive Communications and Networking
ER -