TY - JOUR
T1 - Joint-Way Compression for LDPC Neural Decoding Algorithm With Tensor-Ring Decomposition
AU - Liang, Yuanhui
AU - Lam, Chan Tong
AU - Ng, Benjamin K.
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2023
Y1 - 2023
N2 - In this paper, we propose low complexity joint-way compression algorithms with Tensor-Ring (TR) decomposition and weight sharing to further lower the storage and computational complexity requirements for low density parity check (LDPC) neural decoding. Compared with Tensor-Train (TT) decomposition, TR decomposition is more flexible for the selection of ranks, and is also conducive to the use of rank optimization algorithms. In particular, we use TR decomposition to decompose not only the weight parameter matrix of Neural Normalized Min-Sum (NNMS)+ algorithm, but also the message matrix transmitted between variable nodes and check nodes. Furthermore, we combine the TR decomposition and weight sharing algorithm, called joint-way compression, to further lower the complexity of LDPC neural decoding algorithm. We show that the joint-way compression algorithm can achieve better compression efficiency than a single compression algorithm while maintaining a comparable bit error rate (BER) performance. From the numerical experiments, we found that all the compression algorithms with appropriate selection of ranks give almost no performance degradation and that the TRwm-ssNNMS+ algorithm, which combines the spatial sharing and TR decomposition of both weight and message matrix, has the best compression efficiency. Compared with our TT-NNMS+ algorithm proposed in Yuanhui et al. (2022), the number of parameters is reduced by about 70 times and the number of multiplications is reduced by about 6 times.
AB - In this paper, we propose low complexity joint-way compression algorithms with Tensor-Ring (TR) decomposition and weight sharing to further lower the storage and computational complexity requirements for low density parity check (LDPC) neural decoding. Compared with Tensor-Train (TT) decomposition, TR decomposition is more flexible for the selection of ranks, and is also conducive to the use of rank optimization algorithms. In particular, we use TR decomposition to decompose not only the weight parameter matrix of Neural Normalized Min-Sum (NNMS)+ algorithm, but also the message matrix transmitted between variable nodes and check nodes. Furthermore, we combine the TR decomposition and weight sharing algorithm, called joint-way compression, to further lower the complexity of LDPC neural decoding algorithm. We show that the joint-way compression algorithm can achieve better compression efficiency than a single compression algorithm while maintaining a comparable bit error rate (BER) performance. From the numerical experiments, we found that all the compression algorithms with appropriate selection of ranks give almost no performance degradation and that the TRwm-ssNNMS+ algorithm, which combines the spatial sharing and TR decomposition of both weight and message matrix, has the best compression efficiency. Compared with our TT-NNMS+ algorithm proposed in Yuanhui et al. (2022), the number of parameters is reduced by about 70 times and the number of multiplications is reduced by about 6 times.
KW - Joint-way compression
KW - LDPC neural decoding
KW - tensor ring decomposition
KW - weight sharing
UR - http://www.scopus.com/inward/record.url?scp=85149848857&partnerID=8YFLogxK
U2 - 10.1109/ACCESS.2023.3252907
DO - 10.1109/ACCESS.2023.3252907
M3 - Article
AN - SCOPUS:85149848857
SN - 2169-3536
VL - 11
SP - 22871
EP - 22879
JO - IEEE Access
JF - IEEE Access
ER -