Abstract
In this letter, we propose a low-complexity hypernetwork for channel neural decoding with learning weights in tensor ring (TR) format, called TRHyper. The internal parameters and the number of layers of the TRHyper based channel neural decoding algorithm can be updated without retraining. We design the size of each TRHyper layer according to the size of the factor tensor in tensor ring format. During the training phase, we reuse the storage space for the learning weights of the main decoding network, so the proposed TRHyper no longer require additional storage space for its learning weights. Numerical results show that for low-density parity check (LDPC) codes, the performance of the TRHyper based channel neural decoder is similar to that of the original decoder, while for Bose-Chaudhuri-Hocquenghem (BCH) codes, the performance slightly exceeds the original decoder.
Original language | English |
---|---|
Journal | IEEE Communications Letters |
DOIs | |
Publication status | Accepted/In press - 2025 |
Keywords
- BCH
- Channel Neural Decoding
- Hypernetwork
- LDPC
- Tensor Ring