TRHyper: Low-Complexity Hypernetwork for Channel Neural Decoding with Learning Weights in Tensor Ring Format

Liang Yuanhui, Chan Tong Lam, Qingle Wu, Benjamin K. Ng, SIO KEI IM

Research output: Contribution to journalArticlepeer-review

Abstract

In this letter, we propose a low-complexity hypernetwork for channel neural decoding with learning weights in tensor ring (TR) format, called TRHyper. The internal parameters and the number of layers of the TRHyper based channel neural decoding algorithm can be updated without retraining. We design the size of each TRHyper layer according to the size of the factor tensor in tensor ring format. During the training phase, we reuse the storage space for the learning weights of the main decoding network, so the proposed TRHyper no longer require additional storage space for its learning weights. Numerical results show that for low-density parity check (LDPC) codes, the performance of the TRHyper based channel neural decoder is similar to that of the original decoder, while for Bose-Chaudhuri-Hocquenghem (BCH) codes, the performance slightly exceeds the original decoder.

Original languageEnglish
JournalIEEE Communications Letters
DOIs
Publication statusAccepted/In press - 2025

Keywords

  • BCH
  • Channel Neural Decoding
  • Hypernetwork
  • LDPC
  • Tensor Ring

Fingerprint

Dive into the research topics of 'TRHyper: Low-Complexity Hypernetwork for Channel Neural Decoding with Learning Weights in Tensor Ring Format'. Together they form a unique fingerprint.

Cite this