Application of Tensor Decomposition to Reduce the Complexity of Neural Min-Sum Channel Decoding Algorithm

Qingle Wu, Benjamin K. Ng, Yuanhui Liang, Chan Tong Lam, Yan Ma

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)

Abstract

Channel neural decoding is very promising as it outperforms the traditional channel decoding algorithms. Unfortunately, it still faces the disadvantage of high computational complexity and storage complexity compared with the traditional decoding algorithms. In this paper, we propose that low rank decomposition techniques based on tensor train decomposition and tensor ring decomposition can be utilized in neural offset min-sum (NOMS) and neural scale min-sim (NSMS) decoding algorithms. The experiment results show that the proposed two algorithms achieve near state-of-the-art performance with low complexity.

Original languageEnglish
Article number2255
JournalApplied Sciences (Switzerland)
Volume13
Issue number4
DOIs
Publication statusPublished - Feb 2023

Keywords

  • neural offset min-sum
  • neural scale min-sum
  • tensor ring decomposition
  • tensor train decomposition

Fingerprint

Dive into the research topics of 'Application of Tensor Decomposition to Reduce the Complexity of Neural Min-Sum Channel Decoding Algorithm'. Together they form a unique fingerprint.

Cite this