Multi-Way Compression for Channel Neural Decoding with Quantization

Yuanhui Liang, Chan Tong Lam, Qingle Wu, Benjamin K. Ng, Sio Kei Im

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The performance of model-driven channel neural decoding has surpassed that of traditional channel decoding algorithms, but at a higher complexity, making it difficult to implement on resource-constrained communication hardware. In this paper, we propose a quantization scheme for model-driven channel neural decoding, and combine the quantization scheme with TR decomposition and weight sharing algorithms to form different types of multi-way compression methods. Experimental results on LDPC, BCH and Hamming codes show that the proposed quantization and multi-way compression methods can effectively reduce the complexity of channel neural decoding without significant performance degradation.

Original languageEnglish
Title of host publication2023 9th International Conference on Computer and Communications, ICCC 2023
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages968-973
Number of pages6
ISBN (Electronic)9798350317251
DOIs
Publication statusPublished - 2023
Event9th International Conference on Computer and Communications, ICCC 2023 - Hybrid, Chengdu, China
Duration: 8 Dec 202311 Dec 2023

Publication series

Name2023 9th International Conference on Computer and Communications, ICCC 2023

Conference

Conference9th International Conference on Computer and Communications, ICCC 2023
Country/TerritoryChina
CityHybrid, Chengdu
Period8/12/2311/12/23

Keywords

  • channel neural decoding
  • model-driven
  • multi-way compression
  • quantization

Fingerprint

Dive into the research topics of 'Multi-Way Compression for Channel Neural Decoding with Quantization'. Together they form a unique fingerprint.

Cite this