LSTM-GRU-Based cGAN With Multi-Head Attention for One-Bit Channel Estimation in Multi-User Massive MIMO System

Qingying Wu, Junqi Bao, Hui Xu, Benjamin K. Ng, Chan Tong Lam

Research output: Contribution to journalArticlepeer-review

Abstract

The application of one-bit Analog-to-Digital Converters (ADCs) in massive Multiple-Input Multiple-Output (MIMO) systems offers significant advantages in terms of reduced power consumption and hardware complexity. However, the severe quantization with one-bit ADCs causes extreme challenges for accurate channel estimation. To address this issue, this paper proposes a hybrid framework based on a conditional Generative Adversarial Network (cGAN), integrated with Long Short-Term Memory-Gated Recurrent Units (LSTM-GRU) and Multi-Head Attention (MHA) modules. Specifically, the cGAN component facilitates the generation of high-fidelity channel matrices through adversarial learning, while the LSTM-GRU and MHA modules enhance the model’s ability to extract features from sequential channel structures and spatial correlations across antenna dimensions. Besides, the model is trained under a task-specific loss formulation to improve generalization performance in the presence of heavily quantized signals. Experiments are conducted under various signal-to-noise ratios (SNRs), antenna configurations, and pilot lengths. Numerical results demonstrate that the proposed method can outperform state-of-the-art CNN, cGAN, and LSTM-GRU methods by achieving Normalized Mean Square Error (NMSE) gains of 8.84 dB, 5.92 dB, and 4.34 dB, respectively.

Original languageEnglish
Pages (from-to)191884-191893
Number of pages10
JournalIEEE Access
Volume13
DOIs
Publication statusPublished - 2025

Keywords

  • channel estimation
  • conditional generative adversarial network
  • One-bit MIMO
  • recurrent neural network

Fingerprint

Dive into the research topics of 'LSTM-GRU-Based cGAN With Multi-Head Attention for One-Bit Channel Estimation in Multi-User Massive MIMO System'. Together they form a unique fingerprint.

Cite this