WiSACL: A subdomain adaptive Wi-Fi-based gesture recognition via contrastive learning

Research output: Contribution to journalArticlepeer-review

Abstract

Wi-Fi-based gesture recognition faces significant performance degradation in cross-domain scenarios due to the distribution shifts between various environments, user locations, and facing orientations. To address this challenge, we propose WiSACL, a novel Wi-Fi-based gesture recognition framework that integrates innovative CSI signal processing with advanced subdomain adaptation techniques. We propose a novel Wi-Fi CSI phase difference representation that suppresses the environmental noise and highlights the motion features via phase ratio computation, wavelet denoising, and temporal differencing, which are then encoded into discriminative image representations for enhanced gesture recognition. Building upon this robust signal representation, we develop an Adaptive Distributioncalibrated Pseudo-Label (ADPL) module that progressively refines target labels through subdomain distribution alignment and confidence-aware selection. To fully exploit these pseudo-labels while mitigating the adverse effects of domain shift and label noise, we further introduce a contrastive learning (CL) model that explicitly enhances intra-class compactness and inter-class separation in the feature space. Extensive experiments on the Widar3.0 dataset demonstrate that WiSACL achieves an average accuracy of 98.62% across cross-location, cross-orientation, and cross-environment scenarios, significantly outperforming state-of-the-art methods and validating the effectiveness of our approach for robust cross-domain wireless gesture recognition.

Original languageEnglish
JournalIEEE Internet of Things Journal
DOIs
Publication statusAccepted/In press - 2026

Keywords

  • Contrastive learning
  • Cross-domain
  • Pseudo-label
  • Subdomain adaptive learning
  • Wi-Fi sensing

Fingerprint

Dive into the research topics of 'WiSACL: A subdomain adaptive Wi-Fi-based gesture recognition via contrastive learning'. Together they form a unique fingerprint.

Cite this