The Structure-sharing Hypergraph Reasoning Attention Module for CNNs

Jingchao Wang, Guoheng Huang, Xiaochen Yuan, Guo Zhong, Tongxu Lin, Chi Man Pun, Fenfang Xie

研究成果: Article同行評審

摘要

Attention mechanisms improve the performance of models by selectively processing relevant information. However, existing attention mechanisms for CNNs do not utilize the high-order semantic similarity between different channels in the input when inferring attention. To address this issue, in this paper, we propose the Structure-sharing Hypergraph Reasoning Attention Module (SHRA Module) to explore the high-order similarity among nodes via hypergraph learning. SHRA Module transforms the input CNN feature maps into hypergraph node representations, which are used to reason attention under a set of learnable hypergraph convolutions. When performing the hypergraph convolution, the SHRA Module utilizes our proposed structure-sharing hypergraph convolution (SHGCN) to perform hypergraph convolutions, where the hypergraphs from different groups and the weight matrices for hypergraph convolutions are conducted in a right-shifted-permutation sequence of hypergraphs. As a result, the weights matrices can be shared with all groups of hypergraphs while performing hypergraph convolution, thus the global information can be used by the module to have a deep look into the input feature. We evaluate SHRA Module with models in object detection, lesion segmentation, and image classification tasks to demonstrate its effectiveness. Experimental results show that SHRA Module highly significantly enhances model performance, surpassing that of classic attention modules.

原文English
文章編號125240
期刊Expert Systems with Applications
259
DOIs
出版狀態Published - 1 1月 2025

指紋

深入研究「The Structure-sharing Hypergraph Reasoning Attention Module for CNNs」主題。共同形成了獨特的指紋。

引用此