TY - JOUR
T1 - The Structure-sharing Hypergraph Reasoning Attention Module for CNNs
AU - Wang, Jingchao
AU - Huang, Guoheng
AU - Yuan, Xiaochen
AU - Zhong, Guo
AU - Lin, Tongxu
AU - Pun, Chi Man
AU - Xie, Fenfang
N1 - Publisher Copyright:
© 2024 Elsevier Ltd
PY - 2025/1/1
Y1 - 2025/1/1
N2 - Attention mechanisms improve the performance of models by selectively processing relevant information. However, existing attention mechanisms for CNNs do not utilize the high-order semantic similarity between different channels in the input when inferring attention. To address this issue, in this paper, we propose the Structure-sharing Hypergraph Reasoning Attention Module (SHRA Module) to explore the high-order similarity among nodes via hypergraph learning. SHRA Module transforms the input CNN feature maps into hypergraph node representations, which are used to reason attention under a set of learnable hypergraph convolutions. When performing the hypergraph convolution, the SHRA Module utilizes our proposed structure-sharing hypergraph convolution (SHGCN) to perform hypergraph convolutions, where the hypergraphs from different groups and the weight matrices for hypergraph convolutions are conducted in a right-shifted-permutation sequence of hypergraphs. As a result, the weights matrices can be shared with all groups of hypergraphs while performing hypergraph convolution, thus the global information can be used by the module to have a deep look into the input feature. We evaluate SHRA Module with models in object detection, lesion segmentation, and image classification tasks to demonstrate its effectiveness. Experimental results show that SHRA Module highly significantly enhances model performance, surpassing that of classic attention modules.
AB - Attention mechanisms improve the performance of models by selectively processing relevant information. However, existing attention mechanisms for CNNs do not utilize the high-order semantic similarity between different channels in the input when inferring attention. To address this issue, in this paper, we propose the Structure-sharing Hypergraph Reasoning Attention Module (SHRA Module) to explore the high-order similarity among nodes via hypergraph learning. SHRA Module transforms the input CNN feature maps into hypergraph node representations, which are used to reason attention under a set of learnable hypergraph convolutions. When performing the hypergraph convolution, the SHRA Module utilizes our proposed structure-sharing hypergraph convolution (SHGCN) to perform hypergraph convolutions, where the hypergraphs from different groups and the weight matrices for hypergraph convolutions are conducted in a right-shifted-permutation sequence of hypergraphs. As a result, the weights matrices can be shared with all groups of hypergraphs while performing hypergraph convolution, thus the global information can be used by the module to have a deep look into the input feature. We evaluate SHRA Module with models in object detection, lesion segmentation, and image classification tasks to demonstrate its effectiveness. Experimental results show that SHRA Module highly significantly enhances model performance, surpassing that of classic attention modules.
KW - Attention mechanism
KW - Hypergraph
KW - Structure-sharing
UR - http://www.scopus.com/inward/record.url?scp=85202935276&partnerID=8YFLogxK
U2 - 10.1016/j.eswa.2024.125240
DO - 10.1016/j.eswa.2024.125240
M3 - Article
AN - SCOPUS:85202935276
SN - 0957-4174
VL - 259
JO - Expert Systems with Applications
JF - Expert Systems with Applications
M1 - 125240
ER -