TY - JOUR
T1 - Mutual sample-center interaction with hard queue mining for face recognition
AU - Chen, Yongqiang
AU - Li, Jianqing
AU - Yuan, Xiaochen
AU - Yang, Guanghua
AU - Li, Xiaofan
AU - Gong, Xueyuan
N1 - Publisher Copyright:
© 2026 Elsevier Inc.
PY - 2026/7/5
Y1 - 2026/7/5
N2 - In large-scale face recognition datasets, the huge number of classes can lead to storage demands for fully connected (FC) layer parameters that exceed even those of the backbone network, and it will cause an Out of Memory (OOM) error when training on WebFace21M. Consequently, training models with limited computational resources renders mainstream FC-based methods impractical. Existing approaches address this issue by utilizing only a subset of negative classes during training. However, a key drawback of these methods lies in their handling of backpropagation: while FC-based approaches involve all negative classes in gradient updates, the partial negative class strategy engages only a subset. This limitation weakens the quality of inter-class interactions. To address this limitation, we present MutualFace. Our approach simulates the mutual interaction between the positive center of the class Wyi and its corresponding feature xi, which is observed in FC-based approaches where Wyi and xi mutually interact with each other during backpropagation. Critically, we maintain a dynamic repository of hard negative class centers per class, enabling exposure to a wider variety of challenging negative instances during training. Our MutualFace framework demonstrates comprehensive superiority over other methods, achieving improved performance on both the large-scale IJBC test set and multiple smaller benchmarks. The code is available at https://github.com/isBoMula/MutualFace.
AB - In large-scale face recognition datasets, the huge number of classes can lead to storage demands for fully connected (FC) layer parameters that exceed even those of the backbone network, and it will cause an Out of Memory (OOM) error when training on WebFace21M. Consequently, training models with limited computational resources renders mainstream FC-based methods impractical. Existing approaches address this issue by utilizing only a subset of negative classes during training. However, a key drawback of these methods lies in their handling of backpropagation: while FC-based approaches involve all negative classes in gradient updates, the partial negative class strategy engages only a subset. This limitation weakens the quality of inter-class interactions. To address this limitation, we present MutualFace. Our approach simulates the mutual interaction between the positive center of the class Wyi and its corresponding feature xi, which is observed in FC-based approaches where Wyi and xi mutually interact with each other during backpropagation. Critically, we maintain a dynamic repository of hard negative class centers per class, enabling exposure to a wider variety of challenging negative instances during training. Our MutualFace framework demonstrates comprehensive superiority over other methods, achieving improved performance on both the large-scale IJBC test set and multiple smaller benchmarks. The code is available at https://github.com/isBoMula/MutualFace.
KW - Deep neural networks
KW - Fully connected layer
KW - Inter-class interaction
KW - Intra-class compactness
KW - Large-scale face recognition
UR - https://www.scopus.com/pages/publications/105032513480
U2 - 10.1016/j.ins.2026.123352
DO - 10.1016/j.ins.2026.123352
M3 - Article
AN - SCOPUS:105032513480
SN - 0020-0255
VL - 743
JO - Information Sciences
JF - Information Sciences
M1 - 123352
ER -