TY - GEN
T1 - Knowledge-Supervised Learning
T2 - 29th ACM International Conference on Multimedia, MM 2021
AU - Wang, Li
AU - Fan, Baoyu
AU - Guo, Zhenhua
AU - Zhao, Yaqian
AU - Zhang, Runze
AU - Li, Rengang
AU - Gong, Weifeng
AU - Wang, Endong
N1 - Publisher Copyright:
© 2021 ACM.
PY - 2021/10/17
Y1 - 2021/10/17
N2 - The consensus of multiple views on the same data will provide extra regularization, thereby improving accuracy. Based on this idea, we proposed a novel Knowledge-Supervised Learning (KSL) method for person re-identification (Re-ID), which can improve the performance without introducing extra inference cost. Firstly, we introduce isomorphic auxiliary training strategy to conduct basic multiple views that simultaneously train multiple classifier heads of the same network on the same training data. The consensus constraints aim to maximize the agreement among multiple views. To introduce this regular constraint, inspired by knowledge distillation that paired branches can be trained collaboratively through mutual imitation learning. Three novel constraints losses are proposed to distill the knowledge that needs to be transferred across different branches: similarity of predicted classification probability for cosine space constraints, distance of embedding features for euclidean space constraints, hard sample mutual mining for hard sample space constraints. From different perspectives, these losses complement each other. Experiments on four mainstream Re-ID datasets show that a standard model with KSL method trained from scratch outperforms its ImageNet pre-training results by a clear margin. With KSL method, a lightweight model without ImageNet pre-training outperforms most large models. We expect that these discoveries can attract some attention from the current de facto paradigm of "pre-training and fine-tuning"in Re-ID task to the knowledge discovery during model training.
AB - The consensus of multiple views on the same data will provide extra regularization, thereby improving accuracy. Based on this idea, we proposed a novel Knowledge-Supervised Learning (KSL) method for person re-identification (Re-ID), which can improve the performance without introducing extra inference cost. Firstly, we introduce isomorphic auxiliary training strategy to conduct basic multiple views that simultaneously train multiple classifier heads of the same network on the same training data. The consensus constraints aim to maximize the agreement among multiple views. To introduce this regular constraint, inspired by knowledge distillation that paired branches can be trained collaboratively through mutual imitation learning. Three novel constraints losses are proposed to distill the knowledge that needs to be transferred across different branches: similarity of predicted classification probability for cosine space constraints, distance of embedding features for euclidean space constraints, hard sample mutual mining for hard sample space constraints. From different perspectives, these losses complement each other. Experiments on four mainstream Re-ID datasets show that a standard model with KSL method trained from scratch outperforms its ImageNet pre-training results by a clear margin. With KSL method, a lightweight model without ImageNet pre-training outperforms most large models. We expect that these discoveries can attract some attention from the current de facto paradigm of "pre-training and fine-tuning"in Re-ID task to the knowledge discovery during model training.
KW - consensus constraints
KW - isomorphic auxiliary training
KW - knowledge distillation
KW - person retrieval
UR - http://www.scopus.com/inward/record.url?scp=85119366369&partnerID=8YFLogxK
U2 - 10.1145/3474085.3475340
DO - 10.1145/3474085.3475340
M3 - Conference contribution
AN - SCOPUS:85119366369
T3 - MM 2021 - Proceedings of the 29th ACM International Conference on Multimedia
SP - 1866
EP - 1874
BT - MM 2021 - Proceedings of the 29th ACM International Conference on Multimedia
PB - Association for Computing Machinery, Inc
Y2 - 20 October 2021 through 24 October 2021
ER -