CLG: Contrastive Label Generation with Knowledge for Few-Shot Learning

Han Ma, Baoyu Fan, Benjamin K. Ng, Chan Tong Lam

研究成果: Article同行評審

摘要

Training large-scale models needs big data. However, the few-shot problem is difficult to resolve due to inadequate training data. It is valuable to use only a few training samples to perform the task, such as using big data for application scenarios due to cost and resource problems. So, to tackle this problem, we present a simple and efficient method, contrastive label generation with knowledge for few-shot learning (CLG). Specifically, we: (1) Propose contrastive label generation to align the label with data input and enhance feature representations; (2) Propose a label knowledge filter to avoid noise during injection of the explicit knowledge into the data and label; (3) Employ label logits mask to simplify the task; (4) Employ multi-task fusion loss to learn different perspectives from the training set. The experiments demonstrate that CLG achieves an accuracy of 59.237%, which is more than about 3% in comparison with the best baseline. It shows that CLG obtains better features and gives the model more information about the input sentences to improve the classification ability.

原文English
文章編號472
期刊Mathematics
12
發行號3
DOIs
出版狀態Published - 2月 2024

指紋

深入研究「CLG: Contrastive Label Generation with Knowledge for Few-Shot Learning」主題。共同形成了獨特的指紋。

引用此