摘要
Training large-scale models needs big data. However, the few-shot problem is difficult to resolve due to inadequate training data. It is valuable to use only a few training samples to perform the task, such as using big data for application scenarios due to cost and resource problems. So, to tackle this problem, we present a simple and efficient method, contrastive label generation with knowledge for few-shot learning (CLG). Specifically, we: (1) Propose contrastive label generation to align the label with data input and enhance feature representations; (2) Propose a label knowledge filter to avoid noise during injection of the explicit knowledge into the data and label; (3) Employ label logits mask to simplify the task; (4) Employ multi-task fusion loss to learn different perspectives from the training set. The experiments demonstrate that CLG achieves an accuracy of 59.237%, which is more than about 3% in comparison with the best baseline. It shows that CLG obtains better features and gives the model more information about the input sentences to improve the classification ability.
原文 | English |
---|---|
文章編號 | 472 |
期刊 | Mathematics |
卷 | 12 |
發行號 | 3 |
DOIs | |
出版狀態 | Published - 2月 2024 |
指紋
深入研究「CLG: Contrastive Label Generation with Knowledge for Few-Shot Learning」主題。共同形成了獨特的指紋。新聞/媒體
-
Study Data from Faculty of Applied Sciences Provide New Insights into Mathematics (CLG: Contrastive Label Generation with Knowledge for Few-Shot Learning)
CHAN TONG LAM & HAN MA
15/02/24
1 的項目 媒體報導
新聞/媒體: Press/Media