跳至主導覽 跳至搜尋 跳過主要內容

Local to global learning: Gradually adding classes for training deep neural networks

  • Hao Cheng
  • , Dongze Lian
  • , Bowen Deng
  • , Shenghua Gao
  • , Tao Tan
  • , Yanlin Geng

研究成果: Conference contribution同行評審

13 引文 斯高帕斯(Scopus)

摘要

We propose a new learning paradigm, Local to Global Learning (LGL), for Deep Neural Networks (DNNs) to improve the performance of classification problems. The core of LGL is to learn a DNN model from fewer categories (local) to more categories (global) gradually within the entire training set. LGL is most related to the Self-Paced Learning (SPL) algorithm but its formulation is different from SPL. SPL trains its data from simple to complex, while LGL from local to global. In this paper, we incorporate the idea of LGL into the learning objective of DNNs and explain why LGL works better from an information-theoretic perspective. Experiments on the toy data, CIFAR-10, CIFAR-100, and ImageNet dataset show that LGL outperforms the baseline and SPL-based algorithms.

原文English
主出版物標題Proceedings - 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2019
發行者IEEE Computer Society
頁面4743-4751
頁數9
ISBN(電子)9781728132938
DOIs
出版狀態Published - 6月 2019
對外發佈
事件32nd IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2019 - Long Beach, United States
持續時間: 16 6月 201920 6月 2019

出版系列

名字Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
2019-June
ISSN(列印)1063-6919

Conference

Conference32nd IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2019
國家/地區United States
城市Long Beach
期間16/06/1920/06/19

指紋

深入研究「Local to global learning: Gradually adding classes for training deep neural networks」主題。共同形成了獨特的指紋。

引用此