Self-Adaptive Layer: An Application of Function Approximation Theory to Enhance Convergence Efficiency in Neural Networks

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Citations (Scopus)

Abstract

Neural networks provide a general architecture to model complex nonlinear systems, but the source data are often mixed with a lot of noise and interference information. One way to offer a smoother alternative for addressing this issue in training is to increase the neural or layer size. In this paper, a new self-adaptive layer is developed to overcome the problems of neural networks so as to achieve faster convergence and avoid local minimum. We incorporate function approximation theory into the layer element arrangement, so that the training process and the network approximation properties can be investigated via linear algebra, where the precision of adaptation can be controlled by the order of polynomials being used. Experimental results show that our proposed layer leads to significantly faster performance in convergence. As a result, this new layer greatly enhances the training accuracy. Moreover, the design and implementation can be easily deployed in most current systems.

Original languageEnglish
Title of host publication34th International Conference on Information Networking, ICOIN 2020
PublisherIEEE Computer Society
Pages447-452
Number of pages6
ISBN (Electronic)9781728141985
DOIs
Publication statusPublished - Jan 2020
Event34th International Conference on Information Networking, ICOIN 2020 - Barcelona, Spain
Duration: 7 Jan 202010 Jan 2020

Publication series

NameInternational Conference on Information Networking
Volume2020-January
ISSN (Print)1976-7684

Conference

Conference34th International Conference on Information Networking, ICOIN 2020
Country/TerritorySpain
CityBarcelona
Period7/01/2010/01/20

Keywords

  • Function Approximation
  • Neural Network
  • Orthogonal Polynomial
  • Self-Adaptive

Fingerprint

Dive into the research topics of 'Self-Adaptive Layer: An Application of Function Approximation Theory to Enhance Convergence Efficiency in Neural Networks'. Together they form a unique fingerprint.

Cite this