CARU: A Content-Adaptive Recurrent Unit for the Transition of Hidden State in NLP

研究成果: Conference contribution同行評審

22 引文 斯高帕斯(Scopus)

摘要

This article introduces a novel RNN unit inspired by GRU, namely the Content-Adaptive Recurrent Unit (CARU). The design of CARU contains all the features of GRU but requires fewer training parameters. We make use of the concept of weights in our design to analyze the transition of hidden states. At the same time, we also describe how the content adaptive gate handles the received words and alleviates the long-term dependence problem. As a result, the unit can improve the accuracy of the experiments, and the results show that CARU not only has better performance than GRU, but also produces faster training. Moreover, the proposed unit is general and can be applied to all RNN related neural network models.

原文English
主出版物標題Neural Information Processing - 27th International Conference, ICONIP 2020, Proceedings
編輯Haiqin Yang, Kitsuchart Pasupa, Andrew Chi-Sing Leung, James T. Kwok, Jonathan H. Chan, Irwin King
發行者Springer Science and Business Media Deutschland GmbH
頁面693-703
頁數11
ISBN(列印)9783030638290
DOIs
出版狀態Published - 2020
事件27th International Conference on Neural Information Processing, ICONIP 2020 - Bangkok, Thailand
持續時間: 18 11月 202022 11月 2020

出版系列

名字Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
12532 LNCS
ISSN(列印)0302-9743
ISSN(電子)1611-3349

Conference

Conference27th International Conference on Neural Information Processing, ICONIP 2020
國家/地區Thailand
城市Bangkok
期間18/11/2022/11/20

指紋

深入研究「CARU: A Content-Adaptive Recurrent Unit for the Transition of Hidden State in NLP」主題。共同形成了獨特的指紋。

引用此