Applying and Optimizing NLP Model with CARU

Ka Hou Chan, Sio Kei Im, Giovanni Pau

研究成果: Conference contribution同行評審

6 引文 斯高帕斯(Scopus)

摘要

RNN for language models can solve the problem of sparse content and high-dimensional features in traditional N-gram models. However, due to the problems of overfitting and gradient disappearance, the original RNN still lacks long-term content dependence and noise interference. This paper proposes an improved method based on a context word vector for RNN with CARU. In order to alleviate the overfitting problem, a modified DropConnect layer is employed in the proposed model. In addition, the multilayer CARU is used to add contextual word vectors to the model with the feature layer to strengthen the ability to learn long-distance information during the training process. Experimental results show that the proposed method effectively improves the performance of RNN-based language model.

原文English
主出版物標題8th International Conference on Advanced Computing and Communication Systems, ICACCS 2022
發行者Institute of Electrical and Electronics Engineers Inc.
頁面1018-1022
頁數5
ISBN(電子)9781665408165
DOIs
出版狀態Published - 2022
事件8th International Conference on Advanced Computing and Communication Systems, ICACCS 2022 - Coimbatore, India
持續時間: 25 3月 202226 3月 2022

出版系列

名字8th International Conference on Advanced Computing and Communication Systems, ICACCS 2022

Conference

Conference8th International Conference on Advanced Computing and Communication Systems, ICACCS 2022
國家/地區India
城市Coimbatore
期間25/03/2226/03/22

指紋

深入研究「Applying and Optimizing NLP Model with CARU」主題。共同形成了獨特的指紋。

引用此