An Investigation of Multilayer RNNs in Sentiment Analysis

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Recurrent Neural Network (RNN) is one of the most powerful deep learning architectures and is commonly used to process various sequential input features, such as video sequences and natural sentence. It outperforms in solving some tasks of Neural Language Processing (NLP) about the sentiment analyse. RNN models have the advantage of allowing to receive data recurrently and extract the main information from the feature encoding of the previous time steps. In this work, there are four types of RNN's units have been analysed, including the Linear RNN, Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU) and Content-Adaptive Recurrent Unit (CARU). The implementation of all these units on multilayer RNN architectures is investigated, and their performance is tested on two benchmark sentiment analysis datasets: IMDB and SST2. The complete source code and experimental results are also provided for future study.

Original languageEnglish
Title of host publicationProceedings - 2023 3rd International Conference on Engineering Education and Information Technology, EEIT 2023
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages46-50
Number of pages5
ISBN (Electronic)9798350326840
DOIs
Publication statusPublished - 2023
Event3rd International Conference on Engineering Education and Information Technology, EEIT 2023 - Nanjing, China
Duration: 17 May 202319 May 2023

Publication series

NameProceedings - 2023 3rd International Conference on Engineering Education and Information Technology, EEIT 2023

Conference

Conference3rd International Conference on Engineering Education and Information Technology, EEIT 2023
Country/TerritoryChina
CityNanjing
Period17/05/2319/05/23

Keywords

  • CARU
  • GRU
  • LSTM
  • NLP
  • RNN
  • Sentiment Analysis

Fingerprint

Dive into the research topics of 'An Investigation of Multilayer RNNs in Sentiment Analysis'. Together they form a unique fingerprint.

Cite this