Variable-Depth Convolutional Neural Network for Text Classification

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

11 Citations (Scopus)


This article introduces a recurrent CNN based framework for the classification of arbitrary length text in natural sentence. In our model, we present a complete CNN design with recurrent structure to capture the contextual information as far as possible when learning sentences, which allows arbitrary-length sentences and more flexibility to analyze complete sentences compared with traditional CNN based neural networks. In addition, our model greatly reduces the number of layers in the architecture and requires fewer training parameters, which leads to less memory consumption, and it can reach $$O\left(\log n\right) $$ time complexity. As a result, this model can achieve enhancement in training accuracy. Moreover, the design and implementation can be easily deployed in the current text classification systems.

Original languageEnglish
Title of host publicationNeural Information Processing - 27th International Conference, ICONIP 2020, Proceedings
EditorsHaiqin Yang, Kitsuchart Pasupa, Andrew Chi-Sing Leung, James T. Kwok, Jonathan H. Chan, Irwin King
PublisherSpringer Science and Business Media Deutschland GmbH
Number of pages8
ISBN (Print)9783030638221
Publication statusPublished - 2020
Event27th International Conference on Neural Information Processing, ICONIP 2020 - Bangkok, Thailand
Duration: 18 Nov 202022 Nov 2020

Publication series

NameCommunications in Computer and Information Science
ISSN (Print)1865-0929
ISSN (Electronic)1865-0937


Conference27th International Conference on Neural Information Processing, ICONIP 2020


  • Convolutional neural network
  • Machine learning
  • Recurrent
  • Text classification


Dive into the research topics of 'Variable-Depth Convolutional Neural Network for Text Classification'. Together they form a unique fingerprint.

Cite this