CRAN: A Hybrid CNN-RNN Attention-Based Model for Text Classification

0
73

Authors: Bin Cui, Dongxiang Zhang, Han Wang, Lei Wang, Long Guo

Tags: 2018, conceptual modeling

Text classification is one of the fundamental tasks in the field of natural language processing. The CNN-based approaches and RNN-based approaches have shown different capabilities in representing a piece of text. In this paper, we propose a hybrid CNN-RNN attention-based neural network, named CRAN, which combines the convolutional neural network and recurrent neural network effectively with the help of the attention mechanism. We validate the proposed model on several large-scale datasets (i.e., eight multi-class text classification and five multi-label text classification tasks), and compare with the state-of-the-art models. Experimental results show that CRAN can achieve the state-of-the-art performance on most of the datasets. In particular, CRAN yields better performance with much fewer parameters compared with a very deep convolutional networks with 29 layers, which proves its effectiveness and efficiency.

Read the full paper here: https://link-springer-com.proxy2.hec.ca/chapter/10.1007/978-3-030-00847-5_42