CRNN: a joint neural network for redundancy detection

This article proposes a novel framework for detecting redundancy in supervised sentence categorisation. Unlike traditional singleton neural network, our model incorporates character-aware convolutional neural network (Char-CNN) with character-aware recurrent neural network (Char-RNN) to form a convo...

Full description

Bibliographic Details
Main Authors: Fu, Xinyu, Ch’ng, Eugene, Aickelin, Uwe, See, Simon
Format: Conference or Workshop Item
Published: 2017
Subjects:
Online Access:https://eprints.nottingham.ac.uk/42463/
_version_ 1848796493142556672
author Fu, Xinyu
Ch’ng, Eugene
Aickelin, Uwe
See, Simon
author_facet Fu, Xinyu
Ch’ng, Eugene
Aickelin, Uwe
See, Simon
author_sort Fu, Xinyu
building Nottingham Research Data Repository
collection Online Access
description This article proposes a novel framework for detecting redundancy in supervised sentence categorisation. Unlike traditional singleton neural network, our model incorporates character-aware convolutional neural network (Char-CNN) with character-aware recurrent neural network (Char-RNN) to form a convolutional recurrent neural network (CRNN). Our model benefits from Char-CNN in that only salient features are selected and fed into the integrated Char-RNN. Char-RNN effectively learns long sequence semantics via sophisticated update mechanism. We compare our framework against the state-of-the-art text classification algorithms on four popular benchmarking corpus. For instance, our model achieves competing precision rate, recall ratio, and F1 score on the Google-news data-set. For twenty-news-groups data stream, our algorithm obtains the optimum on precision rate, recall ratio, and F1 score. For Brown Corpus, our framework obtains the best F1 score and almost equivalent precision rate and recall ratio over the top competitor. For the question classification collection, CRNN produces the optimal recall rate and F1 score and comparable precision rate. We also analyse three different RNN hidden recurrent cells’ impact on performance and their runtime efficiency. We observe that MGU achieves the optimal runtime and comparable performance against GRU and LSTM. For TFIDF based algorithms, we experiment with word2vec, GloVe, and sent2vec embeddings and report their performance differences.
first_indexed 2025-11-14T19:48:51Z
format Conference or Workshop Item
id nottingham-42463
institution University of Nottingham Malaysia Campus
institution_category Local University
last_indexed 2025-11-14T19:48:51Z
publishDate 2017
recordtype eprints
repository_type Digital Repository
spelling nottingham-424632020-05-04T18:49:55Z https://eprints.nottingham.ac.uk/42463/ CRNN: a joint neural network for redundancy detection Fu, Xinyu Ch’ng, Eugene Aickelin, Uwe See, Simon This article proposes a novel framework for detecting redundancy in supervised sentence categorisation. Unlike traditional singleton neural network, our model incorporates character-aware convolutional neural network (Char-CNN) with character-aware recurrent neural network (Char-RNN) to form a convolutional recurrent neural network (CRNN). Our model benefits from Char-CNN in that only salient features are selected and fed into the integrated Char-RNN. Char-RNN effectively learns long sequence semantics via sophisticated update mechanism. We compare our framework against the state-of-the-art text classification algorithms on four popular benchmarking corpus. For instance, our model achieves competing precision rate, recall ratio, and F1 score on the Google-news data-set. For twenty-news-groups data stream, our algorithm obtains the optimum on precision rate, recall ratio, and F1 score. For Brown Corpus, our framework obtains the best F1 score and almost equivalent precision rate and recall ratio over the top competitor. For the question classification collection, CRNN produces the optimal recall rate and F1 score and comparable precision rate. We also analyse three different RNN hidden recurrent cells’ impact on performance and their runtime efficiency. We observe that MGU achieves the optimal runtime and comparable performance against GRU and LSTM. For TFIDF based algorithms, we experiment with word2vec, GloVe, and sent2vec embeddings and report their performance differences. 2017-06-15 Conference or Workshop Item PeerReviewed Fu, Xinyu, Ch’ng, Eugene, Aickelin, Uwe and See, Simon (2017) CRNN: a joint neural network for redundancy detection. In: 3rd IEEE International Conference on Smart Computing (Smartcomp 2017), 29-31 May 2017, Hong Kong, China. Logic gates Training Redundancy Recurrent neural networks Benchmark testing Computational modeling http://ieeexplore.ieee.org/document/7946996/ doi:10.1109/SMARTCOMP.2017.7946996 doi:10.1109/SMARTCOMP.2017.7946996
spellingShingle Logic gates
Training
Redundancy
Recurrent neural networks
Benchmark testing
Computational modeling
Fu, Xinyu
Ch’ng, Eugene
Aickelin, Uwe
See, Simon
CRNN: a joint neural network for redundancy detection
title CRNN: a joint neural network for redundancy detection
title_full CRNN: a joint neural network for redundancy detection
title_fullStr CRNN: a joint neural network for redundancy detection
title_full_unstemmed CRNN: a joint neural network for redundancy detection
title_short CRNN: a joint neural network for redundancy detection
title_sort crnn: a joint neural network for redundancy detection
topic Logic gates
Training
Redundancy
Recurrent neural networks
Benchmark testing
Computational modeling
url https://eprints.nottingham.ac.uk/42463/
https://eprints.nottingham.ac.uk/42463/
https://eprints.nottingham.ac.uk/42463/