text-classification-cnn-rnn and text_rnn_attention
These two projects are competitors, as both implement neural network architectures for Chinese text classification using different combinations of recurrent and convolutional layers, making them alternative choices for the same task.
About text-classification-cnn-rnn
gaussic/text-classification-cnn-rnn
CNN-RNN中文文本分类,基于TensorFlow
Implements character-level CNN and RNN architectures using TensorFlow 1.3+ with Conv1D operations and multi-layer GRU/LSTM cells for sequence modeling. Provides complete preprocessing pipeline including vocabulary building, fixed-length sequence padding (600 characters), and batch iteration with shuffling for the THUCNews dataset (10 categories, 65K training samples). Achieves 96%+ test accuracy on Chinese news classification with detailed evaluation metrics including per-category precision/recall and confusion matrices.
About text_rnn_attention
cjymz886/text_rnn_attention
嵌入Word2vec词向量的RNN+ATTENTION中文文本分类
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work