textClassifier and Hierarchical-attention-networks-pytorch
These are independent implementations of the same paper (Hierarchical Attention Networks for Document Classification) that compete as alternative PyTorch codebases for the same task, with richliao/textClassifier offering a more feature-complete package while vietnh1009's version provides a simpler reference implementation.
About textClassifier
richliao/textClassifier
Text classifier for Hierarchical Attention Networks for Document Classification
Implements three distinct architectures—hierarchical attention networks with word and sentence-level attention, CNNs with convolutional filters, and bidirectional LSTMs with attention mechanisms—all built on Keras. Supports interpretability by extracting attention weights to identify important words for predictions. Compatible with pre-trained GloVe embeddings and includes training pipelines on standard datasets like IMDB reviews.
About Hierarchical-attention-networks-pytorch
vietnh1009/Hierarchical-attention-networks-pytorch
Hierarchical Attention Networks for document classification
Implements two-level attention mechanisms at word and sentence levels to capture document structure, with GloVe word embeddings (50-300d) initialized in the embedding layer rather than default random initialization. Built on PyTorch with early stopping regularization and TensorBoard integration for training visualization. Includes a web demo interface and pre-trained models evaluated across eight datasets (AG News, DBPedia, Yelp, Amazon, Yahoo Answers) with configurable batch size, learning rate, and embedding dimensions.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work