Hierarchical-attention-networks-pytorch and bert-han

These tools are **competitors**, as they both aim to implement the Hierarchical Attention Network (HAN) architecture for document classification, with "vietnh1009/Hierarchical-attention-networks-pytorch" additionally integrating the PyTorch deep learning framework and "Hazoom/bert-han" leveraging BERT, each offering distinct approaches to the same core task.

Maintenance 0/25
Adoption 10/25
Maturity 8/25
Community 24/25
Maintenance 0/25
Adoption 8/25
Maturity 16/25
Community 17/25
Stars: 406
Forks: 107
Downloads:
Commits (30d): 0
Language: Python
License:
Stars: 46
Forks: 9
Downloads:
Commits (30d): 0
Language: Python
License: Apache-2.0
No License Stale 6m No Package No Dependents
Stale 6m No Package No Dependents

About Hierarchical-attention-networks-pytorch

vietnh1009/Hierarchical-attention-networks-pytorch

Hierarchical Attention Networks for document classification

Implements two-level attention mechanisms at word and sentence levels to capture document structure, with GloVe word embeddings (50-300d) initialized in the embedding layer rather than default random initialization. Built on PyTorch with early stopping regularization and TensorBoard integration for training visualization. Includes a web demo interface and pre-trained models evaluated across eight datasets (AG News, DBPedia, Yelp, Amazon, Yahoo Answers) with configurable batch size, learning rate, and embedding dimensions.

About bert-han

Hazoom/bert-han

Hierarchical-Attention-Network

Scores updated daily from GitHub, PyPI, and npm data. How scores work