transformer-abstractive-summarization and Text-Summarization

These are competitors offering overlapping functionality—both implement transformer-based abstractive summarization—though A focuses exclusively on the abstractive approach while B adds extractive methods as an alternative summarization strategy.

Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 22/25
Maintenance 0/25
Adoption 9/25
Maturity 16/25
Community 10/25
Stars: 168
Forks: 47
Downloads:
Commits (30d): 0
Language: Jupyter Notebook
License: Apache-2.0
Stars: 86
Forks: 7
Downloads:
Commits (30d): 0
Language: Jupyter Notebook
License: MIT
Stale 6m No Package No Dependents
Stale 6m No Package No Dependents

About transformer-abstractive-summarization

rojagtap/transformer-abstractive-summarization

Abstractive Text Summarization using Transformer

Implements the multi-head self-attention architecture from "Attention is All You Need" with encoder-decoder stacks for sequence-to-sequence summarization. Trained on the Inshorts news dataset to generate abstractive summaries by learning compressed semantic representations rather than extracting existing sentences. Includes detailed blog walkthroughs covering transformer mechanics and training procedures for reproducibility.

About Text-Summarization

aj-naik/Text-Summarization

Abstractive and Extractive Text summarization using Transformers.

Scores updated daily from GitHub, PyPI, and npm data. How scores work