transformer-abstractive-summarization and Text-Summarization
These are competitors offering overlapping functionality—both implement transformer-based abstractive summarization—though A focuses exclusively on the abstractive approach while B adds extractive methods as an alternative summarization strategy.
About transformer-abstractive-summarization
rojagtap/transformer-abstractive-summarization
Abstractive Text Summarization using Transformer
Implements the multi-head self-attention architecture from "Attention is All You Need" with encoder-decoder stacks for sequence-to-sequence summarization. Trained on the Inshorts news dataset to generate abstractive summaries by learning compressed semantic representations rather than extracting existing sentences. Includes detailed blog walkthroughs covering transformer mechanics and training procedures for reproducibility.
About Text-Summarization
aj-naik/Text-Summarization
Abstractive and Extractive Text summarization using Transformers.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work