rojagtap/transformer-abstractive-summarization
Abstractive Text Summarization using Transformer
Implements the multi-head self-attention architecture from "Attention is All You Need" with encoder-decoder stacks for sequence-to-sequence summarization. Trained on the Inshorts news dataset to generate abstractive summaries by learning compressed semantic representations rather than extracting existing sentences. Includes detailed blog walkthroughs covering transformer mechanics and training procedures for reproducibility.
168 stars. No commits in the last 6 months.
Stars
168
Forks
47
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
Dec 02, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/rojagtap/transformer-abstractive-summarization"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
abelriboulot/onnxt5
Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using...
pszemraj/textsum
CLI & Python API to easily summarize text-based files with transformers
HHousen/DocSum
A tool to automatically summarize documents abstractively using the BART or PreSumm Machine...
abhilash1910/LongPegasus
LongPegasus package is used for inducing longformer self attention over base pegasus abstractive...
aj-naik/Text-Summarization
Abstractive and Extractive Text summarization using Transformers.