rojagtap/transformer-abstractive-summarization

Abstractive Text Summarization using Transformer

48
/ 100
Emerging

Implements the multi-head self-attention architecture from "Attention is All You Need" with encoder-decoder stacks for sequence-to-sequence summarization. Trained on the Inshorts news dataset to generate abstractive summaries by learning compressed semantic representations rather than extracting existing sentences. Includes detailed blog walkthroughs covering transformer mechanics and training procedures for reproducibility.

168 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 22 / 25

How are scores calculated?

Stars

168

Forks

47

Language

Jupyter Notebook

License

Apache-2.0

Last pushed

Dec 02, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/rojagtap/transformer-abstractive-summarization"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.