santhoshkolloju/Abstractive-Summarization-With-Transfer-Learning

Abstractive summarisation using Bert as encoder and Transformer Decoder

42
/ 100
Emerging

Leverages the Texar library to build an encoder-decoder architecture where pretrained BERT weights initialize the encoder while the decoder trains from scratch, enabling faster convergence than sequential LSTM models. Implements parallel training via Transformer self-attention mechanisms and includes a Flask-based inference server for generating abstractive summaries via HTTP POST requests. Requires TFRecord preprocessing of input story-summary pairs and supports full configuration customization through a dedicated config file.

412 stars. No commits in the last 6 months.

No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 8 / 25
Community 24 / 25

How are scores calculated?

Stars

412

Forks

98

Language

Python

License

Last pushed

May 30, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/santhoshkolloju/Abstractive-Summarization-With-Transfer-Learning"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.