santhoshkolloju/Abstractive-Summarization-With-Transfer-Learning
Abstractive summarisation using Bert as encoder and Transformer Decoder
Leverages the Texar library to build an encoder-decoder architecture where pretrained BERT weights initialize the encoder while the decoder trains from scratch, enabling faster convergence than sequential LSTM models. Implements parallel training via Transformer self-attention mechanisms and includes a Flask-based inference server for generating abstractive summaries via HTTP POST requests. Requires TFRecord preprocessing of input story-summary pairs and supports full configuration customization through a dedicated config file.
412 stars. No commits in the last 6 months.
Stars
412
Forks
98
Language
Python
License
—
Category
Last pushed
May 30, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/santhoshkolloju/Abstractive-Summarization-With-Transfer-Learning"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
kenlimmj/rouge
A Javascript implementation of the Recall-Oriented Understudy for Gisting Evaluation (ROUGE)...
uoneway/KoBertSum
KoBertSum은 BertSum모델을 한국어 데이터에 적용할 수 있도록 수정한 한국어 요약 모델입니다.
udibr/headlines
Automatically generate headlines to short articles
bheinzerling/pyrouge
A Python wrapper for the ROUGE summarization evaluation package
xiongma/transformer-pointer-generator
A Abstractive Summarization Implementation with Transformer and Pointer-generator