Text-Summarization and Finetune-Transformers

Text-Summarization
35
Emerging
Finetune-Transformers
32
Emerging
Maintenance 0/25
Adoption 9/25
Maturity 16/25
Community 10/25
Maintenance 0/25
Adoption 7/25
Maturity 8/25
Community 17/25
Stars: 86
Forks: 7
Downloads:
Commits (30d): 0
Language: Jupyter Notebook
License: MIT
Stars: 39
Forks: 10
Downloads:
Commits (30d): 0
Language: Python
License:
Stale 6m No Package No Dependents
No License Stale 6m No Package No Dependents

About Text-Summarization

aj-naik/Text-Summarization

Abstractive and Extractive Text summarization using Transformers.

This project helps students, researchers, or anyone dealing with large volumes of text quickly grasp the main points. You provide it with a long document, article, or research paper, and it generates either a condensed version highlighting key sentences or a completely new, shorter summary in your own words. It's designed for anyone needing to efficiently process information and get to the core message without reading everything.

academic-research content-analysis information-retrieval study-notes report-writing

About Finetune-Transformers

nsi319/Finetune-Transformers

Abstractive text summarization by fine-tuning seq2seq models.

This helps developers fine-tune large language models for abstractive text summarization. It takes a pre-trained sequence-to-sequence model and your domain-specific text data, then outputs a specialized model that can summarize text more accurately for your particular use case. This tool is for machine learning engineers and data scientists who need to adapt generic summarization models to specific datasets, like news articles or research papers.

natural-language-processing machine-learning-engineering text-summarization model-fine-tuning data-science

Scores updated daily from GitHub, PyPI, and npm data. How scores work