injamul3798/Fine-Tuning-BERT-for-E-commerce-Text-Classification-A-Multi-category-Approach
In this Kaggle project, we leverage the power of the BERT (Bidirectional Encoder Representations from Transformers) model for fine-tuned multi-category text classification in the context of E-commerce. Our dataset comprises product descriptions from four distinct categories - "Electronics," "Household," "Books," and "Clothing & Accessories.
No commits in the last 6 months.
Stars
5
Forks
—
Language
Jupyter Notebook
License
—
Category
Last pushed
Jan 27, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/injamul3798/Fine-Tuning-BERT-for-E-commerce-Text-Classification-A-Multi-category-Approach"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
urchade/GLiNER
Generalist and Lightweight Model for Named Entity Recognition (Extract any entity types from...
HySonLab/ViDeBERTa
ViDeBERTa: A powerful pre-trained language model for Vietnamese, EACL 2023
lgalke/text-clf-baselines
WideMLP for Text Classification
stccenter/Comparative-Analysis-of-BERT-and-GPT-for-Classifying-Crisis-News-with-Sudan-Conflict-as-an-Example
Comparative Analysis of BERT and GPT for Conflict-Related Multiclass Label Classification from...
NLP-AI-Wizards/clef2025-checkthat
Challenge to distinguish whether a sentence from a news article expresses the subjective view of...