BERTopic and turftopic

These are competitors offering alternative approaches to neural topic modeling: BERTopic uses BERT embeddings with c-TF-IDF for interpretability at scale, while TurfTopic uses sentence-transformers with a focus on robustness and speed, so practitioners typically choose one based on their priorities around interpretability versus performance.

BERTopic
71
Verified
turftopic
47
Emerging
Maintenance 10/25
Adoption 15/25
Maturity 25/25
Community 21/25
Maintenance 10/25
Adoption 9/25
Maturity 16/25
Community 12/25
Stars: 7,443
Forks: 882
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stars: 94
Forks: 9
Downloads:
Commits (30d): 0
Language: Python
License: MIT
No risk flags
No Package No Dependents

About BERTopic

MaartenGr/BERTopic

Leveraging BERT and c-TF-IDF to create easily interpretable topics.

Combines dense transformer embeddings with dimensionality reduction (UMAP) and clustering (HDBSCAN) to discover coherent topics, then applies class-based TF-IDF to extract semantically meaningful keywords per cluster. Supports diverse modeling paradigms including supervised, hierarchical, dynamic, multimodal, and zero-shot approaches, with optional LLM-based topic representation for natural language summaries. Integrates with 🤗 Hugging Face transformers and offers pluggable backends for embeddings (Flair, spaCy, Gensim) and vision models for cross-modal topic discovery.

About turftopic

x-tabdeveloping/turftopic

Robust and fast topic models with sentence-transformers.

Scores updated daily from GitHub, PyPI, and npm data. How scores work