TabPFN and tabicl

These are competitors—both are pretrained foundation models designed to achieve state-of-the-art performance on tabular classification tasks through different approaches (TabPFN uses prior-function networks while TabICLv2 uses in-context learning), and practitioners would typically evaluate and select one based on their specific dataset and performance requirements.

TabPFN
83
Verified
tabicl
69
Established
Maintenance 23/25
Adoption 15/25
Maturity 25/25
Community 20/25
Maintenance 23/25
Adoption 10/25
Maturity 16/25
Community 20/25
Stars: 5,846
Forks: 586
Downloads:
Commits (30d): 27
Language: Python
License:
Stars: 603
Forks: 80
Downloads:
Commits (30d): 37
Language: Python
License:
No risk flags
No Package No Dependents

About TabPFN

PriorLabs/TabPFN

⚡ TabPFN: Foundation Model for Tabular Data ⚡

Based on the README, here's a technical summary: Built on a pretrained transformer architecture trained exclusively on synthetic data, TabPFN performs in-context learning by processing entire training sets through the model at inference time rather than traditional parameter updates. It provides scikit-learn compatible classifiers and regressors optimized for GPU inference on tabular datasets under 100K samples and 2000 features, with the ecosystem offering SHAP-based interpretability, synthetic data generation, embedding extraction, and hyperparameter optimization extensions.

About tabicl

soda-inria/tabicl

TabICLv2: A state-of-the-art tabular foundation model

Scores updated daily from GitHub, PyPI, and npm data. How scores work