stchakwdev/Mamba_KAN
A rigorous 2x3 factorial comparison of neural network architectures: KAN vs MLP feedforward layers combined with Transformer vs Mamba sequence models. Investigates whether KAN advantages stem from B-spline activations or network topology.
Stars
6
Forks
—
Language
Python
License
MIT
Category
Last pushed
Jan 06, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/stchakwdev/Mamba_KAN"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
NVIDIAGameWorks/kaolin
A PyTorch Library for Accelerating 3D Deep Learning Research
Jim137/qkan
PyTorch implementation of QKAN "Quantum-inspired Kolmogorov-Arnold Network"...
AntonioTepsich/Convolutional-KANs
This project extends the idea of the innovative architecture of Kolmogorov-Arnold Networks (KAN)...
lgy112112/ikan
ikan: many kan variants for every body
stchakwdev/kan_transformer
Baantu Research: Hybrid KAN-Transformer for investigating learnable activations in LLM...