Kitsunp/Small-lenguaje-Model-Hybrid-Norm-Furier-Formers
A compact language model implementing HybridNorm and Fourier-based attention. Combines CoLA (low-rank projections), FANformer, and hybrid normalization to create an efficient decoder-only transformer. Leverages periodicity modeling and gated residuals to enhance performance while maintaining a small parameter footprint.
No commits in the last 6 months.
Stars
4
Forks
—
Language
Python
License
Apache-2.0
Category
Last pushed
Aug 08, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/Kitsunp/Small-lenguaje-Model-Hybrid-Norm-Furier-Formers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Lightning-AI/litgpt
20+ high-performance LLMs with recipes to pretrain, finetune and deploy at scale.
liangyuwang/Tiny-DeepSpeed
Tiny-DeepSpeed, a minimalistic re-implementation of the DeepSpeed library
microsoft/Text2Grad
🚀 Text2Grad: Converting natural language feedback into gradient signals for precise model...
catherinesyeh/attention-viz
Visualizing query-key interactions in language + vision transformers (VIS 2023)
huangjia2019/llm-gpt
From classic NLP to modern LLMs: building language models step by step. 异æ¥å›¾ä¹¦ï¼šã€Š GPT图解 å¤§æ¨¡åž‹æ˜¯æ€Žæ ·æž„å»ºçš„ã€‹-...