SatvikPraveen/JAX-NSL
Comprehensive JAX implementation of neural networks and scientific computing. Features distributed training, physics-informed networks, custom autodiff, and advanced optimization. Production-ready code with numerical stability, multi-device parallelism, and research-grade implementations.
Stars
1
Forks
—
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Mar 09, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/SatvikPraveen/JAX-NSL"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ThilinaRajapakse/simpletransformers
Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling,...
explosion/curated-transformers
🤖 A PyTorch library of curated Transformer models and their composable components
google/deepconsensus
DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences...
jsksxs360/How-to-use-Transformers
Transformers 库快速入门教程
ruanchaves/hashformers
Accurate word segmentation for hashtags and text, powered by Transformers and Beam Search. A...