mindspore-lab/mindnlp
MindSpore + 🤗Huggingface: Run any Transformers/Diffusers model on MindSpore with seamless compatibility and acceleration.
913 stars. Actively maintained with 157 commits in the last 30 days.
Stars
913
Forks
267
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 08, 2026
Commits (30d)
157
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/mindspore-lab/mindnlp"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
AI-Hypercomputer/maxtext
A simple, performant and scalable Jax LLM!
mosaicml/llm-foundry
LLM training code for Databricks foundation models
rasbt/reasoning-from-scratch
Implement a reasoning LLM in PyTorch from scratch, step by step
ridgerchu/matmulfreellm
Implementation for MatMul-free LM.
CASE-Lab-UMD/LLM-Drop
The official implementation of the paper "Uncovering the Redundancy in Transformers via a...