optimum-habana and optimum-graphcore
These are ecosystem siblings—both are specialized hardware acceleration libraries that extend Hugging Face Transformers to different proprietary AI accelerators (Habana Gaudi HPUs vs. Graphcore IPUs), following the same `optimum-*` pattern for their respective platforms.
Maintenance
13/25
Adoption
10/25
Maturity
16/25
Community
25/25
Maintenance
0/25
Adoption
9/25
Maturity
16/25
Community
21/25
Stars: 207
Forks: 270
Downloads: —
Commits (30d): 0
Language: Python
License: Apache-2.0
Stars: 87
Forks: 33
Downloads: —
Commits (30d): 0
Language: Python
License: Apache-2.0
No Package
No Dependents
Stale 6m
No Package
No Dependents
About optimum-habana
huggingface/optimum-habana
Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)
About optimum-graphcore
huggingface/optimum-graphcore
Blazing fast training of 🤗 Transformers on Graphcore IPUs
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work