optimum and optimum-habana
Optimum-habana is a hardware-specific plugin for Optimum that enables Transformer training acceleration on Habana Gaudi processors, making them complements rather than competitors—you use optimum-habana alongside Optimum to target HPU hardware specifically.
About optimum
huggingface/optimum
🚀 Accelerate inference and training of 🤗 Transformers, Diffusers, TIMM and Sentence Transformers with easy to use hardware optimization tools
Supports hardware-specific backends including ONNX Runtime, OpenVINO, TensorRT-LLM, AWS Neuron, and Intel Gaudi through modular installations, enabling optimized inference across diverse accelerators. Provides unified APIs for model export, quantization, and graph optimization while maintaining compatibility with PyTorch, enabling deployment from research to production without refactoring model code.
About optimum-habana
huggingface/optimum-habana
Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work