ibnaleem/mixtral.py
A Python module for running the Mixtral-8x7B language model with customisable precision and attention mechanisms.
No commits in the last 6 months. Available on PyPI.
Stars
2
Forks
—
Language
Python
License
GPL-3.0
Category
Last pushed
Feb 06, 2024
Commits (30d)
0
Dependencies
2
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/ibnaleem/mixtral.py"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
kyegomez/LIMoE
Implementation of the "the first large-scale multimodal mixture of experts models." from the...
dohlee/chromoformer
The official code implementation for Chromoformer in PyTorch. (Lee et al., Nature Communications. 2022)
ahans30/goldfish-loss
[NeurIPS 2024] Goldfish Loss: Mitigating Memorization in Generative LLMs
yinboc/trans-inr
Transformers as Meta-Learners for Implicit Neural Representations, in ECCV 2022
bloomberg/MixCE-acl2023
Implementation of MixCE method described in ACL 2023 paper by Zhang et al.