StrombergNLP/Low-Carbon-NLP
Code and experiments for low-power LLM architecture search, SustaiNLP 2021
This project helps researchers and engineers optimize the energy efficiency of training large language models. By analyzing how different hyperparameters impact power consumption during training, it allows you to identify configurations that reduce the carbon footprint of your NLP models. This is designed for anyone working on deep learning for natural language processing who wants to make their computations more sustainable.
No commits in the last 6 months.
Use this if you are training large language models and want to understand and reduce the energy consumption associated with different training parameters.
Not ideal if you are looking for pre-trained low-carbon models or tools to measure the energy of an already-trained model in production.
Stars
5
Forks
—
Language
Python
License
—
Category
Last pushed
Jun 16, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/StrombergNLP/Low-Carbon-NLP"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
stanford-oval/genienlp
GenieNLP: A versatile codebase for any NLP task
Quantinuum/Quixer
Code repository for the preprint "Quixer: A Quantum Transformer Model"
ICHEC/QNLP
ICHEC Quantum natural language processing (QNLP) toolkit
hans/nn-decoding
Brain decoding/encoding with neural network language models
mullzhang/quantum-nlp
NLP (Natural Language Processing) using quantum annealer