StrombergNLP/Low-Carbon-NLP

Code and experiments for low-power LLM architecture search, SustaiNLP 2021

12
/ 100
Experimental

This project helps researchers and engineers optimize the energy efficiency of training large language models. By analyzing how different hyperparameters impact power consumption during training, it allows you to identify configurations that reduce the carbon footprint of your NLP models. This is designed for anyone working on deep learning for natural language processing who wants to make their computations more sustainable.

No commits in the last 6 months.

Use this if you are training large language models and want to understand and reduce the energy consumption associated with different training parameters.

Not ideal if you are looking for pre-trained low-carbon models or tools to measure the energy of an already-trained model in production.

sustainable-AI NLP-research deep-learning-optimization energy-efficiency carbon-footprint-reduction
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 4 / 25
Maturity 8 / 25
Community 0 / 25

How are scores calculated?

Stars

5

Forks

Language

Python

License

Last pushed

Jun 16, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/StrombergNLP/Low-Carbon-NLP"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.