VHellendoorn/Code-LMs

Guide to using pre-trained large language models of source code

48
/ 100
Emerging

Provides pre-trained PolyCoder models (160M–2.7B parameters) trained on 12-language code corpora using GPT-NeoX, available via Hugging Face transformers or custom checkpoints. Supports code generation with configurable temperature and beam search, includes evaluation harnesses for perplexity and HumanEval benchmarks, and offers Docker containerization with GPU support for inference and fine-tuning workflows.

1,842 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 22 / 25

How are scores calculated?

Stars

1,842

Forks

265

Language

Python

License

MIT

Last pushed

Jul 07, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/VHellendoorn/Code-LMs"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.