ewijaya/protein-lm-distill
Compact protein language models via knowledge distillation from ProtGPT2. 10-50x faster inference with maintained generation quality. Enables high-throughput screening workflows and secure on-premise deployment. Pre-trained models available on HuggingFace.
Stars
2
Forks
—
Language
Python
License
—
Category
Last pushed
Mar 11, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/ewijaya/protein-lm-distill"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lucidrains/alphagenome
Implementation of AlphaGenome, Deepmind's updated genomic attention model
BiomedSciAI/biomed-multi-omic
Build foundation model for RNA or DNA data
BioinfoMachineLearning/DeepInteract
A geometric deep learning framework (Geometric Transformers) for predicting protein interface...
Gleghorn-Lab/Protify
Low code molecular property prediction
Bindwell/PLAPT
Codebase and CLI for PLAPT: A state-of-the-art protein-ligand binding affinity model for drug discovery