julianscher/gpt-adaprune
An integrated PyTorch pipeline for pretraining GPT-2 on linear regression tasks with curriculum learning and fine-tuning on extended downstream tasks. Includes adaptive pruning (AdaPrune) to compare pruning-based adaptation against standard gradient-based fine-tuning.
No commits in the last 6 months.
Stars
—
Forks
—
Language
Python
License
MIT
Category
Last pushed
Aug 17, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/julianscher/gpt-adaprune"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Xilinx/brevitas
Brevitas: neural network quantization in PyTorch
fastmachinelearning/qonnx
QONNX: Arbitrary-Precision Quantized Neural Networks in ONNX
open-mmlab/mmengine
OpenMMLab Foundational Library for Training Deep Learning Models
google/qkeras
QKeras: a quantization deep learning library for Tensorflow Keras
tensorflow/model-optimization
A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization...