UKPLab/arxiv2025-inherent-limits-plms
Code repository for the paper "The Inherent Limits of Pretrained LLMs: The Unexpected Convergence of Instruction Tuning and In-Context Learning Capabilities"
No commits in the last 6 months.
Stars
13
Forks
—
Language
Python
License
Apache-2.0
Category
Last pushed
Jan 16, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/UKPLab/arxiv2025-inherent-limits-plms"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
jncraton/languagemodels
Explore large language models in 512MB of RAM
microsoft/unilm
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
haizelabs/verdict
Inference-time scaling for LLMs-as-a-judge.
bytedance/Sa2VA
Official Repo For Pixel-LLM Codebase
albertan017/LLM4Decompile
Reverse Engineering: Decompiling Binary Code with Large Language Models