thunlp/InfLLM
The code of our paper "InfLLM: Unveiling the Intrinsic Capacity of LLMs for Understanding Extremely Long Sequences with Training-Free Memory"
35
/ 100
Emerging
395 stars. No commits in the last 6 months.
Stale 6m
No Package
No Dependents
Maintenance
0 / 25
Adoption
10 / 25
Maturity
9 / 25
Community
16 / 25
Stars
395
Forks
39
Language
Python
License
MIT
Category
Last pushed
Apr 20, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/thunlp/InfLLM"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
jncraton/languagemodels
Explore large language models in 512MB of RAM
60
microsoft/unilm
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
57
albertan017/LLM4Decompile
Reverse Engineering: Decompiling Binary Code with Large Language Models
54
haizelabs/verdict
Inference-time scaling for LLMs-as-a-judge.
48
bytedance/Sa2VA
Official Repo For Pixel-LLM Codebase
47