richardwhiteii/rlm
Recursive Language Model patterns for Claude Code — handle massive contexts (10M+ tokens) by treating them as external variables
Stars
41
Forks
4
Language
Python
License
MIT
Last pushed
Feb 11, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/richardwhiteii/rlm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Dao-AILab/flash-attention
Fast and memory-efficient exact attention
wuwangzhang1216/abliterix
Fully automatic censorship removal for language models. LoRA abliteration + Optuna TPE optimization.
lucidrains/deep-cross-attention
Implementation of the proposed DeepCrossAttention by Heddes et al at Google research, in Pytorch
qnbs/CannaGuide-2025
🌿 CannaGuide 2025 Cannabis Grow Guide: AI-powered digital companion for the entire cannabis...
modelscope/mcore-bridge
MCore-Bridge: Providing Megatron-Core model definitions for state-of-the-art large models and...