neoheartbeats/neoheartbeats-kernel
An architecture for LLMs' continual-learning and long-term memories
This project offers an architecture for creating large language models (LLMs) that can learn continuously and maintain long-term memories. It takes in conversational data, including custom personas, and outputs refined LLMs capable of more natural, human-preference-aligned responses. AI researchers and developers focused on advanced LLM capabilities would use this.
No commits in the last 6 months.
Use this if you are an AI researcher or developer aiming to build LLMs that adapt and remember over extended interactions, moving beyond static model limitations.
Not ideal if you are looking for a plug-and-play solution for basic LLM deployment without deep customization or architectural exploration.
Stars
6
Forks
—
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Sep 23, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/neoheartbeats/neoheartbeats-kernel"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
OptimalScale/LMFlow
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
adithya-s-k/AI-Engineering.academy
Mastering Applied AI, One Concept at a Time
young-geng/scalax
A simple library for scaling up JAX programs
jax-ml/jax-llm-examples
Minimal yet performant LLM examples in pure JAX
riyanshibohra/TuneKit
Upload your data → Get a fine-tuned SLM. Free.