neoheartbeats/neoheartbeats-kernel

An architecture for LLMs' continual-learning and long-term memories

20
/ 100
Experimental

This project offers an architecture for creating large language models (LLMs) that can learn continuously and maintain long-term memories. It takes in conversational data, including custom personas, and outputs refined LLMs capable of more natural, human-preference-aligned responses. AI researchers and developers focused on advanced LLM capabilities would use this.

No commits in the last 6 months.

Use this if you are an AI researcher or developer aiming to build LLMs that adapt and remember over extended interactions, moving beyond static model limitations.

Not ideal if you are looking for a plug-and-play solution for basic LLM deployment without deep customization or architectural exploration.

AI research LLM development continual learning long-term memory natural language processing
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 4 / 25
Maturity 16 / 25
Community 0 / 25

How are scores calculated?

Stars

6

Forks

Language

Jupyter Notebook

License

MIT

Category

llm-fine-tuning

Last pushed

Sep 23, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/neoheartbeats/neoheartbeats-kernel"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.