theaiautomators/insights-lm-local-package

Open-source, fully private and local alternative to NotebookLM. Chat with your documents, generate audio summaries, and ground AI in your own sources—built with Supabase, N8N on a React frontend using Ollama for local inference

51
/ 100
Established

Supports local audio transcription via Whisper ASR and text-to-speech synthesis through Coqui TTS, enabling fully offline podcast generation and multi-modal document interaction. The backend re-engineers N8N workflows to orchestrate local inference pipelines—LLM embeddings, transcription, and synthesis—while Supabase edge functions handle document processing and citation retrieval without external API calls. Containerized via Docker with GPU acceleration support, the stack decouples from cloud dependencies while maintaining the original frontend's React/Vite interface.

200 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 10 / 25
Maturity 15 / 25
Community 24 / 25

How are scores calculated?

Stars

200

Forks

102

Language

TypeScript

License

MIT

Last pushed

Sep 12, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/rag/theaiautomators/insights-lm-local-package"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.