louisgv/local.ai
🎒 local.ai - Run AI locally on your PC!
Provides a curated model registry with hardware recommendations and cryptographic verification, plus a streaming inference API compatible with OpenAI's completion endpoint. Built on the Rust `llm` crate for efficient local inference, it integrates with window.ai to enable any web application to run AI without cloud costs. Includes a note-taking interface with per-note inference configurations exported as MDX.
716 stars. No commits in the last 6 months.
Stars
716
Forks
68
Language
TypeScript
License
GPL-3.0
Category
Last pushed
Sep 24, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/louisgv/local.ai"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ParisNeo/lollms-webui
Lord of Large Language and Multi modal Systems Web User Interface
ggozad/oterm
the terminal client for Ollama
hand-e-fr/OpenHosta
A lightweight library integrating LLM natively into Python
owndev/Open-WebUI-Functions
Open-WebUI-Functions is a collection of custom pipelines, filters, and integrations designed to...
F33RNI/LlM-Api-Open
Unofficial open APIs for popular LLMs (currently for ChatGPT and MS Copilot) with self-hosted...