shadowfax92/Fyin
Open source alternative to Perplexity AI with ability to run locally
Combines web search with local vector embeddings and parallel scraping to generate cited answers, supporting both OpenAI and local Ollama models. Integrates with multiple search backends (Bing, SearXNG, DuckDuckGo) and uses configurable LLM embeddings for semantic search. Built in Rust with Docker support for containerized deployment.
229 stars. No commits in the last 6 months.
Stars
229
Forks
24
Language
Rust
License
AGPL-3.0
Category
Last pushed
Oct 09, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/rag/shadowfax92/Fyin"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
undreamai/LLMUnity
Create characters in Unity with LLMs!
Mintplex-Labs/anythingllm-docs
Documentation of AnythingLLM by Mintplex Labs Inc.
bloodworks-io/phlox
Open source, local first AI medical scribe for desktop and web.
mamei16/LLM_Web_search
An extension for oobabooga/text-generation-webui that enables the LLM to search the web
snexus/llm-search
Querying local documents, powered by LLM