LLocalSearch and local-genAI-search
Both tools are competitors, as each is a locally running generative search engine powered by LLMs (specifically Llama 3 in B's case) designed to answer user questions, with A aggregating search results and B focusing on local files.
About LLocalSearch
nilsherzig/LLocalSearch
LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress of the agents and the final answer. No OpenAI or Google API keys are needed.
Implements agentic recursion where local LLMs autonomously select and chain multiple web search tools based on intermediate results, with live execution logs and source links displayed during the reasoning process. Built with LangChain/LangChainGo and Ollama for model inference, it provides a privacy-first alternative to commercial search aggregators that prioritize advertiser placement over result quality.
About local-genAI-search
nikolamilosevic86/local-genAI-search
Local-GenAI-Search is a generative search engine based on Llama 3, langchain and qdrant that answers questions based on your local files
Scores updated daily from GitHub, PyPI, and npm data. How scores work