ragflow and agentic-rag-for-dummies

RAGFlow is a comprehensive production-ready RAG engine, while Agentic RAG for Dummies is an educational framework for learning agentic RAG patterns—they are complements where one serves as a reference implementation and the other as a learning resource, though they could also function as alternatives depending on whether production deployment or learning is the priority.

ragflow
72
Verified
agentic-rag-for-dummies
65
Established
Maintenance 25/25
Adoption 10/25
Maturity 16/25
Community 21/25
Maintenance 20/25
Adoption 10/25
Maturity 13/25
Community 22/25
Stars: 74,911
Forks: 8,368
Downloads:
Commits (30d): 243
Language: Python
License: Apache-2.0
Stars: 2,743
Forks: 383
Downloads:
Commits (30d): 15
Language: Jupyter Notebook
License: MIT
No Package No Dependents
No Package No Dependents

About ragflow

infiniflow/ragflow

RAGFlow is a leading open-source Retrieval-Augmented Generation (RAG) engine that fuses cutting-edge RAG with Agent capabilities to create a superior context layer for LLMs

This tool helps create advanced AI assistants that can accurately answer questions using your specific business documents and data. You input various documents like PDFs, Word files, web pages, and even structured data, and it outputs a system that provides precise, traceable answers. It's designed for business leaders, knowledge managers, or AI product developers who need to build reliable question-answering systems for internal teams or customers.

knowledge-management enterprise-search customer-support-automation business-intelligence document-intelligence

About agentic-rag-for-dummies

GiovanniPasq/agentic-rag-for-dummies

A modular Agentic RAG built with LangGraph — learn Retrieval-Augmented Generation Agents in minutes.

Built on LangGraph's agentic framework, this system implements hierarchical parent-child chunk indexing for precision search paired with context-rich retrieval, conversation memory across turns, and human-in-the-loop query clarification. Multi-agent map-reduce parallelizes sub-query resolution with self-correction and context compression, while supporting pluggable LLM providers (Ollama, OpenAI, Anthropic, Google) and Qdrant vector storage—all orchestrated through observable graph execution with Langfuse integration.

Scores updated daily from GitHub, PyPI, and npm data. How scores work