paper-qa and ask-pdf

These are competitors: both implement RAG pipelines for PDF question-answering with vector databases and LLMs, but paper-qa differentiates through scientific document optimization and citation generation while ask-pdf offers a simpler general-purpose implementation.

paper-qa
77
Verified
ask-pdf
26
Experimental
Maintenance 20/25
Adoption 12/25
Maturity 25/25
Community 20/25
Maintenance 2/25
Adoption 5/25
Maturity 9/25
Community 10/25
Stars: 8,264
Forks: 838
Downloads:
Commits (30d): 7
Language: Python
License: Apache-2.0
Stars: 14
Forks: 2
Downloads:
Commits (30d): 0
Language: Jupyter Notebook
License: MIT
No risk flags
Stale 6m No Package No Dependents

About paper-qa

Future-House/paper-qa

High accuracy RAG for answering questions from scientific documents with citations

Implements agentic RAG with iterative query refinement and LLM-based re-ranking, automatically enriches documents with metadata (citations, journal quality) from Semantic Scholar and Crossref, and supports multiple document formats (PDFs, text, code, Office files) with full-text search via tantivy. Integrates with any LiteLLM-supported model provider and offers local embedding alternatives, enabling deployment without proprietary APIs.

About ask-pdf

ralphcajipe/ask-pdf

A RAG Application to Ask Questions from a PDF Document using Large Language Models and Vector Database

Scores updated daily from GitHub, PyPI, and npm data. How scores work