llmware and End-to-End-LLM-Projects

These are complements: llmware provides a production-ready framework for building RAG pipelines, while End-to-End-LLM-Projects serves as educational reference implementations demonstrating similar RAG, function-calling, and agent patterns that practitioners could integrate into a llmware-based system.

llmware
84
Verified
End-to-End-LLM-Projects
21
Experimental
Maintenance 17/25
Adoption 17/25
Maturity 25/25
Community 25/25
Maintenance 0/25
Adoption 5/25
Maturity 9/25
Community 7/25
Stars: 14,864
Forks: 2,964
Downloads: 1,177
Commits (30d): 12
Language: Python
License: Apache-2.0
Stars: 11
Forks: 1
Downloads:
Commits (30d): 0
Language: Jupyter Notebook
License: MIT
No risk flags
Stale 6m No Package No Dependents

About llmware

llmware-ai/llmware

Unified framework for building enterprise RAG pipelines with small, specialized models

Brings together prepackaged quantized models (50+ specialized for RAG tasks like extraction, classification, and summarization) and a modular RAG pipeline with multi-format document parsing, vector embedding with multiple backends (Chromadb, Milvus), and hybrid query capabilities (text, semantic, metadata filters). The unified ModelCatalog interface abstracts over diverse inference engines—GGUF, OpenVINO, ONNX-Runtime, HuggingFace—enabling the same code to run on-device across CPUs, GPUs, and NPUs on Windows, Mac, and Linux. Prompt objects orchestrate end-to-end knowledge retrieval and generation, automatically batching sources to fit model context windows while tracking provenance for fact-checking against source materials.

About End-to-End-LLM-Projects

pd2871/End-to-End-LLM-Projects

This repo contains code related to development of LLM based projects with Langchain and LLamaIndex. It uses RAG, Function calling, agents and tools as of now for interaction with data.

Related comparisons

Scores updated daily from GitHub, PyPI, and npm data. How scores work