llama_index and llama-index-vector-search-javascript
LlamaIndex is the core RAG framework, while the Azure sample is a reference implementation demonstrating how to deploy LlamaIndex with Azure-specific services for vector search and LLM inference—making them complements designed to be used together.
About llama_index
run-llama/llama_index
LlamaIndex is the leading document agent and OCR platform
Provides a modular architecture with 300+ integration packages through LlamaHub, enabling flexible composition of LLM, embedding, and vector store providers without vendor lock-in. Core capabilities include agentic workflow orchestration, structured data extraction, and multi-format document parsing (130+ formats) via LlamaParse, with a namespace pattern that cleanly separates core abstractions from provider-specific implementations.
About llama-index-vector-search-javascript
Azure-Samples/llama-index-vector-search-javascript
A sample app for the Retrieval-Augmented Generation pattern using LlamaIndex.ts, running in Azure, using Azure AI Search for retrieval and Azure OpenAI large language models to power ChatGPT-style and Q&A experiences using your own data.
Implements document ingestion with LlamaIndex's data connectors to populate Azure AI Search indexes, enabling semantic search over custom documents. The full-stack JavaScript solution deploys to Azure Container Apps with infrastructure-as-code (Bicep) via Azure Developer CLI, including built-in monitoring through Azure Monitor for production observability.
Scores updated daily from GitHub, PyPI, and npm data. How scores work