tecno-consultores/llm-lab
Docker Compose to install N8N, Openweb UI, Qdrant, Ollama, EvolutionAPI and other systems.
Provides modular Docker Compose profiles for assembling LLM and automation stacks, enabling selective deployment of workflow engines (N8N with distributed workers), vector databases (Qdrant), local LLMs (Ollama), speech processing (Whisper), web scraping (Crawl4ai), and messaging infrastructure (Kafka). Supports heterogeneous hardware including Nvidia GPUs, AMD64, and ARM64 architectures through architecture-specific profiles. Includes observability tooling (Redis Insight), reverse proxy management (NGINX), and integrations with chat APIs (EvolutionAPI) and low-code platforms (Flowise).
Stars
11
Forks
6
Language
Shell
License
GPL-2.0
Category
Last pushed
Feb 07, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/agents/tecno-consultores/llm-lab"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Featured in
Higher-rated alternatives
agno-agi/agno
Build, run, manage agentic software at scale.
agentscope-ai/agentscope
Build and run agents you can see, understand and trust.
adaline/gateway
The only fully local production-grade Super SDK that provides a simple, unified, and powerful...
iflytek/astron-agent
Enterprise-grade, commercial-friendly agentic workflow platform for building next-generation SuperAgents.
agentuse/agentuse
🤖 AI agents on autopilot. Any model. Runs local, cron, CI/CD, or Docker.