voluminor/openwebui-ollama-proxy
A proxy server written entirely in Go (with no external dependencies) that translates Ollama API requests into an OpenAI-compatible API format. It allows native Ollama clients (Ollie, Enchanted, CLI) to transparently work with models hosted on Open WebUI. It supports streaming, multimodal requests (images), three-level caching, AES-256-GCM session
Stars
1
Forks
—
Language
Go
License
LGPL-2.1
Category
Last pushed
Mar 15, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/voluminor/openwebui-ollama-proxy"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
cloudwego/eino
The ultimate LLM/AI application development framework in Go.
xyproto/algernon
Small self-contained pure-Go web server with Lua, Teal, Markdown, Ollama, HTTP/2, QUIC, Redis,...
instill-ai/instill-core
🔮 Instill Core is a full-stack AI infrastructure tool for data, model and pipeline...
gocnn/candy
Minimalist ML framework for Go.
voocel/litellm
LiteLLM for Go, the easiest way to write LLM-based programs in Go