voluminor/openwebui-ollama-proxy

A proxy server written entirely in Go (with no external dependencies) that translates Ollama API requests into an OpenAI-compatible API format. It allows native Ollama clients (Ollie, Enchanted, CLI) to transparently work with models hosted on Open WebUI. It supports streaming, multimodal requests (images), three-level caching, AES-256-GCM session

23
/ 100
Experimental
No Package No Dependents
Maintenance 13 / 25
Adoption 1 / 25
Maturity 9 / 25
Community 0 / 25

How are scores calculated?

Stars

1

Forks

Language

Go

License

LGPL-2.1

Last pushed

Mar 15, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/voluminor/openwebui-ollama-proxy"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.