Ujstor/self-hosting-ai-models
Guide to self-hosting AI models using Traefik on a home network, offering cost-effective and controlled alternatives to cloud-based services.
Orchestrates Ollama, Stable Diffusion, and Fooocus across Docker containers with Traefik as a reverse proxy and load balancer, handling HTTPS/SSL termination, basic auth middleware, and dynamic routing. Supports dual-access patterns: secure SSH tunneling for local consumption and web-based access through registered subdomains with port forwarding. Includes Makefile automation for model installation, environment setup, and service lifecycle management across multiple AI inference workloads.
No commits in the last 6 months.
Stars
29
Forks
10
Language
Makefile
License
—
Category
Last pushed
Dec 18, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mlops/Ujstor/self-hosting-ai-models"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.