Ujstor/self-hosting-ai-models

Guide to self-hosting AI models using Traefik on a home network, offering cost-effective and controlled alternatives to cloud-based services.

25
/ 100
Experimental

Orchestrates Ollama, Stable Diffusion, and Fooocus across Docker containers with Traefik as a reverse proxy and load balancer, handling HTTPS/SSL termination, basic auth middleware, and dynamic routing. Supports dual-access patterns: secure SSH tunneling for local consumption and web-based access through registered subdomains with port forwarding. Includes Makefile automation for model installation, environment setup, and service lifecycle management across multiple AI inference workloads.

No commits in the last 6 months.

No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 1 / 25
Community 17 / 25

How are scores calculated?

Stars

29

Forks

10

Language

Makefile

License

Last pushed

Dec 18, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mlops/Ujstor/self-hosting-ai-models"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.