SMNETSTUDIO/Groq2API
Free Groq API
Provides a containerized API wrapper exposing Groq's inference models (Llama 3, Mixtral, Gemma) via OpenAI-compatible endpoints with configurable streaming and token limits. Deploys as a lightweight Docker container or serverless function on Vercel, Koyeb, Render, or Railway, abstracting Groq's authentication behind standard `/v1/chat/completions` routes. Supports multiple open-source LLM backends with per-model token constraints and optional streaming responses for real-time inference.
339 stars. No commits in the last 6 months.
Stars
339
Forks
86
Language
Go
License
—
Category
Last pushed
Jan 17, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/SMNETSTUDIO/Groq2API"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
langfuse/langfuse-docs
🪢 Langfuse documentation -- Langfuse is the open source LLM Engineering Platform. Observability,...
google/generative-ai-go
Go SDK for Google Generative AI
IBM/watsonx-go
watsonx API Client for Go
jetify-com/ai
The AI framework for Go developers. Build powerful AI applications and agents using our free,...
tech1024/goai
A friendly API and abstractions for developing AI applications.