callstackincubator/ai

On-device LLM execution in React Native with Vercel AI SDK compatibility

45
/ 100
Emerging

Supports multiple on-device execution runtimes—Apple's native Foundation Models (iOS 17+, no downloads), GGUF models via llama.rn, and MLC LLM for optimized inference—while maintaining full compatibility with Vercel AI SDK's `generateText`, `embed`, and experimental APIs. Built-in OpenTelemetry integration pipes performance metrics to Rozenite DevTools for runtime profiling across all providers.

1,219 stars. Actively maintained with 4 commits in the last 30 days.

No Package No Dependents
Maintenance 13 / 25
Adoption 10 / 25
Maturity 9 / 25
Community 13 / 25

How are scores calculated?

Stars

1,219

Forks

41

Language

TypeScript

License

MIT

Last pushed

Mar 02, 2026

Commits (30d)

4

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/callstackincubator/ai"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.