react-native-ai and ai
These are complements: one provides a full-stack framework for AI app development while the other focuses specifically on on-device LLM inference, allowing developers to integrate local model execution into the broader application architecture.
About react-native-ai
dabit3/react-native-ai
Full stack framework for building cross-platform mobile AI apps
Supports streaming responses from multiple LLM providers (OpenAI, Anthropic, Gemini) and image generation models with a Node.js backend proxy for authentication. Built on React Native with a modular architecture—add new models by extending constants, creating chat/image handlers on the server, and updating router logic. Includes five pre-configured themes and real-time chat/image UIs out of the box.
About ai
callstackincubator/ai
On-device LLM execution in React Native with Vercel AI SDK compatibility
Supports multiple on-device execution runtimes—Apple's native Foundation Models (iOS 17+, no downloads), GGUF models via llama.rn, and MLC LLM for optimized inference—while maintaining full compatibility with Vercel AI SDK's `generateText`, `embed`, and experimental APIs. Built-in OpenTelemetry integration pipes performance metrics to Rozenite DevTools for runtime profiling across all providers.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work