LocalLLMClient and LLMFarm
About LocalLLMClient
tattn/LocalLLMClient
Swift package to run local LLMs on iOS, macOS, Linux
This Swift package helps developers integrate large language models (LLMs) directly into their iOS, macOS, or Linux applications. Developers can use it to add AI capabilities like text generation, question answering, and even image analysis. It takes local LLM files and user prompts as input, and outputs text responses or tool calls, enabling apps to perform intelligent tasks offline.
About LLMFarm
guinmoon/LLMFarm
llama and other large language models on iOS and MacOS offline using GGML library.
This app allows you to run large language models (LLMs) like LLaMA and GPT2 directly on your iPhone or Mac, without needing an internet connection. You input a chosen language model and receive text generation or responses, enabling private and fast AI interactions. It's designed for anyone who wants to experiment with or use AI language models offline on Apple devices.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work