MiroFish-Offline and MiroFish-local
These are competing implementations of the same offline multi-agent simulation framework, with the first being an English fork using Neo4j + Ollama and the second being a Chinese version using Graphiti + Neo4j as alternatives to paid services—users would choose one based on language preference and specific backend requirements rather than use both together.
About MiroFish-Offline
nikmcfly/MiroFish-Offline
Offline multi-agent simulation & prediction engine. English fork of MiroFish with Neo4j + Ollama local stack.
Generates hundreds of AI agents with distinct personalities that simulate social media reactions to documents, tracking sentiment evolution and opinion shifts in real time via a Neo4j knowledge graph. Uses a modular architecture with pluggable storage backends and hybrid vector+BM25 search, running entirely locally through Ollama for LLM inference and embeddings, eliminating cloud API dependencies.
About MiroFish-local
tt-a1i/MiroFish-local
MiroFish的免费本地运行版本 | Graphiti+Neo4j替代付费Zep | 简易便捷跑通项目
Implements multi-agent swarm intelligence simulation using Graphiti + Neo4j for local knowledge graph construction and memory management, replacing cloud dependencies with on-premises GraphRAG pipelines. The OASIS engine drives parallel agent interactions to simulate collective behavior across scenarios like public opinion prediction, market sentiment analysis, and policy impact assessment. Supports seamless backend switching between Zep Cloud and local modes via environment variables, with Docker Compose orchestration for Neo4j deployment and compatible OpenAI SDK integration for any LLM provider.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work