OpenMind/OM1

Modular AI runtime for robots

76
/ 100
Verified

Supports multimodal sensor inputs (camera, LIDAR, web data) and outputs (motion commands, speech, navigation) through a plugin-based architecture that connects to ROS2, Zenoh, and CycloneDDS middlewares. Agents are configured declaratively via JSON5 files with system prompts, enabling rapid customization across different robot form factors without code changes. Includes web-based debugging, pre-integrated LLM/VLM endpoints, and simulator support (Gazebo, Isaac Sim) for prototyping before hardware deployment.

2,681 stars. Actively maintained with 54 commits in the last 30 days.

No Package No Dependents
Maintenance 25 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 25 / 25

How are scores calculated?

Stars

2,681

Forks

965

Language

Python

License

MIT

Last pushed

Mar 13, 2026

Commits (30d)

54

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/OpenMind/OM1"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.