Davidwarchy/the-self-grounded-agent
A tiny two-wheel robot with a 100-ray lidar learns objecthood from pure sensorimotor experience. With no labels or tasks, it samples simple actions and tracks how they distort the lidar stream, revealing stable “things” in the world—structures it can use to predict, navigate, and act coherently.
Stars
—
Forks
—
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Feb 27, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Davidwarchy/the-self-grounded-agent"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
microsoft/AirSim
Open source simulator for autonomous vehicles built on Unreal Engine / Unity, from Microsoft AI...
learn-to-race/l2r
Open-source reinforcement learning environment for autonomous racing — featured as a conference...
lgsvl/simulator
A ROS/ROS2 Multi-robot Simulator for Autonomous Vehicles
microsoft/AirSim-NeurIPS2019-Drone-Racing
Drone Racing @ NeurIPS 2019, built on Microsoft AirSim
DeepTecher/AutonomousVehiclePaper
无人驾驶相关论文速递