david-wb/gaze-estimation
A deep learning based gaze estimation framework implemented with PyTorch
Trains on synthetically generated eye images from UnityEyes to predict both eye region landmarks and gaze direction (pitch/yaw angles), achieving ~14% mean angular error on MPIIGaze. Uses a modified stacked hourglass architecture with a dedicated pre-hourglass branch for gaze prediction whose features are fused with landmark predictions via concatenation before final regression layers. Includes real-time webcam inference and end-to-end training pipeline with normalized eye region preprocessing.
194 stars. No commits in the last 6 months.
Stars
194
Forks
37
Language
Jupyter Notebook
License
—
Category
Last pushed
Feb 26, 2020
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/david-wb/gaze-estimation"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
sbobek/tobii-pytracker
Tobii Eyetracker usage and analysis with Python SDK (no Tobii Labs needed)
cpury/lookie-lookie
Learning to track eye movement in the browser
glefundes/mobile-face-gaze
Lightweight gaze estimation with PyTorch.
emilyxxie/mona_lisa_eyes
A machine learning project. Turn on your webcam. Mona Lisa's eyes will follow you around.
manishanis/eye-training
Train your eyes. Read faster.