QuickDraw and QuickDraw-AirGesture-tensorflow
B extends A by adding gesture recognition capabilities on top of the QuickDraw implementation, making them complementary rather than competitive—B uses QuickDraw as a base and augments it with TensorFlow-based air gesture detection.
About QuickDraw
vietnh1009/QuickDraw
Implementation of Quickdraw - an online game developed by Google
Leverages a convolutional neural network trained on Google's Quick Draw dataset to classify hand-drawn sketches across 20 object categories in real-time. Provides two interaction modes: a camera app using color-based pen tracking (blue/red/green) via OpenCV for direct webcam drawing, and a canvas-based drawing interface. Built with PyTorch for model inference and trained on 10,000 images per class with 80/20 train-test split.
About QuickDraw-AirGesture-tensorflow
vietnh1009/QuickDraw-AirGesture-tensorflow
Implementation of QuickDraw - an online game developed by Google, combined with AirGesture - a simple gesture recognition application
Combines hand-pose detection via webcam with a TensorFlow CNN classifier trained on 18 QuickDraw categories, enabling real-time air-drawing recognition without canvas interaction. The system uses OpenCV for hand tracking and centroid detection, feeding gesture sequences to a pre-trained model that classifies drawings mid-sketch. Includes containerized training and inference pipelines with Docker support and automatic session video recording.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work