talking-head-anime-demo and talking-head-anime-4-demo
These are sequential versions of the same project where the second is an improved iteration of the first, making them successive releases rather than alternatives—you would upgrade from A to B rather than choose between them.
About talking-head-anime-demo
pkhungurn/talking-head-anime-demo
Demo for the "Talking Head Anime from a Single Image."
Implements dual applications for anime character animation: a manual pose editor with slider controls and a real-time puppeteer that mirrors head movements from webcam input using dlib face tracking. Built on PyTorch with custom neural network modules (face_morpher, face_rotator, combiner) that process 256×256 RGBA character images and output animated sequences. Requires NVIDIA GPU acceleration and supports deployment via Google Colab for users without local hardware.
About talking-head-anime-4-demo
pkhungurn/talking-head-anime-4-demo
Demo Programs for the "Talking Head(?) Anime from a Single Image 4: Improved Models and Its Distillation" Project
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work