pkhungurn/talking-head-anime-demo
Demo for the "Talking Head Anime from a Single Image."
Implements dual applications for anime character animation: a manual pose editor with slider controls and a real-time puppeteer that mirrors head movements from webcam input using dlib face tracking. Built on PyTorch with custom neural network modules (face_morpher, face_rotator, combiner) that process 256×256 RGBA character images and output animated sequences. Requires NVIDIA GPU acceleration and supports deployment via Google Colab for users without local hardware.
2,021 stars. No commits in the last 6 months.
Stars
2,021
Forks
287
Language
Python
License
MIT
Category
Last pushed
Jun 29, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/pkhungurn/talking-head-anime-demo"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
balewgize/background-remover
Remove background from images using pre-trained ML models.
pkhungurn/talking-head-anime-2-demo
Demo programs for the Talking Head Anime from a Single Image 2: More Expressive project.
pkhungurn/talking-head-anime-3-demo
Demo Programs for the "Talking Head(?) Anime from a Single Image 3: Now the Body Too" Project
pkhungurn/talking-head-anime-4-demo
Demo Programs for the "Talking Head(?) Anime from a Single Image 4: Improved Models and Its...
Netesh5/image_background_remover
A Flutter package that removes the background from images using an ONNX model. The package...