Paperspace/DinoRunTutorial
Accompanying code for Paperspace tutorial "Build an AI to play Dino Run"
Implements a deep convolutional neural network trained via model-free reinforcement learning to autonomously play Chrome's Dino Run by processing visual game frames. Uses Selenium for browser automation and ChromeDriver to interact with the game environment, while Keras/TensorFlow handle the neural network training. The approach learns action patterns directly from pixel input without explicit game state modeling.
326 stars. No commits in the last 6 months.
Stars
326
Forks
101
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Jun 15, 2020
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Paperspace/DinoRunTutorial"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
utay/dino-ml
🦎 Simple AI to teach Google Chrome's offline dino to jump obstacles
cxwithyxy/Neuroevolution_T-rex
用神经网络来训练个能自己玩chrome断线时的那个小恐龙的AI
simply-TOOBASED/dino-bot-3000
A simple bot to play Google's dinosaur game using neural networks and genetic algorithms to get smarter.
kilian-kier/Dino-Game-AI
A simple pygame dino game which can also be trained and played by a NEAT AI
Dewep/T-Rex-s-neural-network
T-Rex's neural network (AI for the game T-Rex / Dinosaur on Google Chrome)