digantamisra98/Mish
Official Repository for "Mish: A Self Regularized Non-Monotonic Neural Activation Function" [BMVC 2020]
Defined by f(x) = x·tanh(softplus(x)), Mish introduces self-regularization through a smooth, non-monotonic curve that maintains gradient flow across negative and positive domains. Integrated across major frameworks—PyTorch, TensorFlow, MXNet, and OpenVINO—with optimized CUDA variants and memory-efficient implementations enabling deployment from edge to large-scale models. Demonstrates state-of-the-art performance in computer vision tasks including object detection on MS-COCO, with variants like H-Mish enabling architectural flexibility.
1,303 stars. Actively maintained with 9 commits in the last 30 days.
Stars
1,303
Forks
128
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Mar 09, 2026
Commits (30d)
9
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/digantamisra98/Mish"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
itdxer/neupy
NeuPy is a Tensorflow based python library for prototyping and building neural networks
Sentdex/nnfs_book
Sample code from the Neural Networks from Scratch book.
vzhou842/cnn-from-scratch
A Convolutional Neural Network implemented from scratch (using only numpy) in Python.
nicklashansen/rnn_lstm_from_scratch
How to build RNNs and LSTMs from scratch with NumPy.
Synthaze/EpyNN
Educational python for Neural Networks.