Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
-
Updated
Sep 23, 2020 - Python
Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
Neural Networks and optimizers from scratch in NumPy, featuring newer optimizers such as DemonAdam or QHAdam.
Add a description, image, and links to the qhadam topic page so that developers can more easily learn about it.
To associate your repository with the qhadam topic, visit your repo's landing page and select "manage topics."