A New Optimization Technique for Deep Neural Networks
-
Updated
Jan 13, 2022 - Python
A New Optimization Technique for Deep Neural Networks
optimizer & lr scheduler & loss function collections in PyTorch
Multi-label classification based on timm.
Instantly improve your training performance of TensorFlow models with just 2 lines of code!
Multi-label classification based on timm, and add SimCLR to timm.
Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
Gradient Centralization for MXNet optimizers
Add a description, image, and links to the gradient-centralization topic page so that developers can more easily learn about it.
To associate your repository with the gradient-centralization topic, visit your repo's landing page and select "manage topics."