Skip to content

AdamW optimizer implemented incorrectly - weight decay does not incorporate learning rate #706

AdamW optimizer implemented incorrectly - weight decay does not incorporate learning rate

AdamW optimizer implemented incorrectly - weight decay does not incorporate learning rate #706

Triggered via issue October 25, 2024 14:23
@ToucheSirToucheSir
commented on #182 c2ae321
Status Skipped
Total duration 3s
Artifacts

TagBot.yml

on: issue_comment
Fit to window
Zoom out
Zoom in