You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was wondering if the authors experimented with LayerNorm instead of BatchNorm and if they noticed any differences? From what I've read is that BatchNorm is heavily dependent on batch size. My max batch size while using this architecture is 32 while using 2 A100 GPUs and this is why I had the idea of replacing BatchNorm with LayerNorm.
Did anyone else try it?
The text was updated successfully, but these errors were encountered:
I was wondering if the authors experimented with LayerNorm instead of BatchNorm and if they noticed any differences? From what I've read is that BatchNorm is heavily dependent on batch size. My max batch size while using this architecture is 32 while using 2 A100 GPUs and this is why I had the idea of replacing BatchNorm with LayerNorm.
Did anyone else try it?
The text was updated successfully, but these errors were encountered: