You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The total batch size( in token ) is 0.5M in one iteration ,which follows the GPT3 paper. With the same logic, since the GPT3's warmup is 375M tokens, why is the warmup_iters not set to 375/0.5 = 750 but 2000? Is there any further consideration?
The text was updated successfully, but these errors were encountered:
The total batch size( in token ) is 0.5M in one iteration ,which follows the GPT3 paper. With the same logic, since the GPT3's warmup is 375M tokens, why is the warmup_iters not set to 375/0.5 = 750 but 2000? Is there any further consideration?
The text was updated successfully, but these errors were encountered: