Why does CPU training work well, but GPU training doesn't? #1650
-
Hi GPytorch Team, you've done great work, thanks for this library, I love it! Currently Im trying to get a Classifier to work. I noticed how it works well on the CPU for a small subset of my large dataset: Training on CPU
Iter 1/50 - Loss: 0.908 Loss seems to decrease nicely!! Training on GPU
[0, 999] loss: 0.699 Loss doesn't seem to decrease at all! Does anyone please have an idea? Im grateful for any comment :) Thanks |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
Try a larger batch size than 16. |
Beta Was this translation helpful? Give feedback.
Try a larger batch size than 16.