Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DDP script and code conflict bug #1683

Open
wcyjerry opened this issue Nov 15, 2024 · 0 comments
Open

DDP script and code conflict bug #1683

wcyjerry opened this issue Nov 15, 2024 · 0 comments

Comments

@wcyjerry
Copy link

Hi, refer to PR 1675
there exists an conflict bug when using ddp.
For me, it make train process stuck at first epoch, and memory usage cost continuous increasing.
Usually, If code have set mp.start_method("spawn"), then one should not use torch.distributed.launch xxx train.py,
just python train.py is ok. So, the dist_train.sh is not suitable for code.
I have uncomment set_method, and use the dist_train.sh, the whole train process start well and no more huge memory cost.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant