-
mosec/examples/resnet50_server_pytorch.py Line 49 in 1627a9b Hi, for multi-instances usage, by default the pytorch use default cuda stream, making it not able to run instances concurrently. So how do you solve this problem? multi-process? |
Beta Was this translation helpful? Give feedback.
Answered by
lkevinzc
Nov 17, 2021
Replies: 1 comment
-
Hi @ShiyangZhang , yes we use multi-process. All the concurrent requests will be queued and multiple processes will consume the queue |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
lkevinzc
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi @ShiyangZhang , yes we use multi-process. All the concurrent requests will be queued and multiple processes will consume the queue