OpenVINO vs Torch Inference #142
Replies: 5 comments 2 replies
-
Hi @alexriedel1, depends on the model. The use of GPU is advantageous in many cases. Here is an example that I ran some time ago using NVIDIA 3090 GPU and i9-10980XE CPU. Please note that these stats are not official and experimental only. |
Beta Was this translation helpful? Give feedback.
-
Thanks @samet-akcay ! |
Beta Was this translation helpful? Give feedback.
-
We plan to add some jupyter notebooks to show this sort of experimentation |
Beta Was this translation helpful? Give feedback.
-
Meanwhile, you could use the following benchmarking script to reproduce the throughput results for gpu and cpu |
Beta Was this translation helpful? Give feedback.
-
@samet-akcay |
Beta Was this translation helpful? Give feedback.
-
Hello,
considering using a GPU for inferencing, will the inference using a openvino IR outperform the torch cuda inferencing in terms of speed?
Thanks, Alex
Beta Was this translation helpful? Give feedback.
All reactions