Open3D-ML allows to use Intel OpenVINO as an optional backend for deep learning models inference.
Install a compatible version of OpenVINO with:
pip install -r requirements-openvino.txt
To enable OpenVINO, wrap a model in ml3d.models.OpenVINOModel
class. In example,
net = ml3d.models.PointPillars(**cfg.model, device='cpu')
net = ml3d.models.OpenVINOModel(net)
Then use net
as usual.
OpenVINO supports Intel CPUs, GPUs and VPUs. By default, model is executed on CPU.
To switch between devices, use net.to
option:
net.to("cpu") # CPU device (default)
net.to("gpu") # GPU device
net.to("myriad") # VPU device
RandLA-Net
(tf, torch)KPConv
(tf, torch)PointPillars
(torch)