[Bug]: The variable convert_model
seems missing in anomalib v1.0.1
#2308
-
Describe the bugThe variable Code
Error message:
DatasetN/A ModelOther (please specify in the field below) Steps to reproduce the behaviorCode
OS informationOS information: Kaggle notebook
Expected behaviorThe trained model can be converted to OpenVINO format. ScreenshotsNo response Pip/GitHubpip What version/branch did you use?No response Configuration YAMLUsing default YAML LogsCalculate Validation Dataset Quantiles: 100%|██████████| 1/1 [00:00<00:00, 2.13it/s]
Calculate Validation Dataset Quantiles: 100%|██████████| 1/1 [00:00<00:00, 1.92it/s]
Calculate Validation Dataset Quantiles: 100%|██████████| 1/1 [00:00<00:00, 2.13it/s]
Calculate Validation Dataset Quantiles: 100%|██████████| 1/1 [00:00<00:00, 1.91it/s]
Calculate Validation Dataset Quantiles: 100%|██████████| 1/1 [00:00<00:00, 2.11it/s]
INFO: `Trainer.fit` stopped: `max_epochs=5` reached.
/opt/conda/lib/python3.10/site-packages/anomalib/models/image/efficient_ad/torch_model.py:29: TracerWarning: torch.tensor results are registered as constants in the trace. You can safely ignore this warning if you use this function to create tensors out of constant variables that would be the same every time you call this function. In any other case, this might cause the trace to be incorrect.
mean = torch.tensor([0.485, 0.456, 0.406])[None, :, None, None].to(x.device)
/opt/conda/lib/python3.10/site-packages/anomalib/models/image/efficient_ad/torch_model.py:30: TracerWarning: torch.tensor results are registered as constants in the trace. You can safely ignore this warning if you use this function to create tensors out of constant variables that would be the same every time you call this function. In any other case, this might cause the trace to be incorrect.
std = torch.tensor([0.229, 0.224, 0.225])[None, :, None, None].to(x.device)
/opt/conda/lib/python3.10/site-packages/anomalib/models/image/efficient_ad/torch_model.py:332: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
return any(value.sum() != 0 for _, value in p_dic.items())
/opt/conda/lib/python3.10/site-packages/anomalib/models/image/efficient_ad/torch_model.py:212: TracerWarning: Converting a tensor to a Python float might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
math.ceil(image_size[0] / 4) if self.padding else math.ceil(image_size[0] / 4) - 8,
/opt/conda/lib/python3.10/site-packages/anomalib/models/image/efficient_ad/torch_model.py:213: TracerWarning: Converting a tensor to a Python float might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
math.ceil(image_size[1] / 4) if self.padding else math.ceil(image_size[1] / 4) - 8,
/opt/conda/lib/python3.10/site-packages/torch/onnx/_internal/jit_utils.py:314: UserWarning: Constant folding - Only steps=1 can be constant folded for opset >= 10 onnx::Slice op. Constant folding not applied. (Triggered internally at /usr/local/src/pytorch/torch/csrc/jit/passes/onnx/constant_fold.cpp:179.)
_C._jit_pass_onnx_node_shape_type_inference(node, params_dict, opset_version)
/opt/conda/lib/python3.10/site-packages/torch/onnx/utils.py:739: UserWarning: Constant folding - Only steps=1 can be constant folded for opset >= 10 onnx::Slice op. Constant folding not applied. (Triggered internally at /usr/local/src/pytorch/torch/csrc/jit/passes/onnx/constant_fold.cpp:179.)
_C._jit_pass_onnx_graph_shape_type_inference(
/opt/conda/lib/python3.10/site-packages/torch/onnx/utils.py:1244: UserWarning: Constant folding - Only steps=1 can be constant folded for opset >= 10 onnx::Slice op. Constant folding not applied. (Triggered internally at /usr/local/src/pytorch/torch/csrc/jit/passes/onnx/constant_fold.cpp:179.)
_C._jit_pass_onnx_graph_shape_type_inference(
---------------------------------------------------------------------------
NameError Traceback (most recent call last)
Cell In[2], line 33
29 # logger.info("Checkpoint path: {}".format(engine.trainer.default_root_dir))
30
31 # Train the model
32 engine.fit(datamodule=datamodule, model=model)
---> 33 engine.export(export_type=ExportType.OPENVINO,
34 model=model,
35 export_root='/kaggle/working/anomalib_weight')
File /opt/conda/lib/python3.10/site-packages/anomalib/engine/engine.py:910, in Engine.export(self, model, export_type, export_root, transform, ov_args, ckpt_path)
903 exported_model_path = export_to_onnx(
904 model=model,
905 export_root=export_root,
906 transform=transform,
907 task=self.task,
908 )
909 elif export_type == ExportType.OPENVINO:
--> 910 exported_model_path = export_to_openvino(
911 model=model,
912 export_root=export_root,
913 transform=transform,
914 task=self.task,
915 ov_args=ov_args,
916 )
917 else:
918 logging.error(f"Export type {export_type} is not supported yet.")
File /opt/conda/lib/python3.10/site-packages/anomalib/deploy/export.py:295, in export_to_openvino(export_root, model, transform, ov_args, task)
293 ov_model_path = model_path.with_suffix(".xml")
294 ov_args = {} if ov_args is None else ov_args
--> 295 if convert_model is not None and serialize is not None:
296 model = convert_model(model_path, **ov_args)
297 serialize(model, ov_model_path)
NameError: name 'convert_model' is not defined Code of Conduct
|
Beta Was this translation helpful? Give feedback.
Answered by
samet-akcay
Sep 18, 2024
Replies: 1 comment
-
Hi @blueclowd, it is probably because you don't have openvino installed on your environment. As you can see below: anomalib/src/anomalib/deploy/export.py Line 30 in 5b01918 |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
samet-akcay
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi @blueclowd, it is probably because you don't have openvino installed on your environment. As you can see below:
convert_model
is imported from openvinoanomalib/src/anomalib/deploy/export.py
Line 30 in 5b01918