Replies: 2 comments
-
The quantized palm model palm_detection_builtin_256_integer_quant.tflite is having input size of 1x256x256x3 which is same as the size of mediapipe blazepalm model. palm_detection_1x3x128x128_post.onnx These models have different input HW. I wanted to know why this H,W were choosen. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
The onnx models like hand_landmark_080_Nx3x224x224.onnx under https://github.com/PINTO0309/PINTO_model_zoo/tree/main/033_Hand_Detection_and_Tracking/30_batchN_post-process_marged. Have dynamic batch size. When its converted to Tensorflow or tflite model the batch size is converted to 1[1x3x224x224].
Does Tensorflow support dynamic batch size. If so is it possible to convert the onnx model to tflite retaining the dynamic batch size?
Any advice will be helpful to understand the conversation process more deeply
Beta Was this translation helpful? Give feedback.
All reactions