You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jul 1, 2021. It is now read-only.
I try to install deeplabcut-core to download pretrained model on my jetson nano. However, I found DLC need Intel-openmp. ARM frame CPU becomes more and more popular (like Apple M1). Maybe you can consider my suggestion :; Also I wonder how you test your models on Jetson Xavier. If I trained a model on an x86, it can be implemented on an arm directly with tensorRT support? Thanks!
The text was updated successfully, but these errors were encountered:
This is probably a question that belongs to the forum/gitter, but since someone brought it up here, I'm also curious about the same thing. I'm on macOS with M1 chip, and was wondering if you plan to support it anytime in the future.
I try to install deeplabcut-core to download pretrained model on my jetson nano. However, I found DLC need Intel-openmp. ARM frame CPU becomes more and more popular (like Apple M1). Maybe you can consider my suggestion :; Also I wonder how you test your models on Jetson Xavier. If I trained a model on an x86, it can be implemented on an arm directly with tensorRT support? Thanks!
The text was updated successfully, but these errors were encountered: