Replies: 1 comment
-
You could build the docker container and then mount a volume to it that contains your model. I have not tested this with bentoML, but I used this before in a fastapi app and custom Dockerfile. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have tried using the saved Bento model in the service.py file. But when ever I build the docker image and run the container, the transformer model used in the pipeline gets downloaded separately.
I can't find a way to run the transformer model locally from the container instead of downloading the file each time from the internet when the container is run.
Any suggestions?
Beta Was this translation helpful? Give feedback.
All reactions