Replies: 1 comment
-
I would recommend controlling the lifecycle of the subprocess manually, instead of using a context manager, additionally after the server start, you can try to sleep for a certain amount of time to avoid race condition. server = subprocess.Popen(COMMAND)
time.sleep(30) Since the client does support as a context manager, you can do this try:
with bentoml.SyncHTTPClient(...) as client:
...
finally:
server.terminate() Hope this helps. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello! I'm using bentoml and greatly appreciate the excellent serving framework you've created.
I'm writing an integration test code following the official guide, but I've encountered a problem.
I want to start the bentoml server via subprocess and request images using
bentoml.SyncHTTPClient
.However, I'm getting the following error:
Based on the test results, it appears that the component loading and the TorchScript JIT compile warm-up process are functioning correctly:
It seems the /readyz endpoint is functioning without issues, but the error occurs at /schema.json. However, even when running the server normally using bentoml serve,
http://localhost:8100/schema.json
does not seem to exist.My API uses Multipart(data=File()) from bentoml.io, and I couldn't find appropriate guides for this setup either.
Additionally, I was unable to find any relevant information about this schema issue online.
Could you please provide guidance on how to resolve this issue?
test_integration.py:
service.py:
Beta Was this translation helpful? Give feedback.
All reactions