-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Shared layer weights are leaked when disposing a model #8471
Comments
Hi @virgil-king , I am not able to reproduce the piece of code that you have shared. Could you please clarify how to reproduce the code? Also, from the error that you have shared, I could check that if the
Let me know if it helps. Thank You!! |
Hi @shmishra99, Thanks for looking at the bug. To repro, you need Node.js installed. Create a directory with a file named
Paste the sample code from my earlier comment into Run I changed my sample code to log
Your suggestion to dispose all of the weights of the model instead of disposing the model itself does fix the leak:
But |
|
If that's really the intent, it should be added to the documentation (unless I missed it there), or else no one will know to do that. But speaking of the intent, IMO model.dispose should dispose these weights. I guess the current logic of model.dispose is to only dispose weights whose reference count is one? If so, it should do that in a loop, until one iteration is unable to dispose any weights, since one loop iteration may cause additional weights to become disposable in the next iteration. Otherwise to be correct, every user of the API should do exactly that, and I don't think it's possible currently since the reference count isn't exposed. Other approaches like disposing every weight of the model would be incorrect when weights are shared across models. |
Please make sure that this is a bug. As per our
GitHub Policy,
we only address code/doc bugs, performance issues, feature requests and
build/installation issues on GitHub. tag:bug_template
System information
Describe the current behavior
Result: 36 weights are not disposed. Those weights are all part of a subgraph of which there are 4 copies in the model:
conv2d_Conv2D[7-16]/kernel
conv2d_Conv2D[7-16]/bias
board_analysis_output/kernel
board_analysis_output/bias
Describe the expected behavior
All weights should be disposed.
Standalone code to reproduce the issue
Provide a reproducible test case that is the bare minimum necessary to generate
the problem. If possible, please share a link to Colab/CodePen/any notebook.
model.zip
Other info / logs Include any logs or source code that would be helpful to
diagnose the problem. If including tracebacks, please include the full
traceback. Large logs and files should be attached.
The text was updated successfully, but these errors were encountered: