-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Out_of_memory
with Dns_resolver_mirage
#298
Comments
Do you know what version of ocaml-dns was used? |
The version is |
Thanks for your report. I've no insight which |
Currently, my unikernel is available here: https://github.com/mirage/dns-resolver I will relaunch it with |
That would be good -- though I'd expect more useful data if you add some metrics (or periodic log messages) about memory consumption and cache size (since the LRU of the cache is not measured in actual used bytes, but items (where the weight function is the size of the resource record map (and each resource record may be different in size)). |
The reason I asked is because I can't make sense of the line number |
I opened #299 that should help to figure out whether the cache is causing (as created) the OOM. Is there a way to have a over-approximation of the memory used by an OCaml value (AFAICT not, esp. not if using bigarrays) -- |
It's probably due to that |
This is my last logs just before the
|
Not sure if that can help as it seems to take a long time to trigger the memory fault, but I tried to print memory usage, my modifications can be seen https://github.com/palainp/dns-resolver/tree/print-mem-usage. It displays every 10" 3 memory sizes: the memory size available above top of the heap (including the stack) this can be used by malloc to growth the ocaml memory, the real free memory (above the heap (including the stack) + free memory inside the heap), and total memory available (the --mem arg minus the kernel size).
|
After ~2 weeks, I systematically get an
Out_of_memory
with this trace:It seems about an internal
Hashtbl
. The unikernel is launched with 32MB. I will try to introspect further to understand where is the error.The text was updated successfully, but these errors were encountered: