-
Notifications
You must be signed in to change notification settings - Fork 50
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance problem #116
Comments
Wonder what takes so long. Which Node version are you using? About the only type of logic executed on import that is not import that I see is this:
update: looks like main reason is old verison of swagger-parser that brought core-js with it |
https://github.com/PayU/api-schema-builder/pull/45/files removes dependency on core-js, so I think that would help. |
I'm on Node 8 + firebase functions setup. |
And also will be waiting for the update to get approved. Can't wait to test it :) |
@manwithsteelnerves Could you try version from #121 and see if it makes any difference? |
Released in 3.0.0 |
Also, in the above test, I was just referring InputValidationError type alone. Anyway to refer that type(for type info) alone and not load the complete module? |
Here are few more tests where it took 1.87secs which I see is more consistent after several tests. |
Should be possible to short-circuit that. Will investigate. I'm curious, what is your use-case? Why optimize for startup speed? |
I'm using it in server-less functions. To be specific, Firebase cloud functions. Unfortunately there is no idle time defined for instances which spin up. This makes it a requirement that the loading needs to be faster as loading of the module will be nearly happening on every request. |
@manwithsteelnerves , can you share your definition file? |
Here is the schema. |
@kobik Do you see anything wrong with the spec? |
I'll have a look tomorrow |
@kobik If I split the json, will it be of any help? Edit : I'm trying to split the spec into multiple so that there will be one file per route. Is it possible to pass to init the main spec file and refer its model definitions from another file? |
Hi @manwithsteelnerves , i'm very sorry for the late response. I'm on it now. |
according to my measurements:
|
@manwithsteelnerves would you mind testing the beta version? i did some optimizations of the require statements and i'd like see if it's making things better for you. |
Sure! Happy to do that :) Just to let you know, I made the bulk single schema to multiple parts(as per enpoint+http method). But still I find it taking some time. so I can get you those values. |
So your actual schemas are much smaller? Can you share an example of a real schema you're using? |
Sorry! I missed your comment! Yes, the schema size now is per endpoint and I will be loading at runtime every time the endpoint is triggered. Please check the schema here. If overall performance ramps up, definitely I revert my code to old single schema approach as its one time load. |
@kobik May I know how to download the latest version to profile? |
@manwithsteelnerves Judging by what I see on npm, I believe 3.0.0-beta.0 is it. |
I saw it but wanted a confirmation. Let me try that! |
Hi, So I have the old setup only to continue our comparisons. But, incase if you see anyways I can reduce it further, please let me know. This is because, on serverless platforms this is crucial. I'm attaching the screenshot as well as the trace file which you load it using this tool. Please check from 2.7secs. Also, I'm using this type which is adding nearly 500ms just for the type alone.
It would be great if you let me know i'm doing anything wrong. |
@manwithsteelnerves , i'm not sure what else we do here. Can you try using |
I actually had another idea I've toyed with in my mind, which is caching of the validation object. allow to create from a definitions file and export the validation object to a file. then init the middileware by giving it the exported validation object file. do you thing it would be an acceptable solution? |
Just to be on the same page, can you share more details about it? |
It's like creating and committing a "compiled" version of the definitions file and use it instead of the definitions file for a faster loading times. This is just an idea that we need to see if it's even a feasible plan. But first I'd like to hear what you think of this idea and second I'd still appreciate if you'd test the asyncInit. Thanks |
Thats a nice one. Most likely it can give better performance. |
@kobik Did you make any performance progress post this? I don't see the release notes to go through, so check if any updates you have. |
@kobik Would love to know if any thing you have progressed on this. |
@manwithsteelnerves Some of the heavier dependencies were dropped, so that would speed up at least the import part. I don't think schema loading changed much since then. |
Currently it's taking around 1.2 seconds to load the module. Any way to get better loading times?
The text was updated successfully, but these errors were encountered: