Rank reduction using random matrix theory #13
Replies: 2 comments 1 reply
-
Hi Fernando, Thank you for your interest in our work! I am excited to read your extension. I do have a few questions:
Being able to replace the full grid search with something smarter will be super beneficial--We are excited to hear more about your work. Best, |
Beta Was this translation helpful? Give feedback.
-
This is exciting! Finding a computationally efficient way to select LASER hyperparameters is indeed an open question. By the way, we have now put a results table on the website. If you prefer, we will be happy to paste your results with laserRMT and acknowledge/cite them to you. My questions/comments are the same as @pratyushasharma . While I am optimistic that there are hyperparameters that do good across a range of tasks, however, by doing so, you may not see the huge gains we were seeing in the paper which we got by doing task-specific hyperparameter selection. Also, it will be nice to either try vanilla LASER with grid search for hyperparameter selection on these tasks that you have tried, or try laserRMT on the datasets in our paper. |
Beta Was this translation helpful? Give feedback.
-
Hi! I really enjoyed the paper.
I've implemented a version that uses marchenko pastur in order to speed up the search, instead looking within a grid search.
If it's of your interest, we can join efforts.
I would be glad if you could take a look at https://github.com/cognitivecomputations/laserRMT
Congratulations for your work.
Best,
Fernando
Beta Was this translation helpful? Give feedback.
All reactions