Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SCML: Add warm_start parameter #345

Open
wants to merge 7 commits into
base: master
Choose a base branch
from

Conversation

maxi-marufo
Copy link

Because SCML optimization procedure is based on stochastic subgradient descent, we can save the weights after fitting the model, and use them in a following fit call (with a different set of triplets). The decision to use the warm_start parameter instead of a new partial_fit method is because partial_fit in scikit-learn will only fit 1 epoch, whereas fit will fit for multiple epochs (until the loss converges or max_iter is reached), which is the case also for SCML.

@maxi-marufo maxi-marufo changed the title [WIP] Add warm_start parameter to SCML SCML: Add warm_start parameter Mar 28, 2022
@perimosocordiae
Copy link
Contributor

This change looks fine to me, though I'm not sure when this warm-start option is useful in practice. Sorry for the extreme delay in reviewing!

@grudloff want to take a look?

@grudloff
Copy link
Contributor

LGTM!

Copy link
Member

@bellet bellet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @maxi-marufo !
Maybe it is possible to design a better test, something in the spirit https://github.com/scikit-learn/scikit-learn/blob/80598905e517759b4696c74ecc35c6e2eb508cff/sklearn/linear_model/tests/test_sgd.py#L274
where the idea is to check that calling warm_start=True is equivalent to manually setting the parameters?

@maxi-marufo maxi-marufo requested a review from bellet November 23, 2024 15:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants