-
Notifications
You must be signed in to change notification settings - Fork 42
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ENH] sklearn
compatible tuning wrapper estimator
#85
Comments
This is the estimator I'd use as a template:
|
I wrote the documentation for a possible design of the sklearn integration API in version 4.8 to show my current progress on this issue: Thoughts or suggestions on the progress or missing features? |
Looks reasonable, although the docstring seems to imply the search space can only be a grid. Is this intended? |
It is. How should the search-space look like instead? |
I thought, an abstract search space? In |
I did not have this on my radar, to be honest. I knew of the distributions for RandomSearchCV, though. I would like to add some/all of those features to Hyperactive. For this I see two ways of going for those features:
I would go for the first way. |
I did have an abstract search space whose type is specific to the tuner in mind with this: #93 For now, I would suggest, supporting a minimal version makes sense, it can be later extended without breaking lower version interfaces? |
I suppose in addition to the current interface to avoid a major version change? From what I can tell this interface will be suited to act as a general optimization interface in the future and could replace the current one. I would suggest to add this new interface as a "beta" or "experimental" feature, similar to this. It shows, that it might be subject to changes within a major version. This way we are free to refine the interface with more flexibility. Edit: Just a small addition to your comment about the abstract search space
I just want to make clear, that adding support for continuous ranges in general would be quite the challenge. Many optimization algorithms are specialized to work in discrete or continuous search spaces. The easiest approach would be to add a sampling algorithm to each discrete optimizer, so that it transforms the continuous space into a discrete one internally. This would maybe work, but would also just "hide" the discretisation of the search space from the user. |
I would suggest to expose the tuners as
sklearn
compatible tuning wrappers, e.g.,HyperactiveCV(sklearn_estimator, config)
,or
HyperactiveCV(sklearn_estimator, hyperopt_tuning_algo, config)
,where
HyperactiveCV
inherits fromsklearn
BaseEstimator
, and gets tested byparametrize_with_checks
in the CI.The text was updated successfully, but these errors were encountered: