Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New version: MLJFlux v0.5.0 #108693

Merged
merged 1 commit into from
Jun 11, 2024

Conversation

JuliaRegistrator
Copy link
Contributor

- (**new model**) Add `NeuralNetworkBinaryClasssifier`, an optimised form of `NeuralNetworkClassifier` for the special case of two target classes. Use `Flux.σ` instead of `softmax` for the default finaliser (#248)
- (**internals**) Switch from implicit to explicit differentiation (#251)
- (**breaking**) Use optimisers from Optimisers.jl instead of Flux.jl (#251). Note that the new optimisers are immutable.
- (**RNG changes.**) Change the default value of the model field `rng` from`Random.GLOBAL_RNG` to `Random.default_rng()`. Change the seeded RNG, obtained by specifying an integer value for `rng`, from `MersenneTwister` to `Xoshiro` (#251)
- (**RNG changes.**) Update the `Short` builder so that the `rng` argument of `build(::Short, rng, ...)`
   is passed on to the `Dropout` layer, as these layers now support this on a GPU, at
  least for `rng=Random.default_rng()` (#251)
- (**weakly breaking**) Change the implementation of L1/L2 regularization from explicit loss penalization to  weight/sign decay (internally chained with the user-specified optimiser). The only breakage for users is that the losses reported in the history will no longer be penalized, because the penalty is not explicitly computed (#251)

UUID: 094fc8d1-fd35-5302-93ea-dabda2abf845
Repo: https://github.com/FluxML/MLJFlux.jl.git
Tree: 2fcdce39d979f2865aaa82d5750c6ee4ce543f4d

Registrator tree SHA: 17aec322677d9b81cdd6b9b9236b09a3f1374c6a
Copy link
Contributor

Your new version pull request met all of the guidelines for auto-merging and is scheduled to be merged in the next round.


If you want to prevent this pull request from being auto-merged, simply leave a comment. If you want to post a comment without blocking auto-merging, you must include the text [noblock] in your comment. You can edit blocking comments, adding [noblock] to them in order to unblock auto-merging.

@JuliaTagBot JuliaTagBot merged commit a9ee273 into master Jun 11, 2024
22 checks passed
@JuliaTagBot JuliaTagBot deleted the registrator-mljflux-094fc8d1-v0.5.0-151880cf2b branch June 11, 2024 01:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants