Releases: FluxML/Optimisers.jl
Releases · FluxML/Optimisers.jl
v0.2.4
v0.2.3
v0.2.2
v0.2.1
v0.2.0
Optimisers v0.2.0
First registered release, almost a re-write under new management.
Closed issues:
- Getting Optimisers moved out of Flux (#5)
- memory impact of the functional approach (#12)
- Missing definition in the basic usage example (#26)
- Handle non-array leaves in
state(o, x)
(#28) - Support ChainRules types (#32)
- Optimise a subset of parameters (#35)
- Port over rule changes from Flux (#38)
- Correct the optimiser's eltype? (#55)
Merged pull requests:
- Add gradient clipping (#27) (@mcabbott)
- Optimise only at
isnumeric
leaves (#29) (@mcabbott) - Wrap optimiser and state in a struct (#30) (@mcabbott)
- Add lazy broadcasting (#31) (@mcabbott)
- Add code coverage (#34) (@ToucheSir)
- Add
trainable
(#36) (@mcabbott) - Make the example run (#37) (@mcabbott)
- Add some tests using StaticArrays (#39) (@mcabbott)
- Allow types from ChainRules (#41) (@mcabbott)
- Trivial cases of
OptimiserChain
(#43) (@mcabbott) - Stop testing on windows + mac (#44) (@mcabbott)
- Disable codecov auto comment and add badge (#45) (@ToucheSir)
- Complex numbers alla Flux 1776 (#47) (@mcabbott)
- Separate
@lazy
from@..
macro, fix some bugs (#48) (@mcabbott) - Ensure that
update
without a gradient is an error (#50) (@mcabbott) - Add
destructure
, take II (#54) (@mcabbott) - Explicitly preserve eltype (#56) (@mcabbott)
v0.1.0
Optimisers v0.1.0
Closed issues:
Merged pull requests:
- add ADAM (#3) (@DhairyaLGandhi)
- add GA ci (#6) (@DhairyaLGandhi)
- Add some more Optimisers (#7) (@DhairyaLGandhi)
- Remove noarg constructors for some optimisers (#8) (@DhairyaLGandhi)
- Move all optimizers to Optimisers.jl (#9) (@darsnack)
- Add optimiser updates utility (#14) (@DhairyaLGandhi)
- RFC: Switch interface for higher-order optimizers (#16) (@darsnack)
- fix Descent apply rule (#17) (@DhairyaLGandhi)
- Add docs (#19) (@DhairyaLGandhi)
- Update compat sections for registration (#22) (@DhairyaLGandhi)