Optax 0.1.5
What's Changed
- Fix arXiv link to Optax Optimistic Gradient Descent optimizer by @8bitmp3 in #458
- Fix the Yogi optimizer paper year, change link to NeurIPS site by @8bitmp3 in #461
- Add exponent to cosine decay schedule and warmup + cosine decay by @copybara-service in #476
- Fix typos in docstring by @pomonam in #480
- Fix global_norm() signature by @brentyi in #481
- Fix
inject_hyperparams()
for python < 3.10. by @copybara-service in #486 - fixed NaN issues in
kl_divergence
loss function by @LukasMut in #473 - feat(ci/tests): bump
setup-python
version and enable cache by @SauravMaheshkar in #485 - Better tests for utils by @acforvs in #465
- Run Github CI every day at 03:00. by @copybara-service in #490
- Fix JIT for
piecewise_interpolate_schedule
,cosine_onecycle_schedule
,linear_onecycle_schedule
by @brentyi in #504 - Explicitly export "softmax_cross_entropy_with_integer_labels" by @nasyxx in #499
- Add the Lion optimizer, discovered by symbolic program search. by @copybara-service in #500
- Replaces references to jax.numpy.DeviceArray with jax.Array. by @copybara-service in #511
- Update pytypes. by @copybara-service in #514
- Fix pytype failures related to teaching pytype about NumPy scalar types. by @copybara-service in #517
- Release v0.1.5. by @copybara-service in #523
New Contributors
- @pomonam made their first contribution in #480
- @brentyi made their first contribution in #481
- @LukasMut made their first contribution in #473
- @SauravMaheshkar made their first contribution in #485
- @acforvs made their first contribution in #465
Full Changelog: v0.1.4...v0.1.5