Meaning of ProxGradState.error #224
-
Hello, I'm using At the beginning I was assuming that the state attribute |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
The jaxopt/jaxopt/_src/proximal_gradient.py Line 176 in 52cbdd4 So you may be far away from your optimum, but in this particular point the loss landscape is flat and this triggers early termination of the algorithm. For strongly convex problems (typically those for which Gradient Descent has convergence guarantees) this means you are close from the optimum. For other problems, well, it is not a great criterion. Hence, I suggest you set As a concluding remark, I would like to say that for constrained problems a huge (non zero) value of the loss is expected even at the optimum, so you may also want to do this sanity check. |
Beta Was this translation helpful? Give feedback.
The
error
field has different semantics for each solver in general, but as a general rule of thumb, since Jaxopt is thought to be used in the context of Implicit Differentiation, theerror
field is usually related to the value ofoptimality_fun
, i.e in your case, the value ofnorm(step_size * grad(fun)(params))
:jaxopt/jaxopt/_src/proximal_gradient.py
Line 176 in 52cbdd4
So you may be far away from your optimum, but in this particular point the loss landscape is flat and this triggers early termination of the algorithm. For strongly convex problems (typically those for which Gradient Descent has convergence guarantees) t…