Skip to content

Meaning of ProxGradState.error #224

Discussion options

You must be logged in to vote

The error field has different semantics for each solver in general, but as a general rule of thumb, since Jaxopt is thought to be used in the context of Implicit Differentiation, the error field is usually related to the value of optimality_fun, i.e in your case, the value of norm(step_size * grad(fun)(params)):

def _error(self, x, x_fun_grad, hyperparams_prox):

So you may be far away from your optimum, but in this particular point the loss landscape is flat and this triggers early termination of the algorithm. For strongly convex problems (typically those for which Gradient Descent has convergence guarantees) t…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by lucagrementieri
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants