Skip to content

JAXopt with nonlinear optimization and neural networks #177

Answered by Algue-Rythme
mhr asked this question in Q&A
Discussion options

You must be logged in to vote

Hi,

It's almost that ! But actually you cannot simply add constraints and objective the way you did: objective(x) + eq_constraints(x) + ineq_constraints(x). It does not make sense.

For one to enable implicit differentiation of constrained optimization problems you need both primal and dual variables (see page 5 in https://arxiv.org/pdf/2105.15183.pdf ). There is a way to do that with a function specially made to handle KKT conditions: jaxopt._src.implicit_diff.make_kkt_optimality_fun. This method can be found here:

def make_kkt_optimality_fun(obj_fun, eq_fun, ineq_fun=None):

You will find example of usage here

As yo…

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@mhr
Comment options

@mhr
Comment options

@mblondel
Comment options

Answer selected by Algue-Rythme
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants