How to properly specify Jacobian & Hessian functions of inequality constraints in Optim

332 Views Asked by At

I’m trying to use the Optim package in Julia to optimize an objective function with 19 variables, and the following inequality constraints:

0 <= x[1]/3 - x[2] <= 1/3 
5 <= 1/x[3] + 1/x[4] <= 6

I’m trying to use either IPNewton() or NewtonTrustRegion , so I need to supply both a Jacobian and Hessian for the constraints. My question is: what is the correct way to write the Jacobian and Hessian functions?

I believe the constraint function would be

function con_c!(c,x)
   c[1] = x[1]/3 - x[2]          
   c[2] = 1/x[3] + 1/x[4]
   c
end

Would the Jacobian function be

function con_jacobian!(J,x)
   #first constraint:
   J[1,1] = 1/3
   J[1,2] = -1.0
  #second constraint:
   J[2,3] = -1/(x[3])^2
   J[2,4] = -1/(x[4])^2
   J
end

? (I assume all other indices of J are automatically set to zero?)

My main question: What would the Hessian function be? This is where I’m most confused. My understanding was that we take Hessians of scalar-valued functions. So do we have to enter multiple Hessians, one for each constraint function (2 in my case)?

I’ve looked at the multiple constraints example given here https://github.com/JuliaNLSolvers/ConstrainedOptim.jl , but I’m still confused. In the example, it looks like they are adding together two Hessian matrices…? Would greatly appreciate some help.

FD: I posted this question on Discourse two days ago but didn't receive a single response, which is why I'm posting it here.

0

There are 0 best solutions below