R's optimx doesn't respect my constraint

555 Views Asked by At

I am working on a constrained optimization problem. I have a number of variables which must sum to 100. I have followed the work here Why does optimx in R not give the correct solution to this simple nonparametric likelihood maximization? and am attempting to optimize the norm of the gradient as shown there. My function is relatively well-behaved and the norm looks like so:

grad.norm <- function(x) {
              lambda <- tail(x, 1)
              p <- head(x, -1)
              h2 <- sum(((test.betas * 81)/p + lambda)^2) + 
                    (sum(p) - 100)^2 
              }

I have the 81 modifier because the ln(p) appears 81 times in my original equation. When I optimize with this code, seemingly regardless of where I set lambda, I get outputs that don't respect my constraint that sum(p) = 100.

This is the reproducible version with test.betas similar to what I actually observe (below). Notice in the output that lambda is not even close to its bounds, so I don't think that's the problem.

library(optimx)

set.seed(43215)

test.betas <- c(rnorm(5, 350, 20), runif(120, 0.01, 1))

grad.norm <- function(x) {
             lambda <- tail(x, 1)
             p <- head(x, -1)
             h2 <- sum(((test.betas * 81)/p + lambda)^2) + 
                   (sum(p) - 100)^2 
            }

(out <- 
    optimx(c(runif(length(test.betas), 5, 10), -50),
           grad.norm,
           lower = c(rep(.01, length(test.betas)), -500),
           upper = c(rep(99.99, length(test.betas)), -1),
           method = "L-BFGS-B"))

 sum(out[,1:length(test.betas)]) # = 505.4267

Any ideas on how to get the function to respect my constraint or resources would be much appreciated.

0

There are 0 best solutions below