Solving non-convex optimization problem using python

151 Views Asked by At

I need to maximize a two-variable non-convex function, which contains a log to the base 2.

I tried CVXPY but an error appears:

cvxpy.error.DCPError: Problem does not follow DCP rules.

The function is like this:

import cvxpy as cp
import numpy as np

x = cp.Variable(2, nonneg=True)
A=23
B=34
C=45
Ob=cp.Minimize(-cp.log(1 + x[0]/(A*x[1]+C))/np.log(2) - cp.log(1 + B*x[1])/np.log(2))
constraints = [0 <= x, x <= 1, cp.sum(x) == 1, x[0] >= x[1]]
prob = cp.Problem(Ob, constraints)
result = prob.solve()

where A, B, and C are just constants.

Should I use another python library to solve the problem and what is the DCP error?

2

There are 2 best solutions below

4
On

As far as I understand, you are running into a fundamental limitation of that library, as it is made to only solve convex optimization problems and your problem is non-convex.

I suggest using scipy.optimize instead.

0
On

You can use Mathematica to solve the optimization problem.

Clear["Global`*"];
(*Define the variables*)
vars = {x1, x2};

(*Define the constants*)
A1 = 23;
B1 = 34;
C1 = 45;

(*Define the objective function*)
objective = -Log[2, 1 + x1/(A1*x2 + C1)] - Log[2, 1 + B1*x2];

(*Define the constraints*)
constraints = {0 <= x1, x1 <= 1, 0 <= x2, x2 <= 1, x1 + x2 == 1, 
   x1 >= x2};

(*Solve the optimization problem*)
solution = NMinimize[{objective, constraints}, vars];

(*Display the result*)
solution //Print
{-4.18264, {x1 -> 0.5, x2 -> 0.5}}