Scipy Optimise with Mystic - constraint keeps getting violated

117 Views Asked by At

I'm trying to optimize a 52x5 matrix to maximize a return value y. I first flatten the matrix into an array of 260 elements, then apply Scipy optimize and mystic. However, the max_limit constraint keeps getting violated?

Please see the main part of the code below:

max_limit = 2000

def constraint_func():
    var_number = ['x'+str(i) for i in range(260)]
    constraint = ' + '.join(var_number) + f' <= {max_limit}'

    return constraint

eqns = ms.simplify(constraint_func(), all=True)
constraint = ms.generate_constraint(ms.generate_solvers(eqns), join=my.constraints.and_)

def objective_func(x):
    constraint_vars = constraint(x)
    y =  -model.func(constraint_vars)
    return y

initial_matrix = [random.randint(0,3) for i in range(260)]

output = so.minimize(objective_func, initial_matrix, method='SLSQP',bounds=[(0,max_limit)]*260 ,tol=0.01, options={ 'disp': True, 'maxiter':100})

1

There are 1 best solutions below

6
On BEST ANSWER

I'm the mystic author. Your code is incomplete above, so I'm going to make some assumptions to produce a full working example.

I'm guessing that the constraints are not getting violated, I'm guessing that you are looking at the result from the scipy.optimize result object, which is preserving the unconstrained value of x.

Let's use a simple model function that's just the sum of the parameter vector. I also note that you only have a single constraint (i.e. sum(x) <= 2000), so you don't need all=True or join=and_ in the constraints definition. It doesn't hurt to keep them, however.

Python 3.8.16 (default, Dec  7 2022, 05:25:02) 
[Clang 10.0.1 (clang-1001.0.46.4)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import random
>>> import mystic as my
>>> import mystic.symbolic as ms
>>> import scipy.optimize as so
>>> 
>>> def model_func(x):
...     return sum(x)
... 
>>> max_limit = 2000
>>> 
>>> def constraint_func():
...     var_number = ['x'+str(i) for i in range(260)]
...     constraint = ' + '.join(var_number) + f' <= {max_limit}'
...     return constraint
... 
>>> eqns = ms.simplify(constraint_func(), all=True)
>>> constraint = ms.generate_constraint(ms.generate_solvers(eqns), join=my.constraints.and_)
>>> 

The problem is that scipy considers your objective_func in the result object, so that means it tracks y for a given x, defined by y = objective_func(x), and as is, your constrained x that you are interested in tracking is only known inside objective_func as constraint_vars. So, if you are ok with being a little inefficient (i.e. being a bit lazy and doing a minimal rewrite of your code), then we can use one of mystic's monitors to get the value of constraint_vars. I'm going to use a callback to do that, so we capture the desired values after each iteration.

>>> mon = my.monitors.VerboseMonitor(1)
>>> 
>>> def objective_func(x):
...     constraint_vars = constraint(x)
...     y = -model_func(constraint_vars)
...     return y
... 
>>> def callback(x):
...     constraint_vars = constraint(x)
...     y = -model_func(constraint_vars)
...     mon(constraint_vars, y)
...     return
... 
>>> initial_matrix = [random.randint(0,3) for i in range(260)]

We can see the results being printed to the verbose monitor, and we can see the difference extracting the results from the monitor, as opposed to from the result object.

>>> output = so.minimize(objective_func, initial_matrix, method='SLSQP', bounds=[(0,max_limit)]*260 ,tol=0.01, options={'disp':True, 'maxiter':100}, callback=callback)
Generation 0 has ChiSquare: -681.0
Generation 1 has ChiSquare: -1980.9999999999995
Generation 2 has ChiSquare: -1999.9999999999961
Generation 3 has ChiSquare: -1999.9999999999998
Optimization terminated successfully    (Exit mode 0)
            Current function value: -1999.9999999999998
            Iterations: 4
            Function evaluations: 1050
            Gradient evaluations: 4
>>> 
>>> print("cost: %s" % output.fun)
cost: -1999.9999999999998
>>> print("unconstrained: %s" % model_func(output.x))
unconstrained: 2102.450852711366
>>> print("constrained: %s" % model_func(mon.x[-1]))
constrained: 1999.9999999999998
>>> print("callback: %s" % mon.y[-1])
callback: -1999.9999999999998

The constrained solution is found at mon.x[-1].