Compiling hard constraints with Mystic optimisation

504 Views Asked by At

I am writing a constrained integer optimization and am having trouble with formulating constraints. Here is a summary of my optimization: (Schedule is my own function that returns a single int value)

Objective: MIN (1500 * x1) + (625 * x2) +(100 * x3)

Constraints:

  1. schedule(x1,x2,x3) >=480

  2. x2 > x1

  3. x2 > x3

Bounds: (5<= x1 <=50), (5<= x2 <=100), (1<= x3 <=20)

What is the best way to group the hard constraints to put into diffev2 because constraint1 uses a function so cannot be written in a symbolic way?

Below is a rough attempt at this. Another minor issue I have experienced is bounds being breached (values were negative) so if there are any red flags with how I have bounded it please do say. Very new to Mystic as well so don't worry about explaining things in simple terms.

import mystic as ms
import mystic.symbolic as msym
import numpy as np

def objective(x): 
    rounded = np.around(x)
    integer= rounded.astype(int)
    x1,x2,x3 = integer
    return (1500 * x1) + (625 * x2) +(100 * x3)


bounds = [(5,50),(5,100),(1,20)]


def constraint1(x):
    rounded = np.around(x)
    integer= rounded.astype(int)
    x1,x2,x3 = integer
    return 240-schedule(x1,x2,x3)

eqns = '''
x2 > x1     
x2 > x3

'''
cons = msym.generate_constraint(msym.generate_solvers(msym.simplify(eqns)))

constraint = ms.constraints.and_(cons, constraint1) #I know this is wrong but I want to join them

from mystic.solvers import diffev2
from mystic.monitors import VerboseMonitor
mon = VerboseMonitor(10)

result = diffev2(objective,x0=bounds, bounds=bounds, constraints=constraint, npop=50, gtol=200, \
                  disp=False, full_output=True, itermon=mon)
1

There are 1 best solutions below

0
On BEST ANSWER

I'd slightly rewrite it like this...

Objective: MIN (1500 * x0) + (625 * x1) + (100 * x2)

Constraints: schedule(x0,x1,x2) >= 480 x1 > x0 x1 > x2 x0, x1, x2 are integers

Bounds: (5 <= x0 <=50), (5 <= x1 <=100), (1 <= x2 <=20)

Here, we will define schedule in a function, as well as the bounds and the objective.

>>> import mystic as my
>>> import mystic.symbolic as ms
>>> import numpy as np
>>>
>>> def objective(x):
...     x0,x1,x2 = x
...     return (1500 * x0) + (625 * x1) + (100 * x2)
...
>>> bounds = [(5,50),(5,100),(1,20)]
>>>
>>> def schedule(x0,x1,x2):
...     return x0 * x1 - x2 * x2
...

Then we start building the constraints. Constraints functions are always x' = constraints(x) (i.e. they take a parameter vector, and return a parameter vector). Thus, we have two special cases here: (1) symbolic equations, for which we will build x' = cons(x), and the constraint on the output of schedule -- which needs a bit of round-about logic to construct.

>>> eqns = '''
... x1 > x0
... x2 < x1
... '''
>>>
>>> ints = np.round
>>> and_ = my.constraints.and_
>>> cons = ms.generate_constraint(ms.generate_solvers(ms.simplify(eqns)), join=and_)
>>>
>>> cons([1,2,3])
[1, 2, 1.999999999999997]
>>> cons([5,5,1])
[5, 5.000000000000006, 1]
>>>

Note that I've rewritten the symbolic constraints to have a different variable on the left side each time. This is because without using and_, the constraint that is built will just use the last one. So, using and_ forces an optimization to be run so that both are respected. I make it easier to solve, by not reusing variables on the left-hand side. Lastly, I check that the constraints work as expected.

Note also I didn't use the integer constraint just yet. We can apply that when we put all the constraints together.

To apply a constraint on the output, we first build a penalty, then convert it to a constraint. It's not efficient, as it again takes an internal optimization.

>>> def penalty1(x):
...     x0,x1,x2 = x
...     return 480 - schedule(x0,x1,x2)
...
>>> @my.penalty.linear_inequality(penalty1)
... def penalty(x):
...     return 0.0
...
>>> lb,ub = zip(*bounds)
>>> c = my.constraints.as_constraint(penalty, lower_bounds=lb, upper_bounds=ub, nvars=3)
>>> c([5,5,1])
[13.126545665004528, 44.97820356778645, 1.0138152329128338]
>>> schedule(*_)
589.3806217359323
>>> c([50,50,10])
[50.0, 50.0, 10.0]
>>> schedule(*_)
2400.0
>>>

Then we put it all together, again using and_.

>>> constraint = and_(c, cons, ints)
>>>
>>> constraint([5,5,1])
[23.0, 30.0, 2.0]
>>> c(_)
[23.0, 30.0, 2.0]
>>> cons(_)
[23.0, 30.0, 2.0]
>>> objective(_)
53450

Lastly, we solve.

>>> from mystic.solvers import diffev2
>>> from mystic.monitors import VerboseMonitor
>>> mon = VerboseMonitor(10)
>>>
>>> result = diffev2(objective,x0=bounds, bounds=bounds, constraints=constraint, npop=50, gtol=100, disp=False, full_output=True, itermon=mon)
Generation 0 has ChiSquare: 43850.000000
Generation 10 has ChiSquare: 42725.000000
Generation 20 has ChiSquare: 42725.000000
Generation 30 has ChiSquare: 42725.000000
Generation 40 has ChiSquare: 42725.000000
Generation 50 has ChiSquare: 42725.000000
Generation 60 has ChiSquare: 42725.000000
Generation 70 has ChiSquare: 42725.000000
Generation 80 has ChiSquare: 42725.000000
Generation 90 has ChiSquare: 42725.000000
Generation 100 has ChiSquare: 42725.000000
STOP("ChangeOverGeneration with {'tolerance': 0.005, 'generations': 100}")
>>> result[0]
array([13., 37.,  1.])
>>> result[1]
42725.0

And again with a different solver...

>>> from mystic.solvers import fmin
>>> mon = VerboseMonitor(10)
>>> result = fmin(objective, x0=[13,37,1], bounds=bounds, constraints=constraint, ftol=1e-6, disp=False, full_output=True, itermon=mon)
Generation 0 has ChiSquare: 42725.000000
Generation 10 has ChiSquare: 42725.000000
Generation 20 has ChiSquare: 42725.000000
STOP("CandidateRelativeTolerance with {'xtol': 0.0001, 'ftol': 1e-06}")

You'll note that the optimizer found better values than what we started with by just applying the constraint, but didn't search very much. In tightly constrained spaces, it is often the case that the solution is found right away. You'll have to investigate a bit more to see if this is indeed the global minimum.