I've been using the Constrained Optimization BY Linear Approximation scipy.optimize.fmin_cobyla
routine in scipy to minimize my objective function with two non-negative constraints and it works well most of the time, but I have a problem when I don't have proper starting values for the two variables - which happens more often than not.
def constraint1(x):
return x[0]
def constraint2(x):
return x[1]
def objective(x):
return x[0] * x[1]; # not actual objective function
x0 = [1.2, 3]
result = fmin_cobyla(objective, x0, [constraint1, constraint2])
I usually test my optimization routines using Excel as a second set of eyes, which led to me finding out about the Set Objective f(n) to Value Of: 0 option in Solver. Is this something that I can easily duplicate in Scipy.Optimize or am I making this too complicated? Since my actual objective function is trying to minimize the sum of squared differences, my goal would be to stop the optimization routine if/when it gets as close as possible to 0.