I have an 40-Dimensional Rosenbrock function with nonlinear and bound constrains. I am using the following code to get minimum. I let internal finite-difference handle the gradient and I am using scaled variables. I do not understand why output gives me the "Positive directional derivative for linesearch" termination.
I played with initial guess but it did not work well. Since I am working with scaled variables optimization variables bounds must always lb=0 and ub=1. Following is my code.
import numpy as np
from scipy.optimize import minimize, NonlinearConstraint,Bounds
def Denormalizer(u_norm,maximum,minimum):
u1 = u_norm*(maximum - minimum) + minimum
return u1
def rosenbrock_40d(x):
x=Denormalizer(x,maximum,minimum)
"""The Rosenbrock function"""
return sum(100.0*(x[1:]-x[:-1]**2.0)**2.0 + (1-x[:-1])**2.0)
def nonlinear_constraint(x):
x=Denormalizer(x,maximum,minimum)
sum_of_squares = 0
for i in range(39):
sum_of_squares += 0.1-(x[i]-1)**3-(x[i+1]-1)
return sum_of_squares
nlcfd = NonlinearConstraint(nonlinear_constraint, -np.inf,0)
Opt_Bounds = Bounds(0,1)
x_values = [
2, -8, 5, 10, -3, 7, -9, 1, -6, 4,
-2, 8, -7, 3, 6, -5, 9, -1, 0, -4,
7, -3, 6, -10, 8, -2, 9, -5, 4, -1,
0, -6, 10, -8, 3, -4, 2, -7, 1, -9
]
maximum,minimum=np.max(x_values),np.min(x_values)
x_values_norm = (x_values - np.min(x_values)) / (np.max(x_values) - np.min(x_values))
resultfd = minimize(rosenbrock_40d, x_values_norm, method='SLSQP',bounds=Opt_Bounds, constraints=nlcfd,options={'maxiter':200,'disp':True,'iprint':2,'ftol':1e-6})
This is the output.
NIT FC OBJFUN GNORM
5 41 8.636645E+06 2.061750E+07
Positive directional derivative for linesearch (Exit mode 8)
Current function value: 8636645.0
Iterations: 5
Function evaluations: 41
Gradient evaluations: 1