Black box optimization with Scikit Optimize

889 Views Asked by At

I have to optimize a black-box problem that depends on external software (no function definition neither derivatives) that is quite expensive to evaluate. It depends on several variables, some of them are real and some other are integers.

I think Scikit Optimize may be a good choice.

I was wondering if the following example (from the Scikit Optimize documentation) may be adapted to my actual problem. Being "f" an external function that provides the cost of a given set of parameters. Here it is a dummy function just to be reproducible. But, instead of depending just on "x", make it dependable on "y" and "z" being one of them restricted to integer values.

I have seen some other examples of Scikit Optimize oriented to hyperparameter optimization (based on Scikit Learn), but they seem less clear for me.

Here is the minimum reproducible example (that crash):

import numpy as np
from skopt import gp_minimize
from skopt.space import Integer
from skopt.space import Real


np.random.seed(123)
def f(x,y,z):
    return (np.sin(5 * x[0]) * (1 - np.tanh(x[0] ** 2)) *np.random.randn() * 0.1-y[0]**2+z[0]**2)

search_space = list()
search_space.append(Real(-2, 2, name='x'))
search_space.append(Integer(-2, 2, name='y'))
search_space.append(Real(0, 2, name='z'))

res = gp_minimize(f, search_space, n_calls=20)
print("x*=%.2f, y*=%.2f, f(x*,y*)=%.2f" % (res.x[0],res.y[0],res.z[0], res.fun))

Best regards and thank you

1

There are 1 best solutions below

0
On BEST ANSWER

You can use the decorator function use_named_args from scikit-optimize to pass your search space with names to your cost function:

import numpy as np
from skopt import gp_minimize
from skopt.space import Integer
from skopt.space import Real
from skopt.utils import use_named_args

np.random.seed(123)

search_space = [
    Real(-2, 2, name='x'),
    Integer(-2, 2, name='y'),
    Real(0, 2, name='z')
    ]

@use_named_args(search_space)
def f(x, y, z):
    return (np.sin(5 * x) * (1 - np.tanh(x ** 2)) *np.random.randn() * 0.1-y**2+z**2)

res = gp_minimize(f, search_space, n_calls=20)

Note that your OptimizeResult res is storing the optimized parameters in the attribute x which is an array of the best values. That is why your code crashes (i.e. there are no attributes y and z in res). You could get a dictionary with mapped names and optimized values as following:

optimized_params = {p.name: res.x[i] for i, p in enumerate(search_space)}