Using scikit-optimize and scikit-learn, I get the following error on calling the gp_minimize function:
TypeError: predict() got an unexpected keyword argument 'return_mean_grad'
I think it may be some version incompatability but they are both latest versions (according to pip)
- scikit-optimize 0.9.0
- scikit-learn 1.2.0
What could be the problem?
Sample code I am using
from sklearn.gaussian_process import GaussianProcessRegressor
from sklearn.gaussian_process.kernels import Matern
from skopt import gp_minimize
# Define the kernel for the Gaussian process
kernel = Matern(nu=2.5)
# Initialize the Gaussian process regressor
gpr = GaussianProcessRegressor(kernel=kernel, n_restarts_optimizer=10)
# Define the objective function
def f(x):
x1, x2, x3 = x
return -x1**2 - x2**2 - x3**2
# Define the variable bounds
bounds = [(0, 1), (0, 1), (0, 1)]
# Perform the optimization
res = gp_minimize(f, bounds, n_calls=20, random_state=0, verbose=True, n_random_starts=10, acq_func="EI", base_estimator=gpr)
# Print the optimal variables and function value
print("x1 = {:.4f}, x2 = {:.4f}, x3 = {:.4f}, f(x) = {:.4f}".format(res.x[0], res.x[1], res.x[2], res.fun))
you're using 2 packages Scikit-Optimize & Scikit-Learn - I can advise you to learn the hierarchy of classes in each of them as well as OOP principles in general, including SOLID principles for your project architecture design. You have tried to do Dependency Invertion having not created the proper implementation, if you really needed it. BTW, I see no need to write such useless wrapper, better use existing classes & methods correct!
sklearn
as best_estimator for gp_minimize fromskopt
:you can use Optimizer in Regressor, but not vc.vs. ! - as so as Regressor is assigned with param Optimizer, you cannot assign it to optimizer(minimizer) from another package again, even if its returned values could suit that skopt.minimizer -- it would provide a cyclic Dependency - that is the horror in development for all times.
For the Regressor you can even create your own custom optimizer to use it as any other optimizer (at least given by default), but in any case you will need to provide correct Black-Box function's return values [in order they could be utilized by Regressor as params], therefore can build-in black-box function into yourOptimizer function (e.g.), concerning your example:
But for Bayesian optimization you can even use bayesian-optimization package if you do not know how to use either sklearn or skopt - skopt-bayes-opt, but not spoiling the logics of MLE calculations with faulty inversions of operations & ugly architecture of dependencies.
Thus, your code's problems: