I'm using the following kernel for my Gaussian process regression with optimizer as Adam. But when i print the model optimized parameters the length scales are not in the interval range i provided.
class GPR(gpytorch.models.ExactGP):
def __init__(self, train_x, train_y, likelihood):
super(GPR, self).__init__(train_x, train_y, likelihood)
self.mean_module = gpytorch.means.ConstantMean()
lengthscale_constraint1 = Interval(lower_bound=1,upper_bound=10)
covar_module1 = gpytorch.kernels.Scalekernel(gpytorch.kernels.RQKernel(ard_nums_dims=train_x.shape[1],lengthscale_constraint= lengthsclae_constraint1))
covar_module1.outputscale = 3.542
covar_module1.raw_outputscale.requires_grad=False
raw_alpha = torch.nn.Parameter(torch.tensor([1e-20]))
covar_module1.base_kernel.register_parameter("raw_alpha", raw_alpha)
covar_module1.base_kernel.raw_alpha.requires_grad = False
self.covar_module = gpytorch.kernels.LinearKernel() + covar_module1
Actual Output
length scales are negative and not in the range. Actual Output
Expected output
The length scales to be in the range of 1 and 10.
The reason for this is that GPyTorch has raw and true parameters, where the true parameters are inverse transformed to the raw parameters, see the docs. This is how constraints are applied to the true parameters.
The raw parameters have the prefix "raw" (eg. "raw_lengthscale"). For example, here is how the lengthscale setter is defined in the
Kernel
class:As you can see the true parameter
value
(the true lengthscale) is inverse transformed to the correspondingraw_lengthscale
.If you are priting out parameters by doing
this will print the raw_parameters. You will then need to apply the appropriate transform to the raw parameters. To do this you can use
Note that
name
will still containraw
even if the true value of the parameter is being printed.