What are the different ways to set the length scales of the kernel to be positive and be in the interval range

221 Views Asked by At

I'm using the following kernel for my Gaussian process regression with optimizer as Adam. But when i print the model optimized parameters the length scales are not in the interval range i provided.

class GPR(gpytorch.models.ExactGP):
    def __init__(self, train_x, train_y, likelihood):
        super(GPR, self).__init__(train_x, train_y, likelihood)
        self.mean_module = gpytorch.means.ConstantMean()
        lengthscale_constraint1 = Interval(lower_bound=1,upper_bound=10)
        covar_module1 = gpytorch.kernels.Scalekernel(gpytorch.kernels.RQKernel(ard_nums_dims=train_x.shape[1],lengthscale_constraint= lengthsclae_constraint1))
        covar_module1.outputscale = 3.542
        covar_module1.raw_outputscale.requires_grad=False
        raw_alpha = torch.nn.Parameter(torch.tensor([1e-20]))
        covar_module1.base_kernel.register_parameter("raw_alpha", raw_alpha)
        covar_module1.base_kernel.raw_alpha.requires_grad = False
        self.covar_module = gpytorch.kernels.LinearKernel() + covar_module1

Actual Output

length scales are negative and not in the range. Actual Output

Expected output

The length scales to be in the range of 1 and 10.

1

There are 1 best solutions below

0
On

The reason for this is that GPyTorch has raw and true parameters, where the true parameters are inverse transformed to the raw parameters, see the docs. This is how constraints are applied to the true parameters.

The raw parameters have the prefix "raw" (eg. "raw_lengthscale"). For example, here is how the lengthscale setter is defined in the Kernel class:

    def _set_lengthscale(self, value: Tensor):
        # Used by the lengthscale_prior
        if not self.has_lengthscale:
            raise RuntimeError("Kernel has no lengthscale.")

        if not torch.is_tensor(value):
            value = torch.as_tensor(value).to(self.raw_lengthscale)

        self.initialize(raw_lengthscale=self.raw_lengthscale_constraint.inverse_transform(value))

As you can see the true parameter value (the true lengthscale) is inverse transformed to the corresponding raw_lengthscale.

If you are priting out parameters by doing

for param_name, param in model.named_parameters():
    print(param_name, param)

this will print the raw_parameters. You will then need to apply the appropriate transform to the raw parameters. To do this you can use

for name, param, constraint in model.named_parameters_and_constraints():

    print(name, constraint.transform(param))

Note that name will still contain raw even if the true value of the parameter is being printed.