I have an SGPR model:
import numpy as np
import gpflow
X, Y = np.random.randn(50, 2), np.random.randn(50, 1)
Z1 = np.random.randn(13, 2)
k = gpflow.kernels.SquaredExponential()
m = gpflow.models.SGPR(data=(X, Y), kernel=k, inducing_variable=Z1)
And I would like to assign inducing variable but with different shape, like:
Z2 = np.random.randn(29, 2)
m.inducing_variable.Z.assign(Z2)
But if I do it, I got:
ValueError: Shapes (13, 2) and (29, 2) are incompatible
is there a way to reassign the inducing variables without redefining the model?
Context: Instead of optimizing the model with the inducing variables, I would like to optimize the model without optimizing the inducing variables, manually reassigning the inducing variables at each step of the optimization.
UPDATE: This issue is resolved by https://github.com/GPflow/GPflow/pull/1594, which will become part of the next GPflow patch release (2.1.4).
With that fix, you don't need a custom class. All you need to do is explicitly set the static shape with
Nonealong the first dimension:Then
m.inducing_variable.Z.assign(Z2)should work just fine.Note that in this case
Zcannot be trainable, as the TensorFlow optimizers need to know the shape at construction time and don't support dynamic shapes.Right now (as of GPflow 2.1.2) there is no built-in way to change the shape of inducing variables for
SGPR, though it is in principle possible. You can get what you want with your own inducing variable class though:and then do
instead. Then your
m.inducing_variable.Z.assign()should work as you like it.(For
SVGP, the size of the inducing variable and the distribution defined byq_muandq_sqrthas to match, as well as be known at construction time, so in this case changing the number of inducing variables is less trivial.)