code:
a = T.vector()
b = T.vector()
loss = T.sum(a-b)
dy = T.grad(loss, a)
d2y = T.grad(loss, dy)
f = theano.function([a,b], y)
print f([.5,.5,.5], [1,0,1])
output:
theano.gradient.DisconnectedInputError: grad method was asked to compute
the gradientwith respect to a variable that is not part of the
computational graph of the cost, or is used only by a non-differentiable
operator: Elemwise{second}.0
how is a derivative of the graph not part of the graph? Is this why scan is used to compute the hessian?
Here:
you are attempting to compute the gradient of the loss with respect to
dy
. However the loss depends only on the values ofa
andb
and notdy
, hence the error. It only makes sense to compute partial derivatives of the loss with respect to parameters that actually affect its value.The easiest way to compute the Hessian in Theano is to use the
theano.gradient.hessian
convenience function:See the documentation here for an alternative manual method that uses a combination of
theano.grad
andtheano.scan
.In your example the Hessian will be a 3x3 matrix of zeros, since the partial derivative of the loss w.r.t.
a
is independent ofa
(it's just a vector of ones).