How to compute gradient wrt concatenated variables in theano

149 Views Asked by At

I have a loss function y which is a function of multiple shared variables theta1, theta2, ...

Then, gradient descent of y wrt theta s can be written simply as

import theano.tensor as T
theta_list = [theta1, theta2, theta3, theta4]
grad_list = T.grad(y, theta_list)
for theta, gradient in zip(theta_list, grad_list):
    theta = theta - learning_rate * gradient

However I want not to use the list representation of theta. That is,

import theano.tensor as T
thetas = <Properly concatenate thetas>
gradient = T.grad(y, thetas)
thetas = thetas - learning_rate * gradient

Is there any way to enable this?

Simple thetas = T.concatenate(theta_list) raises DisconnectedInputError when calculate the gradient.

0

There are 0 best solutions below