Sharing parameters in different nn.Moules in pytorch

295 Views Asked by At

I've got the model that you can see below, but I need to create two instances of them that shares x2h and h2h. Does anyone know how to do it?

class RNN(nn.Module):
    def __init__(self, input_size, hidden_size, output_size):
        super(RNN, self).__init__()

        self.hidden_size = hidden_size
        self.x2h = nn.Linear(input_size, hidden_size)
        self.h2h = nn.Linear(hidden_size, hidden_size)
        self.h2o = nn.Linear(hidden_size, output_size)

        #self.softmax = nn.LogSoftmax(dim=1)
        self.softmax = nn.Softmax(dim=1)

    def forward(self, input, hidden):

        hidden1 = self.x2h(input)
        hidden2 = self.h2h(hidden)
        hidden = hidden1 + hidden2
        output = self.h2o(hidden)
        output = self.softmax(output)

        return output, hidden

    def initHidden(self):
        return torch.zeros(1, self.hidden_size)
1

There are 1 best solutions below

0
On

It is a Python question i assume.

Variables declared inside the class, not inside a method are class or static variables.

Ref: https://radek.io/2011/07/21/static-variables-and-methods-in-python/