Can't share parameters between modules in cutorch

77 Views Asked by At

I tried to use the function below to share parameters in LSTM. It is ok to run on CPU. But if I switched to GPU, it told me " bad argument #1 to 'set' (expecting number or torch.DoubleTensor or torch.DoubleStorage ..)". I think this error occurred when I share parameters of a GPU gmodule. How can i solve this problem?

  function share_params(cell, src)
   if torch.type(cell) == 'nn.gModule' then
       for i = 1, #cell.forwardnodes do
       local node = cell.forwardnodes[i]

      if node.data.module then
       --print(src.forwardnodes[i].data.module['weight'])
         node.data.module:share(src.forwardnodes[i].data.module,
         'weight', 'bias', 'gradWeight', 'gradBias')
     end
    end
  elseif torch.isTypeOf(cell, 'nn.Module') then
    cell:share(src, 'weight', 'bias', 'gradWeight', 'gradBias')
  else
    error('parameters cannot be shared for this input')
  end
 end
0

There are 0 best solutions below