Randomly set some elements in a tensor to zero (with low computational time)

2.4k Views Asked by At

I have a tensor of shape (3072,1000) which represents the weights in my neural network. I want to:

  1. randomly set 60% of its elements to zero.
  2. After updating the weights, keep 60% of the elements equal to zero but again randomly i.e., not the same previous elements.

Note: my network is not the usual artificial neural network which uses backpropagation algorithm but it is a biophysical model of the neurons in the brain so I am using special weight updating rules. Therefore, I think the ready functions in pytorch, if any, might not be helpful.

I tried the follwoing code, it is working but it takes so long because after every time I update my weight tensor, I have to run that code to set the weight tensor again to be 60% zeros

row_indices = np.random.choice(np.size(mytensor.weight, 0),
                       replace=False,size=int(np.size(mytensor.weight, 0)* 0.6))
column_indices = np.random.choice(np. size(mytensor.weight, 1),
                       replace=False, size=int(np. size(mytensor.weight, 1) * 0.6))
for r in row_indices:
    for c in column_indices:
        (mytensor.weight)[r][c] = 0
2

There are 2 best solutions below

1
On BEST ANSWER

You can use the dropout function for this:

import torch.nn.functional as F

my_tensor.weight = F.dropout(my_tensor.weight, p=0.6)
0
On

iacob's answer is perfect if you want approximately 60% of the weights to be set to 0. If you want to set exactly m values in your tensor to zero then you can use something like this

n = mytensor.weight.numel()
m = int(round(n*0.6))
indices = np.random.choice(n, m, replace=False) # alternative: indices = torch.randperm(n)[:m]
mytensor.weight = mytensor.weight.contiguous()  
mytensor.weight.flatten()[indices] = 0