Large Neural Network Pruning

156 Views Asked by At

I have done some experiments on neural network pruning, but only on small models. I used to prune the relevant weights as follows (similarly as it is explained in the official tutorial https://pytorch.org/tutorials/intermediate/pruning_tutorial.html):

for name,module in model.named_modules():
    if 'layer' in name:
        parameters_to_prune.append((getattr(model, name),'weight'))


prune.global_unstructured(
    parameters_to_prune,
    pruning_method=prune.L1Unstructured,
    amount=sparsity_constant,
    )

The main problem in doing this, is that I have to define a list (or tuple) of layers to prune. This works when I define my model by hands and I know the name of different layers (for example, in the code provided, I was aware of the fact that all the fully connected layers, had the string "layer" in their name.

How can I avoid this process, and define a pruning method that prunes all the parameters of a given model, without having to call the layers by name?

All in all, I'm looking for a function that, given a model and a constant of sparsity, globally prunes the given model (by masking it):

model = models.ResNet18()
function_that_prunes(model, sparsity_constant)
0

There are 0 best solutions below