Dataloader with a different batch size for each iteration for a deep learning project

152 Views Asked by At

For my deep learning project I need a cifar10 data loader (python) for my model which uses a varied batch size each iteration, unfortunately torch.utils.data.DataLoader uses a constant batch size throughout the training.

The batch size is randomly geometrically distributed for each iteration.

If any of you have a code for a data loader (especially for cifar10) which uses a varied batch sampler size I would love to have a look.

I tried using the code of chatGPT by building a batch sampler class but it didn't work either.

As I tried to build a class that is based on torch.utils.data.DataLoader, but I got a couple of errors

Thanks in advance

0

There are 0 best solutions below