If I am using a dataloader in Pytorch and want to define something that needs the size of the current batch, how do I access it?
The issue I have with using my defined batch size(say, r) is suppose the dataset is 1009 long, but my r=100 (in a generic function). How do I ensure that the last batch doesn't throw error due to mismatch in dimensions (100 vs 9)?
To retrieve the size of the current batch in a PyTorch DataLoader, you can use
len(batch)
when iterating through the DataLoader.To avoid dimension mismatch errors with the last batch, set the
drop_last
parameter to True when creating the DataLoader.