I am getting the total number of bytes of the 32 largest files in the folder:
$big32 = Get-ChildItem c:\\temp -recurse |
Sort-Object length -descending |
select-object -first 32 |
measure-object -property length –sum
$big32.sum /1gb
However, it's working very slowly. We have about 10 TB of data in 1.4 million files.
I can think of some improvements, especially to memory usage but following should be considerable faster than
Get-ChildItemEdit
I would look at trying to implement an implit heap to reduce memory usage without hurting performance (possibly even improves it... to be tested)
Edit 2
If the filenames are not required, the easiest gain on memory is to not include them in the results.