I have to move older duplicated files into another directory while keeping the latest copy of each original filename. I need to keep them for a week in that location and then delete them. I have the delete-item already working just need some guidance on the moving the older duplicates.
Example
C:\Temp
text1.txt. 14:03 PM
text2.txt. 14:03 PM
txt1.csv 14:04 PM
C:Temp\Source
text1.txt. 14:01 PM
text2.txt. 14:04 PM
txt1.csv 14:06 PM
The goal is to have the written the older files into the D:\ for example while keeping the structure the same to avoid an overwritten error in the future with other files
D:\Temp
txt1.csv 14:04 PM
text2.txt. 14:03 PM
D:\Temp\SourceDropped
text1.txt. 14:01 PM
Result Outcome and have the C:\ now like these for example
C:\Temp
text1.csv 14:03 PM
C:Temp\Source
text2.txt. 14:04 PM
txt1.csv 14:06 PM
I have tried this but the Move-Item is not moving them
$dirC = 'C\Temp'
$dirD = 'D:\Temp'
Get-ChildItem -Path $dirC -Recurse -File |
Select-Object -Property FullName, Name, LastWriteTime|
Group-Object -Property LastWriteTime |
Where-Object -Property Count -GT 1 |
ForEach-Object {
$_.Group |
Sort-Object -Property LastWriteTime -Descending |
Select-Object -Skip 1 |
Move-Item $_.FullName $dirD -Force #-WhatIf
}
Also tried hashing them but it does not follow the structure when copied over
$dirC = 'C\Temp'
$dirD = 'D:\Temp'
$hash = @{}
Get-ChildItem -Path $dirC -Recurse -File | ForEach-Object {
$filepath = $_.FullName
$filehash = Get-FileHash -Path $filepath -Algorithm SHA256 |
Select-Object -ExpandProperty Hash
if($hash.ContainsKey($filehash)){
Move-Item -Path $filepath -Destination $dirD -Force
}
}
This might help you get going, essentially, first groups files by their
.Lengthto avoid overhead of hashing them without need. Then, if the groups have more than 1 file, it will group them by MD5 hash and, only if the files have the same hash it will sort by.LastWriteTimeand skip the newest one, the rest should be moved to$dirD.-WhatIfshould be removed once you have confirmed the code is doing what you need.