I am using below script to get the details about port status of multiple remote servers.
Workflow Test-OpenPortWF
{
[CmdletBinding()]
param
(
[Parameter(Position=0)]
[String[]]$Target,
[Parameter(Mandatory=$true, Position=1)]
[int]$Port
)
If(Test-Path -Path C:\Temp\Results.csv -ErrorAction SilentlyContinue){ Remove-Item -Path C:\Temp\Results.csv -Force }
If(Test-Path -Path C:\Temp\Report.csv -ErrorAction SilentlyContinue){ Remove-Item -Path C:\Temp\Report.csv -Force }
foreach -parallel -throttle 50 ($t in $Target)
{
Sequence
{
$Out = Test-NetConnection -ComputerName $t -Port $Port -WarningAction SilentlyContinue | Select ComputerName,RemoteAddress,RemotePort,@{N="PortTestSucceeded"; E={$_.tcpTestSucceeded}}
Start-Sleep -Milliseconds 100
$Out | Export-Csv -Path C:\Temp\Results.csv -NoTypeInformation -Append
}
}
InlineScript
{
Import-Csv c:\Temp\Results.csv | Select ComputerName,RemoteAddress,RemotePort,PortTestSucceeded | Export-Csv c:\Temp\Report.csv -NoTypeInformation
Remove-Item c:\Temp\Results.csv -Force
Write-Host "Execution completed! Check Report.csv for output."
}
}
# Example use for multiple servers for one port 5985 and export results to CSV file.
# Assuming all target servers are found in c:\temp\Servers.txt (new line separated)
#
# PS C:\Temp> Test-OpenPortWF -Target (Get-Content .\Servers.txt) -Port 5985
Mostly it is working but it is not able to give complete results because since we are running this as a parallel workflow, if two servers complete the processing at the same time it will try to write the results to the CSV file at once for both the servers which is resulting in below error. And around 6% results are missing in the CSV file:
Microsoft.PowerShell.Utility\Write-Error : The process cannot access the file 'C:\Temp\Results.csv' because it is being used by another process. At Test-OpenPortWF:54 char:54 + + CategoryInfo : NotSpecified: (:) [Write-Error], CmdletInvocationException + FullyQualifiedErrorId : System.Management.Automation.CmdletInvocationException,Microsoft.PowerShell.Commands.WriteErrorCommand + PSComputerName : [localhost]
How can we get around this problem?
Because you use parallel processing, there may be conflicts when multiple threads try to output to your csv file (you run into file locks judging by the error)
Instead, try to output to single temporary files (with unique names) and at the end merge those files into one single report (and delete the temp files)
For example add a counter ($x) in the foreach loop that increments with every iteration ( $x++ ) , then output the results to -Path "C:\Temp\Results_$x.csv"