My script takes too long to run

3.3k Views Asked by At

My script reports on event errors from the application, system and security logs within 24 hours. The script takes an average of 8 mins to run. I believe these particular statements contribute to the problem:

$Lap = Get-EventLog -computername $server -LogName $logname -after (Get-Date).AddHours(-24)
$Lap | where {$_.entryType -match "Error"} | Sort-Object eventid |
  group-object eventid | Format-Table Name, Count -auto | fl |
  out-string | Add-Content $eventlogfile

If I replace the -after parameter with -newest, the script runs quickly. Any thoughts of why the above statement contributes to the script taking awhile to run?

Edit Doing some research, I learned that by using the get-eventlog cmdlet PS parses the entire event log.

Measure how long it'll take to determine the amount of application errors found within the last 24 hours Measure-Command -Expression {get-eventlog -ComputerName server123 -LogName Application -EntryType Error -after (Get-Date).AddHours(-24)}

Total time: 54 secs

Count how many application errors popped up in the last 24 hours*
$logs1 = get-eventlog -ComputerName server123 -LogName Application -EntryType Error -after (Get-Date).AddHours(-24) $logs1.count

There are 3 errors

measure-command -expression {get-eventlog -ComputerName server123 -logname application -Newest 3 }

43 milliseconds

Any thoughts of why specifying the -after parameter with the get-eventlog cmdlet returns results much longer than specifing the -newest parameter?

3

There are 3 best solutions below

1
On

The cmdlet Measure-Command can be used to find out how long a command will take. Use it to find out what is going on. Split up the script a bit to find the most expensive operation. Something like so,

Measure-Command -Expression { $Lap = Get-EventLog -computername $server -LogName $logname -after (Get-Date).AddHours(-24) }
Measure-Command -Expression { $lap2 = $Lap| ? {$_.entryType -match "Error"}}
Measure-Command -Expression { $Lap3 = $Lap2 |Sort-Object eventid
# Keep splitting / unsplitting statements until you find the costly one
Measure-Command -Expression { $Lap3 |group-object eventid|Format-Table Name, Count -auto|fl|out-string|Add-Content $eventlogfile }
1
On

You transfer all events of the past 24 hours over the network and then filter them on the local computer:

$Lap = Get-EventLog -computername $server -LogName $logname -after (Get-Date).AddHours(-24)
$Lap | where {$_.entryType -match "Error"} | Sort-Object eventid | ...

Doing the filtering directly with Get-EventLog should speed things up, because that way only the error events of the past 24 hours are transferred over the network:

$Lap = Get-EventLog -Computer $server -LogName $logname -EntryType 'Error' -After (Get-Date).AddHours(-24)
$Lap | Sort-Object eventid | ...
0
On

I've seen this before, and using -After with large log files is definitely a performance problem.

I've used the index numbers to speed it up when doing repetitive log reads:

$last_index = get-content index_history.txt

#Get the index number of the last log entry 
$index = (Get-EventLog -ComputerName $_ -LogName $log -newest 1).index 

#calculate number of events to retrieve 
if ($last_index){$n = $index - $last_index} 

#get the log entries  
$Lap = Get-EventLog -computername $server -LogName $logname -Newest $n

$index | set-content index_history.txt