I'm using logstash to collect logs from my ASA 5505 and i want to extract ip source; ip destination; port source; port destination to use them in kibana. What i should write in the filter.
This is a sample log message:
<166>Aug 20 2014 05:51:34: %ASA-6-302014: Teardown TCP connection 8440 for inside:192.168.2.209/51483 to outside:104.16.13.8/80 duration 0:00:53 bytes 13984 TCP FINs
<166>Aug 20 2014 06:50:55: %ASA-6-305012: Teardown dynamic TCP translation from inside:192.168.2.209/33388 to outside:192.168.1.101/33388 duration 0:04:00
<167>Aug 20 2014 06:50:55: %ASA-7-609002: Teardown local-host outside:74.125.206.95 duration 0:04:00
<166>Aug 20 2014 06:50:55: %ASA-6-305012: Teardown dynamic TCP translation from inside:192.168.2.209/33390 to outside:192.168.1.101/33390 duration 0:04:00
<166>Aug 20 2014 06:50:54: %ASA-6-302014: Teardown TCP connection 10119 for inside:192.168.2.209/48466 to outside:173.194.66.84/443 duration 0:05:34 bytes 3160 TCP FINs
<167>Aug 20 2014 06:50:53: %ASA-7-710005: UDP request discarded from 192.168.1.199/3205 to outside:255.255.255.255/3206
And this is the filter being used:
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ] }
syslog_pri { }
date { match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
Thanks
You'll want to add something like this in:
before the
syslog_pri
line.Basically you'll need to build patterns that match each line type. The above two should match what you have, but if anything comes up as a
_grokparsefailure
, you'll need to figure out why. One way to do that is using http://grokdebug.herokuapp.com/ (it's how I came up with the pattern in the first place).