entries not entering logstash filter

209 Views Asked by At

I've been trying to parse rails log entries sent from beaver to my logstash indexer. But certain entries are not entering the filter section at all, but are appearing on my kibana dashboard in their original state (i.e without the fields being extracted).

beaver conf
    path:[path to the log]
    exclude:[.gz]
    multiline_regex_before = ^\s+
    multiline_regex_after = ^F

basically anything that starts in a new line and with F is a multiline entry.

logstash conf
    input {
    sqs{ ------parameters for the queue------- } }
    filter {
    grok {
      match => ["message","%{NOTSPACE:level}, \[%{TIMESTAMP_ISO8601:timestamp} #%{POSINT:pid}\]%{SPACE}%{LOGLEVEL:event_level} -- %{GREEDYDATA:info}"] } 
    multiline {
      pattern => ^F
      what => next
      negate => false
    } 
    output {
      elasticsearch_http { 
        host => "host 
        address" port "80" } 
    }

sample log entry:

E, [2015-03-10T10:52:34.125841 #26480] ERROR -- : [1;32mDeface:[0m 'auth_shared_login_bar' matched 0 times with 'li#search-bar'</b>
1

There are 1 best solutions below

0
On BEST ANSWER

Thanks..But i got this to work properly.. Basically two in instances the logstash were running with different config files, with one not having the grok-patterns for rails log. I just killed that process and it worked fine.