How to watch or read new log files with Logstash?

3.7k Views Asked by At

EDIT: Trying to watch Magento report logs. Magento writes its crash report to a new file everytime an exception occurs.

According to Logstash oficial documentation: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-file.html the option dicover_interval means "How often (in seconds) we expand the filename patterns in the path option to discover new files to watch."

The problem: I have a system that writes its logs on a new file every time a new exception occurs (The file name is the exception number). However, Logstash on startup, starts watching the existing files , but I cant get Logstash to read the new files.

My conf file:

    input {
      file {
        type => "error-report-log"
        path => "/srv/www/var/report"
        #start_position => "beginning"
        ignore_older => 30
        close_older => 30
        discover_interval => 5
        codec => multiline {
            pattern => "."
            what => "previous"
        }
    }
}

As you can see, I've tried discover_interval with no luck. New files are not getting watched.

Am I missing something or Logstash simply does not support this kind of behavior ?

Thanks in advance.

3

There are 3 best solutions below

4
On

I guess you're missing out the sincedb_path within your file. What if you have your input as such:

input {
  file {
    type => "error-report-log"
    path => "/srv/www/var/report"
    sincedb_path => "/dev/null"   <-- add this line
    start_position => "beginning" <-- uncomment this
    ignore_older => 0             <-- change it to zero
    codec => multiline {
        pattern => "."
        what => "previous"
    }
}

Once you have it, logstash should pick up any new lines or new files which are being added.

0
On

SOLUTION: Using Filebeat and pass the logs through Logstash is a much better approach :

filebeat:
  prospectors:
    -
      input_type: log
      scan_frequency: 5s
      paths:
        - /srv/www/var/report/*
      fields:
        app_id: "${APP_ID}"
        type: error-report-log
      multiline:
        pattern: '^\$'
        negate: true
        match: after
output:
  logstash:
    enabled: true
    hosts: ["my.logstash.com:5044"]

This solution has been tested with Magento 2.

Magento 2 writes its exception / crash report to a new file everytime.

0
On

Thanks @kaßta for template!

I improved it a bit so file is not picked up again and again with ignore_older together with scan_frequency. Also close_eof will keep processing resources to minimum.

filebeat.prospectors:
  - input_type: log
    close_eof: true
    ignore_older: 1m
    scan_frequency: 1m
    paths:
      - "/srv/www/var/report/*"
    fields:
      project: shop
      app: magento
      env: development
    multiline:
      pattern: '^\$'
      negate: true
      match: after

output.logstash:
  hosts: ['logstash:5044']