How to import Linux logs into ELK stack for log analysis?

1.6k Views Asked by At

I'm building a log analysis environment with the purpose of analyzing linux logs such as: /var/log/auth.log, /var/log/cron, /var/log/syslog, etc. The goal is to be able to upload such a log file and analyze it properly with Kibana/Elasticsearch. To do so, I created a .conf file as seen below, which includes the proper patterns to pars auth.log and the information needed in the input and output section. Unfortunately, when connecting to Kibana I cannot see any data in the "Discover" panel and cannot find the related "index pattern". I tested the grokk pattern and they works well.

input {
  file {
    type => "linux-auth"
    path => [ "/home/ubuntu/logs/auth.log"]
  }
filter {
  if [type] == "linux-auth" {
      grok {
        match => { "message" => "%{TIMESTAMP_ISO8601:time} %{WORD:method}\[%{POSINT:auth_pid}\]\: %{DATA:message} for %{DATA:user} from %{IPORHOST:IP_address} port %{POSINT:port}" }
      }
      grok {
        match => { "message" => "%{TIMESTAMP_ISO8601:time} %{WORD:method}\[%{POSINT:auth_pid}\]\:%{DATA:message} for %{GREEDYDATA:username}" }
      }
  }
}
output{
    elasticsearch {
        hosts => "elasticsearch:9200"
    }
}

Example of auth.log:

2018-12-02T14:01:00Z sshd[0000001]: Accepted keyboard-interactive/pam for root from 185.118.167.241 port 64965 ssh2
2018-12-02T14:02:00Z sshd[0000002]: Failed keyboard-interactive/pam for invalid user ubuntu from 36.104.140.175 port 57512 ssh2
2018-12-02T14:03:00Z sshd[0000003]: pam_unix(sshd:session): session closed for user root
1

There are 1 best solutions below

2
On

Here is the few recommendations which i would like to give:

  1. You can run logstash on debug mode like below to check what is the exact error.
bin/logstash --debug -f file_path.conf
  1. Check with stdout in output section which will print the incoming data. So that you will be sure that logstash reading the file correctly.
  2. The most important as you mention you want to read system log and need to visualize the data, I would recommend to use filebeat with system modules. Filebeat is especially build for such use cases like reading from file.

It is simple setup where in filebeat under the system module you just need to specify which system log file you need read. Mention the Elasticsearch endpoint and run the filebeat.

It will start reading and pushing the data to the elasticsearch.

Also You don't need build the custom dashboard in kibana (As you going to build in case of logstash). Filebeat comes with pre configured dashboards for system logs.

You can check more on above official document.