How to write log data from a file to Splunk

279 Views Asked by At

An application writes its logs into files which are saved e.g in /home/my-user/myapp/ directory.

I want to send the data (logs) from those files to Splunk.

I was thinking to use [inputs.file] or [inputs.tail] plugins in Telegraf.

In order to verify that it will work out, I want to output data from a log file /home/myuser/myapp/connect.log1 to another file testoutput_log first.

The example log data is as follows:

[2022-09-02 20:06:30,199] INFO [sftp_source_bht_extract|task-0] No files matching [^\s]+(\.(?i)(csv))$ were found in /dci/BHT (io.confluent.connect.sftp.source.SftpFileDequeue:86)
[2022-09-02 20:06:30,446] INFO [mongo_sftp_source_billing_statement_history|task-0] File queue out of files, searching for new file(s) in /dci/genius/out (io.confluent.connect.sftp.source.SftpFileDequeue:66)
[2022-09-02 20:06:30,449] INFO [sftp_source_bht_extract|task-0] File queue out of files, searching for new file(s) in /dci/BHT (io.confluent.connect.sftp.source.SftpFileDequeue:66)

Here is the part of Telegraf config covering the use case described above:

[[inputs.file]]
  files = ["/home/my-user/myapp/connect.log.1"]
  data_format = "grok"
  grok_patterns = ["%{GREEDYDATA}"]

[[outputs.file]]
    files = ["stdout","/tmp/testoutput_log"]
    data_format = "influx"

But the output file remains empty. Also tried with various output data formats - no success.

What am I doing wrong?

1

There are 1 best solutions below

0
On

Per the docs, you need to define at least one field when using the grok parser.

When using the influx data format a metric requires a field to be valid, hence this requirement.