Logstash Config how to trasfer aws s3 csv without header to Elasticsearch

420 Views Asked by At

I have sample csv file in s3 with 3 column without any header. But during data transfer from s3 csv to elasticsearch, I want to give some name to each column (in my case id, name, age to column 0 to 2 respectively).

Input Sample.csv

  1,myname,23
  2,myname2,24

Expected Output should be following doc in ES index:

        [{
            "_index": "user_detail",
            "_type": "user_detail_type",
            "_id": "1",
            "_score": 1.0,
            "_source": {
                "id": "1",
                "name": "myname",
                "age": "23"
            }
        },
        {
            "_index": "user_detail",
            "_type": "user_detail_type",
            "_id": "2",
            "_score": 1.0,
            "_source": {
                "id": "2",
                "name": "myname2",
                "age": "24"
            }
        }]

Logstash config that I have written is:

input {
      s3 {
         bucket => "users"
         region => "us-east-1"
         watch_for_new_files => false
         prefix => "user.csv"
        }
}

filter {
  // Need help here
}

output {
     elasticsearch {
          hosts => "localhost:9200"
          index => "user_detail"
          document_type => "user_detail_type"
          document_id => "%{id}"
     }
}

Doubt: What should I write in filter section or any change in config to convert column[0] => id, column[1] => name, column[2] => age during Elasticsearch insertion.

0

There are 0 best solutions below