I have a log containing json object. The log gets parsed if json object has no spaces. If it has spaces between key value pair, it is not getting parsed.
Configuration file used
input {
syslog {
port => 3011
}
}
filter {
grok {
match => { "message" =>
[
"%{SYSLOGTIMESTAMP:timestamp4} %{DATA:time_ms}|%{DATA:field1}|%{DATA:field2}|99|%{DATA:field3}|%{DATA:field4}|%{DATA:field5}|%{DATA:field6}|%{DATA:field7}|%{DATA:field8}|%{DATA:field9}|%{DATA:field10}|%{DATA:field11}|%{DATA:field12}|%{GREEDYDATA:field13}"
]
}
}
date {
match => ["timestamp4", "MMM dd HH:mm:ss"]
}
if [field13] {
mutate {
add_field => {"log_type" => "my-logs"}
}
}
}
output {
if [log_type] == "my-logs" {
stdout { codec => rubydebug }
elasticsearch {
hosts => ["ES_HOST:9200"]
index => "my-logs-000001"
}
}
}
Logs getting parsed:
echo "Mar 21 13:27:11 11:11.366293|dataadwhw1|ebsmp4713user5_@maiator|99|4064|22|SUCCESS|data|19|UA101|10.1.1.70|https|data.com|{"wrg_id":"200000337"}|200" | nc localhost 3011 echo "Mar 21 13:27:11 11:11.366293|dataadwhw1|ebsmp4713user5_@maiator|99|4064|22|SUCCESS|data|19|UA101|10.1.1.70|https|data.com|{"wrg_id" :"200000337"}|200" | nc localhost 3011
Log not getting parsed: echo "Mar 21 13:27:11 11:11.366293|dataadwhw1|ebsmp4713user5_@maiator|99|4064|22|SUCCESS|data|19|UA101|10.1.1.70|https|data.com|{"wrg_id": "200000337"}|200" | nc localhost 3011
It looks like your data is in CSV format delimited with
|. So I recommend you to use logstash csv parser.After that If you want to remove the whitespaces you can use mutate => strip.