Hi Team,
The design we used to read log files - Filebeat -> Kafka Topic -> Logstash -> Elasticsearch
When the message is arriving in Logstash it is in the following format
{
"@timestamp":"2018-09-20T00:23:48.543Z","@metadata":{
"beat":"filebeat",
"type":"doc",
"version":"6.3.2",
"topic":"kafka_broker_1"
},
"source":"/apps/kafka/confluent-4.0.0/logs/server.log",
"offset":5158,
"message":"[2018-09-19 19:23:46,612] INFO Incrementing log start offset of partition _confluent-metrics-5 to 8644625 in dir /data/kafka/kafka-logs (kafka.log.Log)",
"prospector":{
"type":"log"
},
"input":{
"type":"log"
},
"beat":{
"name":"test01",
"hostname":"test01",
"version":"6.3.2"
},
"host":{
"name":"test01"
}
}
And I m using the following filter in Logstash
filter {
mutate { add_field => { "source" => "%{[message][source]}" "kafkaTopic" => "%{[@metadata][kafka][topic]}" } } grok { match => { "inputjson" => "\[%{TIMESTAMP_ISO8601:logTime}\] %{LOGLEVEL:severity} %{GREEDYDATA:logMessage} \(%{JAVACLASS:loggerName}\)"} } date { match => [ "logTime", "yyyy-MM-dd HH:mm:ss,SSS" ] target => "logTime" } if "_grokparsefailure" in [tags] { mutate { remove_field => [ "@version","path","type","host" ] } } else { mutate { remove_field => [ "message","@version","path","type","host" ] } }
}
And the source filed is coming as string in the ES
"source" : "%{[message][source]}"
How to access the nested fields in the incoming message? Any help/directions would help me to fix the issue.Thanks
Regards,
Logeswaran Radhakrishnan