I'm trying to use the json filter for Logstash 2.1. I find the "add_field" option is ignored. I had a more complex config earlier, but I stripped it down to test just the add_field.
Your JSON message gets unmarshaled by the json codec in your input, and hence the subsequent json filter won't work since there is no message field to parse (so it bails out early and won't trigger add_field).
Thank you. When I switched from generating to a simple text file with the json, it worked.
However, when I move back towards my original pipeline - kafka input, json filter, console output, I once more find the add_field ineffective.
C:\kafka_2.10-0.8.2.1>.\bin\windows\kafka-console-producer.bat --broker-list localhost:9092 --topic elktest
[2015-12-07 15:17:59,764] WARN Property topic is not valid (kafka.utils.VerifiableProperties)
{"clicked":true}
Logstash console:
E:\ELK\logstash-2.1.0>bin\logstash agent -f add_field.conf
io/console not supported; tty will not be manipulated
Settings: Default filter workers: 2
Logstash startup completed
{
"clicked" => true,
"@version" => "1",
"@timestamp" => "2015-12-07T09:48:06.234Z"
}
Am I doing something wrong here?
My original configuration needed to extract a nested timestamp from the json consumed from kafka.
The input json consumed from Kafka was something like :
AFAICT the events don't have a messagefield for the json filter to parse. Apart from @timestamp (that Logstash probably added on its own) there's just clicked.
Sorry, I'm rather new to Logstash. Isn't "message" where the payload ends up by default? Won't the entire json read from Kafka be available in "message"?
Not with codec => "json" which is the default for the kafka input. Then the JSON payload will be unmarshaled by the input plugin and there will only be a message field if the JSON payload defines it.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.