Logstash json filter's add_field ignored

I'm trying to use the json filter for Logstash 2.1. I find the "add_field" option is ignored. I had a more complex config earlier, but I stripped it down to test just the add_field.

input {
generator{
lines => ['{"@message": {"clicked":true}}']
codec => "json"
count => 1
}
}
filter {
json {
source => "message"
add_field => { "somefield" => "Well hello there!" }
}
}

output {
stdout { codec => rubydebug }
}

Console on run:

E:\ELK\logstash-2.1.0>bin\logstash agent -f logstash.conf
io/console not supported; tty will not be manipulated
Settings: Default filter workers: 2
Logstash startup completed
{
"@message" => {
"clicked" => true
},
"@version" => "1",
"@timestamp" => "2015-12-07T08:34:45.508Z",
"host" => "IND-PUN-LAP-096",
"sequence" => 0
}

Is there something I'm doing wrong here, or is this a bug with 2.1?

Your JSON message gets unmarshaled by the json codec in your input, and hence the subsequent json filter won't work since there is no message field to parse (so it bails out early and won't trigger add_field).

Thank you. When I switched from generating to a simple text file with the json, it worked.
However, when I move back towards my original pipeline - kafka input, json filter, console output, I once more find the add_field ineffective.

input {
kafka {
zk_connect => "localhost:2181"
topic_id => "elktest"
}
}
filter {
json {
source => "message"
add_field => { "somefield" => "Well hello there!" }
}
}

output {
stdout { codec => rubydebug }
}

My Kafka producer console:

C:\kafka_2.10-0.8.2.1>.\bin\windows\kafka-console-producer.bat --broker-list localhost:9092 --topic elktest
[2015-12-07 15:17:59,764] WARN Property topic is not valid (kafka.utils.VerifiableProperties)
{"clicked":true}

Logstash console:

E:\ELK\logstash-2.1.0>bin\logstash agent -f add_field.conf
io/console not supported; tty will not be manipulated
Settings: Default filter workers: 2
Logstash startup completed
{
"clicked" => true,
"@version" => "1",
"@timestamp" => "2015-12-07T09:48:06.234Z"
}

Am I doing something wrong here?

My original configuration needed to extract a nested timestamp from the json consumed from kafka.
The input json consumed from Kafka was something like :

{
"userInfo": {
"logTime": "2015-10-06 05:54:53.106",
...
}
}

I needed to extract logTime and assign it to an eventTimestamp.
So my filter looked like this:

filter {
json {
source => "message"
add_field => { "eventTimestamp" => "%{[userInfo][logTime]}" }
}
}

Is this incorrect?

AFAICT the events don't have a messagefield for the json filter to parse. Apart from @timestamp (that Logstash probably added on its own) there's just clicked.

1 Like

Sorry, I'm rather new to Logstash. Isn't "message" where the payload ends up by default? Won't the entire json read from Kafka be available in "message"?

Not with codec => "json" which is the default for the kafka input. Then the JSON payload will be unmarshaled by the input plugin and there will only be a message field if the JSON payload defines it.

1 Like

Aaaah! Thanks a ton!